Prediction of global and local model quality in CASP8 using the ModFOLD server.
McGuffin, Liam J
2009-01-01
The development of effective methods for predicting the quality of three-dimensional (3D) models is fundamentally important for the success of tertiary structure (TS) prediction strategies. Since CASP7, the Quality Assessment (QA) category has existed to gauge the ability of various model quality assessment programs (MQAPs) at predicting the relative quality of individual 3D models. For the CASP8 experiment, automated predictions were submitted in the QA category using two methods from the ModFOLD server-ModFOLD version 1.1 and ModFOLDclust. ModFOLD version 1.1 is a single-model machine learning based method, which was used for automated predictions of global model quality (QMODE1). ModFOLDclust is a simple clustering based method, which was used for automated predictions of both global and local quality (QMODE2). In addition, manual predictions of model quality were made using ModFOLD version 2.0--an experimental method that combines the scores from ModFOLDclust and ModFOLD v1.1. Predictions from the ModFOLDclust method were the most successful of the three in terms of the global model quality, whilst the ModFOLD v1.1 method was comparable in performance to other single-model based methods. In addition, the ModFOLDclust method performed well at predicting the per-residue, or local, model quality scores. Predictions of the per-residue errors in our own 3D models, selected using the ModFOLD v2.0 method, were also the most accurate compared with those from other methods. All of the MQAPs described are publicly accessible via the ModFOLD server at: http://www.reading.ac.uk/bioinf/ModFOLD/. The methods are also freely available to download from: http://www.reading.ac.uk/bioinf/downloads/. Copyright 2009 Wiley-Liss, Inc.
Deep learning architecture for air quality predictions.
Li, Xiang; Peng, Ling; Hu, Yuan; Shao, Jing; Chi, Tianhe
2016-11-01
With the rapid development of urbanization and industrialization, many developing countries are suffering from heavy air pollution. Governments and citizens have expressed increasing concern regarding air pollution because it affects human health and sustainable development worldwide. Current air quality prediction methods mainly use shallow models; however, these methods produce unsatisfactory results, which inspired us to investigate methods of predicting air quality based on deep architecture models. In this paper, a novel spatiotemporal deep learning (STDL)-based air quality prediction method that inherently considers spatial and temporal correlations is proposed. A stacked autoencoder (SAE) model is used to extract inherent air quality features, and it is trained in a greedy layer-wise manner. Compared with traditional time series prediction models, our model can predict the air quality of all stations simultaneously and shows the temporal stability in all seasons. Moreover, a comparison with the spatiotemporal artificial neural network (STANN), auto regression moving average (ARMA), and support vector regression (SVR) models demonstrates that the proposed method of performing air quality predictions has a superior performance.
2014-01-01
Background Protein model quality assessment is an essential component of generating and using protein structural models. During the Tenth Critical Assessment of Techniques for Protein Structure Prediction (CASP10), we developed and tested four automated methods (MULTICOM-REFINE, MULTICOM-CLUSTER, MULTICOM-NOVEL, and MULTICOM-CONSTRUCT) that predicted both local and global quality of protein structural models. Results MULTICOM-REFINE was a clustering approach that used the average pairwise structural similarity between models to measure the global quality and the average Euclidean distance between a model and several top ranked models to measure the local quality. MULTICOM-CLUSTER and MULTICOM-NOVEL were two new support vector machine-based methods of predicting both the local and global quality of a single protein model. MULTICOM-CONSTRUCT was a new weighted pairwise model comparison (clustering) method that used the weighted average similarity between models in a pool to measure the global model quality. Our experiments showed that the pairwise model assessment methods worked better when a large portion of models in the pool were of good quality, whereas single-model quality assessment methods performed better on some hard targets when only a small portion of models in the pool were of reasonable quality. Conclusions Since digging out a few good models from a large pool of low-quality models is a major challenge in protein structure prediction, single model quality assessment methods appear to be poised to make important contributions to protein structure modeling. The other interesting finding was that single-model quality assessment scores could be used to weight the models by the consensus pairwise model comparison method to improve its accuracy. PMID:24731387
Cao, Renzhi; Wang, Zheng; Cheng, Jianlin
2014-04-15
Protein model quality assessment is an essential component of generating and using protein structural models. During the Tenth Critical Assessment of Techniques for Protein Structure Prediction (CASP10), we developed and tested four automated methods (MULTICOM-REFINE, MULTICOM-CLUSTER, MULTICOM-NOVEL, and MULTICOM-CONSTRUCT) that predicted both local and global quality of protein structural models. MULTICOM-REFINE was a clustering approach that used the average pairwise structural similarity between models to measure the global quality and the average Euclidean distance between a model and several top ranked models to measure the local quality. MULTICOM-CLUSTER and MULTICOM-NOVEL were two new support vector machine-based methods of predicting both the local and global quality of a single protein model. MULTICOM-CONSTRUCT was a new weighted pairwise model comparison (clustering) method that used the weighted average similarity between models in a pool to measure the global model quality. Our experiments showed that the pairwise model assessment methods worked better when a large portion of models in the pool were of good quality, whereas single-model quality assessment methods performed better on some hard targets when only a small portion of models in the pool were of reasonable quality. Since digging out a few good models from a large pool of low-quality models is a major challenge in protein structure prediction, single model quality assessment methods appear to be poised to make important contributions to protein structure modeling. The other interesting finding was that single-model quality assessment scores could be used to weight the models by the consensus pairwise model comparison method to improve its accuracy.
MQAPRank: improved global protein model quality assessment by learning-to-rank.
Jing, Xiaoyang; Dong, Qiwen
2017-05-25
Protein structure prediction has achieved a lot of progress during the last few decades and a greater number of models for a certain sequence can be predicted. Consequently, assessing the qualities of predicted protein models in perspective is one of the key components of successful protein structure prediction. Over the past years, a number of methods have been developed to address this issue, which could be roughly divided into three categories: single methods, quasi-single methods and clustering (or consensus) methods. Although these methods achieve much success at different levels, accurate protein model quality assessment is still an open problem. Here, we present the MQAPRank, a global protein model quality assessment program based on learning-to-rank. The MQAPRank first sorts the decoy models by using single method based on learning-to-rank algorithm to indicate their relative qualities for the target protein. And then it takes the first five models as references to predict the qualities of other models by using average GDT_TS scores between reference models and other models. Benchmarked on CASP11 and 3DRobot datasets, the MQAPRank achieved better performances than other leading protein model quality assessment methods. Recently, the MQAPRank participated in the CASP12 under the group name FDUBio and achieved the state-of-the-art performances. The MQAPRank provides a convenient and powerful tool for protein model quality assessment with the state-of-the-art performances, it is useful for protein structure prediction and model quality assessment usages.
Predicting the Overall Spatial Quality of Automotive Audio Systems
NASA Astrophysics Data System (ADS)
Koya, Daisuke
The spatial quality of automotive audio systems is often compromised due to their unideal listening environments. Automotive audio systems need to be developed quickly due to industry demands. A suitable perceptual model could evaluate the spatial quality of automotive audio systems with similar reliability to formal listening tests but take less time. Such a model is developed in this research project by adapting an existing model of spatial quality for automotive audio use. The requirements for the adaptation were investigated in a literature review. A perceptual model called QESTRAL was reviewed, which predicts the overall spatial quality of domestic multichannel audio systems. It was determined that automotive audio systems are likely to be impaired in terms of the spatial attributes that were not considered in developing the QESTRAL model, but metrics are available that might predict these attributes. To establish whether the QESTRAL model in its current form can accurately predict the overall spatial quality of automotive audio systems, MUSHRA listening tests using headphone auralisation with head tracking were conducted to collect results to be compared against predictions by the model. Based on guideline criteria, the model in its current form could not accurately predict the overall spatial quality of automotive audio systems. To improve prediction performance, the QESTRAL model was recalibrated and modified using existing metrics of the model, those that were proposed from the literature review, and newly developed metrics. The most important metrics for predicting the overall spatial quality of automotive audio systems included those that were interaural cross-correlation (IACC) based, relate to localisation of the frontal audio scene, and account for the perceived scene width in front of the listener. Modifying the model for automotive audio systems did not invalidate its use for domestic audio systems. The resulting model predicts the overall spatial quality of 2- and 5-channel automotive audio systems with a cross-validation performance of R. 2 = 0.85 and root-mean-squareerror (RMSE) = 11.03%.
Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin
2015-01-01
Model evaluation and selection is an important step and a big challenge in template-based protein structure prediction. Individual model quality assessment methods designed for recognizing some specific properties of protein structures often fail to consistently select good models from a model pool because of their limitations. Therefore, combining multiple complimentary quality assessment methods is useful for improving model ranking and consequently tertiary structure prediction. Here, we report the performance and analysis of our human tertiary structure predictor (MULTICOM) based on the massive integration of 14 diverse complementary quality assessment methods that was successfully benchmarked in the 11th Critical Assessment of Techniques of Protein Structure prediction (CASP11). The predictions of MULTICOM for 39 template-based domains were rigorously assessed by six scoring metrics covering global topology of Cα trace, local all-atom fitness, side chain quality, and physical reasonableness of the model. The results show that the massive integration of complementary, diverse single-model and multi-model quality assessment methods can effectively leverage the strength of single-model methods in distinguishing quality variation among similar good models and the advantage of multi-model quality assessment methods of identifying reasonable average-quality models. The overall excellent performance of the MULTICOM predictor demonstrates that integrating a large number of model quality assessment methods in conjunction with model clustering is a useful approach to improve the accuracy, diversity, and consequently robustness of template-based protein structure prediction. PMID:26369671
Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin
2016-09-01
Model evaluation and selection is an important step and a big challenge in template-based protein structure prediction. Individual model quality assessment methods designed for recognizing some specific properties of protein structures often fail to consistently select good models from a model pool because of their limitations. Therefore, combining multiple complimentary quality assessment methods is useful for improving model ranking and consequently tertiary structure prediction. Here, we report the performance and analysis of our human tertiary structure predictor (MULTICOM) based on the massive integration of 14 diverse complementary quality assessment methods that was successfully benchmarked in the 11th Critical Assessment of Techniques of Protein Structure prediction (CASP11). The predictions of MULTICOM for 39 template-based domains were rigorously assessed by six scoring metrics covering global topology of Cα trace, local all-atom fitness, side chain quality, and physical reasonableness of the model. The results show that the massive integration of complementary, diverse single-model and multi-model quality assessment methods can effectively leverage the strength of single-model methods in distinguishing quality variation among similar good models and the advantage of multi-model quality assessment methods of identifying reasonable average-quality models. The overall excellent performance of the MULTICOM predictor demonstrates that integrating a large number of model quality assessment methods in conjunction with model clustering is a useful approach to improve the accuracy, diversity, and consequently robustness of template-based protein structure prediction. Proteins 2016; 84(Suppl 1):247-259. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
2014-01-01
Background It is important to predict the quality of a protein structural model before its native structure is known. The method that can predict the absolute local quality of individual residues in a single protein model is rare, yet particularly needed for using, ranking and refining protein models. Results We developed a machine learning tool (SMOQ) that can predict the distance deviation of each residue in a single protein model. SMOQ uses support vector machines (SVM) with protein sequence and structural features (i.e. basic feature set), including amino acid sequence, secondary structures, solvent accessibilities, and residue-residue contacts to make predictions. We also trained a SVM model with two new additional features (profiles and SOV scores) on 20 CASP8 targets and found that including them can only improve the performance when real deviations between native and model are higher than 5Å. The SMOQ tool finally released uses the basic feature set trained on 85 CASP8 targets. Moreover, SMOQ implemented a way to convert predicted local quality scores into a global quality score. SMOQ was tested on the 84 CASP9 single-domain targets. The average difference between the residue-specific distance deviation predicted by our method and the actual distance deviation on the test data is 2.637Å. The global quality prediction accuracy of the tool is comparable to other good tools on the same benchmark. Conclusion SMOQ is a useful tool for protein single model quality assessment. Its source code and executable are available at: http://sysbio.rnet.missouri.edu/multicom_toolbox/. PMID:24776231
Cao, Renzhi; Wang, Zheng; Wang, Yiheng; Cheng, Jianlin
2014-04-28
It is important to predict the quality of a protein structural model before its native structure is known. The method that can predict the absolute local quality of individual residues in a single protein model is rare, yet particularly needed for using, ranking and refining protein models. We developed a machine learning tool (SMOQ) that can predict the distance deviation of each residue in a single protein model. SMOQ uses support vector machines (SVM) with protein sequence and structural features (i.e. basic feature set), including amino acid sequence, secondary structures, solvent accessibilities, and residue-residue contacts to make predictions. We also trained a SVM model with two new additional features (profiles and SOV scores) on 20 CASP8 targets and found that including them can only improve the performance when real deviations between native and model are higher than 5Å. The SMOQ tool finally released uses the basic feature set trained on 85 CASP8 targets. Moreover, SMOQ implemented a way to convert predicted local quality scores into a global quality score. SMOQ was tested on the 84 CASP9 single-domain targets. The average difference between the residue-specific distance deviation predicted by our method and the actual distance deviation on the test data is 2.637Å. The global quality prediction accuracy of the tool is comparable to other good tools on the same benchmark. SMOQ is a useful tool for protein single model quality assessment. Its source code and executable are available at: http://sysbio.rnet.missouri.edu/multicom_toolbox/.
Holcomb, David A; Messier, Kyle P; Serre, Marc L; Rowny, Jakob G; Stewart, Jill R
2018-06-25
Predictive modeling is promising as an inexpensive tool to assess water quality. We developed geostatistical predictive models of microbial water quality that empirically modeled spatiotemporal autocorrelation in measured fecal coliform (FC) bacteria concentrations to improve prediction. We compared five geostatistical models featuring different autocorrelation structures, fit to 676 observations from 19 locations in North Carolina's Jordan Lake watershed using meteorological and land cover predictor variables. Though stream distance metrics (with and without flow-weighting) failed to improve prediction over the Euclidean distance metric, incorporating temporal autocorrelation substantially improved prediction over the space-only models. We predicted FC throughout the stream network daily for one year, designating locations "impaired", "unimpaired", or "unassessed" if the probability of exceeding the state standard was ≥90%, ≤10%, or >10% but <90%, respectively. We could assign impairment status to more of the stream network on days any FC were measured, suggesting frequent sample-based monitoring remains necessary, though implementing spatiotemporal predictive models may reduce the number of concurrent sampling locations required to adequately assess water quality. Together, these results suggest that prioritizing sampling at different times and conditions using geographically sparse monitoring networks is adequate to build robust and informative geostatistical models of water quality impairment.
NASA Astrophysics Data System (ADS)
Choudhury, Anustup; Farrell, Suzanne; Atkins, Robin; Daly, Scott
2017-09-01
We present an approach to predict overall HDR display quality as a function of key HDR display parameters. We first performed subjective experiments on a high quality HDR display that explored five key HDR display parameters: maximum luminance, minimum luminance, color gamut, bit-depth and local contrast. Subjects rated overall quality for different combinations of these display parameters. We explored two models | a physical model solely based on physically measured display characteristics and a perceptual model that transforms physical parameters using human vision system models. For the perceptual model, we use a family of metrics based on a recently published color volume model (ICT-CP), which consists of the PQ luminance non-linearity (ST2084) and LMS-based opponent color, as well as an estimate of the display point spread function. To predict overall visual quality, we apply linear regression and machine learning techniques such as Multilayer Perceptron, RBF and SVM networks. We use RMSE and Pearson/Spearman correlation coefficients to quantify performance. We found that the perceptual model is better at predicting subjective quality than the physical model and that SVM is better at prediction than linear regression. The significance and contribution of each display parameter was investigated. In addition, we found that combined parameters such as contrast do not improve prediction. Traditional perceptual models were also evaluated and we found that models based on the PQ non-linearity performed better.
Real-time assessments of water quality: expanding nowcasting throughout the Great Lakes
,
2013-01-01
Nowcasts are systems that inform the public of current bacterial water-quality conditions at beaches on the basis of predictive models. During 2010–12, the U.S. Geological Survey (USGS) worked with 23 local and State agencies to improve existing operational beach nowcast systems at 4 beaches and expand the use of predictive models in nowcasts at an additional 45 beaches throughout the Great Lakes. The predictive models were specific to each beach, and the best model for each beach was based on a unique combination of environmental and water-quality explanatory variables. The variables used most often in models to predict Escherichia coli (E. coli) concentrations or the probability of exceeding a State recreational water-quality standard included turbidity, day of the year, wave height, wind direction and speed, antecedent rainfall for various time periods, and change in lake level over 24 hours. During validation of 42 beach models during 2012, the models performed better than the current method to assess recreational water quality (previous day's E. coli concentration). The USGS will continue to work with local agencies to improve nowcast predictions, enable technology transfer of predictive model development procedures, and implement more operational systems during 2013 and beyond.
Researches of fruit quality prediction model based on near infrared spectrum
NASA Astrophysics Data System (ADS)
Shen, Yulin; Li, Lian
2018-04-01
With the improvement in standards for food quality and safety, people pay more attention to the internal quality of fruits, therefore the measurement of fruit internal quality is increasingly imperative. In general, nondestructive soluble solid content (SSC) and total acid content (TAC) analysis of fruits is vital and effective for quality measurement in global fresh produce markets, so in this paper, we aim at establishing a novel fruit internal quality prediction model based on SSC and TAC for Near Infrared Spectrum. Firstly, the model of fruit quality prediction based on PCA + BP neural network, PCA + GRNN network, PCA + BP adaboost strong classifier, PCA + ELM and PCA + LS_SVM classifier are designed and implemented respectively; then, in the NSCT domain, the median filter and the SavitzkyGolay filter are used to preprocess the spectral signal, Kennard-Stone algorithm is used to automatically select the training samples and test samples; thirdly, we achieve the optimal models by comparing 15 kinds of prediction model based on the theory of multi-classifier competition mechanism, specifically, the non-parametric estimation is introduced to measure the effectiveness of proposed model, the reliability and variance of nonparametric estimation evaluation of each prediction model to evaluate the prediction result, while the estimated value and confidence interval regard as a reference, the experimental results demonstrate that this model can better achieve the optimal evaluation of the internal quality of fruit; finally, we employ cat swarm optimization to optimize two optimal models above obtained from nonparametric estimation, empirical testing indicates that the proposed method can provide more accurate and effective results than other forecasting methods.
NASA Astrophysics Data System (ADS)
Isingizwe Nturambirwe, J. Frédéric; Perold, Willem J.; Opara, Umezuruike L.
2016-02-01
Near infrared (NIR) spectroscopy has gained extensive use in quality evaluation. It is arguably one of the most advanced spectroscopic tools in non-destructive quality testing of food stuff, from measurement to data analysis and interpretation. NIR spectral data are interpreted through means often involving multivariate statistical analysis, sometimes associated with optimisation techniques for model improvement. The objective of this research was to explore the extent to which genetic algorithms (GA) can be used to enhance model development, for predicting fruit quality. Apple fruits were used, and NIR spectra in the range from 12000 to 4000 cm-1 were acquired on both bruised and healthy tissues, with different degrees of mechanical damage. GAs were used in combination with partial least squares regression methods to develop bruise severity prediction models, and compared to PLS models developed using the full NIR spectrum. A classification model was developed, which clearly separated bruised from unbruised apple tissue. GAs helped improve prediction models by over 10%, in comparison with full spectrum-based models, as evaluated in terms of error of prediction (Root Mean Square Error of Cross-validation). PLS models to predict internal quality, such as sugar content and acidity were developed and compared to the versions optimized by genetic algorithm. Overall, the results highlighted the potential use of GA method to improve speed and accuracy of fruit quality prediction.
Assessment and prediction of air quality using fuzzy logic and autoregressive models
NASA Astrophysics Data System (ADS)
Carbajal-Hernández, José Juan; Sánchez-Fernández, Luis P.; Carrasco-Ochoa, Jesús A.; Martínez-Trinidad, José Fco.
2012-12-01
In recent years, artificial intelligence methods have been used for the treatment of environmental problems. This work, presents two models for assessment and prediction of air quality. First, we develop a new computational model for air quality assessment in order to evaluate toxic compounds that can harm sensitive people in urban areas, affecting their normal activities. In this model we propose to use a Sigma operator to statistically asses air quality parameters using their historical data information and determining their negative impact in air quality based on toxicity limits, frequency average and deviations of toxicological tests. We also introduce a fuzzy inference system to perform parameter classification using a reasoning process and integrating them in an air quality index describing the pollution levels in five stages: excellent, good, regular, bad and danger, respectively. The second model proposed in this work predicts air quality concentrations using an autoregressive model, providing a predicted air quality index based on the fuzzy inference system previously developed. Using data from Mexico City Atmospheric Monitoring System, we perform a comparison among air quality indices developed for environmental agencies and similar models. Our results show that our models are an appropriate tool for assessing site pollution and for providing guidance to improve contingency actions in urban areas.
Zhou, Hongyi; Skolnick, Jeffrey
2009-01-01
In this work, we develop a fully automated method for the quality assessment prediction of protein structural models generated by structure prediction approaches such as fold recognition servers, or ab initio methods. The approach is based on fragment comparisons and a consensus Cα contact potential derived from the set of models to be assessed and was tested on CASP7 server models. The average Pearson linear correlation coefficient between predicted quality and model GDT-score per target is 0.83 for the 98 targets which is better than those of other quality assessment methods that participated in CASP7. Our method also outperforms the other methods by about 3% as assessed by the total GDT-score of the selected top models. PMID:18004783
A comparison of different functions for predicted protein model quality assessment.
Li, Juan; Fang, Huisheng
2016-07-01
In protein structure prediction, a considerable number of models are usually produced by either the Template-Based Method (TBM) or the ab initio prediction. The purpose of this study is to find the critical parameter in assessing the quality of the predicted models. A non-redundant template library was developed and 138 target sequences were modeled. The target sequences were all distant from the proteins in the template library and were aligned with template library proteins on the basis of the transformation matrix. The quality of each model was first assessed with QMEAN and its six parameters, which are C_β interaction energy (C_beta), all-atom pairwise energy (PE), solvation energy (SE), torsion angle energy (TAE), secondary structure agreement (SSA), and solvent accessibility agreement (SAE). Finally, the alignment score (score) was also used to assess the quality of model. Hence, a total of eight parameters (i.e., QMEAN, C_beta, PE, SE, TAE, SSA, SAE, score) were independently used to assess the quality of each model. The results indicate that SSA is the best parameter to estimate the quality of the model.
United3D: a protein model quality assessment program that uses two consensus based methods.
Terashi, Genki; Oosawa, Makoto; Nakamura, Yuuki; Kanou, Kazuhiko; Takeda-Shitaka, Mayuko
2012-01-01
In protein structure prediction, such as template-based modeling and free modeling (ab initio modeling), the step that assesses the quality of protein models is very important. We have developed a model quality assessment (QA) program United3D that uses an optimized clustering method and a simple Cα atom contact-based potential. United3D automatically estimates the quality scores (Qscore) of predicted protein models that are highly correlated with the actual quality (GDT_TS). The performance of United3D was tested in the ninth Critical Assessment of protein Structure Prediction (CASP9) experiment. In CASP9, United3D showed the lowest average loss of GDT_TS (5.3) among the QA methods participated in CASP9. This result indicates that the performance of United3D to identify the high quality models from the models predicted by CASP9 servers on 116 targets was best among the QA methods that were tested in CASP9. United3D also produced high average Pearson correlation coefficients (0.93) and acceptable Kendall rank correlation coefficients (0.68) between the Qscore and GDT_TS. This performance was competitive with the other top ranked QA methods that were tested in CASP9. These results indicate that United3D is a useful tool for selecting high quality models from many candidate model structures provided by various modeling methods. United3D will improve the accuracy of protein structure prediction.
Perceptual quality prediction on authentically distorted images using a bag of features approach
Ghadiyaram, Deepti; Bovik, Alan C.
2017-01-01
Current top-performing blind perceptual image quality prediction models are generally trained on legacy databases of human quality opinion scores on synthetically distorted images. Therefore, they learn image features that effectively predict human visual quality judgments of inauthentic and usually isolated (single) distortions. However, real-world images usually contain complex composite mixtures of multiple distortions. We study the perceptually relevant natural scene statistics of such authentically distorted images in different color spaces and transform domains. We propose a “bag of feature maps” approach that avoids assumptions about the type of distortion(s) contained in an image and instead focuses on capturing consistencies—or departures therefrom—of the statistics of real-world images. Using a large database of authentically distorted images, human opinions of them, and bags of features computed on them, we train a regressor to conduct image quality prediction. We demonstrate the competence of the features toward improving automatic perceptual quality prediction by testing a learned algorithm using them on a benchmark legacy database as well as on a newly introduced distortion-realistic resource called the LIVE In the Wild Image Quality Challenge Database. We extensively evaluate the perceptual quality prediction model and algorithm and show that it is able to achieve good-quality prediction power that is better than other leading models. PMID:28129417
Identifying pollution sources and predicting urban air quality using ensemble learning methods
NASA Astrophysics Data System (ADS)
Singh, Kunwar P.; Gupta, Shikha; Rai, Premanjali
2013-12-01
In this study, principal components analysis (PCA) was performed to identify air pollution sources and tree based ensemble learning models were constructed to predict the urban air quality of Lucknow (India) using the air quality and meteorological databases pertaining to a period of five years. PCA identified vehicular emissions and fuel combustion as major air pollution sources. The air quality indices revealed the air quality unhealthy during the summer and winter. Ensemble models were constructed to discriminate between the seasonal air qualities, factors responsible for discrimination, and to predict the air quality indices. Accordingly, single decision tree (SDT), decision tree forest (DTF), and decision treeboost (DTB) were constructed and their generalization and predictive performance was evaluated in terms of several statistical parameters and compared with conventional machine learning benchmark, support vector machines (SVM). The DT and SVM models discriminated the seasonal air quality rendering misclassification rate (MR) of 8.32% (SDT); 4.12% (DTF); 5.62% (DTB), and 6.18% (SVM), respectively in complete data. The AQI and CAQI regression models yielded a correlation between measured and predicted values and root mean squared error of 0.901, 6.67 and 0.825, 9.45 (SDT); 0.951, 4.85 and 0.922, 6.56 (DTF); 0.959, 4.38 and 0.929, 6.30 (DTB); 0.890, 7.00 and 0.836, 9.16 (SVR) in complete data. The DTF and DTB models outperformed the SVM both in classification and regression which could be attributed to the incorporation of the bagging and boosting algorithms in these models. The proposed ensemble models successfully predicted the urban ambient air quality and can be used as effective tools for its management.
Development of wavelet-ANN models to predict water quality parameters in Hilo Bay, Pacific Ocean.
Alizadeh, Mohamad Javad; Kavianpour, Mohamad Reza
2015-09-15
The main objective of this study is to apply artificial neural network (ANN) and wavelet-neural network (WNN) models for predicting a variety of ocean water quality parameters. In this regard, several water quality parameters in Hilo Bay, Pacific Ocean, are taken under consideration. Different combinations of water quality parameters are applied as input variables to predict daily values of salinity, temperature and DO as well as hourly values of DO. The results demonstrate that the WNN models are superior to the ANN models. Also, the hourly models developed for DO prediction outperform the daily models of DO. For the daily models, the most accurate model has R equal to 0.96, while for the hourly model it reaches up to 0.98. Overall, the results show the ability of the model to monitor the ocean parameters, in condition with missing data, or when regular measurement and monitoring are impossible. Copyright © 2015 Elsevier Ltd. All rights reserved.
APOLLO: a quality assessment service for single and multiple protein models.
Wang, Zheng; Eickholt, Jesse; Cheng, Jianlin
2011-06-15
We built a web server named APOLLO, which can evaluate the absolute global and local qualities of a single protein model using machine learning methods or the global and local qualities of a pool of models using a pair-wise comparison approach. Based on our evaluations on 107 CASP9 (Critical Assessment of Techniques for Protein Structure Prediction) targets, the predicted quality scores generated from our machine learning and pair-wise methods have an average per-target correlation of 0.671 and 0.917, respectively, with the true model quality scores. Based on our test on 92 CASP9 targets, our predicted absolute local qualities have an average difference of 2.60 Å with the actual distances to native structure. http://sysbio.rnet.missouri.edu/apollo/. Single and pair-wise global quality assessment software is also available at the site.
Challoner, Avril; Pilla, Francesco; Gill, Laurence
2015-12-01
NO₂ and particulate matter are the air pollutants of most concern in Ireland, with possible links to the higher respiratory and cardiovascular mortality and morbidity rates found in the country compared to the rest of Europe. Currently, air quality limits in Europe only cover outdoor environments yet the quality of indoor air is an essential determinant of a person's well-being, especially since the average person spends more than 90% of their time indoors. The modelling conducted in this research aims to provide a framework for epidemiological studies by the use of publically available data from fixed outdoor monitoring stations to predict indoor air quality more accurately. Predictions are made using two modelling techniques, the Personal-exposure Activity Location Model (PALM), to predict outdoor air quality at a particular building, and Artificial Neural Networks, to model the indoor/outdoor relationship of the building. This joint approach has been used to predict indoor air concentrations for three inner city commercial buildings in Dublin, where parallel indoor and outdoor diurnal monitoring had been carried out on site. This modelling methodology has been shown to provide reasonable predictions of average NO₂ indoor air quality compared to the monitored data, but did not perform well in the prediction of indoor PM2.5 concentrations. Hence, this approach could be used to determine NO₂ exposures more rigorously of those who work and/or live in the city centre, which can then be linked to potential health impacts.
NASA Astrophysics Data System (ADS)
Szeląg, Bartosz; Barbusiński, Krzysztof; Studziński, Jan; Bartkiewicz, Lidia
2017-11-01
In the study, models developed using data mining methods are proposed for predicting wastewater quality indicators: biochemical and chemical oxygen demand, total suspended solids, total nitrogen and total phosphorus at the inflow to wastewater treatment plant (WWTP). The models are based on values measured in previous time steps and daily wastewater inflows. Also, independent prediction systems that can be used in case of monitoring devices malfunction are provided. Models of wastewater quality indicators were developed using MARS (multivariate adaptive regression spline) method, artificial neural networks (ANN) of the multilayer perceptron type combined with the classification model (SOM) and cascade neural networks (CNN). The lowest values of absolute and relative errors were obtained using ANN+SOM, whereas the MARS method produced the highest error values. It was shown that for the analysed WWTP it is possible to obtain continuous prediction of selected wastewater quality indicators using the two developed independent prediction systems. Such models can ensure reliable WWTP work when wastewater quality monitoring systems become inoperable, or are under maintenance.
ProTSAV: A protein tertiary structure analysis and validation server.
Singh, Ankita; Kaushik, Rahul; Mishra, Avinash; Shanker, Asheesh; Jayaram, B
2016-01-01
Quality assessment of predicted model structures of proteins is as important as the protein tertiary structure prediction. A highly efficient quality assessment of predicted model structures directs further research on function. Here we present a new server ProTSAV, capable of evaluating predicted model structures based on some popular online servers and standalone tools. ProTSAV furnishes the user with a single quality score in case of individual protein structure along with a graphical representation and ranking in case of multiple protein structure assessment. The server is validated on ~64,446 protein structures including experimental structures from RCSB and predicted model structures for CASP targets and from public decoy sets. ProTSAV succeeds in predicting quality of protein structures with a specificity of 100% and a sensitivity of 98% on experimentally solved structures and achieves a specificity of 88%and a sensitivity of 91% on predicted protein structures of CASP11 targets under 2Å.The server overcomes the limitations of any single server/method and is seen to be robust in helping in quality assessment. ProTSAV is freely available at http://www.scfbio-iitd.res.in/software/proteomics/protsav.jsp. Copyright © 2015 Elsevier B.V. All rights reserved.
A model for predicting air quality along highways.
DOT National Transportation Integrated Search
1973-01-01
The subject of this report is an air quality prediction model for highways, AIRPOL Version 2, July 1973. AIRPOL has been developed by modifying the basic Gaussian approach to gaseous dispersion. The resultant model is smooth and continuous throughout...
Modeling the time--varying subjective quality of HTTP video streams with rate adaptations.
Chen, Chao; Choi, Lark Kwon; de Veciana, Gustavo; Caramanis, Constantine; Heath, Robert W; Bovik, Alan C
2014-05-01
Newly developed hypertext transfer protocol (HTTP)-based video streaming technologies enable flexible rate-adaptation under varying channel conditions. Accurately predicting the users' quality of experience (QoE) for rate-adaptive HTTP video streams is thus critical to achieve efficiency. An important aspect of understanding and modeling QoE is predicting the up-to-the-moment subjective quality of a video as it is played, which is difficult due to hysteresis effects and nonlinearities in human behavioral responses. This paper presents a Hammerstein-Wiener model for predicting the time-varying subjective quality (TVSQ) of rate-adaptive videos. To collect data for model parameterization and validation, a database of longer duration videos with time-varying distortions was built and the TVSQs of the videos were measured in a large-scale subjective study. The proposed method is able to reliably predict the TVSQ of rate adaptive videos. Since the Hammerstein-Wiener model has a very simple structure, the proposed method is suitable for online TVSQ prediction in HTTP-based streaming.
Evaluating Air-Quality Models: Review and Outlook.
NASA Astrophysics Data System (ADS)
Weil, J. C.; Sykes, R. I.; Venkatram, A.
1992-10-01
Over the past decade, much attention has been devoted to the evaluation of air-quality models with emphasis on model performance in predicting the high concentrations that are important in air-quality regulations. This paper stems from our belief that this practice needs to be expanded to 1) evaluate model physics and 2) deal with the large natural or stochastic variability in concentration. The variability is represented by the root-mean- square fluctuating concentration (c about the mean concentration (C) over an ensemble-a given set of meteorological, source, etc. conditions. Most air-quality models used in applications predict C, whereas observations are individual realizations drawn from an ensemble. For cC large residuals exist between predicted and observed concentrations, which confuse model evaluations.This paper addresses ways of evaluating model physics in light of the large c the focus is on elevated point-source models. Evaluation of model physics requires the separation of the mean model error-the difference between the predicted and observed C-from the natural variability. A residual analysis is shown to be an elective way of doing this. Several examples demonstrate the usefulness of residuals as well as correlation analyses and laboratory data in judging model physics.In general, c models and predictions of the probability distribution of the fluctuating concentration (c), (c, are in the developmental stage, with laboratory data playing an important role. Laboratory data from point-source plumes in a convection tank show that (c approximates a self-similar distribution along the plume center plane, a useful result in a residual analysis. At pmsent,there is one model-ARAP-that predicts C, c, and (c for point-source plumes. This model is more computationally demanding than other dispersion models (for C only) and must be demonstrated as a practical tool. However, it predicts an important quantity for applications- the uncertainty in the very high and infrequent concentrations. The uncertainty is large and is needed in evaluating operational performance and in predicting the attainment of air-quality standards.
Challoner, Avril; Pilla, Francesco; Gill, Laurence
2015-01-01
NO2 and particulate matter are the air pollutants of most concern in Ireland, with possible links to the higher respiratory and cardiovascular mortality and morbidity rates found in the country compared to the rest of Europe. Currently, air quality limits in Europe only cover outdoor environments yet the quality of indoor air is an essential determinant of a person’s well-being, especially since the average person spends more than 90% of their time indoors. The modelling conducted in this research aims to provide a framework for epidemiological studies by the use of publically available data from fixed outdoor monitoring stations to predict indoor air quality more accurately. Predictions are made using two modelling techniques, the Personal-exposure Activity Location Model (PALM), to predict outdoor air quality at a particular building, and Artificial Neural Networks, to model the indoor/outdoor relationship of the building. This joint approach has been used to predict indoor air concentrations for three inner city commercial buildings in Dublin, where parallel indoor and outdoor diurnal monitoring had been carried out on site. This modelling methodology has been shown to provide reasonable predictions of average NO2 indoor air quality compared to the monitored data, but did not perform well in the prediction of indoor PM2.5 concentrations. Hence, this approach could be used to determine NO2 exposures more rigorously of those who work and/or live in the city centre, which can then be linked to potential health impacts. PMID:26633448
Presence of indicator plant species as a predictor of wetland vegetation integrity
Stapanian, Martin A.; Adams, Jean V.; Gara, Brian
2013-01-01
We fit regression and classification tree models to vegetation data collected from Ohio (USA) wetlands to determine (1) which species best predict Ohio vegetation index of biotic integrity (OVIBI) score and (2) which species best predict high-quality wetlands (OVIBI score >75). The simplest regression tree model predicted OVIBI score based on the occurrence of three plant species: skunk-cabbage (Symplocarpus foetidus), cinnamon fern (Osmunda cinnamomea), and swamp rose (Rosa palustris). The lowest OVIBI scores were best predicted by the absence of the selected plant species rather than by the presence of other species. The simplest classification tree model predicted high-quality wetlands based on the occurrence of two plant species: skunk-cabbage and marsh-fern (Thelypteris palustris). The overall misclassification rate from this tree was 13 %. Again, low-quality wetlands were better predicted than high-quality wetlands by the absence of selected species rather than the presence of other species using the classification tree model. Our results suggest that a species’ wetland status classification and coefficient of conservatism are of little use in predicting wetland quality. A simple, statistically derived species checklist such as the one created in this study could be used by field biologists to quickly and efficiently identify wetland sites likely to be regulated as high-quality, and requiring more intensive field assessments. Alternatively, it can be used for advanced determinations of low-quality wetlands. Agencies can save considerable money by screening wetlands for the presence/absence of such “indicator” species before issuing permits.
Utility of NCEP Operational and Emerging Meteorological Models for Driving Air Quality Prediction
NASA Astrophysics Data System (ADS)
McQueen, J.; Huang, J.; Huang, H. C.; Shafran, P.; Lee, P.; Pan, L.; Sleinkofer, A. M.; Stajner, I.; Upadhayay, S.; Tallapragada, V.
2017-12-01
Operational air quality predictions for the United States (U. S.) are provided at NOAA by the National Air Quality Forecasting Capability (NAQFC). NAQFC provides nationwide operational predictions of ozone and particulate matter twice per day (at 06 and 12 UTC cycles) at 12 km resolution and 1 hour time intervals through 48 hours and distributed at http://airquality.weather.gov. The NOAA National Centers for Environmental Prediction (NCEP) operational North American Mesoscale (NAM) 12 km weather prediction is used to drive the Community Multiscale Air Quality (CMAQ) model. In 2017, the NAM was upgraded in part to reduce a warm 2m temperature bias in Summer (V4). At the same time CMAQ was updated to V5.0.2. Both versions of the models were run in parallel for several months. Therefore the impact of improvements from the atmospheric chemistry model versus upgrades with the weather prediction model could be assessed. . Improvements to CMAQ were related to improvements to improvements in NAM 2 m temperature bias through increasing the opacity of clouds and reducing downward shortwave radiation resulted in reduced ozone photolysis. Higher resolution operational NWP models have recently been introduced as part of the NCEP modeling suite. These include the NAM CONUS Nest (3 km horizontal resolution) run four times per day through 60 hours and the High Resolution Rapid Refresh (HRRR, 3 km) run hourly out to 18 hours. In addition, NCEP with other NOAA labs has begun to develop and test the Next Generation Global Prediction System (NGGPS) based on the FV3 global model. This presentation also overviews recent developments with operational numerical weather prediction and evaluates the ability of these models for predicting low level temperatures, clouds and capturing boundary layer processes important for driving air quality prediction in complex terrain. The assessed meteorological model errors could help determine the magnitude of possible pollutant errors from CMAQ if used for driving meteorology. The NWP models will be evaluated against standard and mesonet fields averaged for various regions during the summer 2017. An evaluation of meteorological fields important to air quality modeling (eg: near surface winds, temperatures, moisture and boundary layer heights, cloud cover) will be reported on.
Prediction of specialty coffee cup quality based on near infrared spectra of green coffee beans.
Tolessa, Kassaye; Rademaker, Michael; De Baets, Bernard; Boeckx, Pascal
2016-04-01
The growing global demand for specialty coffee increases the need for improved coffee quality assessment methods. Green bean coffee quality analysis is usually carried out by physical (e.g. black beans, immature beans) and cup quality (e.g. acidity, flavour) evaluation. However, these evaluation methods are subjective, costly, time consuming, require sample preparation and may end up in poor grading systems. This calls for the development of a rapid, low-cost, reliable and reproducible analytical method to evaluate coffee quality attributes and eventually chemical compounds of interest (e.g. chlorogenic acid) in coffee beans. The aim of this study was to develop a model able to predict coffee cup quality based on NIR spectra of green coffee beans. NIR spectra of 86 samples of green Arabica beans of varying quality were analysed. Partial least squares (PLS) regression method was used to develop a model correlating spectral data to cupping score data (cup quality). The selected PLS model had a good predictive power for total specialty cup quality and its individual quality attributes (overall cup preference, acidity, body and aftertaste) showing a high correlation coefficient with r-values of 90, 90,78, 72 and 72, respectively, between measured and predicted cupping scores for 20 out of 86 samples. The corresponding root mean square error of prediction (RMSEP) was 1.04, 0.22, 0.27, 0.24 and 0.27 for total specialty cup quality, overall cup preference, acidity, body and aftertaste, respectively. The results obtained suggest that NIR spectra of green coffee beans are a promising tool for fast and accurate prediction of coffee quality and for classifying green coffee beans into different specialty grades. However, the model should be further tested for coffee samples from different regions in Ethiopia and test if one generic or region-specific model should be developed. Copyright © 2015 Elsevier B.V. All rights reserved.
Francy, Donna S.; Brady, Amie M.G.; Carvin, Rebecca B.; Corsi, Steven R.; Fuller, Lori M.; Harrison, John H.; Hayhurst, Brett A.; Lant, Jeremiah; Nevers, Meredith B.; Terrio, Paul J.; Zimmerman, Tammy M.
2013-01-01
Predictive models have been used at beaches to improve the timeliness and accuracy of recreational water-quality assessments over the most common current approach to water-quality monitoring, which relies on culturing fecal-indicator bacteria such as Escherichia coli (E. coli.). Beach-specific predictive models use environmental and water-quality variables that are easily and quickly measured as surrogates to estimate concentrations of fecal-indicator bacteria or to provide the probability that a State recreational water-quality standard will be exceeded. When predictive models are used for beach closure or advisory decisions, they are referred to as “nowcasts.” During the recreational seasons of 2010-12, the U.S. Geological Survey (USGS), in cooperation with 23 local and State agencies, worked to improve existing nowcasts at 4 beaches, validate predictive models at another 38 beaches, and collect data for predictive-model development at 7 beaches throughout the Great Lakes. This report summarizes efforts to collect data and develop predictive models by multiple agencies and to compile existing information on the beaches and beach-monitoring programs into one comprehensive report. Local agencies measured E. coli concentrations and variables expected to affect E. coli concentrations such as wave height, turbidity, water temperature, and numbers of birds at the time of sampling. In addition to these field measurements, equipment was installed by the USGS or local agencies at or near several beaches to collect water-quality and metrological measurements in near real time, including nearshore buoys, weather stations, and tributary staff gages and monitors. The USGS worked with local agencies to retrieve data from existing sources either manually or by use of tools designed specifically to compile and process data for predictive-model development. Predictive models were developed by use of linear regression and (or) partial least squares techniques for 42 beaches that had at least 2 years of data (2010-11 and sometimes earlier) and for 1 beach that had 1 year of data. For most models, software designed for model development by the U.S. Environmental Protection Agency (Virtual Beach) was used. The selected model for each beach was based on a combination of explanatory variables including, most commonly, turbidity, day of the year, change in lake level over 24 hours, wave height, wind direction and speed, and antecedent rainfall for various time periods. Forty-two predictive models were validated against data collected during an independent year (2012) and compared to the current method for assessing recreational water quality-using the previous day’s E. coli concentration (persistence model). Goals for good predictive-model performance were responses that were at least 5 percent greater than the persistence model and overall correct responses greater than or equal to 80 percent, sensitivities (percentage of exceedances of the bathing-water standard that were correctly predicted by the model) greater than or equal to 50 percent, and specificities (percentage of nonexceedances correctly predicted by the model) greater than or equal to 85 percent. Out of 42 predictive models, 24 models yielded over-all correct responses that were at least 5 percent greater than the use of the persistence model. Predictive-model responses met the performance goals more often than the persistence-model responses in terms of overall correctness (28 versus 17 models, respectively), sensitivity (17 versus 4 models), and specificity (34 versus 25 models). Gaining knowledge of each beach and the factors that affect E. coli concentrations is important for developing good predictive models. Collection of additional years of data with a wide range of environmental conditions may also help to improve future model performance. The USGS will continue to work with local agencies in 2013 and beyond to develop and validate predictive models at beaches and improve existing nowcasts, restructuring monitoring activities to accommodate future uncertainties in funding and resources.
Predicting perceptual quality of images in realistic scenario using deep filter banks
NASA Astrophysics Data System (ADS)
Zhang, Weixia; Yan, Jia; Hu, Shiyong; Ma, Yang; Deng, Dexiang
2018-03-01
Classical image perceptual quality assessment models usually resort to natural scene statistic methods, which are based on an assumption that certain reliable statistical regularities hold on undistorted images and will be corrupted by introduced distortions. However, these models usually fail to accurately predict degradation severity of images in realistic scenarios since complex, multiple, and interactive authentic distortions usually appear on them. We propose a quality prediction model based on convolutional neural network. Quality-aware features extracted from filter banks of multiple convolutional layers are aggregated into the image representation. Furthermore, an easy-to-implement and effective feature selection strategy is used to further refine the image representation and finally a linear support vector regression model is trained to map image representation into images' subjective perceptual quality scores. The experimental results on benchmark databases present the effectiveness and generalizability of the proposed model.
Lin, Fen-Fang; Wang, Ke; Yang, Ning; Yan, Shi-Guang; Zheng, Xin-Yu
2012-02-01
In this paper, some main factors such as soil type, land use pattern, lithology type, topography, road, and industry type that affect soil quality were used to precisely obtain the spatial distribution characteristics of regional soil quality, mutual information theory was adopted to select the main environmental factors, and decision tree algorithm See 5.0 was applied to predict the grade of regional soil quality. The main factors affecting regional soil quality were soil type, land use, lithology type, distance to town, distance to water area, altitude, distance to road, and distance to industrial land. The prediction accuracy of the decision tree model with the variables selected by mutual information was obviously higher than that of the model with all variables, and, for the former model, whether of decision tree or of decision rule, its prediction accuracy was all higher than 80%. Based on the continuous and categorical data, the method of mutual information theory integrated with decision tree could not only reduce the number of input parameters for decision tree algorithm, but also predict and assess regional soil quality effectively.
Quality by control: Towards model predictive control of mammalian cell culture bioprocesses.
Sommeregger, Wolfgang; Sissolak, Bernhard; Kandra, Kulwant; von Stosch, Moritz; Mayer, Martin; Striedner, Gerald
2017-07-01
The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
US EPA 2012 Air Quality Fused Surface for the Conterminous U.S. Map Service
This web service contains a polygon layer that depicts fused air quality predictions for 2012 for census tracts in the conterminous United States. Fused air quality predictions (for ozone and PM2.5) are modeled using a Bayesian space-time downscaling fusion model approach described in a series of three published journal papers: 1) (Berrocal, V., Gelfand, A. E. and Holland, D. M. (2012). Space-time fusion under error in computer model output: an application to modeling air quality. Biometrics 68, 837-848; 2) Berrocal, V., Gelfand, A. E. and Holland, D. M. (2010). A bivariate space-time downscaler under space and time misalignment. The Annals of Applied Statistics 4, 1942-1975; and 3) Berrocal, V., Gelfand, A. E., and Holland, D. M. (2010). A spatio-temporal downscaler for output from numerical models. J. of Agricultural, Biological,and Environmental Statistics 15, 176-197) is used to provide daily, predictive PM2.5 (daily average) and O3 (daily 8-hr maximum) surfaces for 2012. Summer (O3) and annual (PM2.5) means calculated and published. The downscaling fusion model uses both air quality monitoring data from the National Air Monitoring Stations/State and Local Air Monitoring Stations (NAMS/SLAMS) and numerical output from the Models-3/Community Multiscale Air Quality (CMAQ). Currently, predictions at the US census tract centroid locations within the 12 km CMAQ domain are archived. Predictions at the CMAQ grid cell centroids, or any desired set of locations co
NASA Astrophysics Data System (ADS)
Wang, Haixia; Suo, Tongchuan; Wu, Xiaolin; Zhang, Yue; Wang, Chunhua; Yu, Heshui; Li, Zheng
2018-03-01
The control of batch-to-batch quality variations remains a challenging task for pharmaceutical industries, e.g., traditional Chinese medicine (TCM) manufacturing. One difficult problem is to produce pharmaceutical products with consistent quality from raw material of large quality variations. In this paper, an integrated methodology combining the near infrared spectroscopy (NIRS) and dynamic predictive modeling is developed for the monitoring and control of the batch extraction process of licorice. With the spectra data in hand, the initial state of the process is firstly estimated with a state-space model to construct a process monitoring strategy for the early detection of variations induced by the initial process inputs such as raw materials. Secondly, the quality property of the end product is predicted at the mid-course during the extraction process with a partial least squares (PLS) model. The batch-end-time (BET) is then adjusted accordingly to minimize the quality variations. In conclusion, our study shows that with the help of the dynamic predictive modeling, NIRS can offer the past and future information of the process, which enables more accurate monitoring and control of process performance and product quality.
Water quality management using statistical analysis and time-series prediction model
NASA Astrophysics Data System (ADS)
Parmar, Kulwinder Singh; Bhardwaj, Rashmi
2014-12-01
This paper deals with water quality management using statistical analysis and time-series prediction model. The monthly variation of water quality standards has been used to compare statistical mean, median, mode, standard deviation, kurtosis, skewness, coefficient of variation at Yamuna River. Model validated using R-squared, root mean square error, mean absolute percentage error, maximum absolute percentage error, mean absolute error, maximum absolute error, normalized Bayesian information criterion, Ljung-Box analysis, predicted value and confidence limits. Using auto regressive integrated moving average model, future water quality parameters values have been estimated. It is observed that predictive model is useful at 95 % confidence limits and curve is platykurtic for potential of hydrogen (pH), free ammonia, total Kjeldahl nitrogen, dissolved oxygen, water temperature (WT); leptokurtic for chemical oxygen demand, biochemical oxygen demand. Also, it is observed that predicted series is close to the original series which provides a perfect fit. All parameters except pH and WT cross the prescribed limits of the World Health Organization /United States Environmental Protection Agency, and thus water is not fit for drinking, agriculture and industrial use.
Blind prediction of natural video quality.
Saad, Michele A; Bovik, Alan C; Charrier, Christophe
2014-03-01
We propose a blind (no reference or NR) video quality evaluation model that is nondistortion specific. The approach relies on a spatio-temporal model of video scenes in the discrete cosine transform domain, and on a model that characterizes the type of motion occurring in the scenes, to predict video quality. We use the models to define video statistics and perceptual features that are the basis of a video quality assessment (VQA) algorithm that does not require the presence of a pristine video to compare against in order to predict a perceptual quality score. The contributions of this paper are threefold. 1) We propose a spatio-temporal natural scene statistics (NSS) model for videos. 2) We propose a motion model that quantifies motion coherency in video scenes. 3) We show that the proposed NSS and motion coherency models are appropriate for quality assessment of videos, and we utilize them to design a blind VQA algorithm that correlates highly with human judgments of quality. The proposed algorithm, called video BLIINDS, is tested on the LIVE VQA database and on the EPFL-PoliMi video database and shown to perform close to the level of top performing reduced and full reference VQA algorithms.
Predicting Software Suitability Using a Bayesian Belief Network
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.
2005-01-01
The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.
Methods are needed improve the timeliness and accuracy of recreational water‐quality assessments. Traditional culture methods require 18–24 h to obtain results and may not reflect current conditions. Predictive models, based on environmental and water quality variables, have been...
Hoos, Anne B.; Patel, Anant R.
1996-01-01
Model-adjustment procedures were applied to the combined data bases of storm-runoff quality for Chattanooga, Knoxville, and Nashville, Tennessee, to improve predictive accuracy for storm-runoff quality for urban watersheds in these three cities and throughout Middle and East Tennessee. Data for 45 storms at 15 different sites (five sites in each city) constitute the data base. Comparison of observed values of storm-runoff load and event-mean concentration to the predicted values from the regional regression models for 10 constituents shows prediction errors, as large as 806,000 percent. Model-adjustment procedures, which combine the regional model predictions with local data, are applied to improve predictive accuracy. Standard error of estimate after model adjustment ranges from 67 to 322 percent. Calibration results may be biased due to sampling error in the Tennessee data base. The relatively large values of standard error of estimate for some of the constituent models, although representing significant reduction (at least 50 percent) in prediction error compared to estimation with unadjusted regional models, may be unacceptable for some applications. The user may wish to collect additional local data for these constituents and repeat the analysis, or calibrate an independent local regression model.
Soyiri, Ireneous N; Reidpath, Daniel D
2013-01-01
Forecasting higher than expected numbers of health events provides potentially valuable insights in its own right, and may contribute to health services management and syndromic surveillance. This study investigates the use of quantile regression to predict higher than expected respiratory deaths. Data taken from 70,830 deaths occurring in New York were used. Temporal, weather and air quality measures were fitted using quantile regression at the 90th-percentile with half the data (in-sample). Four QR models were fitted: an unconditional model predicting the 90th-percentile of deaths (Model 1), a seasonal/temporal (Model 2), a seasonal, temporal plus lags of weather and air quality (Model 3), and a seasonal, temporal model with 7-day moving averages of weather and air quality. Models were cross-validated with the out of sample data. Performance was measured as proportionate reduction in weighted sum of absolute deviations by a conditional, over unconditional models; i.e., the coefficient of determination (R1). The coefficient of determination showed an improvement over the unconditional model between 0.16 and 0.19. The greatest improvement in predictive and forecasting accuracy of daily mortality was associated with the inclusion of seasonal and temporal predictors (Model 2). No gains were made in the predictive models with the addition of weather and air quality predictors (Models 3 and 4). However, forecasting models that included weather and air quality predictors performed slightly better than the seasonal and temporal model alone (i.e., Model 3 > Model 4 > Model 2) This study provided a new approach to predict higher than expected numbers of respiratory related-deaths. The approach, while promising, has limitations and should be treated at this stage as a proof of concept.
Soyiri, Ireneous N.; Reidpath, Daniel D.
2013-01-01
Forecasting higher than expected numbers of health events provides potentially valuable insights in its own right, and may contribute to health services management and syndromic surveillance. This study investigates the use of quantile regression to predict higher than expected respiratory deaths. Data taken from 70,830 deaths occurring in New York were used. Temporal, weather and air quality measures were fitted using quantile regression at the 90th-percentile with half the data (in-sample). Four QR models were fitted: an unconditional model predicting the 90th-percentile of deaths (Model 1), a seasonal / temporal (Model 2), a seasonal, temporal plus lags of weather and air quality (Model 3), and a seasonal, temporal model with 7-day moving averages of weather and air quality. Models were cross-validated with the out of sample data. Performance was measured as proportionate reduction in weighted sum of absolute deviations by a conditional, over unconditional models; i.e., the coefficient of determination (R1). The coefficient of determination showed an improvement over the unconditional model between 0.16 and 0.19. The greatest improvement in predictive and forecasting accuracy of daily mortality was associated with the inclusion of seasonal and temporal predictors (Model 2). No gains were made in the predictive models with the addition of weather and air quality predictors (Models 3 and 4). However, forecasting models that included weather and air quality predictors performed slightly better than the seasonal and temporal model alone (i.e., Model 3 > Model 4 > Model 2) This study provided a new approach to predict higher than expected numbers of respiratory related-deaths. The approach, while promising, has limitations and should be treated at this stage as a proof of concept. PMID:24147122
Blind image quality assessment without training on human opinion scores
NASA Astrophysics Data System (ADS)
Mittal, Anish; Soundararajan, Rajiv; Muralidhar, Gautam S.; Bovik, Alan C.; Ghosh, Joydeep
2013-03-01
We propose a family of image quality assessment (IQA) models based on natural scene statistics (NSS), that can predict the subjective quality of a distorted image without reference to a corresponding distortionless image, and without any training results on human opinion scores of distorted images. These `completely blind' models compete well with standard non-blind image quality indices in terms of subjective predictive performance when tested on the large publicly available `LIVE' Image Quality database.
DeepQA: improving the estimation of single protein model quality with deep belief networks.
Cao, Renzhi; Bhattacharya, Debswapna; Hou, Jie; Cheng, Jianlin
2016-12-05
Protein quality assessment (QA) useful for ranking and selecting protein models has long been viewed as one of the major challenges for protein tertiary structure prediction. Especially, estimating the quality of a single protein model, which is important for selecting a few good models out of a large model pool consisting of mostly low-quality models, is still a largely unsolved problem. We introduce a novel single-model quality assessment method DeepQA based on deep belief network that utilizes a number of selected features describing the quality of a model from different perspectives, such as energy, physio-chemical characteristics, and structural information. The deep belief network is trained on several large datasets consisting of models from the Critical Assessment of Protein Structure Prediction (CASP) experiments, several publicly available datasets, and models generated by our in-house ab initio method. Our experiments demonstrate that deep belief network has better performance compared to Support Vector Machines and Neural Networks on the protein model quality assessment problem, and our method DeepQA achieves the state-of-the-art performance on CASP11 dataset. It also outperformed two well-established methods in selecting good outlier models from a large set of models of mostly low quality generated by ab initio modeling methods. DeepQA is a useful deep learning tool for protein single model quality assessment and protein structure prediction. The source code, executable, document and training/test datasets of DeepQA for Linux is freely available to non-commercial users at http://cactus.rnet.missouri.edu/DeepQA/ .
Atmospheric Model Evaluation Tool for meteorological and air quality simulations
The Atmospheric Model Evaluation Tool compares model predictions to observed data from various meteorological and air quality observation networks to help evaluate meteorological and air quality simulations.
Regional air quality models are frequently used for regulatory applications to predict changes in air quality due to changes in emissions or changes in meteorology. Dynamic model evaluation is thus an important step in establishing credibility in the model predicted pollutant re...
Tilburg, Charles E.; Jordan, Linda M.; Carlson, Amy E.; Zeeman, Stephan I.; Yund, Philip O.
2015-01-01
Faecal pollution in stormwater, wastewater and direct run-off can carry zoonotic pathogens to streams, rivers and the ocean, reduce water quality, and affect both recreational and commercial fishing areas of the coastal ocean. Typically, the closure of beaches and commercial fishing areas is governed by the testing for the presence of faecal bacteria, which requires an 18–24 h period for sample incubation. As water quality can change during this testing period, the need for accurate and timely predictions of coastal water quality has become acute. In this study, we: (i) examine the relationship between water quality, precipitation and river discharge at several locations within the Gulf of Maine, and (ii) use multiple linear regression models based on readily obtainable hydrometeorological measurements to predict water quality events at five coastal locations. Analysis of a 12 year dataset revealed that high river discharge and/or precipitation events can lead to reduced water quality; however, the use of only these two parameters to predict water quality can result in a number of errors. Analysis of a higher frequency, 2 year study using multiple linear regression models revealed that precipitation, salinity, river discharge, winds, seasonality and coastal circulation correlate with variations in water quality. Although there has been extensive development of regression models for freshwater, this is one of the first attempts to create a mechanistic model to predict water quality in coastal marine waters. Model performance is similar to that of efforts in other regions, which have incorporated models into water resource managers' decisions, indicating that the use of a mechanistic model in coastal Maine is feasible. PMID:26587258
Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey
2017-01-01
As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed‐batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647–1661, 2017 PMID:28786215
Large-scale structure prediction by improved contact predictions and model quality assessment.
Michel, Mirco; Menéndez Hurtado, David; Uziela, Karolis; Elofsson, Arne
2017-07-15
Accurate contact predictions can be used for predicting the structure of proteins. Until recently these methods were limited to very big protein families, decreasing their utility. However, recent progress by combining direct coupling analysis with machine learning methods has made it possible to predict accurate contact maps for smaller families. To what extent these predictions can be used to produce accurate models of the families is not known. We present the PconsFold2 pipeline that uses contact predictions from PconsC3, the CONFOLD folding algorithm and model quality estimations to predict the structure of a protein. We show that the model quality estimation significantly increases the number of models that reliably can be identified. Finally, we apply PconsFold2 to 6379 Pfam families of unknown structure and find that PconsFold2 can, with an estimated 90% specificity, predict the structure of up to 558 Pfam families of unknown structure. Out of these, 415 have not been reported before. Datasets as well as models of all the 558 Pfam families are available at http://c3.pcons.net/ . All programs used here are freely available. arne@bioinfo.se. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Large-scale model quality assessment for improving protein tertiary structure prediction.
Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin
2015-06-15
Sampling structural models and ranking them are the two major challenges of protein structure prediction. Traditional protein structure prediction methods generally use one or a few quality assessment (QA) methods to select the best-predicted models, which cannot consistently select relatively better models and rank a large number of models well. Here, we develop a novel large-scale model QA method in conjunction with model clustering to rank and select protein structural models. It unprecedentedly applied 14 model QA methods to generate consensus model rankings, followed by model refinement based on model combination (i.e. averaging). Our experiment demonstrates that the large-scale model QA approach is more consistent and robust in selecting models of better quality than any individual QA method. Our method was blindly tested during the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM group. It was officially ranked third out of all 143 human and server predictors according to the total scores of the first models predicted for 78 CASP11 protein domains and second according to the total scores of the best of the five models predicted for these domains. MULTICOM's outstanding performance in the extremely competitive 2014 CASP11 experiment proves that our large-scale QA approach together with model clustering is a promising solution to one of the two major problems in protein structure modeling. The web server is available at: http://sysbio.rnet.missouri.edu/multicom_cluster/human/. © The Author 2015. Published by Oxford University Press.
Artificial neural network modeling of the water quality index using land use areas as predictors.
Gazzaz, Nabeel M; Yusoff, Mohd Kamil; Ramli, Mohammad Firuz; Juahir, Hafizan; Aris, Ahmad Zaharin
2015-02-01
This paper describes the design of an artificial neural network (ANN) model to predict the water quality index (WQI) using land use areas as predictors. Ten-year records of land use statistics and water quality data for Kinta River (Malaysia) were employed in the modeling process. The most accurate WQI predictions were obtained with the network architecture 7-23-1; the back propagation training algorithm; and a learning rate of 0.02. The WQI forecasts of this model had significant (p < 0.01), positive, very high correlation (ρs = 0.882) with the measured WQI values. Sensitivity analysis revealed that the relative importance of the land use classes to WQI predictions followed the order: mining > rubber > forest > logging > urban areas > agriculture > oil palm. These findings show that the ANNs are highly reliable means of relating water quality to land use, thus integrating land use development with river water quality management.
Protein single-model quality assessment by feature-based probability density functions.
Cao, Renzhi; Cheng, Jianlin
2016-04-04
Protein quality assessment (QA) has played an important role in protein structure prediction. We developed a novel single-model quality assessment method-Qprob. Qprob calculates the absolute error for each protein feature value against the true quality scores (i.e. GDT-TS scores) of protein structural models, and uses them to estimate its probability density distribution for quality assessment. Qprob has been blindly tested on the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM-NOVEL server. The official CASP result shows that Qprob ranks as one of the top single-model QA methods. In addition, Qprob makes contributions to our protein tertiary structure predictor MULTICOM, which is officially ranked 3rd out of 143 predictors. The good performance shows that Qprob is good at assessing the quality of models of hard targets. These results demonstrate that this new probability density distribution based method is effective for protein single-model quality assessment and is useful for protein structure prediction. The webserver of Qprob is available at: http://calla.rnet.missouri.edu/qprob/. The software is now freely available in the web server of Qprob.
Francy, Donna S.; Graham, Jennifer L.; Stelzer, Erin A.; Ecker, Christopher D.; Brady, Amie M. G.; Pam Struffolino,; Loftin, Keith A.
2015-11-06
The results of this study showed that water-quality and environmental variables are promising for use in site-specific daily or long-term predictive models. In order to develop more accurate models to predict toxin concentrations at freshwater lake sites, data need to be collected more frequently and for consecutive days in future studies.
Model-based monitoring of stormwater runoff quality.
Birch, Heidi; Vezzaro, Luca; Mikkelsen, Peter Steen
2013-01-01
Monitoring of micropollutants (MP) in stormwater is essential to evaluate the impacts of stormwater on the receiving aquatic environment. The aim of this study was to investigate how different strategies for monitoring of stormwater quality (combining a model with field sampling) affect the information obtained about MP discharged from the monitored system. A dynamic stormwater quality model was calibrated using MP data collected by automatic volume-proportional sampling and passive sampling in a storm drainage system on the outskirts of Copenhagen (Denmark) and a 10-year rain series was used to find annual average (AA) and maximum event mean concentrations. Use of this model reduced the uncertainty of predicted AA concentrations compared to a simple stochastic method based solely on data. The predicted AA concentration, obtained by using passive sampler measurements (1 month installation) for calibration of the model, resulted in the same predicted level but with narrower model prediction bounds than by using volume-proportional samples for calibration. This shows that passive sampling allows for a better exploitation of the resources allocated for stormwater quality monitoring.
Soulat, J; Picard, B; Léger, S; Monteils, V
2018-06-01
In this study, four prediction models were developed by logistic regression using individual data from 96 heifers. Carcass and sensory rectus abdominis quality clusters were identified then predicted using the rearing factors data. The obtained models from rearing factors applied during the fattening period were compared to those characterising the heifers' whole life. The highest prediction power of carcass and meat quality clusters were obtained from the models considering the whole life, with success rates of 62.8% and 54.9%, respectively. Rearing factors applied during both pre-weaning and fattening periods influenced carcass and meat quality. According to models, carcass traits were improved when heifer's mother was older for first calving, calves ingested concentrates during pasture preceding weaning and heifers were slaughtered older. Meat traits were improved by the genetic of heifers' parents (i.e., calving ease and early muscularity) and when heifers were slaughtered older. A management of carcass and meat quality traits is possible at different periods of the heifers' life. Copyright © 2018 Elsevier Ltd. All rights reserved.
Meteorological Processes Affecting Air Quality – Research and Model Development Needs
Meteorology modeling is an important component of air quality modeling systems that defines the physical and dynamical environment for atmospheric chemistry. The meteorology models used for air quality applications are based on numerical weather prediction models that were devel...
Valuing river characteristics using combined site choice and participation travel cost models.
Johnstone, C; Markandya, A
2006-08-01
This paper presents new welfare measures for marginal changes in river quality in selected English rivers. The river quality indicators used include chemical, biological and habitat-level attributes. Economic values for recreational use of three types of river-upland, lowland and chalk-are presented. A survey of anglers was carried out and using these data, two travel cost models were estimated, one to predict the numbers of trips and the other to predict angling site choice. These models were then linked to estimate the welfare associated with marginal changes in river quality using the participation levels as estimated in the trip prediction model. The model results showed that higher flow rates, biological quality and nutrient pollution levels affect site choice and influence the likelihood of a fishing trip. Consumer surplus values per trip for a 10% change in river attributes range from pound 0.04 to pound 3.93 ( pound 2001) depending on the attribute.
Season-ahead water quality forecasts for the Schuylkill River, Pennsylvania
NASA Astrophysics Data System (ADS)
Block, P. J.; Leung, K.
2013-12-01
Anticipating and preparing for elevated water quality parameter levels in critical water sources, using weather forecasts, is not uncommon. In this study, we explore the feasibility of extending this prediction scale to a season-ahead for the Schuylkill River in Philadelphia, utilizing both statistical and dynamical prediction models, to characterize the season. This advance information has relevance for recreational activities, ecosystem health, and water treatment, as the Schuylkill provides 40% of Philadelphia's water supply. The statistical model associates large-scale climate drivers with streamflow and water quality parameter levels; numerous variables from NOAA's CFSv2 model are evaluated for the dynamical approach. A multi-model combination is also assessed. Results indicate moderately skillful prediction of average summertime total coliform and wintertime turbidity, using season-ahead oceanic and atmospheric variables, predominantly from the North Atlantic Ocean. Models predicting the number of elevated turbidity events across the wintertime season are also explored.
LARGE-SCALE PREDICTIONS OF MOBILE SOURCE CONTRIBUTIONS TO CONCENTRATIONS OF TOXIC AIR POLLUTANTS
This presentation shows concentrations and deposition of toxic air pollutants predicted by a 3-D air quality model, the Community Multi Scale Air Quality (CMAQ) modeling system. Contributions from both on-road and non-road mobile sources are analyzed.
Hilkens, N A; Algra, A; Greving, J P
2016-01-01
ESSENTIALS: Prediction models may help to identify patients at high risk of bleeding on antiplatelet therapy. We identified existing prediction models for bleeding and validated them in patients with cerebral ischemia. Five prediction models were identified, all of which had some methodological shortcomings. Performance in patients with cerebral ischemia was poor. Background Antiplatelet therapy is widely used in secondary prevention after a transient ischemic attack (TIA) or ischemic stroke. Bleeding is the main adverse effect of antiplatelet therapy and is potentially life threatening. Identification of patients at increased risk of bleeding may help target antiplatelet therapy. This study sought to identify existing prediction models for intracranial hemorrhage or major bleeding in patients on antiplatelet therapy and evaluate their performance in patients with cerebral ischemia. We systematically searched PubMed and Embase for existing prediction models up to December 2014. The methodological quality of the included studies was assessed with the CHARMS checklist. Prediction models were externally validated in the European Stroke Prevention Study 2, comprising 6602 patients with a TIA or ischemic stroke. We assessed discrimination and calibration of included prediction models. Five prediction models were identified, of which two were developed in patients with previous cerebral ischemia. Three studies assessed major bleeding, one studied intracerebral hemorrhage and one gastrointestinal bleeding. None of the studies met all criteria of good quality. External validation showed poor discriminative performance, with c-statistics ranging from 0.53 to 0.64 and poor calibration. A limited number of prediction models is available that predict intracranial hemorrhage or major bleeding in patients on antiplatelet therapy. The methodological quality of the models varied, but was generally low. Predictive performance in patients with cerebral ischemia was poor. In order to reliably predict the risk of bleeding in patients with cerebral ischemia, development of a prediction model according to current methodological standards is needed. © 2015 International Society on Thrombosis and Haemostasis.
Downey, Brandon; Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey
2017-11-01
As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed-batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647-1661, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.
Liu, Yu; Xi, Du-Gang; Li, Zhao-Liang
2015-01-01
Predicting the levels of chlorophyll-a (Chl-a) is a vital component of water quality management, which ensures that urban drinking water is safe from harmful algal blooms. This study developed a model to predict Chl-a levels in the Yuqiao Reservoir (Tianjin, China) biweekly using water quality and meteorological data from 1999-2012. First, six artificial neural networks (ANNs) and two non-ANN methods (principal component analysis and the support vector regression model) were compared to determine the appropriate training principle. Subsequently, three predictors with different input variables were developed to examine the feasibility of incorporating meteorological factors into Chl-a prediction, which usually only uses water quality data. Finally, a sensitivity analysis was performed to examine how the Chl-a predictor reacts to changes in input variables. The results were as follows: first, ANN is a powerful predictive alternative to the traditional modeling techniques used for Chl-a prediction. The back program (BP) model yields slightly better results than all other ANNs, with the normalized mean square error (NMSE), the correlation coefficient (Corr), and the Nash-Sutcliffe coefficient of efficiency (NSE) at 0.003 mg/l, 0.880 and 0.754, respectively, in the testing period. Second, the incorporation of meteorological data greatly improved Chl-a prediction compared to models solely using water quality factors or meteorological data; the correlation coefficient increased from 0.574-0.686 to 0.880 when meteorological data were included. Finally, the Chl-a predictor is more sensitive to air pressure and pH compared to other water quality and meteorological variables.
The Effect of Data Quality on Short-term Growth Model Projections
David Gartner
2005-01-01
This study was designed to determine the effect of FIA's data quality on short-term growth model projections. The data from Georgia's 1996 statewide survey were used for the Southern variant of the Forest Vegetation Simulator to predict Georgia's first annual panel. The effect of several data error sources on growth modeling prediction errors...
Improved model quality assessment using ProQ2.
Ray, Arjun; Lindahl, Erik; Wallner, Björn
2012-09-10
Employing methods to assess the quality of modeled protein structures is now standard practice in bioinformatics. In a broad sense, the techniques can be divided into methods relying on consensus prediction on the one hand, and single-model methods on the other. Consensus methods frequently perform very well when there is a clear consensus, but this is not always the case. In particular, they frequently fail in selecting the best possible model in the hard cases (lacking consensus) or in the easy cases where models are very similar. In contrast, single-model methods do not suffer from these drawbacks and could potentially be applied on any protein of interest to assess quality or as a scoring function for sampling-based refinement. Here, we present a new single-model method, ProQ2, based on ideas from its predecessor, ProQ. ProQ2 is a model quality assessment algorithm that uses support vector machines to predict local as well as global quality of protein models. Improved performance is obtained by combining previously used features with updated structural and predicted features. The most important contribution can be attributed to the use of profile weighting of the residue specific features and the use features averaged over the whole model even though the prediction is still local. ProQ2 is significantly better than its predecessors at detecting high quality models, improving the sum of Z-scores for the selected first-ranked models by 20% and 32% compared to the second-best single-model method in CASP8 and CASP9, respectively. The absolute quality assessment of the models at both local and global level is also improved. The Pearson's correlation between the correct and local predicted score is improved from 0.59 to 0.70 on CASP8 and from 0.62 to 0.68 on CASP9; for global score to the correct GDT_TS from 0.75 to 0.80 and from 0.77 to 0.80 again compared to the second-best single methods in CASP8 and CASP9, respectively. ProQ2 is available at http://proq2.wallnerlab.org.
Passion in sport: on the quality of the coach-athlete relationship.
Lafrenière, Marc-André K; Jowett, Sophia; Vallerand, Robert J; Gonahue, Eric G; Lorimer, Ross
2008-10-01
Vallerand et al. (2003) developed a dualistic model of passion, wherein two types of passion are proposed: harmonious (HP) and obsessive (OP) passion that predict adaptive and less adaptive interpersonal outcomes, respectively. In the present research, we were interested in understanding the role of passion in the quality of coach-athlete relationships. Results of Study 1, conducted with athletes (N=157), revealed that HP positively predicts a high-quality coach-athlete relationship, whereas OP was largely unrelated to such relationships. Study 2 was conducted with coaches (N=106) and showed that only HP positively predicted the quality of the coach-athlete relationship. Furthermore, these effects were fully mediated by positive emotions. Finally, the quality of the coach-athlete relationship positively predicted coaches' subjective well-being. Future research directions are discussed in light of the dualistic model of passion.
Perceptual tools for quality-aware video networks
NASA Astrophysics Data System (ADS)
Bovik, A. C.
2014-01-01
Monitoring and controlling the quality of the viewing experience of videos transmitted over increasingly congested networks (especially wireless networks) is a pressing problem owing to rapid advances in video-centric mobile communication and display devices that are straining the capacity of the network infrastructure. New developments in automatic perceptual video quality models offer tools that have the potential to be used to perceptually optimize wireless video, leading to more efficient video data delivery and better received quality. In this talk I will review key perceptual principles that are, or could be used to create effective video quality prediction models, and leading quality prediction models that utilize these principles. The goal is to be able to monitor and perceptually optimize video networks by making them "quality-aware."
Kochendorfer, Logan B; Kerns, Kathryn A
2017-05-01
Relationships with parents and friends are important contexts for developing romantic relationship skills. Parents and friends may influence both the timing of involvement and the quality of romantic relationships. Three models of the joint influence of parents and friends (direct effects model, mediation model, and moderator model) have been proposed. The present study uses data from a longitudinal study (n = 1012; 49.8% female; 81.1% Caucasian) to examine how attachment and friendship quality at age 10 years predict romantic relationship involvement and quality at ages 12 and 15 years. The results supported the direct effects model, with attachment and friendship quality uniquely predicting different romantic relationship outcomes. The findings provide further support for the important influence of family and friends on early romantic relationships.
Lee, Hasup; Baek, Minkyung; Lee, Gyu Rie; Park, Sangwoo; Seok, Chaok
2017-03-01
Many proteins function as homo- or hetero-oligomers; therefore, attempts to understand and regulate protein functions require knowledge of protein oligomer structures. The number of available experimental protein structures is increasing, and oligomer structures can be predicted using the experimental structures of related proteins as templates. However, template-based models may have errors due to sequence differences between the target and template proteins, which can lead to functional differences. Such structural differences may be predicted by loop modeling of local regions or refinement of the overall structure. In CAPRI (Critical Assessment of PRotein Interactions) round 30, we used recently developed features of the GALAXY protein modeling package, including template-based structure prediction, loop modeling, model refinement, and protein-protein docking to predict protein complex structures from amino acid sequences. Out of the 25 CAPRI targets, medium and acceptable quality models were obtained for 14 and 1 target(s), respectively, for which proper oligomer or monomer templates could be detected. Symmetric interface loop modeling on oligomer model structures successfully improved model quality, while loop modeling on monomer model structures failed. Overall refinement of the predicted oligomer structures consistently improved the model quality, in particular in interface contacts. Proteins 2017; 85:399-407. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Choudhry, Shahid A.; Li, Jing; Davis, Darcy; Erdmann, Cole; Sikka, Rishi; Sutariya, Bharat
2013-01-01
Introduction: Preventing the occurrence of hospital readmissions is needed to improve quality of care and foster population health across the care continuum. Hospitals are being held accountable for improving transitions of care to avert unnecessary readmissions. Advocate Health Care in Chicago and Cerner (ACC) collaborated to develop all-cause, 30-day hospital readmission risk prediction models to identify patients that need interventional resources. Ideally, prediction models should encompass several qualities: they should have high predictive ability; use reliable and clinically relevant data; use vigorous performance metrics to assess the models; be validated in populations where they are applied; and be scalable in heterogeneous populations. However, a systematic review of prediction models for hospital readmission risk determined that most performed poorly (average C-statistic of 0.66) and efforts to improve their performance are needed for widespread usage. Methods: The ACC team incorporated electronic health record data, utilized a mixed-method approach to evaluate risk factors, and externally validated their prediction models for generalizability. Inclusion and exclusion criteria were applied on the patient cohort and then split for derivation and internal validation. Stepwise logistic regression was performed to develop two predictive models: one for admission and one for discharge. The prediction models were assessed for discrimination ability, calibration, overall performance, and then externally validated. Results: The ACC Admission and Discharge Models demonstrated modest discrimination ability during derivation, internal and external validation post-recalibration (C-statistic of 0.76 and 0.78, respectively), and reasonable model fit during external validation for utility in heterogeneous populations. Conclusions: The ACC Admission and Discharge Models embody the design qualities of ideal prediction models. The ACC plans to continue its partnership to further improve and develop valuable clinical models. PMID:24224068
NASA Astrophysics Data System (ADS)
Lee, Soon Hwan; Kim, Ji Sun; Lee, Kang Yeol; Shon, Keon Tae
2017-04-01
Air quality due to increasing Particulate Matter(PM) in Korea in Asia is getting worse. At present, the PM forecast is announced based on the PM concentration predicted from the air quality prediction numerical model. However, forecast accuracy is not as high as expected due to various uncertainties for PM physical and chemical characteristics. The purpose of this study was to develop a numerical-statistically ensemble models to improve the accuracy of prediction of PM10 concentration. Numerical models used in this study are the three dimensional atmospheric model Weather Research and Forecasting(WRF) and the community multiscale air quality model (CMAQ). The target areas for the PM forecast are Seoul, Busan, Daegu, and Daejeon metropolitan areas in Korea. The data used in the model development are PM concentration and CMAQ predictions and the data period is 3 months (March 1 - May 31, 2014). The dynamic-statistical technics for reducing the systematic error of the CMAQ predictions was applied to the dynamic linear model(DLM) based on the Baysian Kalman filter technic. As a result of applying the metrics generated from the dynamic linear model to the forecasting of PM concentrations accuracy was improved. Especially, at the high PM concentration where the damage is relatively large, excellent improvement results are shown.
Ding, Jinliang; Chai, Tianyou; Wang, Hong
2011-03-01
This paper presents a novel offline modeling for product quality prediction of mineral processing which consists of a number of unit processes in series. The prediction of the product quality of the whole mineral process (i.e., the mixed concentrate grade) plays an important role and the establishment of its predictive model is a key issue for the plantwide optimization. For this purpose, a hybrid modeling approach of the mixed concentrate grade prediction is proposed, which consists of a linear model and a nonlinear model. The least-squares support vector machine is adopted to establish the nonlinear model. The inputs of the predictive model are the performance indices of each unit process, while the output is the mixed concentrate grade. In this paper, the model parameter selection is transformed into the shape control of the probability density function (PDF) of the modeling error. In this context, both the PDF-control-based and minimum-entropy-based model parameter selection approaches are proposed. Indeed, this is the first time that the PDF shape control idea is used to deal with system modeling, where the key idea is to turn model parameters so that either the modeling error PDF is controlled to follow a target PDF or the modeling error entropy is minimized. The experimental results using the real plant data and the comparison of the two approaches are discussed. The results show the effectiveness of the proposed approaches.
Impact of inherent meteorology uncertainty on air quality model predictions
It is well established that there are a number of different classifications and sources of uncertainties in environmental modeling systems. Air quality models rely on two key inputs, namely, meteorology and emissions. When using air quality models for decision making, it is impor...
Quantitative structure-activity relationship models that stand the test of time.
Davis, Andrew M; Wood, David J
2013-04-01
The pharmaceutical industry is in a period of intense change. While this has many drivers, attrition through the development process continues to be an important pressure. The emerging definitions of "compound quality" that are based on retrospective analyses of developmental attrition have highlighted a new direction for medicinal chemistry and the paradigm of "quality at the point of design". The time has come for retrospective analyses to catalyze prospective action. Quality at the point of design places pressure on the quality of our predictive models. Empirical QSAR models when built with care provide true predictive control, but their accuracy and precision can be improved. Here we describe AstraZeneca's experience of automation in QSAR model building and validation, and how an informatics system can provide a step-change in predictive power to project design teams, if they choose to use it.
Predicting aged pork quality using a portable Raman device.
Santos, C C; Zhao, J; Dong, X; Lonergan, S M; Huff-Lonergan, E; Outhouse, A; Carlson, K B; Prusa, K J; Fedler, C A; Yu, C; Shackelford, S D; King, D A; Wheeler, T L
2018-05-29
The utility of Raman spectroscopic signatures of fresh pork loin (1 d & 15 d postmortem) in predicting fresh pork tenderness and slice shear force (SSF) was determined. Partial least square models showed that sensory tenderness and SSF are weakly correlated (R 2 = 0.2). Raman spectral data were collected in 6 s using a portable Raman spectrometer (RS). A PLS regression model was developed to predict quantitatively the tenderness scores and SSF values from Raman spectral data, with very limited success. It was discovered that the prediction accuracies for day 15 post mortem samples are significantly greater than that for day 1 postmortem samples. Classification models were developed to predict tenderness at two ends of sensory quality as "poor" vs. "good". The accuracies of classification into different quality categories (1st to 4th percentile) are also greater for the day 15 postmortem samples for sensory tenderness (93.5% vs 76.3%) and SSF (92.8% vs 76.1%). RS has the potential to become a rapid on-line screening tool for the pork producers to quickly select meats with superior quality and/or cull poor quality to meet market demand/expectations. Copyright © 2018 Elsevier Ltd. All rights reserved.
Prediction of passenger ride quality in a multifactor environment
NASA Technical Reports Server (NTRS)
Dempsey, T. K.; Leatherwood, J. D.
1976-01-01
A model being developed, permits the understanding and prediction of passenger discomfort in a multifactor environment with particular emphasis upon combined noise and vibration. The model has general applicability to diverse transportation systems and provides a means of developing ride quality design criteria as well as a diagnostic tool for identifying the vibration and/or noise stimuli causing discomfort. Presented are: (1) a review of the basic theoretical and mathematical computations associated with the model, (2) a discussion of methodological and criteria investigations for both the vertical and roll axes of vibration, (3) a description of within-axis masking of discomfort responses for the vertical axis, thereby allowing prediction of the total discomfort due to any random vertical vibration, (4) a discussion of initial data on between-axis masking, and (5) discussion of a study directed towards extension of the vibration model to the more general case of predicting ride quality in the combined noise and vibration environments.
Thin-slice vision: inference of confidence measure from perceptual video quality
NASA Astrophysics Data System (ADS)
Hameed, Abdul; Balas, Benjamin; Dai, Rui
2016-11-01
There has been considerable research on thin-slice judgments, but no study has demonstrated the predictive validity of confidence measures when assessors watch videos acquired from communication systems, in which the perceptual quality of videos could be degraded by limited bandwidth and unreliable network conditions. This paper studies the relationship between high-level thin-slice judgments of human behavior and factors that contribute to perceptual video quality. Based on a large number of subjective test results, it has been found that the confidence of a single individual present in all the videos, called speaker's confidence (SC), could be predicted by a list of features that contribute to perceptual video quality. Two prediction models, one based on artificial neural network and the other based on a decision tree, were built to predict SC. Experimental results have shown that both prediction models can result in high correlation measures.
Performance of ANFIS versus MLP-NN dissolved oxygen prediction models in water quality monitoring.
Najah, A; El-Shafie, A; Karim, O A; El-Shafie, Amr H
2014-02-01
We discuss the accuracy and performance of the adaptive neuro-fuzzy inference system (ANFIS) in training and prediction of dissolved oxygen (DO) concentrations. The model was used to analyze historical data generated through continuous monitoring of water quality parameters at several stations on the Johor River to predict DO concentrations. Four water quality parameters were selected for ANFIS modeling, including temperature, pH, nitrate (NO3) concentration, and ammoniacal nitrogen concentration (NH3-NL). Sensitivity analysis was performed to evaluate the effects of the input parameters. The inputs with the greatest effect were those related to oxygen content (NO3) or oxygen demand (NH3-NL). Temperature was the parameter with the least effect, whereas pH provided the lowest contribution to the proposed model. To evaluate the performance of the model, three statistical indices were used: the coefficient of determination (R (2)), the mean absolute prediction error, and the correlation coefficient. The performance of the ANFIS model was compared with an artificial neural network model. The ANFIS model was capable of providing greater accuracy, particularly in the case of extreme events.
Lindner-Lunsford, J. B.; Ellis, S.R.
1984-01-01
The U.S. Geological Survey 's Distributed Routing Rainfall-Runoff Model--Version II was calibrated and verified for five urban basins in the Denver metropolitan area. Land-use types in the basins were light commerical, multifamily housing, single-family housing, and a shopping center. The overall accuracy of model predictions of peak flows and runoff volumes was about 15 percent for storms with rainfall intensities of less than 1 inch per hour and runoff volume of greater than 0.01 inch. Predictions generally were unsatisfactory for storm having a rainfall intensity of more than 1 inch per hour, or runoff of 0.01 inch or less. The Distributed Routing Rainfall-Runoff Model-Quality, a multievent runoff-quality model developed by the U.S. Geological Survey, was calibrated and verified on four basins. The model was found to be most useful in the prediction of seasonal loads of constituents in the runoff resulting from rainfall. The model was not very accurate in the prediction of runoff loads of individual constituents. (USGS)
Grayson, Richard; Kay, Paul; Foulger, Miles
2008-01-01
Diffuse pollution poses a threat to water quality and results in the need for treatment for potable water supplies which can prove costly. Within the Yorkshire region, UK, nitrates, pesticides and water colour present particular treatment problems. Catchment management techniques offer an alternative to 'end of pipe' solutions and allow resources to be targeted to the most polluting areas. This project has attempted to identify such areas using GIS based modelling approaches in catchments where water quality data were available. As no model exists to predict water colour a model was created using an MCE method which is capable of predicting colour concentrations at the catchment scale. CatchIS was used to predict pesticide and nitrate N concentrations and was found to be generally capable of reliably predicting nitrate N loads at the catchment scale. The pesticides results did not match the historic data possibly due to problems with the historic pesticide data and temporal and spatially variability in pesticide usage. The use of these models can be extended to predict water quality problems in catchments where water quality data are unavailable and highlight areas of concern. IWA Publishing 2008.
NASA Astrophysics Data System (ADS)
Ouyang, Qin; Chen, Quansheng; Zhao, Jiewen
2016-02-01
The approach presented herein reports the application of near infrared (NIR) spectroscopy, in contrast with human sensory panel, as a tool for estimating Chinese rice wine quality; concretely, to achieve the prediction of the overall sensory scores assigned by the trained sensory panel. Back propagation artificial neural network (BPANN) combined with adaptive boosting (AdaBoost) algorithm, namely BP-AdaBoost, as a novel nonlinear algorithm, was proposed in modeling. First, the optimal spectra intervals were selected by synergy interval partial least square (Si-PLS). Then, BP-AdaBoost model based on the optimal spectra intervals was established, called Si-BP-AdaBoost model. These models were optimized by cross validation, and the performance of each final model was evaluated according to correlation coefficient (Rp) and root mean square error of prediction (RMSEP) in prediction set. Si-BP-AdaBoost showed excellent performance in comparison with other models. The best Si-BP-AdaBoost model was achieved with Rp = 0.9180 and RMSEP = 2.23 in the prediction set. It was concluded that NIR spectroscopy combined with Si-BP-AdaBoost was an appropriate method for the prediction of the sensory quality in Chinese rice wine.
A next generation air quality modeling system is being developed at the U.S. EPA to enable seamless modeling of air quality from global to regional to (eventually) local scales. State of the science chemistry and aerosol modules from the Community Multiscale Air Quality (CMAQ) mo...
Tomperi, Jani; Leiviskä, Kauko
2018-06-01
Traditionally the modelling in an activated sludge process has been based on solely the process measurements, but as the interest to optically monitor wastewater samples to characterize the floc morphology has increased, in the recent years the results of image analyses have been more frequently utilized to predict the characteristics of wastewater. This study shows that the traditional process measurements or the automated optical monitoring variables by themselves are not capable of developing the best predictive models for the treated wastewater quality in a full-scale wastewater treatment plant, but utilizing these variables together the optimal models, which show the level and changes in the treated wastewater quality, are achieved. By this early warning, process operation can be optimized to avoid environmental damages and economic losses. The study also shows that specific optical monitoring variables are important in modelling a certain quality parameter, regardless of the other input variables available.
The second phase of the MicroArray Quality Control (MAQC-II) project evaluated common practices for developing and validating microarray-based models aimed at predicting toxicological and clinical endpoints. Thirty-six teams developed classifiers for 13 endpoints - some easy, som...
Prediction models for Arabica coffee beverage quality based on aroma analyses and chemometrics.
Ribeiro, J S; Augusto, F; Salva, T J G; Ferreira, M M C
2012-11-15
In this work, soft modeling based on chemometric analyses of coffee beverage sensory data and the chromatographic profiles of volatile roasted coffee compounds is proposed to predict the scores of acidity, bitterness, flavor, cleanliness, body, and overall quality of the coffee beverage. A partial least squares (PLS) regression method was used to construct the models. The ordered predictor selection (OPS) algorithm was applied to select the compounds for the regression model of each sensory attribute in order to take only significant chromatographic peaks into account. The prediction errors of these models, using 4 or 5 latent variables, were equal to 0.28, 0.33, 0.35, 0.33, 0.34 and 0.41, for each of the attributes and compatible with the errors of the mean scores of the experts. Thus, the results proved the feasibility of using a similar methodology in on-line or routine applications to predict the sensory quality of Brazilian Arabica coffee. Copyright © 2012 Elsevier B.V. All rights reserved.
Wilson, Richard; Goodacre, Steve W; Klingbajl, Marcin; Kelly, Anne-Maree; Rainer, Tim; Coats, Tim; Holloway, Vikki; Townend, Will; Crane, Steve
2014-01-01
Background and objective Risk-adjusted mortality rates can be used as a quality indicator if it is assumed that the discrepancy between predicted and actual mortality can be attributed to the quality of healthcare (ie, the model has attributional validity). The Development And Validation of Risk-adjusted Outcomes for Systems of emergency care (DAVROS) model predicts 7-day mortality in emergency medical admissions. We aimed to test this assumption by evaluating the attributional validity of the DAVROS risk-adjustment model. Methods We selected cases that had the greatest discrepancy between observed mortality and predicted probability of mortality from seven hospitals involved in validation of the DAVROS risk-adjustment model. Reviewers at each hospital assessed hospital records to determine whether the discrepancy between predicted and actual mortality could be explained by the healthcare provided. Results We received 232/280 (83%) completed review forms relating to 179 unexpected deaths and 53 unexpected survivors. The healthcare system was judged to have potentially contributed to 10/179 (8%) of the unexpected deaths and 26/53 (49%) of the unexpected survivors. Failure of the model to appropriately predict risk was judged to be responsible for 135/179 (75%) of the unexpected deaths and 2/53 (4%) of the unexpected survivors. Some 10/53 (19%) of the unexpected survivors died within a few months of the 7-day period of model prediction. Conclusions We found little evidence that deaths occurring in patients with a low predicted mortality from risk-adjustment could be attributed to the quality of healthcare provided. PMID:23605036
NASA Technical Reports Server (NTRS)
Hess, R. A.
1977-01-01
A brief review of some of the more pertinent applications of analytical pilot models to the prediction of aircraft handling qualities is undertaken. The relative ease with which multiloop piloting tasks can be modeled via the optimal control formulation makes the use of optimal pilot models particularly attractive for handling qualities research. To this end, a rating hypothesis is introduced which relates the numerical pilot opinion rating assigned to a particular vehicle and task to the numerical value of the index of performance resulting from an optimal pilot modeling procedure as applied to that vehicle and task. This hypothesis is tested using data from piloted simulations and is shown to be reasonable. An example concerning a helicopter landing approach is introduced to outline the predictive capability of the rating hypothesis in multiaxis piloting tasks.
Cost Models for MMC Manufacturing Processes
NASA Technical Reports Server (NTRS)
Elzey, Dana M.; Wadley, Haydn N. G.
1996-01-01
Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.
Hu, Meng-Han; Dong, Qing-Li; Liu, Bao-Lin
2016-08-01
Hyperspectral reflectance and transmittance sensing as well as near-infrared (NIR) spectroscopy were investigated as non-destructive tools for estimating blueberry firmness, elastic modulus and soluble solid content (SSC). Least squares-support vector machine models were established from these three spectra based on samples from three cultivars viz. Bluecrop, Duke and M2 and two harvest years viz. 2014 and 2015 for predicting blueberry postharvest quality. One-cultivar reflectance models (establishing model using one cultivar) derived better results than the corresponding transmittance and NIR models for predicting blueberry firmness with few cultivar effects. Two-cultivar NIR models (establishing model using two cultivars) proved to be suitable for estimating blueberry SSC with correlations over 0.83. Rp (RMSEp ) values of the three-cultivar reflectance models (establishing model using 75% of three cultivars) were 0.73 (0.094) and 0.73 (0.186), respectively , for predicting blueberry firmness and elastic modulus. For SSC prediction, the three-cultivar NIR model was found to achieve an Rp (RMSEp ) value of 0.85 (0.090). Adding Bluecrop samples harvested in 2014 could enhance the three-cultivar model robustness for firmness and elastic modulus. The above results indicated the potential for using spatial and spectral techniques to develop robust models for predicting blueberry postharvest quality containing biological variability. © 2015 Society of Chemical Industry. © 2015 Society of Chemical Industry.
Reflexion on linear regression trip production modelling method for ensuring good model quality
NASA Astrophysics Data System (ADS)
Suprayitno, Hitapriya; Ratnasari, Vita
2017-11-01
Transport Modelling is important. For certain cases, the conventional model still has to be used, in which having a good trip production model is capital. A good model can only be obtained from a good sample. Two of the basic principles of a good sampling is having a sample capable to represent the population characteristics and capable to produce an acceptable error at a certain confidence level. It seems that this principle is not yet quite understood and used in trip production modeling. Therefore, investigating the Trip Production Modelling practice in Indonesia and try to formulate a better modeling method for ensuring the Model Quality is necessary. This research result is presented as follows. Statistics knows a method to calculate span of prediction value at a certain confidence level for linear regression, which is called Confidence Interval of Predicted Value. The common modeling practice uses R2 as the principal quality measure, the sampling practice varies and not always conform to the sampling principles. An experiment indicates that small sample is already capable to give excellent R2 value and sample composition can significantly change the model. Hence, good R2 value, in fact, does not always mean good model quality. These lead to three basic ideas for ensuring good model quality, i.e. reformulating quality measure, calculation procedure, and sampling method. A quality measure is defined as having a good R2 value and a good Confidence Interval of Predicted Value. Calculation procedure must incorporate statistical calculation method and appropriate statistical tests needed. A good sampling method must incorporate random well distributed stratified sampling with a certain minimum number of samples. These three ideas need to be more developed and tested.
Berlow, Eric L.; Knapp, Roland A.; Ostoja, Steven M.; Williams, Richard J.; McKenny, Heather; Matchett, John R.; Guo, Qinghau; Fellers, Gary M.; Kleeman, Patrick; Brooks, Matthew L.; Joppa, Lucas
2013-01-01
A central challenge of conservation biology is using limited data to predict rare species occurrence and identify conservation areas that play a disproportionate role in regional persistence. Where species occupy discrete patches in a landscape, such predictions require data about environmental quality of individual patches and the connectivity among high quality patches. We present a novel extension to species occupancy modeling that blends traditionalpredictions of individual patch environmental quality with network analysis to estimate connectivity characteristics using limited survey data. We demonstrate this approach using environmental and geospatial attributes to predict observed occupancy patterns of the Yosemite toad (Anaxyrus (= Bufo) canorus) across >2,500 meadows in Yosemite National Park (USA). A. canorus, a Federal Proposed Species, breeds in shallow water associated with meadows. Our generalized linear model (GLM) accurately predicted ~84% of true presence-absence data on a subset of data withheld for testing. The predicted environmental quality of each meadow was iteratively ‘boosted’ by the quality of neighbors within dispersal distance. We used this park-wide meadow connectivity network to estimate the relative influence of an individual Meadow’s ‘environmental quality’ versus its ‘network quality’ to predict: a) clusters of high quality breeding meadows potentially linked by dispersal, b) breeding meadows with high environmental quality that are isolated from other such meadows, c) breeding meadows with lower environmental quality where long-term persistence may critically depend on the network neighborhood, and d) breeding meadows with the biggest impact on park-wide breeding patterns. Combined with targeted data on dispersal, genetics, disease, and other potential stressors, these results can guide designation of core conservation areas for A. canorus in Yosemite National Park.
Tracing the influence of land-use change on water quality and coral reefs using a Bayesian model.
Brown, Christopher J; Jupiter, Stacy D; Albert, Simon; Klein, Carissa J; Mangubhai, Sangeeta; Maina, Joseph M; Mumby, Peter; Olley, Jon; Stewart-Koster, Ben; Tulloch, Vivitskaia; Wenger, Amelia
2017-07-06
Coastal ecosystems can be degraded by poor water quality. Tracing the causes of poor water quality back to land-use change is necessary to target catchment management for coastal zone management. However, existing models for tracing the sources of pollution require extensive data-sets which are not available for many of the world's coral reef regions that may have severe water quality issues. Here we develop a hierarchical Bayesian model that uses freely available satellite data to infer the connection between land-uses in catchments and water clarity in coastal oceans. We apply the model to estimate the influence of land-use change on water clarity in Fiji. We tested the model's predictions against underwater surveys, finding that predictions of poor water quality are consistent with observations of high siltation and low coverage of sediment-sensitive coral genera. The model thus provides a means to link land-use change to declines in coastal water quality.
Prediction of aircraft handling qualities using analytical models of the human pilot
NASA Technical Reports Server (NTRS)
Hess, R. A.
1982-01-01
The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion for determining the susceptibility of an aircraft to pilot-induced oscillations (PIO) is formulated. Finally, a model-based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.
Prediction of aircraft handling qualities using analytical models of the human pilot
NASA Technical Reports Server (NTRS)
Hess, R. A.
1982-01-01
The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion for determining the susceptibility of an aircraft to pilot induced oscillations is formulated. Finally, a model based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.
NASA Astrophysics Data System (ADS)
SUN, N.; Yearsley, J. R.; Lettenmaier, D. P.
2013-12-01
Recent research shows that precipitation extremes in many of the largest U.S. urban areas have increased over the last 60 years. These changes have important implications for stormwater runoff and water quality, which in urban areas are dominated by the most extreme precipitation events. We assess the potential implications of changes in extreme precipitation and changing land cover in urban and urbanizing watersheds at the regional scale using a combination of hydrology and water quality models. Specifically, we describe the integration of a spatially distributed hydrological model - the Distributed Hydrology Soil Vegetation Model (DHSVM), the urban water quality model in EPA's Storm Water Management Model (SWMM), the semi-Lagrangian stream temperature model RBM10, and dynamical and statistical downscaling methods applied to global climate predictions. Key output water quality parameters include total suspended solids (TSS), toal nitrogen, total phosphorous, fecal coliform bacteria and stream temperature. We have evaluated the performance of the modeling system in the highly urbanized Mercer Creek watershed in the rapidly growing Bellevue urban area in WA, USA. The results suggest that the model is able to (1) produce reasonable streamflow predictions at fine temporal and spatial scales; (2) provide spatially distributed water temperature predictions that mostly agree with observations throughout a complex stream network, and characterize impacts of climate, landscape, near-stream vegetation change on stream temperature at local and regional scales; and (3) capture plausibly the response of water quality constituents to varying magnitude of precipitation events in urban environments. Next we will extend the scope of the study from the Mercer Creek watershed to include the entire Puget Sound Basin, WA, USA.
NASA Astrophysics Data System (ADS)
Matichuk, R.; Tonnesen, G.; Luecken, D.; Roselle, S. J.; Napelenok, S. L.; Baker, K. R.; Gilliam, R. C.; Misenis, C.; Murphy, B.; Schwede, D. B.
2015-12-01
The western United States is an important source of domestic energy resources. One of the primary environmental impacts associated with oil and natural gas production is related to air emission releases of a number of air pollutants. Some of these pollutants are important precursors to the formation of ground-level ozone. To better understand ozone impacts and other air quality issues, photochemical air quality models are used to simulate the changes in pollutant concentrations in the atmosphere on local, regional, and national spatial scales. These models are important for air quality management because they assist in identifying source contributions to air quality problems and designing effective strategies to reduce harmful air pollutants. The success of predicting oil and natural gas air quality impacts depends on the accuracy of the input information, including emissions inventories, meteorological information, and boundary conditions. The treatment of chemical and physical processes within these models is equally important. However, given the limited amount of data collected for oil and natural gas production emissions in the past and the complex terrain and meteorological conditions in western states, the ability of these models to accurately predict pollution concentrations from these sources is uncertain. Therefore, this presentation will focus on understanding the Community Multiscale Air Quality (CMAQ) model's ability to predict air quality impacts associated with oil and natural gas production and its sensitivity to input uncertainties. The results will focus on winter ozone issues in the Uinta Basin, Utah and identify the factors contributing to model performance issues. The results of this study will help support future air quality model development, policy and regulatory decisions for the oil and gas sector.
Feature maps driven no-reference image quality prediction of authentically distorted images
NASA Astrophysics Data System (ADS)
Ghadiyaram, Deepti; Bovik, Alan C.
2015-03-01
Current blind image quality prediction models rely on benchmark databases comprised of singly and synthetically distorted images, thereby learning image features that are only adequate to predict human perceived visual quality on such inauthentic distortions. However, real world images often contain complex mixtures of multiple distortions. Rather than a) discounting the effect of these mixtures of distortions on an image's perceptual quality and considering only the dominant distortion or b) using features that are only proven to be efficient for singly distorted images, we deeply study the natural scene statistics of authentically distorted images, in different color spaces and transform domains. We propose a feature-maps-driven statistical approach which avoids any latent assumptions about the type of distortion(s) contained in an image, and focuses instead on modeling the remarkable consistencies in the scene statistics of real world images in the absence of distortions. We design a deep belief network that takes model-based statistical image features derived from a very large database of authentically distorted images as input and discovers good feature representations by generalizing over different distortion types, mixtures, and severities, which are later used to learn a regressor for quality prediction. We demonstrate the remarkable competence of our features for improving automatic perceptual quality prediction on a benchmark database and on the newly designed LIVE Authentic Image Quality Challenge Database and show that our approach of combining robust statistical features and the deep belief network dramatically outperforms the state-of-the-art.
Likelihood of achieving air quality targets under model uncertainties.
Digar, Antara; Cohan, Daniel S; Cox, Dennis D; Kim, Byeong-Uk; Boylan, James W
2011-01-01
Regulatory attainment demonstrations in the United States typically apply a bright-line test to predict whether a control strategy is sufficient to attain an air quality standard. Photochemical models are the best tools available to project future pollutant levels and are a critical part of regulatory attainment demonstrations. However, because photochemical models are uncertain and future meteorology is unknowable, future pollutant levels cannot be predicted perfectly and attainment cannot be guaranteed. This paper introduces a computationally efficient methodology for estimating the likelihood that an emission control strategy will achieve an air quality objective in light of uncertainties in photochemical model input parameters (e.g., uncertain emission and reaction rates, deposition velocities, and boundary conditions). The method incorporates Monte Carlo simulations of a reduced form model representing pollutant-precursor response under parametric uncertainty to probabilistically predict the improvement in air quality due to emission control. The method is applied to recent 8-h ozone attainment modeling for Atlanta, Georgia, to assess the likelihood that additional controls would achieve fixed (well-defined) or flexible (due to meteorological variability and uncertain emission trends) targets of air pollution reduction. The results show that in certain instances ranking of the predicted effectiveness of control strategies may differ between probabilistic and deterministic analyses.
2014-01-01
This paper examined the efficiency of multivariate linear regression (MLR) and artificial neural network (ANN) models in prediction of two major water quality parameters in a wastewater treatment plant. Biochemical oxygen demand (BOD) and chemical oxygen demand (COD) as well as indirect indicators of organic matters are representative parameters for sewer water quality. Performance of the ANN models was evaluated using coefficient of correlation (r), root mean square error (RMSE) and bias values. The computed values of BOD and COD by model, ANN method and regression analysis were in close agreement with their respective measured values. Results showed that the ANN performance model was better than the MLR model. Comparative indices of the optimized ANN with input values of temperature (T), pH, total suspended solid (TSS) and total suspended (TS) for prediction of BOD was RMSE = 25.1 mg/L, r = 0.83 and for prediction of COD was RMSE = 49.4 mg/L, r = 0.81. It was found that the ANN model could be employed successfully in estimating the BOD and COD in the inlet of wastewater biochemical treatment plants. Moreover, sensitive examination results showed that pH parameter have more effect on BOD and COD predicting to another parameters. Also, both implemented models have predicted BOD better than COD. PMID:24456676
Sugeno-Fuzzy Expert System Modeling for Quality Prediction of Non-Contact Machining Process
NASA Astrophysics Data System (ADS)
Sivaraos; Khalim, A. Z.; Salleh, M. S.; Sivakumar, D.; Kadirgama, K.
2018-03-01
Modeling can be categorised into four main domains: prediction, optimisation, estimation and calibration. In this paper, the Takagi-Sugeno-Kang (TSK) fuzzy logic method is examined as a prediction modelling method to investigate the taper quality of laser lathing, which seeks to replace traditional lathe machines with 3D laser lathing in order to achieve the desired cylindrical shape of stock materials. Three design parameters were selected: feed rate, cutting speed and depth of cut. A total of twenty-four experiments were conducted with eight sequential runs and replicated three times. The results were found to be 99% of accuracy rate of the TSK fuzzy predictive model, which suggests that the model is a suitable and practical method for non-linear laser lathing process.
Akbaş, Halil; Bilgen, Bilge; Turhan, Aykut Melih
2015-11-01
This study proposes an integrated prediction and optimization model by using multi-layer perceptron neural network and particle swarm optimization techniques. Three different objective functions are formulated. The first one is the maximization of methane percentage with single output. The second one is the maximization of biogas production with single output. The last one is the maximization of biogas quality and biogas production with two outputs. Methane percentage, carbon dioxide percentage, and other contents' percentage are used as the biogas quality criteria. Based on the formulated models and data from a wastewater treatment facility, optimal values of input variables and their corresponding maximum output values are found out for each model. It is expected that the application of the integrated prediction and optimization models increases the biogas production and biogas quality, and contributes to the quantity of electricity production at the wastewater treatment facility. Copyright © 2015 Elsevier Ltd. All rights reserved.
Yang, Zhongshan; Wang, Jian
2017-10-01
Air pollution in many countries is worsening with industrialization and urbanization, resulting in climate change and affecting people's health, thus, making the work of policymakers more difficult. It is therefore both urgent and necessary to establish amore scientific air quality monitoring and early warning system to evaluate the degree of air pollution objectively, and predict pollutant concentrations accurately. However, the integration of air quality assessment and air pollutant concentration prediction to establish an air quality system is not common. In this paper, we propose a new air quality monitoring and early warning system, including an assessment module and forecasting module. In the air quality assessment module, fuzzy comprehensive evaluation is used to determine the main pollutants and evaluate the degree of air pollution more scientifically. In the air pollutant concentration prediction module, a novel hybridization model combining complementary ensemble empirical mode decomposition, a modified cuckoo search and differential evolution algorithm, and an Elman neural network, is proposed to improve the forecasting accuracy of six main air pollutant concentrations. To verify the effectiveness of this system, pollutant data for two cities in China are used. The result of the fuzzy comprehensive evaluation shows that the major air pollutants in Xi'an and Jinan are PM 10 and PM 2.5 respectively, and that the air quality of Xi'an is better than that of Jinan. The forecasting results indicate that the proposed hybrid model is remarkably superior to all benchmark models on account of its higher prediction accuracy and stability. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kim, S.; Seo, D. J.
2017-12-01
When water temperature (TW) increases due to changes in hydrometeorological conditions, the overall ecological conditions change in the aquatic system. The changes can be harmful to human health and potentially fatal to fish habitat. Therefore, it is important to assess the impacts of thermal disturbances on in-stream processes of water quality variables and be able to predict effectiveness of possible actions that may be taken for water quality protection. For skillful prediction of in-stream water quality processes, it is necessary for the watershed water quality models to be able to reflect such changes. Most of the currently available models, however, assume static parameters for the biophysiochemical processes and hence are not able to capture nonstationaries seen in water quality observations. In this work, we assess the performance of the Hydrological Simulation Program-Fortran (HSPF) in predicting algal dynamics following TW increase. The study area is located in the Republic of Korea where waterway change due to weir construction and drought concurrently occurred around 2012. In this work we use data assimilation (DA) techniques to update model parameters as well as the initial condition of selected state variables for in-stream processes relevant to algal growth. For assessment of model performance and characterization of temporal variability, various goodness-of-fit measures and wavelet analysis are used.
NASA Astrophysics Data System (ADS)
Qiao, T.; Ren, J.; Craigie, C.; Zabalza, J.; Maltin, Ch.; Marshall, S.
2015-03-01
It is well known that the eating quality of beef has a significant influence on the repurchase behavior of consumers. There are several key factors that affect the perception of quality, including color, tenderness, juiciness, and flavor. To support consumer repurchase choices, there is a need for an objective measurement of quality that could be applied to meat prior to its sale. Objective approaches such as offered by spectral technologies may be useful, but the analytical algorithms used remain to be optimized. For visible and near infrared (VISNIR) spectroscopy, Partial Least Squares Regression (PLSR) is a widely used technique for meat related quality modeling and prediction. In this paper, a Support Vector Machine (SVM) based machine learning approach is presented to predict beef eating quality traits. Although SVM has been successfully used in various disciplines, it has not been applied extensively to the analysis of meat quality parameters. To this end, the performance of PLSR and SVM as tools for the analysis of meat tenderness is evaluated, using a large dataset acquired under industrial conditions. The spectral dataset was collected using VISNIR spectroscopy with the wavelength ranging from 350 to 1800 nm on 234 beef M. longissimus thoracis steaks from heifers, steers, and young bulls. As the dimensionality with the VISNIR data is very high (over 1600 spectral bands), the Principal Component Analysis (PCA) technique was applied for feature extraction and data reduction. The extracted principal components (less than 100) were then used for data modeling and prediction. The prediction results showed that SVM has a greater potential to predict beef eating quality than PLSR, especially for the prediction of tenderness. The infl uence of animal gender on beef quality prediction was also investigated, and it was found that beef quality traits were predicted most accurately in beef from young bulls.
Linked Hydrologic-Hydrodynamic Model Framework to Forecast Impacts of Rivers on Beach Water Quality
NASA Astrophysics Data System (ADS)
Anderson, E. J.; Fry, L. M.; Kramer, E.; Ritzenthaler, A.
2014-12-01
The goal of NOAA's beach quality forecasting program is to use a multi-faceted approach to aid in detection and prediction of bacteria in recreational waters. In particular, our focus has been on the connection between tributary loads and bacteria concentrations at nearby beaches. While there is a clear link between stormwater runoff and beach water quality, quantifying the contribution of river loadings to nearshore bacterial concentrations is complicated due to multiple processes that drive bacterial concentrations in rivers as well as those processes affecting the fate and transport of bacteria upon exiting the rivers. In order to forecast potential impacts of rivers on beach water quality, we developed a linked hydrologic-hydrodynamic water quality framework that simulates accumulation and washoff of bacteria from the landscape, and then predicts the fate and transport of washed off bacteria from the watershed to the coastal zone. The framework includes a watershed model (IHACRES) to predict fecal indicator bacteria (FIB) loadings to the coastal environment (accumulation, wash-off, die-off) as a function of effective rainfall. These loadings are input into a coastal hydrodynamic model (FVCOM), including a bacteria transport model (Lagrangian particle), to simulate 3D bacteria transport within the coastal environment. This modeling system provides predictive tools to assist local managers in decision-making to reduce human health threats.
Ouyang, Qin; Chen, Quansheng; Zhao, Jiewen
2016-02-05
The approach presented herein reports the application of near infrared (NIR) spectroscopy, in contrast with human sensory panel, as a tool for estimating Chinese rice wine quality; concretely, to achieve the prediction of the overall sensory scores assigned by the trained sensory panel. Back propagation artificial neural network (BPANN) combined with adaptive boosting (AdaBoost) algorithm, namely BP-AdaBoost, as a novel nonlinear algorithm, was proposed in modeling. First, the optimal spectra intervals were selected by synergy interval partial least square (Si-PLS). Then, BP-AdaBoost model based on the optimal spectra intervals was established, called Si-BP-AdaBoost model. These models were optimized by cross validation, and the performance of each final model was evaluated according to correlation coefficient (Rp) and root mean square error of prediction (RMSEP) in prediction set. Si-BP-AdaBoost showed excellent performance in comparison with other models. The best Si-BP-AdaBoost model was achieved with Rp=0.9180 and RMSEP=2.23 in the prediction set. It was concluded that NIR spectroscopy combined with Si-BP-AdaBoost was an appropriate method for the prediction of the sensory quality in Chinese rice wine. Copyright © 2015 Elsevier B.V. All rights reserved.
Learning a Continuous-Time Streaming Video QoE Model.
Ghadiyaram, Deepti; Pan, Janice; Bovik, Alan C
2018-05-01
Over-the-top adaptive video streaming services are frequently impacted by fluctuating network conditions that can lead to rebuffering events (stalling events) and sudden bitrate changes. These events visually impact video consumers' quality of experience (QoE) and can lead to consumer churn. The development of models that can accurately predict viewers' instantaneous subjective QoE under such volatile network conditions could potentially enable the more efficient design of quality-control protocols for media-driven services, such as YouTube, Amazon, Netflix, and so on. However, most existing models only predict a single overall QoE score on a given video and are based on simple global video features, without accounting for relevant aspects of human perception and behavior. We have created a QoE evaluator, called the time-varying QoE Indexer, that accounts for interactions between stalling events, analyzes the spatial and temporal content of a video, predicts the perceptual video quality, models the state of the client-side data buffer, and consequently predicts continuous-time quality scores that agree quite well with human opinion scores. The new QoE predictor also embeds the impact of relevant human cognitive factors, such as memory and recency, and their complex interactions with the video content being viewed. We evaluated the proposed model on three different video databases and attained standout QoE prediction performance.
Atashi, Alireza; Verburg, Ilona W; Karim, Hesam; Miri, Mirmohammad; Abu-Hanna, Ameen; de Jonge, Evert; de Keizer, Nicolette F; Eslami, Saeid
2018-06-01
Intensive Care Units (ICU) length of stay (LoS) prediction models are used to compare different institutions and surgeons on their performance, and is useful as an efficiency indicator for quality control. There is little consensus about which prediction methods are most suitable to predict (ICU) length of stay. The aim of this study is to systematically review models for predicting ICU LoS after coronary artery bypass grafting and to assess the reporting and methodological quality of these models to apply them for benchmarking. A general search was conducted in Medline and Embase up to 31-12-2016. Three authors classified the papers for inclusion by reading their title, abstract and full text. All original papers describing development and/or validation of a prediction model for LoS in the ICU after CABG surgery were included. We used a checklist developed for critical appraisal and data extraction for systematic reviews of prediction modeling and extended it on handling specific patients subgroups. We also defined other items and scores to assess the methodological and reporting quality of the models. Of 5181 uniquely identified articles, fifteen studies were included of which twelve on development of new models and three on validation of existing models. All studies used linear or logistic regression as method for model development, and reported various performance measures based on the difference between predicted and observed ICU LoS. Most used a prospective (46.6%) or retrospective study design (40%). We found heterogeneity in patient inclusion/exclusion criteria; sample size; reported accuracy rates; and methods of candidate predictor selection. Most (60%) studies have not mentioned the handling of missing values and none compared the model outcome measure of survivors with non-survivors. For model development and validation studies respectively, the maximum reporting (methodological) scores were 66/78 and 62/62 (14/22 and 12/22). There are relatively few models for predicting ICU length of stay after CABG. Several aspects of methodological and reporting quality of studies in this field should be improved. There is a need for standardizing outcome and risk factor definitions in order to develop/validate a multi-institutional and international risk scoring system.
Foerster, Steffen; Zhong, Ying; Pintea, Lilian; Murray, Carson M; Wilson, Michael L; Mjungu, Deus C; Pusey, Anne E
2016-01-01
The distribution and abundance of food resources are among the most important factors that influence animal behavioral strategies. Yet, spatial variation in feeding habitat quality is often difficult to assess with traditional methods that rely on extrapolation from plot survey data or remote sensing. Here, we show that maximum entropy species distribution modeling can be used to successfully predict small-scale variation in the distribution of 24 important plant food species for chimpanzees at Gombe National Park, Tanzania. We combined model predictions with behavioral observations to quantify feeding habitat quality as the cumulative dietary proportion of the species predicted to occur in a given location. This measure exhibited considerable spatial heterogeneity with elevation and latitude, both within and across main habitat types. We used model results to assess individual variation in habitat selection among adult chimpanzees during a 10-year period, testing predictions about trade-offs between foraging and reproductive effort. We found that nonswollen females selected the highest-quality habitats compared with swollen females or males, in line with predictions based on their energetic needs. Swollen females appeared to compromise feeding in favor of mating opportunities, suggesting that females rather than males change their ranging patterns in search of mates. Males generally occupied feeding habitats of lower quality, which may exacerbate energetic challenges of aggression and territory defense. Finally, we documented an increase in feeding habitat quality with community residence time in both sexes during the dry season, suggesting an influence of familiarity on foraging decisions in a highly heterogeneous landscape.
Use of Air Quality Observations by the National Air Quality Forecast Capability
NASA Astrophysics Data System (ADS)
Stajner, I.; McQueen, J.; Lee, P.; Stein, A. F.; Kondragunta, S.; Ruminski, M.; Tong, D.; Pan, L.; Huang, J. P.; Shafran, P.; Huang, H. C.; Dickerson, P.; Upadhayay, S.
2015-12-01
The National Air Quality Forecast Capability (NAQFC) operational predictions of ozone and wildfire smoke for the United States (U.S.) and predictions of airborne dust for continental U.S. are available at http://airquality.weather.gov/. NOAA National Centers for Environmental Prediction (NCEP) operational North American Mesoscale (NAM) weather predictions are combined with the Community Multiscale Air Quality (CMAQ) model to produce the ozone predictions and test fine particulate matter (PM2.5) predictions. The Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) model provides smoke and dust predictions. Air quality observations constrain emissions used by NAQFC predictions. NAQFC NOx emissions from mobile sources were updated using National Emissions Inventory (NEI) projections for year 2012. These updates were evaluated over large U.S. cities by comparing observed changes in OMI NO2 observations and NOx measured by surface monitors. The rate of decrease in NOx emission projections from year 2005 to year 2012 is in good agreement with the observed changes over the same period. Smoke emissions rely on the fire locations detected from satellite observations obtained from NESDIS Hazard Mapping System (HMS). Dust emissions rely on a climatology of areas with a potential for dust emissions based on MODIS Deep Blue aerosol retrievals. Verification of NAQFC predictions uses AIRNow compilation of surface measurements for ozone and PM2.5. Retrievals of smoke from GOES satellites are used for verification of smoke predictions. Retrievals of dust from MODIS are used for verification of dust predictions. In summary, observations are the basis for the emissions inputs for NAQFC, they are critical for evaluation of performance of NAQFC predictions, and furthermore they are used in real-time testing of bias correction of PM2.5 predictions, as we continue to work on improving modeling and emissions important for representation of PM2.5.
NASA Astrophysics Data System (ADS)
Tiebin, Wu; Yunlian, Liu; Xinjun, Li; Yi, Yu; Bin, Zhang
2018-06-01
Aiming at the difficulty in quality prediction of sintered ores, a hybrid prediction model is established based on mechanism models of sintering and time-weighted error compensation on the basis of the extreme learning machine (ELM). At first, mechanism models of drum index, total iron, and alkalinity are constructed according to the chemical reaction mechanism and conservation of matter in the sintering process. As the process is simplified in the mechanism models, these models are not able to describe high nonlinearity. Therefore, errors are inevitable. For this reason, the time-weighted ELM based error compensation model is established. Simulation results verify that the hybrid model has a high accuracy and can meet the requirement for industrial applications.
Air quality models are used to predict changes in pollutant concentrations resulting from envisioned emission control policies. Recognizing the need to assess the credibility of air quality models in a policy-relevant context, we perform a dynamic evaluation of the community Mult...
NASA Technical Reports Server (NTRS)
Kirtman, Ben P.; Min, Dughong; Infanti, Johnna M.; Kinter, James L., III; Paolino, Daniel A.; Zhang, Qin; vandenDool, Huug; Saha, Suranjana; Mendez, Malaquias Pena; Becker, Emily;
2013-01-01
The recent US National Academies report "Assessment of Intraseasonal to Interannual Climate Prediction and Predictability" was unequivocal in recommending the need for the development of a North American Multi-Model Ensemble (NMME) operational predictive capability. Indeed, this effort is required to meet the specific tailored regional prediction and decision support needs of a large community of climate information users. The multi-model ensemble approach has proven extremely effective at quantifying prediction uncertainty due to uncertainty in model formulation, and has proven to produce better prediction quality (on average) then any single model ensemble. This multi-model approach is the basis for several international collaborative prediction research efforts, an operational European system and there are numerous examples of how this multi-model ensemble approach yields superior forecasts compared to any single model. Based on two NOAA Climate Test Bed (CTB) NMME workshops (February 18, and April 8, 2011) a collaborative and coordinated implementation strategy for a NMME prediction system has been developed and is currently delivering real-time seasonal-to-interannual predictions on the NOAA Climate Prediction Center (CPC) operational schedule. The hindcast and real-time prediction data is readily available (e.g., http://iridl.ldeo.columbia.edu/SOURCES/.Models/.NMME/) and in graphical format from CPC (http://origin.cpc.ncep.noaa.gov/products/people/wd51yf/NMME/index.html). Moreover, the NMME forecast are already currently being used as guidance for operational forecasters. This paper describes the new NMME effort, presents an overview of the multi-model forecast quality, and the complementary skill associated with individual models.
Shearer, Heather M; Côté, Pierre; Boyle, Eleanor; Hayden, Jill A; Frank, John; Johnson, William G
2017-09-01
Purpose Our objective was to develop a clinical prediction model to identify workers with sustainable employment following an episode of work-related low back pain (LBP). Methods We used data from a cohort study of injured workers with incident LBP claims in the USA to predict employment patterns 1 and 6 months following a workers' compensation claim. We developed three sequential models to determine the contribution of three domains of variables: (1) basic demographic/clinical variables; (2) health-related variables; and (3) work-related factors. Multivariable logistic regression was used to develop the predictive models. We constructed receiver operator curves and used the c-index to measure predictive accuracy. Results Seventy-nine percent and 77 % of workers had sustainable employment at 1 and 6 months, respectively. Sustainable employment at 1 month was predicted by initial back pain intensity, mental health-related quality of life, claim litigation and employer type (c-index = 0.77). At 6 months, sustainable employment was predicted by physical and mental health-related quality of life, claim litigation and employer type (c-index = 0.77). Adding health-related and work-related variables to models improved predictive accuracy by 8.5 and 10 % at 1 and 6 months respectively. Conclusion We developed clinically-relevant models to predict sustainable employment in injured workers who made a workers' compensation claim for LBP. Inquiring about back pain intensity, physical and mental health-related quality of life, claim litigation and employer type may be beneficial in developing programs of care. Our models need to be validated in other populations.
George M. Banzhaf; Thomas G. Matney; Emily B. Schultz; James S. Meadows; J. Paul Jeffreys; William C. Booth; Gan Li; Andrew W. Ezell; Theodor D. Leininger
2016-01-01
Red oak (Quercus section Labatae)-sweetgum (Liquidambar styraciflua L.) stands growing on mid-south bottomland sites in the United States are well known for producing high-quality grade hardwood logs, but models for estimating the quantity and quality of standing grade wood in these stands have been unavailable. Prediction...
Zhu, Hongyan; Chu, Bingquan; Fan, Yangyang; Tao, Xiaoya; Yin, Wenxin; He, Yong
2017-08-10
We investigated the feasibility and potentiality of determining firmness, soluble solids content (SSC), and pH in kiwifruits using hyperspectral imaging, combined with variable selection methods and calibration models. The images were acquired by a push-broom hyperspectral reflectance imaging system covering two spectral ranges. Weighted regression coefficients (BW), successive projections algorithm (SPA) and genetic algorithm-partial least square (GAPLS) were compared and evaluated for the selection of effective wavelengths. Moreover, multiple linear regression (MLR), partial least squares regression and least squares support vector machine (LS-SVM) were developed to predict quality attributes quantitatively using effective wavelengths. The established models, particularly SPA-MLR, SPA-LS-SVM and GAPLS-LS-SVM, performed well. The SPA-MLR models for firmness (R pre = 0.9812, RPD = 5.17) and SSC (R pre = 0.9523, RPD = 3.26) at 380-1023 nm showed excellent performance, whereas GAPLS-LS-SVM was the optimal model at 874-1734 nm for predicting pH (R pre = 0.9070, RPD = 2.60). Image processing algorithms were developed to transfer the predictive model in every pixel to generate prediction maps that visualize the spatial distribution of firmness and SSC. Hence, the results clearly demonstrated that hyperspectral imaging has the potential as a fast and non-invasive method to predict the quality attributes of kiwifruits.
Miyauchi, Shunsuke; Yonetani, Tsutomu; Yuki, Takayuki; Tomio, Ayako; Bamba, Takeshi; Fukusaki, Eiichiro
2017-02-01
For an experimental model to elucidate the relationship between light quality during plant culture conditions and plant quality of crops or vegetables, we cultured tea plants (Camellia sinensis) and analyzed their leaves as tea material. First, metabolic profiling of teas from a tea contest in Japan was performed with gas chromatography/mass spectrometry (GC/MS), and then a ranking predictive model was made which predicted tea rankings from their metabolite profile. Additionally, the importance of some compounds (glutamine, glutamic acid, oxalic acid, epigallocatechin, phosphoric acid, and inositol) was elucidated for measurement of the quality of tea leaf. Subsequently, tea plants were cultured in artificial conditions to control these compounds. From the result of prediction by the ranking predictive model, the tea sample supplemented with ultraviolet-A (315-399 nm) showed the highest ranking. The improvement in quality was thought to come from the high amino-acid and decreased epigallocatechin content in tea leaves. The current study shows the use and value of metabolic profiling in the field of high-quality crops and vegetables production that has been conventionally evaluated by human sensory analysis. Metabolic profiling enables us to form hypothesis to understand and develop high quality plant cultured under artificial condition. Copyright © 2016 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Evaluating Predictive Models of Software Quality
NASA Astrophysics Data System (ADS)
Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.
2014-06-01
Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.
Harris, Ted D.; Graham, Jennifer L.
2017-01-01
Cyanobacterial blooms degrade water quality in drinking water supply reservoirs by producing toxic and taste-and-odor causing secondary metabolites, which ultimately cause public health concerns and lead to increased treatment costs for water utilities. There have been numerous attempts to create models that predict cyanobacteria and their secondary metabolites, most using linear models; however, linear models are limited by assumptions about the data and have had limited success as predictive tools. Thus, lake and reservoir managers need improved modeling techniques that can accurately predict large bloom events that have the highest impact on recreational activities and drinking-water treatment processes. In this study, we compared 12 unique linear and nonlinear regression modeling techniques to predict cyanobacterial abundance and the cyanobacterial secondary metabolites microcystin and geosmin using 14 years of physiochemical water quality data collected from Cheney Reservoir, Kansas. Support vector machine (SVM), random forest (RF), boosted tree (BT), and Cubist modeling techniques were the most predictive of the compared modeling approaches. SVM, RF, and BT modeling techniques were able to successfully predict cyanobacterial abundance, microcystin, and geosmin concentrations <60,000 cells/mL, 2.5 µg/L, and 20 ng/L, respectively. Only Cubist modeling predicted maxima concentrations of cyanobacteria and geosmin; no modeling technique was able to predict maxima microcystin concentrations. Because maxima concentrations are a primary concern for lake and reservoir managers, Cubist modeling may help predict the largest and most noxious concentrations of cyanobacteria and their secondary metabolites.
The US EPA has a plan to leverage recent advances in meteorological modeling to develop a "Next-Generation" air quality modeling system that will allow consistent modeling of problems from global to local scale. The meteorological model of choice is the Model for Predic...
Childhood maltreatment and adulthood poor sleep quality: a longitudinal study.
Abajobir, Amanuel A; Kisely, Steve; Williams, Gail; Strathearn, Lane; Najman, Jake M
2017-08-01
Available evidence from cross-sectional studies suggests that childhood maltreatment may be associated with a range of sleep disorders. However, these studies have not controlled for potential individual-, familial- and environmental-level confounders. To determine the association between childhood maltreatment and lower sleep quality after adjusting for potential confounders. Data for the present study were obtained from a pre-birth cohort study of 3778 young adults (52.6% female) of the Mater Hospital-University of Queensland Study of Pregnancy follow up at a mean age of 20.6 years. The Mater Hospital-University of Queensland Study of Pregnancy is a prospective Australian pre-birth cohort study of mothers consecutively recruited during their first obstetric clinic visit at Brisbane's Mater Hospital in 1981-1983. Participants completed the Pittsburgh Sleep Quality Index at the 21-year follow up. We linked this dataset to agency-recorded substantiated cases of childhood maltreatment. A series of separate logistic regression models was used to test whether childhood maltreatment predicted lower sleep quality after adjustment for selected confounders. Substantiated physical abuse significantly predicted lower sleep quality in males. Single and multiple forms of childhood maltreatment, including age of maltreatment and number of substantiations, did not predict lower sleep quality in either gender in both crude and adjusted models. Not being married, living in a residential problem area, cigarette smoking and internalising were significantly associated with lower sleep quality in a fully adjusted model for the male-female combined sample. Childhood maltreatment does not appear to predict young adult poor sleep quality, with the exception of physical abuse for males. While childhood maltreatment has been found to predict a range of mental health problems, childhood maltreatment does not appear to predict sleep problems occurring in young adults. Poor sleep quality was accounted for by concurrent social disadvantage, cigarette smoking and internalising. © 2017 Royal Australasian College of Physicians.
Kramer, Andrew A; Higgins, Thomas L; Zimmerman, Jack E
2015-02-01
To compare ICU performance using standardized mortality ratios generated by the Acute Physiology and Chronic Health Evaluation IVa and a National Quality Forum-endorsed methodology and examine potential reasons for model-based standardized mortality ratio differences. Retrospective analysis of day 1 hospital mortality predictions at the ICU level using Acute Physiology and Chronic Health Evaluation IVa and National Quality Forum models on the same patient cohort. Forty-seven ICUs at 36 U.S. hospitals from January 2008 to May 2013. Eighty-nine thousand three hundred fifty-three consecutive unselected ICU admissions. None. We assessed standardized mortality ratios for each ICU using data for patients eligible for Acute Physiology and Chronic Health Evaluation IVa and National Quality Forum predictions in order to compare unit-level model performance, differences in ICU rankings, and how case-mix adjustment might explain standardized mortality ratio differences. Hospital mortality was 11.5%. Overall standardized mortality ratio was 0.89 using Acute Physiology and Chronic Health Evaluation IVa and 1.07 using National Quality Forum, the latter having a widely dispersed and multimodal standardized mortality ratio distribution. Model exclusion criteria eliminated mortality predictions for 10.6% of patients for Acute Physiology and Chronic Health Evaluation IVa and 27.9% for National Quality Forum. The two models agreed on the significance and direction of standardized mortality ratio only 45% of the time. Four ICUs had standardized mortality ratios significantly less than 1.0 using Acute Physiology and Chronic Health Evaluation IVa, but significantly greater than 1.0 using National Quality Forum. Two ICUs had standardized mortality ratios exceeding 1.75 using National Quality Forum, but nonsignificant performance using Acute Physiology and Chronic Health Evaluation IVa. Stratification by patient and institutional characteristics indicated that units caring for more severely ill patients and those with a higher percentage of patients on mechanical ventilation had the most discordant standardized mortality ratios between the two predictive models. Acute Physiology and Chronic Health Evaluation IVa and National Quality Forum models yield different ICU performance assessments due to differences in case-mix adjustment. Given the growing role of outcomes in driving prospective payment patient referral and public reporting, performance should be assessed by models with fewer exclusions, superior accuracy, and better case-mix adjustment.
Zhang, Lei; Zou, Zhihong; Shan, Wei
2017-06-01
Water quality forecasting is an essential part of water resource management. Spatiotemporal variations of water quality and their inherent constraints make it very complex. This study explored a data-based method for short-term water quality forecasting. Prediction of water quality indicators including dissolved oxygen, chemical oxygen demand by KMnO 4 and ammonia nitrogen using support vector machine was taken as inputs of the particle swarm algorithm based optimal wavelet neural network to forecast the whole status index of water quality. Gubeikou monitoring section of Miyun reservoir in Beijing, China was taken as the study case to examine effectiveness of this approach. The experiment results also revealed that the proposed model has advantages of stability and time reduction in comparison with other data-driven models including traditional BP neural network model, wavelet neural network model and Gradient Boosting Decision Tree model. It can be used as an effective approach to perform short-term comprehensive water quality prediction. Copyright © 2016. Published by Elsevier B.V.
Gradient Magnitude Similarity Deviation: A Highly Efficient Perceptual Image Quality Index.
Xue, Wufeng; Zhang, Lei; Mou, Xuanqin; Bovik, Alan C
2014-02-01
It is an important task to faithfully evaluate the perceptual quality of output images in many applications, such as image compression, image restoration, and multimedia streaming. A good image quality assessment (IQA) model should not only deliver high quality prediction accuracy, but also be computationally efficient. The efficiency of IQA metrics is becoming particularly important due to the increasing proliferation of high-volume visual data in high-speed networks. We present a new effective and efficient IQA model, called gradient magnitude similarity deviation (GMSD). The image gradients are sensitive to image distortions, while different local structures in a distorted image suffer different degrees of degradations. This motivates us to explore the use of global variation of gradient based local quality map for overall image quality prediction. We find that the pixel-wise gradient magnitude similarity (GMS) between the reference and distorted images combined with a novel pooling strategy-the standard deviation of the GMS map-can predict accurately perceptual image quality. The resulting GMSD algorithm is much faster than most state-of-the-art IQA methods, and delivers highly competitive prediction accuracy. MATLAB source code of GMSD can be downloaded at http://www4.comp.polyu.edu.hk/~cslzhang/IQA/GMSD/GMSD.htm.
Regional-scale air quality models are being used to demonstrate attainment of the ozone air quality standard. In current regulatory applications, a regional-scale air quality model is applied for a base year and a future year with reduced emissions using the same meteorological ...
This study is conducted in the framework of the Air Quality Modelling Evaluation International Initiative (AQMEII) and aims at the operational evaluation of an ensemble of 12 regional-scale chemical transport models used to predict air quality over the North American (NA) and Eur...
The construction of QSAR models is critically dependent on the quality of available data. As part of our efforts to develop public platforms to provide access to predictive models, we have attempted to discriminate the influence of the quality versus quantity of data available ...
The development of QSAR models is critically dependent on the quality of available data. As part of our efforts to develop public platforms to provide access to predictive models, we have attempted to discriminate the influence of the quality versus quantity of data available to...
A First Step towards a Clinical Decision Support System for Post-traumatic Stress Disorders.
Ma, Sisi; Galatzer-Levy, Isaac R; Wang, Xuya; Fenyö, David; Shalev, Arieh Y
2016-01-01
PTSD is distressful and debilitating, following a non-remitting course in about 10% to 20% of trauma survivors. Numerous risk indicators of PTSD have been identified, but individual level prediction remains elusive. As an effort to bridge the gap between scientific discovery and practical application, we designed and implemented a clinical decision support pipeline to provide clinically relevant recommendation for trauma survivors. To meet the specific challenge of early prediction, this work uses data obtained within ten days of a traumatic event. The pipeline creates personalized predictive model for each individual, and computes quality metrics for each predictive model. Clinical recommendations are made based on both the prediction of the model and its quality, thus avoiding making potentially detrimental recommendations based on insufficient information or suboptimal model. The current pipeline outperforms the acute stress disorder, a commonly used clinical risk factor for PTSD development, both in terms of sensitivity and specificity.
Improved protein model quality assessments by changing the target function.
Uziela, Karolis; Menéndez Hurtado, David; Shu, Nanjiang; Wallner, Björn; Elofsson, Arne
2018-06-01
Protein modeling quality is an important part of protein structure prediction. We have for more than a decade developed a set of methods for this problem. We have used various types of description of the protein and different machine learning methodologies. However, common to all these methods has been the target function used for training. The target function in ProQ describes the local quality of a residue in a protein model. In all versions of ProQ the target function has been the S-score. However, other quality estimation functions also exist, which can be divided into superposition- and contact-based methods. The superposition-based methods, such as S-score, are based on a rigid body superposition of a protein model and the native structure, while the contact-based methods compare the local environment of each residue. Here, we examine the effects of retraining our latest predictor, ProQ3D, using identical inputs but different target functions. We find that the contact-based methods are easier to predict and that predictors trained on these measures provide some advantages when it comes to identifying the best model. One possible reason for this is that contact based methods are better at estimating the quality of multi-domain targets. However, training on the S-score gives the best correlation with the GDT_TS score, which is commonly used in CASP to score the global model quality. To take the advantage of both of these features we provide an updated version of ProQ3D that predicts local and global model quality estimates based on different quality estimates. © 2018 Wiley Periodicals, Inc.
The paper describes a project that combines the capabilities of urban geography, raster-based GIS, predictive meteorological and air pollutant diffusion modeling, to support a neighborhood-scale air quality monitoring pilot study under the U.S. EPA EMPACT Program. The study ha...
The ability to predict water quality in lakes is important since lakes are sources of water for agriculture, drinking, and recreational uses. Lakes are also home to a dynamic ecosystem of lacustrine wetlands and deep waters. They are sensitive to pH changes and are dependent on d...
Eisenberg, Nancy; Sulik, Michael J.; Spinrad, Tracy L.; Edwards, Alison; Eggum, Natalie D.; Liew, Jeffrey; Sallquist, Julie; Popp, Tierney K.; Smith, Cynthia L.; Hart, Daniel
2012-01-01
The purpose of the current study was to predict the development of aggressive behavior from young children’s respiratory sinus arrhythmia (RSA) and environmental quality. In a longitudinal sample of 213 children, baseline RSA, RSA suppression in response to a film of crying babies, and a composite measure of environmental quality (incorporating socioeconomic status and marital adjustment) were measured, and parent-reported aggression was assessed from 18 to 54 months of age. Predictions based on biological sensitivity-to-context/differential susceptibility and diathesis-stress models, as well as potential moderation by child sex, were examined. The interaction of baseline RSA with environmental quality predicted the development (slope) and 54-month intercept of mothers’ reports of aggression. For girls only, the interaction between baseline RSA and environmental quality predicted the 18-month intercept of fathers’ reports. In general, significant negative relations between RSA and aggression were found primarily at high levels of environmental quality. In addition, we found a significant Sex × RSA interaction predicting the slope and 54-month intercept of fathers’ reports of aggression, such that RSA was negatively related to aggression for boys but not for girls. Contrary to predictions, no significant main effects or interactions were found for RSA suppression. The results provide mixed but not full support for differential susceptibility theory and provide little support for the diathesis-stress model. PMID:22182294
Early experiences building a software quality prediction model
NASA Technical Reports Server (NTRS)
Agresti, W. W.; Evanco, W. M.; Smith, M. C.
1990-01-01
Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.
Predictive Techniques for Spacecraft Cabin Air Quality Control
NASA Technical Reports Server (NTRS)
Perry, J. L.; Cromes, Scott D. (Technical Monitor)
2001-01-01
As assembly of the International Space Station (ISS) proceeds, predictive techniques are used to determine the best approach for handling a variety of cabin air quality challenges. These techniques use equipment offgassing data collected from each ISS module before flight to characterize the trace chemical contaminant load. Combined with crew metabolic loads, these data serve as input to a predictive model for assessing the capability of the onboard atmosphere revitalization systems to handle the overall trace contaminant load as station assembly progresses. The techniques for predicting in-flight air quality are summarized along with results from early ISS mission analyses. Results from groundbased analyses of in-flight air quality samples are compared to the predictions to demonstrate the technique's relative conservatism.
Brady, Amie M.G.; Plona, Meg B.
2012-01-01
The Cuyahoga River within Cuyahoga Valley National Park (CVNP) is at times impaired for recreational use due to elevated concentrations of Escherichia coli (E. coli), a fecal-indicator bacterium. During the recreational seasons of mid-May through September during 2009–11, samples were collected 4 days per week and analyzed for E. coli concentrations at two sites within CVNP. Other water-quality and environ-mental data, including turbidity, rainfall, and streamflow, were measured and (or) tabulated for analysis. Regression models developed to predict recreational water quality in the river were implemented during the recreational seasons of 2009–11 for one site within CVNP–Jaite. For the 2009 and 2010 seasons, the regression models were better at predicting exceedances of Ohio's single-sample standard for primary-contact recreation compared to the traditional method of using the previous day's E. coli concentration. During 2009, the regression model was based on data collected during 2005 through 2008, excluding available 2004 data. The resulting model for 2009 did not perform as well as expected (based on the calibration data set) and tended to overestimate concentrations (correct responses at 69 percent). During 2010, the regression model was based on data collected during 2004 through 2009, including all of the available data. The 2010 model performed well, correctly predicting 89 percent of the samples above or below the single-sample standard, even though the predictions tended to be lower than actual sample concentrations. During 2011, the regression model was based on data collected during 2004 through 2010 and tended to overestimate concentrations. The 2011 model did not perform as well as the traditional method or as expected, based on the calibration dataset (correct responses at 56 percent). At a second site—Lock 29, approximately 5 river miles upstream from Jaite, a regression model based on data collected at the site during the recreational seasons of 2008–10 also did not perform as well as the traditional method or as well as expected (correct responses at 60 percent). Above normal precipitation in the region and a delayed start to the 2011 sampling season (sampling began mid-June) may have affected how well the 2011 models performed. With these new data, however, updated regression models may be better able to predict recreational water quality conditions due to the increased amount of diverse water quality conditions included in the calibration data. Daily recreational water-quality predictions for Jaite were made available on the Ohio Nowcast Web site at www.ohionowcast.info. Other public outreach included signage at trailheads in the park, articles in the park's quarterly-published schedule of events and volunteer newsletters. A U.S. Geological Survey Fact Sheet was also published to bring attention to water-quality issues in the park.
NASA Astrophysics Data System (ADS)
Bray, Casey D.; Battye, William; Aneja, Viney P.; Tong, Daniel; Lee, Pius; Tang, Youhua; Nowak, John B.
2017-08-01
Atmospheric ammonia (NH3) is not only a major precursor gas for fine particulate matter (PM2.5), but it also negatively impacts the environment through eutrophication and acidification. As the need for agriculture, the largest contributing source of NH3, increases, NH3 emissions will also increase. Therefore, it is crucial to accurately predict ammonia concentrations. The objective of this study is to determine how well the U.S. National Oceanic and Atmospheric Administration (NOAA) National Air Quality Forecast Capability (NAQFC) system predicts ammonia concentrations using their Community Multiscale Air Quality (CMAQ) model (v4.6). Model predictions of atmospheric ammonia are compared against measurements taken during the NOAA California Nexus (CalNex) field campaign that took place between May and July of 2010. Additionally, the model predictions were also compared against ammonia measurements obtained from the Tropospheric Emission Spectrometer (TES) on the Aura satellite. The results of this study showed that the CMAQ model tended to under predict concentrations of NH3. When comparing the CMAQ model with the CalNex measurements, the model under predicted NH3 by a factor of 2.4 (NMB = -58%). However, the ratio of the median measured NH3 concentration to the median of the modeled NH3 concentration was 0.8. When compared with the TES measurements, the model under predicted concentrations of NH3 by a factor of 4.5 (NMB = -77%), with a ratio of the median retrieved NH3 concentration to the median of the modeled NH3 concentration of 3.1. Because the model was the least accurate over agricultural regions, it is likely that the major source of error lies within the agricultural emissions in the National Emissions Inventory. In addition to this, the lack of the use of bidirectional exchange of NH3 in the model could also contribute to the observed bias.
Overload, autonomy, and burnout as predictors of physicians' quality of care.
Shirom, Arie; Nirel, Nurit; Vinokur, Amiram D
2006-10-01
A model in which perceived overload and burnout mediated the relations of workload and autonomy with physicians' quality of care to their patients was examined. The study was based on data from 890 specialists representing six medical specialties. Including global burnout as well as its three first-order facets of physical fatigue, cognitive weariness, and emotional exhaustion improved the fit between the structural model and the data relative to an alternative model that included only global burnout. Workload (number of work hours) indirectly predicted quality of care through perceived overload. Additionally, the authors found that the paths from the first order factors of emotional exhaustion, physical fatigue, and cognitive weariness predicted quality of care negatively, positively, and nonsignificantly, respectively.
NASA Technical Reports Server (NTRS)
Quattrochi, D. A.; Lapenta, W. M.; Crosson, W. L.; Estes, M. G., Jr.; Limaye, A.; Kahn, M.
2006-01-01
Local and state agencies are responsible for developing state implementation plans to meet National Ambient Air Quality Standards. Numerical models used for this purpose simulate the transport and transformation of criteria pollutants and their precursors. The specification of land use/land cover (LULC) plays an important role in controlling modeled surface meteorology and emissions. NASA researchers have worked with partners and Atlanta stakeholders to incorporate an improved high-resolution LULC dataset for the Atlanta area within their modeling system and to assess meteorological and air quality impacts of Urban Heat Island (UHI) mitigation strategies. The new LULC dataset provides a more accurate representation of land use, has the potential to improve model accuracy, and facilitates prediction of LULC changes. Use of the new LULC dataset for two summertime episodes improved meteorological forecasts, with an existing daytime cold bias of approx. equal to 3 C reduced by 30%. Model performance for ozone prediction did not show improvement. In addition, LULC changes due to Atlanta area urbanization were predicted through 2030, for which model simulations predict higher urban air temperatures. The incorporation of UHI mitigation strategies partially offset this warming trend. The data and modeling methods used are generally applicable to other U.S. cities.
Ground-water models for water resources planning
Moore, John E.
1980-01-01
In the past decade hydrologists have emphasized the development of computer-based mathematical models to aid in the understanding of flow, the transport of solutes, transport of heat, and deformation in the groundwater system. These models have been used to provide information and predictions for water managers. Too frequently, groundwater was neglected in water-resource planning because managers believed that it could not be adequately evaluated in terms of availability, quality, and effect of development on surface water supplies. Now, however, with newly developed digital groundwater models, effects of development can be predicted. Such models have been used to predict hydrologic and quality changes under different stresses. These models have grown in complexity over the last 10 years from simple one-layer flow models to three-dimensional simulations of groundwater flow which may include solute transport, heat transport, effects of land subsidence, and encroachment of salt water. This paper illustrates, through case histories, how predictive groundwater models have provided the information needed for the sound planning and management of water resources in the United States. (USGS)
Water Quality Analysis Simulation Program (WASP)
The Water Quality Analysis Simulation Program (WASP) model helps users interpret and predict water quality responses to natural phenomena and manmade pollution for various pollution management decisions.
Li, Mingjie; Zhou, Ping; Wang, Hong; ...
2017-09-19
As one of the most important unit in the papermaking industry, the high consistency (HC) refining system is confronted with challenges such as improving pulp quality, energy saving, and emissions reduction in its operation processes. Here in this correspondence, an optimal operation of HC refining system is presented using nonlinear multiobjective model predictive control strategies that aim at set-point tracking objective of pulp quality, economic objective, and specific energy (SE) consumption objective, respectively. First, a set of input and output data at different times are employed to construct the subprocess model of the state process model for the HC refiningmore » system, and then the Wiener-type model can be obtained through combining the mechanism model of Canadian Standard Freeness and the state process model that determines their structures based on Akaike information criterion. Second, the multiobjective optimization strategy that optimizes both the set-point tracking objective of pulp quality and SE consumption is proposed simultaneously, which uses NSGA-II approach to obtain the Pareto optimal set. Furthermore, targeting at the set-point tracking objective of pulp quality, economic objective, and SE consumption objective, the sequential quadratic programming method is utilized to produce the optimal predictive controllers. In conclusion, the simulation results demonstrate that the proposed methods can make the HC refining system provide a better performance of set-point tracking of pulp quality when these predictive controllers are employed. In addition, while the optimal predictive controllers orienting with comprehensive economic objective and SE consumption objective, it has been shown that they have significantly reduced the energy consumption.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Mingjie; Zhou, Ping; Wang, Hong
As one of the most important unit in the papermaking industry, the high consistency (HC) refining system is confronted with challenges such as improving pulp quality, energy saving, and emissions reduction in its operation processes. Here in this correspondence, an optimal operation of HC refining system is presented using nonlinear multiobjective model predictive control strategies that aim at set-point tracking objective of pulp quality, economic objective, and specific energy (SE) consumption objective, respectively. First, a set of input and output data at different times are employed to construct the subprocess model of the state process model for the HC refiningmore » system, and then the Wiener-type model can be obtained through combining the mechanism model of Canadian Standard Freeness and the state process model that determines their structures based on Akaike information criterion. Second, the multiobjective optimization strategy that optimizes both the set-point tracking objective of pulp quality and SE consumption is proposed simultaneously, which uses NSGA-II approach to obtain the Pareto optimal set. Furthermore, targeting at the set-point tracking objective of pulp quality, economic objective, and SE consumption objective, the sequential quadratic programming method is utilized to produce the optimal predictive controllers. In conclusion, the simulation results demonstrate that the proposed methods can make the HC refining system provide a better performance of set-point tracking of pulp quality when these predictive controllers are employed. In addition, while the optimal predictive controllers orienting with comprehensive economic objective and SE consumption objective, it has been shown that they have significantly reduced the energy consumption.« less
Mateescu, R G; Oltenacu, P A; Garmyn, A J; Mafi, G G; VanOverbeke, D L
2016-05-01
Product quality is a high priority for the beef industry because of its importance as a major driver of consumer demand for beef and the ability of the industry to improve it. A 2-prong approach based on implementation of a genetic program to improve eating quality and a system to communicate eating quality and increase the probability that consumers' eating quality expectations are met is outlined. The objectives of this study were 1) to identify the best carcass and meat composition traits to be used in a selection program to improve eating quality and 2) to develop a relatively small number of classes that reflect real and perceptible differences in eating quality that can be communicated to consumers and identify a subset of carcass and meat composition traits with the highest predictive accuracy across all eating quality classes. Carcass traits, meat composition, including Warner-Bratzler shear force (WBSF), intramuscular fat content (IMFC), trained sensory panel scores, and mineral composition traits of 1,666 Angus cattle were used in this study. Three eating quality indexes, EATQ1, EATQ2, and EATQ3, were generated by using different weights for the sensory traits (emphasis on tenderness, flavor, and juiciness, respectively). The best model for predicting eating quality explained 37%, 9%, and 19% of the variability of EATQ1, EATQ2, and EATQ3, and 2 traits, WBSF and IMFC, accounted for most of the variability explained by the best models. EATQ1 combines tenderness, juiciness, and flavor assessed by trained panels with 0.60, 0.15, and 0.25 weights, best describes North American consumers, and has a moderate heritability (0.18 ± 0.06). A selection index (I= -0.5[WBSF] + 0.3[IMFC]) based on phenotypic and genetic variances and covariances can be used to improve eating quality as a correlated trait. The 3 indexes (EATQ1, EATQ2, and EATQ3) were used to generate 3 equal (33.3%) low, medium, and high eating quality classes, and linear combinations of traits that best predict class membership were estimated using a predictive discriminant analysis. The best predictive model to classify new observations into low, medium, and high eating quality classes defined by the EATQ1 index included WBSF, IMFC, HCW, and marbling score and resulted in a total error rate of 47.06%, much lower than the 60.74% error rate when the prediction of class membership was based on the USDA grading system. The 2 best predictors were WBSF and IMFC, and they accounted for 97.2% of the variability explained by the best model.
A prototype surface ozone concentration forecasting model system for the Eastern U.S. has been developed. The model system is consisting of a regional meteorological and a regional air quality model. It demonstrated a strong prediction dependence on its ozone boundary conditions....
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, Alexander V.; Rodrigues, George, E-mail: george.rodrigues@lhsc.on.ca; Department of Epidemiology/Biostatistics, University of Western Ontario, London, ON
Purpose: To compare the quality-adjusted life expectancy and overall survival in patients with Stage I non-small-cell lung cancer (NSCLC) treated with either stereotactic body radiation therapy (SBRT) or surgery. Methods and Materials: We constructed a Markov model to describe health states after either SBRT or lobectomy for Stage I NSCLC for a 5-year time frame. We report various treatment strategy survival outcomes stratified by age, sex, and pack-year history of smoking, and compared these with an external outcome prediction tool (Adjuvant{exclamation_point} Online). Results: Overall survival, cancer-specific survival, and other causes of death as predicted by our model correlated closely withmore » those predicted by the external prediction tool. Overall survival at 5 years as predicted by baseline analysis of our model is in favor of surgery, with a benefit ranging from 2.2% to 3.0% for all cohorts. Mean quality-adjusted life expectancy ranged from 3.28 to 3.78 years after surgery and from 3.35 to 3.87 years for SBRT. The utility threshold for preferring SBRT over surgery was 0.90. Outcomes were sensitive to quality of life, the proportion of local and regional recurrences treated with standard vs. palliative treatments, and the surgery- and SBRT-related mortalities. Conclusions: The role of SBRT in the medically operable patient is yet to be defined. Our model indicates that SBRT may offer comparable overall survival and quality-adjusted life expectancy as compared with surgical resection. Well-powered prospective studies comparing surgery vs. SBRT in early-stage lung cancer are warranted to further investigate the relative survival, quality of life, and cost characteristics of both treatment paradigms.« less
Patient-specific dosimetric endpoints based treatment plan quality control in radiotherapy.
Song, Ting; Staub, David; Chen, Mingli; Lu, Weiguo; Tian, Zhen; Jia, Xun; Li, Yongbao; Zhou, Linghong; Jiang, Steve B; Gu, Xuejun
2015-11-07
In intensity modulated radiotherapy (IMRT), the optimal plan for each patient is specific due to unique patient anatomy. To achieve such a plan, patient-specific dosimetric goals reflecting each patient's unique anatomy should be defined and adopted in the treatment planning procedure for plan quality control. This study is to develop such a personalized treatment plan quality control tool by predicting patient-specific dosimetric endpoints (DEs). The incorporation of patient specific DEs is realized by a multi-OAR geometry-dosimetry model, capable of predicting optimal DEs based on the individual patient's geometry. The overall quality of a treatment plan is then judged with a numerical treatment plan quality indicator and characterized as optimal or suboptimal. Taking advantage of clinically available prostate volumetric modulated arc therapy (VMAT) treatment plans, we built and evaluated our proposed plan quality control tool. Using our developed tool, six of twenty evaluated plans were identified as sub-optimal plans. After plan re-optimization, these suboptimal plans achieved better OAR dose sparing without sacrificing the PTV coverage, and the dosimetric endpoints of the re-optimized plans agreed well with the model predicted values, which validate the predictability of the proposed tool. In conclusion, the developed tool is able to accurately predict optimally achievable DEs of multiple OARs, identify suboptimal plans, and guide plan optimization. It is a useful tool for achieving patient-specific treatment plan quality control.
A comparative study of kinetic and connectionist modeling for shelf-life prediction of Basundi mix.
Ruhil, A P; Singh, R R B; Jain, D K; Patel, A A; Patil, G R
2011-04-01
A ready-to-reconstitute formulation of Basundi, a popular Indian dairy dessert was subjected to storage at various temperatures (10, 25 and 40 °C) and deteriorative changes in the Basundi mix were monitored using quality indices like pH, hydroxyl methyl furfural (HMF), bulk density (BD) and insolubility index (II). The multiple regression equations and the Arrhenius functions that describe the parameters' dependence on temperature for the four physico-chemical parameters were integrated to develop mathematical models for predicting sensory quality of Basundi mix. Connectionist model using multilayer feed forward neural network with back propagation algorithm was also developed for predicting the storage life of the product employing artificial neural network (ANN) tool box of MATLAB software. The quality indices served as the input parameters whereas the output parameters were the sensorily evaluated flavour and total sensory score. A total of 140 observations were used and the prediction performance was judged on the basis of per cent root mean square error. The results obtained from the two approaches were compared. Relatively lower magnitudes of percent root mean square error for both the sensory parameters indicated that the connectionist models were better fitted than kinetic models for predicting storage life.
A time-varying subjective quality model for mobile streaming videos with stalling events
NASA Astrophysics Data System (ADS)
Ghadiyaram, Deepti; Pan, Janice; Bovik, Alan C.
2015-09-01
Over-the-top mobile video streaming is invariably influenced by volatile network conditions which cause playback interruptions (stalling events), thereby impairing users' quality of experience (QoE). Developing models that can accurately predict users' QoE could enable the more efficient design of quality-control protocols for video streaming networks that reduce network operational costs while still delivering high-quality video content to the customers. Existing objective models that predict QoE are based on global video features, such as the number of stall events and their lengths, and are trained and validated on a small pool of ad hoc video datasets, most of which are not publicly available. The model we propose in this work goes beyond previous models as it also accounts for the fundamental effect that a viewer's recent level of satisfaction or dissatisfaction has on their overall viewing experience. In other words, the proposed model accounts for and adapts to the recency, or hysteresis effect caused by a stall event in addition to accounting for the lengths, frequency of occurrence, and the positions of stall events - factors that interact in a complex way to affect a user's QoE. On the recently introduced LIVE-Avvasi Mobile Video Database, which consists of 180 distorted videos of varied content that are afflicted solely with over 25 unique realistic stalling events, we trained and validated our model to accurately predict the QoE, attaining standout QoE prediction performance.
NASA Astrophysics Data System (ADS)
Stajner, I.; Hou, Y. T.; McQueen, J.; Lee, P.; Stein, A. F.; Tong, D.; Pan, L.; Huang, J.; Huang, H. C.; Upadhayay, S.
2016-12-01
NOAA provides operational air quality predictions using the National Air Quality Forecast Capability (NAQFC): ozone and wildfire smoke for the United States and airborne dust for the contiguous 48 states at http://airquality.weather.gov. NOAA's predictions of fine particulate matter (PM2.5) became publicly available in February 2016. Ozone and PM2.5 predictions are produced using a system that operationally links the Community Multiscale Air Quality (CMAQ) model with meteorological inputs from the North American mesoscale forecast Model (NAM). Smoke and dust predictions are provided using the Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) model. Current NAQFC focus is on updating CMAQ to version 5.0.2, improving PM2.5 predictions, and updating emissions estimates, especially for NOx using recently observed trends. Wildfire smoke emissions from a newer version of the USFS BlueSky system are being included in a new configuration of the NAQFC NAM-CMAQ system, which is re-run for the previous 24 hours when the wildfires were observed from satellites, to better represent wildfire emissions prior to initiating predictions for the next 48 hours. In addition, NOAA is developing the Next Generation Global Prediction System (NGGPS) to represent the earth system for extended weather prediction. NGGPS will include a representation of atmospheric dynamics, physics, aerosols and atmospheric composition as well as coupling with ocean, wave, ice and land components. NGGPS is being developed with a broad community involvement, including community developed components and academic research to develop and test potential improvements for potentially inclusion in NGGPS. Several investigators at NOAA's research laboratories and in academia are working to improve the aerosol and gaseous chemistry representation for NGGPS, to develop and evaluate the representation of atmospheric composition, and to establish and improve the coupling with radiation and microphysics. Additional efforts may include the improved use of predicted atmospheric composition in assimilation of observations and the linkage of full global atmospheric composition predictions with national air quality predictions.
Barba, Lida; Sánchez-Macías, Davinia; Barba, Iván; Rodríguez, Nibaldo
2018-06-01
Guinea pig meat consumption is increasing exponentially worldwide. The evaluation of the contribution of carcass components to carcass quality potentially can allow for the estimation of the value added to food animal origin and make research in guinea pigs more practicable. The aim of this study was to propose a methodology for modelling the contribution of different carcass components to the overall carcass quality of guinea pigs by using non-invasive pre- and post mortem carcass measurements. The selection of predictors was developed through correlation analysis and statistical significance; whereas the prediction models were based on Multiple Linear Regression. The prediction results showed higher accuracy in the prediction of carcass component contribution expressed in grams, compared to when expressed as a percentage of carcass quality components. The proposed prediction models can be useful for the guinea pig meat industry and research institutions by using non-invasive and time- and cost-efficient carcass component measuring techniques. Copyright © 2018 Elsevier Ltd. All rights reserved.
Prediction on carbon dioxide emissions based on fuzzy rules
NASA Astrophysics Data System (ADS)
Pauzi, Herrini; Abdullah, Lazim
2014-06-01
There are several ways to predict air quality, varying from simple regression to models based on artificial intelligence. Most of the conventional methods are not sufficiently able to provide good forecasting performances due to the problems with non-linearity uncertainty and complexity of the data. Artificial intelligence techniques are successfully used in modeling air quality in order to cope with the problems. This paper describes fuzzy inference system (FIS) to predict CO2 emissions in Malaysia. Furthermore, adaptive neuro-fuzzy inference system (ANFIS) is used to compare the prediction performance. Data of five variables: energy use, gross domestic product per capita, population density, combustible renewable and waste and CO2 intensity are employed in this comparative study. The results from the two model proposed are compared and it is clearly shown that the ANFIS outperforms FIS in CO2 prediction.
In this paper, the concept of scale analysis is applied to evaluate ozone predictions from two regional-scale air quality models. To this end, seasonal time series of observations and predictions from the RAMS3b/UAM-V and MM5/MAQSIP (SMRAQ) modeling systems for ozone were spectra...
Testing and analysis of internal hardwood log defect prediction models
R. Edward Thomas
2011-01-01
The severity and location of internal defects determine the quality and value of lumber sawn from hardwood logs. Models have been developed to predict the size and position of internal defects based on external defect indicator measurements. These models were shown to predict approximately 80% of all internal knots based on external knot indicators. However, the size...
Heddam, Salim
2016-09-01
This paper proposes multilayer perceptron neural network (MLPNN) to predict phycocyanin (PC) pigment using water quality variables as predictor. In the proposed model, four water quality variables that are water temperature, dissolved oxygen, pH, and specific conductance were selected as the inputs for the MLPNN model, and the PC as the output. To demonstrate the capability and the usefulness of the MLPNN model, a total of 15,849 data measured at 15-min (15 min) intervals of time are used for the development of the model. The data are collected at the lower Charles River buoy, and available from the US Environmental Protection Agency (USEPA). For comparison purposes, a multiple linear regression (MLR) model that was frequently used for predicting water quality variables in previous studies is also built. The performances of the models are evaluated using a set of widely used statistical indices. The performance of the MLPNN and MLR models is compared with the measured data. The obtained results show that (i) the all proposed MLPNN models are more accurate than the MLR models and (ii) the results obtained are very promising and encouraging for the development of phycocyanin-predictive models.
Air Quality Response Modeling for Decision Support | Science ...
Air quality management relies on photochemical models to predict the responses of pollutant concentrations to changes in emissions. Such modeling is especially important for secondary pollutants such as ozone and fine particulate matter which vary nonlinearly with changes in emissions. Numerous techniques for probing pollutant-emission relationships within photochemical models have been developed and deployed for a variety of decision support applications. However, atmospheric response modeling remains complicated by the challenge of validating sensitivity results against observable data. This manuscript reviews the state of the science of atmospheric response modeling as well as efforts to characterize the accuracy and uncertainty of sensitivity results. The National Exposure Research Laboratory′s (NERL′s) Atmospheric Modeling and Analysis Division (AMAD) conducts research in support of EPA′s mission to protect human health and the environment. AMAD′s research program is engaged in developing and evaluating predictive atmospheric models on all spatial and temporal scales for forecasting the Nation′s air quality and for assessing changes in air quality and air pollutant exposures, as affected by changes in ecosystem management and regulatory decisions. AMAD is responsible for providing a sound scientific and technical basis for regulatory policies based on air quality models to improve ambient air quality. The models developed by AMAD are being use
Kona, Ravikanth; Fahmy, Raafat M; Claycamp, Gregg; Polli, James E; Martinez, Marilyn; Hoag, Stephen W
2015-02-01
The objective of this study is to use near-infrared spectroscopy (NIRS) coupled with multivariate chemometric models to monitor granule and tablet quality attributes in the formulation development and manufacturing of ciprofloxacin hydrochloride (CIP) immediate release tablets. Critical roller compaction process parameters, compression force (CFt), and formulation variables identified from our earlier studies were evaluated in more detail. Multivariate principal component analysis (PCA) and partial least square (PLS) models were developed during the development stage and used as a control tool to predict the quality of granules and tablets. Validated models were used to monitor and control batches manufactured at different sites to assess their robustness to change. The results showed that roll pressure (RP) and CFt played a critical role in the quality of the granules and the finished product within the range tested. Replacing binder source did not statistically influence the quality attributes of the granules and tablets. However, lubricant type has significantly impacted the granule size. Blend uniformity, crushing force, disintegration time during the manufacturing was predicted using validated PLS regression models with acceptable standard error of prediction (SEP) values, whereas the models resulted in higher SEP for batches obtained from different manufacturing site. From this study, we were able to identify critical factors which could impact the quality attributes of the CIP IR tablets. In summary, we demonstrated the ability of near-infrared spectroscopy coupled with chemometrics as a powerful tool to monitor critical quality attributes (CQA) identified during formulation development.
Forecasting PM10 in metropolitan areas: Efficacy of neural networks.
Fernando, H J S; Mammarella, M C; Grandoni, G; Fedele, P; Di Marco, R; Dimitrova, R; Hyde, P
2012-04-01
Deterministic photochemical air quality models are commonly used for regulatory management and planning of urban airsheds. These models are complex, computer intensive, and hence are prohibitively expensive for routine air quality predictions. Stochastic methods are becoming increasingly popular as an alternative, which relegate decision making to artificial intelligence based on Neural Networks that are made of artificial neurons or 'nodes' capable of 'learning through training' via historic data. A Neural Network was used to predict particulate matter concentration at a regulatory monitoring site in Phoenix, Arizona; its development, efficacy as a predictive tool and performance vis-à-vis a commonly used regulatory photochemical model are described in this paper. It is concluded that Neural Networks are much easier, quicker and economical to implement without compromising the accuracy of predictions. Neural Networks can be used to develop rapid air quality warning systems based on a network of automated monitoring stations. Copyright © 2011 Elsevier Ltd. All rights reserved.
Defect measurement and analysis of JPL ground software: a case study
NASA Technical Reports Server (NTRS)
Powell, John D.; Spagnuolo, John N., Jr.
2004-01-01
Ground software systems at JPL must meet high assurance standards while remaining on schedule due to relatively immovable launch dates for spacecraft that will be controlled by such systems. Toward this end, the Software Quality Improvement (SQI) project's Measurement and Benchmarking (M&B) team is collecting and analyzing defect data of JPL ground system software projects to build software defect prediction models. The aim of these models is to improve predictability with regard to software quality activities. Predictive models will quantitatively define typical trends for JPL ground systems as well as Critical Discriminators (CDs) to provide explanations for atypical deviations from the norm at JPL. CDs are software characteristics that can be estimated or foreseen early in a software project's planning. Thus, these CDs will assist in planning for the predicted degree to which software quality activities for a project are likely to deviation from the normal JPL ground system based on pasted experience across the lab.
Urban Landscape Characterization Using Remote Sensing Data For Input into Air Quality Modeling
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.; Estes, Maurice G., Jr.; Crosson, William; Khan, Maudood
2005-01-01
The urban landscape is inherently complex and this complexity is not adequately captured in air quality models that are used to assess whether urban areas are in attainment of EPA air quality standards, particularly for ground level ozone. This inadequacy of air quality models to sufficiently respond to the heterogeneous nature of the urban landscape can impact how well these models predict ozone pollutant levels over metropolitan areas and ultimately, whether cities exceed EPA ozone air quality standards. We are exploring the utility of high-resolution remote sensing data and urban growth projections as improved inputs to meteorological and air quality models focusing on the Atlanta, Georgia metropolitan area as a case study. The National Land Cover Dataset at 30m resolution is being used as the land use/land cover input and aggregated to the 4km scale for the MM5 mesoscale meteorological model and the Community Multiscale Air Quality (CMAQ) modeling schemes. Use of these data have been found to better characterize low density/suburban development as compared with USGS 1 km land use/land cover data that have traditionally been used in modeling. Air quality prediction for future scenarios to 2030 is being facilitated by land use projections using a spatial growth model. Land use projections were developed using the 2030 Regional Transportation Plan developed by the Atlanta Regional Commission. This allows the State Environmental Protection agency to evaluate how these transportation plans will affect future air quality.
A spectral method for spatial downscaling | Science Inventory ...
Complex computer models play a crucial role in air quality research. These models are used to evaluate potential regulatory impacts of emission control strategies and to estimate air quality in areas without monitoring data. For both of these purposes, it is important to calibrate model output with monitoring data to adjust for model biases and improve spatial prediction. In this paper, we propose a new spectral method to study and exploit complex relationships between model output and monitoring data. Spectral methods allow us to estimate the relationship between model output and monitoring data separately at different spatial scales, and to use model output for prediction only at the appropriate scales. The proposed method is computationally efficient and can be implemented using standard software. We apply the method to compare Community Multiscale Air Quality (CMAQ) model output with ozone measurements in the United States in July, 2005. We find that CMAQ captures large-scale spatial trends, but has low correlation with the monitoring data at small spatial scales. The National Exposure Research Laboratory′s (NERL′s)Atmospheric Modeling Division (AMAD) conducts research in support of EPA′s mission to protect human health and the environment. AMAD′s research program is engaged in developing and evaluating predictive atmospheric models on all spatial and temporal scales for forecasting the Nation′s air quality and for assessing ch
A diagnostic model for studying daytime urban air quality trends
NASA Technical Reports Server (NTRS)
Brewer, D. A.; Remsberg, E. E.; Woodbury, G. E.
1981-01-01
A single cell Eulerian photochemical air quality simulation model was developed and validated for selected days of the 1976 St. Louis Regional Air Pollution Study (RAPS) data sets; parameterizations of variables in the model and validation studies using the model are discussed. Good agreement was obtained between measured and modeled concentrations of NO, CO, and NO2 for all days simulated. The maximum concentration of O3 was also predicted well. Predicted species concentrations were relatively insensitive to small variations in CO and NOx emissions and to the concentrations of species which are entrained as the mixed layer rises.
NASA Astrophysics Data System (ADS)
Pandremmenou, K.; Shahid, M.; Kondi, L. P.; Lövström, B.
2015-03-01
In this work, we propose a No-Reference (NR) bitstream-based model for predicting the quality of H.264/AVC video sequences, affected by both compression artifacts and transmission impairments. The proposed model is based on a feature extraction procedure, where a large number of features are calculated from the packet-loss impaired bitstream. Many of the features are firstly proposed in this work, and the specific set of the features as a whole is applied for the first time for making NR video quality predictions. All feature observations are taken as input to the Least Absolute Shrinkage and Selection Operator (LASSO) regression method. LASSO indicates the most important features, and using only them, it is possible to estimate the Mean Opinion Score (MOS) with high accuracy. Indicatively, we point out that only 13 features are able to produce a Pearson Correlation Coefficient of 0.92 with the MOS. Interestingly, the performance statistics we computed in order to assess our method for predicting the Structural Similarity Index and the Video Quality Metric are equally good. Thus, the obtained experimental results verified the suitability of the features selected by LASSO as well as the ability of LASSO in making accurate predictions through sparse modeling.
Rossato, Luana T; Barbosa, Cinthia D; Nahas, Paula C; Orsatti, Fábio L; de Oliveira, Erick P
2018-04-01
Low strength and/or lean mass quality are associated with higher hospitalization and mortality. The aim of this study was to evaluate the main demographic and anthropometric predictors of strength and lean mass quality in hospitalized patients. We evaluated 136 patients (18-86 years) of both sexes, admitted in a public hospital. Waist circumference (WC) was measured using an inelastic tape, lean mass (LM) was assessed by bioimpedance, and handgrip strength (HGS) was performed using a dynamometer. Lean mass quality (HGS/LM) was also calculated. We noted that LM predicted 33.1% of HGS, whereas WC was not associated with HGS. Evaluating LM and WC in the same statistical model, WC (β = -0.249, p = 0.001) increased the prediction of HGS by 4.7% when compared to LM alone. Accessing LM, WC, age, and sex in the same model an increase in the prediction of HGS by 7.3% was noted when compared to LM alone, but only LM and sex were significant. In addition, WC predicted the lean mass quality by 4% (β = -0.205, p = 0.016) and when WC, sex, and age were placed in the same model; WC (β = -0.172, p = 0.035) and sex (β = 0.332, p < 0.001) explained the variations in lean mass quality by 15%. The main predictor of lower HGS was lower LM, whereas sex showed a low association. Furthermore, although a low association was found, higher abdominal obesity and sex predicted lower lean mass quality. Copyright © 2018 European Society for Clinical Nutrition and Metabolism. Published by Elsevier Ltd. All rights reserved.
Chipman, Jonathan J; Sanda, Martin G; Dunn, Rodney L; Wei, John T; Litwin, Mark S; Crociani, Catrina M; Regan, Meredith M; Chang, Peter
2014-03-01
We expanded the clinical usefulness of EPIC-CP (Expanded Prostate Cancer Index Composite for Clinical Practice) by evaluating its responsiveness to health related quality of life changes, defining the minimally important differences for an individual patient change in each domain and applying it to a sexual outcome prediction model. In 1,201 subjects from a previously described multicenter longitudinal cohort we modeled the EPIC-CP domain scores of each treatment group before treatment, and at short-term and long-term followup. We considered a posttreatment domain score change from pretreatment of 0.5 SD or greater clinically significant and p ≤ 0.01 statistically significant. We determined the domain minimally important differences using the pooled 0.5 SD of the 2, 6, 12 and 24-month posttreatment changes from pretreatment values. We then recalibrated an EPIC-CP based nomogram model predicting 2-year post-prostatectomy functional erection from that developed using EPIC-26. For each health related quality of life domain EPIC-CP was sensitive to similar posttreatment health related quality of life changes with time, as was observed using EPIC-26. The EPIC-CP minimally important differences in changes in the urinary incontinence, urinary irritation/obstruction, bowel, sexual and vitality/hormonal domains were 1.0, 1.3, 1.2, 1.6 and 1.0, respectively. The EPIC-CP based sexual prediction model performed well (AUC 0.76). It showed robust agreement with its EPIC-26 based counterpart with 10% or less predicted probability differences between models in 95% of individuals and a mean ± SD difference of 0.0 ± 0.05 across all individuals. EPIC-CP is responsive to health related quality of life changes during convalescence and it can be used to predict 2-year post-prostatectomy sexual outcomes. It can facilitate shared medical decision making and patient centered care. Copyright © 2014 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Curry, Timothy J.; Batterson, James G. (Technical Monitor)
2000-01-01
Low order equivalent system (LOES) models for the Tu-144 supersonic transport aircraft were identified from flight test data. The mathematical models were given in terms of transfer functions with a time delay by the military standard MIL-STD-1797A, "Flying Qualities of Piloted Aircraft," and the handling qualities were predicted from the estimated transfer function coefficients. The coefficients and the time delay in the transfer functions were estimated using a nonlinear equation error formulation in the frequency domain. Flight test data from pitch, roll, and yaw frequency sweeps at various flight conditions were used for parameter estimation. Flight test results are presented in terms of the estimated parameter values, their standard errors, and output fits in the time domain. Data from doublet maneuvers at the same flight conditions were used to assess the predictive capabilities of the identified models. The identified transfer function models fit the measured data well and demonstrated good prediction capabilities. The Tu-144 was predicted to be between level 2 and 3 for all longitudinal maneuvers and level I for all lateral maneuvers. High estimates of the equivalent time delay in the transfer function model caused the poor longitudinal rating.
Deep supervised dictionary learning for no-reference image quality assessment
NASA Astrophysics Data System (ADS)
Huang, Yuge; Liu, Xuesong; Tian, Xiang; Zhou, Fan; Chen, Yaowu; Jiang, Rongxin
2018-03-01
We propose a deep convolutional neural network (CNN) for general no-reference image quality assessment (NR-IQA), i.e., accurate prediction of image quality without a reference image. The proposed model consists of three components such as a local feature extractor that is a fully CNN, an encoding module with an inherent dictionary that aggregates local features to output a fixed-length global quality-aware image representation, and a regression module that maps the representation to an image quality score. Our model can be trained in an end-to-end manner, and all of the parameters, including the weights of the convolutional layers, the dictionary, and the regression weights, are simultaneously learned from the loss function. In addition, the model can predict quality scores for input images of arbitrary sizes in a single step. We tested our method on commonly used image quality databases and showed that its performance is comparable with that of state-of-the-art general-purpose NR-IQA algorithms.
Modeling the Effects of Conservation Tillage on Water Quality at the Field Scale
USDA-ARS?s Scientific Manuscript database
The development and application of predictive tools to quantitatively assess the effects of tillage and related management activities should be carefully tested against high quality field data. This study reports on: 1) the calibration and validation of the Root Zone Water Quality Model (RZWQM) to a...
A hybrid model for predicting carbon monoxide from vehicular exhausts in urban environments
NASA Astrophysics Data System (ADS)
Gokhale, Sharad; Khare, Mukesh
Several deterministic-based air quality models evaluate and predict the frequently occurring pollutant concentration well but, in general, are incapable of predicting the 'extreme' concentrations. In contrast, the statistical distribution models overcome the above limitation of the deterministic models and predict the 'extreme' concentrations. However, the environmental damages are caused by both extremes as well as by the sustained average concentration of pollutants. Hence, the model should predict not only 'extreme' ranges but also the 'middle' ranges of pollutant concentrations, i.e. the entire range. Hybrid modelling is one of the techniques that estimates/predicts the 'entire range' of the distribution of pollutant concentrations by combining the deterministic based models with suitable statistical distribution models ( Jakeman, et al., 1988). In the present paper, a hybrid model has been developed to predict the carbon monoxide (CO) concentration distributions at one of the traffic intersections, Income Tax Office (ITO), in the Delhi city, where the traffic is heterogeneous in nature and meteorology is 'tropical'. The model combines the general finite line source model (GFLSM) as its deterministic, and log logistic distribution (LLD) model, as its statistical components. The hybrid (GFLSM-LLD) model is then applied at the ITO intersection. The results show that the hybrid model predictions match with that of the observed CO concentration data within the 5-99 percentiles range. The model is further validated at different street location, i.e. Sirifort roadway. The validation results show that the model predicts CO concentrations fairly well ( d=0.91) in 10-95 percentiles range. The regulatory compliance is also developed to estimate the probability of exceedance of hourly CO concentration beyond the National Ambient Air Quality Standards (NAAQS) of India. It consists of light vehicles, heavy vehicles, three- wheelers (auto rickshaws) and two-wheelers (scooters, motorcycles, etc).
NASA Astrophysics Data System (ADS)
Benjankar, R. M.; Sohrabi, M.; Tonina, D.; McKean, J. A.
2013-12-01
Aquatic habitat models utilize flow variables which may be predicted with one-dimensional (1D) or two-dimensional (2D) hydrodynamic models to simulate aquatic habitat quality. Studies focusing on the effects of hydrodynamic model dimensionality on predicted aquatic habitat quality are limited. Here we present the analysis of the impact of flow variables predicted with 1D and 2D hydrodynamic models on simulated spatial distribution of habitat quality and Weighted Usable Area (WUA) for fall-spawning Chinook salmon. Our study focuses on three river systems located in central Idaho (USA), which are a straight and pool-riffle reach (South Fork Boise River), small pool-riffle sinuous streams in a large meadow (Bear Valley Creek) and a steep-confined plane-bed stream with occasional deep forced pools (Deadwood River). We consider low and high flows in simple and complex morphologic reaches. Results show that 1D and 2D modeling approaches have effects on both the spatial distribution of the habitat and WUA for both discharge scenarios, but we did not find noticeable differences between complex and simple reaches. In general, the differences in WUA were small, but depended on stream type. Nevertheless, spatially distributed habitat quality difference is considerable in all streams. The steep-confined plane bed stream had larger differences between aquatic habitat quality defined with 1D and 2D flow models compared to results for streams with well defined macro-topographies, such as pool-riffle bed forms. KEY WORDS: one- and two-dimensional hydrodynamic models, habitat modeling, weighted usable area (WUA), hydraulic habitat suitability, high and low discharges, simple and complex reaches
Sakoda, Lori C; Henderson, Louise M; Caverly, Tanner J; Wernli, Karen J; Katki, Hormuzd A
2017-12-01
Risk prediction models may be useful for facilitating effective and high-quality decision-making at critical steps in the lung cancer screening process. This review provides a current overview of published lung cancer risk prediction models and their applications to lung cancer screening and highlights both challenges and strategies for improving their predictive performance and use in clinical practice. Since the 2011 publication of the National Lung Screening Trial results, numerous prediction models have been proposed to estimate the probability of developing or dying from lung cancer or the probability that a pulmonary nodule is malignant. Respective models appear to exhibit high discriminatory accuracy in identifying individuals at highest risk of lung cancer or differentiating malignant from benign pulmonary nodules. However, validation and critical comparison of the performance of these models in independent populations are limited. Little is also known about the extent to which risk prediction models are being applied in clinical practice and influencing decision-making processes and outcomes related to lung cancer screening. Current evidence is insufficient to determine which lung cancer risk prediction models are most clinically useful and how to best implement their use to optimize screening effectiveness and quality. To address these knowledge gaps, future research should be directed toward validating and enhancing existing risk prediction models for lung cancer and evaluating the application of model-based risk calculators and its corresponding impact on screening processes and outcomes.
Moore, Kevin L; Schmidt, Rachel; Moiseenko, Vitali; Olsen, Lindsey A; Tan, Jun; Xiao, Ying; Galvin, James; Pugh, Stephanie; Seider, Michael J; Dicker, Adam P; Bosch, Walter; Michalski, Jeff; Mutic, Sasa
2015-06-01
The purpose of this study was to quantify the frequency and clinical severity of quality deficiencies in intensity modulated radiation therapy (IMRT) planning in the Radiation Therapy Oncology Group 0126 protocol. A total of 219 IMRT patients from the high-dose arm (79.2 Gy) of RTOG 0126 were analyzed. To quantify plan quality, we used established knowledge-based methods for patient-specific dose-volume histogram (DVH) prediction of organs at risk and a Lyman-Kutcher-Burman (LKB) model for grade ≥2 rectal complications to convert DVHs into normal tissue complication probabilities (NTCPs). The LKB model was validated by fitting dose-response parameters relative to observed toxicities. The 90th percentile (22 of 219) of plans with the lowest excess risk (difference between clinical and model-predicted NTCP) were used to create a model for the presumed best practices in the protocol (pDVH0126,top10%). Applying the resultant model to the entire sample enabled comparisons between DVHs that patients could have received to DVHs they actually received. Excess risk quantified the clinical impact of suboptimal planning. Accuracy of pDVH predictions was validated by replanning 30 of 219 patients (13.7%), including equal numbers of presumed "high-quality," "low-quality," and randomly sampled plans. NTCP-predicted toxicities were compared to adverse events on protocol. Existing models showed that bladder-sparing variations were less prevalent than rectum quality variations and that increased rectal sparing was not correlated with target metrics (dose received by 98% and 2% of the PTV, respectively). Observed toxicities were consistent with current LKB parameters. Converting DVH and pDVH0126,top10% to rectal NTCPs, we observed 94 of 219 patients (42.9%) with ≥5% excess risk, 20 of 219 patients (9.1%) with ≥10% excess risk, and 2 of 219 patients (0.9%) with ≥15% excess risk. Replanning demonstrated the predicted NTCP reductions while maintaining the volume of the PTV receiving prescription dose. An equivalent sample of high-quality plans showed fewer toxicities than low-quality plans, 6 of 73 versus 10 of 73 respectively, although these differences were not significant (P=.21) due to insufficient statistical power in this retrospective study. Plan quality deficiencies in RTOG 0126 exposed patients to substantial excess risk for rectal complications. Copyright © 2015 Elsevier Inc. All rights reserved.
The environmental fluid dynamics code (EFDC) was used to study the three dimensional (3D) circulation, water quality, and ecology in Narragansett Bay, RI. Predictions of the Bay hydrodynamics included the behavior of the water surface elevation, currents, salinity, and temperatur...
NASA Astrophysics Data System (ADS)
Honti, Mark; Schuwirth, Nele; Rieckermann, Jörg; Stamm, Christian
2017-03-01
The design and evaluation of solutions for integrated surface water quality management requires an integrated modelling approach. Integrated models have to be comprehensive enough to cover the aspects relevant for management decisions, allowing for mapping of larger-scale processes such as climate change to the regional and local contexts. Besides this, models have to be sufficiently simple and fast to apply proper methods of uncertainty analysis, covering model structure deficits and error propagation through the chain of sub-models. Here, we present a new integrated catchment model satisfying both conditions. The conceptual iWaQa
model was developed to support the integrated management of small streams. It can be used to predict traditional water quality parameters, such as nutrients and a wide set of organic micropollutants (plant and material protection products), by considering all major pollutant pathways in urban and agricultural environments. Due to its simplicity, the model allows for a full, propagative analysis of predictive uncertainty, including certain structural and input errors. The usefulness of the model is demonstrated by predicting future surface water quality in a small catchment with mixed land use in the Swiss Plateau. We consider climate change, population growth or decline, socio-economic development, and the implementation of management strategies to tackle urban and agricultural point and non-point sources of pollution. Our results indicate that input and model structure uncertainties are the most influential factors for certain water quality parameters. In these cases model uncertainty is already high for present conditions. Nevertheless, accounting for today's uncertainty makes management fairly robust to the foreseen range of potential changes in the next decades. The assessment of total predictive uncertainty allows for selecting management strategies that show small sensitivity to poorly known boundary conditions. The identification of important sources of uncertainty helps to guide future monitoring efforts and pinpoints key indicators, whose evolution should be closely followed to adapt management. The possible impact of climate change is clearly demonstrated by water quality substantially changing depending on single climate model chains. However, when all climate trajectories are combined, the human land use and management decisions have a larger influence on water quality against a time horizon of 2050 in the study.
Kramer, Andrew A; Higgins, Thomas L; Zimmerman, Jack E
2014-03-01
To examine the accuracy of the original Mortality Probability Admission Model III, ICU Outcomes Model/National Quality Forum modification of Mortality Probability Admission Model III, and Acute Physiology and Chronic Health Evaluation IVa models for comparing observed and risk-adjusted hospital mortality predictions. Retrospective paired analyses of day 1 hospital mortality predictions using three prognostic models. Fifty-five ICUs at 38 U.S. hospitals from January 2008 to December 2012. Among 174,001 intensive care admissions, 109,926 met model inclusion criteria and 55,304 had data for mortality prediction using all three models. None. We compared patient exclusions and the discrimination, calibration, and accuracy for each model. Acute Physiology and Chronic Health Evaluation IVa excluded 10.7% of all patients, ICU Outcomes Model/National Quality Forum 20.1%, and Mortality Probability Admission Model III 24.1%. Discrimination of Acute Physiology and Chronic Health Evaluation IVa was superior with area under receiver operating curve (0.88) compared with Mortality Probability Admission Model III (0.81) and ICU Outcomes Model/National Quality Forum (0.80). Acute Physiology and Chronic Health Evaluation IVa was better calibrated (lowest Hosmer-Lemeshow statistic). The accuracy of Acute Physiology and Chronic Health Evaluation IVa was superior (adjusted Brier score = 31.0%) to that for Mortality Probability Admission Model III (16.1%) and ICU Outcomes Model/National Quality Forum (17.8%). Compared with observed mortality, Acute Physiology and Chronic Health Evaluation IVa overpredicted mortality by 1.5% and Mortality Probability Admission Model III by 3.1%; ICU Outcomes Model/National Quality Forum underpredicted mortality by 1.2%. Calibration curves showed that Acute Physiology and Chronic Health Evaluation performed well over the entire risk range, unlike the Mortality Probability Admission Model and ICU Outcomes Model/National Quality Forum models. Acute Physiology and Chronic Health Evaluation IVa had better accuracy within patient subgroups and for specific admission diagnoses. Acute Physiology and Chronic Health Evaluation IVa offered the best discrimination and calibration on a large common dataset and excluded fewer patients than Mortality Probability Admission Model III or ICU Outcomes Model/National Quality Forum. The choice of ICU performance benchmarks should be based on a comparison of model accuracy using data for identical patients.
Watershed Models for Predicting Nitrogen Loads from Artificially Drained Lands
R. Wayne Skaggs; George M. Chescheir; Glenn Fernandez; Devendra M. Amatya
2003-01-01
Non-point sources of pollutants originate at the field scale but water quality problems usually occur at the watershed or basin scale. This paper describes a series of models developed for poorly drained watersheds. The models use DRAINMOD to predict hydrology at the field scale and a range of methods to predict channel hydraulics and nitrogen transport. In-stream...
Li, Jia; Xia, Yunni; Luo, Xin
2014-01-01
OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy.
Usability Prediction & Ranking of SDLC Models Using Fuzzy Hierarchical Usability Model
NASA Astrophysics Data System (ADS)
Gupta, Deepak; Ahlawat, Anil K.; Sagar, Kalpna
2017-06-01
Evaluation of software quality is an important aspect for controlling and managing the software. By such evaluation, improvements in software process can be made. The software quality is significantly dependent on software usability. Many researchers have proposed numbers of usability models. Each model considers a set of usability factors but do not cover all the usability aspects. Practical implementation of these models is still missing, as there is a lack of precise definition of usability. Also, it is very difficult to integrate these models into current software engineering practices. In order to overcome these challenges, this paper aims to define the term `usability' using the proposed hierarchical usability model with its detailed taxonomy. The taxonomy considers generic evaluation criteria for identifying the quality components, which brings together factors, attributes and characteristics defined in various HCI and software models. For the first time, the usability model is also implemented to predict more accurate usability values. The proposed system is named as fuzzy hierarchical usability model that can be easily integrated into the current software engineering practices. In order to validate the work, a dataset of six software development life cycle models is created and employed. These models are ranked according to their predicted usability values. This research also focuses on the detailed comparison of proposed model with the existing usability models.
A multisensor evaluation of the asymmetric convective model, version 2, in southeast Texas.
Kolling, Jenna S; Pleim, Jonathan E; Jeffries, Harvey E; Vizuete, William
2013-01-01
There currently exist a number of planetary boundary layer (PBL) schemes that can represent the effects of turbulence in daytime convective conditions, although these schemes remain a large source of uncertainty in meteorology and air quality model simulations. This study evaluates a recently developed combined local and nonlocal closure PBL scheme, the Asymmetric Convective Model, version 2 (ACM2), against PBL observations taken from radar wind profilers, a ground-based lidar, and multiple daytime radiosonde balloon launches. These observations were compared against predictions of PBLs from the Weather Research and Forecasting (WRF) model version 3.1 with the ACM2 PBL scheme option, and the Fifth-Generation Meteorological Model (MM5) version 3.7.3 with the Eta PBL scheme option that is currently being used to develop ozone control strategies in southeast Texas. MM5 and WRF predictions during the regulatory modeling episode were evaluated on their ability to predict the rise and fall of the PBL during daytime convective conditions across southeastern Texas. The MM5 predicted PBLs consistently underpredicted observations, and were also less than the WRF PBL predictions. The analysis reveals that the MM5 predicted a slower rising and shallower PBL not representative of the daytime urban boundary layer. Alternatively, the WRF model predicted a more accurate PBL evolution improving the root mean square error (RMSE), both temporally and spatially. The WRF model also more accurately predicted vertical profiles of temperature and moisture in the lowest 3 km of the atmosphere. Inspection of median surface temperature and moisture time-series plots revealed higher predicted surface temperatures in WRF and more surface moisture in MM5. These could not be attributed to surface heat fluxes, and thus the differences in performance of the WRF and MM5 models are likely due to the PBL schemes. An accurate depiction of the diurnal evolution of the planetary boundary layer (PBL) is necessary for realistic air quality simulations, and for formulating effective policy. The meteorological model used to support the southeast Texas 03 attainment demonstration made predictions of the PBL that were consistently less than those found in observations. The use of the Asymmetric Convective Model, version 2 (ACM2), predicted taller PBL heights and improved model predictions. A lower predicted PBL height in an air quality model would increase precursor concentrations and change the chemical production of O3 and possibly the response to control strategies.
COMPARISONS OF SPATIAL PATTERNS OF WET DEPOSITION TO MODEL PREDICTIONS
The Community Multiscale Air Quality model, (CMAQ), is a "one-atmosphere" model, in that it uses a consistent set of chemical reactions and physical principles to predict concentrations of primary pollutants, photochemical smog, and fine aerosols, as well as wet and dry depositi...
A real-time air quality forecasting system (Eta-CMAQ model suite) has been developed by linking the NCEP Eta model to the U.S. EPA CMAQ model. This work presents results from the application of the Eta-CMAQ modeling system for forecasting O3 over the northeastern U.S d...
Automated antibody structure prediction using Accelrys tools: Results and best practices
Fasnacht, Marc; Butenhof, Ken; Goupil-Lamy, Anne; Hernandez-Guzman, Francisco; Huang, Hongwei; Yan, Lisa
2014-01-01
We describe the methodology and results from our participation in the second Antibody Modeling Assessment experiment. During the experiment we predicted the structure of eleven unpublished antibody Fv fragments. Our prediction methods centered on template-based modeling; potential templates were selected from an antibody database based on their sequence similarity to the target in the framework regions. Depending on the quality of the templates, we constructed models of the antibody framework regions either using a single, chimeric or multiple template approach. The hypervariable loop regions in the initial models were rebuilt by grafting the corresponding regions from suitable templates onto the model. For the H3 loop region, we further refined models using ab initio methods. The final models were subjected to constrained energy minimization to resolve severe local structural problems. The analysis of the models submitted show that Accelrys tools allow for the construction of quite accurate models for the framework and the canonical CDR regions, with RMSDs to the X-ray structure on average below 1 Å for most of these regions. The results show that accurate prediction of the H3 hypervariable loops remains a challenge. Furthermore, model quality assessment of the submitted models show that the models are of quite high quality, with local geometry assessment scores similar to that of the target X-ray structures. Proteins 2014; 82:1583–1598. © 2014 The Authors. Proteins published by Wiley Periodicals, Inc. PMID:24833271
Quality and price--impact on patient satisfaction.
Pantouvakis, Angelos; Bouranta, Nancy
2014-01-01
The purpose of this paper is to synthesize existing quality-measurement models and applies them to healthcare by combining a Nordic service-quality with an American service performance model. Results are based on a questionnaire survey of 1,298 respondents. Service quality dimensions were derived and related to satisfaction by employing a multinomial logistic model, which allows prediction and service improvement. Qualitative and empirical evidence indicates that customer satisfaction and service quality are multi-dimensional constructs, whose quality components, together with convenience and cost, influence the customer's overall satisfaction. The proposed model identifies important quality and satisfaction issues. It also enables transitions between different responses in different studies to be compared.
Assessing microscope image focus quality with deep learning.
Yang, Samuel J; Berndl, Marc; Michael Ando, D; Barch, Mariya; Narayanaswamy, Arunachalam; Christiansen, Eric; Hoyer, Stephan; Roat, Chris; Hung, Jane; Rueden, Curtis T; Shankar, Asim; Finkbeiner, Steven; Nelson, Philip
2018-03-15
Large image datasets acquired on automated microscopes typically have some fraction of low quality, out-of-focus images, despite the use of hardware autofocus systems. Identification of these images using automated image analysis with high accuracy is important for obtaining a clean, unbiased image dataset. Complicating this task is the fact that image focus quality is only well-defined in foreground regions of images, and as a result, most previous approaches only enable a computation of the relative difference in quality between two or more images, rather than an absolute measure of quality. We present a deep neural network model capable of predicting an absolute measure of image focus on a single image in isolation, without any user-specified parameters. The model operates at the image-patch level, and also outputs a measure of prediction certainty, enabling interpretable predictions. The model was trained on only 384 in-focus Hoechst (nuclei) stain images of U2OS cells, which were synthetically defocused to one of 11 absolute defocus levels during training. The trained model can generalize on previously unseen real Hoechst stain images, identifying the absolute image focus to within one defocus level (approximately 3 pixel blur diameter difference) with 95% accuracy. On a simpler binary in/out-of-focus classification task, the trained model outperforms previous approaches on both Hoechst and Phalloidin (actin) stain images (F-scores of 0.89 and 0.86, respectively over 0.84 and 0.83), despite only having been presented Hoechst stain images during training. Lastly, we observe qualitatively that the model generalizes to two additional stains, Hoechst and Tubulin, of an unseen cell type (Human MCF-7) acquired on a different instrument. Our deep neural network enables classification of out-of-focus microscope images with both higher accuracy and greater precision than previous approaches via interpretable patch-level focus and certainty predictions. The use of synthetically defocused images precludes the need for a manually annotated training dataset. The model also generalizes to different image and cell types. The framework for model training and image prediction is available as a free software library and the pre-trained model is available for immediate use in Fiji (ImageJ) and CellProfiler.
Anxiety, social skills, friendship quality, and peer victimization: an integrated model.
Crawford, A Melissa; Manassis, Katharina
2011-10-01
This cross-sectional study investigated whether anxiety and social functioning interact in their prediction of peer victimization. A structural equation model linking anxiety, social skills, and friendship quality to victimization was tested separately for children with anxiety disorders and normal comparison children to explore whether the processes involved in victimization differ for these groups. Participants were 8-14 year old children: 55 (34 boys, 21 girls) diagnosed with an anxiety disorder and 85 (37 boys, 48 girls) normal comparison children. The final models for both groups yielded two independent pathways to victimization: (a) anxiety independently predicted being victimized; and (b) poor social skills predicted lower friendship quality, which in turn, placed a child at risk for victimization. These findings have important implications for the treatment of childhood anxiety disorders and for school-based anti-bullying interventions, but replication with larger samples is indicated. Copyright © 2011 Elsevier Ltd. All rights reserved.
Beef quality grading using machine vision
NASA Astrophysics Data System (ADS)
Jeyamkondan, S.; Ray, N.; Kranzler, Glenn A.; Biju, Nisha
2000-12-01
A video image analysis system was developed to support automation of beef quality grading. Forty images of ribeye steaks were acquired. Fat and lean meat were differentiated using a fuzzy c-means clustering algorithm. Muscle longissimus dorsi (l.d.) was segmented from the ribeye using morphological operations. At the end of each iteration of erosion and dilation, a convex hull was fitted to the image and compactness was measured. The number of iterations was selected to yield the most compact l.d. Match between the l.d. muscle traced by an expert grader and that segmented by the program was 95.9%. Marbling and color features were extracted from the l.d. muscle and were used to build regression models to predict marbling and color scores. Quality grade was predicted using another regression model incorporating all features. Grades predicted by the model were statistically equivalent to the grades assigned by expert graders.
Water Quality Analysis Simulation
The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural phenomena and man-made pollution for variious pollution management decisions.
Biological and functional relevance of CASP predictions
Liu, Tianyun; Ish‐Shalom, Shirbi; Torng, Wen; Lafita, Aleix; Bock, Christian; Mort, Matthew; Cooper, David N; Bliven, Spencer; Capitani, Guido; Mooney, Sean D.
2017-01-01
Abstract Our goal is to answer the question: compared with experimental structures, how useful are predicted models for functional annotation? We assessed the functional utility of predicted models by comparing the performances of a suite of methods for functional characterization on the predictions and the experimental structures. We identified 28 sites in 25 protein targets to perform functional assessment. These 28 sites included nine sites with known ligand binding (holo‐sites), nine sites that are expected or suggested by experimental authors for small molecule binding (apo‐sites), and Ten sites containing important motifs, loops, or key residues with important disease‐associated mutations. We evaluated the utility of the predictions by comparing their microenvironments to the experimental structures. Overall structural quality correlates with functional utility. However, the best‐ranked predictions (global) may not have the best functional quality (local). Our assessment provides an ability to discriminate between predictions with high structural quality. When assessing ligand‐binding sites, most prediction methods have higher performance on apo‐sites than holo‐sites. Some servers show consistently high performance for certain types of functional sites. Finally, many functional sites are associated with protein‐protein interaction. We also analyzed biologically relevant features from the protein assemblies of two targets where the active site spanned the protein‐protein interface. For the assembly targets, we find that the features in the models are mainly determined by the choice of template. PMID:28975675
Brady, Amie M.G.; Bushon, Rebecca N.; Plona, Meg B.
2009-01-01
The Cuyahoga River within Cuyahoga Valley National Park (CVNP) in Ohio is often impaired for recreational use because of elevated concentrations of bacteria, which are indicators of fecal contamination. During the recreational seasons (May through August) of 2004 through 2007, samples were collected at two river sites, one upstream of and one centrally-located within CVNP. Bacterial concentrations and turbidity were determined, and streamflow at time of sampling and rainfall amounts over the previous 24 hours prior to sampling were ascertained. Statistical models to predict Escherichia coli (E. coli) concentrations were developed for each site (with data from 2004 through 2006) and tested during an independent year (2007). At Jaite, a sampling site near the center of CVNP, the predictive model performed better than the traditional method of determining the current day's water quality using the previous day's E. coli concentration. During 2007, the Jaite model, based on turbidity, produced more correct responses (81 percent) and fewer false negatives (3.2 percent) than the traditional method (68 and 26 percent, respectively). At Old Portage, a sampling site just upstream from CVNP, a predictive model with turbidity and rainfall as explanatory variables did not perform as well as the traditional method. The Jaite model was used to estimate water quality at three other sites in the park; although it did not perform as well as the traditional method, it performed well - yielding between 68 and 91 percent correct responses. Further research would be necessary to determine whether using the Jaite model to predict recreational water quality elsewhere on the river would provide accurate results.
Having Fun on Facebook?: Mothers' Enjoyment as a Moderator of Mental Health and Facebook Use.
Kaufmann, Renee; Buckner, Marjorie M; Ledbetter, Andrew M
2017-08-01
This study reports results of a study that examined the extent to which contextual factors (i.e., income level and number of children) might predict a mother's mental health quality, which, in turn, may predict level of engagement with Facebook. Results supported this model, finding that mothers with more children and lower income possess lower mental health quality, and lower mental health quality predicted more frequent Facebook use. However, this pattern was qualified by a mother's level of enjoyment of Facebook, such that mental health quality did not significantly predict Facebook intensity when enjoyment of Facebook was low. This research extends practitioners' knowledge of mothers' mental health quality by identifying a behavior that may indicate lower mental health quality and enhance abilities to recognize mothers who may need support or treatment. Future directions for this research are included.
Air pollution dispersion models for human exposure predictions in London.
Beevers, Sean D; Kitwiroon, Nutthida; Williams, Martin L; Kelly, Frank J; Ross Anderson, H; Carslaw, David C
2013-01-01
The London household survey has shown that people travel and are exposed to air pollutants differently. This argues for human exposure to be based upon space-time-activity data and spatio-temporal air quality predictions. For the latter, we have demonstrated the role that dispersion models can play by using two complimentary models, KCLurban, which gives source apportionment information, and Community Multi-scale Air Quality Model (CMAQ)-urban, which predicts hourly air quality. The KCLurban model is in close agreement with observations of NO(X), NO(2) and particulate matter (PM)(10/2.5), having a small normalised mean bias (-6% to 4%) and a large Index of Agreement (0.71-0.88). The temporal trends of NO(X) from the CMAQ-urban model are also in reasonable agreement with observations. Spatially, NO(2) predictions show that within 10's of metres of major roads, concentrations can range from approximately 10-20 p.p.b. up to 70 p.p.b. and that for PM(10/2.5) central London roadside concentrations are approximately double the suburban background concentrations. Exposure to different PM sources is important and we predict that brake wear-related PM(10) concentrations are approximately eight times greater near major roads than at suburban background locations. Temporally, we have shown that average NO(X) concentrations close to roads can range by a factor of approximately six between the early morning minimum and morning rush hour maximum periods. These results present strong arguments for the hybrid exposure model under development at King's and, in future, for in-building models and a model for the London Underground.
Brady, Amie M. G.; Meg B. Plona,
2015-07-30
A computer program was developed to manage the nowcasts by running the predictive models and posting the results to a publicly accessible Web site daily by 9 a.m. The nowcasts were able to correctly predict E. coli concentrations above or below the water-quality standard at Jaite for 79 percent of the samples compared with the measured concentrations. In comparison, the persistence model (using the previous day’s sample concentration) correctly predicted concentrations above or below the water-quality standard in only 68 percent of the samples. To determine if the Jaite nowcast could be used for the stretch of the river between Lock 29 and Jaite, the model predictions for Jaite were compared with the measured concentrations at Lock 29. The Jaite nowcast provided correct responses for 77 percent of the Lock 29 samples, which was a greater percentage than the percentage of correct responses (58 percent) from the persistence model at Lock 29.
Sun, Xin; Young, Jennifer; Liu, Jeng-Hung; Newman, David
2018-06-01
The objective of this project was to develop a computer vision system (CVS) for objective measurement of pork loin under industry speed requirement. Color images of pork loin samples were acquired using a CVS. Subjective color and marbling scores were determined according to the National Pork Board standards by a trained evaluator. Instrument color measurement and crude fat percentage were used as control measurements. Image features (18 color features; 1 marbling feature; 88 texture features) were extracted from whole pork loin color images. Artificial intelligence prediction model (support vector machine) was established for pork color and marbling quality grades. The results showed that CVS with support vector machine modeling reached the highest prediction accuracy of 92.5% for measured pork color score and 75.0% for measured pork marbling score. This research shows that the proposed artificial intelligence prediction model with CVS can provide an effective tool for predicting color and marbling in the pork industry at online speeds. Copyright © 2018 Elsevier Ltd. All rights reserved.
Ojanen, Tiina; Sijtsema, Jelle J; Hawley, Patricia H; Little, Todd D
2010-12-01
Friendships are essential for adolescent social development. However, they may be pursued for varying motives, which, in turn, may predict similarity in friendships via social selection or social influence processes, and likely help to explain friendship quality. We examined the effect of early adolescents' (N = 374, 12-14 years) intrinsic and extrinsic friendship motivation on friendship selection and social influence by utilizing social network modeling. In addition, longitudinal relations among motivation and friendship quality were estimated with structural equation modeling. Extrinsic motivation predicted activity in making friendship nominations during the sixth grade and lower friendship quality across time. Intrinsic motivation predicted inactivity in making friendship nominations during the sixth, popularity as a friend across the transition to middle school, and higher friendship quality across time. Social influence effects were observed for both motives, but were more pronounced for intrinsic motivation. Copyright © 2010 The Association for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kim, Chan Moon; Parnichkun, Manukid
2017-11-01
Coagulation is an important process in drinking water treatment to attain acceptable treated water quality. However, the determination of coagulant dosage is still a challenging task for operators, because coagulation is nonlinear and complicated process. Feedback control to achieve the desired treated water quality is difficult due to lengthy process time. In this research, a hybrid of k-means clustering and adaptive neuro-fuzzy inference system ( k-means-ANFIS) is proposed for the settled water turbidity prediction and the optimal coagulant dosage determination using full-scale historical data. To build a well-adaptive model to different process states from influent water, raw water quality data are classified into four clusters according to its properties by a k-means clustering technique. The sub-models are developed individually on the basis of each clustered data set. Results reveal that the sub-models constructed by a hybrid k-means-ANFIS perform better than not only a single ANFIS model, but also seasonal models by artificial neural network (ANN). The finally completed model consisting of sub-models shows more accurate and consistent prediction ability than a single model of ANFIS and a single model of ANN based on all five evaluation indices. Therefore, the hybrid model of k-means-ANFIS can be employed as a robust tool for managing both treated water quality and production costs simultaneously.
Rocket exhaust effluent modeling for tropospheric air quality and environmental assessments
NASA Technical Reports Server (NTRS)
Stephens, J. B.; Stewart, R. B.
1977-01-01
The various techniques for diffusion predictions to support air quality predictions and environmental assessments for aerospace applications are discussed in terms of limitations imposed by atmospheric data. This affords an introduction to the rationale behind the selection of the National Aeronautics and Space Administration (NASA)/Marshall Space Flight Center (MSFC) Rocket Exhaust Effluent Diffusion (REED) program. The models utilized in the NASA/MSFC REED program are explained. This program is then evaluated in terms of some results from a joint MSFC/Langley Research Center/Kennedy Space Center Titan Exhaust Effluent Prediction and Monitoring Program.
Feedbacks between Air Pollution and Weather, Part 1: Effects on Weather
The meteorological predictions of fully coupled air-quality models running in “feedback” versus “nofeedback” simulations were compared against each other as part of Phase 2 of the Air Quality Model Evaluation International Initiative. The model simulations included a “no-feedback...
Extant process-based hydrologic and water quality models are indispensable to water resources planning and environmental management. However, models are only approximations of real systems and often calibrated with incomplete and uncertain data. Reliable estimates, or perhaps f...
Extant process-based hydrologic and water quality models are indispensable to water resources planning and environmental management. However, models are only approximations of real systems and often calibrated with incomplete and uncertain data. Reliable estimates, or perhaps f...
Predicting the Accuracy of Protein–Ligand Docking on Homology Models
BORDOGNA, ANNALISA; PANDINI, ALESSANDRO; BONATI, LAURA
2011-01-01
Ligand–protein docking is increasingly used in Drug Discovery. The initial limitations imposed by a reduced availability of target protein structures have been overcome by the use of theoretical models, especially those derived by homology modeling techniques. While this greatly extended the use of docking simulations, it also introduced the need for general and robust criteria to estimate the reliability of docking results given the model quality. To this end, a large-scale experiment was performed on a diverse set including experimental structures and homology models for a group of representative ligand–protein complexes. A wide spectrum of model quality was sampled using templates at different evolutionary distances and different strategies for target–template alignment and modeling. The obtained models were scored by a selection of the most used model quality indices. The binding geometries were generated using AutoDock, one of the most common docking programs. An important result of this study is that indeed quantitative and robust correlations exist between the accuracy of docking results and the model quality, especially in the binding site. Moreover, state-of-the-art indices for model quality assessment are already an effective tool for an a priori prediction of the accuracy of docking experiments in the context of groups of proteins with conserved structural characteristics. PMID:20607693
NASA Astrophysics Data System (ADS)
Jiang, Sanyuan; Jomaa, Seifeddine; Büttner, Olaf; Rode, Michael
2014-05-01
Hydrological water quality modeling is increasingly used for investigating runoff and nutrient transport processes as well as watershed management but it is mostly unclear how data availablity determins model identification. In this study, the HYPE (HYdrological Predictions for the Environment) model, which is a process-based, semi-distributed hydrological water quality model, was applied in two different mesoscale catchments (Selke (463 km2) and Weida (99 km2)) located in central Germany to simulate discharge and inorganic nitrogen (IN) transport. PEST and DREAM(ZS) were combined with the HYPE model to conduct parameter calibration and uncertainty analysis. Split-sample test was used for model calibration (1994-1999) and validation (1999-2004). IN concentration and daily IN load were found to be highly correlated with discharge, indicating that IN leaching is mainly controlled by runoff. Both dynamics and balances of water and IN load were well captured with NSE greater than 0.83 during validation period. Multi-objective calibration (calibrating hydrological and water quality parameters simultaneously) was found to outperform step-wise calibration in terms of model robustness. Multi-site calibration was able to improve model performance at internal sites, decrease parameter posterior uncertainty and prediction uncertainty. Nitrogen-process parameters calibrated using continuous daily averages of nitrate-N concentration observations produced better and more robust simulations of IN concentration and load, lower posterior parameter uncertainty and IN concentration prediction uncertainty compared to the calibration against uncontinuous biweekly nitrate-N concentration measurements. Both PEST and DREAM(ZS) are efficient in parameter calibration. However, DREAM(ZS) is more sound in terms of parameter identification and uncertainty analysis than PEST because of its capability to evolve parameter posterior distributions and estimate prediction uncertainty based on global search and Bayesian inference schemes.
NASA Astrophysics Data System (ADS)
Johnston, J. M.
2013-12-01
Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework 'iemWatersheds' has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water Assessment Tool (SWAT) predicts surface water and sediment runoff and associated contaminants; the Watershed Mercury Model (WMM) predicts mercury runoff and loading to streams; the Water quality Analysis and Simulation Program (WASP) predicts water quality within the stream channel; the Habitat Suitability Index (HSI) model scores physicochemical habitat quality for individual fish species; and the Bioaccumulation and Aquatic System Simulator (BASS) predicts fish growth, population dynamics and bioaccumulation of toxic substances. The capability of the Framework to address cumulative impacts will be demonstrated for freshwater ecosystem services and mountaintop mining.
NASA Astrophysics Data System (ADS)
Booth, N. L.; Everman, E.; Kuo, I.; Sprague, L.; Murphy, L.
2011-12-01
A new web-based decision support system has been developed as part of the U.S. Geological Survey (USGS) National Water Quality Assessment Program's (NAWQA) effort to provide ready access to Spatially Referenced Regressions On Watershed attributes (SPARROW) results of stream water-quality conditions and to offer sophisticated scenario testing capabilities for research and water-quality planning via an intuitive graphical user interface with a map-based display. The SPARROW Decision Support System (DSS) is delivered through a web browser over an Internet connection, making it widely accessible to the public in a format that allows users to easily display water-quality conditions, distribution of nutrient sources, nutrient delivery to downstream waterbodies, and simulations of altered nutrient inputs including atmospheric and agricultural sources. The DSS offers other features for analysis including various background map layers, model output exports, and the ability to save and share prediction scenarios. SPARROW models currently supported by the DSS are based on the modified digital versions of the 1:500,000-scale River Reach File (RF1) and 1:100,000-scale National Hydrography Dataset (medium-resolution, NHDPlus) stream networks. The underlying modeling framework and server infrastructure illustrate innovations in the information technology and geosciences fields for delivering SPARROW model predictions over the web by performing intensive model computations and map visualizations of the predicted conditions within the stream network.
DOT National Transportation Integrated Search
1978-02-01
Ride-quality models for city buses and intercity trains are presented and discussed in terms of their ability to predict passenger comfort and ride acceptability. The report, the last of three volumes, contains procedural guidelines to be employed by...
The difficulty in assessing errors in numerical models of air quality is a major obstacle to improving their ability to predict and retrospectively map air quality. In this paper, using simulation outputs from the Community Multi-scale Air Quality Model (CMAQ), the statistic...
Huysmans, Maaike A; Eijckelhof, Belinda H W; Garza, Jennifer L Bruno; Coenen, Pieter; Blatter, Birgitte M; Johnson, Peter W; van Dieën, Jaap H; van der Beek, Allard J; Dennerlein, Jack T
2017-12-15
Alternative techniques to assess physical exposures, such as prediction models, could facilitate more efficient epidemiological assessments in future large cohort studies examining physical exposures in relation to work-related musculoskeletal symptoms. The aim of this study was to evaluate two types of models that predict arm-wrist-hand physical exposures (i.e. muscle activity, wrist postures and kinematics, and keyboard and mouse forces) during computer use, which only differed with respect to the candidate predicting variables; (i) a full set of predicting variables, including self-reported factors, software-recorded computer usage patterns, and worksite measurements of anthropometrics and workstation set-up (full models); and (ii) a practical set of predicting variables, only including the self-reported factors and software-recorded computer usage patterns, that are relatively easy to assess (practical models). Prediction models were build using data from a field study among 117 office workers who were symptom-free at the time of measurement. Arm-wrist-hand physical exposures were measured for approximately two hours while workers performed their own computer work. Each worker's anthropometry and workstation set-up were measured by an experimenter, computer usage patterns were recorded using software and self-reported factors (including individual factors, job characteristics, computer work behaviours, psychosocial factors, workstation set-up characteristics, and leisure-time activities) were collected by an online questionnaire. We determined the predictive quality of the models in terms of R2 and root mean squared (RMS) values and exposure classification agreement to low-, medium-, and high-exposure categories (in the practical model only). The full models had R2 values that ranged from 0.16 to 0.80, whereas for the practical models values ranged from 0.05 to 0.43. Interquartile ranges were not that different for the two models, indicating that only for some physical exposures the full models performed better. Relative RMS errors ranged between 5% and 19% for the full models, and between 10% and 19% for the practical model. When the predicted physical exposures were classified into low, medium, and high, classification agreement ranged from 26% to 71%. The full prediction models, based on self-reported factors, software-recorded computer usage patterns, and additional measurements of anthropometrics and workstation set-up, show a better predictive quality as compared to the practical models based on self-reported factors and recorded computer usage patterns only. However, predictive quality varied largely across different arm-wrist-hand exposure parameters. Future exploration of the relation between predicted physical exposure and symptoms is therefore only recommended for physical exposures that can be reasonably well predicted. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Quality metrics for sensor images
NASA Technical Reports Server (NTRS)
Ahumada, AL
1993-01-01
Methods are needed for evaluating the quality of augmented visual displays (AVID). Computational quality metrics will help summarize, interpolate, and extrapolate the results of human performance tests with displays. The FLM Vision group at NASA Ames has been developing computational models of visual processing and using them to develop computational metrics for similar problems. For example, display modeling systems use metrics for comparing proposed displays, halftoning optimizing methods use metrics to evaluate the difference between the halftone and the original, and image compression methods minimize the predicted visibility of compression artifacts. The visual discrimination models take as input two arbitrary images A and B and compute an estimate of the probability that a human observer will report that A is different from B. If A is an image that one desires to display and B is the actual displayed image, such an estimate can be regarded as an image quality metric reflecting how well B approximates A. There are additional complexities associated with the problem of evaluating the quality of radar and IR enhanced displays for AVID tasks. One important problem is the question of whether intruding obstacles are detectable in such displays. Although the discrimination model can handle detection situations by making B the original image A plus the intrusion, this detection model makes the inappropriate assumption that the observer knows where the intrusion will be. Effects of signal uncertainty need to be added to our models. A pilot needs to make decisions rapidly. The models need to predict not just the probability of a correct decision, but the probability of a correct decision by the time the decision needs to be made. That is, the models need to predict latency as well as accuracy. Luce and Green have generated models for auditory detection latencies. Similar models are needed for visual detection. Most image quality models are designed for static imagery. Watson has been developing a general spatial-temporal vision model to optimize video compression techniques. These models need to be adapted and calibrated for AVID applications.
Nowcasting recreational water quality
Boehm, Alexandria B.; Whitman, Richard L.; Nevers, Meredith; Hou, Deyi; Weisberg, Stephen B.
2007-01-01
Advances in molecular techniques may soon provide new opportunities to provide more timely information on whether recreational beaches are free from fecal contamination. However, an alternative approach is the use of predictive models. This chapter presents a summary of these developing efforts. First, we describe documented physical, chemical, and biological factors that have been demonstrated by researchers to affect bacterial concentrations at beaches and thus represent logical parameters for inclusion in a model. Then, we illustrate how various types of models can be applied to predict water quality at freshwater and marine beaches.
The Impact of Truth Surrogate Variance on Quality Assessment/Assurance in Wind Tunnel Testing
NASA Technical Reports Server (NTRS)
DeLoach, Richard
2016-01-01
Minimum data volume requirements for wind tunnel testing are reviewed and shown to depend on error tolerance, response model complexity, random error variance in the measurement environment, and maximum acceptable levels of inference error risk. Distinctions are made between such related concepts as quality assurance and quality assessment in response surface modeling, as well as between precision and accuracy. Earlier research on the scaling of wind tunnel tests is extended to account for variance in the truth surrogates used at confirmation sites in the design space to validate proposed response models. A model adequacy metric is presented that represents the fraction of the design space within which model predictions can be expected to satisfy prescribed quality specifications. The impact of inference error on the assessment of response model residuals is reviewed. The number of sites where reasonably well-fitted response models actually predict inadequately is shown to be considerably less than the number of sites where residuals are out of tolerance. The significance of such inference error effects on common response model assessment strategies is examined.
Stone, Wesley W.; Gilliom, Robert J.; Crawford, Charles G.
2008-01-01
Regression models were developed for predicting annual maximum and selected annual maximum moving-average concentrations of atrazine in streams using the Watershed Regressions for Pesticides (WARP) methodology developed by the National Water-Quality Assessment Program (NAWQA) of the U.S. Geological Survey (USGS). The current effort builds on the original WARP models, which were based on the annual mean and selected percentiles of the annual frequency distribution of atrazine concentrations. Estimates of annual maximum and annual maximum moving-average concentrations for selected durations are needed to characterize the levels of atrazine and other pesticides for comparison to specific water-quality benchmarks for evaluation of potential concerns regarding human health or aquatic life. Separate regression models were derived for the annual maximum and annual maximum 21-day, 60-day, and 90-day moving-average concentrations. Development of the regression models used the same explanatory variables, transformations, model development data, model validation data, and regression methods as those used in the original development of WARP. The models accounted for 72 to 75 percent of the variability in the concentration statistics among the 112 sampling sites used for model development. Predicted concentration statistics from the four models were within a factor of 10 of the observed concentration statistics for most of the model development and validation sites. Overall, performance of the models for the development and validation sites supports the application of the WARP models for predicting annual maximum and selected annual maximum moving-average atrazine concentration in streams and provides a framework to interpret the predictions in terms of uncertainty. For streams with inadequate direct measurements of atrazine concentrations, the WARP model predictions for the annual maximum and the annual maximum moving-average atrazine concentrations can be used to characterize the probable levels of atrazine for comparison to specific water-quality benchmarks. Sites with a high probability of exceeding a benchmark for human health or aquatic life can be prioritized for monitoring.
Kim, Young-sun; Lim, Hyo Keun; Park, Min Jung; Rhim, Hyunchul; Jung, Sin-Ho; Sohn, Insuk; Kim, Tae-Joong; Keserci, Bilgin
2016-01-01
The aim of this study was to fit and validate screening magnetic resonance imaging (MRI)-based prediction models for assessing immediate therapeutic responses of uterine fibroids to MRI-guided high-intensity focused ultrasound (MR-HIFU) ablation. Informed consent from all subjects was obtained for our institutional review board-approved study. A total of 240 symptomatic uterine fibroids (mean diameter, 6.9 cm) in 152 women (mean age, 43.3 years) treated with MR-HIFU ablation were retrospectively analyzed (160 fibroids for training, 80 fibroids for validation). Screening MRI parameters (subcutaneous fat thickness [mm], x1; relative peak enhancement [%] in semiquantitative perfusion MRI, x2; T2 signal intensity ratio of fibroid to skeletal muscle, x3) were used to fit prediction models with regard to ablation efficiency (nonperfused volume/treatment cell volume, y1) and ablation quality (grade 1-5, poor to excellent, y2), respectively, using the generalized estimating equation method. Cutoff values for achievement of treatment intent (efficiency >1.0; quality grade 4/5) were determined based on receiver operating characteristic curve analysis. Prediction performances were validated by calculating positive and negative predictive values. Generalized estimating equation analyses yielded models of y1 = 2.2637 - 0.0415x1 - 0.0011x2 - 0.0772x3 and y2 = 6.8148 - 0.1070x1 - 0.0050x2 - 0.2163x3. Cutoff values were 1.312 for ablation efficiency (area under the curve, 0.7236; sensitivity, 0.6882; specificity, 0.6866) and 4.019 for ablation quality (0.8794; 0.7156; 0.9020). Positive and negative predictive values were 0.917 and 0.500 for ablation efficiency and 0.978 and 0.600 for ablation quality, respectively. Screening MRI-based prediction models for assessing immediate therapeutic responses of uterine fibroids to MR-HIFU ablation were fitted and validated, which may reduce the risk of unsuccessful treatment.
Ground-water models for water resource planning
Moore, J.E.
1983-01-01
In the past decade hydrogeologists have emphasized the development of computer-based mathematical models to aid in the understanding of flow, the transport of solutes, transport of heat, and deformation in the ground-water system. These models have been used to provide information and predictions for water managers. Too frequently, ground-water was neglected in water resource planning because managers believed that it could not be adequately evaluated in terms of availability, quality, and effect of development on surface-water supplies. Now, however, with newly developed digital ground-water models, effects of development can be predicted. Such models have been used to predict hydrologic and quality changes under different stresses. These models have grown in complexity over the last ten years from simple one-layer models to three-dimensional simulations of ground-water flow, which may include solute transport, heat transport, effects of land subsidence, and encroachment of saltwater. Case histories illustrate how predictive ground-water models have provided the information needed for the sound planning and management of water resources in the USA. ?? 1983 D. Reidel Publishing Company.
Portilla, Ximena A.; Ballard, Parissa J.; Adler, Nancy E.; Boyce, W. Thomas; Obradović, Jelena
2014-01-01
This study investigates the dynamic interplay between teacher-child relationship quality and children’s behaviors across kindergarten and first grade to predict academic competence in first grade. Using a sample of 338 ethnically diverse 5-year-old children, nested path analytic models were conducted to examine bidirectional pathways between children’s behaviors and teacher-child relationship quality. Low self-regulation in kindergarten fall, as indexed by inattention and impulsive behaviors, predicted more conflict with teachers in kindergarten spring and this effect persisted into first grade. Conflict and low self-regulation jointly predicted decreases in school engagement which in turn predicted first grade academic competence. Findings illustrate the importance of considering transactions between self-regulation, teacher-child relationship quality, and school engagement in predicting academic competence. PMID:24916608
Machine Learning and Deep Learning Models to Predict Runoff Water Quantity and Quality
NASA Astrophysics Data System (ADS)
Bradford, S. A.; Liang, J.; Li, W.; Murata, T.; Simunek, J.
2017-12-01
Contaminants can be rapidly transported at the soil surface by runoff to surface water bodies. Physically-based models, which are based on the mathematical description of main hydrological processes, are key tools for predicting surface water impairment. Along with physically-based models, data-driven models are becoming increasingly popular for describing the behavior of hydrological and water resources systems since these models can be used to complement or even replace physically based-models. In this presentation we propose a new data-driven model as an alternative to a physically-based overland flow and transport model. First, we have developed a physically-based numerical model to simulate overland flow and contaminant transport (the HYDRUS-1D overland flow module). A large number of numerical simulations were carried out to develop a database containing information about the impact of various input parameters (weather patterns, surface topography, vegetation, soil conditions, contaminants, and best management practices) on runoff water quantity and quality outputs. This database was used to train data-driven models. Three different methods (Neural Networks, Support Vector Machines, and Recurrence Neural Networks) were explored to prepare input- output functional relations. Results demonstrate the ability and limitations of machine learning and deep learning models to predict runoff water quantity and quality.
Roche, Daniel B; Buenavista, Maria T; Tetchner, Stuart J; McGuffin, Liam J
2011-07-01
The IntFOLD server is a novel independent server that integrates several cutting edge methods for the prediction of structure and function from sequence. Our guiding principles behind the server development were as follows: (i) to provide a simple unified resource that makes our prediction software accessible to all and (ii) to produce integrated output for predictions that can be easily interpreted. The output for predictions is presented as a simple table that summarizes all results graphically via plots and annotated 3D models. The raw machine readable data files for each set of predictions are also provided for developers, which comply with the Critical Assessment of Methods for Protein Structure Prediction (CASP) data standards. The server comprises an integrated suite of five novel methods: nFOLD4, for tertiary structure prediction; ModFOLD 3.0, for model quality assessment; DISOclust 2.0, for disorder prediction; DomFOLD 2.0 for domain prediction; and FunFOLD 1.0, for ligand binding site prediction. Predictions from the IntFOLD server were found to be competitive in several categories in the recent CASP9 experiment. The IntFOLD server is available at the following web site: http://www.reading.ac.uk/bioinf/IntFOLD/.
Zhang, Zugui; Jones, Philip; Weintraub, William S; Mancini, G B John; Sedlis, Steven; Maron, David J; Teo, Koon; Hartigan, Pamela; Kostuk, William; Berman, Daniel; Boden, William E; Spertus, John A
2018-05-01
Percutaneous coronary intervention (PCI) is a therapy to reduce angina and improve quality of life in patients with stable ischemic heart disease. However, it is unclear whether the quality of life after PCI is more dependent on the PCI or other patient-related factors. To address this question, we created models to predict angina and quality of life 1 year after PCI and medical therapy. Using data from the 2287 stable ischemic heart disease patients randomized in the COURAGE trial (Clinical Outcomes Utilizing Revascularization and Aggressive Drug Evaluation) to PCI plus optimal medical therapy (OMT) versus OMT alone, we built prediction models for 1-year Seattle Angina Questionnaire angina frequency, physical limitation, and quality of life scores, both as continuous outcomes and categorized by clinically desirable states, using multivariable techniques. Although most patients improved regardless of treatment, marked variability was observed in Seattle Angina Questionnaire scores 1 year after randomization. Adding PCI conferred a greater mean improvement (about 2 points) in Seattle Angina Questionnaire scores that were not affected by patient characteristics ( P values for all interactions >0.05). The proportion of patients free of angina or having very good/excellent physical limitation (physical function) or quality of life at 1 year was 57%, 58%, 66% with PCI+OMT and 50%, 55%, 59% with OMT alone group, respectively. However, other characteristics, such as baseline symptoms, age, diabetes mellitus, and the magnitude of myocardium subtended by narrowed coronary arteries were as, or more, important than revascularization in predicting symptoms (partial R 2 =0.07 versus 0.29, 0.03 versus 0.22, and 0.05 versus 0.24 in the domain of angina frequency, physical limitation, and quality of life, respectively). There was modest/good discrimination of the models (C statistic=0.72-0.82) and excellent calibration (coefficients of determination for predicted versus observed deciles=0.83-0.97). The health status outcomes of stable ischemic heart disease patients treated by OMT+PCI versus OMT alone can be predicted with modest accuracy. Angina and quality of life at 1 year is improved by PCI but is more strongly associated with other patient characteristics. URL: https://www.clinicaltrials.gov. Unique identifier: NCT00007657. © 2018 American Heart Association, Inc.
Gram Quist, Helle; Christensen, Ulla; Christensen, Karl Bang; Aust, Birgit; Borg, Vilhelm; Bjorner, Jakob B
2013-01-17
Lifestyle variables may serve as important intermediate factors between psychosocial work environment and health outcomes. Previous studies, focussing on work stress models have shown mixed and weak results in relation to weight change. This study aims to investigate psychosocial factors outside the classical work stress models as potential predictors of change in body mass index (BMI) in a population of health care workers. A cohort study, with three years follow-up, was conducted among Danish health care workers (3982 women and 152 men). Logistic regression analyses examined change in BMI (more than +/- 2 kg/m(2)) as predicted by baseline psychosocial work factors (work pace, workload, quality of leadership, influence at work, meaning of work, predictability, commitment, role clarity, and role conflicts) and five covariates (age, cohabitation, physical work demands, type of work position and seniority). Among women, high role conflicts predicted weight gain, while high role clarity predicted both weight gain and weight loss. Living alone also predicted weight gain among women, while older age decreased the odds of weight gain. High leadership quality predicted weight loss among men. Associations were generally weak, with the exception of quality of leadership, age, and cohabitation. This study of a single occupational group suggested a few new risk factors for weight change outside the traditional work stress models.
Shi, Xiaohu; Zhang, Jingfen; He, Zhiquan; Shang, Yi; Xu, Dong
2011-09-01
One of the major challenges in protein tertiary structure prediction is structure quality assessment. In many cases, protein structure prediction tools generate good structural models, but fail to select the best models from a huge number of candidates as the final output. In this study, we developed a sampling-based machine-learning method to rank protein structural models by integrating multiple scores and features. First, features such as predicted secondary structure, solvent accessibility and residue-residue contact information are integrated by two Radial Basis Function (RBF) models trained from different datasets. Then, the two RBF scores and five selected scoring functions developed by others, i.e., Opus-CA, Opus-PSP, DFIRE, RAPDF, and Cheng Score are synthesized by a sampling method. At last, another integrated RBF model ranks the structural models according to the features of sampling distribution. We tested the proposed method by using two different datasets, including the CASP server prediction models of all CASP8 targets and a set of models generated by our in-house software MUFOLD. The test result shows that our method outperforms any individual scoring function on both best model selection, and overall correlation between the predicted ranking and the actual ranking of structural quality.
Predicting indoor pollutant concentrations, and applications to air quality management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lorenzetti, David M.
Because most people spend more than 90% of their time indoors, predicting exposure to airborne pollutants requires models that incorporate the effect of buildings. Buildings affect the exposure of their occupants in a number of ways, both by design (for example, filters in ventilation systems remove particles) and incidentally (for example, sorption on walls can reduce peak concentrations, but prolong exposure to semivolatile organic compounds). Furthermore, building materials and occupant activities can generate pollutants. Indoor air quality depends not only on outdoor air quality, but also on the design, maintenance, and use of the building. For example, ''sick building'' symptomsmore » such as respiratory problems and headaches have been related to the presence of air-conditioning systems, to carpeting, to low ventilation rates, and to high occupant density (1). The physical processes of interest apply even in simple structures such as homes. Indoor air quality models simulate the processes, such as ventilation and filtration, that control pollutant concentrations in a building. Section 2 describes the modeling approach, and the important transport processes in buildings. Because advection usually dominates among the transport processes, Sections 3 and 4 describe methods for predicting airflows. The concluding section summarizes the application of these models.« less
Successful use of the Exposure Related Dose Estimating Model (ERDEM) in risk assessment of susceptible human sub-populations, e.g., infants and children, requires input of quality experimental data. In the clear absence of quality data, PBPK models can be developed and possibl...
The U.S. Environmental Protection Agency (USEPA) has a team of scientists developing a next generation air quality modeling system employing the Model for Prediction Across Scales – Atmosphere (MPAS-A) as its meteorological foundation. Several preferred physics schemes and ...
USDA-ARS?s Scientific Manuscript database
This chapter presents the development and application of a three-dimensional water quality model for predicting the distributions of nutrients, phytoplankton, dissolved oxygen, etc., in natural lakes. In this model, the computational domain was divided into two parts: the water column and the bed se...
An examination of data quality on QSAR Modeling in regards ...
The development of QSAR models is critically dependent on the quality of available data. As part of our efforts to develop public platforms to provide access to predictive models, we have attempted to discriminate the influence of the quality versus quantity of data available to develop and validate QSAR models. We have focused our efforts on the widely used EPISuite software that was initially developed over two decades ago and, specifically, on the PHYSPROP dataset used to train the EPISuite prediction models. This presentation will review our approaches to examining key datasets, the delivery of curated data and the development of machine-learning models for thirteen separate property endpoints of interest to environmental science. We will also review how these data will be made freely accessible to the community via a new “chemistry dashboard”. This abstract does not reflect U.S. EPA policy. presentation at UNC-CH.
NASA Astrophysics Data System (ADS)
Gao, Ming; Li, Shiwei
2017-05-01
Based on experimental data of the soybean yield and quality from 30 sampling points, a quantitative structure-activity relationship model (2D-QSAR) was established using the soil quality (elements, pH, organic matter content and cation exchange capacity) as independent variables and soybean yield or quality as the dependent variable, with SPSS software. During the modeling, the full data set (30 and 14 compounds) was divided into a training set (24 and 11 compounds) for model generation and a test set (6 and 3 compounds) for model validation. The R2 values of the resulting models and data were 0.826 and 0.808 for soybean yield and quality, respectively, and all regression coefficients were significant (P < 0.05). The correlation coefficient R2pred of observed values and predicted values of the soybean yield and soybean quality in the test set were 0.961 and 0.956, respectively, indicating that the models had a good predictive ability. Moreover, the Mo, Se, K, N and organic matter contents and the cation exchange capacity of soil had a positive effect on soybean production, and the B, Mo, Se, K and N contents and cation exchange coefficient had a positive effect on soybean quality. The results are instructive for enhancing soils to improve the yield and quality of soybean, and this method can also be used to study other crops or regions, providing a theoretical basis to improving the yield and quality of crops.
McHale, S M; Updegraff, K A; Helms-Erikson, H; Crouter, A C
2001-01-01
The development of gender role qualities (attitudes, personality, leisure activities) from middle childhood to early adolescence was studied to determine whether siblings' gender role qualities predicted those of their sisters and brothers. Participants were 198 firstborn and second-born siblings (Ms = 10 years 9 months and 8 years 3 months, respectively, in Year 1) and their parents. Families were interviewed annually for 3 years. Firstborn siblings' qualities in Year 1 predicted second-born children's qualities in Year 3 when both parent and child qualities in Year 1 were controlled, a pattern consistent with a social learning model of sibling influence. Parental influence was more evident and sibling influence less evident in predicting firstborns' qualities; for firstborns, sibling influences suggested a de-identification process.
Frimpter, M.H.; Donohue, J.J.; Rapacz, M.V.; Beye, H.G.
1990-01-01
A mass-balance accounting model can be used to guide the management of septic systems and fertilizers to control the degradation of groundwater quality in zones of an aquifer that contributes water to public supply wells. The nitrate nitrogen concentration of the mixture in the well can be predicted for steady-state conditions by calculating the concentration that results from the total weight of nitrogen and total volume of water entering the zone of contribution to the well. These calculations will allow water-quality managers to predict the nitrate concentrations that would be produced by different types and levels of development, and to plan development accordingly. Computations for different development schemes provide a technical basis for planners and managers to compare water quality effects and to select alternatives that limit nitrate concentration in wells. Appendix A contains tables of nitrate loads and water volumes from common sources for use with the accounting model. Appendix B describes the preparation of a spreadsheet for the nitrate loading calculations with a software package generally available for desktop computers. (USGS)
Dong, Jian-Jun; Li, Qing-Liang; Yin, Hua; Zhong, Cheng; Hao, Jun-Guang; Yang, Pan-Fei; Tian, Yu-Hong; Jia, Shi-Ru
2014-10-15
Sensory evaluation is regarded as a necessary procedure to ensure a reproducible quality of beer. Meanwhile, high-throughput analytical methods provide a powerful tool to analyse various flavour compounds, such as higher alcohol and ester. In this study, the relationship between flavour compounds and sensory evaluation was established by non-linear models such as partial least squares (PLS), genetic algorithm back-propagation neural network (GA-BP), support vector machine (SVM). It was shown that SVM with a Radial Basis Function (RBF) had a better performance of prediction accuracy for both calibration set (94.3%) and validation set (96.2%) than other models. Relatively lower prediction abilities were observed for GA-BP (52.1%) and PLS (31.7%). In addition, the kernel function of SVM played an essential role of model training when the prediction accuracy of SVM with polynomial kernel function was 32.9%. As a powerful multivariate statistics method, SVM holds great potential to assess beer quality. Copyright © 2014 Elsevier Ltd. All rights reserved.
Advanced Computational Methods for High-accuracy Refinement of Protein Low-quality Models
NASA Astrophysics Data System (ADS)
Zang, Tianwu
Predicting the 3-dimentional structure of protein has been a major interest in the modern computational biology. While lots of successful methods can generate models with 3˜5A root-mean-square deviation (RMSD) from the solution, the progress of refining these models is quite slow. It is therefore urgently needed to develop effective methods to bring low-quality models to higher-accuracy ranges (e.g., less than 2 A RMSD). In this thesis, I present several novel computational methods to address the high-accuracy refinement problem. First, an enhanced sampling method, named parallel continuous simulated tempering (PCST), is developed to accelerate the molecular dynamics (MD) simulation. Second, two energy biasing methods, Structure-Based Model (SBM) and Ensemble-Based Model (EBM), are introduced to perform targeted sampling around important conformations. Third, a three-step method is developed to blindly select high-quality models along the MD simulation. These methods work together to make significant refinement of low-quality models without any knowledge of the solution. The effectiveness of these methods is examined in different applications. Using the PCST-SBM method, models with higher global distance test scores (GDT_TS) are generated and selected in the MD simulation of 18 targets from the refinement category of the 10th Critical Assessment of Structure Prediction (CASP10). In addition, in the refinement test of two CASP10 targets using the PCST-EBM method, it is indicated that EBM may bring the initial model to even higher-quality levels. Furthermore, a multi-round refinement protocol of PCST-SBM improves the model quality of a protein to the level that is sufficient high for the molecular replacement in X-ray crystallography. Our results justify the crucial position of enhanced sampling in the protein structure prediction and demonstrate that a considerable improvement of low-accuracy structures is still achievable with current force fields.
Quality of life as a cancer nursing outcome variable.
Padilla, G V; Grant, M M
1985-10-01
A reliable and valid multidimensional instrument for measuring quality of life in cancer patients has been developed. Furthermore, a model has been offered that describes how quality of life works as an outcome variable. Using this model, predictions were made of how nursing interventions may directly or indirectly impact on quality of life. Initial testing of the model using data from 135 colostomy patients showed how satisfaction with nursing care and personal control act as cognitive mediators of self-worth, which then impacts on dimensions of quality of life.
A neighborhood statistics model for predicting stream pathogen indicator levels.
Pandey, Pramod K; Pasternack, Gregory B; Majumder, Mahbubul; Soupir, Michelle L; Kaiser, Mark S
2015-03-01
Because elevated levels of water-borne Escherichia coli in streams are a leading cause of water quality impairments in the U.S., water-quality managers need tools for predicting aqueous E. coli levels. Presently, E. coli levels may be predicted using complex mechanistic models that have a high degree of unchecked uncertainty or simpler statistical models. To assess spatio-temporal patterns of instream E. coli levels, herein we measured E. coli, a pathogen indicator, at 16 sites (at four different times) within the Squaw Creek watershed, Iowa, and subsequently, the Markov Random Field model was exploited to develop a neighborhood statistics model for predicting instream E. coli levels. Two observed covariates, local water temperature (degrees Celsius) and mean cross-sectional depth (meters), were used as inputs to the model. Predictions of E. coli levels in the water column were compared with independent observational data collected from 16 in-stream locations. The results revealed that spatio-temporal averages of predicted and observed E. coli levels were extremely close. Approximately 66 % of individual predicted E. coli concentrations were within a factor of 2 of the observed values. In only one event, the difference between prediction and observation was beyond one order of magnitude. The mean of all predicted values at 16 locations was approximately 1 % higher than the mean of the observed values. The approach presented here will be useful while assessing instream contaminations such as pathogen/pathogen indicator levels at the watershed scale.
This study examines ozone (O3) predictions from the Community Multiscale Air Quality (CMAQ) model version 4.5 and discusses potential factors influencing the model results. Daily maximum 8-hr average O3 levels are largely underpredicted when observed O...
[On-site evaluation of raw milk qualities by portable Vis/NIR transmittance technique].
Wang, Jia-Hua; Zhang, Xiao-Wei; Wang, Jun; Han, Dong-Hai
2014-10-01
To ensure the material safety of dairy products, visible (Vis)/near infrared (NIR) spectroscopy combined with che- mometrics methods was used to develop models for fat, protein, dry matter (DM) and lactose on-site evaluation. A total of 88 raw milk samples were collected from individual livestocks in different years. The spectral of raw milk were measured by a porta- ble Vis/NIR spectrometer with diffused transmittance accessory. To remove the scatter effect and baseline drift, the diffused transmittance spectra were preprocessed by 2nd order derivative with Savitsky-Golay (polynomial order 2, data point 25). Changeable size moving window partial least squares (CSMWPLS) and genetic algorithms partial least squares (GAPLS) meth- ods were suggested to select informative regions for PLS calibration. The PLS and multiple linear regression (MLR) methods were used to develop models for predicting quality index of raw milk. The prediction performance of CSMWPLS models were similar to GAPLS models for fat, protein, DM and lactose evaluation, the root mean standard errors of prediction (RMSEP) were 0.115 6/0.103 3, 0.096 2/0.113 7, 0.201 3/0.123 7 and 0.077 4/0.066 8, and the relative standard deviations of prediction (RPD) were 8.99/10.06, 3.53/2.99, 5.76/9.38 and 1.81/2.10, respectively. Meanwhile, the MLR models were also cal- ibrated with 8, 10, 9 and 7 variables for fat, protein, DM and lactose, respectively. The prediction performance of MLR models was better than or close to PLS models. The MLR models to predict fat, protein, DM and lactose yielded the RMSEP of 0.107 0, 0.093 0, 0.136 0 and 0.065 8, and the RPD of 9.72, 3.66, 8.53 and 2.13, respectively. The results demonstrated the usefulness of Vis/NIR spectra combined with multivariate calibration methods as an objective and rapid method for the quality evaluation of complicated raw milks. And the results obtained also highlight the potential of portable Vis/NIR instruments for on-site assessing quality indexes of raw milk.
USDA-ARS?s Scientific Manuscript database
Water quality models are used to predict effects of conservation practices to mitigate the transport of herbicides to water bodies. We used two models - the Agricultural Policy/Environmental eXtender (APEX) and the Riparian Ecosystem Management Model (REMM) to predict the movement of atrazine from ...
Development and testing of watershed-scale models for poorly drained soils
Glenn P. Fernandez; George M. Chescheir; R. Wayne Skaggs; Devendra M. Amatya
2005-01-01
Watershed-scale hydrology and water quality models were used to evaluate the crrmulative impacts of land use and management practices on dowrzstream hydrology and nitrogen loading of poorly drained watersheds. Field-scale hydrology and nutrient dyyrutmics are predicted by DRAINMOD in both models. In the first model (DRAINMOD-DUFLOW), field-scale predictions are coupled...
Social Anxiety and Friendship Quality over Time.
Rodebaugh, Thomas L; Lim, Michelle H; Shumaker, Erik A; Levinson, Cheri A; Thompson, Tess
2015-01-01
High social anxiety in adults is associated with self-report of impaired friendship quality, but not necessarily with impairment reported by friends. Further, prospective prediction of social anxiety and friendship quality over time has not been tested among adults. We therefore examined friendship quality and social anxiety prospectively in 126 young adults (67 primary participants and 59 friends, aged 17-22 years); the primary participants were screened to be extreme groups to increase power and relevance to clinical samples (i.e., they were recruited based on having very high or very low social interaction anxiety). The prospective relationships between friendship quality and social anxiety were then tested using an Actor-Partner Interdependence Model. Friendship quality prospectively predicted social anxiety over time within each individual in the friendship, such that higher friendship quality at Time 1 predicted lower social anxiety approximately 6 months later at Time 2. Social anxiety did not predict friendship quality. Although the results support the view that social anxiety and friendship quality have an important causal relationship, the results run counter to the assumption that high social anxiety causes poor friendship quality. Interventions to increase friendship quality merit further consideration.
Bettina Ohse; Falk Huettmann; Stefanie M. Ickert-Bond; Glenn P. Juday
2009-01-01
Most wilderness areas still lack accurate distribution information on tree species. We met this need with a predictive GIS modeling approach, using freely available digital data and computer programs to efficiently obtain high-quality species distribution maps. Here we present a digital map with the predicted distribution of white spruce (Picea glauca...
Reliability Prediction of Ontology-Based Service Compositions Using Petri Net and Time Series Models
Li, Jia; Xia, Yunni; Luo, Xin
2014-01-01
OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy. PMID:24688429
Kahlen, Katrin; Stützel, Hartmut
2011-10-01
Light quantity and quality affect internode lengths in cucumber (Cucumis sativus), whereby leaf area and the optical properties of the leaves mainly control light quality within a cucumber plant community. This modelling study aimed at providing a simple, non-destructive method to predict final internode lengths (FILs) using light quantity and leaf area data. Several simplifications of a light quantity and quality sensitive model for estimating FILs in cucumber have been tested. The direct simplifications substitute the term for the red : far-red (R : FR) ratios, by a term for (a) the leaf area index (LAI, m(2) m(-2)) or (b) partial LAI, the cumulative leaf area per m(2) ground, where leaf area per m(2) ground is accumulated from the top of each plant until a number, n, of leaves per plant is reached. The indirect simplifications estimate the input R : FR ratio based on partial leaf area and plant density. In all models, simulated FILs were in line with the measured FILs over various canopy architectures and light conditions, but the prediction quality varied. The indirect simplification based on leaf area of ten leaves revealed the best fit with measured data. Its prediction quality was even higher than of the original model. This study showed that for vertically trained cucumber plants, leaf area data can substitute local light quality data for estimating FIL data. In unstressed canopies, leaf area over the upper ten ranks seems to represent the feedback of the growing architecture on internode elongation with respect to light quality. This highlights the role of this domain of leaves as the primary source for the specific R : FR signal controlling the final length of an internode and could therefore guide future research on up-scaling local processes to the crop level.
Biological and functional relevance of CASP predictions.
Liu, Tianyun; Ish-Shalom, Shirbi; Torng, Wen; Lafita, Aleix; Bock, Christian; Mort, Matthew; Cooper, David N; Bliven, Spencer; Capitani, Guido; Mooney, Sean D; Altman, Russ B
2018-03-01
Our goal is to answer the question: compared with experimental structures, how useful are predicted models for functional annotation? We assessed the functional utility of predicted models by comparing the performances of a suite of methods for functional characterization on the predictions and the experimental structures. We identified 28 sites in 25 protein targets to perform functional assessment. These 28 sites included nine sites with known ligand binding (holo-sites), nine sites that are expected or suggested by experimental authors for small molecule binding (apo-sites), and Ten sites containing important motifs, loops, or key residues with important disease-associated mutations. We evaluated the utility of the predictions by comparing their microenvironments to the experimental structures. Overall structural quality correlates with functional utility. However, the best-ranked predictions (global) may not have the best functional quality (local). Our assessment provides an ability to discriminate between predictions with high structural quality. When assessing ligand-binding sites, most prediction methods have higher performance on apo-sites than holo-sites. Some servers show consistently high performance for certain types of functional sites. Finally, many functional sites are associated with protein-protein interaction. We also analyzed biologically relevant features from the protein assemblies of two targets where the active site spanned the protein-protein interface. For the assembly targets, we find that the features in the models are mainly determined by the choice of template. © 2017 The Authors Proteins: Structure, Function and Bioinformatics Published by Wiley Periodicals, Inc.
Sleep Quality Prediction From Wearable Data Using Deep Learning.
Sathyanarayana, Aarti; Joty, Shafiq; Fernandez-Luque, Luis; Ofli, Ferda; Srivastava, Jaideep; Elmagarmid, Ahmed; Arora, Teresa; Taheri, Shahrad
2016-11-04
The importance of sleep is paramount to health. Insufficient sleep can reduce physical, emotional, and mental well-being and can lead to a multitude of health complications among people with chronic conditions. Physical activity and sleep are highly interrelated health behaviors. Our physical activity during the day (ie, awake time) influences our quality of sleep, and vice versa. The current popularity of wearables for tracking physical activity and sleep, including actigraphy devices, can foster the development of new advanced data analytics. This can help to develop new electronic health (eHealth) applications and provide more insights into sleep science. The objective of this study was to evaluate the feasibility of predicting sleep quality (ie, poor or adequate sleep efficiency) given the physical activity wearable data during awake time. In this study, we focused on predicting good or poor sleep efficiency as an indicator of sleep quality. Actigraphy sensors are wearable medical devices used to study sleep and physical activity patterns. The dataset used in our experiments contained the complete actigraphy data from a subset of 92 adolescents over 1 full week. Physical activity data during awake time was used to create predictive models for sleep quality, in particular, poor or good sleep efficiency. The physical activity data from sleep time was used for the evaluation. We compared the predictive performance of traditional logistic regression with more advanced deep learning methods: multilayer perceptron (MLP), convolutional neural network (CNN), simple Elman-type recurrent neural network (RNN), long short-term memory (LSTM-RNN), and a time-batched version of LSTM-RNN (TB-LSTM). Deep learning models were able to predict the quality of sleep (ie, poor or good sleep efficiency) based on wearable data from awake periods. More specifically, the deep learning methods performed better than traditional logistic regression. “CNN had the highest specificity and sensitivity, and an overall area under the receiver operating characteristic (ROC) curve (AUC) of 0.9449, which was 46% better as compared with traditional logistic regression (0.6463). Deep learning methods can predict the quality of sleep based on actigraphy data from awake periods. These predictive models can be an important tool for sleep research and to improve eHealth solutions for sleep. ©Aarti Sathyanarayana, Shafiq Joty, Luis Fernandez-Luque, Ferda Ofli, Jaideep Srivastava, Ahmed Elmagarmid, Teresa Arora, Shahrad Taheri. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 04.11.2016.
Sleep Quality Prediction From Wearable Data Using Deep Learning
Sathyanarayana, Aarti; Joty, Shafiq; Ofli, Ferda; Srivastava, Jaideep; Elmagarmid, Ahmed; Arora, Teresa; Taheri, Shahrad
2016-01-01
Background The importance of sleep is paramount to health. Insufficient sleep can reduce physical, emotional, and mental well-being and can lead to a multitude of health complications among people with chronic conditions. Physical activity and sleep are highly interrelated health behaviors. Our physical activity during the day (ie, awake time) influences our quality of sleep, and vice versa. The current popularity of wearables for tracking physical activity and sleep, including actigraphy devices, can foster the development of new advanced data analytics. This can help to develop new electronic health (eHealth) applications and provide more insights into sleep science. Objective The objective of this study was to evaluate the feasibility of predicting sleep quality (ie, poor or adequate sleep efficiency) given the physical activity wearable data during awake time. In this study, we focused on predicting good or poor sleep efficiency as an indicator of sleep quality. Methods Actigraphy sensors are wearable medical devices used to study sleep and physical activity patterns. The dataset used in our experiments contained the complete actigraphy data from a subset of 92 adolescents over 1 full week. Physical activity data during awake time was used to create predictive models for sleep quality, in particular, poor or good sleep efficiency. The physical activity data from sleep time was used for the evaluation. We compared the predictive performance of traditional logistic regression with more advanced deep learning methods: multilayer perceptron (MLP), convolutional neural network (CNN), simple Elman-type recurrent neural network (RNN), long short-term memory (LSTM-RNN), and a time-batched version of LSTM-RNN (TB-LSTM). Results Deep learning models were able to predict the quality of sleep (ie, poor or good sleep efficiency) based on wearable data from awake periods. More specifically, the deep learning methods performed better than traditional linear regression. CNN had the highest specificity and sensitivity, and an overall area under the receiver operating characteristic (ROC) curve (AUC) of 0.9449, which was 46% better as compared with traditional linear regression (0.6463). Conclusions Deep learning methods can predict the quality of sleep based on actigraphy data from awake periods. These predictive models can be an important tool for sleep research and to improve eHealth solutions for sleep. PMID:27815231
Uncertainties in ozone concentrations predicted with a Lagrangian photochemical air quality model have been estimated using Bayesian Monte Carlo (BMC) analysis. Bayesian Monte Carlo analysis provides a means of combining subjective "prior" uncertainty estimates developed ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lian, J; Yuan, L; Wu, Q
Purpose: The quality and efficiency of radiotherapy treatment planning are highly planer dependent. Previously we have developed a statistical model to correlate anatomical features with dosimetry features of head and neck Tomotherapy treatment. The model enables us to predict the best achievable dosimetry for individual patient prior to treatment planning. The purpose of this work is to study if the prediction model can facilitate the treatment planning in both the efficiency and dosimetric quality. Methods: The anatomy-dosimetry correlation model was used to calculate the expected DVH for nine patients formerly treated. In Group A (3 patients), the model prediction agreedmore » with the clinic plan; in Group B (3 patients), the model predicted lower larynx mean dose than the clinic plan; in Group C (3 patients), the model suggested the brainstem could be further spared. Guided by the prior knowledge, we re-planned all 9 cases. The number of interactions during the optimization process and dosimetric endpoints between the original clinical plan and model-guided re-plan were compared. Results: For Group A, the difference of target coverage and organs-at-risk sparing is insignificant (p>0.05) between the replan and the clinical plan. For Group B, the clinical plan larynx median dose is 49.4±4.7 Gy, while the prediction suggesting 40.0±6.2 Gy (p<0.05). The re-plan achieved 41.5±6.6 Gy, with similar dose on other structures as clinical plan. For Group C, the clinical plan brainstem maximum dose is 44.7±5.5 Gy. The model predicted lower value 32.2±3.8 Gy (p<0.05). The re-plans reduced brainstem maximum dose to 31.8±4.1 Gy without affecting the dosimetry of other structures. In the replanning of the 9 cases, the times operator interacted with TPS are reduced on average about 50% compared to the clinical plan. Conclusion: We have demonstrated that the prior expert knowledge embedded model improved the efficiency and quality of Tomotherapy treatment planning.« less
Can Predictive Modeling Identify Head and Neck Oncology Patients at Risk for Readmission?
Manning, Amy M; Casper, Keith A; Peter, Kay St; Wilson, Keith M; Mark, Jonathan R; Collar, Ryan M
2018-05-01
Objective Unplanned readmission within 30 days is a contributor to health care costs in the United States. The use of predictive modeling during hospitalization to identify patients at risk for readmission offers a novel approach to quality improvement and cost reduction. Study Design Two-phase study including retrospective analysis of prospectively collected data followed by prospective longitudinal study. Setting Tertiary academic medical center. Subjects and Methods Prospectively collected data for patients undergoing surgical treatment for head and neck cancer from January 2013 to January 2015 were used to build predictive models for readmission within 30 days of discharge using logistic regression, classification and regression tree (CART) analysis, and random forests. One model (logistic regression) was then placed prospectively into the discharge workflow from March 2016 to May 2016 to determine the model's ability to predict which patients would be readmitted within 30 days. Results In total, 174 admissions had descriptive data. Thirty-two were excluded due to incomplete data. Logistic regression, CART, and random forest predictive models were constructed using the remaining 142 admissions. When applied to 106 consecutive prospective head and neck oncology patients at the time of discharge, the logistic regression model predicted readmissions with a specificity of 94%, a sensitivity of 47%, a negative predictive value of 90%, and a positive predictive value of 62% (odds ratio, 14.9; 95% confidence interval, 4.02-55.45). Conclusion Prospectively collected head and neck cancer databases can be used to develop predictive models that can accurately predict which patients will be readmitted. This offers valuable support for quality improvement initiatives and readmission-related cost reduction in head and neck cancer care.
Air pollution exposure prediction approaches used in air pollution epidemiology studies.
Özkaynak, Halûk; Baxter, Lisa K; Dionisio, Kathie L; Burke, Janet
2013-01-01
Epidemiological studies of the health effects of outdoor air pollution have traditionally relied upon surrogates of personal exposures, most commonly ambient concentration measurements from central-site monitors. However, this approach may introduce exposure prediction errors and misclassification of exposures for pollutants that are spatially heterogeneous, such as those associated with traffic emissions (e.g., carbon monoxide, elemental carbon, nitrogen oxides, and particulate matter). We review alternative air quality and human exposure metrics applied in recent air pollution health effect studies discussed during the International Society of Exposure Science 2011 conference in Baltimore, MD. Symposium presenters considered various alternative exposure metrics, including: central site or interpolated monitoring data, regional pollution levels predicted using the national scale Community Multiscale Air Quality model or from measurements combined with local-scale (AERMOD) air quality models, hybrid models that include satellite data, statistically blended modeling and measurement data, concentrations adjusted by home infiltration rates, and population-based human exposure model (Stochastic Human Exposure and Dose Simulation, and Air Pollutants Exposure models) predictions. These alternative exposure metrics were applied in epidemiological applications to health outcomes, including daily mortality and respiratory hospital admissions, daily hospital emergency department visits, daily myocardial infarctions, and daily adverse birth outcomes. This paper summarizes the research projects presented during the symposium, with full details of the work presented in individual papers in this journal issue.
A Spectral Method for Spatial Downscaling
Reich, Brian J.; Chang, Howard H.; Foley, Kristen M.
2014-01-01
Summary Complex computer models play a crucial role in air quality research. These models are used to evaluate potential regulatory impacts of emission control strategies and to estimate air quality in areas without monitoring data. For both of these purposes, it is important to calibrate model output with monitoring data to adjust for model biases and improve spatial prediction. In this article, we propose a new spectral method to study and exploit complex relationships between model output and monitoring data. Spectral methods allow us to estimate the relationship between model output and monitoring data separately at different spatial scales, and to use model output for prediction only at the appropriate scales. The proposed method is computationally efficient and can be implemented using standard software. We apply the method to compare Community Multiscale Air Quality (CMAQ) model output with ozone measurements in the United States in July 2005. We find that CMAQ captures large-scale spatial trends, but has low correlation with the monitoring data at small spatial scales. PMID:24965037
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Schmidt, Rachel; Moiseenko, Vitali
Purpose: The purpose of this study was to quantify the frequency and clinical severity of quality deficiencies in intensity modulated radiation therapy (IMRT) planning in the Radiation Therapy Oncology Group 0126 protocol. Methods and Materials: A total of 219 IMRT patients from the high-dose arm (79.2 Gy) of RTOG 0126 were analyzed. To quantify plan quality, we used established knowledge-based methods for patient-specific dose-volume histogram (DVH) prediction of organs at risk and a Lyman-Kutcher-Burman (LKB) model for grade ≥2 rectal complications to convert DVHs into normal tissue complication probabilities (NTCPs). The LKB model was validated by fitting dose-response parameters relative tomore » observed toxicities. The 90th percentile (22 of 219) of plans with the lowest excess risk (difference between clinical and model-predicted NTCP) were used to create a model for the presumed best practices in the protocol (pDVH{sub 0126,top10%}). Applying the resultant model to the entire sample enabled comparisons between DVHs that patients could have received to DVHs they actually received. Excess risk quantified the clinical impact of suboptimal planning. Accuracy of pDVH predictions was validated by replanning 30 of 219 patients (13.7%), including equal numbers of presumed “high-quality,” “low-quality,” and randomly sampled plans. NTCP-predicted toxicities were compared to adverse events on protocol. Results: Existing models showed that bladder-sparing variations were less prevalent than rectum quality variations and that increased rectal sparing was not correlated with target metrics (dose received by 98% and 2% of the PTV, respectively). Observed toxicities were consistent with current LKB parameters. Converting DVH and pDVH{sub 0126,top10%} to rectal NTCPs, we observed 94 of 219 patients (42.9%) with ≥5% excess risk, 20 of 219 patients (9.1%) with ≥10% excess risk, and 2 of 219 patients (0.9%) with ≥15% excess risk. Replanning demonstrated the predicted NTCP reductions while maintaining the volume of the PTV receiving prescription dose. An equivalent sample of high-quality plans showed fewer toxicities than low-quality plans, 6 of 73 versus 10 of 73 respectively, although these differences were not significant (P=.21) due to insufficient statistical power in this retrospective study. Conclusions: Plan quality deficiencies in RTOG 0126 exposed patients to substantial excess risk for rectal complications.« less
PconsFold: improved contact predictions improve protein models.
Michel, Mirco; Hayat, Sikander; Skwark, Marcin J; Sander, Chris; Marks, Debora S; Elofsson, Arne
2014-09-01
Recently it has been shown that the quality of protein contact prediction from evolutionary information can be improved significantly if direct and indirect information is separated. Given sufficiently large protein families, the contact predictions contain sufficient information to predict the structure of many protein families. However, since the first studies contact prediction methods have improved. Here, we ask how much the final models are improved if improved contact predictions are used. In a small benchmark of 15 proteins, we show that the TM-scores of top-ranked models are improved by on average 33% using PconsFold compared with the original version of EVfold. In a larger benchmark, we find that the quality is improved with 15-30% when using PconsC in comparison with earlier contact prediction methods. Further, using Rosetta instead of CNS does not significantly improve global model accuracy, but the chemistry of models generated with Rosetta is improved. PconsFold is a fully automated pipeline for ab initio protein structure prediction based on evolutionary information. PconsFold is based on PconsC contact prediction and uses the Rosetta folding protocol. Due to its modularity, the contact prediction tool can be easily exchanged. The source code of PconsFold is available on GitHub at https://www.github.com/ElofssonLab/pcons-fold under the MIT license. PconsC is available from http://c.pcons.net/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
Regional-scale air quality models are used to estimate the response of air pollutants to potential emission control strategies as part of the decision-making process. Traditionally, the model predicted pollutant concentrations are evaluated for the “base case” to assess a model’s...
USDA-ARS?s Scientific Manuscript database
In order to control algal blooms, stressor-response relationships between water quality metrics, environmental variables, and algal growth should be understood and modeled. Machine-learning methods were suggested to express stressor-response relationships found by application of mechanistic water qu...
DOT National Transportation Integrated Search
1986-01-01
This report describes an investigation of state-of-the-art models for predicting the impact on air quality of additions or changes to a highway system identified by the U.S. Environmental Protection Agency as a "non-attainment area" for air quality s...
Stochastic performance modeling and evaluation of obstacle detectability with imaging range sensors
NASA Technical Reports Server (NTRS)
Matthies, Larry; Grandjean, Pierrick
1993-01-01
Statistical modeling and evaluation of the performance of obstacle detection systems for Unmanned Ground Vehicles (UGVs) is essential for the design, evaluation, and comparison of sensor systems. In this report, we address this issue for imaging range sensors by dividing the evaluation problem into two levels: quality of the range data itself and quality of the obstacle detection algorithms applied to the range data. We review existing models of the quality of range data from stereo vision and AM-CW LADAR, then use these to derive a new model for the quality of a simple obstacle detection algorithm. This model predicts the probability of detecting obstacles and the probability of false alarms, as a function of the size and distance of the obstacle, the resolution of the sensor, and the level of noise in the range data. We evaluate these models experimentally using range data from stereo image pairs of a gravel road with known obstacles at several distances. The results show that the approach is a promising tool for predicting and evaluating the performance of obstacle detection with imaging range sensors.
A review of AirQ Models and their applications for forecasting the air pollution health outcomes.
Oliveri Conti, Gea; Heibati, Behzad; Kloog, Itai; Fiore, Maria; Ferrante, Margherita
2017-03-01
Even though clean air is considered as a basic requirement for the maintenance of human health, air pollution continues to pose a significant health threat in developed and developing countries alike. Monitoring and modeling of classic and emerging pollutants is vital to our knowledge of health outcomes in exposed subjects and to our ability to predict them. The ability to anticipate and manage changes in atmospheric pollutant concentrations relies on an accurate representation of the chemical state of the atmosphere. The task of providing the best possible analysis of air pollution thus requires efficient computational tools enabling efficient integration of observational data into models. A number of air quality models have been developed and play an important role in air quality management. Even though a large number of air quality models have been discussed or applied, their heterogeneity makes it difficult to select one approach above the others. This paper provides a brief review on air quality models with respect to several aspects such as prediction of health effects.
Fire frequency in the Interior Columbia River Basin: Building regional models from fire history data
McKenzie, D.; Peterson, D.L.; Agee, James K.
2000-01-01
Fire frequency affects vegetation composition and successional pathways; thus it is essential to understand fire regimes in order to manage natural resources at broad spatial scales. Fire history data are lacking for many regions for which fire management decisions are being made, so models are needed to estimate past fire frequency where local data are not yet available. We developed multiple regression models and tree-based (classification and regression tree, or CART) models to predict fire return intervals across the interior Columbia River basin at 1-km resolution, using georeferenced fire history, potential vegetation, cover type, and precipitation databases. The models combined semiqualitative methods and rigorous statistics. The fire history data are of uneven quality; some estimates are based on only one tree, and many are not cross-dated. Therefore, we weighted the models based on data quality and performed a sensitivity analysis of the effects on the models of estimation errors that are due to lack of cross-dating. The regression models predict fire return intervals from 1 to 375 yr for forested areas, whereas the tree-based models predict a range of 8 to 150 yr. Both types of models predict latitudinal and elevational gradients of increasing fire return intervals. Examination of regional-scale output suggests that, although the tree-based models explain more of the variation in the original data, the regression models are less likely to produce extrapolation errors. Thus, the models serve complementary purposes in elucidating the relationships among fire frequency, the predictor variables, and spatial scale. The models can provide local managers with quantitative information and provide data to initialize coarse-scale fire-effects models, although predictions for individual sites should be treated with caution because of the varying quality and uneven spatial coverage of the fire history database. The models also demonstrate the integration of qualitative and quantitative methods when requisite data for fully quantitative models are unavailable. They can be tested by comparing new, independent fire history reconstructions against their predictions and can be continually updated, as better fire history data become available.
An analytical approach for predicting pilot induced oscillations
NASA Technical Reports Server (NTRS)
Hess, R. A.
1981-01-01
The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion or determining the susceptability of an aircraft to pilot induced oscillations (PIO) is formulated. Finally, a model-based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.
The current study examines predictions of transference ratios and related modeled parameters for oxidized sulfur and oxidized nitrogen using five years (2002-2006) of 12-km grid cell-specific annual estimates from EPA’s Community Air Quality Model (CMAQ) for five selected sub-re...
USDA-ARS?s Scientific Manuscript database
Few studies have attempted to quantify mass balances of both pesticides and degradates in multiple agricultural settings of the United States. We used inverse modeling to calibrate the Root Zone Water Quality Model (RZWQM) for predicting the unsaturated-zone transport and fate of metolachlor, metola...
In this study, the concept of scale analysis is applied to evaluate two state-of-science meteorological models, namely MM5 and RAMS3b, currently being used to drive regional-scale air quality models. To this end, seasonal time series of observations and predictions for temperatur...
The central purpose of our study was to examine the performance of the United States Environmental Protection Agency's (EPA) nonreactive Gaussian air quality dispersion model, the Industrial Source Complex Short Term Model (ISCST3) Version 98226, in predicting polychlorinated dib...
Individuation or Identification? Self-Objectification and the Mother-Adolescent Relationship
Katz-Wise, Sabra L.; Budge, Stephanie L.; Lindberg, Sara M.; Hyde, Janet S.
2013-01-01
Do adolescents model their mothers’ self-objectification? We measured self-objectification (body surveillance and body shame), body mass index (BMI), body esteem, and quality of the mother-adolescent relationship in 179 female and 162 male adolescents at age 15, as well as self-objectification in their mothers. Initial analyses indicated no improvement in model fit if paths were allowed to differ for females and males; therefore a single model was tested for the combined sample. Findings revealed that mothers’ body surveillance negatively predicted adolescents’ body surveillance. Mothers’ body shame was unrelated to adolescents’ body shame, but positively predicted adolescents’ body surveillance. Results for the relationship between mothers’ and adolescents’ self-objectification suggest that adolescents engaged in more individuation than modeling. A more positive mother-adolescent relationship predicted lower body shame and higher body esteem in adolescents, suggesting that the quality of the relationship with the mother may be a protective factor for adolescents’ body image. Mother-adolescent relationship quality did not moderate the association between mothers’ and adolescents’ self-objectification. These findings contribute to our understanding about the sociocultural role of parents in adolescents’ body image and inform interventions addressing negative body image in this age group. The quality of the mother-adolescent relationship is a clear point of entry for such interventions. Therapists should work with adolescents and their mothers toward a more positive relationship quality, which could then positively impact adolescents’ body image. PMID:24363490
Multiple Sensitivity Testing for Regional Air Quality Model in summer 2014
NASA Astrophysics Data System (ADS)
Tang, Y.; Lee, P.; Pan, L.; Tong, D.; Kim, H. C.; Huang, M.; Wang, J.; McQueen, J.; Lu, C. H.; Artz, R. S.
2015-12-01
The NOAA Air Resources laboratory leads to improve the performance of the U.S. Air Quality Forecasting Capability (NAQFC). It is operational in NOAA National Centers for Environmental Prediction (NCEP) which focuses on predicting surface ozone and PM2.5. In order to improve its performance, we tested several approaches, including NOAA Environmental Modeling System Global Aerosol Component (NGAC) simulation derived ozone and aerosol lateral boundary conditions (LBC), bi-direction NH3 emission and HMS(Hazard Mapping System)-BlueSky emission with the latest U.S. EPA Community Multi-scale Air Quality model (CMAQ) version and the U.S EPA National Emission Inventory (NEI)-2011 anthropogenic emissions. The operational NAQFC uses static profiles for its lateral boundary condition (LBC), which does not impose severe issue for near-surface air quality prediction. However, its degraded performance for the upper layer (e.g. above 3km) is evident when comparing with aircraft measured ozone. NCEP's Global Forecast System (GFS) has tracer O3 prediction treated as 3-D prognostic variable (Moorthi and Iredell, 1998) after being initialized with Solar Backscatter Ultra Violet-2 (SBUV-2) satellite data. We applied that ozone LBC to the CMAQ's upper layers and yield more reasonable O3 prediction than that with static LBC comparing with the aircraft data in Discover-AQ Colorado campaign. NGAC's aerosol LBC also improved the PM2.5 prediction with more realistic background aerosols. The bi-direction NH3 emission used in CMAQ also help reduce the NH3 and nitrate under-prediction issue. During summer 2014, strong wildfires occurred in northwestern USA, and we used the US Forest Service's BlueSky fire emission with HMS fire counts to drive CMAQ and tested the difference of day-1 and day-2 fire emission estimation. Other related issues were also discussed.
Modelling of beef sensory quality for a better prediction of palatability.
Hocquette, Jean-François; Van Wezemael, Lynn; Chriki, Sghaier; Legrand, Isabelle; Verbeke, Wim; Farmer, Linda; Scollan, Nigel D; Polkinghorne, Rod; Rødbotten, Rune; Allen, Paul; Pethick, David W
2014-07-01
Despite efforts by the industry to control the eating quality of beef, there remains a high level of variability in palatability, which is one reason for consumer dissatisfaction. In Europe, there is still no reliable on-line tool to predict beef quality and deliver consistent quality beef to consumers. Beef quality traits depend in part on the physical and chemical properties of the muscles. The determination of these properties (known as muscle profiling) will allow for more informed decisions to be made in the selection of individual muscles for the production of value-added products. Therefore, scientists and professional partners of the ProSafeBeef project have brought together all the data they have accumulated over 20 years. The resulting BIF-Beef (Integrated and Functional Biology of Beef) data warehouse contains available data of animal growth, carcass composition, muscle tissue characteristics and beef quality traits. This database is useful to determine the most important muscle characteristics associated with a high tenderness, a high flavour or generally a high quality. Another more consumer driven modelling tool was developed in Australia: the Meat Standards Australia (MSA) grading scheme that predicts beef quality for each individual muscle×specific cooking method combination using various information on the corresponding animals and post-slaughter processing factors. This system has also the potential to detect variability in quality within muscles. The MSA system proved to be effective in predicting beef palatability not only in Australia but also in many other countries. The results of the work conducted in Europe within the ProSafeBeef project indicate that it would be possible to manage a grading system in Europe similar to the MSA system. The combination of the different modelling approaches (namely muscle biochemistry and a MSA-like meat grading system adapted to the European market) is a promising area of research to improve the prediction of beef quality. In both approaches, the volume of data available not only provides statistically sound correlations between various factors and beef quality traits but also a better understanding of the variability of beef quality according to various criteria (breed, age, sex, pH, marbling etc.). © 2013 The American Meat Science Association. All rights reserved.
A statistical model for water quality predictions from a river discharge using coastal observations
NASA Astrophysics Data System (ADS)
Kim, S.; Terrill, E. J.
2007-12-01
Understanding and predicting coastal ocean water quality has benefits for reducing human health risks, protecting the environment, and improving local economies which depend on clean beaches. Continuous observations of coastal physical oceanography increase the understanding of the processes which control the fate and transport of a riverine plume which potentially contains high levels of contaminants from the upstream watershed. A data-driven model of the fate and transport of river plume water from the Tijuana River has been developed using surface current observations provided by a network of HF radar operated as part of a local coastal observatory that has been in place since 2002. The model outputs are compared with water quality sampling of shoreline indicator bacteria, and the skill of an alarm for low water quality is evaluated using the receiver operating characteristic (ROC) curve. In addition, statistical analysis of beach closures in comparison with environmental variables is also discussed.
Longer guts and higher food quality increase energy intake in migratory swans.
van Gils, Jan A; Beekman, Jan H; Coehoorn, Pieter; Corporaal, Els; Dekkers, Ten; Klaassen, Marcel; van Kraaij, Rik; de Leeuw, Rinze; de Vries, Peter P
2008-11-01
1. Within the broad field of optimal foraging, it is increasingly acknowledged that animals often face digestive constraints rather than constraints on rates of food collection. This therefore calls for a formalization of how animals could optimize food absorption rates. 2. Here we generate predictions from a simple graphical optimal digestion model for foragers that aim to maximize their (true) metabolizable food intake over total time (i.e. including nonforaging bouts) under a digestive constraint. 3. The model predicts that such foragers should maintain a constant food retention time, even if gut length or food quality changes. For phenotypically flexible foragers, which are able to change the size of their digestive machinery, this means that an increase in gut length should go hand in hand with an increase in gross intake rate. It also means that better quality food should be digested more efficiently. 4. These latter two predictions are tested in a large avian long-distance migrant, the Bewick's swan (Cygnus columbianus bewickii), feeding on grasslands in its Dutch wintering quarters. 5. Throughout winter, free-ranging Bewick's swans, growing a longer gut and experiencing improved food quality, increased their gross intake rate (i.e. bite rate) and showed a higher digestive efficiency. These responses were in accordance with the model and suggest maintenance of a constant food retention time. 6. These changes doubled the birds' absorption rate. Had only food quality changed (and not gut length), then absorption rate would have increased by only 67%; absorption rate would have increased by only 17% had only gut length changed (and not food quality). 7. The prediction that gross intake rate should go up with gut length parallels the mechanism included in some proximate models of foraging that feeding motivation scales inversely to gut fullness. We plea for a tighter integration between ultimate and proximate foraging models.
Real-time control of combined surface water quantity and quality: polder flushing.
Xu, M; van Overloop, P J; van de Giesen, N C; Stelling, G S
2010-01-01
In open water systems, keeping both water depths and water quality at specified values is critical for maintaining a 'healthy' water system. Many systems still require manual operation, at least for water quality management. When applying real-time control, both quantity and quality standards need to be met. In this paper, an artificial polder flushing case is studied. Model Predictive Control (MPC) is developed to control the system. In addition to MPC, a 'forward estimation' procedure is used to acquire water quality predictions for the simplified model used in MPC optimization. In order to illustrate the advantages of MPC, classical control [Proportional-Integral control (PI)] has been developed for comparison in the test case. The results show that both algorithms are able to control the polder flushing process, but MPC is more efficient in functionality and control flexibility.
NASA Astrophysics Data System (ADS)
Saleh, D.; Domagalski, J. L.; Smith, R. A.
2016-12-01
The SPARROW (SPAtially-Referenced Regression On Watershed Attributes) model, developed by the U.S. Geological Survey, has been used to identify and quantify the sources of nitrogen and phosphorus in watersheds and to predict their fluxes and concentration at specified locations downstream. Existing SPARROW models use a hybrid statistical approach to describe an annual average ("steady-state") relationship between sources and stream conditions based on long-term water quality monitoring data and spatially-referenced explanatory information. Although these annual models are useful for some management purposes, many water quality issues stem from intra- and inter-annual changes in constituent sources, hydrologic forcing, or other environmental conditions, which cause a lag between watershed inputs and stream water quality. We are developing a seasonal dynamic SPARROW model of sources, fluxes, and yields of phosphorus for the watershed (approximately 9,700 square kilometers) draining to Upper Klamath Lake, Oregon. The lake is hyper-eutrophic and various options are being considered for water quality improvement. The model was calibrated with 11 years of water quality data (2000 to 2010) and simulates seasonal loads and yields for a total of 44 seasons. Phosphorus sources to the watershed include animal manure, farm fertilizer, discharges of treated wastewater, and natural sources (soil and streambed sediment). The model predicts that phosphorus delivery to the lake is strongly affected by intra- and inter-annual changes in precipitation and by temporary seasonal storage of phosphorus in the watershed. The model can be used to predict how different management actions for mitigating phosphorus sources might affect phosphorus loading to the lake as well as the time required for any changes in loading to occur following implementation of the action.
Baggott, Sarah; Cai, Xiaoming; McGregor, Glenn; Harrison, Roy M
2006-05-01
The Regional Atmospheric Modeling System (RAMS) and Urban Airshed Model (UAM IV) have been implemented for prediction of air pollutant concentrations within the West Midlands conurbation of the United Kingdom. The modelling results for wind speed, direction and temperature are in reasonable agreement with observations for two stations, one in a rural area and the other in an urban area. Predictions of surface temperature are generally good for both stations, but the results suggest that the quality of temperature prediction is sensitive to whether cloud cover is reproduced reliably by the model. Wind direction is captured very well by the model, while wind speed is generally overestimated. The air pollution climate of the UK West Midlands is very different to those for which the UAM model was primarily developed, and the methods used to overcome these limitations are described. The model shows a tendency towards under-prediction of primary pollutant (NOx and CO) concentrations, but with suitable attention to boundary conditions and vertical profiles gives fairly good predictions of ozone concentrations. Hourly updating of chemical concentration boundary conditions yields the best results, with input of vertical profiles desirable. The model seriously underpredicts NO2/NO ratios within the urban area and this appears to relate to inadequate production of peroxy radicals. Overall, the chemical reactivity predicted by the model appears to fall well below that occurring in the atmosphere.
The Level of Quality of Work Life to Predict Work Alienation
ERIC Educational Resources Information Center
Erdem, Mustafa
2014-01-01
The current research aims to determine the level of elementary school teachers' quality of work life (QWL) to predict work alienation. The study was designed using the relational survey model. The research population consisted of 1096 teachers employed at 25 elementary schools within the city of Van in the academic year 2010- 2011, and 346…
Accounting for and predicting the influence of spatial autocorrelation in water quality modeling
NASA Astrophysics Data System (ADS)
Miralha, L.; Kim, D.
2017-12-01
Although many studies have attempted to investigate the spatial trends of water quality, more attention is yet to be paid to the consequences of considering and ignoring the spatial autocorrelation (SAC) that exists in water quality parameters. Several studies have mentioned the importance of accounting for SAC in water quality modeling, as well as the differences in outcomes between models that account for and ignore SAC. However, the capacity to predict the magnitude of such differences is still ambiguous. In this study, we hypothesized that SAC inherently possessed by a response variable (i.e., water quality parameter) influences the outcomes of spatial modeling. We evaluated whether the level of inherent SAC is associated with changes in R-Squared, Akaike Information Criterion (AIC), and residual SAC (rSAC), after accounting for SAC during modeling procedure. The main objective was to analyze if water quality parameters with higher Moran's I values (inherent SAC measure) undergo a greater increase in R² and a greater reduction in both AIC and rSAC. We compared a non-spatial model (OLS) to two spatial regression approaches (spatial lag and error models). Predictor variables were the principal components of topographic (elevation and slope), land cover, and hydrological soil group variables. We acquired these data from federal online sources (e.g. USGS). Ten watersheds were selected, each in a different state of the USA. Results revealed that water quality parameters with higher inherent SAC showed substantial increase in R² and decrease in rSAC after performing spatial regressions. However, AIC values did not show significant changes. Overall, the higher the level of inherent SAC in water quality variables, the greater improvement of model performance. This indicates a linear and direct relationship between the spatial model outcomes (R² and rSAC) and the degree of SAC in each water quality variable. Therefore, our study suggests that the inherent level of SAC in response variables can predict improvements in models even before performing spatial regression approaches. We also recognize the constraints of this research and suggest that further studies focus on better ways of defining spatial neighborhoods, considering the differences among stations set in tributaries near to each other and in upstream areas.
No-Reference Image Quality Assessment by Wide-Perceptual-Domain Scorer Ensemble Method.
Liu, Tsung-Jung; Liu, Kuan-Hsien
2018-03-01
A no-reference (NR) learning-based approach to assess image quality is presented in this paper. The devised features are extracted from wide perceptual domains, including brightness, contrast, color, distortion, and texture. These features are used to train a model (scorer) which can predict scores. The scorer selection algorithms are utilized to help simplify the proposed system. In the final stage, the ensemble method is used to combine the prediction results from selected scorers. Two multiple-scale versions of the proposed approach are also presented along with the single-scale one. They turn out to have better performances than the original single-scale method. Because of having features from five different domains at multiple image scales and using the outputs (scores) from selected score prediction models as features for multi-scale or cross-scale fusion (i.e., ensemble), the proposed NR image quality assessment models are robust with respect to more than 24 image distortion types. They also can be used on the evaluation of images with authentic distortions. The extensive experiments on three well-known and representative databases confirm the performance robustness of our proposed model.
ERIC Educational Resources Information Center
Saavedra, Pedro; Kuchak, JoAnn
An error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications was developed, based on interviews conducted with a quality control sample of 1,791 students during 1978-1979. The model was designed to identify corrective methods appropriate for different types of…
Outcomes-Balanced Framework for Emergency Management: A Predictive Model for Preparedness
2013-09-01
Management Total Quality Management (TQM) was developed by W. Edwards Deming in the post-World War II reconstruction period in Japan. It ushered in a...FIGURES Figure 1. From Total Quality Management Principles ....................................................30 Figure 2. Outcomes Logic Model (After...THIRA Threat and Hazard Identification and Risk Assessment TQM Total Quality Management UTL Universal Task List xiv ACKNOWLEDGMENTS German
No-reference quality assessment based on visual perception
NASA Astrophysics Data System (ADS)
Li, Junshan; Yang, Yawei; Hu, Shuangyan; Zhang, Jiao
2014-11-01
The visual quality assessment of images/videos is an ongoing hot research topic, which has become more and more important for numerous image and video processing applications with the rapid development of digital imaging and communication technologies. The goal of image quality assessment (IQA) algorithms is to automatically assess the quality of images/videos in agreement with human quality judgments. Up to now, two kinds of models have been used for IQA, namely full-reference (FR) and no-reference (NR) models. For FR models, IQA algorithms interpret image quality as fidelity or similarity with a perfect image in some perceptual space. However, the reference image is not available in many practical applications, and a NR IQA approach is desired. Considering natural vision as optimized by the millions of years of evolutionary pressure, many methods attempt to achieve consistency in quality prediction by modeling salient physiological and psychological features of the human visual system (HVS). To reach this goal, researchers try to simulate HVS with image sparsity coding and supervised machine learning, which are two main features of HVS. A typical HVS captures the scenes by sparsity coding, and uses experienced knowledge to apperceive objects. In this paper, we propose a novel IQA approach based on visual perception. Firstly, a standard model of HVS is studied and analyzed, and the sparse representation of image is accomplished with the model; and then, the mapping correlation between sparse codes and subjective quality scores is trained with the regression technique of least squaresupport vector machine (LS-SVM), which gains the regressor that can predict the image quality; the visual metric of image is predicted with the trained regressor at last. We validate the performance of proposed approach on Laboratory for Image and Video Engineering (LIVE) database, the specific contents of the type of distortions present in the database are: 227 images of JPEG2000, 233 images of JPEG, 174 images of White Noise, 174 images of Gaussian Blur, 174 images of Fast Fading. The database includes subjective differential mean opinion score (DMOS) for each image. The experimental results show that the proposed approach not only can assess many kinds of distorted images quality, but also exhibits a superior accuracy and monotonicity.
A manufacturing quality assessment model based-on two stages interval type-2 fuzzy logic
NASA Astrophysics Data System (ADS)
Purnomo, Muhammad Ridwan Andi; Helmi Shintya Dewi, Intan
2016-01-01
This paper presents the development of an assessment models for manufacturing quality using Interval Type-2 Fuzzy Logic (IT2-FL). The proposed model is developed based on one of building block in sustainable supply chain management (SSCM), which is benefit of SCM, and focuses more on quality. The proposed model can be used to predict the quality level of production chain in a company. The quality of production will affect to the quality of product. Practically, quality of production is unique for every type of production system. Hence, experts opinion will play major role in developing the assessment model. The model will become more complicated when the data contains ambiguity and uncertainty. In this study, IT2-FL is used to model the ambiguity and uncertainty. A case study taken from a company in Yogyakarta shows that the proposed manufacturing quality assessment model can work well in determining the quality level of production.
Garcia-Menendez, Fernando; Hu, Yongtao; Odman, Mehmet T
2014-09-15
Air quality forecasts generated with chemical transport models can provide valuable information about the potential impacts of fires on pollutant levels. However, significant uncertainties are associated with fire-related emission estimates as well as their distribution on gridded modeling domains. In this study, we explore the sensitivity of fine particulate matter concentrations predicted by a regional-scale air quality model to the spatial and temporal allocation of fire emissions. The assessment was completed by simulating a fire-related smoke episode in which air quality throughout the Atlanta metropolitan area was affected on February 28, 2007. Sensitivity analyses were carried out to evaluate the significance of emission distribution among the model's vertical layers, along the horizontal plane, and into hourly inputs. Predicted PM2.5 concentrations were highly sensitive to emission injection altitude relative to planetary boundary layer height. Simulations were also responsive to the horizontal allocation of fire emissions and their distribution into single or multiple grid cells. Additionally, modeled concentrations were greatly sensitive to the temporal distribution of fire-related emissions. The analyses demonstrate that, in addition to adequate estimates of emitted mass, successfully modeling the impacts of fires on air quality depends on an accurate spatiotemporal allocation of emissions. Copyright © 2014 Elsevier B.V. All rights reserved.
Stochastic modeling for river pollution of Sungai Perlis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yunus, Nurul Izzaty Mohd.; Rahman, Haliza Abd.; Bahar, Arifah
2015-02-03
River pollution has been recognized as a contributor to a wide range of health problems and disorders in human. It can pose health dangers to humans who come into contact with it, either directly or indirectly. Therefore, it is most important to measure the concentration of Biochemical Oxygen Demand (BOD) as a water quality parameter since the parameter has long been the basic means for determining the degree of water pollution in rivers. In this study, BOD is used as a parameter to estimate the water quality at Sungai Perlis. It has been observed that Sungai Perlis is polluted duemore » to lack of management and improper use of resources. Therefore, it is of importance to model the Sungai Perlis water quality in order to describe and predict the water quality systems. The BOD concentration secondary data set is used which was extracted from the Drainage and Irrigation Department Perlis State website. The first order differential equation from Streeter – Phelps model was utilized as a deterministic model. Then, the model was developed into a stochastic model. Results from this study shows that the stochastic model is more adequate to describe and predict the BOD concentration and the water quality systems in Sungai Perlis by having smaller value of mean squared error (MSE)« less
2015-07-15
Long-term effects on cancer survivors’ quality of life of physical training versus physical training combined with cognitive-behavioral therapy ...COMPARISON OF NEURAL NETWORK AND LINEAR REGRESSION MODELS IN STATISTICALLY PREDICTING MENTAL AND PHYSICAL HEALTH STATUS OF BREAST...34Comparison of Neural Network and Linear Regression Models in Statistically Predicting Mental and Physical Health Status of Breast Cancer Survivors
NASA Astrophysics Data System (ADS)
Kim, Dae-Yong; Cho, Byoung-Kwan
2015-11-01
The quality parameters of the Korean traditional rice wine "Makgeolli" were monitored using Fourier transform near-infrared (FT-NIR) spectroscopy with multivariate statistical analysis (MSA) during fermentation. Alcohol, reducing sugar, and titratable acid were the parameters assessed to determine the quality index of fermentation substrates and products. The acquired spectra were analyzed with partial least squares regression (PLSR). The best prediction model for alcohol was obtained with maximum normalization, showing a coefficient of determination (Rp2) of 0.973 and a standard error of prediction (SEP) of 0.760%. In addition, the best prediction model for reducing sugar was obtained with no data preprocessing, with a Rp2 value of 0.945 and a SEP of 1.233%. The prediction of titratable acidity was best with mean normalization, showing a Rp2 value of 0.882 and a SEP of 0.045%. These results demonstrate that FT-NIR spectroscopy can be used for rapid measurements of quality parameters during Makgeolli fermentation.
Multi-model analysis in hydrological prediction
NASA Astrophysics Data System (ADS)
Lanthier, M.; Arsenault, R.; Brissette, F.
2017-12-01
Hydrologic modelling, by nature, is a simplification of the real-world hydrologic system. Therefore ensemble hydrological predictions thus obtained do not present the full range of possible streamflow outcomes, thereby producing ensembles which demonstrate errors in variance such as under-dispersion. Past studies show that lumped models used in prediction mode can return satisfactory results, especially when there is not enough information available on the watershed to run a distributed model. But all lumped models greatly simplify the complex processes of the hydrologic cycle. To generate more spread in the hydrologic ensemble predictions, multi-model ensembles have been considered. In this study, the aim is to propose and analyse a method that gives an ensemble streamflow prediction that properly represents the forecast probabilities and reduced ensemble bias. To achieve this, three simple lumped models are used to generate an ensemble. These will also be combined using multi-model averaging techniques, which generally generate a more accurate hydrogram than the best of the individual models in simulation mode. This new predictive combined hydrogram is added to the ensemble, thus creating a large ensemble which may improve the variability while also improving the ensemble mean bias. The quality of the predictions is then assessed on different periods: 2 weeks, 1 month, 3 months and 6 months using a PIT Histogram of the percentiles of the real observation volumes with respect to the volumes of the ensemble members. Initially, the models were run using historical weather data to generate synthetic flows. This worked for individual models, but not for the multi-model and for the large ensemble. Consequently, by performing data assimilation at each prediction period and thus adjusting the initial states of the models, the PIT Histogram could be constructed using the observed flows while allowing the use of the multi-model predictions. The under-dispersion has been largely corrected on short-term predictions. For the longer term, the addition of the multi-model member has been beneficial to the quality of the predictions, although it is too early to determine whether the gain is related to the addition of a member or if multi-model member has plus-value itself.
Quality assessment of butter cookies applying multispectral imaging
Andresen, Mette S; Dissing, Bjørn S; Løje, Hanne
2013-01-01
A method for characterization of butter cookie quality by assessing the surface browning and water content using multispectral images is presented. Based on evaluations of the browning of butter cookies, cookies were manually divided into groups. From this categorization, reference values were calculated for a statistical prediction model correlating multispectral images with a browning score. The browning score is calculated as a function of oven temperature and baking time. It is presented as a quadratic response surface. The investigated process window was the intervals 4–16 min and 160–200°C in a forced convection electrically heated oven. In addition to the browning score, a model for predicting the average water content based on the same images is presented. This shows how multispectral images of butter cookies may be used for the assessment of different quality parameters. Statistical analysis showed that the most significant wavelengths for browning predictions were in the interval 400–700 nm and the wavelengths significant for water prediction were primarily located in the near-infrared spectrum. The water prediction model was found to correctly estimate the average water content with an absolute error of 0.22%. From the images it was also possible to follow the browning and drying propagation from the cookie edge toward the center. PMID:24804036
Improving of local ozone forecasting by integrated models.
Gradišar, Dejan; Grašič, Boštjan; Božnar, Marija Zlata; Mlakar, Primož; Kocijan, Juš
2016-09-01
This paper discuss the problem of forecasting the maximum ozone concentrations in urban microlocations, where reliable alerting of the local population when thresholds have been surpassed is necessary. To improve the forecast, the methodology of integrated models is proposed. The model is based on multilayer perceptron neural networks that use as inputs all available information from QualeAria air-quality model, WRF numerical weather prediction model and onsite measurements of meteorology and air pollution. While air-quality and meteorological models cover large geographical 3-dimensional space, their local resolution is often not satisfactory. On the other hand, empirical methods have the advantage of good local forecasts. In this paper, integrated models are used for improved 1-day-ahead forecasting of the maximum hourly value of ozone within each day for representative locations in Slovenia. The WRF meteorological model is used for forecasting meteorological variables and the QualeAria air-quality model for gas concentrations. Their predictions, together with measurements from ground stations, are used as inputs to a neural network. The model validation results show that integrated models noticeably improve ozone forecasts and provide better alert systems.
Lin, Jie; Dai, Yi; Guo, Ya-nan; Xu, Hai-rong; Wang, Xiao-chang
2012-01-01
This study aimed to analyze the volatile chemical profile of Longjing tea, and further develop a prediction model for aroma quality of Longjing tea based on potent odorants. A total of 21 Longjing samples were analyzed by headspace solid phase microextraction (HS-SPME) coupled with gas chromatography-mass spectrometry (GC-MS). Pearson’s linear correlation analysis and partial least square (PLS) regression were applied to investigate the relationship between sensory aroma scores and the volatile compounds. Results showed that 60 volatile compounds could be commonly detected in this famous green tea. Terpenes and esters were two major groups characterized, representing 33.89% and 15.53% of the total peak area respectively. Ten compounds were determined to contribute significantly to the perceived aroma quality of Longjing tea, especially linalool (0.701), nonanal (0.738), (Z)-3-hexenyl hexanoate (−0.785), and β-ionone (−0.763). On the basis of these 10 compounds, a model (correlation coefficient of 89.4% and cross-validated correlation coefficient of 80.4%) was constructed to predict the aroma quality of Longjing tea. Summarily, this study has provided a novel option for quality prediction of green tea based on HS-SPME/GC-MS technique. PMID:23225852
Liu, Bing-Chun; Binaykia, Arihant; Chang, Pei-Chann; Tiwari, Manoj Kumar; Tsao, Cheng-Chin
2017-01-01
Today, China is facing a very serious issue of Air Pollution due to its dreadful impact on the human health as well as the environment. The urban cities in China are the most affected due to their rapid industrial and economic growth. Therefore, it is of extreme importance to come up with new, better and more reliable forecasting models to accurately predict the air quality. This paper selected Beijing, Tianjin and Shijiazhuang as three cities from the Jingjinji Region for the study to come up with a new model of collaborative forecasting using Support Vector Regression (SVR) for Urban Air Quality Index (AQI) prediction in China. The present study is aimed to improve the forecasting results by minimizing the prediction error of present machine learning algorithms by taking into account multiple city multi-dimensional air quality information and weather conditions as input. The results show that there is a decrease in MAPE in case of multiple city multi-dimensional regression when there is a strong interaction and correlation of the air quality characteristic attributes with AQI. Also, the geographical location is found to play a significant role in Beijing, Tianjin and Shijiazhuang AQI prediction. PMID:28708836
Predictability Analysis of PM10 Concentrations in Budapest
NASA Astrophysics Data System (ADS)
Ferenczi, Zita
2013-04-01
Climate, weather and air quality may have harmful effects on human health and environment. Over the past few hundred years we had to face the changes in climate in parallel with the changes in air quality. These observed changes in climate, weather and air quality continuously interact with each other: pollutants are changing the climate, thus changing the weather, but climate also has impacts on air quality. The increasing number of extreme weather situations may be a result of climate change, which could create favourable conditions for rising of pollutant concentrations. Air quality in Budapest is determined by domestic and traffic emissions combined with the meteorological conditions. In some cases, the effect of long-range transport could also be essential. While the time variability of the industrial and traffic emissions is not significant, the domestic emissions increase in winter season. In recent years, PM10 episodes have caused the most critical air quality problems in Budapest, especially in winter. In Budapest, an air quality network of 11 stations detects the concentration values of different pollutants hourly. The Hungarian Meteorological Service has developed an air quality prediction model system for the area of Budapest. The system forecasts the concentration of air pollutants (PM10, NO2, SO2 and O3) for two days in advance. In this work we used meteorological parameters and PM10 data detected by the stations of the air quality network, as well as the forecasted PM10 values of the air quality prediction model system. In this work we present the evaluation of PM10 predictions in the last two years and the most important meteorological parameters affecting PM10 concentration. The results of this analysis determine the effect of the meteorological parameters and the emission of aerosol particles on the PM10 concentration values as well as the limits of this prediction system.
Extending the cost-benefit model of thermoregulation: high-temperature environments.
Vickers, Mathew; Manicom, Carryn; Schwarzkopf, Lin
2011-04-01
The classic cost-benefit model of ectothermic thermoregulation compares energetic costs and benefits, providing a critical framework for understanding this process (Huey and Slatkin 1976 ). It considers the case where environmental temperature (T(e)) is less than the selected temperature of the organism (T(sel)), and it predicts that, to minimize increasing energetic costs of thermoregulation as habitat thermal quality declines, thermoregulatory effort should decrease until the lizard thermoconforms. We extended this model to include the case where T(e) exceeds T(sel), and we redefine costs and benefits in terms of fitness to include effects of body temperature (T(b)) on performance and survival. Our extended model predicts that lizards will increase thermoregulatory effort as habitat thermal quality declines, gaining the fitness benefits of optimal T(b) and maximizing the net benefit of activity. Further, to offset the disproportionately high fitness costs of high T(e) compared with low T(e), we predicted that lizards would thermoregulate more effectively at high values of T(e) than at low ones. We tested our predictions on three sympatric skink species (Carlia rostralis, Carlia rubrigularis, and Carlia storri) in hot savanna woodlands and found that thermoregulatory effort increased as thermal quality declined and that lizards thermoregulated most effectively at high values of T(e).
Pragmatic estimation of a spatio-temporal air quality model with irregular monitoring data
NASA Astrophysics Data System (ADS)
Sampson, Paul D.; Szpiro, Adam A.; Sheppard, Lianne; Lindström, Johan; Kaufman, Joel D.
2011-11-01
Statistical analyses of health effects of air pollution have increasingly used GIS-based covariates for prediction of ambient air quality in "land use" regression models. More recently these spatial regression models have accounted for spatial correlation structure in combining monitoring data with land use covariates. We present a flexible spatio-temporal modeling framework and pragmatic, multi-step estimation procedure that accommodates essentially arbitrary patterns of missing data with respect to an ideally complete space by time matrix of observations on a network of monitoring sites. The methodology incorporates a model for smooth temporal trends with coefficients varying in space according to Partial Least Squares regressions on a large set of geographic covariates and nonstationary modeling of spatio-temporal residuals from these regressions. This work was developed to provide spatial point predictions of PM 2.5 concentrations for the Multi-Ethnic Study of Atherosclerosis and Air Pollution (MESA Air) using irregular monitoring data derived from the AQS regulatory monitoring network and supplemental short-time scale monitoring campaigns conducted to better predict intra-urban variation in air quality. We demonstrate the interpretation and accuracy of this methodology in modeling data from 2000 through 2006 in six U.S. metropolitan areas and establish a basis for likelihood-based estimation.
This poster compares air quality modeling simulations under current climate and a future (approximately 2050) climate scenario. Differences in predicted ozone episodes and daily average PM2.5 concentrations are presented, along with vertical ozone profiles. Modeling ...
APPLICATION OF A WATER QUALITY ASSESSMENT MODELING SYSTEM AT A SUPERFUND SITE
Water quality modeling and related exposure assessments at a Superfund site, Silver Bow Creek-Clark Fork River in Montana, demonstrate the capability to predict the fate of mining waste pollutants in the environment. inked assessment system--consisting of hydrology and erosion, r...
Åkerstedt, Torbjörn; Orsini, Nicola; Petersen, Helena; Axelsson, John; Lekander, Mats; Kecklund, Göran
2012-06-01
The connection between stress and sleep is well established in cross-sectional questionnaire studies and in a few prospective studies. Here, the intention was to study the link between stress and sleep on a day-to-day basis across 42 days. Fifty participants kept a sleep/wake diary across 42 days and responded to daily questions on sleep and stress. The results were analyzed with a mixed model approach using stress during the prior day to predict morning ratings of sleep quality. The results showed that bedtime stress and worries were the main predictors of sleep quality, but that, also, late awakening, short prior sleep, high quality of prior sleep, and good health the prior day predicted higher sleep quality. Stress during the day predicts subsequent sleep quality on a day-to-day basis across 42 days. The observed range of variation in stress/worries was modest, which is why it is suggested that the present data underestimates the impact of stress on subsequent sleep quality. Copyright © 2012 Elsevier B.V. All rights reserved.
A simplified model of all-sky artificial sky glow derived from VIIRS Day/Night band data
NASA Astrophysics Data System (ADS)
Duriscoe, Dan M.; Anderson, Sharolyn J.; Luginbuhl, Christian B.; Baugh, Kimberly E.
2018-07-01
We present a simplified method using geographic analysis tools to predict the average artificial luminance over the hemisphere of the night sky, expressed as a ratio to the natural condition. The VIIRS Day/Night Band upward radiance data from the Suomi NPP orbiting satellite was used for input to the model. The method is based upon a relation between sky glow brightness and the distance from the observer to the source of upward radiance. This relationship was developed using a Garstang radiative transfer model with Day/Night Band data as input, then refined and calibrated with ground-based all-sky V-band photometric data taken under cloudless and low atmospheric aerosol conditions. An excellent correlation was found between observed sky quality and the predicted values from the remotely sensed data. Thematic maps of large regions of the earth showing predicted artificial V-band sky brightness may be quickly generated with modest computing resources. We have found a fast and accurate method based on previous work to model all-sky quality. We provide limitations to this method. The proposed model meets requirements needed by decision makers and land managers of an easy to interpret and understand metric of sky quality.
Gong, Yin-Xi; He, Cheng; Yan, Fei; Feng, Zhong-Ke; Cao, Meng-Lei; Gao, Yuan; Miao, Jie; Zhao, Jin-Long
2013-10-01
Multispectral remote sensing data containing rich site information are not fully used by the classic site quality evaluation system, as it merely adopts artificial ground survey data. In order to establish a more effective site quality evaluation system, a neural network model which combined remote sensing spectra factors with site factors and site index relations was established and used to study the sublot site quality evaluation in the Wangyedian Forest Farm in Inner Mongolia Province, Chifeng City. Based on the improved back propagation artificial neural network (BPANN), this model combined multispectral remote sensing data with sublot survey data, and took larch as example, Through training data set sensitivity analysis weak or irrelevant factor was excluded, the size of neural network was simplified, and the efficiency of network training was improved. This optimal site index prediction model had an accuracy up to 95.36%, which was 9.83% higher than that of the neural network model based on classic sublot survey data, and this shows that using multi-spectral remote sensing and small class survey data to determine the status of larch index prediction model has the highest predictive accuracy. The results fully indicate the effectiveness and superiority of this method.
Study on the Influence of Building Materials on Indoor Pollutants and Pollution Sources
NASA Astrophysics Data System (ADS)
Wang, Yao
2018-01-01
The paper summarizes the achievements and problems of indoor air quality research at home and abroad. The pollutants and pollution sources in the room are analyzed systematically. The types of building materials and pollutants are also discussed. The physical and chemical properties and health effects of main pollutants were analyzed and studied. According to the principle of mass balance, the basic mathematical model of indoor air quality is established. Considering the release rate of pollutants and indoor ventilation, a mathematical model for predicting the concentration of indoor air pollutants is derived. The model can be used to analyze and describe the variation of pollutant concentration in indoor air, and to predict and calculate the concentration of pollutants in indoor air at a certain time. The results show that the mathematical model established in this study can be used to analyze and predict the variation law of pollutant concentration in indoor air. The evaluation model can be used to evaluate the impact of indoor air quality and evaluation of current situation. Especially in the process of building and interior decoration, through pre-evaluation, it can provide reliable design parameters for selecting building materials and determining ventilation volume.
Predicting soil quality indices with near infrared analysis in a wildfire chronosequence.
Cécillon, Lauric; Cassagne, Nathalie; Czarnes, Sonia; Gros, Raphaël; Vennetier, Michel; Brun, Jean-Jacques
2009-01-15
We investigated the power of near infrared (NIR) analysis for the quantitative assessment of soil quality in a wildfire chronosequence. The effect of wildfire disturbance and soil engineering activity of earthworms on soil organic matter quality was first assessed with principal component analysis of NIR spectra. Three soil quality indices were further calculated using an adaptation of the method proposed by Velasquez et al. [Velasquez, E., Lavelle, P., Andrade, M. GISQ, a multifunctional indicator of soil quality. Soil Biol Biochem 2007; 39: 3066-3080.], each one addressing an ecosystem service provided by soils: organic matter storage, nutrient supply and biological activity. Partial least squares regression models were developed to test the predicting ability of NIR analysis for these soil quality indices. All models reached coefficients of determination above 0.90 and ratios of performance to deviation above 2.8. This finding provides new opportunities for the monitoring of soil quality, using NIR scanning of soil samples.
Wen, Dongqi; Zhai, Wenjuan; Xiang, Sheng; Hu, Zhice; Wei, Tongchuan; Noll, Kenneth E
2017-11-01
Determination of the effect of vehicle emissions on air quality near roadways is important because vehicles are a major source of air pollution. A near-roadway monitoring program was undertaken in Chicago between August 4 and October 30, 2014, to measure ultrafine particles, carbon dioxide, carbon monoxide, traffic volume and speed, and wind direction and speed. The objective of this study was to develop a method to relate short-term changes in traffic mode of operation to air quality near roadways using data averaged over 5-min intervals to provide a better understanding of the processes controlling air pollution concentrations near roadways. Three different types of data analysis are provided to demonstrate the type of results that can be obtained from a near-roadway sampling program based on 5-min measurements: (1) development of vehicle emission factors (EFs) for ultrafine particles as a function of vehicle mode of operation, (2) comparison of measured and modeled CO 2 concentrations, and (3) application of dispersion models to determine concentrations near roadways. EFs for ultrafine particles are developed that are a function of traffic volume and mode of operation (free flow and congestion) for light-duty vehicles (LDVs) under real-world conditions. Two air quality models-CALINE4 (California Line Source Dispersion Model, version 4) and AERMOD (American Meteorological Society/U.S. Environmental Protection Agency Regulatory Model)-are used to predict the ultrafine particulate concentrations near roadways for comparison with measured concentrations. When using CALINE4 to predict air quality levels in the mixing cell, changes in surface roughness and stability class have no effect on the predicted concentrations. However, when using AERMOD to predict air quality in the mixing cell, changes in surface roughness have a significant impact on the predicted concentrations. The paper provides emission factors (EFs) that are a function of traffic volume and mode of operation (free flow and congestion) for LDVs under real-world conditions. The good agreement between monitoring and modeling results indicates that high-resolution, simultaneous measurements of air quality and meteorological and traffic conditions can be used to determine real-world, fleet-wide vehicle EFs as a function of vehicle mode of operation under actual driving conditions.
We incorporate the Regional Atmospheric Chemistry Mechanism (RACM2) into the Community Multiscale Air Quality (CMAQ) hemispheric model and compare model predictions to those obtained using the existing Carbon Bond chemical mechanism with the updated toluene chemistry (CB05TU). Th...
COMPARISON OF DATA FROM AN IAQ TEST HOUSE WITH PREDICTIONS OF AN IAQ COMPUTER MODEL
The paper describes several experiments to evaluate the impact of indoor air pollutant sources on indoor air quality (IAQ). Measured pollutant concentrations are compared with concentrations predicted by an IAQ model. The measured concentrations are in excellent agreement with th...
Latino Adolescents' Academic Motivation: The Role of Siblings
ERIC Educational Resources Information Center
Alfaro, Edna C.; Umana-Taylor, Adriana J.
2010-01-01
Guided by an ecological perspective, two competing models were tested to examine how sibling relationship quality directly predicted or interacted with academic support from siblings to predict Latino adolescents' academic motivation (N = 258). Gender differences were examined utilizing multiple group analysis in structural equation modeling.…
Isoprene significantly contributes to organic aerosol in the southeastern United States where biogenic hydrocarbons mix with anthropogenic emissions. In this work, the Community Multiscale Air Quality model is updated to predict isoprene aerosol from epoxides produced under both ...
This paper examines the operational performance of the Community Multiscale Air Quality (CMAQ) model simulations for 2002 - 2006 using both 36-km and 12-km horizontal grid spacing, with a primary focus on the performance of the CMAQ model in predicting wet deposition of sulfate (...
We describe a seagrass growth (SGG) model that is coupled to a water quality (WQ) model that includes the effects of phytoplankton (chlorophyll), colored dissolved organic matter (CDOM) and suspended solids (TSS) on water clarity. Phytoplankton growth was adjusted daily for PAR (...
This presentation will examine the impact of data quality on the construction of QSAR models being developed within the EPA‘s National Center for Computational Toxicology. We have developed a public-facing platform to provide access to predictive models. As part of the work we ha...
NASA Technical Reports Server (NTRS)
1971-01-01
A study of techniques for the prediction of crime in the City of Los Angeles was conducted. Alternative approaches to crime prediction (causal, quasicausal, associative, extrapolative, and pattern-recognition models) are discussed, as is the environment within which predictions were desired for the immediate application. The decision was made to use time series (extrapolative) models to produce the desired predictions. The characteristics of the data and the procedure used to choose equations for the extrapolations are discussed. The usefulness of different functional forms (constant, quadratic, and exponential forms) and of different parameter estimation techniques (multiple regression and multiple exponential smoothing) are compared, and the quality of the resultant predictions is assessed.
Development of an analytical-numerical model to predict radiant emission or absorption
NASA Technical Reports Server (NTRS)
Wallace, Tim L.
1994-01-01
The development of an analytical-numerical model to predict radiant emission or absorption is discussed. A voigt profile is assumed to predict the spectral qualities of a singlet atomic transition line for atomic species of interest to the OPAD program. The present state of this model is described in each progress report required under contract. Model and code development is guided by experimental data where available. When completed, the model will be used to provide estimates of specie erosion rates from spectral data collected from rocket exhaust plumes or other sources.
Quality assessment of protein model-structures based on structural and functional similarities.
Konopka, Bogumil M; Nebel, Jean-Christophe; Kotulska, Malgorzata
2012-09-21
Experimental determination of protein 3D structures is expensive, time consuming and sometimes impossible. A gap between number of protein structures deposited in the World Wide Protein Data Bank and the number of sequenced proteins constantly broadens. Computational modeling is deemed to be one of the ways to deal with the problem. Although protein 3D structure prediction is a difficult task, many tools are available. These tools can model it from a sequence or partial structural information, e.g. contact maps. Consequently, biologists have the ability to generate automatically a putative 3D structure model of any protein. However, the main issue becomes evaluation of the model quality, which is one of the most important challenges of structural biology. GOBA--Gene Ontology-Based Assessment is a novel Protein Model Quality Assessment Program. It estimates the compatibility between a model-structure and its expected function. GOBA is based on the assumption that a high quality model is expected to be structurally similar to proteins functionally similar to the prediction target. Whereas DALI is used to measure structure similarity, protein functional similarity is quantified using standardized and hierarchical description of proteins provided by Gene Ontology combined with Wang's algorithm for calculating semantic similarity. Two approaches are proposed to express the quality of protein model-structures. One is a single model quality assessment method, the other is its modification, which provides a relative measure of model quality. Exhaustive evaluation is performed on data sets of model-structures submitted to the CASP8 and CASP9 contests. The validation shows that the method is able to discriminate between good and bad model-structures. The best of tested GOBA scores achieved 0.74 and 0.8 as a mean Pearson correlation to the observed quality of models in our CASP8 and CASP9-based validation sets. GOBA also obtained the best result for two targets of CASP8, and one of CASP9, compared to the contest participants. Consequently, GOBA offers a novel single model quality assessment program that addresses the practical needs of biologists. In conjunction with other Model Quality Assessment Programs (MQAPs), it would prove useful for the evaluation of single protein models.
CHENG, JIANLIN; EICKHOLT, JESSE; WANG, ZHENG; DENG, XIN
2013-01-01
After decades of research, protein structure prediction remains a very challenging problem. In order to address the different levels of complexity of structural modeling, two types of modeling techniques — template-based modeling and template-free modeling — have been developed. Template-based modeling can often generate a moderate- to high-resolution model when a similar, homologous template structure is found for a query protein but fails if no template or only incorrect templates are found. Template-free modeling, such as fragment-based assembly, may generate models of moderate resolution for small proteins of low topological complexity. Seldom have the two techniques been integrated together to improve protein modeling. Here we develop a recursive protein modeling approach to selectively and collaboratively apply template-based and template-free modeling methods to model template-covered (i.e. certain) and template-free (i.e. uncertain) regions of a protein. A preliminary implementation of the approach was tested on a number of hard modeling cases during the 9th Critical Assessment of Techniques for Protein Structure Prediction (CASP9) and successfully improved the quality of modeling in most of these cases. Recursive modeling can signicantly reduce the complexity of protein structure modeling and integrate template-based and template-free modeling to improve the quality and efficiency of protein structure prediction. PMID:22809379
Brady, Amie M.G.; Plona, Meg B.
2009-01-01
During the recreational season of 2008 (May through August), a regression model relating turbidity to concentrations of Escherichia coli (E. coli) was used to predict recreational water quality in the Cuyahoga River at the historical community of Jaite, within the present city of Brecksville, Ohio, a site centrally located within Cuyahoga Valley National Park. Samples were collected three days per week at Jaite and at three other sites on the river. Concentrations of E. coli were determined and compared to environmental and water-quality measures and to concentrations predicted with a regression model. Linear relations between E. coli concentrations and turbidity, gage height, and rainfall were statistically significant for Jaite. Relations between E. coli concentrations and turbidity were statistically significant for the three additional sites, and relations between E. coli concentrations and gage height were significant at the two sites where gage-height data were available. The turbidity model correctly predicted concentrations of E. coli above or below Ohio's single-sample standard for primary-contact recreation for 77 percent of samples collected at Jaite.
van der Linden, Bernadette W.A.; Winkels, Renate M.; van Duijnhoven, Fränzel J.; Mols, Floortje; van Roekel, Eline H.; Kampman, Ellen; Beijer, Sandra; Weijenberg, Matty P.
2016-01-01
The population of colorectal cancer (CRC) survivors is growing and many survivors experience deteriorated health-related quality of life (HRQoL) in both early and late post-treatment phases. Identification of CRC survivors at risk for HRQoL deterioration can be improved by using prediction models. However, such models are currently not available for oncology practice. As a starting point for developing prediction models of HRQoL for CRC survivors, a comprehensive overview of potential candidate HRQoL predictors is necessary. Therefore, a systematic literature review was conducted to identify candidate predictors of HRQoL of CRC survivors. Original research articles on associations of biopsychosocial factors with HRQoL of CRC survivors were searched in PubMed, Embase, and Google Scholar. Two independent reviewers assessed eligibility and selected articles for inclusion (N = 53). Strength of evidence for candidate HRQoL predictors was graded according to predefined methodological criteria. The World Health Organization’s International Classification of Functioning, Disability and Health (ICF) was used to develop a biopsychosocial framework in which identified candidate HRQoL predictors were mapped across the main domains of the ICF: health condition, body structures and functions, activities, participation, and personal and environmental factors. The developed biopsychosocial ICF framework serves as a basis for selecting candidate HRQoL predictors, thereby providing conceptual guidance for developing comprehensive, evidence-based prediction models of HRQoL for CRC survivors. Such models are useful in clinical oncology practice to aid in identifying individual CRC survivors at risk for HRQoL deterioration and could also provide potential targets for a biopsychosocial intervention aimed at safeguarding the HRQoL of at-risk individuals. Implications for Practice: More and more people now survive a diagnosis of colorectal cancer. The quality of life of these cancer survivors is threatened by health problems persisting for years after diagnosis and treatment. Early identification of survivors at risk of experiencing low quality of life in the future is thus important for taking preventive measures. Clinical prediction models are tools that can help oncologists identify at-risk individuals. However, such models are currently not available for clinical oncology practice. This systematic review outlines candidate predictors of low quality of life of colorectal cancer survivors, providing a firm conceptual basis for developing prediction models. PMID:26911406
Modeling of video compression effects on target acquisition performance
NASA Astrophysics Data System (ADS)
Cha, Jae H.; Preece, Bradley; Espinola, Richard L.
2009-05-01
The effect of video compression on image quality was investigated from the perspective of target acquisition performance modeling. Human perception tests were conducted recently at the U.S. Army RDECOM CERDEC NVESD, measuring identification (ID) performance on simulated military vehicle targets at various ranges. These videos were compressed with different quality and/or quantization levels utilizing motion JPEG, motion JPEG2000, and MPEG-4 encoding. To model the degradation on task performance, the loss in image quality is fit to an equivalent Gaussian MTF scaled by the Structural Similarity Image Metric (SSIM). Residual compression artifacts are treated as 3-D spatio-temporal noise. This 3-D noise is found by taking the difference of the uncompressed frame, with the estimated equivalent blur applied, and the corresponding compressed frame. Results show good agreement between the experimental data and the model prediction. This method has led to a predictive performance model for video compression by correlating various compression levels to particular blur and noise input parameters for NVESD target acquisition performance model suite.
Chapman, Robert W; Reading, Benjamin J; Sullivan, Craig V
2014-01-01
Inherited gene transcripts deposited in oocytes direct early embryonic development in all vertebrates, but transcript profiles indicative of embryo developmental competence have not previously been identified. We employed artificial intelligence to model profiles of maternal ovary gene expression and their relationship to egg quality, evaluated as production of viable mid-blastula stage embryos, in the striped bass (Morone saxatilis), a farmed species with serious egg quality problems. In models developed using artificial neural networks (ANNs) and supervised machine learning, collective changes in the expression of a limited suite of genes (233) representing <2% of the queried ovary transcriptome explained >90% of the eventual variance in embryo survival. Egg quality related to minor changes in gene expression (<0.2-fold), with most individual transcripts making a small contribution (<1%) to the overall prediction of egg quality. These findings indicate that the predictive power of the transcriptome as regards egg quality resides not in levels of individual genes, but rather in the collective, coordinated expression of a suite of transcripts constituting a transcriptomic "fingerprint". Correlation analyses of the corresponding candidate genes indicated that dysfunction of the ubiquitin-26S proteasome, COP9 signalosome, and subsequent control of the cell cycle engenders embryonic developmental incompetence. The affected gene networks are centrally involved in regulation of early development in all vertebrates, including humans. By assessing collective levels of the relevant ovarian transcripts via ANNs we were able, for the first time in any vertebrate, to accurately predict the subsequent embryo developmental potential of eggs from individual females. Our results show that the transcriptomic fingerprint evidencing developmental dysfunction is highly predictive of, and therefore likely to regulate, egg quality, a biologically complex trait crucial to reproductive fitness.
2014-01-01
Inherited gene transcripts deposited in oocytes direct early embryonic development in all vertebrates, but transcript profiles indicative of embryo developmental competence have not previously been identified. We employed artificial intelligence to model profiles of maternal ovary gene expression and their relationship to egg quality, evaluated as production of viable mid-blastula stage embryos, in the striped bass (Morone saxatilis), a farmed species with serious egg quality problems. In models developed using artificial neural networks (ANNs) and supervised machine learning, collective changes in the expression of a limited suite of genes (233) representing <2% of the queried ovary transcriptome explained >90% of the eventual variance in embryo survival. Egg quality related to minor changes in gene expression (<0.2-fold), with most individual transcripts making a small contribution (<1%) to the overall prediction of egg quality. These findings indicate that the predictive power of the transcriptome as regards egg quality resides not in levels of individual genes, but rather in the collective, coordinated expression of a suite of transcripts constituting a transcriptomic “fingerprint”. Correlation analyses of the corresponding candidate genes indicated that dysfunction of the ubiquitin-26S proteasome, COP9 signalosome, and subsequent control of the cell cycle engenders embryonic developmental incompetence. The affected gene networks are centrally involved in regulation of early development in all vertebrates, including humans. By assessing collective levels of the relevant ovarian transcripts via ANNs we were able, for the first time in any vertebrate, to accurately predict the subsequent embryo developmental potential of eggs from individual females. Our results show that the transcriptomic fingerprint evidencing developmental dysfunction is highly predictive of, and therefore likely to regulate, egg quality, a biologically complex trait crucial to reproductive fitness. PMID:24820964
Testa, Alison C; Hane, James K; Ellwood, Simon R; Oliver, Richard P
2015-03-11
The impact of gene annotation quality on functional and comparative genomics makes gene prediction an important process, particularly in non-model species, including many fungi. Sets of homologous protein sequences are rarely complete with respect to the fungal species of interest and are often small or unreliable, especially when closely related species have not been sequenced or annotated in detail. In these cases, protein homology-based evidence fails to correctly annotate many genes, or significantly improve ab initio predictions. Generalised hidden Markov models (GHMM) have proven to be invaluable tools in gene annotation and, recently, RNA-seq has emerged as a cost-effective means to significantly improve the quality of automated gene annotation. As these methods do not require sets of homologous proteins, improving gene prediction from these resources is of benefit to fungal researchers. While many pipelines now incorporate RNA-seq data in training GHMMs, there has been relatively little investigation into additionally combining RNA-seq data at the point of prediction, and room for improvement in this area motivates this study. CodingQuarry is a highly accurate, self-training GHMM fungal gene predictor designed to work with assembled, aligned RNA-seq transcripts. RNA-seq data informs annotations both during gene-model training and in prediction. Our approach capitalises on the high quality of fungal transcript assemblies by incorporating predictions made directly from transcript sequences. Correct predictions are made despite transcript assembly problems, including those caused by overlap between the transcripts of adjacent gene loci. Stringent benchmarking against high-confidence annotation subsets showed CodingQuarry predicted 91.3% of Schizosaccharomyces pombe genes and 90.4% of Saccharomyces cerevisiae genes perfectly. These results are 4-5% better than those of AUGUSTUS, the next best performing RNA-seq driven gene predictor tested. Comparisons against whole genome Sc. pombe and S. cerevisiae annotations further substantiate a 4-5% improvement in the number of correctly predicted genes. We demonstrate the success of a novel method of incorporating RNA-seq data into GHMM fungal gene prediction. This shows that a high quality annotation can be achieved without relying on protein homology or a training set of genes. CodingQuarry is freely available ( https://sourceforge.net/projects/codingquarry/ ), and suitable for incorporation into genome annotation pipelines.
Urban development results in changes to land use and land cover and, consequently, to biogenic and anthropogenic emissions, meteorological processes, and processes such as dry deposition that influence future predictions of air quality. This study examines the impacts of alter...
Cutrona, Carolyn E.; Russell, Daniel W.; Abraham, W. Todd; Gardner, Kelli A.; Melby, Janet N.; Bryant, Chalandra; Conger, Rand D.
2007-01-01
Demographic characteristics, family financial strain, neighborhood-level economic disadvantage, and state of residence were tested as predictors of observed warmth, hostility, and self-reported marital quality. Participants were 202 married African American couples who resided in a range of neighborhood contexts. Neighborhood-level economic disadvantage predicted lower warmth during marital interactions, as did residence in the rural south. Consistent with the family stress model (e.g., Conger & Elder, 1994), family financial strain predicted lower perceived marital quality. Unexpectedly, neighborhood-level economic disadvantage predicted higher marital quality. Social comparison processes and degree of exposure to racially based discrimination are considered as explanations for this unexpected result. The importance of context in relationship outcomes is highlighted. PMID:17955056
Table Rock Lake Water-Clarity Assessment Using Landsat Thematic Mapper Satellite Data
Krizanich, Gary; Finn, Michael P.
2009-01-01
Water quality of Table Rock Lake in southwestern Missouri is assessed using Landsat Thematic Mapper satellite data. A pilot study uses multidate satellite image scenes in conjunction with physical measurements of secchi disk transparency collected by the Lakes of Missouri Volunteer Program to construct a regression model used to estimate water clarity. The natural log of secchi disk transparency is the dependent variable in the regression and the independent variables are Thematic Mapper band 1 (blue) reflectance and a ratio of the band 1 and band 3 (red) reflectance. The regression model can be used to reliably predict water clarity anywhere within the lake. A pixel-level lake map of predicted water clarity or computed trophic state can be produced from the model output. Information derived from this model can be used by water-resource managers to assess water quality and evaluate effects of changes in the watershed on water quality.
Francy, Donna S.; Stelzer, Erin A.; Duris, Joseph W.; Brady, Amie M.G.; Harrison, John H.; Johnson, Heather E.; Ware, Michael W.
2013-01-01
Predictive models, based on environmental and water quality variables, have been used to improve the timeliness and accuracy of recreational water quality assessments, but their effectiveness has not been studied in inland waters. Sampling at eight inland recreational lakes in Ohio was done in order to investigate using predictive models for Escherichia coli and to understand the links between E. coli concentrations, predictive variables, and pathogens. Based upon results from 21 beach sites, models were developed for 13 sites, and the most predictive variables were rainfall, wind direction and speed, turbidity, and water temperature. Models were not developed at sites where the E. coli standard was seldom exceeded. Models were validated at nine sites during an independent year. At three sites, the model resulted in increased correct responses, sensitivities, and specificities compared to use of the previous day's E. coli concentration (the current method). Drought conditions during the validation year precluded being able to adequately assess model performance at most of the other sites. Cryptosporidium, adenovirus, eaeA (E. coli), ipaH (Shigella), and spvC (Salmonella) were found in at least 20% of samples collected for pathogens at five sites. The presence or absence of the three bacterial genes was related to some of the model variables but was not consistently related to E. coli concentrations. Predictive models were not effective at all inland lake sites; however, their use at two lakes with high swimmer densities will provide better estimates of public health risk than current methods and will be a valuable resource for beach managers and the public.
Francy, Donna S; Stelzer, Erin A; Duris, Joseph W; Brady, Amie M G; Harrison, John H; Johnson, Heather E; Ware, Michael W
2013-03-01
Predictive models, based on environmental and water quality variables, have been used to improve the timeliness and accuracy of recreational water quality assessments, but their effectiveness has not been studied in inland waters. Sampling at eight inland recreational lakes in Ohio was done in order to investigate using predictive models for Escherichia coli and to understand the links between E. coli concentrations, predictive variables, and pathogens. Based upon results from 21 beach sites, models were developed for 13 sites, and the most predictive variables were rainfall, wind direction and speed, turbidity, and water temperature. Models were not developed at sites where the E. coli standard was seldom exceeded. Models were validated at nine sites during an independent year. At three sites, the model resulted in increased correct responses, sensitivities, and specificities compared to use of the previous day's E. coli concentration (the current method). Drought conditions during the validation year precluded being able to adequately assess model performance at most of the other sites. Cryptosporidium, adenovirus, eaeA (E. coli), ipaH (Shigella), and spvC (Salmonella) were found in at least 20% of samples collected for pathogens at five sites. The presence or absence of the three bacterial genes was related to some of the model variables but was not consistently related to E. coli concentrations. Predictive models were not effective at all inland lake sites; however, their use at two lakes with high swimmer densities will provide better estimates of public health risk than current methods and will be a valuable resource for beach managers and the public.
Li, Qiongge; Chan, Maria F
2017-01-01
Over half of cancer patients receive radiotherapy (RT) as partial or full cancer treatment. Daily quality assurance (QA) of RT in cancer treatment closely monitors the performance of the medical linear accelerator (Linac) and is critical for continuous improvement of patient safety and quality of care. Cumulative longitudinal QA measurements are valuable for understanding the behavior of the Linac and allow physicists to identify trends in the output and take preventive actions. In this study, artificial neural networks (ANNs) and autoregressive moving average (ARMA) time-series prediction modeling techniques were both applied to 5-year daily Linac QA data. Verification tests and other evaluations were then performed for all models. Preliminary results showed that ANN time-series predictive modeling has more advantages over ARMA techniques for accurate and effective applicability in the dosimetry and QA field. © 2016 New York Academy of Sciences.
A user-friendly model for spray drying to aid pharmaceutical product development.
Grasmeijer, Niels; de Waard, Hans; Hinrichs, Wouter L J; Frijlink, Henderik W
2013-01-01
The aim of this study was to develop a user-friendly model for spray drying that can aid in the development of a pharmaceutical product, by shifting from a trial-and-error towards a quality-by-design approach. To achieve this, a spray dryer model was developed in commercial and open source spreadsheet software. The output of the model was first fitted to the experimental output of a Büchi B-290 spray dryer and subsequently validated. The predicted outlet temperatures of the spray dryer model matched the experimental values very well over the entire range of spray dryer settings that were tested. Finally, the model was applied to produce glassy sugars by spray drying, an often used excipient in formulations of biopharmaceuticals. For the production of glassy sugars, the model was extended to predict the relative humidity at the outlet, which is not measured in the spray dryer by default. This extended model was then successfully used to predict whether specific settings were suitable for producing glassy trehalose and inulin by spray drying. In conclusion, a spray dryer model was developed that is able to predict the output parameters of the spray drying process. The model can aid the development of spray dried pharmaceutical products by shifting from a trial-and-error towards a quality-by-design approach.
Population-based human exposure models predict the distribution of personal exposures to pollutants of outdoor origin using a variety of inputs, including air pollution concentrations; human activity patterns, such as the amount of time spent outdoors versus indoors, commuting, w...
An air-quality forecasting (AQF) system based on the National Weather Service (NWS) National Centers for Environmental Prediction's (NCEP's) Eta model and the U.S. EPA's Community Multiscale Air Quality (CMAQ) Modeling System is used to simulate the distributions of tropospheric ...
DOT National Transportation Integrated Search
2008-04-01
The objective of this study was to develop resilient modulus prediction models for possible application in the quality control/quality assurance (QC/QA) procedures during and after the construction of pavement layers. Field and laboratory testing pro...
Bonny, S P F; Hocquette, J-F; Pethick, D W; Farmer, L J; Legrand, I; Wierzbicki, J; Allen, P; Polkinghorne, R J; Gardner, G E
2016-06-01
Delivering beef of consistent quality to the consumer is vital for consumer satisfaction and will help to ensure demand and therefore profitability within the beef industry. In Australia, this is being tackled with Meat Standards Australia (MSA), which uses carcass traits and processing factors to deliver an individual eating quality guarantee to the consumer for 135 different 'cut by cooking methods' from each carcass. The carcass traits used in the MSA model, such as ossification score, carcass weight and marbling explain the majority of the differences between breeds and sexes. Therefore, it was expected that the model would predict with eating quality of bulls and dairy breeds with good accuracy. In total, 8128 muscle samples from 482 carcasses from France, Poland, Ireland and Northern Ireland were MSA graded at slaughter then evaluated for tenderness, juiciness, flavour liking and overall liking by untrained consumers, according to MSA protocols. The scores were weighted (0.3, 0.1, 0.3, 0.3) and combined to form a global eating quality (meat quality (MQ4)) score. The carcasses were grouped into one of the three breed categories: beef breeds, dairy breeds and crosses. The difference between the actual and the MSA-predicted MQ4 scores were analysed using a linear mixed effects model including fixed effects for carcass hang method, cook type, muscle type, sex, country, breed category and postmortem ageing period, and random terms for animal identification, consumer country and kill group. Bulls had lower MQ4 scores than steers and females and were predicted less accurately by the MSA model. Beef breeds had lower eating quality scores than dairy breeds and crosses for five out of the 16 muscles tested. Beef breeds were also over predicted in comparison with the cross and dairy breeds for six out of the 16 muscles tested. Therefore, even after accounting for differences in carcass traits, bulls still differ in eating quality when compared with females and steers. Breed also influenced eating quality beyond differences in carcass traits. However, in this case, it was only for certain muscles. This should be taken into account when estimating the eating quality of meat. In addition, the coefficients used by the Australian MSA model for some muscles, marbling score and ultimate pH do not exactly reflect the influence of these factors on eating quality in this data set, and if this system was to be applied to Europe then the coefficients for these muscles and covariates would need further investigation.
Richard S. Holthausen; Michael J. Wisdom; John Pierce; Daniel K. Edwards; Mary M. Rowland
1994-01-01
We used expert opinion to evaluate the predictive reliability of a habitat effectiveness model for elk in western Oregon and Washington. Twenty-five experts in elk ecology were asked to rate habitat quality for 16 example landscapes. Rankings and ratings of 21 experts were significantly correlated with model output. Expert opinion and model predictions differed for 4...
Chesapeake Bay Forecast System: Oxygen Prediction for the Sustainable Ecosystem Management
NASA Astrophysics Data System (ADS)
Mathukumalli, B.; Long, W.; Zhang, X.; Wood, R.; Murtugudde, R. G.
2010-12-01
The Chesapeake Bay Forecast System (CBFS) is a flexible, end-to-end expert prediction tool for decision makers that will provide customizable, user-specified predictions and projections of the region’s climate, air and water quality, local chemistry, and ecosystems at days to decades. As a part of CBFS, the long-term water quality data were collected and assembled to develop ecological models for the sustainable management of the Chesapeake Bay. Cultural eutrophication depletes oxygen levels in this ecosystem particularly in summer which has several negative implications on the structure and function of ecosystem. In order to understand dynamics and prediction of spatially-explicit oxygen levels in the Bay, an empirical process based ecological model is developed with long-term control variables (water temperature, salinity, nitrogen and phosphorus). Statistical validation methods were employed to demonstrate usability of predictions for management purposes and the predicted oxygen levels are quite faithful to observations. The predicted oxygen values and other physical outputs from downscaling of regional weather and climate predictions, or forecasts from hydrodynamic models can be used to forecast various ecological components. Such forecasts would be useful for both recreational and commercial users of the bay (for example, bass fishing). Furthermore, this work can also be used to predict extent of hypoxia/anoxia not only from anthropogenic nutrient pollution, but also from global warming. Some hindcasts and forecasts are discussed along with the ongoing efforts at a mechanistic ecosystem model to provide prognostic oxygen predictions and projections and upper trophic modeling using an energetics approach.
Gonzalez, Raul; Conn, Kathleen E.; Crosswell, Joey; Noble, Rachel
2012-01-01
Coastal and estuarine waters are the site of intense anthropogenic influence with concomitant use for recreation and seafood harvesting. Therefore, coastal and estuarine water quality has a direct impact on human health. In eastern North Carolina (NC) there are over 240 recreational and 1025 shellfish harvesting water quality monitoring sites that are regularly assessed. Because of the large number of sites, sampling frequency is often only on a weekly basis. This frequency, along with an 18–24 h incubation time for fecal indicator bacteria (FIB) enumeration via culture-based methods, reduces the efficiency of the public notification process. In states like NC where beach monitoring resources are limited but historical data are plentiful, predictive models may offer an improvement for monitoring and notification by providing real-time FIB estimates. In this study, water samples were collected during 12 dry (n = 88) and 13 wet (n = 66) weather events at up to 10 sites. Statistical predictive models for Escherichiacoli (EC), enterococci (ENT), and members of the Bacteroidales group were created and subsequently validated. Our results showed that models for EC and ENT (adjusted R2 were 0.61 and 0.64, respectively) incorporated a range of antecedent rainfall, climate, and environmental variables. The most important variables for EC and ENT models were 5-day antecedent rainfall, dissolved oxygen, and salinity. These models successfully predicted FIB levels over a wide range of conditions with a 3% (EC model) and 9% (ENT model) overall error rate for recreational threshold values and a 0% (EC model) overall error rate for shellfish threshold values. Though modeling of members of the Bacteroidales group had less predictive ability (adjusted R2 were 0.56 and 0.53 for fecal Bacteroides spp. and human Bacteroides spp., respectively), the modeling approach and testing provided information on Bacteroidales ecology. This is the first example of a set of successful statistical predictive models appropriate for assessment of both recreational and shellfish harvesting water quality in estuarine waters.
Mathematical model for prediction of efficiency indicators of educational activity in high school
NASA Astrophysics Data System (ADS)
Tikhonova, O. M.; Kushnikov, V. A.; Fominykh, D. S.; Rezchikov, A. F.; Ivashchenko, V. A.; Bogomolov, A. S.; Filimonyuk, L. Yu; Dolinina, O. N.; Kushnikov, O. V.; Shulga, T. E.; Tverdokhlebov, V. A.
2018-05-01
The quality of high school is a current problem all over the world. The paper presents the system dedicated to predicting the accreditation indicators of technical universities based on J. Forrester mechanism of system dynamics. The mathematical model is developed for prediction of efficiency indicators of the educational activity and is based on the apparatus of nonlinear differential equations.
Using the Gamma-Poisson Model to Predict Library Circulations.
ERIC Educational Resources Information Center
Burrell, Quentin L.
1990-01-01
Argues that the gamma mixture of Poisson processes, for all its perceived defects, can be used to make predictions regarding future library book circulations of a quality adequate for general management requirements. The use of the model is extensively illustrated with data from two academic libraries. (Nine references) (CLB)
Modeling Benthic Sediment Processes to Predict Water Quality and Ecology in Narragansett Bay
The benthic sediment acts as a huge reservoir of particulate and dissolved material (within interstitial water) which can contribute to loading of contaminants and nutrients to the water column. A benthic sediment model is presented in this report to predict spatial and temporal ...
Effects of urbanization on the water quality of lakes in Eagan, Minnesota
Ayers, M.A.; Payne, G.A.; Have, Mark A.
1980-01-01
Three phosphorus-prediction models developed during the study are applicable to shallow (less than about 12 feet), nonstratifying lakes and ponds. The data base was not sufficient to select an appropriate model to predict the effects of future loading from continuing urbanization on the deeper lakes.
Nonstarch polysaccharides in wheat flour wire-cut cookie making.
Guttieri, Mary J; Souza, Edward J; Sneller, Clay
2008-11-26
Nonstarch polysaccharides in wheat flour have significant capacity to affect the processing quality of wheat flour dough and the finished quality of wheat flour products. Most research has focused on the effects of arabinoxylans (AX) in bread making. This study found that water-extractable AX and arabinogalactan peptides can predict variation in pastry wheat quality as captured by the wire-cut cookie model system. The sum of water-extractable AX plus arabinogalactan was highly predictive of cookie spread factor. The combination of cookie spread factor and the ratio of water-extractable arabinose to xylose predicted peak force of the three-point bend test of cookie texture.
Prediction of Protein Structure by Template-Based Modeling Combined with the UNRES Force Field.
Krupa, Paweł; Mozolewska, Magdalena A; Joo, Keehyoung; Lee, Jooyoung; Czaplewski, Cezary; Liwo, Adam
2015-06-22
A new approach to the prediction of protein structures that uses distance and backbone virtual-bond dihedral angle restraints derived from template-based models and simulations with the united residue (UNRES) force field is proposed. The approach combines the accuracy and reliability of template-based methods for the segments of the target sequence with high similarity to those having known structures with the ability of UNRES to pack the domains correctly. Multiplexed replica-exchange molecular dynamics with restraints derived from template-based models of a given target, in which each restraint is weighted according to the accuracy of the prediction of the corresponding section of the molecule, is used to search the conformational space, and the weighted histogram analysis method and cluster analysis are applied to determine the families of the most probable conformations, from which candidate predictions are selected. To test the capability of the method to recover template-based models from restraints, five single-domain proteins with structures that have been well-predicted by template-based methods were used; it was found that the resulting structures were of the same quality as the best of the original models. To assess whether the new approach can improve template-based predictions with incorrectly predicted domain packing, four such targets were selected from the CASP10 targets; for three of them the new approach resulted in significantly better predictions compared with the original template-based models. The new approach can be used to predict the structures of proteins for which good templates can be found for sections of the sequence or an overall good template can be found for the entire sequence but the prediction quality is remarkably weaker in putative domain-linker regions.
Zhang, Yang; Pun, Betty; Wu, Shiang-Yuh; Vijayaraghavan, Krish; Seigneur, Christian
2004-12-01
The Models-3 Community Multiscale Air Quality (CMAQ) Modeling System and the Particulate Matter Comprehensive Air Quality Model with extensions (PMCAMx) were applied to simulate the period June 29-July 10, 1999, of the Southern Oxidants Study episode with two nested horizontal grid sizes: a coarse resolution of 32 km and a fine resolution of 8 km. The predicted spatial variations of ozone (O3), particulate matter with an aerodynamic diameter less than or equal to 2.5 microm (PM2.5), and particulate matter with an aerodynamic diameter less than or equal to 10 microm (PM10) by both models are similar in rural areas but differ from one another significantly over some urban/suburban areas in the eastern and southern United States, where PMCAMx tends to predict higher values of O3 and PM than CMAQ. Both models tend to predict O3 values that are higher than those observed. For observed O3 values above 60 ppb, O3 performance meets the U.S. Environmental Protection Agency's criteria for CMAQ with both grids and for PMCAMx with the fine grid only. It becomes unsatisfactory for PMCAMx and marginally satisfactory for CMAQ for observed O3 values above 40 ppb. Both models predict similar amounts of sulfate (SO4(2-)) and organic matter, and both predict SO4(2-) to be the largest contributor to PM2.5. PMCAMx generally predicts higher amounts of ammonium (NH4+), nitrate (NO3-), and black carbon (BC) than does CMAQ. PM performance for CMAQ is generally consistent with that of other PM models, whereas PMCAMx predicts higher concentrations of NO3-, NH4+, and BC than observed, which degrades its performance. For PM10 and PM2.5 predictions over the southeastern U.S. domain, the ranges of mean normalized gross errors (MNGEs) and mean normalized bias are 37-43% and -33-4% for CMAQ and 50-59% and 7-30% for PMCAMx. Both models predict the largest MNGEs for NO3- (98-104% for CMAQ 138-338% for PMCAMx). The inaccurate NO3- predictions by both models may be caused by the inaccuracies in the ammonia emission inventory and the uncertainties in the gas/particle partitioning under some conditions. In addition to these uncertainties, the significant PM overpredictions by PMCAMx may be attributed to the lack of wet removal for PM and a likely underprediction in the vertical mixing during the daytime.
Procedures for adjusting regional regression models of urban-runoff quality using local data
Hoos, A.B.; Sisolak, J.K.
1993-01-01
Statistical operations termed model-adjustment procedures (MAP?s) can be used to incorporate local data into existing regression models to improve the prediction of urban-runoff quality. Each MAP is a form of regression analysis in which the local data base is used as a calibration data set. Regression coefficients are determined from the local data base, and the resulting `adjusted? regression models can then be used to predict storm-runoff quality at unmonitored sites. The response variable in the regression analyses is the observed load or mean concentration of a constituent in storm runoff for a single storm. The set of explanatory variables used in the regression analyses is different for each MAP, but always includes the predicted value of load or mean concentration from a regional regression model. The four MAP?s examined in this study were: single-factor regression against the regional model prediction, P, (termed MAP-lF-P), regression against P,, (termed MAP-R-P), regression against P, and additional local variables (termed MAP-R-P+nV), and a weighted combination of P, and a local-regression prediction (termed MAP-W). The procedures were tested by means of split-sample analysis, using data from three cities included in the Nationwide Urban Runoff Program: Denver, Colorado; Bellevue, Washington; and Knoxville, Tennessee. The MAP that provided the greatest predictive accuracy for the verification data set differed among the three test data bases and among model types (MAP-W for Denver and Knoxville, MAP-lF-P and MAP-R-P for Bellevue load models, and MAP-R-P+nV for Bellevue concentration models) and, in many cases, was not clearly indicated by the values of standard error of estimate for the calibration data set. A scheme to guide MAP selection, based on exploratory data analysis of the calibration data set, is presented and tested. The MAP?s were tested for sensitivity to the size of a calibration data set. As expected, predictive accuracy of all MAP?s for the verification data set decreased as the calibration data-set size decreased, but predictive accuracy was not as sensitive for the MAP?s as it was for the local regression models.
Perceptual video quality assessment in H.264 video coding standard using objective modeling.
Karthikeyan, Ramasamy; Sainarayanan, Gopalakrishnan; Deepa, Subramaniam Nachimuthu
2014-01-01
Since usage of digital video is wide spread nowadays, quality considerations have become essential, and industry demand for video quality measurement is rising. This proposal provides a method of perceptual quality assessment in H.264 standard encoder using objective modeling. For this purpose, quality impairments are calculated and a model is developed to compute the perceptual video quality metric based on no reference method. Because of the shuttle difference between the original video and the encoded video the quality of the encoded picture gets degraded, this quality difference is introduced by the encoding process like Intra and Inter prediction. The proposed model takes into account of the artifacts introduced by these spatial and temporal activities in the hybrid block based coding methods and an objective modeling of these artifacts into subjective quality estimation is proposed. The proposed model calculates the objective quality metric using subjective impairments; blockiness, blur and jerkiness compared to the existing bitrate only calculation defined in the ITU G 1070 model. The accuracy of the proposed perceptual video quality metrics is compared against popular full reference objective methods as defined by VQEG.
Development of VIS/NIR spectroscopic system for real-time prediction of fresh pork quality
NASA Astrophysics Data System (ADS)
Zhang, Haiyun; Peng, Yankun; Zhao, Songwei; Sasao, Akira
2013-05-01
Quality attributes of fresh meat will influence nutritional value and consumers' purchasing power. The aim of the research was to develop a prototype for real-time detection of quality in meat. It consisted of hardware system and software system. A VIS/NIR spectrograph in the range of 350 to 1100 nm was used to collect the spectral data. In order to acquire more potential information of the sample, optical fiber multiplexer was used. A conveyable and cylindrical device was designed and fabricated to hold optical fibers from multiplexer. High power halogen tungsten lamp was collected as the light source. The spectral data were obtained with the exposure time of 2.17ms from the surface of the sample by press down the trigger switch on the self-developed system. The system could automatically acquire, process, display and save the data. Moreover the quality could be predicted on-line. A total of 55 fresh pork samples were used to develop prediction model for real time detection. The spectral data were pretreated with standard normalized variant (SNV) and partial least squares regression (PLSR) was used to develop prediction model. The correlation coefficient and root mean square error of the validation set for water content and pH were 0.810, 0.653, and 0.803, 0.098 respectively. The research shows that the real-time non-destructive detection system based on VIS/NIR spectroscopy can be efficient to predict the quality of fresh meat.
UV/blue light-induced fluorescence for assessing apple maturity
NASA Astrophysics Data System (ADS)
Noh, Hyun Kwon; Lu, Renfu
2005-11-01
Chlorophyll fluorescence has been researched for assessing fruit post-harvest quality and condition. The objective of this preliminary research was to investigate the potential of fluorescence spectroscopy for measuring apple fruit quality. Ultraviolet (UV) and blue light was used as an excitation source for inducing fluorescence in apples. Fluorescence spectra were measured from 'Golden Delicious' (GD) and 'Red Delicious' (RD) apples by using a visible/near-infrared spectrometer after one, three, and five minutes of continuous UV/blue light illumination. Standard destructive tests were performed to measure fruit firmness, skin and flesh color, soluble solids and acid content from the apples. Calibration models for each of the three illumination time periods were developed to predict fruit quality indexes. The results showed that fluorescence emission decreased steadily during the first three minutes of UV/blue light illumination and was stable within five minutes. The differences were minimal in the model prediction results based on fluorescence data at one, three or five minutes of illumination. Overall, better predictions were obtained for apple skin chroma and hue and flesh hue with values for the correlation coefficient of validation between 0.80 and 0.90 for both GD and RD. Relatively poor predictions were obtained for fruit firmness, soluble solids content, titrational acid, and flesh chroma. This research demonstrated that fluorescence spectroscopy is potentially useful for assessing selected quality attributes of apple fruit and further research is needed to improve fluorescence measurements so that better predictions of fruit quality can be achieved.
Bergquist, John R; Thiels, Cornelius A; Etzioni, David A; Habermann, Elizabeth B; Cima, Robert R
2016-04-01
Colorectal surgical site infections (C-SSIs) are a major source of postoperative morbidity. Institutional C-SSI rates are modeled and scrutinized, and there is increasing movement in the direction of public reporting. External validation of C-SSI risk prediction models is lacking. Factors governing C-SSI occurrence are complicated and multifactorial. We hypothesized that existing C-SSI prediction models have limited ability to accurately predict C-SSI in independent data. Colorectal resections identified from our institutional ACS-NSQIP dataset (2006 to 2014) were reviewed. The primary outcome was any C-SSI according to the ACS-NSQIP definition. Emergency cases were excluded. Published C-SSI risk scores: the National Nosocomial Infection Surveillance (NNIS), Contamination, Obesity, Laparotomy, and American Society of Anesthesiologists (ASA) class (COLA), Preventie Ziekenhuisinfecties door Surveillance (PREZIES), and NSQIP-based models were compared with receiver operating characteristic (ROC) analysis to evaluate discriminatory quality. There were 2,376 cases included, with an overall C-SSI rate of 9% (213 cases). None of the models produced reliable and high quality C-SSI predictions. For any C-SSI, the NNIS c-index was 0.57 vs 0.61 for COLA, 0.58 for PREZIES, and 0.62 for NSQIP: all well below the minimum "reasonably" predictive c-index of 0.7. Predictions for superficial, deep, and organ space SSI were similarly poor. Published C-SSI risk prediction models do not accurately predict C-SSI in our independent institutional dataset. Application of externally developed prediction models to any individual practice must be validated or modified to account for institution and case-mix specific factors. This questions the validity of using externally or nationally developed models for "expected" outcomes and interhospital comparisons. Copyright © 2016 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Birmingham, R.S.; Bub, K.L.; Vaughn, B.E.
2017-01-01
Parenting and attachment are critical in the emergence of self-regulation (SR) in preschool. However, most studies use general indexes of parenting quality, failing to explore the unique contributions of sensitivity and home quality to SR. Further, the nature of the interplay between parenting and attachment history is not well understood. Using a sample of 938 children from The NICHD Study of Early Child Care and Youth Development, a series of structural equation models were fit to determine whether sensitivity and home quality concurrently predicted SR at 54 months, and whether attachment mediated or moderated these pathways. Results suggest that both sensitivity and home quality uniquely predict SR. Further, these early parenting variables were each indirectly associated with SR through children's attachment history. That is, higher levels of sensitivity and home quality predicted secure attachment history, which, along with parenting, predicted more advanced SR skills at 54 months. No moderated pathways emerged, suggesting attachment history may be best conceptualized as a mediating mechanism. PMID:27894211
NASA Astrophysics Data System (ADS)
Cameron, Enrico; Pilla, Giorgio; Stella, Fabio A.
2018-06-01
The application of statistical classification methods is investigated—in comparison also to spatial interpolation methods—for predicting the acceptability of well-water quality in a situation where an effective quantitative model of the hydrogeological system under consideration cannot be developed. In the example area in northern Italy, in particular, the aquifer is locally affected by saline water and the concentration of chloride is the main indicator of both saltwater occurrence and groundwater quality. The goal is to predict if the chloride concentration in a water well will exceed the allowable concentration so that the water is unfit for the intended use. A statistical classification algorithm achieved the best predictive performances and the results of the study show that statistical classification methods provide further tools for dealing with groundwater quality problems concerning hydrogeological systems that are too difficult to describe analytically or to simulate effectively.
NASA Astrophysics Data System (ADS)
Balasubramanian, S.; Nelson, A. J.; Koloutsou-Vakakis, S.; Lin, J.; Myles, L.; Rood, M. J.
2016-12-01
Biogeochemical models such as DeNitrification DeComposition (DNDC) are used to model greenhouse and other trace gas fluxes (e.g., ammonia (NH3)) from agricultural ecosystems. NH3 is of interest to air quality because it is a precursor to ambient particulate matter. NH3 fluxes from chemical fertilizer application are uncertain due to dependence on local weather and soil properties, and farm nitrogen management practices. DNDC can be advantageously implemented to model the underlying spatial and temporal trends to support air quality modeling. However, such implementation, requires a detailed evaluation of model predictions, and model behavior. This is the first study to assess DNDC predictions of NH3 fluxes to/from the atmosphere, from chemical fertilizer application, during an entire crop growing season, in the United States. Relaxed eddy accumulation (REA) measurements over corn in Central Illinois, in year 2014, were used to evaluate magnitude and trends in modeled NH3 fluxes. DNDC was able to replicate both magnitude and trends in measured NH3 fluxes, with greater accuracy during the initial 33 days after application, when NH3 was mostly emitted to the atmosphere. However, poorer performance was observed when depositional fluxes were measured. Sensitivity analysis using Monte Carlo simulations indicated that modeled NH3 fluxes were most sensitive to input air temperature and precipitation, soil organic carbon, field capacity and pH and fertilizer loading rate, timing, and application depth and tilling date. By constraining these inputs for conditions in Central Illinois, uncertainty in annual NH3 fluxes was estimated to vary from -87% to 61%. Results from this study provides insight to further improve DNDC predictions and inform efforts for upscaling site predictions to regional scale for the development of emission inventories for air quality modeling.
NASA Astrophysics Data System (ADS)
Jathar, Shantanu H.; Woody, Matthew; Pye, Havala O. T.; Baker, Kirk R.; Robinson, Allen L.
2017-03-01
Gasoline- and diesel-fueled engines are ubiquitous sources of air pollution in urban environments. They emit both primary particulate matter and precursor gases that react to form secondary particulate matter in the atmosphere. In this work, we updated the organic aerosol module and organic emissions inventory of a three-dimensional chemical transport model, the Community Multiscale Air Quality Model (CMAQ), using recent, experimentally derived inputs and parameterizations for mobile sources. The updated model included a revised volatile organic compound (VOC) speciation for mobile sources and secondary organic aerosol (SOA) formation from unspeciated intermediate volatility organic compounds (IVOCs). The updated model was used to simulate air quality in southern California during May and June 2010, when the California Research at the Nexus of Air Quality and Climate Change (CalNex) study was conducted. Compared to the Traditional version of CMAQ, which is commonly used for regulatory applications, the updated model did not significantly alter the predicted organic aerosol (OA) mass concentrations but did substantially improve predictions of OA sources and composition (e.g., POA-SOA split), as well as ambient IVOC concentrations. The updated model, despite substantial differences in emissions and chemistry, performed similar to a recently released research version of CMAQ (Woody et al., 2016) that did not include the updated VOC and IVOC emissions and SOA data. Mobile sources were predicted to contribute 30-40 % of the OA in southern California (half of which was SOA), making mobile sources the single largest source contributor to OA in southern California. The remainder of the OA was attributed to non-mobile anthropogenic sources (e.g., cooking, biomass burning) with biogenic sources contributing to less than 5 % to the total OA. Gasoline sources were predicted to contribute about 13 times more OA than diesel sources; this difference was driven by differences in SOA production. Model predictions highlighted the need to better constrain multi-generational oxidation reactions in chemical transport models.
DRAINMOD-GIS: a lumped parameter watershed scale drainage and water quality model
G.P. Fernandez; G.M. Chescheir; R.W. Skaggs; D.M. Amatya
2006-01-01
A watershed scale lumped parameter hydrology and water quality model that includes an uncertainty analysis component was developed and tested on a lower coastal plain watershed in North Carolina. Uncertainty analysis was used to determine the impacts of uncertainty in field and network parameters of the model on the predicted outflows and nitrate-nitrogen loads at the...
NASA Astrophysics Data System (ADS)
Kruse Christensen, Nikolaj; Ferre, Ty Paul A.; Fiandaca, Gianluca; Christensen, Steen
2017-03-01
We present a workflow for efficient construction and calibration of large-scale groundwater models that includes the integration of airborne electromagnetic (AEM) data and hydrological data. In the first step, the AEM data are inverted to form a 3-D geophysical model. In the second step, the 3-D geophysical model is translated, using a spatially dependent petrophysical relationship, to form a 3-D hydraulic conductivity distribution. The geophysical models and the hydrological data are used to estimate spatially distributed petrophysical shape factors. The shape factors primarily work as translators between resistivity and hydraulic conductivity, but they can also compensate for structural defects in the geophysical model. The method is demonstrated for a synthetic case study with sharp transitions among various types of deposits. Besides demonstrating the methodology, we demonstrate the importance of using geophysical regularization constraints that conform well to the depositional environment. This is done by inverting the AEM data using either smoothness (smooth) constraints or minimum gradient support (sharp) constraints, where the use of sharp constraints conforms best to the environment. The dependency on AEM data quality is also tested by inverting the geophysical model using data corrupted with four different levels of background noise. Subsequently, the geophysical models are used to construct competing groundwater models for which the shape factors are calibrated. The performance of each groundwater model is tested with respect to four types of prediction that are beyond the calibration base: a pumping well's recharge area and groundwater age, respectively, are predicted by applying the same stress as for the hydrologic model calibration; and head and stream discharge are predicted for a different stress situation. As expected, in this case the predictive capability of a groundwater model is better when it is based on a sharp geophysical model instead of a smoothness constraint. This is true for predictions of recharge area, head change, and stream discharge, while we find no improvement for prediction of groundwater age. Furthermore, we show that the model prediction accuracy improves with AEM data quality for predictions of recharge area, head change, and stream discharge, while there appears to be no accuracy improvement for the prediction of groundwater age.
Models that predict standing crop of stream fish from habitat variables: 1950-85.
K.D. Fausch; C.L. Hawkes; M.G. Parsons
1988-01-01
We reviewed mathematical models that predict standing crop of stream fish (number or biomass per unit area or length of stream) from measurable habitat variables and classified them by the types of independent habitat variables found significant, by mathematical structure, and by model quality. Habitat variables were of three types and were measured on different scales...
A model to predict stream water temperature across the conterminous USA
Catalina Segura; Peter Caldwell; Ge Sun; Steve McNulty; Yang Zhang
2014-01-01
Stream water temperature (ts) is a critical water quality parameter for aquatic ecosystems. However, ts records are sparse or nonexistent in many river systems. In this work, we present an empirical model to predict ts at the site scale across the USA. The model, derived using data from 171 reference sites selected from the Geospatial Attributes of Gages for Evaluating...
Advanced Water Quality Modelling in Marine Systems: Application to the Wadden Sea, the Netherlands
NASA Astrophysics Data System (ADS)
Boon, J.; Smits, J. G.
2006-12-01
There is an increasing demand for knowledge and models that arise from water management in relation to water quality, sediment quality (ecology) and sediment accumulation (ecomorphology). Recently, models for sediment diagenesis and erosion developed or incorporated by Delft Hydraulics integrates the relevant physical, (bio)chemical and biological processes for the sediment-water exchange of substances. The aim of the diagenesis models is the prediction of both sediment quality and the return fluxes of substances such as nutrients and micropollutants to the overlying water. The resulting so-called DELWAQ-G model is a new, generic version of the water and sediment quality model of the DELFT3D framework. One set of generic water quality process formulations is used to calculate process rates in both water and sediment compartments. DELWAQ-G involves the explicit simulation of sediment layers in the water quality model with state-of-the-art process kinetics. The local conditions in a water layer or sediment layer such as the dissolved oxygen concentration determine if and how individual processes come to expression. New processes were added for sulphate, sulphide, methane and the distribution of the electron-acceptor demand over dissolved oxygen, nitrate, sulphate and carbon dioxide. DELWAQ-G also includes the dispersive and advective transport processes in the sediment and across the sediment-water interface. DELWAQ-G has been applied for the Wadden Sea. A very dynamic tidal and ecologically active estuary with a complex hydrodynamic behaviour located at the north of the Netherlands. The predicted profiles in the sediment reflect the typical interactions of diagenesis processes.
Predicting Fecal Indicator Bacteria Fate and Removal in Urban Stormwater at the Watershed Scale
NASA Astrophysics Data System (ADS)
Wolfand, J.; Hogue, T. S.; Luthy, R. G.
2016-12-01
Urban stormwater is a major cause of water quality impairment, resulting in surface waters that fail to meet water quality standards and support their designated uses. Of the many stormwater pollutants, fecal indicator bacteria are particularly important to track because they are directly linked to pathogens which jeopardize public health; yet, their fate and transport in urban stormwater is poorly understood. Monitoring fecal bacteria in stormwater is possible, but due to the high variability of fecal indicators both spatially and temporally, single grab or composite samples do not fully capture fecal indicator loading. Models have been developed to predict fecal indicator bacteria at the watershed scale, but they are often limited to agricultural areas, or areas that receive frequent rainfall. Further, it is unclear whether best management practices (BMPs), such as bioretention or engineered wetlands, are able to reduce bacteria to meet water quality standards at watershed outlets. This research seeks to develop a model to predict fecal indicator bacteria in urban stormwater in a semi-arid climate at the watershed scale. Using the highly developed Ballona Creek watershed (89 mi2) located in Los Angeles County as a case study, several existing mechanistic models are coupled with a hydrologic model to predict fecal indicator concentrations (E. coli, enterococci, fecal coliform, and total coliform) at the outfall of Ballona Creek watershed, Santa Monica Bay. The hydrologic model was developed using InfoSWMM Sustain, calibrated for flow from WY 1998-2006 (NSE = 0.94; R2 = 0.95), and validated from WY 2007-2015 (NSE = 0.93; R2 = 0.95). The developed coupled model is being used to predict fecal indicator fate and transport and evaluate how BMPs can be optimized to reduce fecal indicator loading to surface waters and recreational beaches.
Evaluation of 3D-Jury on CASP7 models.
Kaján, László; Rychlewski, Leszek
2007-08-21
3D-Jury, the structure prediction consensus method publicly available in the Meta Server http://meta.bioinfo.pl/, was evaluated using models gathered in the 7th round of the Critical Assessment of Techniques for Protein Structure Prediction (CASP7). 3D-Jury is an automated expert process that generates protein structure meta-predictions from sets of models obtained from partner servers. The performance of 3D-Jury was analysed for three aspects. First, we examined the correlation between the 3D-Jury score and a model quality measure: the number of correctly predicted residues. The 3D-Jury score was shown to correlate significantly with the number of correctly predicted residues, the correlation is good enough to be used for prediction. 3D-Jury was also found to improve upon the competing servers' choice of the best structure model in most cases. The value of the 3D-Jury score as a generic reliability measure was also examined. We found that the 3D-Jury score separates bad models from good models better than the reliability score of the original server in 27 cases and falls short of it in only 5 cases out of a total of 38. We report the release of a new Meta Server feature: instant 3D-Jury scoring of uploaded user models. The 3D-Jury score continues to be a good indicator of structural model quality. It also provides a generic reliability score, especially important for models that were not assigned such by the original server. Individual structure modellers can also benefit from the 3D-Jury scoring system by testing their models in the new instant scoring feature http://meta.bioinfo.pl/compare_your_model_example.pl available in the Meta Server.
Development of techniques and data for evaluating ride quality. Volume 2 : ride-quality research
DOT National Transportation Integrated Search
1978-03-01
Ride-quality models for city buses and intercity trains are presented : and discussed in terms of their ability to predict passenger : comfort and ride acceptability. : This, the second of three volumes, contains a technical discussion, : of the ride...
Physical Vapor Transport of Mercurous Chloride Crystals: Design of a Microgravity Experiment
NASA Technical Reports Server (NTRS)
Duval, W, M. B.; Singh, N. B.; Glicksman, M. E.
1997-01-01
Flow field characteristics predicted from a computational model show that the dynamical state of the flow, for practical crystal growth conditions of mercurous chloride, can range from steady to unsteady. Evidence that the flow field can be strongly dominated by convection for ground-based conditions is provided by the prediction of asymmetric velocity profiles bv the model which show reasonable agreement with laser Doppler velocimetry experiments in both magnitude and planform. Unsteady flow is shown to be correlated with a degradation of crystal quality as quantified by light scattering pattern measurements, A microgravity experiment is designed to show that an experiment performed with parameters which yield an unsteady flow becomes steady (diffusive-advective) in a microgravity environment of 10(exp -3) g(sub 0) as predicted by the model, and hence yields crystals with optimal quality.
Image and Video Quality Assessment Using LCD: Comparisons with CRT Conditions
NASA Astrophysics Data System (ADS)
Tourancheau, Sylvain; Callet, Patrick Le; Barba, Dominique
In this paper, the impact of display on quality assessment is addressed. Subjective quality assessment experiments have been performed on both LCD and CRT displays. Two sets of still images and two sets of moving pictures have been assessed using either an ACR or a SAMVIQ protocol. Altogether, eight experiments have been led. Results are presented and discussed, some differences are pointed out. Concerning moving pictures, these differences seem to be mainly due to LCD moving artefacts such as motion blur. LCD motion blur has been measured objectively and with psycho-physics experiments. A motion-blur metric based on the temporal characteristics of LCD can be defined. A prediction model have been then designed which predict the differences of perceived quality between CRT and LCD. This motion-blur-based model enables the estimation of perceived quality on LCD with respect to the perceived quality on CRT. Technical solutions to LCD motion blur can thus be evaluated on natural contents by this mean.
Comprehensive model for predicting perceptual image quality of smart mobile devices.
Gong, Rui; Xu, Haisong; Luo, M R; Li, Haifeng
2015-01-01
An image quality model for smart mobile devices was proposed based on visual assessments of several image quality attributes. A series of psychophysical experiments were carried out on two kinds of smart mobile devices, i.e., smart phones and tablet computers, in which naturalness, colorfulness, brightness, contrast, sharpness, clearness, and overall image quality were visually evaluated under three lighting environments via categorical judgment method for various application types of test images. On the basis of Pearson correlation coefficients and factor analysis, the overall image quality could first be predicted by its two constituent attributes with multiple linear regression functions for different types of images, respectively, and then the mathematical expressions were built to link the constituent image quality attributes with the physical parameters of smart mobile devices and image appearance factors. The procedure and algorithms were applicable to various smart mobile devices, different lighting conditions, and multiple types of images, and performance was verified by the visual data.
NASA Astrophysics Data System (ADS)
Wan, Xiaodong; Wang, Yuanxun; Zhao, Dawei; Huang, YongAn
2017-09-01
Our study aims at developing an effective quality monitoring system in small scale resistance spot welding of titanium alloy. The measured electrical signals were interpreted in combination with the nugget development. Features were extracted from the dynamic resistance and electrode voltage curve. A higher welding current generally indicated a lower overall dynamic resistance level. A larger electrode voltage peak and higher change rate of electrode voltage could be detected under a smaller electrode force or higher welding current condition. Variation of the extracted features and weld quality was found more sensitive to the change of welding current than electrode force. Different neural network model were proposed for weld quality prediction. The back propagation neural network was more proper in failure load estimation. The probabilistic neural network model was more appropriate to be applied in quality level classification. A real-time and on-line weld quality monitoring system may be developed by taking advantages of both methods.
Predicting Bacteria Removal by Enhanced Stormwater Control Measures (SCMs) at the Watershed Scale
NASA Astrophysics Data System (ADS)
Wolfand, J.; Bell, C. D.; Boehm, A. B.; Hogue, T. S.; Luthy, R. G.
2017-12-01
Urban stormwater is a major cause of water quality impairment, resulting in surface waters that fail to meet water quality standards and support their designated uses. Fecal indicator bacteria are present in high concentrations in stormwater and are strictly regulated in receiving waters; yet, their fate and transport in urban stormwater is poorly understood. Stormwater control measures (SCMs) are often used to treat, infiltrate, and release urban runoff, but field measurements show that the removal of bacteria by these structural solutions is limited (median log removal = 0.24, n = 370). Researchers have therefore looked to improve bacterial removal by enhancing SCMs through alterations in flow regimes or adding geomedia such as biochar. The present research seeks to develop a model to predict removal of fecal indicator bacteria by enhanced SCMs at the watershed scale in a semi-arid climate. Using the highly developed Ballona Creek watershed (290 km2) located in Los Angeles County as a case study, a hydrologic model is coupled with a stochastic water quality model to predict E. coli concentration near the outfall of the Ballona Creek, Santa Monica Bay. A hydrologic model was developed using EPA SWMM, calibrated for flow from water year 1998-2006 (NSE = 0.94; R2 = 0.94), and validated from water year 2007-2015 (NSE = 0.90; R2 = 0.93). This bacterial loading model was then linked to EPA SUSTAIN and a SCM bacterial removal script to simulate log removal of bacteria by various SCMs and predict bacterial concentrations in Ballona Creek. Preliminary results suggest small enhancements to SCMs that improve bacterial removal (<0.5 log removal) may offer large benefits to surface water quality and enable communities such as Los Angeles to meet their regulatory requirements.
A Review of Surface Water Quality Models
Li, Shibei; Jia, Peng; Qi, Changjun; Ding, Feng
2013-01-01
Surface water quality models can be useful tools to simulate and predict the levels, distributions, and risks of chemical pollutants in a given water body. The modeling results from these models under different pollution scenarios are very important components of environmental impact assessment and can provide a basis and technique support for environmental management agencies to make right decisions. Whether the model results are right or not can impact the reasonability and scientificity of the authorized construct projects and the availability of pollution control measures. We reviewed the development of surface water quality models at three stages and analyzed the suitability, precisions, and methods among different models. Standardization of water quality models can help environmental management agencies guarantee the consistency in application of water quality models for regulatory purposes. We concluded the status of standardization of these models in developed countries and put forward available measures for the standardization of these surface water quality models, especially in developing countries. PMID:23853533
Zhang, Chengxin; Mortuza, S M; He, Baoji; Wang, Yanting; Zhang, Yang
2018-03-01
We develop two complementary pipelines, "Zhang-Server" and "QUARK", based on I-TASSER and QUARK pipelines for template-based modeling (TBM) and free modeling (FM), and test them in the CASP12 experiment. The combination of I-TASSER and QUARK successfully folds three medium-size FM targets that have more than 150 residues, even though the interplay between the two pipelines still awaits further optimization. Newly developed sequence-based contact prediction by NeBcon plays a critical role to enhance the quality of models, particularly for FM targets, by the new pipelines. The inclusion of NeBcon predicted contacts as restraints in the QUARK simulations results in an average TM-score of 0.41 for the best in top five predicted models, which is 37% higher than that by the QUARK simulations without contacts. In particular, there are seven targets that are converted from non-foldable to foldable (TM-score >0.5) due to the use of contact restraints in the simulations. Another additional feature in the current pipelines is the local structure quality prediction by ResQ, which provides a robust residue-level modeling error estimation. Despite the success, significant challenges still remain in ab initio modeling of multi-domain proteins and folding of β-proteins with complicated topologies bound by long-range strand-strand interactions. Improvements on domain boundary and long-range contact prediction, as well as optimal use of the predicted contacts and multiple threading alignments, are critical to address these issues seen in the CASP12 experiment. © 2017 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Jima, T. G.; Roberts, A.
2013-12-01
Quality of coastal and freshwater resources in the Southeastern United States is threatened due to Eutrophication as a result of excessive nutrients, and phosphorus is acknowledged as one of the major limiting nutrients. In areas with much non-point source (NPS) pollution, land use land cover and climate have been found to have significant impact on water quality. Landscape metrics applied in catchment and riparian stream based nutrient export models are known to significantly improve nutrient prediction. The regional SPARROW (Spatially Referenced Regression On Watershed attributes), which predicts Total Phosphorus has been developed by the Southeastern United States regions USGS, as part of the National Water Quality Assessment (NAWQA) program and the model accuracy was found to be 67%. However, landscape composition and configuration metrics which play a significant role in the source, transport and delivery of the nutrient have not been incorporated in the model. Including these matrices in the models parameterization will improve the models accuracy and improve decision making process for mitigating and managing NPS phosphorus in the region. The National Land Cover Data 2001 raster data will be used (since the base line is 2002) for the region (with 8321 watersheds ) with fragstats 4.1 and ArcGIS Desktop 10.1 for the analysis of landscape matrices, buffers and creating map layers. The result will be imported to the Southeast SPARROW model and will be analyzed. Resulting statistical significance and model accuracy will be assessed and predictions for those areas with no water quality monitoring station will be made.
Pilkington, Sarah M; Crowhurst, Ross; Hilario, Elena; Nardozza, Simona; Fraser, Lena; Peng, Yongyan; Gunaseelan, Kularajathevan; Simpson, Robert; Tahir, Jibran; Deroles, Simon C; Templeton, Kerry; Luo, Zhiwei; Davy, Marcus; Cheng, Canhong; McNeilage, Mark; Scaglione, Davide; Liu, Yifei; Zhang, Qiong; Datson, Paul; De Silva, Nihal; Gardiner, Susan E; Bassett, Heather; Chagné, David; McCallum, John; Dzierzon, Helge; Deng, Cecilia; Wang, Yen-Yi; Barron, Lorna; Manako, Kelvina; Bowen, Judith; Foster, Toshi M; Erridge, Zoe A; Tiffin, Heather; Waite, Chethi N; Davies, Kevin M; Grierson, Ella P; Laing, William A; Kirk, Rebecca; Chen, Xiuyin; Wood, Marion; Montefiori, Mirco; Brummell, David A; Schwinn, Kathy E; Catanach, Andrew; Fullerton, Christina; Li, Dawei; Meiyalaghan, Sathiyamoorthy; Nieuwenhuizen, Niels; Read, Nicola; Prakash, Roneel; Hunter, Don; Zhang, Huaibi; McKenzie, Marian; Knäbel, Mareike; Harris, Alastair; Allan, Andrew C; Gleave, Andrew; Chen, Angela; Janssen, Bart J; Plunkett, Blue; Ampomah-Dwamena, Charles; Voogd, Charlotte; Leif, Davin; Lafferty, Declan; Souleyre, Edwige J F; Varkonyi-Gasic, Erika; Gambi, Francesco; Hanley, Jenny; Yao, Jia-Long; Cheung, Joey; David, Karine M; Warren, Ben; Marsh, Ken; Snowden, Kimberley C; Lin-Wang, Kui; Brian, Lara; Martinez-Sanchez, Marcela; Wang, Mindy; Ileperuma, Nadeesha; Macnee, Nikolai; Campin, Robert; McAtee, Peter; Drummond, Revel S M; Espley, Richard V; Ireland, Hilary S; Wu, Rongmei; Atkinson, Ross G; Karunairetnam, Sakuntala; Bulley, Sean; Chunkath, Shayhan; Hanley, Zac; Storey, Roy; Thrimawithana, Amali H; Thomson, Susan; David, Charles; Testolin, Raffaele; Huang, Hongwen; Hellens, Roger P; Schaffer, Robert J
2018-04-16
Most published genome sequences are drafts, and most are dominated by computational gene prediction. Draft genomes typically incorporate considerable sequence data that are not assigned to chromosomes, and predicted genes without quality confidence measures. The current Actinidia chinensis (kiwifruit) 'Hongyang' draft genome has 164 Mb of sequences unassigned to pseudo-chromosomes, and omissions have been identified in the gene models. A second genome of an A. chinensis (genotype Red5) was fully sequenced. This new sequence resulted in a 554.0 Mb assembly with all but 6 Mb assigned to pseudo-chromosomes. Pseudo-chromosomal comparisons showed a considerable number of translocation events have occurred following a whole genome duplication (WGD) event some consistent with centromeric Robertsonian-like translocations. RNA sequencing data from 12 tissues and ab initio analysis informed a genome-wide manual annotation, using the WebApollo tool. In total, 33,044 gene loci represented by 33,123 isoforms were identified, named and tagged for quality of evidential support. Of these 3114 (9.4%) were identical to a protein within 'Hongyang' The Kiwifruit Information Resource (KIR v2). Some proportion of the differences will be varietal polymorphisms. However, as most computationally predicted Red5 models required manual re-annotation this proportion is expected to be small. The quality of the new gene models was tested by fully sequencing 550 cloned 'Hort16A' cDNAs and comparing with the predicted protein models for Red5 and both the original 'Hongyang' assembly and the revised annotation from KIR v2. Only 48.9% and 63.5% of the cDNAs had a match with 90% identity or better to the original and revised 'Hongyang' annotation, respectively, compared with 90.9% to the Red5 models. Our study highlights the need to take a cautious approach to draft genomes and computationally predicted genes. Our use of the manual annotation tool WebApollo facilitated manual checking and correction of gene models enabling improvement of computational prediction. This utility was especially relevant for certain types of gene families such as the EXPANSIN like genes. Finally, this high quality gene set will supply the kiwifruit and general plant community with a new tool for genomics and other comparative analysis.
Harlow C. Landphair
1979-01-01
This paper relates the evolution of an empirical model used to predict public response to scenic quality objectively. The text relates the methods used to develop the visual quality index model, explains the terms used in the equation and briefly illustrates how the model is applied and how it is tested. While the technical application of the model relies heavily on...
Within the context of the Air Quality Model Evaluation International Initiative phase 2 (AQMEII2) project, this part II paper performs a multi-model assessment of major column abundances of gases, radiation, aerosol, and cloud variables for 2006 and 2010 simulations with three on...
NASA Astrophysics Data System (ADS)
Taylan, Osman
2017-02-01
High ozone concentration is an important cause of air pollution mainly due to its role in the greenhouse gas emission. Ozone is produced by photochemical processes which contain nitrogen oxides and volatile organic compounds in the lower atmospheric level. Therefore, monitoring and controlling the quality of air in the urban environment is very important due to the public health care. However, air quality prediction is a highly complex and non-linear process; usually several attributes have to be considered. Artificial intelligent (AI) techniques can be employed to monitor and evaluate the ozone concentration level. The aim of this study is to develop an Adaptive Neuro-Fuzzy inference approach (ANFIS) to determine the influence of peripheral factors on air quality and pollution which is an arising problem due to ozone level in Jeddah city. The concentration of ozone level was considered as a factor to predict the Air Quality (AQ) under the atmospheric conditions. Using Air Quality Standards of Saudi Arabia, ozone concentration level was modelled by employing certain factors such as; nitrogen oxide (NOx), atmospheric pressure, temperature, and relative humidity. Hence, an ANFIS model was developed to observe the ozone concentration level and the model performance was assessed by testing data obtained from the monitoring stations established by the General Authority of Meteorology and Environment Protection of Kingdom of Saudi Arabia. The outcomes of ANFIS model were re-assessed by fuzzy quality charts using quality specification and control limits based on US-EPA air quality standards. The results of present study show that the ANFIS model is a comprehensive approach for the estimation and assessment of ozone level and is a reliable approach to produce more genuine outcomes.
Development of a multi-ensemble Prediction Model for China
NASA Astrophysics Data System (ADS)
Brasseur, G. P.; Bouarar, I.; Petersen, A. K.
2016-12-01
As part of the EU-sponsored Panda and MarcoPolo Projects, a multi-model prediction system including 7 models has been developed. Most regional models use global air quality predictions provided by the Copernicus Atmospheric Monitoring Service and downscale the forecast at relatively high spatial resolution in eastern China. The paper will describe the forecast system and show examples of forecasts produced for several Chinese urban areas and displayed on a web site developed by the Dutch Meteorological service. A discussion on the accuracy of the predictions based on a detailed validation process using surface measurements from the Chinese monitoring network will be presented.
Xu, Yadong; Serre, Marc L; Reyes, Jeanette; Vizuete, William
2016-04-19
To improve ozone exposure estimates for ambient concentrations at a national scale, we introduce our novel Regionalized Air Quality Model Performance (RAMP) approach to integrate chemical transport model (CTM) predictions with the available ozone observations using the Bayesian Maximum Entropy (BME) framework. The framework models the nonlinear and nonhomoscedastic relation between air pollution observations and CTM predictions and for the first time accounts for variability in CTM model performance. A validation analysis using only noncollocated data outside of a validation radius rv was performed and the R(2) between observations and re-estimated values for two daily metrics, the daily maximum 8-h average (DM8A) and the daily 24-h average (D24A) ozone concentrations, were obtained with the OBS scenario using ozone observations only in contrast with the RAMP and a Constant Air Quality Model Performance (CAMP) scenarios. We show that, by accounting for the spatial and temporal variability in model performance, our novel RAMP approach is able to extract more information in terms of R(2) increase percentage, with over 12 times for the DM8A and over 3.5 times for the D24A ozone concentrations, from CTM predictions than the CAMP approach assuming that model performance does not change across space and time.
NASA Astrophysics Data System (ADS)
Huang, H. C.; Pan, L.; McQueen, J.; Lee, P.; ONeill, S. M.; Ruminski, M.; Shafran, P.; DiMego, G.; Huang, J.; Stajner, I.; Upadhayay, S.; Larkin, N. K.
2016-12-01
Wildfires contribute to air quality problems not only towards primary emissions of particular matters (PM) but also emitted ozone precursor gases that can lead to elevated ozone concentration. Wildfires are unpredictable and can be ignited by natural causes such as lightning or accidently by human negligent behavior such as live cigarette. Although wildfire impacts on the air quality can be studied by collecting fire information after events, it is extremely difficult to predict future occurrence and behavior of wildfires for real-time air quality forecasts. Because of the time constraints of operational air quality forecasting, assumption of future day's fire behavior often have to be made based on observed fire information in the past. The United States (U.S.) NOAA/NWS built the National Air Quality Forecast Capability (NAQFC) based on the U.S. EPA CMAQ to provide air quality forecast guidance (prediction) publicly. State and local forecasters use the forecast guidance to issue air quality alerts in their area. The NAQFC fine particulates (PM2.5) prediction includes emissions from anthropogenic and biogenic sources, as well as natural sources such as dust storms and fires. The fire emission input to the NAQFC is derived from the NOAA NESDIS HMS fire and smoke detection product and the emission module of the US Forest Service BlueSky Smoke Modeling Framework. This study focuses on the error estimation of NAQFC PM2.5 predictions resulting from fire emissions. The comparisons between the NAQFC modeled PM2.5 and the EPA AirNow surface observation show that present operational NAQFC fire emissions assumption can lead to a huge error in PM2.5 prediction as fire emissions are sometimes placed at wrong location and time. This PM2.5 prediction error can be propagated from the fire source in the Northwest U.S. to downstream areas as far as the Southeast U.S. From this study, a new procedure has been identified to minimize the aforementioned error. An additional 24 hours reanalysis-run of NAQFC using same-day observed fire emission are being tested. Preliminary results have shown that this procedure greatly improves the PM2.5 predictions at both nearby and downstream areas from fire sources. The 24 hours reanalysis-run is critical and necessary especially during extreme fire events to provide better PM2.5 predictions.
Evaluation of 3D-Jury on CASP7 models
Kaján, László; Rychlewski, Leszek
2007-01-01
Background 3D-Jury, the structure prediction consensus method publicly available in the Meta Server , was evaluated using models gathered in the 7th round of the Critical Assessment of Techniques for Protein Structure Prediction (CASP7). 3D-Jury is an automated expert process that generates protein structure meta-predictions from sets of models obtained from partner servers. Results The performance of 3D-Jury was analysed for three aspects. First, we examined the correlation between the 3D-Jury score and a model quality measure: the number of correctly predicted residues. The 3D-Jury score was shown to correlate significantly with the number of correctly predicted residues, the correlation is good enough to be used for prediction. 3D-Jury was also found to improve upon the competing servers' choice of the best structure model in most cases. The value of the 3D-Jury score as a generic reliability measure was also examined. We found that the 3D-Jury score separates bad models from good models better than the reliability score of the original server in 27 cases and falls short of it in only 5 cases out of a total of 38. We report the release of a new Meta Server feature: instant 3D-Jury scoring of uploaded user models. Conclusion The 3D-Jury score continues to be a good indicator of structural model quality. It also provides a generic reliability score, especially important for models that were not assigned such by the original server. Individual structure modellers can also benefit from the 3D-Jury scoring system by testing their models in the new instant scoring feature available in the Meta Server. PMID:17711571
Ebrahimi, Milad; Gerber, Erin L; Rockaway, Thomas D
2017-05-15
For most water treatment plants, a significant number of performance data variables are attained on a time series basis. Due to the interconnectedness of the variables, it is often difficult to assess over-arching trends and quantify operational performance. The objective of this study was to establish simple and reliable predictive models to correlate target variables with specific measured parameters. This study presents a multivariate analysis of the physicochemical parameters of municipal wastewater. Fifteen quality and quantity parameters were analyzed using data recorded from 2010 to 2016. To determine the overall quality condition of raw and treated wastewater, a Wastewater Quality Index (WWQI) was developed. The index summarizes a large amount of measured quality parameters into a single water quality term by considering pre-established quality limitation standards. To identify treatment process performance, the interdependencies between the variables were determined by using Principal Component Analysis (PCA). The five extracted components from the 15 variables accounted for 75.25% of total dataset information and adequately represented the organic, nutrient, oxygen demanding, and ion activity loadings of influent and effluent streams. The study also utilized the model to predict quality parameters such as Biological Oxygen Demand (BOD), Total Phosphorus (TP), and WWQI. High accuracies ranging from 71% to 97% were achieved for fitting the models with the training dataset and relative prediction percentage errors less than 9% were achieved for the testing dataset. The presented techniques and procedures in this paper provide an assessment framework for the wastewater treatment monitoring programs. Copyright © 2017 Elsevier Ltd. All rights reserved.
An Artificial Intelligence System to Predict Quality of Service in Banking Organizations
Popovič, Aleš
2016-01-01
Quality of service, that is, the waiting time that customers must endure in order to receive a service, is a critical performance aspect in private and public service organizations. Providing good service quality is particularly important in highly competitive sectors where similar services exist. In this paper, focusing on banking sector, we propose an artificial intelligence system for building a model for the prediction of service quality. While the traditional approach used for building analytical models relies on theories and assumptions about the problem at hand, we propose a novel approach for learning models from actual data. Thus, the proposed approach is not biased by the knowledge that experts may have about the problem, but it is completely based on the available data. The system is based on a recently defined variant of genetic programming that allows practitioners to include the concept of semantics in the search process. This will have beneficial effects on the search process and will produce analytical models that are based only on the data and not on domain-dependent knowledge. PMID:27313604
An Artificial Intelligence System to Predict Quality of Service in Banking Organizations.
Castelli, Mauro; Manzoni, Luca; Popovič, Aleš
2016-01-01
Quality of service, that is, the waiting time that customers must endure in order to receive a service, is a critical performance aspect in private and public service organizations. Providing good service quality is particularly important in highly competitive sectors where similar services exist. In this paper, focusing on banking sector, we propose an artificial intelligence system for building a model for the prediction of service quality. While the traditional approach used for building analytical models relies on theories and assumptions about the problem at hand, we propose a novel approach for learning models from actual data. Thus, the proposed approach is not biased by the knowledge that experts may have about the problem, but it is completely based on the available data. The system is based on a recently defined variant of genetic programming that allows practitioners to include the concept of semantics in the search process. This will have beneficial effects on the search process and will produce analytical models that are based only on the data and not on domain-dependent knowledge.
Pestana, Maribela; Beja, Pedro; Correia, Pedro José; de Varennes, Amarilis; Faria, Eugénio Araújo
2005-06-01
To determine if flower nutrient composition can be used to predict fruit quality, a field experiment was conducted over three seasons (1996-1999) in a commercial orange orchard (Citrus sinensis (L.) Osbeck cv. 'Valencia Late', budded on Troyer citrange rootstock) established on a calcareous soil in southern Portugal. Flowers were collected from 20 trees during full bloom in April and their nutrient composition determined, and fruits were harvested the following March and their quality evaluated. Patterns of covariation in flower nutrient concentrations and in fruit quality variables were evaluated by principal component analysis. Regression models relating fruit quality variables to flower nutrient composition were developed by stepwise selection procedures. The predictive power of the regression models was evaluated with an independent data set. Nutrient composition of flowers at full bloom could be used to predict the fruit quality variables fresh fruit mass and maturation index in the following year. Magnesium, Ca and Zn concentrations measured in flowers were related to fruit fresh mass estimations and N, P, Mg and Fe concentrations were related to fruit maturation index. We also established reference values for the nutrient composition of flowers based on measurements made in trees that produced large (> 76 mm in diameter) fruit.
EFFECTS OF USING THE CB05 VERSUS THE CB4 CHEMICAL MECHANISMS ON MODEL PREDICTIONS
The Carbon Bond 4 (CB4) chemical mechanism has been widely used for many years in box and air quality models to predict the effect of atmospheric chemistry on pollutant concentrations. Because of the importance of this mechanism and the length of time since its original developm...
EFFECTS OF USING THE CB05 VERSUS THE CB4 CHEMICAL MECHANISM ON MODEL PREDICTIONS
The Carbon Bond 4 (CB4) chemical mechanism has been widely used for many years in box and air quality models to predict the effect of atmospheric chemistry on pollutant concentrations. Because of the importance of this mechanism and the length of time since its original developm...
As part of our efforts to develop a public platform to provide access to predictive models we have attempted to disentangle the influence of the quality versus quantity of data available to develop and validate QSAR models. Using a thorough manual review of the data underlying t...
Heddam, Salim; Kisi, Ozgur
2017-07-01
In this paper, several extreme learning machine (ELM) models, including standard extreme learning machine with sigmoid activation function (S-ELM), extreme learning machine with radial basis activation function (R-ELM), online sequential extreme learning machine (OS-ELM), and optimally pruned extreme learning machine (OP-ELM), are newly applied for predicting dissolved oxygen concentration with and without water quality variables as predictors. Firstly, using data from eight United States Geological Survey (USGS) stations located in different rivers basins, USA, the S-ELM, R-ELM, OS-ELM, and OP-ELM were compared against the measured dissolved oxygen (DO) using four water quality variables, water temperature, specific conductance, turbidity, and pH, as predictors. For each station, we used data measured at an hourly time step for a period of 4 years. The dataset was divided into a training set (70%) and a validation set (30%). We selected several combinations of the water quality variables as inputs for each ELM model and six different scenarios were compared. Secondly, an attempt was made to predict DO concentration without water quality variables. To achieve this goal, we used the year numbers, 2008, 2009, etc., month numbers from (1) to (12), day numbers from (1) to (31) and hour numbers from (00:00) to (24:00) as predictors. Thirdly, the best ELM models were trained using validation dataset and tested with the training dataset. The performances of the four ELM models were evaluated using four statistical indices: the coefficient of correlation (R), the Nash-Sutcliffe efficiency (NSE), the root mean squared error (RMSE), and the mean absolute error (MAE). Results obtained from the eight stations indicated that: (i) the best results were obtained by the S-ELM, R-ELM, OS-ELM, and OP-ELM models having four water quality variables as predictors; (ii) out of eight stations, the OP-ELM performed better than the other three ELM models at seven stations while the R-ELM performed the best at one station. The OS-ELM models performed the worst and provided the lowest accuracy; (iii) for predicting DO without water quality variables, the R-ELM performed the best at seven stations followed by the S-ELM in the second place and the OP-ELM performed the worst with low accuracy; (iv) for the final application where training ELM models with validation dataset and testing with training dataset, the OP-ELM provided the best accuracy using water quality variables and the R-ELM performed the best at all eight stations without water quality variables. Fourthly, and finally, we compared the results obtained from different ELM models with those obtained using multiple linear regression (MLR) and multilayer perceptron neural network (MLPNN). Results obtained using MLPNN and MLR models reveal that: (i) using water quality variables as predictors, the MLR performed the worst and provided the lowest accuracy in all stations; (ii) MLPNN was ranked in the second place at two stations, in the third place at four stations, and finally, in the fourth place at two stations, (iii) for predicting DO without water quality variables, MLPNN is ranked in the second place at five stations, and ranked in the third, fourth, and fifth places in the remaining three stations, while MLR was ranked in the last place with very low accuracy at all stations. Overall, the results suggest that the ELM is more effective than the MLPNN and MLR for modelling DO concentration in river ecosystems.
Predicting Trihalomethanes (THMs) in the New York City Water Supply
NASA Astrophysics Data System (ADS)
Mukundan, R.; Van Dreason, R.
2013-12-01
Chlorine, a commonly used disinfectant in most water supply systems, can combine with organic carbon to form disinfectant byproducts including carcinogenic trihalomethanes (THMs). We used water quality data from 24 monitoring sites within the New York City (NYC) water supply distribution system, measured between January 2009 and April 2012, to develop site-specific empirical models for predicting total trihalomethane (TTHM) levels. Terms in the model included various combinations of the following water quality parameters: total organic carbon, pH, specific conductivity, and water temperature. Reasonable estimates of TTHM levels were achieved with overall R2 of about 0.87 and predicted values within 5 μg/L of measured values. The relative importance of factors affecting TTHM formation was estimated by ranking the model regression coefficients. Site-specific models showed improved model performance statistics compared to a single model for the entire system most likely because the single model did not consider locational differences in the water treatment process. Although never out of compliance in 2011, the TTHM levels in the water supply increased following tropical storms Irene and Lee with 45% of the samples exceeding the 80 μg/L Maximum Contaminant Level (MCL) in October and November. This increase was explained by changes in water quality parameters, particularly by the increase in total organic carbon concentration and pH during this period.
Downscaler Model for predicting daily air pollution
This model combines daily ozone and particulate matter monitoring and modeling data from across the U.S. to provide improved fine-scale estimates of air quality in communities and other specific locales.
Congestion Prediction Modeling for Quality of Service Improvement in Wireless Sensor Networks
Lee, Ga-Won; Lee, Sung-Young; Huh, Eui-Nam
2014-01-01
Information technology (IT) is pushing ahead with drastic reforms of modern life for improvement of human welfare. Objects constitute “Information Networks” through smart, self-regulated information gathering that also recognizes and controls current information states in Wireless Sensor Networks (WSNs). Information observed from sensor networks in real-time is used to increase quality of life (QoL) in various industries and daily life. One of the key challenges of the WSNs is how to achieve lossless data transmission. Although nowadays sensor nodes have enhanced capacities, it is hard to assure lossless and reliable end-to-end data transmission in WSNs due to the unstable wireless links and low hard ware resources to satisfy high quality of service (QoS) requirements. We propose a node and path traffic prediction model to predict and minimize the congestion. This solution includes prediction of packet generation due to network congestion from both periodic and event data generation. Simulation using NS-2 and Matlab is used to demonstrate the effectiveness of the proposed solution. PMID:24784035
An Overview of Atmospheric Chemistry and Air Quality Modeling
NASA Technical Reports Server (NTRS)
Johnson, Matthew S.
2017-01-01
This presentation will include my personal research experience and an overview of atmospheric chemistry and air quality modeling to the participants of the NASA Student Airborne Research Program (SARP 2017). The presentation will also provide examples on ways to apply airborne observations for chemical transport (CTM) and air quality (AQ) model evaluation. CTM and AQ models are important tools in understanding tropospheric-stratospheric composition, atmospheric chemistry processes, meteorology, and air quality. This presentation will focus on how NASA scientist currently apply CTM and AQ models to better understand these topics. Finally, the importance of airborne observation in evaluating these topics and how in situ and remote sensing observations can be used to evaluate and improve CTM and AQ model predictions will be highlighted.
Key Issues for Seamless Integrated Chemistry–Meteorology Modeling
Online coupled meteorology–atmospheric chemistry models have greatly evolved in recent years. Although mainly developed by the air quality modeling community, these integrated models are also of interest for numerical weather prediction and climate modeling, as they can con...
Frameworks for Assessing the Quality of Modeling and Simulation Capabilities
NASA Astrophysics Data System (ADS)
Rider, W. J.
2012-12-01
The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are incomplete and need to be extended incorporating elements from the other as well as new elements related to how models are solved, and how the model will be applied. I will describe this merger of approach and how it should be applied. The problems in adoption are related to basic human nature in that no one likes to be graded, or told they are not sufficiently quality oriented. Rather than engage in an adversarial role, I suggest that the frameworks be viewed as a collaborative tool. Instead these frameworks should be used to structure collaborations that can be used to assist the modeling and simulation efforts to be high quality. The framework provides a comprehensive setting of modeling and simulation themes that should be explored in providing high quality. W. Oberkampf, M. Pilch, and T. Trucano, Predictive Capability Maturity Model for Computational Modeling and Simulation, SAND2007-5948, 2007. B. Boyack, Quantifying Reactor Safety Margins Part 1: An Overview of the Code Scaling, Applicability, and Uncertainty Evaluation Methodology, Nuc. Eng. Design, 119, pp. 1-15, 1990. National Aeronautics and Space Administration, STANDARD FOR MODELS AND SIMULATIONS, NASA-STD-7009, 2008. Y. Ben-Haim and F. Hemez, Robustness, fidelity and prediction-looseness of models, Proc. R. Soc. A (2012) 468, 227-244.
Anazawa, Takayuki; Paruch, Jennifer L; Miyata, Hiroaki; Gotoh, Mitsukazu; Ko, Clifford Y; Cohen, Mark E; Hirahara, Norimichi; Zhou, Lynn; Konno, Hiroyuki; Wakabayashi, Go; Sugihara, Kenichi; Mori, Masaki
2015-12-01
International collaboration is important in healthcare quality evaluation; however, few international comparisons of general surgery outcomes have been accomplished. Furthermore, predictive model application for risk stratification has not been internationally evaluated. The National Clinical Database (NCD) in Japan was developed in collaboration with the American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP), with a goal of creating a standardized surgery database for quality improvement. The study aimed to compare the consistency and impact of risk factors of 3 major gastroenterological surgical procedures in Japan and the United States (US) using web-based prospective data entry systems: right hemicolectomy (RH), low anterior resection (LAR), and pancreaticoduodenectomy (PD).Data from NCD and ACS-NSQIP, collected over 2 years, were examined. Logistic regression models were used for predicting 30-day mortality for both countries. Models were exchanged and evaluated to determine whether the models built for one population were accurate for the other population.We obtained data for 113,980 patients; 50,501 (Japan: 34,638; US: 15,863), 42,770 (Japan: 35,445; US: 7325), and 20,709 (Japan: 15,527; US: 5182) underwent RH, LAR, and, PD, respectively. Thirty-day mortality rates for RH were 0.76% (Japan) and 1.88% (US); rates for LAR were 0.43% versus 1.08%; and rates for PD were 1.35% versus 2.57%. Patient background, comorbidities, and practice style were different between Japan and the US. In the models, the odds ratio for each variable was similar between NCD and ACS-NSQIP. Local risk models could predict mortality using local data, but could not accurately predict mortality using data from other countries.We demonstrated the feasibility and efficacy of the international collaborative research between Japan and the US, but found that local risk models remain essential for quality improvement.
NASA Astrophysics Data System (ADS)
Gobrecht, Alexia; Bendoula, Ryad; Roger, Jean-Michel; Bellon-Maurel, Véronique
2014-05-01
Visible - Near-infrared spectroscopy (Vis-NIRS) is now commonly used to measure different physical and chemical parameters of soils, including carbon content. However, prediction model accuracy is insufficient for Vis-NIRS to replace routine laboratory analysis. One of the biggest issues this technique is facing up to is light scattering due to soil particles. It causes departure in the assumed linear relationship between the Absorbance spectrum and the concentration of the chemicals of interest as stated by Beer-Lambert's Law, which underpins the calibration models. Therefore it becomes essential to improve the metrological quality of the measured signal in order to optimize calibration as light/matter interactions are at the basis of the resulting linear modeling. Optics can help to mitigate scattering effect on the signal. We put forward a new optical setup coupling linearly polarized light with a Vis-NIR spectrometer to free the measured spectra from multi-scattering effect. The corrected measured spectrum was then used to compute an Absorbance spectrum of the sample, using Dahm's Equation in the frame of the Representative Layer Theory. This method has been previously tested and validated on liquid (milk+ dye) and powdered (sand + dye) samples showing scattering (and absorbing) properties. The obtained Absorbance was a very good approximation of the Beer-Lambert's law absorbance. Here, we tested the method on a set of 54 soil samples to predict Soil Organic Carbon content. In order to assess the signal quality improvement by this method, we built and compared calibration models using Partial Least Square (PLS) algorithm. The prediction model built from new Absorbance spectrum outperformed the model built with the classical Absorbance traditionally obtained with Vis-NIR diffuse reflectance. This study is a good illustration of the high influence of signal quality on prediction model's performances.
NASA Astrophysics Data System (ADS)
Lobuglio, Joseph N.; Characklis, Gregory W.; Serre, Marc L.
2007-03-01
Sparse monitoring data and error inherent in water quality models make the identification of waters not meeting regulatory standards uncertain. Additional monitoring can be implemented to reduce this uncertainty, but it is often expensive. These costs are currently a major concern, since developing total maximum daily loads, as mandated by the Clean Water Act, will require assessing tens of thousands of water bodies across the United States. This work uses the Bayesian maximum entropy (BME) method of modern geostatistics to integrate water quality monitoring data together with model predictions to provide improved estimates of water quality in a cost-effective manner. This information includes estimates of uncertainty and can be used to aid probabilistic-based decisions concerning the status of a water (i.e., impaired or not impaired) and the level of monitoring needed to characterize the water for regulatory purposes. This approach is applied to the Catawba River reservoir system in western North Carolina as a means of estimating seasonal chlorophyll a concentration. Mean concentration and confidence intervals for chlorophyll a are estimated for 66 reservoir segments over an 11-year period (726 values) based on 219 measured seasonal averages and 54 model predictions. Although the model predictions had a high degree of uncertainty, integration of modeling results via BME methods reduced the uncertainty associated with chlorophyll estimates compared with estimates made solely with information from monitoring efforts. Probabilistic predictions of future chlorophyll levels on one reservoir are used to illustrate the cost savings that can be achieved by less extensive and rigorous monitoring methods within the BME framework. While BME methods have been applied in several environmental contexts, employing these methods as a means of integrating monitoring and modeling results, as well as application of this approach to the assessment of surface water monitoring networks, represent unexplored areas of research.
Stelzer, Erin A.; Duris, Joseph W.; Brady, Amie M. G.; Harrison, John H.; Johnson, Heather E.; Ware, Michael W.
2013-01-01
Predictive models, based on environmental and water quality variables, have been used to improve the timeliness and accuracy of recreational water quality assessments, but their effectiveness has not been studied in inland waters. Sampling at eight inland recreational lakes in Ohio was done in order to investigate using predictive models for Escherichia coli and to understand the links between E. coli concentrations, predictive variables, and pathogens. Based upon results from 21 beach sites, models were developed for 13 sites, and the most predictive variables were rainfall, wind direction and speed, turbidity, and water temperature. Models were not developed at sites where the E. coli standard was seldom exceeded. Models were validated at nine sites during an independent year. At three sites, the model resulted in increased correct responses, sensitivities, and specificities compared to use of the previous day's E. coli concentration (the current method). Drought conditions during the validation year precluded being able to adequately assess model performance at most of the other sites. Cryptosporidium, adenovirus, eaeA (E. coli), ipaH (Shigella), and spvC (Salmonella) were found in at least 20% of samples collected for pathogens at five sites. The presence or absence of the three bacterial genes was related to some of the model variables but was not consistently related to E. coli concentrations. Predictive models were not effective at all inland lake sites; however, their use at two lakes with high swimmer densities will provide better estimates of public health risk than current methods and will be a valuable resource for beach managers and the public. PMID:23291550
So, Rita; Teakles, Andrew; Baik, Jonathan; Vingarzan, Roxanne; Jones, Keith
2018-05-01
Visibility degradation, one of the most noticeable indicators of poor air quality, can occur despite relatively low levels of particulate matter when the risk to human health is low. The availability of timely and reliable visibility forecasts can provide a more comprehensive understanding of the anticipated air quality conditions to better inform local jurisdictions and the public. This paper describes the development of a visibility forecasting modeling framework, which leverages the existing air quality and meteorological forecasts from Canada's operational Regional Air Quality Deterministic Prediction System (RAQDPS) for the Lower Fraser Valley of British Columbia. A baseline model (GM-IMPROVE) was constructed using the revised IMPROVE algorithm based on unprocessed forecasts from the RAQDPS. Three additional prototypes (UMOS-HYB, GM-MLR, GM-RF) were also developed and assessed for forecast performance of up to 48 hr lead time during various air quality and meteorological conditions. Forecast performance was assessed by examining their ability to provide both numerical and categorical forecasts in the form of 1-hr total extinction and Visual Air Quality Ratings (VAQR), respectively. While GM-IMPROVE generally overestimated extinction more than twofold, it had skill in forecasting the relative species contribution to visibility impairment, including ammonium sulfate and ammonium nitrate. Both statistical prototypes, GM-MLR and GM-RF, performed well in forecasting 1-hr extinction during daylight hours, with correlation coefficients (R) ranging from 0.59 to 0.77. UMOS-HYB, a prototype based on postprocessed air quality forecasts without additional statistical modeling, provided reasonable forecasts during most daylight hours. In terms of categorical forecasts, the best prototype was approximately 75 to 87% correct, when forecasting for a condensed three-category VAQR. A case study, focusing on a poor visual air quality yet low Air Quality Health Index episode, illustrated that the statistical prototypes were able to provide timely and skillful visibility forecasts with lead time up to 48 hr. This study describes the development of a visibility forecasting modeling framework, which leverages the existing air quality and meteorological forecasts from Canada's operational Regional Air Quality Deterministic Prediction System. The main applications include tourism and recreation planning, input into air quality management programs, and educational outreach. Visibility forecasts, when supplemented with the existing air quality and health based forecasts, can assist jurisdictions to anticipate the visual air quality impacts as perceived by the public, which can potentially assist in formulating the appropriate air quality bulletins and recommendations.
Shelf-life prediction models for ready-to-eat fresh cut salads: Testing in real cold chain.
Tsironi, Theofania; Dermesonlouoglou, Efimia; Giannoglou, Marianna; Gogou, Eleni; Katsaros, George; Taoukis, Petros
2017-01-02
The aim of the study was to develop and test the applicability of predictive models for shelf-life estimation of ready-to-eat (RTE) fresh cut salads in realistic distribution temperature conditions in the food supply chain. A systematic kinetic study of quality loss of RTE mixed salad (lollo rosso lettuce-40%, lollo verde lettuce-45%, rocket-15%) packed under modified atmospheres (3% O 2 , 10% CO 2 , 87% N 2 ) was conducted. Microbial population (total viable count, Pseudomonas spp., lactic acid bacteria), vitamin C, colour and texture were the measured quality parameters. Kinetic models for these indices were developed to determine the quality loss and calculate product remaining shelf-life (SL R ). Storage experiments were conducted at isothermal (2.5-15°C) and non-isothermal temperature conditions (T eff =7.8°C defined as the constant temperature that results in the same quality value as the variable temperature distribution) for validation purposes. Pseudomonas dominated spoilage, followed by browning and chemical changes. The end of shelf-life correlated with a Pseudomonas spp. level of 8 log(cfu/g), and 20% loss of the initial vitamin C content. The effect of temperature on these quality parameters was expressed by the Arrhenius equation; activation energy (E a ) value was 69.1 and 122.6kJ/mol for Pseudomonas spp. growth and vitamin C loss rates, respectively. Shelf-life prediction models were also validated in real cold chain conditions (including the stages of transport to and storage at retail distribution center, transport to and display at 7 retail stores, transport to and storage in domestic refrigerators). The quality level and SL R estimated after 2-3days of domestic storage (time of consumption) ranged between 1 and 8days at 4°C and was predicted within satisfactory statistical error by the kinetic models. T eff in the cold chain ranged between 3.7 and 8.3°C. Using the validated models, SL R of RTE fresh cut salad can be estimated at any point of the cold chain if the temperature history is known. Shelf-life models of validated applicability can serve as an effective tool for shelf-life assessment and the development of new products in the fresh produce food sector. Copyright © 2016. Published by Elsevier B.V.
Eskelson, Bianca N.I.; Hagar, Joan; Temesgen, Hailemariam
2012-01-01
Snags (standing dead trees) are an essential structural component of forests. Because wildlife use of snags depends on size and decay stage, snag density estimation without any information about snag quality attributes is of little value for wildlife management decision makers. Little work has been done to develop models that allow multivariate estimation of snag density by snag quality class. Using climate, topography, Landsat TM data, stand age and forest type collected for 2356 forested Forest Inventory and Analysis plots in western Washington and western Oregon, we evaluated two multivariate techniques for their abilities to estimate density of snags by three decay classes. The density of live trees and snags in three decay classes (D1: recently dead, little decay; D2: decay, without top, some branches and bark missing; D3: extensive decay, missing bark and most branches) with diameter at breast height (DBH) ≥ 12.7 cm was estimated using a nonparametric random forest nearest neighbor imputation technique (RF) and a parametric two-stage model (QPORD), for which the number of trees per hectare was estimated with a Quasipoisson model in the first stage and the probability of belonging to a tree status class (live, D1, D2, D3) was estimated with an ordinal regression model in the second stage. The presence of large snags with DBH ≥ 50 cm was predicted using a logistic regression and RF imputation. Because of the more homogenous conditions on private forest lands, snag density by decay class was predicted with higher accuracies on private forest lands than on public lands, while presence of large snags was more accurately predicted on public lands, owing to the higher prevalence of large snags on public lands. RF outperformed the QPORD model in terms of percent accurate predictions, while QPORD provided smaller root mean square errors in predicting snag density by decay class. The logistic regression model achieved more accurate presence/absence classification of large snags than the RF imputation approach. Adjusting the decision threshold to account for unequal size for presence and absence classes is more straightforward for the logistic regression than for the RF imputation approach. Overall, model accuracies were poor in this study, which can be attributed to the poor predictive quality of the explanatory variables and the large range of forest types and geographic conditions observed in the data.
Nevers, Meredith B.; Whitman, Richard L.
2011-01-01
Efforts to improve public health protection in recreational swimming waters have focused on obtaining real-time estimates of water quality. Current monitoring techniques rely on the time-intensive culturing of fecal indicator bacteria (FIB) from water samples, but rapidly changing FIB concentrations result in management errors that lead to the public being exposed to high FIB concentrations (type II error) or beaches being closed despite acceptable water quality (type I error). Empirical predictive models may provide a rapid solution, but their effectiveness at improving health protection has not been adequately assessed. We sought to determine if emerging monitoring approaches could effectively reduce risk of illness exposure by minimizing management errors. We examined four monitoring approaches (inactive, current protocol, a single predictive model for all beaches, and individual models for each beach) with increasing refinement at 14 Chicago beaches using historical monitoring and hydrometeorological data and compared management outcomes using different standards for decision-making. Predictability (R2) of FIB concentration improved with model refinement at all beaches but one. Predictive models did not always reduce the number of management errors and therefore the overall illness burden. Use of a Chicago-specific single-sample standard-rather than the default 235 E. coli CFU/100 ml widely used-together with predictive modeling resulted in the greatest number of open beach days without any increase in public health risk. These results emphasize that emerging monitoring approaches such as empirical models are not equally applicable at all beaches, and combining monitoring approaches may expand beach access.
Kivlighan, Dennis M; Hill, Clara E; Gelso, Charles J; Baumann, Ellen
2016-03-01
We used the Actor Partner Interdependence Model (APIM; Kashy & Kenny, 2000) to examine the dyadic associations of 74 clients and 23 therapists in their evaluations of working alliance, real relationship, session quality, and client improvement over time in ongoing psychodynamic or interpersonal psychotherapy. There were significant actor effects for both therapists and clients, with the participant's own ratings of working alliance and real relationship independently predicting their own evaluations of session quality. There were significant client partner effects, with clients' working alliance and real relationship independently predicting their therapists' evaluations of session quality. The client partner real relationship effect was stronger in later sessions than in earlier sessions. Therapists' real relationship ratings (partner effect) were a stronger predictor of clients' session quality ratings in later sessions than in earlier sessions. Therapists' working alliance ratings (partner effect) were a stronger predictor of clients' session quality ratings when clients made greater improvement than when clients made lesser improvement. For clients' session outcome ratings, there were complex three-way interactions, such that both Client real relationship and working alliance interacted with client improvement and time in treatment to predict clients' session quality. These findings strongly suggest both individual and partner effects when clients and therapists evaluate psychotherapy process and outcome. Implications for research and practice are discussed. (c) 2016 APA, all rights reserved).
Bedoya, David; Manolakos, Elias S; Novotny, Vladimir
2011-03-01
Indices of Biological integrity (IBI) are considered valid indicators of the overall health of a water body because the biological community is an endpoint within natural systems. However, prediction of biological integrity using information from multi-parameter environmental observations is a challenging problem due to the hierarchical organization of the natural environment, the existence of nonlinear inter-dependencies among variables as well as natural stochasticity and measurement noise. We present a method for predicting the Fish Index of Biological Integrity (IBI) using multiple environmental observations at the state-scale in Ohio. Instream (chemical and physical quality) and offstream parameters (regional and local upstream land uses, stream fragmentation, and point source density and intensity) are used for this purpose. The IBI predictions are obtained using the environmental site-similarity concept and following a simple to implement leave-one-out cross validation approach. An IBI prediction for a sampling site is calculated by averaging the observed IBI scores of observations clustered in the most similar branch of a dendrogram--a hierarchical clustering tree of environmental observations--built using the rest of the observations. The standardized Euclidean distance is used to assess dissimilarity between observations. The constructed predictive model was able to explain 61% of the IBI variability statewide. Stream fragmentation and regional land use explained 60% of the variability; the remaining 1% was explained by instream habitat quality. Metrics related to local land use, water quality, and point source density and intensity did not improve the predictive model at the state-scale. The impact of local environmental conditions was evaluated by comparing local characteristics between well- and mispredicted sites. Significant differences in local land use patterns and upstream fragmentation density explained some of the model's over-predictions. Local land use conditions explained some of the model's IBI under-predictions at the state-scale since none of the variables within this group were included in the best final predictive model. Under-predicted sites also had higher levels of downstream fragmentation. The proposed variables ranking and predictive modeling methodology is very well suited for the analysis of hierarchical environments, such as natural fresh water systems, with many cross-correlated environmental variables. It is computationally efficient, can be fully automated, does not make any pre-conceived assumptions on the variables interdependency structure (such as linearity), and it is able to rank variables in a database and generate IBI predictions using only non-parametric easy to implement hierarchical clustering. Copyright © 2011 Elsevier Ltd. All rights reserved.
Susan J. Prichard; Eva C. Karau; Roger D. Ottmar; Maureen C. Kennedy; James B. Cronan; Clinton S. Wright; Robert E. Keane
2014-01-01
Reliable predictions of fuel consumption are critical in the eastern United States (US), where prescribed burning is frequently applied to forests and air quality is of increasing concern. CONSUME and the First Order Fire Effects Model (FOFEM), predictive models developed to estimate fuel consumption and emissions from wildland fires, have not been systematically...
The Simulations of Wildland Fire Smoke PM25 in the NWS Air Quality Forecasting Systems
NASA Astrophysics Data System (ADS)
Huang, H. C.; Pan, L.; McQueen, J.; Lee, P.; ONeill, S. M.; Ruminski, M.; Shafran, P.; Huang, J.; Stajner, I.; Upadhayay, S.; Larkin, N. K.
2017-12-01
The increase of wildland fire intensity and frequency in the United States (U.S.) has led to property loss, human fatality, and poor air quality due to elevated particulate matters and surface ozone concentrations. The NOAA/National Weather Service (NWS) built the National Air Quality Forecast Capability (NAQFC) based on the U.S. Environmental Protection Agency (EPA) Community Multi-scale Air Quality (CMAQ) Modeling System driven by the NCEP North American Mesoscale Forecast System meteorology to provide ozone and fine particulate matter (PM2.5) forecast guidance publicly. State and local forecasters use the NWS air quality forecast guidance to issue air quality alerts in their area. The NAQFC PM2.5 predictions include emissions from anthropogenic and biogenic sources, as well as natural sources such as dust storms and wildland fires. The wildland fire emission inputs to the NAQFC is derived from the NOAA National Environmental Satellite, Data, and Information Service Hazard Mapping System fire and smoke detection product and the emission module of the U.S. Forest Service (USFS) BlueSky Smoke Modeling Framework. Wildland fires are unpredictable and can be ignited by natural causes such as lightning or be human-caused. It is extremely difficult to predict future occurrences and behavior of wildland fires, as is the available bio-fuel to be burned for real-time air quality predictions. Assumptions of future day's wildland fire behavior often have to be made from older observed wildland fire information. The comparisons between the NAQFC modeled PM2.5 and the EPA AirNow surface observation show that large errors in PM2.5 prediction can occur if fire smoke emissions are sometimes placed at the wrong location and/or time. A configuration of NAQFC CMAQ-system to re-run previous 24 hours, during which wildland fires were observed from satellites has been included recently. This study focuses on the effort performed to minimize the error in NAQFC PM2.5 predictions resulting from incorporating fire smoke emissions into the NAQFC from a recently updated newer version of USFS BlueSky system. This study will show how new approaches has improved the PM2.5 predictions at both nearby and downstream areas from fire sources. Furthermore, Environment and Climate Change Canada (ECCC) fire emissions data are being tested.
Postprocessing for Air Quality Predictions
NASA Astrophysics Data System (ADS)
Delle Monache, L.
2017-12-01
In recent year, air quality (AQ) forecasting has made significant progress towards better predictions with the goal of protecting the public from harmful pollutants. This progress is the results of improvements in weather and chemical transport models, their coupling, and more accurate emission inventories (e.g., with the development of new algorithms to account in near real-time for fires). Nevertheless, AQ predictions are still affected at times by significant biases which stem from limitations in both weather and chemistry transport models. Those are the result of numerical approximations and the poor representation (and understanding) of important physical and chemical process. Moreover, although the quality of emission inventories has been significantly improved, they are still one of the main sources of uncertainties in AQ predictions. For operational real-time AQ forecasting, a significant portion of these biases can be reduced with the implementation of postprocessing methods. We will review some of the techniques that have been proposed to reduce both systematic and random errors of AQ predictions, and improve the correlation between predictions and observations of ground-level ozone and surface particulate matter less than 2.5 µm in diameter (PM2.5). These methods, which can be applied to both deterministic and probabilistic predictions, include simple bias-correction techniques, corrections inspired by the Kalman filter, regression methods, and the more recently developed analog-based algorithms. These approaches will be compared and contrasted, and strength and weaknesses of each will be discussed.
Kaplan, Katherine A; Hirshman, Jason; Hernandez, Beatriz; Stefanick, Marcia L; Hoffman, Andrew R; Redline, Susan; Ancoli-Israel, Sonia; Stone, Katie; Friedman, Leah; Zeitzer, Jamie M
2017-02-01
Reports of subjective sleep quality are frequently collected in research and clinical practice. It is unclear, however, how well polysomnographic measures of sleep correlate with subjective reports of prior-night sleep quality in elderly men and women. Furthermore, the relative importance of various polysomnographic, demographic and clinical characteristics in predicting subjective sleep quality is not known. We sought to determine the correlates of subjective sleep quality in older adults using more recently developed machine learning algorithms that are suitable for selecting and ranking important variables. Community-dwelling older men (n=1024) and women (n=459), a subset of those participating in the Osteoporotic Fractures in Men study and the Study of Osteoporotic Fractures study, respectively, completed a single night of at-home polysomnographic recording of sleep followed by a set of morning questions concerning the prior night's sleep quality. Questionnaires concerning demographics and psychological characteristics were also collected prior to the overnight recording and entered into multivariable models. Two machine learning algorithms, lasso penalized regression and random forests, determined variable selection and the ordering of variable importance separately for men and women. Thirty-eight sleep, demographic and clinical correlates of sleep quality were considered. Together, these multivariable models explained only 11-17% of the variance in predicting subjective sleep quality. Objective sleep efficiency emerged as the strongest correlate of subjective sleep quality across all models, and across both sexes. Greater total sleep time and sleep stage transitions were also significant objective correlates of subjective sleep quality. The amount of slow wave sleep obtained was not determined to be important. Overall, the commonly obtained measures of polysomnographically-defined sleep contributed little to subjective ratings of prior-night sleep quality. Though they explained relatively little of the variance, sleep efficiency, total sleep time and sleep stage transitions were among the most important objective correlates. Published by Elsevier B.V.
Kaplan, Katherine A.; Hirshman, Jason; Hernandez, Beatriz; Stefanick, Marcia L.; Hoffman, Andrew R.; Redline, Susan; Ancoli-Israel, Sonia; Stone, Katie; Friedman, Leah; Zeitzer, Jamie M.
2016-01-01
Background Reports of subjective sleep quality are frequently collected in research and clinical practice. It is unclear, however, how well polysomnographic measures of sleep correlate with subjective reports of prior-night sleep quality in elderly men and women. Furthermore, the relative importance of various polysomnographic, demographic and clinical characteristics in predicting subjective sleep quality is not known. We sought to determine the correlates of subjective sleep quality in in older adults using more recently developed machine learning algorithms that are suitable for selecting and ranking important variables. Methods Community-dwelling older men (n=1024) and women (n=459), a subset of those participating in the Osteoporotic Fractures in Men study and the Study of Osteoporotic Fractures study, respectively, completed a single night of at-home polysomnographic recording of sleep followed by a set of morning questions concerning the prior night's sleep quality. Questionnaires concerning demographics and psychological characteristics were also collected prior to the overnight recording and entered into multivariable models. Two machine learning algorithms, lasso penalized regression and random forests, determined variable selection and the ordering of variable importance separately for men and women. Results Thirty-eight sleep, demographic and clinical correlates of sleep quality were considered. Together, these multivariable models explained only 11-17% of the variance in predicting subjective sleep quality. Objective sleep efficiency emerged as the strongest correlate of subjective sleep quality across all models, and across both sexes. Greater total sleep time and sleep stage transitions were also significant objective correlates of subjective sleep quality. The amount of slow wave sleep obtained was not determined to be important. Conclusions Overall, the commonly obtained measures of polysomnographically-defined sleep contributed little to subjective ratings of prior-night sleep quality. Though they explained relatively little of the variance, sleep efficiency, total sleep time and sleep stage transitions were among the most important objective correlates. PMID:27889439
ERIC Educational Resources Information Center
McDonough, Meghan H.; Crocker, Peter R. E.
2005-01-01
This study examined the factor structure of the Sport Friendship Quality Scale (SFQS; Weiss & Smith, 1999) and compared two models in which (a) self-worth mediated the relationship between physical self/friendship quality and sport commitment and (b) friendship quality and physical self-perceptions directly predicted self-worth and sport…
Srinivas, T R; Taber, D J; Su, Z; Zhang, J; Mour, G; Northrup, D; Tripathi, A; Marsden, J E; Moran, W P; Mauldin, P D
2017-03-01
We sought proof of concept of a Big Data Solution incorporating longitudinal structured and unstructured patient-level data from electronic health records (EHR) to predict graft loss (GL) and mortality. For a quality improvement initiative, GL and mortality prediction models were constructed using baseline and follow-up data (0-90 days posttransplant; structured and unstructured for 1-year models; data up to 1 year for 3-year models) on adult solitary kidney transplant recipients transplanted during 2007-2015 as follows: Model 1: United Network for Organ Sharing (UNOS) data; Model 2: UNOS & Transplant Database (Tx Database) data; Model 3: UNOS, Tx Database & EHR comorbidity data; and Model 4: UNOS, Tx Database, EHR data, Posttransplant trajectory data, and unstructured data. A 10% 3-year GL rate was observed among 891 patients (2007-2015). Layering of data sources improved model performance; Model 1: area under the curve (AUC), 0.66; (95% confidence interval [CI]: 0.60, 0.72); Model 2: AUC, 0.68; (95% CI: 0.61-0.74); Model 3: AUC, 0.72; (95% CI: 0.66-077); Model 4: AUC, 0.84, (95 % CI: 0.79-0.89). One-year GL (AUC, 0.87; Model 4) and 3-year mortality (AUC, 0.84; Model 4) models performed similarly. A Big Data approach significantly adds efficacy to GL and mortality prediction models and is EHR deployable to optimize outcomes. © 2016 The American Society of Transplantation and the American Society of Transplant Surgeons.
Maltesen, Morten Jonas; van de Weert, Marco; Grohganz, Holger
2012-09-01
Moisture content and aerodynamic particle size are critical quality attributes for spray-dried protein formulations. In this study, spray-dried insulin powders intended for pulmonary delivery were produced applying design of experiments methodology. Near infrared spectroscopy (NIR) in combination with preprocessing and multivariate analysis in the form of partial least squares projections to latent structures (PLS) were used to correlate the spectral data with moisture content and aerodynamic particle size measured by a time of flight principle. PLS models predicting the moisture content were based on the chemical information of the water molecules in the NIR spectrum. Models yielded prediction errors (RMSEP) between 0.39% and 0.48% with thermal gravimetric analysis used as reference method. The PLS models predicting the aerodynamic particle size were based on baseline offset in the NIR spectra and yielded prediction errors between 0.27 and 0.48 μm. The morphology of the spray-dried particles had a significant impact on the predictive ability of the models. Good predictive models could be obtained for spherical particles with a calibration error (RMSECV) of 0.22 μm, whereas wrinkled particles resulted in much less robust models with a Q (2) of 0.69. Based on the results in this study, NIR is a suitable tool for process analysis of the spray-drying process and for control of moisture content and particle size, in particular for smooth and spherical particles.
NASA Astrophysics Data System (ADS)
Weger, L.; Lupascu, A.; Cremonese, L.; Butler, T. M.
2017-12-01
Numerous countries in Europe that possess domestic shale gas reserves are considering exploiting this unconventional gas resource as part of their energy transition agenda. While natural gas generates less CO2 emissions upon combustion compared to coal or oil, making it attractive as a bridge in the transition from fossil fuels to renewables, production of shale gas leads to emissions of CH4 and air pollutants such as NOx, VOCs and PM. These gases in turn influence the climate as well as air quality. In this study, we investigate the impact of a potential shale gas development in Germany and the United Kingdom on local and regional air quality. This work builds on our previous study in which we constructed emissions scenarios based on shale gas utilization in these counties. In order to explore the influence of shale gas production on air quality, we investigate emissions predicted from our shale gas scenarios with the Weather Research and Forecasting model with chemistry (WRF-Chem) model. In order to do this, we first design a model set-up over Europe and evaluate its performance for the meteorological and chemical parameters. Subsequently we add shale gas emissions fluxes based on the scenarios over the area of the grid in which the shale gas activities are predicted to occur. Finally, we model these emissions and analyze the impact on air quality on both a local and regional scale. The aims of this work are to predict the range of adverse effects on air quality, highlight the importance of emissions control strategies in reducing air pollution, to promote further discussion, and to provide policy makers with information for decision making on a potential shale gas development in the two study countries.
NASA Astrophysics Data System (ADS)
Fang, Kaizheng; Mu, Daobin; Chen, Shi; Wu, Borong; Wu, Feng
2012-06-01
In this study, a prediction model based on artificial neural network is constructed for surface temperature simulation of nickel-metal hydride battery. The model is developed from a back-propagation network which is trained by Levenberg-Marquardt algorithm. Under each ambient temperature of 10 °C, 20 °C, 30 °C and 40 °C, an 8 Ah cylindrical Ni-MH battery is charged in the rate of 1 C, 3 C and 5 C to its SOC of 110% in order to provide data for the model training. Linear regression method is adopted to check the quality of the model training, as well as mean square error and absolute error. It is shown that the constructed model is of excellent training quality for the guarantee of prediction accuracy. The surface temperature of battery during charging is predicted under various ambient temperatures of 50 °C, 60 °C, 70 °C by the model. The results are validated in good agreement with experimental data. The value of battery surface temperature is calculated to exceed 90 °C under the ambient temperature of 60 °C if it is overcharged in 5 C, which might cause battery safety issues.
Nondestructive detection of pork quality based on dual-band VIS/NIR spectroscopy
NASA Astrophysics Data System (ADS)
Wang, Wenxiu; Peng, Yankun; Li, Yongyu; Tang, Xiuying; Liu, Yuanyuan
2015-05-01
With the continuous development of living standards and the relative change of dietary structure, consumers' rising and persistent demand for better quality of meat is emphasized. Colour, pH value, and cooking loss are important quality attributes when evaluating meat. To realize nondestructive detection of multi-parameter of meat quality simultaneously is popular in production and processing of meat and meat products. The objectives of this research were to compare the effectiveness of two bands for rapid nondestructive and simultaneous detection of pork quality attributes. Reflectance spectra of 60 chilled pork samples were collected from a dual-band visible/near-infrared spectroscopy system which covered 350-1100 nm and 1000-2600 nm. Then colour, pH value and cooking loss were determined by standard methods as reference values. Standard normal variables transform (SNVT) was employed to eliminate the spectral noise. A spectrum connection method was put forward for effective integration of the dual-band spectrum to make full use of the whole efficient information. Partial least squares regression (PLSR) and Principal component analysis (PCA) were applied to establish prediction models using based on single-band spectrum and dual-band spectrum, respectively. The experimental results showed that the PLSR model based on dual-band spectral information was superior to the models based on single band spectral information with lower root means quare error (RMSE) and higher accuracy. The PLSR model based on dual-band (use the overlapping part of first band) yielded the best prediction result with correlation coefficient of validation (Rv) of 0.9469, 0.9495, 0.9180, 0.9054 and 0.8789 for L*, a*, b*, pH value and cooking loss, respectively. This mainly because dual-band spectrum can provide sufficient and comprehensive information which reflected the quality attributes. Data fusion from dual-band spectrum could significantly improve pork quality parameters prediction performance. The research also indicated that multi-band spectral information fusion has potential to comprehensively evaluate other quality and safety attributes of pork.
A User-Friendly Model for Spray Drying to Aid Pharmaceutical Product Development
Grasmeijer, Niels; de Waard, Hans; Hinrichs, Wouter L. J.; Frijlink, Henderik W.
2013-01-01
The aim of this study was to develop a user-friendly model for spray drying that can aid in the development of a pharmaceutical product, by shifting from a trial-and-error towards a quality-by-design approach. To achieve this, a spray dryer model was developed in commercial and open source spreadsheet software. The output of the model was first fitted to the experimental output of a Büchi B-290 spray dryer and subsequently validated. The predicted outlet temperatures of the spray dryer model matched the experimental values very well over the entire range of spray dryer settings that were tested. Finally, the model was applied to produce glassy sugars by spray drying, an often used excipient in formulations of biopharmaceuticals. For the production of glassy sugars, the model was extended to predict the relative humidity at the outlet, which is not measured in the spray dryer by default. This extended model was then successfully used to predict whether specific settings were suitable for producing glassy trehalose and inulin by spray drying. In conclusion, a spray dryer model was developed that is able to predict the output parameters of the spray drying process. The model can aid the development of spray dried pharmaceutical products by shifting from a trial-and-error towards a quality-by-design approach. PMID:24040240
NASA Technical Reports Server (NTRS)
Engwirda, Darren
2017-01-01
An algorithm for the generation of non-uniform, locally orthogonal staggered unstructured spheroidal grids is described. This technique is designed to generate very high-quality staggered VoronoiDelaunay meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric simulation, ocean-modelling and numerical weather prediction. Using a recently developed Frontal-Delaunay refinement technique, a method for the construction of high-quality unstructured spheroidal Delaunay triangulations is introduced. A locally orthogonal polygonal grid, derived from the associated Voronoi diagram, is computed as the staggered dual. It is shown that use of the Frontal-Delaunay refinement technique allows for the generation of very high-quality unstructured triangulations, satisfying a priori bounds on element size and shape. Grid quality is further improved through the application of hill-climbing-type optimisation techniques. Overall, the algorithm is shown to produce grids with very high element quality and smooth grading characteristics, while imposing relatively low computational expense. A selection of uniform and non-uniform spheroidal grids appropriate for high-resolution, multi-scale general circulation modelling are presented. These grids are shown to satisfy the geometric constraints associated with contemporary unstructured C-grid-type finite-volume models, including the Model for Prediction Across Scales (MPAS-O). The use of user-defined mesh-spacing functions to generate smoothly graded, non-uniform grids for multi-resolution-type studies is discussed in detail.
NASA Astrophysics Data System (ADS)
Engwirda, Darren
2017-06-01
An algorithm for the generation of non-uniform, locally orthogonal staggered unstructured spheroidal grids is described. This technique is designed to generate very high-quality staggered Voronoi-Delaunay meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric simulation, ocean-modelling and numerical weather prediction. Using a recently developed Frontal-Delaunay refinement technique, a method for the construction of high-quality unstructured spheroidal Delaunay triangulations is introduced. A locally orthogonal polygonal grid, derived from the associated Voronoi diagram, is computed as the staggered dual. It is shown that use of the Frontal-Delaunay refinement technique allows for the generation of very high-quality unstructured triangulations, satisfying a priori bounds on element size and shape. Grid quality is further improved through the application of hill-climbing-type optimisation techniques. Overall, the algorithm is shown to produce grids with very high element quality and smooth grading characteristics, while imposing relatively low computational expense. A selection of uniform and non-uniform spheroidal grids appropriate for high-resolution, multi-scale general circulation modelling are presented. These grids are shown to satisfy the geometric constraints associated with contemporary unstructured C-grid-type finite-volume models, including the Model for Prediction Across Scales (MPAS-O). The use of user-defined mesh-spacing functions to generate smoothly graded, non-uniform grids for multi-resolution-type studies is discussed in detail.
Quality assessment of protein model-structures based on structural and functional similarities
2012-01-01
Background Experimental determination of protein 3D structures is expensive, time consuming and sometimes impossible. A gap between number of protein structures deposited in the World Wide Protein Data Bank and the number of sequenced proteins constantly broadens. Computational modeling is deemed to be one of the ways to deal with the problem. Although protein 3D structure prediction is a difficult task, many tools are available. These tools can model it from a sequence or partial structural information, e.g. contact maps. Consequently, biologists have the ability to generate automatically a putative 3D structure model of any protein. However, the main issue becomes evaluation of the model quality, which is one of the most important challenges of structural biology. Results GOBA - Gene Ontology-Based Assessment is a novel Protein Model Quality Assessment Program. It estimates the compatibility between a model-structure and its expected function. GOBA is based on the assumption that a high quality model is expected to be structurally similar to proteins functionally similar to the prediction target. Whereas DALI is used to measure structure similarity, protein functional similarity is quantified using standardized and hierarchical description of proteins provided by Gene Ontology combined with Wang's algorithm for calculating semantic similarity. Two approaches are proposed to express the quality of protein model-structures. One is a single model quality assessment method, the other is its modification, which provides a relative measure of model quality. Exhaustive evaluation is performed on data sets of model-structures submitted to the CASP8 and CASP9 contests. Conclusions The validation shows that the method is able to discriminate between good and bad model-structures. The best of tested GOBA scores achieved 0.74 and 0.8 as a mean Pearson correlation to the observed quality of models in our CASP8 and CASP9-based validation sets. GOBA also obtained the best result for two targets of CASP8, and one of CASP9, compared to the contest participants. Consequently, GOBA offers a novel single model quality assessment program that addresses the practical needs of biologists. In conjunction with other Model Quality Assessment Programs (MQAPs), it would prove useful for the evaluation of single protein models. PMID:22998498
Evaluation of WRF Parameterizations for Air Quality Applications over the Midwest USA
NASA Astrophysics Data System (ADS)
Zheng, Z.; Fu, K.; Balasubramanian, S.; Koloutsou-Vakakis, S.; McFarland, D. M.; Rood, M. J.
2017-12-01
Reliable predictions from Chemical Transport Models (CTMs) for air quality research require accurate gridded weather inputs. In this study, a sensitivity analysis of 17 Weather Research and Forecast (WRF) model runs was conducted to explore the optimum configuration in six physics categories (i.e., cumulus, surface layer, microphysics, land surface model, planetary boundary layer, and longwave/shortwave radiation) for the Midwest USA. WRF runs were initally conducted over four days in May 2011 for a 12 km x 12 km domain over contiguous USA and a nested 4 km x 4 km domain over the Midwest USA (i.e., Illinois and adjacent areas including Iowa, Indiana, and Missouri). Model outputs were evaluated statistically by comparison with meteorological observations (DS337.0, METAR data, and the Water and Atmospheric Resources Monitoring Network) and resulting statistics were compared to benchmark values from the literature. Identified optimum configurations of physics parametrizations were then evaluated for the whole months of May and October 2011 to evaluate WRF model performance for Midwestern spring and fall seasons. This study demonstrated that for the chosen physics options, WRF predicted well temperature (Index of Agreement (IOA) = 0.99), pressure (IOA = 0.99), relative humidity (IOA = 0.93), wind speed (IOA = 0.85), and wind direction (IOA = 0.97). However, WRF did not predict daily precipitation satisfactorily (IOA = 0.16). Developed gridded weather fields will be used as inputs to a CTM ensemble consisting of the Comprehensive Air Quality Model with Extensions to study impacts of chemical fertilizer usage on regional air quality in the Midwest USA.
Drying characteristics and quality of bananas under infrared radiation heating
USDA-ARS?s Scientific Manuscript database
Hot air (HA) drying of banana has low drying efficiency and results in undesirable product quality. The objectives of this research were to investigate the feasibility of infrared (IR) heating to improve banana drying rate, evaluate quality of the dried product, and establish models for predicting d...
Surrogate Analysis and Index Developer (SAID) tool
Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.
2015-10-01
The regression models created in SAID can be used in utilities that have been developed to work with the USGS National Water Information System (NWIS) and for the USGS National Real-Time Water Quality (NRTWQ) Web site. The real-time dissemination of predicted SSC and prediction intervals for each time step has substantial potential to improve understanding of sediment-related water quality and associated engineering and ecological management decisions.
NASA Astrophysics Data System (ADS)
Dawson, H. E.
2003-12-01
This paper presents a mass balance approach to assessing the cumulative impacts of discharge from Coal Bed Methane (CBM) wells on surface water quality and its suitability for irrigation in the Powder River Basin. Key water quality parameters for predicting potential effects of CBM development on irrigated agriculture are sodicity, expressed as sodium adsorption ratio (SAR) and salinity, expressed as electrical conductivity (EC). The assessment was performed with the aid of a spreadsheet model, which was designed to estimate steady-state SAR and EC at gauged stream locations after mixing with CBM produced water. Model input included ambient stream water quality and flow, CBM produced water quality and discharge rates, conveyance loss (quantity of water loss that may occur between the discharge point and the receiving streams), beneficial uses, regulatory thresholds, and discharge allocation at state-line boundaries. Historical USGS data were used to establish ambient stream water quality and flow conditions. The resultant water quality predicted for each stream station included the cumulative discharge of CBM produced water in all reaches upstream of the station. Model output was presented in both tabular and graphical formats, and indicated the suitability of pre- and post-mixing water quality for irrigation. Advantages and disadvantages of the spreadsheet model are discussed. This approach was used by federal agencies to support the development of the January 2003 Environmental Impact Statements (EIS) for the Wyoming and Montana portions of the Powder River Basin.
Recknagel, Friedrich; Orr, Philip T; Bartkow, Michael; Swanepoel, Annelie; Cao, Hongqing
2017-11-01
An early warning scheme is proposed that runs ensembles of inferential models for predicting the cyanobacterial population dynamics and cyanotoxin concentrations in drinking water reservoirs on a diel basis driven by in situ sonde water quality data. When the 10- to 30-day-ahead predicted concentrations of cyanobacteria cells or cyanotoxins exceed pre-defined limit values, an early warning automatically activates an action plan considering in-lake control, e.g. intermittent mixing and ad hoc water treatment in water works, respectively. Case studies of the sub-tropical Lake Wivenhoe (Australia) and the Mediterranean Vaal Reservoir (South Africa) demonstrate that ensembles of inferential models developed by the hybrid evolutionary algorithm HEA are capable of up to 30days forecasts of cyanobacteria and cyanotoxins using data collected in situ. The resulting models for Dolicospermum circinale displayed validity for up to 10days ahead, whilst concentrations of Cylindrospermopsis raciborskii and microcystins were successfully predicted up to 30days ahead. Implementing the proposed scheme for drinking water reservoirs enhances current water quality monitoring practices by solely utilising in situ monitoring data, in addition to cyanobacteria and cyanotoxin measurements. Access to routinely measured cyanotoxin data allows for development of models that predict explicitly cyanotoxin concentrations that avoid to inadvertently model and predict non-toxic cyanobacterial strains. Copyright © 2017 Elsevier B.V. All rights reserved.
PconsD: ultra rapid, accurate model quality assessment for protein structure prediction.
Skwark, Marcin J; Elofsson, Arne
2013-07-15
Clustering methods are often needed for accurately assessing the quality of modeled protein structures. Recent blind evaluation of quality assessment methods in CASP10 showed that there is little difference between many different methods as far as ranking models and selecting best model are concerned. When comparing many models, the computational cost of the model comparison can become significant. Here, we present PconsD, a fast, stream-computing method for distance-driven model quality assessment that runs on consumer hardware. PconsD is at least one order of magnitude faster than other methods of comparable accuracy. The source code for PconsD is freely available at http://d.pcons.net/. Supplementary benchmarking data are also available there. arne@bioinfo.se Supplementary data are available at Bioinformatics online.
Protein structure modeling and refinement by global optimization in CASP12.
Hong, Seung Hwan; Joung, InSuk; Flores-Canales, Jose C; Manavalan, Balachandran; Cheng, Qianyi; Heo, Seungryong; Kim, Jong Yun; Lee, Sun Young; Nam, Mikyung; Joo, Keehyoung; Lee, In-Ho; Lee, Sung Jong; Lee, Jooyoung
2018-03-01
For protein structure modeling in the CASP12 experiment, we have developed a new protocol based on our previous CASP11 approach. The global optimization method of conformational space annealing (CSA) was applied to 3 stages of modeling: multiple sequence-structure alignment, three-dimensional (3D) chain building, and side-chain re-modeling. For better template selection and model selection, we updated our model quality assessment (QA) method with the newly developed SVMQA (support vector machine for quality assessment). For 3D chain building, we updated our energy function by including restraints generated from predicted residue-residue contacts. New energy terms for the predicted secondary structure and predicted solvent accessible surface area were also introduced. For difficult targets, we proposed a new method, LEEab, where the template term played a less significant role than it did in LEE, complemented by increased contributions from other terms such as the predicted contact term. For TBM (template-based modeling) targets, LEE performed better than LEEab, but for FM targets, LEEab was better. For model refinement, we modified our CASP11 molecular dynamics (MD) based protocol by using explicit solvents and tuning down restraint weights. Refinement results from MD simulations that used a new augmented statistical energy term in the force field were quite promising. Finally, when using inaccurate information (such as the predicted contacts), it was important to use the Lorentzian function for which the maximal penalty arising from wrong information is always bounded. © 2017 Wiley Periodicals, Inc.
Eslami, Mohammad H; Rybin, Denis V; Doros, Gheorghe; Siracuse, Jeffrey J; Farber, Alik
2018-01-01
The purpose of this study is to externally validate a recently reported Vascular Study Group of New England (VSGNE) risk predictive model of postoperative mortality after elective abdominal aortic aneurysm (AAA) repair and to compare its predictive ability across different patients' risk categories and against the established risk predictive models using the Vascular Quality Initiative (VQI) AAA sample. The VQI AAA database (2010-2015) was queried for patients who underwent elective AAA repair. The VSGNE cases were excluded from the VQI sample. The external validation of a recently published VSGNE AAA risk predictive model, which includes only preoperative variables (age, gender, history of coronary artery disease, chronic obstructive pulmonary disease, cerebrovascular disease, creatinine levels, and aneurysm size) and planned type of repair, was performed using the VQI elective AAA repair sample. The predictive value of the model was assessed via the C-statistic. Hosmer-Lemeshow method was used to assess calibration and goodness of fit. This model was then compared with the Medicare, Vascular Governance Northwest model, and Glasgow Aneurysm Score for predicting mortality in VQI sample. The Vuong test was performed to compare the model fit between the models. Model discrimination was assessed in different risk group VQI quintiles. Data from 4431 cases from the VSGNE sample with the overall mortality rate of 1.4% was used to develop the model. The internally validated VSGNE model showed a very high discriminating ability in predicting mortality (C = 0.822) and good model fit (Hosmer-Lemeshow P = .309) among the VSGNE elective AAA repair sample. External validation on 16,989 VQI cases with an overall 0.9% mortality rate showed very robust predictive ability of mortality (C = 0.802). Vuong tests yielded a significant fit difference favoring the VSGNE over then Medicare model (C = 0.780), Vascular Governance Northwest (0.774), and Glasgow Aneurysm Score (0.639). Across the 5 risk quintiles, the VSGNE model predicted observed mortality significantly with great accuracy. This simple VSGNE AAA risk predictive model showed very high discriminative ability in predicting mortality after elective AAA repair among a large external independent sample of AAA cases performed by a diverse array of physicians nationwide. The risk score based on this simple VSGNE model can reliably stratify patients according to their risk of mortality after elective AAA repair better than other established models. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
GenePRIMP: A Gene Prediction Improvement Pipeline For Prokaryotic Genomes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kyrpides, Nikos C.; Ivanova, Natalia N.; Pati, Amrita
2010-07-08
GenePRIMP (Gene Prediction Improvement Pipeline, Http://geneprimp.jgi-psf.org), a computational process that performs evidence-based evaluation of gene models in prokaryotic genomes and reports anomalies including inconsistent start sites, missing genes, and split genes. We show that manual curation of gene models using the anomaly reports generated by GenePRIMP improves their quality and demonstrate the applicability of GenePRIMP in improving finishing quality and comparing different genome sequencing and annotation technologies. Keywords in context: Gene model, Quality Control, Translation start sites, Automatic correction. Hardware requirements; PC, MAC; Operating System: UNIX/LINUX; Compiler/Version: Perl 5.8.5 or higher; Special requirements: NCBI Blast and nr installation; File Types:more » Source Code, Executable module(s), Sample problem input data; installation instructions other; programmer documentation. Location/transmission: http://geneprimp.jgi-psf.org/gp.tar.gz« less
NASA Technical Reports Server (NTRS)
Young, Stuart A.; Vaughan, Mark; Omar, Ali; Liu, Zhaoyan; Lee, Sunhee; Hu, Youngxiang; Cope, Martin
2008-01-01
Global measurements of the vertical distribution of clouds and aerosols have been recorded by the lidar on board the CALIPSO (Cloud Aerosol Lidar Infrared Pathfinder Satellite Observations) satellite since June 2006. Such extensive, height-resolved measurements provide a rare and valuable opportunity for developing, testing and validating various atmospheric models, including global climate, numerical weather prediction, chemical transport and air quality models. Here we report on the initial results of an investigation into the performance of the Australian Air Quality Forecast System (AAQFS) model in forecasting the distribution of elevated dust over the Australian region. The model forecasts of PM60 dust distribution are compared with the CALIPSO lidar Vertical Feature Mask (VFM) data product. The VFM classifies contiguous atmospheric regions of enhanced backscatter as either cloud or aerosols. Aerosols are further classified into six subtypes. By comparing forecast PM60 concentration profiles to the spatial distribution of dust reported in the CALIPSO VFM, we can assess the model s ability to predict the occurrence and the vertical and horizontal extents of dust events within the study area.
Implications of Modeling Uncertainty for Water Quality Decision Making
NASA Astrophysics Data System (ADS)
Shabman, L.
2002-05-01
The report, National Academy of Sciences report, "Assessing the TMDL Approach to Water Quality Management" endorsed the "watershed" and "ambient water quality focused" approach" to water quality management called for in the TMDL program. The committee felt that available data and models were adequate to move such a program forward, if the EPA and all stakeholders better understood the nature of the scientific enterprise and its application to the TMDL program. Specifically, the report called for a greater acknowledgement of model prediction uncertinaity in making and implementing TMDL plans. To assure that such uncertinaity was addressed in water quality decision making the committee called for a commitment to "adaptive implementation" of water quality management plans. The committee found that the number and complexity of the interactions of multiple stressors, combined with model prediction uncertinaity means that we need to avoid the temptation to make assurances that specific actions will result in attainment of particular water quality standards. Until the work on solving a water quality problem begins, analysts and decision makers cannot be sure what the correct solutions are, or even what water quality goals a community should be seeking. In complex systems we need to act in order to learn; adaptive implementation is a concurrent process of action and learning. Learning requires (1) continued monitoring of the waterbody to determine how it responds to the actions taken and (2) carefully designed experiments in the watershed. If we do not design learning into what we attempt we are not doing adaptive implementation. Therefore, there needs to be an increased commitment to monitoring and experiments in watersheds that will lead to learning. This presentation will 1) explain the logic for adaptive implementation; 2) discuss the ways that water quality modelers could characterize and explain model uncertinaity to decision makers; 3) speculate on the implications of the adaptive implementation for setting of water quality standards, for design of watershed monitoring programs and for the regulatory rules governing the TMDL program implementation.
Global Environmental Multiscale model - a platform for integrated environmental predictions
NASA Astrophysics Data System (ADS)
Kaminski, Jacek W.; Struzewska, Joanna; Neary, Lori; Dearden, Frank
2017-04-01
The Global Environmental Multiscale model was developed by the Government of Canada as an operational weather prediction model in the mid-1990s. Subsequently, it was used as the host meteorological model for an on-line implementation of air quality chemistry and aerosols from global to the meso-gamma scale. Further model developments led to the vertical extension of the modelling domain to include stratospheric chemistry, aerosols, and formation of polar stratospheric clouds. In parallel, the modelling platform was used for planetary applications where dynamical, radiative transfer and chemical processes in the atmosphere of Mars were successfully simulated. Undoubtedly, the developed modelling platform can be classified as an example capable of the seamless and coupled modelling of the dynamics and chemistry of planetary atmospheres. We will present modelling results for global, regional, and local air quality episodes and the long-term air quality trends. Upper troposphere and lower stratosphere modelling results will be presented in terms of climate change and subsonic aviation emissions modelling. Model results for the atmosphere of Mars will be presented in the context of the 2016 ExoMars mission and the anticipated observations from the NOMAD instrument. Also, we will present plans and the design to extend the GEM model to the F region with further coupling with a magnetospheric model that extends to 15 Re.
Lassale, Camille; Gunter, Marc J.; Romaguera, Dora; Peelen, Linda M.; Van der Schouw, Yvonne T.; Beulens, Joline W. J.; Freisling, Heinz; Muller, David C.; Ferrari, Pietro; Huybrechts, Inge; Fagherazzi, Guy; Boutron-Ruault, Marie-Christine; Affret, Aurélie; Overvad, Kim; Dahm, Christina C.; Olsen, Anja; Roswall, Nina; Tsilidis, Konstantinos K.; Katzke, Verena A.; Kühn, Tilman; Buijsse, Brian; Quirós, José-Ramón; Sánchez-Cantalejo, Emilio; Etxezarreta, Nerea; Huerta, José María; Barricarte, Aurelio; Bonet, Catalina; Khaw, Kay-Tee; Key, Timothy J.; Trichopoulou, Antonia; Bamia, Christina; Lagiou, Pagona; Palli, Domenico; Agnoli, Claudia; Tumino, Rosario; Fasanelli, Francesca; Panico, Salvatore; Bueno-de-Mesquita, H. Bas; Boer, Jolanda M. A.; Sonestedt, Emily; Nilsson, Lena Maria; Renström, Frida; Weiderpass, Elisabete; Skeie, Guri; Lund, Eiliv; Moons, Karel G. M.; Riboli, Elio; Tzoulaki, Ioanna
2016-01-01
Scores of overall diet quality have received increasing attention in relation to disease aetiology; however, their value in risk prediction has been little examined. The objective was to assess and compare the association and predictive performance of 10 diet quality scores on 10-year risk of all-cause, CVD and cancer mortality in 451,256 healthy participants to the European Prospective Investigation into Cancer and Nutrition, followed-up for a median of 12.8y. All dietary scores studied showed significant inverse associations with all outcomes. The range of HRs (95% CI) in the top vs. lowest quartile of dietary scores in a composite model including non-invasive factors (age, sex, smoking, body mass index, education, physical activity and study centre) was 0.75 (0.72–0.79) to 0.88 (0.84–0.92) for all-cause, 0.76 (0.69–0.83) to 0.84 (0.76–0.92) for CVD and 0.78 (0.73–0.83) to 0.91 (0.85–0.97) for cancer mortality. Models with dietary scores alone showed low discrimination, but composite models also including age, sex and other non-invasive factors showed good discrimination and calibration, which varied little between different diet scores examined. Mean C-statistic of full models was 0.73, 0.80 and 0.71 for all-cause, CVD and cancer mortality. Dietary scores have poor predictive performance for 10-year mortality risk when used in isolation but display good predictive ability in combination with other non-invasive common risk factors. PMID:27409582
[A predictive model for the quality of sexual life in hysterectomized women].
Urrutia, María Teresa; Araya, Alejandra; Rivera, Soledad; Viviani, Paola; Villarroel, Luis
2007-03-01
The effects of hysterectomy on sexuality has been extensively studied. To establish a model to predict the quality of sexual life in hysterectomized women, six months after surgery. Analytical, longitudinal and prospective study of 90 hysterectomized women aged 45+/-7 years. Two structured interviews at the time of surgery and six months later were carried out to determine the characteristics of sexuality and communication within the couple. In the two interviews, communication and the quality of sexual life were described as "good" in 72 and 77% of women, respectively (NS). The variables that had a 40% influence on the quality of sexual life sixth months after surgery, were oophorectomy status, the presence of orgasm, the characteristics of communication and the basal sexuality with the couple. The sexuality of the hysterectomized women will depend, on a great extent, of pre-surgical variables. Therefore, it is important to consider these variables for the education of hysterectomized women.
NASA Astrophysics Data System (ADS)
Engelen, R. J.; Peuch, V. H.
2017-12-01
The European Copernicus Atmosphere Monitoring Service (CAMS) operationally provides daily forecasts of global atmospheric composition and regional air quality. The global forecasting system is using ECMWF's Integrated Forecasting System (IFS), which is used for numerical weather prediction and which has been extended with modules for atmospheric chemistry, aerosols and greenhouse gases. The regional forecasts are produced by an ensemble of seven operational European air quality models that take their boundary conditions from the global system and provide an ensemble median with ensemble spread as their main output. Both the global and regional forecasting systems are feeding their output into air quality models on a variety of scales in various parts of the world. We will introduce the CAMS service chain and provide illustrations of its use in downstream applications. Both the usage of the daily forecasts and the usage of global and regional reanalyses will be addressed.
ERIC Educational Resources Information Center
Hamilton, Stephen T.; Freed, Erin M.; Long, Debra L.
2013-01-01
The goal of this study was to examine predictions derived from the Lexical Quality Hypothesis regarding relations among word decoding, working-memory capacity, and the ability to integrate new concepts into a developing discourse representation. Hierarchical Linear Modeling was used to quantify the effects of three text properties (length,…
ERIC Educational Resources Information Center
Dettmers, Swantje; Trautwein, Ulrich; Ludtke, Oliver; Kunter, Mareike; Baumert, Jurgen
2010-01-01
The present study examined the associations of 2 indicators of homework quality (homework selection and homework challenge) with homework motivation, homework behavior, and mathematics achievement. Multilevel modeling was used to analyze longitudinal data from a representative national sample of 3,483 students in Grades 9 and 10; homework effects…
Souihi, Nabil; Dumarey, Melanie; Wikström, Håkan; Tajarobi, Pirjo; Fransson, Magnus; Svensson, Olof; Josefson, Mats; Trygg, Johan
2013-04-15
Roll compaction is a continuous process for solid dosage form manufacturing increasingly popular within pharmaceutical industry. Although roll compaction has become an established technique for dry granulation, the influence of material properties is still not fully understood. In this study, a quality by design (QbD) approach was utilized, not only to understand the influence of different qualities of mannitol and dicalcium phosphate (DCP), but also to predict critical quality attributes of the drug product based solely on the material properties of that filler. By describing each filler quality in terms of several representative physical properties, orthogonal projections to latent structures (OPLS) was used to understand and predict how those properties affected drug product intermediates as well as critical quality attributes of the final drug product. These models were then validated by predicting product attributes for filler qualities not used in the model construction. The results of this study confirmed that the tensile strength reduction, known to affect plastic materials when roll compacted, is not prominent when using brittle materials. Some qualities of these fillers actually demonstrated improved compactability following roll compaction. While direct compression qualities are frequently used for roll compacted drug products because of their excellent flowability and good compaction properties, this study revealed that granules from these qualities were more poor flowing than the corresponding powder blends, which was not seen for granules from traditional qualities. The QbD approach used in this study could be extended beyond fillers. Thus any new compound/ingredient would first be characterized and then suitable formulation characteristics could be determined in silico, without running any additional experiments. Copyright © 2013 Elsevier B.V. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Relevant data about subsurface water flow and solute transport at relatively large scales that are of interest to the public are inherently laborious and in most cases simply impossible to obtain. Upscaling in which fine-scale models and data are used to predict changes at the coarser scales is the...
Key Questions in Building Defect Prediction Models in Practice
NASA Astrophysics Data System (ADS)
Ramler, Rudolf; Wolfmaier, Klaus; Stauder, Erwin; Kossak, Felix; Natschläger, Thomas
The information about which modules of a future version of a software system are defect-prone is a valuable planning aid for quality managers and testers. Defect prediction promises to indicate these defect-prone modules. However, constructing effective defect prediction models in an industrial setting involves a number of key questions. In this paper we discuss ten key questions identified in context of establishing defect prediction in a large software development project. Seven consecutive versions of the software system have been used to construct and validate defect prediction models for system test planning. Furthermore, the paper presents initial empirical results from the studied project and, by this means, contributes answers to the identified questions.
Predicting dire outcomes of patients with community acquired pneumonia.
Cooper, Gregory F; Abraham, Vijoy; Aliferis, Constantin F; Aronis, John M; Buchanan, Bruce G; Caruana, Richard; Fine, Michael J; Janosky, Janine E; Livingston, Gary; Mitchell, Tom; Monti, Stefano; Spirtes, Peter
2005-10-01
Community-acquired pneumonia (CAP) is an important clinical condition with regard to patient mortality, patient morbidity, and healthcare resource utilization. The assessment of the likely clinical course of a CAP patient can significantly influence decision making about whether to treat the patient as an inpatient or as an outpatient. That decision can in turn influence resource utilization, as well as patient well being. Predicting dire outcomes, such as mortality or severe clinical complications, is a particularly important component in assessing the clinical course of patients. We used a training set of 1601 CAP patient cases to construct 11 statistical and machine-learning models that predict dire outcomes. We evaluated the resulting models on 686 additional CAP-patient cases. The primary goal was not to compare these learning algorithms as a study end point; rather, it was to develop the best model possible to predict dire outcomes. A special version of an artificial neural network (NN) model predicted dire outcomes the best. Using the 686 test cases, we estimated the expected healthcare quality and cost impact of applying the NN model in practice. The particular, quantitative results of this analysis are based on a number of assumptions that we make explicit; they will require further study and validation. Nonetheless, the general implication of the analysis seems robust, namely, that even small improvements in predictive performance for prevalent and costly diseases, such as CAP, are likely to result in significant improvements in the quality and efficiency of healthcare delivery. Therefore, seeking models with the highest possible level of predictive performance is important. Consequently, seeking ever better machine-learning and statistical modeling methods is of great practical significance.
Dėdelė, Audrius; Miškinytė, Auksė
2015-09-01
In many countries, road traffic is one of the main sources of air pollution associated with adverse effects on human health and environment. Nitrogen dioxide (NO2) is considered to be a measure of traffic-related air pollution, with concentrations tending to be higher near highways, along busy roads, and in the city centers, and the exceedances are mainly observed at measurement stations located close to traffic. In order to assess the air quality in the city and the air pollution impact on public health, air quality models are used. However, firstly, before the model can be used for these purposes, it is important to evaluate the accuracy of the dispersion modelling as one of the most widely used method. The monitoring and dispersion modelling are two components of air quality monitoring system (AQMS), in which statistical comparison was made in this research. The evaluation of the Atmospheric Dispersion Modelling System (ADMS-Urban) was made by comparing monthly modelled NO2 concentrations with the data of continuous air quality monitoring stations in Kaunas city. The statistical measures of model performance were calculated for annual and monthly concentrations of NO2 for each monitoring station site. The spatial analysis was made using geographic information systems (GIS). The calculation of statistical parameters indicated a good ADMS-Urban model performance for the prediction of NO2. The results of this study showed that the agreement of modelled values and observations was better for traffic monitoring stations compared to the background and residential stations.
Validating a model that predicts daily growth and feed quality of New Zealand dairy pastures.
Woodward, S J
2001-09-01
The Pasture Quality (PQ) model is a simple, mechanistic, dynamical system model that was designed to capture the essential biological processes in grazed grass-clover pasture, and to be optimised to derive improved grazing strategies for New Zealand dairy farms. While the individual processes represented in the model (photosynthesis, tissue growth, flowering, leaf death, decomposition, worms) were based on experimental data, this did not guarantee that the assembled model would accurately predict the behaviour of the system as a whole (i.e., pasture growth and quality). Validation of the whole model was thus a priority, since any strategy derived from the model could impact a farm business in the order of thousands of dollars per annum if adopted. This paper describes the process of defining performance criteria for the model, obtaining suitable data to test the model, and carrying out the validation analysis. The validation process highlighted a number of weaknesses in the model, which will lead to the model being improved. As a result, the model's utility will be enhanced. Furthermore, validation was found to have an unexpected additional benefit, in that despite the model's poor initial performance, support was generated for the model among field scientists involved in the wider project.
Human Thermal Model Evaluation Using the JSC Human Thermal Database
NASA Technical Reports Server (NTRS)
Bue, Grant; Makinen, Janice; Cognata, Thomas
2012-01-01
Human thermal modeling has considerable long term utility to human space flight. Such models provide a tool to predict crew survivability in support of vehicle design and to evaluate crew response in untested space environments. It is to the benefit of any such model not only to collect relevant experimental data to correlate it against, but also to maintain an experimental standard or benchmark for future development in a readily and rapidly searchable and software accessible format. The Human thermal database project is intended to do just so; to collect relevant data from literature and experimentation and to store the data in a database structure for immediate and future use as a benchmark to judge human thermal models against, in identifying model strengths and weakness, to support model development and improve correlation, and to statistically quantify a model s predictive quality. The human thermal database developed at the Johnson Space Center (JSC) is intended to evaluate a set of widely used human thermal models. This set includes the Wissler human thermal model, a model that has been widely used to predict the human thermoregulatory response to a variety of cold and hot environments. These models are statistically compared to the current database, which contains experiments of human subjects primarily in air from a literature survey ranging between 1953 and 2004 and from a suited experiment recently performed by the authors, for a quantitative study of relative strength and predictive quality of the models.
Rébufa, Catherine; Pany, Inès; Bombarda, Isabelle
2018-09-30
A rapid methodology was developed to simultaneously predict water content and activity values (a w ) of Moringa oleifera leaf powders (MOLP) using near infrared (NIR) signatures and experimental sorption isotherms. NIR spectra of MOLP samples (n = 181) were recorded. A Partial Least Square Regression model (PLS2) was obtained with low standard errors of prediction (SEP of 1.8% and 0.07 for water content and a w respectively). Experimental sorption isotherms obtained at 20, 30 and 40 °C showed similar profiles. This result is particularly important to use MOLP in food industry. In fact, a temperature variation of the drying process will not affect their available water content (self-life). Nutrient contents based on protein and selected minerals (Ca, Fe, K) were also predicted from PLS1 models. Protein contents were well predicted (SEP of 2.3%). This methodology allowed for an improvement in MOLP safety, quality control and traceability. Published by Elsevier Ltd.
Evaluation of ride quality prediction methods for operational military helicopters
NASA Technical Reports Server (NTRS)
Leatherwood, J. D.; Clevenson, S. A.; Hollenbaugh, D. D.
1984-01-01
The results of a simulator study conducted to compare and validate various ride quality prediction methods for use in assessing passenger/crew ride comfort within helicopters are presented. Included are results quantifying 35 helicopter pilots' discomfort responses to helicopter interior noise and vibration typical of routine flights, assessment of various ride quality metrics including the NASA ride comfort model, and examination of possible criteria approaches. Results of the study indicated that crew discomfort results from a complex interaction between vibration and interior noise. Overall measures such as weighted or unweighted root-mean-square acceleration level and A-weighted noise level were not good predictors of discomfort. Accurate prediction required a metric incorporating the interactive effects of both noise and vibration. The best metric for predicting crew comfort to the combined noise and vibration environment was the NASA discomfort index.
The unique role of lexical accessibility in predicting kindergarten emergent literacy.
Verhoeven, Ludo; van Leeuwe, Jan; Irausquin, Rosemarie; Segers, Eliane
The goal of this longitudinal study was to examine how lexical quality predicts the emergence of literacy abilities in 169 Dutch kindergarten children before formal reading instruction has started. At the beginning of the school year, a battery of precursor measures associated with lexical quality was related to the emergence of letter knowledge and word decoding. Confirmatory factor analysis evidenced five domains related to lexical quality, i.e., vocabulary, phonological coding, phonological awareness, lexical retrieval and phonological working memory. Structural equation modeling showed that the development of letter knowledge during the year could be predicted from children's phonological awareness and lexical retrieval, and the emergence of word decoding from their phonological awareness and letter knowledge. It is concluded that it is primarily the accessibility of phonological representations in the mental lexicon that predicts the emergence of literacy in kindergarten.
Seasonal Drought Prediction: Advances, Challenges, and Future Prospects
NASA Astrophysics Data System (ADS)
Hao, Zengchao; Singh, Vijay P.; Xia, Youlong
2018-03-01
Drought prediction is of critical importance to early warning for drought managements. This review provides a synthesis of drought prediction based on statistical, dynamical, and hybrid methods. Statistical drought prediction is achieved by modeling the relationship between drought indices of interest and a suite of potential predictors, including large-scale climate indices, local climate variables, and land initial conditions. Dynamical meteorological drought prediction relies on seasonal climate forecast from general circulation models (GCMs), which can be employed to drive hydrological models for agricultural and hydrological drought prediction with the predictability determined by both climate forcings and initial conditions. Challenges still exist in drought prediction at long lead time and under a changing environment resulting from natural and anthropogenic factors. Future research prospects to improve drought prediction include, but are not limited to, high-quality data assimilation, improved model development with key processes related to drought occurrence, optimal ensemble forecast to select or weight ensembles, and hybrid drought prediction to merge statistical and dynamical forecasts.
NASA Astrophysics Data System (ADS)
Hu, J.; Zhang, H.; Ying, Q.; Chen, S.-H.; Vandenberghe, F.; Kleeman, M. J.
2014-08-01
For the first time, a decadal (9 years from 2000 to 2008) air quality model simulation with 4 km horizontal resolution and daily time resolution has been conducted in California to provide air quality data for health effects studies. Model predictions are compared to measurements to evaluate the accuracy of the simulation with an emphasis on spatial and temporal variations that could be used in epidemiology studies. Better model performance is found at longer averaging times, suggesting that model results with averaging times ≥ 1 month should be the first to be considered in epidemiological studies. The UCD/CIT model predicts spatial and temporal variations in the concentrations of O3, PM2.5, EC, OC, nitrate, and ammonium that meet standard modeling performance criteria when compared to monthly-averaged measurements. Predicted sulfate concentrations do not meet target performance metrics due to missing sulfur sources in the emissions. Predicted seasonal and annual variations of PM2.5, EC, OC, nitrate, and ammonium have mean fractional biases that meet the model performance criteria in 95%, 100%, 71%, 73%, and 92% of the simulated months, respectively. The base dataset provides an improvement for predicted population exposure to PM concentrations in California compared to exposures estimated by central site monitors operated one day out of every 3 days at a few urban locations. Uncertainties in the model predictions arise from several issues. Incomplete understanding of secondary organic aerosol formation mechanisms leads to OC bias in the model results in summertime but does not affect OC predictions in winter when concentrations are typically highest. The CO and NO (species dominated by mobile emissions) results reveal temporal and spatial uncertainties associated with the mobile emissions generated by the EMFAC 2007 model. The WRF model tends to over-predict wind speed during stagnation events, leading to under-predictions of high PM concentrations, usually in winter months. The WRF model also generally under-predicts relative humidity, resulting in less particulate nitrate formation especially during winter months. These issues will be improved in future studies. All model results included in the current manuscript can be downloaded free of charge at http://faculty.engineering.ucdavis.edu/kleeman/.
Birmingham, R S; Bub, K L; Vaughn, B E
2017-04-01
Parenting and attachment are critical in the emergence of self-regulation (SR) in preschool. However, most studies use general indexes of parenting quality, failing to explore the unique contributions of sensitivity and home quality to SR. Further, the nature of the interplay between parenting and attachment history is not well understood. Using a sample of 938 children from The National Institute of Child Health and Human Development Study of Early Child Care and Youth Development, a series of structural equation models were fit to determine whether sensitivity and home quality concurrently predicted SR at 54 months, and whether attachment mediated or moderated these pathways. Results suggest that both sensitivity and home quality uniquely predict SR. Further, these early parenting variables were each indirectly associated with SR through children's attachment history. That is, higher levels of sensitivity and home quality predicted secure attachment history, which, along with parenting, predicted more advanced SR skills at 54 months. No moderated pathways emerged, suggesting that attachment history may be best conceptualized as a mediating mechanism.
Murphy, Jennifer C.; Farmer, James; Layton, Alice
2016-06-13
The U.S. Geological Survey, in cooperation with the Tennessee Duck River Development Agency, monitored water quality at several locations in the upper Duck River watershed between October 2007 and September 2010. Discrete water samples collected at 24 sites in the watershed were analyzed for water quality, and Escherichia coli (E. coli) and enterococci concentrations. Additional analyses, including the determination of anthropogenic-organic compounds, bacterial concentration of resuspended sediment, and bacterial-source tracking, were performed at a subset of sites. Continuous monitoring of streamflow, turbidity, and specific conductance was conducted at seven sites; a subset of sites also was monitored for water temperature and dissolved oxygen concentration. Multiple-regression models were developed to predict instantaneous E. coli concentrations and loads at sites with continuous monitoring. This data collection effort, along with the E. coli models and predictions, support analyses of the relations among land use, bacteria source and transport, and basin hydrology in the upper Duck River watershed.
Research on quality metrics of wireless adaptive video streaming
NASA Astrophysics Data System (ADS)
Li, Xuefei
2018-04-01
With the development of wireless networks and intelligent terminals, video traffic has increased dramatically. Adaptive video streaming has become one of the most promising video transmission technologies. For this type of service, a good QoS (Quality of Service) of wireless network does not always guarantee that all customers have good experience. Thus, new quality metrics have been widely studies recently. Taking this into account, the objective of this paper is to investigate the quality metrics of wireless adaptive video streaming. In this paper, a wireless video streaming simulation platform with DASH mechanism and multi-rate video generator is established. Based on this platform, PSNR model, SSIM model and Quality Level model are implemented. Quality Level Model considers the QoE (Quality of Experience) factors such as image quality, stalling and switching frequency while PSNR Model and SSIM Model mainly consider the quality of the video. To evaluate the performance of these QoE models, three performance metrics (SROCC, PLCC and RMSE) which are used to make a comparison of subjective and predicted MOS (Mean Opinion Score) are calculated. From these performance metrics, the monotonicity, linearity and accuracy of these quality metrics can be observed.
Forecasting asthma-related hospital admissions in London using negative binomial models.
Soyiri, Ireneous N; Reidpath, Daniel D; Sarran, Christophe
2013-05-01
Health forecasting can improve health service provision and individual patient outcomes. Environmental factors are known to impact chronic respiratory conditions such as asthma, but little is known about the extent to which these factors can be used for forecasting. Using weather, air quality and hospital asthma admissions, in London (2005-2006), two related negative binomial models were developed and compared with a naive seasonal model. In the first approach, predictive forecasting models were fitted with 7-day averages of each potential predictor, and then a subsequent multivariable model is constructed. In the second strategy, an exhaustive search of the best fitting models between possible combinations of lags (0-14 days) of all the environmental effects on asthma admission was conducted. Three models were considered: a base model (seasonal effects), contrasted with a 7-day average model and a selected lags model (weather and air quality effects). Season is the best predictor of asthma admissions. The 7-day average and seasonal models were trivial to implement. The selected lags model was computationally intensive, but of no real value over much more easily implemented models. Seasonal factors can predict daily hospital asthma admissions in London, and there is a little evidence that additional weather and air quality information would add to forecast accuracy.
Communication Efficacy and Couples’ Cancer Management: Applying a Dyadic Appraisal Model
Magsamen-Conrad, Kate; Checton, Maria G.; Venetis, Maria K.; Greene, Kathryn
2014-01-01
The purpose of the present study was to apply Berg and Upchurch’s (2007) developmental-conceptual model to understand better how couples cope with cancer. Specifically, we hypothesized a dyadic appraisal model in which proximal factors (relational quality), dyadic appraisal (prognosis uncertainty), and dyadic coping (communication efficacy) predicted adjustment (cancer management). The study was cross-sectional and included 83 dyads in which one partner had been diagnosed with and/or treated for cancer. For both patients and partners, multilevel analyses using the actor-partner interdependence model (APIM) indicated that proximal contextual factors predicted dyadic appraisal and dyadic coping. Dyadic appraisal predicted dyadic coping, which then predicted dyadic adjustment. Patients’ confidence in their ability to talk about the cancer predicted their own cancer management. Partners’ confidence predicted their own and the patient’s ability to cope with cancer, which then predicted patients’ perceptions of their general health. Implications and future research are discussed. PMID:25983382
Communication Efficacy and Couples' Cancer Management: Applying a Dyadic Appraisal Model.
Magsamen-Conrad, Kate; Checton, Maria G; Venetis, Maria K; Greene, Kathryn
2015-06-01
The purpose of the present study was to apply Berg and Upchurch's (2007) developmental-conceptual model to understand better how couples cope with cancer. Specifically, we hypothesized a dyadic appraisal model in which proximal factors (relational quality), dyadic appraisal (prognosis uncertainty), and dyadic coping (communication efficacy) predicted adjustment (cancer management). The study was cross-sectional and included 83 dyads in which one partner had been diagnosed with and/or treated for cancer. For both patients and partners, multilevel analyses using the actor-partner interdependence model (APIM) indicated that proximal contextual factors predicted dyadic appraisal and dyadic coping. Dyadic appraisal predicted dyadic coping, which then predicted dyadic adjustment. Patients' confidence in their ability to talk about the cancer predicted their own cancer management. Partners' confidence predicted their own and the patient's ability to cope with cancer, which then predicted patients' perceptions of their general health. Implications and future research are discussed.
Barton, Catherine A; Zarzecki, Charles J; Russell, Mark H
2010-04-01
This work assessed the usefulness of a current air quality model (American Meteorological Society/Environmental Protection Agency Regulatory Model [AERMOD]) for predicting air concentrations and deposition of perfluorooctanoate (PFO) near a manufacturing facility. Air quality models play an important role in providing information for verifying permitting conditions and for exposure assessment purposes. It is important to ensure traditional modeling approaches are applicable to perfluorinated compounds, which are known to have unusual properties. Measured field data were compared with modeling predictions to show that AERMOD adequately located the maximum air concentration in the study area, provided representative or conservative air concentration estimates, and demonstrated bias and scatter not significantly different than that reported for other compounds. Surface soil/grass concentrations resulting from modeled deposition flux also showed acceptable bias and scatter compared with measured concentrations of PFO in soil/grass samples. Errors in predictions of air concentrations or deposition may be best explained by meteorological input uncertainty and conservatism in the PRIME algorithm used to account for building downwash. In general, AERMOD was found to be a useful screening tool for modeling the dispersion and deposition of PFO in air near a manufacturing facility.
Predicting human olfactory perception from chemical features of odor molecules.
Keller, Andreas; Gerkin, Richard C; Guan, Yuanfang; Dhurandhar, Amit; Turu, Gabor; Szalai, Bence; Mainland, Joel D; Ihara, Yusuke; Yu, Chung Wen; Wolfinger, Russ; Vens, Celine; Schietgat, Leander; De Grave, Kurt; Norel, Raquel; Stolovitzky, Gustavo; Cecchi, Guillermo A; Vosshall, Leslie B; Meyer, Pablo
2017-02-24
It is still not possible to predict whether a given molecule will have a perceived odor or what olfactory percept it will produce. We therefore organized the crowd-sourced DREAM Olfaction Prediction Challenge. Using a large olfactory psychophysical data set, teams developed machine-learning algorithms to predict sensory attributes of molecules based on their chemoinformatic features. The resulting models accurately predicted odor intensity and pleasantness and also successfully predicted 8 among 19 rated semantic descriptors ("garlic," "fish," "sweet," "fruit," "burnt," "spices," "flower," and "sour"). Regularized linear models performed nearly as well as random forest-based ones, with a predictive accuracy that closely approaches a key theoretical limit. These models help to predict the perceptual qualities of virtually any molecule with high accuracy and also reverse-engineer the smell of a molecule. Copyright © 2017, American Association for the Advancement of Science.
CONFOLD2: improved contact-driven ab initio protein structure modeling.
Adhikari, Badri; Cheng, Jianlin
2018-01-25
Contact-guided protein structure prediction methods are becoming more and more successful because of the latest advances in residue-residue contact prediction. To support contact-driven structure prediction, effective tools that can quickly build tertiary structural models of good quality from predicted contacts need to be developed. We develop an improved contact-driven protein modelling method, CONFOLD2, and study how it may be effectively used for ab initio protein structure prediction with predicted contacts as input. It builds models using various subsets of input contacts to explore the fold space under the guidance of a soft square energy function, and then clusters the models to obtain the top five models. CONFOLD2 obtains an average reconstruction accuracy of 0.57 TM-score for the 150 proteins in the PSICOV contact prediction dataset. When benchmarked on the CASP11 contacts predicted using CONSIP2 and CASP12 contacts predicted using Raptor-X, CONFOLD2 achieves a mean TM-score of 0.41 on both datasets. CONFOLD2 allows to quickly generate top five structural models for a protein sequence when its secondary structures and contacts predictions at hand. The source code of CONFOLD2 is publicly available at https://github.com/multicom-toolbox/CONFOLD2/ .
NASA Astrophysics Data System (ADS)
Hu, Jianlin; Li, Xun; Huang, Lin; Ying, Qi; Zhang, Qiang; Zhao, Bin; Wang, Shuxiao; Zhang, Hongliang
2017-11-01
Accurate exposure estimates are required for health effect analyses of severe air pollution in China. Chemical transport models (CTMs) are widely used to provide spatial distribution, chemical composition, particle size fractions, and source origins of air pollutants. The accuracy of air quality predictions in China is greatly affected by the uncertainties of emission inventories. The Community Multiscale Air Quality (CMAQ) model with meteorological inputs from the Weather Research and Forecasting (WRF) model were used in this study to simulate air pollutants in China in 2013. Four simulations were conducted with four different anthropogenic emission inventories, including the Multi-resolution Emission Inventory for China (MEIC), the Emission Inventory for China by School of Environment at Tsinghua University (SOE), the Emissions Database for Global Atmospheric Research (EDGAR), and the Regional Emission inventory in Asia version 2 (REAS2). Model performance of each simulation was evaluated against available observation data from 422 sites in 60 cities across China. Model predictions of O3 and PM2.5 generally meet the model performance criteria, but performance differences exist in different regions, for different pollutants, and among inventories. Ensemble predictions were calculated by linearly combining the results from different inventories to minimize the sum of the squared errors between the ensemble results and the observations in all cities. The ensemble concentrations show improved agreement with observations in most cities. The mean fractional bias (MFB) and mean fractional errors (MFEs) of the ensemble annual PM2.5 in the 60 cities are -0.11 and 0.24, respectively, which are better than the MFB (-0.25 to -0.16) and MFE (0.26-0.31) of individual simulations. The ensemble annual daily maximum 1 h O3 (O3-1h) concentrations are also improved, with mean normalized bias (MNB) of 0.03 and mean normalized errors (MNE) of 0.14, compared to MNB of 0.06-0.19 and MNE of 0.16-0.22 of the individual predictions. The ensemble predictions agree better with observations with daily, monthly, and annual averaging times in all regions of China for both PM2.5 and O3-1h. The study demonstrates that ensemble predictions from combining predictions from individual emission inventories can improve the accuracy of predicted temporal and spatial distributions of air pollutants. This study is the first ensemble model study in China using multiple emission inventories, and the results are publicly available for future health effect studies.
Application-Driven No-Reference Quality Assessment for Dermoscopy Images With Multiple Distortions.
Xie, Fengying; Lu, Yanan; Bovik, Alan C; Jiang, Zhiguo; Meng, Rusong
2016-06-01
Dermoscopy images often suffer from blur and uneven illumination distortions that occur during acquisition, which can adversely influence consequent automatic image analysis results on potential lesion objects. The purpose of this paper is to deploy an algorithm that can automatically assess the quality of dermoscopy images. Such an algorithm could be used to direct image recapture or correction. We describe an application-driven no-reference image quality assessment (IQA) model for dermoscopy images affected by possibly multiple distortions. For this purpose, we created a multiple distortion dataset of dermoscopy images impaired by varying degrees of blur and uneven illumination. The basis of this model is two single distortion IQA metrics that are sensitive to blur and uneven illumination, respectively. The outputs of these two metrics are combined to predict the quality of multiply distorted dermoscopy images using a fuzzy neural network. Unlike traditional IQA algorithms, which use human subjective score as ground truth, here ground truth is driven by the application, and generated according to the degree of influence of the distortions on lesion analysis. The experimental results reveal that the proposed model delivers accurate and stable quality prediction results for dermoscopy images impaired by multiple distortions. The proposed model is effective for quality assessment of multiple distorted dermoscopy images. An application-driven concept for IQA is introduced, and at the same time, a solution framework for the IQA of multiple distortions is proposed.
Predicting effects of environmental change on river inflows to ...
Estuarine river watersheds provide valued ecosystem services to their surrounding communities including drinking water, fish habitat, and regulation of estuarine water quality. However, the provisioning of these services can be affected by changes in the quantity and quality of river water, such as those caused by altered landscapes or shifting temperatures or precipitation. We used the ecohydrology model, VELMA, in the Trask River watershed to simulate the effects of environmental change scenarios on estuarine river inputs to Tillamook Bay (OR) estuary. The Trask River watershed is 453 km2 and contains extensive agriculture, silviculture, urban, and wetland areas. VELMA was parameterized using existing spatial datasets of elevation, soil type, land use, air temperature, precipitation, river flow, and water quality. Simulated land use change scenarios included alterations in the distribution of the nitrogen-fixing tree species Alnus rubra, and comparisons of varying timber harvest plans. Scenarios involving spatial and temporal shifts in air temperature and precipitation trends were also simulated. Our research demonstrates the utility of ecohydrology models such as VELMA to aid in watershed management decision-making. Model outputs of river water flow, temperature, and nutrient concentrations can be used to predict effects on drinking water quality, salmonid populations, and estuarine water quality. This modeling effort is part of a larger framework of
Titus S. Seilheimer; Patrick L. Zimmerman; Kirk M. Stueve; Charles H. Perry
2013-01-01
The Great Lakes watersheds have an important influence on the water quality of the nearshore environment, therefore, watershed characteristics can be used to predict what will be observed in the streams. We used novel landscape information describing the forest cover change, along with forest census data and established land cover data to predict total phosphorus and...
The Role of Minority Stressors in Lesbian Relationship Commitment and Persistence over Time.
Barrantes, Renzo J; Eaton, Asia A; Veldhuis, Cindy B; Hughes, Tonda L
2017-06-01
The Investment Model of relationship commitment uses interpersonal investment, relationship satisfaction, quality of alternatives, and commitment to predict relationship longevity (Rusbult, 1980, 1983). Although ample support for the Investment Model has been found in heterosexual couples, it appears to be less powerful in predicting stability in same-sex relationships (Beals, Impett, & Peplau, 2002), potentially because the model does not account for factors unique to same-sex relationships, such as anti-gay discrimination. However, no research has tested the nature and power of sexual minority stress factors in predicting same-sex relationship stability over time. Using secondary, longitudinal data collected from a diverse sample of lesbian women in relationships ( N = 211), we examined how internalized homonegativity, sexual identity disclosure, and workplace discrimination affected the Investment Model antecedents of relationship persistence: satisfaction, quality of alternatives, and investment. We tested the influence of sexual minority stressors on Investment Model processes using structural equations modeling and found that sexual identity disclosure was positively associated with satisfaction and investment, internalized homonegativity was only negatively associated with satisfaction and investment, while workplace discrimination was negatively associated with alternatives. Moreover, both relationship satisfaction and investment influenced commitment which predicted persistence in these relationships over about seven years' time, demonstrating support for the Investment Model. Our findings support the addition of sexual minority stress variables to the Investment Model when examining same-sex relationships and implications are discussed.
The Role of Minority Stressors in Lesbian Relationship Commitment and Persistence over Time
Barrantes, Renzo J.; Eaton, Asia A.; Veldhuis, Cindy B.; Hughes, Tonda L.
2017-01-01
The Investment Model of relationship commitment uses interpersonal investment, relationship satisfaction, quality of alternatives, and commitment to predict relationship longevity (Rusbult, 1980, 1983). Although ample support for the Investment Model has been found in heterosexual couples, it appears to be less powerful in predicting stability in same-sex relationships (Beals, Impett, & Peplau, 2002), potentially because the model does not account for factors unique to same-sex relationships, such as anti-gay discrimination. However, no research has tested the nature and power of sexual minority stress factors in predicting same-sex relationship stability over time. Using secondary, longitudinal data collected from a diverse sample of lesbian women in relationships (N = 211), we examined how internalized homonegativity, sexual identity disclosure, and workplace discrimination affected the Investment Model antecedents of relationship persistence: satisfaction, quality of alternatives, and investment. We tested the influence of sexual minority stressors on Investment Model processes using structural equations modeling and found that sexual identity disclosure was positively associated with satisfaction and investment, internalized homonegativity was only negatively associated with satisfaction and investment, while workplace discrimination was negatively associated with alternatives. Moreover, both relationship satisfaction and investment influenced commitment which predicted persistence in these relationships over about seven years’ time, demonstrating support for the Investment Model. Our findings support the addition of sexual minority stress variables to the Investment Model when examining same-sex relationships and implications are discussed. PMID:28695154
Shim, Eun-Jung; Hahm, Bong-Jin; Go, Dong Jin; Lee, Kwang-Min; Noh, Hae Lim; Park, Seung-Hee; Song, Yeong Wook
2018-06-01
To examine factors in the fear-avoidance model, such as pain, pain catastrophizing, fear-avoidance beliefs, physical disability, and depression and their relationships with physical and psychological quality of life in patients with rheumatic diseases. The data were obtained from 360 patients with rheumatic diseases who completed self-report measures assessing study variables. Structural equation modeling was used to examine the hypothesized relationships among factors specified in the fear-avoidance model predicting physical and psychological quality of life. Final models fit the data well, explaining 96% and 82% of the variance in physical and psychological quality of life, respectively. Higher pain catastrophizing was related to stronger fear-avoidance beliefs that had a direct negative association with physical disability and depression, which, in turn, negatively affected physical quality of life. Pain severity was also directly related to physical disability. Physical disability also affected physical quality of life indirectly through depression. The hypothesized relationships specified in the model were also confirmed for psychological quality of life. However, physical disability had an indirect association with psychological quality of life via depression. The current results underscore the significant role of cognitive, affective, and behavioral factors in perceived physical disability and their mediated detrimental effect on physical and psychological quality of life in patients with rheumatic diseases. Implications for rehabilitation The fear-avoidance model is applicable to the prediction of quality of life in patients with rheumatic diseases. As pain-catastrophizing and fear-avoidance beliefs are important factors linked to physical disability and depression, intervening these cognitive factors is necessary to improve physical function and depression in patients with rheumatic diseases. Considering the strong association between depression and physical and psychological quality of life, the assessment and treatment of the former should be included in the rehabilitation of patients with rheumatic diseases. Interventions targeting physical function and depression are likely to be effective in terms of improving physical and psychological quality of life in patients with rheumatic diseases.
NASA Astrophysics Data System (ADS)
Carmichael, G. R.; Saide, P. E.; Gao, M.; Streets, D. G.; Kim, J.; Woo, J. H.
2017-12-01
Ambient aerosols are important air pollutants with direct impacts on human health and on the Earth's weather and climate systems through their interactions with radiation and clouds. Their role is dependent on their distributions of size, number, phase and composition, which vary significantly in space and time. There remain large uncertainties in simulated aerosol distributions due to uncertainties in emission estimates and in chemical and physical processes associated with their formation and removal. These uncertainties lead to large uncertainties in weather and air quality predictions and in estimates of health and climate change impacts. Despite these uncertainties and challenges, regional-scale coupled chemistry-meteorological models such as WRF-Chem have significant capabilities in predicting aerosol distributions and explaining aerosol-weather interactions. We explore the hypothesis that new advances in on-line, coupled atmospheric chemistry/meteorological models, and new emission inversion and data assimilation techniques applicable to such coupled models, can be applied in innovative ways using current and evolving observation systems to improve predictions of aerosol distributions at regional scales. We investigate the impacts of assimilating AOD from geostationary satellite (GOCI) and surface PM2.5 measurements on predictions of AOD and PM in Korea during KORUS-AQ through a series of experiments. The results suggest assimilating datasets from multiple platforms can improve the predictions of aerosol temporal and spatial distributions.
Experience-based quality control of clinical intensity-modulated radiotherapy planning.
Moore, Kevin L; Brame, R Scott; Low, Daniel A; Mutic, Sasa
2011-10-01
To incorporate a quality control tool, according to previous planning experience and patient-specific anatomic information, into the intensity-modulated radiotherapy (IMRT) plan generation process and to determine whether the tool improved treatment plan quality. A retrospective study of 42 IMRT plans demonstrated a correlation between the fraction of organs at risk (OARs) overlapping the planning target volume and the mean dose. This yielded a model, predicted dose = prescription dose (0.2 + 0.8 [1 - exp(-3 overlapping planning target volume/volume of OAR)]), that predicted the achievable mean doses according to the planning target volume overlap/volume of OAR and the prescription dose. The model was incorporated into the planning process by way of a user-executable script that reported the predicted dose for any OAR. The script was introduced to clinicians engaged in IMRT planning and deployed thereafter. The script's effect was evaluated by tracking δ = (mean dose-predicted dose)/predicted dose, the fraction by which the mean dose exceeded the model. All OARs under investigation (rectum and bladder in prostate cancer; parotid glands, esophagus, and larynx in head-and-neck cancer) exhibited both smaller δ and reduced variability after script implementation. These effects were substantial for the parotid glands, for which the previous δ = 0.28 ± 0.24 was reduced to δ = 0.13 ± 0.10. The clinical relevance was most evident in the subset of cases in which the parotid glands were potentially salvageable (predicted dose <30 Gy). Before script implementation, an average of 30.1 Gy was delivered to the salvageable cases, with an average predicted dose of 20.3 Gy. After implementation, an average of 18.7 Gy was delivered to salvageable cases, with an average predicted dose of 17.2 Gy. In the prostate cases, the rectum model excess was reduced from δ = 0.28 ± 0.20 to δ = 0.07 ± 0.15. On surveying dosimetrists at the end of the study, most reported that the script both improved their IMRT planning (8 of 10) and increased their efficiency (6 of 10). This tool proved successful in increasing normal tissue sparing and reducing interclinician variability, providing effective quality control of the IMRT plan development process. Copyright © 2011 Elsevier Inc. All rights reserved.
Dunlop, Boadie W; Hill, Eric; Johnson, Benjamin N; Klein, Daniel N; Gelenberg, Alan J; Rothbaum, Barbara O; Thase, Michael E; Kocsis, James H
2015-03-01
Sexual dysfunction is common among depressed adults. Childhood sexual abuse (CSA) and depressive symptomology are among the risk factors for sexual dysfunction, and these factors may interact to predict adult relationship functioning. Several models have been developed postulating interactions between these variables. We tested models of the effects of CSA and elucidate the associations between CSA, sexual dysfunction, depression severity, anxiety, and relationship quality in chronically depressed adults. Baseline data from 808 chronically depressed outpatients enrolled in the Research Evaluating the Value of Augmenting Medication with Psychotherapy study were evaluated using structural equation modeling. The Inventory of Depressive Symptomology, self-report version (IDS-SR) assessed depression severity, and the Mood and Anxiety Symptom Questionnaire Anxious Arousal subscale assessed anxiety. Sexual function was assessed with the Arizona Sexual Experiences Scale (ASEX), and the Quality of Marriage Index (QMI) assessed relationship quality for patients in stable relationships. CSA scores predicted depression severity on the IDS-SR, as well as lower relationship quality and sexual satisfaction. ASEX scores were significantly associated with depression severity but were not correlated with the QMI. Two models were evaluated to elucidate these associations, revealing that (i) depression severity and anxious arousal mediated the relationship between CSA and adult sexual function, (ii) anxious arousal and sexual functioning mediated the association between CSA and depression symptoms, and (iii) when these models were combined, anxious arousal emerged as the most important mediator of CSA on depression which, in turn, mediated associations with adult sexual satisfaction and relationship quality. Although CSA predicts lower relationship and sexual satisfaction among depressed adults, the long-term effects of CSA appear to be mediated by depressive and anxious symptoms. It is important to address depression and anxiety symptoms when treating patients with CSA who present with sexual dysfunction or marital concerns. © 2014 International Society for Sexual Medicine.
Evaluating the Impact of Aerosols on Numerical Weather Prediction
NASA Astrophysics Data System (ADS)
Freitas, Saulo; Silva, Arlindo; Benedetti, Angela; Grell, Georg; Members, Wgne; Zarzur, Mauricio
2015-04-01
The Working Group on Numerical Experimentation (WMO, http://www.wmo.int/pages/about/sec/rescrosscut/resdept_wgne.html) has organized an exercise to evaluate the impact of aerosols on NWP. This exercise will involve regional and global models currently used for weather forecast by the operational centers worldwide and aims at addressing the following questions: a) How important are aerosols for predicting the physical system (NWP, seasonal, climate) as distinct from predicting the aerosols themselves? b) How important is atmospheric model quality for air quality forecasting? c) What are the current capabilities of NWP models to simulate aerosol impacts on weather prediction? Toward this goal we have selected 3 strong or persistent events of aerosol pollution worldwide that could be fairly represented in current NWP models and that allowed for an evaluation of the aerosol impact on weather prediction. The selected events includes a strong dust storm that blew off the coast of Libya and over the Mediterranean, an extremely severe episode of air pollution in Beijing and surrounding areas, and an extreme case of biomass burning smoke in Brazil. The experimental design calls for simulations with and without explicitly accounting for aerosol feedbacks in the cloud and radiation parameterizations. In this presentation we will summarize the results of this study focusing on the evaluation of model performance in terms of its ability to faithfully simulate aerosol optical depth, and the assessment of the aerosol impact on the predictions of near surface wind, temperature, humidity, rainfall and the surface energy budget.
A model of service quality perceptions and health care consumer behavior.
O'Connor, S J; Shewchuk, R M; Bowers, M R
1991-01-01
Analysis of covariance structures (LISREL) was used to examine the influence of consumer held perceptions of service quality on consumer satisfaction and intentions to return. Results indicate that service quality is a significant predictor of consumer satisfaction which, in turn, predicts intention to return. Health care marketing implications are discussed.
The Research of the Personality Qualities of Future Educational Psychologists
ERIC Educational Resources Information Center
Dolgova, V. I.; Salamatov, A. A.; Potapova, M. V.; Yakovleva, N. O.
2016-01-01
In this article, the authors substantiate the existence of the personality qualities of future educational psychologists (PQFEP) that are, in fact, a sum of knowledge, skills, abilities, socially required qualities of personality allowing the psychologist to solve problems in all the fields of professional activities. A model of PQFEP predicts the…
Remote Sensing Characterization of the Urban Landscape for Improvement of Air Quality Modeling
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.; Estes, Maurice G., Jr.; Khan, Maudood
2005-01-01
The urban landscape is inherently complex and this complexity is not adequately captured in air quality models, particularly the Community Multiscale Air Quality (CMAQ) model that is used to assess whether urban areas are in attainment of EPA air quality standards, primarily for ground level ozone. This inadequacy of the CMAQ model to sufficiently respond to the heterogeneous nature of the urban landscape can impact how well the model predicts ozone pollutant levels over metropolitan areas and ultimately, whether cities exceed EPA ozone air quality standards. We are exploring the utility of high-resolution remote sensing data and urban growth projections as improved inputs to the meteorology component of the CMAQ model focusing on the Atlanta, Georgia metropolitan area as a case study. These growth projections include "business as usual" and "smart growth" scenarios out to 2030. The growth projections illustrate the effects of employing urban heat island mitigation strategies, such as increasing tree canopy and albedo across the Atlanta metro area, in moderating ground-level ozone and air temperature, compared to "business as usual" simulations in which heat island mitigation strategies are not applied. The National Land Cover Dataset at 30m resolution is being used as the land use/land cover input and aggregated to the 4km scale for the MM5 mesoscale meteorological model and the (CMAQ) modeling schemes. Use of these data has been found to better characterize low densityhburban development as compared with USGS 1 km land use/land cover data that have traditionally been used in modeling. Air quality prediction for fiture scenarios to 2030 is being facilitated by land use projections using a spatial growth model. Land use projections were developed using the 2030 Regional Transportation Plan developed by the Atlanta Regional Commission, the regional planning agency for the area. This allows the state Environmental Protection agency to evaluate how these transportation plans will affect fbture air quality.
Grieger, Jessica A; Johnson, Brittany J; Wycherley, Thomas P; Golley, Rebecca K
2017-05-01
Background: Dietary simulation modeling can predict dietary strategies that may improve nutritional or health outcomes. Objectives: The study aims were to undertake a systematic review of simulation studies that model dietary strategies aiming to improve nutritional intake, body weight, and related chronic disease, and to assess the methodologic and reporting quality of these models. Methods: The Preferred Reporting Items for Systematic Reviews and Meta-Analyses guided the search strategy with studies located through electronic searches [Cochrane Library, Ovid (MEDLINE and Embase), EBSCOhost (CINAHL), and Scopus]. Study findings were described and dietary modeling methodology and reporting quality were critiqued by using a set of quality criteria adapted for dietary modeling from general modeling guidelines. Results: Forty-five studies were included and categorized as modeling moderation, substitution, reformulation, or promotion dietary strategies. Moderation and reformulation strategies targeted individual nutrients or foods to theoretically improve one particular nutrient or health outcome, estimating small to modest improvements. Substituting unhealthy foods with healthier choices was estimated to be effective across a range of nutrients, including an estimated reduction in intake of saturated fatty acids, sodium, and added sugar. Promotion of fruits and vegetables predicted marginal changes in intake. Overall, the quality of the studies was moderate to high, with certain features of the quality criteria consistently reported. Conclusions: Based on the results of reviewed simulation dietary modeling studies, targeting a variety of foods rather than individual foods or nutrients theoretically appears most effective in estimating improvements in nutritional intake, particularly reducing intake of nutrients commonly consumed in excess. A combination of strategies could theoretically be used to deliver the best improvement in outcomes. Study quality was moderate to high. However, given the lack of dietary simulation reporting guidelines, future work could refine the quality tool to harmonize consistency in the reporting of subsequent dietary modeling studies. © 2017 American Society for Nutrition.
NASA Astrophysics Data System (ADS)
Ayoko, Godwin A.; Singh, Kirpal; Balerea, Steven; Kokot, Serge
2007-03-01
SummaryPhysico-chemical properties of surface water and groundwater samples from some developing countries have been subjected to multivariate analyses by the non-parametric multi-criteria decision-making methods, PROMETHEE and GAIA. Complete ranking information necessary to select one source of water in preference to all others was obtained, and this enabled relationships between the physico-chemical properties and water quality to be assessed. Thus, the ranking of the quality of the water bodies was found to be strongly dependent on the total dissolved solid, phosphate, sulfate, ammonia-nitrogen, calcium, iron, chloride, magnesium, zinc, nitrate and fluoride contents of the waters. However, potassium, manganese and zinc composition showed the least influence in differentiating the water bodies. To model and predict the water quality influencing parameters, partial least squares analyses were carried out on a matrix made up of the results of water quality assessment studies carried out in Nigeria, Papua New Guinea, Egypt, Thailand and India/Pakistan. The results showed that the total dissolved solid, calcium, sulfate, sodium and chloride contents can be used to predict a wide range of physico-chemical characteristics of water. The potential implications of these observations on the financial and opportunity costs associated with elaborate water quality monitoring are discussed.
Improved Fuzzy Modelling to Predict the Academic Performance of Distance Education Students
ERIC Educational Resources Information Center
Yildiz, Osman; Bal, Abdullah; Gulsecen, Sevinc
2013-01-01
It is essential to predict distance education students' year-end academic performance early during the course of the semester and to take precautions using such prediction-based information. This will, in particular, help enhance their academic performance and, therefore, improve the overall educational quality. The present study was on the…
NASA Technical Reports Server (NTRS)
Lawrence, Stella
1991-01-01
The object of this project was to develop and calibrate quantitative models for predicting the quality of software. Reliable flight and supporting ground software is a highly important factor in the successful operation of the space shuttle program. The models used in the present study consisted of SMERFS (Statistical Modeling and Estimation of Reliability Functions for Software). There are ten models in SMERFS. For a first run, the results obtained in modeling the cumulative number of failures versus execution time showed fairly good results for our data. Plots of cumulative software failures versus calendar weeks were made and the model results were compared with the historical data on the same graph. If the model agrees with actual historical behavior for a set of data then there is confidence in future predictions for this data. Considering the quality of the data, the models have given some significant results, even at this early stage. With better care in data collection, data analysis, recording of the fixing of failures and CPU execution times, the models should prove extremely helpful in making predictions regarding the future pattern of failures, including an estimate of the number of errors remaining in the software and the additional testing time required for the software quality to reach acceptable levels. It appears that there is no one 'best' model for all cases. It is for this reason that the aim of this project was to test several models. One of the recommendations resulting from this study is that great care must be taken in the collection of data. When using a model, the data should satisfy the model assumptions.
Hanna, R. Blair; Campbell, Sharon G.
2000-01-01
This report describes the water quality model developed for the Klamath River System Impact Assessment Model (SIAM). The Klamath River SIAM is a decision support system developed by the authors and other US Geological Survey (USGS), Midcontinent Ecological Science Center staff to study the effects of basin-wide water management decisions on anadromous fish in the Klamath River. The Army Corps of Engineersa?? HEC5Q water quality modeling software was used to simulate water temperature, dissolved oxygen and conductivity in 100 miles of the Klamath River Basin in Oregon and California. The water quality model simulated three reservoirs and the mainstem Klamath River influenced by the Shasta and Scott River tributaries. Model development, calibration and two validation exercises are described as well as the integration of the water quality model into the SIAM decision support system software. Within SIAM, data are exchanged between the water quantity model (MODSIM), the water quality model (HEC5Q), the salmon population model (SALMOD) and methods for evaluating ecosystem health. The overall predictive ability of the water quality model is described in the context of calibration and validation error statistics. Applications of SIAM and the water quality model are described.
Stormwater quality modelling in combined sewers: calibration and uncertainty analysis.
Kanso, A; Chebbo, G; Tassin, B
2005-01-01
Estimating the level of uncertainty in urban stormwater quality models is vital for their utilization. This paper presents the results of application of a Monte Carlo Markov Chain method based on the Bayesian theory for the calibration and uncertainty analysis of a storm water quality model commonly used in available software. The tested model uses a hydrologic/hydrodynamic scheme to estimate the accumulation, the erosion and the transport of pollutants on surfaces and in sewers. It was calibrated for four different initial conditions of in-sewer deposits. Calibration results showed large variability in the model's responses in function of the initial conditions. They demonstrated that the model's predictive capacity is very low.
Adjusted hospital death rates: a potential screen for quality of medical care.
Dubois, R W; Brook, R H; Rogers, W H
1987-09-01
Increased economic pressure on hospitals has accelerated the need to develop a screening tool for identifying hospitals that potentially provide poor quality care. Based upon data from 93 hospitals and 205,000 admissions, we used a multiple regression model to adjust the hospitals crude death rate. The adjustment process used age, origin of patient from the emergency department or nursing home, and a hospital case mix index based on DRGs (diagnostic related groups). Before adjustment, hospital death rates ranged from 0.3 to 5.8 per 100 admissions. After adjustment, hospital death ratios ranged from 0.36 to 1.36 per 100 (actual death rate divided by predicted death rate). Eleven hospitals (12 per cent) were identified where the actual death rate exceeded the predicted death rate by more than two standard deviations. In nine hospitals (10 per cent), the predicted death rate exceeded the actual death rate by a similar statistical margin. The 11 hospitals with higher than predicted death rates may provide inadequate quality of care or have uniquely ill patient populations. The adjusted death rate model needs to be validated and generalized before it can be used routinely to screen hospitals. However, the remaining large differences in observed versus predicted death rates lead us to believe that important differences in hospital performance may exist.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Ping; Song, Heda; Wang, Hong
Blast furnace (BF) in ironmaking is a nonlinear dynamic process with complicated physical-chemical reactions, where multi-phase and multi-field coupling and large time delay occur during its operation. In BF operation, the molten iron temperature (MIT) as well as Si, P and S contents of molten iron are the most essential molten iron quality (MIQ) indices, whose measurement, modeling and control have always been important issues in metallurgic engineering and automation field. This paper develops a novel data-driven nonlinear state space modeling for the prediction and control of multivariate MIQ indices by integrating hybrid modeling and control techniques. First, to improvemore » modeling efficiency, a data-driven hybrid method combining canonical correlation analysis and correlation analysis is proposed to identify the most influential controllable variables as the modeling inputs from multitudinous factors would affect the MIQ indices. Then, a Hammerstein model for the prediction of MIQ indices is established using the LS-SVM based nonlinear subspace identification method. Such a model is further simplified by using piecewise cubic Hermite interpolating polynomial method to fit the complex nonlinear kernel function. Compared to the original Hammerstein model, this simplified model can not only significantly reduce the computational complexity, but also has almost the same reliability and accuracy for a stable prediction of MIQ indices. Last, in order to verify the practicability of the developed model, it is applied in designing a genetic algorithm based nonlinear predictive controller for multivariate MIQ indices by directly taking the established model as a predictor. Industrial experiments show the advantages and effectiveness of the proposed approach.« less
NASA Astrophysics Data System (ADS)
Jia, Huizhen; Sun, Quansen; Ji, Zexuan; Wang, Tonghan; Chen, Qiang
2014-11-01
The goal of no-reference/blind image quality assessment (NR-IQA) is to devise a perceptual model that can accurately predict the quality of a distorted image as human opinions, in which feature extraction is an important issue. However, the features used in the state-of-the-art "general purpose" NR-IQA algorithms are usually natural scene statistics (NSS) based or are perceptually relevant; therefore, the performance of these models is limited. To further improve the performance of NR-IQA, we propose a general purpose NR-IQA algorithm which combines NSS-based features with perceptually relevant features. The new method extracts features in both the spatial and gradient domains. In the spatial domain, we extract the point-wise statistics for single pixel values which are characterized by a generalized Gaussian distribution model to form the underlying features. In the gradient domain, statistical features based on neighboring gradient magnitude similarity are extracted. Then a mapping is learned to predict quality scores using a support vector regression. The experimental results on the benchmark image databases demonstrate that the proposed algorithm correlates highly with human judgments of quality and leads to significant performance improvements over state-of-the-art methods.
Nemes, Szilard; Rolfson, Ola; Garellick, Göran
2018-02-01
Clinicians considering improvements in health-related quality of life (HRQoL) after total hip replacement (THR) must account for multiple pieces of information. Evidence-based decisions are important to best assess the effect of THR on HRQoL. This work aims at constructing a shared decision-making tool that helps clinicians assessing the future benefits of THR by offering predictions of 1-year postoperative HRQoL of THR patients. We used data from the Swedish Hip Arthroplasty Register. Data from 2008 were used as training set and data from 2009 to 2012 as validation set. We adopted two approaches. First, we assumed a continuous distribution for the EQ-5D index and modelled the postoperative EQ-5D index with regression models. Second, we modelled the five dimensions of the EQ-5D and weighted together the predictions using the UK Time Trade-Off value set. As predictors, we used preoperative EQ-5D dimensions and the EQ-5D index, EQ visual analogue scale, visual analogue scale pain, Charnley classification, age, gender, body mass index, American Society of Anesthesiologists, surgical approach and prosthesis type. Additionally, the tested algorithms were combined in a single predictive tool by stacking. Best predictive power was obtained by the multivariate adaptive regression splines (R 2 = 0.158). However, this was not significantly better than the predictive power of linear regressions (R 2 = 0.157). The stacked model had a predictive power of 17%. Successful implementation of a shared decision-making tool that can aid clinicians and patients in understanding expected improvement in HRQoL following THR would require higher predictive power than we achieved. For a shared decision-making tool to succeed, further variables, such as socioeconomics, need to be considered. © 2016 John Wiley & Sons, Ltd.
On the use and the performance of software reliability growth models
NASA Technical Reports Server (NTRS)
Keiller, Peter A.; Miller, Douglas R.
1991-01-01
We address the problem of predicting future failures for a piece of software. The number of failures occurring during a finite future time interval is predicted from the number failures observed during an initial period of usage by using software reliability growth models. Two different methods for using the models are considered: straightforward use of individual models, and dynamic selection among models based on goodness-of-fit and quality-of-prediction criteria. Performance is judged by the relative error of the predicted number of failures over future finite time intervals relative to the number of failures eventually observed during the intervals. Six of the former models and eight of the latter are evaluated, based on their performance on twenty data sets. Many open questions remain regarding the use and the performance of software reliability growth models.
This study presents the first evaluation of the performance of the Eta-CMAQ air quality forecast model to predict a variety of widely used seasonal mean and cumulative O3 exposure indices associated with vegetation using the U.S. AIRNow O3 observations.
Influence of rainfall and catchment characteristics on urban stormwater quality.
Liu, An; Egodawatta, Prasanna; Guan, Yuntao; Goonetilleke, Ashantha
2013-02-01
The accuracy and reliability of urban stormwater quality modelling outcomes are important for stormwater management decision making. The commonly adopted approach where only a limited number of factors are used to predict urban stormwater quality may not adequately represent the complexity of the quality response to a rainfall event or site-to-site differences to support efficient treatment design. This paper discusses an investigation into the influence of rainfall and catchment characteristics on urban stormwater quality in order to investigate the potential areas for errors in current stormwater quality modelling practices. It was found that the influence of rainfall characteristics on pollutant wash-off is step-wise based on specific thresholds. This means that a modelling approach where the wash-off process is predicted as a continuous function of rainfall intensity and duration is not appropriate. Additionally, other than conventional catchment characteristics, namely, land use and impervious surface fraction, other catchment characteristics such as impervious area layout, urban form and site specific characteristics have an important influence on both, pollutant build-up and wash-off processes. Finally, the use of solids as a surrogate to estimate other pollutant species was found to be inappropriate. Individually considering build-up and wash-off processes for each pollutant species should be the preferred option. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Cisneros, Felipe; Veintimilla, Jaime
2013-04-01
The main aim of this research is to create a model of Artificial Neural Networks (ANN) that allows predicting the flow in Tomebamba River both, at real time and in a certain day of year. As inputs we are using information of rainfall and flow of the stations along of the river. This information is organized in scenarios and each scenario is prepared to a specific area. The information is acquired from the hydrological stations placed in the watershed using an electronic system developed at real time and it supports any kind or brands of this type of sensors. The prediction works very good three days in advance This research includes two ANN models: Back propagation and a hybrid model between back propagation and OWO-HWO. These last two models have been tested in a preliminary research. To validate the results we are using some error indicators such as: MSE, RMSE, EF, CD and BIAS. The results of this research reached high levels of reliability and the level of error are minimal. These predictions are useful for flood and water quality control and management at City of Cuenca Ecuador
Passenger ride quality determined from commercial airline flights
NASA Technical Reports Server (NTRS)
Richards, L. G.; Kuhlthau, A. R.; Jacobson, I. D.
1975-01-01
The University of Virginia ride-quality research program is reviewed. Data from two flight programs, involving seven types of aircraft, are considered in detail. An apparatus for measuring physical variations in the flight environment and recording the subjective reactions of test subjects is described. Models are presented for predicting the comfort response of test subjects from the physical data, and predicting the overall comfort reaction of test subjects from their moment by moment responses. The correspondence of mean passenger comfort judgments and test subject response is shown. Finally, the models of comfort response based on data from the 5-point and 7-point comfort scales are shown to correspond.
Mamdani-Fuzzy Modeling Approach for Quality Prediction of Non-Linear Laser Lathing Process
NASA Astrophysics Data System (ADS)
Sivaraos; Khalim, A. Z.; Salleh, M. S.; Sivakumar, D.; Kadirgama, K.
2018-03-01
Lathing is a process to fashioning stock materials into desired cylindrical shapes which usually performed by traditional lathe machine. But, the recent rapid advancements in engineering materials and precision demand gives a great challenge to the traditional method. The main drawback of conventional lathe is its mechanical contact which brings to the undesirable tool wear, heat affected zone, finishing, and dimensional accuracy especially taper quality in machining of stock with high length to diameter ratio. Therefore, a novel approach has been devised to investigate in transforming a 2D flatbed CO2 laser cutting machine into 3D laser lathing capability as an alternative solution. Three significant design parameters were selected for this experiment, namely cutting speed, spinning speed, and depth of cut. Total of 24 experiments were performed with eight (8) sequential runs where they were then replicated three (3) times. The experimental results were then used to establish Mamdani - Fuzzy predictive model where it yields the accuracy of more than 95%. Thus, the proposed Mamdani - Fuzzy modelling approach is found very much suitable and practical for quality prediction of non-linear laser lathing process for cylindrical stocks of 10mm diameter.
Improved prediction of antibody VL–VH orientation
Marze, Nicholas A.; Lyskov, Sergey; Gray, Jeffrey J.
2016-01-01
Antibodies are important immune molecules with high commercial value and therapeutic interest because of their ability to bind diverse antigens. Computational prediction of antibody structure can quickly reveal valuable information about the nature of these antigen-binding interactions, but only if the models are of sufficient quality. To achieve high model quality during complementarity-determining region (CDR) structural prediction, one must account for the VL–VH orientation. We developed a novel four-metric VL–VH orientation coordinate frame. Additionally, we extended the CDR grafting protocol in RosettaAntibody with a new method that diversifies VL–VH orientation by using 10 VL–VH orientation templates rather than a single one. We tested the multiple-template grafting protocol on two datasets of known antibody crystal structures. During the template-grafting phase, the new protocol improved the fraction of accurate VL–VH orientation predictions from only 26% (12/46) to 72% (33/46) of targets. After the full RosettaAntibody protocol, including CDR H3 remodeling and VL–VH re-orientation, the new protocol produced more candidate structures with accurate VL–VH orientation than the standard protocol in 43/46 targets (93%). The improved ability to predict VL–VH orientation will bolster predictions of other parts of the paratope, including the conformation of CDR H3, a grand challenge of antibody homology modeling. PMID:27276984
Bayesian Revision of Residual Detection Power
NASA Technical Reports Server (NTRS)
DeLoach, Richard
2013-01-01
This paper addresses some issues with quality assessment and quality assurance in response surface modeling experiments executed in wind tunnels. The role of data volume on quality assurance for response surface models is reviewed. Specific wind tunnel response surface modeling experiments are considered for which apparent discrepancies exist between fit quality expectations based on implemented quality assurance tactics, and the actual fit quality achieved in those experiments. These discrepancies are resolved by using Bayesian inference to account for certain imperfections in the assessment methodology. Estimates of the fraction of out-of-tolerance model predictions based on traditional frequentist methods are revised to account for uncertainty in the residual assessment process. The number of sites in the design space for which residuals are out of tolerance is seen to exceed the number of sites where the model actually fails to fit the data. A method is presented to estimate how much of the design space in inadequately modeled by low-order polynomial approximations to the true but unknown underlying response function.
TESTS OF INDOOR AIR QUALITY SINKS
Experiments were conducted in a room-size test chamber to determine the sink effects of selected materials on indoor air concentrations of p-dichlorobenzene (PDCB). hese effects might alter pollutant behavior from that predicted using similar indoor air quality models, by reducin...
Inverse modeling with RZWQM2 to predict water quality
USDA-ARS?s Scientific Manuscript database
Agricultural systems models such as RZWQM2 are complex and have numerous parameters that are unknown and difficult to estimate. Inverse modeling provides an objective statistical basis for calibration that involves simultaneous adjustment of model parameters and yields parameter confidence intervals...
DOT National Transportation Integrated Search
1982-01-01
This report presents the user instructions and data requirements for SIMCO, a combined simulation and probability computer model developed to quantify and evaluate carbon monoxide in roadside environments. The model permits direct determinations of t...
Utility of distributed hydrologic and water quality models for watershed management and sustainability studies should be accompanied by rigorous model uncertainty analysis. However, the use of complex watershed models primarily follows the traditional {calibrate/validate/predict}...
Harris, Jenny; Cornelius, Victoria; Ream, Emma; Cheevers, Katy; Armes, Jo
2017-07-01
The purpose of this review was to identify potential candidate predictors of anxiety in women with early-stage breast cancer (BC) after adjuvant treatments and evaluate methodological development of existing multivariable models to inform the future development of a predictive risk stratification model (PRSM). Databases (MEDLINE, Web of Science, CINAHL, CENTRAL and PsycINFO) were searched from inception to November 2015. Eligible studies were prospective, recruited women with stage 0-3 BC, used a validated anxiety outcome ≥3 months post-treatment completion and used multivariable prediction models. Internationally accepted quality standards were used to assess predictive risk of bias and strength of evidence. Seven studies were identified: five were observational cohorts and two secondary analyses of RCTs. Variability of measurement and selective reporting precluded meta-analysis. Twenty-one candidate predictors were identified in total. Younger age and previous mental health problems were identified as risk factors in ≥3 studies. Clinical variables (e.g. treatment, tumour grade) were not identified as predictors in any studies. No studies adhered to all quality standards. Pre-existing vulnerability to mental health problems and younger age increased the risk of anxiety after completion of treatment for BC survivors, but there was no evidence that chemotherapy was a predictor. Multiple predictors were identified but many lacked reproducibility or were not measured across studies, and inadequate reporting did not allow full evaluation of the multivariable models. The use of quality standards in the development of PRSM within supportive cancer care would improve model quality and performance, thereby allowing professionals to better target support for patients.
NASA Astrophysics Data System (ADS)
Smith, R. A.; Alexander, R. B.; Schwarz, G. E.
2003-12-01
Determining the effects of land use change (e.g. urbanization, deforestation) on water quality at large spatial scales has been difficult because water quality measurements in large rivers with heterogeneous basins show the integrated effects of multiple factors. Moreover, the observed effects of land use changes on water quality in small homogeneous stream basins may not be indicative of downstream effects (including effects on such ecologically relevant characteristics as nutrient levels and elemental ratios) because of loss processes occurring during downstream transport in river channels. In this study we used the USGS SPARROW (Spatially-Referenced Regression on Watersheds) models of total nitrogen (TN) and total phosphorus (TP) in streams and rivers of the conterminous US to examine the effects of various aspects of land use change on nutrient concentrations and flux from the pre-development era to the present. The models were calibrated with data from 370 long-term monitoring stations representing a wide range of basin sizes, land use/cover classes, climates, and physiographies. The non-linear formulation for each model includes 20+ statistically estimated parameters relating to land use/cover characteristics and other environmental variables such as temperature, soil conditions, hill slope, and the hydraulic characteristics of 2200 large lakes and reservoirs. Model predictions are available for 62,000 river/stream channel nodes. Model predictions of pre-development water quality compare favorably with nutrient data from 63 undeveloped (reference) sites. Error statistics are available for predictions at all nodes. Model simulations were chosen to compare the effects of selected aspects of land use change on nutrient levels at large and small basin scales, lacustrine and coastal receiving waters, and among the major US geographic regions.
Martínez Vega, Mabel V; Sharifzadeh, Sara; Wulfsohn, Dvoralai; Skov, Thomas; Clemmensen, Line Harder; Toldam-Andersen, Torben B
2013-12-01
Visible-near infrared spectroscopy remains a method of increasing interest as a fast alternative for the evaluation of fruit quality. The success of the method is assumed to be achieved by using large sets of samples to produce robust calibration models. In this study we used representative samples of an early and a late season apple cultivar to evaluate model robustness (in terms of prediction ability and error) on the soluble solids content (SSC) and acidity prediction, in the wavelength range 400-1100 nm. A total of 196 middle-early season and 219 late season apples (Malus domestica Borkh.) cvs 'Aroma' and 'Holsteiner Cox' samples were used to construct spectral models for SSC and acidity. Partial least squares (PLS), ridge regression (RR) and elastic net (EN) models were used to build prediction models. Furthermore, we compared three sub-sample arrangements for forming training and test sets ('smooth fractionator', by date of measurement after harvest and random). Using the 'smooth fractionator' sampling method, fewer spectral bands (26) and elastic net resulted in improved performance for SSC models of 'Aroma' apples, with a coefficient of variation CVSSC = 13%. The model showed consistently low errors and bias (PLS/EN: R(2) cal = 0.60/0.60; SEC = 0.88/0.88°Brix; Biascal = 0.00/0.00; R(2) val = 0.33/0.44; SEP = 1.14/1.03; Biasval = 0.04/0.03). However, the prediction acidity and for SSC (CV = 5%) of the late cultivar 'Holsteiner Cox' produced inferior results as compared with 'Aroma'. It was possible to construct local SSC and acidity calibration models for early season apple cultivars with CVs of SSC and acidity around 10%. The overall model performance of these data sets also depend on the proper selection of training and test sets. The 'smooth fractionator' protocol provided an objective method for obtaining training and test sets that capture the existing variability of the fruit samples for construction of visible-NIR prediction models. The implication is that by using such 'efficient' sampling methods for obtaining an initial sample of fruit that represents the variability of the population and for sub-sampling to form training and test sets it should be possible to use relatively small sample sizes to develop spectral predictions of fruit quality. Using feature selection and elastic net appears to improve the SSC model performance in terms of R(2), RMSECV and RMSEP for 'Aroma' apples. © 2013 Society of Chemical Industry.
A changing climate: impacts on human exposures to O3 using ...
Predicting the impacts of changing climate on human exposure to air pollution requires future scenarios that account for changes in ambient pollutant concentrations, population sizes and distributions, and housing stocks. An integrated methodology to model changes in human exposures due to these impacts was developed by linking climate, air quality, land-use, and human exposure models. This methodology was then applied to characterize changes in predicted human exposures to O3 under multiple future scenarios. Regional climate projections for the U.S. were developed by downscaling global circulation model (GCM) scenarios for three of the Intergovernmental Panel on Climate Change’s (IPCC’s) Representative Concentration Pathways (RCPs) using the Weather Research and Forecasting (WRF) model. The regional climate results were in turn used to generate air quality (concentration) projections using the Community Multiscale Air Quality (CMAQ) model. For each of the climate change scenarios, future U.S. census-tract level population distributions from the Integrated Climate and Land Use Scenarios (ICLUS) model for four future scenarios based on the IPCC’s Special Report on Emissions Scenarios (SRES) storylines were used. These climate, air quality, and population projections were used as inputs to EPA’s Air Pollutants Exposure (APEX) model for 12 U.S. cities. Probability density functions show changes in the population distribution of 8 h maximum daily O3 exposur
NASA Astrophysics Data System (ADS)
Schliep, E. M.; Gelfand, A. E.; Holland, D. M.
2015-12-01
There is considerable demand for accurate air quality information in human health analyses. The sparsity of ground monitoring stations across the United States motivates the need for advanced statistical models to predict air quality metrics, such as PM2.5, at unobserved sites. Remote sensing technologies have the potential to expand our knowledge of PM2.5 spatial patterns beyond what we can predict from current PM2.5 monitoring networks. Data from satellites have an additional advantage in not requiring extensive emission inventories necessary for most atmospheric models that have been used in earlier data fusion models for air pollution. Statistical models combining monitoring station data with satellite-obtained aerosol optical thickness (AOT), also referred to as aerosol optical depth (AOD), have been proposed in the literature with varying levels of success in predicting PM2.5. The benefit of using AOT is that satellites provide complete gridded spatial coverage. However, the challenges involved with using it in fusion models are (1) the correlation between the two data sources varies both in time and in space, (2) the data sources are temporally and spatially misaligned, and (3) there is extensive missingness in the monitoring data and also in the satellite data due to cloud cover. We propose a hierarchical autoregressive spatially varying coefficients model to jointly model the two data sources, which addresses the foregoing challenges. Additionally, we offer formal model comparison for competing models in terms of model fit and out of sample prediction of PM2.5. The models are applied to daily observations of PM2.5 and AOT in the summer months of 2013 across the conterminous United States. Most notably, during this time period, we find small in-sample improvement incorporating AOT into our autoregressive model but little out-of-sample predictive improvement.
Data governance in predictive toxicology: A review.
Fu, Xin; Wojak, Anna; Neagu, Daniel; Ridley, Mick; Travis, Kim
2011-07-13
Due to recent advances in data storage and sharing for further data processing in predictive toxicology, there is an increasing need for flexible data representations, secure and consistent data curation and automated data quality checking. Toxicity prediction involves multidisciplinary data. There are hundreds of collections of chemical, biological and toxicological data that are widely dispersed, mostly in the open literature, professional research bodies and commercial companies. In order to better manage and make full use of such large amount of toxicity data, there is a trend to develop functionalities aiming towards data governance in predictive toxicology to formalise a set of processes to guarantee high data quality and better data management. In this paper, data quality mainly refers in a data storage sense (e.g. accuracy, completeness and integrity) and not in a toxicological sense (e.g. the quality of experimental results). This paper reviews seven widely used predictive toxicology data sources and applications, with a particular focus on their data governance aspects, including: data accuracy, data completeness, data integrity, metadata and its management, data availability and data authorisation. This review reveals the current problems (e.g. lack of systematic and standard measures of data quality) and desirable needs (e.g. better management and further use of captured metadata and the development of flexible multi-level user access authorisation schemas) of predictive toxicology data sources development. The analytical results will help to address a significant gap in toxicology data quality assessment and lead to the development of novel frameworks for predictive toxicology data and model governance. While the discussed public data sources are well developed, there nevertheless remain some gaps in the development of a data governance framework to support predictive toxicology. In this paper, data governance is identified as the new challenge in predictive toxicology, and a good use of it may provide a promising framework for developing high quality and easy accessible toxicity data repositories. This paper also identifies important research directions that require further investigation in this area.
Data governance in predictive toxicology: A review
2011-01-01
Background Due to recent advances in data storage and sharing for further data processing in predictive toxicology, there is an increasing need for flexible data representations, secure and consistent data curation and automated data quality checking. Toxicity prediction involves multidisciplinary data. There are hundreds of collections of chemical, biological and toxicological data that are widely dispersed, mostly in the open literature, professional research bodies and commercial companies. In order to better manage and make full use of such large amount of toxicity data, there is a trend to develop functionalities aiming towards data governance in predictive toxicology to formalise a set of processes to guarantee high data quality and better data management. In this paper, data quality mainly refers in a data storage sense (e.g. accuracy, completeness and integrity) and not in a toxicological sense (e.g. the quality of experimental results). Results This paper reviews seven widely used predictive toxicology data sources and applications, with a particular focus on their data governance aspects, including: data accuracy, data completeness, data integrity, metadata and its management, data availability and data authorisation. This review reveals the current problems (e.g. lack of systematic and standard measures of data quality) and desirable needs (e.g. better management and further use of captured metadata and the development of flexible multi-level user access authorisation schemas) of predictive toxicology data sources development. The analytical results will help to address a significant gap in toxicology data quality assessment and lead to the development of novel frameworks for predictive toxicology data and model governance. Conclusions While the discussed public data sources are well developed, there nevertheless remain some gaps in the development of a data governance framework to support predictive toxicology. In this paper, data governance is identified as the new challenge in predictive toxicology, and a good use of it may provide a promising framework for developing high quality and easy accessible toxicity data repositories. This paper also identifies important research directions that require further investigation in this area. PMID:21752279
Improving the baking quality of bread wheat by genomic selection in early generations.
Michel, Sebastian; Kummer, Christian; Gallee, Martin; Hellinger, Jakob; Ametz, Christian; Akgöl, Batuhan; Epure, Doru; Güngör, Huseyin; Löschenberger, Franziska; Buerstmayr, Hermann
2018-02-01
Genomic selection shows great promise for pre-selecting lines with superior bread baking quality in early generations, 3 years ahead of labour-intensive, time-consuming, and costly quality analysis. The genetic improvement of baking quality is one of the grand challenges in wheat breeding as the assessment of the associated traits often involves time-consuming, labour-intensive, and costly testing forcing breeders to postpone sophisticated quality tests to the very last phases of variety development. The prospect of genomic selection for complex traits like grain yield has been shown in numerous studies, and might thus be also an interesting method to select for baking quality traits. Hence, we focused in this study on the accuracy of genomic selection for laborious and expensive to phenotype quality traits as well as its selection response in comparison with phenotypic selection. More than 400 genotyped wheat lines were, therefore, phenotyped for protein content, dough viscoelastic and mixing properties related to baking quality in multi-environment trials 2009-2016. The average prediction accuracy across three independent validation populations was r = 0.39 and could be increased to r = 0.47 by modelling major QTL as fixed effects as well as employing multi-trait prediction models, which resulted in an acceptable prediction accuracy for all dough rheological traits (r = 0.38-0.63). Genomic selection can furthermore be applied 2-3 years earlier than direct phenotypic selection, and the estimated selection response was nearly twice as high in comparison with indirect selection by protein content for baking quality related traits. This considerable advantage of genomic selection could accordingly support breeders in their selection decisions and aid in efficiently combining superior baking quality with grain yield in newly developed wheat varieties.
NASA Astrophysics Data System (ADS)
Moore, K.; Pierson, D.; Pettersson, K.; Naden, P.; Allott, N.; Jennings, E.; Tamm, T.; Järvet, A.; Nickus, U.; Thies, H.; Arvola, L.; Järvinen, M.; Schneiderman, E.; Zion, M.; Lounsbury, D.
2004-05-01
We are applying an existing watershed model in the EU CLIME (Climate and Lake Impacts in Europe) project to evaluate the effects of weather on seasonal and annual delivery of N, P, and DOC to lakes. Model calibration is based on long-term records of weather and water quality data collected from sites in different climatic regions spread across Europe and in New York State. The overall aim of the CLIME project is to develop methods and models to support lake and catchment management under current climate conditions and make predictions under future climate scenarios. Scientists from 10 partner countries are collaborating on developing a consistent approach to defining model parameters for the Generalized Watershed Loading Functions (GWLF) model, one of a larger suite of models used in the project. An example of the approach for the hydrological portion of the GWLF model will be presented, with consideration of the balance between model simplicity, ease of use, data requirements, and realistic predictions.
Prediction of porosity of food materials during drying: Current challenges and directions.
Joardder, Mohammad U H; Kumar, C; Karim, M A
2017-07-18
Pore formation in food samples is a common physical phenomenon observed during dehydration processes. The pore evolution during drying significantly affects the physical properties and quality of dried foods. Therefore, it should be taken into consideration when predicting transport processes in the drying sample. Characteristics of pore formation depend on the drying process parameters, product properties and processing time. Understanding the physics of pore formation and evolution during drying will assist in accurately predicting the drying kinetics and quality of food materials. Researchers have been trying to develop mathematical models to describe the pore formation and evolution during drying. In this study, existing porosity models are critically analysed and limitations are identified. Better insight into the factors affecting porosity is provided, and suggestions are proposed to overcome the limitations. These include considerations of process parameters such as glass transition temperature, sample temperature, and variable material properties in the porosity models. Several researchers have proposed models for porosity prediction of food materials during drying. However, these models are either very simplistic or empirical in nature and failed to consider relevant significant factors that influence porosity. In-depth understanding of characteristics of the pore is required for developing a generic model of porosity. A micro-level analysis of pore formation is presented for better understanding, which will help in developing an accurate and generic porosity model.
Bravo, Gina; Sene, Modou; Arcand, Marcel
2017-07-01
Family members are often called upon to make decisions for an incapacitated relative. Yet they have difficulty predicting a loved one's desire to receive treatments in hypothetical situations. We tested the hypothesis that this difficulty could in part be explained by discrepant quality-of-life assessments. The data come from 235 community-dwelling adults aged 70 years and over who rated their quality of life and desire for specified interventions in four health states (current state, mild to moderate stroke, incurable brain cancer, and severe dementia). All ratings were made on Likert-type scales. Using identical rating scales, a surrogate chosen by the older adult was asked to predict the latter's responses. Linear mixed models were fitted to determine whether differences in quality-of-life ratings between the older adult and surrogate were associated with surrogates' inaccuracy in predicting desire for treatment. The difference in quality-of-life ratings was a significant predictor of prediction inaccuracy for the three hypothetical health states (p < 0.01) and nearly significant for the current health state (p = 0.077). All regression coefficients were negative, implying that the more the surrogate overestimated quality of life compared to the older adult, the more he or she overestimated the older adult's desire to be treated. Discrepant quality-of-life ratings are associated with surrogates' difficulty in predicting desire for life-sustaining interventions in hypothetical situations. This finding underscores the importance of discussing anticipated quality of life in states of cognitive decline, to better prepare family members for making difficult decisions for their loved ones. ISRCTN89993391.
Integration of Air Quality & Exposure Models for Health Studies
The presentation describes a new community-scale tool called exposure model for individuals (EMI), which predicts five tiers of individual-level exposure metrics for ambient PM using outdoor concentrations, questionnaires, weather, and time-location information. In this modeling ...
Dimethylsulfide Chemistry: Annual, Seasonal, and Spatial Impacts on Sulfate
We incorporated oceanic emissions and atmospheric chemistry of dimethylsulfide (DMS) into the hemispheric Community Multiscale Air Quality model and performed annual model simulations without and with DMS chemistry. The model without DMS chemistry predicts higher concentrations o...
Modeling Benthic Sediment Processes to Predict Water ...
The benthic sediment acts as a huge reservoir of particulate and dissolved material (within interstitial water) which can contribute to loading of contaminants and nutrients to the water column. A benthic sediment model is presented in this report to predict spatial and temporal benthic fluxes of nutrients and chemicals in Narragansett Bay. A benthic sediment model is presented in this report to identify benthic flux into the water column in Narragansett Bay. Benthic flux is essential to properly model water quality and ecology in estuarine and coastal systems.
Nesakumar, Noel; Baskar, Chanthini; Kesavan, Srinivasan; Rayappan, John Bosco Balaguru; Alwarappan, Subbiah
2018-05-22
The moisture content of beetroot varies during long-term cold storage. In this work, we propose a strategy to identify the moisture content and age of beetroot using principal component analysis coupled Fourier transform infrared spectroscopy (FTIR). Frequent FTIR measurements were recorded directly from the beetroot sample surface over a period of 34 days for analysing its moisture content employing attenuated total reflectance in the spectral ranges of 2614-4000 and 1465-1853 cm -1 with a spectral resolution of 8 cm -1 . In order to estimate the transmittance peak height (T p ) and area under the transmittance curve [Formula: see text] over the spectral ranges of 2614-4000 and 1465-1853 cm -1 , Gaussian curve fitting algorithm was performed on FTIR data. Principal component and nonlinear regression analyses were utilized for FTIR data analysis. Score plot over the ranges of 2614-4000 and 1465-1853 cm -1 allowed beetroot quality discrimination. Beetroot quality predictive models were developed by employing biphasic dose response function. Validation experiment results confirmed that the accuracy of the beetroot quality predictive model reached 97.5%. This research work proves that FTIR spectroscopy in combination with principal component analysis and beetroot quality predictive models could serve as an effective tool for discriminating moisture content in fresh, half and completely spoiled stages of beetroot samples and for providing status alerts.
NASA Technical Reports Server (NTRS)
Leatherwood, J. D.; Clevenson, S. A.; Hollenbaugh, D. D.
1984-01-01
The results of a simulator study conducted to compare and validate various ride quality prediction methods for use in assessing passenger/crew ride comfort within helicopters are presented. Included are results quantifying 35 helicopter pilots discomfort responses to helicopter interior noise and vibration typical of routine flights, assessment of various ride quality metrics including the NASA ride comfort model, and examination of possible criteria approaches. Results of the study indicated that crew discomfort results from a complex interaction between vibration and interior noise. Overall measures such as weighted or unweighted root-mean-square acceleration level and A-weighted noise level were not good predictors of discomfort. Accurate prediction required a metric incorporating the interactive effects of both noise and vibration. The best metric for predicting crew comfort to the combined noise and vibration environment was the NASA discomfort index.
Janneck, Robby; Vercesi, Federico; Heremans, Paul; Genoe, Jan; Rolin, Cedric
2016-09-01
A model that describes solvent evaporation dynamics in meniscus-guided coating techniques is developed. In combination with a single fitting parameter, it is shown that this formula can accurately predict a processing window for various coating conditions. Organic thin-film transistors (OTFTs), fabricated by a zone-casting setup, indeed show the best performance at the predicted coating speeds with mobilities reaching 7 cm 2 V -1 s -1 . © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
External-environmental and internal-health early life predictors of adolescent development.
Hartman, Sarah; Li, Zhi; Nettle, Daniel; Belsky, Jay
2017-12-01
A wealth of evidence documents associations between various aspects of the rearing environment and later development. Two evolutionary-inspired models advance explanations for why and how such early experiences shape later functioning: (a) the external-prediction model, which highlights the role of the early environment (e.g., parenting) in regulating children's development, and (b) the internal-prediction model, which emphasizes internal state (i.e., health) as the critical regulator. Thus, by using data from the NICHD Study of Early Child Care and Youth Development, the current project draws from both models by investigating whether the effect of the early environment on later adolescent functioning is subject to an indirect effect by internal-health variables. Results showed a significant indirect effect of internal health on the relation between the early environment and adolescent behavior. Specifically, early environmental adversity during the first 5 years of life predicted lower quality health during childhood, which then led to problematic adolescent functioning and earlier age of menarche for girls. In addition, for girls, early adversity predicted lower quality health that forecasted earlier age of menarche leading to increased adolescent risk taking. The discussion highlights the importance of integrating both internal and external models to further understand the developmental processes that effect adolescent behavior.
Implementation of a WRF-CMAQ Air Quality Modeling System in Bogotá, Colombia
NASA Astrophysics Data System (ADS)
Nedbor-Gross, R.; Henderson, B. H.; Pachon, J. E.; Davis, J. R.; Baublitz, C. B.; Rincón, A.
2014-12-01
Due to a continuous economic growth Bogotá, Colombia has experienced air pollution issues in recent years. The local environmental authority has implemented several strategies to curb air pollution that have resulted in the decrease of PM10 concentrations since 2010. However, more activities are necessary in order to meet international air quality standards in the city. The University of Florida Air Quality and Climate group is collaborating with the Universidad de La Salle to prioritize regulatory strategies for Bogotá using air pollution simulations. To simulate pollution, we developed a modeling platform that combines the Weather Research and Forecasting Model (WRF), local emissions, and the Community Multi-scale Air Quality model (CMAQ). This platform is the first of its kind to be implemented in the megacity of Bogota, Colombia. The presentation will discuss development and evaluation of the air quality modeling system, highlight initial results characterizing photochemical conditions in Bogotá, and characterize air pollution under proposed regulatory strategies. The WRF model has been configured and applied to Bogotá, which resides in a tropical climate with complex mountainous topography. Developing the configuration included incorporation of local topography and land-use data, a physics sensitivity analysis, review, and systematic evaluation. The threshold, however, was set based on synthesis of model performance under less mountainous conditions. We will evaluate the impact that differences in autocorrelation contribute to the non-ideal performance. Air pollution predictions are currently under way. CMAQ has been configured with WRF meteorology, global boundary conditions from GEOS-Chem, and a locally produced emission inventory. Preliminary results from simulations show promising performance of CMAQ in Bogota. Anticipated results include a systematic performance evaluation of ozone and PM10, characterization of photochemical sensitivity, and air quality predictions under proposed regulatory scenarios.