United3D: a protein model quality assessment program that uses two consensus based methods.
Terashi, Genki; Oosawa, Makoto; Nakamura, Yuuki; Kanou, Kazuhiko; Takeda-Shitaka, Mayuko
2012-01-01
In protein structure prediction, such as template-based modeling and free modeling (ab initio modeling), the step that assesses the quality of protein models is very important. We have developed a model quality assessment (QA) program United3D that uses an optimized clustering method and a simple Cα atom contact-based potential. United3D automatically estimates the quality scores (Qscore) of predicted protein models that are highly correlated with the actual quality (GDT_TS). The performance of United3D was tested in the ninth Critical Assessment of protein Structure Prediction (CASP9) experiment. In CASP9, United3D showed the lowest average loss of GDT_TS (5.3) among the QA methods participated in CASP9. This result indicates that the performance of United3D to identify the high quality models from the models predicted by CASP9 servers on 116 targets was best among the QA methods that were tested in CASP9. United3D also produced high average Pearson correlation coefficients (0.93) and acceptable Kendall rank correlation coefficients (0.68) between the Qscore and GDT_TS. This performance was competitive with the other top ranked QA methods that were tested in CASP9. These results indicate that United3D is a useful tool for selecting high quality models from many candidate model structures provided by various modeling methods. United3D will improve the accuracy of protein structure prediction.
Design and realization of high quality prime farmland planning and management information system
NASA Astrophysics Data System (ADS)
Li, Manchun; Liu, Guohong; Liu, Yongxue; Jiang, Zhixin
2007-06-01
The article discusses the design and realization of a high quality prime farmland planning and management information system based on SDSS. Models in concept integration, management planning are used in High Quality Prime Farmland Planning in order to refine the current model system and the management information system is deigned with a triangular structure. Finally an example of Tonglu county high quality prime farmland planning and management information system is introduced.
Fan, Xin-Gang; Mi, Wen-Bao; Ma, Zhen-Ning
2015-02-01
For deep analysis on the regional environmental economic system, the paper analyzes the mutual relation of regional economy development, environmental quality, environmental pollution, and builds the theoretical basis. Then, the economy-pollution-environment quality three-dimensional coupling evaluation model for district is constructed. It includes economic development level index, environmental pollution index, and environmental quality index. The model is a cube, which has spatialization and visualization characteristics. The model includes 8 sub cubes, which expresses 8 types of state, e. g. low pollution-inferior quality-low level of economic development etc. The model can be used to evaluate the status of region, divide development phase, analyze evolution trend etc. It has two ways including relative meaning evaluation (RME) and absolute meaning evaluation (AME). Based on the model, Yinchuan City in the Ningxia Hui Autonomous Region is used as an example for the empirical study. Using RME, compared with Guangzhou city, The result shows that the Yinchuan City has been a high pollution-low quality-low level of economic development state for a long period during 1996-2010. After 2007, the state changed to a high pollution-high quality-low level of economic development. Now, the environmental quality of Yinchuan city gets better, but pollutant discharge pressure is high, and tends to be the break point of high environment quality and low environment. With AME, using national standard, the Yinchuan City remains a high pollution-low quality-low level of economic development state during 1996-2010. Empirical research verifies that different target reference areas and relevant national standards have different main parameters, the evaluating result has an flexible range. The dimensionless data enhances the coupling of index. The data position in model increases the visibility to the environmental management decisions. The model improves mismatches of calculated data size, time asymmetry of spatial data, verification of the former multi-target coupling model.
Scotti, Dennis J; Harmon, Joel; Behson, Scott J
2007-01-01
Healthcare managers must deliver high-quality patient services that generate highly satisfied and loyal customers. In this article, we examine how a high-involvement approach to the work environment of healthcare employees may lead to exceptional service quality, satisfied patients, and ultimately to loyal customers. Specifically, we investigate the chain of events through which high-performance work systems (HPWS) and customer orientation influence employee and customer perceptions of service quality and patient satisfaction in a national sample of 113 Veterans Health Administration ambulatory care centers. We present a conceptual model for linking work environment to customer satisfaction and test this model using structural equations modeling. The results suggest that (1) HPWS is linked to employee perceptions of their ability to deliver high-quality customer service, both directly and through their perceptions of customer orientation; (2) employee perceptions of customer service are linked to customer perceptions of high-quality service; and (3) perceived service quality is linked with customer satisfaction. Theoretical and practical implications of our findings, including suggestions of how healthcare managers can implement changes to their work environments, are discussed.
Castellano, Michael J; Mueller, Kevin E; Olk, Daniel C; Sawyer, John E; Six, Johan
2015-09-01
Labile, 'high-quality', plant litters are hypothesized to promote soil organic matter (SOM) stabilization in mineral soil fractions that are physicochemically protected from rapid mineralization. However, the effect of litter quality on SOM stabilization is inconsistent. High-quality litters, characterized by high N concentrations, low C/N ratios, and low phenol/lignin concentrations, are not consistently stabilized in SOM with greater efficiency than 'low-quality' litters characterized by low N concentrations, high C/N ratios, and high phenol/lignin concentrations. Here, we attempt to resolve these inconsistent results by developing a new conceptual model that links litter quality to the soil C saturation concept. Our model builds on the Microbial Efficiency-Matrix Stabilization framework (Cotrufo et al., 2013) by suggesting the effect of litter quality on SOM stabilization is modulated by the extent of soil C saturation such that high-quality litters are not always stabilized in SOM with greater efficiency than low-quality litters. © 2015 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Shaw, Amelia R.; Smith Sawyer, Heather; LeBoeuf, Eugene J.; McDonald, Mark P.; Hadjerioua, Boualem
2017-11-01
Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2 is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. The reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.
Shaw, Amelia R.; Sawyer, Heather Smith; LeBoeuf, Eugene J.; ...
2017-10-24
Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2more » is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. Here, the reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaw, Amelia R.; Sawyer, Heather Smith; LeBoeuf, Eugene J.
Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2more » is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. Here, the reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.« less
Error-proneness as a handicap signal.
De Jaegher, Kris
2003-09-21
This paper describes two discrete signalling models in which the error-proneness of signals can serve as a handicap signal. In the first model, the direct handicap of sending a high-quality signal is not large enough to assure that a low-quality signaller will not send it. However, if the receiver sometimes mistakes a high-quality signal for a low-quality one, then there is an indirect handicap to sending a high-quality signal. The total handicap of sending such a signal may then still be such that a low-quality signaller would not want to send it. In the second model, there is no direct handicap of sending signals, so that nothing would seem to stop a signaller from always sending a high-quality signal. However, the receiver sometimes fails to detect signals, and this causes an indirect handicap of sending a high-quality signal that still stops the low-quality signaller of sending such a signal. The conditions for honesty are that the probability of an error of detection is higher for a high-quality than for a low-quality signal, and that the signaller who does not detect a signal adopts a response that is bad to the signaller. In both our models, we thus obtain the result that signal accuracy should not lie above a certain level in order for honest signalling to be possible. Moreover, we show that the maximal accuracy that can be achieved is higher the lower the degree of conflict between signaller and receiver. As well, we show that it is the conditions for honest signalling that may be constraining signal accuracy, rather than the signaller trying to make honest signals as effective as possible given receiver psychology, or the signaller adapting the accuracy of honest signals depending on his interests.
Demirci, Müşerref Duygu Saçar; Allmer, Jens
2017-07-28
MicroRNAs (miRNAs) are involved in the post-transcriptional regulation of protein abundance and thus have a great impact on the resulting phenotype. It is, therefore, no wonder that they have been implicated in many diseases ranging from virus infections to cancer. This impact on the phenotype leads to a great interest in establishing the miRNAs of an organism. Experimental methods are complicated which led to the development of computational methods for pre-miRNA detection. Such methods generally employ machine learning to establish models for the discrimination between miRNAs and other sequences. Positive training data for model establishment, for the most part, stems from miRBase, the miRNA registry. The quality of the entries in miRBase has been questioned, though. This unknown quality led to the development of filtering strategies in attempts to produce high quality positive datasets which can lead to a scarcity of positive data. To analyze the quality of filtered data we developed a machine learning model and found it is well able to establish data quality based on intrinsic measures. Additionally, we analyzed which features describing pre-miRNAs could discriminate between low and high quality data. Both models are applicable to data from miRBase and can be used for establishing high quality positive data. This will facilitate the development of better miRNA detection tools which will make the prediction of miRNAs in disease states more accurate. Finally, we applied both models to all miRBase data and provide the list of high quality hairpins.
Impact of High Resolution Land-Use Data in Meteorology and Air Quality Modeling Systems
Accurate land use information is important in meteorology for land surface exchanges, in emission modeling for emission spatial allocation, and in air quality modeling for chemical surface fluxes. Currently, meteorology, emission, and air quality models often use outdated USGS Gl...
NASA Astrophysics Data System (ADS)
Roy, K.; Peltier, W. R.
2017-12-01
Our understanding of the Earth-Ice-Ocean interactions that have accompanied the large glaciation-deglaciation process characteristic of the last half of the Pleistocene has benefited significantly from the development of high-quality models of the Glacial Isostatic Adjustment (GIA) process. These models provide fundamental insight on the large changes in sea level and land ice cover over this time period, as well as key constraints on the viscosity structure of the Earth's interior. Their development has benefited from the recent availability of high-quality constraints from regions of forebulge collapse. In particular, over North America, the joint use of high-quality sea level data from the U.S. East coast, together with the vast network of precise space-geodetic observations of crustal motion existing over most of the interior of the continent, has led to the latest ICE-7G_NA (VM7) model (Roy & Peltier, GJI, 2017). In this paper, exciting opportunities provided by such high-quality observations related to the GIA process will be discussed, not only in the context of the continuing effort to refine global models of this phenomenon, but also in terms of the fundamental insight they may provide on outstanding issues in high-pressure geophysics, paleoclimatology or hydrogeology. Specific examples where such high-quality observations can be used (either separately, or using a combination of independent sources) will be presented, focusing particularly on constraints from the North American continent and from the Mediterranean basin. This work will demonstrate that, given the high-quality of currently available constraints on the GIA process, considerable further geophysical insight can be obtained based upon the use of spherically-symmetric models of the viscosity structure of the planet.
NASA Astrophysics Data System (ADS)
Tang, U. W.; Wang, Z. S.
2008-10-01
Each city has its unique urban form. The importance of urban form on sustainable development has been recognized in recent years. Traditionally, air quality modelling in a city is in a mesoscale with grid resolution of kilometers, regardless of its urban form. This paper introduces a GIS-based air quality and noise model system developed to study the built environment of highly compact urban forms. Compared with traditional mesoscale air quality model system, the present model system has a higher spatial resolution down to individual buildings along both sides of the street. Applying the developed model system in the Macao Peninsula with highly compact urban forms, the average spatial resolution of input and output data is as high as 174 receptor points per km2. Based on this input/output dataset with a high spatial resolution, this study shows that even the highly compact urban forms can be fragmented into a very small geographic scale of less than 3 km2. This is due to the significant temporal variation of urban development. The variation of urban form in each fragment in turn affects air dispersion, traffic condition, and thus air quality and noise in a measurable scale.
Application and evaluation of high-resolution WRF-CMAQ with simple urban parameterization.
The 2-way coupled WRF-CMAQ meteorology and air quality modeling system is evaluated for high-resolution applications by comparing to a regional air quality field study (Discover-AQ). The model was modified to better account for the effects of urban environments. High-resolution...
Application and evaluation of high-resolution WRF-CMAQ with simple urban parameterization
The 2-way coupled WRF-CMAQ meteorology and air quality modeling system is evaluated for high-resolution applications by comparing to a regional air quality field study (Discover-AQ). The model was modified to better account for the effects of urban environments. High-resolution...
A Systematic Process for Developing High Quality SaaS Cloud Services
NASA Astrophysics Data System (ADS)
La, Hyun Jung; Kim, Soo Dong
Software-as-a-Service (SaaS) is a type of cloud service which provides software functionality through Internet. Its benefits are well received in academia and industry. To fully utilize the benefits, there should be effective methodologies to support the development of SaaS services which provide high reusability and applicability. Conventional approaches such as object-oriented methods do not effectively support SaaS-specific engineering activities such as modeling common features, variability, and designing quality services. In this paper, we present a systematic process for developing high quality SaaS and highlight the essentiality of commonality and variability (C&V) modeling to maximize the reusability. We first define criteria for designing the process model and provide a theoretical foundation for SaaS; its meta-model and C&V model. We clarify the notion of commonality and variability in SaaS, and propose a SaaS development process which is accompanied with engineering instructions. Using the proposed process, SaaS services with high quality can be effectively developed.
Harvey-Knowles, Jacquelyn; Faw, Meara H
2018-04-01
Cancer caregivers often experience significant challenges in their motivation and ability to comfort cancer survivors, particularly in a spousal or romantic context. Spousal cancer caregivers have been known to report even greater levels of burden and distress than cancer sufferers, yet still take on the role of acting as an informal caregiver so they can attend to their partner's needs. The current study tested whether a theoretical model of supportive outcomes-the dual-process model of supportive communication-explained variations in cancer caregivers' motivation and ability to create high-quality support messages. The study also tested whether participant engagement with reflective journaling on supportive acts was associated with increased motivation or ability to generate high-quality support messages. Based upon the dual-process model, we posited that, following supportive journaling tasks, caregivers of spouses currently managing a cancer experience would report greater motivation but also greater difficulty in generating high-quality support messages, while individuals caring for a patient in remission would report lower motivation but greater ability to create high-quality support messages. Findings provided support for these assertions and suggested that reflective journaling tasks might be a useful tool for improving remission caregivers' ability to provide high-quality social support to survivors. Corresponding theoretical and applied implications are discussed.
Water quality assessment and meta model development in Melen watershed - Turkey.
Erturk, Ali; Gurel, Melike; Ekdal, Alpaslan; Tavsan, Cigdem; Ugurluoglu, Aysegul; Seker, Dursun Zafer; Tanik, Aysegul; Ozturk, Izzet
2010-07-01
Istanbul, being one of the highly populated metropolitan areas of the world, has been facing water scarcity since the past decade. Water transfer from Melen Watershed was considered as the most feasible option to supply water to Istanbul due to its high water potential and relatively less degraded water quality. This study consists of two parts. In the first part, water quality data covering 26 parameters from 5 monitoring stations were analyzed and assessed due to the requirements of the "Quality Required of Surface Water Intended for the Abstraction of Drinking Water" regulation. In the second part, a one-dimensional stream water quality model with simple water quality kinetics was developed. It formed a basic design for more advanced water quality models for the watershed. The reason for assessing the water quality data and developing a model was to provide information for decision making on preliminary actions to prevent any further deterioration of existing water quality. According to the water quality assessment at the water abstraction point, Melen River has relatively poor water quality with regard to NH(4)(+), BOD(5), faecal streptococcus, manganese and phenol parameters, and is unsuitable for drinking water abstraction in terms of COD, PO(4)(3-), total coliform, total suspended solids, mercury and total chromium parameters. The results derived from the model were found to be consistent with the water quality assessment. It also showed that relatively high inorganic nitrogen and phosphorus concentrations along the streams are related to diffuse nutrient loads that should be managed together with municipal and industrial wastewaters. Copyright 2010 Elsevier Ltd. All rights reserved.
Gulf of Mexico dissolved oxygen model (GoMDOM) research and quality assurance project plan
An integrated high resolution mathematical modeling framework is being developed that will link hydrodynamic, atmospheric, and water quality models for the northern Gulf of Mexico. This Research and Quality Assurance Project Plan primarily focuses on the deterministic Gulf of Me...
NASA Technical Reports Server (NTRS)
Engwirda, Darren
2017-01-01
An algorithm for the generation of non-uniform, locally orthogonal staggered unstructured spheroidal grids is described. This technique is designed to generate very high-quality staggered VoronoiDelaunay meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric simulation, ocean-modelling and numerical weather prediction. Using a recently developed Frontal-Delaunay refinement technique, a method for the construction of high-quality unstructured spheroidal Delaunay triangulations is introduced. A locally orthogonal polygonal grid, derived from the associated Voronoi diagram, is computed as the staggered dual. It is shown that use of the Frontal-Delaunay refinement technique allows for the generation of very high-quality unstructured triangulations, satisfying a priori bounds on element size and shape. Grid quality is further improved through the application of hill-climbing-type optimisation techniques. Overall, the algorithm is shown to produce grids with very high element quality and smooth grading characteristics, while imposing relatively low computational expense. A selection of uniform and non-uniform spheroidal grids appropriate for high-resolution, multi-scale general circulation modelling are presented. These grids are shown to satisfy the geometric constraints associated with contemporary unstructured C-grid-type finite-volume models, including the Model for Prediction Across Scales (MPAS-O). The use of user-defined mesh-spacing functions to generate smoothly graded, non-uniform grids for multi-resolution-type studies is discussed in detail.
NASA Astrophysics Data System (ADS)
Engwirda, Darren
2017-06-01
An algorithm for the generation of non-uniform, locally orthogonal staggered unstructured spheroidal grids is described. This technique is designed to generate very high-quality staggered Voronoi-Delaunay meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric simulation, ocean-modelling and numerical weather prediction. Using a recently developed Frontal-Delaunay refinement technique, a method for the construction of high-quality unstructured spheroidal Delaunay triangulations is introduced. A locally orthogonal polygonal grid, derived from the associated Voronoi diagram, is computed as the staggered dual. It is shown that use of the Frontal-Delaunay refinement technique allows for the generation of very high-quality unstructured triangulations, satisfying a priori bounds on element size and shape. Grid quality is further improved through the application of hill-climbing-type optimisation techniques. Overall, the algorithm is shown to produce grids with very high element quality and smooth grading characteristics, while imposing relatively low computational expense. A selection of uniform and non-uniform spheroidal grids appropriate for high-resolution, multi-scale general circulation modelling are presented. These grids are shown to satisfy the geometric constraints associated with contemporary unstructured C-grid-type finite-volume models, including the Model for Prediction Across Scales (MPAS-O). The use of user-defined mesh-spacing functions to generate smoothly graded, non-uniform grids for multi-resolution-type studies is discussed in detail.
When environmentally persistent pathogens transform good habitat into ecological traps.
Leach, Clinton B; Webb, Colleen T; Cross, Paul C
2016-03-01
Habitat quality plays an important role in the dynamics and stability of wildlife metapopulations. However, the benefits of high-quality habitat may be modulated by the presence of an environmentally persistent pathogen. In some cases, the presence of environmental pathogen reservoirs on high-quality habitat may lead to the creation of ecological traps, wherein host individuals preferentially colonize high-quality habitat, but are then exposed to increased infection risk and disease-induced mortality. We explored this possibility through the development of a stochastic patch occupancy model, where we varied the pathogen's virulence, transmission rate and environmental persistence as well as the distribution of habitat quality in the host metapopulation. This model suggests that for pathogens with intermediate levels of spread, high-quality habitat can serve as an ecological trap, and can be detrimental to host persistence relative to low-quality habitat. This inversion of the relative roles of high- and low-quality habitat highlights the importance of considering the interaction between spatial structure and pathogen transmission when managing wildlife populations exposed to an environmentally persistent pathogen.
When environmentally persistent pathogens transform good habitat into ecological traps
Leach, Clint; Webb, Colleen T.; Cross, Paul C.
2016-01-01
Habitat quality plays an important role in the dynamics and stability of wildlife metapopulations. However, the benefits of high-quality habitat may be modulated by the presence of an environmentally persistent pathogen. In some cases, the presence of environmental pathogen reservoirs on high-quality habitat may lead to the creation of ecological traps, wherein host individuals preferentially colonize high-quality habitat, but are then exposed to increased infection risk and disease-induced mortality. We explored this possibility through the development of a stochastic patch occupancy model, where we varied the pathogen’s virulence, transmission rate and environmental persistence as well as the distribution of habitat quality in the host metapopulation. This model suggests that for pathogens with intermediate levels of spread, high-quality habitat can serve as an ecological trap, and can be detrimental to host persistence relative to low-quality habitat. This inversion of the relative roles of high- and low-quality habitat highlights the importance of considering the interaction between spatial structure and pathogen transmission when managing wildlife populations exposed to an environmentally persistent pathogen.
The high-order decoupled direct method in three dimensions for particular matter (HDDM-3D/PM) has been implemented in the Community Multiscale Air Quality (CMAQ) model to enable advanced sensitivity analysis. The major effort of this work is to develop high-order DDM sensitivity...
Modified-BRISQUE as no reference image quality assessment for structural MR images.
Chow, Li Sze; Rajagopal, Heshalini
2017-11-01
An effective and practical Image Quality Assessment (IQA) model is needed to assess the image quality produced from any new hardware or software in MRI. A highly competitive No Reference - IQA (NR - IQA) model called Blind/Referenceless Image Spatial Quality Evaluator (BRISQUE) initially designed for natural images were modified to evaluate structural MR images. The BRISQUE model measures the image quality by using the locally normalized luminance coefficients, which were used to calculate the image features. The modified-BRISQUE model trained a new regression model using MR image features and Difference Mean Opinion Score (DMOS) from 775 MR images. Two types of benchmarks: objective and subjective assessments were used as performance evaluators for both original and modified-BRISQUE models. There was a high correlation between the modified-BRISQUE with both benchmarks, and they were higher than those for the original BRISQUE. There was a significant percentage improvement in their correlation values. The modified-BRISQUE was statistically better than the original BRISQUE. The modified-BRISQUE model can accurately measure the image quality of MR images. It is a practical NR-IQA model for MR images without using reference images. Copyright © 2017 Elsevier Inc. All rights reserved.
Fang, Ruogu; Karlsson, Kolbeinn; Chen, Tsuhan; Sanelli, Pina C.
2014-01-01
Blood-brain-barrier permeability (BBBP) measurements extracted from the perfusion computed tomography (PCT) using the Patlak model can be a valuable indicator to predict hemorrhagic transformation in patients with acute stroke. Unfortunately, the standard Patlak model based PCT requires excessive radiation exposure, which raised attention on radiation safety. Minimizing radiation dose is of high value in clinical practice but can degrade the image quality due to the introduced severe noise. The purpose of this work is to construct high quality BBBP maps from low-dose PCT data by using the brain structural similarity between different individuals and the relations between the high- and low-dose maps. The proposed sparse high-dose induced (shd-Patlak) model performs by building a high-dose induced prior for the Patlak model with a set of location adaptive dictionaries, followed by an optimized estimation of BBBP map with the prior regularized Patlak model. Evaluation with the simulated low-dose clinical brain PCT datasets clearly demonstrate that the shd-Patlak model can achieve more significant gains than the standard Patlak model with improved visual quality, higher fidelity to the gold standard and more accurate details for clinical analysis. PMID:24200529
Cohen, Mark E; Dimick, Justin B; Bilimoria, Karl Y; Ko, Clifford Y; Richards, Karen; Hall, Bruce Lee
2009-12-01
Although logistic regression has commonly been used to adjust for risk differences in patient and case mix to permit quality comparisons across hospitals, hierarchical modeling has been advocated as the preferred methodology, because it accounts for clustering of patients within hospitals. It is unclear whether hierarchical models would yield important differences in quality assessments compared with logistic models when applied to American College of Surgeons (ACS) National Surgical Quality Improvement Program (NSQIP) data. Our objective was to evaluate differences in logistic versus hierarchical modeling for identifying hospitals with outlying outcomes in the ACS-NSQIP. Data from ACS-NSQIP patients who underwent colorectal operations in 2008 at hospitals that reported at least 100 operations were used to generate logistic and hierarchical prediction models for 30-day morbidity and mortality. Differences in risk-adjusted performance (ratio of observed-to-expected events) and outlier detections from the two models were compared. Logistic and hierarchical models identified the same 25 hospitals as morbidity outliers (14 low and 11 high outliers), but the hierarchical model identified 2 additional high outliers. Both models identified the same eight hospitals as mortality outliers (five low and three high outliers). The values of observed-to-expected events ratios and p values from the two models were highly correlated. Results were similar when data were permitted from hospitals providing < 100 patients. When applied to ACS-NSQIP data, logistic and hierarchical models provided nearly identical results with respect to identification of hospitals' observed-to-expected events ratio outliers. As hierarchical models are prone to implementation problems, logistic regression will remain an accurate and efficient method for performing risk adjustment of hospital quality comparisons.
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.; Estes, Maurice G., Jr.; Crosson, William L.; Khan, Maudood N.
2006-01-01
The Atlanta Urban Heat Island and Air Quality Project had its genesis in Project ATLANTA (ATlanta Land use Analysis: Temperature and Air quality) that began in 1996. Project ATLANTA examined how high-spatial resolution thermal remote sensing data could be used to derive better measurements of the Urban Heat Island effect over Atlanta. We have explored how these thermal remote sensing, as well as other imaged datasets, can be used to better characterize the urban landscape for improved air quality modeling over the Atlanta area. For the air quality modeling project, the National Land Cover Dataset and the local scale Landpro99 dataset at 30m spatial resolutions have been used to derive land use/land cover characteristics for input into the MM5 mesoscale meteorological model that is one of the foundations for the Community Multiscale Air Quality (CMAQ) model to assess how these data can improve output from CMAQ. Additionally, land use changes to 2030 have been predicted using a Spatial Growth Model (SGM). SGM simulates growth around a region using population, employment and travel demand forecasts. Air quality modeling simulations were conducted using both current and future land cover. Meteorological modeling simulations indicate a 0.5 C increase in daily maximum air temperatures by 2030. Air quality modeling simulations show substantial differences in relative contributions of individual atmospheric pollutant constituents as a result of land cover change. Enhanced boundary layer mixing over the city tends to offset the increase in ozone concentration expected due to higher surface temperatures as a result of urbanization.
Artificial neural network modeling of the water quality index using land use areas as predictors.
Gazzaz, Nabeel M; Yusoff, Mohd Kamil; Ramli, Mohammad Firuz; Juahir, Hafizan; Aris, Ahmad Zaharin
2015-02-01
This paper describes the design of an artificial neural network (ANN) model to predict the water quality index (WQI) using land use areas as predictors. Ten-year records of land use statistics and water quality data for Kinta River (Malaysia) were employed in the modeling process. The most accurate WQI predictions were obtained with the network architecture 7-23-1; the back propagation training algorithm; and a learning rate of 0.02. The WQI forecasts of this model had significant (p < 0.01), positive, very high correlation (ρs = 0.882) with the measured WQI values. Sensitivity analysis revealed that the relative importance of the land use classes to WQI predictions followed the order: mining > rubber > forest > logging > urban areas > agriculture > oil palm. These findings show that the ANNs are highly reliable means of relating water quality to land use, thus integrating land use development with river water quality management.
Advanced Computational Methods for High-accuracy Refinement of Protein Low-quality Models
NASA Astrophysics Data System (ADS)
Zang, Tianwu
Predicting the 3-dimentional structure of protein has been a major interest in the modern computational biology. While lots of successful methods can generate models with 3˜5A root-mean-square deviation (RMSD) from the solution, the progress of refining these models is quite slow. It is therefore urgently needed to develop effective methods to bring low-quality models to higher-accuracy ranges (e.g., less than 2 A RMSD). In this thesis, I present several novel computational methods to address the high-accuracy refinement problem. First, an enhanced sampling method, named parallel continuous simulated tempering (PCST), is developed to accelerate the molecular dynamics (MD) simulation. Second, two energy biasing methods, Structure-Based Model (SBM) and Ensemble-Based Model (EBM), are introduced to perform targeted sampling around important conformations. Third, a three-step method is developed to blindly select high-quality models along the MD simulation. These methods work together to make significant refinement of low-quality models without any knowledge of the solution. The effectiveness of these methods is examined in different applications. Using the PCST-SBM method, models with higher global distance test scores (GDT_TS) are generated and selected in the MD simulation of 18 targets from the refinement category of the 10th Critical Assessment of Structure Prediction (CASP10). In addition, in the refinement test of two CASP10 targets using the PCST-EBM method, it is indicated that EBM may bring the initial model to even higher-quality levels. Furthermore, a multi-round refinement protocol of PCST-SBM improves the model quality of a protein to the level that is sufficient high for the molecular replacement in X-ray crystallography. Our results justify the crucial position of enhanced sampling in the protein structure prediction and demonstrate that a considerable improvement of low-accuracy structures is still achievable with current force fields.
Spence Laschinger, Heather K; Fida, Roberta
2015-05-01
A model linking authentic leadership, structural empowerment, and supportive professional practice environments to nurses' perceptions of patient care quality and job satisfaction was tested. Positive work environment characteristics are important for nurses' perceptions of patient care quality and job satisfaction (significant factors for retention). Few studies have examined the mechanism by which these characteristics operate to influence perceptions of patient care quality or job satisfaction. A cross-sectional provincial survey of 723 Canadian nurses was used to test the hypothesized models using structural equation modeling. The model was an acceptable fit and all paths were significant. Authentic leadership had a positive effect on structural empowerment, which had a positive effect on perceived support for professional practice and a negative effect on nurses' perceptions that inadequate unit staffing prevented them from providing high-quality patient care. These workplace conditions predicted job satisfaction. Authentic leaders play an important role in creating empowering professional practice environments that foster high-quality care and job satisfaction.
Tsai, Thomas C; Greaves, Felix; Zheng, Jie; Orav, E John; Zinner, Michael J; Jha, Ashish K
2016-09-01
US policy makers are making efforts to simultaneously improve the quality of and reduce spending on health care through alternative payment models such as bundled payment. Bundled payment models are predicated on the theory that aligning financial incentives for all providers across an episode of care will lower health care spending while improving quality. Whether this is true remains unknown. Using national Medicare fee-for-service claims for the period 2011-12 and data on hospital quality, we evaluated how thirty- and ninety-day episode-based spending were related to two validated measures of surgical quality-patient satisfaction and surgical mortality. We found that patients who had major surgery at high-quality hospitals cost Medicare less than those who had surgery at low-quality institutions, for both thirty- and ninety-day periods. The difference in Medicare spending between low- and high-quality hospitals was driven primarily by postacute care, which accounted for 59.5 percent of the difference in thirty-day episode spending, and readmissions, which accounted for 19.9 percent. These findings suggest that efforts to achieve value through bundled payment should focus on improving care at low-quality hospitals and reducing unnecessary use of postacute care. Project HOPE—The People-to-People Health Foundation, Inc.
Presence of indicator plant species as a predictor of wetland vegetation integrity
Stapanian, Martin A.; Adams, Jean V.; Gara, Brian
2013-01-01
We fit regression and classification tree models to vegetation data collected from Ohio (USA) wetlands to determine (1) which species best predict Ohio vegetation index of biotic integrity (OVIBI) score and (2) which species best predict high-quality wetlands (OVIBI score >75). The simplest regression tree model predicted OVIBI score based on the occurrence of three plant species: skunk-cabbage (Symplocarpus foetidus), cinnamon fern (Osmunda cinnamomea), and swamp rose (Rosa palustris). The lowest OVIBI scores were best predicted by the absence of the selected plant species rather than by the presence of other species. The simplest classification tree model predicted high-quality wetlands based on the occurrence of two plant species: skunk-cabbage and marsh-fern (Thelypteris palustris). The overall misclassification rate from this tree was 13 %. Again, low-quality wetlands were better predicted than high-quality wetlands by the absence of selected species rather than the presence of other species using the classification tree model. Our results suggest that a species’ wetland status classification and coefficient of conservatism are of little use in predicting wetland quality. A simple, statistically derived species checklist such as the one created in this study could be used by field biologists to quickly and efficiently identify wetland sites likely to be regulated as high-quality, and requiring more intensive field assessments. Alternatively, it can be used for advanced determinations of low-quality wetlands. Agencies can save considerable money by screening wetlands for the presence/absence of such “indicator” species before issuing permits.
NASA Astrophysics Data System (ADS)
Chu, Hone-Jay; Kong, Shish-Jeng; Chang, Chih-Hua
2018-03-01
The turbidity (TB) of a water body varies with time and space. Water quality is traditionally estimated via linear regression based on satellite images. However, estimating and mapping water quality require a spatio-temporal nonstationary model, while TB mapping necessitates the use of geographically and temporally weighted regression (GTWR) and geographically weighted regression (GWR) models, both of which are more precise than linear regression. Given the temporal nonstationary models for mapping water quality, GTWR offers the best option for estimating regional water quality. Compared with GWR, GTWR provides highly reliable information for water quality mapping, boasts a relatively high goodness of fit, improves the explanation of variance from 44% to 87%, and shows a sufficient space-time explanatory power. The seasonal patterns of TB and the main spatial patterns of TB variability can be identified using the estimated TB maps from GTWR and by conducting an empirical orthogonal function (EOF) analysis.
Use of ocean color scanner data in water quality mapping
NASA Technical Reports Server (NTRS)
Khorram, S.
1981-01-01
Remotely sensed data, in combination with in situ data, are used in assessing water quality parameters within the San Francisco Bay-Delta. The parameters include suspended solids, chlorophyll, and turbidity. Regression models are developed between each of the water quality parameter measurements and the Ocean Color Scanner (OCS) data. The models are then extended to the entire study area for mapping water quality parameters. The results include a series of color-coded maps, each pertaining to one of the water quality parameters, and the statistical analysis of the OCS data and regression models. It is found that concurrently collected OCS data and surface truth measurements are highly useful in mapping the selected water quality parameters and locating areas having relatively high biological activity. In addition, it is found to be virtually impossible, at least within this test site, to locate such areas on U-2 color and color-infrared photography.
Information of urban morphological features at high resolution is needed to properly model and characterize the meteorological and air quality fields in urban areas. We describe a new project called National Urban Database with Access Portal Tool, (NUDAPT) that addresses this nee...
NASA Astrophysics Data System (ADS)
Hilali, Mohamed M.
2005-11-01
A simple cost-effective approach was proposed and successfully employed to fabricate high-quality screen-printed (SP) contacts to high sheet-resistance emitters (100 O/sq) to improve the Si solar cell efficiency. Device modeling was used to quantify the performance enhancement possible from the high sheet-resistance emitter for various cell designs. It was found that for performance enhancement from the high sheet-resistance emitter, certain cell design criteria must be satisfied. Model calculations showed that in order to achieve any performance enhancement over the conventional ˜40 O/sq emitter, the high sheet resistance emitter solar cell must have a reasonably good (<120,000 cm/s) or low front-surface recombination velocity (FSRV). Model calculations were also performed to establish requirements for high fill factors (FFs). The results showed that the series resistance should be less than 0.8 O-cm2, the shunt resistance should be greater than 1000 O-cm2, and the junction leakage current should be less than 25 nA/cm2. Analytical microscopy and surface analysis techniques were used to study the Ag-Si contact interface of different SP Ag pastes. Physical and electrical properties of SP Ag thick-film contacts were studied and correlated to understand and achieve good-quality ohmic contacts to high sheet-resistance emitters for solar cells. This information was then used to define the criteria for high-quality screen-printed contacts. The role of paste constituents and firing scheme on contact quality were investigated to tailor the high-quality screen-printed contact interface structure that results in high performance solar cells. Results indicated that small particle size, high glass transition temperature, rapid firing and less aggressive glass frit help in producing high-quality contacts. Based on these results high-quality SP contacts with high FFs > 0.78 on high sheet-resistance emitters were achieved for the first time using a simple single-step firing process. This technology was applied to different substrates (monocrystalline and multicrystalline) and surfaces (textured and planar). Cell efficiencies of ˜16.2% on low-cost EFG ribbon substrates were achieved on high sheet-resistance emitters with SP contacts. A record high-efficiency SP solar cell of 19% with textured high sheet-resistance emitter was also fabricated and modeled.
Does High School Facility Quality Affect Student Achievement? A Two-Level Hierarchical Linear Model
ERIC Educational Resources Information Center
Bowers, Alex J.; Urick, Angela
2011-01-01
The purpose of this study is to isolate the independent effects of high school facility quality on student achievement using a large, nationally representative U.S. database of student achievement and school facility quality. Prior research on linking school facility quality to student achievement has been mixed. Studies that relate overall…
Shi, Yuan; Lau, Kevin Ka-Lun; Ng, Edward
2017-08-01
Urban air quality serves as an important function of the quality of urban life. Land use regression (LUR) modelling of air quality is essential for conducting health impacts assessment but more challenging in mountainous high-density urban scenario due to the complexities of the urban environment. In this study, a total of 21 LUR models are developed for seven kinds of air pollutants (gaseous air pollutants CO, NO 2 , NO x , O 3 , SO 2 and particulate air pollutants PM 2.5 , PM 10 ) with reference to three different time periods (summertime, wintertime and annual average of 5-year long-term hourly monitoring data from local air quality monitoring network) in Hong Kong. Under the mountainous high-density urban scenario, we improved the traditional LUR modelling method by incorporating wind availability information into LUR modelling based on surface geomorphometrical analysis. As a result, 269 independent variables were examined to develop the LUR models by using the "ADDRESS" independent variable selection method and stepwise multiple linear regression (MLR). Cross validation has been performed for each resultant model. The results show that wind-related variables are included in most of the resultant models as statistically significant independent variables. Compared with the traditional method, a maximum increase of 20% was achieved in the prediction performance of annual averaged NO 2 concentration level by incorporating wind-related variables into LUR model development. Copyright © 2017 Elsevier Inc. All rights reserved.
Modeling the Effects of Conservation Tillage on Water Quality at the Field Scale
USDA-ARS?s Scientific Manuscript database
The development and application of predictive tools to quantitatively assess the effects of tillage and related management activities should be carefully tested against high quality field data. This study reports on: 1) the calibration and validation of the Root Zone Water Quality Model (RZWQM) to a...
NASA Astrophysics Data System (ADS)
Liu, Qiong; Wang, Wen-xi; Zhu, Ke-ren; Zhang, Chao-yong; Rao, Yun-qing
2014-11-01
Mixed-model assembly line sequencing is significant in reducing the production time and overall cost of production. To improve production efficiency, a mathematical model aiming simultaneously to minimize overtime, idle time and total set-up costs is developed. To obtain high-quality and stable solutions, an advanced scatter search approach is proposed. In the proposed algorithm, a new diversification generation method based on a genetic algorithm is presented to generate a set of potentially diverse and high-quality initial solutions. Many methods, including reference set update, subset generation, solution combination and improvement methods, are designed to maintain the diversification of populations and to obtain high-quality ideal solutions. The proposed model and algorithm are applied and validated in a case company. The results indicate that the proposed advanced scatter search approach is significant for mixed-model assembly line sequencing in this company.
A high and low noise model for strong motion accelerometers
NASA Astrophysics Data System (ADS)
Clinton, J. F.; Cauzzi, C.; Olivieri, M.
2010-12-01
We present reference noise models for high-quality strong motion accelerometer installations. We use continuous accelerometer data acquired by the Swiss Seismological Service (SED) since 2006 and other international high-quality accelerometer network data to derive very broadband (50Hz-100s) high and low noise models. The proposed noise models are compared to the Peterson (1993) low and high noise models designed for broadband seismometers; the datalogger self-noise; background noise levels at existing Swiss strong motion stations; and typical earthquake signals recorded in Switzerland and worldwide. The standard strong motion station operated by the SED consists of a Kinemetrics Episensor (2g clip level; flat acceleration response from 200 Hz to DC; <155dB dynamic range) coupled with a 24-bit Nanometrics Taurus datalogger. The proposed noise models are based on power spectral density (PSD) noise levels for each strong motion station computed via PQLX (McNamara and Buland, 2004) from several years of continuous recording. The 'Accelerometer Low Noise Model', ALNM, is dominated by instrument noise from the sensor and datalogger. The 'Accelerometer High Noise Model', AHNM, reflects 1) at high frequencies the acceptable site noise in urban areas, 2) at mid-periods the peak microseismal energy, as determined by the Peterson High Noise Model and 3) at long periods the maximum noise observed from well insulated sensor / datalogger systems placed in vault quality sites. At all frequencies, there is at least one order of magnitude between the ALNM and the AHNM; at high frequencies (> 1Hz) this extends to 2 orders of magnitude. This study provides remarkable confirmation of the capability of modern strong motion accelerometers to record low-amplitude ground motions with seismic observation quality. In particular, an accelerometric station operating at the ALNM is capable of recording the full spectrum of near source earthquakes, out to 100 km, down to M2. Of particular interest for the SED, this study provides acceptable noise limits for candidate sites for the on-going Strong Motion Network modernisation.
Oikawa, P. Y.; Ge, C.; Wang, J.; Eberwein, J. R.; Liang, L. L.; Allsman, L. A.; Grantz, D. A.; Jenerette, G. D.
2015-01-01
Fertilized soils have large potential for production of soil nitrogen oxide (NOx=NO+NO2), however these emissions are difficult to predict in high-temperature environments. Understanding these emissions may improve air quality modelling as NOx contributes to formation of tropospheric ozone (O3), a powerful air pollutant. Here we identify the environmental and management factors that regulate soil NOx emissions in a high-temperature agricultural region of California. We also investigate whether soil NOx emissions are capable of influencing regional air quality. We report some of the highest soil NOx emissions ever observed. Emissions vary nonlinearly with fertilization, temperature and soil moisture. We find that a regional air chemistry model often underestimates soil NOx emissions and NOx at the surface and in the troposphere. Adjusting the model to match NOx observations leads to elevated tropospheric O3. Our results suggest management can greatly reduce soil NOx emissions, thereby improving air quality. PMID:26556236
ERIC Educational Resources Information Center
Davis, Heather A.; Chang, Mei-Lin; Andrzejewski, Carey E.; Poirier, Ryan R.
2014-01-01
The purpose of this study was to examine changes in students' relational engagement across the transition to high school in three schools reformed to improve the quality of student-teacher relationships. In order to analyze this data we employed latent growth curve (LGC) modeling techniques (n = 637). We ran three LGC models on three…
A Markovian model market—Akerlof's lemons and the asymmetry of information
NASA Astrophysics Data System (ADS)
Tilles, Paulo F. C.; Ferreira, Fernando F.; Francisco, Gerson; Pereira, Carlos de B.; Sarti, Flavia M.
2011-07-01
In this work we study an agent based model to investigate the role of asymmetric information degrees for market evolution. This model is quite simple and may be treated analytically since the consumers evaluate the quality of a certain good taking into account only the quality of the last good purchased plus her perceptive capacity β. As a consequence, the system evolves according to a stationary Markov chain. The value of a good offered by the firms increases along with quality according to an exponent α, which is a measure of the technology. It incorporates all the technological capacity of the production systems such as education, scientific development and techniques that change the productivity rates. The technological level plays an important role to explain how the asymmetry of information may affect the market evolution in this model. We observe that, for high technological levels, the market can detect adverse selection. The model allows us to compute the maximum asymmetric information degree before the market collapses. Below this critical point the market evolves during a limited period of time and then dies out completely. When β is closer to 1 (symmetric information), the market becomes more profitable for high quality goods, although high and low quality markets coexist. The maximum asymmetric information level is a consequence of an ergodicity breakdown in the process of quality evaluation.
Gradient Magnitude Similarity Deviation: A Highly Efficient Perceptual Image Quality Index.
Xue, Wufeng; Zhang, Lei; Mou, Xuanqin; Bovik, Alan C
2014-02-01
It is an important task to faithfully evaluate the perceptual quality of output images in many applications, such as image compression, image restoration, and multimedia streaming. A good image quality assessment (IQA) model should not only deliver high quality prediction accuracy, but also be computationally efficient. The efficiency of IQA metrics is becoming particularly important due to the increasing proliferation of high-volume visual data in high-speed networks. We present a new effective and efficient IQA model, called gradient magnitude similarity deviation (GMSD). The image gradients are sensitive to image distortions, while different local structures in a distorted image suffer different degrees of degradations. This motivates us to explore the use of global variation of gradient based local quality map for overall image quality prediction. We find that the pixel-wise gradient magnitude similarity (GMS) between the reference and distorted images combined with a novel pooling strategy-the standard deviation of the GMS map-can predict accurately perceptual image quality. The resulting GMSD algorithm is much faster than most state-of-the-art IQA methods, and delivers highly competitive prediction accuracy. MATLAB source code of GMSD can be downloaded at http://www4.comp.polyu.edu.hk/~cslzhang/IQA/GMSD/GMSD.htm.
Uncertainty, ensembles and air quality dispersion modeling: applications and challenges
NASA Astrophysics Data System (ADS)
Dabberdt, Walter F.; Miller, Erik
The past two decades have seen significant advances in mesoscale meteorological modeling research and applications, such as the development of sophisticated and now widely used advanced mesoscale prognostic models, large eddy simulation models, four-dimensional data assimilation, adjoint models, adaptive and targeted observational strategies, and ensemble and probabilistic forecasts. Some of these advances are now being applied to urban air quality modeling and applications. Looking forward, it is anticipated that the high-priority air quality issues for the near-to-intermediate future will likely include: (1) routine operational forecasting of adverse air quality episodes; (2) real-time high-level support to emergency response activities; and (3) quantification of model uncertainty. Special attention is focused here on the quantification of model uncertainty through the use of ensemble simulations. Application to emergency-response dispersion modeling is illustrated using an actual event that involved the accidental release of the toxic chemical oleum. Both surface footprints of mass concentration and the associated probability distributions at individual receptors are seen to provide valuable quantitative indicators of the range of expected concentrations and their associated uncertainty.
DockQ: A Quality Measure for Protein-Protein Docking Models
Basu, Sankar
2016-01-01
The state-of-the-art to assess the structural quality of docking models is currently based on three related yet independent quality measures: Fnat, LRMS, and iRMS as proposed and standardized by CAPRI. These quality measures quantify different aspects of the quality of a particular docking model and need to be viewed together to reveal the true quality, e.g. a model with relatively poor LRMS (>10Å) might still qualify as 'acceptable' with a descent Fnat (>0.50) and iRMS (<3.0Å). This is also the reason why the so called CAPRI criteria for assessing the quality of docking models is defined by applying various ad-hoc cutoffs on these measures to classify a docking model into the four classes: Incorrect, Acceptable, Medium, or High quality. This classification has been useful in CAPRI, but since models are grouped in only four bins it is also rather limiting, making it difficult to rank models, correlate with scoring functions or use it as target function in machine learning algorithms. Here, we present DockQ, a continuous protein-protein docking model quality measure derived by combining Fnat, LRMS, and iRMS to a single score in the range [0, 1] that can be used to assess the quality of protein docking models. By using DockQ on CAPRI models it is possible to almost completely reproduce the original CAPRI classification into Incorrect, Acceptable, Medium and High quality. An average PPV of 94% at 90% Recall demonstrating that there is no need to apply predefined ad-hoc cutoffs to classify docking models. Since DockQ recapitulates the CAPRI classification almost perfectly, it can be viewed as a higher resolution version of the CAPRI classification, making it possible to estimate model quality in a more quantitative way using Z-scores or sum of top ranked models, which has been so valuable for the CASP community. The possibility to directly correlate a quality measure to a scoring function has been crucial for the development of scoring functions for protein structure prediction, and DockQ should be useful in a similar development in the protein docking field. DockQ is available at http://github.com/bjornwallner/DockQ/ PMID:27560519
Assessing product image quality for online shopping
NASA Astrophysics Data System (ADS)
Goswami, Anjan; Chung, Sung H.; Chittar, Naren; Islam, Atiq
2012-01-01
Assessing product-image quality is important in the context of online shopping. A high quality image that conveys more information about a product can boost the buyer's confidence and can get more attention. However, the notion of image quality for product-images is not the same as that in other domains. The perception of quality of product-images depends not only on various photographic quality features but also on various high level features such as clarity of the foreground or goodness of the background etc. In this paper, we define a notion of product-image quality based on various such features. We conduct a crowd-sourced experiment to collect user judgments on thousands of eBay's images. We formulate a multi-class classification problem for modeling image quality by classifying images into good, fair and poor quality based on the guided perceptual notions from the judges. We also conduct experiments with regression using average crowd-sourced human judgments as target. We compute a pseudo-regression score with expected average of predicted classes and also compute a score from the regression technique. We design many experiments with various sampling and voting schemes with crowd-sourced data and construct various experimental image quality models. Most of our models have reasonable accuracies (greater or equal to 70%) on test data set. We observe that our computed image quality score has a high (0.66) rank correlation with average votes from the crowd sourced human judgments.
Reputation and Competition in a Hidden Action Model
Fedele, Alessandro; Tedeschi, Piero
2014-01-01
The economics models of reputation and quality in markets can be classified in three categories. (i) Pure hidden action, where only one type of seller is present who can provide goods of different quality. (ii) Pure hidden information, where sellers of different types have no control over product quality. (iii) Mixed frameworks, which include both hidden action and hidden information. In this paper we develop a pure hidden action model of reputation and Bertrand competition, where consumers and firms interact repeatedly in a market with free entry. The price of the good produced by the firms is contractible, whilst the quality is noncontractible, hence it is promised by the firms when a contract is signed. Consumers infer future quality from all available information, i.e., both from what they know about past quality and from current prices. According to early contributions, competition should make reputation unable to induce the production of high-quality goods. We provide a simple solution to this problem by showing that high quality levels are sustained as an outcome of a stationary symmetric equilibrium. PMID:25329387
Reputation and competition in a hidden action model.
Fedele, Alessandro; Tedeschi, Piero
2014-01-01
The economics models of reputation and quality in markets can be classified in three categories. (i) Pure hidden action, where only one type of seller is present who can provide goods of different quality. (ii) Pure hidden information, where sellers of different types have no control over product quality. (iii) Mixed frameworks, which include both hidden action and hidden information. In this paper we develop a pure hidden action model of reputation and Bertrand competition, where consumers and firms interact repeatedly in a market with free entry. The price of the good produced by the firms is contractible, whilst the quality is noncontractible, hence it is promised by the firms when a contract is signed. Consumers infer future quality from all available information, i.e., both from what they know about past quality and from current prices. According to early contributions, competition should make reputation unable to induce the production of high-quality goods. We provide a simple solution to this problem by showing that high quality levels are sustained as an outcome of a stationary symmetric equilibrium.
The difficulty in assessing errors in numerical models of air quality is a major obstacle to improving their ability to predict and retrospectively map air quality. In this paper, using simulation outputs from the Community Multi-scale Air Quality Model (CMAQ), the statistic...
Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara
2016-05-09
A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.
Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool
NASA Astrophysics Data System (ADS)
Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.
2018-06-01
Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.
Integrating multiple data sources in species distribution modeling: A framework for data fusion
Pacifici, Krishna; Reich, Brian J.; Miller, David A.W.; Gardner, Beth; Stauffer, Glenn E.; Singh, Susheela; McKerrow, Alexa; Collazo, Jaime A.
2017-01-01
The last decade has seen a dramatic increase in the use of species distribution models (SDMs) to characterize patterns of species’ occurrence and abundance. Efforts to parameterize SDMs often create a tension between the quality and quantity of data available to fit models. Estimation methods that integrate both standardized and non-standardized data types offer a potential solution to the tradeoff between data quality and quantity. Recently several authors have developed approaches for jointly modeling two sources of data (one of high quality and one of lesser quality). We extend their work by allowing for explicit spatial autocorrelation in occurrence and detection error using a Multivariate Conditional Autoregressive (MVCAR) model and develop three models that share information in a less direct manner resulting in more robust performance when the auxiliary data is of lesser quality. We describe these three new approaches (“Shared,” “Correlation,” “Covariates”) for combining data sources and show their use in a case study of the Brown-headed Nuthatch in the Southeastern U.S. and through simulations. All three of the approaches which used the second data source improved out-of-sample predictions relative to a single data source (“Single”). When information in the second data source is of high quality, the Shared model performs the best, but the Correlation and Covariates model also perform well. When the information quality in the second data source is of lesser quality, the Correlation and Covariates model performed better suggesting they are robust alternatives when little is known about auxiliary data collected opportunistically or through citizen scientists. Methods that allow for both data types to be used will maximize the useful information available for estimating species distributions.
NASA Astrophysics Data System (ADS)
Sondkar, Pravin B.
The severity of combined aerodynamics and power transmission response in high-speed, high power density systems such as a rotorcraft is still a major cause of annoyance in spite of recent advancement in passive, semi-active and active control. With further increase in the capacity and power of this class of machinery systems, the acoustic noise levels are expected to increase even more. To achieve further improvements in sound quality, a more refined understanding of the factors and attributes controlling human perception is needed. In the case of rotorcraft systems, the perceived quality of the interior sound field is a major determining factor of passenger comfort. Traditionally, this sound quality factor is determined by measuring the response of a chosen set of juries who are asked to compare their qualitative reactions to two or more sounds based on their subjective impressions. This type of testing is very time-consuming, costly, often inconsistent, and not useful for practical design purposes. Furthermore, there is no known universal model for sound quality. The primary aim of this research is to achieve significant improvements in quantifying the sound quality of combined aerodynamic and power transmission response in high-speed, high power density machinery systems such as a rotorcraft by applying relevant objective measures related to the spectral characteristics of the sound field. Two models have been proposed in this dissertation research. First, a classical multivariate regression analysis model based on currently known sound quality metrics as well some new metrics derived in this study is presented. Even though the analysis resulted in the best possible multivariate model as a measure of the acoustic noise quality, it lacks incorporation of human judgment mechanism. The regression model can change depending on specific application, nature of the sounds and types of juries used in the study. Also, it predicts only the averaged preference scores and does not explain why two jury members differ in their judgment. To address the above shortcoming of applying regression analysis, a new human judgment model is proposed to further improve the ability to predict the degree of subjective annoyance. The human judgment model involves extraction of subjective attributes and their values using a proposed artificial jury processor. In this approach, a set of ear transfer functions are employed to compute the characteristics of sound pressure waves as perceived subjectively by human. The resulting basilar membrane displacement data from this proposed model is then applied to analyze the attribute values. Using this proposed human judgment model, the human judgment mechanism, which is highly sophisticated, will be examined. Since the human judgment model is essentially based on jury attributes that are not expected to change significantly with application or nature of the sound field, it gives a more common basis to evaluate sound quality. This model also attempts to explain the inter-juror differences in opinion, which is critical in understanding the variability in human response.
NASA Astrophysics Data System (ADS)
Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara
2016-05-01
A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.
Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara
2016-01-01
A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling. PMID:27157858
Gaia: automated quality assessment of protein structure models.
Kota, Pradeep; Ding, Feng; Ramachandran, Srinivas; Dokholyan, Nikolay V
2011-08-15
Increasing use of structural modeling for understanding structure-function relationships in proteins has led to the need to ensure that the protein models being used are of acceptable quality. Quality of a given protein structure can be assessed by comparing various intrinsic structural properties of the protein to those observed in high-resolution protein structures. In this study, we present tools to compare a given structure to high-resolution crystal structures. We assess packing by calculating the total void volume, the percentage of unsatisfied hydrogen bonds, the number of steric clashes and the scaling of the accessible surface area. We assess covalent geometry by determining bond lengths, angles, dihedrals and rotamers. The statistical parameters for the above measures, obtained from high-resolution crystal structures enable us to provide a quality-score that points to specific areas where a given protein structural model needs improvement. We provide these tools that appraise protein structures in the form of a web server Gaia (http://chiron.dokhlab.org). Gaia evaluates the packing and covalent geometry of a given protein structure and provides quantitative comparison of the given structure to high-resolution crystal structures. dokh@unc.edu Supplementary data are available at Bioinformatics online.
Mesh quality oriented 3D geometric vascular modeling based on parallel transport frame.
Guo, Jixiang; Li, Shun; Chui, Yim Pan; Qin, Jing; Heng, Pheng Ann
2013-08-01
While a number of methods have been proposed to reconstruct geometrically and topologically accurate 3D vascular models from medical images, little attention has been paid to constantly maintain high mesh quality of these models during the reconstruction procedure, which is essential for many subsequent applications such as simulation-based surgical training and planning. We propose a set of methods to bridge this gap based on parallel transport frame. An improved bifurcation modeling method and two novel trifurcation modeling methods are developed based on 3D Bézier curve segments in order to ensure the continuous surface transition at furcations. In addition, a frame blending scheme is implemented to solve the twisting problem caused by frame mismatch of two successive furcations. A curvature based adaptive sampling scheme combined with a mesh quality guided frame tilting algorithm is developed to construct an evenly distributed, non-concave and self-intersection free surface mesh for vessels with distinct radius and high curvature. Extensive experiments demonstrate that our methodology can generate vascular models with better mesh quality than previous methods in terms of surface mesh quality criteria. Copyright © 2013 Elsevier Ltd. All rights reserved.
Designing multifocal corneal models to correct presbyopia by laser ablation
NASA Astrophysics Data System (ADS)
Alarcón, Aixa; Anera, Rosario G.; Del Barco, Luis Jiménez; Jiménez, José R.
2012-01-01
Two multifocal corneal models and an aspheric model designed to correct presbyopia by corneal photoablation were evaluated. The design of each model was optimized to achieve the best visual quality possible for both near and distance vision. In addition, we evaluated the effect of myosis and pupil decentration on visual quality. The corrected model with the central zone for near vision provides better results since it requires less ablated corneal surface area, permits higher addition values, presents stabler visual quality with pupil-size variations and lower high-order aberrations.
Lin, Xiaotong; Liu, Mei; Chen, Xue-wen
2009-04-29
Protein-protein interactions play vital roles in nearly all cellular processes and are involved in the construction of biological pathways such as metabolic and signal transduction pathways. Although large-scale experiments have enabled the discovery of thousands of previously unknown linkages among proteins in many organisms, the high-throughput interaction data is often associated with high error rates. Since protein interaction networks have been utilized in numerous biological inferences, the inclusive experimental errors inevitably affect the quality of such prediction. Thus, it is essential to assess the quality of the protein interaction data. In this paper, a novel Bayesian network-based integrative framework is proposed to assess the reliability of protein-protein interactions. We develop a cross-species in silico model that assigns likelihood scores to individual protein pairs based on the information entirely extracted from model organisms. Our proposed approach integrates multiple microarray datasets and novel features derived from gene ontology. Furthermore, the confidence scores for cross-species protein mappings are explicitly incorporated into our model. Applying our model to predict protein interactions in the human genome, we are able to achieve 80% in sensitivity and 70% in specificity. Finally, we assess the overall quality of the experimentally determined yeast protein-protein interaction dataset. We observe that the more high-throughput experiments confirming an interaction, the higher the likelihood score, which confirms the effectiveness of our approach. This study demonstrates that model organisms certainly provide important information for protein-protein interaction inference and assessment. The proposed method is able to assess not only the overall quality of an interaction dataset, but also the quality of individual protein-protein interactions. We expect the method to continually improve as more high quality interaction data from more model organisms becomes available and is readily scalable to a genome-wide application.
Linking service quality, customer satisfaction, and behavioral intention.
Woodside, A G; Frey, L L; Daly, R T
1989-12-01
Based on the service quality and script theory literature, a framework of relationships among service quality, customer satisfaction, and behavioral intention for service purchases is proposed. Specific models are developed from the general framework and the models are applied and tested for the highly complex and divergent consumer service of overnight hospital care. Service quality, customer satisfaction, and behavioral intention data were collected from recent patients of two hospitals. The findings support the specific models and general framework. Implications for theory, service marketing, and future research are discussed.
NASA Astrophysics Data System (ADS)
Dorfner, Tobias; Förtsch, Christian; Boone, William; Neuhaus, Birgit J.
2017-09-01
A number of studies on single instructional quality features have been reported for mathematics and science instruction. For summarizing single instructional quality features, researchers have created a model of three basic dimensions (classroom management, supportive climate, and cognitive activation) of instructional quality mainly through observing mathematics instruction. Considering this model as valid for all subjects and as usable for describing instruction, we used it in this study which aimed to analyze characteristics of instructional quality in biology lessons of high-achieving and low-achieving classes, independently of content. Therefore, we used the data of three different previous video studies of biology instruction conducted in Germany. From each video study, we selected three high-achieving and three low-achieving classes (N = 18 teachers; 35 videos) for our multiple-case study, in which conspicuous characteristics of instructional quality features were qualitatively identified and qualitatively analyzed. The amount of these characteristics was counted in a quantitative way in all the videos. The characteristics we found could be categorized using the model of three basic dimensions of instructional quality despite some subject-specific differences for biology instruction. Our results revealed that many more characteristics were observable in high-achieving classes than in low-achieving classes. Thus, we believe that this model could be used to describe biology instruction independently of the content. We also make the claims about the qualities for biology instruction—working with concentration in a content-structured environment, getting challenged in higher order thinking, and getting praised for performance—that could have positive influence on students' achievement.
Model improvements and validation of TerraSAR-X precise orbit determination
NASA Astrophysics Data System (ADS)
Hackel, S.; Montenbruck, O.; Steigenberger, P.; Balss, U.; Gisinger, C.; Eineder, M.
2017-05-01
The radar imaging satellite mission TerraSAR-X requires precisely determined satellite orbits for validating geodetic remote sensing techniques. Since the achieved quality of the operationally derived, reduced-dynamic (RD) orbit solutions limits the capabilities of the synthetic aperture radar (SAR) validation, an effort is made to improve the estimated orbit solutions. This paper discusses the benefits of refined dynamical models on orbit accuracy as well as estimated empirical accelerations and compares different dynamic models in a RD orbit determination. Modeling aspects discussed in the paper include the use of a macro-model for drag and radiation pressure computation, the use of high-quality atmospheric density and wind models as well as the benefit of high-fidelity gravity and ocean tide models. The Sun-synchronous dusk-dawn orbit geometry of TerraSAR-X results in a particular high correlation of solar radiation pressure modeling and estimated normal-direction positions. Furthermore, this mission offers a unique suite of independent sensors for orbit validation. Several parameters serve as quality indicators for the estimated satellite orbit solutions. These include the magnitude of the estimated empirical accelerations, satellite laser ranging (SLR) residuals, and SLR-based orbit corrections. Moreover, the radargrammetric distance measurements of the SAR instrument are selected for assessing the quality of the orbit solutions and compared to the SLR analysis. The use of high-fidelity satellite dynamics models in the RD approach is shown to clearly improve the orbit quality compared to simplified models and loosely constrained empirical accelerations. The estimated empirical accelerations are substantially reduced by 30% in tangential direction when working with the refined dynamical models. Likewise the SLR residuals are reduced from -3 ± 17 to 2 ± 13 mm, and the SLR-derived normal-direction position corrections are reduced from 15 to 6 mm, obtained from the 2012-2014 period. The radar range bias is reduced from -10.3 to -6.1 mm with the updated orbit solutions, which coincides with the reduced standard deviation of the SLR residuals. The improvements are mainly driven by the satellite macro-model for the purpose of solar radiation pressure modeling, improved atmospheric density models, and the use of state-of-the-art gravity field models.
Ozgul, Arpat; Armitage, Kenneth B; Blumstein, Daniel T; Vanvuren, Dirk H; Oli, Madan K
2006-01-01
1. The presence/absence of a species at a particular site is the simplest form of data that can be collected during ecological field studies. We used 13 years (1990-2002) of survey data to parameterize a stochastic patch occupancy model for a metapopulation of the yellow-bellied marmot in Colorado, and investigated the significance of particular patches and the influence of site quality, network characteristics and regional stochasticity on the metapopulation persistence. 2. Persistence of the yellow-bellied marmot metapopulation was strongly dependent on the high quality colony sites, and persistence probability was highly sensitive to small changes in the quality of these sites. 3. A relatively small number of colony sites was ultimately responsible for the regional persistence. However, lower quality satellite sites also made a significant contribution to long-term metapopulation persistence, especially when regional stochasticity was high. 4. The northern network of the marmot metapopulation was more stable compared to the southern network, and the persistence of the southern network depended heavily on the northern network. 5. Although complex models of metapopulation dynamics may provide a more accurate description of metapopulation dynamics, such models are data-intensive. Our study, one of the very few applications of stochastic patch occupancy models to a mammalian species, suggests that stochastic patch occupancy models can provide important insights into metapopulation dynamics using data that are easy to collect.
The World Meteorological Organization’s (WMO) Global Atmosphere Watch (GAW) Programme coordinates high-quality observations of atmospheric composition from global to local scales with the aim to drive high-quality and high-impact science while co-producing a new generation of pro...
Measuring the impact of urbanization on scenic quality: land use change in the northeast
Robert O. Brush; James F. Palmer
1979-01-01
The changes in scenic quality resulting from urbanization are explored for a region in the Northeast. The relative contributions to scenic quality of certain landscape features are examined by developing regression models for the region and for town landscapes within that region. The models provide empirical evidence of the importance of trees for maintaining high...
ERIC Educational Resources Information Center
Commonwealth of Learning, 2010
2010-01-01
The Commonwealth of Learning Review and Improvement Model (COL RIM) was developed by the Commonwealth of Learning in response to two key drivers: (1) Increased global emphasis on the quality of higher education; and (2) Rising concern about the high cost and uncertain benefits of conventional approaches to external quality assurance. Any…
NASA Astrophysics Data System (ADS)
Taylan, Osman
2017-02-01
High ozone concentration is an important cause of air pollution mainly due to its role in the greenhouse gas emission. Ozone is produced by photochemical processes which contain nitrogen oxides and volatile organic compounds in the lower atmospheric level. Therefore, monitoring and controlling the quality of air in the urban environment is very important due to the public health care. However, air quality prediction is a highly complex and non-linear process; usually several attributes have to be considered. Artificial intelligent (AI) techniques can be employed to monitor and evaluate the ozone concentration level. The aim of this study is to develop an Adaptive Neuro-Fuzzy inference approach (ANFIS) to determine the influence of peripheral factors on air quality and pollution which is an arising problem due to ozone level in Jeddah city. The concentration of ozone level was considered as a factor to predict the Air Quality (AQ) under the atmospheric conditions. Using Air Quality Standards of Saudi Arabia, ozone concentration level was modelled by employing certain factors such as; nitrogen oxide (NOx), atmospheric pressure, temperature, and relative humidity. Hence, an ANFIS model was developed to observe the ozone concentration level and the model performance was assessed by testing data obtained from the monitoring stations established by the General Authority of Meteorology and Environment Protection of Kingdom of Saudi Arabia. The outcomes of ANFIS model were re-assessed by fuzzy quality charts using quality specification and control limits based on US-EPA air quality standards. The results of present study show that the ANFIS model is a comprehensive approach for the estimation and assessment of ozone level and is a reliable approach to produce more genuine outcomes.
A longitudinal study of clinical peer review's impact on quality and safety in U.S. hospitals.
Edwards, Marc T
2013-01-01
Clinical peer review is the dominant method of event analysis in U.S. hospitals. It is pivotal to medical staff efforts to improve quality and safety, yet the quality assurance process model that has prevailed for the past 30 years evokes fear and is fundamentally antithetical to a culture of safety. Two prior national studies characterized a quality improvement model that corrects this dysfunction but failed to demonstrate progress toward its adoption despite a high rate of program change between 2007 and 2009. This study's online survey of 470 organizations participating in either of the prior studies further assessed relationships between clinical peer review program factors, including the degree of conformance to the quality improvement model (the QI model score), and subjectively measured program impact variables. Among the 300 hospitals (64%) that responded, the median QI model score was only 60 on a 100-point scale. Scores increased somewhat for the 2007 cohort (mean pair-wise difference of 5.9 [2-10]), but not for the 2009 cohort. The QI model is expanded as the result of the finding that self-reporting of adverse events, near misses, and hazardous conditions--an essential practice in high-reliability organizations--is no longer rare in hospitals. Self-reporting and the quality of case review are additional multivariate predictors of the perceived ongoing impact of clinical peer review on quality and safety, medical staff perceptions of the program, and medical staff engagement in quality and safety initiatives. Hospital leaders and trustees who seek to improve patient outcomes should facilitate the adoption of this best practice model for clinical peer review.
Exploring network operations for data and information networks
NASA Astrophysics Data System (ADS)
Yao, Bing; Su, Jing; Ma, Fei; Wang, Xiaomin; Zhao, Xiyang; Yao, Ming
2017-01-01
Barabási and Albert, in 1999, formulated scale-free models based on some real networks: World-Wide Web, Internet, metabolic and protein networks, language or sexual networks. Scale-free networks not only appear around us, but also have high qualities in the world. As known, high quality information networks can transfer feasibly and efficiently data, clearly, their topological structures are very important for data safety. We build up network operations for constructing large scale of dynamic networks from smaller scale of network models having good property and high quality. We focus on the simplest operators to formulate complex operations, and are interesting on the closeness of operations to desired network properties.
A FRAMEWORK FOR FINE-SCALE COMPUTATIONAL FLUID DYNAMICS AIR QUALITY MODELING AND ANALYSIS
Fine-scale Computational Fluid Dynamics (CFD) simulation of pollutant concentrations within roadway and building microenvironments is feasible using high performance computing. Unlike currently used regulatory air quality models, fine-scale CFD simulations are able to account rig...
A FEDERATED PARTNERSHIP FOR URBAN METEOROLOGICAL AND AIR QUALITY MODELING
Recently, applications of urban meteorological and air quality models have been performed at resolutions on the order of km grid sizes. This necessitated development and incorporation of high resolution landcover data and additional boundary layer parameters that serve to descri...
Comparison of CMAQ Modeling Study with Discover-AQ 2014 Aircraft Measurements over Colorado
NASA Astrophysics Data System (ADS)
Tang, Y.; Pan, L.; Lee, P.; Tong, D.; Kim, H. C.; Artz, R. S.
2014-12-01
NASA and NCAR jointly led a recent multiple platform-based (space, air and ground) measurement intensive to study air quality and to validate satellite data. The Discover-AQ/FRAPPE field experiment took place along the Colorado Front Range in July and August, 2014. The air quality modeling team of the NOAA Air Resources Laboratory was one of the three teams that provided real-time air quality forecasting for the campaign. The U.S. EPA Community Multi-scale Air Quality (CMAQ) Model was used with emission inventories based on the data set used by the NOAA National Air Quality Forecasting Capacity (NAQFC). By analyzing the forecast results calculated using aircraft measurements, it was found that CO emissions tended to be overestimated, while ethane emissions were underestimated. Biogenic VOCs were also underpredicted. Due to their relatively high altitude, ozone concentrations in Denver and the surrounding areas are affected by both local emissions and transported ozone. The modeled ozone was highly dependent on the meteorological predictions over this region. The complex terrain over the Rocky Mountains also contributed to the model uncertainty. This study discussed the causes of model biases, the forecast performance under different meteorology, and results from using different model grid resolutions. Several data assimilation techniques were further tested to improve the "post-analysis" performance of the modeling system for the period.
Kline, Ronald M; Bazell, Carol; Smith, Erin; Schumacher, Heidi; Rajkumar, Rahul; Conway, Patrick H
2015-03-01
Cancer is a medically complex and expensive disease with costs projected to rise further as new treatment options increase and the United States population ages. Studies showing significant regional variation in oncology quality and costs and model tests demonstrating cost savings without adverse outcomes suggest there are opportunities to create a system of oncology care in the US that delivers higher quality care at lower cost. The Centers for Medicare and Medicaid Services (CMS) have designed an episode-based payment model centered around 6 month periods of chemotherapy treatment. Monthly per-patient care management payments will be made to practices to support practice transformation, including additional patient services and specific infrastructure enhancements. Quarterly reporting of quality metrics will drive continuous quality improvement and the adoption of best practices among participants. Practices achieving cost savings will also be eligible for performance-based payments. Savings are expected through improved care coordination and appropriately aligned payment incentives, resulting in decreased avoidable emergency department visits and hospitalizations and more efficient and evidence-based use of imaging, laboratory tests, and therapeutic agents, as well as improved end of life care. New therapies and better supportive care have significantly improved cancer survival in recent decades. This has come at a high cost, with cancer therapy consuming $124 billion in 2010. CMS has designed an episode-based model of oncology care that incorporates elements from several successful model tests. By providing care management and performance based payments in conjunction with quality metrics and a rapid learning environment, it is hoped that this model will demonstrate how oncology care in the US can transform into a high value, high quality system. Copyright © 2015 by American Society of Clinical Oncology.
NASA Astrophysics Data System (ADS)
Zhang, J.; Lin, L. F.; Bras, R. L.
2017-12-01
Hydrological applications rely on the availability and quality of precipitation products, specially model- and satellite-based products for use in areas without ground measurements. It is known that the quality of model- and satellite-based precipitation products are complementary—model-based products exhibiting high quality during winters while satellite-based products seem to be better during summers. To explore that behavior, this study uses 2-m air temperature as auxiliary information to evaluate high-resolution (0.1°×0.1° every hour) precipitation products from Weather Research and Forecasting (WRF) simulations and from version-4 Integrated Multi-satellite Retrievals for GPM (IMERG) early and final runs. The products are evaluated relative to the reference NCEP Stage IV precipitation estimates over the central United States in 2016. The results show that the WRF and IMERG final-run estimates are nearly unbiased while the IMERG early-run estimates positively biased. The results also show that the WRF estimates exhibit high correlations with the reference data when the temperature falls below 280°K and the IMERG estimates (i.e., both early and final runs) do so when the temperature exceeds 280°K. Moreover, the temperature threshold of 280°K, which distinguishes the quality of the WRF and the IMERG products, does not vary significantly with either season or location. This study not only adds insight into current precipitation research on the quality of precipitation products but also suggests a simple way for choosing either a model- or satellite-based product or a hybrid model/satellite product for applications.
Miyauchi, Shunsuke; Yonetani, Tsutomu; Yuki, Takayuki; Tomio, Ayako; Bamba, Takeshi; Fukusaki, Eiichiro
2017-02-01
For an experimental model to elucidate the relationship between light quality during plant culture conditions and plant quality of crops or vegetables, we cultured tea plants (Camellia sinensis) and analyzed their leaves as tea material. First, metabolic profiling of teas from a tea contest in Japan was performed with gas chromatography/mass spectrometry (GC/MS), and then a ranking predictive model was made which predicted tea rankings from their metabolite profile. Additionally, the importance of some compounds (glutamine, glutamic acid, oxalic acid, epigallocatechin, phosphoric acid, and inositol) was elucidated for measurement of the quality of tea leaf. Subsequently, tea plants were cultured in artificial conditions to control these compounds. From the result of prediction by the ranking predictive model, the tea sample supplemented with ultraviolet-A (315-399 nm) showed the highest ranking. The improvement in quality was thought to come from the high amino-acid and decreased epigallocatechin content in tea leaves. The current study shows the use and value of metabolic profiling in the field of high-quality crops and vegetables production that has been conventionally evaluated by human sensory analysis. Metabolic profiling enables us to form hypothesis to understand and develop high quality plant cultured under artificial condition. Copyright © 2016 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sadat Hashemi, Somayeh; Ghavami Sabouri, Saeed; Khorsandi, Alireza
2018-04-01
We present a theoretical model in order to study the effect of a thermally loaded crystal on the quality of a second-harmonic (SH) beam generated in a high-power pumping regime. The model is provided based on using a particular structure of oven considered for MgO:PPsLT nonlinear crystal to compensate for the thermal de-phasing effect that as the pumping power reaches up to 50 W degrades the conversion efficiency and beam quality of the interacting beams. Hereupon, the quality of fundamental beam is involved in the modeling to investigate the final effect on the beam quality of generated SH beam. Beam quality evaluation is subsequently simulated using Hermite-Gaussian modal decomposition approach for a range of fundamental beam qualities varied from 1 to 3 and for different levels of input powers. To provide a meaningful comparison numerical simulation is correlated with real data deduced from a high-power SH generation (SHG) experimental device. It is found that when using the open-top oven scheme and fixing the fundamental M 2-factor at nearly 1, for a range of input powers changing from 15 to 30 W, the M 2-factor of SHG beam is degraded from 9% to 24%, respectively, confirming very good consistency with the reported experimental results.
Evaluation of Model Recognition for Grammar-Based Automatic 3d Building Model Reconstruction
NASA Astrophysics Data System (ADS)
Yu, Qian; Helmholz, Petra; Belton, David
2016-06-01
In recent years, 3D city models are in high demand by many public and private organisations, and the steadily growing capacity in both quality and quantity are increasing demand. The quality evaluation of these 3D models is a relevant issue both from the scientific and practical points of view. In this paper, we present a method for the quality evaluation of 3D building models which are reconstructed automatically from terrestrial laser scanning (TLS) data based on an attributed building grammar. The entire evaluation process has been performed in all the three dimensions in terms of completeness and correctness of the reconstruction. Six quality measures are introduced to apply on four datasets of reconstructed building models in order to describe the quality of the automatic reconstruction, and also are assessed on their validity from the evaluation point of view.
[Effect of occupational stress and effort-reward imbalance on sleep quality of people's policeman].
Wu, Hui; Gu, Guizhen; Yu, Shanfa
2014-04-01
To explore the effect of occupational stress and effort-reward imbalance on sleep quality of people's police. A cluster sampling survey of sleep quality and occupational stress correlated factors was conducted on 287 police from a city public security bureau by questionnaires in May, 2011; the relationship between sleep quality and occupational stress correlated factors was analyzed by one-way ANOVA and multivariate non-conditional logistic regression using effort-reward imbalance model (ERI) and demand-control-support model (DCS). And the subjects were divided into high tension group and low tension group using the 1.0 of ERI and DCS coefficients as the boundary. The sleep quality score of shift work police was higher than day work police (11.95 ± 6.54 vs 9.52 ± 6.43, t = 2.77, P < 0.05).In ERI model, the sleep quality score in high tension group was higher than low tension group (14.50 ± 6.41 vs 8.60 ± 5.53, t = -5.32, P < 0.01), and in DCS model, the sleep quality score in high tension group was also higher than low tension group (13.71 ± 6.62 vs 9.46 ± 6.04, t = -3.71, P < 0.01).For the regression analysis of ERI model as an argument, sex (OR = 3.0, 95%CI:1.16-7.73) , age for 30-39 years (OR = 3.48, 95%CI:1.32-9.16) , intrinsic effort (OR = 2.30, 95%CI:1.10-4.81) and daily hassles (OR = 2.15, 95%CI:1.06-4.33) were risk factors of low sleep quality, and reward (OR = 0.26, 95%CI:0.12-0.52) was the protective factor.For the regression analysis of DCS model as an argument , age for 30-39 years (OR = 2.55, 95%CI:1.02-6.37) , depressive symptom (OR = 2.10, 95%CI:1.14-3.89) and daily hassles (OR = 3.25, 95%CI:1.70-6.19) were risk factors of low sleep quality.While the ERI model and the DCS model were analyzed simultaneously, sex (OR = 3.03, 95%CI:1.15-7.98) , age for 30-39 years (OR = 3.71, 95%CI:1.38-9.98) and daily hassles (OR = 2.09, 95%CI:1.01-4.30) were the risk factors of low sleep quality, and reward (OR = 0.22, 95%CI:0.10-0.48) was the protective factor. Occupational stress and effort-reward imbalance affected the sleep quality to people's policeman.
Tracing the influence of land-use change on water quality and coral reefs using a Bayesian model.
Brown, Christopher J; Jupiter, Stacy D; Albert, Simon; Klein, Carissa J; Mangubhai, Sangeeta; Maina, Joseph M; Mumby, Peter; Olley, Jon; Stewart-Koster, Ben; Tulloch, Vivitskaia; Wenger, Amelia
2017-07-06
Coastal ecosystems can be degraded by poor water quality. Tracing the causes of poor water quality back to land-use change is necessary to target catchment management for coastal zone management. However, existing models for tracing the sources of pollution require extensive data-sets which are not available for many of the world's coral reef regions that may have severe water quality issues. Here we develop a hierarchical Bayesian model that uses freely available satellite data to infer the connection between land-uses in catchments and water clarity in coastal oceans. We apply the model to estimate the influence of land-use change on water clarity in Fiji. We tested the model's predictions against underwater surveys, finding that predictions of poor water quality are consistent with observations of high siltation and low coverage of sediment-sensitive coral genera. The model thus provides a means to link land-use change to declines in coastal water quality.
Student Engagement in the Scottish Quality Enhancement Framework
ERIC Educational Resources Information Center
Gvaramadze, Irakli
2011-01-01
The research addressed the interplay of student engagement and quality enhancement mechanisms in the Scottish higher education system. The paper demonstrates increasing focus on student learning, learning experience and high-quality learning in the current quality enhancement approaches. The student-university coproduction model is used to…
Adaptive correction procedure for TVL1 image deblurring under impulse noise
NASA Astrophysics Data System (ADS)
Bai, Minru; Zhang, Xiongjun; Shao, Qianqian
2016-08-01
For the problem of image restoration of observed images corrupted by blur and impulse noise, the widely used TVL1 model may deviate from both the data-acquisition model and the prior model, especially for high noise levels. In order to seek a solution of high recovery quality beyond the reach of the TVL1 model, we propose an adaptive correction procedure for TVL1 image deblurring under impulse noise. Then, a proximal alternating direction method of multipliers (ADMM) is presented to solve the corrected TVL1 model and its convergence is also established under very mild conditions. It is verified by numerical experiments that our proposed approach outperforms the TVL1 model in terms of signal-to-noise ratio (SNR) values and visual quality, especially for high noise levels: it can handle salt-and-pepper noise as high as 90% and random-valued noise as high as 70%. In addition, a comparison with a state-of-the-art method, the two-phase method, demonstrates the superiority of the proposed approach.
Saadati, Fatemeh; Sehhatiei Shafaei, Fahimeh; Mirghafourvand, Mozhgan
2018-01-01
Sleep is one of the most basic human requirements. This research aims at determining the status of sleep quality and its relationship with quality of life among high-risk pregnant women in Tabriz, Iran, in 2015. This research was a sectional study done on 364 qualified women in 28-36 weeks of pregnancy suffering from mild preeclampsia and gestational diabetes. The sampling was done as convenience. Personal-social-midwifery questionnaire, Pittsburg sleep quality, and quality of life in pregnancy (QOL-ORAV) were used for gathering data. Multivariate linear regression model was used for determining the relationship between sleep quality and its subsets with quality of life and controlling confounders. In the current study, the prevalence of sleep disturbance was 96.4%. Mean (SD) of the total score of sleep quality was 10.1 (4.1) and the total score of quality of life was 61.7 (17.3). According to Pearson's correlation test, there was statistically significant relationship between quality of life and sleep quality and all its subsets except sleep duration and use of sleep medication (p < 0.001). Meanwhile, according to the multivariate linear regression model, sleep latency, day time dysfunction, health status, and home air-conditioning were related with quality of life. The findings of current research show that sleep quality is low among high-risk pregnant women and quality of life is medium. So, it is necessary that required training is given by health cares for improving sleep quality and quality of life to mothers.
A design procedure for the handling qualities optimization of the X-29A aircraft
NASA Technical Reports Server (NTRS)
Bosworth, John T.; Cox, Timothy H.
1989-01-01
A design technique for handling qualities improvement was developed for the X-29A aircraft. As with any new aircraft, the X-29A control law designers were presented with a relatively high degree of uncertainty in their mathematical models. The presence of uncertainties, and the high level of static instability of the X-29A caused the control law designers to stress stability and robustness over handling qualities. During flight test, the mathematical models of the vehicle were validated or corrected to match the vehicle dynamic behavior. The updated models were then used to fine tune the control system to provide fighter-like handling characteristics. A design methodology was developed which works within the existing control system architecture to provide improved handling qualities and acceptable stability with a minimum of cost in both implementation as well as software verification and validation.
NASA Astrophysics Data System (ADS)
Huang, Shengzhou; Li, Mujun; Shen, Lianguan; Qiu, Jinfeng; Zhou, Youquan
2018-03-01
A novel fabrication method for high quality aspheric microlens array (MLA) was developed by combining the dose-modulated DMD-based lithography and surface thermal reflow process. In this method, the complex shape of aspheric microlens is pre-modeled via dose modulation in a digital micromirror device (DMD) based maskless projection lithography. And the dose modulation mainly depends on the distribution of exposure dose of photoresist. Then the pre-shaped aspheric microlens is polished by a following non-contact thermal reflow (NCTR) process. Different from the normal process, the reflow process here is investigated to improve the surface quality while keeping the pre-modeled shape unchanged, and thus will avoid the difficulties in generating the aspheric surface during reflow. Fabrication of a designed aspheric MLA with this method was demonstrated in experiments. Results showed that the obtained aspheric MLA was good in both shape accuracy and surface quality. The presented method may be a promising approach in rapidly fabricating high quality aspheric microlens with complex surface.
McKean, J.R.; Johnson, D.M.; Johnson, Richard L.; Taylor, R.G.
2005-01-01
Central Idaho has superior environmental amenities, as evidenced by exceptionally high-value tourism, such as guided whitewater rafting. The focus of our study concerns the attainment of high-quality jobs in a high-quality natural environment. We estimate cumulative wage rate effects unique to nonconsumptive river recreation in central Idaho for comparison with other sectors. The cumulative effects are based on a detailed survey of recreation spending and a modified synthesized input–output model. Cumulative wage rate effects support using the abundance of environmental amenities to expand and attract high-wage, environmentally sensitive firms, as opposed to expanded tourism to improve employment quality.
Bundled Payments in Total Joint Replacement: Keeping Our Care Affordable and High in Quality.
McLawhorn, Alexander S; Buller, Leonard T
2017-09-01
The purpose of this review was to evaluate the literature regarding bundle payment reimbursement models for total joint arthroplasty (TJA). From an economic standpoint, TJA are cost-effective, but they represent a substantial expense to the Centers for Medicare & Medicaid Services (CMS). Historically, fee-for-service payment models resulted in highly variable cost and quality. CMS introduced Bundled Payments for Care Improvement (BPCI) in 2012 and subsequently the Comprehensive Care for Joint Replacement (CJR) reimbursement model in 2016 to improve the value of TJA from the perspectives of both CMS and patients, by improving quality via cost control. Early results of bundled payments are promising, but preserving access to care for patients with high comorbidity burdens and those requiring more complex care is a lingering concern. Hospitals, regardless of current participation in bundled payments, should develop care pathways for TJA to maximize efficiency and patient safety.
Režek Jambrak, Anet; Šimunek, Marina; Grbeš, Franjo; Mandura, Ana; Djekic, Ilija
2018-04-01
The objective of this paper was to demonstrate application of quality function deployment in analysing effects of high power ultrasound on quality properties of apple juices and nectars. In order to develop a quality function deployment model, joint with instrumental analysis of treated samples, a field survey was performed to identify consumer preferences towards quality characteristics of juices/nectar. Based on field research, the three most important characteristics were 'taste' and 'aroma' with 28.5% of relative absolute weight importance, followed by 'odour' (16.9%). The quality function deployment model showed that the top three 'quality scores' for apple juice were treatments with amplitude 90 µm, 9 min treatment time and sample temperature 40 °C; 60 µm, 9 min, 60 °C; and 90 µm, 6 min, 40 °C. For nectars, the top three were treatments 120 µm, 9 min, 20 °C; 60 µm, 9 min, 60 °C; and A2.16 60 µm, 9 min, 20 °C. This type of quality model enables a more complex measure of large scale of different quality parameters. Its simplicity should be understood as its practical advantage and, as such, this tool can be a part of design quality when using novel preservation technologies. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
USING CMAQ FOR EXPOSURE MODELING AND CHARACTERIZING THE SUB-GRID VARIABILITY FOR EXPOSURE ESTIMATES
Atmospheric processes and the associated transport and dispersion of atmospheric pollutants are known to be highly variable in time and space. Current air quality models that characterize atmospheric chemistry effects, e.g. the Community Multi-scale Air Quality (CMAQ), provide vo...
Modeling white sturgeon movement in a reservoir: The effect of water quality and sturgeon density
Sullivan, A.B.; Jager, H.I.; Myers, R.
2003-01-01
We developed a movement model to examine the distribution and survival of white sturgeon (Acipenser transmontanus) in a reservoir subject to large spatial and temporal variation in dissolved oxygen and temperature. Temperature and dissolved oxygen were simulated by a CE-QUAL-W2 model of Brownlee Reservoir, Idaho for a typical wet, normal, and dry hydrologic year. We compared current water quality conditions to scenarios with reduced nutrient inputs to the reservoir. White sturgeon habitat quality was modeled as a function of temperature, dissolved oxygen and, in some cases, suitability for foraging and depth. We assigned a quality index to each cell along the bottom of the reservoir. The model simulated two aspects of daily movement. Advective movement simulated the tendency for animals to move toward areas with high habitat quality, and diffusion simulated density dependent movement away from areas with high sturgeon density in areas with non-lethal habitat conditions. Mortality resulted when sturgeon were unable to leave areas with lethal temperature or dissolved oxygen conditions. Water quality was highest in winter and early spring and lowest in mid to late summer. Limiting nutrient inputs reduced the area of Brownlee Reservoir with lethal conditions for sturgeon and raised the average habitat suitability throughout the reservoir. Without movement, simulated white sturgeon survival ranged between 45 and 89%. Allowing movement raised the predicted survival of sturgeon under all conditions to above 90% as sturgeon avoided areas with low habitat quality. ?? 2003 Elsevier B.V. All rights reserved.
Helicopter mathematical models and control law development for handling qualities research
NASA Technical Reports Server (NTRS)
Chen, Robert T. N.; Lebacqz, J. Victor; Aiken, Edwin W.; Tischler, Mark B.
1988-01-01
Progress made in joint NASA/Army research concerning rotorcraft flight-dynamics modeling, design methodologies for rotorcraft flight-control laws, and rotorcraft parameter identification is reviewed. Research into these interactive disciplines is needed to develop the analytical tools necessary to conduct flying qualities investigations using both the ground-based and in-flight simulators, and to permit an efficient means of performing flight test evaluation of rotorcraft flying qualities for specification compliance. The need for the research is particularly acute for rotorcraft because of their mathematical complexity, high order dynamic characteristics, and demanding mission requirements. The research in rotorcraft flight-dynamics modeling is pursued along two general directions: generic nonlinear models and nonlinear models for specific rotorcraft. In addition, linear models are generated that extend their utilization from 1-g flight to high-g maneuvers and expand their frequency range of validity for the design analysis of high-gain flight control systems. A variety of methods ranging from classical frequency-domain approaches to modern time-domain control methodology that are used in the design of rotorcraft flight control laws is reviewed. Also reviewed is a study conducted to investigate the design details associated with high-gain, digital flight control systems for combat rotorcraft. Parameter identification techniques developed for rotorcraft applications are reviewed.
High-Performance Integrated Control of water quality and quantity in urban water reservoirs
NASA Astrophysics Data System (ADS)
Galelli, S.; Castelletti, A.; Goedbloed, A.
2015-11-01
This paper contributes a novel High-Performance Integrated Control framework to support the real-time operation of urban water supply storages affected by water quality problems. We use a 3-D, high-fidelity simulation model to predict the main water quality dynamics and inform a real-time controller based on Model Predictive Control. The integration of the simulation model into the control scheme is performed by a model reduction process that identifies a low-order, dynamic emulator running 4 orders of magnitude faster. The model reduction, which relies on a semiautomatic procedural approach integrating time series clustering and variable selection algorithms, generates a compact and physically meaningful emulator that can be coupled with the controller. The framework is used to design the hourly operation of Marina Reservoir, a 3.2 Mm3 storm-water-fed reservoir located in the center of Singapore, operated for drinking water supply and flood control. Because of its recent formation from a former estuary, the reservoir suffers from high salinity levels, whose behavior is modeled with Delft3D-FLOW. Results show that our control framework reduces the minimum salinity levels by nearly 40% and cuts the average annual deficit of drinking water supply by about 2 times the active storage of the reservoir (about 4% of the total annual demand).
Iriza, Amalia; Dumitrache, Rodica C.; Lupascu, Aurelia; ...
2016-01-01
Our paper aims to evaluate the quality of high-resolution weather forecasts from the Weather Research and Forecasting (WRF) numerical weather prediction model. The lateral and boundary conditions were obtained from the numerical output of the Consortium for Small-scale Modeling (COSMO) model at 7 km horizontal resolution. Furthermore, the WRF model was run for January and July 2013 at two horizontal resolutions (3 and 1 km). The numerical forecasts of the WRF model were evaluated using different statistical scores for 2 m temperature and 10 m wind speed. Our results showed a tendency of the WRF model to overestimate the valuesmore » of the analyzed parameters in comparison to observations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iriza, Amalia; Dumitrache, Rodica C.; Lupascu, Aurelia
Our paper aims to evaluate the quality of high-resolution weather forecasts from the Weather Research and Forecasting (WRF) numerical weather prediction model. The lateral and boundary conditions were obtained from the numerical output of the Consortium for Small-scale Modeling (COSMO) model at 7 km horizontal resolution. Furthermore, the WRF model was run for January and July 2013 at two horizontal resolutions (3 and 1 km). The numerical forecasts of the WRF model were evaluated using different statistical scores for 2 m temperature and 10 m wind speed. Our results showed a tendency of the WRF model to overestimate the valuesmore » of the analyzed parameters in comparison to observations.« less
Quality metrics in high-dimensional data visualization: an overview and systematization.
Bertini, Enrico; Tatu, Andrada; Keim, Daniel
2011-12-01
In this paper, we present a systematization of techniques that use quality metrics to help in the visual exploration of meaningful patterns in high-dimensional data. In a number of recent papers, different quality metrics are proposed to automate the demanding search through large spaces of alternative visualizations (e.g., alternative projections or ordering), allowing the user to concentrate on the most promising visualizations suggested by the quality metrics. Over the last decade, this approach has witnessed a remarkable development but few reflections exist on how these methods are related to each other and how the approach can be developed further. For this purpose, we provide an overview of approaches that use quality metrics in high-dimensional data visualization and propose a systematization based on a thorough literature review. We carefully analyze the papers and derive a set of factors for discriminating the quality metrics, visualization techniques, and the process itself. The process is described through a reworked version of the well-known information visualization pipeline. We demonstrate the usefulness of our model by applying it to several existing approaches that use quality metrics, and we provide reflections on implications of our model for future research. © 2010 IEEE
NASA Astrophysics Data System (ADS)
Tiebin, Wu; Yunlian, Liu; Xinjun, Li; Yi, Yu; Bin, Zhang
2018-06-01
Aiming at the difficulty in quality prediction of sintered ores, a hybrid prediction model is established based on mechanism models of sintering and time-weighted error compensation on the basis of the extreme learning machine (ELM). At first, mechanism models of drum index, total iron, and alkalinity are constructed according to the chemical reaction mechanism and conservation of matter in the sintering process. As the process is simplified in the mechanism models, these models are not able to describe high nonlinearity. Therefore, errors are inevitable. For this reason, the time-weighted ELM based error compensation model is established. Simulation results verify that the hybrid model has a high accuracy and can meet the requirement for industrial applications.
Blind prediction of natural video quality.
Saad, Michele A; Bovik, Alan C; Charrier, Christophe
2014-03-01
We propose a blind (no reference or NR) video quality evaluation model that is nondistortion specific. The approach relies on a spatio-temporal model of video scenes in the discrete cosine transform domain, and on a model that characterizes the type of motion occurring in the scenes, to predict video quality. We use the models to define video statistics and perceptual features that are the basis of a video quality assessment (VQA) algorithm that does not require the presence of a pristine video to compare against in order to predict a perceptual quality score. The contributions of this paper are threefold. 1) We propose a spatio-temporal natural scene statistics (NSS) model for videos. 2) We propose a motion model that quantifies motion coherency in video scenes. 3) We show that the proposed NSS and motion coherency models are appropriate for quality assessment of videos, and we utilize them to design a blind VQA algorithm that correlates highly with human judgments of quality. The proposed algorithm, called video BLIINDS, is tested on the LIVE VQA database and on the EPFL-PoliMi video database and shown to perform close to the level of top performing reduced and full reference VQA algorithms.
ERIC Educational Resources Information Center
Bromer, Juliet; Korfmacher, Jon
2017-01-01
Research Findings: Home-based child care accounts for a significant proportion of nonparental child care arrangements for young children in the United States. Yet the early care and education field lacks clear models or pathways for how to improve quality in these settings. The conceptual model presented here articulates the components of…
NASA Astrophysics Data System (ADS)
Benjankar, R. M.; Sohrabi, M.; Tonina, D.; McKean, J. A.
2013-12-01
Aquatic habitat models utilize flow variables which may be predicted with one-dimensional (1D) or two-dimensional (2D) hydrodynamic models to simulate aquatic habitat quality. Studies focusing on the effects of hydrodynamic model dimensionality on predicted aquatic habitat quality are limited. Here we present the analysis of the impact of flow variables predicted with 1D and 2D hydrodynamic models on simulated spatial distribution of habitat quality and Weighted Usable Area (WUA) for fall-spawning Chinook salmon. Our study focuses on three river systems located in central Idaho (USA), which are a straight and pool-riffle reach (South Fork Boise River), small pool-riffle sinuous streams in a large meadow (Bear Valley Creek) and a steep-confined plane-bed stream with occasional deep forced pools (Deadwood River). We consider low and high flows in simple and complex morphologic reaches. Results show that 1D and 2D modeling approaches have effects on both the spatial distribution of the habitat and WUA for both discharge scenarios, but we did not find noticeable differences between complex and simple reaches. In general, the differences in WUA were small, but depended on stream type. Nevertheless, spatially distributed habitat quality difference is considerable in all streams. The steep-confined plane bed stream had larger differences between aquatic habitat quality defined with 1D and 2D flow models compared to results for streams with well defined macro-topographies, such as pool-riffle bed forms. KEY WORDS: one- and two-dimensional hydrodynamic models, habitat modeling, weighted usable area (WUA), hydraulic habitat suitability, high and low discharges, simple and complex reaches
Assuring quality in high-consequence engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoover, Marcey L.; Kolb, Rachel R.
2014-03-01
In high-consequence engineering organizations, such as Sandia, quality assurance may be heavily dependent on staff competency. Competency-dependent quality assurance models are at risk when the environment changes, as it has with increasing attrition rates, budget and schedule cuts, and competing program priorities. Risks in Sandia's competency-dependent culture can be mitigated through changes to hiring, training, and customer engagement approaches to manage people, partners, and products. Sandia's technical quality engineering organization has been able to mitigate corporate-level risks by driving changes that benefit all departments, and in doing so has assured Sandia's commitment to excellence in high-consequence engineering and national service.
Grieger, Jessica A; Johnson, Brittany J; Wycherley, Thomas P; Golley, Rebecca K
2017-05-01
Background: Dietary simulation modeling can predict dietary strategies that may improve nutritional or health outcomes. Objectives: The study aims were to undertake a systematic review of simulation studies that model dietary strategies aiming to improve nutritional intake, body weight, and related chronic disease, and to assess the methodologic and reporting quality of these models. Methods: The Preferred Reporting Items for Systematic Reviews and Meta-Analyses guided the search strategy with studies located through electronic searches [Cochrane Library, Ovid (MEDLINE and Embase), EBSCOhost (CINAHL), and Scopus]. Study findings were described and dietary modeling methodology and reporting quality were critiqued by using a set of quality criteria adapted for dietary modeling from general modeling guidelines. Results: Forty-five studies were included and categorized as modeling moderation, substitution, reformulation, or promotion dietary strategies. Moderation and reformulation strategies targeted individual nutrients or foods to theoretically improve one particular nutrient or health outcome, estimating small to modest improvements. Substituting unhealthy foods with healthier choices was estimated to be effective across a range of nutrients, including an estimated reduction in intake of saturated fatty acids, sodium, and added sugar. Promotion of fruits and vegetables predicted marginal changes in intake. Overall, the quality of the studies was moderate to high, with certain features of the quality criteria consistently reported. Conclusions: Based on the results of reviewed simulation dietary modeling studies, targeting a variety of foods rather than individual foods or nutrients theoretically appears most effective in estimating improvements in nutritional intake, particularly reducing intake of nutrients commonly consumed in excess. A combination of strategies could theoretically be used to deliver the best improvement in outcomes. Study quality was moderate to high. However, given the lack of dietary simulation reporting guidelines, future work could refine the quality tool to harmonize consistency in the reporting of subsequent dietary modeling studies. © 2017 American Society for Nutrition.
Monitoring and modeling of microbial and biological water quality
USDA-ARS?s Scientific Manuscript database
Microbial and biological water quality informs on the health of water systems and their suitability for uses in irrigation, recreation, aquaculture, and other activities. Indicators of microbial and biological water quality demonstrate high spatial and temporal variability. Therefore, monitoring str...
ERIC Educational Resources Information Center
Boronico, Jess; Murdy, Jim; Kong, Xinlu
2014-01-01
This manuscript proposes a mathematical model to address faculty sufficiency requirements towards assuring overall high quality management education at a global university. Constraining elements include full-time faculty coverage by discipline, location, and program, across multiple campus locations subject to stated service quality standards of…
Designing a Broadband Pump for High-Quality Micro-Lasers via Modified Net Radiation Method.
Nechayev, Sergey; Reusswig, Philip D; Baldo, Marc A; Rotschild, Carmel
2016-12-07
High-quality micro-lasers are key ingredients in non-linear optics, communication, sensing and low-threshold solar-pumped lasers. However, such micro-lasers exhibit negligible absorption of free-space broadband pump light. Recently, this limitation was lifted by cascade energy transfer, in which the absorption and quality factor are modulated with wavelength, enabling non-resonant pumping of high-quality micro-lasers and solar-pumped laser to operate at record low solar concentration. Here, we present a generic theoretical framework for modeling the absorption, emission and energy transfer of incoherent radiation between cascade sensitizer and laser gain media. Our model is based on linear equations of the modified net radiation method and is therefore robust, fast converging and has low complexity. We apply this formalism to compute the optimal parameters of low-threshold solar-pumped lasers. It is revealed that the interplay between the absorption and self-absorption of such lasers defines the optimal pump absorption below the maximal value, which is in contrast to conventional lasers for which full pump absorption is desired. Numerical results are compared to experimental data on a sensitized Nd 3+ :YAG cavity, and quantitative agreement with theoretical models is found. Our work modularizes the gain and sensitizing components and paves the way for the optimal design of broadband-pumped high-quality micro-lasers and efficient solar-pumped lasers.
Designing a Broadband Pump for High-Quality Micro-Lasers via Modified Net Radiation Method
Nechayev, Sergey; Reusswig, Philip D.; Baldo, Marc A.; Rotschild, Carmel
2016-01-01
High-quality micro-lasers are key ingredients in non-linear optics, communication, sensing and low-threshold solar-pumped lasers. However, such micro-lasers exhibit negligible absorption of free-space broadband pump light. Recently, this limitation was lifted by cascade energy transfer, in which the absorption and quality factor are modulated with wavelength, enabling non-resonant pumping of high-quality micro-lasers and solar-pumped laser to operate at record low solar concentration. Here, we present a generic theoretical framework for modeling the absorption, emission and energy transfer of incoherent radiation between cascade sensitizer and laser gain media. Our model is based on linear equations of the modified net radiation method and is therefore robust, fast converging and has low complexity. We apply this formalism to compute the optimal parameters of low-threshold solar-pumped lasers. It is revealed that the interplay between the absorption and self-absorption of such lasers defines the optimal pump absorption below the maximal value, which is in contrast to conventional lasers for which full pump absorption is desired. Numerical results are compared to experimental data on a sensitized Nd3+:YAG cavity, and quantitative agreement with theoretical models is found. Our work modularizes the gain and sensitizing components and paves the way for the optimal design of broadband-pumped high-quality micro-lasers and efficient solar-pumped lasers. PMID:27924844
Identify High-Quality Protein Structural Models by Enhanced K-Means.
Wu, Hongjie; Li, Haiou; Jiang, Min; Chen, Cheng; Lv, Qiang; Wu, Chuang
2017-01-01
Background. One critical issue in protein three-dimensional structure prediction using either ab initio or comparative modeling involves identification of high-quality protein structural models from generated decoys. Currently, clustering algorithms are widely used to identify near-native models; however, their performance is dependent upon different conformational decoys, and, for some algorithms, the accuracy declines when the decoy population increases. Results. Here, we proposed two enhanced K -means clustering algorithms capable of robustly identifying high-quality protein structural models. The first one employs the clustering algorithm SPICKER to determine the initial centroids for basic K -means clustering ( SK -means), whereas the other employs squared distance to optimize the initial centroids ( K -means++). Our results showed that SK -means and K -means++ were more robust as compared with SPICKER alone, detecting 33 (59%) and 42 (75%) of 56 targets, respectively, with template modeling scores better than or equal to those of SPICKER. Conclusions. We observed that the classic K -means algorithm showed a similar performance to that of SPICKER, which is a widely used algorithm for protein-structure identification. Both SK -means and K -means++ demonstrated substantial improvements relative to results from SPICKER and classical K -means.
Identify High-Quality Protein Structural Models by Enhanced K-Means
Li, Haiou; Chen, Cheng; Lv, Qiang; Wu, Chuang
2017-01-01
Background. One critical issue in protein three-dimensional structure prediction using either ab initio or comparative modeling involves identification of high-quality protein structural models from generated decoys. Currently, clustering algorithms are widely used to identify near-native models; however, their performance is dependent upon different conformational decoys, and, for some algorithms, the accuracy declines when the decoy population increases. Results. Here, we proposed two enhanced K-means clustering algorithms capable of robustly identifying high-quality protein structural models. The first one employs the clustering algorithm SPICKER to determine the initial centroids for basic K-means clustering (SK-means), whereas the other employs squared distance to optimize the initial centroids (K-means++). Our results showed that SK-means and K-means++ were more robust as compared with SPICKER alone, detecting 33 (59%) and 42 (75%) of 56 targets, respectively, with template modeling scores better than or equal to those of SPICKER. Conclusions. We observed that the classic K-means algorithm showed a similar performance to that of SPICKER, which is a widely used algorithm for protein-structure identification. Both SK-means and K-means++ demonstrated substantial improvements relative to results from SPICKER and classical K-means. PMID:28421198
Urban Landscape Characterization Using Remote Sensing Data For Input into Air Quality Modeling
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.; Estes, Maurice G., Jr.; Crosson, William; Khan, Maudood
2005-01-01
The urban landscape is inherently complex and this complexity is not adequately captured in air quality models that are used to assess whether urban areas are in attainment of EPA air quality standards, particularly for ground level ozone. This inadequacy of air quality models to sufficiently respond to the heterogeneous nature of the urban landscape can impact how well these models predict ozone pollutant levels over metropolitan areas and ultimately, whether cities exceed EPA ozone air quality standards. We are exploring the utility of high-resolution remote sensing data and urban growth projections as improved inputs to meteorological and air quality models focusing on the Atlanta, Georgia metropolitan area as a case study. The National Land Cover Dataset at 30m resolution is being used as the land use/land cover input and aggregated to the 4km scale for the MM5 mesoscale meteorological model and the Community Multiscale Air Quality (CMAQ) modeling schemes. Use of these data have been found to better characterize low density/suburban development as compared with USGS 1 km land use/land cover data that have traditionally been used in modeling. Air quality prediction for future scenarios to 2030 is being facilitated by land use projections using a spatial growth model. Land use projections were developed using the 2030 Regional Transportation Plan developed by the Atlanta Regional Commission. This allows the State Environmental Protection agency to evaluate how these transportation plans will affect future air quality.
NASA Astrophysics Data System (ADS)
Mairota, Paola; Cafarelli, Barbara; Labadessa, Rocco; Lovergine, Francesco P.; Tarantino, Cristina; Nagendra, Harini; Didham, Raphael K.
2015-02-01
Modelling the empirical relationships between habitat quality and species distribution patterns is the first step to understanding human impacts on biodiversity. It is important to build on this understanding to develop a broader conceptual appreciation of the influence of surrounding landscape structure on local habitat quality, across multiple spatial scales. Traditional models which report that 'habitat amount' in the landscape is sufficient to explain patterns of biodiversity, irrespective of habitat configuration or spatial variation in habitat quality at edges, implicitly treat each unit of habitat as interchangeable and ignore the high degree of interdependence between spatial components of land-use change. Here, we test the contrasting hypothesis, that local habitat units are not interchangeable in their habitat attributes, but are instead dependent on variation in surrounding habitat structure at both patch- and landscape levels. As the statistical approaches needed to implement such hierarchical causal models are observation-intensive, we utilise very high resolution (VHR) Earth Observation (EO) images to rapidly generate fine-grained measures of habitat patch internal heterogeneities over large spatial extents. We use linear mixed-effects models to test whether these remotely-sensed proxies for habitat quality were influenced by surrounding patch or landscape structure. The results demonstrate the significant influence of surrounding patch and landscape context on local habitat quality. They further indicate that such an influence can be direct, when a landscape variable alone influences the habitat structure variable, and/or indirect when the landscape and patch attributes have a conjoined effect on the response variable. We conclude that a substantial degree of interaction among spatial configuration effects is likely to be the norm in determining the ecological consequences of habitat fragmentation, thus corroborating the notion of the spatial context dependence of habitat quality.
Quality improvement prototype: Johnson Space Center, National Aeronautics and Space Administration
NASA Technical Reports Server (NTRS)
1990-01-01
The Johnson Space Flight Center was recognized by the Office of Management and Budget as a model for its high standards of quality. Included are an executive summary of the center's activities, an organizational overview, techniques for improving quality, the status of the quality effort and a listing of key personnel.
Evaluating Quality of Students' Support Services in Open Distance Learning
ERIC Educational Resources Information Center
Nsamba, Asteria; Makoe, Mpine
2017-01-01
Evaluating the quality of students' support services in distance education institutions is vital because by nature Open Distance Learning (ODL) is a high-involvement service industry, with multiple student support service encounters. Most quality evaluation models tend to view quality from the institutional perspective. As a result, little is…
NASA Astrophysics Data System (ADS)
Mundhra, A.; Sain, K.; Shankar, U.
2012-12-01
The Indian National Gas Hydrate Program Expedition (NGHP) 01 discovered gas hydrate in unconsolidated sediments at several drilling sites along the continental margins of Krishna-Godavari Basin, India. Presence of gas hydrate reduces the attenuation of travelling seismic waves which can be measured by estimation of seismic quality factor (Dasgupta and Clark, 1998). Here, we use log spectral ratio method (Sain et al, 2009) to compute quality factor at three locations, among which two have strong and one has no bottom simulating reflector (BSR), along seismic cross-line near one of the drilling site. Interval quality factor for three submarine sedimentary layers bounded by seafloor, BSR, one reflector above and another reflector below the BSR has been measured. To compute quality factor, unprocessed pre-stack seismic data has been used to avoid any influence of processing sequence. We have estimated that interval quality factor lies within 200-220 in the interval having BSR while it varies within 90-100 in other intervals. Thereby, high interval quality factor ascertains that observed BSR is due to presence of gas hydrates. We have performed rock physics modelling by using isotropic and anisotropic models, to quantitatively estimate gas hydrate saturation at one of the location where an interval has high quality factor. Abruptly high measured resistivity and high P-wave velocity in the interval, leads to towering hydrate saturation (Archie,1942 and Helegrud et al, 1999) in comparison to lower gas hydrate saturations estimated by pressure core and chlorinity measurements. Overestimation of saturation is attributed to presence of near vertical fractures that are identified from logging-while-drilling resistivity images. We have carried out anisotropic modeling (Kennedy and Herrick, 2004 and Lee,2009) by incorporating fracture volume and fracture porosity to estimate hydrate saturation and have observed that modeled gas hydrate saturations agree with the lower gas hydrate saturations obtained from pressure core and chlorinity measurements. Therefore, we find that 1) quality factor is significantly higher in the interval bearing gas hydrates and is a useful tool to discover hydrate deposits, 2) anisotropy due to presence of near vertical hydrate filled fractures translates into elevated saturation because of high measured resistivity and velocity and 3) anisotropic model greatly corrects the saturation estimates in fractured medium. References: Archie, G.E., 1942. Petroleum Transactions of AIME, 146, 54-62. Dasgupta, R., Clark, R.A., 1998. Geophysics 63, 2120-2128. Kennedy, W.D., Herrick, D.C., 2004. Petrophysics 45, 38-58. Lee, M.W., 2009. U.S. Geological Survey Scientific Investigations Report 2009-5141, 13. Sain, K., Singh, A.K., Thakur, N.K., Khanna, R.K., 2009.Marine Geophysical Researches 30, 137-145.
NASA Astrophysics Data System (ADS)
Neal, Lucy S.; Dalvi, Mohit; Folberth, Gerd; McInnes, Rachel N.; Agnew, Paul; O'Connor, Fiona M.; Savage, Nicholas H.; Tilbee, Marie
2017-11-01
There is a clear need for the development of modelling frameworks for both climate change and air quality to help inform policies for addressing these issues simultaneously. This paper presents an initial attempt to develop a single modelling framework, by introducing a greater degree of consistency in the meteorological modelling framework by using a two-step, one-way nested configuration of models, from a global composition-climate model (GCCM) (140 km resolution) to a regional composition-climate model covering Europe (RCCM) (50 km resolution) and finally to a high (12 km) resolution model over the UK (AQUM). The latter model is used to produce routine air quality forecasts for the UK. All three models are based on the Met Office's Unified Model (MetUM). In order to better understand the impact of resolution on the downscaling of projections of future climate and air quality, we have used this nest of models to simulate a 5-year period using present-day emissions and under present-day climate conditions. We also consider the impact of running the higher-resolution model with higher spatial resolution emissions, rather than simply regridding emissions from the RCCM. We present an evaluation of the models compared to in situ air quality observations over the UK, plus a comparison against an independent 1 km resolution gridded dataset, derived from a combination of modelling and observations, effectively producing an analysis of annual mean surface pollutant concentrations. We show that using a high-resolution model over the UK has some benefits in improving air quality modelling, but that the use of higher spatial resolution emissions is important to capture local variations in concentrations, particularly for primary pollutants such as nitrogen dioxide and sulfur dioxide. For secondary pollutants such as ozone and the secondary component of PM10, the benefits of a higher-resolution nested model are more limited and reasons for this are discussed. This study highlights the point that the resolution of models is not the only factor in determining model performance - consistency between nested models is also important.
Predicting Software Suitability Using a Bayesian Belief Network
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.
2005-01-01
The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.
An annual assessment of air quality with the CALIOPE modeling system over Spain.
Baldasano, J M; Pay, M T; Jorba, O; Gassó, S; Jiménez-Guerrero, P
2011-05-01
The CALIOPE project, funded by the Spanish Ministry of the Environment, aims at establishing an air quality forecasting system for Spain. With this goal, CALIOPE modeling system was developed and applied with high resolution (4km×4km, 1h) using the HERMES emission model (including emissions of resuspended particles from paved roads) specifically built up for Spain. The present study provides an evaluation and the assessment of the modeling system, coupling WRF-ARW/HERMES/CMAQ/BSC-DREAM8b for a full-year simulation in 2004 over Spain. The evaluation focuses on the capability of the model to reproduce the temporal and spatial distribution of gas phase species (NO(2), O(3), and SO(2)) and particulate matter (PM10) against ground-based measurements from the Spanish air quality monitoring network. The evaluation of the modeling results on an hourly basis shows a strong dependency of the performance of the model on the type of environment (urban, suburban and rural) and the dominant emission sources (traffic, industrial, and background). The O(3) chemistry is best represented in summer, when mean hourly variability and high peaks are generally well reproduced. The mean normalized error and bias meet the recommendations proposed by the United States Environmental Protection Agency (US-EPA) and the European regulations. Modeled O(3) shows higher performance for urban than for rural stations, especially at traffic stations in large cities, since stations influenced by traffic emissions (i.e., high-NO(x) environments) are better characterized with a more pronounced daily variability. NO(x)/O(3) chemistry is better represented under non-limited-NO(2) regimes. SO(2) is mainly produced from isolated point sources (power generation and transformation industries) which generate large plumes of high SO(2) concentration affecting the air quality on a local to national scale where the meteorological pattern is crucial. The contribution of mineral dust from the Sahara desert through the BSC-DREAM8b model helps to satisfactorily reproduce episodic high PM10 concentration peaks at background stations. The model assessment indicates that one of the main air quality-related problems in Spain is the high level of O(3). A quarter of the Iberian Peninsula shows more than 30days exceeding the value 120μgm(-3) for the maximum 8-h O(3) concentration as a consequence of the transport of O(3) precursors downwind to/from the Madrid and Barcelona metropolitan areas, and industrial areas and cities in the Mediterranean coast. Copyright © 2011 Elsevier B.V. All rights reserved.
LM-3: A High-resolution Lake Michigan Mass Balance Water Quality Model
This report is a user’s manual that describes the high-resolution mass balance model known as LM3. LM3 has been applied to Lake Michigan to describe the transport and fate of atrazine, PCB congeners, and chloride in that system. The model has also been used to model eutrophicat...
NASA Astrophysics Data System (ADS)
Gidey, Amanuel
2018-06-01
Determining suitability and vulnerability of groundwater quality for irrigation use is a key alarm and first aid for careful management of groundwater resources to diminish the impacts on irrigation. This study was conducted to determine the overall suitability of groundwater quality for irrigation use and to generate their spatial distribution maps in Elala catchment, Northern Ethiopia. Thirty-nine groundwater samples were collected to analyze and map the water quality variables. Atomic absorption spectrophotometer, ultraviolet spectrophotometer, titration and calculation methods were used for laboratory groundwater quality analysis. Arc GIS, geospatial analysis tools, semivariogram model types and interpolation methods were used to generate geospatial distribution maps. Twelve and eight water quality variables were used to produce weighted overlay and irrigation water quality index models, respectively. Root-mean-square error, mean square error, absolute square error, mean error, root-mean-square standardized error, measured values versus predicted values were used for cross-validation. The overall weighted overlay model result showed that 146 km2 areas are highly suitable, 135 km2 moderately suitable and 60 km2 area unsuitable for irrigation use. The result of irrigation water quality index confirms 10.26% with no restriction, 23.08% with low restriction, 20.51% with moderate restriction, 15.38% with high restriction and 30.76% with the severe restriction for irrigation use. GIS and irrigation water quality index are better methods for irrigation water resources management to achieve a full yield irrigation production to improve food security and to sustain it for a long period, to avoid the possibility of increasing environmental problems for the future generation.
Dunbar, Margaret; Mirpuri, Sheena; Yip, Tiffany
2017-10-01
Previous research has indicated that school engagement tends to decline across high school. At the same time, sleep problems and exposure to social stressors such as ethnic/racial discrimination increase. The current study uses a biopsychosocial perspective to examine the interactive and prospective effects of sleep and discrimination on trajectories of academic performance. Growth curve models were used to explore changes in 6 waves of academic outcomes in a sample of 310 ethnically and racially diverse adolescents (mean age = 14.47 years, SD = .78, and 64.1% female). Ethnic/racial discrimination was assessed at Time 1 in a single survey. Sleep quality and duration were also assessed at Time 1 with daily diary surveys. School engagement and grades were reported every 6 months for 3 years. Higher self-reported sleep quality in the ninth grade was associated with higher levels of academic engagement at the start of high school. Ethnic/racial discrimination moderated the relationship between sleep quality and engagement such that adolescents reporting low levels of discrimination reported a steeper increase in engagement over time, whereas their peers reporting poor sleep quality and high levels of discrimination reported the worse engagement in the ninth grade and throughout high school. The combination of poor sleep quality and high levels of discrimination in ninth grade has downstream consequences for adolescent academic outcomes. This study applies the biopsychosocial model to understand the development and daily experiences of diverse adolescents. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Alava, Juan José; Ross, Peter S; Lachmuth, Cara; Ford, John K B; Hickie, Brendan E; Gobas, Frank A P C
2012-11-20
The development of an area-based polychlorinated biphenyl (PCB) food-web bioaccumulation model enabled a critical evaluation of the efficacy of sediment quality criteria and prey tissue residue guidelines in protecting fish-eating resident killer whales of British Columbia and adjacent waters. Model-predicted and observed PCB concentrations in resident killer whales and Chinook salmon were in good agreement, supporting the model's application for risk assessment and criteria development. Model application shows that PCB concentrations in the sediments from the resident killer whale's Critical Habitats and entire foraging range leads to PCB concentrations in most killer whales that exceed PCB toxicity threshold concentrations reported for marine mammals. Results further indicate that current PCB sediment quality and prey tissue residue criteria for fish-eating wildlife are not protective of killer whales and are not appropriate for assessing risks of PCB-contaminated sediments to high trophic level biota. We present a novel methodology for deriving sediment quality criteria and tissue residue guidelines that protect biota of high trophic levels under various PCB management scenarios. PCB concentrations in sediments and in prey that are deemed protective of resident killer whale health are much lower than current criteria values, underscoring the extreme vulnerability of high trophic level marine mammals to persistent and bioaccumulative contaminants.
Water quality modeling based on landscape analysis: Importance of riparian hydrology
Thomas Grabs
2010-01-01
Several studies in high-latitude catchments have demonstrated the importance of near-stream riparian zones as hydrogeochemical hotspots with a substantial influence on stream chemistry. An adequate representation of the spatial variability of riparian-zone processes and characteristics is the key for modeling spatiotemporal variations of stream-water quality. This...
Lessons learned from a one-dimensional water quality model for the Gulf of Mexico
Hypoxia in the northern Gulf of Mexico has been a major concern for many years. Several water quality models have attempted to describe the link between high nutrient loads from the Mississippi River and hypoxia in the Gulf of Mexico with varied success. Here we describe the dev...
Satellite remote sensing for modeling and monitoring of water quality in the Great Lakes
NASA Astrophysics Data System (ADS)
Coffield, S. R.; Crosson, W. L.; Al-Hamdan, M. Z.; Barik, M. G.
2017-12-01
Consistent and accurate monitoring of the Great Lakes is critical for protecting the freshwater ecosystems, quantifying the impacts of climate change, understanding harmful algal blooms, and safeguarding public health for the millions who rely on the Lakes for drinking water. While ground-based monitoring is often hampered by limited sampling resolution, satellite data provide surface reflectance measurements at much more complete spatial and temporal scales. In this study, we implemented NASA data from the Moderate Resolution Imaging Spectroradiometer (MODIS) onboard the Aqua satellite to build robust water quality models. We developed and validated models for chlorophyll-a, nitrogen, phosphorus, and turbidity based on combinations of the six MODIS Ocean Color bands (412, 443, 488, 531, 547, and 667nm) for 2003-2016. Second, we applied these models to quantify trends in water quality through time and in relation to changing land cover, runoff, and climate for six selected coastal areas in Lakes Michigan and Erie. We found strongest models for chlorophyll-a in Lake Huron (R2 = 0.75), nitrogen in Lake Ontario (R2=0.66), phosphorus in Lake Erie (R2=0.60), and turbidity in Lake Erie (R2=0.86). These offer improvements over previous efforts to model chlorophyll-a while adding nitrogen, phosphorus, and turbidity. Mapped water quality parameters showed high spatial variability, with nitrogen concentrated largely in Superior and coastal Michigan and high turbidity, phosphorus, and chlorophyll near urban and agricultural areas of Erie. Temporal analysis also showed concurrence of high runoff or precipitation and nitrogen in Lake Michigan offshore of wetlands, suggesting that water quality in these areas is sensitive to changes in climate.
NASA Astrophysics Data System (ADS)
Jung, J.; Choi, Y.; Souri, A.; Jeon, W.
2017-12-01
Particle matter(PM) has played a significantly deleterious role in affecting human health and climate. Recently, continuous high concentrations of PM in Korea attracted public attention to this critical issue, and the Korea-United States Air Quality Study(KORUS-AQ) campaign in 2016 was conducted to investigate the causes. For this study, we adjusted the initial conditions in the chemical transport model(CTM) to improve its performance over Korean Peninsula during KORUS-AQ period, using the campaign data to evaluate our model performance. We used the Optimal Interpolation(OI) approach and used hourly surface air quality measurement data from the Air Quality Monitoring Station(AQMS) by NIER and the aerosol optical depth(AOD) measured by a GOCI sensor from the geostationary orbit onboard the Communication Ocean and Meteorological Satellite(COMS). The AOD at 550nm has a 6km spatial resolution and broad coverage over East Asia. After assimilating the surface air quality observation data, the model accuracy significantly improved compared to base model result (without assimilation). It reported very high correlation value (0.98) and considerably decreased mean bias. Especially, it well captured some high peaks which was underpredicted by the base model. To assimilate satellite data, we applied AOD scaling factors to quantify each specie's contribution to total PM concentration and find-mode fraction(FMF) to define vertical distribution. Finally, the improvement showed fairly good agreement.
Extending the cost-benefit model of thermoregulation: high-temperature environments.
Vickers, Mathew; Manicom, Carryn; Schwarzkopf, Lin
2011-04-01
The classic cost-benefit model of ectothermic thermoregulation compares energetic costs and benefits, providing a critical framework for understanding this process (Huey and Slatkin 1976 ). It considers the case where environmental temperature (T(e)) is less than the selected temperature of the organism (T(sel)), and it predicts that, to minimize increasing energetic costs of thermoregulation as habitat thermal quality declines, thermoregulatory effort should decrease until the lizard thermoconforms. We extended this model to include the case where T(e) exceeds T(sel), and we redefine costs and benefits in terms of fitness to include effects of body temperature (T(b)) on performance and survival. Our extended model predicts that lizards will increase thermoregulatory effort as habitat thermal quality declines, gaining the fitness benefits of optimal T(b) and maximizing the net benefit of activity. Further, to offset the disproportionately high fitness costs of high T(e) compared with low T(e), we predicted that lizards would thermoregulate more effectively at high values of T(e) than at low ones. We tested our predictions on three sympatric skink species (Carlia rostralis, Carlia rubrigularis, and Carlia storri) in hot savanna woodlands and found that thermoregulatory effort increased as thermal quality declined and that lizards thermoregulated most effectively at high values of T(e).
Stock, Stephanie; Pitcavage, James M; Simic, Dusan; Altin, Sibel; Graf, Christian; Feng, Wen; Graf, Thomas R
2014-09-01
Improving the quality of care for chronic diseases is an important issue for most health care systems in industrialized nations. One widely adopted approach is the Chronic Care Model (CCM), which was first developed in the late 1990s. In this article we present the results from two large surveys in the United States and Germany that report patients' experiences in different models of patient-centered diabetes care, compared to the experiences of patients who received routine diabetes care in the same systems. The study populations were enrolled in either Geisinger Health System in Pennsylvania or Barmer, a German sickness fund that provides medical insurance nationwide. Our findings suggest that patients with type 2 diabetes who were enrolled in the care models that exhibited key features of the CCM were more likely to receive care that was patient-centered, high quality, and collaborative, compared to patients who received routine care. This study demonstrates that quality improvement can be realized through the application of the Chronic Care Model, regardless of the setting or distinct characteristics of the program. Project HOPE—The People-to-People Health Foundation, Inc.
Variations in Daily Sleep Quality and Type 1 Diabetes Management in Late Adolescents
Queen, Tara L.; Butner, Jonathan; Wiebe, Deborah; Berg, Cynthia A.
2016-01-01
Objective To determine how between- and within-person variability in perceived sleep quality were associated with adolescent diabetes management. Methods A total of 236 older adolescents with type 1 diabetes reported daily for 2 weeks on sleep quality, self-regulatory failures, frequency of blood glucose (BG) checks, and BG values. Average, inconsistent, and daily deviations in sleep quality were examined. Results Hierarchical linear models indicated that poorer average and worse daily perceived sleep quality (compared with one’s average) was each associated with more self-regulatory failures. Sleep quality was not associated with frequency of BG checking. Poorer average sleep quality was related to greater risk of high BG. Furthermore, inconsistent and daily deviations in sleep quality interacted to predict higher BG, with more consistent sleepers benefitting more from a night of high-quality sleep. Conclusions Good, consistent sleep quality during late adolescence may benefit diabetes management by reducing self-regulatory failures and risk of high BG. PMID:26994852
Development and Application of New Quality Model for Software Projects
Karnavel, K.; Dillibabu, R.
2014-01-01
The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594
Development and application of new quality model for software projects.
Karnavel, K; Dillibabu, R
2014-01-01
The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.
Azmat, Syed Khurram; Ali, Moazzam; Hameed, Waqas; Awan, Muhammad Ali
2018-01-01
Studies have documented the impact of quality family planning services on improved contraceptive uptake and continuation, however, relatively little is known about their quality of service provision especially in the context of social franchising. This study examined the quality of clinical services and user experiences among two models in franchised service providers in rural Pakistan. This facility-based assessment was carried out during May-June 2015 at the 20 randomly selected social franchise providers from Chakwal and Faisalabad. In our case, a franchise health facility was a private clinic (mostly) run by a single provider, supported by an assistant. Within the selected health facilities, a total 39 user-provider interactions were observed and same users were interviewed separately. Most of the health facilities were in the private sector. Comparatively, service providers at Greenstar Social Marketing/Population Services International (GSM/PSI) model franchised facilities had higher number of rooms and staff employed, with more providers' ownership. Quality of service indices showed high scores for both Marie Stopes Society (MSS) and GSM/PSI franchised providers. MSS franchised providers demonstrated comparative edge in terms of clinical governance, better method mix and they were more user-focused, while PSI providers offered broader range of non-FP services. Quality of counselling services were similar among both models. Service providers performed well on all indicators of interpersonal care however overall low scores were noted in technical care. For both models, service providers attained an average score of 6.7 (out of the maximum value of 8) on waste disposal mechanism, supplies 12.5 (out of the maximum value of 15), user-centred facility 2.7 (out of the maximum value of 4), and clinical governance 6.5 (out of the maximum value of 11) and respecting clients' privacy. The exit interviews yielded high user satisfaction in both service models. The findings seem suggesting that the MSS and GSM/PSI service providers were maintaining high quality standards in provision of family planning information, services, and commodities but overall there was not much difference between the two models in terms of quality and satisfaction. The results demonstrate that service quality and client satisfaction are an important determinant of use of clinical contraceptive methods in Pakistan.
High-quality, daily meteorological data at high spatial resolution are essential for a variety of hydrologic and ecological modeling applications that support environmental risk assessments and decision making. This paper describes the development, application, and assessment of ...
Modeling the ecological trap hypothesis: a habitat and demographic analysis for migrant songbirds
Therese M. Donovan; Frank R, III Thompson
2001-01-01
Most species occupy both high- and low-quality habitats throughout their ranges. As habitats become modified through anthropogenic change, low-quality habitat may become a more dominant component of the landscape for some species. To conserve species, information on how to assess habitat quality and guidelines for maintaining or eliminating low-quality habitats are...
NASA Astrophysics Data System (ADS)
Malsy, Marcus; Reder, Klara; Flörke, Martina
2014-05-01
Decreasing water quality is one of the main global issues which poses risks to food security, economy, and public health and is consequently crucial for ensuring environmental sustainability. During the last decades access to clean drinking water increased, but 2.5 billion people still do not have access to basic sanitation, especially in Africa and parts of Asia. In this context not only connection to sewage system is of high importance, but also treatment, as an increasing connection rate will lead to higher loadings and therefore higher pressure on water resources. Furthermore, poor people in developing countries use local surface waters for daily activities, e.g. bathing and washing. It is thus clear that water utilization and water sewerage are indispensable connected. In this study, large scale water quality modelling is used to point out hotspots of water pollution to get an insight on potential environmental impacts, in particular, in regions with a low observation density and data gaps in measured water quality parameters. We applied the global water quality model WorldQual to calculate biological oxygen demand (BOD) loadings from point and diffuse sources, as well as in-stream concentrations. Regional focus in this study is on developing countries i.e. Africa, Asia, and South America, as they are most affected by water pollution. Hereby, model runs were conducted for the year 2010 to draw a picture of recent status of surface waters quality and to figure out hotspots and main causes of pollution. First results show that hotspots mainly occur in highly agglomerated regions where population density is high. Large urban areas are initially loading hotspots and pollution prevention and control become increasingly important as point sources are subject to connection rates and treatment levels. Furthermore, river discharge plays a crucial role due to dilution potential, especially in terms of seasonal variability. Highly varying shares of BOD sources across regions, and across sectors demand for an integrated approach to assess main causes of water quality degradation.
NASA Astrophysics Data System (ADS)
Sun, N.; Yearsley, J. R.; Nijssen, B.; Lettenmaier, D. P.
2014-12-01
Urban stream quality is particularly susceptible to extreme precipitation events and land use change. Although the projected effects of extreme events and land use change on hydrology have been resonably well studied, the impacts on urban water quality have not been widely examined due in part to the scale mismatch between global climate models and the spatial scales required to represent urban hydrology and water quality signals. Here we describe a grid-based modeling system that integrates the Distributed Hydrology Soil Vegetation Model (DHSVM) and urban water quality module adpated from EPA's Storm Water Management Model (SWMM) and Soil and water assessment tool (SWAT). Using the model system, we evaluate, for four partially urbanized catchments within the Puget Sound basin, urban water quality under current climate conditions, and projected potential changes in urban water quality associated with future changes in climate and land use. We examine in particular total suspended solids, toal nitrogen, total phosphorous, and coliform bacteria, with catchment representations at the 150-meter spatial resolution and the sub-daily timestep. We report long-term streamflow and water quality predictions in response to extreme precipitation events of varying magnitudes in the four partially urbanized catchments. Our simulations show that urban water quality is highly sensitive to both climatic and land use change.
Industrial pollution and the management of river water quality: a model of Kelani River, Sri Lanka.
Gunawardena, Asha; Wijeratne, E M S; White, Ben; Hailu, Atakelty; Pandit, Ram
2017-08-19
Water quality of the Kelani River has become a critical issue in Sri Lanka due to the high cost of maintaining drinking water standards and the market and non-market costs of deteriorating river ecosystem services. By integrating a catchment model with a river model of water quality, we developed a method to estimate the effect of pollution sources on ambient water quality. Using integrated model simulations, we estimate (1) the relative contribution from point (industrial and domestic) and non-point sources (river catchment) to river water quality and (2) pollutant transfer coefficients for zones along the lower section of the river. Transfer coefficients provide the basis for policy analyses in relation to the location of new industries and the setting of priorities for industrial pollution control. They also offer valuable information to design socially optimal economic policy to manage industrialized river catchments.
Effects of interface pressure distribution on human sleep quality.
Chen, Zongyong; Li, Yuqian; Liu, Rong; Gao, Dong; Chen, Quanhui; Hu, Zhian; Guo, Jiajun
2014-01-01
High sleep quality promotes efficient performance in the following day. Sleep quality is influenced by environmental factors, such as temperature, light, sound and smell. Here, we investigated whether differences in the interface pressure distribution on healthy individuals during sleep influenced sleep quality. We defined four types of pressure models by differences in the area distribution and the subjective feelings that occurred when participants slept on the mattresses. One type of model was showed "over-concentrated" distribution of pressure; one was displayed "over-evenly" distributed interface pressure while the other two models were displayed intermediate distribution of pressure. A polysomnography analysis demonstrated an increase in duration and proportion of non-rapid-eye-movement sleep stages 3 and 4, as well as decreased number of micro-arousals, in subjects sleeping on models with pressure intermediately distributed compared to models with over-concentrated or over-even distribution of pressure. Similarly, higher scores of self-reported sleep quality were obtained in subjects sleeping on the two models with intermediate pressure distribution. Thus, pressure distribution, at least to some degree, influences sleep quality and self-reported feelings of sleep-related events, though the underlying mechanisms remain unknown. The regulation of pressure models imposed by external sleep environment may be a new direction for improving sleep quality. Only an appropriate interface pressure distribution is beneficial for improving sleep quality, over-concentrated or -even distribution of pressure do not help for good sleep.
Local-Scale Air Quality Modeling in Support of Human Health and Exposure Research (Invited)
NASA Astrophysics Data System (ADS)
Isakov, V.
2010-12-01
Spatially- and temporally-sparse information on air quality is a key concern for air-pollution-related environmental health studies. Monitor networks are sparse in both space and time, are costly to maintain, and are often designed purposely to avoid detecting highly localized sources. Recent studies have shown that more narrowly defining the geographic domain of the study populations and improvements in the measured/estimated ambient concentrations can lead to stronger associations between air pollution and hospital admissions and mortality records. Traditionally, ambient air quality measurements have been used as a primary input to support human health and exposure research. However, there is increasing evidence that the current ambient monitoring network is not capturing sharp gradients in exposure due to the presence of high concentration levels near, for example, major roadways. Many air pollutants exhibit large concentration gradients near large emitters such as major roadways, factories, ports, etc. To overcome these limitations, researchers are now beginning to use air quality models to support air pollution exposure and health studies. There are many advantages to using air quality models over traditional approaches based on existing ambient measurements alone. First, models can provide spatially- and temporally-resolved concentrations as direct input to exposure and health studies and thus better defining the concentration levels for the population in the geographic domain. Air quality models have a long history of use in air pollution regulations, and supported by regulatory agencies and a large user community. Also, models can provide bidirectional linkages between sources of emissions and ambient concentrations, thus allowing exploration of various mitigation strategies to reduce risk to exposure. In order to provide best estimates of air concentrations to support human health and exposure studies, model estimates should consider local-scale features, regional-scale transport, and photochemical transformations. Since these needs are currently not met by a single model, hybrid air quality modeling has recently been developed to combine these capabilities. In this paper, we present the results of two studies where we applied the hybrid modeling approach to provide spatial and temporal details in air quality concentrations to support exposure and health studies: a) an urban-scale air quality accountability study involving near-source exposures to multiple ambient air pollutants, and b) an urban-scale epidemiological study involving human health data based on emergency department visits.
NASA Astrophysics Data System (ADS)
Salha, A. A.; Stevens, D. K.
2013-12-01
This study presents numerical application and statistical development of Stream Water Quality Modeling (SWQM) as a tool to investigate, manage, and research the transport and fate of water pollutants in Lower Bear River, Box elder County, Utah. The concerned segment under study is the Bear River starting from Cutler Dam to its confluence with the Malad River (Subbasin HUC 16010204). Water quality problems arise primarily from high phosphorus and total suspended sediment concentrations that were caused by five permitted point source discharges and complex network of canals and ducts of varying sizes and carrying capacities that transport water (for farming and agriculture uses) from Bear River and then back to it. Utah Department of Environmental Quality (DEQ) has designated the entire reach of the Bear River between Cutler Reservoir and Great Salt Lake as impaired. Stream water quality modeling (SWQM) requires specification of an appropriate model structure and process formulation according to nature of study area and purpose of investigation. The current model is i) one dimensional (1D), ii) numerical, iii) unsteady, iv) mechanistic, v) dynamic, and vi) spatial (distributed). The basic principle during the study is using mass balance equations and numerical methods (Fickian advection-dispersion approach) for solving the related partial differential equations. Model error decreases and sensitivity increases as a model becomes more complex, as such: i) uncertainty (in parameters, data input and model structure), and ii) model complexity, will be under investigation. Watershed data (water quality parameters together with stream flow, seasonal variations, surrounding landscape, stream temperature, and points/nonpoint sources) were obtained majorly using the HydroDesktop which is a free and open source GIS enabled desktop application to find, download, visualize, and analyze time series of water and climate data registered with the CUAHSI Hydrologic Information System. Processing, assessment of validity, and distribution of time-series data was explored using the GNU R language (statistical computing and graphics environment). Physical, chemical, and biological processes equations were written in FORTRAN codes (High Performance Fortran) in order to compute and solve their hyperbolic and parabolic complexities. Post analysis of results conducted using GNU R language. High performance computing (HPC) will be introduced to expedite solving complex computational processes using parallel programming. It is expected that the model will assess nonpoint sources and specific point sources data to understand pollutants' causes, transfer, dispersion, and concentration in different locations of Bear River. Investigation the impact of reduction/removal in non-point nutrient loading to Bear River water quality management could be addressed. Keywords: computer modeling; numerical solutions; sensitivity analysis; uncertainty analysis; ecosystem processes; high Performance computing; water quality.
Adolescent HIV Prevention: An Application of the Elaboration Likelihood Model.
ERIC Educational Resources Information Center
Metzler, April E.; Weiskotten, David; Morgen, Keith J.
Ninth grade students (n=298) participated in a study to examine the influence source credibility, message, quality, and personal relevance on HIV prevention message efficacy. A pilot study with adolescent focus groups created the high and low quality messages, as well as the high (HIV+) and low (worried parent) credibility sources. Participants…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-24
... Schools Program (CSP) Grants for Replication and Expansion of High-Quality Charter Schools; Notice... purpose of the CSP is to increase national understanding of the charter school model and to expand the number of high-quality charter schools available to students across the Nation by providing financial...
[AIDS prevention among adolescents in school: a systematic review of the efficacy of interventions].
Juárez, O; Díez, E
1999-01-01
Preventive interventions are considered useful although poorly evaluated. Since 1990 there are growing evidences of effective school aids prevention interventions. This paper aims to identify school aids prevention programs among youngsters aged 13 to 19, published between 1990 and 1995, to analyze each evaluation and intervention quality, to assess their effectiveness as well as identifying the possible contributing factors. Location of reports by means of a Medline computerized search of published articles and reviews, which should include the following criteria: school aids prevention programs, addressed to youngsters aged 13 to 19, published in Spanish, French or English between 1990 and 1995 in scientific literature, and evaluating changes in behavior or its determinants through quantitative measures. Analysis of the evaluation quality through the assessment of the sample size, the use of a control group, the groups comparability, the drop out analysis and the time between the pretest and the posttest. Intervention quality analysis through the use of a psychological behavioral change model and the number of sessions. The effectiveness of the high quality interventions in changing behaviors, intentions, attitudes and knowledge was assessed. 29 studies were selected. Of these studies, relating the quality of evaluation criteria, a 28% was considered a high quality study, a 14% an intermediate quality study and a 58% a low quality one. In relation to intervention quality criteria, a 27% was found to be a high quality study, a 41% an intermediate quality study and 32% a low quality one. 38% (11 studies) showed high or intermediate quality criteria at the same time in intervention and in evaluation. All these studies modified knowledge and attitudes, an 80% modified the intention to behave and a 86% modified behavior. The increase in knowledge and attitudes was in general quite important, greater than 10%, and changes in intentions and behaviors were smaller than 10%, although relevant. Only 38% of the studies may be considered of high or intermediate quality. Preventive interventions correctly evaluated which rely on a theoretical model and offer 4 or more sessions show evidence of moderate but relevant reduction of aids risk practices, and important changes of the future behavior determinants.
Berlow, Eric L.; Knapp, Roland A.; Ostoja, Steven M.; Williams, Richard J.; McKenny, Heather; Matchett, John R.; Guo, Qinghau; Fellers, Gary M.; Kleeman, Patrick; Brooks, Matthew L.; Joppa, Lucas
2013-01-01
A central challenge of conservation biology is using limited data to predict rare species occurrence and identify conservation areas that play a disproportionate role in regional persistence. Where species occupy discrete patches in a landscape, such predictions require data about environmental quality of individual patches and the connectivity among high quality patches. We present a novel extension to species occupancy modeling that blends traditionalpredictions of individual patch environmental quality with network analysis to estimate connectivity characteristics using limited survey data. We demonstrate this approach using environmental and geospatial attributes to predict observed occupancy patterns of the Yosemite toad (Anaxyrus (= Bufo) canorus) across >2,500 meadows in Yosemite National Park (USA). A. canorus, a Federal Proposed Species, breeds in shallow water associated with meadows. Our generalized linear model (GLM) accurately predicted ~84% of true presence-absence data on a subset of data withheld for testing. The predicted environmental quality of each meadow was iteratively ‘boosted’ by the quality of neighbors within dispersal distance. We used this park-wide meadow connectivity network to estimate the relative influence of an individual Meadow’s ‘environmental quality’ versus its ‘network quality’ to predict: a) clusters of high quality breeding meadows potentially linked by dispersal, b) breeding meadows with high environmental quality that are isolated from other such meadows, c) breeding meadows with lower environmental quality where long-term persistence may critically depend on the network neighborhood, and d) breeding meadows with the biggest impact on park-wide breeding patterns. Combined with targeted data on dispersal, genetics, disease, and other potential stressors, these results can guide designation of core conservation areas for A. canorus in Yosemite National Park.
Virginia R. Tolbert; Carl C. Trettin; Dale W. Johnson; John W. Parsons; Allan E. Houston; David A. Mays
2001-01-01
Ensuring sustainability of intensively managed woody crops requires determining soil and water quality effects using a combination of field data and modeling projections. Plot- and catchrnent-scale research, models, and meta-analyses are addressing nutrient availability, site quality, and measures to increase short-rotation woody crop (SRWC) productivity and site...
ERIC Educational Resources Information Center
Barczyk, Casimir; Buckenmeyer, Janet; Feldman, Lori
2010-01-01
This article presents a four-stage model for mentoring faculty in higher education to deliver high quality online instruction. It provides a timeline that shows the stages of program implementation. Known as the Distance Education Mentoring Program, its major outcomes include certified instructors, student achievement, and the attainment of a…
Growing high quality hardwoods: Plantation trials of mixed hardwood species in Tennessee
Christopher M. Oswalt; Wayne K. Clatterbuck
2011-01-01
Hardwood plantations are becoming increasingly important in the United States. To date, many foresters have relied on a conifer plantation model as the basis of establishing and managing hardwood plantations. The monospecific approach suggested by the conifer plantation model does not appear to provide for the development of quality hardwood logs similar to those found...
ERIC Educational Resources Information Center
Dettmers, Swantje; Trautwein, Ulrich; Ludtke, Oliver; Kunter, Mareike; Baumert, Jurgen
2010-01-01
The present study examined the associations of 2 indicators of homework quality (homework selection and homework challenge) with homework motivation, homework behavior, and mathematics achievement. Multilevel modeling was used to analyze longitudinal data from a representative national sample of 3,483 students in Grades 9 and 10; homework effects…
Hudak, R P; Jacoby, I; Meyer, G S; Potter, A L; Hooper, T I; Krakauer, H
1997-01-01
This article describes a training model that focuses on health care management by applying epidemiologic methods to assess and improve the quality of clinical practice. The model's uniqueness is its focus on integrating clinical evidence-based decision making with fundamental principles of resource management to achieve attainable, cost-effective, high-quality health outcomes. The target students are current and prospective clinical and administrative executives who must optimize decision making at the clinical and managerial levels of health care organizations.
Air quality surfaces representing pollutant concentrations across space and time are needed for many applications, including tracking trends and relating air quality to human and ecosystem health. The spatial and temporal characteristics of these surfaces may reveal new informat...
Wang, Dan; Liu, Chenxi; Zhang, Zinan; Ye, Liping; Zhang, Xinping
2018-06-01
Background Patient-centeredness and participatory care is increasingly regarded as a proxy for high-quality interpersonal care. Considering the development of patient-centeredness and participatory care relationship model in pharmacist-patient domain, it is of great significance to explore the mechanism of how pharmacist and patient participative behaviors influence relationship quality and patient outcomes. Objective To validate pharmacist-patient relationship quality model in Chinese hospitals. Four tertiary hospitals in 2017. Methods The provision of pharmaceutical care was investigated. A cross-sectional questionnaire survey covering different constructs of communicative relationship quality model was conducted and the associations among pairs of the study constructs were explored. Based on the results of confirmatory factor analysis, path analysis was conducted to validate the proposed communicative relationship quality model. Main outcome measure Model fit indicators including Tucker-Lewis index (TLI), comparative fit index (CFI), root mean square error of approximation (RMSEA) and weighted root mean square residual(WRMR). Results There were 589 patients included in our study. The final path model had an excellent fit (TLI = 0.98, CFI = 0.98, RMSEA = 0.05; WRMR = 1.06). HCP participative behavior/patient-centeredness (β = 0.79, p < 0.001) and interpersonal communication (β = 0.13, p < 0.001) directly impact the communicative relationship quality. But patient participative behavior was not a predictor of either communicative relationship quality or patient satisfaction. Conclusion HCP participative behavior/patient-centeredness and interpersonal communication are positively related to relationship quality, and relationship quality is mediator between HCP participative behavior and interpersonal communication with patient satisfaction.
NASA Astrophysics Data System (ADS)
Thomas, A.; Huff, A. K.; Gomori, S. G.; Sadoff, N.
2014-12-01
In order to enhance the capacity for air quality modeling and improve air quality monitoring and management in the SERVIR Mesoamerica region, members of SERVIR's Applied Sciences Team (AST) are developing national numerical air quality models for El Salvador and Costa Rica. We are working with stakeholders from the El Salvador Ministry of the Environment and Natural Resources (MARN); National University of Costa Rica (UNA); the Costa Rica Ministry of the Environment, Energy, and Telecommunications (MINAET); and Costa Rica National Meteorological Institute (IMN), who are leaders in air quality monitoring and management in the Mesoamerica region. Focusing initially on these institutions will build sustainability in regional modeling activities by developing air quality modeling capability that can be shared with other countries in Mesoamerica. The air quality models are based on the Community Multi-scale Air Quality (CMAQ) model and incorporate meteorological inputs from the Weather Research and Forecasting (WRF) model, as well as national emissions inventories from El Salvador and Costa Rica. The models are being optimized for urban air quality, which is a priority of decision-makers in Mesoamerica. Once experimental versions of the modeling systems are complete, they will be transitioned to servers run by stakeholders in El Salvador and Costa Rica. The numerical air quality models will provide decision support for stakeholders to identify 1) high-priority areas for expanding national ambient air monitoring networks, 2) needed revisions to national air quality regulations, and 3) gaps in national emissions inventories. This project illustrates SERVIR's goal of the transition of science to support decision-making through capacity building in Mesoamerica, and it aligns with the Group on Earth Observations' health societal benefit theme. This presentation will describe technical aspects of the development of the models and outline key steps in our successful collaboration with the Mesoamerican stakeholders, including the processes of identifying and engaging decision-makers, understanding their requirements and limitations, communicating status updates on a regular basis, and providing sufficient training for end users to be able to utilize the models in a decision-making context.
Air quality impacts of projections of natural gas-fired distributed generation
NASA Astrophysics Data System (ADS)
Horne, Jeremy R.; Carreras-Sospedra, Marc; Dabdub, Donald; Lemar, Paul; Nopmongcol, Uarporn; Shah, Tejas; Yarwood, Greg; Young, David; Shaw, Stephanie L.; Knipping, Eladio M.
2017-11-01
This study assesses the potential impacts on emissions and air quality from the increased adoption of natural gas-fired distributed generation of electricity (DG), including displacement of power from central power generation, in the contiguous United States. The study includes four major tasks: (1) modeling of distributed generation market penetration; (2) modeling of central power generation systems; (3) modeling of spatially and temporally resolved emissions; and (4) photochemical grid modeling to evaluate the potential air quality impacts of increased DG penetration, which includes both power-only DG and combined heat and power (CHP) units, for 2030. Low and high DG penetration scenarios estimate the largest penetration of future DG units in three regions - New England, New York, and California. Projections of DG penetration in the contiguous United States estimate 6.3 GW and 24 GW of market adoption in 2030 for the low DG penetration and high DG penetration scenarios, respectively. High DG penetration (all of which is natural gas-fired) serves to offset 8 GW of new natural gas combined cycle (NGCC) units, and 19 GW of solar photovoltaic (PV) installations by 2030. In all scenarios, air quality in the central United States and the northwest remains unaffected as there is little to no DG penetration in those states. California and several states in the northeast are the most impacted by emissions from DG units. Peak increases in maximum daily 8-h average ozone concentrations exceed 5 ppb, which may impede attainment of ambient air quality standards. Overall, air quality impacts from DG vary greatly based on meteorological conditions, proximity to emissions sources, the number and type of DG installations, and the emissions factors used for DG units.
Evaluating Air-Quality Models: Review and Outlook.
NASA Astrophysics Data System (ADS)
Weil, J. C.; Sykes, R. I.; Venkatram, A.
1992-10-01
Over the past decade, much attention has been devoted to the evaluation of air-quality models with emphasis on model performance in predicting the high concentrations that are important in air-quality regulations. This paper stems from our belief that this practice needs to be expanded to 1) evaluate model physics and 2) deal with the large natural or stochastic variability in concentration. The variability is represented by the root-mean- square fluctuating concentration (c about the mean concentration (C) over an ensemble-a given set of meteorological, source, etc. conditions. Most air-quality models used in applications predict C, whereas observations are individual realizations drawn from an ensemble. For cC large residuals exist between predicted and observed concentrations, which confuse model evaluations.This paper addresses ways of evaluating model physics in light of the large c the focus is on elevated point-source models. Evaluation of model physics requires the separation of the mean model error-the difference between the predicted and observed C-from the natural variability. A residual analysis is shown to be an elective way of doing this. Several examples demonstrate the usefulness of residuals as well as correlation analyses and laboratory data in judging model physics.In general, c models and predictions of the probability distribution of the fluctuating concentration (c), (c, are in the developmental stage, with laboratory data playing an important role. Laboratory data from point-source plumes in a convection tank show that (c approximates a self-similar distribution along the plume center plane, a useful result in a residual analysis. At pmsent,there is one model-ARAP-that predicts C, c, and (c for point-source plumes. This model is more computationally demanding than other dispersion models (for C only) and must be demonstrated as a practical tool. However, it predicts an important quantity for applications- the uncertainty in the very high and infrequent concentrations. The uncertainty is large and is needed in evaluating operational performance and in predicting the attainment of air-quality standards.
The Unmanned Aerial System SUMO: an alternative measurement tool for polar boundary layer studies
NASA Astrophysics Data System (ADS)
Mayer, S.; Jonassen, M. O.; Reuder, J.
2012-04-01
Numerical weather prediction and climate models face special challenges in particular in the commonly stable conditions in the high-latitude environment. For process studies as well as for model validation purposes in-situ observations in the atmospheric boundary layer are highly required, but difficult to retrieve. We introduce a new measurement system for corresponding observations. The Small Unmanned Meteorological Observer SUMO consists of a small and light-weight auto-piloted model aircraft, equipped with a meteorological sensor package. SUMO has been operated in polar environments, among others during IPY on Spitsbergen in the year 2009 and has proven its capabilities for atmospheric measurements with high spatial and temporal resolution even at temperatures of -30 deg C. A comparison of the SUMO data with radiosondes and tethered balloons shows that SUMO can provide atmospheric profiles with comparable quality to those well-established systems. Its high data quality allowed its utilization for evaluation purposes of high-resolution model runs performed with the Weather Research and Forecasting model WRF and for the detailed investigation of an orographically modified flow during a case study.
NASA Astrophysics Data System (ADS)
Byun, D. W.; Rappenglueck, B.; Lefer, B.
2007-12-01
Accurate meteorological and photochemical modeling efforts are necessary to understand the measurements made during the Texas Air Quality Study (TexAQS-II). The main objective of the study is to understand the meteorological and chemical processes of high ozone and regional haze events in the Eastern Texas, including the Houston-Galveston metropolitan area. Real-time and retrospective meteorological and photochemical model simulations were performed to study key physical and chemical processes in the Houston Galveston Area. In particular, the Vertical Mixing Experiment (VME) at the University of Houston campus was performed on selected days during the TexAQS-II. Results of the MM5 meteorological model and CMAQ air quality model simulations were compared with the VME and other TexAQS-II measurements to understand the interaction of the boundary layer dynamics and photochemical evolution affecting Houston air quality.
NASA Astrophysics Data System (ADS)
Siepmann, Jens P.; Wortberg, Johannes; Heinzler, Felix A.
2016-03-01
The injection molding process is mandatorily influenced by the viscosity of the material. By varying the material batch the viscosity of the polymer changes. For the process and part quality the initial conditions of the material in addition to the processing parameters define the process and product quality. A high percentage of technical polymers processed in injection molding is refined in a follow-up production step, for example electro plating. Processing optimized for electro plating often requires avoiding high shear stresses by using low injection speed and pressure conditions. Therefore differences in the material charges' viscosity occur especially in the quality related low shear rate area. These differences and quality related influences can be investigated by high detail rheological analysis and process simulation based on adapted material describing models. Differences in viscosity between batches can be detected by measurements with high-pressure-capillary-rheometers or oscillatory rheometers for low shear rates. A combination of both measurement techniques is possible by the Cox-Merz-Relation. The detected differences in the rheological behavior of both charges are summarized in two material behavior describing model approaches and added to the simulation. In this paper the results of processing-simulations with standard filling parameters are presented with two ABS charges. Part quality defining quantities such as temperature, pressure and shear stress are investigated and the influence of charge variations is pointed out with respect to electro plating quality demands. Furthermore, the results of simulations with a new quality related process control are presented and compared to the standard processing.
NASA Astrophysics Data System (ADS)
Osterman, G. B.; Eldering, A.; Neu, J. L.; Tang, Y.; McQueen, J.; Pinder, R. W.
2011-12-01
To help protect human health and ecosystems, regional-scale atmospheric chemistry models are used to forecast high ozone events and to design emission control strategies to decrease the frequency and severity of ozone events. Despite the impact that nighttime aloft ozone can have on surface ozone, regional-scale atmospheric chemistry models often do not simulate the nighttime ozone concentrations well and nor do they sufficiently capture the ozone transport patterns. Fully characterizing the importance of the nighttime ozone has been hampered by limited measurements of the vertical distribution of ozone and ozone-precursors. The main focus of this work is to begin to utilize remote sensing data sets to characterize the impact of nighttime aloft ozone to air quality events. We will describe our plans to use NASA satellite data sets, transport models and air quality models to study ozone transport, focusing primarily on nighttime ozone and provide initial results. We will use satellite and ozonesonde data to help understand how well the air quality models are simulating ozone in the lower free troposphere and attempt to characterize the impact of nighttime ozone to air quality events. Our specific objectives are: 1) Characterize nighttime aloft ozone using remote sensing data and sondes. 2) Evaluate the ability of the Community Multi-scale Air Quality (CMAQ) model and the National Air Quality Forecast Capability (NAQFC) model to capture the nighttime aloft ozone and its relationship to air quality events. 3) Analyze a set of air quality events and determine the relationship of air quality events to the nighttime aloft ozone. We will achieve our objectives by utilizing the ozone profile data from the NASA Earth Observing System (EOS) Tropospheric Emission Spectrometer (TES) and other sensors, ozonesonde data collected during the Aura mission (IONS), EPA AirNow ground station ozone data, the CMAQ continental-scale air quality model, and the National Air Quality Forecast model.
NASA Astrophysics Data System (ADS)
Choudhury, Anustup; Farrell, Suzanne; Atkins, Robin; Daly, Scott
2017-09-01
We present an approach to predict overall HDR display quality as a function of key HDR display parameters. We first performed subjective experiments on a high quality HDR display that explored five key HDR display parameters: maximum luminance, minimum luminance, color gamut, bit-depth and local contrast. Subjects rated overall quality for different combinations of these display parameters. We explored two models | a physical model solely based on physically measured display characteristics and a perceptual model that transforms physical parameters using human vision system models. For the perceptual model, we use a family of metrics based on a recently published color volume model (ICT-CP), which consists of the PQ luminance non-linearity (ST2084) and LMS-based opponent color, as well as an estimate of the display point spread function. To predict overall visual quality, we apply linear regression and machine learning techniques such as Multilayer Perceptron, RBF and SVM networks. We use RMSE and Pearson/Spearman correlation coefficients to quantify performance. We found that the perceptual model is better at predicting subjective quality than the physical model and that SVM is better at prediction than linear regression. The significance and contribution of each display parameter was investigated. In addition, we found that combined parameters such as contrast do not improve prediction. Traditional perceptual models were also evaluated and we found that models based on the PQ non-linearity performed better.
Quality assessment concept of the World Data Center for Climate and its application to CMIP5 data
NASA Astrophysics Data System (ADS)
Stockhause, M.; Höck, H.; Toussaint, F.; Lautenschlager, M.
2012-08-01
The preservation of data in a high state of quality which is suitable for interdisciplinary use is one of the most pressing and challenging current issues in long-term archiving. For high volume data such as climate model data, the data and data replica are no longer stored centrally but distributed over several local data repositories, e.g. the data of the Climate Model Intercomparison Project Phase 5 (CMIP5). The most important part of the data is to be archived, assigned a DOI, and published according to the World Data Center for Climate's (WDCC) application of the DataCite regulations. The integrated part of WDCC's data publication process, the data quality assessment, was adapted to the requirements of a federated data infrastructure. A concept of a distributed and federated quality assessment procedure was developed, in which the workload and responsibility for quality control is shared between the three primary CMIP5 data centers: Program for Climate Model Diagnosis and Intercomparison (PCMDI), British Atmospheric Data Centre (BADC), and WDCC. This distributed quality control concept, its pilot implementation for CMIP5, and first experiences are presented. The distributed quality control approach is capable of identifying data inconsistencies and to make quality results immediately available for data creators, data users and data infrastructure managers. Continuous publication of new data versions and slow data replication prevents the quality control from check completion. This together with ongoing developments of the data and metadata infrastructure requires adaptations in code and concept of the distributed quality control approach.
Managing Chronic Disease in Ontario Primary Care: The Impact of Organizational Factors
Russell, Grant M.; Dahrouge, Simone; Hogg, William; Geneau, Robert; Muldoon, Laura; Tuna, Meltem
2009-01-01
PURPOSE New approaches to chronic disease management emphasize the need to improve the delivery of primary care services to meet the needs of chronically ill patients. This study (1) assessed whether chronic disease management differed among 4 models of primary health care delivery and (2) identified which practice organizational factors were independently associated with high-quality care. METHODS We undertook a cross-sectional survey with nested qualitative case studies (2 practices per model) in 137 randomly selected primary care practices from 4 delivery models in Ontario Canada: fee for service, capitation, blended payment, and community health centers (CHCs). Practice and clinician surveys were based on the Primary Care Assessment Tool. A chart audit assessed evidence-based care delivery for patients with diabetes, congestive heart failure, and coronary artery disease. Intermediate outcomes were calculated for patients with diabetes and hypertension. Multiple linear regression identified those organizational factors independently associated with chronic disease management. RESULTS Chronic disease management was superior in CHCs. Clinicians in CHCs found it easier than those in the other models to promote high-quality care through longer consultations and interprofessional collaboration. Across the whole sample and independent of model, high-quality chronic disease management was associated with the presence of a nurse-practitioner. It was also associated with lower patient-family physician ratios and when practices had 4 or fewer full-time-equivalent family physicians. CONCLUSIONS The study adds to the literature supporting the value of nurse-practitioners within primary care teams and validates the contributions of Ontario’s CHCs. Our observation that quality of care decreased in larger, busier practices suggests that moves toward larger practices and greater patient-physician ratios may have unanticipated negative effects on processes of care quality. PMID:19597168
2010-06-01
models 13 The Chi-Square test fails to reject the null hypothesis that there is no difference between 2008 and 2009 data (p-value = 0.601). This...attributed to process performance modeling 53 Table 4: Relationships between data quality and integrity activities and overall value attributed to... data quality and integrity; staffing and resources devoted to the work; pertinent training and coaching; and the alignment of the models with
A key factor for improving models of ecosystem benefits is the availability of high quality spatial data. High resolution LIDAR data are now commonly available and can be used to produce more accurate model outputs. However, increased resolution leads to higher computer resource...
Quality assessment of protein model-structures based on structural and functional similarities.
Konopka, Bogumil M; Nebel, Jean-Christophe; Kotulska, Malgorzata
2012-09-21
Experimental determination of protein 3D structures is expensive, time consuming and sometimes impossible. A gap between number of protein structures deposited in the World Wide Protein Data Bank and the number of sequenced proteins constantly broadens. Computational modeling is deemed to be one of the ways to deal with the problem. Although protein 3D structure prediction is a difficult task, many tools are available. These tools can model it from a sequence or partial structural information, e.g. contact maps. Consequently, biologists have the ability to generate automatically a putative 3D structure model of any protein. However, the main issue becomes evaluation of the model quality, which is one of the most important challenges of structural biology. GOBA--Gene Ontology-Based Assessment is a novel Protein Model Quality Assessment Program. It estimates the compatibility between a model-structure and its expected function. GOBA is based on the assumption that a high quality model is expected to be structurally similar to proteins functionally similar to the prediction target. Whereas DALI is used to measure structure similarity, protein functional similarity is quantified using standardized and hierarchical description of proteins provided by Gene Ontology combined with Wang's algorithm for calculating semantic similarity. Two approaches are proposed to express the quality of protein model-structures. One is a single model quality assessment method, the other is its modification, which provides a relative measure of model quality. Exhaustive evaluation is performed on data sets of model-structures submitted to the CASP8 and CASP9 contests. The validation shows that the method is able to discriminate between good and bad model-structures. The best of tested GOBA scores achieved 0.74 and 0.8 as a mean Pearson correlation to the observed quality of models in our CASP8 and CASP9-based validation sets. GOBA also obtained the best result for two targets of CASP8, and one of CASP9, compared to the contest participants. Consequently, GOBA offers a novel single model quality assessment program that addresses the practical needs of biologists. In conjunction with other Model Quality Assessment Programs (MQAPs), it would prove useful for the evaluation of single protein models.
Quality assessment of Digital Elevation Model (DEM) in view of the Altiplano hydrological modeling
NASA Astrophysics Data System (ADS)
Satgé, F.; Arsen, A.; Bonnet, M.; Timouk, F.; Calmant, S.; Pilco, R.; Molina, J.; Lavado, W.; Crétaux, J.; HASM
2013-05-01
Topography is crucial data input for hydrological modeling but in many regions of the world, the only way to characterize topography is the use of satellite-based Digital Elevation Models (DEM). In some regions, the quality of these DEMs remains poor and induces modeling errors that may or not be compensated by model parameters tuning. In such regions, the evaluation of these data uncertainties is an important step in the modeling procedure. In this study, which focuses on the Altiplano region, we present the evaluation of the two freely available DEM. The shuttle radar topographic mission (SRTM), a product of the National Aeronautics and Space Administration (NASA) and the Advanced Space Born Thermal Emission and Reflection Global Digital Elevation Map (ASTER GDEM), data provided by the Ministry of Economy, Trade and Industry of Japan (MESI) in collaboration with the NASA, are widely used. While the first represents a resolution of 3 arc seconds (90m) the latter is 1 arc second (30m). In order to select the most reliable DEM, we compared the DEM elevation with high qualities control points elevation. Because of its large spatial coverture (track spaced of 30 km with a measure of each 172 m) and its high vertical accuracy which is less than 15 cm in good weather conditions, the Geoscience Laser Altimeter System (GLAS) on board on the Ice, Cloud and Land elevation Satellite of NASA (ICESat) represent the better solution to establish a high quality elevation database. After a quality check, more than 150 000 ICESat/GLAS measurements are suitable in terms of accuracy for the Altiplano watershed. This data base has been used to evaluate the vertical accuracy for each DEM. Regarding to the full spatial coverture; the comparison has been done for both, all kind of land coverture, range altitude and mean slope.
Modelling of beef sensory quality for a better prediction of palatability.
Hocquette, Jean-François; Van Wezemael, Lynn; Chriki, Sghaier; Legrand, Isabelle; Verbeke, Wim; Farmer, Linda; Scollan, Nigel D; Polkinghorne, Rod; Rødbotten, Rune; Allen, Paul; Pethick, David W
2014-07-01
Despite efforts by the industry to control the eating quality of beef, there remains a high level of variability in palatability, which is one reason for consumer dissatisfaction. In Europe, there is still no reliable on-line tool to predict beef quality and deliver consistent quality beef to consumers. Beef quality traits depend in part on the physical and chemical properties of the muscles. The determination of these properties (known as muscle profiling) will allow for more informed decisions to be made in the selection of individual muscles for the production of value-added products. Therefore, scientists and professional partners of the ProSafeBeef project have brought together all the data they have accumulated over 20 years. The resulting BIF-Beef (Integrated and Functional Biology of Beef) data warehouse contains available data of animal growth, carcass composition, muscle tissue characteristics and beef quality traits. This database is useful to determine the most important muscle characteristics associated with a high tenderness, a high flavour or generally a high quality. Another more consumer driven modelling tool was developed in Australia: the Meat Standards Australia (MSA) grading scheme that predicts beef quality for each individual muscle×specific cooking method combination using various information on the corresponding animals and post-slaughter processing factors. This system has also the potential to detect variability in quality within muscles. The MSA system proved to be effective in predicting beef palatability not only in Australia but also in many other countries. The results of the work conducted in Europe within the ProSafeBeef project indicate that it would be possible to manage a grading system in Europe similar to the MSA system. The combination of the different modelling approaches (namely muscle biochemistry and a MSA-like meat grading system adapted to the European market) is a promising area of research to improve the prediction of beef quality. In both approaches, the volume of data available not only provides statistically sound correlations between various factors and beef quality traits but also a better understanding of the variability of beef quality according to various criteria (breed, age, sex, pH, marbling etc.). © 2013 The American Meat Science Association. All rights reserved.
Moore, Kevin L; Schmidt, Rachel; Moiseenko, Vitali; Olsen, Lindsey A; Tan, Jun; Xiao, Ying; Galvin, James; Pugh, Stephanie; Seider, Michael J; Dicker, Adam P; Bosch, Walter; Michalski, Jeff; Mutic, Sasa
2015-06-01
The purpose of this study was to quantify the frequency and clinical severity of quality deficiencies in intensity modulated radiation therapy (IMRT) planning in the Radiation Therapy Oncology Group 0126 protocol. A total of 219 IMRT patients from the high-dose arm (79.2 Gy) of RTOG 0126 were analyzed. To quantify plan quality, we used established knowledge-based methods for patient-specific dose-volume histogram (DVH) prediction of organs at risk and a Lyman-Kutcher-Burman (LKB) model for grade ≥2 rectal complications to convert DVHs into normal tissue complication probabilities (NTCPs). The LKB model was validated by fitting dose-response parameters relative to observed toxicities. The 90th percentile (22 of 219) of plans with the lowest excess risk (difference between clinical and model-predicted NTCP) were used to create a model for the presumed best practices in the protocol (pDVH0126,top10%). Applying the resultant model to the entire sample enabled comparisons between DVHs that patients could have received to DVHs they actually received. Excess risk quantified the clinical impact of suboptimal planning. Accuracy of pDVH predictions was validated by replanning 30 of 219 patients (13.7%), including equal numbers of presumed "high-quality," "low-quality," and randomly sampled plans. NTCP-predicted toxicities were compared to adverse events on protocol. Existing models showed that bladder-sparing variations were less prevalent than rectum quality variations and that increased rectal sparing was not correlated with target metrics (dose received by 98% and 2% of the PTV, respectively). Observed toxicities were consistent with current LKB parameters. Converting DVH and pDVH0126,top10% to rectal NTCPs, we observed 94 of 219 patients (42.9%) with ≥5% excess risk, 20 of 219 patients (9.1%) with ≥10% excess risk, and 2 of 219 patients (0.9%) with ≥15% excess risk. Replanning demonstrated the predicted NTCP reductions while maintaining the volume of the PTV receiving prescription dose. An equivalent sample of high-quality plans showed fewer toxicities than low-quality plans, 6 of 73 versus 10 of 73 respectively, although these differences were not significant (P=.21) due to insufficient statistical power in this retrospective study. Plan quality deficiencies in RTOG 0126 exposed patients to substantial excess risk for rectal complications. Copyright © 2015 Elsevier Inc. All rights reserved.
Operation quality assessment model for video conference system
NASA Astrophysics Data System (ADS)
Du, Bangshi; Qi, Feng; Shao, Sujie; Wang, Ying; Li, Weijian
2018-01-01
Video conference system has become an important support platform for smart grid operation and management, its operation quality is gradually concerning grid enterprise. First, the evaluation indicator system covering network, business and operation maintenance aspects was established on basis of video conference system's operation statistics. Then, the operation quality assessment model combining genetic algorithm with regularized BP neural network was proposed, which outputs operation quality level of the system within a time period and provides company manager with some optimization advice. The simulation results show that the proposed evaluation model offers the advantages of fast convergence and high prediction accuracy in contrast with regularized BP neural network, and its generalization ability is superior to LM-BP neural network and Bayesian BP neural network.
Modeling the Water - Quality Effects of Changes to the Klamath River Upstream of Keno Dam, Oregon
Sullivan, Annett B.; Sogutlugil, I. Ertugrul; Rounds, Stewart A.; Deas, Michael L.
2013-01-01
The Link River to Keno Dam (Link-Keno) reach of the Klamath River, Oregon, generally has periods of water-quality impairment during summer, including low dissolved oxygen, elevated concentrations of ammonia and algae, and high pH. Efforts are underway to improve water quality in this reach through a Total Maximum Daily Load (TMDL) program and other management and operational actions. To assist in planning, a hydrodynamic and water-quality model was used in this study to provide insight about how various actions could affect water quality in the reach. These model scenarios used a previously developed and calibrated CE-QUAL-W2 model of the Link-Keno reach developed by the U.S. Geological Survey (USGS), Watercourse Engineering Inc., and the Bureau of Reclamation for calendar years 2006-09 (referred to as the "USGS model" in this report). Another model of the same river reach was previously developed by Tetra Tech, Inc. and the Oregon Department of Environmental Quality for years 2000 and 2002 and was used in the TMDL process; that model is referred to as the "TMDL model" in this report. This report includes scenarios that (1) assess the effect of TMDL allocations on water quality, (2) provide insight on certain aspects of the TMDL model, (3) assess various methods to improve water quality in this reach, and (4) examine possible water-quality effects of a future warmer climate. Results presented in this report for the first 5 scenarios supersede or augment those that were previously published (scenarios 1 and 2 in Sullivan and others [2011], 3 through 5 in Sullivan and others [2012]); those previous results are still valid, but the results for those scenarios in this report are more current.
Abbott Preschool Program Longitudinal Effects Study: Fifth Grade Follow-Up
ERIC Educational Resources Information Center
Barnett, W. Steven; Jung, Kwanghee; Youn, Min-Jong; Frede, Ellen C.
2013-01-01
New Jersey's Abbott Preschool program is of broad national and international interest because the Abbott program provides a model for building a high-quality system of universal pre-K through public-private partnerships that transform the existing system. The program offers high-quality pre-K to all children in 31 New Jersey communities with high…
ERIC Educational Resources Information Center
Epstein, Ann S.
The Training of Trainers (ToT) Evaluation investigated the efficacy of the High/Scope model for improving the quality of early childhood programs on a national scale. To address this question, the High/Scope Foundation undertook a multimethod evaluation that collected anecdotal records from the consultants and 793 participants in 40 ToT projects,…
Wilcox, S.; Andreas, A.
2010-03-16
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Stoffel, T.; Andreas, A.
2010-04-26
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2010-07-13
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2012-11-03
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Solar Resource & Meteorological Assessment Project (SOLRMAP): Sun Spot Two; Swink, Colorado (Data)
Wilcox, S.; Andreas, A.
2010-11-10
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2010-07-14
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2009-07-22
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2010-11-03
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Air Quality Science and Regulatory Efforts Require Geostationary Satellite Measurements
NASA Technical Reports Server (NTRS)
Pickering, Kenneth E.; Allen, D. J.; Stehr, J. W.
2006-01-01
Air quality scientists and regulatory agencies would benefit from the high spatial and temporal resolution trace gas and aerosol data that could be provided by instruments on a geostationary platform. More detailed time-resolved data from a geostationary platform could be used in tracking regional transport and in evaluating mesoscale air quality model performance in terms of photochemical evolution throughout the day. The diurnal cycle of photochemical pollutants is currently missing from the data provided by the current generation of atmospheric chemistry satellites which provide only one measurement per day. Often peak surface ozone mixing ratios are reached much earlier in the day during major regional pollution episodes than during local episodes due to downward mixing of ozone that had been transported above the boundary layer overnight. The regional air quality models often do not simulate this downward mixing well enough and underestimate surface ozone in regional episodes. Having high time-resolution geostationary data will make it possible to determine the magnitude of this lower-and mid-tropospheric transport that contributes to peak eight-hour average ozone and 24-hour average PM2.5 concentrations. We will show ozone and PM(sub 2.5) episodes from the CMAQ model and suggest ways in which geostationary satellite data would improve air quality forecasting. Current regulatory modeling is typically being performed at 12 km horizontal resolution. State and regional air quality regulators in regions with complex topography and/or land-sea breezes are anxious to move to 4-km or finer resolution simulations. Geostationary data at these or finer resolutions will be useful in evaluating such models.
NASA Astrophysics Data System (ADS)
Choi, Yu-Jin; Hyde, Peter; Fernando, H. J. S.
High (episodic) particulate matter (PM) events over the sister cities of Douglas (AZ) and Agua Prieta (Sonora), located in the US-Mexico border, were simulated using the 3D Eulerian air quality model, MODELS-3/CMAQ. The best available input information was used for the simulations, with pollution inventory specified on a fine grid. In spite of inherent uncertainties associated with the emission inventory as well as the chemistry and meteorology of the air quality simulation tool, model evaluations showed acceptable PM predictions, while demonstrating the need for including the interaction between meteorology and emissions in an interactive mode in the model, a capability currently unavailable in MODELS-3/CMAQ when dealing with PM. Sensitivity studies on boundary influence indicate an insignificant regional (advection) contribution of PM to the study area. The contribution of secondary particles to the occurrence of high PM events was trivial. High PM episodes in the study area, therefore, are purely local events that largely depend on local meteorological conditions. The major PM emission sources were identified as vehicular activities on unpaved/paved roads and wind-blown dust. The results will be of immediate utility in devising PM mitigation strategies for the study area, which is one of the US EPA-designated non-attainment areas with respect to PM.
Neural networks with fuzzy Petri nets for modeling a machining process
NASA Astrophysics Data System (ADS)
Hanna, Moheb M.
1998-03-01
The paper presents an intelligent architecture based a feedforward neural network with fuzzy Petri nets for modeling product quality in a CNC machining center. It discusses how the proposed architecture can be used for modeling, monitoring and control a product quality specification such as surface roughness. The surface roughness represents the output quality specification manufactured by a CNC machining center as a result of a milling process. The neural network approach employed the selected input parameters which defined by the machine operator via the CNC code. The fuzzy Petri nets approach utilized the exact input milling parameters, such as spindle speed, feed rate, tool diameter and coolant (off/on), which can be obtained via the machine or sensors system. An aim of the proposed architecture is to model the demanded quality of surface roughness as high, medium or low.
TU-EF-204-02: Hiigh Quality and Sub-MSv Cerebral CT Perfusion Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Ke; Niu, Kai; Wu, Yijing
2015-06-15
Purpose: CT Perfusion (CTP) imaging is of great importance in acute ischemic stroke management due to its potential to detect hypoperfused yet salvageable tissue and distinguish it from definitely unsalvageable tissue. However, current CTP imaging suffers from poor image quality and high radiation dose (up to 5 mSv). The purpose of this work was to demonstrate that technical innovations such as Prior Image Constrained Compressed Sensing (PICCS) have the potential to address these challenges and achieve high quality and sub-mSv CTP imaging. Methods: (1) A spatial-temporal 4D cascaded system model was developed to indentify the bottlenecks in the current CTPmore » technology; (2) A task-based framework was developed to optimize the CTP system parameters; (3) Guided by (1) and (2), PICCS was customized for the reconstruction of CTP source images. Digital anthropomorphic perfusion phantoms, animal studies, and preliminary human subject studies were used to validate and evaluate the potentials of using these innovations to advance the CTP technology. Results: The 4D cascaded model was validated in both phantom and canine stroke models. Based upon this cascaded model, it has been discovered that, as long as the spatial resolution and noise properties of the 4D source CT images are given, the 3D MTF and NPS of the final CTP maps can be analytically derived for a given set of processing methods and parameters. The cascaded model analysis also identified that the most critical technical factor in CTP is how to acquire and reconstruct high quality source images; it has very little to do with the denoising techniques often used after parametric perfusion calculations. This explained why PICCS resulted in a five-fold dose reduction or substantial improvement in image quality. Conclusion: Technical innovations generated promising results towards achieving high quality and sub-mSv CTP imaging for reliable and safe assessment of acute ischemic strokes. K. Li, K. Niu, Y. Wu: Nothing to disclose. G.-H. Chen: Research funded, GE Healthcare; Research funded, Siemens AX.« less
NASA Astrophysics Data System (ADS)
Lu, Lin; Chang, Yunlong; Li, Yingmin; He, Youyou
2013-05-01
A transverse magnetic field was introduced to the arc plasma in the process of welding stainless steel tubes by high-speed Tungsten Inert Gas Arc Welding (TIG for short) without filler wire. The influence of external magnetic field on welding quality was investigated. 9 sets of parameters were designed by the means of orthogonal experiment. The welding joint tensile strength and form factor of weld were regarded as the main standards of welding quality. A binary quadratic nonlinear regression equation was established with the conditions of magnetic induction and flow rate of Ar gas. The residual standard deviation was calculated to adjust the accuracy of regression model. The results showed that, the regression model was correct and effective in calculating the tensile strength and aspect ratio of weld. Two 3D regression models were designed respectively, and then the impact law of magnetic induction on welding quality was researched.
Kiely, Daniel J; Stephanson, Kirk; Ross, Sue
2011-10-01
Low-cost laparoscopic box trainers built using home computers and webcams may provide residents with a useful tool for practice at home. This study set out to evaluate the image quality of low-cost laparoscopic box trainers compared with a commercially available model. Five low-cost laparoscopic box trainers including the components listed were compared in random order to one commercially available box trainer: A (high-definition USB 2.0 webcam, PC laptop), B (Firewire webcam, Mac laptop), C (high-definition USB 2.0 webcam, Mac laptop), D (standard USB webcam, PC desktop), E (Firewire webcam, PC desktop), and F (the TRLCD03 3-DMEd Standard Minimally Invasive Training System). Participants observed still image quality and performed a peg transfer task using each box trainer. Participants rated still image quality, image quality with motion, and whether the box trainer had sufficient image quality to be useful for training. Sixteen residents in obstetrics and gynecology took part in the study. The box trainers showing no statistically significant difference from the commercially available model were A, B, C, D, and E for still image quality; A for image quality with motion; and A and B for usefulness of the simulator based on image quality. The cost of the box trainers A-E is approximately $100 to $160 each, not including a computer or laparoscopic instruments. Laparoscopic box trainers built from a high-definition USB 2.0 webcam with a PC (box trainer A) or from a Firewire webcam with a Mac (box trainer B) provide image quality comparable with a commercial standard.
Burn injury models of care: A review of quality and cultural safety for care of Indigenous children.
Fraser, Sarah; Grant, Julian; Mackean, Tamara; Hunter, Kate; Holland, Andrew J A; Clapham, Kathleen; Teague, Warwick J; Ivers, Rebecca Q
2018-05-01
Safety and quality in the systematic management of burn care is important to ensure optimal outcomes. It is not clear if or how burn injury models of care uphold these qualities, or if they provide a space for culturally safe healthcare for Indigenous peoples, especially for children. This review is a critique of publically available models of care analysing their ability to facilitate safe, high-quality burn care for Indigenous children. Models of care were identified and mapped against cultural safety principles in healthcare, and against the National Health and Medical Research Council standard for clinical practice guidelines. An initial search and appraisal of tools was conducted to assess suitability of the tools in providing a mechanism to address quality and cultural safety. From the 53 documents found, 6 were eligible for review. Aspects of cultural safety were addressed in the models, but not explicitly, and were recorded very differently across all models. There was also limited or no cultural consultation documented in the models of care reviewed. Quality in the documents against National Health and Medical Research Council guidelines was evident; however, description or application of quality measures was inconsistent and incomplete. Gaps concerning safety and quality in the documented care pathways for Indigenous peoples' who sustain a burn injury and require burn care highlight the need for investigation and reform of current practices. Copyright © 2017 Elsevier Ltd and ISBI. All rights reserved.
Fuzzy intelligent quality monitoring model for X-ray image processing.
Khalatbari, Azadeh; Jenab, Kouroush
2009-01-01
Today's imaging diagnosis needs to adapt modern techniques of quality engineering to maintain and improve its accuracy and reliability in health care system. One of the main factors that influences diagnostic accuracy of plain film X-ray on detecting pathology is the level of film exposure. If the level of film exposure is not adequate, a normal body structure may be interpretated as pathology and vice versa. This not only influences the patient management but also has an impact on health care cost and patient's quality of life. Therefore, providing an accurate and high quality image is the first step toward an excellent patient management in any health care system. In this paper, we study these techniques and also present a fuzzy intelligent quality monitoring model, which can be used to keep variables from degrading the image quality. The variables derived from chemical activity, cleaning procedures, maintenance, and monitoring may not be sensed, measured, or calculated precisely due to uncertain situations. Therefore, the gamma-level fuzzy Bayesian model for quality monitoring of an image processing is proposed. In order to apply the Bayesian concept, the fuzzy quality characteristics are assumed as fuzzy random variables. Using the fuzzy quality characteristics, the newly developed model calculates the degradation risk for image processing. A numerical example is also presented to demonstrate the application of the model.
van den Akker, Jeroen; Mishne, Gilad; Zimmer, Anjali D; Zhou, Alicia Y
2018-04-17
Next generation sequencing (NGS) has become a common technology for clinical genetic tests. The quality of NGS calls varies widely and is influenced by features like reference sequence characteristics, read depth, and mapping accuracy. With recent advances in NGS technology and software tools, the majority of variants called using NGS alone are in fact accurate and reliable. However, a small subset of difficult-to-call variants that still do require orthogonal confirmation exist. For this reason, many clinical laboratories confirm NGS results using orthogonal technologies such as Sanger sequencing. Here, we report the development of a deterministic machine-learning-based model to differentiate between these two types of variant calls: those that do not require confirmation using an orthogonal technology (high confidence), and those that require additional quality testing (low confidence). This approach allows reliable NGS-based calling in a clinical setting by identifying the few important variant calls that require orthogonal confirmation. We developed and tested the model using a set of 7179 variants identified by a targeted NGS panel and re-tested by Sanger sequencing. The model incorporated several signals of sequence characteristics and call quality to determine if a variant was identified at high or low confidence. The model was tuned to eliminate false positives, defined as variants that were called by NGS but not confirmed by Sanger sequencing. The model achieved very high accuracy: 99.4% (95% confidence interval: +/- 0.03%). It categorized 92.2% (6622/7179) of the variants as high confidence, and 100% of these were confirmed to be present by Sanger sequencing. Among the variants that were categorized as low confidence, defined as NGS calls of low quality that are likely to be artifacts, 92.1% (513/557) were found to be not present by Sanger sequencing. This work shows that NGS data contains sufficient characteristics for a machine-learning-based model to differentiate low from high confidence variants. Additionally, it reveals the importance of incorporating site-specific features as well as variant call features in such a model.
George M. Banzhaf; Thomas G. Matney; Emily B. Schultz; James S. Meadows; J. Paul Jeffreys; William C. Booth; Gan Li; Andrew W. Ezell; Theodor D. Leininger
2016-01-01
Red oak (Quercus section Labatae)-sweetgum (Liquidambar styraciflua L.) stands growing on mid-south bottomland sites in the United States are well known for producing high-quality grade hardwood logs, but models for estimating the quantity and quality of standing grade wood in these stands have been unavailable. Prediction...
Improving the geomagnetic field modeling with a selection of high-quality archaeointensity data
NASA Astrophysics Data System (ADS)
Pavon-Carrasco, Francisco Javier; Gomez-Paccard, Miriam; Herve, Gwenael; Osete, Maria Luisa; Chauvin, Annick
2014-05-01
Geomagnetic field reconstructions for the last millennia are based on archeomagnetic data. However, the scatter of the archaeointensity data is very puzzling and clearly suggests that some of the intensity data might not be reliable. In this work we apply different selection criteria to the European and Western Asian archaeointensity data covering the last three millennia in order to investigate if the data selection affects geomagnetic field models results. Thanks to the recently developed archeomagnetic databases, new valuable information related to the methodology used to determine the archeointensity data is now available. We therefore used this information to rank the archaeointensity data in four quality categories depending on the methodology used during the laboratory treatment of the samples and on the number of specimens retained to calculate the mean intensities. Results show how the intensity geomagnetic field component given by the regional models hardly depends on the selected quality data used. When all the available data are used a different behavior of the geomagnetic field is observed in Western and Eastern Europe. However, when the regional model is obtained from a selection of high-quality intensity data the same features are observed at the European scale.
NASA Astrophysics Data System (ADS)
Ahmadov, R.; McKeen, S. A.; Trainer, M.; Banta, R. M.; Brown, S. S.; Edwards, P. M.; Frost, G. J.; Gilman, J.; Helmig, D.; Johnson, B.; Karion, A.; Koss, A.; Lerner, B. M.; Oltmans, S. J.; Roberts, J. M.; Schnell, R. C.; Veres, P. R.; Warneke, C.; Williams, E. J.; Wild, R. J.; Yuan, B.; Zamora, R. J.; Petron, G.; De Gouw, J. A.; Peischl, J.
2014-12-01
The huge increase in production of oil and natural gas has been associated with high wintertime ozone events over some parts of the western US. The Uinta Basin, UT, where oil and natural gas production is abundant experienced high ozone concentrations in winters of recent years, when cold stagnant weather conditions were prevalent. It has been very challenging for conventional air quality models to accurately simulate such wintertime ozone pollution cases. Here, a regional air quality model study was successfully conducted for the Uinta Basin by using the WRF-Chem model. For this purpose a new emission dataset for the region's oil/gas sector was built based on atmospheric in-situ measurements made during 2012 and 2013 field campaigns in the Uinta Basin. The WRF-Chem model demonstrates that the major factors driving high ozone in the Uinta Basin in winter are shallow boundary layers with light winds, high emissions of volatile organic compounds (VOC) compared to nitrogen oxides emissions from the oil and natural gas industry, enhancement of photolysis rates and reduction of O3 dry deposition due to snow cover. We present multiple sensitivity simulations to quantify the contribution of various factors driving high ozone over the Uinta Basin. The emission perturbation simulations show that the photochemical conditions in the Basin during winter of 2013 were VOC sensitive, which suggests that targeting VOC emissions would be most beneficial for regulatory purposes. Shortcomings of the emissions within the most recent US EPA (NEI-2011, version 1) inventory are also discussed.
NASA Astrophysics Data System (ADS)
Gholami, V.; Khaleghi, M. R.; Sebghati, M.
2017-11-01
The process of water quality testing is money/time-consuming, quite important and difficult stage for routine measurements. Therefore, use of models has become commonplace in simulating water quality. In this study, the coactive neuro-fuzzy inference system (CANFIS) was used to simulate groundwater quality. Further, geographic information system (GIS) was used as the pre-processor and post-processor tool to demonstrate spatial variation of groundwater quality. All important factors were quantified and groundwater quality index (GWQI) was developed. The proposed model was trained and validated by taking a case study of Mazandaran Plain located in northern part of Iran. The factors affecting groundwater quality were the input variables for the simulation, whereas GWQI index was the output. The developed model was validated to simulate groundwater quality. Network validation was performed via comparison between the estimated and actual GWQI values. In GIS, the study area was separated to raster format in the pixel dimensions of 1 km and also by incorporation of input data layers of the Fuzzy Network-CANFIS model; the geo-referenced layers of the effective factors in groundwater quality were earned. Therefore, numeric values of each pixel with geographical coordinates were entered to the Fuzzy Network-CANFIS model and thus simulation of groundwater quality was accessed in the study area. Finally, the simulated GWQI indices using the Fuzzy Network-CANFIS model were entered into GIS, and hence groundwater quality map (raster layer) based on the results of the network simulation was earned. The study's results confirm the high efficiency of incorporation of neuro-fuzzy techniques and GIS. It is also worth noting that the general quality of the groundwater in the most studied plain is fairly low.
Application of Wavelet Filters in an Evaluation of ...
Air quality model evaluation can be enhanced with time-scale specific comparisons of outputs and observations. For example, high-frequency (hours to one day) time scale information in observed ozone is not well captured by deterministic models and its incorporation into model performance metrics lead one to devote resources to stochastic variations in model outputs. In this analysis, observations are compared with model outputs at seasonal, weekly, diurnal and intra-day time scales. Filters provide frequency specific information that can be used to compare the strength (amplitude) and timing (phase) of observations and model estimates. The National Exposure Research Laboratory′s (NERL′s) Atmospheric Modeling and Analysis Division (AMAD) conducts research in support of EPA′s mission to protect human health and the environment. AMAD′s research program is engaged in developing and evaluating predictive atmospheric models on all spatial and temporal scales for forecasting the Nation′s air quality and for assessing changes in air quality and air pollutant exposures, as affected by changes in ecosystem management and regulatory decisions. AMAD is responsible for providing a sound scientific and technical basis for regulatory policies based on air quality models to improve ambient air quality. The models developed by AMAD are being used by EPA, NOAA, and the air pollution community in understanding and forecasting not only the magnitude of the air pollu
USDA-ARS?s Scientific Manuscript database
Pasta is a simple food made from water and durum wheat (Triticum turgidum subsp. durum) semolina. As pasta increases in popularity, studies have endeavored to analyze the attributes that contribute to high quality pasta. Despite being a simple food, the laboratory scale analysis of pasta quality is ...
Production of high-quality polydisperse construction mixes for additive 3D technologies.
NASA Astrophysics Data System (ADS)
Gerasimov, M. D.; Brazhnik, Yu V.; Gorshkov, P. S.; Latyshev, S. S.
2018-03-01
The paper describes a new design of a mixer allowing production of high quality polydisperse powders, used in additive 3D technologies. A new principle of dry powder particle mixing is considered, implementing a possibility of a close-to-ideal distribution of such particles in common space. A mathematical model of the mixer is presented, allowing evaluating quality indicators of the produced mixture. Experimental results are shown and rational values of process parameters of the mixer are obtained.
High-quality poly-dispersed mixtures applied in additive 3D technologies.
NASA Astrophysics Data System (ADS)
Gerasimov, M. D.; Brazhnik, Yu V.; Gorshkov, P. S.; Latyshev, S. S.
2018-03-01
The paper describes the new mixer design to obtain high-quality poly-dispersed powders applied in additive 3D technologies. It also considers a new mixing principle of dry powder particles ensuring the distribution of such particles in the total volume, which is close to ideal. The paper presents the mathematical model of mixer operation providing for the quality assessment of the ready mixtures. Besides, it demonstrates experimental results and obtained rational values of mixer process parameters.
Lawson, Elise H; Zingmond, David S; Stey, Anne M; Hall, Bruce L; Ko, Clifford Y
2014-10-01
To evaluate the relationship between risk-adjusted cost and quality for colectomy procedures and to identify characteristics of "high value" hospitals (high quality, low cost). Policymakers are currently focused on rewarding high-value health care. Hospitals will increasingly be held accountable for both quality and cost. Records (2005-2008) for all patients undergoing colectomy procedures in the American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP) were linked to Medicare inpatient claims. Cost was derived from hospital payments by Medicare. Quality was derived from the occurrence of 30-day postoperative major complications and/or death as recorded in ACS-NSQIP. Risk-adjusted cost and quality metrics were developed using hierarchical multivariable modeling, consistent with a National Quality Forum-endorsed colectomy measure. The study population included 14,745 colectomy patients in 169 hospitals. Average hospitalization cost was $21,350 (SD $20,773, median $16,092, interquartile range $14,341-$24,598). Thirty-four percent of patients had a postoperative complication and/or death. Higher hospital quality was significantly correlated with lower cost (correlation coefficient 0.38, P < 0.001). Among hospitals classified as high quality, 52% were found to be low cost (representing highest value hospitals) whereas 14% were high cost (P = 0.001). Forty-one percent of low-quality hospitals were high cost. Highest "value" hospitals represented a mix of teaching/nonteaching affiliation, small/large bed sizes, and regional locations. Using national ACS-NSQIP and Medicare data, this study reports an association between higher quality and lower cost surgical care. These results suggest that high-value surgical care is being delivered in a wide spectrum of hospitals and hospital types.
BiGG Models: A platform for integrating, standardizing and sharing genome-scale models
King, Zachary A.; Lu, Justin; Drager, Andreas; ...
2015-10-17
In this study, genome-scale metabolic models are mathematically structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scalemore » metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data.« less
BiGG Models: A platform for integrating, standardizing and sharing genome-scale models
King, Zachary A.; Lu, Justin; Dräger, Andreas; Miller, Philip; Federowicz, Stephen; Lerman, Joshua A.; Ebrahim, Ali; Palsson, Bernhard O.; Lewis, Nathan E.
2016-01-01
Genome-scale metabolic models are mathematically-structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scale metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data. PMID:26476456
Thin-slice vision: inference of confidence measure from perceptual video quality
NASA Astrophysics Data System (ADS)
Hameed, Abdul; Balas, Benjamin; Dai, Rui
2016-11-01
There has been considerable research on thin-slice judgments, but no study has demonstrated the predictive validity of confidence measures when assessors watch videos acquired from communication systems, in which the perceptual quality of videos could be degraded by limited bandwidth and unreliable network conditions. This paper studies the relationship between high-level thin-slice judgments of human behavior and factors that contribute to perceptual video quality. Based on a large number of subjective test results, it has been found that the confidence of a single individual present in all the videos, called speaker's confidence (SC), could be predicted by a list of features that contribute to perceptual video quality. Two prediction models, one based on artificial neural network and the other based on a decision tree, were built to predict SC. Experimental results have shown that both prediction models can result in high correlation measures.
Santiago, Luis E; Gonzalez-Caban, Armando; Loomis, John
2008-06-01
Visitor use surveys and water quality data indicates that high visitor use levels of two rivers in Puerto Rico does not appear to adversely affect several water quality parameters. Optimum visitor use to maximize visitor defined satisfaction is a more constraining limit on visitor use than water quality. Our multiple regression analysis suggests that visitor use of about 150 visitors per day yields the highest level of visitor reported satisfaction, a level that does not appear to affect turbidity of the river. This high level of visitor use may be related to the gregarious nature of Puerto Ricans and their tolerance for crowding on this densely populated island. The daily peak visitation model indicates that regulating the number of parking spaces may be the most effective way to keep visitor use within the social carrying capacity.
Taylor, Sam D; He, Yi; Hiscock, Kevin M
2016-09-15
Agricultural diffuse water pollution remains a notable global pressure on water quality, posing risks to aquatic ecosystems, human health and water resources and as a result legislation has been introduced in many parts of the world to protect water bodies. Due to their efficiency and cost-effectiveness, water quality models have been increasingly applied to catchments as Decision Support Tools (DSTs) to identify mitigation options that can be introduced to reduce agricultural diffuse water pollution and improve water quality. In this study, the Soil and Water Assessment Tool (SWAT) was applied to the River Wensum catchment in eastern England with the aim of quantifying the long-term impacts of potential changes to agricultural management practices on river water quality. Calibration and validation were successfully performed at a daily time-step against observations of discharge, nitrate and total phosphorus obtained from high-frequency water quality monitoring within the Blackwater sub-catchment, covering an area of 19.6 km(2). A variety of mitigation options were identified and modelled, both singly and in combination, and their long-term effects on nitrate and total phosphorus losses were quantified together with the 95% uncertainty range of model predictions. Results showed that introducing a red clover cover crop to the crop rotation scheme applied within the catchment reduced nitrate losses by 19.6%. Buffer strips of 2 m and 6 m width represented the most effective options to reduce total phosphorus losses, achieving reductions of 12.2% and 16.9%, respectively. This is one of the first studies to quantify the impacts of agricultural mitigation options on long-term water quality for nitrate and total phosphorus at a daily resolution, in addition to providing an estimate of the uncertainties of those impacts. The results highlighted the need to consider multiple pollutants, the degree of uncertainty associated with model predictions and the risk of unintended pollutant impacts when evaluating the effectiveness of mitigation options, and showed that high-frequency water quality datasets can be applied to robustly calibrate water quality models, creating DSTs that are more effective and reliable. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Suter, Paula; Hennessey, Beth; Florez, Donna; Newton Suter, W
2011-01-01
Individuals with chronic obstructive pulmonary disease (COPD) face significant challenges due to frequent distressing dyspnea and deficits related to activities of daily living. Individuals with COPD are often hospitalized frequently for disease exacerbations, negatively impacting quality of life and healthcare expenditure burden. The home-based chronic care model (HBCCM) was designed to address the needs of patients with chronic diseases. This model facilitates the re-design of chronic care delivery within the home health sector by ensuring patient-centered evidence-based care. This HBCCM foundation is Dr. Edward Wagner s chronic care model and has four additional areas of focus: high touch delivery, theory-based self management, specialist oversight and the use of technology. This article will describe this model in detail and outline how model use for patients with COPD can bring value to stakeholders across the health care continuum.
Christopher Daly; Jonathan W. Smith; Joseph I. Smith; Robert B. McKane
2007-01-01
High-quality daily meteorological data at high spatial resolution are essential for a variety of hydrologic and ecological modeling applications that support environmental risk assessments and decisionmaking. This paper describes the development. application. and assessment of methods to construct daily high resolution (~50-m cell size) meteorological grids for the...
Frameworks for Assessing the Quality of Modeling and Simulation Capabilities
NASA Astrophysics Data System (ADS)
Rider, W. J.
2012-12-01
The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are incomplete and need to be extended incorporating elements from the other as well as new elements related to how models are solved, and how the model will be applied. I will describe this merger of approach and how it should be applied. The problems in adoption are related to basic human nature in that no one likes to be graded, or told they are not sufficiently quality oriented. Rather than engage in an adversarial role, I suggest that the frameworks be viewed as a collaborative tool. Instead these frameworks should be used to structure collaborations that can be used to assist the modeling and simulation efforts to be high quality. The framework provides a comprehensive setting of modeling and simulation themes that should be explored in providing high quality. W. Oberkampf, M. Pilch, and T. Trucano, Predictive Capability Maturity Model for Computational Modeling and Simulation, SAND2007-5948, 2007. B. Boyack, Quantifying Reactor Safety Margins Part 1: An Overview of the Code Scaling, Applicability, and Uncertainty Evaluation Methodology, Nuc. Eng. Design, 119, pp. 1-15, 1990. National Aeronautics and Space Administration, STANDARD FOR MODELS AND SIMULATIONS, NASA-STD-7009, 2008. Y. Ben-Haim and F. Hemez, Robustness, fidelity and prediction-looseness of models, Proc. R. Soc. A (2012) 468, 227-244.
Study of Regional Downscaled Climate and Air Quality in the United States
NASA Astrophysics Data System (ADS)
Gao, Y.; Fu, J. S.; Drake, J.; Lamarque, J.; Lam, Y.; Huang, K.
2011-12-01
Due to the increasing anthropogenic greenhouse gas emissions, the global and regional climate patterns have significantly changed. Climate change has exerted strong impact on ecosystem, air quality and human life. The global model Community Earth System Model (CESM v1.0) was used to predict future climate and chemistry under projected emission scenarios. Two new emission scenarios, Representative Community Pathways (RCP) 4.5 and RCP 8.5, were used in this study for climate and chemistry simulations. The projected global mean temperature will increase 1.2 and 1.7 degree Celcius for the RCP 4.5 and RCP 8.5 scenarios in 2050s, respectively. In order to take advantage of local detailed topography, land use data and conduct local climate impact on air quality, we downscaled CESM outputs to 4 km by 4 km Eastern US domain using Weather Research and Forecasting (WRF) Model and Community Multi-scale Air Quality modeling system (CMAQ). The evaluations between regional model outputs and global model outputs, regional model outputs and observational data were conducted to verify the downscaled methodology. Future climate change and air quality impact were also examined on a 4 km by 4 km high resolution scale.
Uncertainty analyses of the calibrated parameter values of a water quality model
NASA Astrophysics Data System (ADS)
Rode, M.; Suhr, U.; Lindenschmidt, K.-E.
2003-04-01
For river basin management water quality models are increasingly used for the analysis and evaluation of different management measures. However substantial uncertainties exist in parameter values depending on the available calibration data. In this paper an uncertainty analysis for a water quality model is presented, which considers the impact of available model calibration data and the variance of input variables. The investigation was conducted based on four extensive flowtime related longitudinal surveys in the River Elbe in the years 1996 to 1999 with varying discharges and seasonal conditions. For the model calculations the deterministic model QSIM of the BfG (Germany) was used. QSIM is a one dimensional water quality model and uses standard algorithms for hydrodynamics and phytoplankton dynamics in running waters, e.g. Michaelis Menten/Monod kinetics, which are used in a wide range of models. The multi-objective calibration of the model was carried out with the nonlinear parameter estimator PEST. The results show that for individual flow time related measuring surveys very good agreements between model calculation and measured values can be obtained. If these parameters are applied to deviating boundary conditions, substantial errors in model calculation can occur. These uncertainties can be decreased with an increased calibration database. More reliable model parameters can be identified, which supply reasonable results for broader boundary conditions. The extension of the application of the parameter set on a wider range of water quality conditions leads to a slight reduction of the model precision for the specific water quality situation. Moreover the investigations show that highly variable water quality variables like the algal biomass always allow a smaller forecast accuracy than variables with lower coefficients of variation like e.g. nitrate.
Kretschmer, Tina; Sentse, Miranda; Meeus, Wim; Verhulst, Frank C; Veenstra, René; Oldehinkel, Albertine J
2016-09-01
Adolescents' peer experiences embrace behavior, relationship quality, status, and victimization, but studies that account for multiple dimensions are rare. Using latent profile modeling and measures of peer behavior, relationship quality, peer status, and victimization assessed from 1,677 adolescents, four profiles were identified: High Quality, Low Quality, Low Quality Victimized, and Deviant Peers. Multinomial logistic regressions showed that negative parent-child relationships in preadolescence reduced the likelihood of High Quality peer relations in mid-adolescence but only partly differentiated between the other three profiles. Moderation by gender was partly found with girls showing greater sensitivity to parent-child relationship quality with respect to peer experiences. Results underline the multifaceted nature of peer experiences, and practical and theoretical implications are discussed. © 2015 The Authors. Journal of Research on Adolescence © 2015 Society for Research on Adolescence.
Tilburg, Charles E.; Jordan, Linda M.; Carlson, Amy E.; Zeeman, Stephan I.; Yund, Philip O.
2015-01-01
Faecal pollution in stormwater, wastewater and direct run-off can carry zoonotic pathogens to streams, rivers and the ocean, reduce water quality, and affect both recreational and commercial fishing areas of the coastal ocean. Typically, the closure of beaches and commercial fishing areas is governed by the testing for the presence of faecal bacteria, which requires an 18–24 h period for sample incubation. As water quality can change during this testing period, the need for accurate and timely predictions of coastal water quality has become acute. In this study, we: (i) examine the relationship between water quality, precipitation and river discharge at several locations within the Gulf of Maine, and (ii) use multiple linear regression models based on readily obtainable hydrometeorological measurements to predict water quality events at five coastal locations. Analysis of a 12 year dataset revealed that high river discharge and/or precipitation events can lead to reduced water quality; however, the use of only these two parameters to predict water quality can result in a number of errors. Analysis of a higher frequency, 2 year study using multiple linear regression models revealed that precipitation, salinity, river discharge, winds, seasonality and coastal circulation correlate with variations in water quality. Although there has been extensive development of regression models for freshwater, this is one of the first attempts to create a mechanistic model to predict water quality in coastal marine waters. Model performance is similar to that of efforts in other regions, which have incorporated models into water resource managers' decisions, indicating that the use of a mechanistic model in coastal Maine is feasible. PMID:26587258
Metrics for the Evaluation the Utility of Air Quality Forecasting
NASA Astrophysics Data System (ADS)
Sumo, T. M.; Stockwell, W. R.
2013-12-01
Global warming is expected to lead to higher levels of air pollution and therefore the forecasting of both long-term and daily air quality is an important component for the assessment of the costs of climate change and its impact on human health. Some of the risks associated with poor air quality days (where the Air Pollution Index is greater than 100), include hospital visits and mortality. Accurate air quality forecasting has the potential to allow sensitive groups to take appropriate precautions. This research builds metrics for evaluating the utility of air quality forecasting in terms of its potential impacts. Our analysis of air quality models focuses on the Washington, DC/Baltimore, MD region over the summertime ozone seasons between 2010 and 2012. The metrics that are relevant to our analysis include: (1) The number of times that a high ozone or particulate matter (PM) episode is correctly forecasted, (2) the number of times that high ozone or PM episode is forecasted when it does not occur and (3) the number of times when the air quality forecast predicts a cleaner air episode when the air was observed to have high ozone or PM. Our evaluation of the performance of air quality forecasts include those forecasts of ozone and particulate matter and data available from the U.S. Environmental Protection Agency (EPA)'s AIRNOW. We also examined observational ozone and particulate matter data available from Clean Air Partners. Overall the forecast models perform well for our region and time interval.
Characteristics of primary care practices associated with high quality of care
Beaulieu, Marie-Dominique; Haggerty, Jeannie; Tousignant, Pierre; Barnsley, Janet; Hogg, William; Geneau, Robert; Hudon, Éveline; Duplain, Réjean; Denis, Jean-Louis; Bonin, Lucie; Del Grande, Claudio; Dragieva, Natalyia
2013-01-01
Background: No primary practice care model has been shown to be superior in achieving high-quality primary care. We aimed to identify the organizational characteristics of primary care practices that provide high-quality primary care. Methods: We performed a cross-sectional observational study involving a stratified random sample of 37 primary care practices from 3 regions of Quebec. We recruited 1457 patients who had 1 of 2 chronic care conditions or 1 of 6 episodic care conditions. The main outcome was the overall technical quality score. We measured organizational characteristics by use of a validated questionnaire and the Team Climate Inventory. Statistical analyses were based on multilevel regression modelling. Results: The following characteristics were strongly associated with overall technical quality of care score: physician remuneration method (27.0; 95% confidence interval [CI] 19.0–35.0), extent of sharing of administrative resources (7.6; 95% CI 0.8–14.4), presence of allied health professionals (15.3; 95% CI 5.4–25.2) and/or specialist physicians (19.6; 95% CI 8.3–30.9), the presence of mechanisms for maintaining or evaluating competence (7.7; 95% CI 3.0–12.4) and average organizational access to the practice (4.9; 95% CI 2.6–7.2). The number of physicians (1.2; 95% CI 0.6–1.8) and the average Team Climate Inventory score (1.3; 95% CI 0.1–2.5) were modestly associated with high-quality care. Interpretation: We identified a common set of organizational characteristics associated with high-quality primary care. Many of these characteristics are amenable to change through practice-level organizational changes. PMID:23877669
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brinkman, J.J.; Griffioen, P.S.; Groot, S.
1987-03-01
The Netherlands have a rather complex water-management system consisting of a number of major rivers, canals, lakes and ditches. Water-quantity management on a regional scale is necessary for an effective water-quality policy. To support water management, a computer model was developed that includes both water quality and water quantity, based on three submodels: ABOPOL for the water movement, DELWAQ for the calculation of water quality variables and BLOOM-II for the phytoplankton growth. The northern province of Friesland was chosen as a test case for the integrated model to be developed, where water quality is highly related to the water distributionmore » and the main trade-off is minimizing the intake of (eutrophicated) alien water in order to minimize external nutrient load and maximizing the intake in order to flush channels and lakes. The results of the application of these models to this and to a number of hypothetical future situations are described.« less
Zheng, Zhong-liang; Zuo, Zhen-yu; Liu, Zhi-gang; Tsai, Keng-chang; Liu, Ai-fu; Zou, Guo-lin
2005-01-01
A three-dimensional structural model of nattokinase (NK) from Bacillus natto was constructed by homology modeling. High-resolution X-ray structures of Subtilisin BPN' (SB), Subtilisin Carlsberg (SC), Subtilisin E (SE) and Subtilisin Savinase (SS), four proteins with sequential, structural and functional homology were used as templates. Initial models of NK were built by MODELLER and analyzed by the PROCHECK programs. The best quality model was chosen for further refinement by constrained molecular dynamics simulations. The overall quality of the refined model was evaluated. The refined model NKC1 was analyzed by different protein analysis programs including PROCHECK for the evaluation of Ramachandran plot quality, PROSA for testing interaction energies and WHATIF for the calculation of packing quality. This structure was found to be satisfactory and also stable at room temperature as demonstrated by a 300ps long unconstrained molecular dynamics (MD) simulation. Further docking analysis promoted the coming of a new nucleophilic catalytic mechanism for NK, which is induced by attacking of hydroxyl rich in catalytic environment and locating of S221.
Low Temperature Rhombohedral Single Crystal SiGe Epitaxy on c-plane Sapphire
NASA Technical Reports Server (NTRS)
Duzik, Adam J.; Choi, Sang H.
2016-01-01
Current best practice in epitaxial growth of rhombohedral SiGe onto (0001) sapphire (Al2O3) substrate surfaces requires extreme conditions to grow a single crystal SiGe film. Previous models described the sapphire surface reconstruction as the overriding factor in rhombohedral epitaxy, requiring a high temperature Al-terminated surface for high quality films. Temperatures in the 850-1100 C range were thought to be necessary to get SiGe to form coherent atomic matching between the (111) SiGe plane and the (0001) sapphire surface. Such fabrication conditions are difficult and uneconomical, hindering widespread application. This work proposes an alternative model that considers the bulk sapphire structure and determines how the SiGe film nucleates and grows. Accounting for thermal expansion effects, calculations using this new model show that both pure Ge and SiGe can form single crystal films in the 450-550 C temperature range. Experimental results confirm these predictions, where x-ray diffraction and atomic force microscopy show the films fabricated at low temperature rival the high temperature films in crystallographic and surface quality. Finally, an explanation is provided for why films of comparable high quality can be produced in either temperature range.
Cernuda, Carlos; Lughofer, Edwin; Klein, Helmut; Forster, Clemens; Pawliczek, Marcin; Brandstetter, Markus
2017-01-01
During the production process of beer, it is of utmost importance to guarantee a high consistency of the beer quality. For instance, the bitterness is an essential quality parameter which has to be controlled within the specifications at the beginning of the production process in the unfermented beer (wort) as well as in final products such as beer and beer mix beverages. Nowadays, analytical techniques for quality control in beer production are mainly based on manual supervision, i.e., samples are taken from the process and analyzed in the laboratory. This typically requires significant lab technicians efforts for only a small fraction of samples to be analyzed, which leads to significant costs for beer breweries and companies. Fourier transform mid-infrared (FT-MIR) spectroscopy was used in combination with nonlinear multivariate calibration techniques to overcome (i) the time consuming off-line analyses in beer production and (ii) already known limitations of standard linear chemometric methods, like partial least squares (PLS), for important quality parameters Speers et al. (J I Brewing. 2003;109(3):229-235), Zhang et al. (J I Brewing. 2012;118(4):361-367) such as bitterness, citric acid, total acids, free amino nitrogen, final attenuation, or foam stability. The calibration models are established with enhanced nonlinear techniques based (i) on a new piece-wise linear version of PLS by employing fuzzy rules for local partitioning the latent variable space and (ii) on extensions of support vector regression variants (-PLSSVR and ν-PLSSVR), for overcoming high computation times in high-dimensional problems and time-intensive and inappropriate settings of the kernel parameters. Furthermore, we introduce a new model selection scheme based on bagged ensembles in order to improve robustness and thus predictive quality of the final models. The approaches are tested on real-world calibration data sets for wort and beer mix beverages, and successfully compared to linear methods, showing a clear out-performance in most cases and being able to meet the model quality requirements defined by the experts at the beer company. Figure Workflow for calibration of non-Linear model ensembles from FT-MIR spectra in beer production .
Li, Mingjie; Zhou, Ping; Wang, Hong; ...
2017-09-19
As one of the most important unit in the papermaking industry, the high consistency (HC) refining system is confronted with challenges such as improving pulp quality, energy saving, and emissions reduction in its operation processes. Here in this correspondence, an optimal operation of HC refining system is presented using nonlinear multiobjective model predictive control strategies that aim at set-point tracking objective of pulp quality, economic objective, and specific energy (SE) consumption objective, respectively. First, a set of input and output data at different times are employed to construct the subprocess model of the state process model for the HC refiningmore » system, and then the Wiener-type model can be obtained through combining the mechanism model of Canadian Standard Freeness and the state process model that determines their structures based on Akaike information criterion. Second, the multiobjective optimization strategy that optimizes both the set-point tracking objective of pulp quality and SE consumption is proposed simultaneously, which uses NSGA-II approach to obtain the Pareto optimal set. Furthermore, targeting at the set-point tracking objective of pulp quality, economic objective, and SE consumption objective, the sequential quadratic programming method is utilized to produce the optimal predictive controllers. In conclusion, the simulation results demonstrate that the proposed methods can make the HC refining system provide a better performance of set-point tracking of pulp quality when these predictive controllers are employed. In addition, while the optimal predictive controllers orienting with comprehensive economic objective and SE consumption objective, it has been shown that they have significantly reduced the energy consumption.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Mingjie; Zhou, Ping; Wang, Hong
As one of the most important unit in the papermaking industry, the high consistency (HC) refining system is confronted with challenges such as improving pulp quality, energy saving, and emissions reduction in its operation processes. Here in this correspondence, an optimal operation of HC refining system is presented using nonlinear multiobjective model predictive control strategies that aim at set-point tracking objective of pulp quality, economic objective, and specific energy (SE) consumption objective, respectively. First, a set of input and output data at different times are employed to construct the subprocess model of the state process model for the HC refiningmore » system, and then the Wiener-type model can be obtained through combining the mechanism model of Canadian Standard Freeness and the state process model that determines their structures based on Akaike information criterion. Second, the multiobjective optimization strategy that optimizes both the set-point tracking objective of pulp quality and SE consumption is proposed simultaneously, which uses NSGA-II approach to obtain the Pareto optimal set. Furthermore, targeting at the set-point tracking objective of pulp quality, economic objective, and SE consumption objective, the sequential quadratic programming method is utilized to produce the optimal predictive controllers. In conclusion, the simulation results demonstrate that the proposed methods can make the HC refining system provide a better performance of set-point tracking of pulp quality when these predictive controllers are employed. In addition, while the optimal predictive controllers orienting with comprehensive economic objective and SE consumption objective, it has been shown that they have significantly reduced the energy consumption.« less
[High-quality nursing health care environment: the patient safety perspective].
Tu, Yu-Ching; Wang, Ruey-Hsia
2011-06-01
Patient safety is regarded as an important indicator of nursing care quality, and nurses hold frontline responsibility to maintain patient safety. Many countries now face healthcare provider shortfalls, and recognize a close correlation between adequate manpower and patient safety. Many healthcare organizations work to foster positive work environments in order to improve health service quality. The active participation and "buy in" of nurses, patients and policymakers are critical to maximize healthcare environment quality and improve patient safety. This article adopts Donabedian's theoretical "Structure-Process-Outcome" model of quality (Donabedian, 1988) and presumes all high-quality healthcare environment indicators to be linked to patient safety. In addition to raising public awareness regarding the influence of healthcare environment quality on patient safety, this research suggests certain indicators for tracking and assessing healthcare environment quality. Future research may design an empirical study based on these indicators to help further enhance healthcare environment quality and the professional development of nurses.
2011-03-21
to and receive comprehensive high-quality, high-value reproductive health and maternity care. • Comprehensive health care reform strategies...and its implementation, ensure that access to comprehensive, high-quality reproductive health and maternity care services are essential benefits for... Reproductive Health, Centers for Disease Control and Prevention Stakeholder Workgroup Consumers and their Advocates Chair: Judy Norsigian
Beyond technology acceptance to effective technology use: a parsimonious and actionable model.
Holahan, Patricia J; Lesselroth, Blake J; Adams, Kathleen; Wang, Kai; Church, Victoria
2015-05-01
To develop and test a parsimonious and actionable model of effective technology use (ETU). Cross-sectional survey of primary care providers (n = 53) in a large integrated health care organization that recently implemented new medication reconciliation technology. Surveys assessed 5 technology-related perceptions (compatibility with work values, implementation climate, compatibility with work processes, perceived usefulness, and ease of use) and 1 outcome variable, ETU. ETU was measured as both consistency and quality of technology use. Compatibility with work values and implementation climate were found to have differential effects on consistency and quality of use. When implementation climate was strong, consistency of technology use was high. However, quality of technology use was high only when implementation climate was strong and values compatibility was high. This is an important finding and highlights the importance of users' workplace values as a key determinant of quality of use. To extend our effectiveness in implementing new health care information technology, we need parsimonious models that include actionable determinants of ETU and account for the differential effects of these determinants on the multiple dimensions of ETU. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Alyeshmerni, Daniel; Froehlich, James B; Lewin, Jack; Eagle, Kim A
2014-07-01
Despite its status as a world leader in treatment innovation and medical education, a quality chasm exists in American health care. Care fragmentation and poor coordination contribute to expensive care with highly variable quality in the United States. The rising costs of health care since 1990 have had a huge impact on individuals, families, businesses, the federal and state governments, and the national budget deficit. The passage of the Affordable Care Act represents a large shift in how health care is financed and delivered in the United States. The objective of this review is to describe some of the economic and social forces driving health care reform, provide an overview of the Patient Protection and Affordable Care Act (ACA), and review model cardiovascular quality improvement programs underway in the state of Michigan. As health care reorganization occurs at the federal level, local and regional efforts can serve as models to accelerate improvement toward achieving better population health and better care at lower cost. Model programs in Michigan have achieved this goal in cardiovascular care through the systematic application of evidence-based care, the utilization of regional quality improvement collaboratives, community-based childhood wellness promotion, and medical device-based competitive bidding strategies. These efforts are examples of the direction cardiovascular care delivery will need to move in this era of the Affordable Care Act.
Quality Analysis of Open Street Map Data
NASA Astrophysics Data System (ADS)
Wang, M.; Li, Q.; Hu, Q.; Zhou, M.
2013-05-01
Crowd sourcing geographic data is an opensource geographic data which is contributed by lots of non-professionals and provided to the public. The typical crowd sourcing geographic data contains GPS track data like OpenStreetMap, collaborative map data like Wikimapia, social websites like Twitter and Facebook, POI signed by Jiepang user and so on. These data will provide canonical geographic information for pubic after treatment. As compared with conventional geographic data collection and update method, the crowd sourcing geographic data from the non-professional has characteristics or advantages of large data volume, high currency, abundance information and low cost and becomes a research hotspot of international geographic information science in the recent years. Large volume crowd sourcing geographic data with high currency provides a new solution for geospatial database updating while it need to solve the quality problem of crowd sourcing geographic data obtained from the non-professionals. In this paper, a quality analysis model for OpenStreetMap crowd sourcing geographic data is proposed. Firstly, a quality analysis framework is designed based on data characteristic analysis of OSM data. Secondly, a quality assessment model for OSM data by three different quality elements: completeness, thematic accuracy and positional accuracy is presented. Finally, take the OSM data of Wuhan for instance, the paper analyses and assesses the quality of OSM data with 2011 version of navigation map for reference. The result shows that the high-level roads and urban traffic network of OSM data has a high positional accuracy and completeness so that these OSM data can be used for updating of urban road network database.
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.; Estes, Maurice G., Jr.; Crosson, William; Johnson, Hoyt; Khan, Maudood
2006-01-01
The growth of cities, both in population and areal extent, appears as an inexorable process. Urbanization continues at a rapid rate, and it is estimated that by the year 2025, 60 percent of the world s population will live in cities. Urban expansion has profound impacts on a host of biophysical, environmental, and atmospheric processes within an urban ecosystems perspective. A reduction in air quality over cities is a major result of these impacts. Because of its complexity, the urban landscape is not adequately captured in air quality models such as the Community Multiscale Air Quality (CMAQ) model that is used to assess whether urban areas are in attainment of EPA air quality standards, primarily for ground level ozone. This inadequacy of the CMAQ model to sufficiently respond to the heterogeneous nature of the urban landscape can impact how well the model predicts ozone levels over metropolitan areas and ultimately, whether cities exceed EPA ozone air quality standards. We are exploring the utility of high-resolution remote sensing data and urban spatial growth modeling (SGM) projections as improved inputs to a meteorological/air quality modeling system focusing on the Atlanta, Georgia metropolitan area as a case study. These growth projections include business as usual and smart growth scenarios out to 2030. The growth projections illustrate the effects of employing urban heat island mitigation strategies, such as increasing tree canopy and albedo across the Atlanta metro area, which in turn, are used to model how air temperature can potentially be moderated as impacts on elevating ground-level ozone, as opposed to not utilizing heat island mitigation strategies. The National Land Cover Dataset at 30m resolution is being used as the land use/land cover input and aggregated to the 4km scale for the MM5 mesoscale meteorological model and the CMAQ modeling schemes. Use of these data has been found to better characterize low density/suburban development as compared with USGS lkm land use/land cover data that have traditionally been used in modeling. Air quality prediction for future scenarios to 2030 is being facilitated by land use projections using a spatial growth model. Land use projections were developed using the 2030 Regional Transportation Plan developed by the Atlanta Regional Commission, the regional planning agency for the area. This allows the Georgia Environmental Protection Division to evaluate how these transportation plans will affect future air quality. The coupled SGM and air quality modeling approach provides insight on what the impacts of Atlanta s growth will be on the local and regional environment and exists as a mechanism that can be used by policy makers to make rational decisions on urban growth and sustainability for the metropolitan area in the future.
Optimum profit model considering production, quality and sale problem
NASA Astrophysics Data System (ADS)
Chen, Chung-Ho; Lu, Chih-Lun
2011-12-01
Chen and Liu ['Procurement Strategies in the Presence of the Spot Market-an Analytical Framework', Production Planning and Control, 18, 297-309] presented the optimum profit model between the producers and the purchasers for the supply chain system with a pure procurement policy. However, their model with a simple manufacturing cost did not consider the used cost of the customer. In this study, the modified Chen and Liu's model will be addressed for determining the optimum product and process parameters. The authors propose a modified Chen and Liu's model under the two-stage screening procedure. The surrogate variable having a high correlation with the measurable quality characteristic will be directly measured in the first stage. The measurable quality characteristic will be directly measured in the second stage when the product decision cannot be determined in the first stage. The used cost of the customer will be measured by adopting Taguchi's quadratic quality loss function. The optimum purchaser's order quantity, the producer's product price and the process quality level will be jointly determined by maximising the expected profit between them.
Modeling, Monitoring and Fault Diagnosis of Spacecraft Air Contaminants
NASA Technical Reports Server (NTRS)
Ramirez, W. Fred; Skliar, Mikhail; Narayan, Anand; Morgenthaler, George W.; Smith, Gerald J.
1996-01-01
Progress and results in the development of an integrated air quality modeling, monitoring, fault detection, and isolation system are presented. The focus was on development of distributed models of the air contaminants transport, the study of air quality monitoring techniques based on the model of transport process and on-line contaminant concentration measurements, and sensor placement. Different approaches to the modeling of spacecraft air contamination are discussed, and a three-dimensional distributed parameter air contaminant dispersion model applicable to both laminar and turbulent transport is proposed. A two-dimensional approximation of a full scale transport model is also proposed based on the spatial averaging of the three dimensional model over the least important space coordinate. A computer implementation of the transport model is considered and a detailed development of two- and three-dimensional models illustrated by contaminant transport simulation results is presented. The use of a well established Kalman filtering approach is suggested as a method for generating on-line contaminant concentration estimates based on both real time measurements and the model of contaminant transport process. It is shown that high computational requirements of the traditional Kalman filter can render difficult its real-time implementation for high-dimensional transport model and a novel implicit Kalman filtering algorithm is proposed which is shown to lead to an order of magnitude faster computer implementation in the case of air quality monitoring.
Effect of high hydrostatic pressure on overall quality parameters of watermelon juice.
Liu, Y; Zhao, X Y; Zou, L; Hu, X S
2013-06-01
High hydrostatic pressure as a kind of non-thermal processing might maintain the quality of thermo-sensitive watermelon juice. So, the effect of high hydrostatic pressure treatment on enzymes and quality of watermelon juice was investigated. After high hydrostatic pressure treatment, the activities of polyphenol oxidase, peroxidase, and pectin methylesterase of juice decreased significantly with the pressure (P < 0.05). Inactivation of polyphenol oxidase and peroxidase could be fitted by two-fraction model and that of pectin methylesterase could be described by first-order reaction model. Titratable acidity, pH, and total soluble solid of juice did not change significantly (P > 0.05). No significant difference was observed in lycopene and total phenolics after high hydrostatic pressure treatment when compared to the control (P > 0.05). Cloudiness and viscosity increased with pressure (P < 0.05) but did not change significantly with treatment time (P > 0.05). a*- and b*-value both unchanged after high hydrostatic pressure treatment (P > 0.05) while L*-value increased but the values had no significant difference among treated juices. Browning degree after high hydrostatic pressure treatment decreased with increase in pressure and treatment time (P < 0.05). Through the comparison of total color difference values, high hydrostatic pressure had little effect on color of juice. The results of this study demonstrated the efficacy of high hydrostatic pressure in inactivating enzymes and maintaining the quality of watermelon juice.
High Fidelity BWR Fuel Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoon, Su Jong
This report describes the Consortium for Advanced Simulation of Light Water Reactors (CASL) work conducted for completion of the Thermal Hydraulics Methods (THM) Level 3 milestone THM.CFD.P13.03: High Fidelity BWR Fuel Simulation. High fidelity computational fluid dynamics (CFD) simulation for Boiling Water Reactor (BWR) was conducted to investigate the applicability and robustness performance of BWR closures. As a preliminary study, a CFD model with simplified Ferrule spacer grid geometry of NUPEC BWR Full-size Fine-mesh Bundle Test (BFBT) benchmark has been implemented. Performance of multiphase segregated solver with baseline boiling closures has been evaluated. Although the mean values of void fractionmore » and exit quality of CFD result for BFBT case 4101-61 agreed with experimental data, the local void distribution was not predicted accurately. The mesh quality was one of the critical factors to obtain converged result. The stability and robustness of the simulation was mainly affected by the mesh quality, combination of BWR closure models. In addition, the CFD modeling of fully-detailed spacer grid geometry with mixing vane is necessary for improving the accuracy of CFD simulation.« less
Wilcox, S.; Andreas, A.
2010-09-27
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
NASA Astrophysics Data System (ADS)
Wilcox, Steve; Myers, Daryl
2009-08-01
The U.S. Department of Energy's National Renewable Energy Laboratory has embarked on a collaborative effort with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of concentrating solar thermal power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result will be high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Association between Personality Traits and Sleep Quality in Young Korean Women
Kim, Han-Na; Cho, Juhee; Chang, Yoosoo; Ryu, Seungho
2015-01-01
Personality is a trait that affects behavior and lifestyle, and sleep quality is an important component of a healthy life. We analyzed the association between personality traits and sleep quality in a cross-section of 1,406 young women (from 18 to 40 years of age) who were not reporting clinically meaningful depression symptoms. Surveys were carried out from December 2011 to February 2012, using the Revised NEO Personality Inventory and the Pittsburgh Sleep Quality Index (PSQI). All analyses were adjusted for demographic and behavioral variables. We considered beta weights, structure coefficients, unique effects, and common effects when evaluating the importance of sleep quality predictors in multiple linear regression models. Neuroticism was the most important contributor to PSQI global scores in the multiple regression models. By contrast, despite being strongly correlated with sleep quality, conscientiousness had a near-zero beta weight in linear regression models, because most variance was shared with other personality traits. However, conscientiousness was the most noteworthy predictor of poor sleep quality status (PSQI≥6) in logistic regression models and individuals high in conscientiousness were least likely to have poor sleep quality, which is consistent with an OR of 0.813, with conscientiousness being protective against poor sleep quality. Personality may be a factor in poor sleep quality and should be considered in sleep interventions targeting young women. PMID:26030141
A Structural Model of the Retail Market for Illicit Drugs.
Galenianos, Manolis; Gavazza, Alessandro
2017-03-01
We estimate a model of illicit drugs markets using data on purchases of crack cocaine. Buyers are searching for high-quality drugs, but they determine drugs' quality (i.e., their purity) only after consuming them. Hence, sellers can rip off first-time buyers or can offer higher-quality drugs to induce buyers to purchase from them again. In equilibrium, a distribution of qualities persists. The estimated model implies that if drugs were legalized, in which case purity could be regulated and hence observable, the average purity of drugs would increase by approximately 20 percent and the dispersion would decrease by approximately 80 percent. Moreover, increasing penalties may raise the purity and affordability of the drugs traded by increasing sellers’ relative profitability of targeting loyal buyers versus first-time buyers.
A Regularized Volumetric Fusion Framework for Large-Scale 3D Reconstruction
NASA Astrophysics Data System (ADS)
Rajput, Asif; Funk, Eugen; Börner, Anko; Hellwich, Olaf
2018-07-01
Modern computational resources combined with low-cost depth sensing systems have enabled mobile robots to reconstruct 3D models of surrounding environments in real-time. Unfortunately, low-cost depth sensors are prone to produce undesirable estimation noise in depth measurements which result in either depth outliers or introduce surface deformations in the reconstructed model. Conventional 3D fusion frameworks integrate multiple error-prone depth measurements over time to reduce noise effects, therefore additional constraints such as steady sensor movement and high frame-rates are required for high quality 3D models. In this paper we propose a generic 3D fusion framework with controlled regularization parameter which inherently reduces noise at the time of data fusion. This allows the proposed framework to generate high quality 3D models without enforcing additional constraints. Evaluation of the reconstructed 3D models shows that the proposed framework outperforms state of art techniques in terms of both absolute reconstruction error and processing time.
2016-11-04
The Medicare Access and CHIP Reauthorization Act of 2015 (MACRA) repeals the Medicare sustainable growth rate (SGR) methodology for updates to the physician fee schedule (PFS) and replaces it with a new approach to payment called the Quality Payment Program that rewards the delivery of high-quality patient care through two avenues: Advanced Alternative Payment Models (Advanced APMs) and the Merit-based Incentive Payment System (MIPS) for eligible clinicians or groups under the PFS. This final rule with comment period establishes incentives for participation in certain alternative payment models (APMs) and includes the criteria for use by the Physician-Focused Payment Model Technical Advisory Committee (PTAC) in making comments and recommendations on physician-focused payment models (PFPMs). Alternative Payment Models are payment approaches, developed in partnership with the clinician community, that provide added incentives to deliver high-quality and cost-efficient care. APMs can apply to a specific clinical condition, a care episode, or a population. This final rule with comment period also establishes the MIPS, a new program for certain Medicare-enrolled practitioners. MIPS will consolidate components of three existing programs, the Physician Quality Reporting System (PQRS), the Physician Value-based Payment Modifier (VM), and the Medicare Electronic Health Record (EHR) Incentive Program for Eligible Professionals (EPs), and will continue the focus on quality, cost, and use of certified EHR technology (CEHRT) in a cohesive program that avoids redundancies. In this final rule with comment period we have rebranded key terminology based on feedback from stakeholders, with the goal of selecting terms that will be more easily identified and understood by our stakeholders.
Benoit, Richard; Mion, Lorraine
2012-08-01
This paper presents a proposed conceptual model to guide research on pressure ulcer risk in critically ill patients, who are at high risk for pressure ulcer development. However, no conceptual model exists that guides risk assessment in this population. Results from a review of prospective studies were evaluated for design quality and level of statistical reporting. Multivariate findings from studies having high or medium design quality by the National Institute of Health and Clinical Excellence standards were conceptually grouped. The conceptual groupings were integrated into Braden and Bergstrom's (Braden and Bergstrom [1987] Rehabilitation Nursing, 12, 8-12, 16) conceptual model, retaining their original constructs and augmenting their concept of intrinsic factors for tissue tolerance. The model could enhance consistency in research on pressure ulcer risk factors. Copyright © 2012 Wiley Periodicals, Inc.
Respite Care, Stress, Uplifts, and Marital Quality in Parents of Children with Down Syndrome
ERIC Educational Resources Information Center
Norton, Michelle; Dyches, Tina Taylor; Harper, James M.; Roper, Susanne Olsen; Caldarella, Paul
2016-01-01
Parents of children with disabilities are at risk for high stress and low marital quality; therefore, this study surveyed couples (n = 112) of children with Down syndrome (n = 120), assessing whether respite hours, stress, and uplifts were related to marital quality. Structural equation modeling indicated that respite hours were negatively related…
ERIC Educational Resources Information Center
Setodji, Claude Messan; Le, Vi-Nhuan; Schaack, Diana
2013-01-01
Research linking high-quality child care programs and children's cognitive development has contributed to the growing popularity of child care quality benchmarking efforts such as quality rating and improvement systems (QRIS). Consequently, there has been an increased interest in and a need for approaches to identifying thresholds, or cutpoints,…
Pediatric laryngeal simulator using 3D printed models: A novel technique.
Kavanagh, Katherine R; Cote, Valerie; Tsui, Yvonne; Kudernatsch, Simon; Peterson, Donald R; Valdez, Tulio A
2017-04-01
Simulation to acquire and test technical skills is an essential component of medical education and residency training in both surgical and nonsurgical specialties. High-quality simulation education relies on the availability, accessibility, and reliability of models. The objective of this work was to describe a practical pediatric laryngeal model for use in otolaryngology residency training. Ideally, this model would be low-cost, have tactile properties resembling human tissue, and be reliably reproducible. Pediatric laryngeal models were developed using two manufacturing methods: direct three-dimensional (3D) printing of anatomical models and casted anatomical models using 3D-printed molds. Polylactic acid, acrylonitrile butadiene styrene, and high-impact polystyrene (HIPS) were used for the directly printed models, whereas a silicone elastomer (SE) was used for the casted models. The models were evaluated for anatomic quality, ease of manipulation, hardness, and cost of production. A tissue likeness scale was created to validate the simulation model. Fleiss' Kappa rating was performed to evaluate interrater agreement, and analysis of variance was performed to evaluate differences among the materials. The SE provided the most anatomically accurate models, with the tactile properties allowing for surgical manipulation of the larynx. Direct 3D printing was more cost-effective than the SE casting method but did not possess the material properties and tissue likeness necessary for surgical simulation. The SE models of the pediatric larynx created from a casting method demonstrated high quality anatomy, tactile properties comparable to human tissue, and easy manipulation with standard surgical instruments. Their use in a reliable, low-cost, accessible, modular simulation system provides a valuable training resource for otolaryngology residents. N/A. Laryngoscope, 127:E132-E137, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.
A Conceptual Framework for Quality of Care
Mosadeghrad, Ali Mohammad
2012-01-01
Despite extensive research on defining and measuring health care quality, little attention has been given to different stakeholders’ perspectives of high-quality health care services. The main purpose of this study was to explore the attributes of quality healthcare in the Iranian context. Exploratory in-depth individual and focus group interviews were conducted with key healthcare stakeholders including clients, providers, managers, policy makers, payers, suppliers and accreditation panel members to identify the healthcare service quality attributes and dimensions. Data analysis was carried out by content analysis, with the constant comparative method. Over 100 attributes of quality healthcare service were elicited and grouped into five categories. The dimensions were: efficacy, effectiveness, efficiency, empathy, and environment. Consequently, a comprehensive model of service quality was developed for health care context. The findings of the current study led to a conceptual framework of healthcare quality. This model leads to a better understanding of the different aspects of quality in health care and provides a better basis for defining, measuring and controlling quality of health care services. PMID:23922534
Social Anxiety and Friendship Quality over Time.
Rodebaugh, Thomas L; Lim, Michelle H; Shumaker, Erik A; Levinson, Cheri A; Thompson, Tess
2015-01-01
High social anxiety in adults is associated with self-report of impaired friendship quality, but not necessarily with impairment reported by friends. Further, prospective prediction of social anxiety and friendship quality over time has not been tested among adults. We therefore examined friendship quality and social anxiety prospectively in 126 young adults (67 primary participants and 59 friends, aged 17-22 years); the primary participants were screened to be extreme groups to increase power and relevance to clinical samples (i.e., they were recruited based on having very high or very low social interaction anxiety). The prospective relationships between friendship quality and social anxiety were then tested using an Actor-Partner Interdependence Model. Friendship quality prospectively predicted social anxiety over time within each individual in the friendship, such that higher friendship quality at Time 1 predicted lower social anxiety approximately 6 months later at Time 2. Social anxiety did not predict friendship quality. Although the results support the view that social anxiety and friendship quality have an important causal relationship, the results run counter to the assumption that high social anxiety causes poor friendship quality. Interventions to increase friendship quality merit further consideration.
NASA Astrophysics Data System (ADS)
Comyn-Wattiau, Isabelle; Thalheim, Bernhard
Quality assurance is a growing research domain within the Information Systems (IS) and Conceptual Modeling (CM) disciplines. Ongoing research on quality in IS and CM is highly diverse and encompasses theoretical aspects including quality definition and quality models, and practical/empirical aspects such as the development of methods, approaches and tools for quality measurement and improvement. Current research on quality also includes quality characteristics definitions, validation instruments, methodological and development approaches to quality assurance during software and information systems development, quality monitors, quality assurance during information systems development processes and practices, quality assurance both for data and (meta)schemata, quality support for information systems data import and export, quality of query answering, and cost/benefit analysis of quality assurance processes. Quality assurance is also depending on the application area and the specific requirements in applications such as health sector, logistics, public sector, financial sector, manufacturing, services, e-commerce, software, etc. Furthermore, quality assurance must also be supported for data aggregation, ETL processes, web content management and other multi-layered applications. Quality assurance is typically requiring resources and has therefore beside its benefits a computational and economical trade-off. It is therefore also based on compromising between the value of quality data and the cost for quality assurance.
NASA Technical Reports Server (NTRS)
Quattrochi, D. A.; Lapenta, W. M.; Crosson, W. L.; Estes, M. G., Jr.; Limaye, A.; Kahn, M.
2006-01-01
Local and state agencies are responsible for developing state implementation plans to meet National Ambient Air Quality Standards. Numerical models used for this purpose simulate the transport and transformation of criteria pollutants and their precursors. The specification of land use/land cover (LULC) plays an important role in controlling modeled surface meteorology and emissions. NASA researchers have worked with partners and Atlanta stakeholders to incorporate an improved high-resolution LULC dataset for the Atlanta area within their modeling system and to assess meteorological and air quality impacts of Urban Heat Island (UHI) mitigation strategies. The new LULC dataset provides a more accurate representation of land use, has the potential to improve model accuracy, and facilitates prediction of LULC changes. Use of the new LULC dataset for two summertime episodes improved meteorological forecasts, with an existing daytime cold bias of approx. equal to 3 C reduced by 30%. Model performance for ozone prediction did not show improvement. In addition, LULC changes due to Atlanta area urbanization were predicted through 2030, for which model simulations predict higher urban air temperatures. The incorporation of UHI mitigation strategies partially offset this warming trend. The data and modeling methods used are generally applicable to other U.S. cities.
Yerramilli, Anjaneyulu; Dodla, Venkata B; Desamsetti, Srinivas; Challa, Srinivas V; Young, John H; Patrick, Chuck; Baham, Julius M; Hughes, Robert L; Yerramilli, Sudha; Tuluri, Francis; Hardy, Mark G; Swanier, Shelton J
2011-06-01
In this study, an attempt was made to simulate the air quality with reference to ozone over the Jackson (Mississippi) region using an online WRF/Chem (Weather Research and Forecasting-Chemistry) model. The WRF/Chem model has the advantages of the integration of the meteorological and chemistry modules with the same computational grid and same physical parameterizations and includes the feedback between the atmospheric chemistry and physical processes. The model was designed to have three nested domains with the inner-most domain covering the study region with a resolution of 1 km. The model was integrated for 48 hours continuously starting from 0000 UTC of 6 June 2006 and the evolution of surface ozone and other precursor pollutants were analyzed. The model simulated atmospheric flow fields and distributions of NO2 and O3 were evaluated for each of the three different time periods. The GIS based spatial distribution maps for ozone, its precursors NO, NO2, CO and HONO and the back trajectories indicate that all the mobile sources in Jackson, Ridgeland and Madison contributing significantly for their formation. The present study demonstrates the applicability of WRF/Chem model to generate quantitative information at high spatial and temporal resolution for the development of decision support systems for air quality regulatory agencies and health administrators.
NASA Astrophysics Data System (ADS)
Lee, Soon Hwan; Kim, Ji Sun; Lee, Kang Yeol; Shon, Keon Tae
2017-04-01
Air quality due to increasing Particulate Matter(PM) in Korea in Asia is getting worse. At present, the PM forecast is announced based on the PM concentration predicted from the air quality prediction numerical model. However, forecast accuracy is not as high as expected due to various uncertainties for PM physical and chemical characteristics. The purpose of this study was to develop a numerical-statistically ensemble models to improve the accuracy of prediction of PM10 concentration. Numerical models used in this study are the three dimensional atmospheric model Weather Research and Forecasting(WRF) and the community multiscale air quality model (CMAQ). The target areas for the PM forecast are Seoul, Busan, Daegu, and Daejeon metropolitan areas in Korea. The data used in the model development are PM concentration and CMAQ predictions and the data period is 3 months (March 1 - May 31, 2014). The dynamic-statistical technics for reducing the systematic error of the CMAQ predictions was applied to the dynamic linear model(DLM) based on the Baysian Kalman filter technic. As a result of applying the metrics generated from the dynamic linear model to the forecasting of PM concentrations accuracy was improved. Especially, at the high PM concentration where the damage is relatively large, excellent improvement results are shown.
MODELS-3/CMAQ APPLICATIONS WHICH ILLUSTRATE CAPABILITY AND FUNCTIONALITY
The Models-3/CMAQ developed by the U.S. Environmental Protections Agency (USEPA) is a third generation multiscale, multi-pollutant air quality modeling system within a high-level, object-oriented computer framework (Models-3). It has been available to the scientific community ...
The SCALE Verified, Archived Library of Inputs and Data - VALID
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, William BJ J; Rearden, Bradley T
The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated withmore » model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.« less
Impacts of Energy Sector Emissions on PM2.5 Air Quality in Northern India
NASA Astrophysics Data System (ADS)
Karambelas, A. N.; Kiesewetter, G.; Heyes, C.; Holloway, T.
2015-12-01
India experiences high concentrations of fine particulate matter (PM2.5), and several Indian cities currently rank among the world's most polluted cities. With ongoing urbanization and a growing economy, emissions from different energy sectors remain major contributors to air pollution in India. Emission sectors impact ambient air quality differently due to spatial distribution (typical urban vs. typical rural sources) as well as source height characteristics (low-level vs. high stack sources). This study aims to assess the impacts of emissions from three distinct energy sectors—transportation, domestic, and electricity—on ambient PM2.5 in northern India using an advanced air quality analysis framework based on the U.S. EPA Community Multi-Scale Air Quality (CMAQ) model. Present air quality conditions are simulated using 2010 emissions from the Greenhouse Gas-Air Pollution Interaction and Synergies (GAINS) model. Modeled PM2.5 concentrations are compared with satellite observations of aerosol optical depth (AOD) from the Moderate Imaging Spectroradiometer (MODIS) for 2010. Energy sector emissions impacts on future (2030) PM2.5 are evaluated with three sensitivity simulations, assuming maximum feasible reduction technologies for either transportation, domestic, or electricity sectors. These simulations are compared with a business as usual 2030 simulation to assess relative sectoral impacts spatially and temporally. CMAQ is modeled at 12km by 12km and include biogenic emissions from the Community Land Model coupled with the Model of Emissions of Gases and Aerosols in Nature (CLM-MEGAN), biomass burning emissions from the Global Fires Emissions Database (GFED), and ERA-Interim meteorology generated with the Weather Research and Forecasting (WRF) model for 2010 to quantify the impact of modified anthropogenic emissions on ambient PM2.5 concentrations. Energy sector emissions analysis supports decision-making to improve future air quality and public health in India.
Global motion compensated visual attention-based video watermarking
NASA Astrophysics Data System (ADS)
Oakes, Matthew; Bhowmik, Deepayan; Abhayaratne, Charith
2016-11-01
Imperceptibility and robustness are two key but complementary requirements of any watermarking algorithm. Low-strength watermarking yields high imperceptibility but exhibits poor robustness. High-strength watermarking schemes achieve good robustness but often suffer from embedding distortions resulting in poor visual quality in host media. This paper proposes a unique video watermarking algorithm that offers a fine balance between imperceptibility and robustness using motion compensated wavelet-based visual attention model (VAM). The proposed VAM includes spatial cues for visual saliency as well as temporal cues. The spatial modeling uses the spatial wavelet coefficients while the temporal modeling accounts for both local and global motion to arrive at the spatiotemporal VAM for video. The model is then used to develop a video watermarking algorithm, where a two-level watermarking weighting parameter map is generated from the VAM saliency maps using the saliency model and data are embedded into the host image according to the visual attentiveness of each region. By avoiding higher strength watermarking in the visually attentive region, the resulting watermarked video achieves high perceived visual quality while preserving high robustness. The proposed VAM outperforms the state-of-the-art video visual attention methods in joint saliency detection and low computational complexity performance. For the same embedding distortion, the proposed visual attention-based watermarking achieves up to 39% (nonblind) and 22% (blind) improvement in robustness against H.264/AVC compression, compared to existing watermarking methodology that does not use the VAM. The proposed visual attention-based video watermarking results in visual quality similar to that of low-strength watermarking and a robustness similar to those of high-strength watermarking.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castillo-Villar, Krystel K.; Eksioglu, Sandra; Taherkhorsandi, Milad
The production of biofuels using second-generation feedstocks has been recognized as an important alternative source of sustainable energy and its demand is expected to increase due to regulations such as the Renewable Fuel Standard. However, the pathway to biofuel industry maturity faces unique, unaddressed challenges. Here, to address this challenges, this article presents an optimization model which quantifies and controls the impact of biomass quality variability on supply chain related decisions and technology selection. We propose a two-stage stochastic programming model and associated efficient solution procedures for solving large-scale problems to (1) better represent the random nature of the biomassmore » quality (defined by moisture and ash contents) in supply chain modeling, and (2) assess the impact of these uncertainties on the supply chain design and planning. The proposed model is then applied to a case study in the state of Tennessee. Results show that high moisture and ash contents negatively impact the unit delivery cost since poor biomass quality requires the addition of quality control activities. Experimental results indicate that supply chain cost could increase as much as 27%–31% when biomass quality is poor. We assess the impact of the biomass quality on the topological supply chain. Our case study indicates that biomass quality impacts supply chain costs; thus, it is important to consider the impact of biomass quality in supply chain design and management decisions.« less
Castillo-Villar, Krystel K.; Eksioglu, Sandra; Taherkhorsandi, Milad
2017-02-20
The production of biofuels using second-generation feedstocks has been recognized as an important alternative source of sustainable energy and its demand is expected to increase due to regulations such as the Renewable Fuel Standard. However, the pathway to biofuel industry maturity faces unique, unaddressed challenges. Here, to address this challenges, this article presents an optimization model which quantifies and controls the impact of biomass quality variability on supply chain related decisions and technology selection. We propose a two-stage stochastic programming model and associated efficient solution procedures for solving large-scale problems to (1) better represent the random nature of the biomassmore » quality (defined by moisture and ash contents) in supply chain modeling, and (2) assess the impact of these uncertainties on the supply chain design and planning. The proposed model is then applied to a case study in the state of Tennessee. Results show that high moisture and ash contents negatively impact the unit delivery cost since poor biomass quality requires the addition of quality control activities. Experimental results indicate that supply chain cost could increase as much as 27%–31% when biomass quality is poor. We assess the impact of the biomass quality on the topological supply chain. Our case study indicates that biomass quality impacts supply chain costs; thus, it is important to consider the impact of biomass quality in supply chain design and management decisions.« less
Cost Models for MMC Manufacturing Processes
NASA Technical Reports Server (NTRS)
Elzey, Dana M.; Wadley, Haydn N. G.
1996-01-01
Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.
Reforming funding for chronic illness: Medicare-CDM.
Swerissen, Hal; Taylor, Michael J
2008-02-01
Chronic diseases are a major challenge for the Australian health care system in terms of both the provision of quality care and expenditure, and these challenges will only increase in the future. Various programs have been instituted under the Medicare system to provide increased funding for chronic care, but essentially these programs still follow the traditional fee-for-service model. This paper proposes a realignment and extension of current Medicare chronic disease management programs into a framework that provides general practitioners and other health professionals with the necessary "tools" for high quality care planning and ongoing management, and incorporating international models of outcome-linked funding. The integration of social support services with the Medicare system is also a necessary step in providing high quality care for patients with complex needs requiring additional support.
Facilitating the openEHR approach - organizational structures for defining high-quality archetypes.
Kohl, Christian Dominik; Garde, Sebastian; Knaup, Petra
2008-01-01
Using openEHR archetypes to establish an electronic patient record promises rapid development and system interoperability by using or adopting existing archetypes. However, internationally accepted, high quality archetypes which enable a comprehensive semantic interoperability require adequate development and maintenance processes. Therefore, structures have to be created involving different health professions. In the following we present a model which facilitates and governs distributed but cooperative development and adoption of archetypes by different professionals including peer reviews. Our model consists of a hierarchical structure of professional committees and descriptions of the archetype development process considering these different committees.
Mechanisms of deterioration of nutrients. [freeze drying methods for space flight food
NASA Technical Reports Server (NTRS)
Karel, M.; Flink, J. M.
1974-01-01
Methods are reported by which freeze dried foods of improved quality will be produced. The applicability of theories of flavor retention has been demonstrated for a number of food polymers, both proteins and polysacchardies. Studies on the formation of structures during freeze drying have been continued for emulsified systems. Deterioration of organoleptic quality of freeze dried foods due to high temperature heating has been evaluated and improved procedures developed. The influence of water activity and high temperature on retention of model flavor materials and browning deterioration has been evaluated for model systems and food materials.
Haas, Sheila A; Vlasses, Frances; Havey, Julia
2016-01-01
There are multiple demands and challenges inherent in establishing staffing models in ambulatory heath care settings today. If health care administrators establish a supportive physical and interpersonal health care environment, and develop high-performing interprofessional teams and staffing models and electronic documentation systems that track performance, patients will have more opportunities to receive safe, high-quality evidence-based care that encourages patient participation in decision making, as well as provision of their care. The health care organization must be aligned and responsive to the community within which it resides, fully invested in population health management, and continuously scanning the environment for competitive, regulatory, and external environmental risks. All of these challenges require highly competent providers willing to change attitudes and culture such as movement toward collaborative practice among the interprofessional team including the patient.
The Economic Value of Air Quality Forecasting
NASA Astrophysics Data System (ADS)
Anderson-Sumo, Tasha
Both long-term and daily air quality forecasts provide an essential component to human health and impact costs. According the American Lung Association, the estimated current annual cost of air pollution related illness in the United States, adjusted for inflation (3% per year), is approximately $152 billion. Many of the risks such as hospital visits and morality are associated with poor air quality days (where the Air Quality Index is greater than 100). Groups such as sensitive groups become more susceptible to the resulting conditions and more accurate forecasts would help to take more appropriate precautions. This research focuses on evaluating the utility of air quality forecasting in terms of its potential impacts by building on air quality forecasting and economical metrics. Our analysis includes data collected during the summertime ozone seasons between 2010 and 2012 from air quality models for the Washington, DC/Baltimore, MD region. The metrics that are relevant to our analysis include: (1) The number of times that a high ozone or particulate matter (PM) episode is correctly forecasted, (2) the number of times that high ozone or PM episode is forecasted when it does not occur and (3) the number of times when the air quality forecast predicts a cleaner air episode when the air was observed to have high ozone or PM. Our collection of data included available air quality model forecasts of ozone and particulate matter data from the U.S. Environmental Protection Agency (EPA)'s AIRNOW as well as observational data of ozone and particulate matter from Clean Air Partners. We evaluated the performance of the air quality forecasts with that of the observational data and found that the forecast models perform well for the Baltimore/Washington region and the time interval observed. We estimate the potential amount for the Baltimore/Washington region accrues to a savings of up to 5,905 lives and 5.9 billion dollars per year. This total assumes perfect compliance with bad air quality warning and forecast air quality forecasts. There is a difficulty presented with evaluating the economic utility of the forecasts. All may not comply and even with a low compliance rate of 5% and 72% as the average probability of detection of poor air quality days by the air quality models, we estimate that the forecasting program saves 412 lives or 412 million dollars per year for the region. The totals we found are great or greater than other typical yearly meteorological hazard programs such as tornado or hurricane forecasting and it is clear that the economic value of air quality forecasting in the Baltimore/Washington region is vital.
Quality assessment of protein model-structures based on structural and functional similarities
2012-01-01
Background Experimental determination of protein 3D structures is expensive, time consuming and sometimes impossible. A gap between number of protein structures deposited in the World Wide Protein Data Bank and the number of sequenced proteins constantly broadens. Computational modeling is deemed to be one of the ways to deal with the problem. Although protein 3D structure prediction is a difficult task, many tools are available. These tools can model it from a sequence or partial structural information, e.g. contact maps. Consequently, biologists have the ability to generate automatically a putative 3D structure model of any protein. However, the main issue becomes evaluation of the model quality, which is one of the most important challenges of structural biology. Results GOBA - Gene Ontology-Based Assessment is a novel Protein Model Quality Assessment Program. It estimates the compatibility between a model-structure and its expected function. GOBA is based on the assumption that a high quality model is expected to be structurally similar to proteins functionally similar to the prediction target. Whereas DALI is used to measure structure similarity, protein functional similarity is quantified using standardized and hierarchical description of proteins provided by Gene Ontology combined with Wang's algorithm for calculating semantic similarity. Two approaches are proposed to express the quality of protein model-structures. One is a single model quality assessment method, the other is its modification, which provides a relative measure of model quality. Exhaustive evaluation is performed on data sets of model-structures submitted to the CASP8 and CASP9 contests. Conclusions The validation shows that the method is able to discriminate between good and bad model-structures. The best of tested GOBA scores achieved 0.74 and 0.8 as a mean Pearson correlation to the observed quality of models in our CASP8 and CASP9-based validation sets. GOBA also obtained the best result for two targets of CASP8, and one of CASP9, compared to the contest participants. Consequently, GOBA offers a novel single model quality assessment program that addresses the practical needs of biologists. In conjunction with other Model Quality Assessment Programs (MQAPs), it would prove useful for the evaluation of single protein models. PMID:22998498
Modeling water quality, temperature, and flow in Link River, south-central Oregon
Sullivan, Annett B.; Rounds, Stewart A.
2016-09-09
The 2.1-km (1.3-mi) Link River connects Upper Klamath Lake to the Klamath River in south-central Oregon. A CE-QUAL-W2 flow and water-quality model of Link River was developed to provide a connection between an existing model of the upper Klamath River and any existing or future models of Upper Klamath Lake. Water-quality sampling at six locations in Link River was done during 2013–15 to support model development and to provide a better understanding of instream biogeochemical processes. The short reach and high velocities in Link River resulted in fast travel times and limited water-quality transformations, except for dissolved oxygen. Reaeration through the reach, especially at the falls in Link River, was particularly important in moderating dissolved oxygen concentrations that at times entered the reach at Link River Dam with marked supersaturation or subsaturation. This reaeration resulted in concentrations closer to saturation downstream at the mouth of Link River.
Investigation of Possible Wellbore Cement Failures During Hydraulic Fracturing Operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jihoon; Moridis, George
2014-11-01
We model and assess the possibility of shear failure, using the Mohr-Coulomb model ? along the vertical well by employing a rigorous coupled flow-geomechanic analysis. To this end, we vary the values of cohesion between the well casing and the surrounding cement to representing different quality levels of the cementing operation (low cohesion corresponds to low-quality cement and/or incomplete cementing). The simulation results show that there is very little fracturing when the cement is of high quality.. Conversely, incomplete cementing and/or weak cement can causes significant shear failure and the evolution of long fractures/cracks along the vertical well. Specifically, lowmore » cohesion between the well and cemented areas can cause significant shear failure along the well, but the same cohesion as the cemented zone does not cause shear failure. When the hydraulic fracturing pressure is high, low cohesion of the cement can causes fast propagation of shear failure and of the resulting fracture/crack, but a high-quality cement with no weak zones exhibits limited shear failure that is concentrated near the bottom of the vertical part of the well. Thus, high-quality cement and complete cementing along the vertical well appears to be the strongest protection against shear failure of the wellbore cement and, consequently, against contamination hazards to drinking water aquifers during hydraulic fracturing operations.« less
Air quality high resolution simulations of Italian urban areas with WRF-CHIMERE
NASA Astrophysics Data System (ADS)
Falasca, Serena; Curci, Gabriele
2017-04-01
The new European Directive on ambient air quality and cleaner air for Europe (2008/50/EC) encourages the use of modeling techniques to support the observations in the assessment and forecasting of air quality. The modelling system based on the combination of the WRF meteorological model and the CHIMERE chemistry-transport model is used to perform simulations at high resolution over the main Italian cities (e.g. Milan, Rome). Three domains covering Europe, Italy and the urban areas are nested with a decreasing grid size up to 1 km. Numerical results are produced for a winter month and a summer month of the year 2010 and are validated using ground-based observations (e.g. from the European air quality database AirBase). A sensitivity study is performed using different physics options, domain resolution and grid ratio; different urban parameterization schemes are tested using also characteristic morphology parameters for the cities considered. A spatial reallocation of anthropogenic emissions derived from international (e.g. EMEP, TNO, HTAP) and national (e.g. CTN-ACE) emissions inventories and based on the land cover datasets (Global Land Cover Facility and GlobCover) and the OpenStreetMap tool is also included. Preliminary results indicate that the introduction of the spatial redistribution at high-resolution allows a more realistic reproduction of the distribution of the emission flows and thus the concentrations of the pollutants, with significant advantages especially for the urban environments.
Depletion of deep marine food patches forces divers to give up early.
Thums, Michele; Bradshaw, Corey J A; Sumner, Michael D; Horsburgh, Judy M; Hindell, Mark A
2013-01-01
Many optimal foraging models for diving animals examine strategies that maximize time spent in the foraging zone, assuming that prey acquisition increases linearly with search time. Other models have considered the effect of patch quality and predict a net energetic benefit if dives where no prey is encountered early in the dive are abandoned. For deep divers, however, the energetic benefit of giving up is reduced owing to the elevated energy costs associated with descending to physiologically hostile depths, so patch residence time should be invariant. Others consider an asymptotic gain function where the decision to leave a patch is driven by patch-depletion effects - the marginal value theorem. As predator behaviour is increasingly being used as an index of marine resource density and distribution, it is important to understand the nature of this gain function. We investigated the dive behaviour of the world's deepest-diving seal, the southern elephant seal Mirounga leonina, in response to patch quality. Testing these models has largely been limited to controlled experiments on captive animals. By integrating in situ measurements of the seal's relative lipid content obtained from drift rate data (a measure of foraging success) with area-restricted search behaviour identified from first-passage time analysis, we identified regions of high- and low-quality patches. Dive durations and bottom times were not invariant and did not increase in regions of high quality; rather, both were longer when patches were of relatively low quality. This is consistent with the predictions of the marginal value theorem and provides support for a nonlinear relationship between search time and prey acquisition. We also found higher descent and ascent rates in high-quality patches suggesting that seals minimized travel time to the foraging patch when quality was high; however, this was not achieved by increasing speed or dive angle. Relative body lipid content was an important predictor of dive behaviour. Seals did not schedule their diving to maximize time spent in the foraging zone in higher-quality patches, challenging the widely held view that maximizing time in the foraging zone translates to greater foraging success. © 2012 The Authors. Journal of Animal Ecology © 2012 British Ecological Society.
Urban Air Quality Modelling with AURORA: Prague and Bratislava
NASA Astrophysics Data System (ADS)
Veldeman, N.; Viaene, P.; De Ridder, K.; Peelaerts, W.; Lauwaet, D.; Muhammad, N.; Blyth, L.
2012-04-01
The European Commission, in its strategy to protect the health of the European citizens, states that in order to assess the impact of air pollution on public health, information on long-term exposure to air pollution should be available. Currently, indicators of air quality are often being generated using measured pollutant concentrations. While air quality monitoring stations data provide accurate time series information at specific locations, air quality models have the advantage of being able to assess the spatial variability of air quality (for different resolutions) and predict air quality in the future based on different scenarios. When running such air quality models at a high spatial and temporal resolution, one can simulate the actual situation as closely as possible, allowing for a detailed assessment of the risk of exposure to citizens from different pollutants. AURORA (Air quality modelling in Urban Regions using an Optimal Resolution Approach), a prognostic 3-dimensional Eulerian chemistry-transport model, is designed to simulate urban- to regional-scale atmospheric pollutant concentration and exposure fields. The AURORA model also allows to calculate the impact of changes in land use (e.g. planting of trees) or of emission reduction scenario's on air quality. AURORA is currently being applied within the ESA atmospheric GMES service, PASODOBLE (http://www.myair-eu.org), that delivers information on air quality, greenhouse gases, stratospheric ozone, … At present there are two operational AURORA services within PASODOBLE. Within the "Air quality forecast service" VITO delivers daily air quality forecasts for Belgium at a resolution of 5 km and for the major Belgian cities: Brussels, Ghent, Antwerp, Liege and Charleroi. Furthermore forecast services are provided for Prague, Czech Republic and Bratislava, Slovakia, both at a resolution of 1 km. The "Urban/regional air quality assessment service" provides urban- and regional-scale maps (hourly resolution) for air pollution and human exposure statistics for an entire year. So far we concentrated on Brussels, Belgium and the Rotterdam harbour area, The Netherlands. In this contribution we focus on the operational forecast services. Reference Lefebvre W. et al. (2011) Validation of the MIMOSA-AURORA-IFDM model chain for policy support: Modeling concentrations of elemental carbon in Flanders, Atmospheric Environment 45, 6705-6713
Alkerwi, Ala'a; Sauvageot, Nicolas; Malan, Leoné; Shivappa, Nitin; Hébert, James R
2015-04-14
This study examined the association between nutritional awareness and diet quality, as indicated by energy density, dietary diversity and adequacy to achieve dietary recommendations, while considering the potentially important role of socioeconomic status (SES). Data were derived from 1351 subjects, aged 18-69 years and enrolled in the ORISCAV-LUX study. Energy density score (EDS), dietary diversity score (DDS) and Recommendation Compliance Index (RCI) were calculated based on data derived from a food frequency questionnaire. Nutritional awareness was defined as self-perception of the importance assigned to eating balanced meals, and classified as high, moderate, or of little importance. Initially, a General Linear Model was fit that adjusted for age, sex, country of birth, and body mass index (BMI). Furthermore, simultaneous contributions to diet quality of individual-level socioeconomic factors, education, and household income were examined across levels of nutritional awareness. Attributing high importance was associated inversely with energy density (p = 0.02), positively with both dietary diversity (p < 0.0001), and adequacy to dietary recommendations (p < 0.0001), independent of demographic factors, weight status and SES. Further adjustment for household income in the EDS-related multivariable model, reduced the β coefficient by 47% for the "moderate importance" category and 36% for the "high importance" category. Likewise, the β coefficient decreased by 13.6% and 10.7% in the DDS-related model, and by 12.5%, and 7.1% in the RCI-related model, respectively, across awareness categories. Nutritional awareness has a direct effect on diet quality, with a minor component of variance explained by improved income. The impact of nutritional awareness on diet quality seems to be a promising area for both health promotion and health policy research.
ERIC Educational Resources Information Center
Park, Hee Sun; Levine, Timothy R.; Kingsley Westerman, Catherine Y.; Orfgen, Tierney; Foregger, Sarah
2007-01-01
Involvement has long been theoretically specified as a crucial factor determining the persuasive impact of messages. In social judgment theory, ego-involvement makes people more resistant to persuasion, whereas in dual-process models, high-involvement people are susceptible to persuasion when argument quality is high. It is argued that these…
ERIC Educational Resources Information Center
Shouse, A. Clay; Epstein, Ann S.
This document is the final report of the McGregor-funded High/Scope training initiative, a system-wide approach to improving the quality of early childhood programs in the Detroit metropolitan area. The 3-year project was based on the validated High/Scope educational approach and training model, which advocates hands-on active learning for both…
Transitioning to a High-Value Health Care Model: Academic Accountability.
Johnson, Pamela T; Alvin, Matthew D; Ziegelstein, Roy C
2018-06-01
Health care spending in the United States has increased to unprecedented levels, and these costs have broken medical providers' promise to do no harm. Medical debt is the leading contributor to U.S. personal bankruptcy, more than 50% of household foreclosures are secondary to medical debt and illness, and patients are choosing to avoid necessary care because of its cost. Evidence that the health care delivery model is contributing to patient hardship is a call to action for the profession to transition to a high-value model, one that delivers the highest health care quality and safety at the lowest personal and financial cost to patients. As such, value improvement work is being done at academic medical centers across the country. To promote measurable improvements in practice on a national scale, academic institutions need to align efforts and create a new model for collaboration, one that transcends cross-institutional competition, specialty divisions, and geographical constraints. Academic institutions are particularly accountable because of the importance of research and education in driving this transition. Investigations that elucidate effective implementation methodologies and evaluate safety outcomes data can facilitate transformation. Engaging trainees in quality improvement initiatives will instill high-value care into their practice. This article charges academic institutions to go beyond dissemination of best practice guidelines and demonstrate accountability for high-value quality improvement implementation. By effectively transitioning to a high-value health care system, medical providers will convincingly demonstrate that patients are their most important priority.
Modeling Best Management Practices (BMPs) with HSPF
The Hydrological Simulation Program-Fortran (HSPF) is a semi-distributed watershed model, which simulates hydrology and water quality processes at user-specified spatial and temporal scales. Although HSPF is a comprehensive and highly flexible model, a number of investigators not...
The impact of on-site wastewater from high density cluster developments on groundwater quality
NASA Astrophysics Data System (ADS)
Morrissey, P. J.; Johnston, P. M.; Gill, L. W.
2015-11-01
The net impact on groundwater quality from high density clusters of unsewered housing across a range of hydro(geo)logical settings has been assessed. Four separate cluster development sites were selected, each representative of different aquifer vulnerability categories. Groundwater samples were collected on a monthly basis over a two year period for chemical and microbiological analysis from nested multi-horizon sampling boreholes upstream and downstream of the study sites. The field results showed no statistically significant difference between upstream and downstream water quality at any of the study areas, although there were higher breakthroughs in contaminants in the High and Extreme vulnerability sites linked to high intensity rainfall events; these however, could not be directly attributed to on-site effluent. Linked numerical models were then built for each site using HYDRUS 2D to simulate the attenuation of contaminants through the unsaturated zone from which the resulting hydraulic and contaminant fluxes at the water table were used as inputs into MODFLOW MT3D models to simulate the groundwater flows. The results of the simulations confirmed the field observations at each site, indicating that the existing clustered on-site wastewater discharges would only cause limited and very localised impacts on groundwater quality, with contaminant loads being quickly dispersed and diluted downstream due to the relatively high groundwater flow rates. Further simulations were then carried out using the calibrated models to assess the impact of increasing cluster densities revealing little impact at any of the study locations up to a density of 6 units/ha with the exception of the Extreme vulnerability site.
Making difficult decisions: the role of quality of care in choosing a nursing home.
Pesis-Katz, Irena; Phelps, Charles E; Temkin-Greener, Helena; Spector, William D; Veazie, Peter; Mukamel, Dana B
2013-05-01
We investigated how quality of care affects choosing a nursing home. We examined nursing home choice in California, Ohio, New York, and Texas in 2001, a period before the federal Nursing Home Compare report card was published. Thus, consumers were less able to observe clinical quality or clinical quality was masked. We modeled nursing home choice by estimating a conditional multinomial logit model. In all states, consumers were more likely to choose nursing homes of high hotel services quality but not clinical care quality. Nursing home choice was also significantly associated with shorter distance from prior residence, not-for-profit status, and larger facility size. In the absence of quality report cards, consumers choose a nursing home on the basis of the quality dimensions that are easy for them to observe, evaluate, and apply to their situation. Future research should focus on identifying the quality information that offers the most value added to consumers.
Making Difficult Decisions: The Role of Quality of Care in Choosing a Nursing Home
Phelps, Charles E.; Temkin-Greener, Helena; Spector, William D.; Veazie, Peter; Mukamel, Dana B.
2013-01-01
Objectives. We investigated how quality of care affects choosing a nursing home. Methods. We examined nursing home choice in California, Ohio, New York, and Texas in 2001, a period before the federal Nursing Home Compare report card was published. Thus, consumers were less able to observe clinical quality or clinical quality was masked. We modeled nursing home choice by estimating a conditional multinomial logit model. Results. In all states, consumers were more likely to choose nursing homes of high hotel services quality but not clinical care quality. Nursing home choice was also significantly associated with shorter distance from prior residence, not-for-profit status, and larger facility size. Conclusions. In the absence of quality report cards, consumers choose a nursing home on the basis of the quality dimensions that are easy for them to observe, evaluate, and apply to their situation. Future research should focus on identifying the quality information that offers the most value added to consumers. PMID:23488519
NASA Technical Reports Server (NTRS)
Donlan, Charles J.; Kuhn, Richard E.
1948-01-01
An analysis of the estimated high-speed flying qualities of the Chance Vought XF7U-1 airplane in the Mach number range from 0.40 to 0.91 has been made, based on tests of an 0.08-scale model of this airplane in the Langley high-speed 7- by 10-foot wind tunnel. The analysis indicates longitudinal control-position instability at transonic speeds, but the accompanying trim changes are not large. Control-position maneuvering stability, however, is present for all speeds. Longitudinal lateral control appear adequate, but the damping of the short-period longitudinal and lateral oscillations at high altitudes is poor and may require artificial damping.
[Review on HSPF model for simulation of hydrology and water quality processes].
Li, Zhao-fu; Liu, Hong-Yu; Li, Yan
2012-07-01
Hydrological Simulation Program-FORTRAN (HSPF), written in FORTRAN, is one ol the best semi-distributed hydrology and water quality models, which was first developed based on the Stanford Watershed Model. Many studies on HSPF model application were conducted. It can represent the contributions of sediment, nutrients, pesticides, conservatives and fecal coliforms from agricultural areas, continuously simulate water quantity and quality processes, as well as the effects of climate change and land use change on water quantity and quality. HSPF consists of three basic application components: PERLND (Pervious Land Segment) IMPLND (Impervious Land Segment), and RCHRES (free-flowing reach or mixed reservoirs). In general, HSPF has extensive application in the modeling of hydrology or water quality processes and the analysis of climate change and land use change. However, it has limited use in China. The main problems with HSPF include: (1) some algorithms and procedures still need to revise, (2) due to the high standard for input data, the accuracy of the model is limited by spatial and attribute data, (3) the model is only applicable for the simulation of well-mixed rivers, reservoirs and one-dimensional water bodies, it must be integrated with other models to solve more complex problems. At present, studies on HSPF model development are still undergoing, such as revision of model platform, extension of model function, method development for model calibration, and analysis of parameter sensitivity. With the accumulation of basic data and imorovement of data sharing, the HSPF model will be applied more extensively in China.
Constructing a consumption model of fine dining from the perspective of behavioral economics
Tsai, Sang-Bing
2018-01-01
Numerous factors affect how people choose a fine dining restaurant, including food quality, service quality, food safety, and hedonic value. A conceptual framework for evaluating restaurant selection behavior has not yet been developed. This study surveyed 150 individuals with fine dining experience and proposed the use of mental accounting and axiomatic design to construct a consumer economic behavior model. Linear and logistic regressions were employed to determine model correlations and the probability of each factor affecting behavior. The most crucial factor was food quality, followed by service and dining motivation, particularly regarding family dining. Safe ingredients, high cooking standards, and menu innovation all increased the likelihood of consumers choosing fine dining restaurants. PMID:29641554
Constructing a consumption model of fine dining from the perspective of behavioral economics.
Hsu, Sheng-Hsun; Hsiao, Cheng-Fu; Tsai, Sang-Bing
2018-01-01
Numerous factors affect how people choose a fine dining restaurant, including food quality, service quality, food safety, and hedonic value. A conceptual framework for evaluating restaurant selection behavior has not yet been developed. This study surveyed 150 individuals with fine dining experience and proposed the use of mental accounting and axiomatic design to construct a consumer economic behavior model. Linear and logistic regressions were employed to determine model correlations and the probability of each factor affecting behavior. The most crucial factor was food quality, followed by service and dining motivation, particularly regarding family dining. Safe ingredients, high cooking standards, and menu innovation all increased the likelihood of consumers choosing fine dining restaurants.
Multiphysical simulation analysis of the dislocation structure in germanium single crystals
NASA Astrophysics Data System (ADS)
Podkopaev, O. I.; Artemyev, V. V.; Smirnov, A. D.; Mamedov, V. M.; Sid'ko, A. P.; Kalaev, V. V.; Kravtsova, E. D.; Shimanskii, A. F.
2016-09-01
To grow high-quality germanium crystals is one of the most important problems of growth industry. The dislocation density is an important parameter of the quality of single crystals. The dislocation densities in germanium crystals 100 mm in diameter, which have various shapes of the side surface and are grown by the Czochralski technique, are experimentally measured. The crystal growth is numerically simulated using heat-transfer and hydrodynamics models and the Alexander-Haasen dislocation model in terms of the CGSim software package. A comparison of the experimental and calculated dislocation densities shows that the dislocation model can be applied to study lattice defects in germanium crystals and to improve their quality.
High resolution infrared datasets useful for validating stratospheric models
NASA Technical Reports Server (NTRS)
Rinsland, Curtis P.
1992-01-01
An important objective of the High Speed Research Program (HSRP) is to support research in the atmospheric sciences that will improve the basic understanding of the circulation and chemistry of the stratosphere and lead to an interim assessment of the impact of a projected fleet of High Speed Civil Transports (HSCT's) on the stratosphere. As part of this work, critical comparisons between models and existing high quality measurements are planned. These comparisons will be used to test the reliability of current atmospheric chemistry models. Two suitable sets of high resolution infrared measurements are discussed.
Johnsen, Bjørn Helge; Westli, Heidi Kristina; Espevik, Roar; Wisborg, Torben; Brattebø, Guttorm
2017-11-10
High quality team leadership is important for the outcome of medical emergencies. However, the behavioral marker of leadership are not well defined. The present study investigated frequency of behavioral markers of shared mental models (SMM) on quality of medical management. Training video recordings of 27 trauma teams simulating emergencies were analyzed according to team -leader's frequency of shared mental model behavioral markers. The results showed a positive correlation of quality of medical management with leaders sharing information without an explicit demand for the information ("push" of information) and with leaders communicating their situational awareness (SA) and demonstrating implicit supporting behavior. When separating the sample into higher versus lower performing teams, the higher performing teams had leaders who displayed a greater frequency of "push" of information and communication of SA and supportive behavior. No difference was found for the behavioral marker of team initiative, measured as bringing up suggestions to other teammembers. The results of this study emphasize the team leader's role in initiating and updating a team's shared mental model. Team leaders should also set expectations for acceptable interaction patterns (e.g., promoting information exchange) and create a team climate that encourages behaviors, such as mutual performance monitoring, backup behavior, and adaptability to enhance SMM.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Schmidt, Rachel; Moiseenko, Vitali
Purpose: The purpose of this study was to quantify the frequency and clinical severity of quality deficiencies in intensity modulated radiation therapy (IMRT) planning in the Radiation Therapy Oncology Group 0126 protocol. Methods and Materials: A total of 219 IMRT patients from the high-dose arm (79.2 Gy) of RTOG 0126 were analyzed. To quantify plan quality, we used established knowledge-based methods for patient-specific dose-volume histogram (DVH) prediction of organs at risk and a Lyman-Kutcher-Burman (LKB) model for grade ≥2 rectal complications to convert DVHs into normal tissue complication probabilities (NTCPs). The LKB model was validated by fitting dose-response parameters relative tomore » observed toxicities. The 90th percentile (22 of 219) of plans with the lowest excess risk (difference between clinical and model-predicted NTCP) were used to create a model for the presumed best practices in the protocol (pDVH{sub 0126,top10%}). Applying the resultant model to the entire sample enabled comparisons between DVHs that patients could have received to DVHs they actually received. Excess risk quantified the clinical impact of suboptimal planning. Accuracy of pDVH predictions was validated by replanning 30 of 219 patients (13.7%), including equal numbers of presumed “high-quality,” “low-quality,” and randomly sampled plans. NTCP-predicted toxicities were compared to adverse events on protocol. Results: Existing models showed that bladder-sparing variations were less prevalent than rectum quality variations and that increased rectal sparing was not correlated with target metrics (dose received by 98% and 2% of the PTV, respectively). Observed toxicities were consistent with current LKB parameters. Converting DVH and pDVH{sub 0126,top10%} to rectal NTCPs, we observed 94 of 219 patients (42.9%) with ≥5% excess risk, 20 of 219 patients (9.1%) with ≥10% excess risk, and 2 of 219 patients (0.9%) with ≥15% excess risk. Replanning demonstrated the predicted NTCP reductions while maintaining the volume of the PTV receiving prescription dose. An equivalent sample of high-quality plans showed fewer toxicities than low-quality plans, 6 of 73 versus 10 of 73 respectively, although these differences were not significant (P=.21) due to insufficient statistical power in this retrospective study. Conclusions: Plan quality deficiencies in RTOG 0126 exposed patients to substantial excess risk for rectal complications.« less
Development of a Model Following Control Law for Inflight Simulation and Flight Controls Research
NASA Technical Reports Server (NTRS)
Takahashi, Mark; Fletcher, Jay; Aiken, Edwin W. (Technical Monitor)
1994-01-01
The U.S. Army and NASA are currently developing the Rotorcraft Aircrew Systems Concepts Airborne Laboratory (RASCAL) at the Ames Research Center. RASCAL, shown in Figure 1, is a UH-60, which is being modified in a phased development program to have a research fly-by-wire flight control system, and an advanced navigation research platform. An important part of the flight controls and handling qualities research on RASCAL will be an FCS design for the aircraft to achieve high bandwidth control responses and disturbance rejection characteristics. Initially, body states will be used as feedbacks, but research into the use of rotor states will also be considered in later stages to maximize agility and maneuverability. In addition to supporting flight controls research, this FCS design will serve as the inflight simulation control law to support basic handling qualities, guidance, and displays research. Research in high bandwidth controls laws is motivated by the desire to improve the handling qualities in aggressive maneuvering and in severely degraded weather conditions. Naturally, these advantages will also improve the quality of the model following, thereby improving the inflight simulation capabilities of the research vehicle. High bandwidth in the control laws provides tighter tracking allowing for higher response bandwidths which can meet handling qualities requirements for aggressive maneuvering. System sensitivity is also reduced preventing variations in the response from the vehicle due to changing flight conditions. In addition, improved gust rejection will result from this reduced sensitivity. The gust rejection coupled with a highly stable system will make more precise maneuvering and pointing possible in severely degraded weather conditions. The difficulty in achieving higher bandwidths from the control laws in the feedback and in the responses arises from the complexity of the models that are needed to produce a satisfactory design. In this case, high quality models that include rotor dynamics in a physically meaningful context must be available. A non-physical accounting of the rotor, such as lumping the effect as a time delay, is not likely to produce the desired results. High order simulation models based on first principals are satisfactory for the initial design phase in order to work out the control law design concept and get an initial set of gains. These models, however, have known deficiencies, which must be resolved in the final control law design. The error in the pitch-roll cross coupling is one notable deficiency that even sophisticated rotorcraft models including complex wake aerodynamics have yet to capture successfully. This error must be accounted for to achieve the desired decoupling. The approach to design the proposed inflight simulation control law is based on using a combination of simulation and identified models. The linear and nonlinear higher order models were used to develop an explicit model following control structure. This structure was developed to accommodate the design of control laws compliant to many of the quantitative requirements in ADS-33C. Furthermore, it also allows for control law research using rotor-state feedback and other design methodologies such as Quantitative Feedback and H-Infinity. Final gain selection will be based on higher order identified models which include rotor degrees of freedom.
Should we trust build-up/wash-off water quality models at the scale of urban catchments?
Bonhomme, Céline; Petrucci, Guido
2017-01-01
Models of runoff water quality at the scale of an urban catchment usually rely on build-up/wash-off formulations obtained through small-scale experiments. Often, the physical interpretation of the model parameters, valid at the small-scale, is transposed to large-scale applications. Testing different levels of spatial variability, the parameter distributions of a water quality model are obtained in this paper through a Monte Carlo Markov Chain algorithm and analyzed. The simulated variable is the total suspended solid concentration at the outlet of a periurban catchment in the Paris region (2.3 km 2 ), for which high-frequency turbidity measurements are available. This application suggests that build-up/wash-off models applied at the catchment-scale do not maintain their physical meaning, but should be considered as "black-box" models. Copyright © 2016 Elsevier Ltd. All rights reserved.
Harris, Katherine M
2002-06-01
To investigate the impact of quality information on the willingness of consumers to enroll in health plans that restrict provider access. A survey administered to respondents between the ages of 25 and 64 in the West Los Angeles area with private health insurance. An experimental approach is used to measure the effect of variation in provider network features and information about the quality of network physicians on hypothetical plan choices. Conditional logit models are used to analyze the experimental choice data. Next, choice model parameter estimates are used to simulate the impact of changes in plan features on the market shares of competing health plans and to calculate the quality level required to make consumers indifferent to changes in provider access. The presence of quality information reduced the importance of provider network features in plan choices as hypothesized. However, there were not statistically meaningful differences by type of quality measure (i.e., consumer assessed versus expert assessed). The results imply that large quality differences are required to make consumers indifferent to changes in provider access. The impact of quality on plan choices depended more on the particular measure and less on the type of measure. Quality ratings based on the proportion of survey respondents "extremely satisfied with results of care" had the greatest impact on plan choice while the proportion of network doctors "affiliated with university medical centers" had the least. Other consumer and expert assessed measures had more comparable effects. Overall the results provide empirical evidence that consumers are willing to trade high quality for restrictions on provider access. This willingness to trade implies that relatively small plans that place restrictions on provider access can successfully compete against less restrictive plans when they can demonstrate high quality. However, the results of this study suggest that in many cases, the level of quality required for consumers to accept access restrictions may be so high as to be unattainable. The results provide empirical support for the current focus of decision support efforts on consumer assessed quality measures. At the same time, however, the results suggest that consumers would also value quality measures based on expert assessments. This finding is relevant given the lack of comparative quality information based on expert judgment and research suggesting that consumers have apprehensions about their ability to meaningfully interpret performance-based quality measures.
Highly Integrated Quality Assurance – An Empirical Case
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drake Kirkham; Amy Powell; Lucas Rich
2011-02-01
Highly Integrated Quality Assurance – An Empirical Case Drake Kirkham1, Amy Powell2, Lucas Rich3 1Quality Manager, Radioisotope Power Systems (RPS) Program, Idaho National Laboratory, P.O. Box 1625 M/S 6122, Idaho Falls, ID 83415-6122 2Quality Engineer, RPS Program, Idaho National Laboratory 3Quality Engineer, RPS Program, Idaho National Laboratory Contact: Voice: (208) 533-7550 Email: Drake.Kirkham@inl.gov Abstract. The Radioisotope Power Systems Program of the Idaho National Laboratory makes an empirical case for a highly integrated Quality Assurance function pertaining to the preparation, assembly, testing, storage and transportation of 238Pu fueled radioisotope thermoelectric generators. Case data represents multiple campaigns including the Pluto/New Horizons mission,more » the Mars Science Laboratory mission in progress, and other related projects. Traditional Quality Assurance models would attempt to reduce cost by minimizing the role of dedicated Quality Assurance personnel in favor of either functional tasking or peer-based implementations. Highly integrated Quality Assurance adds value by placing trained quality inspectors on the production floor side-by-side with nuclear facility operators to enhance team dynamics, reduce inspection wait time, and provide for immediate, independent feedback. Value is also added by maintaining dedicated Quality Engineers to provide for rapid identification and resolution of corrective action, enhanced and expedited supply chain interfaces, improved bonded storage capabilities, and technical resources for requirements management including data package development and Certificates of Inspection. A broad examination of cost-benefit indicates highly integrated Quality Assurance can reduce cost through the mitigation of risk and reducing administrative burden thereby allowing engineers to be engineers, nuclear operators to be nuclear operators, and the cross-functional team to operate more efficiently. Applicability of this case extends to any high-value, long-term project where traceability and accountability are determining factors.« less
Garcia-Menendez, Fernando; Hu, Yongtao; Odman, Mehmet T
2014-09-15
Air quality forecasts generated with chemical transport models can provide valuable information about the potential impacts of fires on pollutant levels. However, significant uncertainties are associated with fire-related emission estimates as well as their distribution on gridded modeling domains. In this study, we explore the sensitivity of fine particulate matter concentrations predicted by a regional-scale air quality model to the spatial and temporal allocation of fire emissions. The assessment was completed by simulating a fire-related smoke episode in which air quality throughout the Atlanta metropolitan area was affected on February 28, 2007. Sensitivity analyses were carried out to evaluate the significance of emission distribution among the model's vertical layers, along the horizontal plane, and into hourly inputs. Predicted PM2.5 concentrations were highly sensitive to emission injection altitude relative to planetary boundary layer height. Simulations were also responsive to the horizontal allocation of fire emissions and their distribution into single or multiple grid cells. Additionally, modeled concentrations were greatly sensitive to the temporal distribution of fire-related emissions. The analyses demonstrate that, in addition to adequate estimates of emitted mass, successfully modeling the impacts of fires on air quality depends on an accurate spatiotemporal allocation of emissions. Copyright © 2014 Elsevier B.V. All rights reserved.
Rasch analysis of the carers quality of life questionnaire for parkinsonism.
Pillas, Marios; Selai, Caroline; Schrag, Anette
2017-03-01
To assess the psychometric properties of the Carers Quality of Life Questionnaire for Parkinsonism using a Rasch modeling approach and determine the optimal cut-off score. We performed a Rasch analysis of the survey answers of 430 carers of patients with atypical parkinsonism. All of the scale items demonstrated acceptable goodness of fit to the Rasch model. The scale was unidimensional and no notable differential item functioning was detected in the items regarding age and disease type. Rating categories were functioning adequately in all scale items. The scale had high reliability (.95) and construct validity and a high degree of precision, distinguishing between 5 distinct groups of carers with different levels of quality of life. A cut-off score of 62 was found to have the optimal screening accuracy based on Hospital Anxiety and Depression Scale subscores. The results suggest that the Carers Quality of Life Questionnaire for Parkinsonism is a useful scale to assess carers' quality of life and allows analyses requiring interval scaling of variables. © 2016 International Parkinson and Movement Disorder Society. © 2016 International Parkinson and Movement Disorder Society.
Mental models of audit and feedback in primary care settings.
Hysong, Sylvia J; Smitham, Kristen; SoRelle, Richard; Amspoker, Amber; Hughes, Ashley M; Haidet, Paul
2018-05-30
Audit and feedback has been shown to be instrumental in improving quality of care, particularly in outpatient settings. The mental model individuals and organizations hold regarding audit and feedback can moderate its effectiveness, yet this has received limited study in the quality improvement literature. In this study we sought to uncover patterns in mental models of current feedback practices within high- and low-performing healthcare facilities. We purposively sampled 16 geographically dispersed VA hospitals based on high and low performance on a set of chronic and preventive care measures. We interviewed up to 4 personnel from each location (n = 48) to determine the facility's receptivity to audit and feedback practices. Interview transcripts were analyzed via content and framework analysis to identify emergent themes. We found high variability in the mental models of audit and feedback, which we organized into positive and negative themes. We were unable to associate mental models of audit and feedback with clinical performance due to high variance in facility performance over time. Positive mental models exhibit perceived utility of audit and feedback practices in improving performance; whereas, negative mental models did not. Results speak to the variability of mental models of feedback, highlighting how facilities perceive current audit and feedback practices. Findings are consistent with prior research in that variability in feedback mental models is associated with lower performance.; Future research should seek to empirically link mental models revealed in this paper to high and low levels of clinical performance.
A guide to calculating habitat-quality metrics to inform conservation of highly mobile species
Bieri, Joanna A.; Sample, Christine; Thogmartin, Wayne E.; Diffendorfer, James E.; Earl, Julia E.; Erickson, Richard A.; Federico, Paula; Flockhart, D. T. Tyler; Nicol, Sam; Semmens, Darius J.; Skraber, T.; Wiederholt, Ruscena; Mattsson, Brady J.
2018-01-01
Many metrics exist for quantifying the relative value of habitats and pathways used by highly mobile species. Properly selecting and applying such metrics requires substantial background in mathematics and understanding the relevant management arena. To address this multidimensional challenge, we demonstrate and compare three measurements of habitat quality: graph-, occupancy-, and demographic-based metrics. Each metric provides insights into system dynamics, at the expense of increasing amounts and complexity of data and models. Our descriptions and comparisons of diverse habitat-quality metrics provide means for practitioners to overcome the modeling challenges associated with management or conservation of such highly mobile species. Whereas previous guidance for applying habitat-quality metrics has been scattered in diversified tracks of literature, we have brought this information together into an approachable format including accessible descriptions and a modeling case study for a typical example that conservation professionals can adapt for their own decision contexts and focal populations.Considerations for Resource ManagersManagement objectives, proposed actions, data availability and quality, and model assumptions are all relevant considerations when applying and interpreting habitat-quality metrics.Graph-based metrics answer questions related to habitat centrality and connectivity, are suitable for populations with any movement pattern, quantify basic spatial and temporal patterns of occupancy and movement, and require the least data.Occupancy-based metrics answer questions about likelihood of persistence or colonization, are suitable for populations that undergo localized extinctions, quantify spatial and temporal patterns of occupancy and movement, and require a moderate amount of data.Demographic-based metrics answer questions about relative or absolute population size, are suitable for populations with any movement pattern, quantify demographic processes and population dynamics, and require the most data.More real-world examples applying occupancy-based, agent-based, and continuous-based metrics to seasonally migratory species are needed to better understand challenges and opportunities for applying these metrics more broadly.
Sharafi, Mastaneh; Rawal, Shristi; Fernandez, Maria Luz; Huedo-Medina, Tania B; Duffy, Valerie B
2018-05-08
Sensations from foods and beverages drive dietary choices, which in turn, affect risk of diet-related diseases. Perception of these sensation varies with environmental and genetic influences. This observational study aimed to examine associations between chemosensory phenotype, diet and cardiovascular disease (CVD) risk. Reportedly healthy women (n = 110, average age 45 ± 9 years) participated in laboratory-based measures of chemosensory phenotype (taste and smell function, propylthiouracil (PROP) bitterness) and CVD risk factors (waist circumference, blood pressure, serum lipids). Diet variables included preference and intake of sweet/high-fat foods, dietary restraint, and diet quality based on reported preference (Healthy Eating Preference Index-HEPI) and intake (Healthy Eating Index-HEI). We found that females who reported high preference yet low consumption of sweet/high-fat foods had the highest dietary restraint and depressed quinine taste function. PROP nontasters were more likely to report lower diet quality; PROP supertasters more likely to consume but not like a healthy diet. Multivariate structural models were fitted to identify predictors of CVD risk factors. Reliable latent taste (quinine taste function, PROP tasting) and smell (odor intensity) variables were identified, with taste explaining more variance in the CVD risk factors. Lower bitter taste perception was associated with elevated risk. In multivariate models, the HEPI completely mediated the taste-adiposity and taste-HDL associations and partially mediated the taste-triglyceride or taste-systolic blood pressure associations. The taste-LDL pathway was significant and direct. The HEI could not replace HEPI in adequate models. However, using a latent diet quality variable with HEPI and HEI, increased the strength of association between diet quality and adiposity or CVD risk factors. In conclusion, bitter taste phenotype was associated with CVD risk factors via diet quality, particularly when assessed by level of food liking/disliking. Copyright © 2018 Elsevier Inc. All rights reserved.
Murray, A; Montgomery, J E; Chang, H; Rogers, W H; Inui, T; Safran, D G
2001-07-01
To examine the differences in physician satisfaction associated with open- versus closed-model practice settings and to evaluate changes in physician satisfaction between 1986 and 1997. Open-model practices refer to those in which physicians accept patients from multiple health plans and insurers (i.e., do not have an exclusive arrangement with any single health plan). Closed-model practices refer to those wherein physicians have an exclusive relationship with a single health plan (i.e., staff- or group-model HMO). Two cross-sectional surveys of physicians; one conducted in 1986 (Medical Outcomes Study) and one conducted in 1997 (Study of Primary Care Performance in Massachusetts). Primary care practices in Massachusetts. General internists and family practitioners in Massachusetts. Seven measures of physician satisfaction, including satisfaction with quality of care, the potential to achieve professional goals, time spent with individual patients, total earnings from practice, degree of personal autonomy, leisure time, and incentives for high quality. Physicians in open- versus closed-model practices differed significantly in several aspects of their professional satisfaction. In 1997, open-model physicians were less satisfied than closed-model physicians with their total earnings, leisure time, and incentives for high quality. Open-model physicians reported significantly more difficulty with authorization procedures and reported more denials for care. Overall, physicians in 1997 were less satisfied in every aspect of their professional life than 1986 physicians. Differences were significant in three areas: time spent with individual patients, autonomy, and leisure time (P < or =.05). Among open-model physicians, satisfaction with autonomy and time with individual patients were significantly lower in 1997 than 1986 (P < or =.01). Among closed-model physicians, satisfaction with total earnings and with potential to achieve professional goals were significantly lower in 1997 than in 1986 (P < or =.01). This study finds that the state of physician satisfaction in Massachusetts is extremely low, with the majority of physicians dissatisfied with the amount of time they have with individual patients, their leisure time, and their incentives for high quality. Satisfaction with most areas of practice declined significantly between 1986 and 1997. Open-model physicians were less satisfied than closed-model physicians in most aspects of practices.
Murray, Alison; Montgomery, Jana E; Chang, Hong; Rogers, William H; Inui, Thomas; Safran, Dana Gelb
2001-01-01
OBJECTIVE To examine the differences in physician satisfaction associated with open- versus closed-model practice settings and to evaluate changes in physician satisfaction between 1986 and 1997. Open-model practices refer to those in which physicians accept patients from multiple health plans and insurers (i.e., do not have an exclusive arrangement with any single health plan). Closed-model practices refer to those wherein physicians have an exclusive relationship with a single health plan (i.e., staff- or group-model HMO). DESIGN Two cross-sectional surveys of physicians; one conducted in 1986 (Medical Outcomes Study) and one conducted in 1997 (Study of Primary Care Performance in Massachusetts). SETTING Primary care practices in Massachusetts. PARTICIPANTS General internists and family practitioners in Massachusetts. MEASUREMENTS Seven measures of physician satisfaction, including satisfaction with quality of care, the potential to achieve professional goals, time spent with individual patients, total earnings from practice, degree of personal autonomy, leisure time, and incentives for high quality. RESULTS Physicians in open- versus closed-model practices differed significantly in several aspects of their professional satisfaction. In 1997, open-model physicians were less satisfied than closed-model physicians with their total earnings, leisure time, and incentives for high quality. Open-model physicians reported significantly more difficulty with authorization procedures and reported more denials for care. Overall, physicians in 1997 were less satisfied in every aspect of their professional life than 1986 physicians. Differences were significant in three areas: time spent with individual patients, autonomy, and leisure time (P ≤ .05). Among open-model physicians, satisfaction with autonomy and time with individual patients were significantly lower in 1997 than 1986 (P ≤ .01). Among closed-model physicians, satisfaction with total earnings and with potential to achieve professional goals were significantly lower in 1997 than in 1986 (P ≤ .01). CONCLUSIONS This study finds that the state of physician satisfaction in Massachusetts is extremely low, with the majority of physicians dissatisfied with the amount of time they have with individual patients, their leisure time, and their incentives for high quality. Satisfaction with most areas of practice declined significantly between 1986 and 1997. Open-model physicians were less satisfied than closed-model physicians in most aspects of practices.
Katagiri, Ryoko; Asakura, Keiko; Kobayashi, Satomi; Suga, Hitomi; Sasaki, Satoshi
2014-01-01
Although workers with poor sleep quality are reported to have problems with work performance, few studies have assessed the association between dietary factors and sleep quality using validated indexes. Here, we examined this association using information acquired from validated questionnaires. A total of 3,129 female workers aged 34 to 65 years were analyzed. Dietary intake was assessed using a self-administered diet history questionnaire (DHQ), and subjective sleep quality was assessed using the Pittsburgh Sleep Quality Index (PSQI). The relationship between the intake of several food groups and nutrients and sleep quality was examined using multivariable logistic regression models. The effect of eating habits on sleep quality was also examined. Poor sleep quality was associated with low intake of vegetables (p for trend 0.002) and fish (p for trend 0.04) and high intake of confectionary (p for trend 0.004) and noodles (p for trend 0.03) after adjustment for potential confounding factors (age, body mass index, physical activity, depression score, employment status, alcohol intake and smoking status). Poor sleep quality was also significantly and positively associated with consumption of energy drinks and sugar-sweetened beverages, skipping breakfast, and eating irregularly. In addition, poor sleep quality was significantly associated with high carbohydrate intake (p for trend 0.03). A low intake of vegetables and fish, high intake of confectionary and noodles and unhealthy eating habits were independently associated with poor sleep quality. Poor sleep quality was also associated with high carbohydrate intake in free-living Japanese middle-aged female workers.
Deldar, Kolsoum
2017-01-01
Background Teleconsultation is a guarantor for virtual supervision of clinical professors on clinical decisions made by medical residents in teaching hospitals. Type, format, volume, and quality of exchanged information have a great influence on the quality of remote clinical decisions or tele-decisions. Thus, it is necessary to develop a reliable and standard model for these clinical relationships. Objective The goal of this study was to design and evaluate a data model for teleconsultation in the management of high-risk pregnancies. Methods This study was implemented in three phases. In the first phase, a systematic review, a qualitative study, and a Delphi approach were done in selected teaching hospitals. Systematic extraction and localization of diagnostic items to develop the tele-decision clinical archetypes were performed as the second phase. Finally, the developed model was evaluated using predefined consultation scenarios. Results Our review study has shown that present medical consultations have no specific structure or template for patient information exchange. Furthermore, there are many challenges in the remote medical decision-making process, and some of them are related to the lack of the mentioned structure. The evaluation phase of our research has shown that data quality (P<.001), adequacy (P<.001), organization (P<.001), confidence (P<.001), and convenience (P<.001) had more scores in archetype-based consultation scenarios compared with routine-based ones. Conclusions Our archetype-based model could acquire better and higher scores in the data quality, adequacy, organization, confidence, and convenience dimensions than ones with routine scenarios. It is probable that the suggested archetype-based teleconsultation model may improve the quality of physician-physician remote medical consultations. PMID:29242181
NASA Astrophysics Data System (ADS)
Mirauda, D.; Ostoich, M.; Di Maria, F.; Benacchio, S.; Saccardo, I.
2018-03-01
In this paper, a mathematical model has been applied to a river in North-East Italy to describe vulnerability scenarios due to environmental pollution phenomena. Such model, based on the influence diagrams theory, allowed identifying the extremely critical factors, such as wastewater discharges, drainage of diffuse pollution from agriculture and climate changes, which might affect the water quality of the river. The obtained results underlined how the water quality conditions have improved thanks to the continuous controls on the territory, following the application of Water Framework Directive 2000/60/EC. Nevertheless, some fluvial stretches did not reach the “good ecological status” by 2015, because of the increasing population in urban areas recorded in the last years and the high presence of tourists during the summer months, not balanced by a treatment plants upgrade.
Chen, Sheng-Po; Wang, Chieh-Heng; Lin, Wen-Dian; Tong, Yu-Huei; Chen, Yu-Chun; Chiu, Ching-Jui; Chiang, Hung-Chi; Fan, Chen-Lun; Wang, Jia-Lin; Chang, Julius S
2018-05-01
The present study combines high-resolution measurements at various distances from a world-class gigantic petrochemical complex with model simulations to test a method to assess industrial emissions and their effect on local air quality. Due to the complexity in wind conditions which were highly seasonal, the dominant wind flow patterns in the coastal region of interest were classified into three types, namely northeast monsoonal (NEM) flows, southwest monsoonal (SEM) flows and local circulation (LC) based on six years of monitoring data. Sulfur dioxide (SO 2 ) was chosen as an indicative pollutant for prominent industrial emissions. A high-density monitoring network of 12 air-quality stations distributed within a 20-km radius surrounding the petrochemical complex provided hourly measurements of SO 2 and wind parameters. The SO 2 emissions from major industrial sources registered by the monitoring network were then used to validate model simulations and to illustrate the transport of the SO 2 plumes under the three typical wind patterns. It was found that the coupling of observations and modeling was able to successfully explain the transport of the industrial plumes. Although the petrochemical complex was seemingly the only major source to affect local air quality, multiple prominent sources from afar also played a significant role in local air quality. As a result, we found that a more complete and balanced assessment of the local air quality can be achieved only after taking into account the wind characteristics and emission factors of a much larger spatial scale than the initial (20 km by 20 km) study domain. Copyright © 2018 Elsevier Ltd. All rights reserved.
Sharify, Denise Tung; Blake, Bonita; Phillips, Tom; Whitten, Kathleen
2014-01-01
Background Residents of many cities lack affordable, quality housing. Economically disadvantaged neighborhoods often have high rates of poverty and crime, few institutions that enhance the quality of its residents’ lives, and unsafe environments for walking and other physical activity. Deteriorating housing contributes to asthma-related illness. We describe the redevelopment of High Point, a West Seattle neighborhood, to improve its built environment, increase neighborhood physical activity, and reduce indoor asthma triggers. Community Context High Point is one of Seattle’s most demographically diverse neighborhoods. Prior to redevelopment, it had a distressed infrastructure, rising crime rates, and indoor environments that increased asthma-related illness in children and adolescents. High Point residents and partners developed and implemented a comprehensive redevelopment plan to create a sustainable built environment to increase outdoor physical activity and improve indoor environments. Methods We conducted a retrospective analysis of the High Point redevelopment, organized by the different stages of change in the Community Readiness Model. We also examined the multisector partnerships among government and community groups that contributed to the success of the High Point project. Outcome Overall quality of life for residents improved as a result of neighborhood redevelopment. Physical activity increased, residents reported fewer days of poor physical or mental health, and social connectedness between neighbors grew. Asthma-friendly homes significantly decreased asthma-related illness among children and adolescents. Interpretation Providing affordable, quality housing to low-income families improved individual and neighborhood quality of life. Efforts to create social change and improve the health outcomes for entire populations are more effective when multiple organizations work together to improve neighborhood health. PMID:25376016
Buckner-Brown, Joyce; Sharify, Denise Tung; Blake, Bonita; Phillips, Tom; Whitten, Kathleen
2014-11-06
Residents of many cities lack affordable, quality housing. Economically disadvantaged neighborhoods often have high rates of poverty and crime, few institutions that enhance the quality of its residents' lives, and unsafe environments for walking and other physical activity. Deteriorating housing contributes to asthma-related illness. We describe the redevelopment of High Point, a West Seattle neighborhood, to improve its built environment, increase neighborhood physical activity, and reduce indoor asthma triggers. High Point is one of Seattle's most demographically diverse neighborhoods. Prior to redevelopment, it had a distressed infrastructure, rising crime rates, and indoor environments that increased asthma-related illness in children and adolescents. High Point residents and partners developed and implemented a comprehensive redevelopment plan to create a sustainable built environment to increase outdoor physical activity and improve indoor environments. We conducted a retrospective analysis of the High Point redevelopment, organized by the different stages of change in the Community Readiness Model. We also examined the multisector partnerships among government and community groups that contributed to the success of the High Point project. Overall quality of life for residents improved as a result of neighborhood redevelopment. Physical activity increased, residents reported fewer days of poor physical or mental health, and social connectedness between neighbors grew. Asthma-friendly homes significantly decreased asthma-related illness among children and adolescents. Providing affordable, quality housing to low-income families improved individual and neighborhood quality of life. Efforts to create social change and improve the health outcomes for entire populations are more effective when multiple organizations work together to improve neighborhood health.
NASA Astrophysics Data System (ADS)
Quiers, M.; Perrette, Y.; Etienne, D.; Develle, A. L.; Jacq, K.
2017-12-01
The use of organic proxies increases in paleoenvironmental reconstructions from natural archives. Major advances have been achieved by the development of new highly informative molecular proxies usually linked to specific compounds. While studies focused on targeted compounds, offering a high information degree, advances on bulk organic matter are limited. However, this bulk is the main contributor to carbon cycle and has been shown to be a driver of many mineral or organic compounds transfer and record. Development of target proxies need complementary information on bulk organic matter to understand biases link to controlling factors or analytical methods, and provide a robust interpretation. Fluorescence methods have often been employed to characterize and quantify organic matter. However, these technics are mainly developed for liquid samples, inducing material and resolution loss when working on natural archives (either stalagmite or sediments). High-resolution solid phase fluorescence (SPF) was developed on speleothems. This method allows now to analyse organic matter quality and quantity if procedure to constrain the optical density are adopted. In fact, a calibration method using liquid phase fluorescence (LPF) was developed for speleothem, allowing to quantify organic carbon at high-resolution. We report here an application of such a procedure SPF/LPF measurements on lake sediments. In order to avoid sediment matrix effects on the fluorescence signal, a calibration using LPF measurements was realised. First results using this method provided organic matter quality record of different organic matter compounds (humic-like, protein-like and chlorophylle-like compounds) at high resolution for the sediment core. High resolution organic matter fluxes are obtained in a second time, applying pragmatic chemometrics model (non linear models, partial least square models) on high resolution fluorescence data. SPF method can be considered as a promising tool for high resolution record on organic matter quality and quantity. Potential application of this method will be evocated (lake ecosystem dynamic, changes in trophic levels)
NASA Astrophysics Data System (ADS)
Deanes, L. N.; Ahmadov, R.; McKeen, S. A.; Manross, K.; Grell, G. A.; James, E.
2016-12-01
Wildfires are increasing in number and size in the western United States as climate change contributes to warmer and drier conditions in this region. These fires lead to poor air quality and diminished visibility. The High Resolution Rapid Refresh-Smoke modeling system (HRRR-Smoke) is designed to simulate fire emissions and smoke transport with high resolution. The model is based on the Weather Research and Forecasting model, coupled with chemistry (WRF-Chem) and uses fire detection data from the Visible Infrared and Imaging Radiometer Suite (VIIRS) satellite instrument to simulate wildfire emissions and their plume rise. HRRR-Smoke is used in both real-time applications and case studies. In this study, we evaluate the HRRR-Smoke for August 2015, during one of the worst wildfire seasons on record in the United States, by focusing on wildfires that occurred in the northwestern US. We compare HRRR-Smoke simulations with hourly fine particulate matter (PM2.5) observations from the Air Quality System (https://www.epa.gov/aqs) from multiple air quality monitoring sites in Washington state. PM2.5 data includes measurements from urban, suburban and remote sites in the state. We discuss the model performance in capturing large PM2.5 enhancements detected at surface sites due to wildfires. We present various statistical parameters to demonstrate HRRR-Smoke's performance in simulating surface PM2.5 levels.
Reyes, Jeanette M; Xu, Yadong; Vizuete, William; Serre, Marc L
2017-01-01
The regulatory Community Multiscale Air Quality (CMAQ) model is a means to understanding the sources, concentrations and regulatory attainment of air pollutants within a model's domain. Substantial resources are allocated to the evaluation of model performance. The Regionalized Air quality Model Performance (RAMP) method introduced here explores novel ways of visualizing and evaluating CMAQ model performance and errors for daily Particulate Matter ≤ 2.5 micrometers (PM2.5) concentrations across the continental United States. The RAMP method performs a non-homogenous, non-linear, non-homoscedastic model performance evaluation at each CMAQ grid. This work demonstrates that CMAQ model performance, for a well-documented 2001 regulatory episode, is non-homogeneous across space/time. The RAMP correction of systematic errors outperforms other model evaluation methods as demonstrated by a 22.1% reduction in Mean Square Error compared to a constant domain wide correction. The RAMP method is able to accurately reproduce simulated performance with a correlation of r = 76.1%. Most of the error coming from CMAQ is random error with only a minority of error being systematic. Areas of high systematic error are collocated with areas of high random error, implying both error types originate from similar sources. Therefore, addressing underlying causes of systematic error will have the added benefit of also addressing underlying causes of random error.
NASA Astrophysics Data System (ADS)
Brasington, James; James, Joe; Cook, Simon; Cox, Simon; Lotsari, Eliisa; McColl, Sam; Lehane, Niall; Williams, Richard; Vericat, Damia
2016-04-01
In recent years, 3D terrain reconstructions based on Structure-from-Motion photogrammetry have dramatically democratized the availability of high quality topographic data. This approach involves the use of a non-linear bundle adjustment to estimate simultaneously camera position, pose, distortion and 3D model coordinates. In contrast to traditional aerial photogrammetry, the bundle adjustment is typically solved without external constraints and instead ground control is used a posteriori to transform the modelled coordinates to an established datum using a similarity transformation. The limited data requirements, coupled with the ability to self-calibrate compact cameras, has led to a burgeoning of applications using low-cost imagery acquired terrestrially or from low-altitude platforms. To date, most applications have focused on relatively small spatial scales (0.1-5 Ha), where relaxed logistics permit the use of dense ground control networks and high resolution, close-range photography. It is less clear whether this low-cost approach can be successfully upscaled to tackle larger, watershed-scale projects extending over 102-3 km2 where it could offer a competitive alternative to established landscape modelling with airborne lidar. At such scales, compromises over the density of ground control, the speed and height of sensor platform and related image properties are inevitable. In this presentation we provide a systematic assessment of the quality of large-scale SfM terrain products derived for over 80 km2 of the braided Dart River and its catchment in the Southern Alps of NZ. Reference data in the form of airborne and terrestrial lidar are used to quantify the quality of 3D reconstructions derived from helicopter photography and used to establish baseline uncertainty models for geomorphic change detection. Results indicate that camera network design is a key determinant of model quality, and that standard aerial photogrammetric networks based on strips of nadir photography can lead to unstable camera calibration and systematic errors that are difficult to model with sparse ground control. We demonstrate how a low cost multi-camera platform providing both nadir and oblique imagery can support robust camera calibration, enabling the generation of high quality, large-scale terrain products that are suitable for precision fluvial change detection.
Perceptual quality estimation of H.264/AVC videos using reduced-reference and no-reference models
NASA Astrophysics Data System (ADS)
Shahid, Muhammad; Pandremmenou, Katerina; Kondi, Lisimachos P.; Rossholm, Andreas; Lövström, Benny
2016-09-01
Reduced-reference (RR) and no-reference (NR) models for video quality estimation, using features that account for the impact of coding artifacts, spatio-temporal complexity, and packet losses, are proposed. The purpose of this study is to analyze a number of potentially quality-relevant features in order to select the most suitable set of features for building the desired models. The proposed sets of features have not been used in the literature and some of the features are used for the first time in this study. The features are employed by the least absolute shrinkage and selection operator (LASSO), which selects only the most influential of them toward perceptual quality. For comparison, we apply feature selection in the complete feature sets and ridge regression on the reduced sets. The models are validated using a database of H.264/AVC encoded videos that were subjectively assessed for quality in an ITU-T compliant laboratory. We infer that just two features selected by RR LASSO and two bitstream-based features selected by NR LASSO are able to estimate perceptual quality with high accuracy, higher than that of ridge, which uses more features. The comparisons with competing works and two full-reference metrics also verify the superiority of our models.
Farkas, Caroline M; Moeller, Michael D; Felder, Frank A; Henderson, Barron H; Carlton, Annmarie G
2016-08-02
On high electricity demand days, when air quality is often poor, regional transmission organizations (RTOs), such as PJM Interconnection, ensure reliability of the grid by employing peak-use electric generating units (EGUs). These "peaking units" are exempt from some federal and state air quality rules. We identify RTO assignment and peaking unit classification for EGUs in the Eastern U.S. and estimate air quality for four emission scenarios with the Community Multiscale Air Quality (CMAQ) model during the July 2006 heat wave. Further, we population-weight ambient values as a surrogate for potential population exposure. Emissions from electricity reliability networks negatively impact air quality in their own region and in neighboring geographic areas. Monitored and controlled PJM peaking units are generally located in economically depressed areas and can contribute up to 87% of hourly maximum PM2.5 mass locally. Potential population exposure to peaking unit PM2.5 mass is highest in the model domain's most populated cities. Average daily temperature and national gross domestic product steer peaking unit heat input. Air quality planning that capitalizes on a priori knowledge of local electricity demand and economics may provide a more holistic approach to protect human health within the context of growing energy needs in a changing world.
Sentinel site data for model improvement – Definition and characterization
USDA-ARS?s Scientific Manuscript database
Crop models are increasingly being used to assess the impacts of future climate change on production and food security. High quality site-specific data on weather, soils, management, and cultivar are needed for those model applications. Also important, is that model development, evaluation, improvem...
Application of Wavelet Filters in an Evaluation of Photochemical Model Performance
Air quality model evaluation can be enhanced with time-scale specific comparisons of outputs and observations. For example, high-frequency (hours to one day) time scale information in observed ozone is not well captured by deterministic models and its incorporation into model pe...
Ribeiro, Manuel C; Pinho, P; Branquinho, C; Llop, Esteve; Pereira, Maria J
2016-08-15
In most studies correlating health outcomes with air pollution, personal exposure assignments are based on measurements collected at air-quality monitoring stations not coinciding with health data locations. In such cases, interpolators are needed to predict air quality in unsampled locations and to assign personal exposures. Moreover, a measure of the spatial uncertainty of exposures should be incorporated, especially in urban areas where concentrations vary at short distances due to changes in land use and pollution intensity. These studies are limited by the lack of literature comparing exposure uncertainty derived from distinct spatial interpolators. Here, we addressed these issues with two interpolation methods: regression Kriging (RK) and ordinary Kriging (OK). These methods were used to generate air-quality simulations with a geostatistical algorithm. For each method, the geostatistical uncertainty was drawn from generalized linear model (GLM) analysis. We analyzed the association between air quality and birth weight. Personal health data (n=227) and exposure data were collected in Sines (Portugal) during 2007-2010. Because air-quality monitoring stations in the city do not offer high-spatial-resolution measurements (n=1), we used lichen data as an ecological indicator of air quality (n=83). We found no significant difference in the fit of GLMs with any of the geostatistical methods. With RK, however, the models tended to fit better more often and worse less often. Moreover, the geostatistical uncertainty results showed a marginally higher mean and precision with RK. Combined with lichen data and land-use data of high spatial resolution, RK is a more effective geostatistical method for relating health outcomes with air quality in urban areas. This is particularly important in small cities, which generally do not have expensive air-quality monitoring stations with high spatial resolution. Further, alternative ways of linking human activities with their environment are needed to improve human well-being. Copyright © 2016 Elsevier B.V. All rights reserved.
Measuring housing quality in the absence of a monetized real estate market.
Rindfuss, Ronald R; Piotrowski, Martin; Thongthai, Varachai; Prasartkul, Pramote
2007-03-01
Measuring housing quality or value or both has been a weak component of demographic and development research in less developed countries that lack an active real estate (housing) market. We describe a new method based on a standardized subjective rating process. It is designed to be used in settings that do not have an active, monetized housing market. The method is applied in an ongoing longitudinal study in north-east Thailand and could be straightforwardly used in many other settings. We develop a conceptual model of the process whereby households come to reside in high-quality or low-quality housing units. We use this theoretical model in conjunction with longitudinal data to show that the new method of measuring housing quality behaves as theoretically expected, thus providing evidence of face validity.
Assessing hypotheses about nesting site occupancy dynamics
Bled, Florent; Royle, J. Andrew; Cam, Emmanuelle
2011-01-01
Hypotheses about habitat selection developed in the evolutionary ecology framework assume that individuals, under some conditions, select breeding habitat based on expected fitness in different habitat. The relationship between habitat quality and fitness may be reflected by breeding success of individuals, which may in turn be used to assess habitat quality. Habitat quality may also be assessed via local density: if high-quality sites are preferentially used, high density may reflect high-quality habitat. Here we assessed whether site occupancy dynamics vary with site surrogates for habitat quality. We modeled nest site use probability in a seabird subcolony (the Black-legged Kittiwake, Rissa tridactyla) over a 20-year period. We estimated site persistence (an occupied site remains occupied from time t to t + 1) and colonization through two subprocesses: first colonization (site creation at the timescale of the study) and recolonization (a site is colonized again after being deserted). Our model explicitly incorporated site-specific and neighboring breeding success and conspecific density in the neighborhood. Our results provided evidence that reproductively "successful'' sites have a higher persistence probability than "unsuccessful'' ones. Analyses of site fidelity in marked birds and of survival probability showed that high site persistence predominantly reflects site fidelity, not immediate colonization by new owners after emigration or death of previous owners. There is a negative quadratic relationship between local density and persistence probability. First colonization probability decreases with density, whereas recolonization probability is constant. This highlights the importance of distinguishing initial colonization and recolonization to understand site occupancy. All dynamics varied positively with neighboring breeding success. We found evidence of a positive interaction between site-specific and neighboring breeding success. We addressed local population dynamics using a site occupancy approach integrating hypotheses developed in behavioral ecology to account for individual decisions. This allows development of models of population and metapopulation dynamics that explicitly incorporate ecological and evolutionary processes.
Narinc, D; Aygun, A; Karaman, E; Aksoy, T
2015-07-01
The objective of the present study was to estimate heritabilities as well as genetic and phenotypic correlations for egg weight, specific gravity, shape index, shell ratio, egg shell strength, egg length, egg width and shell weight in Japanese quail eggs. External egg quality traits were measured on 5864 eggs of 934 female quails from a dam line selected for two generations. Within the Bayesian framework, using Gibbs Sampling algorithm, a multivariate animal model was applied to estimate heritabilities and genetic correlations for external egg quality traits. The heritability estimates for external egg quality traits were moderate to high and ranged from 0.29 to 0.81. The heritability estimates for egg and shell weight of 0.81 and 0.76 were fairly high. The genetic and phenotypic correlations between egg shell strength with specific gravity, shell ratio and shell weight ranging from 0.55 to 0.79 were relatively high. It can be concluded that it is possible to determine egg shell quality using the egg specific gravity values utilizing its high heritability and fairly high positive correlation with most of the egg shell quality traits. As a result, egg specific gravity may be the choice of selection criterion rather than other external egg traits for genetic improvement of egg shell quality in Japanese quails.
Gómez-García, Francisco; Ruano, Juan; Aguilar-Luque, Macarena; Alcalde-Mellado, Patricia; Gay-Mimbrera, Jesús; Hernández-Romero, José Luis; Sanz-Cabanillas, Juan Luis; Maestre-López, Beatriz; González-Padilla, Marcelino; Carmona-Fernández, Pedro J; García-Nieto, Antonio Vélez; Isla-Tejera, Beatriz
2017-12-29
Article summaries' information and structure may influence researchers/clinicians' decisions to conduct deeper full-text analyses. Specifically, abstracts of systematic reviews (SRs) and meta-analyses (MA) should provide structured summaries for quick assessment. This study explored a method for determining the methodological quality and bias risk of full-text reviews using abstract information alone. Systematic literature searches for SRs and/or MA about psoriasis were undertaken on MEDLINE, EMBASE, and Cochrane database. For each review, quality, abstract-reporting completeness, full-text methodological quality, and bias risk were evaluated using Preferred Reporting Items for Systematic Reviews and Meta-analyses for abstracts (PRISMA-A), Assessing the Methodological Quality of Systematic Reviews (AMSTAR), and ROBIS tools, respectively. Article-, author-, and journal-derived metadata were systematically extracted from eligible studies using a piloted template, and explanatory variables concerning abstract-reporting quality were assessed using univariate and multivariate-regression models. Two classification models concerning SRs' methodological quality and bias risk were developed based on per-item and total PRISMA-A scores and decision-tree algorithms. This work was supported, in part, by project ICI1400136 (JR). No funding was received from any pharmaceutical company. This study analysed 139 SRs on psoriasis interventions. On average, they featured 56.7% of PRISMA-A items. The mean total PRISMA-A score was significantly higher for high-methodological-quality SRs than for moderate- and low-methodological-quality reviews. SRs with low-bias risk showed higher total PRISMA-A values than reviews with high-bias risk. In the final model, only 'authors per review > 6' (OR: 1.098; 95%CI: 1.012-1.194), 'academic source of funding' (OR: 3.630; 95%CI: 1.788-7.542), and 'PRISMA-endorsed journal' (OR: 4.370; 95%CI: 1.785-10.98) predicted PRISMA-A variability. Reviews with a total PRISMA-A score < 6, lacking identification as SR or MA in the title, and lacking explanation concerning bias risk assessment methods were classified as low-methodological quality. Abstracts with a total PRISMA-A score ≥ 9, including main outcomes results and explanation bias risk assessment method were classified as having low-bias risk. The methodological quality and bias risk of SRs may be determined by abstract's quality and completeness analyses. Our proposal aimed to facilitate synthesis of evidence evaluation by clinical professionals lacking methodological skills. External validation is necessary.
Improving NIR model for the prediction of cotton fiber strength
USDA-ARS?s Scientific Manuscript database
Cotton fiber strength is an important quality characteristic that is directly related to the manufacturing of quality consumer goods. Currently, two types of instruments have been implemented to assess cotton fiber strength, namely, the automation oriented high volume instrument (HVI) and the labora...
NASA Astrophysics Data System (ADS)
Hong, E.; Park, Y.; Muirhead, R.; Jeong, J.; Pachepsky, Y. A.
2017-12-01
Pathogenic microorganisms in recreational and irrigation waters remain the subject of concern. Water quality models are used to estimate microbial quality of water sources, to evaluate microbial contamination-related risks, to guide the microbial water quality monitoring, and to evaluate the effect of agricultural management on the microbial water quality. The Agricultural Policy/Environmental eXtender (APEX) is the watershed-scale water quality model that includes highly detailed representation of agricultural management. The APEX currently does not have microbial fate and transport simulation capabilities. The objective of this work was to develop the first APEX microbial fate and transport module that could use the APEX conceptual model of manure removal together with recently introduced conceptualizations of the in-stream microbial fate and transport. The module utilizes manure erosion rates found in the APEX. Bacteria survival in soil-manure mixing layer was simulated with the two-stage survival model. Individual survival patterns were simulated for each manure application date. Simulated in-stream microbial fate and transport processes included the reach-scale passive release of bacteria with resuspended bottom sediment during high flow events, the transport of bacteria from bottom sediment due to the hyporheic exchange during low flow periods, the deposition with settling sediment, and the two-stage survival. Default parameter values were available from recently published databases. The APEX model with the newly developed microbial fate and transport module was applied to simulate seven years of monitoring data for the Toenepi watershed in New Zealand. Based on calibration and testing results, the APEX with the microbe module reproduced well the monitored pattern of E. coli concentrations at the watershed outlet. The APEX with the microbial fate and transport module will be utilized for predicting microbial quality of water under various agricultural practices, evaluating monitoring protocols, and supporting the selection of management practices based on regulations that rely on fecal indicator bacteria concentrations.
Analysis and modeling of atmospheric turbulence on the high-resolution space optical systems
NASA Astrophysics Data System (ADS)
Lili, Jiang; Chen, Xiaomei; Ni, Guoqiang
2016-09-01
Modeling and simulation of optical remote sensing system plays an unslightable role in remote sensing mission predictions, imaging system design, image quality assessment. It has already become a hot research topic at home and abroad. Atmospheric turbulence influence on optical systems is attached more and more importance to as technologies of remote sensing are developed. In order to study the influence of atmospheric turbulence on earth observation system, the atmospheric structure parameter was calculated by using the weak atmospheric turbulence model; and the relationship of the atmospheric coherence length and high resolution remote sensing optical system was established; then the influence of atmospheric turbulence on the coefficient r0h of optical remote sensing system of ground resolution was derived; finally different orbit height of high resolution optical system imaging quality affected by atmospheric turbulence was analyzed. Results show that the influence of atmospheric turbulence on the high resolution remote sensing optical system, the resolution of which has reached sub meter level meter or even the 0.5m, 0.35m and even 0.15m ultra in recent years, image quality will be quite serious. In the above situation, the influence of the atmospheric turbulence must be corrected. Simulation algorithms of PSF are presented based on the above results. Experiment and analytical results are posted.
NASA Astrophysics Data System (ADS)
Wang, Zu-liang; Zhang, Ting; Xie, Shi-yang
2017-01-01
In order to improve the agricultural tracing efficiency and reduce tracking and monitoring cost, agricultural products quality tracking and tracing based on Radio-Frequency Identification(RFID) technology is studied, then tracing and tracking model is set up. Three-layer structure model is established to realize the high quality of agricultural products traceability and tracking. To solve the collision problems between multiple RFID tags and improve the identification efficiency a new reservation slot allocation mechanism is proposed. And then we analyze and optimize the parameter by numerical simulation method.
Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of bioche...
NASA Astrophysics Data System (ADS)
Farid, Sidra; Stroscio, Michael A.; Dutta, Mitra
2018-03-01
Thermal evaporation growth technique is presented as a route to grow cost effective high quality CdS thin films. We have successfully grown high quality CdS thin films on ITO coated glass substrates by thermal evaporation technique and analyzed the effects of annealing and excitation dependent input of CdS thin film using Raman and photoluminescence spectroscopy. LO phonon modes have been analyzed quantitatively considering the contributions due to anneal induced effects on film quality using phonon spatial correlation model, line shape and defect state analysis. Asymmetry in the Raman line shape towards the low frequency side is related to the phonon confinement effects and is modeled by spatial correlation model. Calculations of width (FWHM), integrated intensity, and line shape for the longitudinal (LO) optical phonon modes indicate improved crystalline quality for the annealed films as compared to the as grown films. With increase in laser power, intensity ratio of 2-LO to 1-LO optical phonon modes is found to increase while multiple overtones upto fourth order are observed. Power dependent photoluminescence data indicates direct band-to-band transition in CdS thin films.
Dedicated education unit: student perspectives.
Nishioka, Vicki M; Coe, Michael T; Hanita, Makoto; Moscato, Susan R
2014-01-01
The study compared students' perceptions of their clinical learning experiences in a dedicated education unit (DEU) with their experiences in traditional clinical education. Unlike traditional academic-instructor models, expert nurses in the DEU provide clinical education to students with faculty support. This repeated measures design used student surveys, supplemented by focus group data. Students were more likely to agree that their clinical learning experience was high quality and they had a consistent mentoring relationship during DEU rotations. Students also reported the quality of the unit's learning environment, the leadership style of the nurse manager, and the nursing care on the unit was more favorable in DEUs than traditional units. Consistent with their changed role in DEUs, faculty members were less active in helping students integrate theory and practice. These findings provide additional evidence of the value that the DEU model contributes to high-quality clinical education.
Mateescu, R G; Oltenacu, P A; Garmyn, A J; Mafi, G G; VanOverbeke, D L
2016-05-01
Product quality is a high priority for the beef industry because of its importance as a major driver of consumer demand for beef and the ability of the industry to improve it. A 2-prong approach based on implementation of a genetic program to improve eating quality and a system to communicate eating quality and increase the probability that consumers' eating quality expectations are met is outlined. The objectives of this study were 1) to identify the best carcass and meat composition traits to be used in a selection program to improve eating quality and 2) to develop a relatively small number of classes that reflect real and perceptible differences in eating quality that can be communicated to consumers and identify a subset of carcass and meat composition traits with the highest predictive accuracy across all eating quality classes. Carcass traits, meat composition, including Warner-Bratzler shear force (WBSF), intramuscular fat content (IMFC), trained sensory panel scores, and mineral composition traits of 1,666 Angus cattle were used in this study. Three eating quality indexes, EATQ1, EATQ2, and EATQ3, were generated by using different weights for the sensory traits (emphasis on tenderness, flavor, and juiciness, respectively). The best model for predicting eating quality explained 37%, 9%, and 19% of the variability of EATQ1, EATQ2, and EATQ3, and 2 traits, WBSF and IMFC, accounted for most of the variability explained by the best models. EATQ1 combines tenderness, juiciness, and flavor assessed by trained panels with 0.60, 0.15, and 0.25 weights, best describes North American consumers, and has a moderate heritability (0.18 ± 0.06). A selection index (I= -0.5[WBSF] + 0.3[IMFC]) based on phenotypic and genetic variances and covariances can be used to improve eating quality as a correlated trait. The 3 indexes (EATQ1, EATQ2, and EATQ3) were used to generate 3 equal (33.3%) low, medium, and high eating quality classes, and linear combinations of traits that best predict class membership were estimated using a predictive discriminant analysis. The best predictive model to classify new observations into low, medium, and high eating quality classes defined by the EATQ1 index included WBSF, IMFC, HCW, and marbling score and resulted in a total error rate of 47.06%, much lower than the 60.74% error rate when the prediction of class membership was based on the USDA grading system. The 2 best predictors were WBSF and IMFC, and they accounted for 97.2% of the variability explained by the best model.
NASA Astrophysics Data System (ADS)
Martinez Baquero, G. F.; Furnans, J.; Hudson, C.; Magan, C.
2012-12-01
Management decisions on rivers and associated habitats require sound tools to identify major drivers for spatial and temporal variations of temperature and related water quality variables. 3D hydrodynamic and water quality models are key components to abstract flow dynamics in complex river systems as they allow extrapolating available observations to ungaged locations and alternative scenarios. The data collection and model development are intended to support the Mid-Columbia Fisheries Enhancement Group in conjunction with the Benton Conservation District in efforts to understand how seasonal flow patterns in the Yakima and Columbia rivers interact with the Yakima delta geometry to cause the relatively high water temperatures previously observed west of Bateman Island. These high temperatures are suspected of limiting salmonid success in the area, possibly contributing to adjustments in migration patterns and increased predation. The Environmental Fluid Dynamics Code (EFDC) and Water Quality Analysis Simulation Program (WASP) are used to model flow patterns and enable simulations of temperature distributions and water quality parameters at the confluence. Model development is supported by a bathymetric campaign in 2011 to evaluate delta geometry and to construct the EFDC domain, a sonar river survey in 2012 to measure velocity profiles and to enable model calibration, and a continuous collection of temperature and dissolved oxygen records from Level Scout probes at key locations during last year to drive water quality simulations. The current model is able to reproduce main flow features observed at the confluence and is being prepared to integrate previous and current temperature observations. The final model is expected to evaluate scenarios for the removal or alteration of the Bateman Island Causeway. Alterations to the causeway that permit water passage to the south of Bateman Island are likely to dramatically alter the water flow patterns through the Yakima and Columbia River confluence, which in turn will alter water temperature distributions, sediment transport pathways, and salmonid migration routes.
Lin, Tung-Cheng; Hwang, Lih-Lian; Lai, Yung-Jye
2017-05-17
Previous studies have reported that credibility and content (argument quality) are the most critical factors affecting the quality of health information and its acceptance and use; however, this causal relationship merits further investigation in the context of health education. Moreover, message recipients' prior knowledge may moderate these relationships. This study used the elaboration likelihood model to determine the main effects of argument quality, source credibility and the moderating effect of self-reported diabetes knowledge on message attitudes. A between-subjects experimental design using an educational message concerning diabetes for manipulation was applied to validate the effects empirically. A total of 181 participants without diabetes were recruited from the Department of Health, Taipei City Government. Four group messages were manipulated in terms of argument quality (high and low) × source credibility (high and low). Argument quality and source credibility of health information significantly influenced the attitude of message recipients. The participants with high self-reported knowledge participants exhibited significant disapproval for messages with low argument quality. Effective health information should provide objective descriptions and cite reliable sources; in addition, it should provide accurate, customised messages for recipients who have high background knowledge level and ability to discern message quality. © 2017 Health Libraries Group Health Information & Libraries Journal.
Water quality modelling of Jadro spring.
Margeta, J; Fistanic, I
2004-01-01
Management of water quality in karst is a specific problem. Water generally moves very fast by infiltration processes but far more by concentrated flows through fissures and openings in karst. This enables the entire surface pollution to be transferred fast and without filtration into groundwater springs. A typical example is the Jadro spring. Changes in water quality at the spring are sudden, but short. Turbidity as a major water quality problem for the karst springs regularly exceeds allowable standards. Former practice in problem solving has been reduced to intensive water disinfection in periods of great turbidity without analyses of disinfection by-products risks for water users. The main prerequisite for water quality control and an optimization of water disinfection is the knowledge of raw water quality and nature of occurrence. The analysis of monitoring data and their functional relationship with hydrological parameters enables establishment of a stochastic model that will help obtain better information on turbidity in different periods of the year. Using the model a great number of average monthly and extreme daily values are generated. By statistical analyses of these data possibility of occurrence of high turbidity in certain months is obtained. This information can be used for designing expert system for water quality management of karst springs. Thus, the time series model becomes a valuable tool in management of drinking water quality of the Jadro spring.
[Establishment of a 3D finite element model of human skull using MSCT images and mimics software].
Huang, Ping; Li, Zheng-dong; Shao, Yu; Zou, Dong-hua; Liu, Ning-guo; Li, Li; Chen, Yuan-yuan; Wan, Lei; Chen, Yi-jiu
2011-02-01
To establish a human 3D finite element skull model, and to explore its value in biomechanics analysis. The cadaveric head was scanned and then 3D skull model was created using Mimics software based on 2D CT axial images. The 3D skull model was optimized by preprocessor along with creation of the surface and volume meshes. The stress changes, after the head was struck by an object or the head hit the ground directly, were analyzed using ANSYS software. The original 3D skull model showed a large number of triangles with a poor quality and high similarity with the real head, while the optimized model showed high quality surface and volume meshes with a small number of triangles comparatively. The model could show the local and global stress changes effectively. The human 3D skull model can be established using MSCT and Mimics software and provides a good finite element model for biomechanics analysis. This model may also provide a base for the study of head stress changes following different forces.
Midwife-led continuity models versus other models of care for childbearing women.
Sandall, Jane; Soltani, Hora; Gates, Simon; Shennan, Andrew; Devane, Declan
2016-04-28
Midwives are primary providers of care for childbearing women around the world. However, there is a lack of synthesised information to establish whether there are differences in morbidity and mortality, effectiveness and psychosocial outcomes between midwife-led continuity models and other models of care. To compare midwife-led continuity models of care with other models of care for childbearing women and their infants. We searched the Cochrane Pregnancy and Childbirth Group's Trials Register (25 January 2016) and reference lists of retrieved studies. All published and unpublished trials in which pregnant women are randomly allocated to midwife-led continuity models of care or other models of care during pregnancy and birth. Two review authors independently assessed trials for inclusion and risk of bias, extracted data and checked them for accuracy. The quality of the evidence was assessed using the GRADE approach. We included 15 trials involving 17,674 women. We assessed the quality of the trial evidence for all primary outcomes (i.e. regional analgesia (epidural/spinal), caesarean birth, instrumental vaginal birth (forceps/vacuum), spontaneous vaginal birth, intact perineum, preterm birth (less than 37 weeks) and all fetal loss before and after 24 weeks plus neonatal death using the GRADE methodology: all primary outcomes were graded as of high quality.For the primary outcomes, women who had midwife-led continuity models of care were less likely to experience regional analgesia (average risk ratio (RR) 0.85, 95% confidence interval (CI) 0.78 to 0.92; participants = 17,674; studies = 14; high quality), instrumental vaginal birth (average RR 0.90, 95% CI 0.83 to 0.97; participants = 17,501; studies = 13; high quality), preterm birth less than 37 weeks (average RR 0.76, 95% CI 0.64 to 0.91; participants = 13,238; studies = eight; high quality) and less all fetal loss before and after 24 weeks plus neonatal death (average RR 0.84, 95% CI 0.71 to 0.99; participants = 17,561; studies = 13; high quality evidence). Women who had midwife-led continuity models of care were more likely to experience spontaneous vaginal birth (average RR 1.05, 95% CI 1.03 to 1.07; participants = 16,687; studies = 12; high quality). There were no differences between groups for caesarean births or intact perineum.For the secondary outcomes, women who had midwife-led continuity models of care were less likely to experience amniotomy (average RR 0.80, 95% CI 0.66 to 0.98; participants = 3253; studies = four), episiotomy (average RR 0.84, 95% CI 0.77 to 0.92; participants = 17,674; studies = 14) and fetal loss less than 24 weeks and neonatal death (average RR 0.81, 95% CI 0.67 to 0.98; participants = 15,645; studies = 11). Women who had midwife-led continuity models of care were more likely to experience no intrapartum analgesia/anaesthesia (average RR 1.21, 95% CI 1.06 to 1.37; participants = 10,499; studies = seven), have a longer mean length of labour (hours) (mean difference (MD) 0.50, 95% CI 0.27 to 0.74; participants = 3328; studies = three) and more likely to be attended at birth by a known midwife (average RR 7.04, 95% CI 4.48 to 11.08; participants = 6917; studies = seven). There were no differences between groups for fetal loss equal to/after 24 weeks and neonatal death, induction of labour, antenatal hospitalisation, antepartum haemorrhage, augmentation/artificial oxytocin during labour, opiate analgesia, perineal laceration requiring suturing, postpartum haemorrhage, breastfeeding initiation, low birthweight infant, five-minute Apgar score less than or equal to seven, neonatal convulsions, admission of infant to special care or neonatal intensive care unit(s) or in mean length of neonatal hospital stay (days).Due to a lack of consistency in measuring women's satisfaction and assessing the cost of various maternity models, these outcomes were reported narratively. The majority of included studies reported a higher rate of maternal satisfaction in midwife-led continuity models of care. Similarly, there was a trend towards a cost-saving effect for midwife-led continuity care compared to other care models. This review suggests that women who received midwife-led continuity models of care were less likely to experience intervention and more likely to be satisfied with their care with at least comparable adverse outcomes for women or their infants than women who received other models of care.Further research is needed to explore findings of fewer preterm births and fewer fetal deaths less than 24 weeks, and all fetal loss/neonatal death associated with midwife-led continuity models of care.
Yerramilli, Anjaneyulu; Dodla, Venkata B.; Desamsetti, Srinivas; Challa, Srinivas V.; Young, John H.; Patrick, Chuck; Baham, Julius M.; Hughes, Robert L.; Yerramilli, Sudha; Tuluri, Francis; Hardy, Mark G.; Swanier, Shelton J.
2011-01-01
In this study, an attempt was made to simulate the air quality with reference to ozone over the Jackson (Mississippi) region using an online WRF/Chem (Weather Research and Forecasting–Chemistry) model. The WRF/Chem model has the advantages of the integration of the meteorological and chemistry modules with the same computational grid and same physical parameterizations and includes the feedback between the atmospheric chemistry and physical processes. The model was designed to have three nested domains with the inner-most domain covering the study region with a resolution of 1 km. The model was integrated for 48 hours continuously starting from 0000 UTC of 6 June 2006 and the evolution of surface ozone and other precursor pollutants were analyzed. The model simulated atmospheric flow fields and distributions of NO2 and O3 were evaluated for each of the three different time periods. The GIS based spatial distribution maps for ozone, its precursors NO, NO2, CO and HONO and the back trajectories indicate that all the mobile sources in Jackson, Ridgeland and Madison contributing significantly for their formation. The present study demonstrates the applicability of WRF/Chem model to generate quantitative information at high spatial and temporal resolution for the development of decision support systems for air quality regulatory agencies and health administrators. PMID:21776240
Verma, Sadhna; Sarkar, Saradwata; Young, Jason; Venkataraman, Rajesh; Yang, Xu; Bhavsar, Anil; Patil, Nilesh; Donovan, James; Gaitonde, Krishnanath
2016-05-01
The purpose of this study was to compare high b-value (b = 2000 s/mm(2)) acquired diffusion-weighted imaging (aDWI) with computed DWI (cDWI) obtained using four diffusion models-mono-exponential (ME), intra-voxel incoherent motion (IVIM), stretched exponential (SE), and diffusional kurtosis (DK)-with respect to lesion visibility, conspicuity, contrast, and ability to predict significant prostate cancer (PCa). Ninety four patients underwent 3 T MRI including acquisition of b = 2000 s/mm(2) aDWI and low b-value DWI. High b = 2000 s/mm(2) cDWI was obtained using ME, IVIM, SE, and DK models. All images were scored on quality independently by three radiologists. Lesions were identified on all images and graded for lesion conspicuity. For a subset of lesions for which pathological truth was established, lesion-to-background contrast ratios (LBCRs) were computed and binomial generalized linear mixed model analysis was conducted to compare clinically significant PCa predictive capabilities of all DWI. For all readers and all models, cDWI demonstrated higher ratings for image quality and lesion conspicuity than aDWI except DK (p < 0.001). The LBCRs of ME, IVIM, and SE were significantly higher than LBCR of aDWI (p < 0.001). Receiver Operating Characteristic curves obtained from binomial generalized linear mixed model analysis demonstrated higher Area Under the Curves for ME, SE, IVIM, and aDWI compared to DK or PSAD alone in predicting significant PCa. High b-value cDWI using ME, IVIM, and SE diffusion models provide better image quality, lesion conspicuity, and increased LBCR than high b-value aDWI. Using cDWI can potentially provide comparable sensitivity and specificity for detecting significant PCa as high b-value aDWI without increased scan times and image degradation artifacts.
Learning a Health Knowledge Graph from Electronic Medical Records.
Rotmensch, Maya; Halpern, Yoni; Tlimat, Abdulhakim; Horng, Steven; Sontag, David
2017-07-20
Demand for clinical decision support systems in medicine and self-diagnostic symptom checkers has substantially increased in recent years. Existing platforms rely on knowledge bases manually compiled through a labor-intensive process or automatically derived using simple pairwise statistics. This study explored an automated process to learn high quality knowledge bases linking diseases and symptoms directly from electronic medical records. Medical concepts were extracted from 273,174 de-identified patient records and maximum likelihood estimation of three probabilistic models was used to automatically construct knowledge graphs: logistic regression, naive Bayes classifier and a Bayesian network using noisy OR gates. A graph of disease-symptom relationships was elicited from the learned parameters and the constructed knowledge graphs were evaluated and validated, with permission, against Google's manually-constructed knowledge graph and against expert physician opinions. Our study shows that direct and automated construction of high quality health knowledge graphs from medical records using rudimentary concept extraction is feasible. The noisy OR model produces a high quality knowledge graph reaching precision of 0.85 for a recall of 0.6 in the clinical evaluation. Noisy OR significantly outperforms all tested models across evaluation frameworks (p < 0.01).
Enhancing mathematics teachers' quality through Lesson Study.
Lomibao, Laila S
2016-01-01
The efficiency and effectivity of the learning experience is dependent on the teacher quality, thus, enhancing teacher's quality is vital in improving the students learning outcome. Since, the usual top-down one-shot cascading model practice for teachers' professional development in Philippines has been observed to have much information dilution, and the Southeast Asian Ministers of Education Organization demanded the need to develop mathematics teachers' quality standards through the Southeast Asia Regional Standards for Mathematics Teachers (SEARS-MT), thus, an intensive, ongoing professional development model should be provided to teachers. This study was undertaken to determine the impact of Lesson Study on Bulua National High School mathematics teachers' quality level in terms of SEARS-MT dimensions. A mixed method of quantitative-qualitative research design was employed. Results of the analysis revealed that Lesson Study effectively enhanced mathematics teachers' quality and promoted teachers professional development. Teachers positively perceived Lesson Study to be beneficial for them to become a better mathematics teacher.
Image resolution enhancement via image restoration using neural network
NASA Astrophysics Data System (ADS)
Zhang, Shuangteng; Lu, Yihong
2011-04-01
Image super-resolution aims to obtain a high-quality image at a resolution that is higher than that of the original coarse one. This paper presents a new neural network-based method for image super-resolution. In this technique, the super-resolution is considered as an inverse problem. An observation model that closely follows the physical image acquisition process is established to solve the problem. Based on this model, a cost function is created and minimized by a Hopfield neural network to produce high-resolution images from the corresponding low-resolution ones. Not like some other single frame super-resolution techniques, this technique takes into consideration point spread function blurring as well as additive noise and therefore generates high-resolution images with more preserved or restored image details. Experimental results demonstrate that the high-resolution images obtained by this technique have a very high quality in terms of PSNR and visually look more pleasant.
40 CFR 52.1164 - Localized high concentrations-carbon monoxide.
Code of Federal Regulations, 2010 CFR
2010-07-01
... meteorological modeling, traffic flow monitoring, air quality monitoring and other measures necessary to... reviewing all available traffic data, physical site data and air quality and meteorological data for all... containing measures to regulate traffic and parking so as to reduce carbon monoxide emissions to achieve air...
40 CFR 52.1164 - Localized high concentrations-carbon monoxide.
Code of Federal Regulations, 2014 CFR
2014-07-01
... meteorological modeling, traffic flow monitoring, air quality monitoring and other measures necessary to... reviewing all available traffic data, physical site data and air quality and meteorological data for all... containing measures to regulate traffic and parking so as to reduce carbon monoxide emissions to achieve air...
40 CFR 52.1164 - Localized high concentrations-carbon monoxide.
Code of Federal Regulations, 2013 CFR
2013-07-01
... meteorological modeling, traffic flow monitoring, air quality monitoring and other measures necessary to... reviewing all available traffic data, physical site data and air quality and meteorological data for all... containing measures to regulate traffic and parking so as to reduce carbon monoxide emissions to achieve air...
40 CFR 52.1164 - Localized high concentrations-carbon monoxide.
Code of Federal Regulations, 2011 CFR
2011-07-01
... meteorological modeling, traffic flow monitoring, air quality monitoring and other measures necessary to... reviewing all available traffic data, physical site data and air quality and meteorological data for all... containing measures to regulate traffic and parking so as to reduce carbon monoxide emissions to achieve air...
40 CFR 52.1164 - Localized high concentrations-carbon monoxide.
Code of Federal Regulations, 2012 CFR
2012-07-01
... meteorological modeling, traffic flow monitoring, air quality monitoring and other measures necessary to... reviewing all available traffic data, physical site data and air quality and meteorological data for all... containing measures to regulate traffic and parking so as to reduce carbon monoxide emissions to achieve air...
Meteorological and air pollution modeling for an urban airport
NASA Technical Reports Server (NTRS)
Swan, P. R.; Lee, I. Y.
1980-01-01
Results are presented of numerical experiments modeling meteorology, multiple pollutant sources, and nonlinear photochemical reactions for the case of an airport in a large urban area with complex terrain. A planetary boundary-layer model which predicts the mixing depth and generates wind, moisture, and temperature fields was used; it utilizes only surface and synoptic boundary conditions as input data. A version of the Hecht-Seinfeld-Dodge chemical kinetics model is integrated with a new, rapid numerical technique; both the San Francisco Bay Area Air Quality Management District source inventory and the San Jose Airport aircraft inventory are utilized. The air quality model results are presented in contour plots; the combined results illustrate that the highly nonlinear interactions which are present require that the chemistry and meteorology be considered simultaneously to make a valid assessment of the effects of individual sources on regional air quality.
NASA Astrophysics Data System (ADS)
KIM, M.; Kim, J.; Baek, J.; Kim, C.; Shin, H.
2013-12-01
It has being happened as flush flood or red/green tide in various natural phenomena due to climate change and indiscreet development of river or land. Especially, water being very important to man should be protected and managed from water quality pollution, and in water resources management, real-time watershed monitoring system is being operated with the purpose of keeping watch and managing on rivers. It is especially important to monitor and forecast water quality in watershed. A study area selected Nak_K as one site among TMDL unit watershed in Nakdong River. This study is to develop a water quality forecasting model connected with making full use of observed data of 8 day interval from Nakdong River Environment Research Center. When forecasting models for each of the BOD, DO, COD, and chlorophyll-a are established considering correlation of various water quality factors, it is needed to select water quality factors showing highly considerable correlation with each water quality factor which is BOD, DO, COD, and chlorophyll-a. For analyzing the correlation of the factors (reservoir discharge, precipitation, air temperature, DO, BOD, COD, Tw, TN, TP, chlorophyll-a), in this study, self-organizing map was used and cross correlation analysis method was also used for comparing results drawn. Based on the results, each forecasting model for BOD, DO, COD, and chlorophyll-a was developed during the short period as 8, 16, 24, 32 days at 8 day interval. The each forecasting model is based on neural network with back propagation algorithm. That is, the study is connected with self-organizing map for analyzing correlation among various factors and neural network model for forecasting of water quality. It is considerably effective to manage the water quality in plenty of rivers, then, it specially is possible to monitor a variety of accidents in water quality. It will work well to protect water quality and to prevent destruction of the environment becoming more and more serious before occurring.
NASA Astrophysics Data System (ADS)
Han, Qing; Zhang, Chi; Xu, Bo; Chen, Jiangping
2013-07-01
The hydrodynamic flow behavior, effects of geometry and working conditions of a gas-liquid cylindrical cyclone separator with a new structure are investigated by computational fluid dynamic and experiment. Gas liquid cylindrical cyclone separator is widely used in oil industry, refrigeration system because of its simple structure, high separating efficiency, little maintenance and no moving parts nor internal devices. In this work, a gas liquid cylindrical cyclone separator with new structure used before evaporator in refrigeration system can remove the vapor from the mixture and make evaporator compact by improving its heat exchange efficiency with the lower inlet quality. It also decreases evaporator pressure drop and reduces compressor work. The two pipes are placed symmetrically which makes each of them can be treated as inlet. It means when the fluids flow reverse, the separator performance will not be influence. Four samples with different geometry parameters are tested by experiment with different inlet quality (0.18-0.33), inlet mass flow rate (65-100kg/h). Compared with the experimental data, CFD simulation results show a good agreement. Eulerian multiphase model and Reynolds Stress Turbulence model are applied in the CFD simulation and obtained the inner flow field such as phase path lines, tangential velocity profiles and pressure and volume of fraction distribution contours. The separator body diameter (24, 36, 48mm) and inlet diameter (3.84, 4.8, 5.76mm) decide the maximum tangential velocity which results in the centrifugal force. The tangential velocity profiles are simulated and compared among different models. The higher tangential velocity makes higher quality of gas outlet but high pressure drop at the same time. Decreasing the inlet diameter increases quality of gas outlet pipe and pressure drop. High gas outlet quality is cost at high pressure drop. Increasing of separator diameter makes gas outlet quality increase first and then decrease but the pressure drop decreases all the way. The offset (0, 2.4, 3.6mm) of gas outlet is an insensitive factor which influences the quality and pressure drop little.
Stochastic Packet Loss Model to Evaluate QoE Impairments
NASA Astrophysics Data System (ADS)
Hohlfeld, Oliver
With provisioning of broadband access for mass market—even in wireless and mobile networks—multimedia content, especially real-time streaming of high-quality audio and video, is extensively viewed and exchanged over the Internet. Quality of Experience (QoE) aspects, describing the service quality perceived by the user, is a vital factor in ensuring customer satisfaction in today's communication networks. Frameworks for accessing quality degradations in streamed video currently are investigated as a complex multi-layered research topic, involving network traffic load, codec functions and measures of user perception of video quality.
NASA Astrophysics Data System (ADS)
Ribeiro Piffer, P.; Reverberi Tambosi, L.; Uriarte, M.
2017-12-01
One of the most pressing challenges faced by modern societies is ensuring a sufficient supply of water considering the ever-growing conflict between environmental conservation and expansion of agricultural and urban frontiers worldwide. Land use cover change have marked effects on natural landscapes, putting key watershed ecosystem services in jeopardy. We investigated the consequences of land use cover change and precipitation regimes on water quality in the state of São Paulo, Brazil, a landscape that underwent major changes in past century. Water quality data collected bi-monthly between 2000 and 2014 from 229 water monitoring stations was analyzed together with 2011 land use cover maps. We focused on six water quality metrics (dissolved oxygen, total nitrogen, total phosphorus, turbidity, total dissolved solids and fecal coliforms) and used generalized linear mixed models to analyze the data. Models were built at two scales, the entire watershed and a 60 meters riparian buffer along the river network. Models accounted for 46-67% of the variance in water quality metrics and, apart from dissolved oxygen, which reflected land cover composition in riparian buffers, all metrics responded to land use at the watershed scale. Highly urbanized areas had low dissolved oxygen and high fecal coliforms, dissolved solids, phosphorus and nitrogen levels in streams. Pasture was associated with increases in turbidity, while sugarcane plantations significantly increased nitrogen concentrations. Watersheds with high forest cover had greater dissolved oxygen and lower turbidity. Silviculture plantations had little impact on water quality. Precipitation decreased dissolved oxygen and was associated with higher levels of turbidity, fecal coliforms and phosphorus. Results indicate that conversion of forest cover to other land uses had negative impacts on water quality in the study area, highlighting the need for landscape restoration to improve watersheds ecosystem services.
Testing a pharmacist-patient relationship quality model among older persons with diabetes.
Worley, Marcia M
2006-03-01
Considering recent changes to the Medicare program, pharmacists will have unique opportunities to be reimbursed for providing Medication Therapy Management Services to older persons with diabetes. A high-quality pharmacist-patient relationship can lay the foundation for effective provision of Medication Therapy Management Services and improved care in this cohort. To test a pharmacist-patient relationship quality model in a group of older persons with diabetes from the patient's perspective. Antecedents to relationship quality were pharmacist participative behavior/patient-centeredness of relationship, patient participative behavior, and pharmacist-patient interpersonal communication. Pharmacist-patient relationship commitment was the outcome of relationship quality studied. Data were collected via mailed questionnaire from a random sample of 600 community-dwelling adults in the United States who (1) were 65 years of age and older, (2) had type 1 or type 2 diabetes, (3) used at least one prescription medication to treat their diabetes, and (4) used some type of nonmail order pharmacy as their primary source of obtaining prescription medications. Model relationships were tested using path analysis. The adjusted response rate was 41.6% (221/531). The models explained 47% and 49% of the variance in relationship quality and relationship commitment, respectively. In the relationship quality model, pharmacist participative behavior/patient-centeredness of relationship (beta=.51, P<.001) and pharmacist-patient interpersonal communication (beta=.17, P=.008) had direct effects on relationship quality. In the relationship commitment model, relationship quality had a direct effect on relationship commitment (beta=.60, P<.001). Pharmacist participative behavior/patient-centeredness and pharmacist-patient interpersonal communication had indirect effects on relationship commitment through their effects on relationship quality, which is a mediator in the model. Results affirm findings from previous research showing that patients' perceptions of pharmacist participative behavior/patient-centeredness of relationship and pharmacist-patient interpersonal communication are positively related to perceptions of relationship quality. Also, relationship quality is a strong mediator between pharmacist participative behavior/patient-centeredness of relationship and relationship commitment, as well as between pharmacist-patient interpersonal communication and relationship commitment.
A statistical model for water quality predictions from a river discharge using coastal observations
NASA Astrophysics Data System (ADS)
Kim, S.; Terrill, E. J.
2007-12-01
Understanding and predicting coastal ocean water quality has benefits for reducing human health risks, protecting the environment, and improving local economies which depend on clean beaches. Continuous observations of coastal physical oceanography increase the understanding of the processes which control the fate and transport of a riverine plume which potentially contains high levels of contaminants from the upstream watershed. A data-driven model of the fate and transport of river plume water from the Tijuana River has been developed using surface current observations provided by a network of HF radar operated as part of a local coastal observatory that has been in place since 2002. The model outputs are compared with water quality sampling of shoreline indicator bacteria, and the skill of an alarm for low water quality is evaluated using the receiver operating characteristic (ROC) curve. In addition, statistical analysis of beach closures in comparison with environmental variables is also discussed.
Improving Quality of Care in Patients with Liver Cirrhosis.
Saberifiroozi, Mehdi
2017-10-01
Liver cirrhosis is a major chronic disease in the field of digestive diseases. It causes more than one million deaths per year. Despite established evidence based guidelines, the adherence to standard of care or quality indicators are variable. Complete adherence to the recommendations of guidelines is less than 50%. To improve the quality of care in patients with cirrhosis, we need a more holistic view. Because of high rate of death due to cardiovascular disease and neoplasms, the care of comorbid conditions and risk factors such as smoking, hypertension, high blood sugar or cholesterol, would be important in addition to the management of primary liver disease. Despite a holistic multidisciplinary approach for this goal, the management of such patients should be patient centered and individualized. The diagnosis of underlying etiology and its appropriate treatment is the most important step. Definition and customizing the quality indicators for quality measure in patients are needed. Because most suggested quality indicators are designed for measuring the quality of care in decompensated liver cirrhosis, we need special quality indicators for compensated and milder forms of chronic liver disease as well. Training the patients for participation in their own management, design of special clinics with dedicated health professionals in a form of chronic disease model, is suggested for improvement of quality of care in this group of patients. Special day care centers by a dedicated gastroenterologist and a trained nurse may be a practical model for better management of such patients.
A downloadable meshed human canine tooth model with PDL and bone for finite element simulations.
Boryor, Andrew; Hohmann, Ansgar; Geiger, Martin; Wolfram, Uwe; Sander, Christian; Sander, Franz Günter
2009-09-01
The aim of this study is to relieve scientists from the complex and time-consuming task of model generation by providing a model of a canine tooth and its periradicular tissues for Finite Element Method (FEM) simulations. This was achieved with diverse commercial software, based on a micro-computed tomography of the specimen. The Finite Element (FE) Model consists of enamel, dentin, nerve (innervation), periodontal ligament (PDL), and the surrounding cortical bone with trabecular structure. The area and volume meshes are of a very high quality in order to represent the model in a detailed form. Material properties are to be set individually by every user. The tooth model is provided for Abaqus, Ansys, HyperMesh, Nastran and as STL files, in an ASCII format for free download. This can help reduce the cost and effort of generating a tooth model for some research institutions, and may encourage other research groups to provide their high quality models for other researchers. By providing FE models, research results, especially FEM simulations, could be easily verified by others.
Provider perceptions of an integrated primary care quality improvement strategy: The PPAQ toolkit.
Beehler, Gregory P; Lilienthal, Kaitlin R
2017-02-01
The Primary Care Behavioral Health (PCBH) model of integrated primary care is challenging to implement with high fidelity. The Primary Care Behavioral Health Provider Adherence Questionnaire (PPAQ) was designed to assess provider adherence to essential model components and has recently been adapted into a quality improvement toolkit. The aim of this pilot project was to gather preliminary feedback on providers' perceptions of the acceptability and utility of the PPAQ toolkit for making beneficial practice changes. Twelve mental health providers working in Department of Veterans Affairs integrated primary care clinics participated in semistructured interviews to gather quantitative and qualitative data. Descriptive statistics and qualitative content analysis were used to analyze data. Providers identified several positive features of the PPAQ toolkit organization and structure that resulted in high ratings of acceptability, while also identifying several toolkit components in need of modification to improve usability. Toolkit content was considered highly representative of the (PCBH) model and therefore could be used as a diagnostic self-assessment of model adherence. The toolkit was considered to be high in applicability to providers regardless of their degree of prior professional preparation or current clinical setting. Additionally, providers identified several system-level contextual factors that could impact the usefulness of the toolkit. These findings suggest that frontline mental health providers working in (PCBH) settings may be receptive to using an adherence-focused toolkit for ongoing quality improvement. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Influences of water and sediment quality and hydrologic processes on mussels in the Clinch River
Johnson, Gregory C.; Krstolic, Jennifer L.; Ostby, Brett J.K.
2014-01-01
Segments of the Clinch River in Virginia have experienced declining freshwater mussel populations during the past 40 years, while other segments of the river continue to support some of the richest mussel communities in the country. The close proximity of these contrasting reaches provides a study area where differences in climate, hydrology, and historic mussel distribution are minimal. The USGS conducted a study between 2009 and 2011 to evaluate possible causes of the mussel declines. Evaluation of mussel habitat showed no differences in physical habitat quality, leaving water and sediment quality as possible causes for declines. Three years of continuous water-quality data showed higher turbidity and specific conductance in the reaches with low-quality mussel assemblages compared to reaches with high-quality mussel assemblages. Discrete water-quality samples showed higher major ions and metals concentrations in the low-quality reach. Base-flow samples contained high major ion and metal concentrations coincident to low-quality mussel populations. These results support a conceptual model of dilution and augmentation where increased concentrations of major ions and other dissolved constituents from mined tributaries result in reaches with declining mussel populations. Tributaries from unmined basins provide water with low concentrations of dissolved constituents, diluting reaches of the Clinch River where high-quality mussel populations occur.
Bettina Ohse; Falk Huettmann; Stefanie M. Ickert-Bond; Glenn P. Juday
2009-01-01
Most wilderness areas still lack accurate distribution information on tree species. We met this need with a predictive GIS modeling approach, using freely available digital data and computer programs to efficiently obtain high-quality species distribution maps. Here we present a digital map with the predicted distribution of white spruce (Picea glauca...
Mid-Frequency Sonar Interactions With Beaked Whales
2009-09-30
to acquire new high-resolution morphometric and physical-property data on beaked whales for use in the model. It is hoped that the availability of such... morphometric and physical-property data on beaked whales for use in the model. It is hoped that the availability of such a system, together with high-quality... morphometric data through computerized tomography (CT) scans on marine mammal carcasses, and constructing finite-element models of the anatomy
The link between leadership and safety outcomes in hospitals.
Squires, Mae; Tourangeau, Ann; Spence Laschinger, Heather K; Doran, Diane
2010-11-01
To test and refine a model examining relationships among leadership, interactional justice, quality of the nursing work environment, safety climate and patient and nurse safety outcomes. The quality of nursing work environments may pose serious threats to patient and nurse safety. Justice is an important element in work environments that support safety initiatives yet little research has been done that looks at how leader interactional justice influences safety outcomes. A cross-sectional survey was conducted with 600 acute care registered nurses (RNs) to test and refine a model linking interactional justice, the quality of nurse leader-nurse relationships, work environment and safety climate with patient and nurse outcomes. In general the hypothesized model was supported. Resonant leadership and interactional justice influenced the quality of the leader-nurse relationship which in turn affected the quality of the work environment and safety climate. This ultimately was associated with decreased reported medication errors, intentions to leave and emotional exhaustion. Quality relationships based on fairness and empathy play a pivotal role in creating positive safety climates and work environments. To advocate for safe work environments, managers must strive to develop high-quality relationships through just leadership practices. © 2010 The Authors. Journal compilation © 2010 Blackwell Publishing Ltd.
Evaluating Predictive Models of Software Quality
NASA Astrophysics Data System (ADS)
Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.
2014-06-01
Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.
Engineering High Assurance Distributed Cyber Physical Systems
2015-01-15
decisions: number of interacting agents and co-dependent decisions made in real-time without causing interference . To engineer a high assurance DART...environment specification, architecture definition, domain-specific languages, design patterns, code - generation, analysis, test-generation, and simulation...include synchronization between the models and source code , debugging at the model level, expression of the design intent, and quality of service
ERIC Educational Resources Information Center
Fransen, Shelly Lynette
2013-01-01
High quality student engagement activities are essential if students are to be successful learners. Over the years, many instructional strategies and models have been devised to encourage teachers to develop student engagement activities that result in high achievement. The Reading First Model initiative was introduced as a part of the No Child…
Passion in sport: on the quality of the coach-athlete relationship.
Lafrenière, Marc-André K; Jowett, Sophia; Vallerand, Robert J; Gonahue, Eric G; Lorimer, Ross
2008-10-01
Vallerand et al. (2003) developed a dualistic model of passion, wherein two types of passion are proposed: harmonious (HP) and obsessive (OP) passion that predict adaptive and less adaptive interpersonal outcomes, respectively. In the present research, we were interested in understanding the role of passion in the quality of coach-athlete relationships. Results of Study 1, conducted with athletes (N=157), revealed that HP positively predicts a high-quality coach-athlete relationship, whereas OP was largely unrelated to such relationships. Study 2 was conducted with coaches (N=106) and showed that only HP positively predicted the quality of the coach-athlete relationship. Furthermore, these effects were fully mediated by positive emotions. Finally, the quality of the coach-athlete relationship positively predicted coaches' subjective well-being. Future research directions are discussed in light of the dualistic model of passion.
USDA-ARS?s Scientific Manuscript database
Tomato (Solanum lycopersicum L.) is an excellent plant model for unraveling physiological processes, fruit quality and fruit shelf determinants, stress responsive signaling, pathogenicity, and ripening development in climacteric fruits. Tomato is a popular vegetable, and along with potato, it is cla...
Assessing the Agriculture Teacher Workforce in New England
ERIC Educational Resources Information Center
Uricchio, Cassandra Kay
2011-01-01
High quality teachers are an essential piece of the agricultural education model and directly influence the quality of the total program. However, there has been a steady consolidation and elimination of agricultural education teacher preparation programs in New England. In order to understand how this trend affected agricultural education in this…
Constraint-Driven Software Design: An Escape from the Waterfall Model.
ERIC Educational Resources Information Center
de Hoog, Robert; And Others
1994-01-01
Presents the principles of a development methodology for software design based on a nonlinear, product-driven approach that integrates quality aspects. Two examples are given to show that the flexibility needed for building high quality systems leads to integrated development environments in which methodology, product, and tools are closely…
Rapid Training System Self-Assessment
ERIC Educational Resources Information Center
Flesher, Jeff
2007-01-01
A systematic self-assessment mirrors quality system and certification models, thus making a strong argument for high-quality design, control, and management of the training function. Accomplished for the ongoing betterment of the function, not as a summative judgment of conformance, it discovers strengths and weaknesses and results in a common…
Online high-speed NIR diffuse-reflectance imaging spectroscopy in food quality monitoring
NASA Astrophysics Data System (ADS)
Driver, Richard D.; Didona, Kevin
2009-05-01
The use of hyperspectral technology in the NIR for food quality monitoring is discussed. An example of the use of hyperspectral diffuse reflectance scanning and post-processing with a chemometric model shows discrimination between four pharmaceutical samples comprising Aspirin, Acetaminophen, Vitamin C and Vitamin D.
Service quality in health care setting.
Rashid, Wan Edura Wan; Jusoff, Hj Kamaruzaman
2009-01-01
This paper attempts to explore the concept of service quality in a health care setting. This paper probes the definition of service quality from technical and functional aspects for a better understanding on how consumers evaluate the quality of health care. It adopts the conceptual model of service quality frequently used by the most researchers in the health care sector. The paper also discusses several service quality dimensions and service quality problems in order to provide a more holistic conception of hospital service quality. The paper finds that service quality in health care is very complex as compared to other services because this sector highly involves risk. The paper adds a new perspective towards understanding how the concept of service quality is adopted in a health care setting.
[Quality process control system of Chinese medicine preparation based on "holistic view"].
Wang, Ya-Qi; Jiao, Jiao-Jiao; Wu, Zhen-Feng; Zheng, Qin; Yang, Ming
2018-01-01
"High quality, safety and effectiveness" are the primary principles for the pharmaceutical research and development process in China. The quality of products relies not only on the inspection method, but also on the design and development, process control and standardized management. The quality depends on the process control level. In this paper, the history and current development of quality control of traditional Chinese medicine (TCM) preparations are reviewed systematically. Based on the development model of international drug quality control and the misunderstanding of quality control of TCM preparations, the reasons for impacting the homogeneity of TCM preparations are analyzed and summarized. According to TCM characteristics, efforts were made to control the diversity of TCM, make "unstable" TCM into "stable" Chinese patent medicines, put forward the concepts of "holistic view" and "QbD (quality by design)", so as to create the "holistic, modular, data, standardized" model as the core of TCM preparation quality process control model. Scientific studies shall conform to the actual production of TCM preparations, and be conducive to supporting advanced equipment and technology upgrade, thoroughly applying the scientific research achievements in Chinese patent medicines, and promoting the cluster application and transformation application of TCM pharmaceutical technology, so as to improve the quality and effectiveness of the TCM industry and realize the green development. Copyright© by the Chinese Pharmaceutical Association.
Growth of high quality germanium films on patterned silicon substrates and applications
NASA Astrophysics Data System (ADS)
Vanamu, Ganesh
The principal objective of this work is to determine optimal pattern structures for highest quality (defect free) heteroepitaxial growth. High quality films of Ge on Si are of significant importance and can be used in high electron mobility devices, photodetectors for optical communications (1.3mum or 1.55mum) and integrating III-V optoelectronic devices. However, a 4% lattice mismatch and ˜ 50% thermal expansion mismatch between Ge and Si create three major challenges in growing high quality Ge films on Si, (a) high surface roughness due to a pronounced <110> crosshatch pattern, (b) high dislocation densities in Ge films and (c) high density of microcracks and wafer bending. A common way of reducing lattice and thermal expansion mismatch is to form a "virtual substrate (VS)" by growing a graded composition followed by a uniform layer of the desired epitaxial film on a defect-free Si substrate. Virtual graded layers could not decrease the dislocation densities to the numbers acceptable for most of the devices. Mathews et al. first proposed that limiting the lateral dimensions of the sample prior to growth could reduce the dislocation density. Later On Fitzgerald proposed that patterning decreases the dislocation density in the films. In this work we show high quality crosshatch-free Ge films with dislocation density ˜ 105 cm-2 on the nano-patterned Si and also high quality GaAs films on the Ge/Si virtual substrate. The first step in this research was to perform a systematic study to identify the role of pattern width on the quality of Ge growth. We investigated micrometer and submicrometer scale patterns. We demonstrated that the quality of the heteroepitaxial layers improves as the pattern width decreases. Then we have decreased the pattern width to nanometer-scale dimensions. Significant improvement of the Ge film quality was observed. We used novel interferometric lithography techniques combined with reactive ion and wet chemical etching to fabricate Si structures. The patterning was done using standard photomask based lithography. We analyzed the quality of the Ge films using high resolution x-ray diffraction, TEM and SEM. We performed etch pit density (EPD) measurements by counting the pits formed using a Nomarski optical microscope. In order to correlate characterization with device performance, we designed an inter-digitated pattern to form Ge based metal semiconductor metal photodetector and measured the photoresponse of the Ge films. Preliminary results were very promising. We then grew 4 mum GaAs on the Ge/Si using MBE (0.5 mum/hr and 570°C) and analyzed the GaAs film quality. We also performed modeling to calculate strain energy density and wafer bending in multi-layer films grown epitaxially on planar Si substrates. We have also compared the models with experiments. (Abstract shortened by UMI.)
USDA-ARS?s Scientific Manuscript database
The large size and relative complexity of many plant genomes make creation, quality control, and dissemination of high-quality gene structure annotations challenging. In response, we have developed MAKER-P, a fast and easy-to-use genome annotation engine for plants. Here, we report the use of MAKER-...
Guidelines for Calibration and Application of Storm.
1977-12-01
combination method uses the SCS method on pervious areas and the coefficient method on impervious areas of the watershed. Storm water quality is computed...stations, it should be accomplished according to procedures outlined In Reference 7. Adequate storm water quality data are the most difficult and costly...mass discharge of pollutants is negligible. The state-of-the-art in urban storm water quality modeling precludes highly accurate simulation of
Statistical Downscaling of WRF-Chem Model: An Air Quality Analysis over Bogota, Colombia
NASA Astrophysics Data System (ADS)
Kumar, Anikender; Rojas, Nestor
2015-04-01
Statistical downscaling is a technique that is used to extract high-resolution information from regional scale variables produced by coarse resolution models such as Chemical Transport Models (CTMs). The fully coupled WRF-Chem (Weather Research and Forecasting with Chemistry) model is used to simulate air quality over Bogota. Bogota is a tropical Andean megacity located over a high-altitude plateau in the middle of very complex terrain. The WRF-Chem model was adopted for simulating the hourly ozone concentrations. The computational domains were chosen of 120x120x32, 121x121x32 and 121x121x32 grid points with horizontal resolutions of 27, 9 and 3 km respectively. The model was initialized with real boundary conditions using NCAR-NCEP's Final Analysis (FNL) and a 1ox1o (~111 km x 111 km) resolution. Boundary conditions were updated every 6 hours using reanalysis data. The emission rates were obtained from global inventories, namely the REanalysis of the TROpospheric (RETRO) chemical composition and the Emission Database for Global Atmospheric Research (EDGAR). Multiple linear regression and artificial neural network techniques are used to downscale the model output at each monitoring stations. The results confirm that the statistically downscaled outputs reduce simulated errors by up to 25%. This study provides a general overview of statistical downscaling of chemical transport models and can constitute a reference for future air quality modeling exercises over Bogota and other Colombian cities.
Using the SIOP Model for Effective Content Teaching with Second and Foreign Language Learners
ERIC Educational Resources Information Center
Kareva, Veronika; Echevarria, Jana
2013-01-01
In this paper we present a comprehensive model of instruction for providing consistent, high quality teaching to L2 students. This model, the SIOP Model (Sheltered Instruction Observation Protocol), provides an explicit framework for organizing instructional practices to optimize the effectiveness of teaching second and foreign language learners.…
Papageorgiou, Spyridon N; Papadopoulos, Moschos A; Athanasiou, Athanasios E
2014-02-01
Ideally meta-analyses (MAs) should consolidate the characteristics of orthodontic research in order to produce an evidence-based answer. However severe flaws are frequently observed in most of them. The aim of this study was to evaluate the statistical methods, the methodology, and the quality characteristics of orthodontic MAs and to assess their reporting quality during the last years. Electronic databases were searched for MAs (with or without a proper systematic review) in the field of orthodontics, indexed up to 2011. The AMSTAR tool was used for quality assessment of the included articles. Data were analyzed with Student's t-test, one-way ANOVA, and generalized linear modelling. Risk ratios with 95% confidence intervals were calculated to represent changes during the years in reporting of key items associated with quality. A total of 80 MAs with 1086 primary studies were included in this evaluation. Using the AMSTAR tool, 25 (27.3%) of the MAs were found to be of low quality, 37 (46.3%) of medium quality, and 18 (22.5%) of high quality. Specific characteristics like explicit protocol definition, extensive searches, and quality assessment of included trials were associated with a higher AMSTAR score. Model selection and dealing with heterogeneity or publication bias were often problematic in the identified reviews. The number of published orthodontic MAs is constantly increasing, while their overall quality is considered to range from low to medium. Although the number of MAs of medium and high level seems lately to rise, several other aspects need improvement to increase their overall quality.
Tom, Sarah E; Berenson, Abbey B
2013-01-01
Prior studies have not examined the role of psychosocial stress in the relationship between poor sleep quality and obesity among women of lower socioeconomic status (SES). We tested the following hypotheses in a sample of reproductive-age women of lower SES: 1) Poor sleep quality is related to increased risk of obesity, and 2) psychosocial stress confounds this association between poor sleep quality and obesity. A total of 927 women age 16 to 40 years attending public health clinics in Southeastern Texas provided information on the Pittsburgh Sleep Quality Index and sociodemographic and health characteristics, including the Perceived Stress Scale. Height, weight, and waist circumference (WC) were measured in clinic. A series of models examined the associations between sleep disturbance, perceived stress, and weight outcomes, accounting for potential confounding factors. Nearly 30% of women were overweight, and 35% were obese. Half of women had a WC of greater than 35 inches. Most women had poor sleep quality and high levels of stress. Sleep quality and perceived stress were not related to body mass index category or WC in models that adjusted for age and race/ethnicity. Adjusting for potential confounding factors did not alter results. Perceived stress did not modify the association between sleep quality and weight outcomes. Poor sleep quality and psychosocial stress were not related to weight in reproductive-aged women of lower SES. However, poor sleep quality, high stress, overweight, and obesity were common in this group. Copyright © 2013 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
McElwain, Michael; Van Gorkom, Kyle; Bowers, Charles W.; Carnahan, Timothy M.; Kimble, Randy A.; Knight, J. Scott; Lightsey, Paul; Maghami, Peiman G.; Mustelier, David; Niedner, Malcolm B.;
2017-01-01
The James Webb Space Telescope (JWST) is a large (6.5 m) cryogenic segmented aperture telescope with science instruments that cover the near- and mid-infrared from 0.6-27 microns. The large aperture not only provides high photometric sensitivity, but it also enables high angular resolution across the bandpass, with a diffraction limited point spread function (PSF) at wavelengths longer than 2 microns. The JWST PSF quality and stability are intimately tied to the science capabilities as it is convolved with the astrophysical scene. However, the PSF evolves at a variety of timescales based on telescope jitter and thermal distortion as the observatory attitude is varied. We present the image quality and stability requirements, recent predictions from integrated modeling, measurements made during ground-based testing, and performance characterization activities that will be carried out as part of the commissioning process.
Heterogeneous sharpness for cross-spectral face recognition
NASA Astrophysics Data System (ADS)
Cao, Zhicheng; Schmid, Natalia A.
2017-05-01
Matching images acquired in different electromagnetic bands remains a challenging problem. An example of this type of comparison is matching active or passive infrared (IR) against a gallery of visible face images, known as cross-spectral face recognition. Among many unsolved issues is the one of quality disparity of the heterogeneous images. Images acquired in different spectral bands are of unequal image quality due to distinct imaging mechanism, standoff distances, or imaging environment, etc. To reduce the effect of quality disparity on the recognition performance, one can manipulate images to either improve the quality of poor-quality images or to degrade the high-quality images to the level of the quality of their heterogeneous counterparts. To estimate the level of discrepancy in quality of two heterogeneous images a quality metric such as image sharpness is needed. It provides a guidance in how much quality improvement or degradation is appropriate. In this work we consider sharpness as a relative measure of heterogeneous image quality. We propose a generalized definition of sharpness by first achieving image quality parity and then finding and building a relationship between the image quality of two heterogeneous images. Therefore, the new sharpness metric is named heterogeneous sharpness. Image quality parity is achieved by experimentally finding the optimal cross-spectral face recognition performance where quality of the heterogeneous images is varied using a Gaussian smoothing function with different standard deviation. This relationship is established using two models; one of them involves a regression model and the other involves a neural network. To train, test and validate the model, we use composite operators developed in our lab to extract features from heterogeneous face images and use the sharpness metric to evaluate the face image quality within each band. Images from three different spectral bands visible light, near infrared, and short-wave infrared are considered in this work. Both error of a regression model and validation error of a neural network are analyzed.
Prediction of specialty coffee cup quality based on near infrared spectra of green coffee beans.
Tolessa, Kassaye; Rademaker, Michael; De Baets, Bernard; Boeckx, Pascal
2016-04-01
The growing global demand for specialty coffee increases the need for improved coffee quality assessment methods. Green bean coffee quality analysis is usually carried out by physical (e.g. black beans, immature beans) and cup quality (e.g. acidity, flavour) evaluation. However, these evaluation methods are subjective, costly, time consuming, require sample preparation and may end up in poor grading systems. This calls for the development of a rapid, low-cost, reliable and reproducible analytical method to evaluate coffee quality attributes and eventually chemical compounds of interest (e.g. chlorogenic acid) in coffee beans. The aim of this study was to develop a model able to predict coffee cup quality based on NIR spectra of green coffee beans. NIR spectra of 86 samples of green Arabica beans of varying quality were analysed. Partial least squares (PLS) regression method was used to develop a model correlating spectral data to cupping score data (cup quality). The selected PLS model had a good predictive power for total specialty cup quality and its individual quality attributes (overall cup preference, acidity, body and aftertaste) showing a high correlation coefficient with r-values of 90, 90,78, 72 and 72, respectively, between measured and predicted cupping scores for 20 out of 86 samples. The corresponding root mean square error of prediction (RMSEP) was 1.04, 0.22, 0.27, 0.24 and 0.27 for total specialty cup quality, overall cup preference, acidity, body and aftertaste, respectively. The results obtained suggest that NIR spectra of green coffee beans are a promising tool for fast and accurate prediction of coffee quality and for classifying green coffee beans into different specialty grades. However, the model should be further tested for coffee samples from different regions in Ethiopia and test if one generic or region-specific model should be developed. Copyright © 2015 Elsevier B.V. All rights reserved.
MODELS-3 COMMUNITY MULTISCALE AIR QUALITY (CMAQ) MODEL AEROSOL COMPONENT 2. MODEL EVALUATION
Ambient air concentrations of particulate matter (atmospheric suspensions of solid of liquid materials, i.e., aerosols) continue to be a major concern for the U.S. Environmental Protection Agency (EPA). High particulate matter (PM) concentrations are associated not only with adv...
Zhuo, Limeng; Peng, Jingjing; Zhao, Yunli; Li, Dongxiang; Xie, Xiuman; Tong, Ling; Yu, Zhiguo
2017-10-01
Traditional Chinese medicine consists of complex phytochemical constituents. Selecting appropriate analytical markers of traditional Chinese medicine is a critical step in quality control. Currently, the combination of fingerprinting and efficacy evaluation is considered as a useful method for screening active ingredients in complex mixtures. This study was designed to develop an orthogonal partial least squares model for screening bioactive quality control markers of QishenYiqi dripping pills based on the fingerprint-efficacy relationship. First, the chemical fingerprints of 49 batches of QishenYiqi dripping pill samples were established by ultra-high performance liquid chromatography coupled with a photodiode array detector. Second, ultra-high performance liquid chromatography coupled with quadrupole-time-of-flight mass spectrometry was exploited to systematically investigate the 36 copossessing fingerprint components in QishenYiqi dripping pills. The vascular protective activity of QishenYiqi dripping pills was determined by using a cell counting kit-8 assay. Finally, fingerprint-efficacy relationship was established by orthogonal partial least squares model. The results indicated that ten components exhibited strong correlation with vascular protective activity, and these were preliminarily screened as quality control markers. The present study provided a novel idea for the study of the pharmacodynamic material basis and quality evaluation of QishenYiqi dripping pills. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Harris, Katherine M
2002-01-01
Objective To investigate the impact of quality information on the willingness of consumers to enroll in health plans that restrict provider access. Data Sources and Setting A survey administered to respondents between the ages of 25 and 64 in the West Los Angeles area with private health insurance. Study Design An experimental approach is used to measure the effect of variation in provider network features and information about the quality of network physicians on hypothetical plan choices. Conditional logit models are used to analyze the experimental choice data. Next, choice model parameter estimates are used to simulate the impact of changes in plan features on the market shares of competing health plans and to calculate the quality level required to make consumers indifferent to changes in provider access. Principal Findings The presence of quality information reduced the importance of provider network features in plan choices as hypothesized. However, there were not statistically meaningful differences by type of quality measure (i.e., consumer assessed versus expert assessed). The results imply that large quality differences are required to make consumers indifferent to changes in provider access. The impact of quality on plan choices depended more on the particular measure and less on the type of measure. Quality ratings based on the proportion of survey respondents “extremely satisfied with results of care” had the greatest impact on plan choice while the proportion of network doctors “affiliated with university medical centers” had the least. Other consumer and expert assessed measures had more comparable effects. Conclusions Overall the results provide empirical evidence that consumers are willing to trade high quality for restrictions on provider access. This willingness to trade implies that relatively small plans that place restrictions on provider access can successfully compete against less restrictive plans when they can demonstrate high quality. However, the results of this study suggest that in many cases, the level of quality required for consumers to accept access restrictions may be so high as to be unattainable. The results provide empirical support for the current focus of decision support efforts on consumer assessed quality measures. At the same time, however, the results suggest that consumers would also value quality measures based on expert assessments. This finding is relevant given the lack of comparative quality information based on expert judgment and research suggesting that consumers have apprehensions about their ability to meaningfully interpret performance-based quality measures. PMID:12132595
Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey
2017-01-01
As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed‐batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647–1661, 2017 PMID:28786215
An Overview of Controls and Flying Qualities Technology on the F/A-18 High Alpha Research Vehicle
NASA Technical Reports Server (NTRS)
Pahle, Joseph W.; Wichman, Keith D.; Foster, John V.; Bundick, W. Thomas
1996-01-01
The NASA F/A-18 High Alpha Research Vehicle (HARV) has been the flight test bed of a focused technology effort to significantly increase maneuvering capability at high angles of attack. Development and flight test of control law design methodologies, handling qualities metrics, performance guidelines, and flight evaluation maneuvers are described. The HARV has been modified to include two research control effectors, thrust vectoring, and actuated forebody strakes in order to provide increased control power at high angles of attack. A research flight control system has been used to provide a flexible, easily modified capability for high-angle-of-attack research controls. Different control law design techniques have been implemented and flight-tested, including eigenstructure assignment, variable gain output feedback, pseudo controls, and model-following. Extensive piloted simulation has been used to develop nonlinear performance guide-lines and handling qualities criteria for high angles of attack. This paper reviews the development and evaluation of technologies useful for high-angle-of-attack control. Design, development, and flight test of the research flight control system, control laws, flying qualities specifications, and flight test maneuvers are described. Flight test results are used to illustrate some of the lessons learned during flight test and handling qualities evaluations.
Rump, A; Schöffski, O
2018-07-01
Healthcare systems in developed countries may differ in financing and organisation. Maternity services and delivery are particularly influenced by culture and habits. In this study, we compared the pregnancy care quality and efficiency of the German, French and Japanese healthcare systems. Comparative healthcare data analysis. In an international comparison based mainly on Organisation for Economic Co-operation and Development (OECD) indicators, we analysed the health resources significantly affecting pregnancy care and quantified its quality using structural equation modelling. Pregnancy care efficiency was studied using data envelopment analysis. Pregnancy output was quantified overall or separately using indicators based on perinatal, neonatal or maternal mortality. The density of obstetricians, midwives, paediatricians and the average annual doctor's consultations were positively and the caesarean delivery rate negatively associated with pregnancy outcome. In the international comparison at an aggregate level, Japan ranked first for pregnancy care quality, whereas Germany and France were positioned in the second part of the ranking. Similarly, at an aggregate level, the Japanese system showed pure technical efficiency, whereas Germany and France revealed mediocre efficiency results. Perinatal, neonatal and maternal care quality and efficiency taken separately were quite similar and mediocre in Germany and France. In Japan, there was a marked difference between a highly effective and efficient care of the unborn and newborn baby, and a rather mediocre quality and efficiency of maternal care. Germany, France, and Japan have to struggle with quality and efficiency issues that are nevertheless different: in Germany and France, disappointing pregnancy care quality does not correspond to the high health care expenditures and lead to low technical efficiency. The Japanese system shows a high variability in outcomes and technical efficiency. Maternal care quality during delivery seems to be a particular issue that could possibly be addressed by legally implementing quality assurance systems with stricter rules for reimbursement in obstetrics. Copyright © 2018 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Loftus, Kelli; Tilley, Terry; Hoffman, Jason; Bradburn, Eric; Harvey, Ellen
2015-01-01
The creation of a consistent culture of safety and quality in an intensive care unit is challenging. We applied the Six Sigma Define-Measure-Analyze-Improve-Control (DMAIC) model for quality improvement (QI) to develop a long-term solution to improve outcomes in a high-risk neurotrauma intensive care unit. We sought to reduce central line utilization as a cornerstone in preventing central line-associated bloodstream infections (CLABSIs). This study describes the successful application of the DMAIC model in the creation and implementation of evidence-based quality improvement designed to reduce CLABSIs to below national benchmarks.
Perez-Ponce, Hector; Daul, Christian; Wolf, Didier; Noel, Alain
2013-08-01
In mammography, image quality assessment has to be directly related to breast cancer indicator (e.g. microcalcifications) detectability. Recently, we proposed an X-ray source/digital detector (XRS/DD) model leading to such an assessment. This model simulates very realistic contrast-detail phantom (CDMAM) images leading to gold disc (representing microcalcifications) detectability thresholds that are very close to those of real images taken under the simulated acquisition conditions. The detection step was performed with a mathematical observer. The aim of this contribution is to include human observers into the disc detection process in real and virtual images to validate the simulation framework based on the XRS/DD model. Mathematical criteria (contrast-detail curves, image quality factor, etc.) are used to assess and to compare, from the statistical point of view, the cancer indicator detectability in real and virtual images. The quantitative results given in this paper show that the images simulated by the XRS/DD model are useful for image quality assessment in the case of all studied exposure conditions using either human or automated scoring. Also, this paper confirms that with the XRS/DD model the image quality assessment can be automated and the whole time of the procedure can be drastically reduced. Compared to standard quality assessment methods, the number of images to be acquired is divided by a factor of eight. Copyright © 2012 IPEM. Published by Elsevier Ltd. All rights reserved.
Stodola, Kirk W; Ward, Michael P
2017-06-01
Multiple biotic, abiotic, and evolutionary constraints interact to determine a species' range. However, most species are not present in all suitable and accessible locations. Dispersal ability may explain why many species do not occupy all suitable habitat, but highly mobile species also exhibit a mismatch. Habitat selection behavior where individuals are site faithful and settle near conspecifics could create a social pressure that make a species' geographic range resistant to change. We investigated this possibility by using an individual-based model of habitat selection where habitat quality moved each year. Our model demonstrated the benefits of conspecific attraction in relatively stable environments and its detrimental influence when habitat quality shifted rapidly. These results were most apparent when adult survival was high, because site fidelity led to more individuals occupying poor-quality habitat areas as habitat quality changed. These individuals attracted other dispersing individuals, thereby decreasing the ability to track shifts in habitat quality, which we refer to as "social inertia." Consequently, social inertia may arise for species that exhibit site fidelity and conspecific attraction, which may have conservation implications in light of climate change and widespread alteration of natural habitats.
Building A High Quality Oncology Nursing Workforce Through Lifelong Learning: The De Souza Model.
Esplen, Mary Jane; Wong, Jiahui; Green, Esther; Richards, Joy; Li, Jane
2018-01-05
AbstractCancer is one of the leading causes of death in the world. Along with increased new cases, cancer care has become increasingly complex due to advances in diagnostics and treatments, greater survival, and new models of palliative care. Nurses are a critical resource for cancer patients and their families. Their roles and responsibilities are expanding across the cancer care continuum, calling for specialized training and support. Formal education prepares nurses for entry level of practice, however, it does not provide the specialized competencies required for quality care of cancer patients. There is urgent need to align the educational system to the demands of the health care system, ease transition from formal academic systems to care settings, and to instill a philosophy of lifelong learning. We describe a model of education developed by de Souza Institute in Canada, based on the Novice to Expert specialty training framework, and its success in offering structured oncology continuing education training to nurses, from undergraduate levels to continued career development in the clinical setting. This model may have global relevance, given the challenge in managing the demand for high quality care in all disease areas and in keeping pace with the emerging advances in technologies.
NASA Astrophysics Data System (ADS)
Choi, Hyun-Jung; Lee, Hwa Woon; Jeon, Won-Bae; Lee, Soon-Hwan
2012-01-01
This study evaluated an atmospheric and air quality model of the spatial variability in low-level coastal winds and ozone concentration, which are affected by sea surface temperature (SST) forcing with different thermal gradients. Several numerical experiments examined the effect of sea surface SST forcing on the coastal atmosphere and air quality. In this study, the RAMS-CAMx model was used to estimate the sensitivity to two different resolutions of SST forcing during the episode day as well as to simulate the low-level coastal winds and ozone concentration over a complex coastal area. The regional model reproduced the qualitative effect of SST forcing and thermal gradients on the coastal flow. The high-resolution SST derived from NGSST-O (New Generation Sea Surface Temperature Open Ocean) forcing to resolve the warm SST appeared to enhance the mean response of low-level winds to coastal regions. These wind variations have important implications for coastal air quality. A higher ozone concentration was forecasted when SST data with a high resolution was used with the appropriate limitation of temperature, regional wind circulation, vertical mixing height and nocturnal boundary layer (NBL) near coastal areas.
Remote Sensing Characterization of the Urban Landscape for Improvement of Air Quality Modeling
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.; Estes, Maurice G., Jr.; Khan, Maudood
2005-01-01
The urban landscape is inherently complex and this complexity is not adequately captured in air quality models, particularly the Community Multiscale Air Quality (CMAQ) model that is used to assess whether urban areas are in attainment of EPA air quality standards, primarily for ground level ozone. This inadequacy of the CMAQ model to sufficiently respond to the heterogeneous nature of the urban landscape can impact how well the model predicts ozone pollutant levels over metropolitan areas and ultimately, whether cities exceed EPA ozone air quality standards. We are exploring the utility of high-resolution remote sensing data and urban growth projections as improved inputs to the meteorology component of the CMAQ model focusing on the Atlanta, Georgia metropolitan area as a case study. These growth projections include "business as usual" and "smart growth" scenarios out to 2030. The growth projections illustrate the effects of employing urban heat island mitigation strategies, such as increasing tree canopy and albedo across the Atlanta metro area, in moderating ground-level ozone and air temperature, compared to "business as usual" simulations in which heat island mitigation strategies are not applied. The National Land Cover Dataset at 30m resolution is being used as the land use/land cover input and aggregated to the 4km scale for the MM5 mesoscale meteorological model and the (CMAQ) modeling schemes. Use of these data has been found to better characterize low densityhburban development as compared with USGS 1 km land use/land cover data that have traditionally been used in modeling. Air quality prediction for fiture scenarios to 2030 is being facilitated by land use projections using a spatial growth model. Land use projections were developed using the 2030 Regional Transportation Plan developed by the Atlanta Regional Commission, the regional planning agency for the area. This allows the state Environmental Protection agency to evaluate how these transportation plans will affect fbture air quality.
Hybrid Air Quality Modeling Approach For Use in the Near ...
The Near-road EXposures to Urban air pollutant Study (NEXUS) investigated whether children with asthma living in close proximity to major roadways in Detroit, MI, (particularly near roadways with high diesel traffic) have greater health impacts associated with exposure to air pollutants than those living farther away. A major challenge in such health and exposure studies is the lack of information regarding pollutant exposure characterization. Air quality modeling can provide spatially and temporally varying exposure estimates for examining relationships between traffic-related air pollutants and adverse health outcomes. This paper presents a hybrid air quality modeling approach and its application in NEXUS in order to provide spatial and temporally varying exposure estimates and identification of the mobile source contribution to the total pollutant exposure. Model-based exposure metrics, associated with local variations of emissions and meteorology, were estimated using a combination of the AERMOD and R-LINE dispersion models, local emission source information from the National Emissions Inventory, detailed road network locations and traffic activity, and meteorological data from the Detroit City Airport. The regional background contribution was estimated using a combination of the Community Multiscale Air Quality (CMAQ) model and the Space/Time Ordinary Kriging (STOK) model. To capture the near-road pollutant gradients, refined “mini-grids” of model recep
NASA Astrophysics Data System (ADS)
Abbaspour, K. C.; Rouholahnejad, E.; Vaghefi, S.; Srinivasan, R.; Yang, H.; Kløve, B.
2015-05-01
A combination of driving forces are increasing pressure on local, national, and regional water supplies needed for irrigation, energy production, industrial uses, domestic purposes, and the environment. In many parts of Europe groundwater quantity, and in particular quality, have come under sever degradation and water levels have decreased resulting in negative environmental impacts. Rapid improvements in the economy of the eastern European block of countries and uncertainties with regard to freshwater availability create challenges for water managers. At the same time, climate change adds a new level of uncertainty with regard to freshwater supplies. In this research we build and calibrate an integrated hydrological model of Europe using the Soil and Water Assessment Tool (SWAT) program. Different components of water resources are simulated and crop yield and water quality are considered at the Hydrological Response Unit (HRU) level. The water resources are quantified at subbasin level with monthly time intervals. Leaching of nitrate into groundwater is also simulated at a finer spatial level (HRU). The use of large-scale, high-resolution water resources models enables consistent and comprehensive examination of integrated system behavior through physically-based, data-driven simulation. In this article we discuss issues with data availability, calibration of large-scale distributed models, and outline procedures for model calibration and uncertainty analysis. The calibrated model and results provide information support to the European Water Framework Directive and lay the basis for further assessment of the impact of climate change on water availability and quality. The approach and methods developed are general and can be applied to any large region around the world.
Visual air quality simulation techniques
NASA Astrophysics Data System (ADS)
Molenar, John V.; Malm, William C.; Johnson, Christopher E.
Visual air quality is primarily a human perceptual phenomenon beginning with the transfer of image-forming information through an illuminated, scattering and absorbing atmosphere. Visibility, especially the visual appearance of industrial emissions or the degradation of a scenic view, is the principal atmospheric characteristic through which humans perceive air pollution, and is more sensitive to changing pollution levels than any other air pollution effect. Every attempt to quantify economic costs and benefits of air pollution has indicated that good visibility is a highly valued and desired environmental condition. Measurement programs can at best approximate the state of the ambient atmosphere at a few points in a scenic vista viewed by an observer. To fully understand the visual effect of various changes in the concentration and distribution of optically important atmospheric pollutants requires the use of aerosol and radiative transfer models. Communication of the output of these models to scientists, decision makers and the public is best done by applying modern image-processing systems to generate synthetic images representing the modeled air quality conditions. This combination of modeling techniques has been under development for the past 15 yr. Initially, visual air quality simulations were limited by a lack of computational power to simplified models depicting Gaussian plumes or uniform haze conditions. Recent explosive growth in low cost, high powered computer technology has allowed the development of sophisticated aerosol and radiative transfer models that incorporate realistic terrain, multiple scattering, non-uniform illumination, varying spatial distribution, concentration and optical properties of atmospheric constituents, and relative humidity effects on aerosol scattering properties. This paper discusses these improved models and image-processing techniques in detail. Results addressing uniform and non-uniform layered haze conditions in both urban and remote pristine areas will be presented.
EMRinger: side chain–directed model and map validation for 3D cryo-electron microscopy
Barad, Benjamin A.; Echols, Nathaniel; Wang, Ray Yu-Ruei; ...
2015-08-17
Advances in high-resolution cryo-electron microscopy (cryo-EM) require the development of validation metrics to independently assess map quality and model geometry. We report that EMRinger is a tool that assesses the precise fitting of an atomic model into the map during refinement and shows how radiation damage alters scattering from negatively charged amino acids. EMRinger (https://github.com/fraser-lab/EMRinger) will be useful for monitoring progress in resolving and modeling high-resolution features in cryo-EM.
Towards an operational high-resolution air quality forecasting system at ECCC
NASA Astrophysics Data System (ADS)
Munoz-Alpizar, Rodrigo; Stroud, Craig; Ren, Shuzhan; Belair, Stephane; Leroyer, Sylvie; Souvanlasy, Vanh; Spacek, Lubos; Pavlovic, Radenko; Davignon, Didier; Moran, Moran
2017-04-01
Urban environments are particularly sensitive to weather, air quality (AQ), and climatic conditions. Despite the efforts made in Canada to reduce pollution in urban areas, AQ continues to be a concern for the population, especially during short-term episodes that could lead to exceedances of daily air quality standards. Furthermore, urban air pollution has long been associated with significant adverse health effects. In Canada, the large percentage of the population living in urban areas ( 81%, according to the Canada's 2011 census) is exposed to elevated air pollution due to local emissions sources. Thus, in order to improve the services offered to the Canadian public, Environment and Climate Change Canada has launched an initiative to develop a high-resolution air quality prediction capacity for urban areas in Canada. This presentation will show observed pollution trends (2010-2016) for Canadian mega-cities along with some preliminary high-resolution air quality modelling results. Short-term and long-term plans for urban AQ forecasting in Canada will also be described.
Competition for resources can explain patterns of social and individual learning in nature.
Smolla, Marco; Gilman, R Tucker; Galla, Tobias; Shultz, Susanne
2015-09-22
In nature, animals often ignore socially available information despite the multiple theoretical benefits of social learning over individual trial-and-error learning. Using information filtered by others is quicker, more efficient and less risky than randomly sampling the environment. To explain the mix of social and individual learning used by animals in nature, most models penalize the quality of socially derived information as either out of date, of poor fidelity or costly to acquire. Competition for limited resources, a fundamental evolutionary force, provides a compelling, yet hitherto overlooked, explanation for the evolution of mixed-learning strategies. We present a novel model of social learning that incorporates competition and demonstrates that (i) social learning is favoured when competition is weak, but (ii) if competition is strong social learning is favoured only when resource quality is highly variable and there is low environmental turnover. The frequency of social learning in our model always evolves until it reduces the mean foraging success of the population. The results of our model are consistent with empirical studies showing that individuals rely less on social information where resources vary little in quality and where there is high within-patch competition. Our model provides a framework for understanding the evolution of social learning, a prerequisite for human cumulative culture. © 2015 The Author(s).
Competition for resources can explain patterns of social and individual learning in nature
Smolla, Marco; Gilman, R. Tucker; Galla, Tobias; Shultz, Susanne
2015-01-01
In nature, animals often ignore socially available information despite the multiple theoretical benefits of social learning over individual trial-and-error learning. Using information filtered by others is quicker, more efficient and less risky than randomly sampling the environment. To explain the mix of social and individual learning used by animals in nature, most models penalize the quality of socially derived information as either out of date, of poor fidelity or costly to acquire. Competition for limited resources, a fundamental evolutionary force, provides a compelling, yet hitherto overlooked, explanation for the evolution of mixed-learning strategies. We present a novel model of social learning that incorporates competition and demonstrates that (i) social learning is favoured when competition is weak, but (ii) if competition is strong social learning is favoured only when resource quality is highly variable and there is low environmental turnover. The frequency of social learning in our model always evolves until it reduces the mean foraging success of the population. The results of our model are consistent with empirical studies showing that individuals rely less on social information where resources vary little in quality and where there is high within-patch competition. Our model provides a framework for understanding the evolution of social learning, a prerequisite for human cumulative culture. PMID:26354936
Feldmesser, Ester; Rosenwasser, Shilo; Vardi, Assaf; Ben-Dor, Shifra
2014-02-22
The advent of Next Generation Sequencing technologies and corresponding bioinformatics tools allows the definition of transcriptomes in non-model organisms. Non-model organisms are of great ecological and biotechnological significance, and consequently the understanding of their unique metabolic pathways is essential. Several methods that integrate de novo assembly with genome-based assembly have been proposed. Yet, there are many open challenges in defining genes, particularly where genomes are not available or incomplete. Despite the large numbers of transcriptome assemblies that have been performed, quality control of the transcript building process, particularly on the protein level, is rarely performed if ever. To test and improve the quality of the automated transcriptome reconstruction, we used manually defined and curated genes, several of them experimentally validated. Several approaches to transcript construction were utilized, based on the available data: a draft genome, high quality RNAseq reads, and ESTs. In order to maximize the contribution of the various data, we integrated methods including de novo and genome based assembly, as well as EST clustering. After each step a set of manually curated genes was used for quality assessment of the transcripts. The interplay between the automated pipeline and the quality control indicated which additional processes were required to improve the transcriptome reconstruction. We discovered that E. huxleyi has a very high percentage of non-canonical splice junctions, and relatively high rates of intron retention, which caused unique issues with the currently available tools. While individual tools missed genes and artificially joined overlapping transcripts, combining the results of several tools improved the completeness and quality considerably. The final collection, created from the integration of several quality control and improvement rounds, was compared to the manually defined set both on the DNA and protein levels, and resulted in an improvement of 20% versus any of the read-based approaches alone. To the best of our knowledge, this is the first time that an automated transcript definition is subjected to quality control using manually defined and curated genes and thereafter the process is improved. We recommend using a set of manually curated genes to troubleshoot transcriptome reconstruction.
NASA Astrophysics Data System (ADS)
Hong, Yi; Bonhomme, Celine; Giangola-Murzyn, Agathe; Schertzer, Daniel; Chebbo, Ghassan
2015-04-01
Nowadays, the increasingly use of vehicles causes expanding contaminated storm-water runoff from roads and the associated quarters. Besides, the current utilization of city's separated sewer systems underlines the needs for evaluating precisely the growing impact of these polluted effluents on receiving water bodies. Nevertheless, traditional means of water quality modelling had shown its limits (Kanso, 2004), more accurate modelling schemes are hence required. In this paper, we found that the application of physically based and fully distributed model coupled with detailed high-resolution data is a promising approach to reproduce the various dynamics and interactions of water quantity/quality processes in urban or peri-urban environment. Over recent years, the physically based and spatially distributed numerical platform Multi-Hydro (MH) has been developed at Ecole des Ponts ParisTech (El-Tabach et al. , 2009 ; Gires et al., 2013 ; Giangola-Murzyn et al., 2014). This platform is particularly adapted for representing the hydrological processes for medium size watersheds, including the surface runoff, drainage water routing and the infiltrations on permeable zones. It is formed by the interactive coupling of several independent modules, which depend on generally used open-access models. In the framework of the ANR (French National Agency for Research) Trafipollu project, a new extension of MH, MH-quality, was set up for the water-quality modelling. MH-quality was used for the simulation of pollutant transport on a peri-urban and highly trafficked catchment located near Paris (Le Perreux-sur-Marne, 0.2 km2). The set-up of this model is based on the detailed description of urban land use features. For this purpose, 15 classes of urban land uses relevant to water quality modelling were defined in collaboration with the National Institute of Geography of France (IGN) using Digital Orthophoto Quadrangles (5cm). The delimitation of the urban catchment was then performed by operating a Digital Terrain Model which was generated by applying Lidar data (20cm), and by using GIS information of the drainage system. In addition to land use information, the implementation of different human activities allows a better evaluation of contamination. Experimental data such as rainfall intensities, particle size distribution and dry weather depositions are also used, in order to feed the model with realistic input data and parameters. The runoff and water quality are then simulated for a few rainfall events. Taking advantage of the available data of the continuous observations of precipitation, water discharges and turbidity at the outlet of the drainage systems, the sensitivity analysis is carried out in order to evaluate the performance of MH-quality and the most sensitive parameters. Using appropriate parameters, we are now able to follow the pollutant transport on our experimental urban catchment. The limitations and the perspectives of MH-quality are discussed as well.
Durrieu, Gilles; Pham, Quang-Khoai; Foltête, Anne-Sophie; Maxime, Valérie; Grama, Ion; Tilly, Véronique Le; Duval, Hélène; Tricot, Jean-Marie; Naceur, Chiraz Ben; Sire, Olivier
2016-07-01
Water quality can be evaluated using biomarkers such as tissular enzymatic activities of endemic species. Measurement of molluscs bivalves activity at high frequency (e.g., valvometry) during a long time period is another way to record the animal behavior and to evaluate perturbations of the water quality in real time. As the pollution affects the activity of oysters, we consider the valves opening and closing velocities to monitor the water quality assessment. We propose to model the huge volume of velocity data collected in the framework of valvometry using a new nonparametric extreme values statistical model. The objective is to estimate the tail probabilities and the extreme quantiles of the distribution of valve closing velocity. The tail of the distribution function of valve closing velocity is modeled by a Pareto distribution with parameter t,τ , beyond a threshold τ according to the time t of the experiment. Our modeling approach reveals the dependence between the specific activity of two enzymatic biomarkers (Glutathione-S-transferase and acetylcholinesterase) and the continuous recording of oyster valve velocity, proving the suitability of this tool for water quality assessment. Thus, valvometry allows in real-time in situ analysis of the bivalves behavior and appears as an effective early warning tool in ecological risk assessment and marine environment monitoring.
ProTSAV: A protein tertiary structure analysis and validation server.
Singh, Ankita; Kaushik, Rahul; Mishra, Avinash; Shanker, Asheesh; Jayaram, B
2016-01-01
Quality assessment of predicted model structures of proteins is as important as the protein tertiary structure prediction. A highly efficient quality assessment of predicted model structures directs further research on function. Here we present a new server ProTSAV, capable of evaluating predicted model structures based on some popular online servers and standalone tools. ProTSAV furnishes the user with a single quality score in case of individual protein structure along with a graphical representation and ranking in case of multiple protein structure assessment. The server is validated on ~64,446 protein structures including experimental structures from RCSB and predicted model structures for CASP targets and from public decoy sets. ProTSAV succeeds in predicting quality of protein structures with a specificity of 100% and a sensitivity of 98% on experimentally solved structures and achieves a specificity of 88%and a sensitivity of 91% on predicted protein structures of CASP11 targets under 2Å.The server overcomes the limitations of any single server/method and is seen to be robust in helping in quality assessment. ProTSAV is freely available at http://www.scfbio-iitd.res.in/software/proteomics/protsav.jsp. Copyright © 2015 Elsevier B.V. All rights reserved.
Murillo, Gabriel H; You, Na; Su, Xiaoquan; Cui, Wei; Reilly, Muredach P; Li, Mingyao; Ning, Kang; Cui, Xinping
2016-05-15
Single nucleotide variant (SNV) detection procedures are being utilized as never before to analyze the recent abundance of high-throughput DNA sequencing data, both on single and multiple sample datasets. Building on previously published work with the single sample SNV caller genotype model selection (GeMS), a multiple sample version of GeMS (MultiGeMS) is introduced. Unlike other popular multiple sample SNV callers, the MultiGeMS statistical model accounts for enzymatic substitution sequencing errors. It also addresses the multiple testing problem endemic to multiple sample SNV calling and utilizes high performance computing (HPC) techniques. A simulation study demonstrates that MultiGeMS ranks highest in precision among a selection of popular multiple sample SNV callers, while showing exceptional recall in calling common SNVs. Further, both simulation studies and real data analyses indicate that MultiGeMS is robust to low-quality data. We also demonstrate that accounting for enzymatic substitution sequencing errors not only improves SNV call precision at low mapping quality regions, but also improves recall at reference allele-dominated sites with high mapping quality. The MultiGeMS package can be downloaded from https://github.com/cui-lab/multigems xinping.cui@ucr.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Predicting fire effects on water quality: a perspective and future needs
NASA Astrophysics Data System (ADS)
Smith, Hugh; Sheridan, Gary; Nyman, Petter; Langhans, Christoph; Noske, Philip; Lane, Patrick
2017-04-01
Forest environments are a globally significant source of drinking water. Fire presents a credible threat to the supply of high quality water in many forested regions. The post-fire risk to water supplies depends on storm event characteristics, vegetation cover and fire-related changes in soil infiltration and erodibility modulated by landscape position. The resulting magnitude of runoff generation, erosion and constituent flux to streams and reservoirs determines the severity of water quality impacts in combination with the physical and chemical composition of the entrained material. Research to date suggests that most post-fire water quality impacts are due to large increases in the supply of particulates (fine-grained sediment and ash) and particle-associated chemical constituents. The largest water quality impacts result from high magnitude erosion events, including debris flow processes, which typically occur in response to short duration, high intensity storm events during the recovery period. Most research to date focuses on impacts on water quality after fire. However, information on potential water quality impacts is required prior to fire events for risk planning. Moreover, changes in climate and forest management (e.g. prescribed burning) that affect fire regimes may alter water quality risks. Therefore, prediction requires spatial-temporal representation of fire and rainfall regimes coupled with information on fire-related changes to soil hydrologic parameters. Recent work has applied such an approach by combining a fire spread model with historic fire weather data in a Monte Carlo simulation to quantify probabilities associated with fire and storm events generating debris flows and fine sediment influx to a reservoir located in Victoria, Australia. Prediction of fire effects on water quality would benefit from further research in several areas. First, more work on regional-scale stochastic modelling of intersecting fire and storm events with landscape zones of erosion vulnerability is required to support quantitative evaluation of water quality risk and the effect of future changes in climate and land management. Second, we underscore previous calls for characterisation of landscape-scale domains to support regionalisation of parameter sets derived from empirical studies. Recent examples include work identifying aridity as a control of hydro-geomorphic response to fire and the use of spectral-based indices to predict spatial heterogeneity in ash loadings. Third, information on post-fire erosion from colluvial or alluvial stores is needed to determine their significance as both sediment-contaminant sinks and sources. Such sediment stores may require explicit spatial representation in risk models for some environments and sediment tracing can be used to determine their relative importance as secondary sources. Fourth, increased dating of sediment archives could provide regional datasets of fire-related erosion event frequency. Presently, the lack of such data hinders evaluation of risk models linking fire and storm events to erosion and water quality impacts.
de Lange, Annet H; Kompier, Michiel A J; Taris, Toon W; Geurts, Sabine A E; Beckers, Debby G J; Houtman, Irene L D; Bongers, Paulien M
2009-09-01
This prospective four-wave study examined (i) the causal direction of the longitudinal relations among job demands, job control, sleep quality and fatigue; and (ii) the effects of stability and change in demand-control history on the development of sleep quality and fatigue. Based on results of a four-wave complete panel study among 1163 Dutch employees, we found significant effects of job demands and job control on sleep quality and fatigue across a 1-year time lag, supporting the strain hypothesis (Demand-Control model; Karasek and Theorell, Basic Books, New York, 1990). No reversed or reciprocal causal patterns were detected. Furthermore, our results revealed that cumulative exposure to a high-strain work environment (characterized by high job demands and low job control) was associated with elevated levels of sleep-related complaints. Cumulative exposure to a low-strain work environment (i.e. low job demands and high job control) was associated with the highest sleep quality and lowest level of fatigue. Our results revealed further that changes in exposure history were related to changes in reported sleep quality and fatigue across time. As expected, a transition from a non-high-strain towards a high-strain job was associated with a significant increase in sleep-related complaints; conversely, a transition towards a non-high-strain job was not related to an improvement in sleep-related problems.
Research on Objectives for High-School Biology
ERIC Educational Resources Information Center
Korgan, John J., Jr.; Wilson, John T.
1973-01-01
Describes procedures to develop instructional objectives for high school biology. Two kinds of objectives are identified as pre-objectives and performance objectives. Models to classify these in branches of biology and to ensure quality control are provided. (PS)
The collection of chemical structures and associated experimental data for QSAR modeling is facilitated by the increasing number and size of public databases. However, the performance of QSAR models highly depends on the quality of the data used and the modeling methodology. The ...
Model of care transformation: a health care system CNE's journey.
Swick, Maureen; Doulaveris, Phyllis; Christensen, Patricia
2012-01-01
In 2001, the Institute of Medicine released the report "Crossing the Quality Chasm: A New Health System for the 21st Century." The report criticizes our health care system and argues that we are failing to provide Americans with the high-quality and affordable health care they deserve and need. While incremental progress has been made, we continue to strive for improved care quality, and our rising costs are potentially catastrophic. Consistent with the Institute of Medicine report, and its reputation for innovation, Inova Health System identified care model transformation as a system priority. Given that the organization is replacing its electronic health record and introducing advanced analytic capabilities, the opportunity to transform the model of care in tandem with core clinical platform enhancement was a compelling reason to move forward.
NASA Astrophysics Data System (ADS)
Petoussi-Henss, Nina; Becker, Janine; Greiter, Matthias; Schlattl, Helmut; Zankl, Maria; Hoeschen, Christoph
2014-03-01
In radiography there is generally a conflict between the best image quality and the lowest possible patient dose. A proven method of dosimetry is the simulation of radiation transport in virtual human models (i.e. phantoms). However, while the resolution of these voxel models is adequate for most dosimetric purposes, they cannot provide the required organ fine structures necessary for the assessment of the imaging quality. The aim of this work is to develop hybrid/dual-lattice voxel models (called also phantoms) as well as simulation methods by which patient dose and image quality for typical radiographic procedures can be determined. The results will provide a basis to investigate by means of simulations the relationships between patient dose and image quality for various imaging parameters and develop methods for their optimization. A hybrid model, based on NURBS (Non Linear Uniform Rational B-Spline) and PM (Polygon Mesh) surfaces, was constructed from an existing voxel model of a female patient. The organs of the hybrid model can be then scaled and deformed in a non-uniform way i.e. organ by organ; they can be, thus, adapted to patient characteristics without losing their anatomical realism. Furthermore, the left lobe of the lung was substituted by a high resolution lung voxel model, resulting in a dual-lattice geometry model. "Dual lattice" means in this context the combination of voxel models with different resolution. Monte Carlo simulations of radiographic imaging were performed with the code EGS4nrc, modified such as to perform dual lattice transport. Results are presented for a thorax examination.
Improved model quality assessment using ProQ2.
Ray, Arjun; Lindahl, Erik; Wallner, Björn
2012-09-10
Employing methods to assess the quality of modeled protein structures is now standard practice in bioinformatics. In a broad sense, the techniques can be divided into methods relying on consensus prediction on the one hand, and single-model methods on the other. Consensus methods frequently perform very well when there is a clear consensus, but this is not always the case. In particular, they frequently fail in selecting the best possible model in the hard cases (lacking consensus) or in the easy cases where models are very similar. In contrast, single-model methods do not suffer from these drawbacks and could potentially be applied on any protein of interest to assess quality or as a scoring function for sampling-based refinement. Here, we present a new single-model method, ProQ2, based on ideas from its predecessor, ProQ. ProQ2 is a model quality assessment algorithm that uses support vector machines to predict local as well as global quality of protein models. Improved performance is obtained by combining previously used features with updated structural and predicted features. The most important contribution can be attributed to the use of profile weighting of the residue specific features and the use features averaged over the whole model even though the prediction is still local. ProQ2 is significantly better than its predecessors at detecting high quality models, improving the sum of Z-scores for the selected first-ranked models by 20% and 32% compared to the second-best single-model method in CASP8 and CASP9, respectively. The absolute quality assessment of the models at both local and global level is also improved. The Pearson's correlation between the correct and local predicted score is improved from 0.59 to 0.70 on CASP8 and from 0.62 to 0.68 on CASP9; for global score to the correct GDT_TS from 0.75 to 0.80 and from 0.77 to 0.80 again compared to the second-best single methods in CASP8 and CASP9, respectively. ProQ2 is available at http://proq2.wallnerlab.org.
Quality choice in a health care market: a mixed duopoly approach.
Sanjo, Yasuo
2009-05-01
We investigate a health care market with uncertainty in a mixed duopoly, where a partially privatized public hospital competes against a private hospital in terms of quality choice. We use a simple Hotelling-type spatial competition model by incorporating mean-variance analysis and the framework of partial privatization. We show how the variance in the quality perceived by patients affects the true quality of medical care provided by hospitals. In addition, we show that a case exists in which the quality of the partially privatized hospital becomes higher than that of the private hospital when the patient's preference for quality is relatively high.
Van Laethem, Michelle; Beckers, Debby G J; Geurts, Sabine A E; Garefelt, Johanna; Magnusson Hanson, Linda L; Leineweber, Constanze
2018-04-01
The aim of this longitudinal three-wave study was to examine (i) reciprocal associations among job demands, work-related perseverative cognition (PC), and sleep quality; (ii) PC as a mediator in-between job demands and sleep quality; and (iii) continuous high job demands in relation to sleep quality and work-related PC over time. A representative sample of the Swedish working population was approached in 2010, 2012, and 2014, and 2316 respondents were included in this longitudinal full-panel survey study. Structural equation modelling was performed to analyse the temporal relations between job demands, work-related PC, and sleep quality. Additionally, a subsample (N = 1149) consisting of individuals who reported the same level of exposure to job demands during all three waves (i.e. stable high, stable moderate, or stable low job demands) was examined in relation to PC and sleep quality over time. Analyses showed that job demands, PC, and poor sleep quality were positively and reciprocally related. Work-related PC mediated the normal and reversed, direct across-wave relations between job demands and sleep quality. Individuals with continuous high job demands reported significantly lower sleep quality and higher work-related PC, compared to individuals with continuous moderate/low job demands. This study substantiated reciprocal relations between job demands, work-related PC, and sleep quality and supported work-related PC as an underlying mechanism of the reciprocal job demands-sleep relationship. Moreover, this study showed that chronically high job demands are a risk factor for low sleep quality.
Silvestri, Jennifer
2017-01-01
Purpose To examine the implications of chronic shoulder pain on quality of life and occupational engagement in spinal cord injury (SCI). The Ecology of Human Performance Model and Self-Efficacy Theory will be used to further examine the interplay of shoulder pain, quality of life and engagement in this population. Method Analysis of literature. Results Persons with SCI have a high prevalence of shoulder pain and injury, affecting 37-84% of analysed studies; chronic pain limits occupational engagement and decreases quality of life. Remediation of pain provides improved occupational engagement, functional independence and quality of life in those with high self-efficacy and low depression. Conclusion Shoulder pain is a serious complication following SCI and the Ecology of Human Performance Model and Self-Efficacy Theory can be utilized in conjunction for a framework to evaluate, treat and prevent shoulder pain and its devastating effects on occupational engagement and quality of life in the spinal cord injured population. Thereafter, rehabilitation professionals will have a greater understanding of these interactions to serve as a guide for evaluation and intervention planning to promote optimal occupational engagement through limiting the experiences of occupational injustices for those with SCI and shoulder pain. Implications for Rehabilitation Musculoskeletal pain at the shoulder joint and depression are common complications following spinal cord injury that limit occupational engagement and decrease quality of life. To increase engagement and quality of life in this population, treatments need to address all factors including the under-lying psychosocial instead of task and environment modification alone. The Ecology of Human Performance Model and Self-efficacy Theory are effective frameworks that can be used for evaluation, treatment planning and outcome measurement to maximize occupational engagement and quality of life.
Anderson, Jared R; Van Ryzin, Mark J; Doherty, William J
2010-10-01
Most contemporary studies of change in marital quality over time have used growth curve modeling to describe continuously declining mean curves. However, there is some evidence that different trajectories of marital quality exist for different subpopulations. Group-based trajectory modeling provides the opportunity to conduct an empirical investigation of the variance in marital quality trajectories. We applied this method to analyze data from continuously married individuals from the Marital Instability over the Life Course Study (N = 706). Instead of a single continuously declining trajectory of marital happiness, we found 5 distinct trajectories. Nearly two thirds of participants reported high and stable levels of happiness over time, and the other one third showed either a pattern of continuous low happiness, low happiness that subsequently declined, or a curvilinear pattern of high happiness, decline, and recovery. Marital problems, time spent in shared activities, and (to a lesser degree) economic hardship were able to distinguish trajectory group membership. Our results suggest that marital happiness may have multiple distinct trajectories across reasonably diverse populations. Implications for theory, research, and practice are discussed.
Robust model-based 3d/3D fusion using sparse matching for minimally invasive surgery.
Neumann, Dominik; Grbic, Sasa; John, Matthias; Navab, Nassir; Hornegger, Joachim; Ionasec, Razvan
2013-01-01
Classical surgery is being disrupted by minimally invasive and transcatheter procedures. As there is no direct view or access to the affected anatomy, advanced imaging techniques such as 3D C-arm CT and C-arm fluoroscopy are routinely used for intra-operative guidance. However, intra-operative modalities have limited image quality of the soft tissue and a reliable assessment of the cardiac anatomy can only be made by injecting contrast agent, which is harmful to the patient and requires complex acquisition protocols. We propose a novel sparse matching approach for fusing high quality pre-operative CT and non-contrasted, non-gated intra-operative C-arm CT by utilizing robust machine learning and numerical optimization techniques. Thus, high-quality patient-specific models can be extracted from the pre-operative CT and mapped to the intra-operative imaging environment to guide minimally invasive procedures. Extensive quantitative experiments demonstrate that our model-based fusion approach has an average execution time of 2.9 s, while the accuracy lies within expert user confidence intervals.
Effect of tape recording on perturbation measures.
Jiang, J; Lin, E; Hanson, D G
1998-10-01
Tape recorders have been shown to affect measures of voice perturbation. Few studies, however, have been conducted to quantitatively justify the use or exclusion of certain types of recorders in voice perturbation studies. This study used sinusoidal and triangular waves and synthesized vowels to compare perturbation measures extracted from directly digitized signals with those recorded and played back through various tape recorders, including 3 models of digital audio tape recorders, 2 models of analog audio cassette tape recorders, and 2 models of video tape recorders. Signal contamination for frequency perturbation values was found to be consistently minimal with digital recorders (percent jitter = 0.01%-0.02%), mildly increased with video recorders (0.05%-0.10%), moderately increased with a high-quality analog audio cassette tape recorder (0.15%), and most prominent with a low-quality analog audio cassette tape recorder (0.24%). Recorder effect on amplitude perturbation measures was lowest in digital recorders (percent shimmer = 0.09%-0.20%), mildly to moderately increased in video recorders and a high-quality analog audio cassette tape recorder (0.25%-0.45%), and most prominent in a low-quality analog audio cassette tape recorder (0.98%). The effect of cassette tape material, length of spooled tape, and duration of analysis were also tested and are discussed.
Rapid and Simultaneous Prediction of Eight Diesel Quality Parameters through ATR-FTIR Analysis.
Nespeca, Maurilio Gustavo; Hatanaka, Rafael Rodrigues; Flumignan, Danilo Luiz; de Oliveira, José Eduardo
2018-01-01
Quality assessment of diesel fuel is highly necessary for society, but the costs and time spent are very high while using standard methods. Therefore, this study aimed to develop an analytical method capable of simultaneously determining eight diesel quality parameters (density; flash point; total sulfur content; distillation temperatures at 10% (T10), 50% (T50), and 85% (T85) recovery; cetane index; and biodiesel content) through attenuated total reflection Fourier transform infrared (ATR-FTIR) spectroscopy and the multivariate regression method, partial least square (PLS). For this purpose, the quality parameters of 409 samples were determined using standard methods, and their spectra were acquired in ranges of 4000-650 cm -1 . The use of the multivariate filters, generalized least squares weighting (GLSW) and orthogonal signal correction (OSC), was evaluated to improve the signal-to-noise ratio of the models. Likewise, four variable selection approaches were tested: manual exclusion, forward interval PLS (FiPLS), backward interval PLS (BiPLS), and genetic algorithm (GA). The multivariate filters and variables selection algorithms generated more fitted and accurate PLS models. According to the validation, the FTIR/PLS models presented accuracy comparable to the reference methods and, therefore, the proposed method can be applied in the diesel routine monitoring to significantly reduce costs and analysis time.
Rapid and Simultaneous Prediction of Eight Diesel Quality Parameters through ATR-FTIR Analysis
Hatanaka, Rafael Rodrigues; Flumignan, Danilo Luiz; de Oliveira, José Eduardo
2018-01-01
Quality assessment of diesel fuel is highly necessary for society, but the costs and time spent are very high while using standard methods. Therefore, this study aimed to develop an analytical method capable of simultaneously determining eight diesel quality parameters (density; flash point; total sulfur content; distillation temperatures at 10% (T10), 50% (T50), and 85% (T85) recovery; cetane index; and biodiesel content) through attenuated total reflection Fourier transform infrared (ATR-FTIR) spectroscopy and the multivariate regression method, partial least square (PLS). For this purpose, the quality parameters of 409 samples were determined using standard methods, and their spectra were acquired in ranges of 4000–650 cm−1. The use of the multivariate filters, generalized least squares weighting (GLSW) and orthogonal signal correction (OSC), was evaluated to improve the signal-to-noise ratio of the models. Likewise, four variable selection approaches were tested: manual exclusion, forward interval PLS (FiPLS), backward interval PLS (BiPLS), and genetic algorithm (GA). The multivariate filters and variables selection algorithms generated more fitted and accurate PLS models. According to the validation, the FTIR/PLS models presented accuracy comparable to the reference methods and, therefore, the proposed method can be applied in the diesel routine monitoring to significantly reduce costs and analysis time. PMID:29629209
Robert, Alexandre; Paiva, Vitor H; Bolton, Mark; Jiguet, Frédéric; Bried, Joël
2012-08-01
Environmental variability, costs of reproduction, and heterogeneity in individual quality are three important sources of the temporal and interindividual variations in vital rates of wild populations. Based on an 18-year monitoring of an endangered, recently described, long-lived seabird, Monteiro's Storm-Petrel (Oceanodroma monteiroi), we designed multistate survival models to separate the effects of the reproductive cost (breeders vs. nonbreeders) and individual quality (successful vs. unsuccessful breeders) in relation to temporally variable demographic and oceanographic properties. The analysis revealed a gradient of individual quality from nonbreeders, to unsuccessful breeders, to successful breeders. The survival rates of unsuccessful breeders (0.90 +/- 0.023, mean +/- SE) tended to decrease in years of high average breeding success and were more sensitive to oceanographic variation than those of both (high-quality) successful breeders (0.97 +/- 0.015) and (low-quality) nonbreeders (0.83 +/- 0.028). Overall, our results indicate that reproductive costs act on individuals of intermediate quality and are mediated by environmental harshness.
Nakamura, Akira; Ohtsuka, Jun; Kashiwagi, Tatsuki; Numoto, Nobutaka; Hirota, Noriyuki; Ode, Takahiro; Okada, Hidehiko; Nagata, Koji; Kiyohara, Motosuke; Suzuki, Ei-Ichiro; Kita, Akiko; Wada, Hitoshi; Tanokura, Masaru
2016-02-26
Precise protein structure determination provides significant information on life science research, although high-quality crystals are not easily obtained. We developed a system for producing high-quality protein crystals with high throughput. Using this system, gravity-controlled crystallization are made possible by a magnetic microgravity environment. In addition, in-situ and real-time observation and time-lapse imaging of crystal growth are feasible for over 200 solution samples independently. In this paper, we also report results of crystallization experiments for two protein samples. Crystals grown in the system exhibited magnetic orientation and showed higher and more homogeneous quality compared with the control crystals. The structural analysis reveals that making use of the magnetic microgravity during the crystallization process helps us to build a well-refined protein structure model, which has no significant structural differences with a control structure. Therefore, the system contributes to improvement in efficiency of structural analysis for "difficult" proteins, such as membrane proteins and supermolecular complexes.
Pharmacophore-Map-Pick: A Method to Generate Pharmacophore Models for All Human GPCRs.
Dai, Shao-Xing; Li, Gong-Hua; Gao, Yue-Dong; Huang, Jing-Fei
2016-02-01
GPCR-based drug discovery is hindered by a lack of effective screening methods for most GPCRs that have neither ligands nor high-quality structures. With the aim to identify lead molecules for these GPCRs, we developed a new method called Pharmacophore-Map-Pick to generate pharmacophore models for all human GPCRs. The model of ADRB2 generated using this method not only predicts the binding mode of ADRB2-ligands correctly but also performs well in virtual screening. Findings also demonstrate that this method is powerful for generating high-quality pharmacophore models. The average enrichment for the pharmacophore models of the 15 targets in different GPCR families reached 15-fold at 0.5 % false-positive rate. Therefore, the pharmacophore models can be applied in virtual screening directly with no requirement for any ligand information or shape constraints. A total of 2386 pharmacophore models for 819 different GPCRs (99 % coverage (819/825)) were generated and are available at http://bsb.kiz.ac.cn/GPCRPMD. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Modeling Family Adaptation to Fragile X Syndrome
ERIC Educational Resources Information Center
Raspa, Melissa; Bailey, Donald, Jr.; Bann, Carla; Bishop, Ellen
2014-01-01
Using data from a survey of 1,099 families who have a child with Fragile X syndrome, we examined adaptation across 7 dimensions of family life: parenting knowledge, social support, social life, financial impact, well-being, quality of life, and overall impact. Results illustrate that although families report a high quality of life, they struggle…
Charter School Replication. Policy Guide
ERIC Educational Resources Information Center
Rhim, Lauren Morando
2009-01-01
"Replication" is the practice of a single charter school board or management organization opening several more schools that are each based on the same school model. The most rapid strategy to increase the number of new high-quality charter schools available to children is to encourage the replication of existing quality schools. This policy guide…
Is Bigger Better? Customer Base Expansion through Word-of-Mouth Reputation
ERIC Educational Resources Information Center
Rob, Rafael; Fishman, Arthur
2005-01-01
A model of gradual reputation formation through a process of continuous investment in product quality is developed. We assume that the ability to produce high-quality products requires continuous investment and that as a consequence of informational frictions, such as search costs, information about firms' past performance diffuses only gradually…
Factors Affecting School Quality in Florida
ERIC Educational Resources Information Center
Thornton, Barry; Arbogast, Gordon
2014-01-01
This paper examines the factors that are theorized to be determinants of school quality in the 67 counties of Florida from 2000 to 2011. The model constructed for this purpose is comprised of a mix of independent variables that include county educational attainment (number of high school graduates and State University System enrollees) and…
A Face-to-Face Professional Development Model to Enhance Teaching of Online Research Strategies
ERIC Educational Resources Information Center
Terrazas-Arellanes, Fatima E.; Knox, Carolyn; Strycker, Lisa A.; Walden, Emily
2016-01-01
To help students navigate the digital environment, teachers not only need access to the right technology tools but they must also engage in pedagogically sound, high-quality professional development. For teachers, quality professional development can mean the difference between merely using technology tools and creating transformative change in…
USDA-ARS?s Scientific Manuscript database
High frequency in situ measurements of nitrate can greatly reduce the uncertainty in nitrate flux estimates. Water quality databases maintained by various federal and state agencies often consist of pollutant concentration data obtained from periodic grab samples collected from gauged reaches of a s...
Gottschalk, Maren; Sieme, Harald; Martinsson, Gunilla; Distl, Ottmar
2017-02-01
A high quality of stallion semen is of particular importance for maximum reproductive efficiency. In the present study, we estimated the relationships among estimated breeding values (EBVs) of semen traits and EBVs for the paternal component of the pregnancy rate per estrus cycle (EBV-PAT) for 100 German Warmblood stallions using correlation and general linear model analyses. The most highly correlated sperm quality trait was total number of progressively motile sperm (r = 0.36). EBV-PAT was considered in three classes with stallions 1 SD below (<80), around (80-120), and above (>120) the population mean of 100. The general linear model analysis showed significant effects for EBVs of all semen traits. EBVs of sperm quality traits greater than 100 to 110 were indicative for EBV-PAT greater than 120. Recommendations for breeding soundness examinations on the basis of the assessments of sperm quality traits and estimation of breeding values seem to be an option to support breeders to improve stallion fertility in the present and future stallion generation. Copyright © 2016 Elsevier Inc. All rights reserved.
Process service quality evaluation based on Dempster-Shafer theory and support vector machine.
Pei, Feng-Que; Li, Dong-Bo; Tong, Yi-Fei; He, Fei
2017-01-01
Human involvement influences traditional service quality evaluations, which triggers an evaluation's low accuracy, poor reliability and less impressive predictability. This paper proposes a method by employing a support vector machine (SVM) and Dempster-Shafer evidence theory to evaluate the service quality of a production process by handling a high number of input features with a low sampling data set, which is called SVMs-DS. Features that can affect production quality are extracted by a large number of sensors. Preprocessing steps such as feature simplification and normalization are reduced. Based on three individual SVM models, the basic probability assignments (BPAs) are constructed, which can help the evaluation in a qualitative and quantitative way. The process service quality evaluation results are validated by the Dempster rules; the decision threshold to resolve conflicting results is generated from three SVM models. A case study is presented to demonstrate the effectiveness of the SVMs-DS method.
Effect of quality chronic disease management for alcohol and drug dependence on addiction outcomes.
Kim, Theresa W; Saitz, Richard; Cheng, Debbie M; Winter, Michael R; Witas, Julie; Samet, Jeffrey H
2012-12-01
We examined the effect of the quality of primary care-based chronic disease management (CDM) for alcohol and/or other drug (AOD) dependence on addiction outcomes. We assessed quality using (1) a visit frequency based measure and (2) a self-reported assessment measuring alignment with the chronic care model. The visit frequency based measure had no significant association with addiction outcomes. The self-reported measure of care-when care was at a CDM clinic-was associated with lower drug addiction severity. The self-reported assessment of care from any healthcare source (CDM clinic or elsewhere) was associated with lower alcohol addiction severity and abstinence. These findings suggest that high quality CDM for AOD dependence may improve addiction outcomes. Quality measures based upon alignment with the chronic care model may better capture features of effective CDM care than a visit frequency measure. Copyright © 2012 Elsevier Inc. All rights reserved.
Quality evaluation on an e-learning system in continuing professional education of nurses.
Lin, I-Chun; Chien, Yu-Mei; Chang, I-Chiu
2006-01-01
Maintaining high quality in Web-based learning is a powerful means of increasing the overall efficiency and effectiveness of distance learning. Many studies have evaluated Web-based learning but seldom evaluate from the information systems (IS) perspective. This study applied the famous IS Success model in measuring the quality of a Web-based learning system using a Web-based questionnaire for data collection. One hundred and fifty four nurses participated in the survey. Based on confirmatory factor analysis, the variables of the research model fit for measuring the quality of a Web-based learning system. As Web-based education continues to grow worldwide, the results of this study may assist the system adopter (hospital executives), the learner (nurses), and the system designers in making reasonable and informed judgments with regard to the quality of Web-based learning system in continuing professional education.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Jin; Serbin, Shawn P.; Xu, Xiangtao
Leaf quantity (i.e., canopy leaf area index, LAI), quality (i.e., per-area photosynthetic capacity), and longevity all influence the photosynthetic seasonality of tropical evergreen forests. However, these components of tropical leaf phenology are poorly represented in most terrestrial biosphere models (TBMs). Here in this paper, we explored alternative options for the representation of leaf phenology effects in TBMs that employ the Farquahar, von Caemmerer & Berry (FvCB) representation of CO 2 assimilation. We developed a two-fraction leaf (sun and shade), two-layer canopy (upper and lower) photosynthesis model to evaluate different modeling approaches and assessed three components of phenological variations (i.e., leafmore » quantity, quality, and within-canopy variation in leaf longevity). Our model was driven by the prescribed seasonality of leaf quantity and quality derived from ground-based measurements within an Amazonian evergreen forest. Modeled photosynthetic seasonality was not sensitive to leaf quantity, but was highly sensitive to leaf quality and its vertical distribution within the canopy, with markedly more sensitivity to upper canopy leaf quality. This is because light absorption in tropical canopies is near maximal for the entire year, implying that seasonal changes in LAI have little impact on total canopy light absorption; and because leaf quality has a greater effect on photosynthesis of sunlit leaves than light limited, shade leaves and sunlit foliage are more abundant in the upper canopy. Our two-fraction leaf, two-layer canopy model, which accounted for all three phenological components, was able to simulate photosynthetic seasonality, explaining ~90% of the average seasonal variation in eddy covariance-derived CO 2 assimilation. This work identifies a parsimonious approach for representing tropical evergreen forest photosynthetic seasonality in TBMs that utilize the FvCB model of CO 2 assimilation and highlights the importance of incorporating more realistic phenological mechanisms in models that seek to improve the projection of future carbon dynamics in tropical evergreen forests.« less
Wu, Jin; Serbin, Shawn P.; Xu, Xiangtao; ...
2017-04-18
Leaf quantity (i.e., canopy leaf area index, LAI), quality (i.e., per-area photosynthetic capacity), and longevity all influence the photosynthetic seasonality of tropical evergreen forests. However, these components of tropical leaf phenology are poorly represented in most terrestrial biosphere models (TBMs). Here in this paper, we explored alternative options for the representation of leaf phenology effects in TBMs that employ the Farquahar, von Caemmerer & Berry (FvCB) representation of CO 2 assimilation. We developed a two-fraction leaf (sun and shade), two-layer canopy (upper and lower) photosynthesis model to evaluate different modeling approaches and assessed three components of phenological variations (i.e., leafmore » quantity, quality, and within-canopy variation in leaf longevity). Our model was driven by the prescribed seasonality of leaf quantity and quality derived from ground-based measurements within an Amazonian evergreen forest. Modeled photosynthetic seasonality was not sensitive to leaf quantity, but was highly sensitive to leaf quality and its vertical distribution within the canopy, with markedly more sensitivity to upper canopy leaf quality. This is because light absorption in tropical canopies is near maximal for the entire year, implying that seasonal changes in LAI have little impact on total canopy light absorption; and because leaf quality has a greater effect on photosynthesis of sunlit leaves than light limited, shade leaves and sunlit foliage are more abundant in the upper canopy. Our two-fraction leaf, two-layer canopy model, which accounted for all three phenological components, was able to simulate photosynthetic seasonality, explaining ~90% of the average seasonal variation in eddy covariance-derived CO 2 assimilation. This work identifies a parsimonious approach for representing tropical evergreen forest photosynthetic seasonality in TBMs that utilize the FvCB model of CO 2 assimilation and highlights the importance of incorporating more realistic phenological mechanisms in models that seek to improve the projection of future carbon dynamics in tropical evergreen forests.« less
Data assimilation and model evaluation experiment datasets
NASA Technical Reports Server (NTRS)
Lai, Chung-Cheng A.; Qian, Wen; Glenn, Scott M.
1994-01-01
The Institute for Naval Oceanography, in cooperation with Naval Research Laboratories and universities, executed the Data Assimilation and Model Evaluation Experiment (DAMEE) for the Gulf Stream region during fiscal years 1991-1993. Enormous effort has gone into the preparation of several high-quality and consistent datasets for model initialization and verification. This paper describes the preparation process, the temporal and spatial scopes, the contents, the structure, etc., of these datasets. The goal of DAMEE and the need of data for the four phases of experiment are briefly stated. The preparation of DAMEE datasets consisted of a series of processes: (1) collection of observational data; (2) analysis and interpretation; (3) interpolation using the Optimum Thermal Interpolation System package; (4) quality control and re-analysis; and (5) data archiving and software documentation. The data products from these processes included a time series of 3D fields of temperature and salinity, 2D fields of surface dynamic height and mixed-layer depth, analysis of the Gulf Stream and rings system, and bathythermograph profiles. To date, these are the most detailed and high-quality data for mesoscale ocean modeling, data assimilation, and forecasting research. Feedback from ocean modeling groups who tested this data was incorporated into its refinement. Suggestions for DAMEE data usages include (1) ocean modeling and data assimilation studies, (2) diagnosis and theoretical studies, and (3) comparisons with locally detailed observations.
CerebroMatic: A Versatile Toolbox for Spline-Based MRI Template Creation
Wilke, Marko; Altaye, Mekibib; Holland, Scott K.
2017-01-01
Brain image spatial normalization and tissue segmentation rely on prior tissue probability maps. Appropriately selecting these tissue maps becomes particularly important when investigating “unusual” populations, such as young children or elderly subjects. When creating such priors, the disadvantage of applying more deformation must be weighed against the benefit of achieving a crisper image. We have previously suggested that statistically modeling demographic variables, instead of simply averaging images, is advantageous. Both aspects (more vs. less deformation and modeling vs. averaging) were explored here. We used imaging data from 1914 subjects, aged 13 months to 75 years, and employed multivariate adaptive regression splines to model the effects of age, field strength, gender, and data quality. Within the spm/cat12 framework, we compared an affine-only with a low- and a high-dimensional warping approach. As expected, more deformation on the individual level results in lower group dissimilarity. Consequently, effects of age in particular are less apparent in the resulting tissue maps when using a more extensive deformation scheme. Using statistically-described parameters, high-quality tissue probability maps could be generated for the whole age range; they are consistently closer to a gold standard than conventionally-generated priors based on 25, 50, or 100 subjects. Distinct effects of field strength, gender, and data quality were seen. We conclude that an extensive matching for generating tissue priors may model much of the variability inherent in the dataset which is then not contained in the resulting priors. Further, the statistical description of relevant parameters (using regression splines) allows for the generation of high-quality tissue probability maps while controlling for known confounds. The resulting CerebroMatic toolbox is available for download at http://irc.cchmc.org/software/cerebromatic.php. PMID:28275348
CerebroMatic: A Versatile Toolbox for Spline-Based MRI Template Creation.
Wilke, Marko; Altaye, Mekibib; Holland, Scott K
2017-01-01
Brain image spatial normalization and tissue segmentation rely on prior tissue probability maps. Appropriately selecting these tissue maps becomes particularly important when investigating "unusual" populations, such as young children or elderly subjects. When creating such priors, the disadvantage of applying more deformation must be weighed against the benefit of achieving a crisper image. We have previously suggested that statistically modeling demographic variables, instead of simply averaging images, is advantageous. Both aspects (more vs. less deformation and modeling vs. averaging) were explored here. We used imaging data from 1914 subjects, aged 13 months to 75 years, and employed multivariate adaptive regression splines to model the effects of age, field strength, gender, and data quality. Within the spm/cat12 framework, we compared an affine-only with a low- and a high-dimensional warping approach. As expected, more deformation on the individual level results in lower group dissimilarity. Consequently, effects of age in particular are less apparent in the resulting tissue maps when using a more extensive deformation scheme. Using statistically-described parameters, high-quality tissue probability maps could be generated for the whole age range; they are consistently closer to a gold standard than conventionally-generated priors based on 25, 50, or 100 subjects. Distinct effects of field strength, gender, and data quality were seen. We conclude that an extensive matching for generating tissue priors may model much of the variability inherent in the dataset which is then not contained in the resulting priors. Further, the statistical description of relevant parameters (using regression splines) allows for the generation of high-quality tissue probability maps while controlling for known confounds. The resulting CerebroMatic toolbox is available for download at http://irc.cchmc.org/software/cerebromatic.php.
NASA Astrophysics Data System (ADS)
Pohle, Ina; Glendell, Miriam; Stutter, Marc I.; Helliwell, Rachel C.
2017-04-01
An understanding of catchment response to climate and land use change at a regional scale is necessary for the assessment of mitigation and adaptation options addressing diffuse nutrient pollution. It is well documented that the physicochemical properties of a river ecosystem respond to change in a non-linear fashion. This is particularly important when threshold water concentrations, relevant to national and EU legislation, are exceeded. Large scale (regional) model assessments required for regulatory purposes must represent the key processes and mechanisms that are more readily understood in catchments with water quantity and water quality data monitored at high spatial and temporal resolution. While daily discharge data are available for most catchments in Scotland, nitrate and phosphorus are mostly available on a monthly basis only, as typified by regulatory monitoring. However, high resolution (hourly to daily) water quantity and water quality data exist for a limited number of research catchments. To successfully implement adaptation measures across Scotland, an upscaling from data-rich to data-sparse catchments is required. In addition, the widespread availability of spatial datasets affecting hydrological and biogeochemical responses (e.g. soils, topography/geomorphology, land use, vegetation etc.) provide an opportunity to transfer predictions between data-rich and data-sparse areas by linking processes and responses to catchment attributes. Here, we develop a framework of catchment typologies as a prerequisite for transferring information from data-rich to data-sparse catchments by focusing on how hydrological catchment similarity can be used as an indicator of grouped behaviours in water quality response. As indicators of hydrological catchment similarity we use flow indices derived from observed discharge data across Scotland as well as hydrological model parameters. For the latter, we calibrated the lumped rainfall-runoff model TUWModel using multiple objective functions. The relationships between indicators of hydrological catchment similarity, physical catchment characteristics and nitrate and phosphorus concentrations in rivers are then investigated using multivariate statistics. This understanding of the relationship between catchment characteristics, hydrological processes and water quality will allow us to implement more efficient regulatory water quality monitoring strategies, to improve existing water quality models and to model mitigation and adaptation scenarios to global change in data-sparse catchments.
Fabrication of high quality cDNA microarray using a small amount of cDNA.
Park, Chan Hee; Jeong, Ha Jin; Jung, Jae Jun; Lee, Gui Yeon; Kim, Sang-Chul; Kim, Tae Soo; Yang, Sang Hwa; Chung, Hyun Cheol; Rha, Sun Young
2004-05-01
DNA microarray technology has become an essential part of biological research. It enables the genome-scale analysis of gene expression in various types of model systems. Manufacturing high quality cDNA microarrays of microdeposition type depends on some key factors including a printing device, spotting pins, glass slides, spotting solution, and humidity during spotting. UsingEthe Microgrid II TAS model printing device, this study defined the optimal conditions for producing high density, high quality cDNA microarrays with the least amount of cDNA product. It was observed that aminosilane-modified slides were superior to other types of surface modified-slides. A humidity of 30+/-3% in a closed environment and the overnight drying of the spotted slides gave the best conditions for arraying. In addition, the cDNA dissolved in 30% DMSO gave the optimal conditions for spotting compared to the 1X ArrayIt, 3X SSC and 50% DMSO. Lastly, cDNA in the concentration range of 100-300 ng/ micro l was determined to be best for arraying and post-processing. Currently, the printing system in this study yields reproducible 9000 spots with a spot size 150 mm diameter, and a 200 nm spot spacing.
Dlamini, Nomcebo; Ntshalintshali, Nyasatu; Pindolia, Deepa; Allen, Regan; Nhlabathi, Nomcebo; Novotny, Joseph; Kang Dufour, Mi-Suk; Midekisa, Alemayehu; Gosling, Roly; LeMenach, Arnaud; Cohen, Justin; Dorsey, Grant; Greenhouse, Bryan; Kunene, Simon
2017-01-01
Abstract Background. Low-quality housing may confer risk of malaria infection, but evidence in low transmission settings is limited. Methods. To examine the relationship between individual level housing quality and locally acquired infection in children and adults, a population-based cross-sectional analysis was performed using existing surveillance data from the low transmission setting of Swaziland. From 2012 to 2015, cases were identified through standard diagnostics in health facilities and by loop-mediated isothermal amplification in active surveillance, with uninfected subjects being household members and neighbors. Housing was visually assessed in a home visit and then classified as low, high, or medium quality, based on housing components being traditional, modern, or both, respectively. Results. Overall, 11426 individuals were included in the study: 10960 uninfected and 466 infected (301 symptomatic and 165 asymptomatic). Six percent resided in low-quality houses, 26% in medium-quality houses, and 68% in high-quality houses. In adjusted models, low- and medium-quality construction was associated with increased risk of malaria compared with high-quality construction (adjusted odds ratio [AOR], 2.11 and 95% confidence interval [CI], 1.26–3.53 for low vs high; AOR, 1.56 and 95% CI, 1.15–2.11 for medium vs high). The relationship was independent of vector control, which also conferred a protective effect (AOR, 0.67; 95% CI, .50–.90) for sleeping under an insecticide-treated bed net or a sprayed structure compared with neither. Conclusions. Our study adds to the limited literature on housing quality and malaria risk from low transmission settings. Housing improvements may offer an attractive and sustainable additional strategy to support countries in malaria elimination. PMID:28580365
Dlamini, Nomcebo; Hsiang, Michelle S; Ntshalintshali, Nyasatu; Pindolia, Deepa; Allen, Regan; Nhlabathi, Nomcebo; Novotny, Joseph; Kang Dufour, Mi-Suk; Midekisa, Alemayehu; Gosling, Roly; LeMenach, Arnaud; Cohen, Justin; Dorsey, Grant; Greenhouse, Bryan; Kunene, Simon
2017-01-01
Low-quality housing may confer risk of malaria infection, but evidence in low transmission settings is limited. To examine the relationship between individual level housing quality and locally acquired infection in children and adults, a population-based cross-sectional analysis was performed using existing surveillance data from the low transmission setting of Swaziland. From 2012 to 2015, cases were identified through standard diagnostics in health facilities and by loop-mediated isothermal amplification in active surveillance, with uninfected subjects being household members and neighbors. Housing was visually assessed in a home visit and then classified as low, high, or medium quality, based on housing components being traditional, modern, or both, respectively. Overall, 11426 individuals were included in the study: 10960 uninfected and 466 infected (301 symptomatic and 165 asymptomatic). Six percent resided in low-quality houses, 26% in medium-quality houses, and 68% in high-quality houses. In adjusted models, low- and medium-quality construction was associated with increased risk of malaria compared with high-quality construction (adjusted odds ratio [AOR], 2.11 and 95% confidence interval [CI], 1.26-3.53 for low vs high; AOR, 1.56 and 95% CI, 1.15-2.11 for medium vs high). The relationship was independent of vector control, which also conferred a protective effect (AOR, 0.67; 95% CI, .50-.90) for sleeping under an insecticide-treated bed net or a sprayed structure compared with neither. Our study adds to the limited literature on housing quality and malaria risk from low transmission settings. Housing improvements may offer an attractive and sustainable additional strategy to support countries in malaria elimination. © The Author 2017. Published by Oxford University Press on behalf of Infectious Diseases Society of America.
NASA Astrophysics Data System (ADS)
SUN, N.; Yearsley, J. R.; Lettenmaier, D. P.
2013-12-01
Recent research shows that precipitation extremes in many of the largest U.S. urban areas have increased over the last 60 years. These changes have important implications for stormwater runoff and water quality, which in urban areas are dominated by the most extreme precipitation events. We assess the potential implications of changes in extreme precipitation and changing land cover in urban and urbanizing watersheds at the regional scale using a combination of hydrology and water quality models. Specifically, we describe the integration of a spatially distributed hydrological model - the Distributed Hydrology Soil Vegetation Model (DHSVM), the urban water quality model in EPA's Storm Water Management Model (SWMM), the semi-Lagrangian stream temperature model RBM10, and dynamical and statistical downscaling methods applied to global climate predictions. Key output water quality parameters include total suspended solids (TSS), toal nitrogen, total phosphorous, fecal coliform bacteria and stream temperature. We have evaluated the performance of the modeling system in the highly urbanized Mercer Creek watershed in the rapidly growing Bellevue urban area in WA, USA. The results suggest that the model is able to (1) produce reasonable streamflow predictions at fine temporal and spatial scales; (2) provide spatially distributed water temperature predictions that mostly agree with observations throughout a complex stream network, and characterize impacts of climate, landscape, near-stream vegetation change on stream temperature at local and regional scales; and (3) capture plausibly the response of water quality constituents to varying magnitude of precipitation events in urban environments. Next we will extend the scope of the study from the Mercer Creek watershed to include the entire Puget Sound Basin, WA, USA.
Rangé, G; Chassaing, S; Marcollet, P; Saint-Étienne, C; Dequenne, P; Goralski, M; Bardiére, P; Beverilli, F; Godillon, L; Sabine, B; Laure, C; Gautier, S; Hakim, R; Albert, F; Angoulvant, D; Grammatico-Guillon, L
2018-05-01
To assess the reliability and low cost of a computerized interventional cardiology (IC) registry to prospectively and systematically collect high-quality data for all consecutive coronary patients referred for coronary angiogram or/and coronary angioplasty. Rigorous clinical practice assessment is a key factor to improve prognosis in IC. A prospective and permanent registry could achieve this goal but, presumably, at high cost and low level of data quality. One multicentric IC registry (CRAC registry), fully integrated to usual coronary activity report software, started in the centre Val-de-Loire (CVL) French region in 2014. Quality assessment of CRAC registry was conducted on five IC CathLab of the CVL region, from January 1st to December 31st 2014. Quality of collected data was evaluated by measuring procedure exhaustivity (comparing with data from hospital information system), data completeness (quality controls) and data consistency (by checking complete medical charts as gold standard). Cost per procedure (global registry operating cost/number of collected procedures) was also estimated. CRAC model provided a high-quality level with 98.2% procedure completeness, 99.6% data completeness and 89% data consistency. The operating cost per procedure was €14.70 ($16.51) for data collection and quality control, including ST-segment elevation myocardial infarction (STEMI) preadmission information and one-year follow-up after angioplasty. This integrated computerized IC registry led to the construction of an exhaustive, reliable and costless database, including all coronary patients entering in participating IC centers in the CVL region. This solution will be developed in other French regions, setting up a national IC database for coronary patients in 2020: France PCI. Copyright © 2018 Elsevier Masson SAS. All rights reserved.
2012-01-01
Background A relationship between current socio-economic position and subjective quality of life has been demonstrated, using wellbeing, life and needs satisfaction approaches. Less is known regarding the influence of different life course socio-economic trajectories on later quality of life. Several conceptual models have been proposed to help explain potential life course effects on health, including accumulation, latent, pathway and social mobility models. This systematic review aimed to assess whether evidence supported an overall relationship between life course socio-economic position and quality of life during adulthood and if so, whether there was support for one or more life course models. Methods A review protocol was developed detailing explicit inclusion and exclusion criteria, search terms, data extraction items and quality appraisal procedures. Literature searches were performed in 12 electronic databases during January 2012 and the references and citations of included articles were checked for additional relevant articles. Narrative synthesis was used to analyze extracted data and studies were categorized based on the life course model analyzed. Results Twelve studies met the eligibility criteria and used data from 10 datasets and five countries. Study quality varied and heterogeneity between studies was high. Seven studies assessed social mobility models, five assessed the latent model, two assessed the pathway model and three tested the accumulation model. Evidence indicated an overall relationship, but mixed results were found for each life course model. Some evidence was found to support the latent model among women, but not men. Social mobility models were supported in some studies, but overall evidence suggested little to no effect. Few studies addressed accumulation and pathway effects and study heterogeneity limited synthesis. Conclusions To improve potential for synthesis in this area, future research should aim to increase study comparability. Recommendations include testing all life course models within individual studies and the use of multiple measures of socio-economic position and quality of life. Comparable cross-national data would be beneficial to enable investigation of between-country differences. PMID:22873945
Achieving excellence in community health centers: implications for health reform.
Gurewich, Deborah; Capitman, John; Sirkin, Jenna; Traje, Diana
2012-02-01
Existing studies tell us little about care quality variation within the community health center (CHC) delivery system. They also tell us little about the organizational conditions associated with CHCs that deliver especially high quality care. The purpose of this study was to examine the operational practices associated with a sample of high performing CHCs. Qualitative case studies of eight CHCs identified as delivering high-quality care relative to other CHCs were used to examine operational practices, including systems to facilitate care access, manage patient care, and monitor performance. Four common themes emerged that may contribute to high performance. At the same time, important differences across health centers were observed, reflecting differences in local environments and CHC capacity. In the development of effective, community-based models of care, adapting care standards to meet the needs of local conditions may be important.
Water quality modeling for urban reach of Yamuna river, India (1999-2009), using QUAL2Kw
NASA Astrophysics Data System (ADS)
Sharma, Deepshikha; Kansal, Arun; Pelletier, Greg
2017-06-01
The study was to characterize and understand the water quality of the river Yamuna in Delhi (India) prior to an efficient restoration plan. A combination of collection of monitored data, mathematical modeling, sensitivity, and uncertainty analysis has been done using the QUAL2Kw, a river quality model. The model was applied to simulate DO, BOD, total coliform, and total nitrogen at four monitoring stations, namely Palla, Old Delhi Railway Bridge, Nizamuddin, and Okhla for 10 years (October 1999-June 2009) excluding the monsoon seasons (July-September). The study period was divided into two parts: monthly average data from October 1999-June 2004 (45 months) were used to calibrate the model and monthly average data from October 2005-June 2009 (45 months) were used to validate the model. The R2 for CBODf and TN lies within the range of 0.53-0.75 and 0.68-0.83, respectively. This shows that the model has given satisfactory results in terms of R2 for CBODf, TN, and TC. Sensitivity analysis showed that DO, CBODf, TN, and TC predictions are highly sensitive toward headwater flow and point source flow and quality. Uncertainty analysis using Monte Carlo showed that the input data have been simulated in accordance with the prevalent river conditions.
Animal models for studying homeopathy and high dilutions: conceptual critical review.
Bonamin, Leoni Villano; Endler, Peter Christian
2010-01-01
This is a systematic review of the animal models used in studies of high dilutions. The objectives are to analyze methodological quality of papers and reported results, and to highlight key conceptual aspects of high dilution to suggest clues concerning putative mechanisms of action. Papers for inclusion were identified systematically, from the Pubmed-Medline database, using 'Homeopathy' and 'Animal' as keywords. Only original full papers in English published between January 1999 and June 2009 were included, reviews, scientific reports, thesis, older papers, papers extracted from Medline using similar keywords, papers about mixed commercial formulas and books were also considered for discussion only. 31 papers describing 33 experiments were identified for the main analysis and a total of 89 items cited. Systematic analysis of the selected papers yielded evidence of some important intrinsic features of high dilution studies performed in animal models: a) methodological quality was generally adequate, some aspects could be improved; b) convergence between results and materia medica is seen in some studies, pointing toward to the possibility of systematic study of the Similia principle c) both isopathic and Similia models seem useful to understand some complex biological phenomena, such as parasite-host interactions; d) the effects of high dilutions seem to stimulate restoration of a 'stable state', as seen in several experimental models from both descriptive and mathematical points of view. Copyright 2009 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.
A test of 3 models of Kirtland's warbler habitat suitability
Mark D. Nelson; Richard R. Buech
1996-01-01
We tested 3 models of Kirtland's warbler (Dendroica kirtlandii) habitat suitability during a period when we believe there was a surplus of good quality breeding habitat. A jack pine canopy-cover model was superior to 2 jack pine stem-density models in predicting Kirtland's warbler habitat use and non-use. Estimated density of birds in high...
High-Definition Infrared Spectroscopic Imaging
Reddy, Rohith K.; Walsh, Michael J.; Schulmerich, Matthew V.; Carney, P. Scott; Bhargava, Rohit
2013-01-01
The quality of images from an infrared (IR) microscope has traditionally been limited by considerations of throughput and signal-to-noise ratio (SNR). An understanding of the achievable quality as a function of instrument parameters, from first principals is needed for improved instrument design. Here, we first present a model for light propagation through an IR spectroscopic imaging system based on scalar wave theory. The model analytically describes the propagation of light along the entire beam path from the source to the detector. The effect of various optical elements and the sample in the microscope is understood in terms of the accessible spatial frequencies by using a Fourier optics approach and simulations are conducted to gain insights into spectroscopic image formation. The optimal pixel size at the sample plane is calculated and shown much smaller than that in current mid-IR microscopy systems. A commercial imaging system is modified, and experimental data are presented to demonstrate the validity of the developed model. Building on this validated theoretical foundation, an optimal sampling configuration is set up. Acquired data were of high spatial quality but, as expected, of poorer SNR. Signal processing approaches were implemented to improve the spectral SNR. The resulting data demonstrated the ability to perform high-definition IR imaging in the laboratory by using minimally-modified commercial instruments. PMID:23317676
Beck, H J; Birch, G F
2013-06-01
Stormwater contaminant loading estimates using event mean concentration (EMC), rainfall/runoff relationship calculations and computer modelling (Model of Urban Stormwater Infrastructure Conceptualisation--MUSIC) demonstrated high variability in common methods of water quality assessment. Predictions of metal, nutrient and total suspended solid loadings for three highly urbanised catchments in Sydney estuary, Australia, varied greatly within and amongst methods tested. EMC and rainfall/runoff relationship calculations produced similar estimates (within 1 SD) in a statistically significant number of trials; however, considerable variability within estimates (∼50 and ∼25 % relative standard deviation, respectively) questions the reliability of these methods. Likewise, upper and lower default inputs in a commonly used loading model (MUSIC) produced an extensive range of loading estimates (3.8-8.3 times above and 2.6-4.1 times below typical default inputs, respectively). Default and calibrated MUSIC simulations produced loading estimates that agreed with EMC and rainfall/runoff calculations in some trials (4-10 from 18); however, they were not frequent enough to statistically infer that these methods produced the same results. Great variance within and amongst mean annual loads estimated by common methods of water quality assessment has important ramifications for water quality managers requiring accurate estimates of the quantities and nature of contaminants requiring treatment.
High-definition infrared spectroscopic imaging.
Reddy, Rohith K; Walsh, Michael J; Schulmerich, Matthew V; Carney, P Scott; Bhargava, Rohit
2013-01-01
The quality of images from an infrared (IR) microscope has traditionally been limited by considerations of throughput and signal-to-noise ratio (SNR). An understanding of the achievable quality as a function of instrument parameters, from first principals is needed for improved instrument design. Here, we first present a model for light propagation through an IR spectroscopic imaging system based on scalar wave theory. The model analytically describes the propagation of light along the entire beam path from the source to the detector. The effect of various optical elements and the sample in the microscope is understood in terms of the accessible spatial frequencies by using a Fourier optics approach and simulations are conducted to gain insights into spectroscopic image formation. The optimal pixel size at the sample plane is calculated and shown much smaller than that in current mid-IR microscopy systems. A commercial imaging system is modified, and experimental data are presented to demonstrate the validity of the developed model. Building on this validated theoretical foundation, an optimal sampling configuration is set up. Acquired data were of high spatial quality but, as expected, of poorer SNR. Signal processing approaches were implemented to improve the spectral SNR. The resulting data demonstrated the ability to perform high-definition IR imaging in the laboratory by using minimally-modified commercial instruments.
Gram Quist, Helle; Christensen, Ulla; Christensen, Karl Bang; Aust, Birgit; Borg, Vilhelm; Bjorner, Jakob B
2013-01-17
Lifestyle variables may serve as important intermediate factors between psychosocial work environment and health outcomes. Previous studies, focussing on work stress models have shown mixed and weak results in relation to weight change. This study aims to investigate psychosocial factors outside the classical work stress models as potential predictors of change in body mass index (BMI) in a population of health care workers. A cohort study, with three years follow-up, was conducted among Danish health care workers (3982 women and 152 men). Logistic regression analyses examined change in BMI (more than +/- 2 kg/m(2)) as predicted by baseline psychosocial work factors (work pace, workload, quality of leadership, influence at work, meaning of work, predictability, commitment, role clarity, and role conflicts) and five covariates (age, cohabitation, physical work demands, type of work position and seniority). Among women, high role conflicts predicted weight gain, while high role clarity predicted both weight gain and weight loss. Living alone also predicted weight gain among women, while older age decreased the odds of weight gain. High leadership quality predicted weight loss among men. Associations were generally weak, with the exception of quality of leadership, age, and cohabitation. This study of a single occupational group suggested a few new risk factors for weight change outside the traditional work stress models.
Davenport, Todd E
2015-12-01
Physical therapists increasingly are contributing clinical case reports to the health literature, which form the basis for higher quality evidence that has been incorporated into clinical practice guidelines. Yet, few resources exist to assist physical therapists with the basic mechanics and quality standards of producing a clinical case report. This situation is further complicated by the absence of uniform standards for quality in case reporting. The importance of including a concise yet comprehensive description of patient functioning in all physical therapy case reports suggest the potential appropriateness of basing quality guidelines on the World Health Organization's International Classification of Functioning Disability and Health (ICF) model. The purpose of this paper is to assist physical therapists in creating high-quality clinical case reports for the peer-reviewed literature using the ICF model as a guiding framework. Along these lines, current recommendations related to the basic mechanics of writing a successful clinical case report are reviewed, as well and a proposal for uniform clinical case reporting requirements is introduced with the aim to improve the quality and feasibility of clinical case reporting in physical therapy that are informed by the ICF model. Copyright © 2013 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Yanidar, R.; Hartono, D. M.; Moersidik, S. S.
2018-03-01
Cipayung Landfill takes waste generation from Depok City approximately ± 750 tons/day of solid waste. The south and west boundaries of the landfill is Pesanggarahan River which 200m faraway. The objectives of this study are to indicate an important parameter which greatly affects the water quality of Pesanggrahan River and purpose the dynamic model for improving our understanding of the dynamic behavior that captures the interactions and feedbacks important parameter in river in order to identify and assess the effects of the treated leachate from final solid waste disposal activity as it responds to changes over time in the river. The high concentrations of BOD and COD are not the only cause significantly affect the quality of the pesanggrahan water, it also because the river has been contaminated in the upstream area. It need the water quality model to support the effectiveness calculation of activities for preventing a selected the pollutant sources the model should be developed for simulating and predicting the trend of water quality performance in Pesanggrahan River which can potentially be used by policy makers in strategic management to sustain river water quality as raw drinking water.
NASA Astrophysics Data System (ADS)
Chang, N. B.
2016-12-01
Many countries concern about development and redevelopment efforts in urban regions to reduce the flood risk by considering hazards such as high-tide events, storm surge, flash floods, stormwater runoff, and impacts of sea level rise. Combining these present and future hazards with vulnerable characteristics found throughout coastal communities such as majority low-lying areas and increasing urban development, create scenarios for increasing exposure of flood hazard. As such, the most vulnerable areas require adaptation strategies and mitigation actions for flood hazard management. In addition, in the U.S., Numeric Nutrient Criteria (NNC) are a critical tool for protecting and restoring the designated uses of a waterbody with regard to nitrogen and phosphorus pollution. Strategies such as low impact development (LID) have been promoted in recent years as an alternative to traditional stormwater management and drainage to control both flooding and water quality impact. LID utilizes decentralized multifunctional site designs and incorporates on-site storm water management practices rather than conventional storm water management approaches that divert flow toward centralized facilities. How to integrate hydrologic and water quality models to achieve the decision support becomes a challenge. The Cross Bayou Watershed of Pinellas County in Tampa Bay, a highly urbanized coastal watershed, is utilized as a case study due to its sensitivity to flood hazards and water quality management within the watershed. This study will aid the County, as a decision maker, to implement its stormwater management policy and honor recent NNC state policy via demonstration of an integrated hydrologic and water quality model, including the Interconnected Channel and Pond Routing Model v.4 (ICPR4) and the BMPTRAIN model as a decision support tool. The ICPR4 can be further coupled with the ADCIRC/SWAN model to reflect the storm surge and seal level rise in coastal regions.
Charalambous, Andreas; Radwin, Laurel; Berg, Agneta; Sjovall, Katarina; Patiraki, Elisabeth; Lemonidou, Chryssoula; Katajisto, Jouko; Suhonen, Riitta
2016-09-01
Providing high quality nursing care for patients with malignancies is complex and driven by many factors. Many of the associations between nursing care quality, trust, health status and individualized care remain obscure. To empirically test a model of association linking hospitalized cancer patients' health status, nursing care quality, perceived individuality in care and trust in nurses. A cross-sectional, exploratory and correlational study design was used. This multi-site study was conducted in cancer care clinics, in-patient wards of five tertiary care hospitals in Cyprus, Finland, Greece and Sweden. Out of 876 hospitalized patients with a confirmed histopathological diagnosis of cancer approached to participate in the study in consecutive order, 599 (response rate 68%) agreed to participate and the data from 590 were used for path analysis. Data were collected in 2012-2013 with the Individualized Care Scale-Patient (ICS-Patient), the Oncology Patients' Perceptions of Quality Nursing Care Scale (OPPQNCS), the Euro-Qol (EQ-5D-3L) and the Trust in Nurses Scale. Data were analysed statistically using descriptive and inferential statistics. Mplus version 7.11 was used to determine the best Trust model with path analysis. Although the model fit indices suggested that the hypothesized model did not perfectly to the data, a slightly modified model which includes the reciprocal path between individualized care and nursing care quality demonstrated a good fit. A model of trust in nurses was developed. Health status, individualized care, and nursing care quality were found to be associated with trust. The model highlights the complexity of caring for cancer patients. Trust in nurses is influenced by the provision of individualized care. Generating and promoting trust requires interventions, which promote nursing care quality, individuality and patients' health status. Copyright © 2016 Elsevier Ltd. All rights reserved.
HIGH-RESOLUTION DATASET OF URBAN CANOPY PARAMETERS FOR HOUSTON, TEXAS
Urban dispersion and air quality simulation models applied at various horizontal scales require different levels of fidelity for specifying the characteristics of the underlying surfaces. As the modeling scales approach the neighborhood level (~1 km horizontal grid spacing), the...
Jones, Martyn C; Johnston, Derek
2013-03-01
To examine the effect of nurse mood in the worst event of shift (negative affect, positive affect), receipt of work-based support from managers and colleagues, colleague and patient involvement on perceived quality of care delivery. While the effect of the work environment on nurse mood is well documented, little is known about the effects of the worst event of shift on the quality of care delivered by nurses. This behavioural diary study employed a within-subject and between-subject designs incorporating both cross-sectional and longitudinal elements. One hundred and seventy-one nurses in four large district general hospitals in England completed end-of-shift computerised behavioural diaries over three shifts to explore the effects of the worst clinical incident of shift. Diaries measured negative affect, positive affect, colleague involvement, receipt of work-based support and perceived quality of care delivery. Analysis used multilevel modelling (MLWIN 2.19; Centre for Multi-level Modelling, University of Bristol, Bristol, UK). High levels of negative affect and low levels of positive affect reported in the worst clinical incident of shift were associated with reduced perceived quality of care delivery. Receipt of managerial support and its interaction with negative affect had no relationship with perceived quality of care delivery. Perceived quality of care delivery deteriorated the most when the nurse reported a combination of high negative affect and no receipt of colleague support in the worst clinical incident of shift. Perceived quality of care delivery was also particularly influenced when the nurse reported low positive affect and colleague actions contributed to the problem. Receipt of colleague support is particularly salient in protecting perceived quality of care delivery, especially if the nurse also reports high levels of negative affect in the worst event of shift. The effect of work-based support on care delivery is complex and requires further investigation. © 2012 Blackwell Publishing Ltd.
Midwife-led continuity models versus other models of care for childbearing women.
Sandall, Jane; Soltani, Hora; Gates, Simon; Shennan, Andrew; Devane, Declan
2015-09-15
Midwives are primary providers of care for childbearing women around the world. However, there is a lack of synthesised information to establish whether there are differences in morbidity and mortality, effectiveness and psychosocial outcomes between midwife-led continuity models and other models of care. To compare midwife-led continuity models of care with other models of care for childbearing women and their infants. We searched the Cochrane Pregnancy and Childbirth Group's Trials Register (31 May 2015) and reference lists of retrieved studies. All published and unpublished trials in which pregnant women are randomly allocated to midwife-led continuity models of care or other models of care during pregnancy and birth. Two review authors independently assessed trials for inclusion and risk of bias, extracted data and checked them for accuracy. We included 15 trials involving 17,674 women. We assessed the quality of the trial evidence for all primary outcomes (i.e., regional analgesia (epidural/spinal), caesarean birth, instrumental vaginal birth (forceps/vacuum), spontaneous vaginal birth, intact perineum, preterm birth (less than 37 weeks) and overall fetal loss and neonatal death (fetal loss was assessed by gestation using 24 weeks as the cut-off for viability in many countries) using the GRADE methodology: All primary outcomes were graded as of high quality.For the primary outcomes, women who had midwife-led continuity models of care were less likely to experience regional analgesia (average risk ratio (RR) 0.85, 95% confidence interval (CI) 0.78 to 0.92; participants = 17,674; studies = 14; high quality), instrumental vaginal birth (average RR 0.90, 95% CI 0.83 to 0.97; participants = 17,501; studies = 13; high quality), preterm birth less than 37 weeks (average RR 0.76, 95% CI 0.64 to 0.91; participants = 13,238; studies = 8; high quality) and less overall fetal/neonatal death (average RR 0.84, 95% CI 0.71 to 0.99; participants = 17,561; studies = 13; high quality evidence). Women who had midwife-led continuity models of care were more likely to experience spontaneous vaginal birth (average RR 1.05, 95% CI 1.03 to 1.07; participants = 16,687; studies = 12; high quality). There were no differences between groups for caesarean births or intact perineum.For the secondary outcomes, women who had midwife-led continuity models of care were less likely to experience amniotomy (average RR 0.80, 95% CI 0.66 to 0.98; participants = 3253; studies = 4), episiotomy (average RR 0.84, 95% CI 0.77 to 0.92; participants = 17,674; studies = 14) and fetal loss/neonatal death before 24 weeks (average RR 0.81, 95% CI 0.67 to 0.98; participants = 15,645; studies = 11). Women who had midwife-led continuity models of care were more likely to experience no intrapartum analgesia/anaesthesia (average RR 1.21, 95% CI 1.06 to 1.37; participants = 10,499; studies = 7), have a longer mean length of labour (hours) (mean difference (MD) 0.50, 95% CI 0.27 to 0.74; participants = 3328; studies = 3) and more likely to be attended at birth by a known midwife (average RR 7.04, 95% CI 4.48 to 11.08; participants = 6917; studies = 7). There were no differences between groups for fetal loss or neonatal death more than or equal to 24 weeks, induction of labour, antenatal hospitalisation, antepartum haemorrhage, augmentation/artificial oxytocin during labour, opiate analgesia, perineal laceration requiring suturing, postpartum haemorrhage, breastfeeding initiation, low birthweight infant, five-minute Apgar score less than or equal to seven, neonatal convulsions, admission of infant to special care or neonatal intensive care unit(s) or in mean length of neonatal hospital stay (days).Due to a lack of consistency in measuring women's satisfaction and assessing the cost of various maternity models, these outcomes were reported narratively. The majority of included studies reported a higher rate of maternal satisfaction in midwife-led continuity models of care. Similarly, there was a trend towards a cost-saving effect for midwife-led continuity care compared to other care models. This review suggests that women who received midwife-led continuity models of care were less likely to experience intervention and more likely to be satisfied with their care with at least comparable adverse outcomes for women or their infants than women who received other models of care.Further research is needed to explore findings of fewer preterm births and fewer fetal deaths less than 24 weeks, and overall fetal loss/neonatal death associated with midwife-led continuity models of care.
NASA Astrophysics Data System (ADS)
Javernick, Luke; Redolfi, Marco; Bertoldi, Walter
2018-05-01
New data collection techniques offer numerical modelers the ability to gather and utilize high quality data sets with high spatial and temporal resolution. Such data sets are currently needed for calibration, verification, and to fuel future model development, particularly morphological simulations. This study explores the use of high quality spatial and temporal data sets of observed bed load transport in braided river flume experiments to evaluate the ability of a two-dimensional model, Delft3D, to predict bed load transport. This study uses a fixed bed model configuration and examines the model's shear stress calculations, which are the foundation to predict the sediment fluxes necessary for morphological simulations. The evaluation is conducted for three flow rates, and model setup used highly accurate Structure-from-Motion (SfM) topography and discharge boundary conditions. The model was hydraulically calibrated using bed roughness, and performance was evaluated based on depth and inundation agreement. Model bed load performance was evaluated in terms of critical shear stress exceedance area compared to maps of observed bed mobility in a flume. Following the standard hydraulic calibration, bed load performance was tested for sensitivity to horizontal eddy viscosity parameterization and bed morphology updating. Simulations produced depth errors equal to the SfM inherent errors, inundation agreement of 77-85%, and critical shear stress exceedance in agreement with 49-68% of the observed active area. This study provides insight into the ability of physically based, two-dimensional simulations to accurately predict bed load as well as the effects of horizontal eddy viscosity and bed updating. Further, this study highlights how using high spatial and temporal data to capture the physical processes at work during flume experiments can help to improve morphological modeling.
Crasta, Dev; Funk, Janette L; Lee, Soonhee; Rogge, Ronald D
2017-12-27
Neighborhood quality has been cross-sectionally linked to both relationship behaviors and relationship well-being. Consistent with the Vulnerability Stress-Adaptation model of relationship functioning (Karney & Bradbury, 1995), we hypothesized that associations between social behaviors (e.g., drinking) and relationship quality could be moderated by neighborhood factors. Specifically, we characterized neighborhoods along multiple dimensions using multiple methods (self-report, census) to investigate how neighborhood factors might clarify ambiguous effects of alcohol use on marital functioning. A nationally recruited sample of 303 newlywed couples completed a baseline assessment around the time of marriage and was then assessed yearly across the first 4 years of marriage (94% retention). Three level HLM slope-intercept models were used to model changes in relationship satisfaction across the first 4 years of marriage. Results suggested that, for couples living in highly disordered neighborhoods, positive shifts in overall levels of drinking within specific waves of assessment were associated with corresponding negative shifts in satisfaction whereas in neighborhoods without perceived disorder, this effect was reversed. For couples living in neighborhoods with low levels of domestic structures (high census rates of single renters without children), within-couple discrepancies favoring higher rates of husband drinking in specific waves predicted poorer relationship quality for both partners in those same waves whereas those same discrepancies predicted higher satisfaction in high domesticity neighborhoods (high census rates of married homeowners with children). The findings provide insight into the different roles of alcohol use in relationship maintenance and highlight the importance of using external context to understand intradyadic processes. © 2017 Family Process Institute.
Toronto area ozone: Long-term measurements and modeled sources of poor air quality events
NASA Astrophysics Data System (ADS)
Whaley, C. H.; Strong, K.; Jones, D. B. A.; Walker, T. W.; Jiang, Z.; Henze, D. K.; Cooke, M. A.; McLinden, C. A.; Mittermeier, R. L.; Pommier, M.; Fogal, P. F.
2015-11-01
The University of Toronto Atmospheric Observatory and Environment Canada's Centre for Atmospheric Research Experiments each has over a decade of ground-based Fourier transform infrared (FTIR) spectroscopy measurements in southern Ontario. We present the Toronto area FTIR time series from 2002 to 2013 of two tropospheric trace gases—ozone and carbon monoxide—along with surface in situ measurements taken by government monitoring programs. We interpret their variability with the GEOS-Chem chemical transport model and determine the atmospheric conditions that cause pollution events in the time series. Our analysis includes a regionally tagged O3 model of the 2004-2007 time period, which quantifies the geographical contributions to Toronto area O3. The important emission types for 15 pollution events are then determined with a high-resolution adjoint model. Toronto O3, during pollution events, is most sensitive to southern Ontario and U.S. fossil fuel NOx emissions and natural isoprene emissions. The sources of Toronto pollution events are found to be highly variable, and this is demonstrated in four case studies representing local, short-, middle-, and long-range transport scenarios. This suggests that continental-scale emission reductions could improve air quality in the Toronto region. We also find that abnormally high temperatures and high-pressure systems are common to all pollution events studied, suggesting that climate change may impact Toronto O3. Finally, we quantitatively compare the sensitivity of the surface and column measurements to anthropogenic NOx emissions and show that they are remarkably similar. This work thus demonstrates the usefulness of FTIR measurements in an urban area to assess air quality.
Wang, Hongqing; Chen, Qin; Hu, Kelin; LaPeyre, Megan K.
2017-01-01
Freshwater and sediment management in estuaries affects water quality, particularly in deltaic estuaries. Furthermore, climate change-induced sea-level rise (SLR) and land subsidence also affect estuarine water quality by changing salinity, circulation, stratification, sedimentation, erosion, residence time, and other physical and ecological processes. However, little is known about how the magnitudes and spatial and temporal patterns in estuarine water quality variables will change in response to freshwater and sediment management in the context of future SLR. In this study, we applied the Delft3D model that couples hydrodynamics and water quality processes to examine the spatial and temporal variations of salinity, total suspended solids, and chlorophyll-α concentration in response to small (142 m3 s−1) and large (7080 m3 s−1) Mississippi River (MR) diversions under low (0.38 m) and high (1.44 m) relative SLR (RSLR = eustatic SLR + subsidence) scenarios in the Breton Sound Estuary, Louisiana, USA. The hydrodynamics and water quality model were calibrated and validated via field observations at multiple stations across the estuary. Model results indicate that the large MR diversion would significantly affect the magnitude and spatial and temporal patterns of the studied water quality variables across the entire estuary, whereas the small diversion tends to influence water quality only in small areas near the diversion. RSLR would also play a significant role on the spatial heterogeneity in estuary water quality by acting as an opposite force to river diversions; however, RSLR plays a greater role than the small-scale diversion on the magnitude and spatial pattern of the water quality parameters in this deltaic estuary.
Gurdak, Jason J.; McMahon, Peter B.; Dennehy, Kevin; Qi, Sharon L.
2009-01-01
This report contains the major findings of a 1999-2004 assessment of water quality in the High Plains aquifer. It is one of a series of reports by the National Water-Quality Assessment (NAWQA) Program that present major findings for principal and other aquifers and major river basins across the Nation. In these reports, water quality is discussed in terms of local, regional, State, and national issues. Conditions in the aquifer system are compared to conditions found elsewhere and to selected national benchmarks, such as those for drinking-water quality. This report is intended for individuals working with water-resource issues in Federal, State, or local agencies, universities, public interest groups, or the private sector. The information will be useful in addressing a number of current issues, such as drinking-water quality, the effects of agricultural practices on water quality, source-water protection, and monitoring and sampling strategies. This report is also for individuals who wish to know more about the quality of ground water in areas near where they live and how that water quality compares to the quality of water in other areas across the region and the Nation. The water-quality conditions in the High Plains aquifer summarized in this report are discussed in greater detail in other reports that can be accessed in Appendix 1 of http://pubs.usgs.gov/pp/1749/. Detailed technical information, data and analyses, collection and analytical methodology, models, graphs, and maps that support the findings presented in this report in addition to reports in this series from other basins can be accessed from the national NAWQA Web site (http://water.usgs.gov/nawqa). This report accompanies the detailed and technical report of water-quality conditions in the High Plains aquifer 'Water-quality assessment of the High Plains aquifer, 1999-2004' (http://pubs.usgs.gov/pp/1749/)
Application of receptor models on water quality data in source apportionment in Kuantan River Basin
2012-01-01
Recent techniques in the management of surface river water have been expanding the demand on the method that can provide more representative of multivariate data set. A proper technique of the architecture of artificial neural network (ANN) model and multiple linear regression (MLR) provides an advance tool for surface water modeling and forecasting. The development of receptor model was applied in order to determine the major sources of pollutants at Kuantan River Basin, Malaysia. Thirteen water quality parameters were used in principal component analysis (PCA) and new variables of fertilizer waste, surface runoff, anthropogenic input, chemical and mineral changes and erosion are successfully developed for modeling purposes. Two models were compared in terms of efficiency and goodness-of-fit for water quality index (WQI) prediction. The results show that APCS-ANN model gives better performance with high R2 value (0.9680) and small root mean square error (RMSE) value (2.6409) compared to APCS-MLR model. Meanwhile from the sensitivity analysis, fertilizer waste acts as the dominant pollutant contributor (59.82%) to the basin studied followed by anthropogenic input (22.48%), surface runoff (13.42%), erosion (2.33%) and lastly chemical and mineral changes (1.95%). Thus, this study concluded that receptor modeling of APCS-ANN can be used to solve various constraints in environmental problem that exist between water distribution variables toward appropriate water quality management. PMID:23369363
Mellem, Daniel; Fischer, Frank; Jaspers, Sören; Wenck, Horst; Rübhausen, Michael
2016-01-01
Mitochondria are essential for the energy production of eukaryotic cells. During aging mitochondria run through various processes which change their quality in terms of activity, health and metabolic supply. In recent years, many of these processes such as fission and fusion of mitochondria, mitophagy, mitochondrial biogenesis and energy consumption have been subject of research. Based on numerous experimental insights, it was possible to qualify mitochondrial behaviour in computational simulations. Here, we present a new biophysical model based on the approach of Figge et al. in 2012. We introduce exponential decay and growth laws for each mitochondrial process to derive its time-dependent probability during the aging of cells. All mitochondrial processes of the original model are mathematically and biophysically redefined and additional processes are implemented: Mitochondrial fission and fusion is separated into a metabolic outer-membrane part and a protein-related inner-membrane part, a quality-dependent threshold for mitophagy and mitochondrial biogenesis is introduced and processes for activity-dependent internal oxidative stress as well as mitochondrial repair mechanisms are newly included. Our findings reveal a decrease of mitochondrial quality and a fragmentation of the mitochondrial network during aging. Additionally, the model discloses a quality increasing mechanism due to the interplay of the mitophagy and biogenesis cycle and the fission and fusion cycle of mitochondria. It is revealed that decreased mitochondrial repair can be a quality saving process in aged cells. Furthermore, the model finds strategies to sustain the quality of the mitochondrial network in cells with high production rates of reactive oxygen species due to large energy demands. Hence, the model adds new insights to biophysical mechanisms of mitochondrial aging and provides novel understandings of the interdependency of mitochondrial processes. PMID:26771181
Franzosa, Emily; Tsui, Emma K; Baron, Sherry
2018-02-01
Home care payment models, quality measures, and care plans are based on physical tasks workers perform, ignoring relational care that supports clients' cognitive, emotional, and social well-being. As states seek to rein in costs and improve the efficiency and quality of care, they will need to consider how to measure and support relational care. In four focus groups ( n = 27) of unionized, agency-based New York City home health aides, workers reported aide-client relationships were a cornerstone of high-quality care, and building them required communication, respect, and going the extra mile. Since much of this care was invisible outside the worker-client relationship, aides received little supervisory support and felt excluded from the formal care team. Aligning payment models with quality requires understanding the full scope of services aides provide and a quality work environment that offers support and supervision, engages aides in patient care, and gives them a voice in policy decisions.
Frederix, Geert W J; Hövels, Anke M; Severens, Johan L; Raaijmakers, Jan A M; Schellens, Jan H M
2015-01-01
There is increasing discussion in the Netherlands about the introduction of a threshold value for the costs per extra year of life when reimbursing costs of new drugs. The Medicines Committee ('Commissie Geneesmiddelen'), a division of the Netherlands National Healthcare Institute ('Zorginstituut Nederland'), advises on reimbursement of costs of new drugs. This advice is based upon the determination of therapeutic value of the drug and the results of economic evaluations. Mathematical models that predict future costs and effectiveness are often used in economic evaluations; these models can vary greatly in transparency and quality due to author assumptions. Standardisation of cost-effectiveness models is one solution to overcome the unwanted variation in quality. Discussions about the introduction of a threshold value can only be meaningful if all involved are adequately informed, and by high quality in cost-effectiveness research and, particularly, economic evaluations. Collaboration and discussion between medical specialists, patients or patient organisations, health economists and policy makers, both in development of methods and in standardisation, are essential to improve the quality of decision making.
An Artificial Intelligence System to Predict Quality of Service in Banking Organizations
Popovič, Aleš
2016-01-01
Quality of service, that is, the waiting time that customers must endure in order to receive a service, is a critical performance aspect in private and public service organizations. Providing good service quality is particularly important in highly competitive sectors where similar services exist. In this paper, focusing on banking sector, we propose an artificial intelligence system for building a model for the prediction of service quality. While the traditional approach used for building analytical models relies on theories and assumptions about the problem at hand, we propose a novel approach for learning models from actual data. Thus, the proposed approach is not biased by the knowledge that experts may have about the problem, but it is completely based on the available data. The system is based on a recently defined variant of genetic programming that allows practitioners to include the concept of semantics in the search process. This will have beneficial effects on the search process and will produce analytical models that are based only on the data and not on domain-dependent knowledge. PMID:27313604
An Artificial Intelligence System to Predict Quality of Service in Banking Organizations.
Castelli, Mauro; Manzoni, Luca; Popovič, Aleš
2016-01-01
Quality of service, that is, the waiting time that customers must endure in order to receive a service, is a critical performance aspect in private and public service organizations. Providing good service quality is particularly important in highly competitive sectors where similar services exist. In this paper, focusing on banking sector, we propose an artificial intelligence system for building a model for the prediction of service quality. While the traditional approach used for building analytical models relies on theories and assumptions about the problem at hand, we propose a novel approach for learning models from actual data. Thus, the proposed approach is not biased by the knowledge that experts may have about the problem, but it is completely based on the available data. The system is based on a recently defined variant of genetic programming that allows practitioners to include the concept of semantics in the search process. This will have beneficial effects on the search process and will produce analytical models that are based only on the data and not on domain-dependent knowledge.
NASA Astrophysics Data System (ADS)
Shavers, Mark Randall
1999-12-01
High-energy protons in the galactic cosmic rays (GCR)-or generated by nuclear interactions of GCR heavy-ions with material-are capable of penetrating great thicknesses of shielding to irradiate humans in spacecraft or in lunar or Martian habitats. As protons interact with the nuclei of the elemental constituents of soft tissue and bone, low energy nuclei-target fragments-are emitted into the cells responsible for bone development and maintenance and for hematopoiesis. Leukemogenesis is the principal endpoint of concern because it is the most likely deleterious effect, and it has a short latency period and comparatively low survival rate, although other myelo- proliferative disorders and osteosarcoma also may be induced. A one-dimensional proton-target fragment transport model was used to calculate the energy spectra of fragments produced in bone and soft tissue, and present in marrow cavities at distances from a bone interface. In terms of dose equivalent, the target fragments are as significant as the incident protons. An average radiation quality factor was found to be between 1.8 and 2.6. Biological response to the highly non- uniform energy deposition of the target fragments is such that an alternative approach to conventional predictive risk assessment is needed. Alternative procedures are presented. In vitro cell response and relative biological effectiveness were calculated from the radial dose distribution of each fragment produced by 1-GeV protons using parameters of a modified Ion-Gamma- Kill (IGK) model of radiation action. The modelled endpoints were survival of C3H10t 1/2 and V79 cells, neoplastic transformation of C3H10t1/2 cells, and mutation of the X-linked hypoxanthine phosphoribosyltransferase (HPRT) locus in V79 cells. The dose equivalent and cell responses increased by 10% or less near the interface. Since RBE increases with decreasing dose in the IGK model, comparisons with quality factors were made at dose levels 0.01 <= D [Gy] <= 2. Applying average quality factors derived herein to GCR exposures results in a <= 5% increase of in average quality. Calculated RBEs indicate that accepted quality factors for high-energy protons may be too low due to the relatively high effectiveness of the low-charged target fragments. Derived RBEs for target fragments increase the calculated biological effectiveness of GCR by 20% to 180%.
Level of quality management in the Municipal Sports Services, contrast trough EFQM Excellence Model.
Martínez-Moreno, Alfonso; Díaz Suárez, Arturo
2016-01-01
The quality management in the Municipal Sports Services is embedded in the servuction provided to the citizens, which are their internal customers who determine the quality improvement ensuring competitiveness with excellence criteria. The Model of the European Foundation for Quality Management enables the evaluation of organization progress towards achieving quality goals, from a structured, measurable and comparable methodology. The aim is to carry out a diagnosis of the level of implementation of quality in the Municipal Sports Services of the Region of Murcia, Spain. The sample of 287 workers of 30 sports services gets a high level of reliability at all scales, with a coefficient of variation of .985 (range .810-.943). The score in the criteria of Policy and Strategy, People Management, Alliances and Resources, Processes and People Results were significantly higher (p < .05) in the Municipalities with more than 25,000 inhabitants when compared with those less than 10,000 and with those from 10,000 to 25,000 inhabitants obtaining global ratings of 571 points, those less than 10,000, 590 points those from 10,000 to 25,000 and those higher than 25,000 reach 636, having a good level of quality in relation to the scale that determines the model.
Medina-Mirapeix, Francesc; Jimeno-Serrano, Francisco J; Escolar-Reina, Pilar; Del Baño-Aledo, M Elena
2013-06-01
To assess the relationships between patient experiences and two overall evaluations - satisfaction and service quality - in outpatient rehabilitation settings. A cross-sectional, self-reported survey carried out in the year 2009. Three outpatient rehabilitation units belonging to Spanish hospitals located in Barcelona, Madrid and Seville. Four hundred and sixty-five outpatients (response rate 90%) mean age 39.4 (SD = 11.9) years. Self-reported experiences on aspects of care, participants' perception of service quality, satisfaction with care, socio-demographic and health characteristics. Satisfaction and service quality were highly correlated (rho = 0.72, P< 0.001). Two multivariate logistic regression models using satisfaction and service quality (with adjusted R(2) 31.5% and 37.1%, respectively) indicated that patients' experiences and global rating of health improvement have more effect on those evaluations than socio-demographic characteristics. Mean satisfaction was 8.9 (SD = 1.2), and 88% of respondents described high service quality. However, nearly 25% of the respondents who reported high-quality evaluations also indicated a problem score of more than 50% in almost all aspects of care studied. Satisfaction and service quality provide a poor indicator of patients' experiences. Both are two proxies but distinct constructs in rehabilitation care. Besides, not all problems encountered by patients are equally important to them.
Soil Quality Index Determination Models for Restinga Forest
NASA Astrophysics Data System (ADS)
Bonilha, R. M.; Casagrande, J. C.; Soares, R. M.
2012-04-01
The Restinga Forest is a set of plant communities in mosaic, determined by the characteristics of their substrates as a result of depositional processes and ages. In this complex mosaic are the physiognomies of restinga forests of high-stage regeneration (high restinga) and middle stage of regeneration (low restinga), each with its plant characteristics that differentiate them. Located on the coastal plains of the Brazilian coast, suffering internal influences both the continental slopes, as well as from the sea. Its soils come from the Quaternary and are subject to constant deposition of sediments. The climate in the coastal type is tropical (Köppen). This work was conducted in four locations: (1) Anchieta Island, Ubatuba, (2) Juréia-Itatins Ecological Station, Iguape, (3) Vila das Pedrinhas, Comprida Island; and (4) Cardoso Island, Cananeia. The soil samples were collect at a depths of 0 to 5, 0-10, 0-20, 20-40 and 40 to 60cm for the chemical and physical analysis. Were studied the additive and pondering additive models to evaluate soil quality. It was concluded: a) the comparative additive model produces quantitative results and the pondering additive model quantitative results; b) as the pondering additive model, the values of Soil Quality Index (SQI) for soils under forest of restinga are low and realistic, demonstrating the small plant biomass production potential of these soils, as well as their low resilience; c) the values of SQI similar to areas with and without restinga forest give quantitative demonstration of the restinga be considered as soil phase; d) restinga forest, probably, is maintained solely by the cycling of nutrients in a closed nutrient cycling; e) for the determination of IQS for soils under restinga vegetation the use of routine chemical analysis is adequate. Keywords: Model, restinga forest, Soil Quality Index (SQI).
Spreadsheet WATERSHED modeling for nonpoint-source pollution management in a Wisconsin basin
Walker, J.F.; Pickard, S.A.; Sonzogni, W.C.
1989-01-01
Although several sophisticated nonpoint pollution models exist, few are available that are easy to use, cover a variety of conditions, and integrate a wide range of information to allow managers and planners to assess different control strategies. Here, a straightforward pollutant input accounting approach is presented in the form of an existing model (WATERSHED) that has been adapted to run on modern electronic spreadsheets. As an application, WATERSHED is used to assess options to improve the quality of highly eutrophic Delavan Lake in Wisconsin. WATERSHED is flexible in that several techniques, such as the Universal Soil Loss Equation or unit-area loadings, can be used to estimate nonpoint-source inputs. Once the model parameters are determined (and calibrated, if possible), the spreadsheet features can be used to conduct a sensitivity analysis of management options. In the case of Delavan Lake, it was concluded that, although some nonpoint controls were cost-effective, the overall reduction in phosphorus would be insufficient to measurably improve water quality.A straightforward pollutant input accounting approach is presented in the form of an existing model (WATERSHED) that has been adapted to run on modern electronic spreadsheets. As an application, WATERSHED is used to assess options to improve the quality of highly eutrophic Delavan Lake in Wisconsin. WATERSHED is flexible in that several techniques, such as the Universal Soil Loss Equation or unit-area loadings, can be used to estimate nonpoint-source inputs. Once the model parameters are determined (and calibrated, if possible), the spreadsheet features can be used to conduct a sensitivity analysis of management options. In the case of Delavan Lake, it was concluded that, although some nonpoint controls were cost-effective, the overall reduction in phosphorus would be insufficient to measurably improve water quality.
Air quality impact assessment of multiple open pit coal mines in northern Colombia.
Huertas, José I; Huertas, María E; Izquierdo, Sebastián; González, Enrique D
2012-01-01
The coal mining region in northern Colombia is one of the largest open pit mining regions of the world. In 2009, there were 8 mining companies in operation with an approximate coal production of ∼70 Mtons/year. Since 2007, the Colombian air quality monitoring network has reported readings that exceed the daily and annual air quality standards for total suspended particulate (TSP) matter and particles with an equivalent aerodynamic diameter smaller than 10 μm (PM₁₀) in nearby villages. This paper describes work carried out in order to establish an appropriate clean air program for this region, based on the Colombian national environmental authority requirement for modeling of TSP and PM(10) dispersion. A TSP and PM₁₀ emission inventory was initially developed, and topographic and meteorological information for the region was collected and analyzed. Using this information, the dispersion of TSP was modeled in ISC3 and AERMOD using meteorological data collected by 3 local stations during 2008 and 2009. The results obtained were compared to actual values measured by the air quality monitoring network. High correlation coefficients (>0.73) were obtained, indicating that the models accurately described the main factors affecting particle dispersion in the region. The model was then used to forecast concentrations of particulate matter for 2010. Based on results from the model, areas within the modeling region were identified as highly, fairly, moderately and marginally polluted according to local regulations. Additionally, the contribution particulate matter to the pollution at each village was estimated. Using these predicted values, the Colombian environmental authority imposed new decontamination measures on the mining companies operating in the region. These measures included the relocation of three villages financed by the mine companies based on forecasted pollution levels. Copyright © 2011. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Kim, Y. J.; Sunwoo, Y.; Hwang, I.; Song, S.; Sin, J.; Kim, D.
2015-12-01
A very high population and corresponding high number of vehicles in the Seoul Metropolitan Area (SMA) are aggravating the air quality of this region. The Korean government continues to make concerted efforts to improve air quality. One of the major policies that the Ministry of Environment of Korea enforced is "The Special Act for Improvement of Air Quality in SMA" and "The 1st Air Quality Management Plan of SMA". Mobile Source emission controls are an important part of the policy. Thus, it is timely to evaluate the air quality improvement due to the controls. Therefore, we performed a quantitative analysis of the difference in air quality using the Community Multiscale Air Quality (CMAQ) model and December, 2011 was set as the target period to capture the impact of the above control plans. We considered four fuel-type vehicle emission scenarios and compared the air quality improvement differences between them. The scenarios are as follows: no-control, gasoline vehicle control only, diesel vehicle control only, and control of both; utilizing the revised mobile source emissions from the Clean Air Policy Support System (CAPSS), which is the national emission inventory reflecting current policy.In order to improve the accuracy of the modeling data, we developed new temporal allocation coefficients based on traffic volume observation data and spatially reallocated the mobile source emissions using vehicle flow survey data. Furthermore, we calculated the PM10 and PM2.5 emissions of gasoline vehicles which is omitted in CAPSS.The results of the air quality modeling shows that vehicle control plans for both gasoline and diesel lead to a decrease of 0.65ppb~8.75ppb and 0.02㎍/㎥~7.09㎍/㎥ in NO2 and PM10 monthly average concentrations, respectively. The large percentage decreases mainly appear near the center of the metropolis. However, the largest NO2 decrease percentages are found in the northeast region of Gyeonggi-do, which is the province that surrounds the capital of Seoul. Comparing the results between the different scenarios, diesel vehicle control impact dominates relative to the impact of gasoline control. The diesel-only reduction plan shows that NO2 and PM10 improved by 2.93ppb and 3.32㎍/㎥, respectively.
Adams, Russell; Quinn, Paul F; Perks, Matthew; Barber, Nicholas J; Jonczyk, Jennine; Owen, Gareth J
2016-12-01
High resolution water quality data has recently become widely available from numerous catchment based monitoring schemes. However, the models that can reproduce time series of concentrations or fluxes have not kept pace with the advances in monitoring data. Model performance at predicting phosphorus (P) and sediment concentrations has frequently been poor with models not fit for purpose except for predicting annual losses. Here, the data from the Eden Demonstration Test Catchments (DTC) project have been used to calibrate the Catchment Runoff Attenuation Flux Tool (CRAFT), a new, parsimonious model developed with the aim of modelling both the generation and attenuation of nutrients and sediments in small to medium sized catchments. The CRAFT has the ability to run on an hourly timestep and can calculate the mass of sediments and nutrients transported by three flow pathways representing rapid surface runoff, fast subsurface drainage and slow groundwater flow (baseflow). The attenuation feature of the model is introduced here; this enables surface runoff and contaminants transported via this pathway to be delayed in reaching the catchment outlet. It was used to investigate some hypotheses of nutrient and sediment transport in the Newby Beck Catchment (NBC) Model performance was assessed using a suite of metrics including visual best fit and the Nash-Sutcliffe efficiency. It was found that this approach for water quality models may be the best assessment method as opposed to using a single metric. Furthermore, it was found that, when the aim of the simulations was to reproduce the time series of total P (TP) or total reactive P (TRP) to get the best visual fit, that attenuation was required. The model will be used in the future to explore the impacts on water quality of different mitigation options in the catchment; these will include attenuation of surface runoff. Copyright © 2016 Elsevier B.V. All rights reserved.
Automatic Generation of High Quality DSM Based on IRS-P5 Cartosat-1 Stereo Data
NASA Astrophysics Data System (ADS)
d'Angelo, Pablo; Uttenthaler, Andreas; Carl, Sebastian; Barner, Frithjof; Reinartz, Peter
2010-12-01
IRS-P5 Cartosat-1 high resolution stereo satellite imagery is well suited for the creation of digital surface models (DSM). A system for highly automated and operational DSM and orthoimage generation based on IRS-P5 Cartosat-1 imagery is presented, with an emphasis on automated processing and product quality. The proposed system processes IRS-P5 level-1 stereo scenes using the rational polynomial coefficients (RPC) universal sensor model. The described method uses an RPC correction based on DSM alignment instead of using reference images with a lower lateral accuracy, this results in improved geolocation of the DSMs and orthoimages. Following RPC correction, highly detailed DSMs with 5 m grid spacing are derived using Semiglobal Matching. The proposed method is part of an operational Cartosat-1 processor for the generation of a high resolution DSM. Evaluation of 18 scenes against independent ground truth measurements indicates a mean lateral error (CE90) of 6.7 meters and a mean vertical accuracy (LE90) of 5.1 meters.
NASA Astrophysics Data System (ADS)
Hagemann, M.; Jeznach, L. C.; Park, M. H.; Tobiason, J. E.
2016-12-01
Extreme precipitation events such as tropical storms and hurricanes are by their nature rare, yet have disproportionate and adverse effects on surface water quality. In the context of drinking water reservoirs, common concerns of such events include increased erosion and sediment transport and influx of natural organic matter and nutrients. As part of an effort to model the effects of an extreme precipitation event on water quality at the reservoir intake of a major municipal water system, this study sought to estimate extreme-event watershed responses including streamflow and exports of nutrients and organic matter for use as inputs to a 2-D hydrodynamic and water quality reservoir model. Since extreme-event watershed exports are highly uncertain, we characterized and propagated predictive uncertainty using a quasi-Monte Carlo approach to generate reservoir model inputs. Three storm precipitation depths—corresponding to recurrence intervals of 5, 50, and 100 years—were converted to streamflow in each of 9 tributaries by volumetrically scaling 2 storm hydrographs from the historical record. Rating-curve models for concentratoin, calibrated using 10 years of data for each of 5 constituents, were then used to estimate the parameters of a multivariate lognormal probability model of constituent concentrations, conditional on each scenario's storm date and streamflow. A quasi-random Halton sequence (n = 100) was drawn from the conditional distribution for each event scenario, and used to generate input files to a calibrated CE-QUAL-W2 reservoir model. The resulting simulated concentrations at the reservoir's drinking water intake constitute a low-discrepancy sample from the estimated uncertainty space of extreme-event source water-quality. Limiting factors to the suitability of this approach include poorly constrained relationships between hydrology and constituent concentrations, a high-dimensional space from which to generate inputs, and relatively long run-time for the reservoir model. This approach proved useful in probing a water supply's resilience to extreme events, and to inform management responses, particularly in a region such as the American Northeast where climate change is expected to bring such events with higher frequency and intensity than have occurred in the past.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pino, Francisco; Roé, Nuria; Aguiar, Pablo, E-mail: pablo.aguiar.fernandez@sergas.es
2015-02-15
Purpose: Single photon emission computed tomography (SPECT) has become an important noninvasive imaging technique in small-animal research. Due to the high resolution required in small-animal SPECT systems, the spatially variant system response needs to be included in the reconstruction algorithm. Accurate modeling of the system response should result in a major improvement in the quality of reconstructed images. The aim of this study was to quantitatively assess the impact that an accurate modeling of spatially variant collimator/detector response has on image-quality parameters, using a low magnification SPECT system equipped with a pinhole collimator and a small gamma camera. Methods: Threemore » methods were used to model the point spread function (PSF). For the first, only the geometrical pinhole aperture was included in the PSF. For the second, the septal penetration through the pinhole collimator was added. In the third method, the measured intrinsic detector response was incorporated. Tomographic spatial resolution was evaluated and contrast, recovery coefficients, contrast-to-noise ratio, and noise were quantified using a custom-built NEMA NU 4–2008 image-quality phantom. Results: A high correlation was found between the experimental data corresponding to intrinsic detector response and the fitted values obtained by means of an asymmetric Gaussian distribution. For all PSF models, resolution improved as the distance from the point source to the center of the field of view increased and when the acquisition radius diminished. An improvement of resolution was observed after a minimum of five iterations when the PSF modeling included more corrections. Contrast, recovery coefficients, and contrast-to-noise ratio were better for the same level of noise in the image when more accurate models were included. Ring-type artifacts were observed when the number of iterations exceeded 12. Conclusions: Accurate modeling of the PSF improves resolution, contrast, and recovery coefficients in the reconstructed images. To avoid the appearance of ring-type artifacts, the number of iterations should be limited. In low magnification systems, the intrinsic detector PSF plays a major role in improvement of the image-quality parameters.« less
High Tech Training at Arthur Andersen and Co.
ERIC Educational Resources Information Center
Dennis, Verl E.
1984-01-01
Discusses Arthur Andersen and Company's reasons for using high technology in job training, including its ability to improve productivity, provide training on demand, reduce training costs, and keep educational quality consistent. A Life Cycle Model which is used to integrate high technology into this accounting company's educational programs is…
Linking Dynamic Habitat Selection with Wading Bird Foraging Distributions across Resource Gradients
Beerens, James M.; Noonburg, Erik G.; Gawlik, Dale E.
2015-01-01
Species distribution models (SDM) link species occurrence with a suite of environmental predictors and provide an estimate of habitat quality when the variable set captures the biological requirements of the species. SDMs are inherently more complex when they include components of a species’ ecology such as conspecific attraction and behavioral flexibility to exploit resources that vary across time and space. Wading birds are highly mobile, demonstrate flexible habitat selection, and respond quickly to changes in habitat quality; thus serving as important indicator species for wetland systems. We developed a spatio-temporal, multi-SDM framework using Great Egret (Ardea alba), White Ibis (Eudocimus albus), and Wood Stork (Mycteria Americana) distributions over a decadal gradient of environmental conditions to predict species-specific abundance across space and locations used on the landscape over time. In models of temporal dynamics, species demonstrated conditional preferences for resources based on resource levels linked to differing temporal scales. Wading bird abundance was highest when prey production from optimal periods of inundation was concentrated in shallow depths. Similar responses were observed in models predicting locations used over time, accounting for spatial autocorrelation. Species clustered in response to differing habitat conditions, indicating that social attraction can co-vary with foraging strategy, water-level changes, and habitat quality. This modeling framework can be applied to evaluate the multi-annual resource pulses occurring in real-time, climate change scenarios, or restorative hydrological regimes by tracking changing seasonal and annual distribution and abundance of high quality foraging patches. PMID:26107386
Here the data: the new FLUXNET collection and the future for model-data integration
NASA Astrophysics Data System (ADS)
Papale, D.; Pastorello, G.; Trotta, C.; Chu, H.; Canfora, E.; Agarwal, D.; Baldocchi, D. D.; Torn, M. S.
2016-12-01
Seven years after the release of the LaThuile FLUXNET database, widely used in synthesis activities and model-data fusion exercises, a new FLUXNET collection has been released (FLUXNET 2015 - http://fluxnet.fluxdata.org) with the aim to increase the quality of the measurements and provide high quality standardized data obtained by a new processing pipeline. The new FLUXNET collection includes also sites with timeseries of 20 years of continuous carbon and energy fluxes, opening new opportunities in their use in the context of models parameterization and validation. The main characteristics of the FLUXNET 2015 dataset are the uncertainty quantification, the multiple products (e.g. partitioning in photosynthesis and ecosystem respiration) that allow consistency analysis for each site, and new long term downscaled meteorological data provided with the data. Feedbacks from new users, in particular from the modelling communities, are crucial to further improve the quality of the products and move in the direction of a coherent integration across multi-disciplinary communities. In this presentation, the new FLUXNET2015 dataset will be explained and explored, with particular focus on the meaning of the different products and variables, their potentiality but also their limitations. The future development of the dataset will be discussed, with the role of the regional networks and the ongoing efforts to provide new and advanced services such a near real time data provision and a completely open access policy to high quality standardized measurements of GHGs exchanges and additional ecological quantities.
Linking dynamic habitat selection with wading bird foraging distributions across resource gradients
Beerens, James M.; Noonberg, Erik G.; Gawlik, Dale E.
2015-01-01
Species distribution models (SDM) link species occurrence with a suite of environmental predictors and provide an estimate of habitat quality when the variable set captures the biological requirements of the species. SDMs are inherently more complex when they include components of a species' ecology such as conspecific attraction and behavioral flexibility to exploit resources that vary across time and space. Wading birds are highly mobile, demonstrate flexible habitat selection, and respond quickly to changes in habitat quality; thus serving as important indicator species for wetland systems. We developed a spatio-temporal, multi-SDM framework using Great Egret (Ardea alba), White Ibis (Eudocimus albus), and Wood Stork (Mycteria Americana) distributions over a decadal gradient of environmental conditions to predict species-specific abundance across space and locations used on the landscape over time. In models of temporal dynamics, species demonstrated conditional preferences for resources based on resource levels linked to differing temporal scales. Wading bird abundance was highest when prey production from optimal periods of inundation was concentrated in shallow depths. Similar responses were observed in models predicting locations used over time, accounting for spatial autocorrelation. Species clustered in response to differing habitat conditions, indicating that social attraction can co-vary with foraging strategy, water-level changes, and habitat quality. This modeling framework can be applied to evaluate the multi-annual resource pulses occurring in real-time, climate change scenarios, or restorative hydrological regimes by tracking changing seasonal and annual distribution and abundance of high quality foraging patches.
Downey, Brandon; Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey
2017-11-01
As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed-batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647-1661, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.
The increasing number and size of public databases is facilitating the collection of chemical structures and associated experimental data for QSAR modeling. However, the performance of QSAR models is highly dependent not only on the modeling methodology, but also on the quality o...
Flux-tunable heat sink for quantum electric circuits.
Partanen, M; Tan, K Y; Masuda, S; Govenius, J; Lake, R E; Jenei, M; Grönberg, L; Hassel, J; Simbierowicz, S; Vesterinen, V; Tuorila, J; Ala-Nissila, T; Möttönen, M
2018-04-20
Superconducting microwave circuits show great potential for practical quantum technological applications such as quantum information processing. However, fast and on-demand initialization of the quantum degrees of freedom in these devices remains a challenge. Here, we experimentally implement a tunable heat sink that is potentially suitable for the initialization of superconducting qubits. Our device consists of two coupled resonators. The first resonator has a high quality factor and a fixed frequency whereas the second resonator is designed to have a low quality factor and a tunable resonance frequency. We engineer the low quality factor using an on-chip resistor and the frequency tunability using a superconducting quantum interference device. When the two resonators are in resonance, the photons in the high-quality resonator can be efficiently dissipated. We show that the corresponding loaded quality factor can be tuned from above 10 5 down to a few thousand at 10 GHz in good quantitative agreement with our theoretical model.
Knowledge management systems success in healthcare: Leadership matters.
Ali, Nor'ashikin; Tretiakov, Alexei; Whiddett, Dick; Hunter, Inga
2017-01-01
To deliver high-quality healthcare doctors need to access, interpret, and share appropriate and localised medical knowledge. Information technology is widely used to facilitate the management of this knowledge in healthcare organisations. The purpose of this study is to develop a knowledge management systems success model for healthcare organisations. A model was formulated by extending an existing generic knowledge management systems success model by including organisational and system factors relevant to healthcare. It was tested by using data obtained from 263 doctors working within two district health boards in New Zealand. Of the system factors, knowledge content quality was found to be particularly important for knowledge management systems success. Of the organisational factors, leadership was the most important, and more important than incentives. Leadership promoted knowledge management systems success primarily by positively affecting knowledge content quality. Leadership also promoted knowledge management use for retrieval, which should lead to the use of that better quality knowledge by the doctors, ultimately resulting in better outcomes for patients. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Thompson, William K; Rasmussen, Luke V; Pacheco, Jennifer A; Peissig, Peggy L; Denny, Joshua C; Kho, Abel N; Miller, Aaron; Pathak, Jyotishman
2012-01-01
The development of Electronic Health Record (EHR)-based phenotype selection algorithms is a non-trivial and highly iterative process involving domain experts and informaticians. To make it easier to port algorithms across institutions, it is desirable to represent them using an unambiguous formal specification language. For this purpose we evaluated the recently developed National Quality Forum (NQF) information model designed for EHR-based quality measures: the Quality Data Model (QDM). We selected 9 phenotyping algorithms that had been previously developed as part of the eMERGE consortium and translated them into QDM format. Our study concluded that the QDM contains several core elements that make it a promising format for EHR-driven phenotyping algorithms for clinical research. However, we also found areas in which the QDM could be usefully extended, such as representing information extracted from clinical text, and the ability to handle algorithms that do not consist of Boolean combinations of criteria.
An Investigation of Large Aircraft Handling Qualities
NASA Astrophysics Data System (ADS)
Joyce, Richard D.
An analytical technique for investigating transport aircraft handling qualities is exercised in a study using models of two such vehicles, a Boeing 747 and Lockheed C-5A. Two flight conditions are employed for climb and directional tasks, and a third included for a flare task. The analysis technique is based upon a "structural model" of the human pilot developed by Hess. The associated analysis procedure has been discussed previously in the literature, but centered almost exclusively on the characteristics of high-performance fighter aircraft. The handling qualities rating level (HQRL) and pilot induced oscillation tendencies rating level (PIORL) are predicted for nominal configurations of the aircraft and for "damaged" configurations where actuator rate limits are introduced as nonlinearites. It is demonstrated that the analysis can accommodate nonlinear pilot/vehicle behavior and do so in the context of specific flight tasks, yielding estimates of handling qualities, pilot-induced oscillation tendencies and upper limits of task performance. A brief human-in-the-loop tracking study was performed to provide a limited validation of the pilot model employed.
Becker, Nina I; Encarnação, Jorge A
2015-01-01
Species distribution and endangerment can be assessed by habitat-suitability modelling. This study addresses methodical aspects of habitat suitability modelling and includes an application example in actual species conservation and landscape planning. Models using species presence-absence data are preferable to presence-only models. In contrast to species presence data, absences are rarely recorded. Therefore, many studies generate pseudo-absence data for modelling. However, in this study model quality was higher with null samples collected in the field. Next to species data the choice of landscape data is crucial for suitability modelling. Landscape data with high resolution and ecological relevance for the study species improve model reliability and quality for small elusive mammals like Muscardinus avellanarius. For large scale assessment of species distribution, models with low-detailed data are sufficient. For regional site-specific conservation issues like a conflict-free site for new wind turbines, high-detailed regional models are needed. Even though the overlap with optimally suitable habitat for M. avellanarius was low, the installation of wind plants can pose a threat due to habitat loss and fragmentation. To conclude, modellers should clearly state the purpose of their models and choose the according level of detail for species and environmental data.
2015-01-01
Species distribution and endangerment can be assessed by habitat-suitability modelling. This study addresses methodical aspects of habitat suitability modelling and includes an application example in actual species conservation and landscape planning. Models using species presence-absence data are preferable to presence-only models. In contrast to species presence data, absences are rarely recorded. Therefore, many studies generate pseudo-absence data for modelling. However, in this study model quality was higher with null samples collected in the field. Next to species data the choice of landscape data is crucial for suitability modelling. Landscape data with high resolution and ecological relevance for the study species improve model reliability and quality for small elusive mammals like Muscardinus avellanarius. For large scale assessment of species distribution, models with low-detailed data are sufficient. For regional site-specific conservation issues like a conflict-free site for new wind turbines, high-detailed regional models are needed. Even though the overlap with optimally suitable habitat for M. avellanarius was low, the installation of wind plants can pose a threat due to habitat loss and fragmentation. To conclude, modellers should clearly state the purpose of their models and choose the according level of detail for species and environmental data. PMID:25781894
Mathematical model for prediction of efficiency indicators of educational activity in high school
NASA Astrophysics Data System (ADS)
Tikhonova, O. M.; Kushnikov, V. A.; Fominykh, D. S.; Rezchikov, A. F.; Ivashchenko, V. A.; Bogomolov, A. S.; Filimonyuk, L. Yu; Dolinina, O. N.; Kushnikov, O. V.; Shulga, T. E.; Tverdokhlebov, V. A.
2018-05-01
The quality of high school is a current problem all over the world. The paper presents the system dedicated to predicting the accreditation indicators of technical universities based on J. Forrester mechanism of system dynamics. The mathematical model is developed for prediction of efficiency indicators of the educational activity and is based on the apparatus of nonlinear differential equations.
Lean business model and implementation of a geriatric fracture center.
Kates, Stephen L
2014-05-01
Geriatric hip fracture is a common event associated with high costs of care and often with suboptimal outcomes for the patients. Ideally, a new care model to manage geriatric hip fractures would address both quality and safety of patient care as well as the need for reduced costs of care. The geriatric fracture center model of care is one such model reported to improve both outcomes and quality of care. It is a lean business model applied to medicine. This article describes basic lean business concepts applied to geriatric fracture care and information needed to successfully implement a geriatric fracture center. It is written to assist physicians and surgeons in their efforts to implement an improved care model for their patients. Copyright © 2014 Elsevier Inc. All rights reserved.
Discovery learning model with geogebra assisted for improvement mathematical visual thinking ability
NASA Astrophysics Data System (ADS)
Juandi, D.; Priatna, N.
2018-05-01
The main goal of this study is to improve the mathematical visual thinking ability of high school student through implementation the Discovery Learning Model with Geogebra Assisted. This objective can be achieved through study used quasi-experimental method, with non-random pretest-posttest control design. The sample subject of this research consist of 62 senior school student grade XI in one of school in Bandung district. The required data will be collected through documentation, observation, written tests, interviews, daily journals, and student worksheets. The results of this study are: 1) Improvement students Mathematical Visual Thinking Ability who obtain learning with applied the Discovery Learning Model with Geogebra assisted is significantly higher than students who obtain conventional learning; 2) There is a difference in the improvement of students’ Mathematical Visual Thinking ability between groups based on prior knowledge mathematical abilities (high, medium, and low) who obtained the treatment. 3) The Mathematical Visual Thinking Ability improvement of the high group is significantly higher than in the medium and low groups. 4) The quality of improvement ability of high and low prior knowledge is moderate category, in while the quality of improvement ability in the high category achieved by student with medium prior knowledge.
Merkow, Ryan P; Hall, Bruce L; Cohen, Mark E; Wang, Xue; Adams, John L; Chow, Warren B; Lawson, Elise H; Bilimoria, Karl Y; Richards, Karen; Ko, Clifford Y
2013-03-01
To develop a reliable, robust, parsimonious, risk-adjusted 30-day composite colectomy outcome measure. A fundamental aspect in the pursuit of high-quality care is the development of valid and reliable performance measures in surgery. Colon resection is associated with appreciable morbidity and mortality and therefore is an ideal quality improvement target. From 2010 American College of Surgeons National Surgical Quality Improvement Program data, patients were identified who underwent colon resection for any indication. A composite outcome of death or any serious morbidity within 30 days of the index operation was established. A 6-predictor, parsimonious model was developed and compared with a more complex model with more variables. National caseload requirements were calculated on the basis of increasing reliability thresholds. From 255 hospitals, 22,346 patients were accrued who underwent a colon resection in 2010, most commonly for neoplasm (46.7%). A mortality or serious morbidity event occurred in 4461 patients (20.0%). At the hospital level, the median composite event rate was 20.7% (interquartile range: 15.8%-26.3%). The parsimonious model performed similarly to the full model (Akaike information criterion: 19,411 vs 18,988), and hospital-level performance comparisons were highly correlated (R = 0.97). At a reliability threshold of 0.4, 56 annual colon resections would be required and achievable at an estimated 42% of US and 69% of American College of Surgeons National Surgical Quality Improvement Program hospitals. This 42% of US hospitals performed approximately 84% of all colon resections in the country in 2008. It is feasible to design a measure with a composite outcome of death or serious morbidity after colon surgery that has a low burden for data collection, has substantial clinical importance, and has acceptable reliability.
NASA Technical Reports Server (NTRS)
Lawrence, Stella
1992-01-01
This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.
Coles, Graeme D; Wratten, Stephen D; Porter, John R
2016-01-01
Human food security requires the production of sufficient quantities of both high-quality protein and dietary energy. In a series of case-studies from New Zealand, we show that while production of food ingredients from crops on arable land can meet human dietary energy requirements effectively, requirements for high-quality protein are met more efficiently by animal production from such land. We present a model that can be used to assess dietary energy and quality-corrected protein production from various crop and crop/animal production systems, and demonstrate its utility. We extend our analysis with an accompanying economic analysis of commercially-available, pre-prepared or simply-cooked foods that can be produced from our case-study crop and animal products. We calculate the per-person, per-day cost of both quality-corrected protein and dietary energy as provided in the processed foods. We conclude that mixed dairy/cropping systems provide the greatest quantity of high-quality protein per unit price to the consumer, have the highest food energy production and can support the dietary requirements of the highest number of people, when assessed as all-year-round production systems. Global food and nutritional security will largely be an outcome of national or regional agroeconomies addressing their own food needs. We hope that our model will be used for similar analyses of food production systems in other countries, agroecological zones and economies.
NASA Astrophysics Data System (ADS)
Brasington, J.
2015-12-01
Over the last five years, Structure-from-Motion photogrammetry has dramatically democratized the availability of high quality topographic data. This approach involves the use of a non-linear bundle adjustment to estimate simultaneously camera position, pose, distortion and 3D model coordinates. In contrast to traditional aerial photogrammetry, the bundle adjustment is typically solved without external constraints and instead ground control is used a posteriori to transform the modelled coordinates to an established datum using a similarity transformation. The limited data requirements, coupled with the ability to self-calibrate compact cameras, has led to a burgeoning of applications using low-cost imagery acquired terrestrially or from low-altitude platforms. To date, most applications have focused on relatively small spatial scales where relaxed logistics permit the use of dense ground control and high resolution, close-range photography. It is less clear whether this low-cost approach can be successfully upscaled to tackle larger, watershed-scale projects extending over 102-3 km2 where it could offer a competitive alternative to landscape modelling with airborne lidar. At such scales, compromises over the density of ground control, the speed and height of sensor platform and related image properties are inevitable. In this presentation we provide a systematic assessment of large-scale SfM terrain products derived for over 80 km2 of the braided Dart River and its catchment in the Southern Alps of NZ. Reference data in the form of airborne and terrestrial lidar are used to quantify the quality of 3D reconstructions derived from helicopter photography and used to establish baseline uncertainty models for geomorphic change detection. Results indicate that camera network design is a key determinant of model quality, and that standard aerial networks based on strips of nadir photography can lead to unstable camera calibration and systematic errors that are difficult to model with sparse ground control. We demonstrate how a low cost multi-camera platform providing both nadir and oblique imagery can support robust camera calibration, enabling the generation of high quality, large-scale terrain products that are suitable for precision fluvial change detection.
NASA Astrophysics Data System (ADS)
Liu, Yuanyuan; Peng, Yankun; Zhang, Leilei; Dhakal, Sagar; Wang, Caiping
2014-05-01
Pork is one of the highly consumed meat item in the world. With growing improvement of living standard, concerned stakeholders including consumers and regulatory body pay more attention to comprehensive quality of fresh pork. Different analytical-laboratory based technologies exist to determine quality attributes of pork. However, none of the technologies are able to meet industrial desire of rapid and non-destructive technological development. Current study used optical instrument as a rapid and non-destructive tool to classify 24 h-aged pork longissimus dorsi samples into three kinds of meat (PSE, Normal and DFD), on the basis of color L* and pH24. Total of 66 samples were used in the experiment. Optical system based on Vis/NIR spectral acquisition system (300-1100 nm) was self- developed in laboratory to acquire spectral signal of pork samples. Median smoothing filter (M-filter) and multiplication scatter correction (MSC) was used to remove spectral noise and signal drift. Support vector machine (SVM) prediction model was developed to classify the samples based on their comprehensive qualities. The results showed that the classification model is highly correlated with the actual quality parameters with classification accuracy more than 85%. The system developed in this study being simple and easy to use, results being promising, the system can be used in meat processing industry for real time, non-destructive and rapid detection of pork qualities in future.
Quality of care and patient satisfaction in hospitals with high concentrations of black patients.
Brooks-Carthon, J Margo; Kutney-Lee, Ann; Sloane, Douglas M; Cimiotti, Jeannie P; Aiken, Linda H
2011-09-01
To examine the influence of nursing-specifically nurse staffing and the nurse work environment-on quality of care and patient satisfaction in hospitals with varying concentrations of Black patients. Cross-sectional secondary analysis of 2006-2007 nurse survey data collected across four states (Florida, Pennsylvania, New Jersey, and California), the Hospital Consumer Assessment of Healthcare Providers and Systems survey, and administrative data. Global analysis of variance and linear regression models were used to examine the association between the concentration of Black patients on quality measures (readiness for discharge, patient or family complaints, health care-associated infections) and patient satisfaction, before and after accounting for nursing and hospital characteristics. Nurses working in hospitals with higher concentrations of Blacks reported poorer confidence in patients' readiness for discharge and more frequent complaints and infections. Patients treated in hospitals with higher concentrations of Blacks were less satisfied with their care. In the fully adjusted regression models for quality and patient satisfaction outcomes, the effects associated with the concentration of Blacks were explained in part by nursing and structural hospital characteristics. This study demonstrates a relationship between nursing, structural hospital characteristics, quality of care, and patient satisfaction in hospitals with high concentrations of Black patients. Consideration of nursing factors, in addition to other important hospital characteristics, is critical to understanding and improving quality of care and patient satisfaction in minority-serving hospitals. © 2011 Sigma Theta Tau International.
Forecasting relative impacts of land use on anadromous fish habitat to guide conservation planning.
Lohse, Kathleen A; Newburn, David A; Opperman, Jeff J; Merenlender, Adina M
2008-03-01
Land use change can adversely affect water quality and freshwater ecosystems, yet our ability to predict how systems will respond to different land uses, particularly rural-residential development, is limited by data availability and our understanding of biophysical thresholds. In this study, we use spatially explicit parcel-level data to examine the influence of land use (including urban, rural-residential, and vineyard) on salmon spawning substrate quality in tributaries of the Russian River in California. We develop a land use change model to forecast the probability of losses in high-quality spawning habitat and recommend priority areas for incentive-based land conservation efforts. Ordinal logistic regression results indicate that all three land use types were negatively associated with spawning substrate quality, with urban development having the largest marginal impact. For two reasons, however, forecasted rural-residential and vineyard development have much larger influences on decreasing spawning substrate quality relative to urban development. First, the land use change model estimates 10 times greater land use conversion to both rural-residential and vineyard compared to urban. Second, forecasted urban development is concentrated in the most developed watersheds, which already have poor spawning substrate quality, such that the marginal response to future urban development is less significant. To meet the goals of protecting salmonid spawning habitat and optimizing investments in salmon recovery, we suggest investing in watersheds where future rural-residential development and vineyards threaten high-quality fish habitat, rather than the most developed watersheds, where land values are higher.
Pathak, Jyotishman; Bailey, Kent R; Beebe, Calvin E; Bethard, Steven; Carrell, David S; Chen, Pei J; Dligach, Dmitriy; Endle, Cory M; Hart, Lacey A; Haug, Peter J; Huff, Stanley M; Kaggal, Vinod C; Li, Dingcheng; Liu, Hongfang; Marchant, Kyle; Masanz, James; Miller, Timothy; Oniki, Thomas A; Palmer, Martha; Peterson, Kevin J; Rea, Susan; Savova, Guergana K; Stancl, Craig R; Sohn, Sunghwan; Solbrig, Harold R; Suesse, Dale B; Tao, Cui; Taylor, David P; Westberg, Les; Wu, Stephen; Zhuo, Ning; Chute, Christopher G
2013-01-01
Research objective To develop scalable informatics infrastructure for normalization of both structured and unstructured electronic health record (EHR) data into a unified, concept-based model for high-throughput phenotype extraction. Materials and methods Software tools and applications were developed to extract information from EHRs. Representative and convenience samples of both structured and unstructured data from two EHR systems—Mayo Clinic and Intermountain Healthcare—were used for development and validation. Extracted information was standardized and normalized to meaningful use (MU) conformant terminology and value set standards using Clinical Element Models (CEMs). These resources were used to demonstrate semi-automatic execution of MU clinical-quality measures modeled using the Quality Data Model (QDM) and an open-source rules engine. Results Using CEMs and open-source natural language processing and terminology services engines—namely, Apache clinical Text Analysis and Knowledge Extraction System (cTAKES) and Common Terminology Services (CTS2)—we developed a data-normalization platform that ensures data security, end-to-end connectivity, and reliable data flow within and across institutions. We demonstrated the applicability of this platform by executing a QDM-based MU quality measure that determines the percentage of patients between 18 and 75 years with diabetes whose most recent low-density lipoprotein cholesterol test result during the measurement year was <100 mg/dL on a randomly selected cohort of 273 Mayo Clinic patients. The platform identified 21 and 18 patients for the denominator and numerator of the quality measure, respectively. Validation results indicate that all identified patients meet the QDM-based criteria. Conclusions End-to-end automated systems for extracting clinical information from diverse EHR systems require extensive use of standardized vocabularies and terminologies, as well as robust information models for storing, discovering, and processing that information. This study demonstrates the application of modular and open-source resources for enabling secondary use of EHR data through normalization into standards-based, comparable, and consistent format for high-throughput phenotyping to identify patient cohorts. PMID:24190931
ERIC Educational Resources Information Center
Wolf-Branigin, Michael; Schuyler, Vincent; White, Patience
2007-01-01
Improving quality of life is the primary focus as adolescents with disabilities enter adulthood. They increasingly, however, encounter difficulties transitioning into domains such as employment as these services occur near the end of their high school experience. Using an ecosystems model within a developmental approach, the program sought to…
NASA Astrophysics Data System (ADS)
Ranatunga, T.; Tong, S.; Yang, J.
2011-12-01
Hydrologic and water quality models can provide a general framework to conceptualize and investigate the relationships between climate and water resources. Under a hot and dry climate, highly urbanized watersheds are more vulnerable to changes in climate, such as excess heat and drought. In this study, a comprehensive watershed model, Hydrological Simulation Program FORTRAN (HSPF), is used to assess the impacts of future climate change on the stream discharge and water quality in Las Vegas Wash in Nevada, the only surface water body that drains from the Las Vegas Valley (an area with rapid population growth and urbanization) to Lake Mead. In this presentation, the process of model building, calibration and validation, the generation of climate change scenarios, and the assessment of future climate change effects on stream hydrology and quality are demonstrated. The hydrologic and water quality model is developed based on the data from current national databases and existing major land use categories of the watershed. The model is calibrated for stream discharge, nutrients (nitrogen and phosphorus) and sediment yield. The climate change scenarios are derived from the outputs of the Global Climate Models (GCM) and Regional Climate Models (RCM) simulations, and from the recent assessment reports from the Intergovernmental Panel on Climate Change (IPCC). The Climate Assessment Tool from US EPA's BASINS is used to assess the effects of likely future climate scenarios on the water quantity and quality in Las Vegas Wash. Also the presentation discusses the consequences of these hydrologic changes, including the deficit supplies of clean water during peak seasons of water demand, increased eutrophication potentials, wetland deterioration, and impacts on wild life habitats.
Global Atmosphere Watch Workshop on Measurement-Model ...
The World Meteorological Organization’s (WMO) Global Atmosphere Watch (GAW) Programme coordinates high-quality observations of atmospheric composition from global to local scales with the aim to drive high-quality and high-impact science while co-producing a new generation of products and services. In line with this vision, GAW’s Scientific Advisory Group for Total Atmospheric Deposition (SAG-TAD) has a mandate to produce global maps of wet, dry and total atmospheric deposition for important atmospheric chemicals to enable research into biogeochemical cycles and assessments of ecosystem and human health effects. The most suitable scientific approach for this activity is the emerging technique of measurement-model fusion for total atmospheric deposition. This technique requires global-scale measurements of atmospheric trace gases, particles, precipitation composition and precipitation depth, as well as predictions of the same from global/regional chemical transport models. The fusion of measurement and model results requires data assimilation and mapping techniques. The objective of the GAW Workshop on Measurement-Model Fusion for Global Total Atmospheric Deposition (MMF-GTAD), an initiative of the SAG-TAD, was to review the state-of-the-science and explore the feasibility and methodology of producing, on a routine retrospective basis, global maps of atmospheric gas and aerosol concentrations as well as wet, dry and total deposition via measurement-model
Liu, Mei; Lu, Jun
2014-09-01
Water quality forecasting in agricultural drainage river basins is difficult because of the complicated nonpoint source (NPS) pollution transport processes and river self-purification processes involved in highly nonlinear problems. Artificial neural network (ANN) and support vector model (SVM) were developed to predict total nitrogen (TN) and total phosphorus (TP) concentrations for any location of the river polluted by agricultural NPS pollution in eastern China. River flow, water temperature, flow travel time, rainfall, dissolved oxygen, and upstream TN or TP concentrations were selected as initial inputs of the two models. Monthly, bimonthly, and trimonthly datasets were selected to train the two models, respectively, and the same monthly dataset which had not been used for training was chosen to test the models in order to compare their generalization performance. Trial and error analysis and genetic algorisms (GA) were employed to optimize the parameters of ANN and SVM models, respectively. The results indicated that the proposed SVM models performed better generalization ability due to avoiding the occurrence of overtraining and optimizing fewer parameters based on structural risk minimization (SRM) principle. Furthermore, both TN and TP SVM models trained by trimonthly datasets achieved greater forecasting accuracy than corresponding ANN models. Thus, SVM models will be a powerful alternative method because it is an efficient and economic tool to accurately predict water quality with low risk. The sensitivity analyses of two models indicated that decreasing upstream input concentrations during the dry season and NPS emission along the reach during average or flood season should be an effective way to improve Changle River water quality. If the necessary water quality and hydrology data and even trimonthly data are available, the SVM methodology developed here can easily be applied to other NPS-polluted rivers.
Banović, Marija; Grunert, Klaus G; Barreira, Maria Madalena; Fontes, Magda Aguiar
2010-01-01
This study investigated the differences in the consumers' quality perception of national branded, national store branded, and imported store branded beef. Partial Least Squares analysis is used for modelling the quality perception process. Results show that consumers perceived national branded Carnalentejana beef, as better on all quality cues and quality aspects than the other two store branded beefs. Preference for Carnalentejana beef stayed highly consistent even after the blind test, where consumers differentiated this beef from the other two beef brands on all sensory dimensions: taste, tenderness, and juiciness, and chose it as the preferred one. Consumers utilized more perceived intrinsic cues to infer expected eating quality of store branded beefs.
Respite Care, Stress, Uplifts, and Marital Quality in Parents of Children with Down Syndrome.
Norton, Michelle; Dyches, Tina Taylor; Harper, James M; Roper, Susanne Olsen; Caldarella, Paul
2016-12-01
Parents of children with disabilities are at risk for high stress and low marital quality; therefore, this study surveyed couples (n = 112) of children with Down syndrome (n = 120), assessing whether respite hours, stress, and uplifts were related to marital quality. Structural equation modeling indicated that respite hours were negatively related to wife/husband stress, which was in turn negatively related to wife/husband marital quality. Also, wife uplifts were positively related to both wife and husband marital quality. Husband uplifts were positively related to husband marital quality. Therefore, it is important that respite care is provided and accessible to parents of children with Down syndrome.
Hagger-Johnson, Gareth; Mõttus, René; Craig, Leone C A; Starr, John M; Deary, Ian J
2012-07-01
C-reactive protein (CRP) is an acute-phase marker of systemic inflammation and considered an established risk marker for cardiovascular disease (CVD) in old age. Previous studies have suggested that low childhood intelligence, lower socioeconomic status (SES) in childhood or in later life, unhealthy behaviors, poor wellbeing, and high body mass index (BMI) are associated with inflammation. Life course models that simultaneously incorporate all these risk factors can explain how CVD risks accumulate over time, from childhood to old age. Using the data from 1,091 Scottish adults (Lothian Birth Cohort Study, 1936), a path model was constructed to predict CRP at age 70 from concurrent health behaviors, self-perceived quality of life, and BMI and adulthood SES as mediating variables, and from parental SES and childhood intelligence as distal risk factors. A well-fitting path model (CFI = .92, SRMR = .05) demonstrated significant indirect effects from childhood intelligence and parental social class to inflammation via BMI, health behaviors and quality of life (all ps < .05). Low childhood intelligence, unhealthy behaviors, and higher BMI were also direct predictors of CRP. The life course model illustrated how CVD risks may accumulate over time, beginning in childhood and being both direct and transmitted indirectly via low adult SES, unhealthy behaviors, impaired quality of life, and high BMI. Knowledge on the childhood risk factors and their pathways to poor health can be used to identify high-risk individuals for more intensive and tailored behavior change interventions, and to develop effective public health policies.
Sandler, Irwin N; Wheeler, Lorey A; Braver, Sanford L
2013-12-01
The current study examined the associations between child mental health problems and the quality of maternal and paternal parenting, and how these associations were moderated by three contextual factors: quality of parenting by the other parent, interparental conflict, and the number of overnights parents had with the child. Data for the current study came from a sample of divorcing families who are in high legal conflict over developing or maintaining a parenting plan following divorce. Analyses revealed that the associations between child mental health problems and positive maternal and paternal parenting were moderated by the quality of parenting provided by the other parent and by the number of overnights children spent with parents, but not by the level of interparental conflict. When parenting by the other parent and number of overnights were considered together in the same model, only number of overnights moderated the relations between parenting and child-behavior problems. The results support the proposition that the well-being of children in high-conflict divorcing families is better when they spend adequate time with at least one parent who provides high-quality parenting.
Sandler, Irwin N.; Wheeler, Lorey A.; Braver, Sanford L.
2013-01-01
The current study examined the associations between child mental health problems and the quality of maternal and paternal parenting, and how these associations were moderated by three contextual factors, quality of parenting by the other parent, interparental conflict, and the number of overnights parents had with the child. Data for the current study come from a sample of divorcing families who are in high legal conflict over developing or maintaining a parenting plan following divorce. Analyses revealed that the associations between child mental health problems and positive maternal and paternal parenting were moderated by the quality of parenting provided by the other parent and by the number of overnights children spent with parents, but not by the level of interparental conflict. When both parenting by the other parent and number of overnights were considered in the same model, only number of overnights moderated the relations between parenting and child behavior problems. The results support the proposition that the well-being of children in high conflict divorcing families is better when they spend adequate time with at least one parent who provides high quality parenting. PMID:24098960
A new WRF-CMAQ two-way coupled model was developed to provide a pathway for chemical feedbacks from the air quality model to the meteorological model. The essence of this interaction is focused on the direct radiative effects of scattering and absorbing aerosols in the tropospher...
ERIC Educational Resources Information Center
Lee, Shinyoung; Kim, Heui-Baik
2014-01-01
The purpose of this study is to identify the epistemological features and model qualities depending on model evaluation levels and to explore the reasoning process behind high-level evaluation through small group interaction about blood circulation. Nine groups of three to four students in the eighth grade participated in the modeling practice.…
Collibee, Charlene; Furman, Wyndol
2015-01-01
The present study assessed a developmental task theory of romantic relationships by examining associations between romantic relationship qualities and adjustment across 9 years using a community based sample of 100 male and 100 female participants (M age Wave 1 = 15.83) in a Western U.S. city. Using multilevel modeling, the study examined the moderating effect of age on links between romantic relationship qualities and adjustment. Consistent with developmental task theory, high romantic quality was more associated with internalizing symptoms and dating satisfaction during young adulthood than adolescence. Romantic relationship qualities were also associated with externalizing symptoms and substance use, but the degree of association was consistent across ages. The findings underscore the significance of romantic relationship qualities across development. PMID:26283151