Sample records for quality modeling approach

  1. Receiving water quality assessment: comparison between simplified and detailed integrated urban modelling approaches.

    PubMed

    Mannina, Giorgio; Viviani, Gaspare

    2010-01-01

    Urban water quality management often requires use of numerical models allowing the evaluation of the cause-effect relationship between the input(s) (i.e. rainfall, pollutant concentrations on catchment surface and in sewer system) and the resulting water quality response. The conventional approach to the system (i.e. sewer system, wastewater treatment plant and receiving water body), considering each component separately, does not enable optimisation of the whole system. However, recent gains in understanding and modelling make it possible to represent the system as a whole and optimise its overall performance. Indeed, integrated urban drainage modelling is of growing interest for tools to cope with Water Framework Directive requirements. Two different approaches can be employed for modelling the whole urban drainage system: detailed and simplified. Each has its advantages and disadvantages. Specifically, detailed approaches can offer a higher level of reliability in the model results, but can be very time consuming from the computational point of view. Simplified approaches are faster but may lead to greater model uncertainty due to an over-simplification. To gain insight into the above problem, two different modelling approaches have been compared with respect to their uncertainty. The first urban drainage integrated model approach uses the Saint-Venant equations and the 1D advection-dispersion equations, for the quantity and for the quality aspects, respectively. The second model approach consists of the simplified reservoir model. The analysis used a parsimonious bespoke model developed in previous studies. For the uncertainty analysis, the Generalised Likelihood Uncertainty Estimation (GLUE) procedure was used. Model reliability was evaluated on the basis of capacity of globally limiting the uncertainty. Both models have a good capability to fit the experimental data, suggesting that all adopted approaches are equivalent both for quantity and quality. The

  2. Hybrid Air Quality Modeling Approach For Use in the Near ...

    EPA Pesticide Factsheets

    The Near-road EXposures to Urban air pollutant Study (NEXUS) investigated whether children with asthma living in close proximity to major roadways in Detroit, MI, (particularly near roadways with high diesel traffic) have greater health impacts associated with exposure to air pollutants than those living farther away. A major challenge in such health and exposure studies is the lack of information regarding pollutant exposure characterization. Air quality modeling can provide spatially and temporally varying exposure estimates for examining relationships between traffic-related air pollutants and adverse health outcomes. This paper presents a hybrid air quality modeling approach and its application in NEXUS in order to provide spatial and temporally varying exposure estimates and identification of the mobile source contribution to the total pollutant exposure. Model-based exposure metrics, associated with local variations of emissions and meteorology, were estimated using a combination of the AERMOD and R-LINE dispersion models, local emission source information from the National Emissions Inventory, detailed road network locations and traffic activity, and meteorological data from the Detroit City Airport. The regional background contribution was estimated using a combination of the Community Multiscale Air Quality (CMAQ) model and the Space/Time Ordinary Kriging (STOK) model. To capture the near-road pollutant gradients, refined “mini-grids” of model recep

  3. Fast Geometric Consensus Approach for Protein Model Quality Assessment

    PubMed Central

    Adamczak, Rafal; Pillardy, Jaroslaw; Vallat, Brinda K.

    2011-01-01

    Abstract Model quality assessment (MQA) is an integral part of protein structure prediction methods that typically generate multiple candidate models. The challenge lies in ranking and selecting the best models using a variety of physical, knowledge-based, and geometric consensus (GC)-based scoring functions. In particular, 3D-Jury and related GC methods assume that well-predicted (sub-)structures are more likely to occur frequently in a population of candidate models, compared to incorrectly folded fragments. While this approach is very successful in the context of diversified sets of models, identifying similar substructures is computationally expensive since all pairs of models need to be superimposed using MaxSub or related heuristics for structure-to-structure alignment. Here, we consider a fast alternative, in which structural similarity is assessed using 1D profiles, e.g., consisting of relative solvent accessibilities and secondary structures of equivalent amino acid residues in the respective models. We show that the new approach, dubbed 1D-Jury, allows to implicitly compare and rank N models in O(N) time, as opposed to quadratic complexity of 3D-Jury and related clustering-based methods. In addition, 1D-Jury avoids computationally expensive 3D superposition of pairs of models. At the same time, structural similarity scores based on 1D profiles are shown to correlate strongly with those obtained using MaxSub. In terms of the ability to select the best models as top candidates 1D-Jury performs on par with other GC methods. Other potential applications of the new approach, including fast clustering of large numbers of intermediate structures generated by folding simulations, are discussed as well. PMID:21244273

  4. Performance of stochastic approaches for forecasting river water quality.

    PubMed

    Ahmad, S; Khan, I H; Parida, B P

    2001-12-01

    This study analysed water quality data collected from the river Ganges in India from 1981 to 1990 for forecasting using stochastic models. Initially the box and whisker plots and Kendall's tau test were used to identify the trends during the study period. For detecting the possible intervention in the data the time series plots and cusum charts were used. The three approaches of stochastic modelling which account for the effect of seasonality in different ways. i.e. multiplicative autoregressive integrated moving average (ARIMA) model. deseasonalised model and Thomas-Fiering model were used to model the observed pattern in water quality. The multiplicative ARIMA model having both nonseasonal and seasonal components were, in general, identified as appropriate models. In the deseasonalised modelling approach, the lower order ARIMA models were found appropriate for the stochastic component. The set of Thomas-Fiering models were formed for each month for all water quality parameters. These models were then used to forecast the future values. The error estimates of forecasts from the three approaches were compared to identify the most suitable approach for the reliable forecast. The deseasonalised modelling approach was recommended for forecasting of water quality parameters of a river.

  5. FINE SCALE AIR QUALITY MODELING USING DISPERSION AND CMAQ MODELING APPROACHES: AN EXAMPLE APPLICATION IN WILMINGTON, DE

    EPA Science Inventory

    Characterization of spatial variability of air pollutants in an urban setting at fine scales is critical for improved air toxics exposure assessments, for model evaluation studies and also for air quality regulatory applications. For this study, we investigate an approach that su...

  6. Developing a quality by design approach to model tablet dissolution testing: an industrial case study.

    PubMed

    Yekpe, Ketsia; Abatzoglou, Nicolas; Bataille, Bernard; Gosselin, Ryan; Sharkawi, Tahmer; Simard, Jean-Sébastien; Cournoyer, Antoine

    2018-07-01

    This study applied the concept of Quality by Design (QbD) to tablet dissolution. Its goal was to propose a quality control strategy to model dissolution testing of solid oral dose products according to International Conference on Harmonization guidelines. The methodology involved the following three steps: (1) a risk analysis to identify the material- and process-related parameters impacting the critical quality attributes of dissolution testing, (2) an experimental design to evaluate the influence of design factors (attributes and parameters selected by risk analysis) on dissolution testing, and (3) an investigation of the relationship between design factors and dissolution profiles. Results show that (a) in the case studied, the two parameters impacting dissolution kinetics are active pharmaceutical ingredient particle size distributions and tablet hardness and (b) these two parameters could be monitored with PAT tools to predict dissolution profiles. Moreover, based on the results obtained, modeling dissolution is possible. The practicality and effectiveness of the QbD approach were demonstrated through this industrial case study. Implementing such an approach systematically in industrial pharmaceutical production would reduce the need for tablet dissolution testing.

  7. Image quality guided approach for adaptive modelling of biometric intra-class variations

    NASA Astrophysics Data System (ADS)

    Abboud, Ali J.; Jassim, Sabah A.

    2010-04-01

    The high intra-class variability of acquired biometric data can be attributed to several factors such as quality of acquisition sensor (e.g. thermal), environmental (e.g. lighting), behavioural (e.g. change face pose). Such large fuzziness of biometric data can cause a big difference between an acquired and stored biometric data that will eventually lead to reduced performance. Many systems store multiple templates in order to account for such variations in the biometric data during enrolment stage. The number and typicality of these templates are the most important factors that affect system performance than other factors. In this paper, a novel offline approach is proposed for systematic modelling of intra-class variability and typicality in biometric data by regularly selecting new templates from a set of available biometric images. Our proposed technique is a two stage algorithm whereby in the first stage image samples are clustered in terms of their image quality profile vectors, rather than their biometric feature vectors, and in the second stage a per cluster template is selected from a small number of samples in each clusters to create an ultimate template sets. These experiments have been conducted on five face image databases and their results will demonstrate the effectiveness of proposed quality guided approach.

  8. Mamdani-Fuzzy Modeling Approach for Quality Prediction of Non-Linear Laser Lathing Process

    NASA Astrophysics Data System (ADS)

    Sivaraos; Khalim, A. Z.; Salleh, M. S.; Sivakumar, D.; Kadirgama, K.

    2018-03-01

    Lathing is a process to fashioning stock materials into desired cylindrical shapes which usually performed by traditional lathe machine. But, the recent rapid advancements in engineering materials and precision demand gives a great challenge to the traditional method. The main drawback of conventional lathe is its mechanical contact which brings to the undesirable tool wear, heat affected zone, finishing, and dimensional accuracy especially taper quality in machining of stock with high length to diameter ratio. Therefore, a novel approach has been devised to investigate in transforming a 2D flatbed CO2 laser cutting machine into 3D laser lathing capability as an alternative solution. Three significant design parameters were selected for this experiment, namely cutting speed, spinning speed, and depth of cut. Total of 24 experiments were performed with eight (8) sequential runs where they were then replicated three (3) times. The experimental results were then used to establish Mamdani - Fuzzy predictive model where it yields the accuracy of more than 95%. Thus, the proposed Mamdani - Fuzzy modelling approach is found very much suitable and practical for quality prediction of non-linear laser lathing process for cylindrical stocks of 10mm diameter.

  9. Modeling water quality in an urban river using hydrological factors--data driven approaches.

    PubMed

    Chang, Fi-John; Tsai, Yu-Hsuan; Chen, Pin-An; Coynel, Alexandra; Vachaud, Georges

    2015-03-15

    Contrasting seasonal variations occur in river flow and water quality as a result of short duration, severe intensity storms and typhoons in Taiwan. Sudden changes in river flow caused by impending extreme events may impose serious degradation on river water quality and fateful impacts on ecosystems. Water quality is measured in a monthly/quarterly scale, and therefore an estimation of water quality in a daily scale would be of good help for timely river pollution management. This study proposes a systematic analysis scheme (SAS) to assess the spatio-temporal interrelation of water quality in an urban river and construct water quality estimation models using two static and one dynamic artificial neural networks (ANNs) coupled with the Gamma test (GT) based on water quality, hydrological and economic data. The Dahan River basin in Taiwan is the study area. Ammonia nitrogen (NH3-N) is considered as the representative parameter, a correlative indicator in judging the contamination level over the study. Key factors the most closely related to the representative parameter (NH3-N) are extracted by the Gamma test for modeling NH3-N concentration, and as a result, four hydrological factors (discharge, days w/o discharge, water temperature and rainfall) are identified as model inputs. The modeling results demonstrate that the nonlinear autoregressive with exogenous input (NARX) network furnished with recurrent connections can accurately estimate NH3-N concentration with a very high coefficient of efficiency value (0.926) and a low RMSE value (0.386 mg/l). Besides, the NARX network can suitably catch peak values that mainly occur in dry periods (September-April in the study area), which is particularly important to water pollution treatment. The proposed SAS suggests a promising approach to reliably modeling the spatio-temporal NH3-N concentration based solely on hydrological data, without using water quality sampling data. It is worth noticing that such estimation can be

  10. Intelligent Systems Approaches to Product Sound Quality Analysis

    NASA Astrophysics Data System (ADS)

    Pietila, Glenn M.

    As a product market becomes more competitive, consumers become more discriminating in the way in which they differentiate between engineered products. The consumer often makes a purchasing decision based on the sound emitted from the product during operation by using the sound to judge quality or annoyance. Therefore, in recent years, many sound quality analysis tools have been developed to evaluate the consumer preference as it relates to a product sound and to quantify this preference based on objective measurements. This understanding can be used to direct a product design process in order to help differentiate the product from competitive products or to establish an impression on consumers regarding a product's quality or robustness. The sound quality process is typically a statistical tool that is used to model subjective preference, or merit score, based on objective measurements, or metrics. In this way, new product developments can be evaluated in an objective manner without the laborious process of gathering a sample population of consumers for subjective studies each time. The most common model used today is the Multiple Linear Regression (MLR), although recently non-linear Artificial Neural Network (ANN) approaches are gaining popularity. This dissertation will review publicly available published literature and present additional intelligent systems approaches that can be used to improve on the current sound quality process. The focus of this work is to address shortcomings in the current paired comparison approach to sound quality analysis. This research will propose a framework for an adaptive jury analysis approach as an alternative to the current Bradley-Terry model. The adaptive jury framework uses statistical hypothesis testing to focus on sound pairings that are most interesting and is expected to address some of the restrictions required by the Bradley-Terry model. It will also provide a more amicable framework for an intelligent systems approach

  11. EPA RESEARCH HIGHLIGHTS -- MODELS-3/CMAQ OFFERS COMPREHENSIVE APPROACH TO AIR QUALITY MODELING

    EPA Science Inventory

    Regional and global coordinated efforts are needed to address air quality problems that are growing in complexity and scope. Models-3 CMAQ contains a community multi-scale air quality modeling system for simulating urban to regional scale pollution problems relating to troposphe...

  12. Hybrid Air Quality Modeling Approach for use in the Hear-road Exposures to Urban air pollutant Study(NEXUS)

    EPA Science Inventory

    The paper presents a hybrid air quality modeling approach and its application in NEXUS in order to provide spatial and temporally varying exposure estimates and identification of the mobile source contribution to the total pollutant exposure. Model-based exposure metrics, associa...

  13. Model-driven approach to data collection and reporting for quality improvement.

    PubMed

    Curcin, Vasa; Woodcock, Thomas; Poots, Alan J; Majeed, Azeem; Bell, Derek

    2014-12-01

    Continuous data collection and analysis have been shown essential to achieving improvement in healthcare. However, the data required for local improvement initiatives are often not readily available from hospital Electronic Health Record (EHR) systems or not routinely collected. Furthermore, improvement teams are often restricted in time and funding thus requiring inexpensive and rapid tools to support their work. Hence, the informatics challenge in healthcare local improvement initiatives consists of providing a mechanism for rapid modelling of the local domain by non-informatics experts, including performance metric definitions, and grounded in established improvement techniques. We investigate the feasibility of a model-driven software approach to address this challenge, whereby an improvement model designed by a team is used to automatically generate required electronic data collection instruments and reporting tools. To that goal, we have designed a generic Improvement Data Model (IDM) to capture the data items and quality measures relevant to the project, and constructed Web Improvement Support in Healthcare (WISH), a prototype tool that takes user-generated IDM models and creates a data schema, data collection web interfaces, and a set of live reports, based on Statistical Process Control (SPC) for use by improvement teams. The software has been successfully used in over 50 improvement projects, with more than 700 users. We present in detail the experiences of one of those initiatives, Chronic Obstructive Pulmonary Disease project in Northwest London hospitals. The specific challenges of improvement in healthcare are analysed and the benefits and limitations of the approach are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Model-driven approach to data collection and reporting for quality improvement

    PubMed Central

    Curcin, Vasa; Woodcock, Thomas; Poots, Alan J.; Majeed, Azeem; Bell, Derek

    2014-01-01

    Continuous data collection and analysis have been shown essential to achieving improvement in healthcare. However, the data required for local improvement initiatives are often not readily available from hospital Electronic Health Record (EHR) systems or not routinely collected. Furthermore, improvement teams are often restricted in time and funding thus requiring inexpensive and rapid tools to support their work. Hence, the informatics challenge in healthcare local improvement initiatives consists of providing a mechanism for rapid modelling of the local domain by non-informatics experts, including performance metric definitions, and grounded in established improvement techniques. We investigate the feasibility of a model-driven software approach to address this challenge, whereby an improvement model designed by a team is used to automatically generate required electronic data collection instruments and reporting tools. To that goal, we have designed a generic Improvement Data Model (IDM) to capture the data items and quality measures relevant to the project, and constructed Web Improvement Support in Healthcare (WISH), a prototype tool that takes user-generated IDM models and creates a data schema, data collection web interfaces, and a set of live reports, based on Statistical Process Control (SPC) for use by improvement teams. The software has been successfully used in over 50 improvement projects, with more than 700 users. We present in detail the experiences of one of those initiatives, Chronic Obstructive Pulmonary Disease project in Northwest London hospitals. The specific challenges of improvement in healthcare are analysed and the benefits and limitations of the approach are discussed. PMID:24874182

  15. Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model

    ERIC Educational Resources Information Center

    Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep

    2016-01-01

    Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…

  16. Monitoring scale scores over time via quality control charts, model-based approaches, and time series techniques.

    PubMed

    Lee, Yi-Hsuan; von Davier, Alina A

    2013-07-01

    Maintaining a stable score scale over time is critical for all standardized educational assessments. Traditional quality control tools and approaches for assessing scale drift either require special equating designs, or may be too time-consuming to be considered on a regular basis with an operational test that has a short time window between an administration and its score reporting. Thus, the traditional methods are not sufficient to catch unusual testing outcomes in a timely manner. This paper presents a new approach for score monitoring and assessment of scale drift. It involves quality control charts, model-based approaches, and time series techniques to accommodate the following needs of monitoring scale scores: continuous monitoring, adjustment of customary variations, identification of abrupt shifts, and assessment of autocorrelation. Performance of the methodologies is evaluated using manipulated data based on real responses from 71 administrations of a large-scale high-stakes language assessment.

  17. Assessment of CT image quality using a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Reginatto, M.; Anton, M.; Elster, C.

    2017-08-01

    One of the most promising approaches for evaluating CT image quality is task-specific quality assessment. This involves a simplified version of a clinical task, e.g. deciding whether an image belongs to the class of images that contain the signature of a lesion or not. Task-specific quality assessment can be done by model observers, which are mathematical procedures that carry out the classification task. The most widely used figure of merit for CT image quality is the area under the ROC curve (AUC), a quantity which characterizes the performance of a given model observer. In order to estimate AUC from a finite sample of images, different approaches from classical statistics have been suggested. The goal of this paper is to introduce task-specific quality assessment of CT images to metrology and to propose a novel Bayesian estimation of AUC for the channelized Hotelling observer (CHO) applied to the task of detecting a lesion at a known image location. It is assumed that signal-present and signal-absent images follow multivariate normal distributions with the same covariance matrix. The Bayesian approach results in a posterior distribution for the AUC of the CHO which provides in addition a complete characterization of the uncertainty of this figure of merit. The approach is illustrated by its application to both simulated and experimental data.

  18. Human factors systems approach to healthcare quality and patient safety

    PubMed Central

    Carayon, Pascale; Wetterneck, Tosha B.; Rivera-Rodriguez, A. Joy; Hundt, Ann Schoofs; Hoonakker, Peter; Holden, Richard; Gurses, Ayse P.

    2013-01-01

    Human factors systems approaches are critical for improving healthcare quality and patient safety. The SEIPS (Systems Engineering Initiative for Patient Safety) model of work system and patient safety is a human factors systems approach that has been successfully applied in healthcare research and practice. Several research and practical applications of the SEIPS model are described. Important implications of the SEIPS model for healthcare system and process redesign are highlighted. Principles for redesigning healthcare systems using the SEIPS model are described. Balancing the work system and encouraging the active and adaptive role of workers are key principles for improving healthcare quality and patient safety. PMID:23845724

  19. A participatory sensing approach to characterize ride quality

    NASA Astrophysics Data System (ADS)

    Bridgelall, Raj

    2014-03-01

    Rough roads increase vehicle operation and road maintenance costs. Consequently, transportation agencies spend a significant portion of their budgets on ride-quality characterization to forecast maintenance needs. The ubiquity of smartphones and social media, and the emergence of a connected vehicle environment present lucrative opportunities for cost-reduction and continuous, network-wide, ride-quality characterization. However, there is a lack of models to transform inertial and position information from voluminous data flows into indices that transportation agencies currently use. This work expands on theories of the Road Impact Factor introduced in previous research. The index characterizes road roughness by aggregating connected vehicle data and reporting roughness in direct proportion to the International Roughness Index. Their theoretical relationships are developed, and a case study is presented to compare the relative data quality from an inertial profiler and a regular passenger vehicle. Results demonstrate that the approach is a viable alternative to existing models that require substantially more resources and provide less network coverage. One significant benefit of the participatory sensing approach is that transportation agencies can monitor all network facilities continuously to locate distress symptoms, such as frost heaves, that appear and disappear between ride assessment cycles. Another benefit of the approach is continuous monitoring of all high-risk intersections such as rail grade crossings to better understand the relationship between ride-quality and traffic safety.

  20. An Integrated model for Product Quality Development—A case study on Quality functions deployment and AHP based approach

    NASA Astrophysics Data System (ADS)

    Maitra, Subrata; Banerjee, Debamalya

    2010-10-01

    Present article is based on application of the product quality and improvement of design related with the nature of failure of machineries and plant operational problems of an industrial blower fan Company. The project aims at developing the product on the basis of standardized production parameters for selling its products in the market. Special attention is also being paid to the blower fans which have been ordered directly by the customer on the basis of installed capacity of air to be provided by the fan. Application of quality function deployment is primarily a customer oriented approach. Proposed model of QFD integrated with AHP to select and rank the decision criterions on the commercial and technical factors and the measurement of the decision parameters for selection of best product in the compettitive environment. The present AHP-QFD model justifies the selection of a blower fan with the help of the group of experts' opinion by pairwise comparison of the customer's and ergonomy based technical design requirements. The steps invoved in implementation of the QFD—AHP and selection of weighted criterion may be helpful for all similar purpose industries maintaining cost and utility for competitive product.

  1. A parsimonious dynamic model for river water quality assessment.

    PubMed

    Mannina, Giorgio; Viviani, Gaspare

    2010-01-01

    Water quality modelling is of crucial importance for the assessment of physical, chemical, and biological changes in water bodies. Mathematical approaches to water modelling have become more prevalent over recent years. Different model types ranging from detailed physical models to simplified conceptual models are available. Actually, a possible middle ground between detailed and simplified models may be parsimonious models that represent the simplest approach that fits the application. The appropriate modelling approach depends on the research goal as well as on data available for correct model application. When there is inadequate data, it is mandatory to focus on a simple river water quality model rather than detailed ones. The study presents a parsimonious river water quality model to evaluate the propagation of pollutants in natural rivers. The model is made up of two sub-models: a quantity one and a quality one. The model employs a river schematisation that considers different stretches according to the geometric characteristics and to the gradient of the river bed. Each stretch is represented with a conceptual model of a series of linear channels and reservoirs. The channels determine the delay in the pollution wave and the reservoirs cause its dispersion. To assess the river water quality, the model employs four state variables: DO, BOD, NH(4), and NO. The model was applied to the Savena River (Italy), which is the focus of a European-financed project in which quantity and quality data were gathered. A sensitivity analysis of the model output to the model input or parameters was done based on the Generalised Likelihood Uncertainty Estimation methodology. The results demonstrate the suitability of such a model as a tool for river water quality management.

  2. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes

    PubMed Central

    Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-01-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed‐batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647–1661, 2017 PMID:28786215

  3. A fuzzy-logic based decision-making approach for identification of groundwater quality based on groundwater quality indices.

    PubMed

    Vadiati, M; Asghari-Moghaddam, A; Nakhaei, M; Adamowski, J; Akbarzadeh, A H

    2016-12-15

    Due to inherent uncertainties in measurement and analysis, groundwater quality assessment is a difficult task. Artificial intelligence techniques, specifically fuzzy inference systems, have proven useful in evaluating groundwater quality in uncertain and complex hydrogeological systems. In the present study, a Mamdani fuzzy-logic-based decision-making approach was developed to assess groundwater quality based on relevant indices. In an effort to develop a set of new hybrid fuzzy indices for groundwater quality assessment, a Mamdani fuzzy inference model was developed with widely-accepted groundwater quality indices: the Groundwater Quality Index (GQI), the Water Quality Index (WQI), and the Ground Water Quality Index (GWQI). In an effort to present generalized hybrid fuzzy indices a significant effort was made to employ well-known groundwater quality index acceptability ranges as fuzzy model output ranges rather than employing expert knowledge in the fuzzification of output parameters. The proposed approach was evaluated for its ability to assess the drinking water quality of 49 samples collected seasonally from groundwater resources in Iran's Sarab Plain during 2013-2014. Input membership functions were defined as "desirable", "acceptable" and "unacceptable" based on expert knowledge and the standard and permissible limits prescribed by the World Health Organization. Output data were categorized into multiple categories based on the GQI (5 categories), WQI (5 categories), and GWQI (3 categories). Given the potential of fuzzy models to minimize uncertainties, hybrid fuzzy-based indices produce significantly more accurate assessments of groundwater quality than traditional indices. The developed models' accuracy was assessed and a comparison of the performance indices demonstrated the Fuzzy Groundwater Quality Index model to be more accurate than both the Fuzzy Water Quality Index and Fuzzy Ground Water Quality Index models. This suggests that the new hybrid fuzzy

  4. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes.

    PubMed

    Downey, Brandon; Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-11-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed-batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647-1661, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  5. Experienced quality factors: qualitative evaluation approach to audiovisual quality

    NASA Astrophysics Data System (ADS)

    Jumisko-Pyykkö, Satu; Häkkinen, Jukka; Nyman, Göte

    2007-02-01

    Subjective evaluation is used to identify impairment factors of multimedia quality. The final quality is often formulated via quantitative experiments, but this approach has its constraints, as subject's quality interpretations, experiences and quality evaluation criteria are disregarded. To identify these quality evaluation factors, this study examined qualitatively the criteria participants used to evaluate audiovisual video quality. A semi-structured interview was conducted with 60 participants after a subjective audiovisual quality evaluation experiment. The assessment compared several, relatively low audio-video bitrate ratios with five different television contents on mobile device. In the analysis, methodological triangulation (grounded theory, Bayesian networks and correspondence analysis) was applied to approach the qualitative quality. The results showed that the most important evaluation criteria were the factors of visual quality, contents, factors of audio quality, usefulness - followability and audiovisual interaction. Several relations between the quality factors and the similarities between the contents were identified. As a research methodological recommendation, the focus on content and usage related factors need to be further examined to improve the quality evaluation experiments.

  6. An integrated modeling approach for estimating the water quality benefits of conservation practices at the river basin scale.

    PubMed

    Santhi, C; Kannan, N; White, M; Di Luzio, M; Arnold, J G; Wang, X; Williams, J R

    2014-01-01

    The USDA initiated the Conservation Effects Assessment Project (CEAP) to quantify the environmental benefits of conservation practices at regional and national scales. For this assessment, a sampling and modeling approach is used. This paper provides a technical overview of the modeling approach used in CEAP cropland assessment to estimate the off-site water quality benefits of conservation practices using the Ohio River Basin (ORB) as an example. The modeling approach uses a farm-scale model, Agricultural Policy Environmental Extender (APEX), and a watershed scale model (the Soil and Water Assessment Tool [SWAT]) and databases in the Hydrologic Unit Modeling for the United States system. Databases of land use, soils, land use management, topography, weather, point sources, and atmospheric depositions were developed to derive model inputs. APEX simulates the cultivated cropland, Conserve Reserve Program land, and the practices implemented on them, whereas SWAT simulates the noncultivated land (e.g., pasture, range, urban, and forest) and point sources. Simulation results from APEX are input into SWAT. SWAT routes all sources, including APEX's, to the basin outlet through each eight-digit watershed. Each basin is calibrated for stream flow, sediment, and nutrient loads at multiple gaging sites and turned in for simulating the effects of conservation practice scenarios on water quality. Results indicate that sediment, nitrogen, and phosphorus loads delivered to the Mississippi River from ORB could be reduced by 16, 15, and 23%, respectively, due to current conservation practices. Modeling tools are useful to provide science-based information for assessing existing conservation programs, developing future programs, and developing insights on load reductions necessary for hypoxia in the Gulf of Mexico. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  7. A simulation-based approach for estimating premining water quality: Red Mountain Creek, Colorado

    USGS Publications Warehouse

    Runkel, Robert L.; Kimball, Briant A; Walton-Day, Katherine; Verplanck, Philip L.

    2007-01-01

    Regulatory agencies are often charged with the task of setting site-specific numeric water quality standards for impaired streams. This task is particularly difficult for streams draining highly mineralized watersheds with past mining activity. Baseline water quality data obtained prior to mining are often non-existent and application of generic water quality standards developed for unmineralized watersheds is suspect given the geology of most watersheds affected by mining. Various approaches have been used to estimate premining conditions, but none of the existing approaches rigorously consider the physical and geochemical processes that ultimately determine instream water quality. An approach based on simulation modeling is therefore proposed herein. The approach utilizes synoptic data that provide spatially-detailed profiles of concentration, streamflow, and constituent load along the study reach. This field data set is used to calibrate a reactive stream transport model that considers the suite of physical and geochemical processes that affect constituent concentrations during instream transport. A key input to the model is the quality and quantity of waters entering the study reach. This input is based on chemical analyses available from synoptic sampling and observed increases in streamflow along the study reach. Given the calibrated model, additional simulations are conducted to estimate premining conditions. In these simulations, the chemistry of mining-affected sources is replaced with the chemistry of waters that are thought to be unaffected by mining (proximal, premining analogues). The resultant simulations provide estimates of premining water quality that reflect both the reduced loads that were present prior to mining and the processes that affect these loads as they are transported downstream. This simulation-based approach is demonstrated using data from Red Mountain Creek, Colorado, a small stream draining a heavily-mined watershed. Model

  8. A pilot modeling technique for handling-qualities research

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1980-01-01

    A brief survey of the more dominant analysis techniques used in closed-loop handling-qualities research is presented. These techniques are shown to rely on so-called classical and modern analytical models of the human pilot which have their foundation in the analysis and design principles of feedback control. The optimal control model of the human pilot is discussed in some detail and a novel approach to the a priori selection of pertinent model parameters is discussed. Frequency domain and tracking performance data from 10 pilot-in-the-loop simulation experiments involving 3 different tasks are used to demonstrate the parameter selection technique. Finally, the utility of this modeling approach in handling-qualities research is discussed.

  9. European Approaches to Quality Assurance: Models, Characteristics and Challenges.

    ERIC Educational Resources Information Center

    van Damme, D.

    2000-01-01

    Examines models, characteristics, and challenges of quality assurance in higher education in the Netherlands, Belgium, Germany, Denmark, France, Finland, Italy, and Spain. Notes a common move toward institutional autonomy and output oriented steering, and the absence of accreditation procedures comparable to those in Anglo-Saxon countries. Finds…

  10. Community Multiscale Air Quality Model

    EPA Science Inventory

    The U.S. EPA developed the Community Multiscale Air Quality (CMAQ) system to apply a “one atmosphere” multiscale and multi-pollutant modeling approach based mainly on the “first principles” description of the atmosphere. The multiscale capability is supported by the governing di...

  11. Quality management, a directive approach to patient safety.

    PubMed

    Ayuso-Murillo, Diego; de Andrés-Gimeno, Begoña; Noriega-Matanza, Concha; López-Suárez, Rafael Jesús; Herrera-Peco, Ivan

    Nowadays the implementation of effective quality management systems and external evaluation in healthcare is a necessity to ensure not only transparency in activities related to health but also access to health and patient safety. The key to correctly implementing a quality management system is support from the managers of health facilities, since it is managers who design and communicate to health professionals the strategies of action involved in quality management systems. This article focuses on nursing managers' approach to quality management through the implementation of cycles of continuous improvement, participation of improvement groups, monitoring systems and external evaluation quality models (EFQM, ISO). The implementation of a quality management system will enable preventable adverse effects to be minimized or eliminated, and promote patient safety and safe practice by health professionals. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  12. Solving multi-customer FPR model with quality assurance and discontinuous deliveries using a two-phase algebraic approach.

    PubMed

    Chiu, Yuan-Shyi Peter; Chou, Chung-Li; Chang, Huei-Hsin; Chiu, Singa Wang

    2016-01-01

    A multi-customer finite production rate (FPR) model with quality assurance and discontinuous delivery policy was investigated in a recent paper (Chiu et al. in J Appl Res Technol 12(1):5-13, 2014) using differential calculus approach. This study employs mathematical modeling along with a two-phase algebraic method to resolve such a specific multi-customer FPR model. As a result, the optimal replenishment lot size and number of shipments can be derived without using the differential calculus. Such a straightforward method may assist practitioners who with insufficient knowledge of calculus in learning and managing the real multi-customer FPR systems more effectively.

  13. Patient perception of nursing service quality; an applied model of Donabedian's structure-process-outcome approach theory.

    PubMed

    Kobayashi, Hideyuki; Takemura, Yukie; Kanda, Katsuya

    2011-09-01

    Nursing is a labour-intensive field, and an extensive amount of latent information exists to aid in evaluating the quality of nursing service, with patients' experiences, the primary focus of such evaluations. To effect further improvement in nursing as well as medical care, Donabedian's structure-process-outcome approach has been applied. To classify and confirm patients' specific experiences with regard to nursing service based on Donabedian's structure-process-outcomes model for improving the quality of nursing care. Items were compiled from existing scales and assigned to structure, process or outcomes in Donabedian's model through discussion among expert nurses and pilot data collection. With regard to comfort, surroundings were classified as structure (e.g. accessibility to nurses, disturbance); with regard to patient-practitioner interaction, patient participation was classified as a process (e.g. expertise and skill, patient decision-making); and with regard to changes in patients, satisfaction was classified as an outcome (e.g. information support, overall satisfaction). Patient inquiry was carried out using the finalized questionnaire at general wards in Japanese hospitals in 2005-2006. Reliability and validity were tested using psychometric methods. Data from 1,810 patients (mean age: 59.7 years; mean length of stay: 23.7 days) were analysed. Internal consistency reliability was supported (α = 0.69-0.96), with factor analysis items of structure aggregated to one factor and overall satisfaction under outcome aggregated to one. The remaining items of outcome and process were distributed together in two factors. Inter-scale correlation (r = 0.442-0.807) supported the construct validity of each structure-process-outcome approach. All structure items were represented as negative-worded examples, as they dealt with basic conditions under Japanese universal health care system, and were regarded as representative related to concepts of dissatisfaction and no

  14. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    NASA Astrophysics Data System (ADS)

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  15. Measuring quality of care: considering conceptual approaches to quality indicator development and evaluation.

    PubMed

    Stelfox, Henry T; Straus, Sharon E

    2013-12-01

    In this article, we describe one approach for developing and evaluating quality indicators. We focus on describing different conceptual approaches to quality indicator development, review one approach for developing quality indicators, outline how to evaluate quality indicators once developed, and discuss quality indicator maintenance. The key steps for developing quality indicators include specifying a clear goal for the indicators; using methodologies to incorporate evidence, expertise, and patient perspectives; and considering contextual factors and logistics of implementation. The Strategic Framework Board and the National Quality Measure Clearinghouse have developed criteria for evaluating quality indicators that complement traditional psychometric evaluations. Optimal strategies for quality indicator maintenance and dissemination have not been determined, but experiences with clinical guideline maintenance may be informative. For quality indicators to effectively guide quality improvement efforts, they must be developed, evaluated, maintained, and implemented using rigorous evidence-informed practices. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Institutional Response to the Swedish Model of Quality Assurance.

    ERIC Educational Resources Information Center

    Nilsson, Karl-Axel; Wahlen, Staffan

    2000-01-01

    Evaluates the Swedish model of quality assurance of higher education by examining the response of institutions to 27 quality audits and 19 follow-up interviews. Discusses the relationship between top-down and bottom-up approaches to internal quality assurance and suggests that, with growing professionalization, more limited result-oriented audits…

  17. A model for predicting air quality along highways.

    DOT National Transportation Integrated Search

    1973-01-01

    The subject of this report is an air quality prediction model for highways, AIRPOL Version 2, July 1973. AIRPOL has been developed by modifying the basic Gaussian approach to gaseous dispersion. The resultant model is smooth and continuous throughout...

  18. MODELS-3 COMMUNITY MULTISCALE AIR QUALITY (CMAQ) MODEL AEROSOL COMPONENT 1: MODEL DESCRIPTION

    EPA Science Inventory

    The aerosol component of the Community Multiscale Air Quality (CMAQ) model is designed to be an efficient and economical depiction of aerosol dynamics in the atmosphere. The approach taken represents the particle size distribution as the superposition of three lognormal subdis...

  19. Formulation of consumables management models. Development approach for the mission planning processor working model

    NASA Technical Reports Server (NTRS)

    Connelly, L. C.

    1977-01-01

    The mission planning processor is a user oriented tool for consumables management and is part of the total consumables subsystem management concept. The approach to be used in developing a working model of the mission planning processor is documented. The approach includes top-down design, structured programming techniques, and application of NASA approved software development standards. This development approach: (1) promotes cost effective software development, (2) enhances the quality and reliability of the working model, (3) encourages the sharing of the working model through a standard approach, and (4) promotes portability of the working model to other computer systems.

  20. Development and testing of a fast conceptual river water quality model.

    PubMed

    Keupers, Ingrid; Willems, Patrick

    2017-04-15

    Modern, model based river quality management strongly relies on river water quality models to simulate the temporal and spatial evolution of pollutant concentrations in the water body. Such models are typically constructed by extending detailed hydrodynamic models with a component describing the advection-diffusion and water quality transformation processes in a detailed, physically based way. This approach is too computational time demanding, especially when simulating long time periods that are needed for statistical analysis of the results or when model sensitivity analysis, calibration and validation require a large number of model runs. To overcome this problem, a structure identification method to set up a conceptual river water quality model has been developed. Instead of calculating the water quality concentrations at each water level and discharge node, the river branch is divided into conceptual reservoirs based on user information such as location of interest and boundary inputs. These reservoirs are modelled as Plug Flow Reactor (PFR) and Continuously Stirred Tank Reactor (CSTR) to describe advection and diffusion processes. The same water quality transformation processes as in the detailed models are considered but with adjusted residence times based on the hydrodynamic simulation results and calibrated to the detailed water quality simulation results. The developed approach allows for a much faster calculation time (factor 10 5 ) without significant loss of accuracy, making it feasible to perform time demanding scenario runs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. The role of pesticide fate modelling in a prevention-led approach to potable water quality management

    NASA Astrophysics Data System (ADS)

    Dolan, Tom; Pullan, Stephanie; Whelan, Mick; Parsons, David

    2013-04-01

    Diffuse inputs from agriculture are commonly the main source of pesticide contamination in surface water and may have implications for the quality of treated drinking water. After privatisation in 1991, UK water companies primarily focused on the provision of sufficient water treatment to reduce the risk of non-compliance with the European Drinking Water Directive (DWD), under which all pesticide concentrations must be below 0.1µg/l and UK Water Supply Regulations for the potable water they supply. Since 2000, Article 7 of the Water Framework Directive (WFD) has begun to drive a prevention-led approach to compliance with the DWD. As a consequence water companies are now more interested in the quality of 'raw' (untreated) water at the point of abstraction. Modelling (based upon best available estimates of cropping, pesticide use, weather conditions, pesticide characteristics, and catchment characteristics) and monitoring of raw water quality can both help to determine the compliance risks associated with the quality of this 'raw' water resource. This knowledge allows water companies to prioritise active substances for action in their catchments, and is currently used in many cases to support the design of monitoring programmes for pesticide active substances. Additional value can be provided if models are able to help to identify the type and scale of catchment management interventions required to achieve DWD compliance for pesticide active substances through pollution prevention at source or along transport pathways. These questions were explored using a simple catchment-scale pesticide fate and transport model. The model employs a daily time-step and is semi-lumped with calculations performed for soil type and crop combinations, weighted by their proportions within the catchment. Soil properties are derived from the national soil database and the model can, therefore, be applied to any catchment in England and Wales. Various realistic catchment management

  2. A quality by design approach to investigate the effect of mannitol and dicalcium phosphate qualities on roll compaction.

    PubMed

    Souihi, Nabil; Dumarey, Melanie; Wikström, Håkan; Tajarobi, Pirjo; Fransson, Magnus; Svensson, Olof; Josefson, Mats; Trygg, Johan

    2013-04-15

    Roll compaction is a continuous process for solid dosage form manufacturing increasingly popular within pharmaceutical industry. Although roll compaction has become an established technique for dry granulation, the influence of material properties is still not fully understood. In this study, a quality by design (QbD) approach was utilized, not only to understand the influence of different qualities of mannitol and dicalcium phosphate (DCP), but also to predict critical quality attributes of the drug product based solely on the material properties of that filler. By describing each filler quality in terms of several representative physical properties, orthogonal projections to latent structures (OPLS) was used to understand and predict how those properties affected drug product intermediates as well as critical quality attributes of the final drug product. These models were then validated by predicting product attributes for filler qualities not used in the model construction. The results of this study confirmed that the tensile strength reduction, known to affect plastic materials when roll compacted, is not prominent when using brittle materials. Some qualities of these fillers actually demonstrated improved compactability following roll compaction. While direct compression qualities are frequently used for roll compacted drug products because of their excellent flowability and good compaction properties, this study revealed that granules from these qualities were more poor flowing than the corresponding powder blends, which was not seen for granules from traditional qualities. The QbD approach used in this study could be extended beyond fillers. Thus any new compound/ingredient would first be characterized and then suitable formulation characteristics could be determined in silico, without running any additional experiments. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. E-Learning Quality Assurance: A Process-Oriented Lifecycle Model

    ERIC Educational Resources Information Center

    Abdous, M'hammed

    2009-01-01

    Purpose: The purpose of this paper is to propose a process-oriented lifecycle model for ensuring quality in e-learning development and delivery. As a dynamic and iterative process, quality assurance (QA) is intertwined with the e-learning development process. Design/methodology/approach: After reviewing the existing literature, particularly…

  4. Two modelling approaches to water-quality simulation in a flooded iron-ore mine (Saizerais, Lorraine, France): a semi-distributed chemical reactor model and a physically based distributed reactive transport pipe network model.

    PubMed

    Hamm, V; Collon-Drouaillet, P; Fabriol, R

    2008-02-19

    The flooding of abandoned mines in the Lorraine Iron Basin (LIB) over the past 25 years has degraded the quality of the groundwater tapped for drinking water. High concentrations of dissolved sulphate have made the water unsuitable for human consumption. This problematic issue has led to the development of numerical tools to support water-resource management in mining contexts. Here we examine two modelling approaches using different numerical tools that we tested on the Saizerais flooded iron-ore mine (Lorraine, France). A first approach considers the Saizerais Mine as a network of two chemical reactors (NCR). The second approach is based on a physically distributed pipe network model (PNM) built with EPANET 2 software. This approach considers the mine as a network of pipes defined by their geometric and chemical parameters. Each reactor in the NCR model includes a detailed chemical model built to simulate quality evolution in the flooded mine water. However, in order to obtain a robust PNM, we simplified the detailed chemical model into a specific sulphate dissolution-precipitation model that is included as sulphate source/sink in both a NCR model and a pipe network model. Both the NCR model and the PNM, based on different numerical techniques, give good post-calibration agreement between the simulated and measured sulphate concentrations in the drinking-water well and overflow drift. The NCR model incorporating the detailed chemical model is useful when a detailed chemical behaviour at the overflow is needed. The PNM incorporating the simplified sulphate dissolution-precipitation model provides better information of the physics controlling the effect of flow and low flow zones, and the time of solid sulphate removal whereas the NCR model will underestimate clean-up time due to the complete mixing assumption. In conclusion, the detailed NCR model will give a first assessment of chemical processes at overflow, and in a second time, the PNM model will provide more

  5. A Network Approach to Curriculum Quality Assessment

    ERIC Educational Resources Information Center

    Jordens, J. Zoe; Zepke, Nick

    2009-01-01

    This paper argues for an alternative approach to quality assurance in New Zealand universities that locates evaluation not with external auditors but with members of the teaching team. In the process, aspects of network theories are introduced as the basis for an approach to quality assurance. From this, the concept of networks is extended to…

  6. Implications of Modeling Uncertainty for Water Quality Decision Making

    NASA Astrophysics Data System (ADS)

    Shabman, L.

    2002-05-01

    The report, National Academy of Sciences report, "Assessing the TMDL Approach to Water Quality Management" endorsed the "watershed" and "ambient water quality focused" approach" to water quality management called for in the TMDL program. The committee felt that available data and models were adequate to move such a program forward, if the EPA and all stakeholders better understood the nature of the scientific enterprise and its application to the TMDL program. Specifically, the report called for a greater acknowledgement of model prediction uncertinaity in making and implementing TMDL plans. To assure that such uncertinaity was addressed in water quality decision making the committee called for a commitment to "adaptive implementation" of water quality management plans. The committee found that the number and complexity of the interactions of multiple stressors, combined with model prediction uncertinaity means that we need to avoid the temptation to make assurances that specific actions will result in attainment of particular water quality standards. Until the work on solving a water quality problem begins, analysts and decision makers cannot be sure what the correct solutions are, or even what water quality goals a community should be seeking. In complex systems we need to act in order to learn; adaptive implementation is a concurrent process of action and learning. Learning requires (1) continued monitoring of the waterbody to determine how it responds to the actions taken and (2) carefully designed experiments in the watershed. If we do not design learning into what we attempt we are not doing adaptive implementation. Therefore, there needs to be an increased commitment to monitoring and experiments in watersheds that will lead to learning. This presentation will 1) explain the logic for adaptive implementation; 2) discuss the ways that water quality modelers could characterize and explain model uncertinaity to decision makers; 3) speculate on the implications

  7. Competition and quality in health care markets: a differential-game approach.

    PubMed

    Brekke, Kurt R; Cellini, Roberto; Siciliani, Luigi; Straume, Odd Rune

    2010-07-01

    We investigate the effect of competition on quality in health care markets with regulated prices taking a differential game approach, in which quality is a stock variable. Using a Hotelling framework, we derive the open-loop solution (health care providers set the optimal investment plan at the initial period) and the feedback closed-loop solution (providers move investments in response to the dynamics of the states). Under the closed-loop solution competition is more intense in the sense that providers observe quality in each period and base their investment on this information. If the marginal provision cost is constant, the open-loop and closed-loop solutions coincide, and the results are similar to the ones obtained by static models. If the marginal provision cost is increasing, investment and quality are lower in the closed-loop solution (when competition is more intense). In this case, static models tend to exaggerate the positive effect of competition on quality.

  8. An index approach to performance-based payments for water quality.

    PubMed

    Maille, Peter; Collins, Alan R

    2012-05-30

    In this paper we describe elements of a field research project that presented farmers with economic incentives to control nitrate runoff. The approach used is novel in that payments are based on ambient water quality and water quantity produced by a watershed rather than proxies for water quality conservation. Also, payments are made based on water quality relative to a control watershed, and therefore, account for stochastic fluctuations in background nitrate levels. Finally, the program pays farmers as a group to elicit team behavior. We present our approach to modeling that allowed us to estimate prices for water and resulting payment levels. We then compare these preliminary estimates to the actual values recorded over 33 months of fieldwork. We find that our actual payments were 29% less than our preliminary estimates, due in part to the failure of our ecological model to estimate discharge accurately. Despite this shortfall, the program attracted the participation of 53% of the farmers in the watershed, and resulted in substantial nitrate abatement activity. Given this favorable response, we propose that research efforts focus on implementing field trials of group-level performance-based payments. Ideally these programs would be low risk and control for naturally occurring contamination. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Image quality assessment by preprocessing and full reference model combination

    NASA Astrophysics Data System (ADS)

    Bianco, S.; Ciocca, G.; Marini, F.; Schettini, R.

    2009-01-01

    This paper focuses on full-reference image quality assessment and presents different computational strategies aimed to improve the robustness and accuracy of some well known and widely used state of the art models, namely the Structural Similarity approach (SSIM) by Wang and Bovik and the S-CIELAB spatial-color model by Zhang and Wandell. We investigate the hypothesis that combining error images with a visual attention model could allow a better fit of the psycho-visual data of the LIVE Image Quality assessment Database Release 2. We show that the proposed quality assessment metric better correlates with the experimental data.

  10. Enhancing population pharmacokinetic modeling efficiency and quality using an integrated workflow.

    PubMed

    Schmidt, Henning; Radivojevic, Andrijana

    2014-08-01

    Population pharmacokinetic (popPK) analyses are at the core of Pharmacometrics and need to be performed regularly. Although these analyses are relatively standard, a large variability can be observed in both the time (efficiency) and the way they are performed (quality). Main reasons for this variability include the level of experience of a modeler, personal preferences and tools. This paper aims to examine how the process of popPK model building can be supported in order to increase its efficiency and quality. The presented approach to the conduct of popPK analyses is centered around three key components: (1) identification of most common and important popPK model features, (2) required information content and formatting of the data for modeling, and (3) methodology, workflow and workflow supporting tools. This approach has been used in several popPK modeling projects and a documented example is provided in the supplementary material. Efficiency of model building is improved by avoiding repetitive coding and other labor-intensive tasks and by putting the emphasis on a fit-for-purpose model. Quality is improved by ensuring that the workflow and tools are in alignment with a popPK modeling guidance which is established within an organization. The main conclusion of this paper is that workflow based approaches to popPK modeling are feasible and have significant potential to ameliorate its various aspects. However, the implementation of such an approach in a pharmacometric organization requires openness towards innovation and change-the key ingredient for evolution of integrative and quantitative drug development in the pharmaceutical industry.

  11. Reduced-form air quality modeling for community-scale ...

    EPA Pesticide Factsheets

    Transportation plays an important role in modern society, but its impact on air quality has been shown to have significant adverse effects on public health. Numerous reviews (HEI, CDC, WHO) summarizing findings of hundreds of studies conducted mainly in the last decade, conclude that exposures to traffic emissions near roads are a public health concern. The Community LINE Source Model (C-LINE) is a web-based model designed to inform the community user of local air quality impacts due to roadway vehicles in their region of interest using a simplified modeling approach. Reduced-form air quality modeling is a useful tool for examining what-if scenarios of changes in emissions, such as those due to changes in traffic volume, fleet mix, or vehicle speed. Examining various scenarios of air quality impacts in this way can identify potentially at-risk populations located near roadways, and the effects that a change in traffic activity may have on them. C-LINE computes dispersion of primary mobile source pollutants using meteorological conditions for the region of interest and computes air-quality concentrations corresponding to these selected conditions. C-LINE functionality has been expanded to model emissions from port-related activities (e.g. ships, trucks, cranes, etc.) in a reduced-form modeling system for local-scale near-port air quality analysis. This presentation describes the Community modeling tools C-LINE and C-PORT that are intended to be used by local gove

  12. Taxonomy-Based Approaches to Quality Assurance of Ontologies

    PubMed Central

    Perl, Yehoshua; Ochs, Christopher

    2017-01-01

    Ontologies are important components of health information management systems. As such, the quality of their content is of paramount importance. It has been proven to be practical to develop quality assurance (QA) methodologies based on automated identification of sets of concepts expected to have higher likelihood of errors. Four kinds of such sets (called QA-sets) organized around the themes of complex and uncommonly modeled concepts are introduced. A survey of different methodologies based on these QA-sets and the results of applying them to various ontologies are presented. Overall, following these approaches leads to higher QA yields and better utilization of QA personnel. The formulation of additional QA-set methodologies will further enhance the suite of available ontology QA tools. PMID:29158885

  13. DEVELOPMENT OF AN AGGREGATION AND EPISODE SELECTION SCHEME TO SUPPORT THE MODELS-3 COMMUNITY MULTISCALE AIR QUALITY MODEL

    EPA Science Inventory

    The development of an episode selection and aggregation approach, designed to support distributional estimation of use with the Models-3 Community Multiscale Air Quality (CMAQ) model, is described. The approach utilized cluster analysis of the 700-hPa east-west and north-south...

  14. A novel client service quality measuring model and an eHealthcare mitigating approach.

    PubMed

    Cheng, L M; Choi, Wai Ping Choi; Wong, Anita Yiu Ming

    2016-07-01

    Facing population ageing in Hong Kong, the demand of long-term elderly health care services is increasing. The challenge is to support a good quality service under the constraints faced by recent shortage of nursing and care services professionals without redesigning the work flow operated in the existing elderly health care industries. the existing elderly health care industries. The Total QoS measure based on Finite Capacity Queuing Model is a reliable method and an effective measurement for Quality of services. The value is good for measuring the staffing level and offers a measurement for efficiency enhancement when incorporate new technologies like ICT. The implemented system has improved the Quality of Service by more than 14% and the extra released manpower resource will allow clinical care provider to offer further value added services without actually increasing head count. We have developed a novel Quality of Service measurement for Clinical Care services based on multi-queue using finite capacity queue model M/M/c/K/n and the measurement is useful for estimating the shortage of staff resource in a caring institution. It is essential for future integration with the existing widely used assessment model to develop reliable measuring limits which allow an effective measurement of public fund used in health care industries. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. A methodology model for quality management in a general hospital.

    PubMed

    Stern, Z; Naveh, E

    1997-01-01

    A reappraisal is made of the relevance of industrial modes of quality management to the issues of medical care. Analysis of the nature of medical care, which differentiates it from the supplier-client relationships of industry, presents the main intrinsic characteristics, which create problems in application of the industrial quality management approaches to medical care. Several examples are the complexity of the relationship between the medical action and the result obtained, the client's nonacceptance of economic profitability as a value in his medical care, and customer satisfaction biased by variable standards of knowledge. The real problems unique to hospitals are addressed, and a methodology model for their quality management is offered. Included is a sample of indicator vectors, measurements of quality care, cost of medical care, quality of service, and human resources. These are based on the trilogy of planning quality, quality control, and improving quality. The conclusions confirm the inadequacy of industrial quality management approaches for medical institutions and recommend investment in formulation of appropriate concepts.

  16. Early experiences building a software quality prediction model

    NASA Technical Reports Server (NTRS)

    Agresti, W. W.; Evanco, W. M.; Smith, M. C.

    1990-01-01

    Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.

  17. Drinking water quality management: a holistic approach.

    PubMed

    Rizak, S; Cunliffe, D; Sinclair, M; Vulcano, R; Howard, J; Hrudey, S; Callan, P

    2003-01-01

    A growing list of water contaminants has led to some water suppliers relying primarily on compliance monitoring as a mechanism for managing drinking water quality. While such monitoring is a necessary part of drinking water quality management, experiences with waterborne disease threats and outbreaks have shown that compliance monitoring for numerical limits is not, in itself, sufficient to guarantee the safety and quality of drinking water supplies. To address these issues, the Australian National Health and Medical Research Council (NHMRC) has developed a Framework for Management of Drinking Water Quality (the Framework) for incorporation in the Australian Drinking Water Guidelines, the primary reference on drinking water quality in Australia. The Framework was developed specifically for drinking water supplies and provides a comprehensive and preventive risk management approach from catchment to consumer. It includes holistic guidance on a range of issues considered good practice for system management. The Framework addresses four key areas: Commitment to Drinking Water Quality Management, System Analysis and System Management, Supporting Requirements, and Review. The Framework represents a significantly enhanced approach to the management and regulation of drinking water quality and offers a flexible and proactive means of optimising drinking water quality and protecting public health. Rather than the primary reliance on compliance monitoring, the Framework emphasises prevention, the importance of risk assessment, maintaining the integrity of water supply systems and application of multiple barriers to assure protection of public health. Development of the Framework was undertaken in collaboration with the water industry, regulators and other stakeholder, and will promote a common and unified approach to drinking water quality management throughout Australia. The Framework has attracted international interest.

  18. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  19. Analyzing the quality robustness of chemotherapy plans with respect to model uncertainties.

    PubMed

    Hoffmann, Anna; Scherrer, Alexander; Küfer, Karl-Heinz

    2015-01-01

    Mathematical models of chemotherapy planning problems contain various biomedical parameters, whose values are difficult to quantify and thus subject to some uncertainty. This uncertainty propagates into the therapy plans computed on these models, which poses the question of robustness to the expected therapy quality. This work introduces a combined approach for analyzing the quality robustness of plans in terms of dosing levels with respect to model uncertainties in chemotherapy planning. It uses concepts from multi-criteria decision making for studying parameters related to the balancing between the different therapy goals, and concepts from sensitivity analysis for the examination of parameters describing the underlying biomedical processes and their interplay. This approach allows for a profound assessment of a therapy plan, how stable its quality is with respect to parametric changes in the used mathematical model. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. The relationship between quality management practices and organisational performance: A structural equation modelling approach

    NASA Astrophysics Data System (ADS)

    Jamaluddin, Z.; Razali, A. M.; Mustafa, Z.

    2015-02-01

    The purpose of this paper is to examine the relationship between the quality management practices (QMPs) and organisational performance for the manufacturing industry in Malaysia. In this study, a QMPs and organisational performance framework is developed according to a comprehensive literature review which cover aspects of hard and soft quality factors in manufacturing process environment. A total of 11 hypotheses have been put forward to test the relationship amongst the six constructs, which are management commitment, training, process management, quality tools, continuous improvement and organisational performance. The model is analysed using Structural Equation Modeling (SEM) with AMOS software version 18.0 using Maximum Likelihood (ML) estimation. A total of 480 questionnaires were distributed, and 210 questionnaires were valid for analysis. The results of the modeling analysis using ML estimation indicate that the fits statistics of QMPs and organisational performance model for manufacturing industry is admissible. From the results, it found that the management commitment have significant impact on the training and process management. Similarly, the training had significant effect to the quality tools, process management and continuous improvement. Furthermore, the quality tools have significant influence on the process management and continuous improvement. Likewise, the process management also has a significant impact to the continuous improvement. In addition the continuous improvement has significant influence the organisational performance. However, the results of the study also found that there is no significant relationship between management commitment and quality tools, and between the management commitment and continuous improvement. The results of the study can be used by managers to prioritize the implementation of QMPs. For instances, those practices that are found to have positive impact on organisational performance can be recommended to

  1. Diagnostic Analysis of Ozone Concentrations Simulated by Two Regional-Scale Air Quality Models

    EPA Science Inventory

    Since the Community Multiscale Air Quality modeling system (CMAQ) and the Weather Research and Forecasting with Chemistry model (WRF/Chem) use different approaches to simulate the interaction of meteorology and chemistry, this study compares the CMAQ and WRF/Chem air quality simu...

  2. Accreditation and quality approach in operating theatre departments: the French approach.

    PubMed

    Soudée, M

    2005-01-01

    Since 1996, French health establishments are subjected to a process of evaluating the quality of care, called "accreditation". This process was controlled by ANAES, which, after January 1st, 2005 became the Haute Autorité de Santé (HAS). The accreditation is characterized by a dual process of self-assessment and external audit, leading to four levels of accreditation. In spite of requiring a time-consuming methodology, this approach provides an important means of consolidating the development of the quality approach and re-stimulating the compliance of establishments with standards of safety and vigilance. The professional teams of many French operating theatre departments have been able to use the regulatory and restricting framework of accreditation to organize quality approaches specific to the operative system, supported by the organizational structures of the department such as the operating suite committee, departmental boards and the steering group. Based on quality guidelines including a commitment from the manager and operating suite committee, as well as a quality flow chart and a quality system, these teams describe the main procedures for running the operating theatre. They also organize the follow-up of incidents and undesirable events, along with the risks and points to watch. Audits of the operative system are planned on a regular basis. The second version of the accreditation process considerably reinforces the assessment of professional practices by evaluating the relevance, the risks and the methods of managing care for pathologies. It will make it possible to implement assessments of the health care provided by operating theatre departments and will reinforce the importance of search for quality.

  3. Development and application of air quality models at the US ...

    EPA Pesticide Factsheets

    Overview of the development and application of air quality models at the U.S. EPA, particularly focused on the development and application of the Community Multiscale Air Quality (CMAQ) model developed within the Computation Exposure Division (CED) of the National Exposure Research Laboratory (NERL). This presentation will provide a simple overview of air quality model development and application geared toward a non-technical student audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  4. Do Energy Efficiency Standards Improve Quality? Evidence from a Revealed Preference Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houde, Sebastien; Spurlock, C. Anna

    Minimum energy efficiency standards have occupied a central role in U.S. energy policy for more than three decades, but little is known about their welfare effects. In this paper, we employ a revealed preference approach to quantify the impact of past revisions in energy efficiency standards on product quality. The micro-foundation of our approach is a discrete choice model that allows us to compute a price-adjusted index of vertical quality. Focusing on the appliance market, we show that several standard revisions during the period 2001-2011 have led to an increase in quality. We also show that these standards have hadmore » a modest effect on prices, and in some cases they even led to decreases in prices. For revision events where overall quality increases and prices decrease, the consumer welfare effect of tightening the standards is unambiguously positive. Finally, we show that after controlling for the effect of improvement in energy efficiency, standards have induced an expansion of quality in the non-energy dimension. We discuss how imperfect competition can rationalize these results.« less

  5. Subjective Values of Quality of Life Dimensions in Elderly People. A SEM Preference Model Approach

    ERIC Educational Resources Information Center

    Elosua, Paula

    2011-01-01

    This article proposes a Thurstonian model in the framework of Structural Equation Modelling (SEM) to assess preferences among quality of life dimensions for the elderly. Data were gathered by a paired comparison design in a sample comprised of 323 people aged from 65 to 94 years old. Five dimensions of quality of life were evaluated: Health,…

  6. APOLLO: a quality assessment service for single and multiple protein models.

    PubMed

    Wang, Zheng; Eickholt, Jesse; Cheng, Jianlin

    2011-06-15

    We built a web server named APOLLO, which can evaluate the absolute global and local qualities of a single protein model using machine learning methods or the global and local qualities of a pool of models using a pair-wise comparison approach. Based on our evaluations on 107 CASP9 (Critical Assessment of Techniques for Protein Structure Prediction) targets, the predicted quality scores generated from our machine learning and pair-wise methods have an average per-target correlation of 0.671 and 0.917, respectively, with the true model quality scores. Based on our test on 92 CASP9 targets, our predicted absolute local qualities have an average difference of 2.60 Å with the actual distances to native structure. http://sysbio.rnet.missouri.edu/apollo/. Single and pair-wise global quality assessment software is also available at the site.

  7. SIMULATION OF AEROSOL DYNAMICS: A COMPARATIVE REVIEW OF ALGORITHMS USED IN AIR QUALITY MODELS

    EPA Science Inventory

    A comparative review of algorithms currently used in air quality models to simulate aerosol dynamics is presented. This review addresses coagulation, condensational growth, nucleation, and gas/particle mass transfer. Two major approaches are used in air quality models to repres...

  8. Dynamic Evaluation of Long-Term Air Quality Model Simulations Over the Northeastern U.S.

    EPA Science Inventory

    Dynamic model evaluation assesses a modeling system's ability to reproduce changes in air quality induced by changes in meteorology and/or emissions. In this paper, we illustrate various approaches to dynamic mode evaluation utilizing 18 years of air quality simulations perform...

  9. Offline modeling for product quality prediction of mineral processing using modeling error PDF shaping and entropy minimization.

    PubMed

    Ding, Jinliang; Chai, Tianyou; Wang, Hong

    2011-03-01

    This paper presents a novel offline modeling for product quality prediction of mineral processing which consists of a number of unit processes in series. The prediction of the product quality of the whole mineral process (i.e., the mixed concentrate grade) plays an important role and the establishment of its predictive model is a key issue for the plantwide optimization. For this purpose, a hybrid modeling approach of the mixed concentrate grade prediction is proposed, which consists of a linear model and a nonlinear model. The least-squares support vector machine is adopted to establish the nonlinear model. The inputs of the predictive model are the performance indices of each unit process, while the output is the mixed concentrate grade. In this paper, the model parameter selection is transformed into the shape control of the probability density function (PDF) of the modeling error. In this context, both the PDF-control-based and minimum-entropy-based model parameter selection approaches are proposed. Indeed, this is the first time that the PDF shape control idea is used to deal with system modeling, where the key idea is to turn model parameters so that either the modeling error PDF is controlled to follow a target PDF or the modeling error entropy is minimized. The experimental results using the real plant data and the comparison of the two approaches are discussed. The results show the effectiveness of the proposed approaches.

  10. Assessment and modeling of the groundwater hydrogeochemical quality parameters via geostatistical approaches

    NASA Astrophysics Data System (ADS)

    Karami, Shawgar; Madani, Hassan; Katibeh, Homayoon; Fatehi Marj, Ahmad

    2018-03-01

    Geostatistical methods are one of the advanced techniques used for interpolation of groundwater quality data. The results obtained from geostatistics will be useful for decision makers to adopt suitable remedial measures to protect the quality of groundwater sources. Data used in this study were collected from 78 wells in Varamin plain aquifer located in southeast of Tehran, Iran, in 2013. Ordinary kriging method was used in this study to evaluate groundwater quality parameters. According to what has been mentioned in this paper, seven main quality parameters (i.e. total dissolved solids (TDS), sodium adsorption ratio (SAR), electrical conductivity (EC), sodium (Na+), total hardness (TH), chloride (Cl-) and sulfate (SO4 2-)), have been analyzed and interpreted by statistical and geostatistical methods. After data normalization by Nscore method in WinGslib software, variography as a geostatistical tool to define spatial regression was compiled and experimental variograms were plotted by GS+ software. Then, the best theoretical model was fitted to each variogram based on the minimum RSS. Cross validation method was used to determine the accuracy of the estimated data. Eventually, estimation maps of groundwater quality were prepared in WinGslib software and estimation variance map and estimation error map were presented to evaluate the quality of estimation in each estimated point. Results showed that kriging method is more accurate than the traditional interpolation methods.

  11. Practical Approaches to Quality Improvement for Radiologists.

    PubMed

    Kelly, Aine Marie; Cronin, Paul

    2015-10-01

    Continuous quality improvement is a fundamental attribute of high-performing health care systems. Quality improvement is an essential component of health care, with the current emphasis on adding value. It is also a regulatory requirement, with reimbursements increasingly being linked to practice performance metrics. Practice quality improvement efforts must be demonstrated for credentialing purposes and for certification of radiologists in practice. Continuous quality improvement must occur for radiologists to remain competitive in an increasingly diverse health care market. This review provides an introduction to the main approaches available to undertake practice quality improvement, which will be useful for busy radiologists. Quality improvement plays multiple roles in radiology services, including ensuring and improving patient safety, providing a framework for implementing and improving processes to increase efficiency and reduce waste, analyzing and depicting performance data, monitoring performance and implementing change, enabling personnel assessment and development through continued education, and optimizing customer service and patient outcomes. The quality improvement approaches and underlying principles overlap, which is not surprising given that they all align with good patient care. The application of these principles to radiology practices not only benefits patients but also enhances practice performance through promotion of teamwork and achievement of goals. © RSNA, 2015.

  12. Theoretical approach to society-wide environmental quality control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayano, K.

    1982-01-01

    The study outlines the basis for a theory of societal control of environmental quality in the US based on the concepts and philosophy of company-wide quality control which has developed in Japan as a cross-disciplinary approach to problem-solving in the industrial realm. The basic concepts are: 1) every member of society, as a producer of environmental products and services for future generations, in principle has the responsibility to control the quality of his output; 2) environment quality is the quality of life, or the fitness of use of environment for humans; and 3) societal control is any activity necessary formore » quality production of environmental products and services continuously or in the long run. A motivator-hygiene theory of environmental quality is identified, and a proposal is made that the policy provision must be formulated differently between those aimed at hygiene factors of environmental quality and those aimed at motivators, the former in a collectivistic manner, the latter as an individual problem. The concept of societal cost of environmental quality is introduced. Based on the motivator-hygiene theory of environmental quality, the collectivistic and individual approaches are differentiated and discussed.« less

  13. Incorporating principal component analysis into air quality model evaluation

    EPA Science Inventory

    The efficacy of standard air quality model evaluation techniques is becoming compromised as the simulation periods continue to lengthen in response to ever increasing computing capacity. Accordingly, the purpose of this paper is to demonstrate a statistical approach called Princi...

  14. Electronic Cigarettes and Indoor Air Quality: A Simple Approach to Modeling Potential Bystander Exposures to Nicotine

    PubMed Central

    Colard, Stéphane; O’Connell, Grant; Verron, Thomas; Cahours, Xavier; Pritchard, John D.

    2014-01-01

    There has been rapid growth in the use of electronic cigarettes (“vaping”) in Europe, North America and elsewhere. With such increased prevalence, there is currently a debate on whether the aerosol exhaled following the use of e-cigarettes has implications for the quality of air breathed by bystanders. Conducting chemical analysis of the indoor environment can be costly and resource intensive, limiting the number of studies which can be conducted. However, this can be modelled reasonably accurately based on empirical emissions data and using some basic assumptions. Here, we present a simplified model, based on physical principles, which considers aerosol propagation, dilution and extraction to determine the potential contribution of a single puff from an e-cigarette to indoor air. From this, it was then possible to simulate the cumulative effect of vaping over time. The model was applied to a virtual, but plausible, scenario considering an e-cigarette user and a non-user working in the same office space. The model was also used to reproduce published experimental studies and showed good agreement with the published values of indoor air nicotine concentration. With some additional refinements, such an approach may be a cost-effective and rapid way of assessing the potential exposure of bystanders to exhaled e-cigarette aerosol constituents. PMID:25547398

  15. Quality by control: Towards model predictive control of mammalian cell culture bioprocesses.

    PubMed

    Sommeregger, Wolfgang; Sissolak, Bernhard; Kandra, Kulwant; von Stosch, Moritz; Mayer, Martin; Striedner, Gerald

    2017-07-01

    The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Modelling and analysis of ozone concentration by artificial intelligent techniques for estimating air quality

    NASA Astrophysics Data System (ADS)

    Taylan, Osman

    2017-02-01

    High ozone concentration is an important cause of air pollution mainly due to its role in the greenhouse gas emission. Ozone is produced by photochemical processes which contain nitrogen oxides and volatile organic compounds in the lower atmospheric level. Therefore, monitoring and controlling the quality of air in the urban environment is very important due to the public health care. However, air quality prediction is a highly complex and non-linear process; usually several attributes have to be considered. Artificial intelligent (AI) techniques can be employed to monitor and evaluate the ozone concentration level. The aim of this study is to develop an Adaptive Neuro-Fuzzy inference approach (ANFIS) to determine the influence of peripheral factors on air quality and pollution which is an arising problem due to ozone level in Jeddah city. The concentration of ozone level was considered as a factor to predict the Air Quality (AQ) under the atmospheric conditions. Using Air Quality Standards of Saudi Arabia, ozone concentration level was modelled by employing certain factors such as; nitrogen oxide (NOx), atmospheric pressure, temperature, and relative humidity. Hence, an ANFIS model was developed to observe the ozone concentration level and the model performance was assessed by testing data obtained from the monitoring stations established by the General Authority of Meteorology and Environment Protection of Kingdom of Saudi Arabia. The outcomes of ANFIS model were re-assessed by fuzzy quality charts using quality specification and control limits based on US-EPA air quality standards. The results of present study show that the ANFIS model is a comprehensive approach for the estimation and assessment of ozone level and is a reliable approach to produce more genuine outcomes.

  17. Vector-model-supported approach in prostate plan optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Eva Sau Fan; Department of Health Technology and Informatics, The Hong Kong Polytechnic University; Wu, Vincent Wing Cheung

    Lengthy time consumed in traditional manual plan optimization can limit the use of step-and-shoot intensity-modulated radiotherapy/volumetric-modulated radiotherapy (S&S IMRT/VMAT). A vector model base, retrieving similar radiotherapy cases, was developed with respect to the structural and physiologic features extracted from the Digital Imaging and Communications in Medicine (DICOM) files. Planning parameters were retrieved from the selected similar reference case and applied to the test case to bypass the gradual adjustment of planning parameters. Therefore, the planning time spent on the traditional trial-and-error manual optimization approach in the beginning of optimization could be reduced. Each S&S IMRT/VMAT prostate reference database comprised 100more » previously treated cases. Prostate cases were replanned with both traditional optimization and vector-model-supported optimization based on the oncologists' clinical dose prescriptions. A total of 360 plans, which consisted of 30 cases of S&S IMRT, 30 cases of 1-arc VMAT, and 30 cases of 2-arc VMAT plans including first optimization and final optimization with/without vector-model-supported optimization, were compared using the 2-sided t-test and paired Wilcoxon signed rank test, with a significance level of 0.05 and a false discovery rate of less than 0.05. For S&S IMRT, 1-arc VMAT, and 2-arc VMAT prostate plans, there was a significant reduction in the planning time and iteration with vector-model-supported optimization by almost 50%. When the first optimization plans were compared, 2-arc VMAT prostate plans had better plan quality than 1-arc VMAT plans. The volume receiving 35 Gy in the femoral head for 2-arc VMAT plans was reduced with the vector-model-supported optimization compared with the traditional manual optimization approach. Otherwise, the quality of plans from both approaches was comparable. Vector-model-supported optimization was shown to offer much shortened planning time and iteration

  18. Protein model quality assessment prediction by combining fragment comparisons and a consensus Cα contact potential

    PubMed Central

    Zhou, Hongyi; Skolnick, Jeffrey

    2009-01-01

    In this work, we develop a fully automated method for the quality assessment prediction of protein structural models generated by structure prediction approaches such as fold recognition servers, or ab initio methods. The approach is based on fragment comparisons and a consensus Cα contact potential derived from the set of models to be assessed and was tested on CASP7 server models. The average Pearson linear correlation coefficient between predicted quality and model GDT-score per target is 0.83 for the 98 targets which is better than those of other quality assessment methods that participated in CASP7. Our method also outperforms the other methods by about 3% as assessed by the total GDT-score of the selected top models. PMID:18004783

  19. Use of Multiple Linear Regression Models for Setting Water Quality Criteria for Copper: A Complementary Approach to the Biotic Ligand Model.

    PubMed

    Brix, Kevin V; DeForest, David K; Tear, Lucinda; Grosell, Martin; Adams, William J

    2017-05-02

    performances of the two models. The pooled MLR model was then applied to the species sensitivity distribution to derive acute and chronic criteria equations similar in form to the USEPA's current hardness-based criteria equations but with DOC, pH, and hardness as the independent variables. Overall, the MLR is less responsive to DOC than the BLM across a range of hardness and pH conditions but more responsive to hardness than the BLM. Additionally, at low and intermediate hardness, the MLR model is less responsive than the BLM to pH, but the two models respond comparably at high hardness. The net effect of these different response profiles is that under many typical water quality conditions, MLR- and BLM-based criteria are quite comparable. Indeed, conditions where the two models differ most (high pH/low hardness and low pH/high hardness) are relatively rare in natural aquatic systems. We suggest that this MLR-based approach, which includes the mechanistic foundation of the BLM but is also consistent with widely accepted hardness-dependent WQC in terms of development and form, may facilitate adoption of updated state-wide Cu criteria that more accurately account for the parameters influencing Cu bioavailability than current hardness-based criteria.

  20. Assessing the Quality of the Learning Outcome in Vocational Education: The Expero Model

    ERIC Educational Resources Information Center

    Cervai, Sara; Cian, Luca; Berlanga, Alicia; Borelli, Massimo; Kekale, Tauno

    2013-01-01

    Purpose: This paper aims to present an innovative model to evaluate the quality of the learning outcome in vocational education and training (VET) considering a wide approach that includes, in particular, stakeholders' expectations and perceptions. Design/methodology/approach: The Expero model was implemented in various kinds of vocational schools…

  1. Performance-Based Service Quality Model: An Empirical Study on Japanese Universities

    ERIC Educational Resources Information Center

    Sultan, Parves; Wong, Ho

    2010-01-01

    Purpose: This paper aims to develop and empirically test the performance-based higher education service quality model. Design/methodology/approach: The study develops 67-item instrument for measuring performance-based service quality with a particular focus on the higher education sector. Scale reliability is confirmed using the Cronbach's alpha.…

  2. A Synthesis of a Quality Management Model for Education in Universities

    ERIC Educational Resources Information Center

    Srikanthan, G.; Dalrymple, John

    2004-01-01

    The paper attempts to synthesise the features of the model for quality management in education based on the approaches spelt out in four well-articulated methodologies for the practice of quality in higher education. Each methodology contributes to different views of education from the learners' and the institution's perspectives, providing…

  3. Classroom quality and academic skills: Approaches to learning as a moderator.

    PubMed

    Meng, Christine

    2015-12-01

    The purpose of this study was to examine whether approaches to learning moderated the association between child care classroom environment and Head Start children's academic skills. The data came from the Head Start Family and Child Experiences Survey (FACES-2003 Cohort). The dataset is a nationally representative longitudinal study of Head Start children. The sample was selected using the stratified 4-stage sampling procedure. Data was collected in fall 2003, spring 2004, spring 2005, and spring 2006 in the first year of kindergarten. Participants included 3- and 4-year-old Head Start children (n = 786; 387 boys, 399 girls; 119 Hispanic children, 280 African American children, 312 Caucasian children). Head Start children's academic skills in letter-word identification, dictation/spelling, and mathematics at the 4 time points were measured by the Woodcock-Johnson Achievement Battery tests. Approaches to learning in fall 2003 was measured by the teacher report of the Preschool Learning Behaviors Scale. Child care classroom quality in fall 2003 was measured by the revised Early Childhood Environment Rating Scale. Results of the linear mixed effects models demonstrated that approaches to learning significantly moderated the effect of child care classroom quality on Head Start children's writing and spelling. Specifically, positive approaches to learning mitigated the negative effect of lower levels of classroom quality on dictation/spelling. Results underscore the important role of approaches to learning as a protective factor. Implications for early childhood educators with an emphasis on learning goals for disengaged children are discussed. (c) 2015 APA, all rights reserved).

  4. Linking Air Quality and Human Health Effects Models: An Application to the Los Angeles Air Basin.

    PubMed

    Stewart, Devoun R; Saunders, Emily; Perea, Roberto A; Fitzgerald, Rosa; Campbell, David E; Stockwell, William R

    2017-01-01

    Proposed emission control strategies for reducing ozone and particulate matter are evaluated better when air quality and health effects models are used together. The Community Multiscale Air Quality (CMAQ) model is the US Environmental Protection Agency's model for determining public policy and forecasting air quality. CMAQ was used to forecast air quality changes due to several emission control strategies that could be implemented between 2008 and 2030 for the South Coast Air Basin that includes Los Angeles. The Environmental Benefits Mapping and Analysis Program-Community Edition (BenMAP-CE) was used to estimate health and economic impacts of the different emission control strategies based on CMAQ simulations. BenMAP-CE is a computer program based on epidemiologic studies that link human health and air quality. This modeling approach is better for determining optimum public policy than approaches that only examine concentration changes.

  5. Hydropower Optimization Using Artificial Neural Network Surrogate Models of a High-Fidelity Hydrodynamics and Water Quality Model

    NASA Astrophysics Data System (ADS)

    Shaw, Amelia R.; Smith Sawyer, Heather; LeBoeuf, Eugene J.; McDonald, Mark P.; Hadjerioua, Boualem

    2017-11-01

    Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2 is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. The reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.

  6. Hydropower Optimization Using Artificial Neural Network Surrogate Models of a High-Fidelity Hydrodynamics and Water Quality Model

    DOE PAGES

    Shaw, Amelia R.; Sawyer, Heather Smith; LeBoeuf, Eugene J.; ...

    2017-10-24

    Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2more » is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. Here, the reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.« less

  7. Hydropower Optimization Using Artificial Neural Network Surrogate Models of a High-Fidelity Hydrodynamics and Water Quality Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaw, Amelia R.; Sawyer, Heather Smith; LeBoeuf, Eugene J.

    Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2more » is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. Here, the reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.« less

  8. Perceptual quality prediction on authentically distorted images using a bag of features approach

    PubMed Central

    Ghadiyaram, Deepti; Bovik, Alan C.

    2017-01-01

    Current top-performing blind perceptual image quality prediction models are generally trained on legacy databases of human quality opinion scores on synthetically distorted images. Therefore, they learn image features that effectively predict human visual quality judgments of inauthentic and usually isolated (single) distortions. However, real-world images usually contain complex composite mixtures of multiple distortions. We study the perceptually relevant natural scene statistics of such authentically distorted images in different color spaces and transform domains. We propose a “bag of feature maps” approach that avoids assumptions about the type of distortion(s) contained in an image and instead focuses on capturing consistencies—or departures therefrom—of the statistics of real-world images. Using a large database of authentically distorted images, human opinions of them, and bags of features computed on them, we train a regressor to conduct image quality prediction. We demonstrate the competence of the features toward improving automatic perceptual quality prediction by testing a learned algorithm using them on a benchmark legacy database as well as on a newly introduced distortion-realistic resource called the LIVE In the Wild Image Quality Challenge Database. We extensively evaluate the perceptual quality prediction model and algorithm and show that it is able to achieve good-quality prediction power that is better than other leading models. PMID:28129417

  9. Viticulture microzoning: a functional approach aiming to grape and wine qualities

    NASA Astrophysics Data System (ADS)

    Bonfante, A.; Agrillo, A.; Albrizio, R.; Basile, A.; Buonomo, R.; De Mascellis, R.; Gambuti, A.; Giorio, P.; Guida, G.; Langella, G.; Manna, P.; Minieri, L.; Moio, L.; Siani, T.; Terribile, F.

    2014-12-01

    This paper aims to test a new physically oriented approach to viticulture zoning at the farm scale, strongly rooted on hydropedology and aiming to achieve a better use of environmental features with respect to plant requirement and wine production. The physics of our approach is defined by the use of soil-plant-atmosphere simulation models which applies physically-based equations to describe the soil hydrological processes and solves soil-plant water status. This study (ZOVISA project) was conducted in a farm devoted to high quality wines production (Aglianico DOC), located in South Italy (Campania region, Mirabella Eclano-AV). The soil spatial distribution was obtained after standard soil survey informed by geophysical survey. Two Homogenous Zones (HZs) were identified; in each one of those a physically based model was applied to solve the soil water balance and estimate the soil functional behaviour (crop water stress index, CWSI) defining the functional Homogeneous Zones (fHzs). In these last, experimental plots were established and monitored for investigating soil-plant water status, crop development (biometric and physiological parameters) and daily climate variables (temperature, solar radiation, rainfall, wind). The effects of crop water status on crop response over must and wine quality were then evaluated in the fHZs. This was performed by comparing crop water stress with (i) crop physiological measurement (leaf gas exchange, chlorophyll a fluorescence, leaf water potential, chlorophyll content, LAI measurement), (ii) grape bunches measurements (berry weight, sugar content, titratable acidity, etc.) and (iii) wine quality (aromatic response). Eventually this experiment has proved the usefulness of the physical based approach also in the case of mapping viticulture microzoning.

  10. Frameworks for Assessing the Quality of Modeling and Simulation Capabilities

    NASA Astrophysics Data System (ADS)

    Rider, W. J.

    2012-12-01

    The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are

  11. Evaluation of the Combined AERCOARE/AERMOD Modeling Approach for Offshore Sources

    EPA Science Inventory

    ENVIRON conducted an evaluation of the combined AERCOARE/AERMOD (AERCOARE-MOD) modeling approach for offshore sources using tracer data from four field studies. AERCOARE processes overwater meteorological data for use by the AERMOD air quality dispersion model (EPA, 2004a). AERC...

  12. Management-focused approach to investigating coastal water-quality drivers and impacts in the Baltic Sea

    NASA Astrophysics Data System (ADS)

    Vigouroux, G.; Destouni, G.; Chen, Y.; Bring, A.; Jönsson, A.; Cvetkovic, V.

    2017-12-01

    Coastal areas link human-driven conditions on land with open sea conditions, and include crucial and vulnerable ecosystems that provide a variety of ecosystem services. Eutrophication is a common problem that is not least observed in the Baltic Sea, where coastal water quality is influenced both by land-based nutrient loading and by partly eutrophic open sea conditions. Robust and adaptive management of coastal systems is essential and necessitates integration of large scale catchment-coastal-marine systems as well as consideration of anthropogenic drivers and impacts, and climate change. To address this coastal challenge, relevant methodological approaches are required for characterization of coupled land, local coastal, and open sea conditions under an adaptive management framework for water quality. In this paper we present a new general and scalable dynamic characterization approach, developed for and applied to the Baltic Sea and its coastal areas. A simple carbon-based water quality model is implemented, dividing the Baltic Sea into main management basins that are linked to corresponding hydrological catchments on land, as well as to each other though aggregated three-dimensional marine hydrodynamics. Relevant hydrodynamic variables and associated water quality results have been validated on the Baltic Sea scale and show good accordance with available observation data and other modelling approaches. Based on its scalability, this methodology is further used on coastal zone scale to investigate the effects of hydrodynamic, hydro-climatic and nutrient load drivers on water quality and management implications for coastal areas in the Baltic Sea.

  13. Urban Air Quality Modelling with AURORA: Prague and Bratislava

    NASA Astrophysics Data System (ADS)

    Veldeman, N.; Viaene, P.; De Ridder, K.; Peelaerts, W.; Lauwaet, D.; Muhammad, N.; Blyth, L.

    2012-04-01

    The European Commission, in its strategy to protect the health of the European citizens, states that in order to assess the impact of air pollution on public health, information on long-term exposure to air pollution should be available. Currently, indicators of air quality are often being generated using measured pollutant concentrations. While air quality monitoring stations data provide accurate time series information at specific locations, air quality models have the advantage of being able to assess the spatial variability of air quality (for different resolutions) and predict air quality in the future based on different scenarios. When running such air quality models at a high spatial and temporal resolution, one can simulate the actual situation as closely as possible, allowing for a detailed assessment of the risk of exposure to citizens from different pollutants. AURORA (Air quality modelling in Urban Regions using an Optimal Resolution Approach), a prognostic 3-dimensional Eulerian chemistry-transport model, is designed to simulate urban- to regional-scale atmospheric pollutant concentration and exposure fields. The AURORA model also allows to calculate the impact of changes in land use (e.g. planting of trees) or of emission reduction scenario's on air quality. AURORA is currently being applied within the ESA atmospheric GMES service, PASODOBLE (http://www.myair-eu.org), that delivers information on air quality, greenhouse gases, stratospheric ozone, … At present there are two operational AURORA services within PASODOBLE. Within the "Air quality forecast service" VITO delivers daily air quality forecasts for Belgium at a resolution of 5 km and for the major Belgian cities: Brussels, Ghent, Antwerp, Liege and Charleroi. Furthermore forecast services are provided for Prague, Czech Republic and Bratislava, Slovakia, both at a resolution of 1 km. The "Urban/regional air quality assessment service" provides urban- and regional-scale maps (hourly resolution

  14. Estimating Lightning NOx Emissions for Regional Air Quality Modeling

    NASA Astrophysics Data System (ADS)

    Holloway, T.; Scotty, E.; Harkey, M.

    2014-12-01

    Lightning emissions have long been recognized as an important source of nitrogen oxides (NOx) on a global scale, and an essential emission component for global atmospheric chemistry models. However, only in recent years have regional air quality models incorporated lightning NOx emissions into simulations. The growth in regional modeling of lightning emissions has been driven in part by comparisons with satellite-derived estimates of column NO2, especially from the Ozone Monitoring Instrument (OMI) aboard the Aura satellite. We present and evaluate a lightning inventory for the EPA Community Multiscale Air Quality (CMAQ) model. Our approach follows Koo et al. [2010] in the approach to spatially and temporally allocating a given total value based on cloud-top height and convective precipitation. However, we consider alternate total NOx emission values (which translate into alternate lightning emission factors) based on a review of the literature and performance evaluation against OMI NO2 for July 2007 conditions over the U.S. and parts of Canada and Mexico. The vertical distribution of lightning emissions follow a bimodal distribution from Allen et al. [2012] calculated over 27 vertical model layers. Total lightning NO emissions for July 2007 show the highest above-land emissions in Florida, southeastern Texas and southern Louisiana. Although agreement with OMI NO2 across the domain varied significantly depending on lightning NOx assumptions, agreement among the simulations at ground-based NO2 monitors from the EPA Air Quality System database showed no meaningful sensitivity to lightning NOx. Emissions are compared with prior studies, which find similar distribution patterns, but a wide range of calculated magnitudes.

  15. Linking Air Quality and Human Health Effects Models: An Application to the Los Angeles Air Basin

    PubMed Central

    Stewart, Devoun R; Saunders, Emily; Perea, Roberto A; Fitzgerald, Rosa; Campbell, David E; Stockwell, William R

    2017-01-01

    Proposed emission control strategies for reducing ozone and particulate matter are evaluated better when air quality and health effects models are used together. The Community Multiscale Air Quality (CMAQ) model is the US Environmental Protection Agency’s model for determining public policy and forecasting air quality. CMAQ was used to forecast air quality changes due to several emission control strategies that could be implemented between 2008 and 2030 for the South Coast Air Basin that includes Los Angeles. The Environmental Benefits Mapping and Analysis Program—Community Edition (BenMAP-CE) was used to estimate health and economic impacts of the different emission control strategies based on CMAQ simulations. BenMAP-CE is a computer program based on epidemiologic studies that link human health and air quality. This modeling approach is better for determining optimum public policy than approaches that only examine concentration changes. PMID:29162976

  16. A data-driven approach to quality risk management

    PubMed Central

    Alemayehu, Demissie; Alvir, Jose; Levenstein, Marcia; Nickerson, David

    2013-01-01

    Aim: An effective clinical trial strategy to ensure patient safety as well as trial quality and efficiency involves an integrated approach, including prospective identification of risk factors, mitigation of the risks through proper study design and execution, and assessment of quality metrics in real-time. Such an integrated quality management plan may also be enhanced by using data-driven techniques to identify risk factors that are most relevant in predicting quality issues associated with a trial. In this paper, we illustrate such an approach using data collected from actual clinical trials. Materials and Methods: Several statistical methods were employed, including the Wilcoxon rank-sum test and logistic regression, to identify the presence of association between risk factors and the occurrence of quality issues, applied to data on quality of clinical trials sponsored by Pfizer. Results: Only a subset of the risk factors had a significant association with quality issues, and included: Whether study used Placebo, whether an agent was a biologic, unusual packaging label, complex dosing, and over 25 planned procedures. Conclusion: Proper implementation of the strategy can help to optimize resource utilization without compromising trial integrity and patient safety. PMID:24312890

  17. A data-driven approach to quality risk management.

    PubMed

    Alemayehu, Demissie; Alvir, Jose; Levenstein, Marcia; Nickerson, David

    2013-10-01

    An effective clinical trial strategy to ensure patient safety as well as trial quality and efficiency involves an integrated approach, including prospective identification of risk factors, mitigation of the risks through proper study design and execution, and assessment of quality metrics in real-time. Such an integrated quality management plan may also be enhanced by using data-driven techniques to identify risk factors that are most relevant in predicting quality issues associated with a trial. In this paper, we illustrate such an approach using data collected from actual clinical trials. Several statistical methods were employed, including the Wilcoxon rank-sum test and logistic regression, to identify the presence of association between risk factors and the occurrence of quality issues, applied to data on quality of clinical trials sponsored by Pfizer. ONLY A SUBSET OF THE RISK FACTORS HAD A SIGNIFICANT ASSOCIATION WITH QUALITY ISSUES, AND INCLUDED: Whether study used Placebo, whether an agent was a biologic, unusual packaging label, complex dosing, and over 25 planned procedures. Proper implementation of the strategy can help to optimize resource utilization without compromising trial integrity and patient safety.

  18. Improving Healthcare Quality in the United States: A New Approach.

    PubMed

    Nix, Kathryn A; O'Shea, John S

    2015-06-01

    Improving the quality of health care has been a focus of health reformers during the last 2 decades, yet meaningful and sustainable quality improvement has remained elusive in many ways. Although a number of individual institutions have made great strides toward more effective and efficient care, progress has not gone far enough on a national scale. Barriers to quality of care lie in fundamental, systemwide factors that impede large-scale change. Notable among these is the third-party financing arrangement that dominates the healthcare system. Long-term goals for healthcare reform should address this barrier to higher quality of care. A new model for healthcare financing that includes patient awareness of the cost of care will encourage better quality and reduced spending by engaging patients in the pursuit of value, aligning incentives for insurers to reduce costs with patients' desire to receive excellent care, and holding providers accountable for the quality and cost of the care they provide. Several new programs implemented under the Patient Protection and Affordable Care Act aim to catalyze improvement in the quality of care, but the law takes the wrong approach, directing incentives at providers only and maintaining a system that excludes patients from the search for high-value care.

  19. Uncertainty estimation of a complex water quality model: The influence of Box-Cox transformation on Bayesian approaches and comparison with a non-Bayesian method

    NASA Astrophysics Data System (ADS)

    Freni, Gabriele; Mannina, Giorgio

    In urban drainage modelling, uncertainty analysis is of undoubted necessity. However, uncertainty analysis in urban water-quality modelling is still in its infancy and only few studies have been carried out. Therefore, several methodological aspects still need to be experienced and clarified especially regarding water quality modelling. The use of the Bayesian approach for uncertainty analysis has been stimulated by its rigorous theoretical framework and by the possibility of evaluating the impact of new knowledge on the modelling predictions. Nevertheless, the Bayesian approach relies on some restrictive hypotheses that are not present in less formal methods like the Generalised Likelihood Uncertainty Estimation (GLUE). One crucial point in the application of Bayesian method is the formulation of a likelihood function that is conditioned by the hypotheses made regarding model residuals. Statistical transformations, such as the use of Box-Cox equation, are generally used to ensure the homoscedasticity of residuals. However, this practice may affect the reliability of the analysis leading to a wrong uncertainty estimation. The present paper aims to explore the influence of the Box-Cox equation for environmental water quality models. To this end, five cases were considered one of which was the “real” residuals distributions (i.e. drawn from available data). The analysis was applied to the Nocella experimental catchment (Italy) which is an agricultural and semi-urbanised basin where two sewer systems, two wastewater treatment plants and a river reach were monitored during both dry and wet weather periods. The results show that the uncertainty estimation is greatly affected by residual transformation and a wrong assumption may also affect the evaluation of model uncertainty. The use of less formal methods always provide an overestimation of modelling uncertainty with respect to Bayesian method but such effect is reduced if a wrong assumption is made regarding the

  20. Innovations in projecting emissions for air quality modeling ...

    EPA Pesticide Factsheets

    Air quality modeling is used in setting air quality standards and in evaluating their costs and benefits. Historically, modeling applications have projected emissions and the resulting air quality only 5 to 10 years into the future. Recognition that the choice of air quality management strategy has climate change implications is encouraging longer modeling time horizons. However, for multi-decadal time horizons, many questions about future conditions arise. For example, will current population, economic, and land use trends continue, or will we see shifts that may alter the spatial and temporal pattern of emissions? Similarly, will technologies such as building-integrated solar photovoltaics, battery storage, electric vehicles, and CO2 capture emerge as disruptive technologies - shifting how we produce and use energy - or will these technologies achieve only niche markets and have little impact? These are some of the questions that are being evaluated by researchers within the U.S. EPA’s Office of Research and Development. In this presentation, Dr. Loughlin will describe a range of analytical approaches that are being explored. These include: (i) the development of alternative scenarios of the future that can be used to evaluate candidate management strategies over wide-ranging conditions, (ii) the application of energy system models to project emissions decades into the future and to assess the environmental implications of new technologies, (iii) and methodo

  1. Modelling the effect of wildfire on forested catchment water quality using the SWAT model

    NASA Astrophysics Data System (ADS)

    Yu, M.; Bishop, T.; van Ogtrop, F. F.; Bell, T.

    2016-12-01

    Wildfire removes the surface vegetation, releases ash, increase erosion and runoff, and therefore effects the hydrological cycle of a forested water catchment. It is important to understand chnage and how the catchment recovers. These processes are spatially sensitive and effected by interactions between fire severity and hillslope, soil type and surface vegetation conditions. Thus, a distributed hydrological modelling approach is required. In this study, the Soil and Water Analysis Tool (SWAT) is used to predict the effect of 2001/02 Sydney wild fire on catchment water quality. 10 years pre-fire data is used to create and calibrate the SWAT model. The calibrated model was then used to simulate the water quality for the 10 years post-fire period without fire effect. The simulated water quality data are compared with recorded water quality data provided by Sydney catchment authority. The mean change of flow, total suspended solid, total nitrate and total phosphate are compare on monthly, three month, six month and annual basis. Two control catchment and three burn catchment were analysed.

  2. The Air Quality Model Evaluation International Initiative ...

    EPA Pesticide Factsheets

    This presentation provides an overview of the Air Quality Model Evaluation International Initiative (AQMEII). It contains a synopsis of the three phases of AQMEII, including objectives, logistics, and timelines. It also provides a number of examples of analyses conducted through AQMEII with a particular focus on past and future analyses of deposition. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  3. Modeling subjective evaluation of soundscape quality in urban open spaces: An artificial neural network approach.

    PubMed

    Yu, Lei; Kang, Jian

    2009-09-01

    This research aims to explore the feasibility of using computer-based models to predict the soundscape quality evaluation of potential users in urban open spaces at the design stage. With the data from large scale field surveys in 19 urban open spaces across Europe and China, the importance of various physical, behavioral, social, demographical, and psychological factors for the soundscape evaluation has been statistically analyzed. Artificial neural network (ANN) models have then been explored at three levels. It has been shown that for both subjective sound level and acoustic comfort evaluation, a general model for all the case study sites is less feasible due to the complex physical and social environments in urban open spaces; models based on individual case study sites perform well but the application range is limited; and specific models for certain types of location/function would be reliable and practical. The performance of acoustic comfort models is considerably better than that of sound level models. Based on the ANN models, soundscape quality maps can be produced and this has been demonstrated with an example.

  4. Quality Assurance in Post-Secondary Education: Some Common Approaches

    ERIC Educational Resources Information Center

    Law, Dennis Chung Sea

    2010-01-01

    Purpose: The common approaches to quality assurance (QA), as practiced by most post-secondary education institutions for internal quality monitoring and most QA authorities for external quality monitoring (EQM), have been considered by many researchers as having largely failed to address the essence of educational quality. The purpose of this…

  5. Modeling Approaches in Planetary Seismology

    NASA Technical Reports Server (NTRS)

    Weber, Renee; Knapmeyer, Martin; Panning, Mark; Schmerr, Nick

    2014-01-01

    Of the many geophysical means that can be used to probe a planet's interior, seismology remains the most direct. Given that the seismic data gathered on the Moon over 40 years ago revolutionized our understanding of the Moon and are still being used today to produce new insight into the state of the lunar interior, it is no wonder that many future missions, both real and conceptual, plan to take seismometers to other planets. To best facilitate the return of high-quality data from these instruments, as well as to further our understanding of the dynamic processes that modify a planet's interior, various modeling approaches are used to quantify parameters such as the amount and distribution of seismicity, tidal deformation, and seismic structure on and of the terrestrial planets. In addition, recent advances in wavefield modeling have permitted a renewed look at seismic energy transmission and the effects of attenuation and scattering, as well as the presence and effect of a core, on recorded seismograms. In this chapter, we will review these approaches.

  6. A new approach to the tradeoff between quality and accessibility of health care.

    PubMed

    Tanke, Marit A C; Ikkersheim, David E

    2012-05-01

    Quality of care is associated with patient volume. Regionalization of care is therefore one of the approaches that is suited to improve quality of care. A disadvantage of regionalization is that the accessibility of the facilities can decrease. By investigating the tradeoff between quality and accessibility it is possible to determine the optimal amount of treatment locations in a health care system. In this article we present a new model to quantitatively 'solve' this tradeoff. We use the condition breast cancer in the Netherlands as an example. We calculated the expected quality gains in Quality Adjusted Lifetime Years (QALY's) due to stepwise regionalization using 'volume-outcome' literature for breast cancer. Decreased accessibility was operationalized as increased (travel) costs due to regionalization by using demographic data, drive-time information, and the national median income. The total sum of the quality and accessibility function determines the optimum range of treatment locations for this particular condition, given the 'volume-quality' relationship and Dutch demographics and geography. Currently, 94 locations offer breast cancer treatment in the Netherlands. Our model estimates that the optimum range of treatment locations for this particular condition in the Netherlands varies from 15 locations to 44 locations. Our study shows that the Dutch society would benefit from regionalization of breast cancer care as possible quality gains outweigh heightened travel costs. In addition, this model can be used for other medical conditions and in other countries. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  7. A slow fashion design model for bluejeans using house of quality approach

    NASA Astrophysics Data System (ADS)

    Nergis, B.; Candan, C.; Sarısaltık, S.; Seneloglu, N.; Bozuk, R.; Amzayev, K.

    2017-10-01

    The purpose of this study was to develop a slow fashion design model using the house of quality model (HOQ) to provide fashion designers a tool to improve the overall sustainability of denim jeans for Y generation consumers in Turkish market. In doing so, a survey was conducted to collect data on the design & performance expectations as well as the perception of slow fashion in design process of denim jeans of the targeted consumer group. The results showed that Y generation in the market gave the most importance to the sustainable production techniques when identifying slow fashion.

  8. Multisensor satellite data for water quality analysis and water pollution risk assessment: decision making under deep uncertainty with fuzzy algorithm in framework of multimodel approach

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy V.; Sztoyka, Yulia; Kopachevsky, Ivan; Artemenko, Igor; Yuschenko, Maxim

    2017-10-01

    Multi-model approach for remote sensing data processing and interpretation is described. The problem of satellite data utilization in multi-modeling approach for socio-ecological risks assessment is formally defined. Observation, measurement and modeling data utilization method in the framework of multi-model approach is described. Methodology and models of risk assessment in framework of decision support approach are defined and described. Method of water quality assessment using satellite observation data is described. Method is based on analysis of spectral reflectance of aquifers. Spectral signatures of freshwater bodies and offshores are analyzed. Correlations between spectral reflectance, pollutions and selected water quality parameters are analyzed and quantified. Data of MODIS, MISR, AIRS and Landsat sensors received in 2002-2014 have been utilized verified by in-field spectrometry and lab measurements. Fuzzy logic based approach for decision support in field of water quality degradation risk is discussed. Decision on water quality category is making based on fuzzy algorithm using limited set of uncertain parameters. Data from satellite observations, field measurements and modeling is utilizing in the framework of the approach proposed. It is shown that this algorithm allows estimate water quality degradation rate and pollution risks. Problems of construction of spatial and temporal distribution of calculated parameters, as well as a problem of data regularization are discussed. Using proposed approach, maps of surface water pollution risk from point and diffuse sources are calculated and discussed.

  9. Identification of watershed priority management areas under water quality constraints: A simulation-optimization approach with ideal load reduction

    NASA Astrophysics Data System (ADS)

    Dong, Feifei; Liu, Yong; Wu, Zhen; Chen, Yihui; Guo, Huaicheng

    2018-07-01

    Targeting nonpoint source (NPS) pollution hot spots is of vital importance for placement of best management practices (BMPs). Although physically-based watershed models have been widely used to estimate nutrient emissions, connections between nutrient abatement and compliance of water quality standards have been rarely considered in NPS hotspot ranking, which may lead to ineffective decision-making. It's critical to develop a strategy to identify priority management areas (PMAs) based on water quality response to nutrient load mitigation. A water quality constrained PMA identification framework was thereby proposed in this study, based on the simulation-optimization approach with ideal load reduction (ILR-SO). It integrates the physically-based Soil and Water Assessment Tool (SWAT) model and an optimization model under constraints of site-specific water quality standards. To our knowledge, it was the first effort to identify PMAs with simulation-based optimization. The SWAT model was established to simulate temporal and spatial nutrient loading and evaluate effectiveness of pollution mitigation. A metamodel was trained to establish a quantitative relationship between sources and water quality. Ranking of priority areas is based on required nutrient load reduction in each sub-watershed targeting to satisfy water quality standards in waterbodies, which was calculated with genetic algorithm (GA). The proposed approach was used for identification of PMAs on the basis of diffuse total phosphorus (TP) in Lake Dianchi Watershed, one of the three most eutrophic large lakes in China. The modeling results demonstrated that 85% of diffuse TP came from 30% of the watershed area. Compared with the two conventional targeting strategies based on overland nutrient loss and instream nutrient loading, the ILR-SO model identified distinct PMAs and narrowed down the coverage of management areas. This study addressed the urgent need to incorporate water quality response into PMA

  10. The choices, choosing model of quality of life: description and rationale.

    PubMed

    Gurland, Barry J; Gurland, Roni V

    2009-01-01

    This introductory paper offers a critical review of current models and measures of quality of life, and describes a choices and choosing (c-c) process as a new model of quality of life. Criteria are proposed for judging the relative merits of models of quality of life with preference being given to explicit mechanisms, linkages to a science base, a means of identifying deficits amenable to rational restorative interventions, and with embedded values of the whole person. A conjectured model, based on the processes of gaining access to choices and choosing among them, matches the proposed criteria. The c-c process is an evolved adaptive mechanism dedicated to the pursuit of quality of life, driven by specific biological and psychological systems, and influenced by social and environmental forces. This model strengthens the science base for the field of quality of life, unifies approaches to concept and measurement, and guides the evaluation of impairments of quality of life. Corresponding interventions can be aimed at relieving restrictions or distortions of the c-c process; thus helping people to preserve and improve their quality of life. RELATED WORK: Companion papers detail relevant aspects of the science base, present methods of identifying deficits and distortions of the c-c model so as to open opportunities for rational restorative interventions, and explore empirical analyses of the relationship between health imposed restrictions of c-c and conventional indicators of diminished quality of life. [corrected] (c) 2008 John Wiley & Sons, Ltd.

  11. Innovative United Kingdom Approaches To Measuring Service Quality.

    ERIC Educational Resources Information Center

    Winkworth, Ian

    2001-01-01

    Reports on approaches to measuring the service quality of academic libraries in the United Kingdom. Discusses the role of government and the national background of quality measurement; measurement frameworks; better use of statistics; benchmarking; measuring user satisfaction; and possible future development. (Author/LRW)

  12. Toward an integrated approach to nutritional quality, environmental sustainability, and economic viability: research and measurement gaps.

    PubMed

    Herforth, Anna; Frongillo, Edward A; Sassi, Franco; Mclean, Mireille Seneclauze; Arabi, Mandana; Tirado, Cristina; Remans, Roseline; Mantilla, Gilma; Thomson, Madeleine; Pingali, Prabhu

    2014-12-01

    Nutrition is affected by numerous environmental and societal causes. This paper starts with a simple framework based on three domains: nutritional quality, economic viability, and environmental sustainability, and calls for an integrated approach in research to simultaneously account for all three. It highlights limitations in the current understanding of each domain, and how they influence one another. Five research topics are identified: measuring the three domains (nutritional quality, economic viability, environmental sustainability); modeling across disciplines; furthering the analysis of food systems in relation to the three domains; connecting climate change and variability to nutritional quality; and increasing attention to inequities among population groups in relation to the three domains. For an integrated approach to be developed, there is a need to identify and disseminate available metrics, modeling techniques, and tools to researchers, practitioners, and policy makers. This is a first step so that a systems approach that takes into account potential environmental and economic trade-offs becomes the norm in analyzing nutrition and food-security patterns. Such an approach will help fill critical knowledge gaps and will guide researchers seeking to define and address specific research questions in nutrition in their wider socioeconomic and environmental contexts. © 2014 New York Academy of Sciences.

  13. Merging Digital Surface Models Implementing Bayesian Approaches

    NASA Astrophysics Data System (ADS)

    Sadeq, H.; Drummond, J.; Li, Z.

    2016-06-01

    In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades). It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  14. AQMEII: A New International Initiative on Air Quality Model Evaluation

    EPA Science Inventory

    We provide a conceptual view of the process of evaluating regional-scale three-dimensional numerical photochemical air quality modeling system, based on an examination of existing approached to the evaluation of such systems as they are currently used in a variety of application....

  15. AN ECOEPIDEMIOLOGICAL APPROACH FOR DEVELOPING WATER QUALITY CRITERIA

    EPA Science Inventory

    The USEPA's Draft Framework for Developing Suspended and Bedded Sediments Water Quality Criteria is based on an ecoepidemiological approach that is potentially applicable to any chemical or non-chemical agent. An ecoepidemiological approach infers associations from the co-occurre...

  16. Electromagnetic forward modelling for realistic Earth models using unstructured tetrahedral meshes and a meshfree approach

    NASA Astrophysics Data System (ADS)

    Farquharson, C.; Long, J.; Lu, X.; Lelievre, P. G.

    2017-12-01

    Real-life geology is complex, and so, even when allowing for the diffusive, low resolution nature of geophysical electromagnetic methods, we need Earth models that can accurately represent this complexity when modelling and inverting electromagnetic data. This is particularly the case for the scales, detail and conductivity contrasts involved in mineral and hydrocarbon exploration and development, but also for the larger scale of lithospheric studies. Unstructured tetrahedral meshes provide a flexible means of discretizing a general, arbitrary Earth model. This is important when wanting to integrate a geophysical Earth model with a geological Earth model parameterized in terms of surfaces. Finite-element and finite-volume methods can be derived for computing the electric and magnetic fields in a model parameterized using an unstructured tetrahedral mesh. A number of such variants have been proposed and have proven successful. However, the efficiency and accuracy of these methods can be affected by the "quality" of the tetrahedral discretization, that is, how many of the tetrahedral cells in the mesh are long, narrow and pointy. This is particularly the case if one wants to use an iterative technique to solve the resulting linear system of equations. One approach to deal with this issue is to develop sophisticated model and mesh building and manipulation capabilities in order to ensure that any mesh built from geological information is of sufficient quality for the electromagnetic modelling. Another approach is to investigate other methods of synthesizing the electromagnetic fields. One such example is a "meshfree" approach in which the electromagnetic fields are synthesized using a mesh that is distinct from the mesh used to parameterized the Earth model. There are then two meshes, one describing the Earth model and one used for the numerical mathematics of computing the fields. This means that there are no longer any quality requirements on the model mesh, which

  17. Elucidating hydraulic fracturing impacts on groundwater quality using a regional geospatial statistical modeling approach.

    PubMed

    Burton, Taylour G; Rifai, Hanadi S; Hildenbrand, Zacariah L; Carlton, Doug D; Fontenot, Brian E; Schug, Kevin A

    2016-03-01

    Hydraulic fracturing operations have been viewed as the cause of certain environmental issues including groundwater contamination. The potential for hydraulic fracturing to induce contaminant pathways in groundwater is not well understood since gas wells are completed while isolating the water table and the gas-bearing reservoirs lay thousands of feet below the water table. Recent studies have attributed ground water contamination to poor well construction and leaks in the wellbore annulus due to ruptured wellbore casings. In this paper, a geospatial model of the Barnett Shale region was created using ArcGIS. The model was used for spatial analysis of groundwater quality data in order to determine if regional variations in groundwater quality, as indicated by various groundwater constituent concentrations, may be associated with the presence of hydraulically fractured gas wells in the region. The Barnett Shale reservoir pressure, completions data, and fracture treatment data were evaluated as predictors of groundwater quality change. Results indicated that elevated concentrations of certain groundwater constituents are likely related to natural gas production in the study area and that beryllium, in this formation, could be used as an indicator variable for evaluating fracturing impacts on regional groundwater quality. Results also indicated that gas well density and formation pressures correlate to change in regional water quality whereas proximity to gas wells, by itself, does not. The results also provided indirect evidence supporting the possibility that micro annular fissures serve as a pathway transporting fluids and chemicals from the fractured wellbore to the overlying groundwater aquifers. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Quality models for audiovisual streaming

    NASA Astrophysics Data System (ADS)

    Thang, Truong Cong; Kim, Young Suk; Kim, Cheon Seog; Ro, Yong Man

    2006-01-01

    Quality is an essential factor in multimedia communication, especially in compression and adaptation. Quality metrics can be divided into three categories: within-modality quality, cross-modality quality, and multi-modality quality. Most research has so far focused on within-modality quality. Moreover, quality is normally just considered from the perceptual perspective. In practice, content may be drastically adapted, even converted to another modality. In this case, we should consider the quality from semantic perspective as well. In this work, we investigate the multi-modality quality from the semantic perspective. To model the semantic quality, we apply the concept of "conceptual graph", which consists of semantic nodes and relations between the nodes. As an typical of multi-modality example, we focus on audiovisual streaming service. Specifically, we evaluate the amount of information conveyed by a audiovisual content where both video and audio channels may be strongly degraded, even audio are converted to text. In the experiments, we also consider the perceptual quality model of audiovisual content, so as to see the difference with semantic quality model.

  19. New directions: Time for a new approach to modeling surface-atmosphere exchanges in air quality models?

    NASA Astrophysics Data System (ADS)

    Saylor, Rick D.; Hicks, Bruce B.

    2016-03-01

    Just as the exchange of heat, moisture and momentum between the Earth's surface and the atmosphere are critical components of meteorological and climate models, the surface-atmosphere exchange of many trace gases and aerosol particles is a vitally important process in air quality (AQ) models. Current state-of-the-art AQ models treat the emission and deposition of most gases and particles as separate model parameterizations, even though evidence has accumulated over time that the emission and deposition processes of many constituents are often two sides of the same coin, with the upward (emission) or downward (deposition) flux over a landscape depending on a range of environmental, seasonal and biological variables. In this note we argue that the time has come to integrate the treatment of these processes in AQ models to provide biological, physical and chemical consistency and improved predictions of trace gases and particles.

  20. APPLICATION OF A NEW LAND-SURFACE, DRY DEPOSITION, AND PBL MODEL IN THE MODELS-3 COMMUNITY MULTI-SCALE AIR QUALITY (CMAQ) MODEL SYSTEM

    EPA Science Inventory

    Like most air quality modeling systems, CMAQ divides the treatment of meteorological and chemical/transport processes into separate models run sequentially. A potential drawback to this approach is that it creates the illusion that these processes are minimally interdependent an...

  1. Metacognition: towards a new approach to quality of life.

    PubMed

    Blanc, Julien; Boyer, Laurent; Le Coz, Pierre; Auquier, Pascal

    2014-03-01

    Recent studies have demonstrated that various diseases states (e.g., schizophrenia, Alzheimer's disease) and events (e.g., a stroke) alter a person's perception of their physical and mental status. Most often this involves alterations in a person's metacognitive capabilities, and this can question the conceptual model of quality of life (QoL) based on a "perspectivist" approach. Using the example of schizophrenia, we applied a philosophical model, developed by Griffin, to deal with this potential threat to the validity of QoL assessment. Patients with schizophrenia are at risk for being impaired in their ability to assess their QoL. We hypothesise that metacognition (i.e., the ability to attribute mental states in terms of beliefs and goals to one's self and others) is a formal condition to assess QoL. This particular skill is important because self-reflection is necessary for making a qualitative judgment. A link between this psychological concept and the philosophical concept of reflexivity may be established. We propose a conceptual approach to QoL that takes into account the patient's reflexivity. This approach is derived from Griffin's theory based on the list of "prudential values" and the satisfaction of the informed desires of the individual. The ability of patients to evaluate and value their life should be considered to enrich the concept of QoL. The approach derived from Griffin's theory might constitute a new avenue for QoL research.

  2. Population Modeling Approach to Optimize Crop Harvest Strategy. The Case of Field Tomato.

    PubMed

    Tran, Dinh T; Hertog, Maarten L A T M; Tran, Thi L H; Quyen, Nguyen T; Van de Poel, Bram; Mata, Clara I; Nicolaï, Bart M

    2017-01-01

    In this study, the aim is to develop a population model based approach to optimize fruit harvesting strategies with regard to fruit quality and its derived economic value. This approach was applied to the case of tomato fruit harvesting under Vietnamese conditions. Fruit growth and development of tomato (cv. "Savior") was monitored in terms of fruit size and color during both the Vietnamese winter and summer growing seasons. A kinetic tomato fruit growth model was applied to quantify biological fruit-to-fruit variation in terms of their physiological maturation. This model was successfully calibrated. Finally, the model was extended to translate the fruit-to-fruit variation at harvest into the economic value of the harvested crop. It can be concluded that a model based approach to the optimization of harvest date and harvest frequency with regard to economic value of the crop as such is feasible. This approach allows growers to optimize their harvesting strategy by harvesting the crop at more uniform maturity stages meeting the stringent retail demands for homogeneous high quality product. The total farm profit would still depend on the impact a change in harvesting strategy might have on related expenditures. This model based harvest optimisation approach can be easily transferred to other fruit and vegetable crops improving homogeneity of the postharvest product streams.

  3. Client satisfaction with reproductive health-care quality: integrating business approaches to modeling and measurement.

    PubMed

    Alden, Dana L; Do, Mai Hoa; Bhawuk, Dharm

    2004-12-01

    Health-care managers are increasingly interested in client perceptions of clinic service quality and satisfaction. While tremendous progress has occurred, additional perspectives on the conceptualization, modeling and measurement of these constructs may further assist health-care managers seeking to provide high-quality care. To that end, this study draws on theories from business and health to develop an integrated model featuring antecedents to and consequences of reproductive health-care client satisfaction. In addition to developing a new model, this study contributes by testing how well Western-based theories of client satisfaction hold in a developing, Asian country. Applied to urban, reproductive health clinic users in Hanoi, Vietnam, test results suggest that hypothesized antecedents such as pre-visit expectations, perceived clinic performance and how much performance exceeds expectations impact client satisfaction. However, the relative importance of these predictors appears to vary depending on a client's level of service-related experience. Finally, higher levels of client satisfaction are positively related to future clinic use intentions. This study demonstrates the value of: (1) incorporating theoretical perspectives from multiple disciplines to model processes underlying health-care satisfaction and (2) field testing those models before implementation. It also furthers research designed to provide health-care managers with actionable measures of the complex processes related to their clients' satisfaction.

  4. A quality risk management model approach for cell therapy manufacturing.

    PubMed

    Lopez, Fabio; Di Bartolo, Chiara; Piazza, Tommaso; Passannanti, Antonino; Gerlach, Jörg C; Gridelli, Bruno; Triolo, Fabio

    2010-12-01

    International regulatory authorities view risk management as an essential production need for the development of innovative, somatic cell-based therapies in regenerative medicine. The available risk management guidelines, however, provide little guidance on specific risk analysis approaches and procedures applicable in clinical cell therapy manufacturing. This raises a number of problems. Cell manufacturing is a poorly automated process, prone to operator-introduced variations, and affected by heterogeneity of the processed organs/tissues and lot-dependent variability of reagent (e.g., collagenase) efficiency. In this study, the principal challenges faced in a cell-based product manufacturing context (i.e., high dependence on human intervention and absence of reference standards for acceptable risk levels) are identified and addressed, and a risk management model approach applicable to manufacturing of cells for clinical use is described for the first time. The use of the heuristic and pseudo-quantitative failure mode and effect analysis/failure mode and critical effect analysis risk analysis technique associated with direct estimation of severity, occurrence, and detection is, in this specific context, as effective as, but more efficient than, the analytic hierarchy process. Moreover, a severity/occurrence matrix and Pareto analysis can be successfully adopted to identify priority failure modes on which to act to mitigate risks. The application of this approach to clinical cell therapy manufacturing in regenerative medicine is also discussed. © 2010 Society for Risk Analysis.

  5. Data-base development for water-quality modeling of the Patuxent River basin, Maryland

    USGS Publications Warehouse

    Fisher, G.T.; Summers, R.M.

    1987-01-01

    Procedures and rationale used to develop a data base and data management system for the Patuxent Watershed Nonpoint Source Water Quality Monitoring and Modeling Program of the Maryland Department of the Environment and the U.S. Geological Survey are described. A detailed data base and data management system has been developed to facilitate modeling of the watershed for water quality planning purposes; statistical analysis; plotting of meteorologic, hydrologic and water quality data; and geographic data analysis. The system is Maryland 's prototype for development of a basinwide water quality management program. A key step in the program is to build a calibrated and verified water quality model of the basin using the Hydrological Simulation Program--FORTRAN (HSPF) hydrologic model, which has been used extensively in large-scale basin modeling. The compilation of the substantial existing data base for preliminary calibration of the basin model, including meteorologic, hydrologic, and water quality data from federal and state data bases and a geographic information system containing digital land use and soils data is described. The data base development is significant in its application of an integrated, uniform approach to data base management and modeling. (Lantz-PTT)

  6. Quality Improvement on the Acute Inpatient Psychiatry Unit Using the Model for Improvement

    PubMed Central

    Singh, Kuldeep; Sanderson, Joshua; Galarneau, David; Keister, Thomas; Hickman, Dean

    2013-01-01

    Background A need exists for constant evaluation and modification of processes within healthcare systems to achieve quality improvement. One common approach is the Model for Improvement that can be used to clearly define aims, measures, and changes that are then implemented through a plan-do-study-act (PDSA) cycle. This approach is a commonly used method for improving quality in a wide range of fields. The Model for Improvement allows for a systematic process that can be revised at set time intervals to achieve a desired result. Methods We used the Model for Improvement in an acute psychiatry unit (APU) to improve the screening incidence of abnormal involuntary movements in eligible patients—those starting or continuing on standing neuroleptics—with the Abnormal Involuntary Movement Scale (AIMS). Results After 8 weeks of using the Model for Improvement, both of the participating inpatient services in the APU showed substantial overall improvement in screening for abnormal involuntary movements using the AIMS. Conclusion Crucial aspects of a successful quality improvement initiative based on the Model for Improvement are well-defined goals, process measures, and structured PDSA cycles. Success also requires communication, organization, and participation of the entire team. PMID:24052768

  7. Quality improvement on the acute inpatient psychiatry unit using the model for improvement.

    PubMed

    Singh, Kuldeep; Sanderson, Joshua; Galarneau, David; Keister, Thomas; Hickman, Dean

    2013-01-01

    A need exists for constant evaluation and modification of processes within healthcare systems to achieve quality improvement. One common approach is the Model for Improvement that can be used to clearly define aims, measures, and changes that are then implemented through a plan-do-study-act (PDSA) cycle. This approach is a commonly used method for improving quality in a wide range of fields. The Model for Improvement allows for a systematic process that can be revised at set time intervals to achieve a desired result. We used the Model for Improvement in an acute psychiatry unit (APU) to improve the screening incidence of abnormal involuntary movements in eligible patients-those starting or continuing on standing neuroleptics-with the Abnormal Involuntary Movement Scale (AIMS). After 8 weeks of using the Model for Improvement, both of the participating inpatient services in the APU showed substantial overall improvement in screening for abnormal involuntary movements using the AIMS. Crucial aspects of a successful quality improvement initiative based on the Model for Improvement are well-defined goals, process measures, and structured PDSA cycles. Success also requires communication, organization, and participation of the entire team.

  8. A Systemwide Approach to Improving Early Childhood Program Quality in the Detroit Metropolitan Area. Final Report.

    ERIC Educational Resources Information Center

    Shouse, A. Clay; Epstein, Ann S.

    This document is the final report of the McGregor-funded High/Scope training initiative, a system-wide approach to improving the quality of early childhood programs in the Detroit metropolitan area. The 3-year project was based on the validated High/Scope educational approach and training model, which advocates hands-on active learning for both…

  9. Using "big data" to optimally model hydrology and water quality across expansive regions

    USGS Publications Warehouse

    Roehl, E.A.; Cook, J.B.; Conrads, P.A.

    2009-01-01

    This paper describes a new divide and conquer approach that leverages big environmental data, utilizing all available categorical and time-series data without subjectivity, to empirically model hydrologic and water-quality behaviors across expansive regions. The approach decomposes large, intractable problems into smaller ones that are optimally solved; decomposes complex signals into behavioral components that are easier to model with "sub- models"; and employs a sequence of numerically optimizing algorithms that include time-series clustering, nonlinear, multivariate sensitivity analysis and predictive modeling using multi-layer perceptron artificial neural networks, and classification for selecting the best sub-models to make predictions at new sites. This approach has many advantages over traditional modeling approaches, including being faster and less expensive, more comprehensive in its use of available data, and more accurate in representing a system's physical processes. This paper describes the application of the approach to model groundwater levels in Florida, stream temperatures across Western Oregon and Wisconsin, and water depths in the Florida Everglades. ?? 2009 ASCE.

  10. REVIEW OF THE GOVERNING EQUATIONS, COMPUTATIONAL ALGORITHMS, AND OTHER COMPONENTS OF THE MODELS-3 COMMUNITY MULTISCALE AIR QUALITY (CMAQ) MODELING SYSTEM

    EPA Science Inventory

    This article describes the governing equations, computational algorithms, and other components entering into the Community Multiscale Air Quality (CMAQ) modeling system. This system has been designed to approach air quality as a whole by including state-of-the-science capabiliti...

  11. Comparison of approaches to Total Quality Management. Including an examination of the Department of Energy`s position on quality management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, C.T.

    1994-03-01

    This paper presents a comparison of several qualitatively different approaches to Total Quality Management (TQM). The continuum ranges from management approaches that are primarily standards -- with specific guidelines, but few theoretical concepts -- to approaches that are primarily philosophical, with few specific guidelines. The approaches to TQM discussed in this paper include the International Organization for Standardization (ISO) 9000 Standard, the Malcolm Baldrige National Quality Award, Senge`s the Learning Organization, Watkins and Marsick`s approach to organizational learning, Covey`s Seven Habits of Highly Successful People, and Deming`s Fourteen Points for Management. Some of these approaches (Deming and ISO 9000) aremore » then compared to the DOE`s official position on quality management and conduct of operations (DOE Orders 5700.6C and 5480.19). Using a tabular format, it is shown that while 5700.6C (Quality Assurance) maps well to many of the current approaches to TQM, DOE`s principle guide to management Order 5419.80 (Conduct of Operations) has many significant conflicts with some of the modern approaches to continuous quality improvement.« less

  12. Accounting for and predicting the influence of spatial autocorrelation in water quality modeling

    NASA Astrophysics Data System (ADS)

    Miralha, L.; Kim, D.

    2017-12-01

    Although many studies have attempted to investigate the spatial trends of water quality, more attention is yet to be paid to the consequences of considering and ignoring the spatial autocorrelation (SAC) that exists in water quality parameters. Several studies have mentioned the importance of accounting for SAC in water quality modeling, as well as the differences in outcomes between models that account for and ignore SAC. However, the capacity to predict the magnitude of such differences is still ambiguous. In this study, we hypothesized that SAC inherently possessed by a response variable (i.e., water quality parameter) influences the outcomes of spatial modeling. We evaluated whether the level of inherent SAC is associated with changes in R-Squared, Akaike Information Criterion (AIC), and residual SAC (rSAC), after accounting for SAC during modeling procedure. The main objective was to analyze if water quality parameters with higher Moran's I values (inherent SAC measure) undergo a greater increase in R² and a greater reduction in both AIC and rSAC. We compared a non-spatial model (OLS) to two spatial regression approaches (spatial lag and error models). Predictor variables were the principal components of topographic (elevation and slope), land cover, and hydrological soil group variables. We acquired these data from federal online sources (e.g. USGS). Ten watersheds were selected, each in a different state of the USA. Results revealed that water quality parameters with higher inherent SAC showed substantial increase in R² and decrease in rSAC after performing spatial regressions. However, AIC values did not show significant changes. Overall, the higher the level of inherent SAC in water quality variables, the greater improvement of model performance. This indicates a linear and direct relationship between the spatial model outcomes (R² and rSAC) and the degree of SAC in each water quality variable. Therefore, our study suggests that the inherent level of

  13. Prediction of aircraft handling qualities using analytical models of the human pilot

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1982-01-01

    The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion for determining the susceptibility of an aircraft to pilot-induced oscillations (PIO) is formulated. Finally, a model-based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.

  14. Prediction of aircraft handling qualities using analytical models of the human pilot

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1982-01-01

    The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion for determining the susceptibility of an aircraft to pilot induced oscillations is formulated. Finally, a model based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.

  15. An Integrated Computer Modeling Environment for Regional Land Use, Air Quality, and Transportation Planning

    DOT National Transportation Integrated Search

    1997-04-01

    The Land Use, Air Quality, and Transportation Integrated Modeling Environment (LATIME) represents an integrated approach to computer modeling and simulation of land use allocation, travel demand, and mobile source emissions for the Albuquerque, New M...

  16. An examination of data quality on QSAR Modeling in regards ...

    EPA Pesticide Factsheets

    The development of QSAR models is critically dependent on the quality of available data. As part of our efforts to develop public platforms to provide access to predictive models, we have attempted to discriminate the influence of the quality versus quantity of data available to develop and validate QSAR models. We have focused our efforts on the widely used EPISuite software that was initially developed over two decades ago and, specifically, on the PHYSPROP dataset used to train the EPISuite prediction models. This presentation will review our approaches to examining key datasets, the delivery of curated data and the development of machine-learning models for thirteen separate property endpoints of interest to environmental science. We will also review how these data will be made freely accessible to the community via a new “chemistry dashboard”. This abstract does not reflect U.S. EPA policy. presentation at UNC-CH.

  17. Audiovisual quality estimation of mobile phone video cameras with interpretation-based quality approach

    NASA Astrophysics Data System (ADS)

    Radun, Jenni E.; Virtanen, Toni; Olives, Jean-Luc; Vaahteranoksa, Mikko; Vuori, Tero; Nyman, Göte

    2007-01-01

    We present an effective method for comparing subjective audiovisual quality and the features related to the quality changes of different video cameras. Both quantitative estimation of overall quality and qualitative description of critical quality features are achieved by the method. The aim was to combine two image quality evaluation methods, the quantitative Absolute Category Rating (ACR) method with hidden reference removal and the qualitative Interpretation- Based Quality (IBQ) method in order to see how they complement each other in audiovisual quality estimation tasks. 26 observers estimated the audiovisual quality of six different cameras, mainly mobile phone video cameras. In order to achieve an efficient subjective estimation of audiovisual quality, only two contents with different quality requirements were recorded with each camera. The results show that the subjectively important quality features were more related to the overall estimations of cameras' visual video quality than to the features related to sound. The data demonstrated two significant quality dimensions related to visual quality: darkness and sharpness. We conclude that the qualitative methodology can complement quantitative quality estimations also with audiovisual material. The IBQ approach is valuable especially, when the induced quality changes are multidimensional.

  18. Measuring Service Quality in Higher Education: Development of a Hierarchical Model (HESQUAL)

    ERIC Educational Resources Information Center

    Teeroovengadum, Viraiyan; Kamalanabhan, T. J.; Seebaluck, Ashley Keshwar

    2016-01-01

    Purpose: This paper aims to develop and empirically test a hierarchical model for measuring service quality in higher education. Design/methodology/approach: The first phase of the study consisted of qualitative research methods and a comprehensive literature review, which allowed the development of a conceptual model comprising 53 service quality…

  19. On the validity of the incremental approach to estimate the impact of cities on air quality

    NASA Astrophysics Data System (ADS)

    Thunis, Philippe

    2018-01-01

    The question of how much cities are the sources of their own air pollution is not only theoretical as it is critical to the design of effective strategies for urban air quality planning. In this work, we assess the validity of the commonly used incremental approach to estimate the likely impact of cities on their air pollution. With the incremental approach, the city impact (i.e. the concentration change generated by the city emissions) is estimated as the concentration difference between a rural background and an urban background location, also known as the urban increment. We show that the city impact is in reality made up of the urban increment and two additional components and consequently two assumptions need to be fulfilled for the urban increment to be representative of the urban impact. The first assumption is that the rural background location is not influenced by emissions from within the city whereas the second requires that background concentration levels, obtained with zero city emissions, are equal at both locations. Because the urban impact is not measurable, the SHERPA modelling approach, based on a full air quality modelling system, is used in this work to assess the validity of these assumptions for some European cities. Results indicate that for PM2.5, these two assumptions are far from being fulfilled for many large or medium city sizes. For this type of cities, urban increments are largely underestimating city impacts. Although results are in better agreement for NO2, similar issues are met. In many situations the incremental approach is therefore not an adequate estimate of the urban impact on air pollution. This poses issues in terms of interpretation when these increments are used to define strategic options in terms of air quality planning. We finally illustrate the interest of comparing modelled and measured increments to improve our confidence in the model results.

  20. Clinical governance: bridging the gap between managerial and clinical approaches to quality of care

    PubMed Central

    Buetow, S. A.; Roland, M.

    1999-01-01

    Clinical governance has been introduced as a new approach to quality improvement in the UK national health service. This article maps clinical governance against a discussion of the four main approaches to measuring and improving quality of care: quality assessment, quality assurance, clinical audit, and quality improvement (including continuous quality improvement). Quality assessment underpins each approach. Whereas clinical audit has, in general, been professionally led, managers have driven quality improvement initiatives. Quality assurance approaches have been perceived to be externally driven by managers or to involve professional inspection. It is discussed how clinical governance seeks to bridge these approaches. Clinical governance allows clinicians in the UK to lead a comprehensive strategy to improve quality within provider organisations, although with an expectation of greatly increased external accountability. Clinical governance aims to bring together managerial, organisational, and clinical approaches to improving quality of care. If successful, it will define a new type of professionalism for the next century. Failure by the professions to seize the opportunity is likely to result in increasingly detailed external control of clinical activity in the UK, as has occurred in some other countries. PMID:10847876

  1. Predicting health-related quality of life in cancer patients receiving chemotherapy: a structural equation approach using the self-control model.

    PubMed

    Park, Yu-Ri; Park, Eun-Young; Kim, Jung-Hee

    2017-11-09

    According to the self-control model, self-control works as a protective factor and a psychological resource. Although an understanding of the effect(s) of peripheral neuropathy on quality of life is important to healthcare professionals, previous studies do not facilitate broad comprehension in this regard. The purpose of this cross-sectional study was to test the multidimensional assumptions of quality of life of patients with cancer, with focus on their self-control. A structural equation model was tested on patients with cancer at the oncology clinic of a university hospital where patients received chemotherapy. A model was tested using structural equation modeling, which allows the researcher to find the empirical evidence by testing a measurement model and a structural model. The model comprised three variables, self-control, health related quality of life, and chemotherapy-induced peripheral neuropathy. Among the variables, self-control was the endogenous and mediating variable. The proposed models showed good fit indices. Self-control partially mediated chemotherapy-induced peripheral neuropathy and quality of life. It was found that the physical symptoms of peripheral neuropathy influenced health-related quality of life both indirectly and directly. Self-control plays a significant role in the protection and promotion of physical and mental health in various stressful situations, and thus, as a psychological resource, it plays a significant role in quality of life. Our results can be used to develop a quality of life model for patients receiving chemotherapy and as a theoretical foundation for the development of appropriate nursing interventions.

  2. PREDICTIVE UNCERTAINTY IN HYDROLOGIC AND WATER QUALITY MODELING: APPROACHES, APPLICATION TO ENVIRONMENTAL MANAGEMENT, AND FUTURE CHALLENGES

    EPA Science Inventory

    Extant process-based hydrologic and water quality models are indispensable to water resources planning and environmental management. However, models are only approximations of real systems and often calibrated with incomplete and uncertain data. Reliable estimates, or perhaps f...

  3. Approaches to measuring quality of the wilderness experience

    Treesearch

    William T. Borrie; Robert M. Birzell

    2001-01-01

    Wilderness is a special place that provides opportunity for unique and profound experiences. An essential task for the maintenance of these recreational opportunities is the definition and monitoring of experience quality. Four approaches to the measurement of the wilderness experience have developed in over 30 years of research: satisfaction approaches (which focus on...

  4. Approach to developing numeric water quality criteria for ...

    EPA Pesticide Factsheets

    Human activities on land increase nutrient loads to coastal waters, which can increase phytoplankton production and biomass and potentially cause harmful ecological effects. States can adopt numeric water quality criteria into their water quality standards to protect the designated uses of their coastal waters from eutrophication impacts. The first objective of this study was to provide an approach for developing numeric water quality criteria for coastal waters based on archived SeaWiFS ocean color satellite data. The second objective was to develop an approach for transferring water quality criteria assessments to newer ocean color satellites such as MODIS and MERIS. Spatial and temporal measures of SeaWiFS, MODIS, and MERIS chlorophyll-a (ChlRS-a, mg m-3) were resolved across Florida’s coastal waters between 1998 and 2009. Annual geometric means of SeaWiFS ChlRS-a were evaluated to determine a quantitative reference baseline from the 90th percentile of the annual geometric means. A method for transferring to multiple ocean color sensors was implemented with SeaWiFS as the reference instrument. The ChlRS-a annual geometric means for each coastal segment from MODIS and MERIS were regressed against SeaWiFS to provide a similar response among all three satellites. Standardization factors for each coastal segment were calculated based on differences between 90th percentiles from SeaWiFS to MODIS and SeaWiFS to MERIS. This transfer approach allowed for futu

  5. Air Quality Modeling | Air Quality Planning & Standards | US ...

    EPA Pesticide Factsheets

    2016-06-08

    The basic mission of the Office of Air Quality Planning and Standards is to preserve and improve the quality of our nation's air. One facet of accomplishing this goal requires that new and existing air pollution sources be modeled for compliance with the National Ambient Air Quality Standards (NAAQS).

  6. Higher Education Quality Assessment Model: Towards Achieving Educational Quality Standard

    ERIC Educational Resources Information Center

    Noaman, Amin Y.; Ragab, Abdul Hamid M.; Madbouly, Ayman I.; Khedra, Ahmed M.; Fayoumi, Ayman G.

    2017-01-01

    This paper presents a developed higher education quality assessment model (HEQAM) that can be applied for enhancement of university services. This is because there is no universal unified quality standard model that can be used to assess the quality criteria of higher education institutes. The analytical hierarchy process is used to identify the…

  7. Modelling road accidents: An approach using structural time series

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  8. Large-scale model quality assessment for improving protein tertiary structure prediction.

    PubMed

    Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin

    2015-06-15

    Sampling structural models and ranking them are the two major challenges of protein structure prediction. Traditional protein structure prediction methods generally use one or a few quality assessment (QA) methods to select the best-predicted models, which cannot consistently select relatively better models and rank a large number of models well. Here, we develop a novel large-scale model QA method in conjunction with model clustering to rank and select protein structural models. It unprecedentedly applied 14 model QA methods to generate consensus model rankings, followed by model refinement based on model combination (i.e. averaging). Our experiment demonstrates that the large-scale model QA approach is more consistent and robust in selecting models of better quality than any individual QA method. Our method was blindly tested during the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM group. It was officially ranked third out of all 143 human and server predictors according to the total scores of the first models predicted for 78 CASP11 protein domains and second according to the total scores of the best of the five models predicted for these domains. MULTICOM's outstanding performance in the extremely competitive 2014 CASP11 experiment proves that our large-scale QA approach together with model clustering is a promising solution to one of the two major problems in protein structure modeling. The web server is available at: http://sysbio.rnet.missouri.edu/multicom_cluster/human/. © The Author 2015. Published by Oxford University Press.

  9. Water quality modelling of an impacted semi-arid catchment using flow data from the WEAP model

    NASA Astrophysics Data System (ADS)

    Slaughter, Andrew R.; Mantel, Sukhmani K.

    2018-04-01

    the two models were compared to the available observed data, with the initial focus within WQSAM on a simulation of instream total dissolved solids (TDS) and nutrient concentrations. The WEAP model was able to adequately simulate flow in the Buffalo River catchment, with consideration of human inputs and outputs. WQSAM was adapted to successfully take as input the flow output of the WEAP model, and the simulations of nutrients by WQSAM provided a good representation of the variability of observed nutrient concentrations in the catchment. This study showed that the WQSAM model is able to accept flow inputs from the WEAP model, and that this approach is able to provide satisfactory estimates of both flow and water quality for a small, semi-arid and impacted catchment. It is hoped that this research will encourage the application of WQSAM to an increased number of catchments within southern Africa and beyond.

  10. A cost-efficiency and health benefit approach to improve urban air quality.

    PubMed

    Miranda, A I; Ferreira, J; Silveira, C; Relvas, H; Duque, L; Roebeling, P; Lopes, M; Costa, S; Monteiro, A; Gama, C; Sá, E; Borrego, C; Teixeira, J P

    2016-11-01

    When ambient air quality standards established in the EU Directive 2008/50/EC are exceeded, Member States are obliged to develop and implement Air Quality Plans (AQP) to improve air quality and health. Notwithstanding the achievements in emission reductions and air quality improvement, additional efforts need to be undertaken to improve air quality in a sustainable way - i.e. through a cost-efficiency approach. This work was developed in the scope of the recently concluded MAPLIA project "Moving from Air Pollution to Local Integrated Assessment", and focuses on the definition and assessment of emission abatement measures and their associated costs, air quality and health impacts and benefits by means of air quality modelling tools, health impact functions and cost-efficiency analysis. The MAPLIA system was applied to the Grande Porto urban area (Portugal), addressing PM10 and NOx as the most important pollutants in the region. Four different measures to reduce PM10 and NOx emissions were defined and characterized in terms of emissions and implementation costs, and combined into 15 emission scenarios, simulated by the TAPM air quality modelling tool. Air pollutant concentration fields were then used to estimate health benefits in terms of avoided costs (external costs), using dose-response health impact functions. Results revealed that, among the 15 scenarios analysed, the scenario including all 4 measures lead to a total net benefit of 0.3M€·y(-1). The largest net benefit is obtained for the scenario considering the conversion of 50% of open fire places into heat recovery wood stoves. Although the implementation costs of this measure are high, the benefits outweigh the costs. Research outcomes confirm that the MAPLIA system is useful for policy decision support on air quality improvement strategies, and could be applied to other urban areas where AQP need to be implemented and monitored. Copyright © 2016. Published by Elsevier B.V.

  11. A direct sensitivity approach to predict hourly ozone resulting from compliance with the National Ambient Air Quality Standard.

    PubMed

    Simon, Heather; Baker, Kirk R; Akhtar, Farhan; Napelenok, Sergey L; Possiel, Norm; Wells, Benjamin; Timin, Brian

    2013-03-05

    In setting primary ambient air quality standards, the EPA's responsibility under the law is to establish standards that protect public health. As part of the current review of the ozone National Ambient Air Quality Standard (NAAQS), the US EPA evaluated the health exposure and risks associated with ambient ozone pollution using a statistical approach to adjust recent air quality to simulate just meeting the current standard level, without specifying emission control strategies. One drawback of this purely statistical concentration rollback approach is that it does not take into account spatial and temporal heterogeneity of ozone response to emissions changes. The application of the higher-order decoupled direct method (HDDM) in the community multiscale air quality (CMAQ) model is discussed here to provide an example of a methodology that could incorporate this variability into the risk assessment analyses. Because this approach includes a full representation of the chemical production and physical transport of ozone in the atmosphere, it does not require assumed background concentrations, which have been applied to constrain estimates from past statistical techniques. The CMAQ-HDDM adjustment approach is extended to measured ozone concentrations by determining typical sensitivities at each monitor location and hour of the day based on a linear relationship between first-order sensitivities and hourly ozone values. This approach is demonstrated by modeling ozone responses for monitor locations in Detroit and Charlotte to domain-wide reductions in anthropogenic NOx and VOCs emissions. As seen in previous studies, ozone response calculated using HDDM compared well to brute-force emissions changes up to approximately a 50% reduction in emissions. A new stepwise approach is developed here to apply this method to emissions reductions beyond 50% allowing for the simulation of more stringent reductions in ozone concentrations. Compared to previous rollback methods, this

  12. Contribution of regional-scale fire events to ozone and PM2.5 air quality estimated by photochemical modeling approaches

    NASA Astrophysics Data System (ADS)

    Baker, K. R.; Woody, M. C.; Tonnesen, G. S.; Hutzell, W.; Pye, H. O. T.; Beaver, M. R.; Pouliot, G.; Pierce, T.

    2016-09-01

    Two specific fires from 2011 are tracked for local to regional scale contribution to ozone (O3) and fine particulate matter (PM2.5) using a freely available regulatory modeling system that includes the BlueSky wildland fire emissions tool, Spare Matrix Operator Kernel Emissions (SMOKE) model, Weather and Research Forecasting (WRF) meteorological model, and Community Multiscale Air Quality (CMAQ) photochemical grid model. The modeling system was applied to track the contribution from a wildfire (Wallow) and prescribed fire (Flint Hills) using both source sensitivity and source apportionment approaches. The model estimated fire contribution to primary and secondary pollutants are comparable using source sensitivity (brute-force zero out) and source apportionment (Integrated Source Apportionment Method) approaches. Model estimated O3 enhancement relative to CO is similar to values reported in literature indicating the modeling system captures the range of O3 inhibition possible near fires and O3 production both near the fire and downwind. O3 and peroxyacetyl nitrate (PAN) are formed in the fire plume and transported downwind along with highly reactive VOC species such as formaldehyde and acetaldehyde that are both emitted by the fire and rapidly produced in the fire plume by VOC oxidation reactions. PAN and aldehydes contribute to continued downwind O3 production. The transport and thermal decomposition of PAN to nitrogen oxides (NOX) enables O3 production in areas limited by NOX availability and the photolysis of aldehydes to produce free radicals (HOX) causes increased O3 production in NOX rich areas. The modeling system tends to overestimate hourly surface O3 at routine rural monitors in close proximity to the fires when the model predicts elevated fire impacts on O3 and Hazard Mapping System (HMS) data indicates possible fire impact. A sensitivity simulation in which solar radiation and photolysis rates were more aggressively attenuated by aerosol in the plume

  13. Protocol for Reliability Assessment of Structural Health Monitoring Systems Incorporating Model-assisted Probability of Detection (MAPOD) Approach

    DTIC Science & Technology

    2011-09-01

    a quality evaluation with limited data, a model -based assessment must be...that affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a ...affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a wide range

  14. Working conditions, self-perceived stress, anxiety, depression and quality of life: A structural equation modelling approach

    PubMed Central

    Rusli, Bin Nordin; Edimansyah, Bin Abdin; Naing, Lin

    2008-01-01

    Background The relationships between working conditions [job demand, job control and social support]; stress, anxiety, and depression; and perceived quality of life factors [physical health, psychological wellbeing, social relationships and environmental conditions] were assessed using a sample of 698 male automotive assembly workers in Malaysia. Methods The validated Malay version of the Job Content Questionnaire (JCQ), Depression Anxiety Stress Scales (DASS) and the World Health Organization Quality of Life-Brief (WHOQOL-BREF) were used. A structural equation modelling (SEM) analysis was applied to test the structural relationships of the model using AMOS version 6.0, with the maximum likelihood ratio as the method of estimation. Results The results of the SEM supported the hypothesized structural model (χ2 = 22.801, df = 19, p = 0.246). The final model shows that social support (JCQ) was directly related to all 4 factors of the WHOQOL-BREF and inversely related to depression and stress (DASS). Job demand (JCQ) was directly related to stress (DASS) and inversely related to the environmental conditions (WHOQOL-BREF). Job control (JCQ) was directly related to social relationships (WHOQOL-BREF). Stress (DASS) was directly related to anxiety and depression (DASS) and inversely related to physical health, environment conditions and social relationships (WHOQOL-BREF). Anxiety (DASS) was directly related to depression (DASS) and inversely related to physical health (WHOQOL-BREF). Depression (DASS) was inversely related to the psychological wellbeing (WHOQOL-BREF). Finally, stress, anxiety and depression (DASS) mediate the relationships between job demand and social support (JCQ) to the 4 factors of WHOQOL-BREF. Conclusion These findings suggest that higher social support increases the self-reported quality of life of these workers. Higher job control increases the social relationships, whilst higher job demand increases the self-perceived stress and decreases the self

  15. Working conditions, self-perceived stress, anxiety, depression and quality of life: a structural equation modelling approach.

    PubMed

    Rusli, Bin Nordin; Edimansyah, Bin Abdin; Naing, Lin

    2008-02-06

    The relationships between working conditions [job demand, job control and social support]; stress, anxiety, and depression; and perceived quality of life factors [physical health, psychological wellbeing, social relationships and environmental conditions] were assessed using a sample of 698 male automotive assembly workers in Malaysia. The validated Malay version of the Job Content Questionnaire (JCQ), Depression Anxiety Stress Scales (DASS) and the World Health Organization Quality of Life-Brief (WHOQOL-BREF) were used. A structural equation modelling (SEM) analysis was applied to test the structural relationships of the model using AMOS version 6.0, with the maximum likelihood ratio as the method of estimation. The results of the SEM supported the hypothesized structural model (chi2 = 22.801, df = 19, p = 0.246). The final model shows that social support (JCQ) was directly related to all 4 factors of the WHOQOL-BREF and inversely related to depression and stress (DASS). Job demand (JCQ) was directly related to stress (DASS) and inversely related to the environmental conditions (WHOQOL-BREF). Job control (JCQ) was directly related to social relationships (WHOQOL-BREF). Stress (DASS) was directly related to anxiety and depression (DASS) and inversely related to physical health, environment conditions and social relationships (WHOQOL-BREF). Anxiety (DASS) was directly related to depression (DASS) and inversely related to physical health (WHOQOL-BREF). Depression (DASS) was inversely related to the psychological wellbeing (WHOQOL-BREF). Finally, stress, anxiety and depression (DASS) mediate the relationships between job demand and social support (JCQ) to the 4 factors of WHOQOL-BREF. These findings suggest that higher social support increases the self-reported quality of life of these workers. Higher job control increases the social relationships, whilst higher job demand increases the self-perceived stress and decreases the self-perceived quality of life related to

  16. Linking Meteorology, Air Quality Models and Observations to ...

    EPA Pesticide Factsheets

    Epidemiologic studies are critical in establishing the association between exposure to air pollutants and adverse health effects. Results of epidemiologic studies are used by U.S. EPA in developing air quality standards to protect the public from the health effects of air pollutants. A major challenge in environmental epidemiology is adequate exposure characterization. Numerous health studies have used measurements from a few central-site ambient monitors to characterize air pollution exposures. Relying solely on central-site ambient monitors does not account for the spatial-heterogeneity of ambient air pollution patterns, the temporal variability in ambient concentrations, nor the influence of infiltration and indoor sources. Central-site monitoring becomes even more problematic for certain air pollutants that exhibit significant spatial heterogeneity. Statistical interpolation techniques and passive monitoring methods can provide additional spatial resolution in ambient concentration estimates. In addition, spatio-temporal models, which integrate GIS data and other factors, such as meteorology, have also been developed to produce more resolved estimates of ambient concentrations. Models, such as the Community Multi-Scale Air Quality (CMAQ) model, estimate ambient concentrations by combining information on meteorology, source emissions, and chemical-fate and transport. Hybrid modeling approaches, which integrate regional scale models with local scale dispersion

  17. Joint space-time geostatistical model for air quality surveillance

    NASA Astrophysics Data System (ADS)

    Russo, A.; Soares, A.; Pereira, M. J.

    2009-04-01

    Air pollution and peoples' generalized concern about air quality are, nowadays, considered to be a global problem. Although the introduction of rigid air pollution regulations has reduced pollution from industry and power stations, the growing number of cars on the road poses a new pollution problem. Considering the characteristics of the atmospheric circulation and also the residence times of certain pollutants in the atmosphere, a generalized and growing interest on air quality issues led to research intensification and publication of several articles with quite different levels of scientific depth. As most natural phenomena, air quality can be seen as a space-time process, where space-time relationships have usually quite different characteristics and levels of uncertainty. As a result, the simultaneous integration of space and time is not an easy task to perform. This problem is overcome by a variety of methodologies. The use of stochastic models and neural networks to characterize space-time dispersion of air quality is becoming a common practice. The main objective of this work is to produce an air quality model which allows forecasting critical concentration episodes of a certain pollutant by means of a hybrid approach, based on the combined use of neural network models and stochastic simulations. A stochastic simulation of the spatial component with a space-time trend model is proposed to characterize critical situations, taking into account data from the past and a space-time trend from the recent past. To identify near future critical episodes, predicted values from neural networks are used at each monitoring station. In this paper, we describe the design of a hybrid forecasting tool for ambient NO2 concentrations in Lisbon, Portugal.

  18. [Hyperspectral Remote Sensing Estimation Models for Pasture Quality].

    PubMed

    Ma, Wei-wei; Gong, Cai-lan; Hu, Yong; Wei, Yong-lin; Li, Long; Liu, Feng-yi; Meng, Peng

    2015-10-01

    Crude protein (CP), crude fat (CFA) and crude fiber (CFI) are key indicators for evaluation of the quality and feeding value of pasture. Hence, identification of these biological contents is an essential practice for animal husbandry. As current approaches to pasture quality estimation are time-consuming and costly, and even generate hazardous waste, a real-time and non-destructive method is therefore developed in this study using pasture canopy hyperspectral data. A field campaign was carried out in August 2013 around Qinghai Lake in order to obtain field spectral properties of 19 types of natural pasture using the ASD Field Spec 3, a field spectrometer that works in the optical region (350-2 500 nm) of the electromagnetic spectrum. In additional to the spectral data, pasture samples were also collected from the field and examined in laboratory to measure the relative concentration of CP (%), CFA (%) and CFI (%). After spectral denoising and smoothing, the relationship of pasture quality parameters with the reflectance spectrum, the first derivatives of reflectance (FDR), band ratio and the wavelet coefficients (WCs) was analyzed respectively. The concentration of CP, CFA and CFI of pasture was found closely correlated with FDR with wavebands centered at 424, 1 668, and 918 nm as well as with the low-scale (scale = 2, 4) Morlet, Coiflets and Gassian WCs. Accordingly, the linear, exponential, and polynomial equations between each pasture variable and FDR or WCs were developed. Validation of the developed equations indicated that the polynomial model with an independent variable of Coiflets WCs (scale = 4, wavelength =1 209 nm), the polynomial model with an independent variable of FDR, and the exponential model with an independent variable of FDR were the optimal model for prediction of concentration of CP, CFA and CFI of pasture, respectively. The R2 of the pasture quality estimation models was between 0.646 and 0.762 at the 0.01 significance level. Results suggest

  19. Quality assessment of protein model-structures based on structural and functional similarities

    PubMed Central

    2012-01-01

    Background Experimental determination of protein 3D structures is expensive, time consuming and sometimes impossible. A gap between number of protein structures deposited in the World Wide Protein Data Bank and the number of sequenced proteins constantly broadens. Computational modeling is deemed to be one of the ways to deal with the problem. Although protein 3D structure prediction is a difficult task, many tools are available. These tools can model it from a sequence or partial structural information, e.g. contact maps. Consequently, biologists have the ability to generate automatically a putative 3D structure model of any protein. However, the main issue becomes evaluation of the model quality, which is one of the most important challenges of structural biology. Results GOBA - Gene Ontology-Based Assessment is a novel Protein Model Quality Assessment Program. It estimates the compatibility between a model-structure and its expected function. GOBA is based on the assumption that a high quality model is expected to be structurally similar to proteins functionally similar to the prediction target. Whereas DALI is used to measure structure similarity, protein functional similarity is quantified using standardized and hierarchical description of proteins provided by Gene Ontology combined with Wang's algorithm for calculating semantic similarity. Two approaches are proposed to express the quality of protein model-structures. One is a single model quality assessment method, the other is its modification, which provides a relative measure of model quality. Exhaustive evaluation is performed on data sets of model-structures submitted to the CASP8 and CASP9 contests. Conclusions The validation shows that the method is able to discriminate between good and bad model-structures. The best of tested GOBA scores achieved 0.74 and 0.8 as a mean Pearson correlation to the observed quality of models in our CASP8 and CASP9-based validation sets. GOBA also obtained the best

  20. Quality assessment of protein model-structures based on structural and functional similarities.

    PubMed

    Konopka, Bogumil M; Nebel, Jean-Christophe; Kotulska, Malgorzata

    2012-09-21

    Experimental determination of protein 3D structures is expensive, time consuming and sometimes impossible. A gap between number of protein structures deposited in the World Wide Protein Data Bank and the number of sequenced proteins constantly broadens. Computational modeling is deemed to be one of the ways to deal with the problem. Although protein 3D structure prediction is a difficult task, many tools are available. These tools can model it from a sequence or partial structural information, e.g. contact maps. Consequently, biologists have the ability to generate automatically a putative 3D structure model of any protein. However, the main issue becomes evaluation of the model quality, which is one of the most important challenges of structural biology. GOBA--Gene Ontology-Based Assessment is a novel Protein Model Quality Assessment Program. It estimates the compatibility between a model-structure and its expected function. GOBA is based on the assumption that a high quality model is expected to be structurally similar to proteins functionally similar to the prediction target. Whereas DALI is used to measure structure similarity, protein functional similarity is quantified using standardized and hierarchical description of proteins provided by Gene Ontology combined with Wang's algorithm for calculating semantic similarity. Two approaches are proposed to express the quality of protein model-structures. One is a single model quality assessment method, the other is its modification, which provides a relative measure of model quality. Exhaustive evaluation is performed on data sets of model-structures submitted to the CASP8 and CASP9 contests. The validation shows that the method is able to discriminate between good and bad model-structures. The best of tested GOBA scores achieved 0.74 and 0.8 as a mean Pearson correlation to the observed quality of models in our CASP8 and CASP9-based validation sets. GOBA also obtained the best result for two targets of CASP8, and

  1. Information quality-control model

    NASA Technical Reports Server (NTRS)

    Vincent, D. A.

    1971-01-01

    Model serves as graphic tool for estimating complete product objectives from limited input information, and is applied to cost estimations, product-quality evaluations, and effectiveness measurements for manpower resources allocation. Six product quality levels are defined.

  2. Surgery-first orthognathic approach vs traditional orthognathic approach: Oral health-related quality of life assessed with 2 questionnaires.

    PubMed

    Pelo, Sandro; Gasparini, Giulio; Garagiola, Umberto; Cordaro, Massimo; Di Nardo, Francesco; Staderini, Edoardo; Patini, Romeo; de Angelis, Paolo; D'Amato, Giuseppe; Saponaro, Gianmarco; Moro, Alessandro

    2017-08-01

    The purposes of the study were to investigate and evaluate the differences detected by the patients between the traditional orthognathic approach and the surgery-first one in terms of level of satisfaction and quality of life. A total of 30 patients who underwent orthognathic surgery for correction of malocclusions were selected and included in this study. Fifteen patients were treated with the conventional orthognathic surgery approach, and 15 patients with the surgery-first approach. Variables were assessed through the Orthognathic Quality of Life Questionnaire and the Oral Health Impact Profile questionnaire and analyzed with 2-way repeated-measures analysis of variance. The results showed significant differences in terms of the Orthognathic Quality of Life Questionnaire (P <0.001) and the Oral Health Impact Profile (P <0.001) scores within groups between the first and last administrations of both questionnaires. Differences in the control group between first and second administrations were also significant. Questionnaire scores showed an immediate increase of quality of life after surgery in the surgery-first group and an initial worsening during orthodontic treatment in the traditional approach group followed by postoperative improvement. This study showed that the worsening of the facial profile during the traditional orthognathic surgery approach decompensation phase has a negative impact on the perception of patients' quality of life. Surgeons should consider the possibility of a surgery-first approach to prevent this occurrence. Copyright © 2017 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  3. The consistency approach for quality control of vaccines - a strategy to improve quality control and implement 3Rs.

    PubMed

    De Mattia, Fabrizio; Chapsal, Jean-Michel; Descamps, Johan; Halder, Marlies; Jarrett, Nicholas; Kross, Imke; Mortiaux, Frederic; Ponsar, Cecile; Redhead, Keith; McKelvie, Jo; Hendriksen, Coenraad

    2011-01-01

    Current batch release testing of established vaccines emphasizes quality control of the final product and is often characterized by extensive use of animals. This report summarises the discussions of a joint ECVAM/EPAA workshop on the applicability of the consistency approach for routine release of human and veterinary vaccines and its potential to reduce animal use. The consistency approach is based upon thorough characterization of the vaccine during development and the principle that the quality of subsequent batches is the consequence of the strict application of a quality system and of a consistent production of batches. The concept of consistency of production is state-of-the-art for new-generation vaccines, where batch release is mainly based on non-animal methods. There is now the opportunity to introduce the approach into established vaccine production, where it has the potential to replace in vivo tests with non-animal tests designed to demonstrate batch quality while maintaining the highest quality standards. The report indicates how this approach may be further developed for application to established human and veterinary vaccines and emphasizes the continuing need for co-ordination and harmonization. It also gives recommendations for work to be undertaken in order to encourage acceptance and implementation of the consistency approach. Copyright © 2011. Published by Elsevier Ltd.. All rights reserved.

  4. Showing the Unsayable: Participatory Visual Approaches and the Constitution of 'Patient Experience' in Healthcare Quality Improvement.

    PubMed

    Papoulias, Constantina

    2018-06-01

    This article considers the strengths and potential contributions of participatory visual methods for healthcare quality improvement research. It argues that such approaches may enable us to expand our understanding of 'patient experience' and of its potential for generating new knowledge for health systems. In particular, they may open up dimensions of people's engagement with services and treatments which exceed both the declarative nature of responses to questionnaires and the narrative sequencing of self reports gathered through qualitative interviewing. I will suggest that working with such methods may necessitate a more reflexive approach to the constitution of evidence in quality improvement work. To this end, the article will first consider the emerging rationale for the use of visual participatory methods in improvement before outlining the implications of two related approaches-photo-elicitation and PhotoVoice-for the constitution of 'experience'. It will then move to a participatory model for healthcare improvement work, Experience Based Co-Design (EBCD). It will argue that EBCD exemplifies both the strengths and the limitations of adequating visual participatory approaches to quality improvement ends. The article will conclude with a critical reflection on a small photographic study, in which the author participated, and which sought to harness service user perspectives for the design of psychiatric facilities, as a way of considering the potential contribution of visual participatory methods for quality improvement.

  5. ESTIMATION OF EMISSION ADJUSTMENTS FROM THE APPLICATION OF FOUR-DIMENSIONAL DATA ASSIMILATION TO PHOTOCHEMICAL AIR QUALITY MODELING. (R826372)

    EPA Science Inventory

    Four-dimensional data assimilation applied to photochemical air quality modeling is used to suggest adjustments to the emissions inventory of the Atlanta, Georgia metropolitan area. In this approach, a three-dimensional air quality model, coupled with direct sensitivity analys...

  6. The State and the Quality Agenda: A Theoretical Approach

    ERIC Educational Resources Information Center

    Filippakou, Ourania; Tapper, Ted

    2010-01-01

    This article adopts a theoretical approach to analyse the evolution of the quality agenda in English higher education. Using the concept of reification, it shows how the quasi-state has attempted to build a "natural" understanding of the idea of quality. However, the policy implementation process has demonstrated the fragility of the…

  7. Numerical and Qualitative Contrasts of Two Statistical Models for Water Quality Change in Tidal Waters

    EPA Science Inventory

    Two statistical approaches, weighted regression on time, discharge, and season and generalized additive models, have recently been used to evaluate water quality trends in estuaries. Both models have been used in similar contexts despite differences in statistical foundations and...

  8. Water Quality Criteria for Copper Based on the BLM Approach in the Freshwater in China

    PubMed Central

    Zhang, Yahui; Zang, Wenchao; Qin, Lumei; Zheng, Lei; Cao, Ying; Yan, Zhenguang; Yi, Xianliang; Zeng, Honghu; Liu, Zhengtao

    2017-01-01

    The bioavailability and toxicity of metals to aquatic organisms are highly dependent on water quality parameters in freshwaters. The biotic ligand model (BLM) for copper is an approach to generate the water quality criteria (WQC) with water chemistry in the ambient environment. However, few studies were carried out on the WQCs for copper based on the BLM approach in China. In the present study, the toxicity for copper to native Chinese aquatic organisms was conducted and the published toxicity data with water quality parameters to Chinese aquatic species were collected to derive the WQCs for copper by the BLM approach. The BLM-based WQCs (the criterion maximum criteria (CMC) and the criterion continuous concentration (CCC)) for copper in the freshwater for the nation and in the Taihu Lake were obtained. The CMC and CCC values for copper in China were derived to be 1.391 μg/L and 0.495 μg/L, respectively, and the CMC and CCC in the Taihu Lake were 32.194 μg/L and 9.697 μg/L. The high concentration of dissolved organic carbon might be a main reason which resulted in the higher WQC values in the Taihu Lake. The WQC of copper in the freshwater would provide a scientific foundation for water quality standards and the environment risk assessment in China. PMID:28166229

  9. PREDICTIVE UNCERTAINTY IN HYDROLOGIC AND WATER QUALITY MODELING: APPROACHES, APPLICATION TO ENVIRONMENTAL MANAGEMENT, AND FUTURE CHALLENGES (PRESENTATION)

    EPA Science Inventory

    Extant process-based hydrologic and water quality models are indispensable to water resources planning and environmental management. However, models are only approximations of real systems and often calibrated with incomplete and uncertain data. Reliable estimates, or perhaps f...

  10. Massive integration of diverse protein quality assessment methods to improve template based modeling in CASP11

    PubMed Central

    Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin

    2015-01-01

    Model evaluation and selection is an important step and a big challenge in template-based protein structure prediction. Individual model quality assessment methods designed for recognizing some specific properties of protein structures often fail to consistently select good models from a model pool because of their limitations. Therefore, combining multiple complimentary quality assessment methods is useful for improving model ranking and consequently tertiary structure prediction. Here, we report the performance and analysis of our human tertiary structure predictor (MULTICOM) based on the massive integration of 14 diverse complementary quality assessment methods that was successfully benchmarked in the 11th Critical Assessment of Techniques of Protein Structure prediction (CASP11). The predictions of MULTICOM for 39 template-based domains were rigorously assessed by six scoring metrics covering global topology of Cα trace, local all-atom fitness, side chain quality, and physical reasonableness of the model. The results show that the massive integration of complementary, diverse single-model and multi-model quality assessment methods can effectively leverage the strength of single-model methods in distinguishing quality variation among similar good models and the advantage of multi-model quality assessment methods of identifying reasonable average-quality models. The overall excellent performance of the MULTICOM predictor demonstrates that integrating a large number of model quality assessment methods in conjunction with model clustering is a useful approach to improve the accuracy, diversity, and consequently robustness of template-based protein structure prediction. PMID:26369671

  11. Ensuring quality and safety.

    PubMed

    Reid, Jerry

    2010-01-01

    The certification model addresses quality and safety by directly targeting the qualifications of individuals. The practice accreditation model takes a more global approach to quality and safety and addresses the qualifications of individuals and standards for additional components of the quality chain. Although both certification and practice accreditation fundamentally are voluntary, the programs may become mandatory when enforcement mechanisms are linked to the programs via state or federal legislation or via private reimbursement policies, effectively resulting in mandatory standards. The CARE bill takes a certification approach to quality and safety by focusing on the qualifications of the individual. MIPPA takes an accreditation approach by focusing on the practice. MQSA is somewhat of a hybrid in that it takes an accreditation approach, but spells out standards for the individual that the accreditor must follow. If the practice accreditation standards require that all technologists employed in the practice be certified in the modalities performed, then the practice accreditation model and the certification model become functionally equivalent in terms of personnel qualifications. To the extent that practice accreditation models are less prescriptive regarding personnel standards, the certification model results in more stringent standards.

  12. Evaluating the Quality of the Learning Outcome in Healthcare Sector: The Expero4care Model

    ERIC Educational Resources Information Center

    Cervai, Sara; Polo, Federica

    2015-01-01

    Purpose: This paper aims to present the Expero4care model. Considering the growing need for a training evaluation model that does not simply fix processes, the Expero4care model represents the first attempt of a "quality model" dedicated to the learning outcomes of healthcare trainings. Design/Methodology/Approach: Created as development…

  13. Input variable selection and calibration data selection for storm water quality regression models.

    PubMed

    Sun, Siao; Bertrand-Krajewski, Jean-Luc

    2013-01-01

    Storm water quality models are useful tools in storm water management. Interest has been growing in analyzing existing data for developing models for urban storm water quality evaluations. It is important to select appropriate model inputs when many candidate explanatory variables are available. Model calibration and verification are essential steps in any storm water quality modeling. This study investigates input variable selection and calibration data selection in storm water quality regression models. The two selection problems are mutually interacted. A procedure is developed in order to fulfil the two selection tasks in order. The procedure firstly selects model input variables using a cross validation method. An appropriate number of variables are identified as model inputs to ensure that a model is neither overfitted nor underfitted. Based on the model input selection results, calibration data selection is studied. Uncertainty of model performances due to calibration data selection is investigated with a random selection method. An approach using the cluster method is applied in order to enhance model calibration practice based on the principle of selecting representative data for calibration. The comparison between results from the cluster selection method and random selection shows that the former can significantly improve performances of calibrated models. It is found that the information content in calibration data is important in addition to the size of calibration data.

  14. Systems approach to managing educational quality in the engineering classroom

    NASA Astrophysics Data System (ADS)

    Grygoryev, Kostyantyn

    Today's competitive environment in post-secondary education requires universities to demonstrate the quality of their programs in order to attract financing, and student and academic talent. Despite significant efforts devoted to improving the quality of higher education, systematic, continuous performance measurement and management still have not reached the level where educational outputs and outcomes are actually produced---the classroom. An engineering classroom is a complex environment in which educational inputs are transformed by educational processes into educational outputs and outcomes. By treating a classroom as a system, one can apply tools such as Structural Equation Modeling, Statistical Process Control, and System Dynamics in order to discover cause-and-effect relationships among the classroom variables, control the classroom processes, and evaluate the effect of changes to the course organization, content, and delivery, on educational processes and outcomes. Quality improvement is best achieved through the continuous, systematic application of efforts and resources. Improving classroom processes and outcomes is an iterative process that starts with identifying opportunities for improvement, designing the action plan, implementing the changes, and evaluating their effects. Once the desired objectives are achieved, the quality improvement cycle may start again. The goal of this research was to improve the educational processes and outcomes in an undergraduate engineering management course taught at the University of Alberta. The author was involved with the course, first, as a teaching assistant, and, then, as a primary instructor. The data collected from the course over four years were used to create, first, a static and, then, a dynamic model of a classroom system. By using model output and qualitative feedback from students, changes to the course organization and content were introduced. These changes led to a lower perceived course workload and

  15. A Review of Surface Water Quality Models

    PubMed Central

    Li, Shibei; Jia, Peng; Qi, Changjun; Ding, Feng

    2013-01-01

    Surface water quality models can be useful tools to simulate and predict the levels, distributions, and risks of chemical pollutants in a given water body. The modeling results from these models under different pollution scenarios are very important components of environmental impact assessment and can provide a basis and technique support for environmental management agencies to make right decisions. Whether the model results are right or not can impact the reasonability and scientificity of the authorized construct projects and the availability of pollution control measures. We reviewed the development of surface water quality models at three stages and analyzed the suitability, precisions, and methods among different models. Standardization of water quality models can help environmental management agencies guarantee the consistency in application of water quality models for regulatory purposes. We concluded the status of standardization of these models in developed countries and put forward available measures for the standardization of these surface water quality models, especially in developing countries. PMID:23853533

  16. Pesticide fate at regional scale: Development of an integrated model approach and application

    NASA Astrophysics Data System (ADS)

    Herbst, M.; Hardelauf, H.; Harms, R.; Vanderborght, J.; Vereecken, H.

    As a result of agricultural practice many soils and aquifers are contaminated with pesticides. In order to quantify the side-effects of these anthropogenic impacts on groundwater quality at regional scale, a process-based, integrated model approach was developed. The Richards’ equation based numerical model TRACE calculates the three-dimensional saturated/unsaturated water flow. For the modeling of regional scale pesticide transport we linked TRACE with the plant module SUCROS and with 3DLEWASTE, a hybrid Lagrangian/Eulerian approach to solve the convection/dispersion equation. We used measurements, standard methods like pedotransfer-functions or parameters from literature to derive the model input for the process model. A first-step application of TRACE/3DLEWASTE to the 20 km 2 test area ‘Zwischenscholle’ for the period 1983-1993 reveals the behaviour of the pesticide isoproturon. The selected test area is characterised by an intense agricultural use and shallow groundwater, resulting in a high vulnerability of the groundwater to pesticide contamination. The model results stress the importance of the unsaturated zone for the occurrence of pesticides in groundwater. Remarkable isoproturon concentrations in groundwater are predicted for locations with thin layered and permeable soils. For four selected locations we used measured piezometric heads to validate predicted groundwater levels. In general, the model results are consistent and reasonable. Thus the developed integrated model approach is seen as a promising tool for the quantification of the agricultural practice impact on groundwater quality.

  17. Modeling hospital infrastructure by optimizing quality, accessibility and efficiency via a mixed integer programming model.

    PubMed

    Ikkersheim, David; Tanke, Marit; van Schooten, Gwendy; de Bresser, Niels; Fleuren, Hein

    2013-06-16

    The majority of curative health care is organized in hospitals. As in most other countries, the current 94 hospital locations in the Netherlands offer almost all treatments, ranging from rather basic to very complex care. Recent studies show that concentration of care can lead to substantial quality improvements for complex conditions and that dispersion of care for chronic conditions may increase quality of care. In previous studies on allocation of hospital infrastructure, the allocation is usually only based on accessibility and/or efficiency of hospital care. In this paper, we explore the possibilities to include a quality function in the objective function, to give global directions to how the 'optimal' hospital infrastructure would be in the Dutch context. To create optimal societal value we have used a mathematical mixed integer programming (MIP) model that balances quality, efficiency and accessibility of care for 30 ICD-9 diagnosis groups. Typical aspects that are taken into account are the volume-outcome relationship, the maximum accepted travel times for diagnosis groups that may need emergency treatment and the minimum use of facilities. The optimal number of hospital locations per diagnosis group varies from 12-14 locations for diagnosis groups which have a strong volume-outcome relationship, such as neoplasms, to 150 locations for chronic diagnosis groups such as diabetes and chronic obstructive pulmonary disease (COPD). In conclusion, our study shows a new approach for allocating hospital infrastructure over a country or certain region that includes quality of care in relation to volume per provider that can be used in various countries or regions. In addition, our model shows that within the Dutch context chronic care may be too concentrated and complex and/or acute care may be too dispersed. Our approach can relatively easily be adopted towards other countries or regions and is very suitable to perform a 'what-if' analysis.

  18. Quality based approach for adaptive face recognition

    NASA Astrophysics Data System (ADS)

    Abboud, Ali J.; Sellahewa, Harin; Jassim, Sabah A.

    2009-05-01

    Recent advances in biometric technology have pushed towards more robust and reliable systems. We aim to build systems that have low recognition errors and are less affected by variation in recording conditions. Recognition errors are often attributed to the usage of low quality biometric samples. Hence, there is a need to develop new intelligent techniques and strategies to automatically measure/quantify the quality of biometric image samples and if necessary restore image quality according to the need of the intended application. In this paper, we present no-reference image quality measures in the spatial domain that have impact on face recognition. The first is called symmetrical adaptive local quality index (SALQI) and the second is called middle halve (MH). Also, an adaptive strategy has been developed to select the best way to restore the image quality, called symmetrical adaptive histogram equalization (SAHE). The main benefits of using quality measures for adaptive strategy are: (1) avoidance of excessive unnecessary enhancement procedures that may cause undesired artifacts, and (2) reduced computational complexity which is essential for real time applications. We test the success of the proposed measures and adaptive approach for a wavelet-based face recognition system that uses the nearest neighborhood classifier. We shall demonstrate noticeable improvements in the performance of adaptive face recognition system over the corresponding non-adaptive scheme.

  19. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    NASA Astrophysics Data System (ADS)

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  20. Local-Scale Air Quality Modeling in Support of Human Health and Exposure Research (Invited)

    NASA Astrophysics Data System (ADS)

    Isakov, V.

    2010-12-01

    Spatially- and temporally-sparse information on air quality is a key concern for air-pollution-related environmental health studies. Monitor networks are sparse in both space and time, are costly to maintain, and are often designed purposely to avoid detecting highly localized sources. Recent studies have shown that more narrowly defining the geographic domain of the study populations and improvements in the measured/estimated ambient concentrations can lead to stronger associations between air pollution and hospital admissions and mortality records. Traditionally, ambient air quality measurements have been used as a primary input to support human health and exposure research. However, there is increasing evidence that the current ambient monitoring network is not capturing sharp gradients in exposure due to the presence of high concentration levels near, for example, major roadways. Many air pollutants exhibit large concentration gradients near large emitters such as major roadways, factories, ports, etc. To overcome these limitations, researchers are now beginning to use air quality models to support air pollution exposure and health studies. There are many advantages to using air quality models over traditional approaches based on existing ambient measurements alone. First, models can provide spatially- and temporally-resolved concentrations as direct input to exposure and health studies and thus better defining the concentration levels for the population in the geographic domain. Air quality models have a long history of use in air pollution regulations, and supported by regulatory agencies and a large user community. Also, models can provide bidirectional linkages between sources of emissions and ambient concentrations, thus allowing exploration of various mitigation strategies to reduce risk to exposure. In order to provide best estimates of air concentrations to support human health and exposure studies, model estimates should consider local-scale features

  1. QMEANclust: estimation of protein model quality by combining a composite scoring function with structural density information.

    PubMed

    Benkert, Pascal; Schwede, Torsten; Tosatto, Silvio Ce

    2009-05-20

    The selection of the most accurate protein model from a set of alternatives is a crucial step in protein structure prediction both in template-based and ab initio approaches. Scoring functions have been developed which can either return a quality estimate for a single model or derive a score from the information contained in the ensemble of models for a given sequence. Local structural features occurring more frequently in the ensemble have a greater probability of being correct. Within the context of the CASP experiment, these so called consensus methods have been shown to perform considerably better in selecting good candidate models, but tend to fail if the best models are far from the dominant structural cluster. In this paper we show that model selection can be improved if both approaches are combined by pre-filtering the models used during the calculation of the structural consensus. Our recently published QMEAN composite scoring function has been improved by including an all-atom interaction potential term. The preliminary model ranking based on the new QMEAN score is used to select a subset of reliable models against which the structural consensus score is calculated. This scoring function called QMEANclust achieves a correlation coefficient of predicted quality score and GDT_TS of 0.9 averaged over the 98 CASP7 targets and perform significantly better in selecting good models from the ensemble of server models than any other groups participating in the quality estimation category of CASP7. Both scoring functions are also benchmarked on the MOULDER test set consisting of 20 target proteins each with 300 alternatives models generated by MODELLER. QMEAN outperforms all other tested scoring functions operating on individual models, while the consensus method QMEANclust only works properly on decoy sets containing a certain fraction of near-native conformations. We also present a local version of QMEAN for the per-residue estimation of model quality (QMEANlocal

  2. A deterministic aggregate production planning model considering quality of products

    NASA Astrophysics Data System (ADS)

    Madadi, Najmeh; Yew Wong, Kuan

    2013-06-01

    Aggregate Production Planning (APP) is a medium-term planning which is concerned with the lowest-cost method of production planning to meet customers' requirements and to satisfy fluctuating demand over a planning time horizon. APP problem has been studied widely since it was introduced and formulated in 1950s. However, in several conducted studies in the APP area, most of the researchers have concentrated on some common objectives such as minimization of cost, fluctuation in the number of workers, and inventory level. Specifically, maintaining quality at the desirable level as an objective while minimizing cost has not been considered in previous studies. In this study, an attempt has been made to develop a multi-objective mixed integer linear programming model that serves those companies aiming to incur the minimum level of operational cost while maintaining quality at an acceptable level. In order to obtain the solution to the multi-objective model, the Fuzzy Goal Programming approach and max-min operator of Bellman-Zadeh were applied to the model. At the final step, IBM ILOG CPLEX Optimization Studio software was used to obtain the experimental results based on the data collected from an automotive parts manufacturing company. The results show that incorporating quality in the model imposes some costs, however a trade-off should be done between the cost resulting from producing products with higher quality and the cost that the firm may incur due to customer dissatisfaction and sale losses.

  3. Antecedents and Consequences of Service Quality in a Higher Education Context: A Qualitative Research Approach

    ERIC Educational Resources Information Center

    Sultan, Parves; Wong, Ho Yin

    2013-01-01

    Purpose: The purpose of the paper is to report on the perception of students in regard to critical antecedents, dimensions and consequences of service quality with an aim to develop a theoretical model in the context of a university in Australia. Design/methodology/approach: This research used focus group discussions with 19 students who had been…

  4. A state of the art regarding urban air quality prediction models

    NASA Astrophysics Data System (ADS)

    Croitoru, Cristiana; Nastase, Ilinca

    2018-02-01

    Urban pollution represents an increasing risk to residents of urban regions, particularly in large, over-industrialized cities knowing that the traffic is responsible for more than 25% of air gaseous pollutants and dust particles. Air quality modelling plays an important role in addressing air pollution control and management approaches by providing guidelines for better and more efficient air quality forecasting, along with smart monitoring sensor networks. The advances in technology regarding simulations, forecasting and monitoring are part of the new smart cities which offers a healthy environment for their occupants.

  5. ENSEMBLE and AMET: Two Systems and Approaches to a Harmonized, Simplified and Efficient Facility for Air Quality Models Development and Evaluation

    EPA Science Inventory

    The complexity of air quality modeling systems, air quality monitoring data make ad-hoc systems for model evaluation important aids to the modeling community. Among those are the ENSEMBLE system developed by the EC-Joint Research Center, and the AMET software developed by the US-...

  6. Modelling of beef sensory quality for a better prediction of palatability.

    PubMed

    Hocquette, Jean-François; Van Wezemael, Lynn; Chriki, Sghaier; Legrand, Isabelle; Verbeke, Wim; Farmer, Linda; Scollan, Nigel D; Polkinghorne, Rod; Rødbotten, Rune; Allen, Paul; Pethick, David W

    2014-07-01

    Despite efforts by the industry to control the eating quality of beef, there remains a high level of variability in palatability, which is one reason for consumer dissatisfaction. In Europe, there is still no reliable on-line tool to predict beef quality and deliver consistent quality beef to consumers. Beef quality traits depend in part on the physical and chemical properties of the muscles. The determination of these properties (known as muscle profiling) will allow for more informed decisions to be made in the selection of individual muscles for the production of value-added products. Therefore, scientists and professional partners of the ProSafeBeef project have brought together all the data they have accumulated over 20 years. The resulting BIF-Beef (Integrated and Functional Biology of Beef) data warehouse contains available data of animal growth, carcass composition, muscle tissue characteristics and beef quality traits. This database is useful to determine the most important muscle characteristics associated with a high tenderness, a high flavour or generally a high quality. Another more consumer driven modelling tool was developed in Australia: the Meat Standards Australia (MSA) grading scheme that predicts beef quality for each individual muscle×specific cooking method combination using various information on the corresponding animals and post-slaughter processing factors. This system has also the potential to detect variability in quality within muscles. The MSA system proved to be effective in predicting beef palatability not only in Australia but also in many other countries. The results of the work conducted in Europe within the ProSafeBeef project indicate that it would be possible to manage a grading system in Europe similar to the MSA system. The combination of the different modelling approaches (namely muscle biochemistry and a MSA-like meat grading system adapted to the European market) is a promising area of research to improve the prediction

  7. The choices, choosing model of quality of life: linkages to a science base.

    PubMed

    Gurland, Barry J; Gurland, Roni V

    2009-01-01

    A previous paper began with a critical review of current models and measures of quality of life and then proposed criteria for judging the relative merits of alternative models: preference was given to finding a model with explicit mechanisms, linkages to a science base, a means of identifying deficits amenable to rational restorative interventions, and with embedded values of the whole person. A conjectured model, based on the processes of accessing choices and choosing among them, matched the proposed criteria. The choices and choosing (c-c) process is an evolved adaptive mechanism dedicated to the pursuit of quality of life, driven by specific biological and psychological systems, and influenced also by social and environmental forces. In this paper the c-c model is examined for its potential to strengthen the science base for the field of quality of life and thus to unify many approaches to concept and measurement. A third paper in this set will lay out a guide to applying the c-c model in evaluating impairments of quality of life and will tie this evaluation to corresponding interventions aimed at relieving restrictions or distortions of the c-c process; thus helping people to preserve and improve their quality of life. The fourth paper will demonstrate empirical analyses of the relationship between health imposed restrictions of options for living and conventional indicators of diminished quality of life. (c) 2008 John Wiley & Sons, Ltd.

  8. A photosynthesis-based two-leaf canopy stomatal conductance model for meteorology and air quality modeling with WRF/CMAQ PX LSM

    NASA Astrophysics Data System (ADS)

    Ran, Limei; Pleim, Jonathan; Song, Conghe; Band, Larry; Walker, John T.; Binkowski, Francis S.

    2017-02-01

    A coupled photosynthesis-stomatal conductance model with single-layer sunlit and shaded leaf canopy scaling is implemented and evaluated in a diagnostic box model with the Pleim-Xiu land surface model (PX LSM) and ozone deposition model components taken directly from the meteorology and air quality modeling system - WRF/CMAQ (Weather Research and Forecast model and Community Multiscale Air Quality model). The photosynthesis-based model for PX LSM (PX PSN) is evaluated at a FLUXNET site for implementation against different parameterizations and the current PX LSM approach with a simple Jarvis function (PX Jarvis). Latent heat flux (LH) from PX PSN is further evaluated at five FLUXNET sites with different vegetation types and landscape characteristics. Simulated ozone deposition and flux from PX PSN are evaluated at one of the sites with ozone flux measurements. Overall, the PX PSN simulates LH as well as the PX Jarvis approach. The PX PSN, however, shows distinct advantages over the PX Jarvis approach for grassland that likely result from its treatment of C3 and C4 plants for CO2 assimilation. Simulations using Moderate Resolution Imaging Spectroradiometer (MODIS) leaf area index (LAI) rather than LAI measured at each site assess how the model would perform with grid averaged data used in WRF/CMAQ. MODIS LAI estimates degrade model performance at all sites but one site having exceptionally old and tall trees. Ozone deposition velocity and ozone flux along with LH are simulated especially well by the PX PSN compared to significant overestimation by the PX Jarvis for a grassland site.

  9. Massive integration of diverse protein quality assessment methods to improve template based modeling in CASP11.

    PubMed

    Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin

    2016-09-01

    Model evaluation and selection is an important step and a big challenge in template-based protein structure prediction. Individual model quality assessment methods designed for recognizing some specific properties of protein structures often fail to consistently select good models from a model pool because of their limitations. Therefore, combining multiple complimentary quality assessment methods is useful for improving model ranking and consequently tertiary structure prediction. Here, we report the performance and analysis of our human tertiary structure predictor (MULTICOM) based on the massive integration of 14 diverse complementary quality assessment methods that was successfully benchmarked in the 11th Critical Assessment of Techniques of Protein Structure prediction (CASP11). The predictions of MULTICOM for 39 template-based domains were rigorously assessed by six scoring metrics covering global topology of Cα trace, local all-atom fitness, side chain quality, and physical reasonableness of the model. The results show that the massive integration of complementary, diverse single-model and multi-model quality assessment methods can effectively leverage the strength of single-model methods in distinguishing quality variation among similar good models and the advantage of multi-model quality assessment methods of identifying reasonable average-quality models. The overall excellent performance of the MULTICOM predictor demonstrates that integrating a large number of model quality assessment methods in conjunction with model clustering is a useful approach to improve the accuracy, diversity, and consequently robustness of template-based protein structure prediction. Proteins 2016; 84(Suppl 1):247-259. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  10. Air Quality Modeling

    EPA Pesticide Factsheets

    In this technical support document (TSD) EPA describes the air quality modeling performed to support the Environmental Protection Agency’s Transport Rule proposal (now known as the Cross-State Air Pollution Rule).

  11. Got (the Right) Milk? How a Blended Quality Improvement Approach Catalyzed Change.

    PubMed

    Luton, Alexandra; Bondurant, Patricia G; Campbell, Amy; Conkin, Claudia; Hernandez, Jae; Hurst, Nancy

    2015-10-01

    The expression, storage, preparation, fortification, and feeding of breast milk are common ongoing activities in many neonatal intensive care units (NICUs) today. Errors in breast milk administration are a serious issue that should be prevented to preserve the health and well-being of NICU babies and their families. This paper describes how a program to improve processes surrounding infant feeding was developed, implemented, and evaluated. The project team used a blended quality improvement approach that included the Model for Improvement, Lean and Six Sigma methodologies, and principles of High Reliability Organizations to identify and drive short-term, medium-term, and long-term improvement strategies. Through its blended quality improvement approach, the team strengthened the entire dispensation system for both human milk and formula and outlined a clear vision and plan for further improvements as well. The NICU reduced feeding errors by 83%. Be systematic in the quality improvement approach, and apply proven methods to improving processes surrounding infant feeding. Involve expert project managers with nonclinical perspective to guide work in a systematic way and provide unbiased feedback. Create multidisciplinary, cross-departmental teams that include a vast array of stakeholders in NICU feeding processes to ensure comprehensive examination of current state, identification of potential risks, and "outside the box" potential solutions. As in the realm of pharmacy, the processes involved in preparing feedings for critically ill infants should be carried out via predictable, reliable means including robust automated verification that integrates seamlessly into existing processes. The use of systems employed in pharmacy for medication preparation should be considered in the human milk and formula preparation setting.

  12. Towards a model for the measurement of data quality in websites

    NASA Astrophysics Data System (ADS)

    Leite, Patrícia; Gonçalves, Joaquim; Teixeira, Paulo; Rocha, Álvaro

    2014-10-01

    Websites are, nowadays, the face of institutions, but they are often neglected, especially when it comes to contents. In the present paper, we put forth an investigation work whose final goal is the development of a model for the measurement of data quality in institutional websites for health units. To that end, we have carried out a bibliographic review of the available approaches for the evaluation of website content quality, in order to identify the most recurrent dimensions and the attributes, and we are currently carrying out a Delphi Method process, presently in its second stage, with the purpose of reaching an adequate set of attributes for the measurement of content quality.

  13. Tamoxifen for breast cancer risk reduction: impact of alternative approaches to quality-of-life adjustment on cost-effectiveness analysis.

    PubMed

    Melnikow, Joy; Birch, Stephen; Slee, Christina; McCarthy, Theodore J; Helms, L Jay; Kuppermann, Miriam

    2008-09-01

    In cost-effectiveness analysis (CEA), the effects of health-care interventions on multiple health dimensions typically require consideration of both quantity and quality of life. To explore the impact of alternative approaches to quality-of-life adjustment using patient preferences (utilities) on the outcome of a CEA on use of tamoxifen for breast cancer risk reduction. A state transition Markov model tracked hypothetical cohorts of women who did or did not take 5 years of tamoxifen for breast cancer risk reduction. Incremental quality-adjusted effectiveness and cost-effectiveness ratios (ICERs) for models including and excluding a utility adjustment for menopausal symptoms were compared with each other and to a global utility model. Two hundred fifty-five women aged 50 and over with estimated 5-year breast cancer risk >or=1.67% participated in utility assessment interviews. Standard gamble utilities were assessed for specified tamoxifen-related health outcomes, current health, and for a global assessment of possible outcomes of tamoxifen use. Inclusion of a utility for menopausal symptoms in the outcome-specific models substantially increased the ICER; at the threshold 5-year breast cancer risk of 1.67%, tamoxifen was dominated. When a global utility for tamoxifen was used in place of outcome-specific utilities, tamoxifen was dominated under all circumstances. CEAs may be profoundly affected by the types of outcomes considered for quality-of-life adjustment and how these outcomes are grouped for utility assessment. Comparisons of ICERs across analyses must consider effects of different approaches to using utilities for quality-of-life adjustment.

  14. Model compilation: An approach to automated model derivation

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Baudin, Catherine; Iwasaki, Yumi; Nayak, Pandurang; Tanaka, Kazuo

    1990-01-01

    An approach is introduced to automated model derivation for knowledge based systems. The approach, model compilation, involves procedurally generating the set of domain models used by a knowledge based system. With an implemented example, how this approach can be used to derive models of different precision and abstraction is illustrated, and models are tailored to different tasks, from a given set of base domain models. In particular, two implemented model compilers are described, each of which takes as input a base model that describes the structure and behavior of a simple electromechanical device, the Reaction Wheel Assembly of NASA's Hubble Space Telescope. The compilers transform this relatively general base model into simple task specific models for troubleshooting and redesign, respectively, by applying a sequence of model transformations. Each transformation in this sequence produces an increasingly more specialized model. The compilation approach lessens the burden of updating and maintaining consistency among models by enabling their automatic regeneration.

  15. A multiobjective response surface approach for improved water quality planning in lakes and reservoirs

    NASA Astrophysics Data System (ADS)

    Castelletti, A.; Pianosi, F.; Soncini-Sessa, R.; Antenucci, J. P.

    2010-06-01

    Improved data collection techniques as well as increasing computing power are opening up new opportunities for the development of sophisticated models that can accurately reproduce hydrodynamic and biochemical conditions of water bodies. While increasing model complexity is considered a virtue for scientific purposes, it is a definite disadvantage for management (engineering) purposes, as it limits the model applicability to what-if analysis over a few, a priori defined interventions. In the recent past, this has become a significant limitation, particularly considering recent advances in water quality rehabilitation technologies (e.g., mixers or oxygenators) for which many design parameters have to be decided. In this paper, a novel approach toward integrating science-oriented and engineering-oriented models and improving water quality planning is presented. It is based on the use of a few appropriately designed simulations of a complex process-based model to iteratively identify the multidimensional function (response surface) that maps the rehabilitation interventions into the objective function. On the basis of the response surface (RS), a greater number of interventions can be quickly evaluated and the corresponding Pareto front can be approximated. Interesting points on the front are then selected and the corresponding interventions are simulated using the original process-based model, thus obtaining new decision-objective samples to refine the RS approximation. The approach is demonstrated in Googong Reservoir (Australia), which is periodically affected by high concentrations of manganese and cyanobacteria. Results indicate that significant improvements could be observed by simply changing the location of the two mixers installed in 2007. Furthermore, it also suggests the best location for an additional pair of mixers.

  16. An integrated modeling approach for estimating the water quality benefits of conservation practices at the river basin scale

    USDA-ARS?s Scientific Manuscript database

    The USDA initiated the Conservation Effects Assessment Project (CEAP) to quantify the environmental benefits of conservation practices at regional and national scales. For this assessment, a sampling and modeling approach is used. This paper provides a technical overview of the modeling approach use...

  17. Problem of data quality and the limitations of the infrastructure approach

    NASA Astrophysics Data System (ADS)

    Behlen, Fred M.; Sayre, Richard E.; Rackus, Edward; Ye, Dingzhong

    1998-07-01

    The 'Infrastructure Approach' is a PACS implementation methodology wherein the archive, network and information systems interfaces are acquired first, and workstations are installed later. The approach allows building a history of archived image data, so that most prior examinations are available in digital form when workstations are deployed. A limitation of the Infrastructure Approach is that the deferred use of digital image data defeats many data quality management functions that are provided automatically by human mechanisms when data is immediately used for the completion of clinical tasks. If the digital data is used solely for archiving while reports are interpreted from film, the radiologist serves only as a check against lost films, and another person must be designated as responsible for the quality of the digital data. Data from the Radiology Information System and the PACS were analyzed to assess the nature and frequency of system and data quality errors. The error level was found to be acceptable if supported by auditing and error resolution procedures requiring additional staff time, and in any case was better than the loss rate of a hardcopy film archive. It is concluded that the problem of data quality compromises but does not negate the value of the Infrastructure Approach. The Infrastructure Approach should best be employed only to a limited extent, and that any phased PACS implementation should have a substantial complement of workstations dedicated to softcopy interpretation for at least some applications, and with full deployment following not long thereafter.

  18. Development of Gridded Fields of Urban Canopy Parameters for Advanced Urban Meteorological and Air Quality Models

    EPA Science Inventory

    Urban dispersion and air quality simulation models applied at various horizontal scales require different levels of fidelity for specifying the characteristics of the underlying surfaces. As the modeling scales approach the neighborhood level (~1 km horizontal grid spacing), the...

  19. Quantile-based Bayesian maximum entropy approach for spatiotemporal modeling of ambient air quality levels.

    PubMed

    Yu, Hwa-Lung; Wang, Chih-Hsin

    2013-02-05

    Understanding the daily changes in ambient air quality concentrations is important to the assessing human exposure and environmental health. However, the fine temporal scales (e.g., hourly) involved in this assessment often lead to high variability in air quality concentrations. This is because of the complex short-term physical and chemical mechanisms among the pollutants. Consequently, high heterogeneity is usually present in not only the averaged pollution levels, but also the intraday variance levels of the daily observations of ambient concentration across space and time. This characteristic decreases the estimation performance of common techniques. This study proposes a novel quantile-based Bayesian maximum entropy (QBME) method to account for the nonstationary and nonhomogeneous characteristics of ambient air pollution dynamics. The QBME method characterizes the spatiotemporal dependence among the ambient air quality levels based on their location-specific quantiles and accounts for spatiotemporal variations using a local weighted smoothing technique. The epistemic framework of the QBME method can allow researchers to further consider the uncertainty of space-time observations. This study presents the spatiotemporal modeling of daily CO and PM10 concentrations across Taiwan from 1998 to 2009 using the QBME method. Results show that the QBME method can effectively improve estimation accuracy in terms of lower mean absolute errors and standard deviations over space and time, especially for pollutants with strong nonhomogeneous variances across space. In addition, the epistemic framework can allow researchers to assimilate the site-specific secondary information where the observations are absent because of the common preferential sampling issues of environmental data. The proposed QBME method provides a practical and powerful framework for the spatiotemporal modeling of ambient pollutants.

  20. Adaptation of a weighted regression approach to evaluate water quality trends in anestuary

    EPA Science Inventory

    To improve the description of long-term changes in water quality, a weighted regression approach developed to describe trends in pollutant transport in rivers was adapted to analyze a long-term water quality dataset from Tampa Bay, Florida. The weighted regression approach allows...

  1. Using a Quality-Led Multimedia Approach for Interpersonal Communication Training

    ERIC Educational Resources Information Center

    Labour, Michel; Leleu-Merviel, Sylvie; Vieville, Nicholas

    2004-01-01

    Faced with a fast changing society, the need to develop quality instructional materials to update professional skills has become a growing necessity. This article shows how certain instructional design techniques, such as the "Scenistic" approach and the SNOW analysis, can ensure the educational and the broad technical quality of interactive…

  2. A frequency-domain approach to improve ANNs generalization quality via proper initialization.

    PubMed

    Chaari, Majdi; Fekih, Afef; Seibi, Abdennour C; Hmida, Jalel Ben

    2018-08-01

    The ability to train a network without memorizing the input/output data, thereby allowing a good predictive performance when applied to unseen data, is paramount in ANN applications. In this paper, we propose a frequency-domain approach to evaluate the network initialization in terms of quality of training, i.e., generalization capabilities. As an alternative to the conventional time-domain methods, the proposed approach eliminates the approximate nature of network validation using an excess of unseen data. The benefits of the proposed approach are demonstrated using two numerical examples, where two trained networks performed similarly on the training and the validation data sets, yet they revealed a significant difference in prediction accuracy when tested using a different data set. This observation is of utmost importance in modeling applications requiring a high degree of accuracy. The efficiency of the proposed approach is further demonstrated on a real-world problem, where unlike other initialization methods, a more conclusive assessment of generalization is achieved. On the practical front, subtle methodological and implementational facets are addressed to ensure reproducibility and pinpoint the limitations of the proposed approach. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Quality measurement and benchmarking of HPV vaccination services: a new approach.

    PubMed

    Maurici, Massimo; Paulon, Luca; Campolongo, Alessandra; Meleleo, Cristina; Carlino, Cristiana; Giordani, Alessandro; Perrelli, Fabrizio; Sgricia, Stefano; Ferrante, Maurizio; Franco, Elisabetta

    2014-01-01

    A new measurement process based upon a well-defined mathematical model was applied to evaluate the quality of human papillomavirus (HPV) vaccination centers in 3 of 12 Local Health Units (ASLs) within the Lazio Region of Italy. The quality aspects considered for evaluation were communicational efficiency, organizational efficiency and comfort. The overall maximum achievable value was 86.10%, while the HPV vaccination quality scores for ASL1, ASL2 and ASL3 were 73.07%, 71.08%, and 67.21%, respectively. With this new approach it is possible to represent the probabilistic reasoning of a stakeholder who evaluates the quality of a healthcare provider. All ASLs had margins for improvements and optimal quality results can be assessed in terms of better performance conditions, confirming the relationship between the resulting quality scores and HPV vaccination coverage. The measurement process was structured into three steps and involved four stakeholder categories: doctors, nurses, parents and vaccinated women. In Step 1, questionnaires were administered to collect different stakeholders' points of view (i.e., subjective data) that were elaborated to obtain the best and worst performance conditions when delivering a healthcare service. Step 2 of the process involved the gathering of performance data during the service delivery (i.e., objective data collection). Step 3 of the process involved the elaboration of all data: subjective data from step 1 are used to define a "standard" to test objective data from step 2. This entire process led to the creation of a set of scorecards. Benchmarking is presented as a result of the probabilistic meaning of the evaluated scores.

  4. Approaches to quality management and accreditation in a genetic testing laboratory

    PubMed Central

    Berwouts, Sarah; Morris, Michael A; Dequeker, Elisabeth

    2010-01-01

    Medical laboratories, and specifically genetic testing laboratories, provide vital medical services to different clients: clinicians requesting a test, patients from whom the sample was collected, public health and medical-legal instances, referral laboratories and authoritative bodies. All expect results that are accurate and obtained in an efficient and effective manner, within a suitable time frame and at acceptable cost. There are different ways of achieving the end results, but compliance with International Organization for Standardization (ISO) 15189, the international standard for the accreditation of medical laboratories, is becoming progressively accepted as the optimal approach to assuring quality in medical testing. We present recommendations and strategies designed to aid genetic testing laboratories with the implementation of a quality management system, including key aspects such as document control, external quality assessment, internal quality control, internal audit, management review, validation, as well as managing the human side of change. The focus is on pragmatic approaches to attain the levels of quality management and quality assurance required for accreditation according to ISO 15189, within the context of genetic testing. Attention is also given to implementing efficient and effective quality improvement. PMID:20720559

  5. Extreme learning machines: a new approach for modeling dissolved oxygen (DO) concentration with and without water quality variables as predictors.

    PubMed

    Heddam, Salim; Kisi, Ozgur

    2017-07-01

    In this paper, several extreme learning machine (ELM) models, including standard extreme learning machine with sigmoid activation function (S-ELM), extreme learning machine with radial basis activation function (R-ELM), online sequential extreme learning machine (OS-ELM), and optimally pruned extreme learning machine (OP-ELM), are newly applied for predicting dissolved oxygen concentration with and without water quality variables as predictors. Firstly, using data from eight United States Geological Survey (USGS) stations located in different rivers basins, USA, the S-ELM, R-ELM, OS-ELM, and OP-ELM were compared against the measured dissolved oxygen (DO) using four water quality variables, water temperature, specific conductance, turbidity, and pH, as predictors. For each station, we used data measured at an hourly time step for a period of 4 years. The dataset was divided into a training set (70%) and a validation set (30%). We selected several combinations of the water quality variables as inputs for each ELM model and six different scenarios were compared. Secondly, an attempt was made to predict DO concentration without water quality variables. To achieve this goal, we used the year numbers, 2008, 2009, etc., month numbers from (1) to (12), day numbers from (1) to (31) and hour numbers from (00:00) to (24:00) as predictors. Thirdly, the best ELM models were trained using validation dataset and tested with the training dataset. The performances of the four ELM models were evaluated using four statistical indices: the coefficient of correlation (R), the Nash-Sutcliffe efficiency (NSE), the root mean squared error (RMSE), and the mean absolute error (MAE). Results obtained from the eight stations indicated that: (i) the best results were obtained by the S-ELM, R-ELM, OS-ELM, and OP-ELM models having four water quality variables as predictors; (ii) out of eight stations, the OP-ELM performed better than the other three ELM models at seven stations while the R

  6. COMMUNITY MULTISCALE AIR QUALITY ( CMAQ ) MODEL - QUALITY ASSURANCE AND VERSION CONTROL

    EPA Science Inventory

    This presentation will be given to the EPA Exposure Modeling Workgroup on January 24, 2006. The quality assurance and version control procedures for the Community Multiscale Air Quality (CMAQ) Model are presented. A brief background of CMAQ is given, then issues related to qual...

  7. Handbook for the Commonwealth of Learning Review and Improvement Model: Making Quality Work in Higher Education

    ERIC Educational Resources Information Center

    Commonwealth of Learning, 2010

    2010-01-01

    The Commonwealth of Learning Review and Improvement Model (COL RIM) was developed by the Commonwealth of Learning in response to two key drivers: (1) Increased global emphasis on the quality of higher education; and (2) Rising concern about the high cost and uncertain benefits of conventional approaches to external quality assurance. Any…

  8. Designing and evaluating the MULTICOM protein local and global model quality prediction methods in the CASP10 experiment

    PubMed Central

    2014-01-01

    Background Protein model quality assessment is an essential component of generating and using protein structural models. During the Tenth Critical Assessment of Techniques for Protein Structure Prediction (CASP10), we developed and tested four automated methods (MULTICOM-REFINE, MULTICOM-CLUSTER, MULTICOM-NOVEL, and MULTICOM-CONSTRUCT) that predicted both local and global quality of protein structural models. Results MULTICOM-REFINE was a clustering approach that used the average pairwise structural similarity between models to measure the global quality and the average Euclidean distance between a model and several top ranked models to measure the local quality. MULTICOM-CLUSTER and MULTICOM-NOVEL were two new support vector machine-based methods of predicting both the local and global quality of a single protein model. MULTICOM-CONSTRUCT was a new weighted pairwise model comparison (clustering) method that used the weighted average similarity between models in a pool to measure the global model quality. Our experiments showed that the pairwise model assessment methods worked better when a large portion of models in the pool were of good quality, whereas single-model quality assessment methods performed better on some hard targets when only a small portion of models in the pool were of reasonable quality. Conclusions Since digging out a few good models from a large pool of low-quality models is a major challenge in protein structure prediction, single model quality assessment methods appear to be poised to make important contributions to protein structure modeling. The other interesting finding was that single-model quality assessment scores could be used to weight the models by the consensus pairwise model comparison method to improve its accuracy. PMID:24731387

  9. Designing and evaluating the MULTICOM protein local and global model quality prediction methods in the CASP10 experiment.

    PubMed

    Cao, Renzhi; Wang, Zheng; Cheng, Jianlin

    2014-04-15

    Protein model quality assessment is an essential component of generating and using protein structural models. During the Tenth Critical Assessment of Techniques for Protein Structure Prediction (CASP10), we developed and tested four automated methods (MULTICOM-REFINE, MULTICOM-CLUSTER, MULTICOM-NOVEL, and MULTICOM-CONSTRUCT) that predicted both local and global quality of protein structural models. MULTICOM-REFINE was a clustering approach that used the average pairwise structural similarity between models to measure the global quality and the average Euclidean distance between a model and several top ranked models to measure the local quality. MULTICOM-CLUSTER and MULTICOM-NOVEL were two new support vector machine-based methods of predicting both the local and global quality of a single protein model. MULTICOM-CONSTRUCT was a new weighted pairwise model comparison (clustering) method that used the weighted average similarity between models in a pool to measure the global model quality. Our experiments showed that the pairwise model assessment methods worked better when a large portion of models in the pool were of good quality, whereas single-model quality assessment methods performed better on some hard targets when only a small portion of models in the pool were of reasonable quality. Since digging out a few good models from a large pool of low-quality models is a major challenge in protein structure prediction, single model quality assessment methods appear to be poised to make important contributions to protein structure modeling. The other interesting finding was that single-model quality assessment scores could be used to weight the models by the consensus pairwise model comparison method to improve its accuracy.

  10. Identifying western yellow-billed cuckoo breeding habitat with a dual modelling approach

    USGS Publications Warehouse

    Johnson, Matthew J.; Hatten, James R.; Holmes, Jennifer A.; Shafroth, Patrick B.

    2017-01-01

    The western population of the yellow-billed cuckoo (Coccyzus americanus) was recently listed as threatened under the federal Endangered Species Act. Yellow-billed cuckoo conservation efforts require the identification of features and area requirements associated with high quality, riparian forest habitat at spatial scales that range from nest microhabitat to landscape, as well as lower-suitability areas that can be enhanced or restored. Spatially explicit models inform conservation efforts by increasing ecological understanding of a target species, especially at landscape scales. Previous yellow-billed cuckoo modelling efforts derived plant-community maps from aerial photography, an expensive and oftentimes inconsistent approach. Satellite models can remotely map vegetation features (e.g., vegetation density, heterogeneity in vegetation density or structure) across large areas with near perfect repeatability, but they usually cannot identify plant communities. We used aerial photos and satellite imagery, and a hierarchical spatial scale approach, to identify yellow-billed cuckoo breeding habitat along the Lower Colorado River and its tributaries. Aerial-photo and satellite models identified several key features associated with yellow-billed cuckoo breeding locations: (1) a 4.5 ha core area of dense cottonwood-willow vegetation, (2) a large native, heterogeneously dense forest (72 ha) around the core area, and (3) moderately rough topography. The odds of yellow-billed cuckoo occurrence decreased rapidly as the amount of tamarisk cover increased or when cottonwood-willow vegetation was limited. We achieved model accuracies of 75–80% in the project area the following year after updating the imagery and location data. The two model types had very similar probability maps, largely predicting the same areas as high quality habitat. While each model provided unique information, a dual-modelling approach provided a more complete picture of yellow-billed cuckoo habitat

  11. Assessing ecosystem effects of reservoir operations using food web-energy transfer and water quality models

    USGS Publications Warehouse

    Saito, L.; Johnson, B.M.; Bartholow, J.; Hanna, R.B.

    2001-01-01

    We investigated the effects on the reservoir food web of a new temperature control device (TCD) on the dam at Shasta Lake, California. We followed a linked modeling approach that used a specialized reservoir water quality model to forecast operation-induced changes in phytoplankton production. A food web–energy transfer model was also applied to propagate predicted changes in phytoplankton up through the food web to the predators and sport fishes of interest. The food web–energy transfer model employed a 10% trophic transfer efficiency through a food web that was mapped using carbon and nitrogen stable isotope analysis. Stable isotope analysis provided an efficient and comprehensive means of estimating the structure of the reservoir's food web with minimal sampling and background data. We used an optimization procedure to estimate the diet proportions of all food web components simultaneously from their isotopic signatures. Some consumers were estimated to be much more sensitive than others to perturbations to phytoplankton supply. The linked modeling approach demonstrated that interdisciplinary efforts enhance the value of information obtained from studies of managed ecosystems. The approach exploited the strengths of engineering and ecological modeling methods to address concerns that neither of the models could have addressed alone: (a) the water quality model could not have addressed quantitatively the possible impacts to fish, and (b) the food web model could not have examined how phytoplankton availability might change due to reservoir operations.

  12. A CNN based neurobiology inspired approach for retinal image quality assessment.

    PubMed

    Mahapatra, Dwarikanath; Roy, Pallab K; Sedai, Suman; Garnavi, Rahil

    2016-08-01

    Retinal image quality assessment (IQA) algorithms use different hand crafted features for training classifiers without considering the working of the human visual system (HVS) which plays an important role in IQA. We propose a convolutional neural network (CNN) based approach that determines image quality using the underlying principles behind the working of the HVS. CNNs provide a principled approach to feature learning and hence higher accuracy in decision making. Experimental results demonstrate the superior performance of our proposed algorithm over competing methods.

  13. Student laboratory reports: an approach to improving feedback and quality

    NASA Astrophysics Data System (ADS)

    Ellingsen, Pål Gunnar; Støvneng, Jon Andreas

    2018-05-01

    We present an ongoing effort in improving the quality of laboratory reports written by first and second year physics students. The effort involves a new approach where students are given the opportunity to submit reports at intermediate deadlines, receive feedback, and then resubmit for the final deadline. In combination with a differential grading system, instead of pass/fail, the improved feedback results in higher quality reports. Improvement in the quality of the reports is visible through the grade statistics.

  14. Flared landing approach flying qualities. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Weingarten, Norman C.; Berthe, Charles J., Jr.; Rynaski, Edmund G.; Sarrafian, Shahan K.

    1986-01-01

    An in-flight research study was conducted utilizing the USAF/Total In-Flight Simulator (TIFS) to investigate longitudinal flying qualities for the flared landing approach phase of flight. A consistent set of data were generated for: determining what kind of command response the pilot prefers/requires in order to flare and land an aircraft with precision, and refining a time history criterion that took into account all the necessary variables and the characteristics that would accurately predict flying qualities. Seven evaluation pilots participated representing NASA Langley, NASA Dryden, Calspan, Boeing, Lockheed, and DFVLR (Braunschweig, Germany). The results of the first part of the study provide guidelines to the flight control system designer, using MIL-F-8785-(C) as a guide, that yield the dynamic behavior pilots prefer in flared landings. The results of the second part provide the flying qualities engineer with a derived flying qualities predictive tool which appears to be highly accurate. This time-domain predictive flying qualities criterion was applied to the flight data as well as six previous flying qualities studies, and the results indicate that the criterion predicted the flying qualities level 81% of the time and the Cooper-Harper pilot rating, within + or - 1%, 60% of the time.

  15. A systems approach to modeling Community-Based Environmental Monitoring: a case of participatory water quality monitoring in rural Mexico.

    PubMed

    Burgos, Ana; Páez, Rosaura; Carmona, Estela; Rivas, Hilda

    2013-12-01

    Community-Based Environmental Monitoring (CBM) is a social practice that makes a valuable contribution to environmental management and construction of active societies for sustainable future. However, its documentation and analysis show deficiencies that hinder contrast and comparison of processes and effects. Based on systems approach, this article presents a model of CBM to orient assessment of programs, with heuristic or practical goals. In a focal level, the model comprises three components, the social subject, the object of monitoring, and the means of action, and five processes, data management, social learning, assimilation/decision making, direct action, and linking. Emergent properties were also identified in the focal and suprafocal levels considering community self-organization, response capacity, and autonomy for environmental management. The model was applied to the assessment of a CBM program of water quality implemented in rural areas in Mexico. Attributes and variables (indicators) for components, processes, and emergent properties were selected to measure changes that emerged since the program implementation. The assessment of the first 3 years (2010-2012) detected changes that indicated movement towards the expected results, but it revealed also the need to adjust the intervention strategy and procedures. Components and processes of the model reflected relevant aspects of the CBM in real world. The component called means of action as a key element to transit "from the data to the action." The CBM model offered a conceptual framework with advantages to understand CBM as a socioecological event and to strengthen its implementation under different conditions and contexts.

  16. Downscaling modelling system for multi-scale air quality forecasting

    NASA Astrophysics Data System (ADS)

    Nuterman, R.; Baklanov, A.; Mahura, A.; Amstrup, B.; Weismann, J.

    2010-09-01

    Urban modelling for real meteorological situations, in general, considers only a small part of the urban area in a micro-meteorological model, and urban heterogeneities outside a modelling domain affect micro-scale processes. Therefore, it is important to build a chain of models of different scales with nesting of higher resolution models into larger scale lower resolution models. Usually, the up-scaled city- or meso-scale models consider parameterisations of urban effects or statistical descriptions of the urban morphology, whereas the micro-scale (street canyon) models are obstacle-resolved and they consider a detailed geometry of the buildings and the urban canopy. The developed system consists of the meso-, urban- and street-scale models. First, it is the Numerical Weather Prediction (HIgh Resolution Limited Area Model) model combined with Atmospheric Chemistry Transport (the Comprehensive Air quality Model with extensions) model. Several levels of urban parameterisation are considered. They are chosen depending on selected scales and resolutions. For regional scale, the urban parameterisation is based on the roughness and flux corrections approach; for urban scale - building effects parameterisation. Modern methods of computational fluid dynamics allow solving environmental problems connected with atmospheric transport of pollutants within urban canopy in a presence of penetrable (vegetation) and impenetrable (buildings) obstacles. For local- and micro-scales nesting the Micro-scale Model for Urban Environment is applied. This is a comprehensive obstacle-resolved urban wind-flow and dispersion model based on the Reynolds averaged Navier-Stokes approach and several turbulent closures, i.e. k -ɛ linear eddy-viscosity model, k - ɛ non-linear eddy-viscosity model and Reynolds stress model. Boundary and initial conditions for the micro-scale model are used from the up-scaled models with corresponding interpolation conserving the mass. For the boundaries a

  17. Spatial Double Generalized Beta Regression Models: Extensions and Application to Study Quality of Education in Colombia

    ERIC Educational Resources Information Center

    Cepeda-Cuervo, Edilberto; Núñez-Antón, Vicente

    2013-01-01

    In this article, a proposed Bayesian extension of the generalized beta spatial regression models is applied to the analysis of the quality of education in Colombia. We briefly revise the beta distribution and describe the joint modeling approach for the mean and dispersion parameters in the spatial regression models' setting. Finally, we motivate…

  18. Model-based approach to partial tracking for musical transcription

    NASA Astrophysics Data System (ADS)

    Sterian, Andrew; Wakefield, Gregory H.

    1998-10-01

    We present a new method for musical partial tracking in the context of musical transcription using a time-frequency Kalman filter structure. The filter is based upon a model for the evolution of a partial behavior across a wide range of pitch from four brass instruments. Statistics are computed independently for the partial attributes of frequency and log-power first differences. We present observed power spectral density shapes, total powers, and histograms, as well as least-squares approximations to these. We demonstrate that a Kalman filter tracker using this partial model is capable of tracking partials in music. We discuss how the filter structure naturally provides quality-of-fit information about the data for use in further processing and how this information can be used to perform partial track initiation and termination within a common framework. We propose that a model-based approach to partial tracking is preferable to existing approaches which generally use heuristic rules or birth/death notions over a small time neighborhood. The advantages include better performance in the presence of cluttered data and simplified tracking over missed observations.

  19. ISO 9000 quality standards: a model for blood banking?

    PubMed

    Nevalainen, D E; Lloyd, H L

    1995-06-01

    The recent American Association of Blood Banks publications Quality Program and Quality Systems in the Blood Bank and Laboratory Environment, the FDA's draft guidelines, and recent changes in the GMP regulations all discuss the benefits of implementing quality systems in blood center and/or manufacturing operations. While the medical device GMPs in the United States have been rewritten to accommodate a quality system approach similar to ISO 9000, the Center for Biologics Evaluation and Research of the FDA is also beginning to make moves toward adopting "quality systems audits" as an inspection process rather than using the historical approach of record reviews. The approach is one of prevention of errors rather than detection after the fact (Tourault MA, oral communication, November 1994). The ISO 9000 series of standards is a quality system that has worldwide scope and can be applied in any industry or service. The use of such international standards in blood banking should raise the level of quality within an organization, among organizations on a regional level, within a country, and among nations on a worldwide basis. Whether an organization wishes to become registered to a voluntary standard or not, the use of such standards to become ISO 9000-compliant would be a move in the right direction and would be a positive sign to the regulatory authorities and the public that blood banking is making a visible effort to implement world-class quality systems in its operations. Implementation of quality system standards such as the ISO 9000 series will provide an organized approach for blood banks and blood bank testing operations. With the continued trend toward consolidation and mergers, resulting in larger operational units with more complexity, quality systems will become even more important as the industry moves into the future.(ABSTRACT TRUNCATED AT 250 WORDS)

  20. An approach to quality and security of supply for single-use bioreactors.

    PubMed

    Barbaroux, Magali; Gerighausen, Susanne; Hackel, Heiko

    2014-01-01

    Single-use systems (also referred to as disposables) have become a huge part of the bioprocessing industry, which raised concern in the industry regarding quality and security of supply. Processes must be in place to assure the supply and control of outsourced activities and quality of purchased materials along the product life cycle. Quality and security of supply for single-use bioreactors (SUBs) are based on a multidisciplinary approach. Developing a state-of-the-art SUB-system based on quality by design (QbD) principles requires broad expertise and know-how including the cell culture application, polymer chemistry, regulatory requirements, and a deep understanding of the biopharmaceutical industry. Using standardized products reduces the complexity and strengthens the robustness of the supply chain. Well-established supplier relations including risk mitigation strategies are the basis for achieving long-term security of supply. Well-developed quality systems including change control approaches aligned with the requirements of the biopharmaceutical industry are a key factor in supporting long-term product availability. This chapter outlines the approach to security of supply for key materials used in single-use production processes for biopharmaceuticals from a supplier perspective.

  1. Integrated modeling approach using SELECT and SWAT models to simulate source loading and in-stream conditions of fecal indicator bacteria.

    NASA Astrophysics Data System (ADS)

    Ranatunga, T.

    2016-12-01

    Modeling of fate and transport of fecal bacteria in a watershed is generally a processed based approach that considers releases from manure, point sources, and septic systems. Overland transport with water and sediments, infiltration into soils, transport in the vadose zone and groundwater, die-off and growth processes, and in-stream transport are considered as the other major processes in bacteria simulation. This presentation will discuss a simulation of fecal indicator bacteria (E.coli) source loading and in-stream conditions of a non-tidal watershed (Cedar Bayou Watershed) in South Central Texas using two models; Spatially Explicit Load Enrichment Calculation Tool (SELECT) and Soil and Water Assessment Tool (SWAT). Furthermore, it will discuss a probable approach of bacteria source load reduction in order to meet the water quality standards in the streams. The selected watershed is listed as having levels of fecal indicator bacteria that posed a risk for contact recreation and wading by the Texas Commission of Environmental Quality (TCEQ). The SELECT modeling approach was used in estimating the bacteria source loading from land categories. Major bacteria sources considered were, failing septic systems, discharges from wastewater treatment facilities, excreta from livestock (Cattle, Horses, Sheep and Goat), excreta from Wildlife (Feral Hogs, and Deer), Pet waste (mainly from Dogs), and runoff from urban surfaces. The estimated source loads were input to the SWAT model in order to simulate the transport through the land and in-stream conditions. The calibrated SWAT model was then used to estimate the indicator bacteria in-stream concentrations for future years based on H-GAC's regional land use, population and household projections (up to 2040). Based on the in-stream reductions required to meet the water quality standards, the corresponding required source load reductions were estimated.

  2. A Model for the Departmental Quality Management Infrastructure Within an Academic Health System.

    PubMed

    Mathews, Simon C; Demski, Renee; Hooper, Jody E; Biddison, Lee Daugherty; Berry, Stephen A; Petty, Brent G; Chen, Allen R; Hill, Peter M; Miller, Marlene R; Witter, Frank R; Allen, Lisa; Wick, Elizabeth C; Stierer, Tracey S; Paine, Lori; Puttgen, Hans A; Tamargo, Rafael J; Pronovost, Peter J

    2017-05-01

    As quality improvement and patient safety come to play a larger role in health care, academic medical centers and health systems are poised to take a leadership role in addressing these issues. Academic medical centers can leverage their large integrated footprint and have the ability to innovate in this field. However, a robust quality management infrastructure is needed to support these efforts. In this context, quality and safety are often described at the executive level and at the unit level. Yet, the role of individual departments, which are often the dominant functional unit within a hospital, in realizing health system quality and safety goals has not been addressed. Developing a departmental quality management infrastructure is challenging because departments are diverse in composition, size, resources, and needs.In this article, the authors describe the model of departmental quality management infrastructure that has been implemented at the Johns Hopkins Hospital. This model leverages the fractal approach, linking departments horizontally to support peer and organizational learning and connecting departments vertically to support accountability to the hospital, health system, and board of trustees. This model also provides both structure and flexibility to meet individual departmental needs, recognizing that independence and interdependence are needed for large academic medical centers. The authors describe the structure, function, and support system for this model as well as the practical and essential steps for its implementation. They also provide examples of its early success.

  3. New approach for optimal electricity planning and dispatching with hourly time-scale air quality and health considerations.

    PubMed

    Kerl, Paul Y; Zhang, Wenxian; Moreno-Cruz, Juan B; Nenes, Athanasios; Realff, Matthew J; Russell, Armistead G; Sokol, Joel; Thomas, Valerie M

    2015-09-01

    Integrating accurate air quality modeling with decision making is hampered by complex atmospheric physics and chemistry and its coupling with atmospheric transport. Existing approaches to model the physics and chemistry accurately lead to significant computational burdens in computing the response of atmospheric concentrations to changes in emissions profiles. By integrating a reduced form of a fully coupled atmospheric model within a unit commitment optimization model, we allow, for the first time to our knowledge, a fully dynamical approach toward electricity planning that accurately and rapidly minimizes both cost and health impacts. The reduced-form model captures the response of spatially resolved air pollutant concentrations to changes in electricity-generating plant emissions on an hourly basis with accuracy comparable to a comprehensive air quality model. The integrated model allows for the inclusion of human health impacts into cost-based decisions for power plant operation. We use the new capability in a case study of the state of Georgia over the years of 2004-2011, and show that a shift in utilization among existing power plants during selected hourly periods could have provided a health cost savings of $175.9 million dollars for an additional electricity generation cost of $83.6 million in 2007 US dollars (USD2007). The case study illustrates how air pollutant health impacts can be cost-effectively minimized by intelligently modulating power plant operations over multihour periods, without implementing additional emissions control technologies.

  4. New approach for optimal electricity planning and dispatching with hourly time-scale air quality and health considerations

    PubMed Central

    Kerl, Paul Y.; Zhang, Wenxian; Moreno-Cruz, Juan B.; Nenes, Athanasios; Realff, Matthew J.; Russell, Armistead G.; Sokol, Joel; Thomas, Valerie M.

    2015-01-01

    Integrating accurate air quality modeling with decision making is hampered by complex atmospheric physics and chemistry and its coupling with atmospheric transport. Existing approaches to model the physics and chemistry accurately lead to significant computational burdens in computing the response of atmospheric concentrations to changes in emissions profiles. By integrating a reduced form of a fully coupled atmospheric model within a unit commitment optimization model, we allow, for the first time to our knowledge, a fully dynamical approach toward electricity planning that accurately and rapidly minimizes both cost and health impacts. The reduced-form model captures the response of spatially resolved air pollutant concentrations to changes in electricity-generating plant emissions on an hourly basis with accuracy comparable to a comprehensive air quality model. The integrated model allows for the inclusion of human health impacts into cost-based decisions for power plant operation. We use the new capability in a case study of the state of Georgia over the years of 2004–2011, and show that a shift in utilization among existing power plants during selected hourly periods could have provided a health cost savings of $175.9 million dollars for an additional electricity generation cost of $83.6 million in 2007 US dollars (USD2007). The case study illustrates how air pollutant health impacts can be cost-effectively minimized by intelligently modulating power plant operations over multihour periods, without implementing additional emissions control technologies. PMID:26283358

  5. Test of the efficiency of three storm water quality models with a rich set of data.

    PubMed

    Ahyerre, M; Henry, F O; Gogien, F; Chabanel, M; Zug, M; Renaudet, D

    2005-01-01

    The objective of this article is to test the efficiency of three different Storm Water Quality Model (SWQM) on the same data set (34 rain events, SS measurements) sampled on a 42 ha watershed in the center of Paris. The models have been calibrated at the scale of the rain event. Considering the mass of pollution calculated per event, the results on the models are satisfactory but that they are in the same order of magnitude as the simple hydraulic approach associated to a constant concentration. In a second time, the mass of pollutant at the outlet of the catchment at the global scale of the 34 events has been calculated. This approach shows that the simple hydraulic calculations gives better results than SWQM. Finally, the pollutographs are analysed, showing that storm water quality models are interesting tools to represent the shape of the pollutographs, and the dynamics of the phenomenon which can be useful in some projects for managers.

  6. An approach to secure weather and climate models against hardware faults

    NASA Astrophysics Data System (ADS)

    Düben, Peter D.; Dawson, Andrew

    2017-03-01

    Enabling Earth System models to run efficiently on future supercomputers is a serious challenge for model development. Many publications study efficient parallelization to allow better scaling of performance on an increasing number of computing cores. However, one of the most alarming threats for weather and climate predictions on future high performance computing architectures is widely ignored: the presence of hardware faults that will frequently hit large applications as we approach exascale supercomputing. Changes in the structure of weather and climate models that would allow them to be resilient against hardware faults are hardly discussed in the model development community. In this paper, we present an approach to secure the dynamical core of weather and climate models against hardware faults using a backup system that stores coarse resolution copies of prognostic variables. Frequent checks of the model fields on the backup grid allow the detection of severe hardware faults, and prognostic variables that are changed by hardware faults on the model grid can be restored from the backup grid to continue model simulations with no significant delay. To justify the approach, we perform model simulations with a C-grid shallow water model in the presence of frequent hardware faults. As long as the backup system is used, simulations do not crash and a high level of model quality can be maintained. The overhead due to the backup system is reasonable and additional storage requirements are small. Runtime is increased by only 13 % for the shallow water model.

  7. Multivariate Bias Correction Procedures for Improving Water Quality Predictions from the SWAT Model

    NASA Astrophysics Data System (ADS)

    Arumugam, S.; Libera, D.

    2017-12-01

    Water quality observations are usually not available on a continuous basis for longer than 1-2 years at a time over a decadal period given the labor requirements making calibrating and validating mechanistic models difficult. Further, any physical model predictions inherently have bias (i.e., under/over estimation) and require post-simulation techniques to preserve the long-term mean monthly attributes. This study suggests a multivariate bias-correction technique and compares to a common technique in improving the performance of the SWAT model in predicting daily streamflow and TN loads across the southeast based on split-sample validation. The approach is a dimension reduction technique, canonical correlation analysis (CCA) that regresses the observed multivariate attributes with the SWAT model simulated values. The common approach is a regression based technique that uses an ordinary least squares regression to adjust model values. The observed cross-correlation between loadings and streamflow is better preserved when using canonical correlation while simultaneously reducing individual biases. Additionally, canonical correlation analysis does a better job in preserving the observed joint likelihood of observed streamflow and loadings. These procedures were applied to 3 watersheds chosen from the Water Quality Network in the Southeast Region; specifically, watersheds with sufficiently large drainage areas and number of observed data points. The performance of these two approaches are compared for the observed period and over a multi-decadal period using loading estimates from the USGS LOADEST model. Lastly, the CCA technique is applied in a forecasting sense by using 1-month ahead forecasts of P & T from ECHAM4.5 as forcings in the SWAT model. Skill in using the SWAT model for forecasting loadings and streamflow at the monthly and seasonal timescale is also discussed.

  8. Identification of water quality degradation hotspots in developing countries by applying large scale water quality modelling

    NASA Astrophysics Data System (ADS)

    Malsy, Marcus; Reder, Klara; Flörke, Martina

    2014-05-01

    Decreasing water quality is one of the main global issues which poses risks to food security, economy, and public health and is consequently crucial for ensuring environmental sustainability. During the last decades access to clean drinking water increased, but 2.5 billion people still do not have access to basic sanitation, especially in Africa and parts of Asia. In this context not only connection to sewage system is of high importance, but also treatment, as an increasing connection rate will lead to higher loadings and therefore higher pressure on water resources. Furthermore, poor people in developing countries use local surface waters for daily activities, e.g. bathing and washing. It is thus clear that water utilization and water sewerage are indispensable connected. In this study, large scale water quality modelling is used to point out hotspots of water pollution to get an insight on potential environmental impacts, in particular, in regions with a low observation density and data gaps in measured water quality parameters. We applied the global water quality model WorldQual to calculate biological oxygen demand (BOD) loadings from point and diffuse sources, as well as in-stream concentrations. Regional focus in this study is on developing countries i.e. Africa, Asia, and South America, as they are most affected by water pollution. Hereby, model runs were conducted for the year 2010 to draw a picture of recent status of surface waters quality and to figure out hotspots and main causes of pollution. First results show that hotspots mainly occur in highly agglomerated regions where population density is high. Large urban areas are initially loading hotspots and pollution prevention and control become increasingly important as point sources are subject to connection rates and treatment levels. Furthermore, river discharge plays a crucial role due to dilution potential, especially in terms of seasonal variability. Highly varying shares of BOD sources across

  9. Innovations in projecting emissions for air quality modeling

    EPA Science Inventory

    Air quality modeling is used in setting air quality standards and in evaluating their costs and benefits. Historically, modeling applications have projected emissions and the resulting air quality only 5 to 10 years into the future. Recognition that the choice of air quality mana...

  10. An analytical probabilistic model of the quality efficiency of a sewer tank

    NASA Astrophysics Data System (ADS)

    Balistrocchi, Matteo; Grossi, Giovanna; Bacchi, Baldassare

    2009-12-01

    The assessment of the efficiency of a storm water storage facility devoted to the sewer overflow control in urban areas strictly depends on the ability to model the main features of the rainfall-runoff routing process and the related wet weather pollution delivery. In this paper the possibility of applying the analytical probabilistic approach for developing a tank design method, whose potentials are similar to the continuous simulations, is proved. In the model derivation the quality issues of such devices were implemented. The formulation is based on a Weibull probabilistic model of the main characteristics of the rainfall process and on a power law describing the relationship between the dimensionless storm water cumulative runoff volume and the dimensionless cumulative pollutograph. Following this approach, efficiency indexes were established. The proposed model was verified by comparing its results to those obtained by continuous simulations; satisfactory agreement is shown for the proposed efficiency indexes.

  11. An approach to secure weather and climate models against hardware faults

    NASA Astrophysics Data System (ADS)

    Düben, Peter; Dawson, Andrew

    2017-04-01

    Enabling Earth System models to run efficiently on future supercomputers is a serious challenge for model development. Many publications study efficient parallelisation to allow better scaling of performance on an increasing number of computing cores. However, one of the most alarming threats for weather and climate predictions on future high performance computing architectures is widely ignored: the presence of hardware faults that will frequently hit large applications as we approach exascale supercomputing. Changes in the structure of weather and climate models that would allow them to be resilient against hardware faults are hardly discussed in the model development community. We present an approach to secure the dynamical core of weather and climate models against hardware faults using a backup system that stores coarse resolution copies of prognostic variables. Frequent checks of the model fields on the backup grid allow the detection of severe hardware faults, and prognostic variables that are changed by hardware faults on the model grid can be restored from the backup grid to continue model simulations with no significant delay. To justify the approach, we perform simulations with a C-grid shallow water model in the presence of frequent hardware faults. As long as the backup system is used, simulations do not crash and a high level of model quality can be maintained. The overhead due to the backup system is reasonable and additional storage requirements are small. Runtime is increased by only 13% for the shallow water model.

  12. Quantifying Co-benefits of Renewable Energy through Integrated Electricity and Air Quality Modeling

    NASA Astrophysics Data System (ADS)

    Abel, D.

    2016-12-01

    This work focuses on the coordination of electricity sector changes with air quality and health improvement strategies through the integration of electricity and air quality models. Two energy models are used to calculate emission perturbations associated with changes in generation technology (20% generation from solar photovoltaics) and demand (future electricity use under a warmer climate). Impacts from increased solar PV penetration are simulated with the electricity model GridView, in collaboration with the National Renewable Energy Laboratory (NREL). Generation results are used to scale power plant emissions from an inventory developed by the Lake Michigan Air Directors Consortium (LADCO). Perturbed emissions and are used to calculate secondary particulate matter with the Community Multiscale Air Quality (CMAQ) model. We find that electricity NOx and SO2 emissions decrease at a rate similar to the total fraction of electricity supplied by solar. Across the Eastern U.S. region, average PM2.5 is reduced 5% over the summer, with highest reduction in regions and on days of greater PM2.5. A similar approach evaluates the air quality impacts of elevated electricity demand under a warmer climate. Meteorology is selected from the North American Regional Climate Change Assessment Program (NARCCAP) and input to a building energy model, eQUEST, to assess electricity demand as a function of ambient temperature. The associated generation and emissions are calculated on a plant-by-plant basis by the MyPower power sector model. These emissions are referenced to the 2011 National Emissions Inventory to be modeled in CMAQ for the Eastern U.S. and extended to health impact evaluation with the Environmental Benefits Mapping and Analysis Program (BenMAP). All results focus on the air quality and health consequences of energy system changes, considering grid-level changes to meet climate and air quality goals.

  13. Engineering of an inhalable DDA/TDB liposomal adjuvant: a quality-by-design approach towards optimization of the spray drying process.

    PubMed

    Ingvarsson, Pall Thor; Yang, Mingshi; Mulvad, Helle; Nielsen, Hanne Mørck; Rantanen, Jukka; Foged, Camilla

    2013-11-01

    The purpose of this study was to identify and optimize spray drying parameters of importance for the design of an inhalable powder formulation of a cationic liposomal adjuvant composed of dimethyldioctadecylammonium (DDA) bromide and trehalose-6,6'-dibehenate (TDB). A quality by design (QbD) approach was applied to identify and link critical process parameters (CPPs) of the spray drying process to critical quality attributes (CQAs) using risk assessment and design of experiments (DoE), followed by identification of an optimal operating space (OOS). A central composite face-centered design was carried out followed by multiple linear regression analysis. Four CQAs were identified; the mass median aerodynamic diameter (MMAD), the liposome stability (size) during processing, the moisture content and the yield. Five CPPs (drying airflow, feed flow rate, feedstock concentration, atomizing airflow and outlet temperature) were identified and tested in a systematic way. The MMAD and the yield were successfully modeled. For the liposome size stability, the ratio between the size after and before spray drying was modeled successfully. The model for the residual moisture content was poor, although, the moisture content was below 3% in the entire design space. Finally, the OOS was drafted from the constructed models for the spray drying of trehalose stabilized DDA/TDB liposomes. The QbD approach for the spray drying process should include a careful consideration of the quality target product profile. This approach implementing risk assessment and DoE was successfully applied to optimize the spray drying of an inhalable DDA/TDB liposomal adjuvant designed for pulmonary vaccination.

  14. Advanced scatter search approach and its application in a sequencing problem of mixed-model assembly lines in a case company

    NASA Astrophysics Data System (ADS)

    Liu, Qiong; Wang, Wen-xi; Zhu, Ke-ren; Zhang, Chao-yong; Rao, Yun-qing

    2014-11-01

    Mixed-model assembly line sequencing is significant in reducing the production time and overall cost of production. To improve production efficiency, a mathematical model aiming simultaneously to minimize overtime, idle time and total set-up costs is developed. To obtain high-quality and stable solutions, an advanced scatter search approach is proposed. In the proposed algorithm, a new diversification generation method based on a genetic algorithm is presented to generate a set of potentially diverse and high-quality initial solutions. Many methods, including reference set update, subset generation, solution combination and improvement methods, are designed to maintain the diversification of populations and to obtain high-quality ideal solutions. The proposed model and algorithm are applied and validated in a case company. The results indicate that the proposed advanced scatter search approach is significant for mixed-model assembly line sequencing in this company.

  15. Adaptation of a Weighted Regression Approach to Evaluate Water Quality Trends in an Estuary

    EPA Science Inventory

    To improve the description of long-term changes in water quality, we adapted a weighted regression approach to analyze a long-term water quality dataset from Tampa Bay, Florida. The weighted regression approach, originally developed to resolve pollutant transport trends in rivers...

  16. Quality by Design approach for studying the impact of formulation and process variables on product quality of oral disintegrating films.

    PubMed

    Mazumder, Sonal; Pavurala, Naresh; Manda, Prashanth; Xu, Xiaoming; Cruz, Celia N; Krishnaiah, Yellela S R

    2017-07-15

    The present investigation was carried out to understand the impact of formulation and process variables on the quality of oral disintegrating films (ODF) using Quality by Design (QbD) approach. Lamotrigine (LMT) was used as a model drug. Formulation variable was plasticizer to film former ratio and process variables were drying temperature, air flow rate in the drying chamber, drying time and wet coat thickness of the film. A Definitive Screening Design of Experiments (DoE) was used to identify and classify the critical formulation and process variables impacting critical quality attributes (CQA). A total of 14 laboratory-scale DoE formulations were prepared and evaluated for mechanical properties (%elongation at break, yield stress, Young's modulus, folding endurance) and other CQA (dry thickness, disintegration time, dissolution rate, moisture content, moisture uptake, drug assay and drug content uniformity). The main factors affecting mechanical properties were plasticizer to film former ratio and drying temperature. Dissolution rate was found to be sensitive to air flow rate during drying and plasticizer to film former ratio. Data were analyzed for elucidating interactions between different variables, rank ordering the critical materials attributes (CMA) and critical process parameters (CPP), and for providing a predictive model for the process. Results suggested that plasticizer to film former ratio and process controls on drying are critical to manufacture LMT ODF with the desired CQA. Published by Elsevier B.V.

  17. Water quality modelling of Jadro spring.

    PubMed

    Margeta, J; Fistanic, I

    2004-01-01

    Management of water quality in karst is a specific problem. Water generally moves very fast by infiltration processes but far more by concentrated flows through fissures and openings in karst. This enables the entire surface pollution to be transferred fast and without filtration into groundwater springs. A typical example is the Jadro spring. Changes in water quality at the spring are sudden, but short. Turbidity as a major water quality problem for the karst springs regularly exceeds allowable standards. Former practice in problem solving has been reduced to intensive water disinfection in periods of great turbidity without analyses of disinfection by-products risks for water users. The main prerequisite for water quality control and an optimization of water disinfection is the knowledge of raw water quality and nature of occurrence. The analysis of monitoring data and their functional relationship with hydrological parameters enables establishment of a stochastic model that will help obtain better information on turbidity in different periods of the year. Using the model a great number of average monthly and extreme daily values are generated. By statistical analyses of these data possibility of occurrence of high turbidity in certain months is obtained. This information can be used for designing expert system for water quality management of karst springs. Thus, the time series model becomes a valuable tool in management of drinking water quality of the Jadro spring.

  18. Fox Valley Technical College Quality First Process Model.

    ERIC Educational Resources Information Center

    Fox Valley Technical Coll., Appleton, WI.

    An overview is provided of the Quality First Process Model developed by Fox Valley Technical College (FVTC), Wisconsin, to provide guidelines for quality instruction and service consistent with the highest educational standards. The 16-step model involves activities that should be adaptable to any organization. The steps of the quality model are…

  19. Evaluation of Weighted Scale Reliability and Criterion Validity: A Latent Variable Modeling Approach

    ERIC Educational Resources Information Center

    Raykov, Tenko

    2007-01-01

    A method is outlined for evaluating the reliability and criterion validity of weighted scales based on sets of unidimensional measures. The approach is developed within the framework of latent variable modeling methodology and is useful for point and interval estimation of these measurement quality coefficients in counseling and education…

  20. Statistical approaches used to assess and redesign surface water-quality-monitoring networks.

    PubMed

    Khalil, B; Ouarda, T B M J

    2009-11-01

    An up-to-date review of the statistical approaches utilized for the assessment and redesign of surface water quality monitoring (WQM) networks is presented. The main technical aspects of network design are covered in four sections, addressing monitoring objectives, water quality variables, sampling frequency and spatial distribution of sampling locations. This paper discusses various monitoring objectives and related procedures used for the assessment and redesign of long-term surface WQM networks. The appropriateness of each approach for the design, contraction or expansion of monitoring networks is also discussed. For each statistical approach, its advantages and disadvantages are examined from a network design perspective. Possible methods to overcome disadvantages and deficiencies in the statistical approaches that are currently in use are recommended.

  1. Investigating the Effect of Approach Angle and Nose Radius on Surface Quality of Inconel 718

    NASA Astrophysics Data System (ADS)

    Kumar, Sunil; Singh, Dilbag; Kalsi, Nirmal S.

    2017-11-01

    This experimental work presents a surface quality evaluation of a Nickel-Cr-Fe based Inconel 718 superalloy, which has many applications in the aero engine and turbine components. However, during machining, the early wear of tool leads to decrease in surface quality. The coating on cutting tool plays a significant role in increasing the wear resistance and life of the tool. In this work, the aim is to study the surface quality of Inconel 718 with TiAlN-coated carbide tools. Influence of various geometrical parameters (tool nose radius, approach angle) and machining variables (cutting velocity, feed rate) on the quality of machined surface (surface roughness) was determined by using central composite design (CCD) matrix. The mathematical model of the same was developed. Analysis of variance was used to find the significance of the parameters. Results showed that the tool nose radius and feed were the main active factors. The present experiment accomplished that TiAlN-coated carbide inserts result in better surface quality as compared with uncoated carbide inserts.

  2. Measuring management's perspective of data quality in Pakistan's Tuberculosis control programme: a test-based approach to identify data quality dimensions.

    PubMed

    Ali, Syed Mustafa; Anjum, Naveed; Kamel Boulos, Maged N; Ishaq, Muhammad; Aamir, Javariya; Haider, Ghulam Rasool

    2018-01-16

    Data quality is core theme of programme's performance assessment and many organizations do not have any data quality improvement strategy, wherein data quality dimensions and data quality assessment framework are important constituents. As there is limited published research about the data quality specifics that are relevant to the context of Pakistan's Tuberculosis control programme, this study aims at identifying the applicable data quality dimensions by using the 'fitness-for-purpose' perspective. Forty-two respondents pooled a total of 473 years of professional experience, out of which 223 years (47%) were in TB control related programmes. Based on the responses against 11 practical cases, adopted from the routine recording and reporting system of Pakistan's TB control programme (real identities of patient were masked), completeness, accuracy, consistency, vagueness, uniqueness and timeliness are the applicable data quality dimensions relevant to the programme's context, i.e. work settings and field of practice. Based on a 'fitness-for-purpose' approach to data quality, this study used a test-based approach to measure management's perspective and identified data quality dimensions pertinent to the programme and country specific requirements. Implementation of a data quality improvement strategy and achieving enhanced data quality would greatly help organizations in promoting data use for informed decision making.

  3. Stakeholder involvement in establishing a milk quality sub-index in dairy cow breeding goals: a Delphi approach.

    PubMed

    Henchion, M; McCarthy, M; Resconi, V C; Berry, D P; McParland, S

    2016-05-01

    The relative weighting on traits within breeding goals are generally determined by bio-economic models or profit functions. While such methods have generally delivered profitability gains to producers, and are being expanded to consider non-market values, current approaches generally do not consider the numerous and diverse stakeholders that affect, or are affected, by such tools. Based on principles of respondent anonymity, iteration, controlled feedback and statistical aggregation of feedback, a Delphi study was undertaken to gauge stakeholder opinion of the importance of detailed milk quality traits within an overall dairy breeding goal for profit, with the aim of assessing its suitability as a complementary, participatory approach to defining breeding goals. The questionnaires used over two survey rounds asked stakeholders: (a) their opinion on incorporating an explicit sub-index for milk quality into a national breeding goal; (b) the importance they would assign to a pre-determined list of milk quality traits and (c) the (relative) weighting they would give such a milk quality sub-index. Results from the survey highlighted a good degree of consensus among stakeholders on the issues raised. Similarly, revelation of the underlying assumptions and knowledge used by stakeholders to make their judgements illustrated their ability to consider a range of perspectives when evaluating traits, and to reconsider their answers based on the responses and rationales given by others, which demonstrated social learning. Finally, while the relative importance assigned by stakeholders in the Delphi survey (4% to 10%) and the results of calculations based on selection index theory of the relative emphasis that should be placed on milk quality to halt any deterioration (16%) are broadly in line, the difference indicates the benefit of considering more than one approach to determining breeding goals. This study thus illustrates the role of the Delphi technique, as a complementary

  4. Effects of Meteorological Data Quality on Snowpack Modeling

    NASA Astrophysics Data System (ADS)

    Havens, S.; Marks, D. G.; Robertson, M.; Hedrick, A. R.; Johnson, M.

    2017-12-01

    Detailed quality control of meteorological inputs is the most time-intensive component of running the distributed, physically-based iSnobal snow model, and the effect of data quality of the inputs on the model is unknown. The iSnobal model has been run operationally since WY2013, and is currently run in several basins in Idaho and California. The largest amount of user input during modeling is for the quality control of precipitation, temperature, relative humidity, solar radiation, wind speed and wind direction inputs. Precipitation inputs require detailed user input and are crucial to correctly model the snowpack mass. This research applies a range of quality control methods to meteorological input, from raw input with minimal cleaning, to complete user-applied quality control. The meteorological input cleaning generally falls into two categories. The first is global minimum/maximum and missing value correction that could be corrected and/or interpolated with automated processing. The second category is quality control for inputs that are not globally erroneous, yet are still unreasonable and generally indicate malfunctioning measurement equipment, such as temperature or relative humidity that remains constant, or does not correlate with daily trends observed at nearby stations. This research will determine how sensitive model outputs are to different levels of quality control and guide future operational applications.

  5. A task force model for statewide change in nursing education: building quality and safety.

    PubMed

    Mundt, Mary H; Clark, Margherita Procaccini; Klemczak, Jeanette Wrona

    2013-01-01

    The purpose of this article was to describe a statewide planning process to transform nursing education in Michigan to improve quality and safety of patient care. A task force model was used to engage diverse partners in issue identification, consensus building, and recommendations. An example of a statewide intervention in nursing education and practice that was executed was the Michigan Quality and Safety in Nursing Education Institute, which was held using an integrated approach to academic-practice partners from all state regions. This paper describes the unique advantage of leadership by the Michigan Chief Nurse Executive, the existence of a nursing strategic plan, and a funding model. An overview of the Task Force on Nursing Education is presented with a focus on the model's 10 process steps and resulting seven recommendations. The Michigan Nurse Education Council was established to implement the recommendations that included quality and safety. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. AIR QUALITY MODELING OF AMMONIA: A REGIONAL MODELING PERSPECTIVE

    EPA Science Inventory

    The talk will address the status of modeling of ammonia from a regional modeling perspective, yet the observations and comments should have general applicability. The air quality modeling system components that are central to modeling ammonia will be noted and a perspective on ...

  7. Factual Approach in Decision Making - the Prerequisite of Success in Quality Management

    NASA Astrophysics Data System (ADS)

    Kučerová, Marta; Škůrková Lestyánszka, Katarína

    2013-12-01

    In quality management system as well as in other managerial systems, effective decisions must be always based on the data and information analysis, i.e. based on facts, in accordance with the factual approach principle in quality management. It is therefore necessary to measure and collect the data and information about processes. The article presents the results of a conducted survey, which was focused on application of factual approach in decision making. It also offers suggestions for improvements of application of the principle in business practice. This article was prepared using the research results of VEGA project No. 1/0229/08 "Perspectives of the quality management development in relation to the requirements of market in the Slovak Republic".

  8. Evaluation of image quality

    NASA Technical Reports Server (NTRS)

    Pavel, M.

    1993-01-01

    This presentation outlines in viewgraph format a general approach to the evaluation of display system quality for aviation applications. This approach is based on the assumption that it is possible to develop a model of the display which captures most of the significant properties of the display. The display characteristics should include spatial and temporal resolution, intensity quantizing effects, spatial sampling, delays, etc. The model must be sufficiently well specified to permit generation of stimuli that simulate the output of the display system. The first step in the evaluation of display quality is an analysis of the tasks to be performed using the display. Thus, for example, if a display is used by a pilot during a final approach, the aesthetic aspects of the display may be less relevant than its dynamic characteristics. The opposite task requirements may apply to imaging systems used for displaying navigation charts. Thus, display quality is defined with regard to one or more tasks. Given a set of relevant tasks, there are many ways to approach display evaluation. The range of evaluation approaches includes visual inspection, rapid evaluation, part-task simulation, and full mission simulation. The work described is focused on two complementary approaches to rapid evaluation. The first approach is based on a model of the human visual system. A model of the human visual system is used to predict the performance of the selected tasks. The model-based evaluation approach permits very rapid and inexpensive evaluation of various design decisions. The second rapid evaluation approach employs specifically designed critical tests that embody many important characteristics of actual tasks. These are used in situations where a validated model is not available. These rapid evaluation tests are being implemented in a workstation environment.

  9. A Conceptual Analytics Model for an Outcome-Driven Quality Management Framework as Part of Professional Healthcare Education.

    PubMed

    Hervatis, Vasilis; Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil

    2015-10-06

    Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators' decision making. A deductive case study approach was applied to develop the conceptual model. The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach.

  10. A Conceptual Analytics Model for an Outcome-Driven Quality Management Framework as Part of Professional Healthcare Education

    PubMed Central

    Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil

    2015-01-01

    Background Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. Objective The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators’ decision making. Methods A deductive case study approach was applied to develop the conceptual model. Results The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. Conclusions The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach. PMID:27731840

  11. Using Clinical Data Standards to Measure Quality: A New Approach.

    PubMed

    D'Amore, John D; Li, Chun; McCrary, Laura; Niloff, Jonathan M; Sittig, Dean F; McCoy, Allison B; Wright, Adam

    2018-04-01

     Value-based payment for care requires the consistent, objective calculation of care quality. Previous initiatives to calculate ambulatory quality measures have relied on billing data or individual electronic health records (EHRs) to calculate and report performance. New methods for quality measure calculation promoted by federal regulations allow qualified clinical data registries to report quality outcomes based on data aggregated across facilities and EHRs using interoperability standards.  This research evaluates the use of clinical document interchange standards as the basis for quality measurement.  Using data on 1,100 patients from 11 ambulatory care facilities and 5 different EHRs, challenges to quality measurement are identified and addressed for 17 certified quality measures.  Iterative solutions were identified for 14 measures that improved patient inclusion and measure calculation accuracy. Findings validate this approach to improving measure accuracy while maintaining measure certification.  Organizations that report care quality should be aware of how identified issues affect quality measure selection and calculation. Quality measure authors should consider increasing real-world validation and the consistency of measure logic in respect to issues identified in this research. Schattauer GmbH Stuttgart.

  12. STREAM WATER QUALITY MODEL

    EPA Science Inventory

    QUAL2K (or Q2K) is a river and stream water quality model that is intended to represent a modernized version of the QUAL2E (or Q2E) model (Brown and Barnwell 1987). Q2K is similar to Q2E in the following respects:

    • One dimensional. The channel is well-mixed vertically a...

    • Hydrodynamics and water quality models applied to Sepetiba Bay

      NASA Astrophysics Data System (ADS)

      Cunha, Cynara de L. da N.; Rosman, Paulo C. C.; Ferreira, Aldo Pacheco; Carlos do Nascimento Monteiro, Teófilo

      2006-10-01

      A coupled hydrodynamic and water quality model is used to simulate the pollution in Sepetiba Bay due to sewage effluent. Sepetiba Bay has a complicated geometry and bottom topography, and is located on the Brazilian coast near Rio de Janeiro. In the simulation, the dissolved oxygen (DO) concentration and biochemical oxygen demand (BOD) are used as indicators for the presence of organic matter in the body of water, and as parameters for evaluating the environmental pollution of the eastern part of Sepetiba Bay. Effluent sources in the model are taken from DO and BOD field measurements. The simulation results are consistent with field observations and demonstrate that the model has been correctly calibrated. The model is suitable for evaluating the environmental impact of sewage effluent on Sepetiba Bay from river inflows, assessing the feasibility of different treatment schemes, and developing specific monitoring activities. This approach has general applicability for environmental assessment of complicated coastal bays.

    • A priori discretization quality metrics for distributed hydrologic modeling applications

      NASA Astrophysics Data System (ADS)

      Liu, Hongli; Tolson, Bryan; Craig, James; Shafii, Mahyar; Basu, Nandita

      2016-04-01

      In distributed hydrologic modelling, a watershed is treated as a set of small homogeneous units that address the spatial heterogeneity of the watershed being simulated. The ability of models to reproduce observed spatial patterns firstly depends on the spatial discretization, which is the process of defining homogeneous units in the form of grid cells, subwatersheds, or hydrologic response units etc. It is common for hydrologic modelling studies to simply adopt a nominal or default discretization strategy without formally assessing alternative discretization levels. This approach lacks formal justifications and is thus problematic. More formalized discretization strategies are either a priori or a posteriori with respect to building and running a hydrologic simulation model. A posteriori approaches tend to be ad-hoc and compare model calibration and/or validation performance under various watershed discretizations. The construction and calibration of multiple versions of a distributed model can become a seriously limiting computational burden. Current a priori approaches are more formalized and compare overall heterogeneity statistics of dominant variables between candidate discretization schemes and input data or reference zones. While a priori approaches are efficient and do not require running a hydrologic model, they do not fully investigate the internal spatial pattern changes of variables of interest. Furthermore, the existing a priori approaches focus on landscape and soil data and do not assess impacts of discretization on stream channel definition even though its significance has been noted by numerous studies. The primary goals of this study are to (1) introduce new a priori discretization quality metrics considering the spatial pattern changes of model input data; (2) introduce a two-step discretization decision-making approach to compress extreme errors and meet user-specified discretization expectations through non-uniform discretization threshold

    • Quality and Quality Assurance in Vocational Education and Training in the Mediterranean Countries: Lessons from the European Approach

      ERIC Educational Resources Information Center

      Masson, Jean-Raymond; Baati, Mounir; Seyfried, Erwin

      2010-01-01

      This article reflects on the development of the European approach towards quality and quality assurance in vocational education and training (VET) and its relevance for VET reforms in the European Training Foundation (ETF) partner countries. The analysis is based on an ETF project conducted in 2007-2008 in the Mediterranean partner countries to…

    • The Educational Situation Quality Model: Recent Advances.

      PubMed

      Doménech-Betoret, Fernando

      2018-01-01

      The purpose of this work was to present an educational model developed in recent years entitled the "The Educational Situation Quality Model" (MOCSE, acronym in Spanish). MOCSE can be defined as an instructional model that simultaneously considers the teaching-learning process, where motivation plays a central role. It explains the functioning of an educational setting by organizing and relating the most important variables which, according to the literature, contribute to student learning. Besides being a conceptual framework, this model also provides a methodological procedure to guide research and to promote reflection in the classroom. It allows teachers to implement effective research-action programs to improve teacher-students satisfaction and learning outcomes in the classroom context. This work explains the model's characteristics and functioning, recent advances, and how teachers can use it in an educational setting with a specific subject. This proposal integrates approaches from several relevant psycho-educational theories and introduces a new perspective into the existing literature that will allow researchers to make progress in studying educational setting functioning. The initial MOCSE configuration has been refined over time in accordance with the empirical results obtained from previous research, carried out within the MOCSE framework and with the subsequent reflections that derived from these results. Finally, the contribution of the model to improve learning outcomes and satisfaction, and its applicability in the classroom, are also discussed.

    • A quality improvement management model for renal care.

      PubMed

      Vlchek, D L; Day, L M

      1991-04-01

      The purpose of this article is to explore the potential for applying the theory and tools of quality improvement (total quality management) in the renal care setting. We believe that the coupling of the statistical techniques used in the Deming method of quality improvement, with modern approaches to outcome and process analysis, will provide the renal care community with powerful tools, not only for improved quality (i.e., reduced morbidity and mortality), but also for technology evaluation and resource allocation.

    • Air Quality Response Modeling for Decision Support | Science ...

      EPA Pesticide Factsheets

      Air quality management relies on photochemical models to predict the responses of pollutant concentrations to changes in emissions. Such modeling is especially important for secondary pollutants such as ozone and fine particulate matter which vary nonlinearly with changes in emissions. Numerous techniques for probing pollutant-emission relationships within photochemical models have been developed and deployed for a variety of decision support applications. However, atmospheric response modeling remains complicated by the challenge of validating sensitivity results against observable data. This manuscript reviews the state of the science of atmospheric response modeling as well as efforts to characterize the accuracy and uncertainty of sensitivity results. The National Exposure Research Laboratory′s (NERL′s) Atmospheric Modeling and Analysis Division (AMAD) conducts research in support of EPA′s mission to protect human health and the environment. AMAD′s research program is engaged in developing and evaluating predictive atmospheric models on all spatial and temporal scales for forecasting the Nation′s air quality and for assessing changes in air quality and air pollutant exposures, as affected by changes in ecosystem management and regulatory decisions. AMAD is responsible for providing a sound scientific and technical basis for regulatory policies based on air quality models to improve ambient air quality. The models developed by AMAD are being use

    • One multi-media environmental system with linkage between meteorology/ hydrology/ air quality models and water quality model

      NASA Astrophysics Data System (ADS)

      Tang, C.; Lynch, J. A.; Dennis, R. L.

      2016-12-01

      The biogeochemical processing of nitrogen and associated pollutants is driven by meteorological and hydrological processes in conjunction with pollutant loading. There are feedbacks between meteorology and hydrology that will be affected by land-use change and climate change. Changes in meteorology will affect pollutant deposition. It is important to account for those feedbacks and produce internally consistent simulations of meteorology, hydrology, and pollutant loading to drive the (watershed/water quality) biogeochemical models. In this study, the ecological response to emission reductions in streams in the Potomac watershed was evaluated. Firstly, we simulated the deposition by using the fully coupled Weather Research & Forecasting (WRF) model and the Community Multiscale Air Quality (CAMQ) model; secondly, we created the hydrological data by the offline linked Variable Infiltration Capacity (VIC) model and the WRF model. Lastly, we investigated the water quality by one comprehensive/environment model, namely the linkage of CMAQ, WRF, VIC and the Model of Acidification of Groundwater In Catchment (MAGIC) model from 2002 to 2010.The simulated results (such as NO3, SO4, and SBC) fit well to the observed values. The linkage provides a generally accurate, well-tested tool for evaluating sensitivities to varying meteorology and environmental changes on acidification and other biogeochemical processes, with capability to comprehensively explore strategic policy and management design.

    • Combination of a Stresor-Response Model with a Conditional Probability Anaylsis Approach to Develop Candidate Criteria from Empirical Data

      EPA Science Inventory

      We show that a conditional probability analysis that utilizes a stressor-response model based on a logistic regression provides a useful approach for developing candidate water quality criterai from empirical data. The critical step in this approach is transforming the response ...

  1. Does a hospital's quality depend on the quality of other hospitals? A spatial econometrics approach

    PubMed Central

    Gravelle, Hugh; Santos, Rita; Siciliani, Luigi

    2014-01-01

    We examine whether a hospital's quality is affected by the quality provided by other hospitals in the same market. We first sketch a theoretical model with regulated prices and derive conditions on demand and cost functions which determine whether a hospital will increase its quality if its rivals increase their quality. We then apply spatial econometric methods to a sample of English hospitals in 2009–10 and a set of 16 quality measures including mortality rates, readmission, revision and redo rates, and three patient reported indicators, to examine the relationship between the quality of hospitals. We find that a hospital's quality is positively associated with the quality of its rivals for seven out of the sixteen quality measures. There are no statistically significant negative associations. In those cases where there is a significant positive association, an increase in rivals' quality by 10% increases a hospital's quality by 1.7% to 2.9%. The finding suggests that for some quality measures a policy which improves the quality in one hospital will have positive spillover effects on the quality in other hospitals. PMID:25843994

  2. Does a hospital's quality depend on the quality of other hospitals? A spatial econometrics approach.

    PubMed

    Gravelle, Hugh; Santos, Rita; Siciliani, Luigi

    2014-11-01

    We examine whether a hospital's quality is affected by the quality provided by other hospitals in the same market. We first sketch a theoretical model with regulated prices and derive conditions on demand and cost functions which determine whether a hospital will increase its quality if its rivals increase their quality. We then apply spatial econometric methods to a sample of English hospitals in 2009-10 and a set of 16 quality measures including mortality rates, readmission, revision and redo rates, and three patient reported indicators, to examine the relationship between the quality of hospitals. We find that a hospital's quality is positively associated with the quality of its rivals for seven out of the sixteen quality measures. There are no statistically significant negative associations. In those cases where there is a significant positive association, an increase in rivals' quality by 10% increases a hospital's quality by 1.7% to 2.9%. The finding suggests that for some quality measures a policy which improves the quality in one hospital will have positive spillover effects on the quality in other hospitals.

  3. Determinants of quality of life in patients with fibromyalgia: A structural equation modeling approach.

    PubMed

    Lee, Jeong-Won; Lee, Kyung-Eun; Park, Dong-Jin; Kim, Seong-Ho; Nah, Seong-Su; Lee, Ji Hyun; Kim, Seong-Kyu; Lee, Yeon-Ah; Hong, Seung-Jae; Kim, Hyun-Sook; Lee, Hye-Soon; Kim, Hyoun Ah; Joung, Chung-Il; Kim, Sang-Hyon; Lee, Shin-Seok

    2017-01-01

    Health-related quality of life (HRQOL) in patients with fibromyalgia (FM) is lower than in patients with other chronic diseases and the general population. Although various factors affect HRQOL, no study has examined a structural equation model of HRQOL as an outcome variable in FM patients. The present study assessed relationships among physical function, social factors, psychological factors, and HRQOL, and the effects of these variables on HRQOL in a hypothesized model using structural equation modeling (SEM). HRQOL was measured using SF-36, and the Fibromyalgia Impact Questionnaire (FIQ) was used to assess physical dysfunction. Social and psychological statuses were assessed using the Beck Depression Inventory (BDI), the State-Trait Anxiety Inventory (STAI), the Arthritis Self-Efficacy Scale (ASES), and the Social Support Scale. SEM analysis was used to test the structural relationships of the model using the AMOS software. Of the 336 patients, 301 (89.6%) were women with an average age of 47.9±10.9 years. The SEM results supported the hypothesized structural model (χ2 = 2.336, df = 3, p = 0.506). The final model showed that Physical Component Summary (PCS) was directly related to self-efficacy and inversely related to FIQ, and that Mental Component Summary (MCS) was inversely related to FIQ, BDI, and STAI. In our model of FM patients, HRQOL was affected by physical, social, and psychological variables. In these patients, higher levels of physical function and self-efficacy can improve the PCS of HRQOL, while physical function, depression, and anxiety negatively affect the MCS of HRQOL.

  4. Determinants of quality of life in patients with fibromyalgia: A structural equation modeling approach

    PubMed Central

    Lee, Jeong-Won; Lee, Kyung-Eun; Park, Dong-Jin; Kim, Seong-Ho; Nah, Seong-Su; Lee, Ji Hyun; Kim, Seong-Kyu; Lee, Yeon-Ah; Hong, Seung-Jae; Kim, Hyun-Sook; Lee, Hye-Soon; Kim, Hyoun Ah; Joung, Chung-Il; Kim, Sang-Hyon

    2017-01-01

    Objective Health-related quality of life (HRQOL) in patients with fibromyalgia (FM) is lower than in patients with other chronic diseases and the general population. Although various factors affect HRQOL, no study has examined a structural equation model of HRQOL as an outcome variable in FM patients. The present study assessed relationships among physical function, social factors, psychological factors, and HRQOL, and the effects of these variables on HRQOL in a hypothesized model using structural equation modeling (SEM). Methods HRQOL was measured using SF-36, and the Fibromyalgia Impact Questionnaire (FIQ) was used to assess physical dysfunction. Social and psychological statuses were assessed using the Beck Depression Inventory (BDI), the State-Trait Anxiety Inventory (STAI), the Arthritis Self-Efficacy Scale (ASES), and the Social Support Scale. SEM analysis was used to test the structural relationships of the model using the AMOS software. Results Of the 336 patients, 301 (89.6%) were women with an average age of 47.9±10.9 years. The SEM results supported the hypothesized structural model (χ2 = 2.336, df = 3, p = 0.506). The final model showed that Physical Component Summary (PCS) was directly related to self-efficacy and inversely related to FIQ, and that Mental Component Summary (MCS) was inversely related to FIQ, BDI, and STAI. Conclusions In our model of FM patients, HRQOL was affected by physical, social, and psychological variables. In these patients, higher levels of physical function and self-efficacy can improve the PCS of HRQOL, while physical function, depression, and anxiety negatively affect the MCS of HRQOL. PMID:28158289

  5. Putting people into water quality modelling.

    NASA Astrophysics Data System (ADS)

    Strickert, G. E.; Hassanzadeh, E.; Noble, B.; Baulch, H. M.; Morales-Marin, L. A.; Lindenschmidt, K. E.

    2017-12-01

    Water quality in the Qu'Appelle River Basin, Saskatchewan is under pressure due to nutrient pollution entering the river system from major cities, industrial zones and agricultural areas. Among these stressors, agricultural activities are basin-wide; therefore, they are the largest non-point source of water pollution in this region. The dynamics of agricultural impacts on water quality are complex and stem from decisions and activities of two distinct stakeholder groups, namely grain farmers and cattle producers, which have different business plans, values, and attitudes towards water quality. As a result, improving water quality in this basin requires engaging with stakeholders to: (1) understand their perspectives regarding a range of agricultural Beneficial Management Practices (BMPs) that can improve water quality in the region, (2) show them the potential consequences of their selected BMPs, and (3) work with stakeholders to better understand the barriers and incentives to implement the effective BMPs. In this line, we held a series of workshops in the Qu'Appelle River Basin with both groups of stakeholders to understand stakeholders' viewpoints about alternative agricultural BMPs and their impact on water quality. Workshop participants were involved in the statement sorting activity (Q-sorts), group discussions, as well as mapping activity. The workshop outcomes show that stakeholder had four distinct viewpoints about the BMPs that can improve water quality, i.e., flow and erosion control, fertilizer management, cattle site management, as well as mixed cattle and wetland management. Accordingly, to simulate the consequences of stakeholder selected BMPs, a conceptual water quality model was developed using System Dynamics (SD). The model estimates potential changes in water quality at the farm, tributary and regional scale in the Qu'Appelle River Basin under each and/or combination of stakeholder selected BMPs. The SD model was then used for real

  6. Model design for predicting extreme precipitation event impacts on water quality in a water supply reservoir

    NASA Astrophysics Data System (ADS)

    Hagemann, M.; Jeznach, L. C.; Park, M. H.; Tobiason, J. E.

    2016-12-01

    Extreme precipitation events such as tropical storms and hurricanes are by their nature rare, yet have disproportionate and adverse effects on surface water quality. In the context of drinking water reservoirs, common concerns of such events include increased erosion and sediment transport and influx of natural organic matter and nutrients. As part of an effort to model the effects of an extreme precipitation event on water quality at the reservoir intake of a major municipal water system, this study sought to estimate extreme-event watershed responses including streamflow and exports of nutrients and organic matter for use as inputs to a 2-D hydrodynamic and water quality reservoir model. Since extreme-event watershed exports are highly uncertain, we characterized and propagated predictive uncertainty using a quasi-Monte Carlo approach to generate reservoir model inputs. Three storm precipitation depths—corresponding to recurrence intervals of 5, 50, and 100 years—were converted to streamflow in each of 9 tributaries by volumetrically scaling 2 storm hydrographs from the historical record. Rating-curve models for concentratoin, calibrated using 10 years of data for each of 5 constituents, were then used to estimate the parameters of a multivariate lognormal probability model of constituent concentrations, conditional on each scenario's storm date and streamflow. A quasi-random Halton sequence (n = 100) was drawn from the conditional distribution for each event scenario, and used to generate input files to a calibrated CE-QUAL-W2 reservoir model. The resulting simulated concentrations at the reservoir's drinking water intake constitute a low-discrepancy sample from the estimated uncertainty space of extreme-event source water-quality. Limiting factors to the suitability of this approach include poorly constrained relationships between hydrology and constituent concentrations, a high-dimensional space from which to generate inputs, and relatively long run

  7. Simulating ensembles of source water quality using a K-nearest neighbor resampling approach.

    PubMed

    Towler, Erin; Rajagopalan, Balaji; Seidel, Chad; Summers, R Scott

    2009-03-01

    Climatological, geological, and water management factors can cause significant variability in surface water quality. As drinking water quality standards become more stringent, the ability to quantify the variability of source water quality becomes more important for decision-making and planning in water treatment for regulatory compliance. However, paucity of long-term water quality data makes it challenging to apply traditional simulation techniques. To overcome this limitation, we have developed and applied a robust nonparametric K-nearest neighbor (K-nn) bootstrap approach utilizing the United States Environmental Protection Agency's Information Collection Rule (ICR) data. In this technique, first an appropriate "feature vector" is formed from the best available explanatory variables. The nearest neighbors to the feature vector are identified from the ICR data and are resampled using a weight function. Repetition of this results in water quality ensembles, and consequently the distribution and the quantification of the variability. The main strengths of the approach are its flexibility, simplicity, and the ability to use a large amount of spatial data with limited temporal extent to provide water quality ensembles for any given location. We demonstrate this approach by applying it to simulate monthly ensembles of total organic carbon for two utilities in the U.S. with very different watersheds and to alkalinity and bromide at two other U.S. utilities.

  8. Helicopter mathematical models and control law development for handling qualities research

    NASA Technical Reports Server (NTRS)

    Chen, Robert T. N.; Lebacqz, J. Victor; Aiken, Edwin W.; Tischler, Mark B.

    1988-01-01

    Progress made in joint NASA/Army research concerning rotorcraft flight-dynamics modeling, design methodologies for rotorcraft flight-control laws, and rotorcraft parameter identification is reviewed. Research into these interactive disciplines is needed to develop the analytical tools necessary to conduct flying qualities investigations using both the ground-based and in-flight simulators, and to permit an efficient means of performing flight test evaluation of rotorcraft flying qualities for specification compliance. The need for the research is particularly acute for rotorcraft because of their mathematical complexity, high order dynamic characteristics, and demanding mission requirements. The research in rotorcraft flight-dynamics modeling is pursued along two general directions: generic nonlinear models and nonlinear models for specific rotorcraft. In addition, linear models are generated that extend their utilization from 1-g flight to high-g maneuvers and expand their frequency range of validity for the design analysis of high-gain flight control systems. A variety of methods ranging from classical frequency-domain approaches to modern time-domain control methodology that are used in the design of rotorcraft flight control laws is reviewed. Also reviewed is a study conducted to investigate the design details associated with high-gain, digital flight control systems for combat rotorcraft. Parameter identification techniques developed for rotorcraft applications are reviewed.

  9. Principal Component Clustering Approach to Teaching Quality Discriminant Analysis

    ERIC Educational Resources Information Center

    Xian, Sidong; Xia, Haibo; Yin, Yubo; Zhai, Zhansheng; Shang, Yan

    2016-01-01

    Teaching quality is the lifeline of the higher education. Many universities have made some effective achievement about evaluating the teaching quality. In this paper, we establish the Students' evaluation of teaching (SET) discriminant analysis model and algorithm based on principal component clustering analysis. Additionally, we classify the SET…

  10. Graphical approach to assess the soil fertility evaluation model validity for rice (case study: southern area of Merapi Mountain, Indonesia)

    NASA Astrophysics Data System (ADS)

    Julianto, E. A.; Suntoro, W. A.; Dewi, W. S.; Partoyo

    2018-03-01

    Climate change has been reported to exacerbate land resources degradation including soil fertility decline. The appropriate validity use on soil fertility evaluation could reduce the risk of climate change effect on plant cultivation. This study aims to assess the validity of a Soil Fertility Evaluation Model using a graphical approach. The models evaluated were the Indonesian Soil Research Center (PPT) version model, the FAO Unesco version model, and the Kyuma version model. Each model was then correlated with rice production (dry grain weight/GKP). The goodness of fit of each model can be tested to evaluate the quality and validity of a model, as well as the regression coefficient (R2). This research used the Eviews 9 programme by a graphical approach. The results obtained three curves, namely actual, fitted, and residual curves. If the actual and fitted curves are widely apart or irregular, this means that the quality of the model is not good, or there are many other factors that are still not included in the model (large residual) and conversely. Indeed, if the actual and fitted curves show exactly the same shape, it means that all factors have already been included in the model. Modification of the standard soil fertility evaluation models can improve the quality and validity of a model.

  11. Market-Based Approaches to Quality Assessment and Management of Higher Education in the Republic of Kazakhstan

    ERIC Educational Resources Information Center

    Valikhanova, Zarina

    2015-01-01

    This article considers the problems of the definition of quality in the educational sphere. Alternative approaches to the concept of quality of education and its evaluation are determined given the different approaches of scientists and experts. The most important criteria in assessing the quality is distinguished and formed in the matrix for…

  12. Cost-effective water quality assessment through the integration of monitoring data and modeling results

    NASA Astrophysics Data System (ADS)

    Lobuglio, Joseph N.; Characklis, Gregory W.; Serre, Marc L.

    2007-03-01

    Sparse monitoring data and error inherent in water quality models make the identification of waters not meeting regulatory standards uncertain. Additional monitoring can be implemented to reduce this uncertainty, but it is often expensive. These costs are currently a major concern, since developing total maximum daily loads, as mandated by the Clean Water Act, will require assessing tens of thousands of water bodies across the United States. This work uses the Bayesian maximum entropy (BME) method of modern geostatistics to integrate water quality monitoring data together with model predictions to provide improved estimates of water quality in a cost-effective manner. This information includes estimates of uncertainty and can be used to aid probabilistic-based decisions concerning the status of a water (i.e., impaired or not impaired) and the level of monitoring needed to characterize the water for regulatory purposes. This approach is applied to the Catawba River reservoir system in western North Carolina as a means of estimating seasonal chlorophyll a concentration. Mean concentration and confidence intervals for chlorophyll a are estimated for 66 reservoir segments over an 11-year period (726 values) based on 219 measured seasonal averages and 54 model predictions. Although the model predictions had a high degree of uncertainty, integration of modeling results via BME methods reduced the uncertainty associated with chlorophyll estimates compared with estimates made solely with information from monitoring efforts. Probabilistic predictions of future chlorophyll levels on one reservoir are used to illustrate the cost savings that can be achieved by less extensive and rigorous monitoring methods within the BME framework. While BME methods have been applied in several environmental contexts, employing these methods as a means of integrating monitoring and modeling results, as well as application of this approach to the assessment of surface water monitoring networks

  13. Root Zone Water Quality Model (RZWQM2): Model use, calibration, and validation

    USDA-ARS?s Scientific Manuscript database

    The Root Zone Water Quality Model (RZWQM2) has been used widely for simulating agricultural management effects on crop production and soil and water quality. Although it is a one-dimensional model it has many desirable features for the modeling community. This paper outlines the principles of calibr...

  14. [Quality assurance and total quality management in residential home care].

    PubMed

    Nübling, R; Schrempp, C; Kress, G; Löschmann, C; Neubart, R; Kuhlmey, A

    2004-02-01

    Quality, quality assurance, and quality management have been important topics in residential care homes for several years. However, only as a result of reform processes in the German legislation (long-term care insurance, care quality assurance) is a systematic discussion taking place. Furthermore, initiatives and holistic model projects, which deal with the assessment and improvement of service quality, were developed in the field of care for the elderly. The present article gives a critical overview of essential developments. Different comprehensive approaches such as the implementation of quality management systems, nationwide expert-based initiatives, and developments towards professionalizing care are discussed. Empirically based approaches, especially those emphasizing the assessment of outcome quality, are focused on in this work. Overall, the authors conclude that in the past few years comprehensive efforts have been made to improve the quality of care. However, the current situation still requires much work to establish a nationwide launch and implementation of evidence-based quality assurance and quality management.

  15. Nurse practice environment, workload, burnout, job outcomes, and quality of care in psychiatric hospitals: a structural equation model approach.

    PubMed

    Van Bogaert, Peter; Clarke, Sean; Willems, Riet; Mondelaers, Mieke

    2013-07-01

    To study the relationships between nurse practice environment, workload, burnout, job outcomes and nurse-reported quality of care in psychiatric hospital staff. Nurses' practice environments in general hospitals have been extensively investigated. Potential variations across practice settings, for instance in psychiatric hospitals, have been much less studied. A cross-sectional design with a survey. A structural equation model previously tested in acute hospitals was evaluated using survey data from a sample of 357 registered nurses, licensed practical nurses, and non-registered caregivers from two psychiatric hospitals in Belgium between December 2010-April 2011. The model included paths between practice environment dimensions and outcome variables, with burnout in a mediating position. A workload measure was also tested as a potential mediator between the practice environment and outcome variables. An improved model, slightly modified from the one validated earlier in samples of acute care nurses, was confirmed. This model explained 50% and 38% of the variance in job outcomes and nurse-reported quality of care respectively. In addition, workload was found to play a mediating role in accounting for job outcomes and significantly improved a model that ultimately explained 60% of the variance in these variables. In psychiatric hospitals as in general hospitals, nurse-physician relationship and other organizational dimensions such as nursing and hospital management were closely associated with perceptions of workload and with burnout and job satisfaction, turnover intentions, and nurse-reported quality of care. Mechanisms linking key variables and differences across settings in these relationships merit attention by managers and researchers. © 2012 Blackwell Publishing Ltd.

  16. A Wireless Sensor Network-Based Approach with Decision Support for Monitoring Lake Water Quality.

    PubMed

    Huang, Xiaoci; Yi, Jianjun; Chen, Shaoli; Zhu, Xiaomin

    2015-11-19

    Online monitoring and water quality analysis of lakes are urgently needed. A feasible and effective approach is to use a Wireless Sensor Network (WSN). Lake water environments, like other real world environments, present many changing and unpredictable situations. To ensure flexibility in such an environment, the WSN node has to be prepared to deal with varying situations. This paper presents a WSN self-configuration approach for lake water quality monitoring. The approach is based on the integration of a semantic framework, where a reasoner can make decisions on the configuration of WSN services. We present a WSN ontology and the relevant water quality monitoring context information, which considers its suitability in a pervasive computing environment. We also propose a rule-based reasoning engine that is used to conduct decision support through reasoning techniques and context-awareness. To evaluate the approach, we conduct usability experiments and performance benchmarks.

  17. A multi-model assessment of the co-benefits of climate mitigation for global air quality

    NASA Astrophysics Data System (ADS)

    Rao, Shilpa; Klimont, Zbigniew; Leitao, Joana; Riahi, Keywan; van Dingenen, Rita; Aleluia Reis, Lara; Calvin, Katherine; Dentener, Frank; Drouet, Laurent; Fujimori, Shinichiro; Harmsen, Mathijs; Luderer, Gunnar; Heyes, Chris; Strefler, Jessica; Tavoni, Massimo; van Vuuren, Detlef P.

    2016-12-01

    We present a model comparison study that combines multiple integrated assessment models with a reduced-form global air quality model to assess the potential co-benefits of global climate mitigation policies in relation to the World Health Organization (WHO) goals on air quality and health. We include in our assessment, a range of alternative assumptions on the implementation of current and planned pollution control policies. The resulting air pollution emission ranges significantly extend those in the Representative Concentration Pathways. Climate mitigation policies complement current efforts on air pollution control through technology and fuel transformations in the energy system. A combination of stringent policies on air pollution control and climate change mitigation results in 40% of the global population exposed to PM levels below the WHO air quality guideline; with the largest improvements estimated for India, China, and Middle East. Our results stress the importance of integrated multisector policy approaches to achieve the Sustainable Development Goals.

  18. A quality by design approach to optimization of emulsions for electrospinning using factorial and D-optimal designs.

    PubMed

    Badawi, Mariam A; El-Khordagui, Labiba K

    2014-07-16

    Emulsion electrospinning is a multifactorial process used to generate nanofibers loaded with hydrophilic drugs or macromolecules for diverse biomedical applications. Emulsion electrospinnability is greatly impacted by the emulsion pharmaceutical attributes. The aim of this study was to apply a quality by design (QbD) approach based on design of experiments as a risk-based proactive approach to achieve predictable critical quality attributes (CQAs) in w/o emulsions for electrospinning. Polycaprolactone (PCL)-thickened w/o emulsions containing doxycycline HCl were formulated using a Span 60/sodium lauryl sulfate (SLS) emulsifier blend. The identified emulsion CQAs (stability, viscosity and conductivity) were linked with electrospinnability using a 3(3) factorial design to optimize emulsion composition for phase stability and a D-optimal design to optimize stable emulsions for viscosity and conductivity after shifting the design space. The three independent variables, emulsifier blend composition, organic:aqueous phase ratio and polymer concentration, had a significant effect (p<0.05) on emulsion CQAs, the emulsifier blend composition exerting prominent main and interaction effects. Scanning electron microscopy (SEM) of emulsion-electrospun NFs and desirability functions allowed modeling of emulsion CQAs to predict electrospinnable formulations. A QbD approach successfully built quality in electrospinnable emulsions, allowing development of hydrophilic drug-loaded nanofibers with desired morphological characteristics. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Production system with process quality control: modelling and application

    NASA Astrophysics Data System (ADS)

    Tsou, Jia-Chi

    2010-07-01

    Over the past decade, there has been a great deal of research dedicated to the study of quality and the economics of production. In this article, we develop a dynamic model which is based on the hypothesis of a traditional economic production quantity model. Taguchi's cost of poor quality is used to evaluate the cost of poor quality in the dynamic production system. A practical case from the automotive industry, which uses the Six-sigma DMAIC methodology, is discussed to verify the proposed model. This study shows that there is an optimal value of quality investment to make the production system reach a reasonable quality level and minimise the production cost. Based on our model, the management can adjust its investment in quality improvement to generate considerable financial return.

  20. System Behavior Models: A Survey of Approaches

    DTIC Science & Technology

    2016-06-01

    MODELS: A SURVEY OF APPROACHES by Scott R. Ruppel June 2016 Thesis Advisor: Kristin Giammarco Second Reader: John M. Green THIS PAGE...Thesis 4. TITLE AND SUBTITLE SYSTEM BEHAVIOR MODELS: A SURVEY OF APPROACHES 5. FUNDING NUMBERS 6. AUTHOR(S) Scott R. Ruppel 7. PERFORMING...Monterey Phoenix, Petri nets, behavior modeling, model-based systems engineering, modeling approaches, modeling survey 15. NUMBER OF PAGES 85 16

  1. A nonlinear quality-related fault detection approach based on modified kernel partial least squares.

    PubMed

    Jiao, Jianfang; Zhao, Ning; Wang, Guang; Yin, Shen

    2017-01-01

    In this paper, a new nonlinear quality-related fault detection method is proposed based on kernel partial least squares (KPLS) model. To deal with the nonlinear characteristics among process variables, the proposed method maps these original variables into feature space in which the linear relationship between kernel matrix and output matrix is realized by means of KPLS. Then the kernel matrix is decomposed into two orthogonal parts by singular value decomposition (SVD) and the statistics for each part are determined appropriately for the purpose of quality-related fault detection. Compared with relevant existing nonlinear approaches, the proposed method has the advantages of simple diagnosis logic and stable performance. A widely used literature example and an industrial process are used for the performance evaluation for the proposed method. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  2. A GIS-based multi-source and multi-box modeling approach (GMSMB) for air pollution assessment--a North American case study.

    PubMed

    Wang, Bao-Zhen; Chen, Zhi

    2013-01-01

    This article presents a GIS-based multi-source and multi-box modeling approach (GMSMB) to predict the spatial concentration distributions of airborne pollutant on local and regional scales. In this method, an extended multi-box model combined with a multi-source and multi-grid Gaussian model are developed within the GIS framework to examine the contributions from both point- and area-source emissions. By using GIS, a large amount of data including emission sources, air quality monitoring, meteorological data, and spatial location information required for air quality modeling are brought into an integrated modeling environment. It helps more details of spatial variation in source distribution and meteorological condition to be quantitatively analyzed. The developed modeling approach has been examined to predict the spatial concentration distribution of four air pollutants (CO, NO(2), SO(2) and PM(2.5)) for the State of California. The modeling results are compared with the monitoring data. Good agreement is acquired which demonstrated that the developed modeling approach could deliver an effective air pollution assessment on both regional and local scales to support air pollution control and management planning.

  3. A quality improvement approach to capacity building in low- and middle-income countries.

    PubMed

    Bardfield, Joshua; Agins, Bruce; Akiyama, Matthew; Basenero, Apollo; Luphala, Patience; Kaindjee-Tjituka, Francina; Natanael, Salomo; Hamunime, Ndapewa

    2015-07-01

    To describe the HEALTHQUAL framework consisting of the following three components: performance measurement, quality improvement and the quality management program, representing an adaptive approach to building capacity in national quality management programs in low and middle-income countries. We present a case study from Namibia illustrating how this approach is adapted to country context. HEALTHQUAL partners with Ministries of Health to build knowledge and expertise in modern improvement methods, including data collection, analysis and reporting, process analysis and the use of data to implement quality improvement projects that aim to improve systems and processes of care. Clinical performance measures are selected in each country by the Ministry of Health on the basis of national guidelines. Patient records are sampled using a standardized statistical table to achieve a minimum confidence interval of 90%, with a spread of ±8% in participating facilities. Data are routinely reviewed to identify gaps in patient care, and aggregated to produce facility mean scores that are trended over time. A formal organizational assessment is conducted at facility and national levels to review the implementation progress. Aggregate mean rates of performance for 10 of 11 indicators of HIV care improved for adult HIV-positive patients between 2008 and 2013. Quality improvement is an approach to capacity building and health systems strengthening that offers adaptive methodology. Synergistic implementation of elements of a national quality program can lead to improvements in care, in parallel with systematic capacity development for measurement, improvement and quality management throughout the healthcare delivery system.

  4. Surgical quality assessment. A simplified approach.

    PubMed

    DeLong, D L

    1991-10-01

    The current approach to QA primarily involves taking action when problems are discovered and designing a documentation system that records the deliverance of quality care. Involving the entire staff helps eliminate problems before they occur. By keeping abreast of current problems and soliciting input from staff members, the QA at our hospital has improved dramatically. The cross-referencing of JCAHO and AORN standards on the assessment form and the single-sheet reporting form expedite the evaluation process and simplify record keeping. The bulletin board increases staff members' understanding of QA and boosts morale and participation. A sound and effective QA program does not require reorganizing an entire department, nor should it invoke negative connotations. Developing an effective QA program merely requires rethinking current processes. The program must meet the department's specific needs, and although many departments concentrate on documentation, auditing charts does not give a complete picture of the quality of care delivered. The QA committee must employ a variety of data collection methods on multiple indicators to ensure an accurate representation of the care delivered, and they must not overlook any issues that directly affect patient outcomes.

  5. Minnesota 4-H Youth Program Quality Improvement Model

    ERIC Educational Resources Information Center

    Herman, Margo; Grant, Samantha

    2015-01-01

    The University of Minnesota Extension Center for Youth Development made an organizational decision in 2011 to invest in a system-wide approach to implement youth program quality into the 4-H program using the Youth Program Quality Assessment (YPQA) tool. This article describes the four key components to the Minnesota Youth Program Quality…

  6. Quality Assurance of Cancer Study Common Data Elements Using A Post-Coordination Approach

    PubMed Central

    Jiang, Guoqian; Solbrig, Harold R.; Prud’hommeaux, Eric; Tao, Cui; Weng, Chunhua; Chute, Christopher G.

    2015-01-01

    Domain-specific common data elements (CDEs) are emerging as an effective approach to standards-based clinical research data storage and retrieval. A limiting factor, however, is the lack of robust automated quality assurance (QA) tools for the CDEs in clinical study domains. The objectives of the present study are to prototype and evaluate a QA tool for the study of cancer CDEs using a post-coordination approach. The study starts by integrating the NCI caDSR CDEs and The Cancer Genome Atlas (TCGA) data dictionaries in a single Resource Description Framework (RDF) data store. We designed a compositional expression pattern based on the Data Element Concept model structure informed by ISO/IEC 11179, and developed a transformation tool that converts the pattern-based compositional expressions into the Web Ontology Language (OWL) syntax. Invoking reasoning and explanation services, we tested the system utilizing the CDEs extracted from two TCGA clinical cancer study domains. The system could automatically identify duplicate CDEs, and detect CDE modeling errors. In conclusion, compositional expressions not only enable reuse of existing ontology codes to define new domain concepts, but also provide an automated mechanism for QA of terminological annotations for CDEs. PMID:26958201

  7. The administrative and clinical rationale for the total organization approach to continuous quality improvement.

    PubMed

    Jones, D J; Ziegenfuss, J T

    1993-01-01

    In our view TQM and CQI represent important innovations in the continuing effort to develop higher performance organizations. Never before has the need been so great to improve quality while at the same time constraining, or reducing, costs. An increasing number of health care organizations can document their experiences that as quality goes up, costs can come down. The contribution of these new approaches is in some sense the wedding of many long established methodologies--the scientific method, statistical quality control, planning, joint problem solving, participative management, and empowerment of the work force. While this recognition could lend support to those who label this new model a fad, that perception denies the linkage of TQM/CQI to the greater stream of innovations pushing us toward ever-greater organizational excellence. Can we not take the philosophy and methods that are potentially useful and try them experimentally? Let our empirical tests tell us of their contribution. We believe the concepts and procedures of TQM/CQI will help us to be better in years to come, even though we highly respect our starting point.

  8. Taking a 'Big Data' approach to data quality in a citizen science project.

    PubMed

    Kelling, Steve; Fink, Daniel; La Sorte, Frank A; Johnston, Alison; Bruns, Nicholas E; Hochachka, Wesley M

    2015-11-01

    Data from well-designed experiments provide the strongest evidence of causation in biodiversity studies. However, for many species the collection of these data is not scalable to the spatial and temporal extents required to understand patterns at the population level. Only data collected from citizen science projects can gather sufficient quantities of data, but data collected from volunteers are inherently noisy and heterogeneous. Here we describe a 'Big Data' approach to improve the data quality in eBird, a global citizen science project that gathers bird observations. First, eBird's data submission design ensures that all data meet high standards of completeness and accuracy. Second, we take a 'sensor calibration' approach to measure individual variation in eBird participant's ability to detect and identify birds. Third, we use species distribution models to fill in data gaps. Finally, we provide examples of novel analyses exploring population-level patterns in bird distributions.

  9. A Systems Engineering Approach to Quality Assurance for Aerospace Testing

    NASA Technical Reports Server (NTRS)

    Shepherd, Christena C.

    2015-01-01

    On the surface, it appears that AS91001 has little to say about how to apply a Quality Management System (QMS) to major aerospace test programs (or even smaller ones). It also appears that there is little in the quality engineering Body of Knowledge (BOK)2 that applies to testing, unless it is nondestructive examination (NDE), or some type of lab or bench testing associated with the manufacturing process. However, if one examines: a) how the systems engineering (SE) processes are implemented throughout a test program; and b) how these SE processes can be mapped to the requirements of AS9100, a number of areas for involvement of the quality professional are revealed. What often happens is that quality assurance during a test program is limited to inspections of the test article; what could be considered a manufacturing al fresco approach. This limits the quality professional and is a disservice to the programs and projects, since there are a number of ways that quality can enhance critical processes, and support efforts to improve risk reduction, efficiency and effectiveness.

  10. The Atlanta Urban Heat Island Mitigation and Air Quality Modeling Project: How High-Resoution Remote Sensing Data Can Improve Air Quality Models

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Estes, Maurice G., Jr.; Crosson, William L.; Khan, Maudood N.

    2006-01-01

    The Atlanta Urban Heat Island and Air Quality Project had its genesis in Project ATLANTA (ATlanta Land use Analysis: Temperature and Air quality) that began in 1996. Project ATLANTA examined how high-spatial resolution thermal remote sensing data could be used to derive better measurements of the Urban Heat Island effect over Atlanta. We have explored how these thermal remote sensing, as well as other imaged datasets, can be used to better characterize the urban landscape for improved air quality modeling over the Atlanta area. For the air quality modeling project, the National Land Cover Dataset and the local scale Landpro99 dataset at 30m spatial resolutions have been used to derive land use/land cover characteristics for input into the MM5 mesoscale meteorological model that is one of the foundations for the Community Multiscale Air Quality (CMAQ) model to assess how these data can improve output from CMAQ. Additionally, land use changes to 2030 have been predicted using a Spatial Growth Model (SGM). SGM simulates growth around a region using population, employment and travel demand forecasts. Air quality modeling simulations were conducted using both current and future land cover. Meteorological modeling simulations indicate a 0.5 C increase in daily maximum air temperatures by 2030. Air quality modeling simulations show substantial differences in relative contributions of individual atmospheric pollutant constituents as a result of land cover change. Enhanced boundary layer mixing over the city tends to offset the increase in ozone concentration expected due to higher surface temperatures as a result of urbanization.

  11. Prediction of Indoor Air Exposure from Outdoor Air Quality Using an Artificial Neural Network Model for Inner City Commercial Buildings.

    PubMed

    Challoner, Avril; Pilla, Francesco; Gill, Laurence

    2015-12-01

    NO₂ and particulate matter are the air pollutants of most concern in Ireland, with possible links to the higher respiratory and cardiovascular mortality and morbidity rates found in the country compared to the rest of Europe. Currently, air quality limits in Europe only cover outdoor environments yet the quality of indoor air is an essential determinant of a person's well-being, especially since the average person spends more than 90% of their time indoors. The modelling conducted in this research aims to provide a framework for epidemiological studies by the use of publically available data from fixed outdoor monitoring stations to predict indoor air quality more accurately. Predictions are made using two modelling techniques, the Personal-exposure Activity Location Model (PALM), to predict outdoor air quality at a particular building, and Artificial Neural Networks, to model the indoor/outdoor relationship of the building. This joint approach has been used to predict indoor air concentrations for three inner city commercial buildings in Dublin, where parallel indoor and outdoor diurnal monitoring had been carried out on site. This modelling methodology has been shown to provide reasonable predictions of average NO₂ indoor air quality compared to the monitored data, but did not perform well in the prediction of indoor PM2.5 concentrations. Hence, this approach could be used to determine NO₂ exposures more rigorously of those who work and/or live in the city centre, which can then be linked to potential health impacts.

  12. A multiple objective optimization approach to quality control

    NASA Technical Reports Server (NTRS)

    Seaman, Christopher Michael

    1991-01-01

    The use of product quality as the performance criteria for manufacturing system control is explored. The goal in manufacturing, for economic reasons, is to optimize product quality. The problem is that since quality is a rather nebulous product characteristic, there is seldom an analytic function that can be used as a measure. Therefore standard control approaches, such as optimal control, cannot readily be applied. A second problem with optimizing product quality is that it is typically measured along many dimensions: there are many apsects of quality which must be optimized simultaneously. Very often these different aspects are incommensurate and competing. The concept of optimality must now include accepting tradeoffs among the different quality characteristics. These problems are addressed using multiple objective optimization. It is shown that the quality control problem can be defined as a multiple objective optimization problem. A controller structure is defined using this as the basis. Then, an algorithm is presented which can be used by an operator to interactively find the best operating point. Essentially, the algorithm uses process data to provide the operator with two pieces of information: (1) if it is possible to simultaneously improve all quality criteria, then determine what changes to the process input or controller parameters should be made to do this; and (2) if it is not possible to improve all criteria, and the current operating point is not a desirable one, select a criteria in which a tradeoff should be made, and make input changes to improve all other criteria. The process is not operating at an optimal point in any sense if no tradeoff has to be made to move to a new operating point. This algorithm ensures that operating points are optimal in some sense and provides the operator with information about tradeoffs when seeking the best operating point. The multiobjective algorithm was implemented in two different injection molding scenarios

  13. A new approach in the development of quality management systems for (micro)electronics

    NASA Astrophysics Data System (ADS)

    Bacivarov, Ioan C.; Bacivarov, Angelica; Gherghina, Cǎtǎlina

    2016-12-01

    This paper presents the new approach in the analysis of the Quality Management Systems (QMS) of companies, based on the revised standard ISO 9001:2015. In the first part of the paper, QMS based on ISO 9001 certification are introduced; the changes and the updates proposed for the new version of ISO 9001:2015 are critically analyzed, based on the documents elaborated by ISO/TC 176. The approach based on ISO 9001:2015 could be considered as "beginning of a new era in development of quality management systems". A comparison between the between the "old" standard ISO 9001:2008 and the "new" standard ISO 9001:2015 is made. In the second part of the paper, steps to be followed in a company to implement this new standard are presented. A peculiar attention is given to the new concept of risk-based thinking in order to support and improve application of the process based approach. The authors conclude that, by considering risk throughout the organization the likelihood of achieving stated objectives is improved, output is more consistent and customers can be confident that they will receive the expected results. Finally, the benefits of the new approach in the development of quality management systems are outlined, as well as how they are reflected in the management of companies in general and those in electronics field, in particular. As demonstrated in this paper, well understood and properly applied, the new approach based on the revised standard ISO9001:2015 could offer a better quality management for companies operating in electronics and beyond.

  14. INDOOR AIR QUALITY MODELING (CHAPTER 58)

    EPA Science Inventory

    The chapter discussses indoor air quality (IAQ) modeling. Such modeling provides a way to investigate many IAQ problems without the expense of large field experiments. Where experiments are planned, IAQ models can be used to help design experiments by providing information on exp...

  15. Quality Assurance Model for Digital Adult Education Materials

    ERIC Educational Resources Information Center

    Dimou, Helen; Kameas, Achilles

    2016-01-01

    Purpose: This paper aims to present a model for the quality assurance of digital educational material that is appropriate for adult education. The proposed model adopts the software quality standard ISO/IEC 9126 and takes into account adult learning theories, Bloom's taxonomy of learning objectives and two instructional design models: Kolb's model…

  16. Pharmaceutical product development: A quality by design approach

    PubMed Central

    Pramod, Kannissery; Tahir, M. Abu; Charoo, Naseem A.; Ansari, Shahid H.; Ali, Javed

    2016-01-01

    The application of quality by design (QbD) in pharmaceutical product development is now a thrust area for the regulatory authorities and the pharmaceutical industry. International Conference on Harmonization and United States Food and Drug Administration (USFDA) emphasized the principles and applications of QbD in pharmaceutical development in their guidance for the industry. QbD attributes are addressed in question-based review, developed by USFDA for chemistry, manufacturing, and controls section of abbreviated new drug applications. QbD principles, when implemented, lead to a successful product development, subsequent prompt regulatory approval, reduce exhaustive validation burden, and significantly reduce post-approval changes. The key elements of QbD viz., target product quality profile, critical quality attributes, risk assessments, design space, control strategy, product lifecycle management, and continual improvement are discussed to understand the performance of dosage forms within design space. Design of experiments, risk assessment tools, and process analytical technology are also discussed for their role in QbD. This review underlines the importance of QbD in inculcating science-based approach in pharmaceutical product development. PMID:27606256

  17. Pharmaceutical product development: A quality by design approach.

    PubMed

    Pramod, Kannissery; Tahir, M Abu; Charoo, Naseem A; Ansari, Shahid H; Ali, Javed

    2016-01-01

    The application of quality by design (QbD) in pharmaceutical product development is now a thrust area for the regulatory authorities and the pharmaceutical industry. International Conference on Harmonization and United States Food and Drug Administration (USFDA) emphasized the principles and applications of QbD in pharmaceutical development in their guidance for the industry. QbD attributes are addressed in question-based review, developed by USFDA for chemistry, manufacturing, and controls section of abbreviated new drug applications. QbD principles, when implemented, lead to a successful product development, subsequent prompt regulatory approval, reduce exhaustive validation burden, and significantly reduce post-approval changes. The key elements of QbD viz., target product quality profile, critical quality attributes, risk assessments, design space, control strategy, product lifecycle management, and continual improvement are discussed to understand the performance of dosage forms within design space. Design of experiments, risk assessment tools, and process analytical technology are also discussed for their role in QbD. This review underlines the importance of QbD in inculcating science-based approach in pharmaceutical product development.

  18. Performance of the Hydrological Portion of a Simple Water Quality Model in Different Climatic Regions

    NASA Astrophysics Data System (ADS)

    Moore, K.; Pierson, D.; Pettersson, K.; Naden, P.; Allott, N.; Jennings, E.; Tamm, T.; Järvet, A.; Nickus, U.; Thies, H.; Arvola, L.; Järvinen, M.; Schneiderman, E.; Zion, M.; Lounsbury, D.

    2004-05-01

    We are applying an existing watershed model in the EU CLIME (Climate and Lake Impacts in Europe) project to evaluate the effects of weather on seasonal and annual delivery of N, P, and DOC to lakes. Model calibration is based on long-term records of weather and water quality data collected from sites in different climatic regions spread across Europe and in New York State. The overall aim of the CLIME project is to develop methods and models to support lake and catchment management under current climate conditions and make predictions under future climate scenarios. Scientists from 10 partner countries are collaborating on developing a consistent approach to defining model parameters for the Generalized Watershed Loading Functions (GWLF) model, one of a larger suite of models used in the project. An example of the approach for the hydrological portion of the GWLF model will be presented, with consideration of the balance between model simplicity, ease of use, data requirements, and realistic predictions.

  19. Linked Hydrologic-Hydrodynamic Model Framework to Forecast Impacts of Rivers on Beach Water Quality

    NASA Astrophysics Data System (ADS)

    Anderson, E. J.; Fry, L. M.; Kramer, E.; Ritzenthaler, A.

    2014-12-01

    The goal of NOAA's beach quality forecasting program is to use a multi-faceted approach to aid in detection and prediction of bacteria in recreational waters. In particular, our focus has been on the connection between tributary loads and bacteria concentrations at nearby beaches. While there is a clear link between stormwater runoff and beach water quality, quantifying the contribution of river loadings to nearshore bacterial concentrations is complicated due to multiple processes that drive bacterial concentrations in rivers as well as those processes affecting the fate and transport of bacteria upon exiting the rivers. In order to forecast potential impacts of rivers on beach water quality, we developed a linked hydrologic-hydrodynamic water quality framework that simulates accumulation and washoff of bacteria from the landscape, and then predicts the fate and transport of washed off bacteria from the watershed to the coastal zone. The framework includes a watershed model (IHACRES) to predict fecal indicator bacteria (FIB) loadings to the coastal environment (accumulation, wash-off, die-off) as a function of effective rainfall. These loadings are input into a coastal hydrodynamic model (FVCOM), including a bacteria transport model (Lagrangian particle), to simulate 3D bacteria transport within the coastal environment. This modeling system provides predictive tools to assist local managers in decision-making to reduce human health threats.

  20. A Bayesian joint probability modeling approach for seasonal forecasting of streamflows at multiple sites

    NASA Astrophysics Data System (ADS)

    Wang, Q. J.; Robertson, D. E.; Chiew, F. H. S.

    2009-05-01

    Seasonal forecasting of streamflows can be highly valuable for water resources management. In this paper, a Bayesian joint probability (BJP) modeling approach for seasonal forecasting of streamflows at multiple sites is presented. A Box-Cox transformed multivariate normal distribution is proposed to model the joint distribution of future streamflows and their predictors such as antecedent streamflows and El Niño-Southern Oscillation indices and other climate indicators. Bayesian inference of model parameters and uncertainties is implemented using Markov chain Monte Carlo sampling, leading to joint probabilistic forecasts of streamflows at multiple sites. The model provides a parametric structure for quantifying relationships between variables, including intersite correlations. The Box-Cox transformed multivariate normal distribution has considerable flexibility for modeling a wide range of predictors and predictands. The Bayesian inference formulated allows the use of data that contain nonconcurrent and missing records. The model flexibility and data-handling ability means that the BJP modeling approach is potentially of wide practical application. The paper also presents a number of statistical measures and graphical methods for verification of probabilistic forecasts of continuous variables. Results for streamflows at three river gauges in the Murrumbidgee River catchment in southeast Australia show that the BJP modeling approach has good forecast quality and that the fitted model is consistent with observed data.

  1. An innovative modeling approach using Qual2K and HEC-RAS integration to assess the impact of tidal effect on River Water quality simulation.

    PubMed

    Fan, Chihhao; Ko, Chun-Han; Wang, Wei-Shen

    2009-04-01

    Water quality modeling has been shown to be a useful tool in strategic water quality management. The present study combines the Qual2K model with the HEC-RAS model to assess the water quality of a tidal river in northern Taiwan. The contaminant loadings of biochemical oxygen demand (BOD), ammonia nitrogen (NH(3)-N), total phosphorus (TP), and sediment oxygen demand (SOD) are utilized in the Qual2K simulation. The HEC-RAS model is used to: (i) estimate the hydraulic constants for atmospheric re-aeration constant calculation; and (ii) calculate the water level profile variation to account for concentration changes as a result of tidal effect. The results show that HEC-RAS-assisted Qual2K simulations taking tidal effect into consideration produce water quality indices that, in general, agree with the monitoring data of the river. Comparisons of simulations with different combinations of contaminant loadings demonstrate that BOD is the most import contaminant. Streeter-Phelps simulation (in combination with HEC-RAS) is also performed for comparison, and the results show excellent agreement with the observed data. This paper is the first report of the innovative use of a combination of the HEC-RAS model and the Qual2K model (or Streeter-Phelps equation) to simulate water quality in a tidal river. The combination is shown to provide an alternative for water quality simulation of a tidal river when available dynamic-monitoring data are insufficient to assess the tidal effect of the river.

  2. Load-based approaches for modelling visual clarity in streams at regional scale.

    PubMed

    Elliott, A H; Davies-Colley, R J; Parshotam, A; Ballantine, D

    2013-01-01

    Reduction of visual clarity in streams by diffuse sources of fine sediment is a cause of water quality impairment in New Zealand and internationally. In this paper we introduce the concept of a load of optical cross section (LOCS), which can be used for load-based management of light-attenuating substances and for water quality models that are based on mass accounting. In this approach, the beam attenuation coefficient (units of m(-1)) is estimated from the inverse of the visual clarity (units of m) measured with a black disc. This beam attenuation coefficient can also be considered as an optical cross section (OCS) per volume of water, analogous to a concentration. The instantaneous 'flux' of cross section is obtained from the attenuation coefficient multiplied by the water discharge, and this can be accumulated over time to give an accumulated 'load' of cross section (LOCS). Moreover, OCS is a conservative quantity, in the sense that the OCS of two combined water volumes is the sum of the OCS of the individual water volumes (barring effects such as coagulation, settling, or sorption). The LOCS can be calculated for a water quality station using rating curve methods applied to measured time series of visual clarity and flow. This approach was applied to the sites in New Zealand's National Rivers Water Quality Network (NRWQN). Although the attenuation coefficient follows roughly a power relation with flow at some sites, more flexible loess rating curves are required at other sites. The hybrid mechanistic-statistical catchment model SPARROW (SPAtially Referenced Regressions On Watershed attributes), which is based on a mass balance for mean annual load, was then applied to the NRWQN dataset. Preliminary results from this model are presented, highlighting the importance of factors related to erosion, such as rainfall, slope, hardness of catchment rock types, and the influence of pastoral development on the load of optical cross section.

  3. a System Dynamics Model to Study the Importance of Infrastructure Facilities on Quality of Primary Education System in Developing Countries

    NASA Astrophysics Data System (ADS)

    Pedamallu, Chandra Sekhar; Ozdamar, Linet; Weber, Gerhard-Wilhelm; Kropat, Erik

    2010-06-01

    The system dynamics approach is a holistic way of solving problems in real-time scenarios. This is a powerful methodology and computer simulation modeling technique for framing, analyzing, and discussing complex issues and problems. System dynamics modeling and simulation is often the background of a systemic thinking approach and has become a management and organizational development paradigm. This paper proposes a system dynamics approach for study the importance of infrastructure facilities on quality of primary education system in developing nations. The model is proposed to be built using the Cross Impact Analysis (CIA) method of relating entities and attributes relevant to the primary education system in any given community. We offer a survey to build the cross-impact correlation matrix and, hence, to better understand the primary education system and importance of infrastructural facilities on quality of primary education. The resulting model enables us to predict the effects of infrastructural facilities on the access of primary education by the community. This may support policy makers to take more effective actions in campaigns.

  4. Revisiting Cyberbullying in Schools Using the Quality Circle Approach

    ERIC Educational Resources Information Center

    Paul, Simone; Smith, Peter K.; Blumberg, Herbert H.

    2012-01-01

    An earlier study reported the use of Quality Circles (QC) in a UK school in the context of understanding and reducing bullying and cyberbullying. Here, we report further work in the same school setting. The QC approach allows explorative analysis of problems in school settings, whereby students embark on a problem-solving exercise over a period of…

  5. Approaches to quality improvement in nursing homes: Lessons learned from the six-state pilot of CMS's Nursing Home Quality Initiative

    PubMed Central

    Kissam, Stephanie; Gifford, David; Parks, Peggy; Patry, Gail; Palmer, Laura; Wilkes, Linda; Fitzgerald, Matthew; Petrulis, Alice Stollenwerk; Barnette, Leslie

    2003-01-01

    Background In November 2002, the Centers for Medicare & Medicaid Services (CMS) launched a Nursing Home Quality Initiative that included publicly reporting a set of Quality Measures for all nursing homes in the country, and providing quality improvement assistance to nursing homes nationwide. A pilot of this initiative occurred in six states for six months prior to the launch. Methods Review and analysis of the lessons learned from the six Quality Improvement Organizations (QIOs) that led quality improvement efforts in nursing homes from the six pilot states. Results QIOs in the six pilot states found several key outcomes of the Nursing Home Quality Initiative that help to maximize the potential of public reporting to leverage effective improvement in nursing home quality of care. First, public reporting focuses the attention of all stakeholders in the nursing home industry on achieving good quality outcomes on a defined set of measures, and creates an incentive for partnership formation. Second, publicly reported quality measures motivate nursing home providers to improve in certain key clinical areas, and in particular to seek out new ways of changing processes of care, such as engaging physicians and the medical director more directly. Third, the lessons learned by QIOs in the pilot of this Initiative indicate that certain approaches to providing quality improvement assistance are key to guiding nursing home providers' desire and enthusiasm to improve towards a using a systematic approach to quality improvement. Conclusion The Nursing Home Quality Initiative has already demonstrated the potential of public reporting to foster collaboration and coordination among nursing home stakeholders and to heighten interest of nursing homes in quality improvement techniques. The lessons learned from this pilot project have implications for any organizations or individuals planning quality improvement projects in the nursing home setting. PMID:12753699

  6. Students' Approaches to Essay-Writing and the Quality of the Written Product.

    ERIC Educational Resources Information Center

    Biggs, John B.

    Studies of text comprehension have suggested that readers focus on different levels of ideational unit while reading, thereby affecting the quality of their comprehension of the text. A study examined the viability of the deep-surface categorization with regard to essay-writing and the relation of different approaches to writing to the quality of…

  7. Comparisons of Multilevel Modeling and Structural Equation Modeling Approaches to Actor-Partner Interdependence Model.

    PubMed

    Hong, Sehee; Kim, Soyoung

    2018-01-01

    There are basically two modeling approaches applicable to analyzing an actor-partner interdependence model: the multilevel modeling (hierarchical linear model) and the structural equation modeling. This article explains how to use these two models in analyzing an actor-partner interdependence model and how these two approaches work differently. As an empirical example, marital conflict data were used to analyze an actor-partner interdependence model. The multilevel modeling and the structural equation modeling produced virtually identical estimates for a basic model. However, the structural equation modeling approach allowed more realistic assumptions on measurement errors and factor loadings, rendering better model fit indices.

  8. Evaluating Predictive Models of Software Quality

    NASA Astrophysics Data System (ADS)

    Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.

    2014-06-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  9. A top-down approach to fabrication of high quality vertical heterostructure nanowire arrays.

    PubMed

    Wang, Hua; Sun, Minghua; Ding, Kang; Hill, Martin T; Ning, Cun-Zheng

    2011-04-13

    We demonstrate a novel top-down approach for fabricating nanowires with unprecedented complexity and optical quality by taking advantage of a nanoscale self-masking effect. We realized vertical arrays of nanowires of 20-40 nm in diameter with 16 segments of complex longitudinal InGaAsP/InP structures. The unprecedented high quality of etched wires is evidenced by the narrowest photoluminescence linewidth ever produced in similar wavelengths, indistinguishable from that of the corresponding wafer. This top-down, mask-free, large scale approach is compatible with the established device fabrication processes and could serve as an important alternative to the bottom-up approach, significantly expanding ranges and varieties of applications of nanowire technology.

  10. DockQ: A Quality Measure for Protein-Protein Docking Models

    PubMed Central

    Basu, Sankar

    2016-01-01

    The state-of-the-art to assess the structural quality of docking models is currently based on three related yet independent quality measures: Fnat, LRMS, and iRMS as proposed and standardized by CAPRI. These quality measures quantify different aspects of the quality of a particular docking model and need to be viewed together to reveal the true quality, e.g. a model with relatively poor LRMS (>10Å) might still qualify as 'acceptable' with a descent Fnat (>0.50) and iRMS (<3.0Å). This is also the reason why the so called CAPRI criteria for assessing the quality of docking models is defined by applying various ad-hoc cutoffs on these measures to classify a docking model into the four classes: Incorrect, Acceptable, Medium, or High quality. This classification has been useful in CAPRI, but since models are grouped in only four bins it is also rather limiting, making it difficult to rank models, correlate with scoring functions or use it as target function in machine learning algorithms. Here, we present DockQ, a continuous protein-protein docking model quality measure derived by combining Fnat, LRMS, and iRMS to a single score in the range [0, 1] that can be used to assess the quality of protein docking models. By using DockQ on CAPRI models it is possible to almost completely reproduce the original CAPRI classification into Incorrect, Acceptable, Medium and High quality. An average PPV of 94% at 90% Recall demonstrating that there is no need to apply predefined ad-hoc cutoffs to classify docking models. Since DockQ recapitulates the CAPRI classification almost perfectly, it can be viewed as a higher resolution version of the CAPRI classification, making it possible to estimate model quality in a more quantitative way using Z-scores or sum of top ranked models, which has been so valuable for the CASP community. The possibility to directly correlate a quality measure to a scoring function has been crucial for the development of scoring functions for protein structure

  11. Community Multiscale Air Quality Modeling System (CMAQ)

    EPA Pesticide Factsheets

    CMAQ is a computational tool used for air quality management. It models air pollutants including ozone, particulate matter and other air toxics to help determine optimum air quality management scenarios.

  12. Reflexion on linear regression trip production modelling method for ensuring good model quality

    NASA Astrophysics Data System (ADS)

    Suprayitno, Hitapriya; Ratnasari, Vita

    2017-11-01

    Transport Modelling is important. For certain cases, the conventional model still has to be used, in which having a good trip production model is capital. A good model can only be obtained from a good sample. Two of the basic principles of a good sampling is having a sample capable to represent the population characteristics and capable to produce an acceptable error at a certain confidence level. It seems that this principle is not yet quite understood and used in trip production modeling. Therefore, investigating the Trip Production Modelling practice in Indonesia and try to formulate a better modeling method for ensuring the Model Quality is necessary. This research result is presented as follows. Statistics knows a method to calculate span of prediction value at a certain confidence level for linear regression, which is called Confidence Interval of Predicted Value. The common modeling practice uses R2 as the principal quality measure, the sampling practice varies and not always conform to the sampling principles. An experiment indicates that small sample is already capable to give excellent R2 value and sample composition can significantly change the model. Hence, good R2 value, in fact, does not always mean good model quality. These lead to three basic ideas for ensuring good model quality, i.e. reformulating quality measure, calculation procedure, and sampling method. A quality measure is defined as having a good R2 value and a good Confidence Interval of Predicted Value. Calculation procedure must incorporate statistical calculation method and appropriate statistical tests needed. A good sampling method must incorporate random well distributed stratified sampling with a certain minimum number of samples. These three ideas need to be more developed and tested.

  13. A multi-objective approach to improve SWAT model calibration in alpine catchments

    NASA Astrophysics Data System (ADS)

    Tuo, Ye; Marcolini, Giorgia; Disse, Markus; Chiogna, Gabriele

    2018-04-01

    Multi-objective hydrological model calibration can represent a valuable solution to reduce model equifinality and parameter uncertainty. The Soil and Water Assessment Tool (SWAT) model is widely applied to investigate water quality and water management issues in alpine catchments. However, the model calibration is generally based on discharge records only, and most of the previous studies have defined a unique set of snow parameters for an entire basin. Only a few studies have considered snow observations to validate model results or have taken into account the possible variability of snow parameters for different subbasins. This work presents and compares three possible calibration approaches. The first two procedures are single-objective calibration procedures, for which all parameters of the SWAT model were calibrated according to river discharge alone. Procedures I and II differ from each other by the assumption used to define snow parameters: The first approach assigned a unique set of snow parameters to the entire basin, whereas the second approach assigned different subbasin-specific sets of snow parameters to each subbasin. The third procedure is a multi-objective calibration, in which we considered snow water equivalent (SWE) information at two different spatial scales (i.e. subbasin and elevation band), in addition to discharge measurements. We tested these approaches in the Upper Adige river basin where a dense network of snow depth measurement stations is available. Only the set of parameters obtained with this multi-objective procedure provided an acceptable prediction of both river discharge and SWE. These findings offer the large community of SWAT users a strategy to improve SWAT modeling in alpine catchments.

  14. The Educational Situation Quality Model: Recent Advances

    PubMed Central

    Doménech-Betoret, Fernando

    2018-01-01

    The purpose of this work was to present an educational model developed in recent years entitled the “The Educational Situation Quality Model” (MOCSE, acronym in Spanish). MOCSE can be defined as an instructional model that simultaneously considers the teaching-learning process, where motivation plays a central role. It explains the functioning of an educational setting by organizing and relating the most important variables which, according to the literature, contribute to student learning. Besides being a conceptual framework, this model also provides a methodological procedure to guide research and to promote reflection in the classroom. It allows teachers to implement effective research-action programs to improve teacher–students satisfaction and learning outcomes in the classroom context. This work explains the model’s characteristics and functioning, recent advances, and how teachers can use it in an educational setting with a specific subject. This proposal integrates approaches from several relevant psycho-educational theories and introduces a new perspective into the existing literature that will allow researchers to make progress in studying educational setting functioning. The initial MOCSE configuration has been refined over time in accordance with the empirical results obtained from previous research, carried out within the MOCSE framework and with the subsequent reflections that derived from these results. Finally, the contribution of the model to improve learning outcomes and satisfaction, and its applicability in the classroom, are also discussed. PMID:29593623

  15. Approaches to ensuring and improving quality in the context of health system strengthening: a cross-site analysis of the five African Health Initiative Partnership programs

    PubMed Central

    2013-01-01

    Background Integrated into the work in health systems strengthening (HSS) is a growing focus on the importance of ensuring quality of the services delivered and systems which support them. Understanding how to define and measure quality in the different key World Health Organization building blocks is critical to providing the information needed to address gaps and identify models for replication. Description of approaches We describe the approaches to defining and improving quality across the five country programs funded through the Doris Duke Charitable Foundation African Health Initiative. While each program has independently developed and implemented country-specific approaches to strengthening health systems, they all included quality of services and systems as a core principle. We describe the differences and similarities across the programs in defining and improving quality as an embedded process essential for HSS to achieve the goal of improved population health. The programs measured quality across most or all of the six WHO building blocks, with specific areas of overlap in improving quality falling into four main categories: 1) defining and measuring quality; 2) ensuring data quality, and building capacity for data use for decision making and response to quality measurements; 3) strengthened supportive supervision and/or mentoring; and 4) operational research to understand the factors associated with observed variation in quality. Conclusions Learning the value and challenges of these approaches to measuring and improving quality across the key components of HSS as the projects continue their work will help inform similar efforts both now and in the future to ensure quality across the critical components of a health system and the impact on population health. PMID:23819662

  16. A sound quality model for objective synthesis evaluation of vehicle interior noise based on artificial neural network

    NASA Astrophysics Data System (ADS)

    Wang, Y. S.; Shen, G. Q.; Xing, Y. F.

    2014-03-01

    Based on the artificial neural network (ANN) technique, an objective sound quality evaluation (SQE) model for synthesis annoyance of vehicle interior noises is presented in this paper. According to the standard named GB/T18697, firstly, the interior noises under different working conditions of a sample vehicle are measured and saved in a noise database. Some mathematical models for loudness, sharpness and roughness of the measured vehicle noises are established and performed by Matlab programming. Sound qualities of the vehicle interior noises are also estimated by jury tests following the anchored semantic differential (ASD) procedure. Using the objective and subjective evaluation results, furthermore, an ANN-based model for synthetical annoyance evaluation of vehicle noises, so-called ANN-SAE, is developed. Finally, the ANN-SAE model is proved by some verification tests with the leave-one-out algorithm. The results suggest that the proposed ANN-SAE model is accurate and effective and can be directly used to estimate sound quality of the vehicle interior noises, which is very helpful for vehicle acoustical designs and improvements. The ANN-SAE approach may be extended to deal with other sound-related fields for product quality evaluations in SQE engineering.

  17. Taguchi's off line method and Multivariate loss function approach for quality management and optimization of process parameters -A review

    NASA Astrophysics Data System (ADS)

    Bharti, P. K.; Khan, M. I.; Singh, Harbinder

    2010-10-01

    Off-line quality control is considered to be an effective approach to improve product quality at a relatively low cost. The Taguchi method is one of the conventional approaches for this purpose. Through this approach, engineers can determine a feasible combination of design parameters such that the variability of a product's response can be reduced and the mean is close to the desired target. The traditional Taguchi method was focused on ensuring good performance at the parameter design stage with one quality characteristic, but most products and processes have multiple quality characteristics. The optimal parameter design minimizes the total quality loss for multiple quality characteristics. Several studies have presented approaches addressing multiple quality characteristics. Most of these papers were concerned with maximizing the parameter combination of signal to noise (SN) ratios. The results reveal the advantages of this approach are that the optimal parameter design is the same as the traditional Taguchi method for the single quality characteristic; the optimal design maximizes the amount of reduction of total quality loss for multiple quality characteristics. This paper presents a literature review on solving multi-response problems in the Taguchi method and its successful implementation in various industries.

  18. COMMUNITY MULTISCALE AIR QUALITY MODELING SYSTEM (ONE ATMOSPHERE)

    EPA Science Inventory

    This task supports ORD's strategy by providing responsive technical support of EPA's mission and provides credible state of the art air quality models and guidance. This research effort is to develop and improve the Community Multiscale Air Quality (CMAQ) modeling system, a mu...

  19. Microscale Obstacle Resolving Air Quality Model Evaluation with the Michelstadt Case

    PubMed Central

    Rakai, Anikó; Kristóf, Gergely

    2013-01-01

    Modelling pollutant dispersion in cities is challenging for air quality models as the urban obstacles have an important effect on the flow field and thus the dispersion. Computational Fluid Dynamics (CFD) models with an additional scalar dispersion transport equation are a possible way to resolve the flowfield in the urban canopy and model dispersion taking into consideration the effect of the buildings explicitly. These models need detailed evaluation with the method of verification and validation to gain confidence in their reliability and use them as a regulatory purpose tool in complex urban geometries. This paper shows the performance of an open source general purpose CFD code, OpenFOAM for a complex urban geometry, Michelstadt, which has both flow field and dispersion measurement data. Continuous release dispersion results are discussed to show the strengths and weaknesses of the modelling approach, focusing on the value of the turbulent Schmidt number, which was found to give best statistical metric results with a value of 0.7. PMID:24027450

  20. Microscale obstacle resolving air quality model evaluation with the Michelstadt case.

    PubMed

    Rakai, Anikó; Kristóf, Gergely

    2013-01-01

    Modelling pollutant dispersion in cities is challenging for air quality models as the urban obstacles have an important effect on the flow field and thus the dispersion. Computational Fluid Dynamics (CFD) models with an additional scalar dispersion transport equation are a possible way to resolve the flowfield in the urban canopy and model dispersion taking into consideration the effect of the buildings explicitly. These models need detailed evaluation with the method of verification and validation to gain confidence in their reliability and use them as a regulatory purpose tool in complex urban geometries. This paper shows the performance of an open source general purpose CFD code, OpenFOAM for a complex urban geometry, Michelstadt, which has both flow field and dispersion measurement data. Continuous release dispersion results are discussed to show the strengths and weaknesses of the modelling approach, focusing on the value of the turbulent Schmidt number, which was found to give best statistical metric results with a value of 0.7.

  1. RAQ–A Random Forest Approach for Predicting Air Quality in Urban Sensing Systems

    PubMed Central

    Yu, Ruiyun; Yang, Yu; Yang, Leyou; Han, Guangjie; Move, Oguti Ann

    2016-01-01

    Air quality information such as the concentration of PM2.5 is of great significance for human health and city management. It affects the way of traveling, urban planning, government policies and so on. However, in major cities there is typically only a limited number of air quality monitoring stations. In the meantime, air quality varies in the urban areas and there can be large differences, even between closely neighboring regions. In this paper, a random forest approach for predicting air quality (RAQ) is proposed for urban sensing systems. The data generated by urban sensing includes meteorology data, road information, real-time traffic status and point of interest (POI) distribution. The random forest algorithm is exploited for data training and prediction. The performance of RAQ is evaluated with real city data. Compared with three other algorithms, this approach achieves better prediction precision. Exciting results are observed from the experiments that the air quality can be inferred with amazingly high accuracy from the data which are obtained from urban sensing. PMID:26761008

  2. A tutorial for developing a topical cream formulation based on the Quality by Design approach.

    PubMed

    Simões, Ana; Veiga, Francisco; Vitorino, Carla; Figueiras, Ana

    2018-06-20

    The pharmaceutical industry has entered in a new era, as there is a growing interest in increasing the quality standards of dosage forms, through the implementation of more structured development and manufacturing approaches. For many decades, the manufacturing of drug products was controlled by a regulatory framework to guarantee the quality of the final product through a fixed process and exhaustive testing. Limitations related to the Quality by Test (QbT) system have been widely acknowledged. The emergence of Quality by Design (QbD) as a systematic and risk-based approach introduced a new quality concept based on a good understanding of how raw materials and process parameters influence the final quality profile. Although the QbD system has been recognized as a revolutionary approach to product development and manufacturing, its full implementation in the pharmaceutical field is still limited. This is particularly evident in the case of semisolid complex formulation development. The present review aims at establishing a practical QbD framework to describe all stages comprised in the pharmaceutical development of a conventional cream in a comprehensible manner. Copyright © 2018. Published by Elsevier Inc.

  3. A quality approach for conducting training needs assessments in the Ministry of Health, State of Bahrain.

    PubMed

    Benjamin, S; al-Darazi, F

    2000-01-01

    In health care organizations around the world, Training Needs Assessments (TNAs) have generally followed a professions-based approach. For example, the training needs of doctors, nurses, each allied health profession, and distinct support staff have been analyzed separately--individualized TNAs conducted for each speciality and functional area. Although a professions-based TNA model can provide useful information to human resource development (HRD) professionals, there are two major drawbacks: (1) it is possible that important training needs might be overlooked because of lack of information sharing among professions and (2) such an approach does not encourage an interdisciplinary, team orientation to service provision. This paper proposes an improved method of conceptualizing TNAs, using an approach that builds on the quality management literature (TQM, CQI, etc.) which stresses the importance of customer- and service-orientations to organizing and measuring organizational and individual performance.

  4. Towards the Next Generation Air Quality Modeling System ...

    EPA Pesticide Factsheets

    The community multiscale air quality (CMAQ) model of the U.S. Environmental Protection Agency is one of the most widely used air quality model worldwide; it is employed for both research and regulatory applications at major universities and government agencies for improving understanding of the formation and transport of air pollutants. It is noted, however, that air quality issues and climate change assessments need to be addressed globally recognizing the linkages and interactions between meteorology and atmospheric chemistry across a wide range of scales. Therefore, an effort is currently underway to develop the next generation air quality modeling system (NGAQM) that will be based on a global integrated meteorology and chemistry system. The model for prediction across scales-atmosphere (MPAS-A), a global fully compressible non-hydrostatic model with seamlessly refined centroidal Voronoi grids, has been chosen as the meteorological driver of this modeling system. The initial step of adapting MPAS-A for the NGAQM was to implement and test the physics parameterizations and options that are preferred for retrospective air quality simulations (see the work presented by R. Gilliam, R. Bullock, and J. Herwehe at this workshop). The next step, presented herein, would be to link the chemistry from CMAQ to MPAS-A to build a prototype for the NGAQM. Furthermore, the techniques to harmonize transport processes between CMAQ and MPAS-A, methodologies to connect the chemis

  5. A gap approach to exploring quality of life in mental health.

    PubMed

    Welham, J; Haire, M; Mercer, D; Stedman, T

    2001-01-01

    Improving quality of life (QoL) is an important treatment outcome for the serious mentally ill. There is, however, a need for an instrument which both captures consumers own assessments and gives direct information for intervention. A useful approach is to define QoL as the gap between actual and ideal life circumstances, which is weighted by importance. In this paper we detail how we developed and evaluated a QoL instrument which follows this model. This instrument, the 'QoL-GAP', is based on self-appraised items within various life domains. For each item respondents firstly identify what they have (actual) and then what they would like (ideal). They then rate the item for its importance and make any comments. A weighted gap score for each item is subsequently derived from the ideal actual gap being weighted by the importance rating. This weighted gap score is then related to domain satisfaction ratings, while their average from each domain is related to overall satisfaction and well-being. We surveyed 120 individuals with a serious and enduring mental illness living in different types of residences, such as psychiatric hospitals, hostels, or their own homes, in a largely urban part of Queensland. Sixty-eight percent were males, and 92% had schizophrenia or related disorders. We found that our approach demonstrated good psychometric properties, and that the model-based predictions were borne out: weighted gap measures were consistently more strongly related to domain satisfaction than were the actual circumstances alone. While further work is being undertaken--in such matters as short-forms and further evaluation of the QoL-GAP in a longitudinal study--our results suggest that this 'gap' approach helps consumers state their own goals and give their opinions and so is particularly relevant for consumer-focused mental health delivery and research.

  6. Improved model quality assessment using ProQ2.

    PubMed

    Ray, Arjun; Lindahl, Erik; Wallner, Björn

    2012-09-10

    Employing methods to assess the quality of modeled protein structures is now standard practice in bioinformatics. In a broad sense, the techniques can be divided into methods relying on consensus prediction on the one hand, and single-model methods on the other. Consensus methods frequently perform very well when there is a clear consensus, but this is not always the case. In particular, they frequently fail in selecting the best possible model in the hard cases (lacking consensus) or in the easy cases where models are very similar. In contrast, single-model methods do not suffer from these drawbacks and could potentially be applied on any protein of interest to assess quality or as a scoring function for sampling-based refinement. Here, we present a new single-model method, ProQ2, based on ideas from its predecessor, ProQ. ProQ2 is a model quality assessment algorithm that uses support vector machines to predict local as well as global quality of protein models. Improved performance is obtained by combining previously used features with updated structural and predicted features. The most important contribution can be attributed to the use of profile weighting of the residue specific features and the use features averaged over the whole model even though the prediction is still local. ProQ2 is significantly better than its predecessors at detecting high quality models, improving the sum of Z-scores for the selected first-ranked models by 20% and 32% compared to the second-best single-model method in CASP8 and CASP9, respectively. The absolute quality assessment of the models at both local and global level is also improved. The Pearson's correlation between the correct and local predicted score is improved from 0.59 to 0.70 on CASP8 and from 0.62 to 0.68 on CASP9; for global score to the correct GDT_TS from 0.75 to 0.80 and from 0.77 to 0.80 again compared to the second-best single methods in CASP8 and CASP9, respectively. ProQ2 is available at http://proq2

  7. Evaluating Air-Quality Models: Review and Outlook.

    NASA Astrophysics Data System (ADS)

    Weil, J. C.; Sykes, R. I.; Venkatram, A.

    1992-10-01

    Over the past decade, much attention has been devoted to the evaluation of air-quality models with emphasis on model performance in predicting the high concentrations that are important in air-quality regulations. This paper stems from our belief that this practice needs to be expanded to 1) evaluate model physics and 2) deal with the large natural or stochastic variability in concentration. The variability is represented by the root-mean- square fluctuating concentration (c about the mean concentration (C) over an ensemble-a given set of meteorological, source, etc. conditions. Most air-quality models used in applications predict C, whereas observations are individual realizations drawn from an ensemble. For cC large residuals exist between predicted and observed concentrations, which confuse model evaluations.This paper addresses ways of evaluating model physics in light of the large c the focus is on elevated point-source models. Evaluation of model physics requires the separation of the mean model error-the difference between the predicted and observed C-from the natural variability. A residual analysis is shown to be an elective way of doing this. Several examples demonstrate the usefulness of residuals as well as correlation analyses and laboratory data in judging model physics.In general, c models and predictions of the probability distribution of the fluctuating concentration (c), (c, are in the developmental stage, with laboratory data playing an important role. Laboratory data from point-source plumes in a convection tank show that (c approximates a self-similar distribution along the plume center plane, a useful result in a residual analysis. At pmsent,there is one model-ARAP-that predicts C, c, and (c for point-source plumes. This model is more computationally demanding than other dispersion models (for C only) and must be demonstrated as a practical tool. However, it predicts an important quantity for applications- the uncertainty in the very high and

  8. A Visual Analytics Approach for Station-Based Air Quality Data

    PubMed Central

    Du, Yi; Ma, Cuixia; Wu, Chao; Xu, Xiaowei; Guo, Yike; Zhou, Yuanchun; Li, Jianhui

    2016-01-01

    With the deployment of multi-modality and large-scale sensor networks for monitoring air quality, we are now able to collect large and multi-dimensional spatio-temporal datasets. For these sensed data, we present a comprehensive visual analysis approach for air quality analysis. This approach integrates several visual methods, such as map-based views, calendar views, and trends views, to assist the analysis. Among those visual methods, map-based visual methods are used to display the locations of interest, and the calendar and the trends views are used to discover the linear and periodical patterns. The system also provides various interaction tools to combine the map-based visualization, trends view, calendar view and multi-dimensional view. In addition, we propose a self-adaptive calendar-based controller that can flexibly adapt the changes of data size and granularity in trends view. Such a visual analytics system would facilitate big-data analysis in real applications, especially for decision making support. PMID:28029117

  9. A Visual Analytics Approach for Station-Based Air Quality Data.

    PubMed

    Du, Yi; Ma, Cuixia; Wu, Chao; Xu, Xiaowei; Guo, Yike; Zhou, Yuanchun; Li, Jianhui

    2016-12-24

    With the deployment of multi-modality and large-scale sensor networks for monitoring air quality, we are now able to collect large and multi-dimensional spatio-temporal datasets. For these sensed data, we present a comprehensive visual analysis approach for air quality analysis. This approach integrates several visual methods, such as map-based views, calendar views, and trends views, to assist the analysis. Among those visual methods, map-based visual methods are used to display the locations of interest, and the calendar and the trends views are used to discover the linear and periodical patterns. The system also provides various interaction tools to combine the map-based visualization, trends view, calendar view and multi-dimensional view. In addition, we propose a self-adaptive calendar-based controller that can flexibly adapt the changes of data size and granularity in trends view. Such a visual analytics system would facilitate big-data analysis in real applications, especially for decision making support.

  10. Development of Innovative Business Model of Modern Manager's Qualities

    ERIC Educational Resources Information Center

    Yashkova, Elena V.; Sineva, Nadezda L.; Shkunova, Angelika A.; Bystrova, Natalia V.; Smirnova, Zhanna V.; Kolosova, Tatyana V.

    2016-01-01

    The paper defines a complex of manager's qualities based on theoretical and methodological analysis and synthesis methods, available national and world literature, research papers and publications. The complex approach methodology was used, which provides an innovative view of the development of modern manager's qualities. The methodological…

  11. Model-based monitoring of stormwater runoff quality.

    PubMed

    Birch, Heidi; Vezzaro, Luca; Mikkelsen, Peter Steen

    2013-01-01

    Monitoring of micropollutants (MP) in stormwater is essential to evaluate the impacts of stormwater on the receiving aquatic environment. The aim of this study was to investigate how different strategies for monitoring of stormwater quality (combining a model with field sampling) affect the information obtained about MP discharged from the monitored system. A dynamic stormwater quality model was calibrated using MP data collected by automatic volume-proportional sampling and passive sampling in a storm drainage system on the outskirts of Copenhagen (Denmark) and a 10-year rain series was used to find annual average (AA) and maximum event mean concentrations. Use of this model reduced the uncertainty of predicted AA concentrations compared to a simple stochastic method based solely on data. The predicted AA concentration, obtained by using passive sampler measurements (1 month installation) for calibration of the model, resulted in the same predicted level but with narrower model prediction bounds than by using volume-proportional samples for calibration. This shows that passive sampling allows for a better exploitation of the resources allocated for stormwater quality monitoring.

  12. A systematic literature review of open source software quality assessment models.

    PubMed

    Adewumi, Adewole; Misra, Sanjay; Omoregbe, Nicholas; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    Many open source software (OSS) quality assessment models are proposed and available in the literature. However, there is little or no adoption of these models in practice. In order to guide the formulation of newer models so they can be acceptable by practitioners, there is need for clear discrimination of the existing models based on their specific properties. Based on this, the aim of this study is to perform a systematic literature review to investigate the properties of the existing OSS quality assessment models by classifying them with respect to their quality characteristics, the methodology they use for assessment, and their domain of application so as to guide the formulation and development of newer models. Searches in IEEE Xplore, ACM, Science Direct, Springer and Google Search is performed so as to retrieve all relevant primary studies in this regard. Journal and conference papers between the year 2003 and 2015 were considered since the first known OSS quality model emerged in 2003. A total of 19 OSS quality assessment model papers were selected. To select these models we have developed assessment criteria to evaluate the quality of the existing studies. Quality assessment models are classified into five categories based on the quality characteristics they possess namely: single-attribute, rounded category, community-only attribute, non-community attribute as well as the non-quality in use models. Our study reflects that software selection based on hierarchical structures is found to be the most popular selection method in the existing OSS quality assessment models. Furthermore, we found that majority (47%) of the existing models do not specify any domain of application. In conclusion, our study will be a valuable contribution to the community and helps the quality assessment model developers in formulating newer models and also to the practitioners (software evaluators) in selecting suitable OSS in the midst of alternatives.

  13. The Attributive Theory of Quality: A Model for Quality Measurement in Higher Education.

    ERIC Educational Resources Information Center

    Afshar, Arash

    A theoretical basis for defining and measuring the quality of institutions of higher education, namely for accreditation purposes, is developed. The theory, the Attributive Theory of Quality, is illustrated using a calculation model that is based on general systems theory. The theory postulates that quality only exists in relation to the…

  14. Prediction of Indoor Air Exposure from Outdoor Air Quality Using an Artificial Neural Network Model for Inner City Commercial Buildings

    PubMed Central

    Challoner, Avril; Pilla, Francesco; Gill, Laurence

    2015-01-01

    NO2 and particulate matter are the air pollutants of most concern in Ireland, with possible links to the higher respiratory and cardiovascular mortality and morbidity rates found in the country compared to the rest of Europe. Currently, air quality limits in Europe only cover outdoor environments yet the quality of indoor air is an essential determinant of a person’s well-being, especially since the average person spends more than 90% of their time indoors. The modelling conducted in this research aims to provide a framework for epidemiological studies by the use of publically available data from fixed outdoor monitoring stations to predict indoor air quality more accurately. Predictions are made using two modelling techniques, the Personal-exposure Activity Location Model (PALM), to predict outdoor air quality at a particular building, and Artificial Neural Networks, to model the indoor/outdoor relationship of the building. This joint approach has been used to predict indoor air concentrations for three inner city commercial buildings in Dublin, where parallel indoor and outdoor diurnal monitoring had been carried out on site. This modelling methodology has been shown to provide reasonable predictions of average NO2 indoor air quality compared to the monitored data, but did not perform well in the prediction of indoor PM2.5 concentrations. Hence, this approach could be used to determine NO2 exposures more rigorously of those who work and/or live in the city centre, which can then be linked to potential health impacts. PMID:26633448

  15. Collaborative problem solving with a total quality model.

    PubMed

    Volden, C M; Monnig, R

    1993-01-01

    A collaborative problem-solving system committed to the interests of those involved complies with the teachings of the total quality management movement in health care. Deming espoused that any quality system must become an integral part of routine activities. A process that is used consistently in dealing with problems, issues, or conflicts provides a mechanism for accomplishing total quality improvement. The collaborative problem-solving process described here results in quality decision-making. This model incorporates Ishikawa's cause-and-effect (fishbone) diagram, Moore's key causes of conflict, and the steps of the University of North Dakota Conflict Resolution Center's collaborative problem solving model.

  16. Evaluating Regional-Scale Air Quality Models

    EPA Science Inventory

    Numerical air quality models are being used to understand the complex interplay among emission loading meteorology, and atmospheric chemistry leading to the formation and accumulation of pollutants in the atmosphere. A model evaluation framework is presented here that considers ...

  17. Modeling and Simulation Resource Repository (MSRR)(System Engineering/Integrated M&S Management Approach

    NASA Technical Reports Server (NTRS)

    Milroy, Audrey; Hale, Joe

    2006-01-01

    NASA s Exploration Systems Mission Directorate (ESMD) is implementing a management approach for modeling and simulation (M&S) that will provide decision-makers information on the model s fidelity, credibility, and quality, including the verification, validation and accreditation information. The NASA MSRR will be implemented leveraging M&S industry best practices. This presentation will discuss the requirements that will enable NASA to capture and make available the "meta data" or "simulation biography" data associated with a model. The presentation will also describe the requirements that drive how NASA will collect and document relevant information for models or suites of models in order to facilitate use and reuse of relevant models and provide visibility across NASA organizations and the larger M&S community.

  18. Quality assessment of Isfahan Medical Faculty web site electronic services and prioritizing solutions using analytic hierarchy process approach.

    PubMed

    Hajrahimi, Nafiseh; Dehaghani, Sayed Mehdi Hejazi; Hajrahimi, Nargess; Sarmadi, Sima

    2014-01-01

    Implementing information technology in the best possible way can bring many advantages such as applying electronic services and facilitating tasks. Therefore, assessment of service providing systems is a way to improve the quality and elevate these systems including e-commerce, e-government, e-banking, and e-learning. This study was aimed to evaluate the electronic services in the website of Isfahan University of Medical Sciences in order to propose solutions to improve them. Furthermore, we aim to rank the solutions based on the factors that enhance the quality of electronic services by using analytic hierarchy process (AHP) method. Non-parametric test was used to assess the quality of electronic services. The assessment of propositions was based on Aqual model and they were prioritized using AHP approach. The AHP approach was used because it directly applies experts' deductions in the model, and lead to more objective results in the analysis and prioritizing the risks. After evaluating the quality of the electronic services, a multi-criteria decision making frame-work was used to prioritize the proposed solutions. Non-parametric tests and AHP approach using Expert Choice software. The results showed that students were satisfied in most of the indicators. Only a few indicators received low satisfaction from students including, design attractiveness, the amount of explanation and details of information, honesty and responsiveness of authorities, and the role of e-services in the user's relationship with university. After interviewing with Information and Communications Technology (ICT) experts at the university, measurement criteria, and solutions to improve the quality were collected. The best solutions were selected by EC software. According to the results, the solution "controlling and improving the process in handling users complaints" is of the utmost importance and authorities have to have it on the website and place great importance on updating this process

  19. Modelling End-User of Electronic-Government Service: The Role of Information quality, System Quality and Trust

    NASA Astrophysics Data System (ADS)

    Witarsyah Jacob, Deden; Fudzee, Mohd Farhan Md; Aizi Salamat, Mohamad; Kasim, Shahreen; Mahdin, Hairulnizam; Azhar Ramli, Azizul

    2017-08-01

    Many governments around the world increasingly use internet technologies such as electronic government to provide public services. These services range from providing the most basic informational website to deploying sophisticated tools for managing interactions between government agencies and beyond government. Electronic government (e-government) aims to provide a more accurate, easily accessible, cost-effective and time saving for the community. In this study, we develop a new model of e-government adoption service by extending the Unified Theory of Acceptance and Use of Technology (UTAUT) through the incorporation of some variables such as System Quality, Information Quality and Trust. The model is then tested using a large-scale, multi-site survey research of 237 Indonesian citizens. This model will be validated by using Structural Equation Modeling (SEM). The result indicates that System Quality, Information Quality and Trust variables proven to effect user behavior. This study extends the current understanding on the influence of System Quality, Information Quality and Trust factors to researchers, practitioners, and policy makers.

  20. Air quality modeling for effective environmental management in the mining region.

    PubMed

    Asif, Zunaira; Chen, Zhi; Han, Yi

    2018-04-18

    Air quality in the mining sector is a serious environmental concern and associated with many health issues. The air quality management in mining region has been facing many challenges due to lack of understanding of atmospheric factors and physical removal mechanism. A modeling approach called mining air dispersion model (MADM) is developed to predict air pollutants concentration in the mining region while considering the deposition effect. The model is taken into account through the planet's boundary conditions and assuming that the eddy diffusivity depends on the downwind distance. The developed MADM is applied to a mining site in Canada. The model provides values as the predicted concentrations of PM 10 , PM 2.5 , TSP, NO 2 and six heavy metals (As, Pb, Hg, Cd, Zn, Cr) at various receptor locations. The model shows that neutral stability conditions are dominant for the study site. The maximum mixing height is achieved (1280 m) during the evening of summer, and minimum mixing height (380 m) is attained during the evening of winter. The dust fall (PM coarse) deposition flux is maximum during February and March with the deposition velocity of 4.67 cm/s. The results are evaluated with the monitoring field values, revealing a good agreement for the target air pollutants with R-squared ranging from 0.72 to 0.96 for PM 2.5 ; 0.71 to 0.82 for PM 10 and from 0.71 to 0.89 for NO 2 . The analyses illustrate that presented algorithm in this model can be used to assess air quality for the mining site in a systematic way. The comparison of MADM and CALPUFF modeling values are made for four different pollutants (PM 2.5 , PM 10 , TSP, and NO 2 ) under three different atmospheric stability classes (stable, neutral and unstable). Further, MADM results are statistically tested against CALPUFF for the air pollutants and model performance is found satisfactory.

  1. Graphene growth process modeling: a physical-statistical approach

    NASA Astrophysics Data System (ADS)

    Wu, Jian; Huang, Qiang

    2014-09-01

    As a zero-band semiconductor, graphene is an attractive material for a wide variety of applications such as optoelectronics. Among various techniques developed for graphene synthesis, chemical vapor deposition on copper foils shows high potential for producing few-layer and large-area graphene. Since fabrication of high-quality graphene sheets requires the understanding of growth mechanisms, and methods of characterization and control of grain size of graphene flakes, analytical modeling of graphene growth process is therefore essential for controlled fabrication. The graphene growth process starts with randomly nucleated islands that gradually develop into complex shapes, grow in size, and eventually connect together to cover the copper foil. To model this complex process, we develop a physical-statistical approach under the assumption of self-similarity during graphene growth. The growth kinetics is uncovered by separating island shapes from area growth rate. We propose to characterize the area growth velocity using a confined exponential model, which not only has clear physical explanation, but also fits the real data well. For the shape modeling, we develop a parametric shape model which can be well explained by the angular-dependent growth rate. This work can provide useful information for the control and optimization of graphene growth process on Cu foil.

  2. Water Quality and Quantity Modeling for Hydrologic and Policy Decision Making

    NASA Astrophysics Data System (ADS)

    Rubiano, J.; Giron, E.; Quintero, M.; O'Brien, R.

    2004-12-01

    This paper presents the results of a research project that elucidate the excesses of nitrogen and phosphorous using a spatial-temporal modeling approach. The project uses the approach of integrating biophysical and socio-economic knowledge to offer sound solution to multiple stakeholders within a watershed context. The aim is to promote rural development and solve environmental conflicts by focusing on the internalization of externalities derived from watershed management, triggering the transference of funding from urban to rural populations, making the city invest in environmental goods or services offered by rural environments. The integrated modeling is focused towards identifying causal relationships between land use and management on the one hand, and water quantity/quality and sedimentation downstream on the other. Estimation of the amount of contaminated sediments transported in the study area and its impact is also studied here. The soil runoff information within the study area is obtained considering the characteristics of erosion using a MUSLE model as a sub-model of SWAT model. Using regression analysis, mathematical relationships between rainfall and surface runoff and between land use or management practices and the measured nitrate and phosphate load are established. The methodology first integrates most of the key spatial information available for the site to facilitate envisioning different land use scenarios and their impacts upon water resources. Subsequently, selected alternatives scenarios regarding the identified externalities are analyzed using optimization models. Opportunities for and constraints to promoting co-operation among users are exposed with the aid of economic games in which more sustainable land use or management alternatives are suggested. Strategic alliances and collective action are promoted in order to implement those alternatives that are environmentally sound and economically feasible. Such options are supported by co

  3. A Model Based Approach to Increase the Part Accuracy in Robot Based Incremental Sheet Metal Forming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meier, Horst; Laurischkat, Roman; Zhu Junhong

    One main influence on the dimensional accuracy in robot based incremental sheet metal forming results from the compliance of the involved robot structures. Compared to conventional machine tools the low stiffness of the robot's kinematic results in a significant deviation of the planned tool path and therefore in a shape of insufficient quality. To predict and compensate these deviations offline, a model based approach, consisting of a finite element approach, to simulate the sheet forming, and a multi body system, modeling the compliant robot structure, has been developed. This paper describes the implementation and experimental verification of the multi bodymore » system model and its included compensation method.« less

  4. Conceptual Models, Choices, and Benchmarks for Building Quality Work Cultures.

    ERIC Educational Resources Information Center

    Acker-Hocevar, Michele

    1996-01-01

    The two models in Florida's Educational Quality Benchmark System represent a new way of thinking about developing schools' work culture. The Quality Performance System Model identifies nine dimensions of work within a quality system. The Change Process Model provides a theoretical framework for changing existing beliefs, attitudes, and behaviors…

  5. Jointly modeling longitudinal proportional data and survival times with an application to the quality of life data in a breast cancer trial.

    PubMed

    Song, Hui; Peng, Yingwei; Tu, Dongsheng

    2017-04-01

    Motivated by the joint analysis of longitudinal quality of life data and recurrence free survival times from a cancer clinical trial, we present in this paper two approaches to jointly model the longitudinal proportional measurements, which are confined in a finite interval, and survival data. Both approaches assume a proportional hazards model for the survival times. For the longitudinal component, the first approach applies the classical linear mixed model to logit transformed responses, while the second approach directly models the responses using a simplex distribution. A semiparametric method based on a penalized joint likelihood generated by the Laplace approximation is derived to fit the joint model defined by the second approach. The proposed procedures are evaluated in a simulation study and applied to the analysis of breast cancer data motivated this research.

  6. Analysis and Model Based Assessment of Water Quality in European Mesoscale Forest Catchments with Different Management Strategies (a Climatic Gradient Approach)

    NASA Astrophysics Data System (ADS)

    Tavares, Filipa; Schwaerzel, Kai; Nunes, João. Pedro; Feger, Karl-Heinz

    2010-05-01

    Forestry activities affect the environmental conditions of river basins by modifying soil properties and vegetation cover, leading to changes in e.g. runoff generation and routing, water yield or the trophic status of water bodies. Climate change is directly linked to forestry, since site-adapted sustainable forest management can buffer negative climate change impacts in river basins, while practices leading to over-harvesting or increasing wildfires can exacerbate these impacts. While studies relating hydrological processes with forestry practices or climate change have already been conducted, the combined impacts of both are rarely discussed. The main objective of the proposed work is to study the interactions between forest management and climate change and the effects of these upon water fluxes and water quality at the catchment scale, over medium to long-term periods and following an East-West climate gradient. Additional objectives are to increase knowledge about the relations between forest, water quality and soil conservation/degradation; and to improve the modelling of hydrological and matter transport processes in managed forests. The present poster shows a conceptual approach to understand this combined interaction by analysing an East-West climatic gradient (Ukraine-Germany-Portugal), with contrasting forestry practices and climate vulnerabilities. The activities within this workplan, to take place during the period 2010 - 2014, will be developed in close collaboration with several ongoing research projects in the host institution at the Dresden University of Technology (TUD) and in the University of Aveiro (UA). The Institute of Soil Science and Site-Ecology (ISSE) at TUD has an internationally renowned research tradition in forest hydrological topics using methods and findings from various (sub)disciplines in a multidisplinary approach. The measurement and simulation of forest catchments has also been a point of research at the Centre for

  7. Taking Teacher Quality Seriously: A Collaborative Approach to Teacher Evaluation

    ERIC Educational Resources Information Center

    Karp, Stan

    2012-01-01

    If narrow, test-based evaluation of teachers is unfair, unreliable, and has negative effects on kids, classrooms, and curricula, what's a better approach? By demonizing teachers and unions, and sharply polarizing the education debate, the corporate reform movement has actually undermined serious efforts to improve teacher quality and evaluation.…

  8. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    PubMed

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  9. Design and Establishment of Quality Model of Fundamental Geographic Information Database

    NASA Astrophysics Data System (ADS)

    Ma, W.; Zhang, J.; Zhao, Y.; Zhang, P.; Dang, Y.; Zhao, T.

    2018-04-01

    In order to make the quality evaluation for the Fundamental Geographic Information Databases(FGIDB) more comprehensive, objective and accurate, this paper studies and establishes a quality model of FGIDB, which formed by the standardization of database construction and quality control, the conformity of data set quality and the functionality of database management system, and also designs the overall principles, contents and methods of the quality evaluation for FGIDB, providing the basis and reference for carry out quality control and quality evaluation for FGIDB. This paper designs the quality elements, evaluation items and properties of the Fundamental Geographic Information Database gradually based on the quality model framework. Connected organically, these quality elements and evaluation items constitute the quality model of the Fundamental Geographic Information Database. This model is the foundation for the quality demand stipulation and quality evaluation of the Fundamental Geographic Information Database, and is of great significance on the quality assurance in the design and development stage, the demand formulation in the testing evaluation stage, and the standard system construction for quality evaluation technology of the Fundamental Geographic Information Database.

  10. Prediction of pilot opinion ratings using an optimal pilot model. [of aircraft handling qualities in multiaxis tasks

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1977-01-01

    A brief review of some of the more pertinent applications of analytical pilot models to the prediction of aircraft handling qualities is undertaken. The relative ease with which multiloop piloting tasks can be modeled via the optimal control formulation makes the use of optimal pilot models particularly attractive for handling qualities research. To this end, a rating hypothesis is introduced which relates the numerical pilot opinion rating assigned to a particular vehicle and task to the numerical value of the index of performance resulting from an optimal pilot modeling procedure as applied to that vehicle and task. This hypothesis is tested using data from piloted simulations and is shown to be reasonable. An example concerning a helicopter landing approach is introduced to outline the predictive capability of the rating hypothesis in multiaxis piloting tasks.

  11. Design of a practical model-observer-based image quality assessment method for x-ray computed tomography imaging systems

    PubMed Central

    Tseng, Hsin-Wu; Fan, Jiahua; Kupinski, Matthew A.

    2016-01-01

    Abstract. The use of a channelization mechanism on model observers not only makes mimicking human visual behavior possible, but also reduces the amount of image data needed to estimate the model observer parameters. The channelized Hotelling observer (CHO) and channelized scanning linear observer (CSLO) have recently been used to assess CT image quality for detection tasks and combined detection/estimation tasks, respectively. Although the use of channels substantially reduces the amount of data required to compute image quality, the number of scans required for CT imaging is still not practical for routine use. It is our desire to further reduce the number of scans required to make CHO or CSLO an image quality tool for routine and frequent system validations and evaluations. This work explores different data-reduction schemes and designs an approach that requires only a few CT scans. Three different kinds of approaches are included in this study: a conventional CHO/CSLO technique with a large sample size, a conventional CHO/CSLO technique with fewer samples, and an approach that we will show requires fewer samples to mimic conventional performance with a large sample size. The mean value and standard deviation of areas under ROC/EROC curve were estimated using the well-validated shuffle approach. The results indicate that an 80% data reduction can be achieved without loss of accuracy. This substantial data reduction is a step toward a practical tool for routine-task-based QA/QC CT system assessment. PMID:27493982

  12. Dynamic extreme values modeling and monitoring by means of sea shores water quality biomarkers and valvometry.

    PubMed

    Durrieu, Gilles; Pham, Quang-Khoai; Foltête, Anne-Sophie; Maxime, Valérie; Grama, Ion; Tilly, Véronique Le; Duval, Hélène; Tricot, Jean-Marie; Naceur, Chiraz Ben; Sire, Olivier

    2016-07-01

    Water quality can be evaluated using biomarkers such as tissular enzymatic activities of endemic species. Measurement of molluscs bivalves activity at high frequency (e.g., valvometry) during a long time period is another way to record the animal behavior and to evaluate perturbations of the water quality in real time. As the pollution affects the activity of oysters, we consider the valves opening and closing velocities to monitor the water quality assessment. We propose to model the huge volume of velocity data collected in the framework of valvometry using a new nonparametric extreme values statistical model. The objective is to estimate the tail probabilities and the extreme quantiles of the distribution of valve closing velocity. The tail of the distribution function of valve closing velocity is modeled by a Pareto distribution with parameter t,τ , beyond a threshold τ according to the time t of the experiment. Our modeling approach reveals the dependence between the specific activity of two enzymatic biomarkers (Glutathione-S-transferase and acetylcholinesterase) and the continuous recording of oyster valve velocity, proving the suitability of this tool for water quality assessment. Thus, valvometry allows in real-time in situ analysis of the bivalves behavior and appears as an effective early warning tool in ecological risk assessment and marine environment monitoring.

  13. Nosocomial Infection Reduction in VLBW Infants With a Statewide Quality-Improvement Model

    PubMed Central

    Powers, Richard J.; Pettit, Janet S.; Lee, Henry C.; Boscardin, W. John; Ahmad Subeh, Mohammad; Gould, Jeffrey B.

    2011-01-01

    OBJECTIVE: To evaluate the effectiveness of the California Perinatal Quality Care Collaborative quality-improvement model using a toolkit supplemented by workshops and Web casts in decreasing nosocomial infections in very low birth weight infants. DESIGN: This was a retrospective cohort study of continuous California Perinatal Quality Care Collaborative members' data during the years 2002–2006. The primary dependent variable was nosocomial infection, defined as a late bacterial or coagulase-negative staphylococcal infection diagnosed after the age of 3 days by positive blood/cerebro-spinal fluid culture(s) and clinical criteria. The primary independent variable of interest was voluntary attendance at the toolkit's introductory event, a direct indicator that at least 1 member of an NICU team had been personally exposed to the toolkit's features rather than being only notified of its availability. The intervention's effects were assessed using a multivariable logistic regression model that risk adjusted for selected demographic and clinical factors. RESULTS: During the study period, 7733 eligible very low birth weight infants were born in 27 quality-improvement participant hospitals and 4512 very low birth weight infants were born in 27 non–quality-improvement participant hospitals. For the entire cohort, the rate of nosocomial infection decreased from 16.9% in 2002 to 14.5% in 2006. For infants admitted to NICUs participating in at least 1 quality-improvement event, there was an associated decreased risk of nosocomial infection (odds ratio: 0.81 [95% confidence interval: 0.68–0.96]) compared with those admitted to nonparticipating hospitals. CONCLUSIONS: The structured intervention approach to quality improvement in the NICU setting, using a toolkit along with attendance at a workshop and/or Web cast, is an effective means by which to improve care outcomes. PMID:21339273

  14. Development of an Instructional Quality Assurance Model in Nursing Science

    ERIC Educational Resources Information Center

    Ajpru, Haruthai; Pasiphol, Shotiga; Wongwanich, Suwimon

    2011-01-01

    The purpose of this study was to develop an instructional quality assurance model in nursing science. The study was divided into 3 phases; (1) to study the information for instructional quality assurance model development (2) to develop an instructional quality assurance model in nursing science and (3) to audit and the assessment of the developed…

  15. Combined comfort model of thermal comfort and air quality on buses in Hong Kong.

    PubMed

    Shek, Ka Wing; Chan, Wai Tin

    2008-01-25

    Air-conditioning settings are important factors in controlling the comfort of passengers on buses. The local bus operators control in-bus air quality and thermal environment by conforming to the prescribed levels stated in published standards. As a result, the settings are merely adjusted to fulfill the standards, rather than to satisfy the passengers' thermal comfort and air quality. Such "standard-oriented" practices are not appropriate; the passengers' preferences and satisfaction should be emphasized instead. Thus a "comfort-oriented" philosophy should be implemented to achieve a comfortable in-bus commuting environment. In this study, the achievement of a comfortable in-bus environment was examined with emphasis on thermal comfort and air quality. Both the measurement of physical parameters and subjective questionnaire surveys were conducted to collect practical in-bus thermal and air parameters data, as well as subjective satisfaction and sensation votes from the passengers. By analyzing the correlation between the objective and subjective data, a combined comfort models were developed. The models helped in evaluating the percentage of dissatisfaction under various combinations of passengers' sensation votes towards thermal comfort and air quality. An effective approach integrated the combined comfort model, hardware and software systems and the bus air-conditioning system could effectively control the transient in-bus environment. By processing and analyzing the data from the continuous monitoring system with the combined comfort model, air-conditioning setting adjustment commands could be determined and delivered to the hardware. This system adjusted air-conditioning settings depending on real-time commands along the bus journey. Therefore, a comfortable in-bus air quality and thermal environment could be achieved and efficiently maintained along the bus journey despite dynamic outdoor influences. Moreover, this model can help optimize air

  16. Measuring the value of air quality: application of the spatial hedonic model.

    PubMed

    Kim, Seung Gyu; Cho, Seong-Hoon; Lambert, Dayton M; Roberts, Roland K

    2010-03-01

    This study applies a hedonic model to assess the economic benefits of air quality improvement following the 1990 Clean Air Act Amendment at the county level in the lower 48 United States. An instrumental variable approach that combines geographically weighted regression and spatial autoregression methods (GWR-SEM) is adopted to simultaneously account for spatial heterogeneity and spatial autocorrelation. SEM mitigates spatial dependency while GWR addresses spatial heterogeneity by allowing response coefficients to vary across observations. Positive amenity values of improved air quality are found in four major clusters: (1) in East Kentucky and most of Georgia around the Southern Appalachian area; (2) in a few counties in Illinois; (3) on the border of Oklahoma and Kansas, on the border of Kansas and Nebraska, and in east Texas; and (4) in a few counties in Montana. Clusters of significant positive amenity values may exist because of a combination of intense air pollution and consumer awareness of diminishing air quality.

  17. A fuzzy MCDM model with objective and subjective weights for evaluating service quality in hotel industries

    NASA Astrophysics Data System (ADS)

    Zoraghi, Nima; Amiri, Maghsoud; Talebi, Golnaz; Zowghi, Mahdi

    2013-12-01

    This paper presents a fuzzy multi-criteria decision-making (FMCDM) model by integrating both subjective and objective weights for ranking and evaluating the service quality in hotels. The objective method selects weights of criteria through mathematical calculation, while the subjective method uses judgments of decision makers. In this paper, we use a combination of weights obtained by both approaches in evaluating service quality in hotel industries. A real case study that considered ranking five hotels is illustrated. Examples are shown to indicate capabilities of the proposed method.

  18. Quality Saving Mechanisms of Mitochondria during Aging in a Fully Time-Dependent Computational Biophysical Model

    PubMed Central

    Mellem, Daniel; Fischer, Frank; Jaspers, Sören; Wenck, Horst; Rübhausen, Michael

    2016-01-01

    Mitochondria are essential for the energy production of eukaryotic cells. During aging mitochondria run through various processes which change their quality in terms of activity, health and metabolic supply. In recent years, many of these processes such as fission and fusion of mitochondria, mitophagy, mitochondrial biogenesis and energy consumption have been subject of research. Based on numerous experimental insights, it was possible to qualify mitochondrial behaviour in computational simulations. Here, we present a new biophysical model based on the approach of Figge et al. in 2012. We introduce exponential decay and growth laws for each mitochondrial process to derive its time-dependent probability during the aging of cells. All mitochondrial processes of the original model are mathematically and biophysically redefined and additional processes are implemented: Mitochondrial fission and fusion is separated into a metabolic outer-membrane part and a protein-related inner-membrane part, a quality-dependent threshold for mitophagy and mitochondrial biogenesis is introduced and processes for activity-dependent internal oxidative stress as well as mitochondrial repair mechanisms are newly included. Our findings reveal a decrease of mitochondrial quality and a fragmentation of the mitochondrial network during aging. Additionally, the model discloses a quality increasing mechanism due to the interplay of the mitophagy and biogenesis cycle and the fission and fusion cycle of mitochondria. It is revealed that decreased mitochondrial repair can be a quality saving process in aged cells. Furthermore, the model finds strategies to sustain the quality of the mitochondrial network in cells with high production rates of reactive oxygen species due to large energy demands. Hence, the model adds new insights to biophysical mechanisms of mitochondrial aging and provides novel understandings of the interdependency of mitochondrial processes. PMID:26771181

  19. A Multivariate Quality Loss Function Approach for Optimization of Spinning Processes

    NASA Astrophysics Data System (ADS)

    Chakraborty, Shankar; Mitra, Ankan

    2018-05-01

    Recent advancements in textile industry have given rise to several spinning techniques, such as ring spinning, rotor spinning etc., which can be used to produce a wide variety of textile apparels so as to fulfil the end requirements of the customers. To achieve the best out of these processes, they should be utilized at their optimal parametric settings. However, in presence of multiple yarn characteristics which are often conflicting in nature, it becomes a challenging task for the spinning industry personnel to identify the best parametric mix which would simultaneously optimize all the responses. Hence, in this paper, the applicability of a new systematic approach in the form of multivariate quality loss function technique is explored for optimizing multiple quality characteristics of yarns while identifying the ideal settings of two spinning processes. It is observed that this approach performs well against the other multi-objective optimization techniques, such as desirability function, distance function and mean squared error methods. With slight modifications in the upper and lower specification limits of the considered quality characteristics, and constraints of the non-linear optimization problem, it can be successfully applied to other processes in textile industry to determine their optimal parametric settings.

  20. A systematic approach towards the development of quality indicators for postnatal care after discharge in Flanders, Belgium.

    PubMed

    Helsloot, Kaat; Walraevens, Mieke; Besauw, Saskia Van; Van Parys, An-Sofie; Devos, Hanne; Holsbeeck, Ann Van; Roelens, Kristien

    2017-05-01

    to develop a set of quality indicators for postnatal care after discharge from the hospital, using a systematic approach. key elements of qualitative postnatal care were defined by performing a systematic review and the literature was searched for potential indicators (step 1). The potential indicators were evaluated by five criteria (validity, reliability, sensitivity, feasibility and acceptability) and by making use of the 'Appraisal of Guidelines for Research and Evaluation', the AIRE-instrument (step 2). In a modified Delphi-survey, the quality indicators were presented to a panel of experts in the field of postnatal care using an online tool (step 3). The final results led to a Flemish model of postnatal care (step 4). Flanders, Belgium PARTICIPANTS: health care professionals, representatives of health care organisations and policy makers with expertise in the field of postnatal care. after analysis 57 research articles, 10 reviews, one book and eight other documents resulted in 150 potential quality indicators in seven critical care domains. Quality assessment of the indicators resulted in 58 concept quality indicators which were presented to an expert-panel of health care professionals. After two Delphi-rounds, 30 quality indicators (six structure, 17 process, and seven outcome indicators) were found appropriate to monitor and improve the quality of postnatal care after discharge from the hospital. KEY CONCLUSIONS AND IMPLICATIONS FOR CLINICAL PRACTICE: the quality indicators resulted in a Flemish model of qualitative postnatal care that was implemented by health authorities as a minimum standard in the context of shortened length of stay. Postnatal care should be adjusted to a flexible length of stay and start in pregnancy with an individualised care plan that follows mother and new-born throughout pregnancy, childbirth and postnatal period. Criteria for discharge and local protocols about the organisation and content of care are essential to facilitate

  1. SLS Navigation Model-Based Design Approach

    NASA Technical Reports Server (NTRS)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and

  2. Advances in the meta-analysis of heterogeneous clinical trials II: The quality effects model.

    PubMed

    Doi, Suhail A R; Barendregt, Jan J; Khan, Shahjahan; Thalib, Lukman; Williams, Gail M

    2015-11-01

    This article examines the performance of the updated quality effects (QE) estimator for meta-analysis of heterogeneous studies. It is shown that this approach leads to a decreased mean squared error (MSE) of the estimator while maintaining the nominal level of coverage probability of the confidence interval. Extensive simulation studies confirm that this approach leads to the maintenance of the correct coverage probability of the confidence interval, regardless of the level of heterogeneity, as well as a lower observed variance compared to the random effects (RE) model. The QE model is robust to subjectivity in quality assessment down to completely random entry, in which case its MSE equals that of the RE estimator. When the proposed QE method is applied to a meta-analysis of magnesium for myocardial infarction data, the pooled mortality odds ratio (OR) becomes 0.81 (95% CI 0.61-1.08) which favors the larger studies but also reflects the increased uncertainty around the pooled estimate. In comparison, under the RE model, the pooled mortality OR is 0.71 (95% CI 0.57-0.89) which is less conservative than that of the QE results. The new estimation method has been implemented into the free meta-analysis software MetaXL which allows comparison of alternative estimators and can be downloaded from www.epigear.com. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Measuring, evaluating and improving hospital quality parameters/dimensions--an integrated healthcare quality approach.

    PubMed

    Zineldin, Mosad; Camgöz-Akdağ, Hatice; Vasicheva, Valiantsina

    2011-01-01

    This paper aims to examine the major factors affecting cumulative summation, to empirically examine the major factors affecting satisfaction and to address the question whether patients in Kazakhstan evaluate healthcare similarly or differently from patients in Egypt and Jordan. A questionnaire, adapted from previous research, was distributed to Kazakhstan inpatients. The questionnaire contained 39 attributes about five newly-developed quality dimensions (5Qs), which were identified to be the most relevant attributes for hospitals. The questionnaire was translated into Russian to increase the response rate and improve data quality. Almost 200 usable questionnaires were returned. Frequency distribution, factor analysis and reliability checks were used to analyze the data. The three biggest concerns for Kazakhstan patients are: infrastructure; atmosphere; and interaction. Hospital staffs concern for patients' needs, parking facilities for visitors, waiting time and food temperature were all common specific attributes, which were perceived as concerns. These were shortcomings in all three countries. Improving health service quality by applying total relationship management and the 5Qs model together with a customer-orientation strategy is recommended. Results can be used by hospital staff to reengineer and redesign creatively their quality management processes and help move towards more effective healthcare quality strategies. Patients in three countries have similar concerns and quality perceptions. The paper describes a new instrument and method. The study assures relevance, validity and reliability, while being explicitly change-oriented. The authors argue that patient satisfaction is a cumulative construct, summing satisfaction as five different qualities (5Qs): object; processes; infrastructure; interaction and atmosphere.

  4. Structural dynamic model obtained from flight use with piloted simulation and handling qualities analysis

    NASA Technical Reports Server (NTRS)

    Powers, Bruce G.

    1996-01-01

    The ability to use flight data to determine an aircraft model with structural dynamic effects suitable for piloted simulation. and handling qualities analysis has been developed. This technique was demonstrated using SR-71 flight test data. For the SR-71 aircraft, the most significant structural response is the longitudinal first-bending mode. This mode was modeled as a second-order system, and the other higher order modes were modeled as a time delay. The distribution of the modal response at various fuselage locations was developed using a uniform beam solution, which can be calibrated using flight data. This approach was compared to the mode shape obtained from the ground vibration test, and the general form of the uniform beam solution was found to be a good representation of the mode shape in the areas of interest. To calibrate the solution, pitch-rate and normal-acceleration instrumentation is required for at least two locations. With the resulting structural model incorporated into the simulation, a good representation of the flight characteristics was provided for handling qualities analysis and piloted simulation.

  5. Response in the water quality of the Salton Sea, California, to changes in phosphorus loading: An empirical modeling approach

    USGS Publications Warehouse

    Robertson, Dale M.; Schladow, S.G.

    2008-01-01

    Salton Sea, California, like many other lakes, has become eutrophic because of excessive nutrient loading, primarily phosphorus (P). A Total Maximum Daily Load (TMDL) is being prepared for P to reduce the input of P to the Sea. In order to better understand how P-load reductions should affect the average annual water quality of this terminal saline lake, three different eutrophication programs (BATHTUB, WiLMS, and the Seepage Lake Model) were applied. After verifying that specific empirical models within these programs were applicable to this saline lake, each model was calibrated using water-quality and nutrient-loading data for 1999 and then used to simulate the effects of specific P-load reductions. Model simulations indicate that a 50% decrease in external P loading would decrease near-surface total phosphorus concentrations (TP) by 25-50%. Application of other empirical models demonstrated that this decrease in loading should decrease near-surface chlorophyll a concentrations (Chl a) by 17-63% and increase Secchi depths (SD) by 38-97%. The wide range in estimated responses in Chl a and SD were primarily caused by uncertainty in how non-algal turbidity would respond to P-load reductions. If only the models most applicable to the Salton Sea are considered, a 70-90% P-load reduction is required for the Sea to be classified as moderately eutrophic (trophic state index of 55). These models simulate steady-state conditions in the Sea; therefore, it is difficult to ascertain how long it would take for the simulated changes to occur after load reductions. ?? 2008 Springer Science+Business Media B.V.

  6. MQAPRank: improved global protein model quality assessment by learning-to-rank.

    PubMed

    Jing, Xiaoyang; Dong, Qiwen

    2017-05-25

    Protein structure prediction has achieved a lot of progress during the last few decades and a greater number of models for a certain sequence can be predicted. Consequently, assessing the qualities of predicted protein models in perspective is one of the key components of successful protein structure prediction. Over the past years, a number of methods have been developed to address this issue, which could be roughly divided into three categories: single methods, quasi-single methods and clustering (or consensus) methods. Although these methods achieve much success at different levels, accurate protein model quality assessment is still an open problem. Here, we present the MQAPRank, a global protein model quality assessment program based on learning-to-rank. The MQAPRank first sorts the decoy models by using single method based on learning-to-rank algorithm to indicate their relative qualities for the target protein. And then it takes the first five models as references to predict the qualities of other models by using average GDT_TS scores between reference models and other models. Benchmarked on CASP11 and 3DRobot datasets, the MQAPRank achieved better performances than other leading protein model quality assessment methods. Recently, the MQAPRank participated in the CASP12 under the group name FDUBio and achieved the state-of-the-art performances. The MQAPRank provides a convenient and powerful tool for protein model quality assessment with the state-of-the-art performances, it is useful for protein structure prediction and model quality assessment usages.

  7. Utilizing Operational and Improved Remote Sensing Measurements to Assess Air Quality Monitoring Model Forecasts

    NASA Astrophysics Data System (ADS)

    Gan, Chuen-Meei

    Air quality model forecasts from Weather Research and Forecast (WRF) and Community Multiscale Air Quality (CMAQ) are often used to support air quality applications such as regulatory issues and scientific inquiries on atmospheric science processes. In urban environments, these models become more complex due to the inherent complexity of the land surface coupling and the enhanced pollutants emissions. This makes it very difficult to diagnose the model, if the surface parameter forecasts such as PM2.5 (particulate matter with aerodynamic diameter less than 2.5 microm) are not accurate. For this reason, getting accurate boundary layer dynamic forecasts is as essential as quantifying realistic pollutants emissions. In this thesis, we explore the usefulness of vertical sounding measurements on assessing meteorological and air quality forecast models. In particular, we focus on assessing the WRF model (12km x 12km) coupled with the CMAQ model for the urban New York City (NYC) area using multiple vertical profiling and column integrated remote sensing measurements. This assessment is helpful in probing the root causes for WRF-CMAQ overestimates of surface PM2.5 occurring both predawn and post-sunset in the NYC area during the summer. In particular, we find that the significant underestimates in the WRF PBL height forecast is a key factor in explaining this anomaly. On the other hand, the model predictions of the PBL height during daytime when convective heating dominates were found to be highly correlated to lidar derived PBL height with minimal bias. Additional topics covered in this thesis include mathematical method using direct Mie scattering approach to convert aerosol microphysical properties from CMAQ into optical parameters making direct comparisons with lidar and multispectral radiometers feasible. Finally, we explore some tentative ideas on combining visible (VIS) and mid-infrared (MIR) sensors to better separate aerosols into fine and coarse modes.

  8. Analytical Quality by Design Approach in RP-HPLC Method Development for the Assay of Etofenamate in Dosage Forms

    PubMed Central

    Peraman, R.; Bhadraya, K.; Reddy, Y. Padmanabha; Reddy, C. Surayaprakash; Lokesh, T.

    2015-01-01

    By considering the current regulatory requirement for an analytical method development, a reversed phase high performance liquid chromatographic method for routine analysis of etofenamate in dosage form has been optimized using analytical quality by design approach. Unlike routine approach, the present study was initiated with understanding of quality target product profile, analytical target profile and risk assessment for method variables that affect the method response. A liquid chromatography system equipped with a C18 column (250×4.6 mm, 5 μ), a binary pump and photodiode array detector were used in this work. The experiments were conducted based on plan by central composite design, which could save time, reagents and other resources. Sigma Tech software was used to plan and analyses the experimental observations and obtain quadratic process model. The process model was used for predictive solution for retention time. The predicted data from contour diagram for retention time were verified actually and it satisfied with actual experimental data. The optimized method was achieved at 1.2 ml/min flow rate of using mobile phase composition of methanol and 0.2% triethylamine in water at 85:15, % v/v, pH adjusted to 6.5. The method was validated and verified for targeted method performances, robustness and system suitability during method transfer. PMID:26997704

  9. An Innovative Approach To Teaching High School Students about Indoor Air Quality.

    ERIC Educational Resources Information Center

    Neumann, Catherine M.; Bloomfield, Molly M.; Harding, Anna K.; Sherburne, Holly

    1999-01-01

    Describes an innovative approach used to help high school students develop critical thinking and real-world problem-solving skills while learning about indoor air quality. (Contains 13 references.) (Author/WRM)

  10. Flared landing approach flying qualities. Volume 1: Experiment design and analysis

    NASA Technical Reports Server (NTRS)

    Weingarten, Norman C.; Berthe, Charles J., Jr.; Rynaski, Edmund G.; Sarrafian, Shahan K.

    1986-01-01

    An inflight research study was conducted utilizing the USAF Total Inflight Simulator (TIFS) to investigate longitudinal flying qualities for the flared landing approach phase of flight. The purpose of the experiment was to generate a consistent set of data for: (1) determining what kind of commanded response the pilot prefers in order to flare and land an airplane with precision, and (2) refining a time history criterion that took into account all the necessary variables and their characteristics that would accurately predict flying qualities. The result of the first part provides guidelines to the flight control system designer, using MIL-F-8785-(C) as a guide, that yield the dynamic behavior pilots perfer in flared landings. The results of the second part provides the flying qualities engineer with a newly derived flying qualities predictive tool which appears to be highly accurate. This time domain predictive flying qualities criterion was applied to the flight data as well as six previous flying qualities studies, and the results indicate that the criterion predicted the flying qualities level 81% of the time and the Cooper-Harper pilot rating, within + or - 1, 60% of the time.

  11. Improvement of quality of 3D printed objects by elimination of microscopic structural defects in fused deposition modeling.

    PubMed

    Gordeev, Evgeniy G; Galushko, Alexey S; Ananikov, Valentine P

    2018-01-01

    Additive manufacturing with fused deposition modeling (FDM) is currently optimized for a wide range of research and commercial applications. The major disadvantage of FDM-created products is their low quality and structural defects (porosity), which impose an obstacle to utilizing them in functional prototyping and direct digital manufacturing of objects intended to contact with gases and liquids. This article describes a simple and efficient approach for assessing the quality of 3D printed objects. Using this approach it was shown that the wall permeability of a printed object depends on its geometric shape and is gradually reduced in a following series: cylinder > cube > pyramid > sphere > cone. Filament feed rate, wall geometry and G-code-defined wall structure were found as primary parameters that influence the quality of 3D-printed products. Optimization of these parameters led to an overall increase in quality and improvement of sealing properties. It was demonstrated that high quality of 3D printed objects can be achieved using routinely available printers and standard filaments.

  12. Effect of the spatiotemporal variability of rainfall inputs in water quality integrated catchment modelling for dissolved oxygen concentrations

    NASA Astrophysics Data System (ADS)

    Moreno Ródenas, Antonio Manuel; Cecinati, Francesca; ten Veldhuis, Marie-Claire; Langeveld, Jeroen; Clemens, Francois

    2016-04-01

    Maintaining water quality standards in highly urbanised hydrological catchments is a worldwide challenge. Water management authorities struggle to cope with changing climate and an increase in pollution pressures. Water quality modelling has been used as a decision support tool for investment and regulatory developments. This approach led to the development of integrated catchment models (ICM), which account for the link between the urban/rural hydrology and the in-river pollutant dynamics. In the modelled system, rainfall triggers the drainage systems of urban areas scattered along a river. When flow exceeds the sewer infrastructure capacity, untreated wastewater enters the natural system by combined sewer overflows. This results in a degradation of the river water quality, depending on the magnitude of the emission and river conditions. Thus, being capable of representing these dynamics in the modelling process is key for a correct assessment of the water quality. In many urbanised hydrological systems the distances between draining sewer infrastructures go beyond the de-correlation length of rainfall processes, especially, for convective summer storms. Hence, spatial and temporal scales of selected rainfall inputs are expected to affect water quality dynamics. The objective of this work is to evaluate how the use of rainfall data from different sources and with different space-time characteristics affects modelled output concentrations of dissolved oxygen in a simplified ICM. The study area is located at the Dommel, a relatively small and sensitive river flowing through the city of Eindhoven (The Netherlands). This river stretch receives the discharge of the 750,000 p.e. WWTP of Eindhoven and from over 200 combined sewer overflows scattered along its length. A pseudo-distributed water quality model has been developed in WEST (mikedhi.com); this is a lumped-physically based model that accounts for urban drainage processes, WWTP and river dynamics for several

  13. Klang River water quality modelling using music

    NASA Astrophysics Data System (ADS)

    Zahari, Nazirul Mubin; Zawawi, Mohd Hafiz; Muda, Zakaria Che; Sidek, Lariyah Mohd; Fauzi, Nurfazila Mohd; Othman, Mohd Edzham Fareez; Ahmad, Zulkepply

    2017-09-01

    Water is an essential resource that sustains life on earth; changes in the natural quality and distribution of water have ecological impacts that can sometimes be devastating. Recently, Malaysia is facing many environmental issues regarding water pollution. The main causes of river pollution are rapid urbanization, arising from the development of residential, commercial, industrial sites, infrastructural facilities and others. The purpose of the study was to predict the water quality of the Connaught Bridge Power Station (CBPS), Klang River. Besides that, affects to the low tide and high tide and. to forecast the pollutant concentrations of the Biochemical Oxygen Demand (BOD) and Total Suspended Solid (TSS) for existing land use of the catchment area through water quality modeling (by using the MUSIC software). Besides that, to identifying an integrated urban stormwater treatment system (Best Management Practice or BMPs) to achieve optimal performance in improving the water quality of the catchment using the MUSIC software in catchment areas having tropical climates. Result from MUSIC Model such as BOD5 at station 1 can be reduce the concentration from Class IV to become Class III. Whereas, for TSS concentration from Class III to become Class II at the station 1. The model predicted a mean TSS reduction of 0.17%, TP reduction of 0.14%, TN reduction of 0.48% and BOD5 reduction of 0.31% for Station 1 Thus, from the result after purposed BMPs the water quality is safe to use because basically water quality monitoring is important due to threat such as activities are harmful to aquatic organisms and public health.

  14. Using Water Quality Models in Management - A Multiple Model Assessment, Analysis of Confidence, and Evaluation of Climate Change Impacts

    NASA Astrophysics Data System (ADS)

    Irby, Isaac David

    pollution diet in light of future projections for air temperature, sea level, and precipitation was examined. While a changing climate will reduce the ability of the nutrient reduction to improve oxygen concentrations, that effect is trumped by the improvements in dissolved oxygen stemming from the pollution diet itself. However, climate change still has the potential to cause the current level of nutrient reduction to be inadequate. This is primarily due to the fact that low-oxygen conditions are predicted to start one week earlier, on average, in the future, with the primary changes resulting from the increase in temperature. Overall, this research lends an increased degree of confidence in the water quality modeling of the potential impact of the Chesapeake Bay pollution diet. This research also establishes the efficacy of utilizing a multiple model approach to examining projected changes in water quality while establishing that the pollution diet trumps the impact from climate change. This work will lead directly to advances in scientific understanding of the response of water quality, ecosystem health, and ecological resilience to the impacts of nutrient reduction and climate change.

  15. Hydrological and water quality processes simulation by the integrated MOHID model

    NASA Astrophysics Data System (ADS)

    Epelde, Ane; Antiguedad, Iñaki; Brito, David; Eduardo, Jauch; Neves, Ramiro; Sauvage, Sabine; Sánchez-Pérez, José Miguel

    2016-04-01

    Different modelling approaches have been used in recent decades to study the water quality degradation caused by non-point source pollution. In this study, the MOHID fully distributed and physics-based model has been employed to simulate hydrological processes and nitrogen dynamics in a nitrate vulnerable zone: the Alegria River watershed (Basque Country, Northern Spain). The results of this study indicate that the MOHID code is suitable for hydrological processes simulation at the watershed scale, as the model shows satisfactory performance at simulating the discharge (with NSE: 0.74 and 0.76 during calibration and validation periods, respectively). The agronomical component of the code, allowed the simulation of agricultural practices, which lead to adequate crop yield simulation in the model. Furthermore, the nitrogen exportation also shows satisfactory performance (with NSE: 0.64 and 0.69 during calibration and validation periods, respectively). While the lack of field measurements do not allow to evaluate the nutrient cycling processes in depth, it has been observed that the MOHID model simulates the annual denitrification according to general ranges established for agricultural watersheds (in this study, 9 kg N ha-1 year-1). In addition, the model has simulated coherently the spatial distribution of the denitrification process, which is directly linked to the simulated hydrological conditions. Thus, the model has localized the highest rates nearby the discharge zone of the aquifer and also where the aquifer thickness is low. These results evidence the strength of this model to simulate watershed scale hydrological processes as well as the crop production and the agricultural activity derived water quality degradation (considering both nutrient exportation and nutrient cycling processes).

  16. The Triangle Model for evaluating the effect of health information technology on healthcare quality and safety

    PubMed Central

    Kern, Lisa M; Abramson, Erika; Kaushal, Rainu

    2011-01-01

    With the proliferation of relatively mature health information technology (IT) systems with large numbers of users, it becomes increasingly important to evaluate the effect of these systems on the quality and safety of healthcare. Previous research on the effectiveness of health IT has had mixed results, which may be in part attributable to the evaluation frameworks used. The authors propose a model for evaluation, the Triangle Model, developed for designing studies of quality and safety outcomes of health IT. This model identifies structure-level predictors, including characteristics of: (1) the technology itself; (2) the provider using the technology; (3) the organizational setting; and (4) the patient population. In addition, the model outlines process predictors, including (1) usage of the technology, (2) organizational support for and customization of the technology, and (3) organizational policies and procedures about quality and safety. The Triangle Model specifies the variables to be measured, but is flexible enough to accommodate both qualitative and quantitative approaches to capturing them. The authors illustrate this model, which integrates perspectives from both health services research and biomedical informatics, with examples from evaluations of electronic prescribing, but it is also applicable to a variety of types of health IT systems. PMID:21857023

  17. Combination of a Stressor-Response Model with a Conditional Probability Analysis Approach for Developing Candidate Criteria from MBSS

    EPA Science Inventory

    I show that a conditional probability analysis using a stressor-response model based on a logistic regression provides a useful approach for developing candidate water quality criteria from empirical data, such as the Maryland Biological Streams Survey (MBSS) data.

  18. Quality control of the RMS US flood model

    NASA Astrophysics Data System (ADS)

    Jankowfsky, Sonja; Hilberts, Arno; Mortgat, Chris; Li, Shuangcai; Rafique, Farhat; Rajesh, Edida; Xu, Na; Mei, Yi; Tillmanns, Stephan; Yang, Yang; Tian, Ye; Mathur, Prince; Kulkarni, Anand; Kumaresh, Bharadwaj Anna; Chaudhuri, Chiranjib; Saini, Vishal

    2016-04-01

    The RMS US flood model predicts the flood risk in the US with a 30 m resolution for different return periods. The model is designed for the insurance industry to estimate the cost of flood risk for a given location. Different statistical, hydrological and hydraulic models are combined to develop the flood maps for different return periods. A rainfall-runoff and routing model, calibrated with observed discharge data, is run with 10 000 years of stochastic simulated precipitation to create time series of discharge and surface runoff. The 100, 250 and 500 year events are extracted from these time series as forcing for a two-dimensional pluvial and fluvial inundation model. The coupling of all the different models which are run on the large area of the US implies a certain amount of uncertainty. Therefore, special attention is paid to the final quality control of the flood maps. First of all, a thorough quality analysis of the Digital Terrain model and the river network was done, as the final quality of the flood maps depends heavily on the DTM quality. Secondly, the simulated 100 year discharge in the major river network (600 000 km) is compared to the 100 year discharge derived using extreme value distribution of all USGS gauges with more than 20 years of peak values (around 11 000 gauges). Thirdly, for each gauge the modelled flood depth is compared to the depth derived from the USGS rating curves. Fourthly, the modelled flood depth is compared to the base flood elevation given in the FEMA flood maps. Fifthly, the flood extent is compared to the FEMA flood extent. Then, for historic events we compare flood extents and flood depths at given locations. Finally, all the data and spatial layers are uploaded on geoserver to facilitate the manual investigation of outliers. The feedback from the quality control is used to improve the model and estimate its uncertainty.

  19. On Regional Modeling to Support Air Quality Policies

    EPA Science Inventory

    We examine the use of the Community Multiscale Air Quality (CMAQ) model in simulating the changes in the extreme values of air quality that are of interest to the regulatory agencies. Year-to-year changes in ozone air quality are attributable to variations in the prevailing mete...

  20. A Study Investigating the Perceived Service Quality Levels of Sport Center Members: A Kano Model Perspective

    ERIC Educational Resources Information Center

    Yildiz, Kadir; Polat, Ercan; Güzel, Pinar

    2018-01-01

    The purpose of this study is to investigate sport center members' perceived service quality levels with a view to Kano customer expectations and requirements model. To that end, a descriptive approach and a correlational research design featuring survey method is adopted. Research group consists of 680 (300 women, 380 men) sport center members who…

  1. A trait based approach to defining valued mentoring qualities

    NASA Astrophysics Data System (ADS)

    Pendall, E.

    2012-12-01

    Graduate training in the sciences requires strong personal interactions among faculty, senior lab members and more junior members. Within the lab-group setting we learn to frame problems, to conduct research and to communicate findings. The result is that individual scientists are partly shaped by a few influential mentors. We have all been influenced by special relationships with mentors, and on reflection we may find that certain qualities have been especially influential in our career choices. In this presentation I will discuss favorable mentoring traits as determined from an informal survey of scientists in varying stages of careers and from diverse backgrounds. Respondents addressed questions about traits they value in their mentors in several categories: 1) personal qualities such as approachability, humor and encouragement; background including gender, ethnicity, and family status; 2) scientific qualities including discipline or specialization, perceived stature in discipline, seniority, breadth of perspective, and level of expectations; and 3) community-oriented qualities promoted by mentors, such as encouraging service contributions and peer-mentoring within the lab group. The results will be compared among respondents by gender, ethnicity, stage of career, type of work, and subdiscipline within the broadly defined Biogeoscience community. We hope to contribute to the growing discussion on building a diverse and balanced scientific workforce.

  2. Influence of Examinations Oriented Approaches on Quality Education in Primary Schools in Kenya

    ERIC Educational Resources Information Center

    Mackatiani, Caleb Imbova

    2017-01-01

    This paper provides a critical appraisal of the influence of examinations oriented approaches on quality education in primary schools in Kenya. The purpose of the study was to determine effects of examination oriented teaching approaches on learning achievement among primary school pupils in Kakamega County, Kenya. It explored the assumptions…

  3. Pragmatic quality metrics for evolutionary software development models

    NASA Technical Reports Server (NTRS)

    Royce, Walker

    1990-01-01

    Due to the large number of product, project, and people parameters which impact large custom software development efforts, measurement of software product quality is a complex undertaking. Furthermore, the absolute perspective from which quality is measured (customer satisfaction) is intangible. While we probably can't say what the absolute quality of a software product is, we can determine the relative quality, the adequacy of this quality with respect to pragmatic considerations, and identify good and bad trends during development. While no two software engineers will ever agree on an optimum definition of software quality, they will agree that the most important perspective of software quality is its ease of change. We can call this flexibility, adaptability, or some other vague term, but the critical characteristic of software is that it is soft. The easier the product is to modify, the easier it is to achieve any other software quality perspective. This paper presents objective quality metrics derived from consistent lifecycle perspectives of rework which, when used in concert with an evolutionary development approach, can provide useful insight to produce better quality per unit cost/schedule or to achieve adequate quality more efficiently. The usefulness of these metrics is evaluated by applying them to a large, real world, Ada project.

  4. Development of Water Quality Modeling in the United States

    EPA Science Inventory

    This presentation describes historical trends in water quality model development in the United States, reviews current efforts, and projects promising future directions. Water quality modeling has a relatively long history in the United States. While its origins lie in the work...

  5. Breastfeeding attitude, health-related quality of life and maternal obesity among multi-ethnic pregnant women: A multi-group structural equation approach.

    PubMed

    Lau, Ying; Htun, Tha Pyai; Lim, Peng Im; Ho-Lim, Sarah Su Tin; Chi, Claudia; Tsai, Cammy; Ong, Kai Wen; Klainin-Yobas, Piyanee

    2017-02-01

    Identifying the factors influencing breastfeeding attitude is significant for the implementation of effective promotion policies and counselling activities. To our best knowledge, no previous studies have modelled the relationships among breastfeeding attitude, health-related quality of life and maternal obesity among multi-ethnic pregnant women; the current study attempts to fill this research gap. This study investigated the relationships among maternal characteristics, health-related quality of life and breastfeeding attitude amidst normal weight and overweight/obese pregnant women using a multi-group structural equation modelling approach. Exploratory cross-sectional design was used. Antenatal clinics of a university-affiliated hospital PARTICIPANTS: Pregnant women were invited to participate; 708 (78.8%) agreed to participate in the study. We examined a hypothetical model on the basis of integrating the concepts of a breastfeeding decision-making model, theory of planned behaviour-based model for breastfeeding and health-related quality of life model among 708 multi-ethnic pregnant women in Singapore. The Iowa Infant Feeding Attitude Scale and Medical Outcomes Study Short Form Health Survey were used to measure breastfeeding attitude and health-related quality of life, respectively. Two structural equation models demonstrated that better health-related quality of life, higher monthly household income, planned pregnancy and previous exclusive breastfeeding experience were significantly associated with positive breastfeeding attitude among normal and overweight/obese pregnant women. Among normal weight pregnant women, those who were older with higher educational level were more likely to have positive breastfeeding attitude. Among overweight/obese pregnant women, Chinese women with confinement nanny plan were less likely to have positive breastfeeding attitude. No significant difference existed between normal weight and overweight/obese pregnant women concerning

  6. A quality by design approach to scale-up of high-shear wet granulation process.

    PubMed

    Pandey, Preetanshu; Badawy, Sherif

    2016-01-01

    High-shear wet granulation is a complex process that in turn makes scale-up a challenging task. Scale-up of high-shear wet granulation process has been studied extensively in the past with various different methodologies being proposed in the literature. This review article discusses existing scale-up principles and categorizes the various approaches into two main scale-up strategies - parameter-based and attribute-based. With the advent of quality by design (QbD) principle in drug product development process, an increased emphasis toward the latter approach may be needed to ensure product robustness. In practice, a combination of both scale-up strategies is often utilized. In a QbD paradigm, there is also a need for an increased fundamental and mechanistic understanding of the process. This can be achieved either by increased experimentation that comes at higher costs, or by using modeling techniques, that are also discussed as part of this review.

  7. An Overview of Atmospheric Chemistry and Air Quality Modeling

    NASA Technical Reports Server (NTRS)

    Johnson, Matthew S.

    2017-01-01

    This presentation will include my personal research experience and an overview of atmospheric chemistry and air quality modeling to the participants of the NASA Student Airborne Research Program (SARP 2017). The presentation will also provide examples on ways to apply airborne observations for chemical transport (CTM) and air quality (AQ) model evaluation. CTM and AQ models are important tools in understanding tropospheric-stratospheric composition, atmospheric chemistry processes, meteorology, and air quality. This presentation will focus on how NASA scientist currently apply CTM and AQ models to better understand these topics. Finally, the importance of airborne observation in evaluating these topics and how in situ and remote sensing observations can be used to evaluate and improve CTM and AQ model predictions will be highlighted.

  8. Service quality benchmarking via a novel approach based on fuzzy ELECTRE III and IPA: an empirical case involving the Italian public healthcare context.

    PubMed

    La Fata, Concetta Manuela; Lupo, Toni; Piazza, Tommaso

    2017-11-21

    A novel fuzzy-based approach which combines ELECTRE III along with the Importance-Performance Analysis (IPA) is proposed in the present work to comparatively evaluate the service quality in the public healthcare context. Specifically, ELECTRE III is firstly considered to compare the service performance of examined hospitals in a noncompensatory manner. Afterwards, IPA is employed to support the service quality management to point out improvement needs and their priorities. The proposed approach also incorporates features of the Fuzzy Set Theory so as to address the possible uncertainty, subjectivity and vagueness of involved experts in evaluating the service quality. The model is applied to five major Sicilian public hospitals, and strengths and criticalities of the delivered service are finally highlighted and discussed. Although several approaches combining multi-criteria methods have already been proposed in the literature to evaluate the service performance in the healthcare field, to the best of the authors' knowledge the present work represents the first attempt at comparing service performance of alternatives in a noncompensatory manner in the investigated context.

  9. Quality choice in a health care market: a mixed duopoly approach.

    PubMed

    Sanjo, Yasuo

    2009-05-01

    We investigate a health care market with uncertainty in a mixed duopoly, where a partially privatized public hospital competes against a private hospital in terms of quality choice. We use a simple Hotelling-type spatial competition model by incorporating mean-variance analysis and the framework of partial privatization. We show how the variance in the quality perceived by patients affects the true quality of medical care provided by hospitals. In addition, we show that a case exists in which the quality of the partially privatized hospital becomes higher than that of the private hospital when the patient's preference for quality is relatively high.

  10. Imputation approaches for animal movement modeling

    USGS Publications Warehouse

    Scharf, Henry; Hooten, Mevin B.; Johnson, Devin S.

    2017-01-01

    The analysis of telemetry data is common in animal ecological studies. While the collection of telemetry data for individual animals has improved dramatically, the methods to properly account for inherent uncertainties (e.g., measurement error, dependence, barriers to movement) have lagged behind. Still, many new statistical approaches have been developed to infer unknown quantities affecting animal movement or predict movement based on telemetry data. Hierarchical statistical models are useful to account for some of the aforementioned uncertainties, as well as provide population-level inference, but they often come with an increased computational burden. For certain types of statistical models, it is straightforward to provide inference if the latent true animal trajectory is known, but challenging otherwise. In these cases, approaches related to multiple imputation have been employed to account for the uncertainty associated with our knowledge of the latent trajectory. Despite the increasing use of imputation approaches for modeling animal movement, the general sensitivity and accuracy of these methods have not been explored in detail. We provide an introduction to animal movement modeling and describe how imputation approaches may be helpful for certain types of models. We also assess the performance of imputation approaches in two simulation studies. Our simulation studies suggests that inference for model parameters directly related to the location of an individual may be more accurate than inference for parameters associated with higher-order processes such as velocity or acceleration. Finally, we apply these methods to analyze a telemetry data set involving northern fur seals (Callorhinus ursinus) in the Bering Sea. Supplementary materials accompanying this paper appear online.

  11. A Statistical Approach For Modeling Tropical Cyclones. Synthetic Hurricanes Generator Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pasqualini, Donatella

    This manuscript brie y describes a statistical ap- proach to generate synthetic tropical cyclone tracks to be used in risk evaluations. The Synthetic Hur- ricane Generator (SynHurG) model allows model- ing hurricane risk in the United States supporting decision makers and implementations of adaptation strategies to extreme weather. In the literature there are mainly two approaches to model hurricane hazard for risk prediction: deterministic-statistical approaches, where the storm key physical parameters are calculated using physi- cal complex climate models and the tracks are usually determined statistically from historical data; and sta- tistical approaches, where both variables and tracks are estimatedmore » stochastically using historical records. SynHurG falls in the second category adopting a pure stochastic approach.« less

  12. The phenology of leaf quality and its within-canopy variation is essential for accurate modeling of photosynthesis in tropical evergreen forests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Jin; Serbin, Shawn P.; Xu, Xiangtao

    Leaf quantity (i.e., canopy leaf area index, LAI), quality (i.e., per-area photosynthetic capacity), and longevity all influence the photosynthetic seasonality of tropical evergreen forests. However, these components of tropical leaf phenology are poorly represented in most terrestrial biosphere models (TBMs). Here in this paper, we explored alternative options for the representation of leaf phenology effects in TBMs that employ the Farquahar, von Caemmerer & Berry (FvCB) representation of CO 2 assimilation. We developed a two-fraction leaf (sun and shade), two-layer canopy (upper and lower) photosynthesis model to evaluate different modeling approaches and assessed three components of phenological variations (i.e., leafmore » quantity, quality, and within-canopy variation in leaf longevity). Our model was driven by the prescribed seasonality of leaf quantity and quality derived from ground-based measurements within an Amazonian evergreen forest. Modeled photosynthetic seasonality was not sensitive to leaf quantity, but was highly sensitive to leaf quality and its vertical distribution within the canopy, with markedly more sensitivity to upper canopy leaf quality. This is because light absorption in tropical canopies is near maximal for the entire year, implying that seasonal changes in LAI have little impact on total canopy light absorption; and because leaf quality has a greater effect on photosynthesis of sunlit leaves than light limited, shade leaves and sunlit foliage are more abundant in the upper canopy. Our two-fraction leaf, two-layer canopy model, which accounted for all three phenological components, was able to simulate photosynthetic seasonality, explaining ~90% of the average seasonal variation in eddy covariance-derived CO 2 assimilation. This work identifies a parsimonious approach for representing tropical evergreen forest photosynthetic seasonality in TBMs that utilize the FvCB model of CO 2 assimilation and highlights the importance of

  13. The phenology of leaf quality and its within-canopy variation is essential for accurate modeling of photosynthesis in tropical evergreen forests

    DOE PAGES

    Wu, Jin; Serbin, Shawn P.; Xu, Xiangtao; ...

    2017-04-18

    Leaf quantity (i.e., canopy leaf area index, LAI), quality (i.e., per-area photosynthetic capacity), and longevity all influence the photosynthetic seasonality of tropical evergreen forests. However, these components of tropical leaf phenology are poorly represented in most terrestrial biosphere models (TBMs). Here in this paper, we explored alternative options for the representation of leaf phenology effects in TBMs that employ the Farquahar, von Caemmerer & Berry (FvCB) representation of CO 2 assimilation. We developed a two-fraction leaf (sun and shade), two-layer canopy (upper and lower) photosynthesis model to evaluate different modeling approaches and assessed three components of phenological variations (i.e., leafmore » quantity, quality, and within-canopy variation in leaf longevity). Our model was driven by the prescribed seasonality of leaf quantity and quality derived from ground-based measurements within an Amazonian evergreen forest. Modeled photosynthetic seasonality was not sensitive to leaf quantity, but was highly sensitive to leaf quality and its vertical distribution within the canopy, with markedly more sensitivity to upper canopy leaf quality. This is because light absorption in tropical canopies is near maximal for the entire year, implying that seasonal changes in LAI have little impact on total canopy light absorption; and because leaf quality has a greater effect on photosynthesis of sunlit leaves than light limited, shade leaves and sunlit foliage are more abundant in the upper canopy. Our two-fraction leaf, two-layer canopy model, which accounted for all three phenological components, was able to simulate photosynthetic seasonality, explaining ~90% of the average seasonal variation in eddy covariance-derived CO 2 assimilation. This work identifies a parsimonious approach for representing tropical evergreen forest photosynthetic seasonality in TBMs that utilize the FvCB model of CO 2 assimilation and highlights the importance of

  14. THE ATMOSPHERIC MODEL EVALUATION TOOL (AMET); AIR QUALITY MODULE

    EPA Science Inventory

    This presentation reviews the development of the Atmospheric Model Evaluation Tool (AMET) air quality module. The AMET tool is being developed to aid in the model evaluation. This presentation focuses on the air quality evaluation portion of AMET. Presented are examples of the...

  15. AIR QUALITY SIMULATION MODEL PERFORMANCE FOR ONE-HOUR AVERAGES

    EPA Science Inventory

    If a one-hour standard for sulfur dioxide were promulgated, air quality dispersion modeling in the vicinity of major point sources would be an important air quality management tool. Would currently available dispersion models be suitable for use in demonstrating attainment of suc...

  16. Developing a customised approach for strengthening tuberculosis laboratory quality management systems toward accreditation

    PubMed Central

    Trollip, Andre; Erni, Donatelle; Kao, Kekeletso

    2017-01-01

    Background Quality-assured tuberculosis laboratory services are critical to achieve global and national goals for tuberculosis prevention and care. Implementation of a quality management system (QMS) in laboratories leads to improved quality of diagnostic tests and better patient care. The Strengthening Laboratory Management Toward Accreditation (SLMTA) programme has led to measurable improvements in the QMS of clinical laboratories. However, progress in tuberculosis laboratories has been slower, which may be attributed to the need for a structured tuberculosis-specific approach to implementing QMS. We describe the development and early implementation of the Strengthening Tuberculosis Laboratory Management Toward Accreditation (TB SLMTA) programme. Development The TB SLMTA curriculum was developed by customizing the SLMTA curriculum to include specific tools, job aids and supplementary materials specific to the tuberculosis laboratory. The TB SLMTA Harmonized Checklist was developed from the World Health Organisation Regional Office for Africa Stepwise Laboratory Quality Improvement Process Towards Accreditation checklist, and incorporated tuberculosis-specific requirements from the Global Laboratory Initiative Stepwise Process Towards Tuberculosis Laboratory Accreditation online tool. Implementation Four regional training-of-trainers workshops have been conducted since 2013. The TB SLMTA programme has been rolled out in 37 tuberculosis laboratories in 10 countries using the Workshop approach in 32 laboratories in five countries and the Facility-based approach in five tuberculosis laboratories in five countries. Conclusion Lessons learnt from early implementation of TB SLMTA suggest that a structured training and mentoring programme can build a foundation towards further quality improvement in tuberculosis laboratories. Structured mentoring, and institutionalisation of QMS into country programmes, is needed to support tuberculosis laboratories to achieve

  17. Application of various FLD modelling approaches

    NASA Astrophysics Data System (ADS)

    Banabic, D.; Aretz, H.; Paraianu, L.; Jurco, P.

    2005-07-01

    This paper focuses on a comparison between different modelling approaches to predict the forming limit diagram (FLD) for sheet metal forming under a linear strain path using the recently introduced orthotropic yield criterion BBC2003 (Banabic D et al 2005 Int. J. Plasticity 21 493-512). The FLD models considered here are a finite element based approach, the well known Marciniak-Kuczynski model, the modified maximum force criterion according to Hora et al (1996 Proc. Numisheet'96 Conf. (Dearborn/Michigan) pp 252-6), Swift's diffuse (Swift H W 1952 J. Mech. Phys. Solids 1 1-18) and Hill's classical localized necking approach (Hill R 1952 J. Mech. Phys. Solids 1 19-30). The FLD of an AA5182-O aluminium sheet alloy has been determined experimentally in order to quantify the predictive capabilities of the models mentioned above.

  18. Quality of care for patients with diabetes mellitus type 2 in 'model practices' in Slovenia - first results.

    PubMed

    Petek, Davorina; Mlakar, Mitja

    2016-09-01

    A new organisation at the primary level, called model practices, introduces a 0.5 full-time equivalent nurse practitioner as a regular member of the team. Nurse practitioners are in charge of registers of chronic patients, and implement an active approach into medical care. Selected quality indicators define the quality of management. The majority of studies confirm the effectiveness of the extended team in the quality of care, which is similar or improved when compared to care performed by the physician alone. The aim of the study is to compare the quality of management of patients with diabetes mellitus type 2 before and after the introduction of model practices. A cohort retrospective study was based on medical records from three practices. Process quality indicators, such as regularity of HbA1c measurement, blood pressure measurement, foot exam, referral to eye exam, performance of yearly laboratory tests and HbA1c level before and after the introduction of model practices were compared. The final sample consisted of 132 patients, whose diabetes care was exclusively performed at the primary care level. The process of care has significantly improved after the delivery of model practices. The most outstanding is the increase of foot exam and HbA1c testing. We could not prove better glycaemic control (p>0.1). Nevertheless, the proposed benchmark for the suggested quality process and outcome indicators were mostly exceeded in this cohort. The introduction of a nurse into the team improves the process quality of care. Benchmarks for quality indicators are obtainable. Better outcomes of care need further confirmation.

  19. Technical note: Comparison of methane ebullition modelling approaches used in terrestrial wetland models

    NASA Astrophysics Data System (ADS)

    Peltola, Olli; Raivonen, Maarit; Li, Xuefei; Vesala, Timo

    2018-02-01

    Emission via bubbling, i.e. ebullition, is one of the main methane (CH4) emission pathways from wetlands to the atmosphere. Direct measurement of gas bubble formation, growth and release in the peat-water matrix is challenging and in consequence these processes are relatively unknown and are coarsely represented in current wetland CH4 emission models. In this study we aimed to evaluate three ebullition modelling approaches and their effect on model performance. This was achieved by implementing the three approaches in one process-based CH4 emission model. All the approaches were based on some kind of threshold: either on CH4 pore water concentration (ECT), pressure (EPT) or free-phase gas volume (EBG) threshold. The model was run using 4 years of data from a boreal sedge fen and the results were compared with eddy covariance measurements of CH4 fluxes.

    Modelled annual CH4 emissions were largely unaffected by the different ebullition modelling approaches; however, temporal variability in CH4 emissions varied an order of magnitude between the approaches. Hence the ebullition modelling approach drives the temporal variability in modelled CH4 emissions and therefore significantly impacts, for instance, high-frequency (daily scale) model comparison and calibration against measurements. The modelling approach based on the most recent knowledge of the ebullition process (volume threshold, EBG) agreed the best with the measured fluxes (R2 = 0.63) and hence produced the most reasonable results, although there was a scale mismatch between the measurements (ecosystem scale with heterogeneous ebullition locations) and model results (single horizontally homogeneous peat column). The approach should be favoured over the two other more widely used ebullition modelling approaches and researchers are encouraged to implement it into their CH4 emission models.

  20. Quality assessment of Isfahan Medical Faculty web site electronic services and prioritizing solutions using analytic hierarchy process approach

    PubMed Central

    Hajrahimi, Nafiseh; Dehaghani, Sayed Mehdi Hejazi; Hajrahimi, Nargess; Sarmadi, Sima

    2014-01-01

    Context: Implementing information technology in the best possible way can bring many advantages such as applying electronic services and facilitating tasks. Therefore, assessment of service providing systems is a way to improve the quality and elevate these systems including e-commerce, e-government, e-banking, and e-learning. Aims: This study was aimed to evaluate the electronic services in the website of Isfahan University of Medical Sciences in order to propose solutions to improve them. Furthermore, we aim to rank the solutions based on the factors that enhance the quality of electronic services by using analytic hierarchy process (AHP) method. Materials and Methods: Non-parametric test was used to assess the quality of electronic services. The assessment of propositions was based on Aqual model and they were prioritized using AHP approach. The AHP approach was used because it directly applies experts’ deductions in the model, and lead to more objective results in the analysis and prioritizing the risks. After evaluating the quality of the electronic services, a multi-criteria decision making frame-work was used to prioritize the proposed solutions. Statistical Analysis Used: Non-parametric tests and AHP approach using Expert Choice software. Results: The results showed that students were satisfied in most of the indicators. Only a few indicators received low satisfaction from students including, design attractiveness, the amount of explanation and details of information, honesty and responsiveness of authorities, and the role of e-services in the user's relationship with university. After interviewing with Information and Communications Technology (ICT) experts at the university, measurement criteria, and solutions to improve the quality were collected. The best solutions were selected by EC software. According to the results, the solution “controlling and improving the process in handling users complaints” is of the utmost importance and authorities

  1. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  2. Web quality control for lectures: Supercourse and Amazon.com.

    PubMed

    Linkov, Faina; LaPorte, Ronald; Lovalekar, Mita; Dodani, Sunita

    2005-12-01

    Peer review has been at the corner stone of quality control of the biomedical journals in the past 300 years. With the emergency of the Internet, new models of quality control and peer review are emerging. However, such models are poorly investigated. We would argue that the popular system of quality control used in Amazon.com offers a way to ensure continuous quality improvement in the area of research communications on the Internet. Such system is providing an interesting alternative to the traditional peer review approaches used in the biomedical journals and challenges the traditional paradigms of scientific publishing. This idea is being explored in the context of Supercourse, a library of 2,350 prevention lectures, shared for free by faculty members from over 150 countries. Supercourse is successfully utilizing quality control approaches that are similar to Amazon.com model. Clearly, the existing approaches and emerging alternatives for quality control in scientific communications needs to be assessed scientifically. Rapid explosion of internet technologies could be leveraged to produce better, more cost effective systems for quality control in the biomedical publications and across all sciences.

  3. Make or buy analysis model based on tolerance allocation to minimize manufacturing cost and fuzzy quality loss

    NASA Astrophysics Data System (ADS)

    Rosyidi, C. N.; Puspitoingrum, W.; Jauhari, W. A.; Suhardi, B.; Hamada, K.

    2016-02-01

    The specification of tolerances has a significant impact on the quality of product and final production cost. The company should carefully pay attention to the component or product tolerance so they can produce a good quality product at the lowest cost. Tolerance allocation has been widely used to solve problem in selecting particular process or supplier. But before merely getting into the selection process, the company must first make a plan to analyse whether the component must be made in house (make), to be purchased from a supplier (buy), or used the combination of both. This paper discusses an optimization model of process and supplier selection in order to minimize the manufacturing costs and the fuzzy quality loss. This model can also be used to determine the allocation of components to the selected processes or suppliers. Tolerance, process capability and production capacity are three important constraints that affect the decision. Fuzzy quality loss function is used in this paper to describe the semantic of the quality, in which the product quality level is divided into several grades. The implementation of the proposed model has been demonstrated by solving a numerical example problem that used a simple assembly product which consists of three components. The metaheuristic approach were implemented to OptQuest software from Oracle Crystal Ball in order to obtain the optimal solution of the numerical example.

  4. Pairing top-down and bottom-up approaches to analyze catchment scale management of water quality and quantity

    NASA Astrophysics Data System (ADS)

    Lovette, J. P.; Duncan, J. M.; Band, L. E.

    2016-12-01

    Watershed management requires information on the hydrologic impacts of local to regional land use, land cover and infrastructure conditions. Management of runoff volumes, storm flows, and water quality can benefit from large scale, "top-down" screening tools, using readily available information, as well as more detailed, "bottom-up" process-based models that explicitly track local runoff production and routing from sources to receiving water bodies. Regional scale data, available nationwide through the NHD+, and top-down models based on aggregated catchment information provide useful tools for estimating regional patterns of peak flows, volumes and nutrient loads at the catchment level. Management impacts can be estimated with these models, but have limited ability to resolve impacts beyond simple changes to land cover proportions. Alternatively, distributed process-based models provide more flexibility in modeling management impacts by resolving spatial patterns of nutrient source, runoff generation, and uptake. This bottom-up approach can incorporate explicit patterns of land cover, drainage connectivity, and vegetation extent, but are typically applied over smaller areas. Here, we first model peak flood flows and nitrogen loads across North Carolina's 70,000 NHD+ catchments using USGS regional streamflow regression equations and the SPARROW model. We also estimate management impact by altering aggregated sources in each of these models. To address the missing spatial implications of the top-down approach, we further explore the demand for riparian buffers as a management strategy, simulating the accumulation of nutrient sources along flow paths and the potential mitigation of these sources through forested buffers. We use the Regional Hydro-Ecological Simulation System (RHESSys) to model changes across several basins in North Carolina's Piedmont and Blue Ridge regions, ranging in size from 15 - 1,130 km2. The two approaches provide a complementary set of tools

  5. Exercising Quality Control in Interdisciplinary Education: Toward an Epistemologically Responsible Approach

    ERIC Educational Resources Information Center

    Stein, Zachary; Connell, Michael; Gardner, Howard

    2008-01-01

    This article argues that certain philosophically devised quality control parameters should guide approaches to interdisciplinary education. We sketch the kind of reflections we think are necessary in order to produce epistemologically responsible curricula. We suggest that the two overarching epistemic dimensions of levels of analysis and basic…

  6. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    ERIC Educational Resources Information Center

    Dhindsa, Harkirat S.; Makarimi-Kasim; Anderson, O. Roger

    2011-01-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample…

  7. Development of the Next Generation Air Quality Modeling System

    EPA Science Inventory

    A next generation air quality modeling system is being developed at the U.S. EPA to enable modeling of air quality from global to regional to (eventually) local scales. We envision that the system will have three configurations: 1. Global meteorology with seamless mesh refinemen...

  8. A Total Quality Leadership Process Improvement Model

    DTIC Science & Technology

    1993-12-01

    Leadership Process Improvement Model by Archester Houston, Ph.D. and Steven L. Dockstader, Ph.D. DTICS ELECTE tleaese oand sale itsFeat ben proe 94-12058...tTl ’AND SIATE COVERID0 Z lits Z40 uerI’Ll12/93 IFinalS.FNR IM F A Total Quality Leadership Process Improvement Model M ARRhOW~ Archester Houston, Ph.D...and Steven L. Dockstader, Ph.D. ?. 7PEJORMING ORG-AN1:AION NAMEIS) AND 00-RESS(ES) L PERFORMIN4 ORAINIZATION Total Quality Leadership OfficeREOTNMR

  9. Voice Quality Modelling for Expressive Speech Synthesis

    PubMed Central

    Socoró, Joan Claudi

    2014-01-01

    This paper presents the perceptual experiments that were carried out in order to validate the methodology of transforming expressive speech styles using voice quality (VoQ) parameters modelling, along with the well-known prosody (F 0, duration, and energy), from a neutral style into a number of expressive ones. The main goal was to validate the usefulness of VoQ in the enhancement of expressive synthetic speech in terms of speech quality and style identification. A harmonic plus noise model (HNM) was used to modify VoQ and prosodic parameters that were extracted from an expressive speech corpus. Perception test results indicated the improvement of obtained expressive speech styles using VoQ modelling along with prosodic characteristics. PMID:24587738

  10. Atmospheric Model Evaluation Tool for meteorological and air quality simulations

    EPA Pesticide Factsheets

    The Atmospheric Model Evaluation Tool compares model predictions to observed data from various meteorological and air quality observation networks to help evaluate meteorological and air quality simulations.

  11. A Unified Approach to Modeling Multidisciplinary Interactions

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Bhatia, Kumar G.

    2000-01-01

    There are a number of existing methods to transfer information among various disciplines. For a multidisciplinary application with n disciplines, the traditional methods may be required to model (n(exp 2) - n) interactions. This paper presents a unified three-dimensional approach that reduces the number of interactions from (n(exp 2) - n) to 2n by using a computer-aided design model. The proposed modeling approach unifies the interactions among various disciplines. The approach is independent of specific discipline implementation, and a number of existing methods can be reformulated in the context of the proposed unified approach. This paper provides an overview of the proposed unified approach and reformulations for two existing methods. The unified approach is specially tailored for application environments where the geometry is created and managed through a computer-aided design system. Results are presented for a blended-wing body and a high-speed civil transport.

  12. Certifying leaders? high-quality management practices and healthy organisations: an ISO-9000 based standardisation approach

    PubMed Central

    MONTANO, Diego

    2016-01-01

    The present study proposes a set of quality requirements to management practices by taking into account the empirical evidence on their potential effects on health, the systemic nature of social organisations, and the current conceptualisations of management functions within the framework of comprehensive quality management systems. Systematic reviews and meta-analyses focusing on the associations between leadership and/or supervision and health in occupational settings are evaluated, and the core elements of an ISO 9001 standardisation approach are presented. Six major occupational health requirements to high-quality management practices are identified pertaining to communication processes, organisational justice, role clarity, decision making, social influence processes and management support. It is concluded that the quality of management practices may be improved by developing a quality management system of management practices that ensures not only conformity to product but also to occupational safety and health requirements. Further research may evaluate the practicability of the proposed approach. PMID:26860787

  13. Certifying leaders? high-quality management practices and healthy organisations: an ISO-9000 based standardisation approach.

    PubMed

    Montano, Diego

    2016-08-05

    The present study proposes a set of quality requirements to management practices by taking into account the empirical evidence on their potential effects on health, the systemic nature of social organisations, and the current conceptualisations of management functions within the framework of comprehensive quality management systems. Systematic reviews and meta-analyses focusing on the associations between leadership and/or supervision and health in occupational settings are evaluated, and the core elements of an ISO 9001 standardisation approach are presented. Six major occupational health requirements to high-quality management practices are identified pertaining to communication processes, organisational justice, role clarity, decision making, social influence processes and management support. It is concluded that the quality of management practices may be improved by developing a quality management system of management practices that ensures not only conformity to product but also to occupational safety and health requirements. Further research may evaluate the practicability of the proposed approach.

  14. Quality-by-Design approach to monitor the operation of a batch bioreactor in an industrial avian vaccine manufacturing process.

    PubMed

    Largoni, Martina; Facco, Pierantonio; Bernini, Donatella; Bezzo, Fabrizio; Barolo, Massimiliano

    2015-10-10

    Monitoring batch bioreactors is a complex task, due to the fact that several sources of variability can affect a running batch and impact on the final product quality. Additionally, the product quality itself may not be measurable on line, but requires sampling and lab analysis taking several days to be completed. In this study we show that, by using appropriate process analytical technology tools, the operation of an industrial batch bioreactor used in avian vaccine manufacturing can be effectively monitored as the batch progresses. Multivariate statistical models are built from historical databases of batches already completed, and they are used to enable the real time identification of the variability sources, to reliably predict the final product quality, and to improve process understanding, paving the way to a reduction of final product rejections, as well as to a reduction of the product cycle time. It is also shown that the product quality "builds up" mainly during the first half of a batch, suggesting on the one side that reducing the variability during this period is crucial, and on the other side that the batch length can possibly be shortened. Overall, the study demonstrates that, by using a Quality-by-Design approach centered on the appropriate use of mathematical modeling, quality can indeed be built "by design" into the final product, whereas the role of end-point product testing can progressively reduce its importance in product manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Impact of inherent meteorology uncertainty on air quality model predictions

    EPA Science Inventory

    It is well established that there are a number of different classifications and sources of uncertainties in environmental modeling systems. Air quality models rely on two key inputs, namely, meteorology and emissions. When using air quality models for decision making, it is impor...

  16. Analysis student self efficacy in terms of using Discovery Learning model with SAVI approach

    NASA Astrophysics Data System (ADS)

    Sahara, Rifki; Mardiyana, S., Dewi Retno Sari

    2017-12-01

    Often students are unable to prove their academic achievement optimally according to their abilities. One reason is that they often feel unsure that they are capable of completing the tasks assigned to them. For students, such beliefs are necessary. The term belief has called self efficacy. Self efficacy is not something that has brought about by birth or something with permanent quality of an individual, but is the result of cognitive processes, the meaning one's self efficacy will be stimulated through learning activities. Self efficacy has developed and enhanced by a learning model that can stimulate students to foster confidence in their capabilities. One of them is by using Discovery Learning model with SAVI approach. Discovery Learning model with SAVI approach is one of learning models that involves the active participation of students in exploring and discovering their own knowledge and using it in problem solving by utilizing all the sensory devices they have. This naturalistic qualitative research aims to analyze student self efficacy in terms of use the Discovery Learning model with SAVI approach. The subjects of this study are 30 students focused on eight students who have high, medium, and low self efficacy obtained through purposive sampling technique. The data analysis of this research used three stages, that were reducing, displaying, and getting conclusion of the data. Based on the results of data analysis, it was concluded that the self efficacy appeared dominantly on the learning by using Discovery Learning model with SAVI approach is magnitude dimension.

  17. Adding spatial flexibility to source-receptor relationships for air quality modeling.

    PubMed

    Pisoni, E; Clappier, A; Degraeuwe, B; Thunis, P

    2017-04-01

    To cope with computing power limitations, air quality models that are used in integrated assessment applications are generally approximated by simpler expressions referred to as "source-receptor relationships (SRR)". In addition to speed, it is desirable for the SRR also to be spatially flexible (application over a wide range of situations) and to require a "light setup" (based on a limited number of full Air Quality Models - AQM simulations). But "speed", "flexibility" and "light setup" do not naturally come together and a good compromise must be ensured that preserves "accuracy", i.e. a good comparability between SRR results and AQM. In this work we further develop a SRR methodology to better capture spatial flexibility. The updated methodology is based on a cell-to-cell relationship, in which a bell-shape function links emissions to concentrations. Maintaining a cell-to-cell relationship is shown to be the key element needed to ensure spatial flexibility, while at the same time the proposed approach to link emissions and concentrations guarantees a "light set-up" phase. Validation has been repeated on different areas and domain sizes (countries, regions, province throughout Europe) for precursors reduced independently or contemporarily. All runs showed a bias around 10% between the full AQM and the SRR. This methodology allows assessing the impact on air quality of emission scenarios applied over any given area in Europe (regions, set of regions, countries), provided that a limited number of AQM simulations are performed for training.

  18. Work stressors, sleep quality, and alcohol-related problems across deployment: A parallel process latent growth modeling approach among Navy members.

    PubMed

    Bravo, Adrian J; Kelley, Michelle L; Hollis, Brittany F

    2017-10-01

    This study examined how work stressors were associated with sleep quality and alcohol-related problems among U.S. Navy members over the course of deployment. Participants were 101 U.S. Navy members assigned to an Arleigh Burke-class destroyer who experienced an 8-month deployment after Operational Enduring Freedom/Operation Iraqi Freedom. Approximately 6 weeks prior to deployment, 6 weeks after deployment, and 6 months reintegration, participants completed measures that assessed work stressors, sleep quality, and alcohol-related problems. A piecewise latent growth model was conducted in which the structural paths assessed if work stressors influenced sleep quality or its growth over time, and in turn if sleep quality influenced alcohol-related problems intercepts or growth over time. A significant indirect effect was found such that increases in work stressors from pre- to postdeployment predicted decreases in sleep quality, which in turn were associated with increases in alcohol-related problems from pre- to postdeployment. These effects were maintained from postdeployment through the 6-month reintegration. Findings suggest that work stressors may have important implications for sleep quality and alcohol-related problems. Positive methods of addressing stress and techniques to improve sleep quality are needed as both may be associated with alcohol-related problems among current Navy members. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Quality by design approach for understanding the critical quality attributes of cyclosporine ophthalmic emulsion.

    PubMed

    Rahman, Ziyaur; Xu, Xiaoming; Katragadda, Usha; Krishnaiah, Yellela S R; Yu, Lawrence; Khan, Mansoor A

    2014-03-03

    Restasis is an ophthalmic cyclosporine emulsion used for the treatment of dry eye syndrome. There are no generic products for this product, probably because of the limitations on establishing in vivo bioequivalence methods and lack of alternative in vitro bioequivalence testing methods. The present investigation was carried out to understand and identify the appropriate in vitro methods that can discriminate the effect of formulation and process variables on critical quality attributes (CQA) of cyclosporine microemulsion formulations having the same qualitative (Q1) and quantitative (Q2) composition as that of Restasis. Quality by design (QbD) approach was used to understand the effect of formulation and process variables on critical quality attributes (CQA) of cyclosporine microemulsion. The formulation variables chosen were mixing order method, phase volume ratio, and pH adjustment method, while the process variables were temperature of primary and raw emulsion formation, microfluidizer pressure, and number of pressure cycles. The responses selected were particle size, turbidity, zeta potential, viscosity, osmolality, surface tension, contact angle, pH, and drug diffusion. The selected independent variables showed statistically significant (p < 0.05) effect on droplet size, zeta potential, viscosity, turbidity, and osmolality. However, the surface tension, contact angle, pH, and drug diffusion were not significantly affected by independent variables. In summary, in vitro methods can detect formulation and manufacturing changes and would thus be important for quality control or sameness of cyclosporine ophthalmic products.

  20. A machine learning calibration model using random forests to improve sensor performance for lower-cost air quality monitoring

    NASA Astrophysics Data System (ADS)

    Zimmerman, Naomi; Presto, Albert A.; Kumar, Sriniwasa P. N.; Gu, Jason; Hauryliuk, Aliaksei; Robinson, Ellis S.; Robinson, Allen L.; Subramanian, R.

    2018-01-01

    Low-cost sensing strategies hold the promise of denser air quality monitoring networks, which could significantly improve our understanding of personal air pollution exposure. Additionally, low-cost air quality sensors could be deployed to areas where limited monitoring exists. However, low-cost sensors are frequently sensitive to environmental conditions and pollutant cross-sensitivities, which have historically been poorly addressed by laboratory calibrations, limiting their utility for monitoring. In this study, we investigated different calibration models for the Real-time Affordable Multi-Pollutant (RAMP) sensor package, which measures CO, NO2, O3, and CO2. We explored three methods: (1) laboratory univariate linear regression, (2) empirical multiple linear regression, and (3) machine-learning-based calibration models using random forests (RF). Calibration models were developed for 16-19 RAMP monitors (varied by pollutant) using training and testing windows spanning August 2016 through February 2017 in Pittsburgh, PA, US. The random forest models matched (CO) or significantly outperformed (NO2, CO2, O3) the other calibration models, and their accuracy and precision were robust over time for testing windows of up to 16 weeks. Following calibration, average mean absolute error on the testing data set from the random forest models was 38 ppb for CO (14 % relative error), 10 ppm for CO2 (2 % relative error), 3.5 ppb for NO2 (29 % relative error), and 3.4 ppb for O3 (15 % relative error), and Pearson r versus the reference monitors exceeded 0.8 for most units. Model performance is explored in detail, including a quantification of model variable importance, accuracy across different concentration ranges, and performance in a range of monitoring contexts including the National Ambient Air Quality Standards (NAAQS) and the US EPA Air Sensors Guidebook recommendations of minimum data quality for personal exposure measurement. A key strength of the RF approach is that

  1. Indoor Air Quality Building Education and Assessment Model

    EPA Pesticide Factsheets

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM), released in 2002, is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  2. Towards new approaches in phenological modelling

    NASA Astrophysics Data System (ADS)

    Chmielewski, Frank-M.; Götz, Klaus-P.; Rawel, Harshard M.; Homann, Thomas

    2014-05-01

    Modelling of phenological stages is based on temperature sums for many decades, describing both the chilling and the forcing requirement of woody plants until the beginning of leafing or flowering. Parts of this approach go back to Reaumur (1735), who originally proposed the concept of growing degree-days. Now, there is a growing body of opinion that asks for new methods in phenological modelling and more in-depth studies on dormancy release of woody plants. This requirement is easily understandable if we consider the wide application of phenological models, which can even affect the results of climate models. To this day, in phenological models still a number of parameters need to be optimised on observations, although some basic physiological knowledge of the chilling and forcing requirement of plants is already considered in these approaches (semi-mechanistic models). Limiting, for a fundamental improvement of these models, is the lack of knowledge about the course of dormancy in woody plants, which cannot be directly observed and which is also insufficiently described in the literature. Modern metabolomic methods provide a solution for this problem and allow both, the validation of currently used phenological models as well as the development of mechanistic approaches. In order to develop this kind of models, changes of metabolites (concentration, temporal course) must be set in relation to the variability of environmental (steering) parameters (weather, day length, etc.). This necessarily requires multi-year (3-5 yr.) and high-resolution (weekly probes between autumn and spring) data. The feasibility of this approach has already been tested in a 3-year pilot-study on sweet cherries. Our suggested methodology is not only limited to the flowering of fruit trees, it can be also applied to tree species of the natural vegetation, where even greater deficits in phenological modelling exist.

  3. Work stressors, depressive symptoms and sleep quality among US Navy members: a parallel process latent growth modelling approach across deployment.

    PubMed

    Bravo, Adrian J; Kelley, Michelle L; Swinkels, Cindy M; Ulmer, Christi S

    2017-11-03

    The present study examined whether work stressors contribute to sleep problems and depressive symptoms over the course of deployment (i.e. pre-deployment, post-deployment and 6-month reintegration) among US Navy members. Specifically, we examined whether depressive symptoms or sleep quality mediate the relationships between work stressors and these outcomes. Participants were 101 US Navy members who experienced an 8-month deployment after Operational Enduring Freedom/Operation Iraqi Freedom. Using piecewise latent growth models, we found that increased work stressors were linked to increased depressive symptoms and decreased sleep quality across all three deployment stages. Further, increases in work stressors from pre- to post-deployment contributed to poorer sleep quality post-deployment via increasing depressive symptoms. Moreover, sleep quality mediated the association between increases in work stressors and increases in depressive symptoms from pre- to post-deployment. These effects were maintained from post-deployment through the 6-month reintegration. Although preliminary, our results suggest that changes in work stressors may have small, but significant implications for both depressive symptoms and quality of sleep over time, and a bi-directional relationship persists between sleep quality and depression across deployment. Strategies that target both stress and sleep could address both precipitating and perpetuating factors that affect sleep and depressive symptoms. © 2017 European Sleep Research Society.

  4. Development of a regional bio-optical model for water quality assessment in the US Virgin Islands

    NASA Astrophysics Data System (ADS)

    Kerrigan, Kristi Lisa

    Previous research in the US Virgin Islands (USVI) has demonstrated that land-based sources of pollution associated with watershed development and climate change are local and global factors causing coral reef degradation. A good indicator that can be used to assess stress on these environments is the water quality. Conventional assessment methods based on in situ measurements are timely and costly. Satellite remote sensing techniques offer better spatial coverage and temporal resolution to accurately characterize the dynamic nature of water quality parameters by applying bio-optical models. Chlorophyll-a, suspended sediments (TSM), and colored-dissolved organic matter are color-producing agents (CPAs) that define the water quality and can be measured remotely. However, the interference of multiple optically active constituents that characterize the water column as well as reflectance from the bottom poses a challenge in shallow coastal environments in USVI. In this study, field and laboratory based data were collected from sites on St. Thomas and St. John to characterize the CPAs and bottom reflectance of substrates. Results indicate that the optical properties of these waters are a function of multiple CPAs with chlorophyll-a values ranging from 0.10 to 2.35 microg/L and TSM values from 8.97 to 15.7 mg/L. These data were combined with in situ hyperspectral radiometric and Landsat OLI satellite data to develop a regionally tiered model that can predict CPA concentrations using traditional band ratio and multivariate approaches. Band ratio models for the hyperspectral dataset (R2 = 0.35; RMSE = 0.10 microg/L) and Landsat OLI dataset (R2 = 0.35; RMSE = 0.12 microg/L) indicated promising accuracy. However, a stronger model was developed using a multivariate, partial least squares regression to identify wavelengths that are more sensitive to chlorophyll-a (R2 = 0.62, RMSE = 0.08 microg/L) and TSM (R2 = 0.55). This approach takes advantage of the full spectrum of

  5. Challenges and opportunities for integrating lake ecosystem modelling approaches

    USGS Publications Warehouse

    Mooij, Wolf M.; Trolle, Dennis; Jeppesen, Erik; Arhonditsis, George; Belolipetsky, Pavel V.; Chitamwebwa, Deonatus B.R.; Degermendzhy, Andrey G.; DeAngelis, Donald L.; Domis, Lisette N. De Senerpont; Downing, Andrea S.; Elliott, J. Alex; Ruberto, Carlos Ruberto; Gaedke, Ursula; Genova, Svetlana N.; Gulati, Ramesh D.; Hakanson, Lars; Hamilton, David P.; Hipsey, Matthew R.; Hoen, Jochem 't; Hulsmann, Stephan; Los, F. Hans; Makler-Pick, Vardit; Petzoldt, Thomas; Prokopkin, Igor G.; Rinke, Karsten; Schep, Sebastiaan A.; Tominaga, Koji; Van Dam, Anne A.; Van Nes, Egbert H.; Wells, Scott A.; Janse, Jan H.

    2010-01-01

    A large number and wide variety of lake ecosystem models have been developed and published during the past four decades. We identify two challenges for making further progress in this field. One such challenge is to avoid developing more models largely following the concept of others ('reinventing the wheel'). The other challenge is to avoid focusing on only one type of model, while ignoring new and diverse approaches that have become available ('having tunnel vision'). In this paper, we aim at improving the awareness of existing models and knowledge of concurrent approaches in lake ecosystem modelling, without covering all possible model tools and avenues. First, we present a broad variety of modelling approaches. To illustrate these approaches, we give brief descriptions of rather arbitrarily selected sets of specific models. We deal with static models (steady state and regression models), complex dynamic models (CAEDYM, CE-QUAL-W2, Delft 3D-ECO, LakeMab, LakeWeb, MyLake, PCLake, PROTECH, SALMO), structurally dynamic models and minimal dynamic models. We also discuss a group of approaches that could all be classified as individual based: super-individual models (Piscator, Charisma), physiologically structured models, stage-structured models and trait-based models. We briefly mention genetic algorithms, neural networks, Kalman filters and fuzzy logic. Thereafter, we zoom in, as an in-depth example, on the multi-decadal development and application of the lake ecosystem model PCLake and related models (PCLake Metamodel, Lake Shira Model, IPH-TRIM3D-PCLake). In the discussion, we argue that while the historical development of each approach and model is understandable given its 'leading principle', there are many opportunities for combining approaches. We take the point of view that a single 'right' approach does not exist and should not be strived for. Instead, multiple modelling approaches, applied concurrently to a given problem, can help develop an integrative

  6. A new model in teaching undergraduate research: A collaborative approach and learning cooperatives.

    PubMed

    O'Neal, Pamela V; McClellan, Lynx Carlton; Jarosinski, Judith M

    2016-05-01

    Forming new, innovative collaborative approaches and cooperative learning methods between universities and hospitals maximize learning for undergraduate nursing students in a research course and provide professional development for nurses on the unit. The purpose of this Collaborative Approach and Learning Cooperatives (CALC) Model is to foster working relations between faculty and hospital administrators, maximize small group learning of undergraduate nursing students, and promote onsite knowledge of evidence based care for unit nurses. A quality improvement study using the CALC Model was implemented in an undergraduate nursing research course at a southern university. Hospital administrators provided a list of clinical concerns based on national performance outcome measures. Undergraduate junior nursing student teams chose a clinical question, gathered evidence from the literature, synthesized results, demonstrated practice application, and developed practice recommendations. The student teams developed posters, which were evaluated by hospital administrators. The administrators selected several posters to display on hospital units for continuing education opportunity. This CALC Model is a systematic, calculated approach and an economically feasible plan to maximize personnel and financial resources to optimize collaboration and cooperative learning. Universities and hospital administrators, nurses, and students benefit from working together and learning from each other. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Development of hybrid genetic-algorithm-based neural networks using regression trees for modeling air quality inside a public transportation bus.

    PubMed

    Kadiyala, Akhil; Kaur, Devinder; Kumar, Ashok

    2013-02-01

    The present study developed a novel approach to modeling indoor air quality (IAQ) of a public transportation bus by the development of hybrid genetic-algorithm-based neural networks (also known as evolutionary neural networks) with input variables optimized from using the regression trees, referred as the GART approach. This study validated the applicability of the GART modeling approach in solving complex nonlinear systems by accurately predicting the monitored contaminants of carbon dioxide (CO2), carbon monoxide (CO), nitric oxide (NO), sulfur dioxide (SO2), 0.3-0.4 microm sized particle numbers, 0.4-0.5 microm sized particle numbers, particulate matter (PM) concentrations less than 1.0 microm (PM10), and PM concentrations less than 2.5 microm (PM2.5) inside a public transportation bus operating on 20% grade biodiesel in Toledo, OH. First, the important variables affecting each monitored in-bus contaminant were determined using regression trees. Second, the analysis of variance was used as a complimentary sensitivity analysis to the regression tree results to determine a subset of statistically significant variables affecting each monitored in-bus contaminant. Finally, the identified subsets of statistically significant variables were used as inputs to develop three artificial neural network (ANN) models. The models developed were regression tree-based back-propagation network (BPN-RT), regression tree-based radial basis function network (RBFN-RT), and GART models. Performance measures were used to validate the predictive capacity of the developed IAQ models. The results from this approach were compared with the results obtained from using a theoretical approach and a generalized practicable approach to modeling IAQ that included the consideration of additional independent variables when developing the aforementioned ANN models. The hybrid GART models were able to capture majority of the variance in the monitored in-bus contaminants. The genetic

  8. An Approach to Improve the Quality of Infrared Images of Vein-Patterns

    PubMed Central

    Lin, Chih-Lung

    2011-01-01

    This study develops an approach to improve the quality of infrared (IR) images of vein-patterns, which usually have noise, low contrast, low brightness and small objects of interest, thus requiring preprocessing to improve their quality. The main characteristics of the proposed approach are that no prior knowledge about the IR image is necessary and no parameters must be preset. Two main goals are sought: impulse noise reduction and adaptive contrast enhancement technologies. In our study, a fast median-based filter (FMBF) is developed as a noise reduction method. It is based on an IR imaging mechanism to detect the noisy pixels and on a modified median-based filter to remove the noisy pixels in IR images. FMBF has the advantage of a low computation load. In addition, FMBF can retain reasonably good edges and texture information when the size of the filter window increases. The most important advantage is that the peak signal-to-noise ratio (PSNR) caused by FMBF is higher than the PSNR caused by the median filter. A hybrid cumulative histogram equalization (HCHE) is proposed for adaptive contrast enhancement. HCHE can automatically generate a hybrid cumulative histogram (HCH) based on two different pieces of information about the image histogram. HCHE can improve the enhancement effect on hot objects rather than background. The experimental results are addressed and demonstrate that the proposed approach is feasible for use as an effective and adaptive process for enhancing the quality of IR vein-pattern images. PMID:22247674

  9. An approach to improve the quality of infrared images of vein-patterns.

    PubMed

    Lin, Chih-Lung

    2011-01-01

    This study develops an approach to improve the quality of infrared (IR) images of vein-patterns, which usually have noise, low contrast, low brightness and small objects of interest, thus requiring preprocessing to improve their quality. The main characteristics of the proposed approach are that no prior knowledge about the IR image is necessary and no parameters must be preset. Two main goals are sought: impulse noise reduction and adaptive contrast enhancement technologies. In our study, a fast median-based filter (FMBF) is developed as a noise reduction method. It is based on an IR imaging mechanism to detect the noisy pixels and on a modified median-based filter to remove the noisy pixels in IR images. FMBF has the advantage of a low computation load. In addition, FMBF can retain reasonably good edges and texture information when the size of the filter window increases. The most important advantage is that the peak signal-to-noise ratio (PSNR) caused by FMBF is higher than the PSNR caused by the median filter. A hybrid cumulative histogram equalization (HCHE) is proposed for adaptive contrast enhancement. HCHE can automatically generate a hybrid cumulative histogram (HCH) based on two different pieces of information about the image histogram. HCHE can improve the enhancement effect on hot objects rather than background. The experimental results are addressed and demonstrate that the proposed approach is feasible for use as an effective and adaptive process for enhancing the quality of IR vein-pattern images.

  10. MOOC Quality: The Need for New Measures

    ERIC Educational Resources Information Center

    Hood, Nina; Littlejohn, Allison

    2016-01-01

    MOOCs are re-operationalising traditional concepts in education. While they draw on elements of existing educational and learning models, they represent a new approach to instruction and learning. The challenges MOOCs present to traditional education models have important implications for approaching and assessing quality. This paper foregrounds…

  11. ONE-ATMOSPHERE DYNAMICS DESCRIPTION IN THE MODELS-3 COMMUNITY MULTI-SCALE QUALITY (CMAQ) MODELING SYSTEM

    EPA Science Inventory

    This paper proposes a general procedure to link meteorological data with air quality models, such as U.S. EPA's Models-3 Community Multi-scale Air Quality (CMAQ) modeling system. CMAQ is intended to be used for studying multi-scale (urban and regional) and multi-pollutant (ozon...

  12. The Quality Improvement Management Approach as Implemented in a Middle School.

    ERIC Educational Resources Information Center

    Bayless, David L.; And Others

    1992-01-01

    The Total Quality Management Theory of W. E. Deming can be adapted within an educational organization to build structures that support educators' beliefs. A case study of the implementation of Deming principles at the LeRoy Martin Middle School in Raleigh (North Carolina) illustrates the effectiveness of this management approach. (SLD)

  13. Education Quality in Kazakhstan in the Context of Competence-Based Approach

    ERIC Educational Resources Information Center

    Nabi, Yskak; Zhaxylykova, Nuriya Ermuhametovna; Kenbaeva, Gulmira Kaparbaevna; Tolbayev, Abdikerim; Bekbaeva, Zeinep Nusipovna

    2016-01-01

    The background of this paper is to present how education system of Kazakhstan evolved during the last 24 years of independence, highlighting the contemporary transformational processes. We defined the aim to identify the education quality in the context of competence-based approach. Methods: Analysis of references, interviewing, experimental work.…

  14. Refining models for quantifying the water quality benefits of improved animal management for use in water quality trading

    USDA-ARS?s Scientific Manuscript database

    Water quality trading (WQT) is a market-based approach that allows point sources of water pollution to meet their water quality obligations by purchasing credits from the reduced discharges from other point or nonpoint sources. Non-permitted animal operations and fields of permitted animal operatio...

  15. A linked hydrodynamic and water quality model for the Salton Sea

    USGS Publications Warehouse

    Chung, E.G.; Schladow, S.G.; Perez-Losada, J.; Robertson, Dale M.

    2008-01-01

    A linked hydrodynamic and water quality model was developed and applied to the Salton Sea. The hydrodynamic component is based on the one-dimensional numerical model, DLM. The water quality model is based on a new conceptual model for nutrient cycling in the Sea, and simulates temperature, total suspended sediment concentration, nutrient concentrations, including PO4-3, NO3-1 and NH4+1, DO concentration and chlorophyll a concentration as functions of depth and time. Existing water temperature data from 1997 were used to verify that the model could accurately represent the onset and breakup of thermal stratification. 1999 is the only year with a near-complete dataset for water quality variables for the Salton Sea. The linked hydrodynamic and water quality model was run for 1999, and by adjustment of rate coefficients and other water quality parameters, a good match with the data was obtained. In this article, the model is fully described and the model results for reductions in external phosphorus load on chlorophyll a distribution are presented. ?? 2008 Springer Science+Business Media B.V.

  16. EVALUATING AND USING AIR QUALITY MODELS

    EPA Science Inventory

    Grid-based models are being used to assess the magnitude of the pollution problem and to design emission control strategies to achieve compliance with the relevant air quality standards in the United States.

  17. Spatial Allocator for air quality modeling

    EPA Pesticide Factsheets

    The Spatial Allocator is a set of tools that helps users manipulate and generate data files related to emissions and air quality modeling without requiring the use of a commercial Geographic Information System.

  18. An assessment model for quality management

    NASA Astrophysics Data System (ADS)

    Völcker, Chr.; Cass, A.; Dorling, A.; Zilioli, P.; Secchi, P.

    2002-07-01

    SYNSPACE together with InterSPICE and Alenia Spazio is developing an assessment method to determine the capability of an organisation in the area of quality management. The method, sponsored by the European Space Agency (ESA), is called S9kS (SPiCE- 9000 for SPACE). S9kS is based on ISO 9001:2000 with additions from the quality standards issued by the European Committee for Space Standardization (ECSS) and ISO 15504 - Process Assessments. The result is a reference model that supports the expansion of the generic process assessment framework provided by ISO 15504 to nonsoftware areas. In order to be compliant with ISO 15504, requirements from ISO 9001 and ECSS-Q-20 and Q-20-09 have been turned into process definitions in terms of Purpose and Outcomes, supported by a list of detailed indicators such as Practices, Work Products and Work Product Characteristics. In coordination with this project, the capability dimension of ISO 15504 has been revised to be consistent with ISO 9001. As contributions from ISO 9001 and the space quality assurance standards are separable, the stripped down version S9k offers organisations in all industries an assessment model based solely on ISO 9001, and is therefore interesting to all organisations, which intend to improve their quality management system based on ISO 9001.

  19. [Integrated Quality Management System (IQMS): a model for improving the quality of reproductive health care in rural Kenya].

    PubMed

    Herrler, Claudia; Bramesfeld, Anke; Brodowski, Marc; Prytherch, Helen; Marx, Irmgard; Nafula, Maureen; Richter-Aairijoki, Heide; Musyoka, Lucy; Marx, Michael; Szecsenyi, Joachim

    2015-01-01

    To develop a model aiming to improve the quality of services for reproductive health care in rural Kenya and designed to measure the quality of reproductive health services in such a way that allows these services to identify measures for improving their performance. The Integrated Quality Management System (IQMS) was developed on the basis of a pre-existing and validated model for quality promotion, namely the European Practice Assessment (EPA). The methodology for quality assessment and feedback of assessment results to the service teams was adopted from the EPA model. Quality assessment methodology included data assessment through staff, patient surveys and service visitation. Quality is assessed by indicators, and so indicators had to be developed that were appropriate for assessing reproductive health care in rural Kenya. A search of the Kenyan and international literature was conducted to identify potential indicators. These were then rated for their relevance and clarity by a panel of Kenyan experts. 260 indicators were rated as relevant and assigned to 29 quality dimensions and 5 domains. The implementation of IQMS in ten facilities showed that IQMS is a feasible model for assessing the quality of reproductive health services in rural Kenya. IQMS enables these services to identify quality improvement targets and necessary improvement measures. Both strengths and limitations of IQMS will be discussed. Copyright © 2015. Published by Elsevier GmbH.

  20. Water Quality Modeling in the Dead End Sections of Drinking ...

    EPA Pesticide Factsheets

    Dead-end sections of drinking water distribution networks are known to be problematic zones in terms of water quality degradation. Extended residence time due to water stagnation leads to rapid reduction of disinfectant residuals allowing the regrowth of microbial pathogens. Water quality models developed so far apply spatial aggregation and temporal averaging techniques for hydraulic parameters by assigning hourly averaged water demands to the main nodes of the network. Although this practice has generally resulted in minimal loss of accuracy for the predicted disinfectant concentrations in main water transmission lines, this is not the case for the peripheries of a distribution network. This study proposes a new approach for simulating disinfectant residuals in dead end pipes while accounting for both spatial and temporal variability in hydraulic and transport parameters. A stochastic demand generator was developed to represent residential water pulses based on a non-homogenous Poisson process. Dispersive solute transport was considered using highly dynamic dispersion rates. A genetic algorithm was used to calibrate the axial hydraulic profile of the dead-end pipe based on the different demand shares of the withdrawal nodes. A parametric sensitivity analysis was done to assess the model performance under variation of different simulation parameters. A group of Monte-Carlo ensembles was carried out to investigate the influence of spatial and temporal variations

  1. A simplified approach to quasi-linear viscoelastic modeling

    PubMed Central

    Nekouzadeh, Ali; Pryse, Kenneth M.; Elson, Elliot L.; Genin, Guy M.

    2007-01-01

    The fitting of quasi-linear viscoelastic (QLV) constitutive models to material data often involves somewhat cumbersome numerical convolution. A new approach to treating quasi-linearity in one dimension is described and applied to characterize the behavior of reconstituted collagen. This approach is based on a new principle for including nonlinearity and requires considerably less computation than other comparable models for both model calibration and response prediction, especially for smoothly applied stretching. Additionally, the approach allows relaxation to adapt with the strain history. The modeling approach is demonstrated through tests on pure reconstituted collagen. Sequences of “ramp-and-hold” stretching tests were applied to rectangular collagen specimens. The relaxation force data from the “hold” was used to calibrate a new “adaptive QLV model” and several models from literature, and the force data from the “ramp” was used to check the accuracy of model predictions. Additionally, the ability of the models to predict the force response on a reloading of the specimen was assessed. The “adaptive QLV model” based on this new approach predicts collagen behavior comparably to or better than existing models, with much less computation. PMID:17499254

  2. Meteorological Processes Affecting Air Quality – Research and Model Development Needs

    EPA Science Inventory

    Meteorology modeling is an important component of air quality modeling systems that defines the physical and dynamical environment for atmospheric chemistry. The meteorology models used for air quality applications are based on numerical weather prediction models that were devel...

  3. Private healthcare quality: applying a SERVQUAL model.

    PubMed

    Butt, Mohsin Muhammad; de Run, Ernest Cyril

    2010-01-01

    This paper seeks to develop and test the SERVQUAL model scale for measuring Malaysian private health service quality. The study consists of 340 randomly selected participants visiting a private healthcare facility during a three-month data collection period. Data were analyzed using means, correlations, principal component and confirmatory factor analysis to establish the modified SERVQUAL scale's reliability, underlying dimensionality and convergent, discriminant validity. Results indicate a moderate negative quality gap for overall Malaysian private healthcare service quality. Results also indicate a moderate negative quality gap on each service quality scale dimension. However, scale development analysis yielded excellent results, which can be used in wider healthcare policy and practice. Respondents were skewed towards a younger population, causing concern that the results might not represent all Malaysian age groups. The study's major contribution is that it offers a way to assess private healthcare service quality. Second, it successfully develops a scale that can be used to measure health service quality in Malaysian contexts.

  4. No-reference quality assessment based on visual perception

    NASA Astrophysics Data System (ADS)

    Li, Junshan; Yang, Yawei; Hu, Shuangyan; Zhang, Jiao

    2014-11-01

    The visual quality assessment of images/videos is an ongoing hot research topic, which has become more and more important for numerous image and video processing applications with the rapid development of digital imaging and communication technologies. The goal of image quality assessment (IQA) algorithms is to automatically assess the quality of images/videos in agreement with human quality judgments. Up to now, two kinds of models have been used for IQA, namely full-reference (FR) and no-reference (NR) models. For FR models, IQA algorithms interpret image quality as fidelity or similarity with a perfect image in some perceptual space. However, the reference image is not available in many practical applications, and a NR IQA approach is desired. Considering natural vision as optimized by the millions of years of evolutionary pressure, many methods attempt to achieve consistency in quality prediction by modeling salient physiological and psychological features of the human visual system (HVS). To reach this goal, researchers try to simulate HVS with image sparsity coding and supervised machine learning, which are two main features of HVS. A typical HVS captures the scenes by sparsity coding, and uses experienced knowledge to apperceive objects. In this paper, we propose a novel IQA approach based on visual perception. Firstly, a standard model of HVS is studied and analyzed, and the sparse representation of image is accomplished with the model; and then, the mapping correlation between sparse codes and subjective quality scores is trained with the regression technique of least squaresupport vector machine (LS-SVM), which gains the regressor that can predict the image quality; the visual metric of image is predicted with the trained regressor at last. We validate the performance of proposed approach on Laboratory for Image and Video Engineering (LIVE) database, the specific contents of the type of distortions present in the database are: 227 images of JPEG2000, 233

  5. Posterior limb of the internal capsule predicts poor quality of life in patients with Parkinson's disease: connectometry approach.

    PubMed

    Ghazi Sherbaf, Farzaneh; Mojtahed Zadeh, Mahtab; Haghshomar, Maryam; Aarabi, Mohammad Hadi

    2018-03-14

    Psychiatric symptoms and motor impairment are major contributions to the poor quality of life in patients with Parkinson's disease (PD). Here, we applied a novel diffusion-weighted imaging approach, diffusion MRI connectometry, to investigate the correlation of quality of life, evaluated by Parkinson's Disease Questionnaire (PDQ39) with the white matter structural connectivity in 27 non-demented PD patients (disease duration of 5.3 ± 2.9 years, H and Y stage = 1.5 ± 0.6, UPDRS-III = 13.7 ± 6.5, indicating unilateral and mild motor involvement). The connectometry analysis demonstrated bilateral posterior limbs of the internal capsule (PLIC) with increased connectivity related to the higher quality of life (FDR = 0.027) in a multiple regression model. The present study suggests for the first time a neural basis of the quality of life in PD in the light of major determinants of poor quality of life in these patients: anxiety, depression, apathy and motor impairment. Results in our sample of non-demented PD patients with relatively mild motor impairment and no apparent sign of depression/anxiety also identify a unique and inexplicable association of the PLIC to the quality of life in PD patients.

  6. Fuzzy intelligent quality monitoring model for X-ray image processing.

    PubMed

    Khalatbari, Azadeh; Jenab, Kouroush

    2009-01-01

    Today's imaging diagnosis needs to adapt modern techniques of quality engineering to maintain and improve its accuracy and reliability in health care system. One of the main factors that influences diagnostic accuracy of plain film X-ray on detecting pathology is the level of film exposure. If the level of film exposure is not adequate, a normal body structure may be interpretated as pathology and vice versa. This not only influences the patient management but also has an impact on health care cost and patient's quality of life. Therefore, providing an accurate and high quality image is the first step toward an excellent patient management in any health care system. In this paper, we study these techniques and also present a fuzzy intelligent quality monitoring model, which can be used to keep variables from degrading the image quality. The variables derived from chemical activity, cleaning procedures, maintenance, and monitoring may not be sensed, measured, or calculated precisely due to uncertain situations. Therefore, the gamma-level fuzzy Bayesian model for quality monitoring of an image processing is proposed. In order to apply the Bayesian concept, the fuzzy quality characteristics are assumed as fuzzy random variables. Using the fuzzy quality characteristics, the newly developed model calculates the degradation risk for image processing. A numerical example is also presented to demonstrate the application of the model.

  7. Life course socio-economic position and quality of life in adulthood: a systematic review of life course models

    PubMed Central

    2012-01-01

    Background A relationship between current socio-economic position and subjective quality of life has been demonstrated, using wellbeing, life and needs satisfaction approaches. Less is known regarding the influence of different life course socio-economic trajectories on later quality of life. Several conceptual models have been proposed to help explain potential life course effects on health, including accumulation, latent, pathway and social mobility models. This systematic review aimed to assess whether evidence supported an overall relationship between life course socio-economic position and quality of life during adulthood and if so, whether there was support for one or more life course models. Methods A review protocol was developed detailing explicit inclusion and exclusion criteria, search terms, data extraction items and quality appraisal procedures. Literature searches were performed in 12 electronic databases during January 2012 and the references and citations of included articles were checked for additional relevant articles. Narrative synthesis was used to analyze extracted data and studies were categorized based on the life course model analyzed. Results Twelve studies met the eligibility criteria and used data from 10 datasets and five countries. Study quality varied and heterogeneity between studies was high. Seven studies assessed social mobility models, five assessed the latent model, two assessed the pathway model and three tested the accumulation model. Evidence indicated an overall relationship, but mixed results were found for each life course model. Some evidence was found to support the latent model among women, but not men. Social mobility models were supported in some studies, but overall evidence suggested little to no effect. Few studies addressed accumulation and pathway effects and study heterogeneity limited synthesis. Conclusions To improve potential for synthesis in this area, future research should aim to increase study

  8. A nested observation and model approach to non linear groundwater surface water interactions.

    NASA Astrophysics Data System (ADS)

    van der Velde, Y.; Rozemeijer, J. C.; de Rooij, G. H.

    2009-04-01

    Surface water quality measurements in The Netherlands are scattered in time and space. Therefore, water quality status and its variations and trends are difficult to determine. In order to reach the water quality goals according to the European Water Framework Directive, we need to improve our understanding of the dynamics of surface water quality and the processes that affect it. In heavily drained lowland catchment groundwater influences the discharge towards the surface water network in many complex ways. Especially a strong seasonal contracting and expanding system of discharging ditches and streams affects discharge and solute transport. At a tube drained field site the tube drain flux and the combined flux of all other flow routes toward a stretch of 45 m of surface water have been measured for a year. Also the groundwater levels at various locations in the field and the discharge at two nested catchment scales have been monitored. The unique reaction of individual flow routes on rainfall events at the field site allowed us to separate the discharge at a 4 ha catchment and at a 6 km2 into flow route contributions. The results of this nested experimental setup combined with the results of a distributed hydrological model has lead to the formulation of a process model approach that focuses on the spatial variability of discharge generation driven by temporal and spatial variations in groundwater levels. The main idea of this approach is that discharge is not generated by catchment average storages or groundwater heads, but is mainly generated by points scale extremes i.e. extreme low permeability, extreme high groundwater heads or extreme low surface elevations, all leading to catchment discharge. We focused on describing the spatial extremes in point scale storages and this led to a simple and measurable expression that governs the non-linear groundwater surface water interaction. We will present the analysis of the field site data to demonstrate the potential

  9. Chronic care model strategies in the United States and Germany deliver patient-centered, high-quality diabetes care.

    PubMed

    Stock, Stephanie; Pitcavage, James M; Simic, Dusan; Altin, Sibel; Graf, Christian; Feng, Wen; Graf, Thomas R

    2014-09-01

    Improving the quality of care for chronic diseases is an important issue for most health care systems in industrialized nations. One widely adopted approach is the Chronic Care Model (CCM), which was first developed in the late 1990s. In this article we present the results from two large surveys in the United States and Germany that report patients' experiences in different models of patient-centered diabetes care, compared to the experiences of patients who received routine diabetes care in the same systems. The study populations were enrolled in either Geisinger Health System in Pennsylvania or Barmer, a German sickness fund that provides medical insurance nationwide. Our findings suggest that patients with type 2 diabetes who were enrolled in the care models that exhibited key features of the CCM were more likely to receive care that was patient-centered, high quality, and collaborative, compared to patients who received routine care. This study demonstrates that quality improvement can be realized through the application of the Chronic Care Model, regardless of the setting or distinct characteristics of the program. Project HOPE—The People-to-People Health Foundation, Inc.

  10. Approaches to Identify Exceedances of Water Quality Thresholds Associated with Ocean Conditions

    EPA Science Inventory

    WED scientists have developed a method to help distinguish whether failures to meet water quality criteria are associated with natural coastal upwelling by using the statistical approach of logistic regression. Estuaries along the west coast of the United States periodically ha...

  11. Linkages between Total Quality Management and the Outcomes-Based Approach in an Education Environment

    ERIC Educational Resources Information Center

    de Jager, H. J.; Nieuwenhuis, F. J.

    2005-01-01

    South Africa has embarked on a process of education renewal by adopting outcomes-based education (OBE). This paper focuses on the linkages between total quality management (TQM) and the outcomes-based approach in an education context. Quality assurance in academic programmes in higher education in South Africa is, in some instances, based on the…

  12. Quality-by-design approach for the development of telmisartan potassium tablets.

    PubMed

    Oh, Ga-Hui; Park, Jin-Hyun; Shin, Hye-Won; Kim, Joo-Eun; Park, Young-Joon

    2018-05-01

    A quality-by-design approach was adopted to develop telmisartan potassium (TP) tablets, which were bioequivalent with the commercially available Micardis ® (telmisartan free base) tablets. The dissolution pattern and impurity profile of TP tablets differed from those of Micardis ® tablets because telmisartan free base is poorly soluble in water. After identifying the quality target product profile and critical quality attributes (CQAs), drug dissolution, and impurities were predicted to be risky CQAs. To determine the exact range and cause of risks, we used the risk assessment (RA) tools, preliminary hazard analysis and failure mode and effect analysis to determine the parameters affecting drug dissolution, impurities, and formulation. The range of the design space was optimized using the face-centered central composite design among the design of experiment (DOE) methods. The binder, disintegrant, and kneading time in the wet granulation were identified as X values affecting Y values (disintegration, hardness, friability, dissolution, and impurities). After determining the design space with the desired Y values, the TP tablets were formulated and their dissolution pattern was compared with that of the reference tablet. The selected TP tablet formulated using design space showed a similar dissolution to that of Micardis ® tablets at pH 7.5. The QbD approach TP tablet was bioequivalent to Micardis ® tablets in beagle dogs.

  13. A Systems Approach to Manage Drinking Water Quality ...

    EPA Pesticide Factsheets

    Drinking water supplies can be vulnerable to impacts from short-term weather events, long-term changes in land-use and climate, and water quality controls in treatment and distribution. Disinfection by-product (DBP) formation in drinking water is a prominent example to illustrate the water supply vulnerability and examine technological options in adaptation. Total organic carbon (TOC) in surface water can vary significantly due to changes or a combination of changes in watershed land use, climate variability, and extreme meteorological events (e.g., hurricanes). On the other hand, water demand is known to vary temporarily and spatially leading to changes in water ages and hence DBP formation potential. Typically a drinking water facility is designed to operate within a projected range of influent water quality and water demand. When the variations exceed the design range, water supply becomes vulnerable in the compliance to Safe Drinking Water Act (SDWA) Stage-II disinfection by-product (DBP) rules. This paper describes a framework of systems-level modeling, monitoring and control in adaptive planning and system operation. The framework, built upon the integration of model projections, adaptive monitoring and systems control, has three primary functions. Its advantages and limitations will be discussed with the application examples in Cincinnati (Ohio, USA) and Las Vegas (Nevada, USA). At a conceptual level, an integrated land use and hydrological model

  14. Identifying quality-markers from Shengmai San protects against transgenic mouse model of Alzheimer's disease using chinmedomics approach.

    PubMed

    Zhang, Ai-Hua; Yu, Jing-Bo; Sun, Hui; Kong, Ling; Wang, Xiang-Qian; Zhang, Qing-Yu; Wang, Xi-Jun

    2018-04-03

    Shengmai San (SMS), a Chinese classic herbal formula, has been widely used for the treatment of Qi-Yin deficiency syndrome in Asia. Modern pharmacological studies have shown that SMS improves the cognitive function. However, the quality markers (Q-markers) for SMS still need further research. Using chinmedocmics strategy to systematically evaluate the efficacy of SMS in the treatment of APPswe/PS1dE9 (APP/PS1) transgenic model of Alzheimer's disease (AD) and to discover the efficacy-related Q-markers. The effect of SMS on APP/PS1 mice was evaluated by behavioral test, immunohistochemistry and urine metabolic profile, and the urine marker metabolites associated with SMS treatment of AD were characterized using metabolomics method. In the premise of efficacy, Serum Pharmacochemistry of Traditional Chinese Medicine was applied to investigate the in vivo constituents of SMS. A correlation analysis between marker metabolites of therapeutic effects and serum constituents was completed by chinmedomics approach. SMS had a therapeutic effect on APP/PS1 mice, and 34 potential urine biomarkers were reversed by SMS treatment. A total of 17 in vivo constituents were detected, including 14 prototype components and 3 metabolites. The correlation analysis showed that eight constituents were extremely correlated with protective effects of SMS in AD, and considered as potential Q-markers of SMS, including schisandrin, isoschisandrin, angeloylgomisin Q, gomisin D, angeloylgomisin H, gomisin M2, ginsenoside F1, 20(R)-ginsenoside Rg3. This study has demonstrated that chinmedomics is novel strategy for discovering the potential effective constituents from herbal formula, which are recognized as Q-markers. Copyright © 2018 Elsevier GmbH. All rights reserved.

  15. Differences in aquatic habitat quality as an impact of one- and two-dimensional hydrodynamic model simulated flow variables

    NASA Astrophysics Data System (ADS)

    Benjankar, R. M.; Sohrabi, M.; Tonina, D.; McKean, J. A.

    2013-12-01

    Aquatic habitat models utilize flow variables which may be predicted with one-dimensional (1D) or two-dimensional (2D) hydrodynamic models to simulate aquatic habitat quality. Studies focusing on the effects of hydrodynamic model dimensionality on predicted aquatic habitat quality are limited. Here we present the analysis of the impact of flow variables predicted with 1D and 2D hydrodynamic models on simulated spatial distribution of habitat quality and Weighted Usable Area (WUA) for fall-spawning Chinook salmon. Our study focuses on three river systems located in central Idaho (USA), which are a straight and pool-riffle reach (South Fork Boise River), small pool-riffle sinuous streams in a large meadow (Bear Valley Creek) and a steep-confined plane-bed stream with occasional deep forced pools (Deadwood River). We consider low and high flows in simple and complex morphologic reaches. Results show that 1D and 2D modeling approaches have effects on both the spatial distribution of the habitat and WUA for both discharge scenarios, but we did not find noticeable differences between complex and simple reaches. In general, the differences in WUA were small, but depended on stream type. Nevertheless, spatially distributed habitat quality difference is considerable in all streams. The steep-confined plane bed stream had larger differences between aquatic habitat quality defined with 1D and 2D flow models compared to results for streams with well defined macro-topographies, such as pool-riffle bed forms. KEY WORDS: one- and two-dimensional hydrodynamic models, habitat modeling, weighted usable area (WUA), hydraulic habitat suitability, high and low discharges, simple and complex reaches

  16. Quality assessment of protein model-structures using evolutionary conservation.

    PubMed

    Kalman, Matan; Ben-Tal, Nir

    2010-05-15

    Programs that evaluate the quality of a protein structural model are important both for validating the structure determination procedure and for guiding the model-building process. Such programs are based on properties of native structures that are generally not expected for faulty models. One such property, which is rarely used for automatic structure quality assessment, is the tendency for conserved residues to be located at the structural core and for variable residues to be located at the surface. We present ConQuass, a novel quality assessment program based on the consistency between the model structure and the protein's conservation pattern. We show that it can identify problematic structural models, and that the scores it assigns to the server models in CASP8 correlate with the similarity of the models to the native structure. We also show that when the conservation information is reliable, the method's performance is comparable and complementary to that of the other single-structure quality assessment methods that participated in CASP8 and that do not use additional structural information from homologs. A perl implementation of the method, as well as the various perl and R scripts used for the analysis are available at http://bental.tau.ac.il/ConQuass/. nirb@tauex.tau.ac.il Supplementary data are available at Bioinformatics online.

  17. Comparison of different uncertainty techniques in urban stormwater quantity and quality modelling.

    PubMed

    Dotto, Cintia B S; Mannina, Giorgio; Kleidorfer, Manfred; Vezzaro, Luca; Henrichs, Malte; McCarthy, David T; Freni, Gabriele; Rauch, Wolfgang; Deletic, Ana

    2012-05-15

    Urban drainage models are important tools used by both practitioners and scientists in the field of stormwater management. These models are often conceptual and usually require calibration using local datasets. The quantification of the uncertainty associated with the models is a must, although it is rarely practiced. The International Working Group on Data and Models, which works under the IWA/IAHR Joint Committee on Urban Drainage, has been working on the development of a framework for defining and assessing uncertainties in the field of urban drainage modelling. A part of that work is the assessment and comparison of different techniques generally used in the uncertainty assessment of the parameters of water models. This paper compares a number of these techniques: the Generalized Likelihood Uncertainty Estimation (GLUE), the Shuffled Complex Evolution Metropolis algorithm (SCEM-UA), an approach based on a multi-objective auto-calibration (a multialgorithm, genetically adaptive multi-objective method, AMALGAM) and a Bayesian approach based on a simplified Markov Chain Monte Carlo method (implemented in the software MICA). To allow a meaningful comparison among the different uncertainty techniques, common criteria have been set for the likelihood formulation, defining the number of simulations, and the measure of uncertainty bounds. Moreover, all the uncertainty techniques were implemented for the same case study, in which the same stormwater quantity and quality model was used alongside the same dataset. The comparison results for a well-posed rainfall/runoff model showed that the four methods provide similar probability distributions of model parameters, and model prediction intervals. For ill-posed water quality model the differences between the results were much wider; and the paper provides the specific advantages and disadvantages of each method. In relation to computational efficiency (i.e. number of iterations required to generate the probability

  18. Comparative evaluation of urban storm water quality models

    NASA Astrophysics Data System (ADS)

    Vaze, J.; Chiew, Francis H. S.

    2003-10-01

    The estimation of urban storm water pollutant loads is required for the development of mitigation and management strategies to minimize impacts to receiving environments. Event pollutant loads are typically estimated using either regression equations or "process-based" water quality models. The relative merit of using regression models compared to process-based models is not clear. A modeling study is carried out here to evaluate the comparative ability of the regression equations and process-based water quality models to estimate event diffuse pollutant loads from impervious surfaces. The results indicate that, once calibrated, both the regression equations and the process-based model can estimate event pollutant loads satisfactorily. In fact, the loads estimated using the regression equation as a function of rainfall intensity and runoff rate are better than the loads estimated using the process-based model. Therefore, if only estimates of event loads are required, regression models should be used because they are simpler and require less data compared to process-based models.

  19. Modeling autism: a systems biology approach

    PubMed Central

    2012-01-01

    Autism is the fastest growing developmental disorder in the world today. The prevalence of autism in the US has risen from 1 in 2500 in 1970 to 1 in 88 children today. People with autism present with repetitive movements and with social and communication impairments. These impairments can range from mild to profound. The estimated total lifetime societal cost of caring for one individual with autism is $3.2 million US dollars. With the rapid growth in this disorder and the great expense of caring for those with autism, it is imperative for both individuals and society that techniques be developed to model and understand autism. There is increasing evidence that those individuals diagnosed with autism present with highly diverse set of abnormalities affecting multiple systems of the body. To this date, little to no work has been done using a whole body systems biology approach to model the characteristics of this disorder. Identification and modelling of these systems might lead to new and improved treatment protocols, better diagnosis and treatment of the affected systems, which might lead to improved quality of life by themselves, and, in addition, might also help the core symptoms of autism due to the potential interconnections between the brain and nervous system with all these other systems being modeled. This paper first reviews research which shows that autism impacts many systems in the body, including the metabolic, mitochondrial, immunological, gastrointestinal and the neurological. These systems interact in complex and highly interdependent ways. Many of these disturbances have effects in most of the systems of the body. In particular, clinical evidence exists for increased oxidative stress, inflammation, and immune and mitochondrial dysfunction which can affect almost every cell in the body. Three promising research areas are discussed, hierarchical, subgroup analysis and modeling over time. This paper reviews some of the systems disturbed in autism and

  20. Optimum profit model considering production, quality and sale problem

    NASA Astrophysics Data System (ADS)

    Chen, Chung-Ho; Lu, Chih-Lun

    2011-12-01

    Chen and Liu ['Procurement Strategies in the Presence of the Spot Market-an Analytical Framework', Production Planning and Control, 18, 297-309] presented the optimum profit model between the producers and the purchasers for the supply chain system with a pure procurement policy. However, their model with a simple manufacturing cost did not consider the used cost of the customer. In this study, the modified Chen and Liu's model will be addressed for determining the optimum product and process parameters. The authors propose a modified Chen and Liu's model under the two-stage screening procedure. The surrogate variable having a high correlation with the measurable quality characteristic will be directly measured in the first stage. The measurable quality characteristic will be directly measured in the second stage when the product decision cannot be determined in the first stage. The used cost of the customer will be measured by adopting Taguchi's quadratic quality loss function. The optimum purchaser's order quantity, the producer's product price and the process quality level will be jointly determined by maximising the expected profit between them.

  1. A comparison of different functions for predicted protein model quality assessment.

    PubMed

    Li, Juan; Fang, Huisheng

    2016-07-01

    In protein structure prediction, a considerable number of models are usually produced by either the Template-Based Method (TBM) or the ab initio prediction. The purpose of this study is to find the critical parameter in assessing the quality of the predicted models. A non-redundant template library was developed and 138 target sequences were modeled. The target sequences were all distant from the proteins in the template library and were aligned with template library proteins on the basis of the transformation matrix. The quality of each model was first assessed with QMEAN and its six parameters, which are C_β interaction energy (C_beta), all-atom pairwise energy (PE), solvation energy (SE), torsion angle energy (TAE), secondary structure agreement (SSA), and solvent accessibility agreement (SAE). Finally, the alignment score (score) was also used to assess the quality of model. Hence, a total of eight parameters (i.e., QMEAN, C_beta, PE, SE, TAE, SSA, SAE, score) were independently used to assess the quality of each model. The results indicate that SSA is the best parameter to estimate the quality of the model.

  2. Evaluating the implementation of confusion assessment method-intensive care unit using a quality improvement approach.

    PubMed

    Stewart, C; Bench, S

    2018-05-15

    Quality improvement (QI) is a way through which health care delivery can be made safer and more effective. Various models of quality improvement methods exist in health care today. These models can help guide and manage the process of introducing changes into clinical practice. The aim of this project was to implement the use of a delirium assessment tool into three adult critical care units within the same hospital using a QI approach. The objective was to improve the identification and management of delirium. Using the Model for Improvement framework, a multidisciplinary working group was established. A delirium assessment tool was introduced via a series of educational initiatives. New local guidelines regarding the use of delirium assessment and management for the multidisciplinary team were also produced. Audit data were collected at 6 weeks and 5 months post-implementation to evaluate compliance with the use of the tool across three critical care units within a single hospital in London. At 6 weeks, in 134 assessment points out of a possible 202, the tool was deemed to be used appropriately, meaning that 60% of patients received timely assessment; 18% of patients were identified as delirious in audit one. Five months later, only 95 assessment points out of a possible 199 were being appropriately assessed (47%); however, a greater number (32%) were identified as delirious. This project emphasizes the complexity of changing practice in a large busy critical care centre. Despite an initial increase in delirium assessment, this was not sustained over time. The use of a QI model highlights the continuous process of embedding changes into clinical practice and the need to use a QI method that can address the challenging nature of modern health care. QI models guide changes in practice. Consideration should be given to the type of QI model used. © 2018 British Association of Critical Care Nurses.

  3. Evaluation of regional climate simulations for air quality modelling purposes

    NASA Astrophysics Data System (ADS)

    Menut, Laurent; Tripathi, Om P.; Colette, Augustin; Vautard, Robert; Flaounas, Emmanouil; Bessagnet, Bertrand

    2013-05-01

    In order to evaluate the future potential benefits of emission regulation on regional air quality, while taking into account the effects of climate change, off-line air quality projection simulations are driven using weather forcing taken from regional climate models. These regional models are themselves driven by simulations carried out using global climate models (GCM) and economical scenarios. Uncertainties and biases in climate models introduce an additional "climate modeling" source of uncertainty that is to be added to all other types of uncertainties in air quality modeling for policy evaluation. In this article we evaluate the changes in air quality-related weather variables induced by replacing reanalyses-forced by GCM-forced regional climate simulations. As an example we use GCM simulations carried out in the framework of the ERA-interim programme and of the CMIP5 project using the Institut Pierre-Simon Laplace climate model (IPSLcm), driving regional simulations performed in the framework of the EURO-CORDEX programme. In summer, we found compensating deficiencies acting on photochemistry: an overestimation by GCM-driven weather due to a positive bias in short-wave radiation, a negative bias in wind speed, too many stagnant episodes, and a negative temperature bias. In winter, air quality is mostly driven by dispersion, and we could not identify significant differences in either wind or planetary boundary layer height statistics between GCM-driven and reanalyses-driven regional simulations. However, precipitation appears largely overestimated in GCM-driven simulations, which could significantly affect the simulation of aerosol concentrations. The identification of these biases will help interpreting results of future air quality simulations using these data. Despite these, we conclude that the identified differences should not lead to major difficulties in using GCM-driven regional climate simulations for air quality projections.

  4. Useful measures and models for analytical quality management in medical laboratories.

    PubMed

    Westgard, James O

    2016-02-01

    The 2014 Milan Conference "Defining analytical performance goals 15 years after the Stockholm Conference" initiated a new discussion of issues concerning goals for precision, trueness or bias, total analytical error (TAE), and measurement uncertainty (MU). Goal-setting models are critical for analytical quality management, along with error models, quality-assessment models, quality-planning models, as well as comprehensive models for quality management systems. There are also critical underlying issues, such as an emphasis on MU to the possible exclusion of TAE and a corresponding preference for separate precision and bias goals instead of a combined total error goal. This opinion recommends careful consideration of the differences in the concepts of accuracy and traceability and the appropriateness of different measures, particularly TAE as a measure of accuracy and MU as a measure of traceability. TAE is essential to manage quality within a medical laboratory and MU and trueness are essential to achieve comparability of results across laboratories. With this perspective, laboratory scientists can better understand the many measures and models needed for analytical quality management and assess their usefulness for practical applications in medical laboratories.

  5. Investigating the impact of mindfulness meditation training on working memory: a mathematical modeling approach.

    PubMed

    van Vugt, Marieke K; Jha, Amishi P

    2011-09-01

    We investigated whether mindfulness training (MT) influences information processing in a working memory task with complex visual stimuli. Participants were tested before (T1) and after (T2) participation in an intensive one-month MT retreat, and their performance was compared with that of an age- and education-matched control group. Accuracy did not differ across groups at either time point. Response times were faster and significantly less variable in the MT versus the control group at T2. Since these results could be due to changes in mnemonic processes, speed-accuracy trade-off, or nondecisional factors (e.g., motor execution), we used a mathematical modeling approach to disentangle these factors. The EZ-diffusion model (Wagenmakers, van der Maas, & Grasman, Psychonomic Bulletin & Review 14:(1), 3-22, 2007) suggested that MT leads to improved information quality and reduced response conservativeness, with no changes in nondecisional factors. The noisy exemplar model further suggested that the increase in information quality reflected a decrease in encoding noise and not an increase in forgetting. Thus, mathematical modeling may help clarify the mechanisms by which MT produces salutary effects on performance.

  6. THE EMERGENCE OF NUMERICAL AIR QUALITY FORECASTING MODELS AND THEIR APPLICATION

    EPA Science Inventory

    In recent years the U.S. and other nations have begun programs for short-term local through regional air quality forecasting based upon numerical three-dimensional air quality grid models. These numerical air quality forecast (NAQF) models and systems have been developed and test...

  7. THE EMERGENCE OF NUMERICAL AIR QUALITY FORCASTING MODELS AND THEIR APPLICATIONS

    EPA Science Inventory

    In recent years the U.S. and other nations have begun programs for short-term local through regional air quality forecasting based upon numerical three-dimensional air quality grid models. These numerical air quality forecast (NAQF) models and systems have been developed and test...

  8. Affinity comparison of different THCA synthase to CBGA using modeling computational approaches.

    PubMed

    Alaoui, Moulay Abdelaziz El; Ibrahimi, Azeddine; Semlali, Oussama; Tarhda, Zineb; Marouane, Melloul; Najwa, Alaoui; Soulaymani, Abdelmajid; Fahime, Elmostafa El

    2014-01-01

    The Δ(9-)Tetrahydrocannabinol (THCA) is the primary psychoactive compound of Cannabis Sativa. It is produced by Δ(1-) Tetrahydrocannabinolic acid synthase (THCA) which catalyzes the oxidative cyclization of cannabigerolic acid (CBGA) the precursor of the THCA. In this study, we were interested by the three dimensional structure of THCA synthase protein. Generation of models were done by MODELLER v9.11 and homology modeling with Δ1-tetrahydrocannabinolic acid (THCA) synthase X ray structure (PDB code 3VTE) on the basis of sequences retrieved from GenBank. Procheck, Errat, and Verify 3D tools were used to verify the reliability of the six 3D models obtained, the overall quality factor and the Prosa Z-score were also used to check the quality of the six modeled proteins. The RMSDs for C-alpha atoms, main-chain atoms, side-chain atoms and all atoms between the modeled structures and the corresponding template ranged between 0.290 Å-1.252 Å, reflecting the good quality of the obtained models. Our study of the CBGA-THCA synthase docking demonstrated that the active site pocket was successfully recognized using computational approach. The interaction energy of CBGA computed in 'fiber types' proteins ranged between -4.1 95 kcal/mol and -5.95 kcal/mol whereas in the 'drug type' was about -7.02 kcal/mol to -7.16 kcal/mol, which maybe indicate the important role played by the interaction energy of CBGA in the determination of the THCA level in Cannabis Sativa L. varieties. Finally, we have proposed an experimental design in order to explore the binding energy source of ligand-enzyme in Cannabis Sativa and the production level of the THCA in the absence of any information regarding the correlation between the enzyme affinity and THCA level production. This report opens the doors to more studies predicting the binding site pocket with accuracy from the perspective of the protein affinity and THCA level produced in Cannabis Sativa.

  9. Improved protein model quality assessments by changing the target function.

    PubMed

    Uziela, Karolis; Menéndez Hurtado, David; Shu, Nanjiang; Wallner, Björn; Elofsson, Arne

    2018-06-01

    Protein modeling quality is an important part of protein structure prediction. We have for more than a decade developed a set of methods for this problem. We have used various types of description of the protein and different machine learning methodologies. However, common to all these methods has been the target function used for training. The target function in ProQ describes the local quality of a residue in a protein model. In all versions of ProQ the target function has been the S-score. However, other quality estimation functions also exist, which can be divided into superposition- and contact-based methods. The superposition-based methods, such as S-score, are based on a rigid body superposition of a protein model and the native structure, while the contact-based methods compare the local environment of each residue. Here, we examine the effects of retraining our latest predictor, ProQ3D, using identical inputs but different target functions. We find that the contact-based methods are easier to predict and that predictors trained on these measures provide some advantages when it comes to identifying the best model. One possible reason for this is that contact based methods are better at estimating the quality of multi-domain targets. However, training on the S-score gives the best correlation with the GDT_TS score, which is commonly used in CASP to score the global model quality. To take the advantage of both of these features we provide an updated version of ProQ3D that predicts local and global model quality estimates based on different quality estimates. © 2018 Wiley Periodicals, Inc.

  10. Identifying best-fitting inputs in health-economic model calibration: a Pareto frontier approach.

    PubMed

    Enns, Eva A; Cipriano, Lauren E; Simons, Cyrena T; Kong, Chung Yin

    2015-02-01

    To identify best-fitting input sets using model calibration, individual calibration target fits are often combined into a single goodness-of-fit (GOF) measure using a set of weights. Decisions in the calibration process, such as which weights to use, influence which sets of model inputs are identified as best-fitting, potentially leading to different health economic conclusions. We present an alternative approach to identifying best-fitting input sets based on the concept of Pareto-optimality. A set of model inputs is on the Pareto frontier if no other input set simultaneously fits all calibration targets as well or better. We demonstrate the Pareto frontier approach in the calibration of 2 models: a simple, illustrative Markov model and a previously published cost-effectiveness model of transcatheter aortic valve replacement (TAVR). For each model, we compare the input sets on the Pareto frontier to an equal number of best-fitting input sets according to 2 possible weighted-sum GOF scoring systems, and we compare the health economic conclusions arising from these different definitions of best-fitting. For the simple model, outcomes evaluated over the best-fitting input sets according to the 2 weighted-sum GOF schemes were virtually nonoverlapping on the cost-effectiveness plane and resulted in very different incremental cost-effectiveness ratios ($79,300 [95% CI 72,500-87,600] v. $139,700 [95% CI 79,900-182,800] per quality-adjusted life-year [QALY] gained). Input sets on the Pareto frontier spanned both regions ($79,000 [95% CI 64,900-156,200] per QALY gained). The TAVR model yielded similar results. Choices in generating a summary GOF score may result in different health economic conclusions. The Pareto frontier approach eliminates the need to make these choices by using an intuitive and transparent notion of optimality as the basis for identifying best-fitting input sets. © The Author(s) 2014.

  11. Assessment and prediction of air quality using fuzzy logic and autoregressive models

    NASA Astrophysics Data System (ADS)

    Carbajal-Hernández, José Juan; Sánchez-Fernández, Luis P.; Carrasco-Ochoa, Jesús A.; Martínez-Trinidad, José Fco.

    2012-12-01

    In recent years, artificial intelligence methods have been used for the treatment of environmental problems. This work, presents two models for assessment and prediction of air quality. First, we develop a new computational model for air quality assessment in order to evaluate toxic compounds that can harm sensitive people in urban areas, affecting their normal activities. In this model we propose to use a Sigma operator to statistically asses air quality parameters using their historical data information and determining their negative impact in air quality based on toxicity limits, frequency average and deviations of toxicological tests. We also introduce a fuzzy inference system to perform parameter classification using a reasoning process and integrating them in an air quality index describing the pollution levels in five stages: excellent, good, regular, bad and danger, respectively. The second model proposed in this work predicts air quality concentrations using an autoregressive model, providing a predicted air quality index based on the fuzzy inference system previously developed. Using data from Mexico City Atmospheric Monitoring System, we perform a comparison among air quality indices developed for environmental agencies and similar models. Our results show that our models are an appropriate tool for assessing site pollution and for providing guidance to improve contingency actions in urban areas.

  12. Quality Assurance and Its Impact from Higher Education Institutions' Perspectives: Methodological Approaches, Experiences and Expectations

    ERIC Educational Resources Information Center

    Bejan, Stelian Andrei; Janatuinen, Tero; Jurvelin, Jouni; Klöpping, Susanne; Malinen, Heikki; Minke, Bernhard; Vacareanu, Radu

    2015-01-01

    This paper reports on methodological approaches, experiences and expectations referring to impact analysis of quality assurance from the perspective of three higher education institutions (students, teaching staff, quality managers) from Germany, Finland and Romania. The presentations of the three sample institutions focus on discussing the core…

  13. AN APPROACH FOR EVALUATING THE EFFECTIVENESS OF VARIOUS OZONE AIR QUALITY STANDARDS FOR PROTECTING TREES

    EPA Science Inventory

    We demonstrate an approach for evaluating the level of protection attained using a variety of forms and levels of past, current, and proposed Air Quality Standards (AQSs). The U.S. Clean Air Act requires the establishment of ambient air quality standards to protect health and pub...

  14. A continental-scale hydrology and water quality model for Europe: Calibration and uncertainty of a high-resolution large-scale SWAT model

    NASA Astrophysics Data System (ADS)

    Abbaspour, K. C.; Rouholahnejad, E.; Vaghefi, S.; Srinivasan, R.; Yang, H.; Kløve, B.

    2015-05-01

    A combination of driving forces are increasing pressure on local, national, and regional water supplies needed for irrigation, energy production, industrial uses, domestic purposes, and the environment. In many parts of Europe groundwater quantity, and in particular quality, have come under sever degradation and water levels have decreased resulting in negative environmental impacts. Rapid improvements in the economy of the eastern European block of countries and uncertainties with regard to freshwater availability create challenges for water managers. At the same time, climate change adds a new level of uncertainty with regard to freshwater supplies. In this research we build and calibrate an integrated hydrological model of Europe using the Soil and Water Assessment Tool (SWAT) program. Different components of water resources are simulated and crop yield and water quality are considered at the Hydrological Response Unit (HRU) level. The water resources are quantified at subbasin level with monthly time intervals. Leaching of nitrate into groundwater is also simulated at a finer spatial level (HRU). The use of large-scale, high-resolution water resources models enables consistent and comprehensive examination of integrated system behavior through physically-based, data-driven simulation. In this article we discuss issues with data availability, calibration of large-scale distributed models, and outline procedures for model calibration and uncertainty analysis. The calibrated model and results provide information support to the European Water Framework Directive and lay the basis for further assessment of the impact of climate change on water availability and quality. The approach and methods developed are general and can be applied to any large region around the world.

  15. Quality assurance of weather data for agricultural system model input

    USDA-ARS?s Scientific Manuscript database

    It is well known that crop production and hydrologic variation on watersheds is weather related. Rarely, however, is meteorological data quality checks reported for agricultural systems model research. We present quality assurance procedures for agricultural system model weather data input. Problems...

  16. Space shuttle flying qualities and criteria assessment

    NASA Technical Reports Server (NTRS)

    Myers, T. T.; Johnston, D. E.; Mcruer, Duane T.

    1987-01-01

    Work accomplished under a series of study tasks for the Flying Qualities and Flight Control Systems Design Criteria Experiment (OFQ) of the Shuttle Orbiter Experiments Program (OEX) is summarized. The tasks involved review of applicability of existing flying quality and flight control system specification and criteria for the Shuttle; identification of potentially crucial flying quality deficiencies; dynamic modeling of the Shuttle Orbiter pilot/vehicle system in the terminal flight phases; devising a nonintrusive experimental program for extraction and identification of vehicle dynamics, pilot control strategy, and approach and landing performance metrics, and preparation of an OEX approach to produce a data archive and optimize use of the data to develop flying qualities for future space shuttle craft in general. Analytic modeling of the Orbiter's unconventional closed-loop dynamics in landing, modeling pilot control strategies, verification of vehicle dynamics and pilot control strategy from flight data, review of various existent or proposed aircraft flying quality parameters and criteria in comparison with the unique dynamic characteristics and control aspects of the Shuttle in landing; and finally a summary of conclusions and recommendations for developing flying quality criteria and design guides for future Shuttle craft.

  17. MODELED MESOSCALE METEOROLOGICAL FIELDS WITH FOUR-DIMENSIONAL DATA ASSIMILATION IN REGIONAL SCALE AIR QUALITY MODELS

    EPA Science Inventory

    This paper addresses the need to increase the temporal and spatial resolution of meteorological data currently used in air quality simulation models, AQSMs. ransport and diffusion parameters including mixing heights and stability used in regulatory air quality dispersion models a...

  18. Authoring and verification of clinical guidelines: a model driven approach.

    PubMed

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc

  19. The Quality Adaptation Model: Adaptation and Adoption of the Quality Standard ISO/IEC 19796-1 for Learning, Education, and Training

    ERIC Educational Resources Information Center

    Pawlowski, Jan M.

    2007-01-01

    In 2005, the new quality standard for learning, education, and training, ISO/IEC 19796-1, was published. Its purpose is to help educational organizations to develop quality systems and to improve the quality of their processes, products, and services. In this article, the standard is presented and compared to existing approaches, showing the…

  20. ROLE OF MODELS IN AIR QUALITY MANAGEMENT DECISIONS

    EPA Science Inventory

    Within the frame of the US-India bilateral agreement on environmental cooperation, a team of US scientists have been helping India in designing emission control policies to address urban air quality problems. This presentation discusses how air quality models need to be used for ...

  1. ENHANCED STREAM WATER QUALITY MODEL (QUAL2EU)

    EPA Science Inventory

    The enhanced stream water quality model QUAL2E and QUAL2E-UNCAS (37) permits simulation of several water quality constituents in a branching stream system using a finite difference solution to the one-dimensional advective-dispersive mass transport and reaction equation. The con...

  2. Remote Sensing and Spatial Growth Modeling Coupled With Air Quality Modeling to Assess the Impact of Atlanta, Georgia on the Local and Regional Environment

    NASA Astrophysics Data System (ADS)

    Quattrochi, D. A.; Estes, M. G.; Crosson, W. L.; Johnson, H.; Khan, M.

    2006-05-01

    compared with USGS 1km land use/land cover data that have traditionally been used in modeling. Air quality prediction for future scenarios to 2030 is being facilitated by land use projections using a spatial growth model. Land use projections were developed using the 2030 Regional Transportation Plan developed by the Atlanta Regional Commission, the regional planning agency for the area. This allows the Georgia Environmental Protection Division to evaluate how these transportation plans will affect future air quality. The coupled SGM and air quality modeling approach provides insight on what the impacts of Atlanta's growth will be on the local and regional environment and exists as a mechanism that can be used by policy makers to make rational decisions on urban growth and sustainability for the metropolitan area in the future.

  3. [Autism: educational models for a quality life].

    PubMed

    Tamarit, J

    2005-01-15

    Our aim is to describe the change that is taking place in the field of education in developmental disabilities from models centred on the clinical symptoms and on the limitations in the adaptive skills to models that focus on valuable personal results in terms of quality of life. In order to understand these changes, we outline some of the key points that have given rise to a particular cultural construction of disability and we also discuss how the situation is changing towards models aimed at achieving important personal results. In autism, as in the other developmental disorders, special emphasis has traditionally been placed on an education focusing on symptoms and on skills, and, although things are now beginning to head in that direction, little attention has been given to education based on the person and his or her quality of life. These changes imply new roles for the professionals attending these people. These roles involve combining technique with empathy and ethics, and they are more firmly based on the active role of individuals with autism, together with their rights, interests and opinions. Models of intervention must pay special attention to the pursuit of valuable personal results, which are oriented towards living a quality life and must involve the active participation of the individuals themselves as well as their relatives.

  4. Physiologically-based pharmacokinetic models: approaches for enabling personalized medicine.

    PubMed

    Hartmanshenn, Clara; Scherholz, Megerle; Androulakis, Ioannis P

    2016-10-01

    Personalized medicine strives to deliver the 'right drug at the right dose' by considering inter-person variability, one of the causes for therapeutic failure in specialized populations of patients. Physiologically-based pharmacokinetic (PBPK) modeling is a key tool in the advancement of personalized medicine to evaluate complex clinical scenarios, making use of physiological information as well as physicochemical data to simulate various physiological states to predict the distribution of pharmacokinetic responses. The increased dependency on PBPK models to address regulatory questions is aligned with the ability of PBPK models to minimize ethical and technical difficulties associated with pharmacokinetic and toxicology experiments for special patient populations. Subpopulation modeling can be achieved through an iterative and integrative approach using an adopt, adapt, develop, assess, amend, and deliver methodology. PBPK modeling has two valuable applications in personalized medicine: (1) determining the importance of certain subpopulations within a distribution of pharmacokinetic responses for a given drug formulation and (2) establishing the formulation design space needed to attain a targeted drug plasma concentration profile. This review article focuses on model development for physiological differences associated with sex (male vs. female), age (pediatric vs. young adults vs. elderly), disease state (healthy vs. unhealthy), and temporal variation (influence of biological rhythms), connecting them to drug product formulation development within the quality by design framework. Although PBPK modeling has come a long way, there is still a lengthy road before it can be fully accepted by pharmacologists, clinicians, and the broader industry.

  5. Identifying approaches for assessing methodological and reporting quality of systematic reviews: a descriptive study.

    PubMed

    Pussegoda, Kusala; Turner, Lucy; Garritty, Chantelle; Mayhew, Alain; Skidmore, Becky; Stevens, Adrienne; Boutron, Isabelle; Sarkis-Onofre, Rafael; Bjerre, Lise M; Hróbjartsson, Asbjørn; Altman, Douglas G; Moher, David

    2017-06-19

    The methodological quality and completeness of reporting of the systematic reviews (SRs) is fundamental to optimal implementation of evidence-based health care and the reduction of research waste. Methods exist to appraise SRs yet little is known about how they are used in SRs or where there are potential gaps in research best-practice guidance materials. The aims of this study are to identify reports assessing the methodological quality (MQ) and/or reporting quality (RQ) of a cohort of SRs and to assess their number, general characteristics, and approaches to 'quality' assessment over time. The Cochrane Library, MEDLINE®, and EMBASE® were searched from January 1990 to October 16, 2014, for reports assessing MQ and/or RQ of SRs. Title, abstract, and full-text screening of all reports were conducted independently by two reviewers. Reports assessing the MQ and/or RQ of a cohort of ten or more SRs of interventions were included. All results are reported as frequencies and percentages of reports. Of 20,765 unique records retrieved, 1189 of them were reviewed for full-text review, of which 76 reports were included. Eight previously published approaches to assessing MQ or reporting guidelines used as proxy to assess RQ were used in 80% (61/76) of identified reports. These included two reporting guidelines (PRISMA and QUOROM) and five quality assessment tools (AMSTAR, R-AMSTAR, OQAQ, Mulrow, Sacks) and GRADE criteria. The remaining 24% (18/76) of reports developed their own criteria. PRISMA, OQAQ, and AMSTAR were the most commonly used published tools to assess MQ or RQ. In conjunction with other approaches, published tools were used in 29% (22/76) of reports, with 36% (8/22) assessing adherence to both PRISMA and AMSTAR criteria and 26% (6/22) using QUOROM and OQAQ. The methods used to assess quality of SRs are diverse, and none has become universally accepted. The most commonly used quality assessment tools are AMSTAR, OQAQ, and PRISMA. As new tools and guidelines are

  6. A data management approach to quality assurance using colorectal carcinoma reports from two institutions as a model.

    PubMed

    Sorge, John P; Harmon, C Reid; Sherman, Susan M; Baillie, E Eugene

    2005-07-01

    We used data management software to compare pathology report data concerning regional lymph node sampling for colorectal carcinoma from 2 institutions using different dissection methods. Data were retrieved from 2 disparate anatomic pathology information systems for all cases of colorectal carcinoma in 2003 involving the ascending and descending colon. Initial sorting of the data included overall lymph node recovery to assess differences between the dissection methods at the 2 institutions. Additional segregation of the data was used to challenge the application's capability of accurately addressing the complexity of the process. This software approach can be used to evaluate data from disparate computer systems, and we demonstrate how an automated function can enable institutions to compare internal pathologic assessment processes and the results of those comparisons. The use of this process has future implications for pathology quality assurance in other areas.

  7. The effects of RN staffing hours on nursing home quality: a two-stage model.

    PubMed

    Lee, Hyang Yuol; Blegen, Mary A; Harrington, Charlene

    2014-03-01

    Based on structure-process-outcome approach, this study examined the association of registered nurse (RN) staffing hours and five quality indicators, including two process measures (catheter use and antipsychotic drug use) and three outcome measures (pressure ulcers, urinary tract infections, and weight loss). We used data on resident assessments, RN staffing, organizational characteristics, and market factors to examine the quality of 195 nursing homes operating in a rural state of United States - Colorado. Two-stage least squares regression models were performed to address the endogenous relationships between RN staffing and the outcome-related quality indicators, and ordinary least squares regression was used for the process-related ones. This analysis focused on the relationship of RN staffing to nursing home quality indicators, controlling for organizational characteristics, resources, resident casemix, and market factors with clustering to control for geographical differences. Higher RN hours were associated with fewer pressure ulcers, but RN hours were not related to the other quality indicators. The study finding shows the importance of understanding the role of 'nurse staffing' under nursing home care, as well as the significance of associated/contextual factors with nursing home quality even in a small rural state. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Integrating microbial physiology and enzyme traits in the quality model

    NASA Astrophysics Data System (ADS)

    Sainte-Marie, Julien; Barrandon, Matthieu; Martin, Francis; Saint-André, Laurent; Derrien, Delphine

    2017-04-01

    Microbe activity plays an undisputable role in soil carbon storage and there have been many calls to integrate microbial ecology in soil carbon (C) models. With regard to this challenge, a few trait-based microbial models of C dynamics have emerged during the past decade. They parameterize specific traits related to decomposer physiology (substrate use efficiency, growth and mortality rates...) and enzyme properties (enzyme production rate, catalytic properties of enzymes…). But these models are built on the premise that organic matter (OM) can be represented as one single entity or are divided into a few pools, while organic matter exists as a continuum of many different compounds spanning from intact plant molecules to highly oxidised microbial metabolites. In addition, a given molecule may also exist in different forms, depending on its stage of polymerization or on its interactions with other organic compounds or mineral phases of the soil. Here we develop a general theoretical model relating the evolution of soil organic matter, as a continuum of progressively decomposing compounds, with decomposer activity and enzyme traits. The model is based on the notion of quality developed by Agren and Bosatta (1998), which is a measure of molecule accessibility to degradation. The model integrates three major processes: OM depolymerisation by enzyme action, OM assimilation and OM biotransformation. For any enzyme, the model reports the quality range where this enzyme selectively operates and how the initial quality distribution of the OM subset evolves into another distribution of qualities under the enzyme action. The model also defines the quality range where the OM can be uptaken and assimilated by microbes. It finally describes how the quality of the assimilated molecules is transformed into another quality distribution, corresponding to the decomposer metabolites signature. Upon decomposer death, these metabolites return to the substrate. We explore here the how

  9. The Total Quality Management Model Department of Personnel State of Colorado,

    DTIC Science & Technology

    A panel of three members will present the Total Quality Management model recently designed for the Department of Personnel, State of Colorado. This model was selected to increase work quality and productivity of the Department and to exemplify Governor Romer’s commitment to quality work within state government.

  10. Healthcare service quality perception in Japan.

    PubMed

    Eleuch, Amira ep Koubaa

    2011-01-01

    This study aims to assess Japanese patients' healthcare service quality perceptions and to shed light on the most meaningful service features. It follows-up a study published in IJHCQA Vol. 21 No. 7. Through a non-linear approach, the study relied on the scatter model to detect healthcare service features' importance in forming overall quality judgment. Japanese patients perceive healthcare services through a linear compensatory process. Features related to technical quality and staff behavior compensate for each other to decide service quality. A limitation of the study is the limited sample size. Non-linear approaches could help researchers to better understand patients' healthcare service quality perceptions. The study highlights a need to adopt an evolution that enhances technical quality and medical practices in Japanese healthcare settings. The study relies on a non-linear approach to assess patient overall quality perceptions in order to enrich knowledge. Furthermore, the research is conducted in Japan where healthcare marketing studies are scarce owing to cultural and language barriers. Japanese culture and healthcare system characteristics are used to explain and interpret the results.

  11. Limitations Of The Current State Space Modelling Approach In Multistage Machining Processes Due To Operation Variations

    NASA Astrophysics Data System (ADS)

    Abellán-Nebot, J. V.; Liu, J.; Romero, F.

    2009-11-01

    The State Space modelling approach has been recently proposed as an engineering-driven technique for part quality prediction in Multistage Machining Processes (MMP). Current State Space models incorporate fixture and datum variations in the multi-stage variation propagation, without explicitly considering common operation variations such as machine-tool thermal distortions, cutting-tool wear, cutting-tool deflections, etc. This paper shows the limitations of the current State Space model through an experimental case study where the effect of the spindle thermal expansion, cutting-tool flank wear and locator errors are introduced. The paper also discusses the extension of the current State Space model to include operation variations and its potential benefits.

  12. Seasonal rationalization of river water quality sampling locations: a comparative study of the modified Sanders and multivariate statistical approaches.

    PubMed

    Varekar, Vikas; Karmakar, Subhankar; Jha, Ramakar

    2016-02-01

    The design of surface water quality sampling location is a crucial decision-making process for rationalization of monitoring network. The quantity, quality, and types of available dataset (watershed characteristics and water quality data) may affect the selection of appropriate design methodology. The modified Sanders approach and multivariate statistical techniques [particularly factor analysis (FA)/principal component analysis (PCA)] are well-accepted and widely used techniques for design of sampling locations. However, their performance may vary significantly with quantity, quality, and types of available dataset. In this paper, an attempt has been made to evaluate performance of these techniques by accounting the effect of seasonal variation, under a situation of limited water quality data but extensive watershed characteristics information, as continuous and consistent river water quality data is usually difficult to obtain, whereas watershed information may be made available through application of geospatial techniques. A case study of Kali River, Western Uttar Pradesh, India, is selected for the analysis. The monitoring was carried out at 16 sampling locations. The discrete and diffuse pollution loads at different sampling sites were estimated and accounted using modified Sanders approach, whereas the monitored physical and chemical water quality parameters were utilized as inputs for FA/PCA. The designed optimum number of sampling locations for monsoon and non-monsoon seasons by modified Sanders approach are eight and seven while that for FA/PCA are eleven and nine, respectively. Less variation in the number and locations of designed sampling sites were obtained by both techniques, which shows stability of results. A geospatial analysis has also been carried out to check the significance of designed sampling location with respect to river basin characteristics and land use of the study area. Both methods are equally efficient; however, modified Sanders

  13. Operation quality assessment model for video conference system

    NASA Astrophysics Data System (ADS)

    Du, Bangshi; Qi, Feng; Shao, Sujie; Wang, Ying; Li, Weijian

    2018-01-01

    Video conference system has become an important support platform for smart grid operation and management, its operation quality is gradually concerning grid enterprise. First, the evaluation indicator system covering network, business and operation maintenance aspects was established on basis of video conference system's operation statistics. Then, the operation quality assessment model combining genetic algorithm with regularized BP neural network was proposed, which outputs operation quality level of the system within a time period and provides company manager with some optimization advice. The simulation results show that the proposed evaluation model offers the advantages of fast convergence and high prediction accuracy in contrast with regularized BP neural network, and its generalization ability is superior to LM-BP neural network and Bayesian BP neural network.

  14. A software quality model and metrics for risk assessment

    NASA Technical Reports Server (NTRS)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  15. Analysis of apple beverages treated with high-power ultrasound: a quality function deployment approach.

    PubMed

    Režek Jambrak, Anet; Šimunek, Marina; Grbeš, Franjo; Mandura, Ana; Djekic, Ilija

    2018-04-01

    The objective of this paper was to demonstrate application of quality function deployment in analysing effects of high power ultrasound on quality properties of apple juices and nectars. In order to develop a quality function deployment model, joint with instrumental analysis of treated samples, a field survey was performed to identify consumer preferences towards quality characteristics of juices/nectar. Based on field research, the three most important characteristics were 'taste' and 'aroma' with 28.5% of relative absolute weight importance, followed by 'odour' (16.9%). The quality function deployment model showed that the top three 'quality scores' for apple juice were treatments with amplitude 90 µm, 9 min treatment time and sample temperature 40 °C; 60 µm, 9 min, 60 °C; and 90 µm, 6 min, 40 °C. For nectars, the top three were treatments 120 µm, 9 min, 20 °C; 60 µm, 9 min, 60 °C; and A2.16 60 µm, 9 min, 20 °C. This type of quality model enables a more complex measure of large scale of different quality parameters. Its simplicity should be understood as its practical advantage and, as such, this tool can be a part of design quality when using novel preservation technologies. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  16. A hybrid agent-based approach for modeling microbiological systems.

    PubMed

    Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing

    2008-11-21

    Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.

  17. Electrification Futures Study Modeling Approach | Energy Analysis | NREL

    Science.gov Websites

    Electrification Futures Study Modeling Approach Electrification Futures Study Modeling Approach To quantitatively answer the research questions of the Electrification Futures Study, researchers will use multiple accounting for infrastructure inertia through stock turnover. Load Modeling The Electrification Futures Study

  18. A comparison of two coaching approaches to enhance implementation of a recovery-oriented service model.

    PubMed

    Deane, Frank P; Andresen, Retta; Crowe, Trevor P; Oades, Lindsay G; Ciarrochi, Joseph; Williams, Virginia

    2014-09-01

    Moving to recovery-oriented service provision in mental health may entail retraining existing staff, as well as training new staff. This represents a substantial burden on organisations, particularly since transfer of training into practice is often poor. Follow-up supervision and/or coaching have been found to improve the implementation and sustainment of new approaches. We compared the effect of two coaching conditions, skills-based and transformational coaching, on the implementation of a recovery-oriented model following training. Training followed by coaching led to significant sustained improvements in the quality of care planning in accordance with the new model over the 12-month study period. No interaction effect was observed between the two conditions. However, post hoc analyses suggest that transformational coaching warrants further exploration. The results support the provision of supervision in the form of coaching in the implementation of a recovery-oriented service model, and suggest the need to better elucidate the mechanisms within different coaching approaches that might contribute to improved care.

  19. Objective vocal quality in children using cochlear implants: a multiparameter approach.

    PubMed

    Baudonck, Nele; D'haeseleer, Evelien; Dhooge, Ingeborg; Van Lierde, Kristiane

    2011-11-01

    The purpose of this study was to determine the objective vocal quality in 36 prelingually deaf children using cochlear implant (CI) with a mean age of 9 years. An additional purpose was to compare the objective vocal quality of these 36 CI users with 25 age-matched children with prelingual severe hearing loss using conventional hearing aids (HAs) and 25 normal hearing (NH) children. The design for this cross-sectional study was a multigroup posttest-only design. The objective vocal quality was measured by means of the dysphonia severity index (DSI). Moreover, perceptual voice assessment using the GRBASI scale was performed. CI children have a vocal quality by means of the DSI of +1.8, corresponding with a DSI% of 68%, indicating a borderline vocal quality situated 2% above the limit of normality. The voice was perceptually characterized by the presence of a very slight grade of hoarseness, roughness, strained phonation, and higher pitch and intensity levels. No significant objective vocal quality differences were measured between the voices of the CI children, HA users, and NH children. According to the results, one aspect of the vocal approach in children with CI and using HAs must be focused on the improvement of the strained vocal characteristic and the use of a lower pitch and intensity level. Copyright © 2011 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  20. A nonpharmacological approach to improve sleep quality in older adults.

    PubMed

    Rawtaer, Iris; Mahendran, Rathi; Chan, Hui Yu; Lei, Feng; Kua, Ee Heok

    2018-06-01

    Poor sleep quality is highly prevalent among older adults and is associated with poor quality of life, cognitive and physical decline, depression, and increased mortality. Medication options commonly used are not ideal, and alternative treatment strategies are needed. We evaluate a community-based psychosocial intervention program and its effect on sleep quality in older adults. Elderly participants aged 60 and above were included. Those with Geriatric Depression Scale and Geriatric Anxiety Inventory scores above 5 and 10, respectively, were excluded. The community program included tai chi exercise, art therapy, mindfulness awareness practice, and music reminiscence therapy. Pittsburgh Sleep Quality Index, Geriatric Depression Scale, and Geriatric Anxiety Inventory were administered at baseline and at 1 year. A hundred and eighty-nine subjects (44 men, 145 women; mean age = 69 years, SD = 5.7, range = 60-89) participated. The proportion of participants with good sleep quality had increased from 58.2% to 64.6%. Sleep disturbance was significantly reduced (baseline, 1.04; postintervention, 0.76; mean difference 0.28; P < .01); men experienced greater improvement (P < .001). Improvements were independent of changes in depressive and anxiety symptoms. Participation in this community program led to positive effects on sleep disturbances after a year. Psychosocial interventions have potential as a nondrug intervention approach for sleep problems, and further research is needed to understand its mediating mechanisms. © 2017 John Wiley & Sons Australia, Ltd.