Sample records for model quality information

  1. Design and Establishment of Quality Model of Fundamental Geographic Information Database

    NASA Astrophysics Data System (ADS)

    Ma, W.; Zhang, J.; Zhao, Y.; Zhang, P.; Dang, Y.; Zhao, T.

    2018-04-01

    In order to make the quality evaluation for the Fundamental Geographic Information Databases(FGIDB) more comprehensive, objective and accurate, this paper studies and establishes a quality model of FGIDB, which formed by the standardization of database construction and quality control, the conformity of data set quality and the functionality of database management system, and also designs the overall principles, contents and methods of the quality evaluation for FGIDB, providing the basis and reference for carry out quality control and quality evaluation for FGIDB. This paper designs the quality elements, evaluation items and properties of the Fundamental Geographic Information Database gradually based on the quality model framework. Connected organically, these quality elements and evaluation items constitute the quality model of the Fundamental Geographic Information Database. This model is the foundation for the quality demand stipulation and quality evaluation of the Fundamental Geographic Information Database, and is of great significance on the quality assurance in the design and development stage, the demand formulation in the testing evaluation stage, and the standard system construction for quality evaluation technology of the Fundamental Geographic Information Database.

  2. Modelling End-User of Electronic-Government Service: The Role of Information quality, System Quality and Trust

    NASA Astrophysics Data System (ADS)

    Witarsyah Jacob, Deden; Fudzee, Mohd Farhan Md; Aizi Salamat, Mohamad; Kasim, Shahreen; Mahdin, Hairulnizam; Azhar Ramli, Azizul

    2017-08-01

    Many governments around the world increasingly use internet technologies such as electronic government to provide public services. These services range from providing the most basic informational website to deploying sophisticated tools for managing interactions between government agencies and beyond government. Electronic government (e-government) aims to provide a more accurate, easily accessible, cost-effective and time saving for the community. In this study, we develop a new model of e-government adoption service by extending the Unified Theory of Acceptance and Use of Technology (UTAUT) through the incorporation of some variables such as System Quality, Information Quality and Trust. The model is then tested using a large-scale, multi-site survey research of 237 Indonesian citizens. This model will be validated by using Structural Equation Modeling (SEM). The result indicates that System Quality, Information Quality and Trust variables proven to effect user behavior. This study extends the current understanding on the influence of System Quality, Information Quality and Trust factors to researchers, practitioners, and policy makers.

  3. Video quality assessment using a statistical model of human visual speed perception.

    PubMed

    Wang, Zhou; Li, Qiang

    2007-12-01

    Motion is one of the most important types of information contained in natural video, but direct use of motion information in the design of video quality assessment algorithms has not been deeply investigated. Here we propose to incorporate a recent model of human visual speed perception [Nat. Neurosci. 9, 578 (2006)] and model visual perception in an information communication framework. This allows us to estimate both the motion information content and the perceptual uncertainty in video signals. Improved video quality assessment algorithms are obtained by incorporating the model as spatiotemporal weighting factors, where the weight increases with the information content and decreases with the perceptual uncertainty. Consistent improvement over existing video quality assessment algorithms is observed in our validation with the video quality experts group Phase I test data set.

  4. Information quality-control model

    NASA Technical Reports Server (NTRS)

    Vincent, D. A.

    1971-01-01

    Model serves as graphic tool for estimating complete product objectives from limited input information, and is applied to cost estimations, product-quality evaluations, and effectiveness measurements for manpower resources allocation. Six product quality levels are defined.

  5. Quality Inspection and Analysis of Three-Dimensional Geographic Information Model Based on Oblique Photogrammetry

    NASA Astrophysics Data System (ADS)

    Dong, S.; Yan, Q.; Xu, Y.; Bai, J.

    2018-04-01

    In order to promote the construction of digital geo-spatial framework in China and accelerate the construction of informatization mapping system, three-dimensional geographic information model emerged. The three-dimensional geographic information model based on oblique photogrammetry technology has higher accuracy, shorter period and lower cost than traditional methods, and can more directly reflect the elevation, position and appearance of the features. At this stage, the technology of producing three-dimensional geographic information models based on oblique photogrammetry technology is rapidly developing. The market demand and model results have been emerged in a large amount, and the related quality inspection needs are also getting larger and larger. Through the study of relevant literature, it is found that there are a lot of researches on the basic principles and technical characteristics of this technology, and relatively few studies on quality inspection and analysis. On the basis of summarizing the basic principle and technical characteristics of oblique photogrammetry technology, this paper introduces the inspection contents and inspection methods of three-dimensional geographic information model based on oblique photogrammetry technology. Combined with the actual inspection work, this paper summarizes the quality problems of three-dimensional geographic information model based on oblique photogrammetry technology, analyzes the causes of the problems and puts forward the quality control measures. It provides technical guidance for the quality inspection of three-dimensional geographic information model data products based on oblique photogrammetry technology in China and provides technical support for the vigorous development of three-dimensional geographic information model based on oblique photogrammetry technology.

  6. Design and realization of high quality prime farmland planning and management information system

    NASA Astrophysics Data System (ADS)

    Li, Manchun; Liu, Guohong; Liu, Yongxue; Jiang, Zhixin

    2007-06-01

    The article discusses the design and realization of a high quality prime farmland planning and management information system based on SDSS. Models in concept integration, management planning are used in High Quality Prime Farmland Planning in order to refine the current model system and the management information system is deigned with a triangular structure. Finally an example of Tonglu county high quality prime farmland planning and management information system is introduced.

  7. Information Quality Evaluation of C2 Systems at Architecture Level

    DTIC Science & Technology

    2014-06-01

    based on architecture models of C2 systems, which can help to identify key factors impacting information quality and improve the system capability at the stage of architecture design of C2 system....capability evaluation of C2 systems at architecture level becomes necessary and important for improving the system capability at the stage of architecture ... design . This paper proposes a method for information quality evaluation of C2 system at architecture level. First, the information quality model is

  8. Development of the information model for consumer assessment of key quality indicators by goods labelling

    NASA Astrophysics Data System (ADS)

    Koshkina, S.; Ostrinskaya, L.

    2018-04-01

    An information model for “key” quality indicators of goods has been developed. This model is based on the assessment of f standardization existing state and the product labeling quality. According to the authors’ opinion, the proposed “key” indicators are the most significant for purchasing decision making. Customers will be able to use this model through their mobile technical devices. The developed model allows to decompose existing processes in data flows and to reveal the levels of possible architectural solutions. In-depth analysis of the presented information model decomposition levels will allow determining the stages of its improvement and to reveal additional indicators of the goods quality that are of interest to customers in the further research. Examining the architectural solutions for the customer’s information environment functioning when integrating existing databases will allow us to determine the boundaries of the model flexibility and customizability.

  9. The Impact of Internet Health Information on Patient Compliance: A Research Model and an Empirical Study.

    PubMed

    Laugesen, John; Hassanein, Khaled; Yuan, Yufei

    2015-06-11

    Patients have been increasingly seeking and using Internet health information to become more active in managing their own health in a partnership with their physicians. This trend has both positive and negative effects on the interactions between patients and their physicians. Therefore, it is important to understand the impact that the increasing use of Internet health information has on the patient-physician relationship and patients' compliance with their treatment regimens. This study examines the impact of patients' use of Internet health information on various elements that characterize the interactions between a patient and her/his physician through a theoretical model based on principal-agent theory and the information asymmetry perspective. A survey-based study consisting of 225 participants was used to validate a model through various statistical techniques. A full assessment of the measurement model and structural model was completed in addition to relevant post hoc analyses. This research revealed that both patient-physician concordance and perceived information asymmetry have significant effects on patient compliance, with patient-physician concordance exhibiting a considerably stronger relationship. Additionally, both physician quality and Internet health information quality have significant effects on patient-physician concordance, with physician quality exhibiting a much stronger relationship. Finally, only physician quality was found to have a significant impact on perceived information asymmetry, whereas Internet health information quality had no impact on perceived information asymmetry. Overall, this study found that physicians can relax regarding their fears concerning patient use of Internet health information because physician quality has the greatest impact on patients and their physician coming to an agreement on their medical situation and recommended treatment regimen as well as patient's compliance with their physician's advice when compared to the impact that Internet health information quality has on these same variables. The findings also indicate that agreement between the patient and physician on the medical situation and treatment is much more important to compliance than the perceived information gap between the patient and physician (ie, the physician having a higher level of information in comparison to the patient). In addition, the level of agreement between a patient and their physician regarding the medical situation is more reliant on the perceived quality of their physician than on the perceived quality of Internet health information used. This research found that only the perceived quality of the physician has a significant relationship with the perceived information gap between the patient and their physician and the quality of the Internet health information has no relationship with this perceived information gap.

  10. How users adopt healthcare information: An empirical study of an online Q&A community.

    PubMed

    Jin, Jiahua; Yan, Xiangbin; Li, Yijun; Li, Yumei

    2016-02-01

    The emergence of social media technology has led to the creation of many online healthcare communities, where patients can easily share and look for healthcare-related information from peers who have experienced a similar problem. However, with increased user-generated content, there is a need to constantly analyse which content should be trusted as one sifts through enormous amounts of healthcare information. This study aims to explore patients' healthcare information seeking behavior in online communities. Based on dual-process theory and the knowledge adoption model, we proposed a healthcare information adoption model for online communities. This model highlights that information quality, emotional support, and source credibility are antecedent variables of adoption likelihood of healthcare information, and competition among repliers and involvement of recipients moderate the relationship between the antecedent variables and adoption likelihood. Empirical data were collected from the healthcare module of China's biggest Q&A community-Baidu Knows. Text mining techniques were adopted to calculate the information quality and emotional support contained in each reply text. A binary logistics regression model and hierarchical regression approach were employed to test the proposed conceptual model. Information quality, emotional support, and source credibility have significant and positive impact on healthcare information adoption likelihood, and among these factors, information quality has the biggest impact on a patient's adoption decision. In addition, competition among repliers and involvement of recipients were tested as moderating effects between these antecedent factors and the adoption likelihood. Results indicate competition among repliers positively moderates the relationship between source credibility and adoption likelihood, and recipients' involvement positively moderates the relationship between information quality, source credibility, and adoption decision. In addition to information quality and source credibility, emotional support has significant positive impact on individuals' healthcare information adoption decisions. Moreover, the relationships between information quality, source credibility, emotional support, and adoption decision are moderated by competition among repliers and involvement of recipients. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Integrated care: an Information Model for Patient Safety and Vigilance Reporting Systems.

    PubMed

    Rodrigues, Jean-Marie; Schulz, Stefan; Souvignet, Julien

    2015-01-01

    Quality management information systems for safety as a whole or for specific vigilances share the same information types but are not interoperable. An international initiative tries to develop an integrated information model for patient safety and vigilance reporting to support a global approach of heath care quality.

  12. Quality evaluation of health information system's architectures developed using the HIS-DF methodology.

    PubMed

    López, Diego M; Blobel, Bernd; Gonzalez, Carolina

    2010-01-01

    Requirement analysis, design, implementation, evaluation, use, and maintenance of semantically interoperable Health Information Systems (HIS) have to be based on eHealth standards. HIS-DF is a comprehensive approach for HIS architectural development based on standard information models and vocabulary. The empirical validity of HIS-DF has not been demonstrated so far. Through an empirical experiment, the paper demonstrates that using HIS-DF and HL7 information models, semantic quality of HIS architecture can be improved, compared to architectures developed using traditional RUP process. Semantic quality of the architecture has been measured in terms of model's completeness and validity metrics. The experimental results demonstrated an increased completeness of 14.38% and an increased validity of 16.63% when using the HIS-DF and HL7 information models in a sample HIS development project. Quality assurance of the system architecture in earlier stages of HIS development presumes an increased quality of final HIS systems, which supposes an indirect impact on patient care.

  13. A Theory of Information Genetics: How Four Subforces Generate Information and the Implications for Total Quality Knowledge Management.

    ERIC Educational Resources Information Center

    Tsai, Bor-sheng

    2002-01-01

    Proposes a model called information genetics to elaborate on the origin of information generating. Explains conceptual and data models; and describes a software program that was developed for citation data mining, infomapping, and information repackaging for total quality knowledge management in Web representation. (Contains 112 references.)…

  14. Information system success model for customer relationship management system in health promotion centers.

    PubMed

    Choi, Wona; Rho, Mi Jung; Park, Jiyun; Kim, Kwang-Jum; Kwon, Young Dae; Choi, In Young

    2013-06-01

    Intensified competitiveness in the healthcare industry has increased the number of healthcare centers and propelled the introduction of customer relationship management (CRM) systems to meet diverse customer demands. This study aimed to develop the information system success model of the CRM system by investigating previously proposed indicators within the model. THE EVALUATION AREAS OF THE CRM SYSTEM INCLUDES THREE AREAS: the system characteristics area (system quality, information quality, and service quality), the user area (perceived usefulness and user satisfaction), and the performance area (personal performance and organizational performance). Detailed evaluation criteria of the three areas were developed, and its validity was verified by a survey administered to CRM system users in 13 nationwide health promotion centers. The survey data were analyzed by the structural equation modeling method, and the results confirmed that the model is feasible. Information quality and service quality showed a statistically significant relationship with perceived usefulness and user satisfaction. Consequently, the perceived usefulness and user satisfaction had significant influence on individual performance as well as an indirect influence on organizational performance. This study extends the research area on information success from general information systems to CRM systems in health promotion centers applying a previous information success model. This lays a foundation for evaluating health promotion center systems and provides a useful guide for successful implementation of hospital CRM systems.

  15. Information System Success Model for Customer Relationship Management System in Health Promotion Centers

    PubMed Central

    Choi, Wona; Rho, Mi Jung; Park, Jiyun; Kim, Kwang-Jum; Kwon, Young Dae

    2013-01-01

    Objectives Intensified competitiveness in the healthcare industry has increased the number of healthcare centers and propelled the introduction of customer relationship management (CRM) systems to meet diverse customer demands. This study aimed to develop the information system success model of the CRM system by investigating previously proposed indicators within the model. Methods The evaluation areas of the CRM system includes three areas: the system characteristics area (system quality, information quality, and service quality), the user area (perceived usefulness and user satisfaction), and the performance area (personal performance and organizational performance). Detailed evaluation criteria of the three areas were developed, and its validity was verified by a survey administered to CRM system users in 13 nationwide health promotion centers. The survey data were analyzed by the structural equation modeling method, and the results confirmed that the model is feasible. Results Information quality and service quality showed a statistically significant relationship with perceived usefulness and user satisfaction. Consequently, the perceived usefulness and user satisfaction had significant influence on individual performance as well as an indirect influence on organizational performance. Conclusions This study extends the research area on information success from general information systems to CRM systems in health promotion centers applying a previous information success model. This lays a foundation for evaluating health promotion center systems and provides a useful guide for successful implementation of hospital CRM systems. PMID:23882416

  16. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    PubMed

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.

  17. Decision Maker Perception of Information Quality: A Case Study of Military Command and Control

    ERIC Educational Resources Information Center

    Morgan, Grayson B.

    2013-01-01

    Decision maker perception of information quality cues from an "information system" (IS) and the process which creates such meta cueing, or data about cues, is a critical yet un-modeled component of "situation awareness" (SA). Examples of common information quality meta cueing for quality criteria include custom ring-tones for…

  18. The Impact of Internet Health Information on Patient Compliance: A Research Model and an Empirical Study

    PubMed Central

    Hassanein, Khaled; Yuan, Yufei

    2015-01-01

    Background Patients have been increasingly seeking and using Internet health information to become more active in managing their own health in a partnership with their physicians. This trend has both positive and negative effects on the interactions between patients and their physicians. Therefore, it is important to understand the impact that the increasing use of Internet health information has on the patient-physician relationship and patients’ compliance with their treatment regimens. Objective This study examines the impact of patients’ use of Internet health information on various elements that characterize the interactions between a patient and her/his physician through a theoretical model based on principal-agent theory and the information asymmetry perspective. Methods A survey-based study consisting of 225 participants was used to validate a model through various statistical techniques. A full assessment of the measurement model and structural model was completed in addition to relevant post hoc analyses. Results This research revealed that both patient-physician concordance and perceived information asymmetry have significant effects on patient compliance, with patient-physician concordance exhibiting a considerably stronger relationship. Additionally, both physician quality and Internet health information quality have significant effects on patient-physician concordance, with physician quality exhibiting a much stronger relationship. Finally, only physician quality was found to have a significant impact on perceived information asymmetry, whereas Internet health information quality had no impact on perceived information asymmetry. Conclusions Overall, this study found that physicians can relax regarding their fears concerning patient use of Internet health information because physician quality has the greatest impact on patients and their physician coming to an agreement on their medical situation and recommended treatment regimen as well as patient’s compliance with their physician’s advice when compared to the impact that Internet health information quality has on these same variables. The findings also indicate that agreement between the patient and physician on the medical situation and treatment is much more important to compliance than the perceived information gap between the patient and physician (ie, the physician having a higher level of information in comparison to the patient). In addition, the level of agreement between a patient and their physician regarding the medical situation is more reliant on the perceived quality of their physician than on the perceived quality of Internet health information used. This research found that only the perceived quality of the physician has a significant relationship with the perceived information gap between the patient and their physician and the quality of the Internet health information has no relationship with this perceived information gap. PMID:26068214

  19. A Markovian model market—Akerlof's lemons and the asymmetry of information

    NASA Astrophysics Data System (ADS)

    Tilles, Paulo F. C.; Ferreira, Fernando F.; Francisco, Gerson; Pereira, Carlos de B.; Sarti, Flavia M.

    2011-07-01

    In this work we study an agent based model to investigate the role of asymmetric information degrees for market evolution. This model is quite simple and may be treated analytically since the consumers evaluate the quality of a certain good taking into account only the quality of the last good purchased plus her perceptive capacity β. As a consequence, the system evolves according to a stationary Markov chain. The value of a good offered by the firms increases along with quality according to an exponent α, which is a measure of the technology. It incorporates all the technological capacity of the production systems such as education, scientific development and techniques that change the productivity rates. The technological level plays an important role to explain how the asymmetry of information may affect the market evolution in this model. We observe that, for high technological levels, the market can detect adverse selection. The model allows us to compute the maximum asymmetric information degree before the market collapses. Below this critical point the market evolves during a limited period of time and then dies out completely. When β is closer to 1 (symmetric information), the market becomes more profitable for high quality goods, although high and low quality markets coexist. The maximum asymmetric information level is a consequence of an ergodicity breakdown in the process of quality evaluation.

  20. Using Modified-ISS Model to Evaluate Medication Administration Safety During Bar Code Medication Administration Implementation in Taiwan Regional Teaching Hospital.

    PubMed

    Ma, Pei-Luen; Jheng, Yan-Wun; Jheng, Bi-Wei; Hou, I-Ching

    2017-01-01

    Bar code medication administration (BCMA) could reduce medical errors and promote patient safety. This research uses modified information systems success model (M-ISS model) to evaluate nurses' acceptance to BCMA. The result showed moderate correlation between medication administration safety (MAS) to system quality, information quality, service quality, user satisfaction, and limited satisfaction.

  1. Establishing a Cloud Computing Success Model for Hospitals in Taiwan.

    PubMed

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services.

  2. Establishing a Cloud Computing Success Model for Hospitals in Taiwan

    PubMed Central

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services. PMID:28112020

  3. Understanding Intention to Use Electronic Information Resources: A Theoretical Extension of the Technology Acceptance Model (TAM)

    PubMed Central

    Tao, Donghua

    2008-01-01

    This study extended the Technology Acceptance Model (TAM) by examining the roles of two aspects of e-resource characteristics, namely, information quality and system quality, in predicting public health students’ intention to use e-resources for completing research paper assignments. Both focus groups and a questionnaire were used to collect data. Descriptive analysis, data screening, and Structural Equation Modeling (SEM) techniques were used for data analysis. The study found that perceived usefulness played a major role in determining students’ intention to use e-resources. Perceived usefulness and perceived ease of use fully mediated the impact that information quality and system quality had on behavior intention. The research model enriches the existing technology acceptance literature by extending TAM. Representing two aspects of e-resource characteristics provides greater explanatory information for diagnosing problems of system design, development, and implementation. PMID:18999300

  4. Understanding intention to use electronic information resources: A theoretical extension of the technology acceptance model (TAM).

    PubMed

    Tao, Donghua

    2008-11-06

    This study extended the Technology Acceptance Model (TAM) by examining the roles of two aspects of e-resource characteristics, namely, information quality and system quality, in predicting public health students' intention to use e-resources for completing research paper assignments. Both focus groups and a questionnaire were used to collect data. Descriptive analysis, data screening, and Structural Equation Modeling (SEM) techniques were used for data analysis. The study found that perceived usefulness played a major role in determining students' intention to use e-resources. Perceived usefulness and perceived ease of use fully mediated the impact that information quality and system quality had on behavior intention. The research model enriches the existing technology acceptance literature by extending TAM. Representing two aspects of e-resource characteristics provides greater explanatory information for diagnosing problems of system design, development, and implementation.

  5. Preface to QoIS 2009

    NASA Astrophysics Data System (ADS)

    Comyn-Wattiau, Isabelle; Thalheim, Bernhard

    Quality assurance is a growing research domain within the Information Systems (IS) and Conceptual Modeling (CM) disciplines. Ongoing research on quality in IS and CM is highly diverse and encompasses theoretical aspects including quality definition and quality models, and practical/empirical aspects such as the development of methods, approaches and tools for quality measurement and improvement. Current research on quality also includes quality characteristics definitions, validation instruments, methodological and development approaches to quality assurance during software and information systems development, quality monitors, quality assurance during information systems development processes and practices, quality assurance both for data and (meta)schemata, quality support for information systems data import and export, quality of query answering, and cost/benefit analysis of quality assurance processes. Quality assurance is also depending on the application area and the specific requirements in applications such as health sector, logistics, public sector, financial sector, manufacturing, services, e-commerce, software, etc. Furthermore, quality assurance must also be supported for data aggregation, ETL processes, web content management and other multi-layered applications. Quality assurance is typically requiring resources and has therefore beside its benefits a computational and economical trade-off. It is therefore also based on compromising between the value of quality data and the cost for quality assurance.

  6. Integrating multiple data sources in species distribution modeling: A framework for data fusion

    USGS Publications Warehouse

    Pacifici, Krishna; Reich, Brian J.; Miller, David A.W.; Gardner, Beth; Stauffer, Glenn E.; Singh, Susheela; McKerrow, Alexa; Collazo, Jaime A.

    2017-01-01

    The last decade has seen a dramatic increase in the use of species distribution models (SDMs) to characterize patterns of species’ occurrence and abundance. Efforts to parameterize SDMs often create a tension between the quality and quantity of data available to fit models. Estimation methods that integrate both standardized and non-standardized data types offer a potential solution to the tradeoff between data quality and quantity. Recently several authors have developed approaches for jointly modeling two sources of data (one of high quality and one of lesser quality). We extend their work by allowing for explicit spatial autocorrelation in occurrence and detection error using a Multivariate Conditional Autoregressive (MVCAR) model and develop three models that share information in a less direct manner resulting in more robust performance when the auxiliary data is of lesser quality. We describe these three new approaches (“Shared,” “Correlation,” “Covariates”) for combining data sources and show their use in a case study of the Brown-headed Nuthatch in the Southeastern U.S. and through simulations. All three of the approaches which used the second data source improved out-of-sample predictions relative to a single data source (“Single”). When information in the second data source is of high quality, the Shared model performs the best, but the Correlation and Covariates model also perform well. When the information quality in the second data source is of lesser quality, the Correlation and Covariates model performed better suggesting they are robust alternatives when little is known about auxiliary data collected opportunistically or through citizen scientists. Methods that allow for both data types to be used will maximize the useful information available for estimating species distributions.

  7. Characterizing CO and NOy Sources and Relative Ambient Ratios in the Baltimore Area Using Ambient Measurements and Source Attribution Modeling

    EPA Science Inventory

    Modeled source attribution information from the Community Multiscale Air Quality model was coupled with ambient data from the 2011 Deriving Information on Surface conditions from Column and Vertically Resolved Observations Relevant to Air Quality Baltimore field study. We assess ...

  8. Nurses' Satisfaction With Using Nursing Information Systems From Technology Acceptance Model and Information Systems Success Model Perspectives: A Reductionist Approach.

    PubMed

    Lin, Hsien-Cheng

    2017-02-01

    Nursing information systems can enhance nursing practice and the efficiency and quality of administrative affairs within the nursing department and thus have been widely considered for implementation. Close alignment of human-computer interaction can advance optimal clinical performance with the use of information systems. However, a lack of introduction of the concept of alignment between users' perceptions and technological functionality has caused dissatisfaction, as shown in the existing literature. This study provides insight into the alignment between nurses' perceptions and how technological functionality affects their satisfaction with Nursing Information System use through a reductionist perspective of alignment. This cross-sectional study collected data from 531 registered nurses in Taiwan. The results indicated that "perceived usefulness in system quality alignment," "perceived usefulness in information quality alignment," "perceived ease of use in system quality alignment," "perceived ease of use in information quality alignment," and "perceived ease of use in service quality alignment" have significantly affected nurses' satisfaction with Nursing Information System use. However, "perceived usefulness in service quality alignment" had no significant effect on nurses' satisfaction. This study also provides some meaningful implications for theoretical and practical aspects of design.

  9. Information security system quality assessment through the intelligent tools

    NASA Astrophysics Data System (ADS)

    Trapeznikov, E. V.

    2018-04-01

    The technology development has shown the automated system information security comprehensive analysis necessity. The subject area analysis indicates the study relevance. The research objective is to develop the information security system quality assessment methodology based on the intelligent tools. The basis of the methodology is the information security assessment model in the information system through the neural network. The paper presents the security assessment model, its algorithm. The methodology practical implementation results in the form of the software flow diagram are represented. The practical significance of the model being developed is noted in conclusions.

  10. Where Is the Xerox Corporation of the LIS Sector?

    ERIC Educational Resources Information Center

    Gilchrist, Alan; Brockman, John

    1996-01-01

    Discusses barriers to the implementation of quality management in the library and information science sector in Europe. Topics include Total Quality Management and other business experiences, an information quality infrastructure, supplier/customer relations, customer satisfaction, and a European Quality Model. (LRW)

  11. Development of an Instructional Quality Assurance Model in Nursing Science

    ERIC Educational Resources Information Center

    Ajpru, Haruthai; Pasiphol, Shotiga; Wongwanich, Suwimon

    2011-01-01

    The purpose of this study was to develop an instructional quality assurance model in nursing science. The study was divided into 3 phases; (1) to study the information for instructional quality assurance model development (2) to develop an instructional quality assurance model in nursing science and (3) to audit and the assessment of the developed…

  12. Exploring nursing e-learning systems success based on information system success model.

    PubMed

    Chang, Hui-Chuan; Liu, Chung-Feng; Hwang, Hsin-Ginn

    2011-12-01

    E-learning is thought of as an innovative approach to enhance nurses' care service knowledge. Extensive research has provided rich information toward system development, courses design, and nurses' satisfaction with an e-learning system. However, a comprehensive view in understanding nursing e-learning system success is an important but less focused-on topic. The purpose of this research was to explore net benefits of nursing e-learning systems based on the updated DeLone and McLean's Information System Success Model. The study used a self-administered questionnaire to collected 208 valid nurses' responses from 21 of Taiwan's medium- and large-scale hospitals that have implemented nursing e-learning systems. The result confirms that the model is sufficient to explore the nurses' use of e-learning systems in terms of intention to use, user satisfaction, and net benefits. However, while the three exogenous quality factors (system quality, information quality, and service quality) were all found to be critical factors affecting user satisfaction, only information quality showed a direct effect on the intention to use. This study provides useful insights for evaluating nursing e-learning system qualities as well as an understanding of nurses' intentions and satisfaction related to performance benefits.

  13. Impact of High Resolution Land-Use Data in Meteorology and Air Quality Modeling Systems

    EPA Science Inventory

    Accurate land use information is important in meteorology for land surface exchanges, in emission modeling for emission spatial allocation, and in air quality modeling for chemical surface fluxes. Currently, meteorology, emission, and air quality models often use outdated USGS Gl...

  14. [Prediction of regional soil quality based on mutual information theory integrated with decision tree algorithm].

    PubMed

    Lin, Fen-Fang; Wang, Ke; Yang, Ning; Yan, Shi-Guang; Zheng, Xin-Yu

    2012-02-01

    In this paper, some main factors such as soil type, land use pattern, lithology type, topography, road, and industry type that affect soil quality were used to precisely obtain the spatial distribution characteristics of regional soil quality, mutual information theory was adopted to select the main environmental factors, and decision tree algorithm See 5.0 was applied to predict the grade of regional soil quality. The main factors affecting regional soil quality were soil type, land use, lithology type, distance to town, distance to water area, altitude, distance to road, and distance to industrial land. The prediction accuracy of the decision tree model with the variables selected by mutual information was obviously higher than that of the model with all variables, and, for the former model, whether of decision tree or of decision rule, its prediction accuracy was all higher than 80%. Based on the continuous and categorical data, the method of mutual information theory integrated with decision tree could not only reduce the number of input parameters for decision tree algorithm, but also predict and assess regional soil quality effectively.

  15. IS Success Model in E-Learning Context Based on Students' Perceptions

    ERIC Educational Resources Information Center

    Freeze, Ronald D.; Alshare, Khaled A.; Lane, Peggy L.; Wen, H. Joseph

    2010-01-01

    This study utilized the Information Systems Success (ISS) model in examining e-learning systems success. The study was built on the premise that system quality (SQ) and information quality (IQ) influence system use and user satisfaction, which in turn impact system success. A structural equation model (SEM), using LISREL, was used to test the…

  16. Validation of the DeLone and McLean Information Systems Success Model.

    PubMed

    Ojo, Adebowale I

    2017-01-01

    This study is an adaptation of the widely used DeLone and McLean information system success model in the context of hospital information systems in a developing country. A survey research design was adopted in the study. A structured questionnaire was used to collect data from 442 health information management personnel in five Nigerian teaching hospitals. A structural equation modeling technique was used to validate the model's constructs. It was revealed that system quality significantly influenced use (β = 0.53, p < 0.001) and user satisfaction (β = 0.17, p < 0.001). Information quality significantly influenced use (β = 0.24, p < 0.001) and user satisfaction (β = 0.17, p < 0.001). Also, service quality significantly influenced use (β = 0.22, p < 0.001) and user satisfaction (β = 0.51, p < 0.001). However, use did not significantly influence user satisfaction (β = 0.00, p > 0.05), but it significantly influenced perceived net benefits (β = 0.21, p < 0.001). Furthermore, user satisfaction did not significantly influence perceived net benefits (β = 0.00, p > 0.05). The study validates the DeLone and McLean information system success model in the context of a hospital information system in a developing country. Importantly, system quality and use were found to be important measures of hospital information system success. It is, therefore, imperative that hospital information systems are designed in such ways that are easy to use, flexible, and functional to serve their purpose.

  17. Validation of the DeLone and McLean Information Systems Success Model

    PubMed Central

    2017-01-01

    Objectives This study is an adaptation of the widely used DeLone and McLean information system success model in the context of hospital information systems in a developing country. Methods A survey research design was adopted in the study. A structured questionnaire was used to collect data from 442 health information management personnel in five Nigerian teaching hospitals. A structural equation modeling technique was used to validate the model's constructs. Results It was revealed that system quality significantly influenced use (β = 0.53, p < 0.001) and user satisfaction (β = 0.17, p < 0.001). Information quality significantly influenced use (β = 0.24, p < 0.001) and user satisfaction (β = 0.17, p < 0.001). Also, service quality significantly influenced use (β = 0.22, p < 0.001) and user satisfaction (β = 0.51, p < 0.001). However, use did not significantly influence user satisfaction (β = 0.00, p > 0.05), but it significantly influenced perceived net benefits (β = 0.21, p < 0.001). Furthermore, user satisfaction did not significantly influence perceived net benefits (β = 0.00, p > 0.05). Conclusions The study validates the DeLone and McLean information system success model in the context of a hospital information system in a developing country. Importantly, system quality and use were found to be important measures of hospital information system success. It is, therefore, imperative that hospital information systems are designed in such ways that are easy to use, flexible, and functional to serve their purpose. PMID:28261532

  18. AIR QUALITY FORECAST DATABASE AND ANALYSIS

    EPA Science Inventory

    In 2003, NOAA and EPA signed a Memorandum of Agreement to collaborate on the design and implementation of a capability to produce daily air quality modeling forecast information for the U.S. NOAA's ETA meteorological model and EPA's Community Multiscale Air Quality (CMAQ) model ...

  19. Total Quality Management in a Knowledge Management Perspective.

    ERIC Educational Resources Information Center

    Johannsen, Carl Gustav

    2000-01-01

    Presents theoretical considerations on both similarities and differences between information management and knowledge management and presents a conceptual model of basic knowledge management processes. Discusses total quality management and quality control in the context of information management. (Author/LRW)

  20. Reputation and Competition in a Hidden Action Model

    PubMed Central

    Fedele, Alessandro; Tedeschi, Piero

    2014-01-01

    The economics models of reputation and quality in markets can be classified in three categories. (i) Pure hidden action, where only one type of seller is present who can provide goods of different quality. (ii) Pure hidden information, where sellers of different types have no control over product quality. (iii) Mixed frameworks, which include both hidden action and hidden information. In this paper we develop a pure hidden action model of reputation and Bertrand competition, where consumers and firms interact repeatedly in a market with free entry. The price of the good produced by the firms is contractible, whilst the quality is noncontractible, hence it is promised by the firms when a contract is signed. Consumers infer future quality from all available information, i.e., both from what they know about past quality and from current prices. According to early contributions, competition should make reputation unable to induce the production of high-quality goods. We provide a simple solution to this problem by showing that high quality levels are sustained as an outcome of a stationary symmetric equilibrium. PMID:25329387

  1. Reputation and competition in a hidden action model.

    PubMed

    Fedele, Alessandro; Tedeschi, Piero

    2014-01-01

    The economics models of reputation and quality in markets can be classified in three categories. (i) Pure hidden action, where only one type of seller is present who can provide goods of different quality. (ii) Pure hidden information, where sellers of different types have no control over product quality. (iii) Mixed frameworks, which include both hidden action and hidden information. In this paper we develop a pure hidden action model of reputation and Bertrand competition, where consumers and firms interact repeatedly in a market with free entry. The price of the good produced by the firms is contractible, whilst the quality is noncontractible, hence it is promised by the firms when a contract is signed. Consumers infer future quality from all available information, i.e., both from what they know about past quality and from current prices. According to early contributions, competition should make reputation unable to induce the production of high-quality goods. We provide a simple solution to this problem by showing that high quality levels are sustained as an outcome of a stationary symmetric equilibrium.

  2. Developing a theoretical model and questionnaire survey instrument to measure the success of electronic health records in residential aged care.

    PubMed

    Yu, Ping; Qian, Siyu

    2018-01-01

    Electronic health records (EHR) are introduced into healthcare organizations worldwide to improve patient safety, healthcare quality and efficiency. A rigorous evaluation of this technology is important to reduce potential negative effects on patient and staff, to provide decision makers with accurate information for system improvement and to ensure return on investment. Therefore, this study develops a theoretical model and questionnaire survey instrument to assess the success of organizational EHR in routine use from the viewpoint of nursing staff in residential aged care homes. The proposed research model incorporates six variables in the reformulated DeLone and McLean information systems success model: system quality, information quality, service quality, use, user satisfaction and net benefits. Two variables training and self-efficacy were also incorporated into the model. A questionnaire survey instrument was designed to measure the eight variables in the model. After a pilot test, the measurement scale was used to collect data from 243 nursing staff members in 10 residential aged care homes belonging to three management groups in Australia. Partial least squares path modeling was conducted to validate the model. The validated EHR systems success model predicts the impact of the four antecedent variables-training, self-efficacy, system quality and information quality-on the net benefits, the indicator of EHR systems success, through the intermittent variables use and user satisfaction. A 24-item measurement scale was developed to quantitatively evaluate the performance of an EHR system. The parsimonious EHR systems success model and the measurement scale can be used to benchmark EHR systems success across organizations and units and over time.

  3. How the national healthcare quality and disparities reports can catalyze quality improvement.

    PubMed

    McNeill, Dwight; Kelley, Ed

    2005-03-01

    The purpose of the National Reports on Healthcare Quality and Disparities is to enhance awareness of quality and health care disparities, track progress, understand variations, and catalyze improvements in health care. The objective of this paper is to propose a model that will facilitate a user's progression from knowledge to action and to show how the reports, its data warehouse, associated products, and Agency for Healthcare Research and Quality resources are integrated and focused on a comprehensive campaign to improve health care quality. The design of the paper is to present a conceptual model and to show how implementation strategies for the reports fit the model. The authors propose a quality improvement supply chain model to help elucidate the links of the process, corresponding developmental stages that potential users need to master and progress through, and "just-in-time" supply chain inputs at each of the corresponding stages, and populate the model with examples. The traditional ways of disseminating knowledge derived from science through reports and conferences are inadequate to the humbling need for vast improvements in the US health care system. Our model suggests the need for a wide variety of information, packaged in a diverse ways, and delivered just in time and on demand. It encourages the alignment of decision makers and researchers, along with information intermediaries and innovation brokers, to make the information production cycle more efficient and effective. Future iterations of the reports will improve relevance, meaning, and distribution of information to facilitate its uptake by potential users.

  4. Comparing a Japanese and a German hospital information system.

    PubMed

    Jahn, F; Issler, L; Winter, A; Takabayashi, K

    2009-01-01

    To examine the architectural differences and similarities of a Japanese and German hospital information system (HIS) in a case study. This cross-cultural comparison, which focuses on structural quality characteristics, offers the chance to get new insights into different HIS architectures, which possibly cannot be obtained by inner-country comparisons. A reference model for the domain layer of hospital information systems containing the typical enterprise functions of a hospital provides the basis of comparison for the two different hospital information systems. 3LGM(2) models, which describe the two HISs and which are based on that reference model, are used to assess several structural quality criteria. Four of these criteria are introduced in detail. The two examined HISs are different in terms of the four structural quality criteria examined. Whereas the centralized architecture of the hospital information system at Chiba University Hospital causes only few functional redundancies and leads to a low implementation of communication standards, the hospital information system at the University Hospital of Leipzig, having a decentralized architecture, exhibits more functional redundancies and a higher use of communication standards. Using a model-based comparison, it was possible to detect remarkable differences between the observed hospital information systems of completely different cultural areas. However, the usability of 3LGM(2) models for comparisons has to be improved in order to apply key figures and to assess or benchmark the structural quality of health information systems architectures more thoroughly.

  5. Service quality and asymmetric information in the regulation of monopolies: The Chilean electricity distribution industry

    NASA Astrophysics Data System (ADS)

    Melo, Oscar Alfredo

    This study is an enquiry about the role that service quality, asymmetric information, scope of regulation and regulator's preferences play in the regulation of monopolies, with an application to the case of the Chilean electricity distribution industry. In Chapter 1, I present the problem of regulating a monopolist and introduce the special conditions that the electricity sector has. Later I discuss the main characteristics of the electricity system that operates in Chile. The literature on regulation is reviewed in Chapter 2. A special emphasis is given to the problems of quality and information, and the lack of its proper joint treatment. In Chapter 3, I develop four theoretical models of regulation that explicitly consider the regulation of price and quality versus price-only regulation, and a symmetric versus asymmetric information structure where only the regulator knows its true costs. In these models, I also consider the effect of a regulator that may have a preference between consumers and the regulated monopolistic firms. I conclude that with symmetric information and independent of the scope of regulation, having a regulator that prefers consumers or producers does not affect the efficiency of the outcome. I also show that the regulator's inability to set quality, thus regulating only price, leads to an inefficient outcome, away from the first best solution that can be achieved by regulating both price and quality, even with asymmetric information, as long as the regulator does not have a "biased" preference for consumers or the monopolistic producers. If the regulator has a "bias," then the equilibrium will be inefficient with asymmetric information. But the effect on equilibrium price and quality depends on the direction of the effect of quality on the marginal effect of price in demand. More importantly, no closed-form solution can be derived unless drastic simplifications are made. To further investigate the outcome of the models, I use numerical simulation in Chapter 4, assuming flexible functional forms and alternative sets of parameters that represent the scenarios of interest. The results show that when the regulator is biased toward consumers (producers), symmetric information models yield higher (lower) quality except for the most efficient firm. Chapter 5 uses data from the electricity sector in Chile and estimates the price and quality elasticity of demand and finds a positive effect of quality on the price elasticity of demand.

  6. Examining the functionality of the DeLone and McLean information system success model as a framework for synthesis in nursing information and communication technology research.

    PubMed

    Booth, Richard G

    2012-06-01

    In this review, studies examining information and communication technology used by nurses in clinical practice were examined. Overall, a total of 39 studies were assessed spanning a time period from 1995 to 2008. The impacts of the various health information and communication technology evaluated by individual studies were synthesized using the DeLone and McLean's six-dimensional framework for evaluating information systems success (ie, System Quality, Information Quality, Service Quality, Use, User Satisfaction, and Net Benefits). Overall, the majority of researchers reported results related to the overall Net Benefits (positive, negative, and indifferent) of the health information and communication technology used by nurses. Attitudes and user satisfaction with technology were also commonly measured attributes. The current iteration of DeLone and McLean model is effective at synthesizing basic elements of health information and communication technology use by nurses. Regardless, the current model lacks the sociotechnical sensitivity to capture deeper nurse-technology relationalities. Limitations and recommendations are provided for researchers considering using the DeLone and McLean model for evaluating health information and communication technology used by nurses.

  7. Information Landscaping: Information Mapping, Charting, Querying and Reporting Techniques for Total Quality Knowledge Management.

    ERIC Educational Resources Information Center

    Tsai, Bor-sheng

    2003-01-01

    Total quality management and knowledge management are merged and used as a conceptual model to direct and develop information landscaping techniques through the coordination of information mapping, charting, querying, and reporting. Goals included: merge citation analysis and data mining, and apply data visualization and information architecture…

  8. Public reporting in health care: how do consumers use quality-of-care information? A systematic review.

    PubMed

    Faber, Marjan; Bosch, Marije; Wollersheim, Hub; Leatherman, Sheila; Grol, Richard

    2009-01-01

    One of the underlying goals of public reporting is to encourage the consumer to select health care providers or health plans that offer comparatively better quality-of-care. To review the weight consumers give to quality-of-care information in the process of choice, to summarize the effect of presentation formats, and to examine the impact of quality information on consumers' choice behavior. The evidence is organized in a theoretical consumer choice model. English language literature was searched in PubMed, the Cochrane Clinical Trial, and the EPOC Databases (January 1990-January 2008). Study selection was limited to randomized controlled trails, controlled before-after trials or interrupted time series. Included interventions focused on choice behavior of consumers in health care settings. Outcome measures referred to one of the steps in a consumer choice model. The quality of the study design was rated, and studies with low quality ratings were excluded. All 14 included studies examine quality information, usually CAHPS, with respect to its impact on the consumer's choice of health plans. Easy-to-read presentation formats and explanatory messages improve knowledge about and attitude towards the use of quality information; however, the weight given to quality information depends on other features, including free provider choice and costs. In real-world settings, having seen quality information is a strong determinant for choosing higher quality-rated health plans. This review contributes to an understanding of consumer choice behavior in health care settings. The small number of included studies limits the strength of our conclusions.

  9. Spatial Data Quality Control Procedure applied to the Okavango Basin Information System

    NASA Astrophysics Data System (ADS)

    Butchart-Kuhlmann, Daniel

    2014-05-01

    Spatial data is a powerful form of information, capable of providing information of great interest and tremendous use to a variety of users. However, much like other data representing the 'real world', precision and accuracy must be high for the results of data analysis to be deemed reliable and thus applicable to real world projects and undertakings. The spatial data quality control (QC) procedure presented here was developed as the topic of a Master's thesis, in the sphere of and using data from the Okavango Basin Information System (OBIS), itself a part of The Future Okavango (TFO) project. The aim of the QC procedure was to form the basis of a method through which to determine the quality of spatial data relevant for application to hydrological, solute, and erosion transport modelling using the Jena Adaptable Modelling System (JAMS). As such, the quality of all data present in OBIS classified under the topics of elevation, geoscientific information, or inland waters, was evaluated. Since the initial data quality has been evaluated, efforts are underway to correct the errors found, thus improving the quality of the dataset.

  10. AIR QUALITY MODELING OF PM AND AIR TOXICS AT NEIGHBORHOOD SCALES

    EPA Science Inventory

    The current interest in fine particles and toxics pollutants provide an impetus for extending air quality modeling capability towards improving exposure modeling and assessments. Human exposure models require information on concentration derived from interpolation of observati...

  11. Santa Margarita Estuary Water Quality Monitoring Data

    DTIC Science & Technology

    2018-02-01

    ADMINISTRATIVE INFORMATION The work described in this report was performed for the Water Quality Section of the Environmental Security Marine Corps Base...water quality model calibration given interest and the necessary resources. The dataset should also inform the stakeholders and Regional Board on...period. Several additional ancillary datasets were collected during the monitoring timeframe that provide key information though they were not collected

  12. INDOOR AIR QUALITY MODELING (CHAPTER 58)

    EPA Science Inventory

    The chapter discussses indoor air quality (IAQ) modeling. Such modeling provides a way to investigate many IAQ problems without the expense of large field experiments. Where experiments are planned, IAQ models can be used to help design experiments by providing information on exp...

  13. Research on manufacturing service behavior modeling based on block chain theory

    NASA Astrophysics Data System (ADS)

    Zhao, Gang; Zhang, Guangli; Liu, Ming; Yu, Shuqin; Liu, Yali; Zhang, Xu

    2018-04-01

    According to the attribute characteristics of processing craft, the manufacturing service behavior is divided into service attribute, basic attribute, process attribute, resource attribute. The attribute information model of manufacturing service is established. The manufacturing service behavior information is successfully divided into public and private domain. Additionally, the block chain technology is introduced, and the information model of manufacturing service based on block chain principle is established, which solves the problem of sharing and secreting information of processing behavior, and ensures that data is not tampered with. Based on the key pairing verification relationship, the selective publishing mechanism for manufacturing information is established, achieving the traceability of product data, guarantying the quality of processing quality.

  14. Developing a theoretical model and questionnaire survey instrument to measure the success of electronic health records in residential aged care

    PubMed Central

    Yu, Ping; Qian, Siyu

    2018-01-01

    Electronic health records (EHR) are introduced into healthcare organizations worldwide to improve patient safety, healthcare quality and efficiency. A rigorous evaluation of this technology is important to reduce potential negative effects on patient and staff, to provide decision makers with accurate information for system improvement and to ensure return on investment. Therefore, this study develops a theoretical model and questionnaire survey instrument to assess the success of organizational EHR in routine use from the viewpoint of nursing staff in residential aged care homes. The proposed research model incorporates six variables in the reformulated DeLone and McLean information systems success model: system quality, information quality, service quality, use, user satisfaction and net benefits. Two variables training and self-efficacy were also incorporated into the model. A questionnaire survey instrument was designed to measure the eight variables in the model. After a pilot test, the measurement scale was used to collect data from 243 nursing staff members in 10 residential aged care homes belonging to three management groups in Australia. Partial least squares path modeling was conducted to validate the model. The validated EHR systems success model predicts the impact of the four antecedent variables—training, self-efficacy, system quality and information quality—on the net benefits, the indicator of EHR systems success, through the intermittent variables use and user satisfaction. A 24-item measurement scale was developed to quantitatively evaluate the performance of an EHR system. The parsimonious EHR systems success model and the measurement scale can be used to benchmark EHR systems success across organizations and units and over time. PMID:29315323

  15. Satellite data driven modeling system for predicting air quality and visibility during wildfire and prescribed burn events

    NASA Astrophysics Data System (ADS)

    Nair, U. S.; Keiser, K.; Wu, Y.; Maskey, M.; Berendes, D.; Glass, P.; Dhakal, A.; Christopher, S. A.

    2012-12-01

    The Alabama Forestry Commission (AFC) is responsible for wildfire control and also prescribed burn management in the state of Alabama. Visibility and air quality degradation resulting from smoke are two pieces of information that are crucial for this activity. Currently the tools available to AFC are the dispersion index available from the National Weather Service and also surface smoke concentrations. The former provides broad guidance for prescribed burning activities but does not provide specific information regarding smoke transport, areas affected and quantification of air quality and visibility degradation. While the NOAA operational air quality guidance includes surface smoke concentrations from existing fire events, it does not account for contributions from background aerosols, which are important for the southeastern region including Alabama. Also lacking is the quantification of visibility. The University of Alabama in Huntsville has developed a state-of-the-art integrated modeling system to address these concerns. This system based on the Community Air Quality Modeling System (CMAQ) that ingests satellite derived smoke emissions and also assimilates NASA MODIS derived aerosol optical thickness. In addition, this operational modeling system also simulates the impact of potential prescribed burn events based on location information derived from the AFC prescribed burn permit database. A lagrangian model is used to simulate smoke plumes for the prescribed burns requests. The combined air quality and visibility degradation resulting from these smoke plumes and background aerosols is computed and the information is made available through a web based decision support system utilizing open source GIS components. This system provides information regarding intersections between highways and other critical facilities such as old age homes, hospitals and schools. The system also includes satellite detected fire locations and other satellite derived datasets relevant for fire and smoke management.

  16. Intelligent evaluation of color sensory quality of black tea by visible-near infrared spectroscopy technology: A comparison of spectra and color data information

    NASA Astrophysics Data System (ADS)

    Ouyang, Qin; Liu, Yan; Chen, Quansheng; Zhang, Zhengzhu; Zhao, Jiewen; Guo, Zhiming; Gu, Hang

    2017-06-01

    Instrumental test of black tea samples instead of human panel test is attracting massive attention recently. This study focused on an investigation of the feasibility for estimation of the color sensory quality of black tea samples using the VIS-NIR spectroscopy technique, comparing the performances of models based on the spectra and color information. In model calibration, the variables were first selected by genetic algorithm (GA); then the nonlinear back propagation-artificial neural network (BPANN) models were established based on the optimal variables. In comparison with the other models, GA-BPANN models from spectra data information showed the best performance, with the correlation coefficient of 0.8935, and the root mean square error of 0.392 in the prediction set. In addition, models based on the spectra information provided better performance than that based on the color parameters. Therefore, the VIS-NIR spectroscopy technique is a promising tool for rapid and accurate evaluation of the sensory quality of black tea samples.

  17. Intelligent evaluation of color sensory quality of black tea by visible-near infrared spectroscopy technology: A comparison of spectra and color data information.

    PubMed

    Ouyang, Qin; Liu, Yan; Chen, Quansheng; Zhang, Zhengzhu; Zhao, Jiewen; Guo, Zhiming; Gu, Hang

    2017-06-05

    Instrumental test of black tea samples instead of human panel test is attracting massive attention recently. This study focused on an investigation of the feasibility for estimation of the color sensory quality of black tea samples using the VIS-NIR spectroscopy technique, comparing the performances of models based on the spectra and color information. In model calibration, the variables were first selected by genetic algorithm (GA); then the nonlinear back propagation-artificial neural network (BPANN) models were established based on the optimal variables. In comparison with the other models, GA-BPANN models from spectra data information showed the best performance, with the correlation coefficient of 0.8935, and the root mean square error of 0.392 in the prediction set. In addition, models based on the spectra information provided better performance than that based on the color parameters. Therefore, the VIS-NIR spectroscopy technique is a promising tool for rapid and accurate evaluation of the sensory quality of black tea samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Urban Air Quality Modelling with AURORA: Prague and Bratislava

    NASA Astrophysics Data System (ADS)

    Veldeman, N.; Viaene, P.; De Ridder, K.; Peelaerts, W.; Lauwaet, D.; Muhammad, N.; Blyth, L.

    2012-04-01

    The European Commission, in its strategy to protect the health of the European citizens, states that in order to assess the impact of air pollution on public health, information on long-term exposure to air pollution should be available. Currently, indicators of air quality are often being generated using measured pollutant concentrations. While air quality monitoring stations data provide accurate time series information at specific locations, air quality models have the advantage of being able to assess the spatial variability of air quality (for different resolutions) and predict air quality in the future based on different scenarios. When running such air quality models at a high spatial and temporal resolution, one can simulate the actual situation as closely as possible, allowing for a detailed assessment of the risk of exposure to citizens from different pollutants. AURORA (Air quality modelling in Urban Regions using an Optimal Resolution Approach), a prognostic 3-dimensional Eulerian chemistry-transport model, is designed to simulate urban- to regional-scale atmospheric pollutant concentration and exposure fields. The AURORA model also allows to calculate the impact of changes in land use (e.g. planting of trees) or of emission reduction scenario's on air quality. AURORA is currently being applied within the ESA atmospheric GMES service, PASODOBLE (http://www.myair-eu.org), that delivers information on air quality, greenhouse gases, stratospheric ozone, … At present there are two operational AURORA services within PASODOBLE. Within the "Air quality forecast service" VITO delivers daily air quality forecasts for Belgium at a resolution of 5 km and for the major Belgian cities: Brussels, Ghent, Antwerp, Liege and Charleroi. Furthermore forecast services are provided for Prague, Czech Republic and Bratislava, Slovakia, both at a resolution of 1 km. The "Urban/regional air quality assessment service" provides urban- and regional-scale maps (hourly resolution) for air pollution and human exposure statistics for an entire year. So far we concentrated on Brussels, Belgium and the Rotterdam harbour area, The Netherlands. In this contribution we focus on the operational forecast services. Reference Lefebvre W. et al. (2011) Validation of the MIMOSA-AURORA-IFDM model chain for policy support: Modeling concentrations of elemental carbon in Flanders, Atmospheric Environment 45, 6705-6713

  19. Development and Validation of a Consumer Quality Assessment Instrument for Dentistry.

    ERIC Educational Resources Information Center

    Johnson, Jeffrey D.; And Others

    1990-01-01

    This paper reviews the literature on consumer involvement in dental quality assessment, argues for inclusion of this information in quality assessment measures, outlines a conceptual model for measuring dental consumer quality assessment, and presents data relating to the development and validation of an instrument based on the conceptual model.…

  20. 40 CFR 52.60 - Significant deterioration of air quality.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... quality. 52.60 Section 52.60 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) All applications and other information required pursuant to § 52.21 from... “Guideline on Air Quality Models (Revised)” or other models approved by EPA. [42 FR 22869, May 5, 1977, as...

  1. 40 CFR 52.60 - Significant deterioration of air quality.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... quality. 52.60 Section 52.60 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) All applications and other information required pursuant to § 52.21 from... “Guideline on Air Quality Models (Revised)” or other models approved by EPA. [42 FR 22869, May 5, 1977, as...

  2. 40 CFR 52.60 - Significant deterioration of air quality.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... quality. 52.60 Section 52.60 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) All applications and other information required pursuant to § 52.21 from... “Guideline on Air Quality Models (Revised)” or other models approved by EPA. [42 FR 22869, May 5, 1977, as...

  3. 40 CFR 52.60 - Significant deterioration of air quality.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... quality. 52.60 Section 52.60 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) All applications and other information required pursuant to § 52.21 from... “Guideline on Air Quality Models (Revised)” or other models approved by EPA. [42 FR 22869, May 5, 1977, as...

  4. 40 CFR 52.60 - Significant deterioration of air quality.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... quality. 52.60 Section 52.60 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) All applications and other information required pursuant to § 52.21 from... “Guideline on Air Quality Models (Revised)” or other models approved by EPA. [42 FR 22869, May 5, 1977, as...

  5. An Ontology of Quality Initiatives and a Model for Decentralized, Collaborative Quality Management on the (Semantic) World Wide Web

    PubMed Central

    2001-01-01

    This editorial provides a model of how quality initiatives concerned with health information on the World Wide Web may in the future interact with each other. This vision fits into the evolving "Semantic Web" architecture - ie, the prospective that the World Wide Web may evolve from a mess of unstructured, human-readable information sources into a global knowledge base with an additional layer providing richer and more meaningful relationships between resources. One first prerequisite for forming such a "Semantic Web" or "web of trust" among the players active in quality management of health information is that these initiatives make statements about themselves and about each other in a machine-processable language. I present a concrete model on how this collaboration could look, and provide some recommendations on what the role of the World Health Organization (WHO) and other policy makers in this framework could be. PMID:11772549

  6. An ontology of quality initiatives and a model for decentralized, collaborative quality management on the (semantic) World-Wide-Web.

    PubMed

    Eysenbach, G

    2001-01-01

    This editorial provides a model of how quality initiatives concerned with health information on the World Wide Web may in the future interact with each other. This vision fits into the evolving "Semantic Web" architecture - ie, the prospective that the World Wide Web may evolve from a mess of unstructured, human-readable information sources into a global knowledge base with an additional layer providing richer and more meaningful relationships between resources. One first prerequisite for forming such a "Semantic Web" or "web of trust" among the players active in quality management of health information is that these initiatives make statements about themselves and about each other in a machine-processable language. I present a concrete model on how this collaboration could look, and provide some recommendations on what the role of the World Health Organization (WHO) and other policy makers in this framework could be.

  7. A framework for improving the quality of health information on the world-wide-web and bettering public (e-)health: the MedCERTAIN approach.

    PubMed

    Eysenbach, G; Köhler, C; Yihune, G; Lampe, K; Cross, P; Brickley, D

    2001-01-01

    There has been considerable debate about the variable quality of health information on the world-wide-web and its impact on public health. While central authorities to regulate, control, censor, or centrally approve information, in-formation providers or websites are neither realistic nor desirable, public health professionals are interested in making systems available that direct patient streams to the best available information sources. National governments and medical societies have also recognized their responsibility to help users to identify "good quality" information sources. But what constitutes good quality, and how can such a system be implemented in a decentralized and democratic manner? This paper presents a model which combines aspects of consumer education, encouragement of best practices among information providers, self-labeling and external evaluations. The model is currently being implemented and evaluated in the MedCERTAIN project, funded by the European Union under the Action Plan for Safer Use of the Internet. The aim is to develop a technical and organisational infrastructure for a pilot system that allows consumers to access metainformation about web-sites and health information providers, including disclosure information from health providers and opinions of external evaluators. The paper explains the general conceptual framework of the model and presents preliminary experiences including results from an expert consensus meeting, where the framework was discussed.

  8. Comprehensive evaluation of land quality basing on 3S technology and farmers' survey: A case study in Crisscross Region of Wind-drift Sand Regions along the Great Wall and Loess Plateau

    NASA Astrophysics Data System (ADS)

    Zhang, Yan-yu; Wang, Jing; Shi, Yan-xi; Li, Yu-huan; Lv, Chun-yan

    2005-10-01

    The Crisscross Region of Wind-drift Sand Regions along the Great Wall and Loess Plateau locates in southern Ordos Plateau and northern Chinese Loess Plateau, where wind erosion and water erosion coexist and specified environmental and socio-economic factors, especially human activities induce serious land degradation. However, there are only a few studies provide an overall assessment consequences. Integrated land quality assessment considering impacts of soil, topography, vegetation, environmental hazards, social-economic factors and land managements are imperative to the regional sustainable land managements. A pilot study was made in Hengshan County (Shanxi Province) with the objective of developing comprehensive land quality evaluation model integrating data from farmers' survey and Remote Sensing. Surveys were carried out in 107 households of study area in 2003 and 2004 to get farmers' perceptions of land quality and to collect correlative information. It was found out that farmers evaluated land quality by slope, water availability, soil texture, yields, amount of fertilizer, crop performance, sandy erosion degree and water erosion degree. Scientists' indicators which emphasize on getting information by RS technology were introduced to reflecting above indicators information for the sake of developing a rapid, efficient and local-fitted land quality assessment model including social-economic, environmental and anthropogenic factors. Data from satellite and surveys were integrated with socio-economic statistic data using geographical information system (GIS) and three indexes, namely Production Press Index (PPI), Land State Index (LSI) and Farmer Behavior Index (FBI) were proposed to measure different aspects of land quality. A model was further derived from the three indexes to explore the overall land quality of the study area. Results suggest that local land prevalently had a poor quality. This paper shows that whilst the model was competent for its work in the study area and evaluation results would supply beneficial information for management decisions.

  9. High-performing trauma teams: frequency of behavioral markers of a shared mental model displayed by team leaders and quality of medical performance.

    PubMed

    Johnsen, Bjørn Helge; Westli, Heidi Kristina; Espevik, Roar; Wisborg, Torben; Brattebø, Guttorm

    2017-11-10

    High quality team leadership is important for the outcome of medical emergencies. However, the behavioral marker of leadership are not well defined. The present study investigated frequency of behavioral markers of shared mental models (SMM) on quality of medical management. Training video recordings of 27 trauma teams simulating emergencies were analyzed according to team -leader's frequency of shared mental model behavioral markers. The results showed a positive correlation of quality of medical management with leaders sharing information without an explicit demand for the information ("push" of information) and with leaders communicating their situational awareness (SA) and demonstrating implicit supporting behavior. When separating the sample into higher versus lower performing teams, the higher performing teams had leaders who displayed a greater frequency of "push" of information and communication of SA and supportive behavior. No difference was found for the behavioral marker of team initiative, measured as bringing up suggestions to other teammembers. The results of this study emphasize the team leader's role in initiating and updating a team's shared mental model. Team leaders should also set expectations for acceptable interaction patterns (e.g., promoting information exchange) and create a team climate that encourages behaviors, such as mutual performance monitoring, backup behavior, and adaptability to enhance SMM.

  10. DEVELOPMENT OF GUIDELINES FOR CALIBRATING, VALIDATING, AND EVALUATING HYDROLOGIC AND WATER QUALITY MODELS: ASABE ENGINEERING PRACTICE 621

    USDA-ARS?s Scientific Manuscript database

    Information to support application of hydrologic and water quality (H/WQ) models abounds, yet modelers commonly use arbitrary, ad hoc methods to conduct, document, and report model calibration, validation, and evaluation. Consistent methods are needed to improve model calibration, validation, and e...

  11. Spatial Allocator for air quality modeling

    EPA Pesticide Factsheets

    The Spatial Allocator is a set of tools that helps users manipulate and generate data files related to emissions and air quality modeling without requiring the use of a commercial Geographic Information System.

  12. A Comparative Study of the Proposed Models for the Components of the National Health Information System

    PubMed Central

    Ahmadi, Maryam; Damanabi, Shahla; Sadoughi, Farahnaz

    2014-01-01

    Introduction: National Health Information System plays an important role in ensuring timely and reliable access to Health information, which is essential for strategic and operational decisions that improve health, quality and effectiveness of health care. In other words, using the National Health information system you can improve the quality of health data, information and knowledge used to support decision making at all levels and areas of the health sector. Since full identification of the components of this system – for better planning and management influential factors of performanceseems necessary, therefore, in this study different attitudes towards components of this system are explored comparatively. Methods: This is a descriptive and comparative kind of study. The society includes printed and electronic documents containing components of the national health information system in three parts: input, process and output. In this context, search for information using library resources and internet search were conducted, and data analysis was expressed using comparative tables and qualitative data. Results: The findings showed that there are three different perspectives presenting the components of national health information system Lippeveld and Sauerborn and Bodart model in 2000, Health Metrics Network (HMN) model from World Health Organization in 2008, and Gattini’s 2009 model. All three models outlined above in the input (resources and structure) require components of management and leadership, planning and design programs, supply of staff, software and hardware facilities and equipment. Plus, in the “process” section from three models, we pointed up the actions ensuring the quality of health information system, and in output section, except for Lippeveld Model, two other models consider information products and use and distribution of information as components of the national health information system. Conclusion: the results showed that all the three models have had a brief discussion about the components of health information in input section. But Lippeveld model has overlooked the components of national health information in process and output sections. Therefore, it seems that the health measurement model of network has a comprehensive presentation for the components of health system in all three sections-input, process and output. PMID:24825937

  13. A comparative study of the proposed models for the components of the national health information system.

    PubMed

    Ahmadi, Maryam; Damanabi, Shahla; Sadoughi, Farahnaz

    2014-04-01

    National Health Information System plays an important role in ensuring timely and reliable access to Health information, which is essential for strategic and operational decisions that improve health, quality and effectiveness of health care. In other words, using the National Health information system you can improve the quality of health data, information and knowledge used to support decision making at all levels and areas of the health sector. Since full identification of the components of this system - for better planning and management influential factors of performanceseems necessary, therefore, in this study different attitudes towards components of this system are explored comparatively. This is a descriptive and comparative kind of study. The society includes printed and electronic documents containing components of the national health information system in three parts: input, process and output. In this context, search for information using library resources and internet search were conducted, and data analysis was expressed using comparative tables and qualitative data. The findings showed that there are three different perspectives presenting the components of national health information system Lippeveld and Sauerborn and Bodart model in 2000, Health Metrics Network (HMN) model from World Health Organization in 2008, and Gattini's 2009 model. All three models outlined above in the input (resources and structure) require components of management and leadership, planning and design programs, supply of staff, software and hardware facilities and equipment. Plus, in the "process" section from three models, we pointed up the actions ensuring the quality of health information system, and in output section, except for Lippeveld Model, two other models consider information products and use and distribution of information as components of the national health information system. the results showed that all the three models have had a brief discussion about the components of health information in input section. But Lippeveld model has overlooked the components of national health information in process and output sections. Therefore, it seems that the health measurement model of network has a comprehensive presentation for the components of health system in all three sections-input, process and output.

  14. Mental Mechanisms for Topics Identification

    PubMed Central

    2014-01-01

    Topics identification (TI) is the process that consists in determining the main themes present in natural language documents. The current TI modeling paradigm aims at acquiring semantic information from statistic properties of large text datasets. We investigate the mental mechanisms responsible for the identification of topics in a single document given existing knowledge. Our main hypothesis is that topics are the result of accumulated neural activation of loosely organized information stored in long-term memory (LTM). We experimentally tested our hypothesis with a computational model that simulates LTM activation. The model assumes activation decay as an unavoidable phenomenon originating from the bioelectric nature of neural systems. Since decay should negatively affect the quality of topics, the model predicts the presence of short-term memory (STM) to keep the focus of attention on a few words, with the expected outcome of restoring quality to a baseline level. Our experiments measured topics quality of over 300 documents with various decay rates and STM capacity. Our results showed that accumulated activation of loosely organized information was an effective mental computational commodity to identify topics. It was furthermore confirmed that rapid decay is detrimental to topics quality but that limited capacity STM restores quality to a baseline level, even exceeding it slightly. PMID:24744775

  15. An integrated view of data quality in Earth observation

    PubMed Central

    Yang, X.; Blower, J. D.; Bastin, L.; Lush, V.; Zabala, A.; Masó, J.; Cornford, D.; Díaz, P.; Lumsden, J.

    2013-01-01

    Data quality is a difficult notion to define precisely, and different communities have different views and understandings of the subject. This causes confusion, a lack of harmonization of data across communities and omission of vital quality information. For some existing data infrastructures, data quality standards cannot address the problem adequately and cannot fulfil all user needs or cover all concepts of data quality. In this study, we discuss some philosophical issues on data quality. We identify actual user needs on data quality, review existing standards and specifications on data quality, and propose an integrated model for data quality in the field of Earth observation (EO). We also propose a practical mechanism for applying the integrated quality information model to a large number of datasets through metadata inheritance. While our data quality management approach is in the domain of EO, we believe that the ideas and methodologies for data quality management can be applied to wider domains and disciplines to facilitate quality-enabled scientific research. PMID:23230156

  16. An integrated view of data quality in Earth observation.

    PubMed

    Yang, X; Blower, J D; Bastin, L; Lush, V; Zabala, A; Masó, J; Cornford, D; Díaz, P; Lumsden, J

    2013-01-28

    Data quality is a difficult notion to define precisely, and different communities have different views and understandings of the subject. This causes confusion, a lack of harmonization of data across communities and omission of vital quality information. For some existing data infrastructures, data quality standards cannot address the problem adequately and cannot fulfil all user needs or cover all concepts of data quality. In this study, we discuss some philosophical issues on data quality. We identify actual user needs on data quality, review existing standards and specifications on data quality, and propose an integrated model for data quality in the field of Earth observation (EO). We also propose a practical mechanism for applying the integrated quality information model to a large number of datasets through metadata inheritance. While our data quality management approach is in the domain of EO, we believe that the ideas and methodologies for data quality management can be applied to wider domains and disciplines to facilitate quality-enabled scientific research.

  17. Nondestructive detection of pork quality based on dual-band VIS/NIR spectroscopy

    NASA Astrophysics Data System (ADS)

    Wang, Wenxiu; Peng, Yankun; Li, Yongyu; Tang, Xiuying; Liu, Yuanyuan

    2015-05-01

    With the continuous development of living standards and the relative change of dietary structure, consumers' rising and persistent demand for better quality of meat is emphasized. Colour, pH value, and cooking loss are important quality attributes when evaluating meat. To realize nondestructive detection of multi-parameter of meat quality simultaneously is popular in production and processing of meat and meat products. The objectives of this research were to compare the effectiveness of two bands for rapid nondestructive and simultaneous detection of pork quality attributes. Reflectance spectra of 60 chilled pork samples were collected from a dual-band visible/near-infrared spectroscopy system which covered 350-1100 nm and 1000-2600 nm. Then colour, pH value and cooking loss were determined by standard methods as reference values. Standard normal variables transform (SNVT) was employed to eliminate the spectral noise. A spectrum connection method was put forward for effective integration of the dual-band spectrum to make full use of the whole efficient information. Partial least squares regression (PLSR) and Principal component analysis (PCA) were applied to establish prediction models using based on single-band spectrum and dual-band spectrum, respectively. The experimental results showed that the PLSR model based on dual-band spectral information was superior to the models based on single band spectral information with lower root means quare error (RMSE) and higher accuracy. The PLSR model based on dual-band (use the overlapping part of first band) yielded the best prediction result with correlation coefficient of validation (Rv) of 0.9469, 0.9495, 0.9180, 0.9054 and 0.8789 for L*, a*, b*, pH value and cooking loss, respectively. This mainly because dual-band spectrum can provide sufficient and comprehensive information which reflected the quality attributes. Data fusion from dual-band spectrum could significantly improve pork quality parameters prediction performance. The research also indicated that multi-band spectral information fusion has potential to comprehensively evaluate other quality and safety attributes of pork.

  18. DEVELOPMENT OF A LAND-SURFACE MODEL PART I: APPLICATION IN A MESOSCALE METEOROLOGY MODEL

    EPA Science Inventory

    Parameterization of land-surface processes and consideration of surface inhomogeneities are very important to mesoscale meteorological modeling applications, especially those that provide information for air quality modeling. To provide crucial, reliable information on the diurn...

  19. Building Information Modeling (BIM) Primer. Report 1: Facility Life-Cycle Process and Technology Innovation

    DTIC Science & Technology

    2012-08-01

    Building Information Modeling ( BIM ) Primer Report 1: Facility Life-cycle Process and Technology Innovation In fo...is unlimited. ERDC/ITL TR-12-2 August 2012 Building Information Modeling ( BIM ) Primer Report 1: Facility Life-cycle Process and Technology...and to enhance the quality of projects through the design, construction, and handover phases. Building Information Modeling ( BIM ) is a

  20. Information sharing model in supporting implementation of e-procurement service: Case of Bandung city

    NASA Astrophysics Data System (ADS)

    Ramantoko, Gadang; Irawan, Herry

    2017-10-01

    This research examines the factors influencing the Information Sharing Model in Supporting Implementation of e-Procurement Services: Case of Bandung City in its early maturity stage. The early maturity of information sharing stage was determined using e-Government Maturity Stage Conceptual Framework from Estevez. Bandung City e-Procurement Information Sharing system was categorized at stage 1 in Estevez' model where the concern was mainly on assessing the benefit and risk of implementing the system. The Authors were using DeLone & McLean (D&M) Information System Success model to study benefit and risk of implementing the system in Bandung city. The model was then empirically tested by employing survey data that was collected from the available 40 listed supplier firms. D&M's model adjusted by Klischewski's description was introducing Information Quality, System Quality, and Service Quality as independent variable; Usability and User Satisfaction as intermediate dependent variable; and Perceived Net Benefit as final dependent variable. The findings suggested that, all of the predictors in D&M's model significantly influenced the net perceived benefit of implementing the e-Procurement system in the early maturity stage. The theoretical contribution of this research suggested that D&M's model might find useful in modeling complex information technology successfulness such as the one used in e-Procurement service. This research could also have implications for policy makers (LPSE) and system providers (LKPP) following the introduction of the service. However, the small number of respondent might be considered limitation of the study. The model needs to be further tested using larger number of respondents by involving the population of the firms in extended boundary/municipality area around Bandung.

  1. Testing a Nursing-Specific Model of Electronic Patient Record documentation with regard to information completeness, comprehensiveness and consistency.

    PubMed

    von Krogh, Gunn; Nåden, Dagfinn; Aasland, Olaf Gjerløw

    2012-10-01

    To present the results from the test site application of the documentation model KPO (quality assurance, problem solving and caring) designed to impact the quality of nursing information in electronic patient record (EPR). The KPO model was developed by means of consensus group and clinical testing. Four documentation arenas and eight content categories, nursing terminologies and a decision-support system were designed to impact the completeness, comprehensiveness and consistency of nursing information. The testing was performed in a pre-test/post-test time series design, three times at a one-year interval. Content analysis of nursing documentation was accomplished through the identification, interpretation and coding of information units. Data from the pre-test and post-test 2 were subjected to statistical analyses. To estimate the differences, paired t-tests were used. At post-test 2, the information is found to be more complete, comprehensive and consistent than at pre-test. The findings indicate that documentation arenas combining work flow and content categories deduced from theories on nursing practice can influence the quality of nursing information. The KPO model can be used as guide when shifting from paper-based to electronic-based nursing documentation with the aim of obtaining complete, comprehensive and consistent nursing information. © 2012 Blackwell Publishing Ltd.

  2. A Completely Blind Video Integrity Oracle.

    PubMed

    Mittal, Anish; Saad, Michele A; Bovik, Alan C

    2016-01-01

    Considerable progress has been made toward developing still picture perceptual quality analyzers that do not require any reference picture and that are not trained on human opinion scores of distorted images. However, there do not yet exist any such completely blind video quality assessment (VQA) models. Here, we attempt to bridge this gap by developing a new VQA model called the video intrinsic integrity and distortion evaluation oracle (VIIDEO). The new model does not require the use of any additional information other than the video being quality evaluated. VIIDEO embodies models of intrinsic statistical regularities that are observed in natural vidoes, which are used to quantify disturbances introduced due to distortions. An algorithm derived from the VIIDEO model is thereby able to predict the quality of distorted videos without any external knowledge about the pristine source, anticipated distortions, or human judgments of video quality. Even with such a paucity of information, we are able to show that the VIIDEO algorithm performs much better than the legacy full reference quality measure MSE on the LIVE VQA database and delivers performance comparable with a leading human judgment trained blind VQA model. We believe that the VIIDEO algorithm is a significant step toward making real-time monitoring of completely blind video quality possible.

  3. Sensitivity of air quality simulation to smoke plume rise

    Treesearch

    Yongqiang Liu; Gary Achtemeier; Scott Goodrick

    2008-01-01

    Plume rise is the height smoke plumes can reach. This information is needed by air quality models such as the Community Multiscale Air Quality (CMAQ) model to simulate physical and chemical processes of point-source fire emissions. This study seeks to understand the importance of plume rise to CMAQ air quality simulation of prescribed burning to plume rise. CMAQ...

  4. Inventory of environmental impact models related to energy technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owen, P.T.; Dailey, N.S.; Johnson, C.A.

    The purpose of this inventory is to identify and collect data on computer simulations and computational models related to the environmental effects of energy source development, energy conversion, or energy utilization. Information for 33 data fields was sought for each model reported. All of the information which could be obtained within the time alloted for completion of the project is presented for each model listed. Efforts will be continued toward acquiring the needed information. Readers who are interested in these particular models are invited to contact ESIC for assistance in locating them. In addition to the standard bibliographic information, othermore » data fields of interest to modelers, such as computer hardware and software requirements, algorithms, applications, and existing model validation information, are included. Indexes are provided for contact person, acronym, keyword, and title. The models are grouped into the following categories: atmospheric transport, air quality, aquatic transport, terrestrial food chains, soil transport, aquatic food chains, water quality, dosimetry, and human effects, animal effects, plant effects, and generalized environmental transport. Within these categories, the models are arranged alphabetically by last name of the contact person.« less

  5. Optimal design of focused experiments and surveys

    NASA Astrophysics Data System (ADS)

    Curtis, Andrew

    1999-10-01

    Experiments and surveys are often performed to obtain data that constrain some previously underconstrained model. Often, constraints are most desired in a particular subspace of model space. Experiment design optimization requires that the quality of any particular design can be both quantified and then maximized. This study shows how the quality can be defined such that it depends on the amount of information that is focused in the particular subspace of interest. In addition, algorithms are presented which allow one particular focused quality measure (from the class of focused measures) to be evaluated efficiently. A subclass of focused quality measures is also related to the standard variance and resolution measures from linearized inverse theory. The theory presented here requires that the relationship between model parameters and data can be linearized around a reference model without significant loss of information. Physical and financial constraints define the space of possible experiment designs. Cross-well tomographic examples are presented, plus a strategy for survey design to maximize information about linear combinations of parameters such as bulk modulus, κ =λ+ 2μ/3.

  6. SPARROW MODELING - Enhancing Understanding of the Nation's Water Quality

    USGS Publications Warehouse

    Preston, Stephen D.; Alexander, Richard B.; Woodside, Michael D.; Hamilton, Pixie A.

    2009-01-01

    The information provided here is intended to assist water-resources managers with interpretation of the U.S. Geological Survey (USGS) SPARROW model and its products. SPARROW models can be used to explain spatial patterns in monitored stream-water quality in relation to human activities and natural processes as defined by detailed geospatial information. Previous SPARROW applications have identified the sources and transport of nutrients in the Mississippi River basin, Chesapeake Bay watershed, and other major drainages of the United States. New SPARROW models with improved accuracy and interpretability are now being developed by the USGS National Water Quality Assessment (NAWQA) Program for six major regions of the conterminous United States. These new SPARROW models are based on updated geospatial data and stream-monitoring records from local, State, and other federal agencies.

  7. Validating archetypes for the Multiple Sclerosis Functional Composite.

    PubMed

    Braun, Michael; Brandt, Alexander Ulrich; Schulz, Stefan; Boeker, Martin

    2014-08-03

    Numerous information models for electronic health records, such as openEHR archetypes are available. The quality of such clinical models is important to guarantee standardised semantics and to facilitate their interoperability. However, validation aspects are not regarded sufficiently yet. The objective of this report is to investigate the feasibility of archetype development and its community-based validation process, presuming that this review process is a practical way to ensure high-quality information models amending the formal reference model definitions. A standard archetype development approach was applied on a case set of three clinical tests for multiple sclerosis assessment: After an analysis of the tests, the obtained data elements were organised and structured. The appropriate archetype class was selected and the data elements were implemented in an iterative refinement process. Clinical and information modelling experts validated the models in a structured review process. Four new archetypes were developed and publicly deployed in the openEHR Clinical Knowledge Manager, an online platform provided by the openEHR Foundation. Afterwards, these four archetypes were validated by domain experts in a team review. The review was a formalised process, organised in the Clinical Knowledge Manager. Both, development and review process turned out to be time-consuming tasks, mostly due to difficult selection processes between alternative modelling approaches. The archetype review was a straightforward team process with the goal to validate archetypes pragmatically. The quality of medical information models is crucial to guarantee standardised semantic representation in order to improve interoperability. The validation process is a practical way to better harmonise models that diverge due to necessary flexibility left open by the underlying formal reference model definitions.This case study provides evidence that both community- and tool-enabled review processes, structured in the Clinical Knowledge Manager, ensure archetype quality. It offers a pragmatic but feasible way to reduce variation in the representation of clinical information models towards a more unified and interoperable model.

  8. Validating archetypes for the Multiple Sclerosis Functional Composite

    PubMed Central

    2014-01-01

    Background Numerous information models for electronic health records, such as openEHR archetypes are available. The quality of such clinical models is important to guarantee standardised semantics and to facilitate their interoperability. However, validation aspects are not regarded sufficiently yet. The objective of this report is to investigate the feasibility of archetype development and its community-based validation process, presuming that this review process is a practical way to ensure high-quality information models amending the formal reference model definitions. Methods A standard archetype development approach was applied on a case set of three clinical tests for multiple sclerosis assessment: After an analysis of the tests, the obtained data elements were organised and structured. The appropriate archetype class was selected and the data elements were implemented in an iterative refinement process. Clinical and information modelling experts validated the models in a structured review process. Results Four new archetypes were developed and publicly deployed in the openEHR Clinical Knowledge Manager, an online platform provided by the openEHR Foundation. Afterwards, these four archetypes were validated by domain experts in a team review. The review was a formalised process, organised in the Clinical Knowledge Manager. Both, development and review process turned out to be time-consuming tasks, mostly due to difficult selection processes between alternative modelling approaches. The archetype review was a straightforward team process with the goal to validate archetypes pragmatically. Conclusions The quality of medical information models is crucial to guarantee standardised semantic representation in order to improve interoperability. The validation process is a practical way to better harmonise models that diverge due to necessary flexibility left open by the underlying formal reference model definitions. This case study provides evidence that both community- and tool-enabled review processes, structured in the Clinical Knowledge Manager, ensure archetype quality. It offers a pragmatic but feasible way to reduce variation in the representation of clinical information models towards a more unified and interoperable model. PMID:25087081

  9. An interval programming model for continuous improvement in micro-manufacturing

    NASA Astrophysics Data System (ADS)

    Ouyang, Linhan; Ma, Yizhong; Wang, Jianjun; Tu, Yiliu; Byun, Jai-Hyun

    2018-03-01

    Continuous quality improvement in micro-manufacturing processes relies on optimization strategies that relate an output performance to a set of machining parameters. However, when determining the optimal machining parameters in a micro-manufacturing process, the economics of continuous quality improvement and decision makers' preference information are typically neglected. This article proposes an economic continuous improvement strategy based on an interval programming model. The proposed strategy differs from previous studies in two ways. First, an interval programming model is proposed to measure the quality level, where decision makers' preference information is considered in order to determine the weight of location and dispersion effects. Second, the proposed strategy is a more flexible approach since it considers the trade-off between the quality level and the associated costs, and leaves engineers a larger decision space through adjusting the quality level. The proposed strategy is compared with its conventional counterparts using an Nd:YLF laser beam micro-drilling process.

  10. Motivating medical information system performance by system quality, service quality, and job satisfaction for evidence-based practice.

    PubMed

    Chang, Ching-Sheng; Chen, Su-Yueh; Lan, Yi-Ting

    2012-11-21

    No previous studies have addressed the integrated relationships among system quality, service quality, job satisfaction, and system performance; this study attempts to bridge such a gap with evidence-based practice study. The convenience sampling method was applied to the information system users of three hospitals in southern Taiwan. A total of 500 copies of questionnaires were distributed, and 283 returned copies were valid, suggesting a valid response rate of 56.6%. SPSS 17.0 and AMOS 17.0 (structural equation modeling) statistical software packages were used for data analysis and processing. The findings are as follows: System quality has a positive influence on service quality (γ11= 0.55), job satisfaction (γ21= 0.32), and system performance (γ31= 0.47). Service quality (β31= 0.38) and job satisfaction (β32= 0.46) will positively influence system performance. It is thus recommended that the information office of hospitals and developers take enhancement of service quality and user satisfaction into consideration in addition to placing b on system quality and information quality when designing, developing, or purchasing an information system, in order to improve benefits and gain more achievements generated by hospital information systems.

  11. Application of Wavelet Filters in an Evaluation of ...

    EPA Pesticide Factsheets

    Air quality model evaluation can be enhanced with time-scale specific comparisons of outputs and observations. For example, high-frequency (hours to one day) time scale information in observed ozone is not well captured by deterministic models and its incorporation into model performance metrics lead one to devote resources to stochastic variations in model outputs. In this analysis, observations are compared with model outputs at seasonal, weekly, diurnal and intra-day time scales. Filters provide frequency specific information that can be used to compare the strength (amplitude) and timing (phase) of observations and model estimates. The National Exposure Research Laboratory′s (NERL′s) Atmospheric Modeling and Analysis Division (AMAD) conducts research in support of EPA′s mission to protect human health and the environment. AMAD′s research program is engaged in developing and evaluating predictive atmospheric models on all spatial and temporal scales for forecasting the Nation′s air quality and for assessing changes in air quality and air pollutant exposures, as affected by changes in ecosystem management and regulatory decisions. AMAD is responsible for providing a sound scientific and technical basis for regulatory policies based on air quality models to improve ambient air quality. The models developed by AMAD are being used by EPA, NOAA, and the air pollution community in understanding and forecasting not only the magnitude of the air pollu

  12. Relations between continuous real-time physical properties and discrete water-quality constituents in the Little Arkansas River, south-central Kansas, 1998-2014

    USGS Publications Warehouse

    Rasmussen, Patrick P.; Eslick, Patrick J.; Ziegler, Andrew C.

    2016-08-11

    Water from the Little Arkansas River is used as source water for artificial recharge of the Equus Beds aquifer, one of the primary water-supply sources for the city of Wichita, Kansas. The U.S. Geological Survey has operated two continuous real-time water-quality monitoring stations since 1995 on the Little Arkansas River in Kansas. Regression models were developed to establish relations between discretely sampled constituent concentrations and continuously measured physical properties to compute concentrations of those constituents of interest. Site-specific regression models were originally published in 2000 for the near Halstead and near Sedgwick U.S. Geological Survey streamgaging stations and the site-specific regression models were then updated in 2003. This report updates those regression models using discrete and continuous data collected during May 1998 through August 2014. In addition to the constituents listed in the 2003 update, new regression models were developed for total organic carbon. The real-time computations of water-quality concentrations and loads are available at http://nrtwq.usgs.gov. The water-quality information in this report is important to the city of Wichita because water-quality information allows for real-time quantification and characterization of chemicals of concern (including chloride), in addition to nutrients, sediment, bacteria, and atrazine transported in the Little Arkansas River. The water-quality information in this report aids in the decision making for water treatment before artificial recharge.

  13. A Curriculum for a Master of Science in Information Quality

    ERIC Educational Resources Information Center

    Lee, Yang W.; Pierce, Elizabeth; Talburt, John; Wang, Richard Y.; Zhu, Hongwei

    2007-01-01

    The first Master of Science in Information Quality (IQ) degree is designed and being offered to prepare students for careers in industry and government as well as advanced graduate studies. The curriculum is guided by the Model Curriculum and Guidelines for Graduate Degree Programs in Information Systems, which are endorsed by the Association for…

  14. Colonoscopy video quality assessment using hidden Markov random fields

    NASA Astrophysics Data System (ADS)

    Park, Sun Young; Sargent, Dusty; Spofford, Inbar; Vosburgh, Kirby

    2011-03-01

    With colonoscopy becoming a common procedure for individuals aged 50 or more who are at risk of developing colorectal cancer (CRC), colon video data is being accumulated at an ever increasing rate. However, the clinically valuable information contained in these videos is not being maximally exploited to improve patient care and accelerate the development of new screening methods. One of the well-known difficulties in colonoscopy video analysis is the abundance of frames with no diagnostic information. Approximately 40% - 50% of the frames in a colonoscopy video are contaminated by noise, acquisition errors, glare, blur, and uneven illumination. Therefore, filtering out low quality frames containing no diagnostic information can significantly improve the efficiency of colonoscopy video analysis. To address this challenge, we present a quality assessment algorithm to detect and remove low quality, uninformative frames. The goal of our algorithm is to discard low quality frames while retaining all diagnostically relevant information. Our algorithm is based on a hidden Markov model (HMM) in combination with two measures of data quality to filter out uninformative frames. Furthermore, we present a two-level framework based on an embedded hidden Markov model (EHHM) to incorporate the proposed quality assessment algorithm into a complete, automated diagnostic image analysis system for colonoscopy video.

  15. A Functional Model of Quality Assurance for Psychiatric Hospitals and Corresponding Staffing Requirements.

    ERIC Educational Resources Information Center

    Kamis-Gould, Edna; And Others

    1991-01-01

    A model for quality assurance (QA) in psychiatric hospitals is described. Its functions (general QA, utilization review, clinical records, evaluation, management information systems, risk management, and infection control), subfunctions, and corresponding staffing requirements are reviewed. This model was designed to foster standardization in QA…

  16. McCook Reservoir Water Quality Model. Numerical Model Investigation

    DTIC Science & Technology

    1991-09-01

    REPT TYPE AND DATES COVERED ad September Cana Final report . LEAND SUBTITLE S. FUNDING NUERS Spinfild VA2261 ThcCook Reservoir Water Quality Model...oxygen injected by the aeration system Manufacturers of diffusers supply OTE information specific to gas flow rate and depth. The depths at which most

  17. The quality of mental disorder information websites: a review.

    PubMed

    Reavley, Nicola J; Jorm, Anthony F

    2011-11-01

    This paper reviews studies assessing the quality of websites providing information about mental disorders. The review included 31 articles identified by searching research databases in March 2010. Topics covered included affective disorders, anxiety disorders, eating disorders, substance use disorders and schizophrenia/psychosis. The largest number of articles (13) reported studies assessing affective disorder information quality. Methodologies varied in site selection and rating methods, with some of limited validity. Most concluded that quality was poor, although quality of affective disorder sites may be improving. There is currently very little understanding of the influence of website quality on user behaviour. Future quality assessments might use the criteria informed by key behaviour change theories. A possible approach to research on websites and user behaviour might be to develop an evaluation framework incorporating strategies from behaviour change models, key mental health literacy elements and health outcomes relevant to mental health promotion. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  18. The National Water-Quality Assessment (NAWQA) Program planned monitoring and modeling activities for Texas, 2013–23

    USGS Publications Warehouse

    Ging, Patricia

    2013-01-01

    The U.S. Geological Survey’s (USGS) National Water-Quality Assessment (NAWQA) Program was established by Congress in 1992 to answer the following question: What is the status of the Nation’s water quality and is it getting better or worse? Since 1992, NAWQA has been a primary source of nationally consistent data and information on the quality of the Nation’s streams and groundwater. Data and information obtained from objective and nationally consistent water-quality monitoring and modeling activities provide answers to where, when, and why the Nation’s water quality is degraded and what can be done to improve and protect it for human and ecosystem needs. For NAWQA’s third decade (2013–23), a new strategic Science Plan has been developed that describes a strategy for building upon and enhancing the USGS’s ongoing assessment of the Nation’s freshwater quality and aquatic ecosystems.

  19. A method of groundwater quality assessment based on fuzzy network-CANFIS and geographic information system (GIS)

    NASA Astrophysics Data System (ADS)

    Gholami, V.; Khaleghi, M. R.; Sebghati, M.

    2017-11-01

    The process of water quality testing is money/time-consuming, quite important and difficult stage for routine measurements. Therefore, use of models has become commonplace in simulating water quality. In this study, the coactive neuro-fuzzy inference system (CANFIS) was used to simulate groundwater quality. Further, geographic information system (GIS) was used as the pre-processor and post-processor tool to demonstrate spatial variation of groundwater quality. All important factors were quantified and groundwater quality index (GWQI) was developed. The proposed model was trained and validated by taking a case study of Mazandaran Plain located in northern part of Iran. The factors affecting groundwater quality were the input variables for the simulation, whereas GWQI index was the output. The developed model was validated to simulate groundwater quality. Network validation was performed via comparison between the estimated and actual GWQI values. In GIS, the study area was separated to raster format in the pixel dimensions of 1 km and also by incorporation of input data layers of the Fuzzy Network-CANFIS model; the geo-referenced layers of the effective factors in groundwater quality were earned. Therefore, numeric values of each pixel with geographical coordinates were entered to the Fuzzy Network-CANFIS model and thus simulation of groundwater quality was accessed in the study area. Finally, the simulated GWQI indices using the Fuzzy Network-CANFIS model were entered into GIS, and hence groundwater quality map (raster layer) based on the results of the network simulation was earned. The study's results confirm the high efficiency of incorporation of neuro-fuzzy techniques and GIS. It is also worth noting that the general quality of the groundwater in the most studied plain is fairly low.

  20. Measuring health care process quality with software quality measures.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2012-01-01

    Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.

  1. Water quality modeling using geographic information system (GIS) data

    NASA Technical Reports Server (NTRS)

    Engel, Bernard A

    1992-01-01

    Protection of the environment and natural resources at the Kennedy Space Center (KSC) is of great concern. The potential for surface and ground water quality problems resulting from non-point sources of pollution was examined using models. Since spatial variation of parameters required was important, geographic information systems (GIS) and their data were used. The potential for groundwater contamination was examined using the SEEPAGE (System for Early Evaluation of the Pollution Potential of Agricultural Groundwater Environments) model. A watershed near the VAB was selected to examine potential for surface water pollution and erosion using the AGNPS (Agricultural Non-Point Source Pollution) model.

  2. Examining the Factors That Contribute to Successful Database Application Implementation Using the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Nworji, Alexander O.

    2013-01-01

    Most organizations spend millions of dollars due to the impact of improperly implemented database application systems as evidenced by poor data quality problems. The purpose of this quantitative study was to use, and extend, the technology acceptance model (TAM) to assess the impact of information quality and technical quality factors on database…

  3. DIAGNOSTIC EVALUATION OF NUMBERICAL AIR QUALITY MODELS WITH SPECIALIZED AMBIENT OBSERVATIONS: TESTING THE COMMUNITY MULTISCALE AIR QUALITY MODELING SYSTEM (CMAQ) AT SELECTED SOS 95 GROUND SITES

    EPA Science Inventory

    Three probes for diagnosing photochemical dynamics are presented and applied to specialized ambient surface-level observations and to a numerical photochemical model to better understand rates of production and other process information in the atmosphere and in the model. Howeve...

  4. Quality assessment of protein model-structures using evolutionary conservation.

    PubMed

    Kalman, Matan; Ben-Tal, Nir

    2010-05-15

    Programs that evaluate the quality of a protein structural model are important both for validating the structure determination procedure and for guiding the model-building process. Such programs are based on properties of native structures that are generally not expected for faulty models. One such property, which is rarely used for automatic structure quality assessment, is the tendency for conserved residues to be located at the structural core and for variable residues to be located at the surface. We present ConQuass, a novel quality assessment program based on the consistency between the model structure and the protein's conservation pattern. We show that it can identify problematic structural models, and that the scores it assigns to the server models in CASP8 correlate with the similarity of the models to the native structure. We also show that when the conservation information is reliable, the method's performance is comparable and complementary to that of the other single-structure quality assessment methods that participated in CASP8 and that do not use additional structural information from homologs. A perl implementation of the method, as well as the various perl and R scripts used for the analysis are available at http://bental.tau.ac.il/ConQuass/. nirb@tauex.tau.ac.il Supplementary data are available at Bioinformatics online.

  5. Evaluation of the Community Multiscale Air Quality Model for Simulating Winter Ozone Formation in the Uinta Basin with Intensive Oil and Gas Production

    NASA Astrophysics Data System (ADS)

    Matichuk, R.; Tonnesen, G.; Luecken, D.; Roselle, S. J.; Napelenok, S. L.; Baker, K. R.; Gilliam, R. C.; Misenis, C.; Murphy, B.; Schwede, D. B.

    2015-12-01

    The western United States is an important source of domestic energy resources. One of the primary environmental impacts associated with oil and natural gas production is related to air emission releases of a number of air pollutants. Some of these pollutants are important precursors to the formation of ground-level ozone. To better understand ozone impacts and other air quality issues, photochemical air quality models are used to simulate the changes in pollutant concentrations in the atmosphere on local, regional, and national spatial scales. These models are important for air quality management because they assist in identifying source contributions to air quality problems and designing effective strategies to reduce harmful air pollutants. The success of predicting oil and natural gas air quality impacts depends on the accuracy of the input information, including emissions inventories, meteorological information, and boundary conditions. The treatment of chemical and physical processes within these models is equally important. However, given the limited amount of data collected for oil and natural gas production emissions in the past and the complex terrain and meteorological conditions in western states, the ability of these models to accurately predict pollution concentrations from these sources is uncertain. Therefore, this presentation will focus on understanding the Community Multiscale Air Quality (CMAQ) model's ability to predict air quality impacts associated with oil and natural gas production and its sensitivity to input uncertainties. The results will focus on winter ozone issues in the Uinta Basin, Utah and identify the factors contributing to model performance issues. The results of this study will help support future air quality model development, policy and regulatory decisions for the oil and gas sector.

  6. Developing a long-term condition's information service in collaboration with third sector organisations.

    PubMed

    McShane, Lesley; Greenwell, Kate; Corbett, Sally; Walker, Richard

    2014-06-01

    People with long-term conditions need to be signposted to high quality information and advice to understand and manage their condition. Information seeking tools combined with third sector information could help address their information needs. To describe the development and implementation of an information service for people living with long-term conditions at one NHS acute trust in the Northeast of England. An information service was trialled using bespoke information models for three long-term conditions in collaboration with third sector organisations. These guided people to relevant, timely and reliable information. Both clinician and service user questionnaires were used to evaluate satisfaction with the service. Appropriately designed information models can be used interchangeably across all services. Between 75% and 91% of users agreed that they were satisfied with various aspects of the service. Generally, users received relevant, understandable and high quality information at the right time. Nearly all health professionals (94-100%) felt the service was accessible, provided high quality information and did not significantly impact on their consultation time. The developed information service was well received by service users and health professionals. Specifically, the use of information prescriptions and menus facilitated access to information for people with long-term conditions. © 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group.

  7. How Clean is your Local Air? Here's an app for that

    NASA Astrophysics Data System (ADS)

    Maskey, M.; Yang, E.; Christopher, S. A.; Keiser, K.; Nair, U. S.; Graves, S. J.

    2011-12-01

    Air quality is a vital element of our environment. Accurate and localized air quality information is critical for characterizing environmental impacts at the local and regional levels. Advances in location-aware handheld devices and air quality modeling have enabled a group of UAHuntsville scientists to develop a mobile app, LocalAQI, that informs users of current conditions and forecasts of up to twenty-four hours, of air quality indices. The air quality index is based on Community Multiscale Air Quality Modeling System (CMAQ). UAHuntsville scientists have used satellite remote sensing products as inputs to CMAQ, resulting in forecast guidance for particulate matter air quality. The CMAQ output is processed to compute a standardized air quality index. Currently, the air quality index is available for the eastern half of the United States. LocalAQI consists of two main views: air quality index view and map view. The air quality index view displays current air quality for the zip code of a location of interest. Air quality index value is translated into a color-coded advisory system. In addition, users are able to cycle through available hourly forecasts for a location. This location-aware app defaults to the current air quality of user's location. The map view displays color-coded air quality information for the eastern US with an ability to animate through the available forecasts. The app is developed using a cross-platform native application development tool, appcelerator; hence LocalAQI is available for iOS and Android-based phones and pads.

  8. Hospital quality choice and market structure in a regulated duopoly.

    PubMed

    Beitia, Arantza

    2003-11-01

    This paper analyzes the optimal structure of a regulated health care industry in a model in which the regulator cannot enforce what hospitals do (unverifiable quality of health) or does not know what hospitals know (incomplete information about production costs) or both. We show that if quality is unverifiable the choice between monopoly and duopoly does not change with respect to the verifiable case but, if there are fixed costs (assumed to be quality dependent) and the monopoly is the optimal market structure, the quality level of the operative hospital decreases. Asymmetry of information introduces informational rents that can be reduced by increasing the most efficient hospital's market share. A monopoly is chosen more often.

  9. Motivating medical information system performance by system quality, service quality, and job satisfaction for evidence-based practice

    PubMed Central

    2012-01-01

    Background No previous studies have addressed the integrated relationships among system quality, service quality, job satisfaction, and system performance; this study attempts to bridge such a gap with evidence-based practice study. Methods The convenience sampling method was applied to the information system users of three hospitals in southern Taiwan. A total of 500 copies of questionnaires were distributed, and 283 returned copies were valid, suggesting a valid response rate of 56.6%. SPSS 17.0 and AMOS 17.0 (structural equation modeling) statistical software packages were used for data analysis and processing. Results The findings are as follows: System quality has a positive influence on service quality (γ11= 0.55), job satisfaction (γ21= 0.32), and system performance (γ31= 0.47). Service quality (β31= 0.38) and job satisfaction (β32= 0.46) will positively influence system performance. Conclusions It is thus recommended that the information office of hospitals and developers take enhancement of service quality and user satisfaction into consideration in addition to placing b on system quality and information quality when designing, developing, or purchasing an information system, in order to improve benefits and gain more achievements generated by hospital information systems. PMID:23171394

  10. Evaluation models and criteria of the quality of hospital websites: a systematic review study

    PubMed Central

    Jeddi, Fatemeh Rangraz; Gilasi, Hamidreza; Khademi, Sahar

    2017-01-01

    Introduction Hospital websites are important tools in establishing communication and exchanging information between patients and staff, and thus should enjoy an acceptable level of quality. The aim of this study was to identify proper models and criteria to evaluate the quality of hospital websites. Methods This research was a systematic review study. The international databases such as Science Direct, Google Scholar, PubMed, Proquest, Ovid, Elsevier, Springer, and EBSCO together with regional database such as Magiran, Scientific Information Database, Persian Journal Citation Report (PJCR) and IranMedex were searched. Suitable keywords including website, evaluation, and quality of website were used. Full text papers related to the research were included. The criteria and sub criteria of the evaluation of website quality were extracted and classified. Results To evaluate the quality of the websites, various models and criteria were presented. The WEB-Q-IM, Mile, Minerva, Seruni Luci, and Web-Qual models were the designed models. The criteria of accessibility, content and apparent features of the websites, the design procedure, the graphics applied in the website, and the page’s attractions have been mentioned in the majority of studies. Conclusion The criteria of accessibility, content, design method, security, and confidentiality of personal information are the essential criteria in the evaluation of all websites. It is suggested that the ease of use, graphics, attractiveness and other apparent properties of websites are considered as the user-friendliness sub criteria. Further, the criteria of speed and accessibility of the website should be considered as sub criterion of efficiency. When determining the evaluation criteria of the quality of websites, attention to major differences in the specific features of any website is essential. PMID:28465807

  11. Evaluation models and criteria of the quality of hospital websites: a systematic review study.

    PubMed

    Jeddi, Fatemeh Rangraz; Gilasi, Hamidreza; Khademi, Sahar

    2017-02-01

    Hospital websites are important tools in establishing communication and exchanging information between patients and staff, and thus should enjoy an acceptable level of quality. The aim of this study was to identify proper models and criteria to evaluate the quality of hospital websites. This research was a systematic review study. The international databases such as Science Direct, Google Scholar, PubMed, Proquest, Ovid, Elsevier, Springer, and EBSCO together with regional database such as Magiran, Scientific Information Database, Persian Journal Citation Report (PJCR) and IranMedex were searched. Suitable keywords including website, evaluation, and quality of website were used. Full text papers related to the research were included. The criteria and sub criteria of the evaluation of website quality were extracted and classified. To evaluate the quality of the websites, various models and criteria were presented. The WEB-Q-IM, Mile, Minerva, Seruni Luci, and Web-Qual models were the designed models. The criteria of accessibility, content and apparent features of the websites, the design procedure, the graphics applied in the website, and the page's attractions have been mentioned in the majority of studies. The criteria of accessibility, content, design method, security, and confidentiality of personal information are the essential criteria in the evaluation of all websites. It is suggested that the ease of use, graphics, attractiveness and other apparent properties of websites are considered as the user-friendliness sub criteria. Further, the criteria of speed and accessibility of the website should be considered as sub criterion of efficiency. When determining the evaluation criteria of the quality of websites, attention to major differences in the specific features of any website is essential.

  12. Models and Methods of Aggregating Linguistic Information in Multi-criteria Hierarchical Quality Assessment Systems

    NASA Astrophysics Data System (ADS)

    Azarnova, T. V.; Titova, I. A.; Barkalov, S. A.

    2018-03-01

    The article presents an algorithm for obtaining an integral assessment of the quality of an organization from the perspective of customers, based on the method of aggregating linguistic information on a multilevel hierarchical system of quality assessment. The algorithm is of a constructive nature, it provides not only the possibility of obtaining an integral evaluation, but also the development of a quality improvement strategy based on the method of linguistic decomposition, which forms the minimum set of areas of work with clients whose quality change will allow obtaining the required level of integrated quality assessment.

  13. A Model for Discussing the Quality of Technology-Enhanced Learning in Blended Learning Programmes

    ERIC Educational Resources Information Center

    Casanova, Diogo; Moreira, António

    2017-01-01

    This paper presents a comprehensive model for supporting informed and critical discussions concerning the quality of Technology-Enhanced Learning in Blended Learning programmes. The model aims to support discussions around domains such as how institutions are prepared, the participants' background and expectations, the course design, and the…

  14. Development of water environment information management and water pollution accident response system

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Ruan, H.

    2009-12-01

    In recent years, many water pollution accidents occurred with the rapid economical development. In this study, water environment information management and water pollution accident response system are developed based on geographic information system (GIS) techniques. The system integrated spatial database, attribute database, hydraulic model, and water quality model under a user-friendly interface in a GIS environment. System ran in both Client/Server (C/S) and Browser/Server (B/S) platform which focused on model and inquiry respectively. System provided spatial and attribute data inquiry, water quality evaluation, statics, water pollution accident response case management (opening reservoir etc) and 2D and 3D visualization function, and gave assistant information to make decision on water pollution accident response. Polluted plume in Huaihe River were selected to simulate the transport of pollutes.

  15. A Personalized Health Information Retrieval System

    PubMed Central

    Wang, Yunli; Liu, Zhenkai

    2005-01-01

    Consumers face barriers when seeking health information on the Internet. A Personalized Health Information Retrieval System (PHIRS) is proposed to recommend health information for consumers. The system consists of four modules: (1) User modeling module captures user’s preference and health interests; (2) Automatic quality filtering module identifies high quality health information; (3) Automatic text difficulty rating module classifies health information into professional or patient educational materials; and (4) User profile matching module tailors health information for individuals. The initial results show that PHIRS could assist consumers with simple search strategies. PMID:16779435

  16. Urban air quality estimation study, phase 1

    NASA Technical Reports Server (NTRS)

    Diamante, J. M.; Englar, T. S., Jr.; Jazwinski, A. H.

    1976-01-01

    Possibilities are explored for applying estimation theory to the analysis, interpretation, and use of air quality measurements in conjunction with simulation models to provide a cost effective method of obtaining reliable air quality estimates for wide urban areas. The physical phenomenology of real atmospheric plumes from elevated localized sources is discussed. A fluctuating plume dispersion model is derived. Individual plume parameter formulations are developed along with associated a priori information. Individual measurement models are developed.

  17. Switching health insurers: the role of price, quality and consumer information search.

    PubMed

    Boonen, Lieke H H M; Laske-Aldershof, Trea; Schut, Frederik T

    2016-04-01

    We examine the impact of price, service quality and information search on people's propensity to switch health insurers in the competitive Dutch health insurance market. Using panel data from annual household surveys and data on health insurers' premiums and quality ratings over the period 2006-2012, we estimate a random effects logit model of people's switching decisions. We find that switching propensities depend on health plan price and quality, and on people's age, health, education and having supplementary or group insurance. Young people (18-35 years) are more sensitive to price, whereas older people are more sensitive to quality. Searching for health plan information has a much stronger impact on peoples' sensitivity to price than to service quality. In addition, searching for health plan information has a stronger impact on the switching propensity of higher than lower educated people, suggesting that higher educated people make better use of available health plan information. Finally, having supplementary insurance significantly reduces older people's switching propensity.

  18. Data Quality in Institutional Arthroplasty Registries: Description of a Model of Validation and Report of Preliminary Results.

    PubMed

    Bautista, Maria P; Bonilla, Guillermo A; Mieth, Klaus W; Llinás, Adolfo M; Rodríguez, Fernanda; Cárdenas, Laura L

    2017-07-01

    Arthroplasty registries are a relevant source of information for research and quality improvement in patient care and its value depends on the quality of the recorded data. The purpose of this study is to describe a model of validation and present the findings of validation of an Institutional Arthroplasty Registry (IAR). Information from 209 primary arthroplasties and revision surgeries of the hip, knee, and shoulder recorded in the IAR between March and September 2015 were analyzed in the following domains. Adherence is defined as the proportion of patients included in the registry, completeness is defined as the proportion of data effectively recorded, and accuracy is defined as the proportion of data consistent with medical records. A random sample of 53 patients (25.4%) was selected to assess the latest 2 domains. A direct comparison between the registry's database and medical records was performed. In total, 324 variables containing information on demographic data, surgical procedure, clinical outcomes, and key performance indicators were analyzed. Two hundred nine of 212 patients who underwent surgery during the study period were included in the registry, accounting for an adherence of 98.6%. Completeness was 91.7% and accuracy was 85.8%. Most errors were found in the preoperative range of motion and timely administration of prophylactic antibiotics and thromboprophylaxis. This model provides useful information regarding the quality of the recorded data since it identified deficient areas within the IAR. We recommend that institutional arthroplasty registries be constantly monitored for data quality before using their information for research or quality improvement purposes. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Quality models for audiovisual streaming

    NASA Astrophysics Data System (ADS)

    Thang, Truong Cong; Kim, Young Suk; Kim, Cheon Seog; Ro, Yong Man

    2006-01-01

    Quality is an essential factor in multimedia communication, especially in compression and adaptation. Quality metrics can be divided into three categories: within-modality quality, cross-modality quality, and multi-modality quality. Most research has so far focused on within-modality quality. Moreover, quality is normally just considered from the perceptual perspective. In practice, content may be drastically adapted, even converted to another modality. In this case, we should consider the quality from semantic perspective as well. In this work, we investigate the multi-modality quality from the semantic perspective. To model the semantic quality, we apply the concept of "conceptual graph", which consists of semantic nodes and relations between the nodes. As an typical of multi-modality example, we focus on audiovisual streaming service. Specifically, we evaluate the amount of information conveyed by a audiovisual content where both video and audio channels may be strongly degraded, even audio are converted to text. In the experiments, we also consider the perceptual quality model of audiovisual content, so as to see the difference with semantic quality model.

  20. Evaluation and Quality Control for the Copernicus Seasonal Forecast Systems

    NASA Astrophysics Data System (ADS)

    Manubens, N.; Hunter, A.; Bedia, J.; Bretonnière, P. A.; Bhend, J.; Doblas-Reyes, F. J.

    2017-12-01

    The EU funded Copernicus Climate Change Service (C3S) will provide authoritative information about past, current and future climate for a wide range of users, from climate scientists to stakeholders from a wide range of sectors including insurance, energy or transport. It has been recognized that providing information about the products' quality and provenance is paramount to establish trust in the service and allow users to make best use of the available information. This presentation outlines the work being conducted within the Quality Assurance for Multi-model Seasonal Forecast Products project (QA4Seas). The aim of QA4Seas is to develop a strategy for the evaluation and quality control (EQC) of the multi-model seasonal forecasts provided by C3S. First, we present the set of guidelines the data providers must comply with, ensuring the data is fully traceable and harmonized across data sets. Second, we discuss the ongoing work on defining a provenance and metadata model that is able to encode such information, and that can be extended to describe the steps followed to obtain the final verification products such as maps and time series of forecast quality measures. The metadata model is based on the Resource Description Framework W3C standard, being thus extensible and reusable. It benefits from widely adopted vocabularies to describe data provenance and workflows, as well as from expert consensus and community-support for the development of the verification and downscaling specific ontologies. Third, we describe the open source software being developed to generate fully reproducible and certifiable seasonal forecast products, which also attaches provenance and metadata information to the verification measures and enables the user to visually inspect the quality of the C3S products. QA4Seas is seeking collaboration with similar initiatives, as well as extending the discussion to interested parties outside the C3S community to share experiences and establish global common guidelines or best practices regarding data provenance.

  1. Total Quality Management of Information System for Quality Assessment of Pesantren Using Fuzzy-SERVQUAL

    NASA Astrophysics Data System (ADS)

    Faizah, Arbiati; Syafei, Wahyul Amien; Isnanto, R. Rizal

    2018-02-01

    This research proposed a model combining an approach of Total Quality Management (TQM) and Fuzzy method of Service Quality (SERVQUAL) to asses service quality. TQM implementation was as quality management orienting on customer's satisfaction by involving all stakeholders. SERVQUAL model was used to measure quality service based on five dimensions such as tangible, reliability, responsiveness, assurance, and empathy. Fuzzy set theory was to accommodate subjectivity and ambiguity of quality assessment. Input data consisted of indicator data and quality assessment aspect. Input data was, then, processed to be service quality assessment questionnaires of Pesantren by using Fuzzy method to get service quality score. This process consisted of some steps as follows : inputting dimension and questionnaire data to data base system, filling questionnaire through system, then, system calculated fuzzification, defuzzification, gap of quality expected and received by service receivers, and calculating each dimension rating showing quality refinement priority. Rating of each quality dimension was, then, displayed at dashboard system to enable users to see information. From system having been built, it could be known that tangible dimension had the highest gap, -0.399, thus it needs to be prioritized and gets evaluation and refinement action soon.

  2. QUANTIFYING SUBGRID POLLUTANT VARIABILITY IN EULERIAN AIR QUALITY MODELS

    EPA Science Inventory

    In order to properly assess human risk due to exposure to hazardous air pollutants or air toxics, detailed information is needed on the location and magnitude of ambient air toxic concentrations. Regional scale Eulerian air quality models are typically limited to relatively coar...

  3. Determinants of user satisfaction with a Clinical Information System.

    PubMed

    Palm, Jean-Marc; Colombet, Isabelle; Sicotte, Claude; Degoulet, Patrice

    2006-01-01

    Clinical Information Systems (CIS) implementation has faced user resistance. Consequently, we aimed to assess the acceptability of an integrated CIS. We designed an electronic survey instrument from two theoretical models (Delone and McLean, and Technology Acceptance Model). Dimensions hypothesized to be determinant of user satisfaction were: user characteristics, CIS use, quality, usefulness, and service quality. The questionnaire was administered to physicians, nurses and medical secretaries of the Georges Pompidou university Hospital (HEGP) in Paris. Answers were obtained from 324 users (93 physicians, 174 nurses, and 57 secretaries). Cronbach's alpha coefficients showed a correct reliability within each dimension. Secretaries and nurses were more satisfied with the CIS than physicians. Except for CIS use, after adjustment for confounders, female gender, perceived CIS quality, usefulness, and service quality were strongly correlated with user satisfaction. This study reinforces the necessity of several models and dimensions to evaluate the acceptability of a complex CIS, with a specific approach for different user profiles.

  4. Competition for resources can explain patterns of social and individual learning in nature.

    PubMed

    Smolla, Marco; Gilman, R Tucker; Galla, Tobias; Shultz, Susanne

    2015-09-22

    In nature, animals often ignore socially available information despite the multiple theoretical benefits of social learning over individual trial-and-error learning. Using information filtered by others is quicker, more efficient and less risky than randomly sampling the environment. To explain the mix of social and individual learning used by animals in nature, most models penalize the quality of socially derived information as either out of date, of poor fidelity or costly to acquire. Competition for limited resources, a fundamental evolutionary force, provides a compelling, yet hitherto overlooked, explanation for the evolution of mixed-learning strategies. We present a novel model of social learning that incorporates competition and demonstrates that (i) social learning is favoured when competition is weak, but (ii) if competition is strong social learning is favoured only when resource quality is highly variable and there is low environmental turnover. The frequency of social learning in our model always evolves until it reduces the mean foraging success of the population. The results of our model are consistent with empirical studies showing that individuals rely less on social information where resources vary little in quality and where there is high within-patch competition. Our model provides a framework for understanding the evolution of social learning, a prerequisite for human cumulative culture. © 2015 The Author(s).

  5. Competition for resources can explain patterns of social and individual learning in nature

    PubMed Central

    Smolla, Marco; Gilman, R. Tucker; Galla, Tobias; Shultz, Susanne

    2015-01-01

    In nature, animals often ignore socially available information despite the multiple theoretical benefits of social learning over individual trial-and-error learning. Using information filtered by others is quicker, more efficient and less risky than randomly sampling the environment. To explain the mix of social and individual learning used by animals in nature, most models penalize the quality of socially derived information as either out of date, of poor fidelity or costly to acquire. Competition for limited resources, a fundamental evolutionary force, provides a compelling, yet hitherto overlooked, explanation for the evolution of mixed-learning strategies. We present a novel model of social learning that incorporates competition and demonstrates that (i) social learning is favoured when competition is weak, but (ii) if competition is strong social learning is favoured only when resource quality is highly variable and there is low environmental turnover. The frequency of social learning in our model always evolves until it reduces the mean foraging success of the population. The results of our model are consistent with empirical studies showing that individuals rely less on social information where resources vary little in quality and where there is high within-patch competition. Our model provides a framework for understanding the evolution of social learning, a prerequisite for human cumulative culture. PMID:26354936

  6. WATGIS: A GIS-Based Lumped Parameter Water Quality Model

    Treesearch

    Glenn P. Fernandez; George M. Chescheir; R. Wayne Skaggs; Devendra M. Amatya

    2002-01-01

    A Geographic Information System (GIS)­based, lumped parameter water quality model was developed to estimate the spatial and temporal nitrogen­loading patterns for lower coastal plain watersheds in eastern North Carolina. The model uses a spatially distributed delivery ratio (DR) parameter to account for nitrogen retention or loss along a drainage network. Delivery...

  7. Is Bigger Better? Customer Base Expansion through Word-of-Mouth Reputation

    ERIC Educational Resources Information Center

    Rob, Rafael; Fishman, Arthur

    2005-01-01

    A model of gradual reputation formation through a process of continuous investment in product quality is developed. We assume that the ability to produce high-quality products requires continuous investment and that as a consequence of informational frictions, such as search costs, information about firms' past performance diffuses only gradually…

  8. Air Quality Modeling in Support of the Near-Road Exposures and Effects of Urban Air Pollutants Study (NEXUS)

    EPA Science Inventory

    A major challenge in traffic-related air pollution exposure studies is the lack of information regarding pollutant exposure characterization. Air quality modeling can provide spatially and temporally varying exposure estimates for examining relationships between traffic-related a...

  9. Water quality modelling of Jadro spring.

    PubMed

    Margeta, J; Fistanic, I

    2004-01-01

    Management of water quality in karst is a specific problem. Water generally moves very fast by infiltration processes but far more by concentrated flows through fissures and openings in karst. This enables the entire surface pollution to be transferred fast and without filtration into groundwater springs. A typical example is the Jadro spring. Changes in water quality at the spring are sudden, but short. Turbidity as a major water quality problem for the karst springs regularly exceeds allowable standards. Former practice in problem solving has been reduced to intensive water disinfection in periods of great turbidity without analyses of disinfection by-products risks for water users. The main prerequisite for water quality control and an optimization of water disinfection is the knowledge of raw water quality and nature of occurrence. The analysis of monitoring data and their functional relationship with hydrological parameters enables establishment of a stochastic model that will help obtain better information on turbidity in different periods of the year. Using the model a great number of average monthly and extreme daily values are generated. By statistical analyses of these data possibility of occurrence of high turbidity in certain months is obtained. This information can be used for designing expert system for water quality management of karst springs. Thus, the time series model becomes a valuable tool in management of drinking water quality of the Jadro spring.

  10. Design of Cycle 3 of the National Water-Quality Assessment Program, 2013-23: Part 2: Science plan for improved water-quality information and management

    USGS Publications Warehouse

    Rowe, Gary L.; Belitz, Kenneth; Demas, Charlie R.; Essaid, Hedeff I.; Gilliom, Robert J.; Hamilton, Pixie A.; Hoos, Anne B.; Lee, Casey J.; Munn, Mark D.; Wolock, David W.

    2013-01-01

    This report presents a science strategy for the third decade of the National Water-Quality Assessment (NAWQA) Program, which since 1991, has been responsible for providing nationally consistent information on the quality of the Nation's streams and groundwater; how water quality is changing over time; and the major natural and human factors that affect current water quality conditions and trends. The strategy is based on an extensive evaluation of the accomplishments of NAWQA over its first two decades, the current status of water-quality monitoring activities by USGS and its partners, and an updated analysis of stakeholder priorities. The plan is designed to address priority issues and national needs identified by NAWQA stakeholders and the National Research Council (2012) irrespective of budget constraints. This plan describes four major goals for the third decade (Cycle 3), the approaches for monitoring, modeling, and scientific studies, key partnerships required to achieve these goals, and products and outcomes that will result from planned assessment activities. The science plan for 2013–2023 is a comprehensive approach to meet stakeholder priorities for: (1) rebuilding NAWQA monitoring networks for streams, rivers, and groundwater, and (2) upgrading models used to extrapolate and forecast changes in water-quality and stream ecosystem condition in response to changing climate and land use. The Cycle 3 plan continues approaches that have been central to the Program’s long-term success, but adjusts monitoring intensities and study designs to address critical information needs and identified data gaps. Restoration of diminished monitoring networks and new directions in modeling and interpretative studies address growing and evolving public and stakeholder needs for water-quality information and improved management, particularly in the face of increasing challenges related to population growth, increasing demands for water, and changing land use and climate. However, a combination of funding growth and extensive collaboration with other USGS programs and other Federal, State, and local agencies, public interest groups, professional and trade associations, academia, and private industry will be needed to fully realize the monitoring and modeling goals laid out in this plan (USGS Fact Sheet 2013-3008).

  11. Service Quality and Customer Satisfaction: An Assessment and Future Directions.

    ERIC Educational Resources Information Center

    Hernon, Peter; Nitecki, Danuta A.; Altman, Ellen

    1999-01-01

    Reviews the literature of library and information science to examine issues related to service quality and customer satisfaction in academic libraries. Discusses assessment, the application of a business model to higher education, a multiple constituency approach, decision areas regarding service quality, resistance to service quality, and future…

  12. Quality of Big Data in Healthcare

    DOE PAGES

    Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay

    2015-01-01

    The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.

  13. Quality of Big Data in Healthcare

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay

    The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.

  14. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    PubMed

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  15. Monitoring and modeling of microbial and biological water quality

    USDA-ARS?s Scientific Manuscript database

    Microbial and biological water quality informs on the health of water systems and their suitability for uses in irrigation, recreation, aquaculture, and other activities. Indicators of microbial and biological water quality demonstrate high spatial and temporal variability. Therefore, monitoring str...

  16. Understanding Employee Awareness of Health Care Quality Information: How Can Employers Benefit?

    PubMed Central

    Abraham, Jean; Feldman, Roger; Carlin, Caroline

    2004-01-01

    Objective To analyze the factors associated with employee awareness of employer-disseminated quality information on providers. Data Sources Primary data were collected in 2002 on a stratified, random sample of 1,365 employees in 16 firms that are members of the Buyers Health Care Action Group (BHCAG) located in the Minneapolis–St. Paul region. An employer survey was also conducted to assess how employers communicated the quality information to employees. Study Design In 2001, BHCAG sponsored two programs for reporting provider quality. We specify employee awareness of the quality information to depend on factors that influence the benefits and costs of search. Factors influencing the benefits include age, sex, provider satisfaction, health status, job tenure, and Twin Cities tenure. Factors influencing search costs include employee income, education, and employer communication strategies. We estimate the model using bivariate probit analysis. Data Collection Employee data were collected by phone survey. Principal Findings Overall, the level of quality information awareness is low. However, employer communication strategies such as distributing booklets to all employees or making them available on request have a large effect on the probability of quality information awareness. Employee education and utilization of providers' services are also positively related to awareness. Conclusions This study is one of the first to investigate employee awareness of provider quality information. Given the direct implications for medical outcomes, one might anticipate higher rates of awareness regarding provider quality, relative to plan quality. However, we do not find empirical evidence to support this assertion. PMID:15533188

  17. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    NASA Astrophysics Data System (ADS)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-05-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  18. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    PubMed Central

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-01-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling. PMID:27157858

  19. Defining Quality in Assisted Living: Comparing Apples, Oranges, and Broccoli

    ERIC Educational Resources Information Center

    Hawes, Catherine; Phillips, Charles D.

    2007-01-01

    Purpose: The purpose of this article is to discuss and describe various measures of quality, quality indicators, and uses of information on quality with specific reference to the role or purpose of assisted living. Design and Methods: We reviewed a variety of major studies of assisted living quality. We elaborated models of assisted living based…

  20. Corn stover harvest increases herbicide movement to subsurface drains – Root Zone Water Quality Model simulations

    USDA-ARS?s Scientific Manuscript database

    BACKGROUND: Removal of crop residues for bioenergy production can alter soil hydrologic properties, but there is little information on its impact on transport of herbicides and their degradation products to subsurface drains. The Root Zone Water Quality Model, previously calibrated using measured fl...

  1. 77 FR 4808 - Conference on Air Quality Modeling

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-31

    ... update our available modeling tools with state-of-the-science techniques and for the public to offer new... C111, 109 T.W. Alexander Drive, Research Triangle Park, NC 27711. FOR FURTHER INFORMATION CONTACT... Quality Assessment Division, Mail Code C439-01, Research Triangle Park, NC 27711; telephone: (919) 541...

  2. USE OF REMOTE SENSING AIR QUALITY INFORMATION IN REGIONAL SCALE AIR POLLUTION MODELING: CURRENT USE AND REQUIREMENTS

    EPA Science Inventory

    In recent years the applications of regional air quality models are continuously being extended to address atmospheric pollution phenomenon from local to hemispheric spatial scales over time scales ranging from episodic to annual. The need to represent interactions between physic...

  3. A combined geostatistical-optimization model for the optimal design of a groundwater quality monitoring network

    NASA Astrophysics Data System (ADS)

    Kolosionis, Konstantinos; Papadopoulou, Maria P.

    2017-04-01

    Monitoring networks provide essential information for water resources management especially in areas with significant groundwater exploitation due to extensive agricultural activities. In this work, a simulation-optimization framework is developed based on heuristic optimization methodologies and geostatistical modeling approaches to obtain an optimal design for a groundwater quality monitoring network. Groundwater quantity and quality data obtained from 43 existing observation locations at 3 different hydrological periods in Mires basin in Crete, Greece will be used in the proposed framework in terms of Regression Kriging to develop the spatial distribution of nitrates concentration in the aquifer of interest. Based on the existing groundwater quality mapping, the proposed optimization tool will determine a cost-effective observation wells network that contributes significant information to water managers and authorities. The elimination of observation wells that add little or no beneficial information to groundwater level and quality mapping of the area can be obtain using estimations uncertainty and statistical error metrics without effecting the assessment of the groundwater quality. Given the high maintenance cost of groundwater monitoring networks, the proposed tool could used by water regulators in the decision-making process to obtain a efficient network design that is essential.

  4. The Effectiveness of Health Care Information Technologies: Evaluation of Trust, Security Beliefs, and Privacy as Determinants of Health Care Outcomes

    PubMed Central

    2018-01-01

    Background The diffusion of health information technologies (HITs) within the health care sector continues to grow. However, there is no theory explaining how success of HITs influences patient care outcomes. With the increase in data breaches, HITs’ success now hinges on the effectiveness of data protection solutions. Still, empirical research has only addressed privacy concerns, with little regard for other factors of information assurance. Objective The objective of this study was to study the effectiveness of HITs using the DeLone and McLean Information Systems Success Model (DMISSM). We examined the role of information assurance constructs (ie, the role of information security beliefs, privacy concerns, and trust in health information) as measures of HIT effectiveness. We also investigated the relationships between information assurance and three aspects of system success: attitude toward health information exchange (HIE), patient access to health records, and perceived patient care quality. Methods Using structural equation modeling, we analyzed the data from a sample of 3677 cancer patients from a public dataset. We used R software (R Project for Statistical Computing) and the Lavaan package to test the hypothesized relationships. Results Our extension of the DMISSM to health care was supported. We found that increased privacy concerns reduce the frequency of patient access to health records use, positive attitudes toward HIE, and perceptions of patient care quality. Also, belief in the effectiveness of information security increases the frequency of patient access to health records and positive attitude toward HIE. Trust in health information had a positive association with attitudes toward HIE and perceived patient care quality. Trust in health information had no direct effect on patient access to health records; however, it had an indirect relationship through privacy concerns. Conclusions Trust in health information and belief in the effectiveness of information security safeguards increases perceptions of patient care quality. Privacy concerns reduce patients’ frequency of accessing health records, patients’ positive attitudes toward HIE exchange, and overall perceived patient care quality. Health care organizations are encouraged to implement security safeguards to increase trust, the frequency of health record use, and reduce privacy concerns, consequently increasing patient care quality. PMID:29643052

  5. 78 FR 11808 - Approval and Promulgation of Implementation Plans; Tennessee: Approve Knox County Supplemental...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-20

    ... INFORMATION: On December 18, 2012, EPA proposed to approve, through parallel processing, a draft revision to... County to account for changes in the emissions model and vehicle miles traveled projection model. EPA is... submit comments. FOR FURTHER INFORMATION CONTACT: Kelly Sheckler, Air Quality Modeling and Transportation...

  6. A study of undue pain and surfing: using hierarchical criteria to assess website quality.

    PubMed

    Lorence, Daniel; Abraham, Joanna

    2008-09-01

    In studies of web-based consumer health information, scant attention has been paid to the selective development of differential methodologies for website quality evaluation, or to selective grouping and analysis of specific ;domains of uncertainty' in healthcare. Our objective is to introduce a more refined model for website evaluation, and illustrate its application using assessment of websites within an area of ongoing medical uncertainty, back pain. In this exploratory technology assessment, we suggest a model for assessing these ;domains of uncertainty' within healthcare, using qualitative assessment of websites and hierarchical concepts. Using such a hierarchy of quality criteria, we review medical information provided by the most frequently accessed websites related to back pain. Websites are evaluated using standardized criteria, with results rated from the viewpoint of the consumer. Results show that standardization of quality rating across subjective content, and between commercial and niche search results, can provide a consumer-friendly dimension to health information.

  7. Quality evaluation on an e-learning system in continuing professional education of nurses.

    PubMed

    Lin, I-Chun; Chien, Yu-Mei; Chang, I-Chiu

    2006-01-01

    Maintaining high quality in Web-based learning is a powerful means of increasing the overall efficiency and effectiveness of distance learning. Many studies have evaluated Web-based learning but seldom evaluate from the information systems (IS) perspective. This study applied the famous IS Success model in measuring the quality of a Web-based learning system using a Web-based questionnaire for data collection. One hundred and fifty four nurses participated in the survey. Based on confirmatory factor analysis, the variables of the research model fit for measuring the quality of a Web-based learning system. As Web-based education continues to grow worldwide, the results of this study may assist the system adopter (hospital executives), the learner (nurses), and the system designers in making reasonable and informed judgments with regard to the quality of Web-based learning system in continuing professional education.

  8. An Empirical Study of Logistics Organization, Electronic Linkage, and Performance

    DTIC Science & Technology

    1993-01-01

    utilization of transportation resources, and improved quality management. Researchers have proposed an information technology (IT) implementation model for...management, more efficient utilization of transportation resources, and improved quality management. Researchers have proposed an information...coordination of (1) facility structure, (2) forecasting and order management, (3) transportation , (4) inventory, and (5) warehousing and packaging. The

  9. Overview and Evaluation of a Smoke Modeling System and other Tools used during Wildfire Incident Deployments

    NASA Astrophysics Data System (ADS)

    ONeill, S. M.; Larkin, N. K.; Martinez, M.; Rorig, M.; Solomon, R. C.; Dubowy, J.; Lahm, P. W.

    2017-12-01

    Specialists operationally deployed to wildfires to forecast expected smoke conditions for the public use many tools and information. These Air Resource Advisors (ARAs) are deployed as part of the Wildland Fire Air Quality Response Program (WFAQRP) and rely on smoke models, monitoring data, meteorological information, and satellite information to produce daily Smoke Outlooks for a region impacted by smoke from wildfires. These Smoke Outlooks are distributed to air quality and health agencies, published online via smoke blogs and other social media, and distributed by the Incident Public Information Officer (PIO), and ultimately to the public. Fundamental to these operations are smoke modeling systems such as the BlueSky Smoke Modeling Framework, which combines fire activity information, mapped fuel loadings, consumption and emissions models, and air quality/dispersion models such as HYSPLIT to produce predictions of PM2.5 concentrations downwind of wildland fires. Performance of this system at a variety of meteorological resolutions, fire initialization information, and vertical allocation of emissions is evaluated for the Summer of 2015 when over 400,000 hectares burned in the northwestern US state of Washington and 1-hr average fine particulate matter (PM2.5) concentrations exceeded 700 μg/m3. The performance of the system at the 12-km, 4-km, and 1.33-km resolutions is evaluated using 1-hr average PM2.5 measurements from permanent monitors and temporary monitors deployed specifically for wildfires by ARAs on wildfire incident command teams. At the higher meteorological resolution (1.33-km) the terrain features are more detailed, showing better valley structures and in general, PM2.5 concentrations were greater in the valleys with the 1.33-km meteorological domain than with the 4-km domain.

  10. Hybrid Air Quality Modeling Approach For Use in the Near ...

    EPA Pesticide Factsheets

    The Near-road EXposures to Urban air pollutant Study (NEXUS) investigated whether children with asthma living in close proximity to major roadways in Detroit, MI, (particularly near roadways with high diesel traffic) have greater health impacts associated with exposure to air pollutants than those living farther away. A major challenge in such health and exposure studies is the lack of information regarding pollutant exposure characterization. Air quality modeling can provide spatially and temporally varying exposure estimates for examining relationships between traffic-related air pollutants and adverse health outcomes. This paper presents a hybrid air quality modeling approach and its application in NEXUS in order to provide spatial and temporally varying exposure estimates and identification of the mobile source contribution to the total pollutant exposure. Model-based exposure metrics, associated with local variations of emissions and meteorology, were estimated using a combination of the AERMOD and R-LINE dispersion models, local emission source information from the National Emissions Inventory, detailed road network locations and traffic activity, and meteorological data from the Detroit City Airport. The regional background contribution was estimated using a combination of the Community Multiscale Air Quality (CMAQ) model and the Space/Time Ordinary Kriging (STOK) model. To capture the near-road pollutant gradients, refined “mini-grids” of model recep

  11. Applications of MIDAS regression in analysing trends in water quality

    NASA Astrophysics Data System (ADS)

    Penev, Spiridon; Leonte, Daniela; Lazarov, Zdravetz; Mann, Rob A.

    2014-04-01

    We discuss novel statistical methods in analysing trends in water quality. Such analysis uses complex data sets of different classes of variables, including water quality, hydrological and meteorological. We analyse the effect of rainfall and flow on trends in water quality utilising a flexible model called Mixed Data Sampling (MIDAS). This model arises because of the mixed frequency in the data collection. Typically, water quality variables are sampled fortnightly, whereas the rain data is sampled daily. The advantage of using MIDAS regression is in the flexible and parsimonious modelling of the influence of the rain and flow on trends in water quality variables. We discuss the model and its implementation on a data set from the Shoalhaven Supply System and Catchments in the state of New South Wales, Australia. Information criteria indicate that MIDAS modelling improves upon simplistic approaches that do not utilise the mixed data sampling nature of the data.

  12. Hydropower Optimization Using Artificial Neural Network Surrogate Models of a High-Fidelity Hydrodynamics and Water Quality Model

    NASA Astrophysics Data System (ADS)

    Shaw, Amelia R.; Smith Sawyer, Heather; LeBoeuf, Eugene J.; McDonald, Mark P.; Hadjerioua, Boualem

    2017-11-01

    Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2 is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. The reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.

  13. Hydropower Optimization Using Artificial Neural Network Surrogate Models of a High-Fidelity Hydrodynamics and Water Quality Model

    DOE PAGES

    Shaw, Amelia R.; Sawyer, Heather Smith; LeBoeuf, Eugene J.; ...

    2017-10-24

    Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2more » is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. Here, the reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.« less

  14. Hydropower Optimization Using Artificial Neural Network Surrogate Models of a High-Fidelity Hydrodynamics and Water Quality Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaw, Amelia R.; Sawyer, Heather Smith; LeBoeuf, Eugene J.

    Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2more » is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. Here, the reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.« less

  15. Impact of Information Technology, Clinical Resource Constraints, and Patient-Centered Practice Characteristics on Quality of Care.

    PubMed

    Baek, JongDeuk; Seidman, Robert L

    2015-01-01

    Factors in the practice environment, such as health information technology (IT) infrastructure, availability of other clinical resources, and financial incentives, may influence whether practices are able to successfully implement the patient-centered medical home (PCMH) model and realize its benefits. This study investigates the impacts of those PCMH-related elements on primary care physicians' perception of quality of care. A multiple logistic regression model was estimated using the 2004 to 2005 CTS Physician Survey, a national sample of salaried primary care physicians (n = 1733). The patient-centered practice environment and availability of clinical resources increased physicians' perceived quality of care. Although IT use for clinical information access did enhance physicians' ability to provide high quality of care, a similar positive impact of IT use was not found for e-prescribing or the exchange of clinical patient information. Lack of resources was negatively associated with physician perception of quality of care. Since health IT is an important foundation of PCMH, patient-centered practices are more likely to have health IT in place to support care delivery. However, despite its potential to enhance delivery of primary care, simply making health IT available does not necessarily translate into physicians' perceptions that it enhances the quality of care they provide. It is critical for health-care managers and policy makers to ensure that primary care physicians fully recognize and embrace the use of new technology to improve both the quality of care provided and the patient outcomes.

  16. Exploring network operations for data and information networks

    NASA Astrophysics Data System (ADS)

    Yao, Bing; Su, Jing; Ma, Fei; Wang, Xiaomin; Zhao, Xiyang; Yao, Ming

    2017-01-01

    Barabási and Albert, in 1999, formulated scale-free models based on some real networks: World-Wide Web, Internet, metabolic and protein networks, language or sexual networks. Scale-free networks not only appear around us, but also have high qualities in the world. As known, high quality information networks can transfer feasibly and efficiently data, clearly, their topological structures are very important for data safety. We build up network operations for constructing large scale of dynamic networks from smaller scale of network models having good property and high quality. We focus on the simplest operators to formulate complex operations, and are interesting on the closeness of operations to desired network properties.

  17. Information quantity and quality affect the realistic accuracy of personality judgment.

    PubMed

    Letzring, Tera D; Wells, Shannon M; Funder, David C

    2006-07-01

    Triads of unacquainted college students interacted in 1 of 5 experimental conditions that manipulated information quantity (amount of information) and information quality (relevance of information to personality), and they then made judgments of each others' personalities. To determine accuracy, the authors compared the ratings of each judge to a broad-based accuracy criterion composed of personality ratings from 3 types of knowledgeable informants (the self, real-life acquaintances, and clinician-interviewers). Results supported the hypothesis that information quantity and quality would be positively related to objective knowledge about the targets and realistic accuracy. Interjudge consensus and self-other agreement followed a similar pattern. These findings are consistent with expectations based on models of the process of accurate judgment (D. C. Funder, 1995, 1999) and consensus (D. A. Kenny, 1994). Copyright 2006 APA, all rights reserved.

  18. MO-C-18A-01: Advances in Model-Based 3D Image Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, G; Pan, X; Stayman, J

    2014-06-15

    Recent years have seen the emergence of CT image reconstruction techniques that exploit physical models of the imaging system, photon statistics, and even the patient to achieve improved 3D image quality and/or reduction of radiation dose. With numerous advantages in comparison to conventional 3D filtered backprojection, such techniques bring a variety of challenges as well, including: a demanding computational load associated with sophisticated forward models and iterative optimization methods; nonlinearity and nonstationarity in image quality characteristics; a complex dependency on multiple free parameters; and the need to understand how best to incorporate prior information (including patient-specific prior images) within themore » reconstruction process. The advantages, however, are even greater – for example: improved image quality; reduced dose; robustness to noise and artifacts; task-specific reconstruction protocols; suitability to novel CT imaging platforms and noncircular orbits; and incorporation of known characteristics of the imager and patient that are conventionally discarded. This symposium features experts in 3D image reconstruction, image quality assessment, and the translation of such methods to emerging clinical applications. Dr. Chen will address novel methods for the incorporation of prior information in 3D and 4D CT reconstruction techniques. Dr. Pan will show recent advances in optimization-based reconstruction that enable potential reduction of dose and sampling requirements. Dr. Stayman will describe a “task-based imaging” approach that leverages models of the imaging system and patient in combination with a specification of the imaging task to optimize both the acquisition and reconstruction process. Dr. Samei will describe the development of methods for image quality assessment in such nonlinear reconstruction techniques and the use of these methods to characterize and optimize image quality and dose in a spectrum of clinical applications. Learning Objectives: Learn the general methodologies associated with model-based 3D image reconstruction. Learn the potential advantages in image quality and dose associated with model-based image reconstruction. Learn the challenges associated with computational load and image quality assessment for such reconstruction methods. Learn how imaging task can be incorporated as a means to drive optimal image acquisition and reconstruction techniques. Learn how model-based reconstruction methods can incorporate prior information to improve image quality, ease sampling requirements, and reduce dose.« less

  19. Genome-Wide Association Mapping for Kernel and Malting Quality Traits Using Historical European Barley Records

    PubMed Central

    Röder, Marion S.; van Eeuwijk, Fred

    2014-01-01

    Malting quality is an important trait in breeding barley (Hordeum vulgare L.). It requires elaborate, expensive phenotyping, which involves micro-malting experiments. Although there is abundant historical information available for different cultivars in different years and trials, that historical information is not often used in genetic analyses. This study aimed to exploit historical records to assist in identifying genomic regions that affect malting and kernel quality traits in barley. This genome-wide association study utilized information on grain yield and 18 quality traits accumulated over 25 years on 174 European spring and winter barley cultivars combined with diversity array technology markers. Marker-trait associations were tested with a mixed linear model. This model took into account the genetic relatedness between cultivars based on principal components scores obtained from marker information. We detected 140 marker-trait associations. Some of these associations confirmed previously known quantitative trait loci for malting quality (on chromosomes 1H, 2H, and 5H). Other associations were reported for the first time in this study. The genetic correlations between traits are discussed in relation to the chromosomal regions associated with the different traits. This approach is expected to be particularly useful when designing strategies for multiple trait improvements. PMID:25372869

  20. Developing hydrological model for water quality in Iraq marshes zone using Landsat-TM

    NASA Astrophysics Data System (ADS)

    Marghany, Maged; Hasab, Hashim Ali; Mansor, Shattri; Shariff, Abdul Rashid Bin Mohamed

    2016-06-01

    The Mesopotamia marshlands constitute the largest wetland ecosystem in the Middle East and Western Eurasia. These wetlands are located at the confluence of the Tigris and Euphrates Rivers in southern Iraq. However, there are series reductions in the wetland zones because of neighbor countries, i.e. Turkey, Syria built dams upstream of Tigris and Euphrates Rivers. In addition, the first Gulf war of the 1980s had damaged majority of the marches resources. In fact,the marshes had been reduced in size to less than 7% since 1973 and had deteriorated in water quality parameters. The study integrates Hydrological Model of RMA-2 with Geographic Information System, and remote sensing techniques to map the water quality in the marshlands south of Iraq. This study shows that RMA-2 shows the two dimensional water flow pattern and water quality quantities in the marshlands. It can be said that the integration between Hydrological Model of RMA-2, Geographic Information System, and remote sensing techniques can be used to monitor water quality in the marshlands south of Iraq.

  1. Care zoning in a psychiatric intensive care unit: strengthening ongoing clinical risk assessment.

    PubMed

    Mullen, Antony; Drinkwater, Vincent; Lewin, Terry J

    2014-03-01

    To implement and evaluate the care zoning model in an eight-bed psychiatric intensive care unit and, specifically, to examine the model's ability to improve the documentation and communication of clinical risk assessment and management. Care zoning guides nurses in assessing clinical risk and planning care within a mental health context. Concerns about the varying quality of clinical risk assessment prompted a trial of the care zoning model in a psychiatric intensive care unit within a regional mental health facility. The care zoning model assigns patients to one of 3 'zones' according to their clinical risk, encouraging nurses to document and implement targeted interventions required to manage those risks. An implementation trial framework was used for this research to refine, implement and evaluate the impact of the model on nurses' clinical practice within the psychiatric intensive care unit, predominantly as a quality improvement initiative. The model was trialled for three months using a pre- and postimplementation staff survey, a pretrial file audit and a weekly file audit. Informal staff feedback was also sought via surveys and regular staff meetings. This trial demonstrated improvement in the quality of mental state documentation, and clinical risk information was identified more accurately. There was limited improvement in the quality of care planning and the documentation of clinical interventions. Nurses' initial concerns over the introduction of the model shifted into overall acceptance and recognition of the benefits. The results of this trial demonstrate that the care zoning model was able to improve the consistency and quality of risk assessment information documented. Care planning and evaluation of associated outcomes showed less improvement. Care zoning remains a highly applicable model for the psychiatric intensive care unit environment and is a useful tool in guiding nurses to carry out routine patient risk assessments. © 2013 John Wiley & Sons Ltd.

  2. Designing Excellence and Quality Model for Training Centers of Primary Health Care: A Delphi Method Study.

    PubMed

    Tabrizi, Jafar-Sadegh; Farahbakhsh, Mostafa; Shahgoli, Javad; Rahbar, Mohammad Reza; Naghavi-Behzad, Mohammad; Ahadi, Hamid-Reza; Azami-Aghdash, Saber

    2015-10-01

    Excellence and quality models are comprehensive methods for improving the quality of healthcare. The aim of this study was to design excellence and quality model for training centers of primary health care using Delphi method. In this study, Delphi method was used. First, comprehensive information were collected using literature review. In extracted references, 39 models were identified from 34 countries and related sub-criteria and standards were extracted from 34 models (from primary 39 models). Then primary pattern including 8 criteria, 55 sub-criteria, and 236 standards was developed as a Delphi questionnaire and evaluated in four stages by 9 specialists of health care system in Tabriz and 50 specialists from all around the country. Designed primary model (8 criteria, 55 sub-criteria, and 236 standards) were concluded with 8 criteria, 45 sub-criteria, and 192 standards after 4 stages of evaluations by specialists. Major criteria of the model are leadership, strategic and operational planning, resource management, information analysis, human resources management, process management, costumer results, and functional results, where the top score was assigned as 1000 by specialists. Functional results had the maximum score of 195 whereas planning had the minimum score of 60. Furthermore the most and the least sub-criteria was for leadership with 10 sub-criteria and strategic planning with 3 sub-criteria, respectively. The model that introduced in this research has been designed following 34 reference models of the world. This model could provide a proper frame for managers of health system in improving quality.

  3. Performance evaluation of public hospital information systems by the information system success model.

    PubMed

    Cho, Kyoung Won; Bae, Sung-Kwon; Ryu, Ji-Hye; Kim, Kyeong Na; An, Chang-Ho; Chae, Young Moon

    2015-01-01

    This study was to evaluate the performance of the newly developed information system (IS) implemented on July 1, 2014 at three public hospitals in Korea. User satisfaction scores of twelve key performance indicators of six IS success factors based on the DeLone and McLean IS Success Model were utilized to evaluate IS performance before and after the newly developed system was introduced. All scores increased after system introduction except for the completeness of medical records and impact on the clinical environment. The relationships among six IS factors were also analyzed to identify the important factors influencing three IS success factors (Intention to Use, User Satisfaction, and Net Benefits). All relationships were significant except for the relationships among Service Quality, Intention to Use, and Net Benefits. The results suggest that hospitals should not only focus on systems and information quality; rather, they should also continuously improve service quality to improve user satisfaction and eventually reach full the potential of IS performance.

  4. A Socio-technical assessment of the success of picture archiving and communication systems: the radiology technologist’s perspective

    PubMed Central

    2013-01-01

    Background With the increasing prevalence of Picture Archiving and Communication Systems (PACS) in healthcare institutions, there is a growing need to measure their success. However, there is a lack of published literature emphasizing the technical and social factors underlying a successful PACS. Methods An updated Information Systems Success Model was utilized by radiology technologists (RTs) to evaluate the success of PACS at a large medical center in Taiwan. A survey, consisting of 109 questionnaires, was analyzed by Structural Equation Modeling. Results Socio-technical factors (including system quality, information quality, service quality, perceived usefulness, user satisfaction, and PACS dependence) were proven to be effective measures of PACS success. Although the relationship between service quality and perceived usefulness was not significant, other proposed relationships amongst the six measurement parameters of success were all confirmed. Conclusions Managers have an obligation to improve the attributes of PACS. At the onset of its deployment, RTs will have formed their own subjective opinions with regards to its quality (system quality, information quality, and service quality). As these personal concepts are either refuted or reinforced based on personal experiences, RTs will become either satisfied or dissatisfied with PACS, based on their perception of its usefulness or lack of usefulness. A satisfied RT may play a pivotal role in the implementation of PACS in the future. PMID:24053458

  5. FACILITATING ADVANCED URBAN METEOROLOGY AND AIR QUALITY MODELING CAPABILITIES WITH HIGH RESOLUTION URBAN DATABASE AND ACCESS PORTAL TOOLS

    EPA Science Inventory

    Information of urban morphological features at high resolution is needed to properly model and characterize the meteorological and air quality fields in urban areas. We describe a new project called National Urban Database with Access Portal Tool, (NUDAPT) that addresses this nee...

  6. Measuring Perceptual (In) Congruence between Information Service Providers and Users

    ERIC Educational Resources Information Center

    Boyce, Crystal

    2017-01-01

    Library quality is no longer evaluated solely on the value of its collections, as user perceptions of service quality play an increasingly important role in defining overall library value. This paper presents a retooling of the LibQUAL+ survey instrument, blending the gap measurement model with perceptual congruence model studies from information…

  7. Evaluating the success of an emergency response medical information system.

    PubMed

    Petter, Stacie; Fruhling, Ann

    2011-07-01

    STATPack™ is an information system used to aid in the diagnosis of pathogens in hospitals and state public health laboratories. STATPack™ is used as a communication and telemedicine diagnosis tool during emergencies. This paper explores the success of this emergency response medical information system (ERMIS) using a well-known framework of information systems success developed by DeLone and McLean. Using an online survey, the entire population of STATPack™ users evaluated the success of the information system by considering system quality, information quality, system use, intention to use, user satisfaction, individual impact, and organizational impact. The results indicate that the overall quality of this ERMIS (i.e., system quality, information quality, and service quality) has a positive impact on both user satisfaction and intention to use the system. However, given the nature of ERMIS, overall quality does not necessarily predict use of the system. Moreover, the user's satisfaction with the information system positively affected the intention to use the system. User satisfaction, intention to use, and system use had a positive influence on the system's impact on the individual. Finally, the organizational impacts of the system were positively influenced by use of the system and the system's individual impact on the user. The results of the study demonstrate how to evaluate the success of an ERMIS as well as introduce potential changes in how one applies the DeLone and McLean success model in an emergency response medical information system context. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  8. Congestion Prediction Modeling for Quality of Service Improvement in Wireless Sensor Networks

    PubMed Central

    Lee, Ga-Won; Lee, Sung-Young; Huh, Eui-Nam

    2014-01-01

    Information technology (IT) is pushing ahead with drastic reforms of modern life for improvement of human welfare. Objects constitute “Information Networks” through smart, self-regulated information gathering that also recognizes and controls current information states in Wireless Sensor Networks (WSNs). Information observed from sensor networks in real-time is used to increase quality of life (QoL) in various industries and daily life. One of the key challenges of the WSNs is how to achieve lossless data transmission. Although nowadays sensor nodes have enhanced capacities, it is hard to assure lossless and reliable end-to-end data transmission in WSNs due to the unstable wireless links and low hard ware resources to satisfy high quality of service (QoS) requirements. We propose a node and path traffic prediction model to predict and minimize the congestion. This solution includes prediction of packet generation due to network congestion from both periodic and event data generation. Simulation using NS-2 and Matlab is used to demonstrate the effectiveness of the proposed solution. PMID:24784035

  9. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2000-01-01

    Hospital information systems have to support quality improvement objectives. The design issues of health care information system can be classified into three categories: 1) time-oriented and event-labelled storage of patient data; 2) contextual support of decision-making; 3) capabilities for modular upgrading. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualize clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the field of blood transfusion. An object-oriented data model of a process has been defined in order to identify its main components: activity, sub-process, resources, constrains, guidelines, parameters and indicators. Although some aspects of activity, such as "where", "what else", and "why" are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for this approach to be generalised within the organisation, for the processes to be interrelated, and for their characteristics to be shared.

  10. The Triangle Model for evaluating the effect of health information technology on healthcare quality and safety

    PubMed Central

    Kern, Lisa M; Abramson, Erika; Kaushal, Rainu

    2011-01-01

    With the proliferation of relatively mature health information technology (IT) systems with large numbers of users, it becomes increasingly important to evaluate the effect of these systems on the quality and safety of healthcare. Previous research on the effectiveness of health IT has had mixed results, which may be in part attributable to the evaluation frameworks used. The authors propose a model for evaluation, the Triangle Model, developed for designing studies of quality and safety outcomes of health IT. This model identifies structure-level predictors, including characteristics of: (1) the technology itself; (2) the provider using the technology; (3) the organizational setting; and (4) the patient population. In addition, the model outlines process predictors, including (1) usage of the technology, (2) organizational support for and customization of the technology, and (3) organizational policies and procedures about quality and safety. The Triangle Model specifies the variables to be measured, but is flexible enough to accommodate both qualitative and quantitative approaches to capturing them. The authors illustrate this model, which integrates perspectives from both health services research and biomedical informatics, with examples from evaluations of electronic prescribing, but it is also applicable to a variety of types of health IT systems. PMID:21857023

  11. An evaluation of the NQF Quality Data Model for representing Electronic Health Record driven phenotyping algorithms.

    PubMed

    Thompson, William K; Rasmussen, Luke V; Pacheco, Jennifer A; Peissig, Peggy L; Denny, Joshua C; Kho, Abel N; Miller, Aaron; Pathak, Jyotishman

    2012-01-01

    The development of Electronic Health Record (EHR)-based phenotype selection algorithms is a non-trivial and highly iterative process involving domain experts and informaticians. To make it easier to port algorithms across institutions, it is desirable to represent them using an unambiguous formal specification language. For this purpose we evaluated the recently developed National Quality Forum (NQF) information model designed for EHR-based quality measures: the Quality Data Model (QDM). We selected 9 phenotyping algorithms that had been previously developed as part of the eMERGE consortium and translated them into QDM format. Our study concluded that the QDM contains several core elements that make it a promising format for EHR-driven phenotyping algorithms for clinical research. However, we also found areas in which the QDM could be usefully extended, such as representing information extracted from clinical text, and the ability to handle algorithms that do not consist of Boolean combinations of criteria.

  12. A Product Development Decision Model for Cockpit Weather Information System

    NASA Technical Reports Server (NTRS)

    Sireli, Yesim; Kauffmann, Paul; Gupta, Surabhi; Kachroo, Pushkin; Johnson, Edward J., Jr. (Technical Monitor)

    2003-01-01

    There is a significant market demand for advanced cockpit weather information products. However, it is unclear how to identify the most promising technological options that provide the desired mix of consumer requirements by employing feasible technical systems at a price that achieves market success. This study develops a unique product development decision model that employs Quality Function Deployment (QFD) and Kano's model of consumer choice. This model is specifically designed for exploration and resolution of this and similar information technology related product development problems.

  13. A Product Development Decision Model for Cockpit Weather Information Systems

    NASA Technical Reports Server (NTRS)

    Sireli, Yesim; Kauffmann, Paul; Gupta, Surabhi; Kachroo, Pushkin

    2003-01-01

    There is a significant market demand for advanced cockpit weather information products. However, it is unclear how to identify the most promising technological options that provide the desired mix of consumer requirements by employing feasible technical systems at a price that achieves market success. This study develops a unique product development decision model that employs Quality Function Deployment (QFD) and Kano's model of consumer choice. This model is specifically designed for exploration and resolution of this and similar information technology related product development problems.

  14. The ends of uncertainty: Air quality science and planning in Central California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fine, James

    Air quality planning in Central California is complicated and controversial despite millions of dollars invested to improve scientific understanding. This research describes and critiques the use of photochemical air quality simulation modeling studies in planning to attain standards for ground-level ozone in the San Francisco Bay Area and the San Joaquin Valley during the 1990's. Data are gathered through documents and interviews with planners, modelers, and policy-makers at public agencies and with representatives from the regulated and environmental communities. Interactions amongst organizations are diagramed to identify significant nodes of interaction. Dominant policy coalitions are described through narratives distinguished by theirmore » uses of and responses to uncertainty, their exposures to risks, and their responses to the principles of conservatism, civil duty, and caution. Policy narratives are delineated using aggregated respondent statements to describe and understand advocacy coalitions. I found that models impacted the planning process significantly, but were used not purely for their scientific capabilities. Modeling results provided justification for decisions based on other constraints and political considerations. Uncertainties were utilized opportunistically by stakeholders instead of managed explicitly. Ultimately, the process supported the partisan views of those in control of the modeling. Based on these findings, as well as a review of model uncertainty analysis capabilities, I recommend modifying the planning process to allow for the development and incorporation of uncertainty information, while addressing the need for inclusive and meaningful public participation. By documenting an actual air quality planning process these findings provide insights about the potential for using new scientific information and understanding to achieve environmental goals, most notably the analysis of uncertainties in modeling applications. Concurrently, needed uncertainty information is identified and capabilities to produce it are assessed. Practices to facilitate incorporation of uncertainty information are suggested based on research findings, as well as theory from the literatures of the policy sciences, decision sciences, science and technology studies, consensus-based and communicative planning, and modeling.« less

  15. Social percolation and the influence of mass media

    NASA Astrophysics Data System (ADS)

    Proykova, Ana; Stauffer, Dietrich

    2002-09-01

    In the marketing model of Solomon and Weisbuch, people buy a product only if their neighbours tell them of its quality, and if this quality is higher than their own quality expectations. Now we introduce additional information from the mass media, which is analogous to the ghost field in percolation theory. The mass media shift the percolative phase transition observed in the model, and decrease the time after which the stationary state is reached.

  16. From data to wisdom: quality improvement strategies supporting large-scale implementation of evidence-based services.

    PubMed

    Daleiden, Eric L; Chorpita, Bruce F

    2005-04-01

    The Hawaii Department of Health Child and Adolescent Mental Health Division has explored various strategies to promote widespread use of empirical evidence to improve the quality of services and outcomes for youth. This article describes a core set of clinical decisions and how several general and local evidence bases may inform those decisions. Multiple quality improvement strategies are illustrated in the context of a model that outlines four phases of evidence: data, information, knowledge, and wisdom.

  17. NWS Marine Forecast Areas

    Science.gov Websites

    Currents Global Ocean Model Sea Surface Temperatures Gulf Stream ASCII Data Gulf Stream Comparison Gridded ASCAT Scatterometer Winds Lightning Strike Density Satellite Imagery Ocean Global Ocean Model , 2017 19:10:57 UTC Disclaimer Information Quality Help Glossary Privacy Policy Freedom of Information

  18. Modeling the ecological trap hypothesis: a habitat and demographic analysis for migrant songbirds

    Treesearch

    Therese M. Donovan; Frank R, III Thompson

    2001-01-01

    Most species occupy both high- and low-quality habitats throughout their ranges. As habitats become modified through anthropogenic change, low-quality habitat may become a more dominant component of the landscape for some species. To conserve species, information on how to assess habitat quality and guidelines for maintaining or eliminating low-quality habitats are...

  19. Social foraging with partial (public) information.

    PubMed

    Mann, Ofri; Kiflawi, Moshe

    2014-10-21

    Group foragers can utilize public information to better estimate patch quality and arrive at more efficient patch-departure rules. However, acquiring such information may come at a cost; e.g. reduced search efficiency. We present a Bayesian group-foraging model in which social foragers do not require full awareness of their companions' foraging success; only of their number. In our model, patch departure is based on direct estimates of the number of remaining items. This is achieved by considering all likely combinations of initial patch-quality and group foraging-success; given the individual forager's experience within the patch. Slower rates of information-acquisition by our 'partially-aware' foragers lead them to over-utilize poor patches; more than fully-aware foragers. However, our model suggests that the ensuing loss in long-term intake-rates can be matched by a relatively low cost to the acquisition of full public information. In other words, we suggest that group-size offers sufficient information for optimal patch utilization by social foragers. We suggest, also, that our model is applicable to other situations where resources undergo 'background depletion', which is coincident but independent of the consumer's own utilization. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Dimensions of service quality in healthcare: a systematic review of literature.

    PubMed

    Fatima, Iram; Humayun, Ayesha; Iqbal, Usman; Shafiq, Muhammad

    2018-06-13

    Various dimensions of healthcare service quality were used and discussed in literature across the globe. This study presents an updated meaningful review of the extensive research that has been conducted on measuring dimensions of healthcare service quality. Systematic review method in current study is based on PRISMA guidelines. We searched for literature using databases such as Google, Google Scholar, PubMed and Social Science, Citation Index. In this study, we screened 1921 identified papers using search terms/phrases. Snowball strategies were adopted to extract published articles from January 1997 till December 2016. Two-hundred and fourteen papers were identified as relevant for data extraction; completed by two researchers, double checked by the other two to develop agreement in discrepancies. In total, 74 studies fulfilled our pre-defined inclusion and exclusion criteria for data analysis. Service quality is mainly measured as technical and functional, incorporating many sub-dimensions. We synthesized the information about dimensions of healthcare service quality with reference to developed and developing countries. 'Tangibility' is found to be the most common contributing factor whereas 'SERVQUAL' as the most commonly used model to measure healthcare service quality. There are core dimensions of healthcare service quality that are commonly found in all models used in current reviewed studies. We found a little difference in these core dimensions while focusing dimensions in both developed and developing countries, as mostly SERVQUAL is being used as the basic model to either generate a new one or to add further contextual dimensions. The current study ranked the contributing factors based on their frequency in literature. Based on these priorities, if factors are addressed irrespective of any context, may lead to contribute to improve healthcare quality and may provide an important information for evidence-informed decision-making.

  1. Quality Metadata Management for Geospatial Scientific Workflows: from Retrieving to Assessing with Online Tools

    NASA Astrophysics Data System (ADS)

    Leibovici, D. G.; Pourabdollah, A.; Jackson, M.

    2011-12-01

    Experts and decision-makers use or develop models to monitor global and local changes of the environment. Their activities require the combination of data and processing services in a flow of operations and spatial data computations: a geospatial scientific workflow. The seamless ability to generate, re-use and modify a geospatial scientific workflow is an important requirement but the quality of outcomes is equally much important [1]. Metadata information attached to the data and processes, and particularly their quality, is essential to assess the reliability of the scientific model that represents a workflow [2]. Managing tools, dealing with qualitative and quantitative metadata measures of the quality associated with a workflow, are, therefore, required for the modellers. To ensure interoperability, ISO and OGC standards [3] are to be adopted, allowing for example one to define metadata profiles and to retrieve them via web service interfaces. However these standards need a few extensions when looking at workflows, particularly in the context of geoprocesses metadata. We propose to fill this gap (i) at first through the provision of a metadata profile for the quality of processes, and (ii) through providing a framework, based on XPDL [4], to manage the quality information. Web Processing Services are used to implement a range of metadata analyses on the workflow in order to evaluate and present quality information at different levels of the workflow. This generates the metadata quality, stored in the XPDL file. The focus is (a) on the visual representations of the quality, summarizing the retrieved quality information either from the standardized metadata profiles of the components or from non-standard quality information e.g., Web 2.0 information, and (b) on the estimated qualities of the outputs derived from meta-propagation of uncertainties (a principle that we have introduced [5]). An a priori validation of the future decision-making supported by the outputs of the workflow once run, is then provided using the meta-propagated qualities, obtained without running the workflow [6], together with the visualization pointing out the need to improve the workflow with better data or better processes on the workflow graph itself. [1] Leibovici, DG, Hobona, G Stock, K Jackson, M (2009) Qualifying geospatial workfow models for adaptive controlled validity and accuracy. In: IEEE 17th GeoInformatics, 1-5 [2] Leibovici, DG, Pourabdollah, A (2010a) Workflow Uncertainty using a Metamodel Framework and Metadata for Data and Processes. OGC TC/PC Meetings, September 2010, Toulouse, France [3] OGC (2011) www.opengeospatial.org [4] XPDL (2008) Workflow Process Definition Interface - XML Process Definition Language.Workflow Management Coalition, Document WfMC-TC-1025, 2008 [5] Leibovici, DG Pourabdollah, A Jackson, M (2011) Meta-propagation of Uncertainties for Scientific Workflow Management in Interoperable Spatial Data Infrastructures. In: Proceedings of the European Geosciences Union (EGU2011), April 2011, Austria [6] Pourabdollah, A Leibovici, DG Jackson, M (2011) MetaPunT: an Open Source tool for Meta-Propagation of uncerTainties in Geospatial Processing. In: Proceedings of OSGIS2011, June 2011, Nottingham, UK

  2. Healthcare delivery systems: designing quality into health information systems.

    PubMed

    Joyce, Phil; Green, Rosamund; Winch, Graham

    2007-01-01

    To ensure that quality is 'engineered in' a holistic, integrated and quality approach is required, and Total Quality Management (TQM) principles are the obvious foundations for this. This paper describes a novel approach to viewing the operations of a healthcare provider where electronic means could be used to distribute information (including electronic fund settlements), building around the Full Service Provider core. Specifically, an approach called the "triple pair flow" model is used to provide a view of healthcare delivery that is integrated, yet detailed, and that combines the strategic enterprise view with a business process view.

  3. Assessment and prediction of air quality using fuzzy logic and autoregressive models

    NASA Astrophysics Data System (ADS)

    Carbajal-Hernández, José Juan; Sánchez-Fernández, Luis P.; Carrasco-Ochoa, Jesús A.; Martínez-Trinidad, José Fco.

    2012-12-01

    In recent years, artificial intelligence methods have been used for the treatment of environmental problems. This work, presents two models for assessment and prediction of air quality. First, we develop a new computational model for air quality assessment in order to evaluate toxic compounds that can harm sensitive people in urban areas, affecting their normal activities. In this model we propose to use a Sigma operator to statistically asses air quality parameters using their historical data information and determining their negative impact in air quality based on toxicity limits, frequency average and deviations of toxicological tests. We also introduce a fuzzy inference system to perform parameter classification using a reasoning process and integrating them in an air quality index describing the pollution levels in five stages: excellent, good, regular, bad and danger, respectively. The second model proposed in this work predicts air quality concentrations using an autoregressive model, providing a predicted air quality index based on the fuzzy inference system previously developed. Using data from Mexico City Atmospheric Monitoring System, we perform a comparison among air quality indices developed for environmental agencies and similar models. Our results show that our models are an appropriate tool for assessing site pollution and for providing guidance to improve contingency actions in urban areas.

  4. Mapping urban air quality in near real-time using observations from low-cost sensors and model information.

    PubMed

    Schneider, Philipp; Castell, Nuria; Vogt, Matthias; Dauge, Franck R; Lahoz, William A; Bartonova, Alena

    2017-09-01

    The recent emergence of low-cost microsensors measuring various air pollutants has significant potential for carrying out high-resolution mapping of air quality in the urban environment. However, the data obtained by such sensors are generally less reliable than that from standard equipment and they are subject to significant data gaps in both space and time. In order to overcome this issue, we present here a data fusion method based on geostatistics that allows for merging observations of air quality from a network of low-cost sensors with spatial information from an urban-scale air quality model. The performance of the methodology is evaluated for nitrogen dioxide in Oslo, Norway, using both simulated datasets and real-world measurements from a low-cost sensor network for January 2016. The results indicate that the method is capable of producing realistic hourly concentration fields of urban nitrogen dioxide that inherit the spatial patterns from the model and adjust the prior values using the information from the sensor network. The accuracy of the data fusion method is dependent on various factors including the total number of observations, their spatial distribution, their uncertainty (both in terms of systematic biases and random errors), as well as the ability of the model to provide realistic spatial patterns of urban air pollution. A validation against official data from air quality monitoring stations equipped with reference instrumentation indicates that the data fusion method is capable of reproducing city-wide averaged official values with an R 2 of 0.89 and a root mean squared error of 14.3 μg m -3 . It is further capable of reproducing the typical daily cycles of nitrogen dioxide. Overall, the results indicate that the method provides a robust way of extracting useful information from uncertain sensor data using only a time-invariant model dataset and the knowledge contained within an entire sensor network. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Rehabilitation of compensable workplace injuries: effective payment models for quality vocational rehabilitation outcomes in a changing social landscape.

    PubMed

    Matthews, Lynda R; Hanley, Francine; Lewis, Virginia; Howe, Caroline

    2015-01-01

    With social and economic costs of workplace injury on the increase, efficient payment models that deliver quality rehabilitation outcomes are of increasing interest. This paper provides a perspective on the issue informed by both refereed literature and published research material not available commercially (gray literature). A review of payment models, workers' compensation and compensable injury identified relevant peer-reviewed and gray literature that informed our discussion. Fee-for-service and performance-based payment models dominate the health and rehabilitation literature, each described as having benefits and challenges to achieving quality outcomes for consumers. There appears to be a movement toward performance-based payments in compensable workplace injury settings as they are perceived to promote time-efficient services and support innovation in rehabilitation practice. However, it appears that the challenges that arise for workplace-based rehabilitation providers and professionals when working under the various payment models, such as staff retention and quality of client-practitioner relationship, are absent from the literature and this could lead to flawed policy decisions. Robust evidence of the benefits and costs associated with different payment models - from the perspectives of clients/consumers, funders and service providers - is needed to inform best practice in rehabilitation of compensable workplace injuries. Available but limited evidence suggests that payment models providing financial incentives for stakeholder-agreed vocational rehabilitation outcomes tend to improve service effectiveness in workers' compensation settings, although there is little evidence of service quality or client satisfaction. Working in a system that identifies payments for stakeholder-agreed outcomes may be more satisfying for rehabilitation practitioners in workers' compensation settings by allowing more clinical autonomy and innovative practice. Researchers need to work closely with the compensation and rehabilitation sector as well as governments to establish robust evidence of the benefits and costs of payment models, from the perspectives of clients/consumers, funders, service providers and rehabilitation professionals.

  6. Foraging Behaviour in Magellanic Woodpeckers Is Consistent with a Multi-Scale Assessment of Tree Quality

    PubMed Central

    Vergara, Pablo M.; Soto, Gerardo E.; Rodewald, Amanda D.; Meneses, Luis O.; Pérez-Hernández, Christian G.

    2016-01-01

    Theoretical models predict that animals should make foraging decisions after assessing the quality of available habitat, but most models fail to consider the spatio-temporal scales at which animals perceive habitat availability. We tested three foraging strategies that explain how Magellanic woodpeckers (Campephilus magellanicus) assess the relative quality of trees: 1) Woodpeckers with local knowledge select trees based on the available trees in the immediate vicinity. 2) Woodpeckers lacking local knowledge select trees based on their availability at previously visited locations. 3) Woodpeckers using information from long-term memory select trees based on knowledge about trees available within the entire landscape. We observed foraging woodpeckers and used a Brownian Bridge Movement Model to identify trees available to woodpeckers along foraging routes. Woodpeckers selected trees with a later decay stage than available trees. Selection models indicated that preferences of Magellanic woodpeckers were based on clusters of trees near the most recently visited trees, thus suggesting that woodpeckers use visual cues from neighboring trees. In a second analysis, Cox’s proportional hazards models showed that woodpeckers used information consolidated across broader spatial scales to adjust tree residence times. Specifically, woodpeckers spent more time at trees with larger diameters and in a more advanced stage of decay than trees available along their routes. These results suggest that Magellanic woodpeckers make foraging decisions based on the relative quality of trees that they perceive and memorize information at different spatio-temporal scales. PMID:27416115

  7. Foraging Behaviour in Magellanic Woodpeckers Is Consistent with a Multi-Scale Assessment of Tree Quality.

    PubMed

    Vergara, Pablo M; Soto, Gerardo E; Moreira-Arce, Darío; Rodewald, Amanda D; Meneses, Luis O; Pérez-Hernández, Christian G

    2016-01-01

    Theoretical models predict that animals should make foraging decisions after assessing the quality of available habitat, but most models fail to consider the spatio-temporal scales at which animals perceive habitat availability. We tested three foraging strategies that explain how Magellanic woodpeckers (Campephilus magellanicus) assess the relative quality of trees: 1) Woodpeckers with local knowledge select trees based on the available trees in the immediate vicinity. 2) Woodpeckers lacking local knowledge select trees based on their availability at previously visited locations. 3) Woodpeckers using information from long-term memory select trees based on knowledge about trees available within the entire landscape. We observed foraging woodpeckers and used a Brownian Bridge Movement Model to identify trees available to woodpeckers along foraging routes. Woodpeckers selected trees with a later decay stage than available trees. Selection models indicated that preferences of Magellanic woodpeckers were based on clusters of trees near the most recently visited trees, thus suggesting that woodpeckers use visual cues from neighboring trees. In a second analysis, Cox's proportional hazards models showed that woodpeckers used information consolidated across broader spatial scales to adjust tree residence times. Specifically, woodpeckers spent more time at trees with larger diameters and in a more advanced stage of decay than trees available along their routes. These results suggest that Magellanic woodpeckers make foraging decisions based on the relative quality of trees that they perceive and memorize information at different spatio-temporal scales.

  8. A Linear Algebra Measure of Cluster Quality.

    ERIC Educational Resources Information Center

    Mather, Laura A.

    2000-01-01

    Discussion of models for information retrieval focuses on an application of linear algebra to text clustering, namely, a metric for measuring cluster quality based on the theory that cluster quality is proportional to the number of terms that are disjoint across the clusters. Explains term-document matrices and clustering algorithms. (Author/LRW)

  9. Models, Measurements, and Local Decisions: Assessing and Addressing Impacts from Port Expansion and Traffic Activity

    EPA Science Inventory

    This presentation includes a combination of modeling and measurement results to characterize near-source air quality in Newark, New Jersey with consideration of how this information could be used to inform decision making to reduce risk of health impacts. Decisions could include ...

  10. A Statistical Decision Model for Periodical Selection for a Specialized Information Center

    ERIC Educational Resources Information Center

    Dym, Eleanor D.; Shirey, Donald L.

    1973-01-01

    An experiment is described which attempts to define a quantitative methodology for the identification and evaluation of all possibly relevant periodical titles containing toxicological-biological information. A statistical decision model was designed and employed, along with yes/no criteria questions, a training technique and a quality control…

  11. The development and evaluation of a nursing information system for caring clinical in-patient.

    PubMed

    Fang, Yu-Wen; Li, Chih-Ping; Wang, Mei-Hua

    2015-01-01

    The research aimed to develop a nursing information system in order to simplify the admission procedure for caring clinical in-patient, enhance the efficiency of medical information documentation. Therefore, by correctly delivering patients’ health records, and providing continues care, patient safety and care quality would be effectively improved. The study method was to apply Spiral Model development system to compose a nursing information team. By using strategies of data collection, working environment observation, applying use-case modeling, and conferences of Joint Application Design (JAD) to complete the system requirement analysis and design. The Admission Care Management Information System (ACMIS) mainly included: (1) Admission nursing management information system. (2) Inter-shift meeting information management system. (3) The linkage of drug management system and physical examination record system. The framework contained qualitative and quantitative components that provided both formative and summative elements of the evaluation. System evaluation was to apply information success model, and developed questionnaire of consisting nurses’ acceptance and satisfaction. The results of questionnaires were users’ satisfaction, the perceived self-involvement, age and information quality were positively to personal and organizational effectiveness. According to the results of this study, the Admission Care Management Information System was practical to simplifying clinic working procedure and effective in communicating and documenting admission medical information.

  12. Uncertainties have a meaning: Information entropy as a quality measure for 3-D geological models

    NASA Astrophysics Data System (ADS)

    Wellmann, J. Florian; Regenauer-Lieb, Klaus

    2012-03-01

    Analyzing, visualizing and communicating uncertainties are important issues as geological models can never be fully determined. To date, there exists no general approach to quantify uncertainties in geological modeling. We propose here to use information entropy as an objective measure to compare and evaluate model and observational results. Information entropy was introduced in the 50s and defines a scalar value at every location in the model for predictability. We show that this method not only provides a quantitative insight into model uncertainties but, due to the underlying concept of information entropy, can be related to questions of data integration (i.e. how is the model quality interconnected with the used input data) and model evolution (i.e. does new data - or a changed geological hypothesis - optimize the model). In other words information entropy is a powerful measure to be used for data assimilation and inversion. As a first test of feasibility, we present the application of the new method to the visualization of uncertainties in geological models, here understood as structural representations of the subsurface. Applying the concept of information entropy on a suite of simulated models, we can clearly identify (a) uncertain regions within the model, even for complex geometries; (b) the overall uncertainty of a geological unit, which is, for example, of great relevance in any type of resource estimation; (c) a mean entropy for the whole model, important to track model changes with one overall measure. These results cannot easily be obtained with existing standard methods. The results suggest that information entropy is a powerful method to visualize uncertainties in geological models, and to classify the indefiniteness of single units and the mean entropy of a model quantitatively. Due to the relationship of this measure to the missing information, we expect the method to have a great potential in many types of geoscientific data assimilation problems — beyond pure visualization.

  13. An Integrated Risk Management Model for Source Water Protection Areas

    PubMed Central

    Chiueh, Pei-Te; Shang, Wei-Ting; Lo, Shang-Lien

    2012-01-01

    Watersheds are recognized as the most effective management unit for the protection of water resources. For surface water supplies that use water from upstream watersheds, evaluating threats to water quality and implementing a watershed management plan are crucial for the maintenance of drinking water safe for humans. The aim of this article is to establish a risk assessment model that provides basic information for identifying critical pollutants and areas at high risk for degraded water quality. In this study, a quantitative risk model that uses hazard quotients for each water quality parameter was combined with a qualitative risk model that uses the relative risk level of potential pollution events in order to characterize the current condition and potential risk of watersheds providing drinking water. In a case study of Taipei Source Water Area in northern Taiwan, total coliforms and total phosphorus were the top two pollutants of concern. Intensive tea-growing and recreational activities around the riparian zone may contribute the greatest pollution to the watershed. Our risk assessment tool may be enhanced by developing, recording, and updating information on pollution sources in the water supply watersheds. Moreover, management authorities could use the resultant information to create watershed risk management plans. PMID:23202770

  14. Data-base development for water-quality modeling of the Patuxent River basin, Maryland

    USGS Publications Warehouse

    Fisher, G.T.; Summers, R.M.

    1987-01-01

    Procedures and rationale used to develop a data base and data management system for the Patuxent Watershed Nonpoint Source Water Quality Monitoring and Modeling Program of the Maryland Department of the Environment and the U.S. Geological Survey are described. A detailed data base and data management system has been developed to facilitate modeling of the watershed for water quality planning purposes; statistical analysis; plotting of meteorologic, hydrologic and water quality data; and geographic data analysis. The system is Maryland 's prototype for development of a basinwide water quality management program. A key step in the program is to build a calibrated and verified water quality model of the basin using the Hydrological Simulation Program--FORTRAN (HSPF) hydrologic model, which has been used extensively in large-scale basin modeling. The compilation of the substantial existing data base for preliminary calibration of the basin model, including meteorologic, hydrologic, and water quality data from federal and state data bases and a geographic information system containing digital land use and soils data is described. The data base development is significant in its application of an integrated, uniform approach to data base management and modeling. (Lantz-PTT)

  15. Developments to the Sylvan stand structure model to describe wood quality changes in southern bottomland hardwood forests because of forest management

    Treesearch

    Ian R. Scott

    2009-01-01

    Growth models can produce a wealth of detailed information that is often very difficult to perceive because it is frequently presented either as summary tables, stand view or landscape view visualizations. We have developed new tools for use with the Sylvan model (Larsen 1994) that allow the analysis of wood-quality changes as a consequence of forest management....

  16. Normalization and standardization of electronic health records for high-throughput phenotyping: the SHARPn consortium

    PubMed Central

    Pathak, Jyotishman; Bailey, Kent R; Beebe, Calvin E; Bethard, Steven; Carrell, David S; Chen, Pei J; Dligach, Dmitriy; Endle, Cory M; Hart, Lacey A; Haug, Peter J; Huff, Stanley M; Kaggal, Vinod C; Li, Dingcheng; Liu, Hongfang; Marchant, Kyle; Masanz, James; Miller, Timothy; Oniki, Thomas A; Palmer, Martha; Peterson, Kevin J; Rea, Susan; Savova, Guergana K; Stancl, Craig R; Sohn, Sunghwan; Solbrig, Harold R; Suesse, Dale B; Tao, Cui; Taylor, David P; Westberg, Les; Wu, Stephen; Zhuo, Ning; Chute, Christopher G

    2013-01-01

    Research objective To develop scalable informatics infrastructure for normalization of both structured and unstructured electronic health record (EHR) data into a unified, concept-based model for high-throughput phenotype extraction. Materials and methods Software tools and applications were developed to extract information from EHRs. Representative and convenience samples of both structured and unstructured data from two EHR systems—Mayo Clinic and Intermountain Healthcare—were used for development and validation. Extracted information was standardized and normalized to meaningful use (MU) conformant terminology and value set standards using Clinical Element Models (CEMs). These resources were used to demonstrate semi-automatic execution of MU clinical-quality measures modeled using the Quality Data Model (QDM) and an open-source rules engine. Results Using CEMs and open-source natural language processing and terminology services engines—namely, Apache clinical Text Analysis and Knowledge Extraction System (cTAKES) and Common Terminology Services (CTS2)—we developed a data-normalization platform that ensures data security, end-to-end connectivity, and reliable data flow within and across institutions. We demonstrated the applicability of this platform by executing a QDM-based MU quality measure that determines the percentage of patients between 18 and 75 years with diabetes whose most recent low-density lipoprotein cholesterol test result during the measurement year was <100 mg/dL on a randomly selected cohort of 273 Mayo Clinic patients. The platform identified 21 and 18 patients for the denominator and numerator of the quality measure, respectively. Validation results indicate that all identified patients meet the QDM-based criteria. Conclusions End-to-end automated systems for extracting clinical information from diverse EHR systems require extensive use of standardized vocabularies and terminologies, as well as robust information models for storing, discovering, and processing that information. This study demonstrates the application of modular and open-source resources for enabling secondary use of EHR data through normalization into standards-based, comparable, and consistent format for high-throughput phenotyping to identify patient cohorts. PMID:24190931

  17. The Effectiveness of Health Care Information Technologies: Evaluation of Trust, Security Beliefs, and Privacy as Determinants of Health Care Outcomes.

    PubMed

    Kisekka, Victoria; Giboney, Justin Scott

    2018-04-11

    The diffusion of health information technologies (HITs) within the health care sector continues to grow. However, there is no theory explaining how success of HITs influences patient care outcomes. With the increase in data breaches, HITs' success now hinges on the effectiveness of data protection solutions. Still, empirical research has only addressed privacy concerns, with little regard for other factors of information assurance. The objective of this study was to study the effectiveness of HITs using the DeLone and McLean Information Systems Success Model (DMISSM). We examined the role of information assurance constructs (ie, the role of information security beliefs, privacy concerns, and trust in health information) as measures of HIT effectiveness. We also investigated the relationships between information assurance and three aspects of system success: attitude toward health information exchange (HIE), patient access to health records, and perceived patient care quality. Using structural equation modeling, we analyzed the data from a sample of 3677 cancer patients from a public dataset. We used R software (R Project for Statistical Computing) and the Lavaan package to test the hypothesized relationships. Our extension of the DMISSM to health care was supported. We found that increased privacy concerns reduce the frequency of patient access to health records use, positive attitudes toward HIE, and perceptions of patient care quality. Also, belief in the effectiveness of information security increases the frequency of patient access to health records and positive attitude toward HIE. Trust in health information had a positive association with attitudes toward HIE and perceived patient care quality. Trust in health information had no direct effect on patient access to health records; however, it had an indirect relationship through privacy concerns. Trust in health information and belief in the effectiveness of information security safeguards increases perceptions of patient care quality. Privacy concerns reduce patients' frequency of accessing health records, patients' positive attitudes toward HIE exchange, and overall perceived patient care quality. Health care organizations are encouraged to implement security safeguards to increase trust, the frequency of health record use, and reduce privacy concerns, consequently increasing patient care quality. ©Victoria Kisekka, Justin Scott Giboney. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 11.04.2018.

  18. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  19. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  20. A random optimization approach for inherent optic properties of nearshore waters

    NASA Astrophysics Data System (ADS)

    Zhou, Aijun; Hao, Yongshuai; Xu, Kuo; Zhou, Heng

    2016-10-01

    Traditional method of water quality sampling is time-consuming and highly cost. It can not meet the needs of social development. Hyperspectral remote sensing technology has well time resolution, spatial coverage and more general segment information on spectrum. It has a good potential in water quality supervision. Via the method of semi-analytical, remote sensing information can be related with the water quality. The inherent optical properties are used to quantify the water quality, and an optical model inside the water is established to analysis the features of water. By stochastic optimization algorithm Threshold Acceptance, a global optimization of the unknown model parameters can be determined to obtain the distribution of chlorophyll, organic solution and suspended particles in water. Via the improvement of the optimization algorithm in the search step, the processing time will be obviously reduced, and it will create more opportunity for the increasing the number of parameter. For the innovation definition of the optimization steps and standard, the whole inversion process become more targeted, thus improving the accuracy of inversion. According to the application result for simulated data given by IOCCG and field date provided by NASA, the approach model get continuous improvement and enhancement. Finally, a low-cost, effective retrieval model of water quality from hyper-spectral remote sensing can be achieved.

  1. Air quality assessment in the periurban area of Mexico Megacity during dry hot season in 2011 and 2012

    NASA Astrophysics Data System (ADS)

    Garcia-Reynoso, Agustin; Santos Garcia-Yee, Jose; Barrera-Huertas, Hugo; Gerardo Ruiz-Suárez, Luis

    2016-04-01

    Air quality is a human health threat not only in urbanized areas, it also affects the surrounding zones. Interaction between urban and rural areas can be evaluated by measurements and using models for regional areas that includes in its domain the peri-urban regions. The use of monitoring sites in remote areas is useful however it is not possible to cover all the region the use of models can provide valuable information about the source and fate of the pollution and its transformation. In order to evaluate the influence of the Mexico Megacity in the air quality of the region, two field campaigns were performed during the dry hot season during 2011 and 2012. Meterological and pollutant measurements were made during February and march 2011, in three sites towards the south east of Mexico Megacity, and from march to April 2012 towards the west after the Popocatepetl-Iztaccihuatl mountain range. Air quality modeling were performed by using the National Emissions Inventory 2008 during the studied periods, a comparison between measurements and the air quality model was performed. This type of studies can offer information about the pollutant distribution, the meteorological conditions and the exactness of emissions inventories. The latest can be useful for emissions inventory developers and policy makers.

  2. A Kind of Transformation of Information Service--Science and Technology Novelty Search in Chinese University Libraries

    ERIC Educational Resources Information Center

    Aiguo, Li

    2007-01-01

    Science and Technology Novelty Search (S&TNS) is a special information consultation service developed as part of the Chinese Sci-Tech system. The author introduces the concept of S&TNS, and explains its role, and the role of the university library in the process. A quality control model to improve the quality of service of the S&TNS at…

  3. 76 FR 15004 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-18

    ... computer software-based models or applications, termed under the rule as ``interactive websites.'' These... information; (c) Ways to enhance the quality, utility, and clarity of the information collected; and (d) Ways...

  4. Analysis of the Factors Affecting Consumer Acceptance of Accredited Online Health Information

    PubMed Central

    2017-01-01

    With the increasing use of the internet and the spread of smartphones, health information seekers obtain considerable information through the internet. As the amount of online health information increases, the need for quality management of health information has been emphasized. The purpose of this study was to investigate the factors affecting the intention of using accredited online health information by applying the extended technology acceptance model (Extended-TAM). An online survey was conducted from September 15, 2016 to October 3, 2016, on 500 men and women aged 19–69 years. The results showed that the greatest factor influencing the acceptance of the accredited health information was perceived usefulness, and the expectation for the quality of the accreditation system was the most important mediator variable. In order to establish the health information accreditation system as a means to provide easy and useful information to the consumers, it is necessary to carry out quality management and promote the system through the continuous monitoring of the accreditation system. PMID:28960026

  5. Website Quality, Expectation, Confirmation, and End User Satisfaction: The Knowledge-Intensive Website of the Korean National Cancer Information Center

    PubMed Central

    Koo, Chulmo; Wati, Yulia; Park, Keeho

    2011-01-01

    Background The fact that patient satisfaction with primary care clinical practices and physician-patient communications has decreased gradually has brought a new opportunity to the online channel as a supplementary service to provide additional information. Objective In this study, our objectives were to examine the process of cognitive knowledge expectation-confirmation from eHealth users and to recommend the attributes of a “knowledge-intensive website.”. Knowledge expectation can be defined as users’ existing attitudes or beliefs regarding expected levels of knowledge they may gain by accessing the website. Knowledge confirmation is the extent to which user’s knowledge expectation of information systems use is realized during actual use. In our hypothesized research model, perceived information quality, presentation and attractiveness as well as knowledge expectation influence knowledge confirmation, which in turn influences perceived usefulness and end user satisfaction, which feeds back to knowledge expectation. Methods An empirical study was conducted at the National Cancer Center (NCC), Republic of Korea (South Korea), by evaluating its official website. A user survey was administered containing items to measure subjectively perceived website quality and expectation-confirmation attributes. A study sample of 198 usable responses was used for further analysis. We used the structural equation model to test the proposed research model. Results Knowledge expectation exhibited a positive effect on knowledge confirmation (beta = .27, P < .001). The paths from information quality, information presentation, and website attractiveness to knowledge confirmation were also positive and significant (beta = .24, P < .001; beta = .29, P < .001; beta = .18, P < .001, respectively). Moreover, the effect of knowledge confirmation on perceived usefulness was also positively significant (beta = .64, P < .001). Knowledge expectation together with knowledge confirmation and perceived usefulness also significantly affected end user satisfaction (beta = .22 P < .001; beta = .39, P < .001; beta = .25, P < .001, respectively). Conclusions Theoretically, this study has (1) identified knowledge-intensive website attributes, (2) enhanced the theoretical foundation of eHealth from the information systems (IS) perspective by adopting the expectation-confirmation theory (ECT), and (3) examined the importance of information and knowledge attributes and explained their impact on user satisfaction. Practically, our empirical results suggest that perceived website quality (ie, information quality, information presentation, and website attractiveness) is a core requirement for knowledge building. In addition, our study has also shown that knowledge confirmation has a greater effect on satisfaction than both knowledge expectation and perceived usefulness. PMID:22047810

  6. A Survey of Precipitation Data for Environmental Modeling

    EPA Science Inventory

    This report explores the types of precipitation data available for environmental modeling. Precipitation is the main driver in the hydrological cycle and modelers use this information to understand water quality and water availability. Models use observed precipitation informatio...

  7. Orienting health care information systems toward quality: how Group Health Cooperative of Puget Sound did it.

    PubMed

    Goverman, I L

    1994-11-01

    Group Health Cooperative of Puget Sound (GHC), a large staff-model health maintenance organization based in Seattle, is redesigning its information systems to provide the systems and information needed to support its quality agenda. Long-range planning for GHC's information resources was done in three phases. In assessment, interviews, surveys, and a benchmarking effort identified strengths and weaknesses of the existing information systems. We concluded that we needed to improve clinical care and patient management systems and enhance health plan applications. In direction setting, we developed six objectives (for example, approach information systems in a way that is consistent with quality improvement principles). Detailed planning was used to define projects, timing, and resource allocations. Some of the most important efforts in the resulting five-year plan include the development of (1) a computerized patient record; (2) a provider-based clinical workstation for access to patient information, order entry, results reporting, guidelines, and reminders; (3) a comprehensive set of patient management and service quality systems; (4) reengineered structures, policies, and processes within the health plan, supported by a complete set of integrated information systems; (5) a standardized, high-capacity communications network to provide linkages both within GHC and among its business partners; and (6) a revised oversight structure for information services, which forms partnerships with users. A quality focus ensured that each project not only produced its own benefits but also supported the larger organizational goals associated with "total" quality.

  8. Importance of A Priori Vertical Ozone Profiles for TEMPO Air Quality Retrievals

    NASA Astrophysics Data System (ADS)

    Johnson, M. S.; Sullivan, J. T.; Liu, X.; Zoogman, P.; Newchurch, M.; Kuang, S.; McGee, T. J.; Leblanc, T.

    2017-12-01

    Ozone (O3) is a toxic pollutant which plays a major role in air quality. Typically, monitoring of surface air quality and O3 mixing ratios is conducted using in situ measurement networks. This is partially due to high-quality information related to air quality being limited from space-borne platforms due to coarse spatial resolution, limited temporal frequency, and minimal sensitivity to lower tropospheric and surface-level O3. The Tropospheric Emissions: Monitoring of Pollution (TEMPO) satellite is designed to address the limitations of current space-based platforms and to improve our ability to monitor North American air quality. TEMPO will provide hourly data of total column and vertical profiles of O3 with high spatial resolution to be used as a near-real-time air quality product. TEMPO O3 retrievals will apply the Smithsonian Astrophysical Observatory profile algorithm developed based on work from GOME, GOME-2, and OMI. This algorithm is suggested to use a priori O3 profile information from a climatological data-base developed from long-term ozone-sonde measurements (tropopause-based (TB-Clim) O3 climatology). This study evaluates the TB-Clim dataset and model simulated O3 profiles, which could potentially serve as a priori O3 profile information in TEMPO retrievals, from near-real-time data assimilation model products (NASA GMAO's operational GEOS-5 FP model and reanalysis data from MERRA2) and a full chemical transport model (CTM), GEOS-Chem. In this study, vertical profile products are evaluated with surface (0-2 km) and tropospheric (0-10 km) TOLNet observations and the theoretical impact of individual a priori profile sources on the accuracy of TEMPO O3 retrievals in the troposphere and at the surface are presented. Results indicate that while the TB-Clim climatological dataset can replicate seasonally-averaged tropospheric O3 profiles, model-simulated profiles from a full CTM resulted in more accurate tropospheric and surface-level O3 retrievals from TEMPO when compared to hourly and daily-averaged TOLNet observations. Furthermore, it is shown that when large surface O3 mixing ratios are observed, TEMPO retrieval values at the surface are most accurate when applying CTM a priori profile information compared to all other data products.

  9. Winter wheat quality monitoring and forecasting system based on remote sensing and environmental factors

    NASA Astrophysics Data System (ADS)

    Haiyang, Yu; Yanmei, Liu; Guijun, Yang; Xiaodong, Yang; Dong, Ren; Chenwei, Nie

    2014-03-01

    To achieve dynamic winter wheat quality monitoring and forecasting in larger scale regions, the objective of this study was to design and develop a winter wheat quality monitoring and forecasting system by using a remote sensing index and environmental factors. The winter wheat quality trend was forecasted before the harvest and quality was monitored after the harvest, respectively. The traditional quality-vegetation index from remote sensing monitoring and forecasting models were improved. Combining with latitude information, the vegetation index was used to estimate agronomy parameters which were related with winter wheat quality in the early stages for forecasting the quality trend. A combination of rainfall in May, temperature in May, illumination at later May, the soil available nitrogen content and other environmental factors established the quality monitoring model. Compared with a simple quality-vegetation index, the remote sensing monitoring and forecasting model used in this system get greatly improved accuracy. Winter wheat quality was monitored and forecasted based on the above models, and this system was completed based on WebGIS technology. Finally, in 2010 the operation process of winter wheat quality monitoring system was presented in Beijing, the monitoring and forecasting results was outputted as thematic maps.

  10. Benefits of Enterprise Ontology for the Development of ICT-Based Value Networks

    NASA Astrophysics Data System (ADS)

    Albani, Antonia; Dietz, Jan L. G.

    The competitiveness of value networks is highly dependent on the cooperation between business partners and the interoperability of their information systems. Innovations in information and communication technology (ICT), primarily the emergence of the Internet, offer possibilities to increase the interoperability of information systems and therefore enable inter-enterprise cooperation. For the design of inter-enterprise information systems, the concept of business component appears to be very promising. However, the identification of business components is strongly dependent on the appropriateness and the quality of the underlying business domain model. The ontological model of an enterprise - or an enterprise network - as presented in this article, is a high-quality and very adequate business domain model. It provides all essential information that is necessary for the design of the supporting information systems, and at a level of abstraction that makes it also understandable for business people. The application of enterprise ontology for the identification of business components is clarified. To exemplify our approach, a practical case is taken from the domain of strategic supply network development. By doing this, a widespread problem of the practical application of inter-enterprise information systems is being addressed.

  11. DeepQA: improving the estimation of single protein model quality with deep belief networks.

    PubMed

    Cao, Renzhi; Bhattacharya, Debswapna; Hou, Jie; Cheng, Jianlin

    2016-12-05

    Protein quality assessment (QA) useful for ranking and selecting protein models has long been viewed as one of the major challenges for protein tertiary structure prediction. Especially, estimating the quality of a single protein model, which is important for selecting a few good models out of a large model pool consisting of mostly low-quality models, is still a largely unsolved problem. We introduce a novel single-model quality assessment method DeepQA based on deep belief network that utilizes a number of selected features describing the quality of a model from different perspectives, such as energy, physio-chemical characteristics, and structural information. The deep belief network is trained on several large datasets consisting of models from the Critical Assessment of Protein Structure Prediction (CASP) experiments, several publicly available datasets, and models generated by our in-house ab initio method. Our experiments demonstrate that deep belief network has better performance compared to Support Vector Machines and Neural Networks on the protein model quality assessment problem, and our method DeepQA achieves the state-of-the-art performance on CASP11 dataset. It also outperformed two well-established methods in selecting good outlier models from a large set of models of mostly low quality generated by ab initio modeling methods. DeepQA is a useful deep learning tool for protein single model quality assessment and protein structure prediction. The source code, executable, document and training/test datasets of DeepQA for Linux is freely available to non-commercial users at http://cactus.rnet.missouri.edu/DeepQA/ .

  12. A Framework of the Use of Information in Software Testing

    ERIC Educational Resources Information Center

    Kaveh, Payman

    2010-01-01

    With the increasing role that software systems play in our daily lives, software quality has become extremely important. Software quality is impacted by the efficiency of the software testing process. There are a growing number of software testing methodologies, models, and initiatives to satisfy the need to improve software quality. The main…

  13. Assessment of surgical discharge summaries and evaluation of a new quality improvement model.

    PubMed

    Stein, Ran; Neufeld, David; Shwartz, Ivan; Erez, Ilan; Haas, Ilana; Magen, Ada; Glassberg, Elon; Shmulevsky, Pavel; Paran, Haim

    2014-11-01

    Discharge summaries after hospitalization provide the most reliable description and implications of the hospitalization. A concise discharge summary is crucial for maintaining continuity of care through the transition from inpatient to ambulatory care. Discharge summaries often lack information and are imprecise. Errors and insufficient recommendations regarding changes in the medical regimen may harm the patient's health and may result in readmission. To evaluate a quality improvement model and training program for writing postoperative discharge summaries for three surgical procedures. Medical records and surgical discharge summaries were reviewed and scored. Essential points for communication between surgeons and family physicians were included in automated forms. Staff was briefed twice regarding required summary contents with an interim evaluation. Changes in quality were evaluated. Summaries from 61 cholecystectomies, 42 hernioplasties and 45 colectomies were reviewed. The average quality score of all discharge summaries increased from 72.1 to 78.3 after the first intervention (P < 0.0005) to 81.0 following the second intervention. As the discharge summary's quality improved, its length decreased significantly. Discharge summaries lack important information and are too long. Developing a model for discharge summaries and instructing surgical staff regarding their contents resulted in measurable improvement. Frequent interventions and supervision are needed to maintain the quality of the surgical discharge summary.

  14. Development and testing of a fast conceptual river water quality model.

    PubMed

    Keupers, Ingrid; Willems, Patrick

    2017-04-15

    Modern, model based river quality management strongly relies on river water quality models to simulate the temporal and spatial evolution of pollutant concentrations in the water body. Such models are typically constructed by extending detailed hydrodynamic models with a component describing the advection-diffusion and water quality transformation processes in a detailed, physically based way. This approach is too computational time demanding, especially when simulating long time periods that are needed for statistical analysis of the results or when model sensitivity analysis, calibration and validation require a large number of model runs. To overcome this problem, a structure identification method to set up a conceptual river water quality model has been developed. Instead of calculating the water quality concentrations at each water level and discharge node, the river branch is divided into conceptual reservoirs based on user information such as location of interest and boundary inputs. These reservoirs are modelled as Plug Flow Reactor (PFR) and Continuously Stirred Tank Reactor (CSTR) to describe advection and diffusion processes. The same water quality transformation processes as in the detailed models are considered but with adjusted residence times based on the hydrodynamic simulation results and calibrated to the detailed water quality simulation results. The developed approach allows for a much faster calculation time (factor 10 5 ) without significant loss of accuracy, making it feasible to perform time demanding scenario runs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Framework for Design of Traceability System on Organic Rice Certification

    NASA Astrophysics Data System (ADS)

    Purwandoko, P. B.; Seminar, K. B.; Sutrisno; Sugiyanta

    2018-05-01

    Nowadays, the preferences of organic products such as organic rice have been increased. It because of the people awareness of the healthy and eco-friendly food product consumption has grown. Therefore, it is very important to ensure organic quality of the product that will be produced. Certification is a series of process that holds to ensure the quality of products meets all criteria of organic standards. Currently, there is a problem that traceability information system for organic rice certification has been not available. The current system still conducts manually caused the loss of information during storage process. This paper aimed at developing a traceability framework on organic rice certification process. First, the main discussed issues are organic certification process. Second, unified modeling language (UML) is used to build the model of user requirement in order to develop traceability system for all actors in the certification process. Furthermore, the information captured model along certification process will be explained in this paper. The model shows the information flow that has to be recorded for each actor. Finally, the challenges in the implementation system will be discussed in this paper.

  16. Information technology model for evaluating emergency medicine teaching

    NASA Astrophysics Data System (ADS)

    Vorbach, James; Ryan, James

    1996-02-01

    This paper describes work in progress to develop an Information Technology (IT) model and supporting information system for the evaluation of clinical teaching in the Emergency Medicine (EM) Department of North Shore University Hospital. In the academic hospital setting student physicians, i.e. residents, and faculty function daily in their dual roles as teachers and students respectively, and as health care providers. Databases exist that are used to evaluate both groups in either academic or clinical performance, but rarely has this information been integrated to analyze the relationship between academic performance and the ability to care for patients. The goal of the IT model is to improve the quality of teaching of EM physicians by enabling the development of integrable metrics for faculty and resident evaluation. The IT model will include (1) methods for tracking residents in order to develop experimental databases; (2) methods to integrate lecture evaluation, clinical performance, resident evaluation, and quality assurance databases; and (3) a patient flow system to monitor patient rooms and the waiting area in the Emergency Medicine Department, to record and display status of medical orders, and to collect data for analyses.

  17. Complex adaptive systems: a tool for interpreting responses and behaviours.

    PubMed

    Ellis, Beverley

    2011-01-01

    Quality improvement is a priority for health services worldwide. There are many barriers to implementing change at the locality level and misinterpreting responses and behaviours can effectively block change. Electronic health records will influence the means by which knowledge and information are generated and sustained among those operating quality improvement programmes. To explain how complex adaptive system (CAS) theory provides a useful tool and new insight into the responses and behaviours that relate to quality improvement programmes in primary care enabled by informatics. Case studies in two English localities who participated in the implementation and development of quality improvement programmes. The research strategy included purposefully sampled case studies, conducted within a social constructionist ontological perspective. Responses and behaviours of quality improvement programmes in the two localities include both positive and negative influences associated with a networked model of governance. Pressures of time, resources and workload are common issues, along with the need for education and training about capturing, coding, recording and sharing information held within electronic health records to support various information requirements. Primary care informatics enables information symmetry among those operating quality improvement programmes by making some aspects of care explicit, allowing consensus about quality improvement priorities and implementable solutions.

  18. Class Model Development Using Business Rules

    NASA Astrophysics Data System (ADS)

    Skersys, Tomas; Gudas, Saulius

    New developments in the area of computer-aided system engineering (CASE) greatly improve processes of the information systems development life cycle (ISDLC). Much effort is put into the quality improvement issues, but IS development projects still suffer from the poor quality of models during the system analysis and design cycles. At some degree, quality of models that are developed using CASE tools can be assured using various. automated. model comparison, syntax. checking procedures. It. is also reasonable to check these models against the business domain knowledge, but the domain knowledge stored in the repository of CASE tool (enterprise model) is insufficient (Gudas et al. 2004). Involvement of business domain experts into these processes is complicated because non- IT people often find it difficult to understand models that were developed by IT professionals using some specific modeling language.

  19. Tracking and forecasting the Nation’s water quality - Priorities and strategies for 2013-2023

    USGS Publications Warehouse

    Rowe, Gary L.; Gilliom, Robert J.; Woodside, Michael D.

    2013-01-01

    Water-quality issues facing the Nation are growing in number and complexity, and solutions are becoming more challenging and costly. Key factors that affect the quality of our drinking water supplies and ecosystem health include contaminants of human and natural origin in streams and groundwater; excess nutrients and sediment; alteration of natural streamflow; eutrophication of lakes, reservoirs, and coastal estuaries; and changes in surface and groundwater quality associated with changes in climate, land and water use, and management practices. Tracking and forecasting the Nation's water quality in the face of these and other pressing water-quality issues are important goals for 2013-2023, the third decade of the U.S. Geological Survey's National Water-Quality Assessment (NAWQA) program. In consultation with stakeholders and the National Research Council, a new strategic Science Plan has been developed that describes a strategy for building upon and enhancing assessment of the Nation's freshwater quality and aquatic ecosystems. The plan continues strategies that have been central to the NAWQA program's long-term success, but it also makes adjustments to the monitoring and modeling approaches NAWQA will use to address critical data and science information needs identified by stakeholders. This fact sheet describes surface-water and groundwater monitoring and modeling activities that will start in fiscal year 2013. It also provides examples of the types of data and information products planned for the next decade, including (1) restored monitoring for reliable and timely status and trend assessments, (2) maps and models that show the distribution of selected contaminants (such as atrazine, nitrate, and arsenic) in streams and aquifers, and (3) Web-based modeling tools that allow managers to evaluate how water quality may change in response to different scenarios of population growth, climate change, or land-use management.

  20. The importance of data quality for generating reliable distribution models for rare, elusive, and cryptic species

    Treesearch

    Keith B. Aubry; Catherine M. Raley; Kevin S. McKelvey

    2017-01-01

    The availability of spatially referenced environmental data and species occurrence records in online databases enable practitioners to easily generate species distribution models (SDMs) for a broad array of taxa. Such databases often include occurrence records of unknown reliability, yet little information is available on the influence of data quality on SDMs generated...

  1. Improving the geomagnetic field modeling with a selection of high-quality archaeointensity data

    NASA Astrophysics Data System (ADS)

    Pavon-Carrasco, Francisco Javier; Gomez-Paccard, Miriam; Herve, Gwenael; Osete, Maria Luisa; Chauvin, Annick

    2014-05-01

    Geomagnetic field reconstructions for the last millennia are based on archeomagnetic data. However, the scatter of the archaeointensity data is very puzzling and clearly suggests that some of the intensity data might not be reliable. In this work we apply different selection criteria to the European and Western Asian archaeointensity data covering the last three millennia in order to investigate if the data selection affects geomagnetic field models results. Thanks to the recently developed archeomagnetic databases, new valuable information related to the methodology used to determine the archeointensity data is now available. We therefore used this information to rank the archaeointensity data in four quality categories depending on the methodology used during the laboratory treatment of the samples and on the number of specimens retained to calculate the mean intensities. Results show how the intensity geomagnetic field component given by the regional models hardly depends on the selected quality data used. When all the available data are used a different behavior of the geomagnetic field is observed in Western and Eastern Europe. However, when the regional model is obtained from a selection of high-quality intensity data the same features are observed at the European scale.

  2. Modelling and management of subjective information in a fuzzy setting

    NASA Astrophysics Data System (ADS)

    Bouchon-Meunier, Bernadette; Lesot, Marie-Jeanne; Marsala, Christophe

    2013-01-01

    Subjective information is very natural for human beings. It is an issue at the crossroad of cognition, semiotics, linguistics, and psycho-physiology. Its management requires dedicated methods, among which we point out the usefulness of fuzzy and possibilistic approaches and related methods, such as evidence theory. We distinguish three aspects of subjectivity: the first deals with perception and sensory information, including the elicitation of quality assessment and the establishment of a link between physical and perceived properties; the second is related to emotions, their fuzzy nature, and their identification; and the last aspect stems from natural language and takes into account information quality and reliability of information.

  3. Trends and Issues in U.S. Navy Manpower

    DTIC Science & Technology

    1985-01-01

    Planning (ADSTAP) system7, consists of several subsystems and models for planning and managing enlisted manpower, personnel, and training. It was... models to provide information for formulating goals and planning the transition from current inventory to estab- lished objectives 9 Operational...planning models to provide information for formulating operating plans to control the size and quality (ratings or skills and pay grades) of the active-duty

  4. [Application of entropy-weight TOPSIS model in synthetical quality evaluation of Angelica sinensis growing in Gansu Province].

    PubMed

    Gu, Zhi-rong; Wang, Ya-li; Sun, Yu-jing; Dind, Jun-xia

    2014-09-01

    To investigate the establishment and application methods of entropy-weight TOPSIS model in synthetical quality evaluation of traditional Chinese medicine with Angelica sinensis growing in Gansu Province as an example. The contents of ferulic acid, 3-butylphthalide, Z-butylidenephthalide, Z-ligustilide, linolic acid, volatile oil, and ethanol soluble extractive were used as an evaluation index set. The weights of each evaluation index were determined by information entropy method. The entropyweight TOPSIS model was established to synthetically evaluate the quality of Angelica sinensis growing in Gansu Province by Euclid closeness degree. The results based on established model were in line with the daodi meaning and the knowledge of clinical experience. The established model was simple in calculation, objective, reliable, and can be applied to synthetical quality evaluation of traditional Chinese medicine.

  5. Cumulative uncertainty in measured streamflow and water quality data for small watersheds

    USGS Publications Warehouse

    Harmel, R.D.; Cooper, R.J.; Slade, R.M.; Haney, R.L.; Arnold, J.G.

    2006-01-01

    The scientific community has not established an adequate understanding of the uncertainty inherent in measured water quality data, which is introduced by four procedural categories: streamflow measurement, sample collection, sample preservation/storage, and laboratory analysis. Although previous research has produced valuable information on relative differences in procedures within these categories, little information is available that compares the procedural categories or presents the cumulative uncertainty in resulting water quality data. As a result, quality control emphasis is often misdirected, and data uncertainty is typically either ignored or accounted for with an arbitrary margin of safety. Faced with the need for scientifically defensible estimates of data uncertainty to support water resource management, the objectives of this research were to: (1) compile selected published information on uncertainty related to measured streamflow and water quality data for small watersheds, (2) use a root mean square error propagation method to compare the uncertainty introduced by each procedural category, and (3) use the error propagation method to determine the cumulative probable uncertainty in measured streamflow, sediment, and nutrient data. Best case, typical, and worst case "data quality" scenarios were examined. Averaged across all constituents, the calculated cumulative probable uncertainty (??%) contributed under typical scenarios ranged from 6% to 19% for streamflow measurement, from 4% to 48% for sample collection, from 2% to 16% for sample preservation/storage, and from 5% to 21% for laboratory analysis. Under typical conditions, errors in storm loads ranged from 8% to 104% for dissolved nutrients, from 8% to 110% for total N and P, and from 7% to 53% for TSS. Results indicated that uncertainty can increase substantially under poor measurement conditions and limited quality control effort. This research provides introductory scientific estimates of uncertainty in measured water quality data. The results and procedures presented should also assist modelers in quantifying the "quality"of calibration and evaluation data sets, determining model accuracy goals, and evaluating model performance.

  6. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2001-12-01

    Healthcare institutions are looking at ways to increase their efficiency by reducing costs while providing care services with a high level of safety. Thus, hospital information systems have to support quality improvement objectives. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualise clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the blood transfusion process. An object-oriented data model of a process has been defined in order to organise the data dictionary. Although some aspects of activity, such as 'where', 'what else', and 'why' are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for the processes to be interrelated, and for their characteristics to be shared, in order to avoid data redundancy and to fit the gathering of data with the provision of care.

  7. 30 CFR 550.218 - What air emissions information must accompany the EP?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... quality modeling, you must use the guidelines in appendix W of 40 CFR part 51 with a model approved by the.... (f) Modeling report. A modeling report or the modeling results (if § 550.303 requires you to use an...

  8. 30 CFR 550.218 - What air emissions information must accompany the EP?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... quality modeling, you must use the guidelines in Appendix W of 40 CFR part 51 with a model approved by the.... (f) Modeling report. A modeling report or the modeling results (if § 550.303 requires you to use an...

  9. 30 CFR 550.218 - What air emissions information must accompany the EP?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... quality modeling, you must use the guidelines in Appendix W of 40 CFR part 51 with a model approved by the.... (f) Modeling report. A modeling report or the modeling results (if § 550.303 requires you to use an...

  10. [Real-time detection of quality of Chinese materia medica: strategy of NIR model evaluation].

    PubMed

    Wu, Zhi-sheng; Shi, Xin-yuan; Xu, Bing; Dai, Xing-xing; Qiao, Yan-jiang

    2015-07-01

    The definition of critical quality attributes of Chinese materia medica ( CMM) was put forward based on the top-level design concept. Nowadays, coupled with the development of rapid analytical science, rapid assessment of critical quality attributes of CMM was firstly carried out, which was the secondary discipline branch of CMM. Taking near infrared (NIR) spectroscopy as an example, which is a rapid analytical technology in pharmaceutical process over the past decade, systematic review is the chemometric parameters in NIR model evaluation. According to the characteristics of complexity of CMM and trace components analysis, a multi-source information fusion strategy of NIR model was developed for assessment of critical quality attributes of CMM. The strategy has provided guideline for NIR reliable analysis in critical quality attributes of CMM.

  11. Importance of a Priori Vertical Ozone Profiles for TEMPO Air Quality Retrievals

    NASA Technical Reports Server (NTRS)

    Johnson, Matthew S.; Sullivan, John; Liu, Xiong; Zoogman, Peter; Newchurch, Mike; Kuang, Shi; McGee, Thomas; Leblanc, Thierry

    2017-01-01

    Ozone (O3) is a toxic pollutant which plays a major role in air quality. Typically, monitoring of surface air quality and O3 mixing ratios is conducted using in situ measurement networks. This is partially due to high-quality information related to air quality being limited from space-borne platforms due to coarse spatial resolution, limited temporal frequency, and minimal sensitivity to lower tropospheric and surface-level O3. The Tropospheric Emissions: Monitoring of Pollution (TEMPO) satellite is designed to address the limitations of current space-based platforms and to improve our ability to monitor North American air quality. TEMPO will provide hourly data of total column and vertical profiles of O3 with high spatial resolution to be used as a near-real-time air quality product. TEMPO O3 retrievals will apply the Smithsonian Astrophysical Observatory profile algorithm developed based on work from GOME (Global Ozone Monitoring Experiment), GOME-2, and OMI (Ozone Monitoring Instrument). This algorithm is suggested to use a priori O3 profile information from a climatological data-base developed from long-term ozone-sonde measurements (tropopause-based (TB-Clim) O3 climatology). This study evaluates the TB-Clim dataset and model simulated O3 profiles, which could potentially serve as a priori O3 profile information in TEMPO retrievals, from near-real-time data assimilation model products (NASA GMAO's (Global Modeling and Assimilation Office) operational GEOS-5 (Goddard Earth Observing System, Version 5) FP (Forecast Products) model and reanalysis data from MERRA2 (Modern-Era Retrospective analysis for Research and Applications, Version 2)) and a full chemical transport model (CTM), GEOS-Chem. In this study, vertical profile products are evaluated with surface (0-2 kilometers) and tropospheric (0-10 kilometers) TOLNet (Tropospheric Ozone Lidar Network) observations and the theoretical impact of individual a priori profile sources on the accuracy of TEMPO O3 retrievals in the troposphere and at the surface are presented. Results indicate that while the TB-Clim climatological dataset can replicate seasonally-averaged tropospheric O3 profiles, model-simulated profiles from a full CTM resulted in more accurate tropospheric and surface-level O3 retrievals from TEMPO when compared to hourly and daily-averaged TOLNet observations. Furthermore, it is shown that when large surface O3 mixing ratios are observed, TEMPO retrieval values at the surface are most accurate when applying CTM a priori profile information compared to all other data products.

  12. Gene Wiki Reviews-Raising the quality and accessibility of information about the human genome.

    PubMed

    Tsueng, Ginger; Good, Benjamin M; Ping, Peipei; Golemis, Erica; Hanukoglu, Israel; van Wijnen, Andre J; Su, Andrew I

    2016-11-05

    Wikipedia and other openly available resources are increasingly becoming commonly used sources of information not just among the lay public but even in academic circles including undergraduate students and postgraduate trainees. To enhance the quality of the Wikipedia articles, in 2013, we initiated the Gene Wiki Reviews on genes and proteins as a series of invited reviews that stipulated editing the corresponding Wikipedia article(s) that would be also subject to peer-review. Thus, while the review article serves as an authoritative snapshot of the field, the "living article" can continue to evolve with the crowdsourcing model of Wikipedia. After publication of over 50 articles, we surveyed the authors to assess the impact of the project. The author survey results revealed that the Gene Wiki project is achieving its major objectives to increase the involvement of scientists in authoring Wikipedia articles and to enhance the quantity and quality of the information about genes and their protein products. Thus, the dual publication model introduced in the Gene Wiki Reviews series represents a valuable innovation in scientific publishing and biomedical knowledge management. We invite experts on specific genes to contact the editors to take part in this project to enhance the quality and accessibility of information about the human genome. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Increasing the Use of Earth Science Data and Models in Air Quality Management.

    PubMed

    Milford, Jana B; Knight, Daniel

    2017-04-01

    In 2010, the U.S. National Aeronautics and Space Administration (NASA) initiated the Air Quality Applied Science Team (AQAST) as a 5-year, $17.5-million award with 19 principal investigators. AQAST aims to increase the use of Earth science products in air quality-related research and to help meet air quality managers' information needs. We conducted a Web-based survey and a limited number of follow-up interviews to investigate federal, state, tribal, and local air quality managers' perspectives on usefulness of Earth science data and models, and on the impact AQAST has had. The air quality managers we surveyed identified meeting the National Ambient Air Quality Standards for ozone and particulate matter, emissions from mobile sources, and interstate air pollution transport as top challenges in need of improved information. Most survey respondents viewed inadequate coverage or frequency of satellite observations, data uncertainty, and lack of staff time or resources as barriers to increased use of satellite data by their organizations. Managers who have been involved with AQAST indicated that the program has helped build awareness of NASA Earth science products, and assisted their organizations with retrieval and interpretation of satellite data and with application of global chemistry and climate models. AQAST has also helped build a network between researchers and air quality managers with potential for further collaborations. NASA's Air Quality Applied Science Team (AQAST) aims to increase the use of satellite data and global chemistry and climate models for air quality management purposes, by supporting research and tool development projects of interest to both groups. Our survey and interviews of air quality managers indicate they found value in many AQAST projects and particularly appreciated the connections to the research community that the program facilitated. Managers expressed interest in receiving continued support for their organizations' use of satellite data, including assistance in retrieving and interpreting data from future geostationary platforms meant to provide more frequent coverage for air quality and other applications.

  14. Indicators to support the dynamic evaluation of air quality models

    NASA Astrophysics Data System (ADS)

    Thunis, P.; Clappier, A.

    2014-12-01

    Air quality models are useful tools for the assessment and forecast of pollutant concentrations in the atmosphere. Most of the evaluation process relies on the “operational phase” or in other words the comparison of model results with available measurements which provides insight on the model capability to reproduce measured concentrations for a given application. But one of the key advantages of air quality models lies in their ability to assess the impact of precursor emission reductions on air quality levels. Models are then used in a dynamic mode (i.e. response to a change in a given model input data) for which evaluation of the model performances becomes a challenge. The objective of this work is to propose common indicators and diagrams to facilitate the understanding of model responses to emission changes when models are to be used for policy support. These indicators are shown to be useful to retrieve information on the magnitude of the locally produced impacts of emission reductions on concentrations with respect to the “external to the domain” contribution but also to identify, distinguish and quantify impacts arising from different factors (different precursors). In addition information about the robustness of the model results is provided. As such these indicators might reveal useful as first screening methodology to identify the feasibility of a given action as well as to prioritize the factors on which to act for an increased efficiency. Finally all indicators are made dimensionless to facilitate the comparison of results obtained with different models, different resolutions, or on different geographical areas.

  15. Competition and quality in a physiotherapy market with fixed prices.

    PubMed

    Pekola, Piia; Linnosmaa, Ismo; Mikkola, Hennamari

    2017-01-01

    Our study focuses on competition and quality in physiotherapy organized and regulated by the Social Insurance Institution of Finland (Kela). We first derive a hypothesis with a theoretical model and then perform empirical analyses of the data. Within the physiotherapy market, prices are regulated by Kela, and after registration eligible firms are accepted to join a pool of firms from which patients choose service providers based on their individual preferences. By using 2SLS estimation techniques, we analyzed the relationship among quality, competition and regulated price. According to the results, competition has a statistically significant (yet weak) negative effect (p = 0.019) on quality. The outcome for quality is likely caused by imperfect information. It seems that Kela has provided too little information for patients about the quality of the service.

  16. Industrial pollution and the management of river water quality: a model of Kelani River, Sri Lanka.

    PubMed

    Gunawardena, Asha; Wijeratne, E M S; White, Ben; Hailu, Atakelty; Pandit, Ram

    2017-08-19

    Water quality of the Kelani River has become a critical issue in Sri Lanka due to the high cost of maintaining drinking water standards and the market and non-market costs of deteriorating river ecosystem services. By integrating a catchment model with a river model of water quality, we developed a method to estimate the effect of pollution sources on ambient water quality. Using integrated model simulations, we estimate (1) the relative contribution from point (industrial and domestic) and non-point sources (river catchment) to river water quality and (2) pollutant transfer coefficients for zones along the lower section of the river. Transfer coefficients provide the basis for policy analyses in relation to the location of new industries and the setting of priorities for industrial pollution control. They also offer valuable information to design socially optimal economic policy to manage industrialized river catchments.

  17. Effects of argument quality, source credibility and self-reported diabetes knowledge on message attitudes: an experiment using diabetes related messages.

    PubMed

    Lin, Tung-Cheng; Hwang, Lih-Lian; Lai, Yung-Jye

    2017-05-17

    Previous studies have reported that credibility and content (argument quality) are the most critical factors affecting the quality of health information and its acceptance and use; however, this causal relationship merits further investigation in the context of health education. Moreover, message recipients' prior knowledge may moderate these relationships. This study used the elaboration likelihood model to determine the main effects of argument quality, source credibility and the moderating effect of self-reported diabetes knowledge on message attitudes. A between-subjects experimental design using an educational message concerning diabetes for manipulation was applied to validate the effects empirically. A total of 181 participants without diabetes were recruited from the Department of Health, Taipei City Government. Four group messages were manipulated in terms of argument quality (high and low) × source credibility (high and low). Argument quality and source credibility of health information significantly influenced the attitude of message recipients. The participants with high self-reported knowledge participants exhibited significant disapproval for messages with low argument quality. Effective health information should provide objective descriptions and cite reliable sources; in addition, it should provide accurate, customised messages for recipients who have high background knowledge level and ability to discern message quality. © 2017 Health Libraries Group Health Information & Libraries Journal.

  18. Progress and lessons learned from water-quality monitoring networks

    USGS Publications Warehouse

    Myers, Donna N.; Ludtke, Amy S.

    2017-01-01

    Stream-quality monitoring networks in the United States were initiated and expanded after passage of successive federal water-pollution control laws from 1948 to 1972. The first networks addressed information gaps on the extent and severity of stream pollution and served as early warning systems for spills. From 1965 to 1972, monitoring networks expanded to evaluate compliance with stream standards, track emerging issues, and assess water-quality status and trends. After 1972, concerns arose regarding the ability of monitoring networks to determine if water quality was getting better or worse and why. As a result, monitoring networks adopted a hydrologic systems approach targeted to key water-quality issues, accounted for human and natural factors affecting water quality, innovated new statistical methods, and introduced geographic information systems and models that predict water quality at unmeasured locations. Despite improvements, national-scale monitoring networks have declined over time. Only about 1%, or 217, of more than 36,000 US Geological Survey monitoring sites sampled from 1975 to 2014 have been operated throughout the four decades since passage of the 1972 Clean Water Act. Efforts to sustain monitoring networks are important because these networks have collected information crucial to the description of water-quality trends over time and are providing information against which to evaluate future trends.

  19. Stochastic empirical loading and dilution model for analysis of flows, concentrations, and loads of highway runoff constituents

    USGS Publications Warehouse

    Granato, Gregory E.; Jones, Susan C.

    2014-01-01

    In cooperation with FHWA, the U.S. Geological Survey developed the stochastic empirical loading and dilution model (SELDM) to supersede the 1990 FHWA runoff quality model. The SELDM tool is designed to transform disparate and complex scientific data into meaningful information about the adverse risks of runoff on receiving waters, the potential need for mitigation measures, and the potential effectiveness of such measures for reducing such risks. The SELDM tool is easy to use because much of the information and data needed to run it are embedded in the model and obtained by defining the site location and five simple basin properties. Information and data from thousands of sites across the country were compiled to facilitate the use of the SELDM tool. A case study illustrates how to use the SELDM tool for conducting the types of sensitivity analyses needed to properly assess water quality risks. For example, the use of deterministic values to model upstream stormflows instead of representative variations in prestorm flow and runoff may substantially overestimate the proportion of highway runoff in downstream flows. Also, the risks for total phosphorus excursions are substantially affected by the selected criteria and the modeling methods used. For example, if a single deterministic concentration is used rather than a stochastic population of values to model upstream concentrations, then the percentage of water quality excursions in the downstream receiving waters may depend entirely on the selected upstream concentration.

  20. Short-term effects of air quality and thermal stress on non-accidental morbidity-a multivariate meta-analysis comparing indices to single measures.

    PubMed

    Lokys, Hanna Leona; Junk, Jürgen; Krein, Andreas

    2018-01-01

    Air quality and thermal stress lead to increased morbidity and mortality. Studies on morbidity and the combined impact of air pollution and thermal stress are still rare. To analyse the correlations between air quality, thermal stress and morbidity, we used a two-stage meta-analysis approach, consisting of a Poisson regression model combined with distributed lag non-linear models (DLNMs) and a meta-analysis investigating whether latitude or the number of inhabitants significantly influence the correlations. We used air pollution, meteorological and hospital admission data from 28 administrative districts along a north-south gradient in western Germany from 2001 to 2011. We compared the performance of the single measure particulate matter (PM10) and air temperature to air quality indices (MPI and CAQI) and the biometeorological index UTCI. Based on the Akaike information criterion (AIC), it can be shown that using air quality indices instead of single measures increases the model strength. However, using the UTCI in the model does not give additional information compared to mean air temperature. Interaction between the 3-day average of air quality (max PM10, max CAQI and max MPI) and meteorology (mean air temperature and mean UTCI) did not improve the models. Using the mean air temperature, we found immediate effects of heat stress (RR 1.0013, 95% CI: 0.9983-1.0043) and by 3 days delayed effects of cold stress (RR: 1.0184, 95% CI: 1.0117-1.0252). The results for air quality differ between both air quality indices and PM10. CAQI and MPI show a delayed impact on morbidity with a maximum RR after 2 days (MPI 1.0058, 95% CI: 1.0013-1.0102; CAQI 1.0068, 95% CI: 1.0030-1.0107). Latitude was identified as a significant meta-variable, whereas the number of inhabitants was not significant in the model.

  1. Short-term effects of air quality and thermal stress on non-accidental morbidity—a multivariate meta-analysis comparing indices to single measures

    NASA Astrophysics Data System (ADS)

    Lokys, Hanna Leona; Junk, Jürgen; Krein, Andreas

    2018-01-01

    Air quality and thermal stress lead to increased morbidity and mortality. Studies on morbidity and the combined impact of air pollution and thermal stress are still rare. To analyse the correlations between air quality, thermal stress and morbidity, we used a two-stage meta-analysis approach, consisting of a Poisson regression model combined with distributed lag non-linear models (DLNMs) and a meta-analysis investigating whether latitude or the number of inhabitants significantly influence the correlations. We used air pollution, meteorological and hospital admission data from 28 administrative districts along a north-south gradient in western Germany from 2001 to 2011. We compared the performance of the single measure particulate matter (PM10) and air temperature to air quality indices (MPI and CAQI) and the biometeorological index UTCI. Based on the Akaike information criterion (AIC), it can be shown that using air quality indices instead of single measures increases the model strength. However, using the UTCI in the model does not give additional information compared to mean air temperature. Interaction between the 3-day average of air quality (max PM10, max CAQI and max MPI) and meteorology (mean air temperature and mean UTCI) did not improve the models. Using the mean air temperature, we found immediate effects of heat stress (RR 1.0013, 95% CI: 0.9983-1.0043) and by 3 days delayed effects of cold stress (RR: 1.0184, 95% CI: 1.0117-1.0252). The results for air quality differ between both air quality indices and PM10. CAQI and MPI show a delayed impact on morbidity with a maximum RR after 2 days (MPI 1.0058, 95% CI: 1.0013-1.0102; CAQI 1.0068, 95% CI: 1.0030-1.0107). Latitude was identified as a significant meta-variable, whereas the number of inhabitants was not significant in the model.

  2. Determining the Uncertainties in Prescribed Burn Emissions Through Comparison of Satellite Estimates to Ground-based Estimates and Air Quality Model Evaluations in Southeastern US

    NASA Astrophysics Data System (ADS)

    Odman, M. T.; Hu, Y.; Russell, A. G.

    2016-12-01

    Prescribed burning is practiced throughout the US, and most widely in the Southeast, for the purpose of maintaining and improving the ecosystem, and reducing the wildfire risk. However, prescribed burn emissions contribute significantly to the of trace gas and particulate matter loads in the atmosphere. In places where air quality is already stressed by other anthropogenic emissions, prescribed burns can lead to major health and environmental problems. Air quality modeling efforts are under way to assess the impacts of prescribed burn emissions. Operational forecasts of the impacts are also emerging for use in dynamic management of air quality as well as the burns. Unfortunately, large uncertainties exist in the process of estimating prescribed burn emissions and these uncertainties limit the accuracy of the burn impact predictions. Prescribed burn emissions are estimated by using either ground-based information or satellite observations. When there is sufficient local information about the burn area, the types of fuels, their consumption amounts, and the progression of the fire, ground-based estimates are more accurate. In the absence of such information satellites remain as the only reliable source for emission estimation. To determine the level of uncertainty in prescribed burn emissions, we compared estimates derived from a burn permit database and other ground-based information to the estimates by the Biomass Burning Emissions Product derived from a constellation of NOAA and NASA satellites. Using these emissions estimates we conducted simulations with the Community Multiscale Air Quality (CMAQ) model and predicted trace gas and particulate matter concentrations throughout the Southeast for two consecutive burn seasons (2015 and 2016). In this presentation, we will compare model predicted concentrations to measurements at monitoring stations and evaluate if the differences are commensurate with our emission uncertainty estimates. We will also investigate if spatial and temporal patterns in the differences reveal the sources of the uncertainty in the prescribed burn emission estimates.

  3. Predicting the synaptic information efficacy in cortical layer 5 pyramidal neurons using a minimal integrate-and-fire model.

    PubMed

    London, Michael; Larkum, Matthew E; Häusser, Michael

    2008-11-01

    Synaptic information efficacy (SIE) is a statistical measure to quantify the efficacy of a synapse. It measures how much information is gained, on the average, about the output spike train of a postsynaptic neuron if the input spike train is known. It is a particularly appropriate measure for assessing the input-output relationship of neurons receiving dynamic stimuli. Here, we compare the SIE of simulated synaptic inputs measured experimentally in layer 5 cortical pyramidal neurons in vitro with the SIE computed from a minimal model constructed to fit the recorded data. We show that even with a simple model that is far from perfect in predicting the precise timing of the output spikes of the real neuron, the SIE can still be accurately predicted. This arises from the ability of the model to predict output spikes influenced by the input more accurately than those driven by the background current. This indicates that in this context, some spikes may be more important than others. Lastly we demonstrate another aspect where using mutual information could be beneficial in evaluating the quality of a model, by measuring the mutual information between the model's output and the neuron's output. The SIE, thus, could be a useful tool for assessing the quality of models of single neurons in preserving input-output relationship, a property that becomes crucial when we start connecting these reduced models to construct complex realistic neuronal networks.

  4. Patient involvement in the decision-making process improves satisfaction and quality of life in postmastectomy breast reconstruction.

    PubMed

    Ashraf, Azra A; Colakoglu, Salih; Nguyen, John T; Anastasopulos, Alexandra J; Ibrahim, Ahmed M S; Yueh, Janet H; Lin, Samuel J; Tobias, Adam M; Lee, Bernard T

    2013-09-01

    The patient-physician relationship has evolved from the paternalistic, physician-dominant model to the shared-decision-making and informed-consumerist model. The level of patient involvement in this decision-making process can potentially influence patient satisfaction and quality of life. In this study, patient-physician decision models are evaluated in patients undergoing postmastectomy breast reconstruction. All women who underwent breast reconstruction at an academic hospital from 1999-2007 were identified. Patients meeting inclusion criteria were mailed questionnaires at a minimum of 1 y postoperatively with questions about decision making, satisfaction, and quality of life. There were 707 women eligible for our study and 465 completed surveys (68% response rate). Patients were divided into one of three groups: paternalistic (n = 18), informed-consumerist (n = 307), shared (n = 140). There were differences in overall general satisfaction (P = 0.034), specifically comparing the informed group to the paternalistic group (66.7% versus 38.9%, P = 0.020) and the shared to the paternalistic group (69.3% versus 38.9%, P = 0.016). There were no differences in aesthetic satisfaction. There were differences found in the SF-12 physical component summary score across all groups (P = 0.033), and a difference was found between the informed and paternalistic groups (P < 0.05). There were no differences in the mental component score (P = 0.42). Women undergoing breast reconstruction predominantly used the informed model of decision making. Patients who adopted a more active role, whether using an informed or shared approach, had higher general patient satisfaction and physical component summary scores compared with patients whose decision making was paternalistic. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Developing a nutrient pollution model to assist policy makers by using a meso-scale Minimum Information Requirement (MIR) approach

    NASA Astrophysics Data System (ADS)

    Adams, R.; Quinn, P. F.; Bowes, M. J.

    2014-09-01

    A model for simulating runoff pathways and water quality fluxes has been developed using the Minimum Information (MIR) approach. The model, the Catchment Runoff Attenuation Tool (CRAFT) is applicable to meso-scale catchments which focusses primarily on hydrological pathways that mobilise nutrients. Hence CRAFT can be used investigate the impact of management intervention strategies designed to reduce the loads of nutrients into receiving watercourses. The model can help policy makers, for example in Europe, meet water quality targets and consider methods to obtain "good" ecological status. A case study of the 414 km2 Frome catchment, Dorset UK, has been described here as an application of the CRAFT model. The model was primarily calibrated on ten years of weekly data to reproduce the observed flows and nutrient (nitrate nitrogen - N - and phosphorus - P) concentrations. Also data from two years of sub-daily high resolution monitoring at the same site were also analysed. These data highlighted some additional signals in the nutrient flux, particularly of soluble reactive phosphorus, which were not observable in the weekly data. This analysis has prompted the choice of using a daily timestep for this meso-scale modelling study as the minimum information requirement. A management intervention scenario was also run to show how the model can support catchment managers to investigate how reducing the concentrations of N and P in the various flow pathways. This scale appropriate modelling tool can help policy makers consider a range of strategies to to meet the European Union (EU) water quality targets for this type of catchment.

  6. 77 FR 38239 - Partial Approval and Disapproval of Air Quality Implementation Plans; Arizona; Infrastructure...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-27

    ... anonymous access system, and EPA will not know your identity or contact information unless you provide it in... control measures. Section 110(a)(2)(B): Ambient air quality monitoring/data system. Section 110(a)(2)(C... significant deterioration (PSD) and visibility protection. Section 110(a)(2)(K): Air quality modeling and...

  7. Medicare Program; FY 2017 Hospice Wage Index and Payment Rate Update and Hospice Quality Reporting Requirements. Final rule.

    PubMed

    2016-08-05

    This final rule will update the hospice wage index, payment rates, and cap amount for fiscal year (FY) 2017. In addition, this rule changes the hospice quality reporting program, including adopting new quality measures. Finally, this final rule includes information regarding the Medicare Care Choices Model (MCCM).

  8. Measuring, managing and maximizing refinery performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bascur, O.A.; Kennedy, J.P.

    1996-01-01

    Implementing continuous quality improvement is a confluence of total quality management, people empowerment, performance indicators and information engineering. Supporting information technologies allow a refiner to narrow the gap between management objectives and the process control level. Dynamic performance monitoring benefits come from production cost savings, improved communications and enhanced decision making. A refinery workgroup information flow model helps automate continuous improvement of processes, performance and the organization. The paper discusses the rethinking of refinery operations, dynamic performance monitoring, continuous process improvement, the knowledge coordinator and repository manager, an integrated plant operations workflow, and successful implementation.

  9. Integration of Air Quality & Exposure Models for Health Studies

    EPA Science Inventory

    The presentation describes a new community-scale tool called exposure model for individuals (EMI), which predicts five tiers of individual-level exposure metrics for ambient PM using outdoor concentrations, questionnaires, weather, and time-location information. In this modeling ...

  10. QUOTEchemo: a patient-centred instrument to measure quality of communication preceding chemotherapy treatment through the patient's eyes.

    PubMed

    van Weert, Julia C M; Jansen, Jesse; de Bruijn, Gert-Jan; Noordman, Janneke; van Dulmen, Sandra; Bensing, Jozien M

    2009-11-01

    Knowing patients' needs is a prerequisite to ensure high quality cancer care. This study describes the development and psychometric properties of a patient-centred instrument to measure needs and actual experiences with communication preceding chemotherapy treatment: QUOTE(chemo). QUOTE-questionnaires (Quality Of care Through the patients' Eyes) are widely used to gain insight into unmet needs, but no validated, standardised questionnaire combining patients' needs and experiences surrounding chemotherapy treatment is available yet. To evaluate the psychometric properties of the QUOTE(chemo), content validity, internal structure and convergent validity were investigated amongst 345 cancer patients, new to chemotherapy, from 10 different hospitals. Literature study, focus group discussions and a categorisation procedure of 67 relevant topics revealed seven main themes: Treatment-related information, Prognosis information, Rehabilitation information, Coping information, Interpersonal communication, Tailored communication and Affective communication. Confirmatory factor analysis using structural equation modelling indicated that the measurement model provided good fit to the data with factor loadings ranging from .43 to .77. The seven QUOTE(chemo) dimensions captured relevant issues of concern with good internal consistency (alpha .72-.92), satisfactory item-total correlations (.35-.79) and satisfactory convergent validity. Affective communication, Treatment-related information and Rehabilitation information were perceived most important by patients. The instrument also appeared to be able to determine which aspects need improvement to ensure high quality care. The highest need for improvement was found for communicating Prognosis information and Rehabilitation information and for Interpersonal communication. These findings provide preliminary evidence of the reliability and validity of the QUOTE(chemo) for use in cancer care surrounding chemotherapy treatment. Researchers and health care providers can use the instrument to measure patients' needs and experiences with communication to identify aspects that need improvement.

  11. Improving the Quality of Ability Estimates through Multidimensional Scoring and Incorporation of Ancillary Variables

    ERIC Educational Resources Information Center

    de la Torre, Jimmy

    2009-01-01

    For one reason or another, various sources of information, namely, ancillary variables and correlational structure of the latent abilities, which are usually available in most testing situations, are ignored in ability estimation. A general model that incorporates these sources of information is proposed in this article. The model has a general…

  12. Tourism guide cloud service quality: What actually delights customers?

    PubMed

    Lin, Shu-Ping; Yang, Chen-Lung; Pi, Han-Chung; Ho, Thao-Minh

    2016-01-01

    The emergence of advanced IT and cloud services has beneficially supported the information-intensive tourism industry, simultaneously caused extreme competitions in attracting customers through building efficient service platforms. On response, numerous nations have implemented cloud platforms to provide value-added sightseeing information and personal intelligent service experiences. Despite these efforts, customers' actual perspectives have yet been sufficiently understood. To bridge the gap, this study attempts to investigate what aspects of tourism cloud services actually delight customers' satisfaction and loyalty. 336 valid survey questionnaire answers were analyzed using structural equation modeling method. The results prove positive impacts of function quality, enjoyment, multiple visual aids, and information quality on customers' satisfaction as well as of enjoyment and satisfaction on use loyalty. The findings hope to provide helpful references of customer use behaviors for enhancing cloud service quality in order to achieve better organizational competitiveness.

  13. Modeling crop residue burning experiments to evaluate smoke emissions and plume transport.

    PubMed

    Zhou, Luxi; Baker, Kirk R; Napelenok, Sergey L; Pouliot, George; Elleman, Robert; O'Neill, Susan M; Urbanski, Shawn P; Wong, David C

    2018-06-15

    Crop residue burning is a common land management practice that results in emissions of a variety of pollutants with negative health impacts. Modeling systems are used to estimate air quality impacts of crop residue burning to support retrospective regulatory assessments and also for forecasting purposes. Ground and airborne measurements from a recent field experiment in the Pacific Northwest focused on cropland residue burning was used to evaluate model performance in capturing surface and aloft impacts from the burning events. The Community Multiscale Air Quality (CMAQ) model was used to simulate multiple crop residue burns with 2 km grid spacing using field-specific information and also more general assumptions traditionally used to support National Emission Inventory based assessments. Field study specific information, which includes area burned, fuel consumption, and combustion completeness, resulted in increased biomass consumption by 123 tons (60% increase) on average compared to consumption estimated with default methods in the National Emission Inventory (NEI) process. Buoyancy heat flux, a key parameter for model predicted fire plume rise, estimated from fuel loading obtained from field measurements can be 30% to 200% more than when estimated using default field information. The increased buoyancy heat flux resulted in higher plume rise by 30% to 80%. This evaluation indicates that the regulatory air quality modeling system can replicate intensity and transport (horizontal and vertical) features for crop residue burning in this region when region-specific information is used to inform emissions and plume rise calculations. Further, previous vertical emissions allocation treatment of putting all cropland residue burning in the surface layer does not compare well with measured plume structure and these types of burns should be modeled more similarly to prescribed fires such that plume rise is based on an estimate of buoyancy. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Toward a Science of Cooperation.

    ERIC Educational Resources Information Center

    Newbern, Dianna; And Others

    Scripted cooperative learning and individual learning of descriptive information were compared in a 2-x-2 factorial design with 104 undergraduates. Influenced by models of individual learning and cognition, differences were assessed in (1) information acquisition and retrieval, (2) the quality and quantity of recalled information, and (3) the…

  15. Overcoming information asymmetry in consumer-directed health plans.

    PubMed

    Retchin, Sheldon M

    2007-04-01

    Consumer-centric healthcare has been extolled as the centerpiece of a new model for managing both quality and price. However, information asymmetry in consumer-directed health plans (CDHPs) is a challenge that must be addressed. For CDHPs to work as intended and to gain acceptance, consumers need information regarding the quality and price of healthcare purchases. The federal government, particularly the Agency for Healthcare Research and Quality, could function as an official resource for information on performance and comparisons among facilities and providers. Because of workforce constraints among primary care physicians, a new group of healthcare professionals called "medical decision advisors" could be trained. Academic health centers would have to play a critical role in devising an appropriate curriculum, as well as designing a certification and credentialing process. However, with appropriate curricula and training, medical decision advisors could furnish information for consumers and aid in the complicated decisions they will face under CDHPs.

  16. Benchmark Evaluation of Start-Up and Zero-Power Measurements at the High-Temperature Engineering Test Reactor

    DOE PAGES

    Bess, John D.; Fujimoto, Nozomu

    2014-10-09

    Benchmark models were developed to evaluate six cold-critical and two warm-critical, zero-power measurements of the HTTR. Additional measurements of a fully-loaded subcritical configuration, core excess reactivity, shutdown margins, six isothermal temperature coefficients, and axial reaction-rate distributions were also evaluated as acceptable benchmark experiments. Insufficient information is publicly available to develop finely-detailed models of the HTTR as much of the design information is still proprietary. However, the uncertainties in the benchmark models are judged to be of sufficient magnitude to encompass any biases and bias uncertainties incurred through the simplification process used to develop the benchmark models. Dominant uncertainties in themore » experimental keff for all core configurations come from uncertainties in the impurity content of the various graphite blocks that comprise the HTTR. Monte Carlo calculations of keff are between approximately 0.9 % and 2.7 % greater than the benchmark values. Reevaluation of the HTTR models as additional information becomes available could improve the quality of this benchmark and possibly reduce the computational biases. High-quality characterization of graphite impurities would significantly improve the quality of the HTTR benchmark assessment. Simulation of the other reactor physics measurements are in good agreement with the benchmark experiment values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.« less

  17. An Integrated Decision Support System for Water Quality Management of Songhua River Basin

    NASA Astrophysics Data System (ADS)

    Zhang, Haiping; Yin, Qiuxiao; Chen, Ling

    2010-11-01

    In the Songhua River Basin of China, many water resource and water environment conflicts interact. A Decision Support System (DSS) for the water quality management has been established for the Basin. The System is featured by the incorporation of a numerical water quality model system into a conventional water quality management system which usually consists of geographic information system (GIS), WebGIS technology, database system and network technology. The model system is built based on DHI MIKE software comprising of a basin rainfall-runoff module, a basin pollution load evaluation module, a river hydrodynamic module and a river water quality module. The DSS provides a friendly graphical user interface that enables the rapid and transparent calculation of various water quality management scenarios, and also enables the convenient access and interpretation of the modeling results to assist the decision-making.

  18. Spatio-temporal water quality mapping from satellite images using geographically and temporally weighted regression

    NASA Astrophysics Data System (ADS)

    Chu, Hone-Jay; Kong, Shish-Jeng; Chang, Chih-Hua

    2018-03-01

    The turbidity (TB) of a water body varies with time and space. Water quality is traditionally estimated via linear regression based on satellite images. However, estimating and mapping water quality require a spatio-temporal nonstationary model, while TB mapping necessitates the use of geographically and temporally weighted regression (GTWR) and geographically weighted regression (GWR) models, both of which are more precise than linear regression. Given the temporal nonstationary models for mapping water quality, GTWR offers the best option for estimating regional water quality. Compared with GWR, GTWR provides highly reliable information for water quality mapping, boasts a relatively high goodness of fit, improves the explanation of variance from 44% to 87%, and shows a sufficient space-time explanatory power. The seasonal patterns of TB and the main spatial patterns of TB variability can be identified using the estimated TB maps from GTWR and by conducting an empirical orthogonal function (EOF) analysis.

  19. How do you perceive this author? Understanding and modeling authors’ communication quality in social media

    PubMed Central

    2018-01-01

    In this study, we leverage human evaluations, content analysis, and computational modeling to generate a comprehensive analysis of readers’ evaluations of authors’ communication quality in social media with respect to four factors: author credibility, interpersonal attraction, communication competence, and intent to interact. We review previous research on the human evaluation process and highlight its limitations in providing sufficient information for readers to assess authors’ communication quality. From our analysis of the evaluations of 1,000 Twitter authors’ communication quality from 300 human evaluators, we provide empirical evidence of the impact of the characteristics of the reader (demographic, social media experience, and personality), author (profile and social media engagement), and content (linguistic, syntactic, similarity, and sentiment) on the evaluation of an author’s communication quality. In addition, based on the author and message characteristics, we demonstrate the potential for building accurate models that can indicate an author’s communication quality. PMID:29389979

  20. How do you perceive this author? Understanding and modeling authors' communication quality in social media.

    PubMed

    Han, Kyungsik

    2018-01-01

    In this study, we leverage human evaluations, content analysis, and computational modeling to generate a comprehensive analysis of readers' evaluations of authors' communication quality in social media with respect to four factors: author credibility, interpersonal attraction, communication competence, and intent to interact. We review previous research on the human evaluation process and highlight its limitations in providing sufficient information for readers to assess authors' communication quality. From our analysis of the evaluations of 1,000 Twitter authors' communication quality from 300 human evaluators, we provide empirical evidence of the impact of the characteristics of the reader (demographic, social media experience, and personality), author (profile and social media engagement), and content (linguistic, syntactic, similarity, and sentiment) on the evaluation of an author's communication quality. In addition, based on the author and message characteristics, we demonstrate the potential for building accurate models that can indicate an author's communication quality.

  1. Applying the Practical Inquiry Model to Investigate the Quality of Students' Online Discourse in an Information Ethics Course Based on Bloom's Teaching Goal and Bird's 3C Model

    ERIC Educational Resources Information Center

    Liu, Chien-Jen; Yang, Shu Ching

    2012-01-01

    The goal of this study is to better understand how the study participants' cognitive discourse is displayed in their learning transaction in an asynchronous, text-based conferencing environment based on Garrison's Practical Inquiry Model (2001). The authors designed an online information ethics course based on Bloom's taxonomy of educational…

  2. An enhanced PM 2.5 air quality forecast model based on nonlinear regression and back-trajectory concentrations

    NASA Astrophysics Data System (ADS)

    Cobourn, W. Geoffrey

    2010-08-01

    An enhanced PM 2.5 air quality forecast model based on nonlinear regression (NLR) and back-trajectory concentrations has been developed for use in the Louisville, Kentucky metropolitan area. The PM 2.5 air quality forecast model is designed for use in the warm season, from May through September, when PM 2.5 air quality is more likely to be critical for human health. The enhanced PM 2.5 model consists of a basic NLR model, developed for use with an automated air quality forecast system, and an additional parameter based on upwind PM 2.5 concentration, called PM24. The PM24 parameter is designed to be determined manually, by synthesizing backward air trajectory and regional air quality information to compute 24-h back-trajectory concentrations. The PM24 parameter may be used by air quality forecasters to adjust the forecast provided by the automated forecast system. In this study of the 2007 and 2008 forecast seasons, the enhanced model performed well using forecasted meteorological data and PM24 as input. The enhanced PM 2.5 model was compared with three alternative models, including the basic NLR model, the basic NLR model with a persistence parameter added, and the NLR model with persistence and PM24. The two models that included PM24 were of comparable accuracy. The two models incorporating back-trajectory concentrations had lower mean absolute errors and higher rates of detecting unhealthy PM2.5 concentrations compared to the other models.

  3. Healthy participants in phase I clinical trials: the quality of their decision to take part.

    PubMed

    Rabin, Cheryl; Tabak, Nili

    2006-08-01

    This study was set out to test the quality of the decision-making process of healthy volunteers in clinical trials. Researchers fear that the decision to volunteer for clinical trials is taken inadequately and that the signature on the consent forms, meant to affirm that consent was 'informed', is actually insubstantial. The study design was quasi-experimental, using a convenience quota sample. Over a period of a year, candidates were approached during their screening process for a proposed clinical trial, after concluding the required 'Informed Consent' procedure. In all, 100 participants in phase I trials filled out questionnaires based ultimately on the Janis and Mann model of vigilant information processing, during their stay in the research centre. Only 35% of the participants reached a 'quality decision'. There is a definite correlation between information processing and quality decision-making. However, many of the healthy research volunteers (58%) do not seek out information nor check alternatives before making a decision. Full disclosure is essential to a valid informed consent procedure but not sufficient; emphasis must be put on having the information understood and assimilated. Research nurses play a central role in achieving this objective.

  4. Web portal on environmental sciences "ATMOS''

    NASA Astrophysics Data System (ADS)

    Gordov, E. P.; Lykosov, V. N.; Fazliev, A. Z.

    2006-06-01

    The developed under INTAS grant web portal ATMOS (http://atmos.iao.ru and http://atmos.scert.ru) makes available to the international research community, environmental managers, and the interested public, a bilingual information source for the domain of Atmospheric Physics and Chemistry, and the related application domain of air quality assessment and management. It offers access to integrated thematic information, experimental data, analytical tools and models, case studies, and related information and educational resources compiled, structured, and edited by the partners into a coherent and consistent thematic information resource. While offering the usual components of a thematic site such as link collections, user group registration, discussion forum, news section etc., the site is distinguished by its scientific information services and tools: on-line models and analytical tools, and data collections and case studies together with tutorial material. The portal is organized as a set of interrelated scientific sites, which addressed basic branches of Atmospheric Sciences and Climate Modeling as well as the applied domains of Air Quality Assessment and Management, Modeling, and Environmental Impact Assessment. Each scientific site is open for external access information-computational system realized by means of Internet technologies. The main basic science topics are devoted to Atmospheric Chemistry, Atmospheric Spectroscopy and Radiation, Atmospheric Aerosols, Atmospheric Dynamics and Atmospheric Models, including climate models. The portal ATMOS reflects current tendency of Environmental Sciences transformation into exact (quantitative) sciences and is quite effective example of modern Information Technologies and Environmental Sciences integration. It makes the portal both an auxiliary instrument to support interdisciplinary projects of regional environment and extensive educational resource in this important domain.

  5. Mediation effects of medication information processing and adherence on association between health literacy and quality of life.

    PubMed

    Song, Sunmi; Lee, Seung-Mi; Jang, Sunmee; Lee, Yoon Jin; Kim, Na-Hyun; Sohn, Hye-Ryoung; Suh, Dong-Churl

    2017-09-16

    To examine whether medication related information processing defined as reading of over-the-counter drug labels, understanding prescription instructions, and information seeking-and medication adherence account for the association between health literacy and quality of life, and whether these associations may be moderated by age and gender. A sample of 305 adults in South Korea was recruited through a proportional quota sampling to take part in a cross-sectional survey on health literacy, medication-related information processing, medication adherence, and quality of life. Descriptive statistics and structural equation modeling (SEM) were performed. Two mediation pathways linking health literacy with quality of life were found. First, health literacy was positively associated with reading drug labels, which was subsequently linked to medication adherence and quality of life. Second, health literacy was positively associated with accurate understanding of prescription instructions, which was associated with quality of life. Age moderation was found, as the mediation by reading drug labels was significant only among young adults whereas the mediation by understanding of medication instruction was only among older adults. Reading drug labels and understanding prescription instructions explained the pathways by which health literacy affects medication adherence and quality of life. The results suggest that training skills for processing medication information can be effective to enhance the health of those with limited health literacy.

  6. College quality and hourly wages: evidence from the self-revelation model, sibling models and instrumental variables.

    PubMed

    Borgen, Nicolai T

    2014-11-01

    This paper addresses the recent discussion on confounding in the returns to college quality literature using the Norwegian case. The main advantage of studying Norway is the quality of the data. Norwegian administrative data provide information on college applications, family relations and a rich set of control variables for all Norwegian citizens applying to college between 1997 and 2004 (N = 141,319) and their succeeding wages between 2003 and 2010 (676,079 person-year observations). With these data, this paper uses a subset of the models that have rendered mixed findings in the literature in order to investigate to what extent confounding biases the returns to college quality. I compare estimates obtained using standard regression models to estimates obtained using the self-revelation model of Dale and Krueger (2002), a sibling fixed effects model and the instrumental variable model used by Long (2008). Using these methods, I consistently find increasing returns to college quality over the course of students' work careers, with positive returns only later in students' work careers. I conclude that the standard regression estimate provides a reasonable estimate of the returns to college quality. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Can high quality overcome consumer resistance to restricted provider access? Evidence from a health plan choice experiment.

    PubMed

    Harris, Katherine M

    2002-06-01

    To investigate the impact of quality information on the willingness of consumers to enroll in health plans that restrict provider access. A survey administered to respondents between the ages of 25 and 64 in the West Los Angeles area with private health insurance. An experimental approach is used to measure the effect of variation in provider network features and information about the quality of network physicians on hypothetical plan choices. Conditional logit models are used to analyze the experimental choice data. Next, choice model parameter estimates are used to simulate the impact of changes in plan features on the market shares of competing health plans and to calculate the quality level required to make consumers indifferent to changes in provider access. The presence of quality information reduced the importance of provider network features in plan choices as hypothesized. However, there were not statistically meaningful differences by type of quality measure (i.e., consumer assessed versus expert assessed). The results imply that large quality differences are required to make consumers indifferent to changes in provider access. The impact of quality on plan choices depended more on the particular measure and less on the type of measure. Quality ratings based on the proportion of survey respondents "extremely satisfied with results of care" had the greatest impact on plan choice while the proportion of network doctors "affiliated with university medical centers" had the least. Other consumer and expert assessed measures had more comparable effects. Overall the results provide empirical evidence that consumers are willing to trade high quality for restrictions on provider access. This willingness to trade implies that relatively small plans that place restrictions on provider access can successfully compete against less restrictive plans when they can demonstrate high quality. However, the results of this study suggest that in many cases, the level of quality required for consumers to accept access restrictions may be so high as to be unattainable. The results provide empirical support for the current focus of decision support efforts on consumer assessed quality measures. At the same time, however, the results suggest that consumers would also value quality measures based on expert assessments. This finding is relevant given the lack of comparative quality information based on expert judgment and research suggesting that consumers have apprehensions about their ability to meaningfully interpret performance-based quality measures.

  8. Ground-water models for water resources planning

    USGS Publications Warehouse

    Moore, John E.

    1980-01-01

    In the past decade hydrologists have emphasized the development of computer-based mathematical models to aid in the understanding of flow, the transport of solutes, transport of heat, and deformation in the groundwater system. These models have been used to provide information and predictions for water managers. Too frequently, groundwater was neglected in water-resource planning because managers believed that it could not be adequately evaluated in terms of availability, quality, and effect of development on surface water supplies. Now, however, with newly developed digital groundwater models, effects of development can be predicted. Such models have been used to predict hydrologic and quality changes under different stresses. These models have grown in complexity over the last 10 years from simple one-layer flow models to three-dimensional simulations of groundwater flow which may include solute transport, heat transport, effects of land subsidence, and encroachment of salt water. This paper illustrates, through case histories, how predictive groundwater models have provided the information needed for the sound planning and management of water resources in the United States. (USGS)

  9. Measuring Quality in Special Libraries: Lessons from Service Marketing.

    ERIC Educational Resources Information Center

    White, Marilyn Domas; Abels, Eileen G.

    1995-01-01

    Surveys the service marketing literature for models and data-gathering instruments measuring service quality, particularly the instruments SERVQUAL and SERVPERF, and assesses their applicability to special libraries and information centers. Topics include service characteristics and definitions of service; performance-minus-expectations and…

  10. Analysis of the Factors Affecting Consumer Acceptance of Accredited Online Health Information.

    PubMed

    Jo, Heui Sug; Song, Tae Min; Kim, Bong Gi

    2017-11-01

    With the increasing use of the internet and the spread of smartphones, health information seekers obtain considerable information through the internet. As the amount of online health information increases, the need for quality management of health information has been emphasized. The purpose of this study was to investigate the factors affecting the intention of using accredited online health information by applying the extended technology acceptance model (Extended-TAM). An online survey was conducted from September 15, 2016 to October 3, 2016, on 500 men and women aged 19-69 years. The results showed that the greatest factor influencing the acceptance of the accredited health information was perceived usefulness, and the expectation for the quality of the accreditation system was the most important mediator variable. In order to establish the health information accreditation system as a means to provide easy and useful information to the consumers, it is necessary to carry out quality management and promote the system through the continuous monitoring of the accreditation system. © 2017 The Korean Academy of Medical Sciences.

  11. Radiation Quality Effects on Transcriptome Profiles in 3-D Cultures After Charged Particle Irradiation

    NASA Technical Reports Server (NTRS)

    Patel, Zarana S.; Kidane, Yared H.; Huff, Janice L.

    2014-01-01

    In this work, we evaluated the differential effects of low- and high-LET radiation on 3-D organotypic cultures in order to investigate radiation quality impacts on gene expression and cellular responses. Current risk models for assessment of space radiation-induced cancer have large uncertainties because the models for adverse health effects following radiation exposure are founded on epidemiological analyses of human populations exposed to low-LET radiation. Reducing these uncertainties requires new knowledge on the fundamental differences in biological responses (the so-called radiation quality effects) triggered by heavy ion particle radiation versus low-LET radiation associated with Earth-based exposures. In order to better quantify these radiation quality effects in biological systems, we are utilizing novel 3-D organotypic human tissue models for space radiation research. These models hold promise for risk assessment as they provide a format for study of human cells within a realistic tissue framework, thereby bridging the gap between 2-D monolayer culture and animal models for risk extrapolation to humans. To identify biological pathway signatures unique to heavy ion particle exposure, functional gene set enrichment analysis (GSEA) was used with whole transcriptome profiling. GSEA has been used extensively as a method to garner biological information in a variety of model systems but has not been commonly used to analyze radiation effects. It is a powerful approach for assessing the functional significance of radiation quality-dependent changes from datasets where the changes are subtle but broad, and where single gene based analysis using rankings of fold-change may not reveal important biological information.

  12. Information system of quality assessment for liquid and gaseous medium production

    NASA Astrophysics Data System (ADS)

    Bobrov, V. N.; Us, N. A.; Davidov, I. S.

    2018-05-01

    A method and a technical solution for controlling the quality of production of liquid and gaseous media is proposed. It is also proposed to monitor harmful factors in production while ensuring safe working conditions. Initially, using the mathematical model of an ideal atmosphere, the projection to the horizontal surface of the observation trajectory is calculated. At the second stage, the horizontal projection of the observation trajectory in real conditions is measured. The quality of the medium is judged by the difference between the projections of observation trajectories. The technical result is presented in the form of a device allowing obtaining information about the quality of the medium under investigation.

  13. Recent Enhancements to the Community Multiscale Air Quality Model (CMAQ)

    EPA Science Inventory

    This presentation overviews recent updates to the CMAQ modeling system. The presentation will be given as part of the information exchange session on Regional Air Pollution Modeling at the UK-US Collaboration Meeting on Air Pollution Exposure Science.

  14. [Fast Detection of Camellia Sinensis Growth Process and Tea Quality Informations with Spectral Technology: A Review].

    PubMed

    Peng, Ji-yu; Song, Xing-lin; Liu, Fei; Bao, Yi-dan; He, Yong

    2016-03-01

    The research achievements and trends of spectral technology in fast detection of Camellia sinensis growth process information and tea quality information were being reviewed. Spectral technology is a kind of fast, nondestructive, efficient detection technology, which mainly contains infrared spectroscopy, fluorescence spectroscopy, Raman spectroscopy and mass spectroscopy. The rapid detection of Camellia sinensis growth process information and tea quality is helpful to realize the informatization and automation of tea production and ensure the tea quality and safety. This paper provides a review on its applications containing the detection of tea (Camellia sinensis) growing status(nitrogen, chlorophyll, diseases and insect pest), the discrimination of tea varieties, the grade discrimination of tea, the detection of tea internal quality (catechins, total polyphenols, caffeine, amino acid, pesticide residual and so on), the quality evaluation of tea beverage and tea by-product, the machinery of tea quality determination and discrimination. This paper briefly introduces the trends of the technology of the determination of tea growth process information, sensor and industrial application. In conclusion, spectral technology showed high potential to detect Camellia sinensis growth process information, to predict tea internal quality and to classify tea varieties and grades. Suitable chemometrics and preprocessing methods is helpful to improve the performance of the model and get rid of redundancy, which provides the possibility to develop the portable machinery. Future work is to develop the portable machinery and on-line detection system is recommended to improve the further application. The application and research achievement of spectral technology concerning about tea were outlined in this paper for the first time, which contained Camellia sinensis growth, tea production, the quality and safety of tea and by-produce and so on, as well as some problems to be solved and its future applicability in modern tea industrial.

  15. Continual improvement: A bibliography with indexes, 1992-1993

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This bibliography lists 606 references to reports and journal articles entered into the NASA Scientific and Technical Information Database during 1992 to 1993. Topics cover the philosophy and history of Continual Improvement (CI), basic approaches and strategies for implementation, and lessons learned from public and private sector models. Entries are arranged according to the following categories: Leadership for Quality, Information and Analysis, Strategic Planning for CI, Human Resources Utilization, Management of Process Quality, Supplier Quality, Assessing Results, Customer Focus and Satisfaction, TQM Tools and Philosophies, and Applications. Indexes include subject, personal author, corporate source, contract number, report number, and accession number.

  16. Hanging Together To Avoid Hanging Separately: Opportunities for Academic Libraries and Consortia.

    ERIC Educational Resources Information Center

    Allen, Barbara McFadden; Hirshon, Arnold

    1998-01-01

    Discusses academic library consortia, examines types of consortia, and presents three case histories (OhioLINK, PALCI and CIC). Highlights include economic competition; changes in information access and delivery; growth of information technology; quality improvement; and future strategies, including pricing models for electronic information,…

  17. Integrating prior information into microwave tomography Part 1: Impact of detail on image quality.

    PubMed

    Kurrant, Douglas; Baran, Anastasia; LoVetri, Joe; Fear, Elise

    2017-12-01

    The authors investigate the impact that incremental increases in the level of detail of patient-specific prior information have on image quality and the convergence behavior of an inversion algorithm in the context of near-field microwave breast imaging. A methodology is presented that uses image quality measures to characterize the ability of the algorithm to reconstruct both internal structures and lesions embedded in fibroglandular tissue. The approach permits key aspects that impact the quality of reconstruction of these structures to be identified and quantified. This provides insight into opportunities to improve image reconstruction performance. Patient-specific information is acquired using radar-based methods that form a regional map of the breast. This map is then incorporated into a microwave tomography algorithm. Previous investigations have demonstrated the effectiveness of this approach to improve image quality when applied to data generated with two-dimensional (2D) numerical models. The present study extends this work by generating prior information that is customized to vary the degree of structural detail to facilitate the investigation of the role of prior information in image formation. Numerical 2D breast models constructed from magnetic resonance (MR) scans, and reconstructions formed with a three-dimensional (3D) numerical breast model are used to assess if trends observed for the 2D results can be extended to 3D scenarios. For the blind reconstruction scenario (i.e., no prior information), the breast surface is not accurately identified and internal structures are not clearly resolved. A substantial improvement in image quality is achieved by incorporating the skin surface map and constraining the imaging domain to the breast. Internal features within the breast appear in the reconstructed image. However, it is challenging to discriminate between adipose and glandular regions and there are inaccuracies in both the structural properties of the glandular region and the dielectric properties reconstructed within this structure. Using a regional map with a skin layer only marginally improves this situation. Increasing the structural detail in the prior information to include internal features leads to reconstructions for which the interface that delineates the fat and gland regions can be inferred. Different features within the glandular region corresponding to tissues with varying relative permittivity values, such as a lesion embedded within glandular structure, emerge in the reconstructed images. Including knowledge of the breast surface and skin layer leads to a substantial improvement in image quality compared to the blind case, but the images have limited diagnostic utility for applications such as tumor response tracking. The diagnostic utility of the reconstruction technique is improved considerably when patient-specific structural information is used. This qualitative observation is supported quantitatively with image metrics. © 2017 American Association of Physicists in Medicine.

  18. Evidence synthesis for medical decision making and the appropriate use of quality scores.

    PubMed

    Doi, Suhail A R

    2014-09-01

    Meta-analyses today continue to be run using conventional random-effects models that ignore tangible information from studies such as the quality of the studies involved, despite the expectation that results of better quality studies reflect more valid results. Previous research has suggested that quality scores derived from such quality appraisals are unlikely to be useful in meta-analysis, because they would produce biased estimates of effects that are unlikely to be offset by a variance reduction within the studied models. However, previous discussions took place in the context of such scores viewed in terms of their ability to maximize their association with both the magnitude and direction of bias. In this review, another look is taken at this concept, this time asserting that probabilistic bias quantification is not possible or even required of quality scores when used in meta-analysis for redistribution of weights. The use of such a model is contrasted with the conventional random effects model of meta-analysis to demonstrate why the latter is inadequate in the face of a properly specified quality score weighting method. © 2014 Marshfield Clinic.

  19. A decision support system for delivering optimal quality peach and tomato

    NASA Technical Reports Server (NTRS)

    Thai, C. N.; Pease, J. N.; Shewfelt, R. L.

    1990-01-01

    Several studies have indicated that color and firmness are the two quality attributes most important to consumers in making purchasing decisions of fresh peaches and tomatoes. However, at present, retail produce managers do not have the proper information for handling fresh produce so it has the most appealing color and firmness when it reaches the consumer. This information should help them predict the consumer color and firmness perception and preference for produce from various storage conditions. Since 1987, for 'Redglobe' peach and 'Sunny' tomato, we have been generating information about their physical quality attributes (firmness and color) and their corresponding consumer sensory scores. This article reports on our current progress toward the goal of integrating such information into a model-based decision support system for retail level managers in handling fresh peaches and tomatoes.

  20. Comparing the information conveyed by envelope modulation for speech intelligibility, speech quality, and music quality.

    PubMed

    Kates, James M; Arehart, Kathryn H

    2015-10-01

    This paper uses mutual information to quantify the relationship between envelope modulation fidelity and perceptual responses. Data from several previous experiments that measured speech intelligibility, speech quality, and music quality are evaluated for normal-hearing and hearing-impaired listeners. A model of the auditory periphery is used to generate envelope signals, and envelope modulation fidelity is calculated using the normalized cross-covariance of the degraded signal envelope with that of a reference signal. Two procedures are used to describe the envelope modulation: (1) modulation within each auditory frequency band and (2) spectro-temporal processing that analyzes the modulation of spectral ripple components fit to successive short-time spectra. The results indicate that low modulation rates provide the highest information for intelligibility, while high modulation rates provide the highest information for speech and music quality. The low-to-mid auditory frequencies are most important for intelligibility, while mid frequencies are most important for speech quality and high frequencies are most important for music quality. Differences between the spectral ripple components used for the spectro-temporal analysis were not significant in five of the six experimental conditions evaluated. The results indicate that different modulation-rate and auditory-frequency weights may be appropriate for indices designed to predict different types of perceptual relationships.

  1. Comparing the information conveyed by envelope modulation for speech intelligibility, speech quality, and music quality

    PubMed Central

    Kates, James M.; Arehart, Kathryn H.

    2015-01-01

    This paper uses mutual information to quantify the relationship between envelope modulation fidelity and perceptual responses. Data from several previous experiments that measured speech intelligibility, speech quality, and music quality are evaluated for normal-hearing and hearing-impaired listeners. A model of the auditory periphery is used to generate envelope signals, and envelope modulation fidelity is calculated using the normalized cross-covariance of the degraded signal envelope with that of a reference signal. Two procedures are used to describe the envelope modulation: (1) modulation within each auditory frequency band and (2) spectro-temporal processing that analyzes the modulation of spectral ripple components fit to successive short-time spectra. The results indicate that low modulation rates provide the highest information for intelligibility, while high modulation rates provide the highest information for speech and music quality. The low-to-mid auditory frequencies are most important for intelligibility, while mid frequencies are most important for speech quality and high frequencies are most important for music quality. Differences between the spectral ripple components used for the spectro-temporal analysis were not significant in five of the six experimental conditions evaluated. The results indicate that different modulation-rate and auditory-frequency weights may be appropriate for indices designed to predict different types of perceptual relationships. PMID:26520329

  2. Correction of stream quality trends for the effects of laboratory measurement bias

    USGS Publications Warehouse

    Alexander, Richard B.; Smith, Richard A.; Schwarz, Gregory E.

    1993-01-01

    We present a statistical model relating measurements of water quality to associated errors in laboratory methods. Estimation of the model allows us to correct trends in water quality for long-term and short-term variations in laboratory measurement errors. An illustration of the bias correction method for a large national set of stream water quality and quality assurance data shows that reductions in the bias of estimates of water quality trend slopes are achieved at the expense of increases in the variance of these estimates. Slight improvements occur in the precision of estimates of trend in bias by using correlative information on bias and water quality to estimate random variations in measurement bias. The results of this investigation stress the need for reliable, long-term quality assurance data and efficient statistical methods to assess the effects of measurement errors on the detection of water quality trends.

  3. Urban-rural variations in air quality and health impacts in northern India

    NASA Astrophysics Data System (ADS)

    Karambelas, A. N.; Holloway, T.; Fiore, A. M.; Kinney, P.; DeFries, R. S.; Kiesewetter, G.; Heyes, C.

    2017-12-01

    Ambient air pollution in India is a severe problem, contributing to negative health impacts and early death. Ground-based monitors often used to quantify health impacts are often located in urban regions, however approximately 70% of India's population resides in rural areas. We use high-resolution concentrations from the regional Community Multi-scale Air Quality (CMAQ) model over densely-populated northern India to estimate air quality and health impacts due to anthropogenic emission sectors separately for urban and rural regions. Modeled concentrations inform relative risk calculations and exposure estimates as performed in the Global Burden of Disease. Anthropogenic emissions from the International Institute for Applied Systems Analysis (IIASA) Greenhouse Gas-Air Pollution Interactions and Synergies (GAINS) model following version 5a of the Evaluating the Climate and Air Quality Impacts of Short-Lived Pollutants project gridding structure are updated to reflect urban- and rural-specific activity information for transportation and residential combustion, and industrial and electrical generating unit location and magnitude information. We estimate 314,000 (95% Confidence Interval: 304,000—323,000) and 58,000 (CI: 39,000—70,000) adults (25 years or older) die prematurely each year from PM2.5 and O3 respectively in northern India, with the greatest impacts along the Indo-Gangetic Plain. Using urban and rural population distributions, we estimate that the majority of premature deaths resulting from PM2.5 and O3 are in rural (292,000) as opposed to urban (79,000) regions. These findings indicate the need for designing monitoring networks and ground-based health studies in rural areas of India to more accurately quantify the true health implications of ambient air pollution, in addition to supporting model evaluation. Using this urban-versus-rural emissions framework, we are assessing anthropogenic contributions to regional air quality and health impacts, and examining mitigation strategies to reduce anthropogenic emissions, improve air quality, and reduce PM2.5 and O3 attributable premature death in the near-term.

  4. A FAST BAYESIAN METHOD FOR UPDATING AND FORECASTING HOURLY OZONE LEVELS

    EPA Science Inventory

    A Bayesian hierarchical space-time model is proposed by combining information from real-time ambient AIRNow air monitoring data, and output from a computer simulation model known as the Community Multi-scale Air Quality (Eta-CMAQ) forecast model. A model validation analysis shows...

  5. Application of Wavelet Filters in an Evaluation of Photochemical Model Performance

    EPA Science Inventory

    Air quality model evaluation can be enhanced with time-scale specific comparisons of outputs and observations. For example, high-frequency (hours to one day) time scale information in observed ozone is not well captured by deterministic models and its incorporation into model pe...

  6. QMEANclust: estimation of protein model quality by combining a composite scoring function with structural density information.

    PubMed

    Benkert, Pascal; Schwede, Torsten; Tosatto, Silvio Ce

    2009-05-20

    The selection of the most accurate protein model from a set of alternatives is a crucial step in protein structure prediction both in template-based and ab initio approaches. Scoring functions have been developed which can either return a quality estimate for a single model or derive a score from the information contained in the ensemble of models for a given sequence. Local structural features occurring more frequently in the ensemble have a greater probability of being correct. Within the context of the CASP experiment, these so called consensus methods have been shown to perform considerably better in selecting good candidate models, but tend to fail if the best models are far from the dominant structural cluster. In this paper we show that model selection can be improved if both approaches are combined by pre-filtering the models used during the calculation of the structural consensus. Our recently published QMEAN composite scoring function has been improved by including an all-atom interaction potential term. The preliminary model ranking based on the new QMEAN score is used to select a subset of reliable models against which the structural consensus score is calculated. This scoring function called QMEANclust achieves a correlation coefficient of predicted quality score and GDT_TS of 0.9 averaged over the 98 CASP7 targets and perform significantly better in selecting good models from the ensemble of server models than any other groups participating in the quality estimation category of CASP7. Both scoring functions are also benchmarked on the MOULDER test set consisting of 20 target proteins each with 300 alternatives models generated by MODELLER. QMEAN outperforms all other tested scoring functions operating on individual models, while the consensus method QMEANclust only works properly on decoy sets containing a certain fraction of near-native conformations. We also present a local version of QMEAN for the per-residue estimation of model quality (QMEANlocal) and compare it to a new local consensus-based approach. Improved model selection is obtained by using a composite scoring function operating on single models in order to enrich higher quality models which are subsequently used to calculate the structural consensus. The performance of consensus-based methods such as QMEANclust highly depends on the composition and quality of the model ensemble to be analysed. Therefore, performance estimates for consensus methods based on large meta-datasets (e.g. CASP) might overrate their applicability in more realistic modelling situations with smaller sets of models based on individual methods.

  7. [Quality by design approaches for pharmaceutical development and manufacturing of Chinese medicine].

    PubMed

    Xu, Bing; Shi, Xin-Yuan; Wu, Zhi-Sheng; Zhang, Yan-Ling; Wang, Yun; Qiao, Yan-Jiang

    2017-03-01

    The pharmaceutical quality was built by design, formed in the manufacturing process and improved during the product's lifecycle. Based on the comprehensive literature review of pharmaceutical quality by design (QbD), the essential ideas and implementation strategies of pharmaceutical QbD were interpreted. Considering the complex nature of Chinese medicine, the "4H" model was innovated and proposed for implementing QbD in pharmaceutical development and industrial manufacture of Chinese medicine product. "4H" corresponds to the acronym of holistic design, holistic information analysis, holistic quality control, and holistic process optimization, which is consistent with the holistic concept of Chinese medicine theory. The holistic design aims at constructing both the quality problem space from the patient requirement and the quality solution space from multidisciplinary knowledge. Holistic information analysis emphasizes understanding the quality pattern of Chinese medicine by integrating and mining multisource data and information at a relatively high level. The batch-to-batch quality consistence and manufacturing system reliability can be realized by comprehensive application of inspective quality control, statistical quality control, predictive quality control and intelligent quality control strategies. Holistic process optimization is to improve the product quality and process capability during the product lifecycle management. The implementation of QbD is useful to eliminate the ecosystem contradictions lying in the pharmaceutical development and manufacturing process of Chinese medicine product, and helps guarantee the cost effectiveness. Copyright© by the Chinese Pharmaceutical Association.

  8. DISCOVER-AQ SJV Surface Measurements and Initial Comparisons with Photochemical Model Simulations

    EPA Science Inventory

    NASA’s DISCOVER-AQ (Deriving Information on Surface Conditions from Column and Vertically Resolved Observations Relevant to Air Quality) campaign studied the air quality throughout California’s San Joaquin Valley (SJV) during January and February of 2013. The SJV is a...

  9. URBAN SPRAWL MODELING, AIR QUALITY MONITORING AND RISK COMMUNICATION: THE NORTHEAST OHIO PROJECT

    EPA Science Inventory

    The Northeast Ohio Urban Sprawl, Air Quality Monitoring, and Communications Project (hereafter called the Northeast Ohio Project) provides local environmental and health information useful to residents, local officials, community planners, and others in a 15 county region in the ...

  10. Thermal modelling using discrete vasculature for thermal therapy: a review

    PubMed Central

    Kok, H.P.; Gellermann, J.; van den Berg, C.A.T.; Stauffer, P.R.; Hand, J.W.; Crezee, J.

    2013-01-01

    Reliable temperature information during clinical hyperthermia and thermal ablation is essential for adequate treatment control, but conventional temperature measurements do not provide 3D temperature information. Treatment planning is a very useful tool to improve treatment quality and substantial progress has been made over the last decade. Thermal modelling is a very important and challenging aspect of hyperthermia treatment planning. Various thermal models have been developed for this purpose, with varying complexity. Since blood perfusion is such an important factor in thermal redistribution of energy in in vivo tissue, thermal simulations are most accurately performed by modelling discrete vasculature. This review describes the progress in thermal modelling with discrete vasculature for the purpose of hyperthermia treatment planning and thermal ablation. There has been significant progress in thermal modelling with discrete vasculature. Recent developments have made real-time simulations possible, which can provide feedback during treatment for improved therapy. Future clinical application of thermal modelling with discrete vasculature in hyperthermia treatment planning is expected to further improve treatment quality. PMID:23738700

  11. Using technology to enhance the quality of home health care: three case studies of health information technology initiatives at the visiting nurse service of New York.

    PubMed

    Russell, David; Rosenfeld, Peri; Ames, Sylvia; Rosati, Robert J

    2010-01-01

    There is a growing recognition among health services researchers and policy makers that Health Information Technology (HIT) has the potential to address challenging issues that face patients and providers of healthcare. The Visiting Nurse Service of New York (VNSNY), a large not-for-profit home healthcare agency, has integrated technology applications into the service delivery model of several programs. Case studies, including the development and implementation, of three informatics initiatives at VNSNY are presented on: (1) Quality Scorecards that utilize process, outcomes, cost, and satisfaction measures to assess performance among clinical staff and programs; (2) a tool to identify patients at risk of being hospitalized, and (3) a predictive model that identifies patients who are eligible for physical rehabilitation services. Following a description of these initiatives, we discuss their impact on quality and process indicators, as well as the opportunities and challenges to implementation. © 2010 National Association for Healthcare Quality.

  12. Characterization of urban air quality using GIS as a management system.

    PubMed

    Puliafito, E; Guevara, M; Puliafito, C

    2003-01-01

    Keeping the air quality acceptable has become an important task for decision makers as well as for non-governmental organizations. Particulate and gaseous emissions of pollutant from industries and auto-exhausts are responsible for rising discomfort, increasing airway diseases, decreasing productivity and the deterioration of artistic and cultural patrimony in urban centers. A model to determine the air quality in urban areas using a geographical information system will be presented here. This system permits the integration, handling, analysis and simulation of spatial and temporal data of the ambient concentration of the main pollutant. It allows the users to characterize and recognize areas with a potential increase or improvement in its air pollution situation. It is also possible to compute past or present conditions by changing basic input information as traffic flow, or stack emission rates. Additionally the model may be used to test the compliance of local standard air quality, to study the environmental impact of new industries or to determine the changes in the conditions when the vehicle circulation is increased.

  13. It Takes Two to Tango: How Parents' and Adolescents' Personalities Link to the Quality of Their Mutual Relationship

    ERIC Educational Resources Information Center

    Denissen, Jaap J. A.; van Aken, Marcel A. G.; Dubas, Judith S.

    2009-01-01

    According to J. Belsky's (1984) process model of parenting, both adolescents' and parents' personality should exert a significant impact on the quality of their mutual relationship. Using multi-informant, symmetric data on the Big Five personality traits and the relationship quality of mothers, fathers, and two adolescent children, the current…

  14. 75 FR 2860 - Clean Water Act Section 303(d): Call for Data for the Illinois River Watershed in Oklahoma and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-19

    ... in Oklahoma and Arkansas to address nutrient water quality impairments. The results of this watershed... Watershed. EPA requests that the public provide any water quality related data and information that may be... loads that are needed to meet water quality standards in both States. This watershed model will serve as...

  15. Signal-transfer Modeling for Regional Assessment of Forest Responses to Environmental Changes in the Southeastern United States

    Treesearch

    Robert J. Luxmoore; William W. Hargrove; M. Lynn Tharp; Wilfred M. Post; Michael W. Berry; Karen S. Minser; Wendell P. Cropper; Dale W. Johnson; Boris Zeide; Ralph L. Amateis; Harold E. Burkhart; V. Clark Baldwin; Kelly D. Peterson

    2000-01-01

    Stochastic transfer of information in a hierarchy of simulators is offered as a conceptual approach for assessing forest responses to changing climate and air quality across 13 southeastern states of the USA. This assessment approach combines geographic information system and Monte Carlo capabilities with several scales of computer modeling for southern pine species...

  16. Risk assessment of consuming agricultural products irrigated with reclaimed wastewater: An exposure model

    NASA Astrophysics Data System (ADS)

    van Ginneken, Meike; Oron, Gideon

    2000-09-01

    This study assesses health risks to consumers due to the use of agricultural products irrigated with reclaimed wastewater. The analysis is based on a definition of an exposure model which takes into account several parameters: (1) the quality of the applied wastewater, (2) the irrigation method, (3) the elapsed times between irrigation, harvest, and product consumption, and (4) the consumers' habits. The exposure model is used for numerical simulation of human consumers' risks using the Monte Carlo simulation method. The results of the numerical simulation show large deviations, probably caused by uncertainty (impreciseness in quality of input data) and variability due to diversity among populations. There is a 10-orders of magnitude difference in the risk of infection between the different exposure scenarios with the same water quality. This variation indicates the need for setting risk-based criteria for wastewater reclamation rather than single water quality guidelines. Extra data are required to decrease uncertainty in the risk assessment. Future research needs to include definition of acceptable risk criteria, more accurate dose-response modeling, information regarding pathogen survival in treated wastewater, additional data related to the passage of pathogens into and in the plants during irrigation, and information regarding the behavior patterns of the community of human consumers.

  17. Integration of external metadata into the Earth System Grid Federation (ESGF)

    NASA Astrophysics Data System (ADS)

    Berger, Katharina; Levavasseur, Guillaume; Stockhause, Martina; Lautenschlager, Michael

    2015-04-01

    International projects with high volume data usually disseminate their data in a federated data infrastructure, e.g.~the Earth System Grid Federation (ESGF). The ESGF aims to make the geographically distributed data seamlessly discoverable and accessible. Additional data-related information is currently collected and stored in separate repositories by each data provider. This scattered and useful information is not or only partly available for ESGF users. Examples for such additional information systems are ES-DOC/metafor for model and simulation information, IPSL's versioning information, CHARMe for user annotations, DKRZ's quality information and data citation information. The ESGF Quality Control working team (esgf-qcwt) aims to integrate these valuable pieces of additional information into the ESGF in order to make them available to users and data archive managers by (i) integrating external information into ESGF portal, (ii) integrating links to external information objects into the ESGF metadata index, e.g. by the use of PIDs (Persistent IDentifiers), and (iii) automating the collection of external information during the ESGF data publication process. For the sixth phase of CMIP (Coupled Model Intercomparison Project), the ESGF metadata index is to be enriched by additional information on data citation, file version, etc. This information will support users directly and can be automatically exploited by higher level services (human and machine readability).

  18. Classifying the health of Connecticut streams using benthic macroinvertebrates with implications for water management.

    PubMed

    Bellucci, Christopher J; Becker, Mary E; Beauchene, Mike; Dunbar, Lee

    2013-06-01

    Bioassessments have formed the foundation of many water quality monitoring programs throughout the United States. Like many state water quality programs, Connecticut has developed a relational database containing information about species richness, species composition, relative abundance, and feeding relationships among macroinvertebrates present in stream and river systems. Geographic Information Systems can provide estimates of landscape condition and watershed characteristics and when combined with measurements of stream biology, provide a useful visual display of information that is useful in a management context. The objective of our study was to estimate the stream health for all wadeable stream kilometers in Connecticut using a combination of macroinvertebrate metrics and landscape variables. We developed and evaluated models using an information theoretic approach to predict stream health as measured by macroinvertebrate multimetric index (MMI) and identified the best fitting model as a three variable model, including percent impervious land cover, a wetlands metric, and catchment slope that best fit the MMI scores (adj-R (2) = 0.56, SE = 11.73). We then provide examples of how modeling can augment existing programs to support water management policies under the Federal Clean Water Act such as stream assessments and anti-degradation.

  19. Health service changes to address diabetes in pregnancy in a complex setting: perspectives of health professionals.

    PubMed

    Kirkham, R; Boyle, J A; Whitbread, C; Dowden, M; Connors, C; Corpus, S; McCarthy, L; Oats, J; McIntyre, H D; Moore, E; O'Dea, K; Brown, A; Maple-Brown, L

    2017-08-03

    Australian Aboriginal and Torres Strait Islander women have high rates of gestational and pre-existing type 2 diabetes in pregnancy. The Northern Territory (NT) Diabetes in Pregnancy Partnership was established to enhance systems and services to improve health outcomes. It has three arms: a clinical register, developing models of care and a longitudinal birth cohort. This study used a process evaluation to report on health professional's perceptions of models of care and related quality improvement activities since the implementation of the Partnership. Changes to models of care were documented according to goals and aims of the Partnership and reviewed annually by the Partnership Steering group. A 'systems assessment tool' was used to guide six focus groups (49 healthcare professionals). Transcripts were coded and analysed according to pre-identified themes of orientation and guidelines, education, communication, logistics and access, and information technology. Key improvements since implementation of the Partnership include: health professional relationships, communication and education; and integration of quality improvement activities. Focus groups with 49 health professionals provided in depth information about how these activities have impacted their practice and models of care for diabetes in pregnancy. Co-ordination of care was reported to have improved, however it was also identified as an opportunity for further development. Recommendations included a central care coordinator, better integration of information technology systems and ongoing comprehensive quality improvement processes. The Partnership has facilitated quality improvement through supporting the development of improved systems that enhance models of care. Persisting challenges exist for delivering care to a high risk population however improvements in formal processes and structures, as demonstrated in this work thus far, play an important role in work towards improving health outcomes.

  20. Quality data collection and management technology of aerospace complex product assembly process

    NASA Astrophysics Data System (ADS)

    Weng, Gang; Liu, Jianhua; He, Yongxi; Zhuang, Cunbo

    2017-04-01

    Aiming at solving problems of difficult management and poor traceability for discrete assembly process quality data, a data collection and management method is proposed which take the assembly process and BOM as the core. Data collection method base on workflow technology, data model base on BOM and quality traceability of assembly process is included in the method. Finally, assembly process quality data management system is developed and effective control and management of quality information for complex product assembly process is realized.

  1. Quality and security - They work together

    NASA Technical Reports Server (NTRS)

    Carr, Richard; Tynan, Marie; Davis, Russell

    1991-01-01

    This paper describes the importance of considering computer security as part of software quality assurance practice. The intended audience is primarily those professionals involved in the design, development, and quality assurance of software. Many issues are raised which point to the need ultimately for integration of quality assurance and computer security disciplines. To address some of the issues raised, the NASA Automated Information Security program is presented as a model which may be used for improving interactions between the quality assurance and computer security community of professionals.

  2. Quality of asthma care under different primary care models in Canada: a population-based study.

    PubMed

    To, Teresa; Guan, Jun; Zhu, Jingqin; Lougheed, M Diane; Kaplan, Alan; Tamari, Itamar; Stanbrook, Matthew B; Simatovic, Jacqueline; Feldman, Laura; Gershon, Andrea S

    2015-02-14

    Previous research has shown variations in quality of care and patient outcomes under different primary care models. The objective of this study was to use previously validated, evidence-based performance indicators to measure quality of asthma care over time and to compare quality of care between different primary care models. Data were obtained for years 2006 to 2010 from the Ontario Asthma Surveillance Information System, which uses health administrative databases to track individuals with asthma living in the province of Ontario, Canada. Individuals with asthma (n=1,813,922) were divided into groups based on the practice model of their primary care provider (i.e., fee-for-service, blended fee-for-service, blended capitation). Quality of asthma care was measured using six validated, evidence-based asthma care performance indicators. All of the asthma performance indicators improved over time within each of the primary care models. Compared to the traditional fee-for-service model, the blended fee-for-service and blended capitation models had higher use of spirometry for asthma diagnosis and monitoring, higher rates of inhaled corticosteroid prescription, and lower outpatient claims. Emergency department visits were lowest in the blended fee-for-service group. Quality of asthma care improved over time within each of the primary care models. However, the amount by which they improved differed between the models. The newer primary care models (i.e., blended fee-for-service, blended capitation) appear to provide better quality of asthma care compared to the traditional fee-for-service model.

  3. Modeling dispersion of traffic-related pollutants in the NEXUS health study

    EPA Science Inventory

    Dispersion modeling tools have traditionally provided critical information for air quality management decisions, but have been used recently to provide exposure estimates to support health studies. However, these models can be challenging to implement, particularly in near-road s...

  4. Information needs for making clinical recommendations about potential drug-drug interactions: a synthesis of literature review and interviews.

    PubMed

    Romagnoli, Katrina M; Nelson, Scott D; Hines, Lisa; Empey, Philip; Boyce, Richard D; Hochheiser, Harry

    2017-02-22

    Drug information compendia and drug-drug interaction information databases are critical resources for clinicians and pharmacists working to avoid adverse events due to exposure to potential drug-drug interactions (PDDIs). Our goal is to develop information models, annotated data, and search tools that will facilitate the interpretation of PDDI information. To better understand the information needs and work practices of specialists who search and synthesize PDDI evidence for drug information resources, we conducted an inquiry that combined a thematic analysis of published literature with unstructured interviews. Starting from an initial set of relevant articles, we developed search terms and conducted a literature search. Two reviewers conducted a thematic analysis of included articles. Unstructured interviews with drug information experts were conducted and similarly coded. Information needs, work processes, and indicators of potential strengths and weaknesses of information systems were identified. Review of 92 papers and 10 interviews identified 56 categories of information needs related to the interpretation of PDDI information including drug and interaction information; study design; evidence including clinical details, quality and content of reports, and consequences; and potential recommendations. We also identified strengths/weaknesses of PDDI information systems. We identified the kinds of information that might be most effective for summarizing PDDIs. The drug information experts we interviewed had differing goals, suggesting a need for detailed information models and flexible presentations. Several information needs not discussed in previous work were identified, including temporal overlaps in drug administration, biological plausibility of interactions, and assessment of the quality and content of reports. Richly structured depictions of PDDI information may help drug information experts more effectively interpret data and develop recommendations. Effective information models and system designs will be needed to maximize the utility of this information.

  5. Developing a workflow to identify inconsistencies in volunteered geographic information: a phenological case study

    USGS Publications Warehouse

    Mehdipoor, Hamed; Zurita-Milla, Raul; Rosemartin, Alyssa; Gerst, Katharine L.; Weltzin, Jake F.

    2015-01-01

    Recent improvements in online information communication and mobile location-aware technologies have led to the production of large volumes of volunteered geographic information. Widespread, large-scale efforts by volunteers to collect data can inform and drive scientific advances in diverse fields, including ecology and climatology. Traditional workflows to check the quality of such volunteered information can be costly and time consuming as they heavily rely on human interventions. However, identifying factors that can influence data quality, such as inconsistency, is crucial when these data are used in modeling and decision-making frameworks. Recently developed workflows use simple statistical approaches that assume that the majority of the information is consistent. However, this assumption is not generalizable, and ignores underlying geographic and environmental contextual variability that may explain apparent inconsistencies. Here we describe an automated workflow to check inconsistency based on the availability of contextual environmental information for sampling locations. The workflow consists of three steps: (1) dimensionality reduction to facilitate further analysis and interpretation of results, (2) model-based clustering to group observations according to their contextual conditions, and (3) identification of inconsistent observations within each cluster. The workflow was applied to volunteered observations of flowering in common and cloned lilac plants (Syringa vulgaris and Syringa x chinensis) in the United States for the period 1980 to 2013. About 97% of the observations for both common and cloned lilacs were flagged as consistent, indicating that volunteers provided reliable information for this case study. Relative to the original dataset, the exclusion of inconsistent observations changed the apparent rate of change in lilac bloom dates by two days per decade, indicating the importance of inconsistency checking as a key step in data quality assessment for volunteered geographic information. Initiatives that leverage volunteered geographic information can adapt this workflow to improve the quality of their datasets and the robustness of their scientific analyses.

  6. Reduced-form air quality modeling for community-scale ...

    EPA Pesticide Factsheets

    Transportation plays an important role in modern society, but its impact on air quality has been shown to have significant adverse effects on public health. Numerous reviews (HEI, CDC, WHO) summarizing findings of hundreds of studies conducted mainly in the last decade, conclude that exposures to traffic emissions near roads are a public health concern. The Community LINE Source Model (C-LINE) is a web-based model designed to inform the community user of local air quality impacts due to roadway vehicles in their region of interest using a simplified modeling approach. Reduced-form air quality modeling is a useful tool for examining what-if scenarios of changes in emissions, such as those due to changes in traffic volume, fleet mix, or vehicle speed. Examining various scenarios of air quality impacts in this way can identify potentially at-risk populations located near roadways, and the effects that a change in traffic activity may have on them. C-LINE computes dispersion of primary mobile source pollutants using meteorological conditions for the region of interest and computes air-quality concentrations corresponding to these selected conditions. C-LINE functionality has been expanded to model emissions from port-related activities (e.g. ships, trucks, cranes, etc.) in a reduced-form modeling system for local-scale near-port air quality analysis. This presentation describes the Community modeling tools C-LINE and C-PORT that are intended to be used by local gove

  7. Sensitivity of an Integrated Mesoscale Atmosphere and Agriculture Land Modeling System (WRF/CMAQ-EPIC) to MODIS Vegetation and Lightning Assimilation

    NASA Astrophysics Data System (ADS)

    Ran, L.; Cooter, E. J.; Gilliam, R. C.; Foroutan, H.; Kang, D.; Appel, W.; Wong, D. C.; Pleim, J. E.; Benson, V.; Pouliot, G.

    2017-12-01

    The combined meteorology and air quality modeling system composed of the Weather Research and Forecast (WRF) model and Community Multiscale Air Quality (CMAQ) model is an important decision support tool that is used in research and regulatory decisions related to emissions, meteorology, climate, and chemical transport. The Environmental Policy Integrated Climate (EPIC) is a cropping model which has long been used in a range of applications related to soil erosion, crop productivity, climate change, and water quality around the world. We have integrated WRF/CMAQ with EPIC using the Fertilizer Emission Scenario Tool for CMAQ (FEST-C) to estimate daily soil N information with fertilization for CMAQ bi-directional ammonia flux modeling. Driven by the weather and N deposition from WRF/CMAQ, FEST-C EPIC simulations are conducted on 22 different agricultural production systems ranging from managed grass lands (e.g. hay and alfalfa) to crop lands (e.g. corn grain and soybean) with rainfed and irrigated information across any defined conterminous United States (U.S.) CMAQ domain and grid resolution. In recent years, this integrated system has been enhanced and applied in many different air quality and ecosystem assessment projects related to land-water-atmosphere interactions. These enhancements have advanced this system to become a valuable tool for integrated assessments of air, land and water quality in light of social drivers and human and ecological outcomes. This presentation will focus on evaluating the sensitivity of precipitation and N deposition in the integrated system to MODIS vegetation input and lightning assimilation and their impacts on agricultural production and fertilization. We will describe the integrated modeling system and evaluate simulated precipitation and N deposition along with other weather information (e.g. temperature, humidity) for 2011 over the conterminous U.S. at 12 km grids from a coupled WRF/CMAQ with MODIS and lightning assimilation. Simulated agricultural production and fertilization from FEST-C EPIC driven by the changed meteorology and N deposition from MODIS and lightning assimilations will be evaluated and analyzed.

  8. Utilization of building information modeling in infrastructure’s design and construction

    NASA Astrophysics Data System (ADS)

    Zak, Josef; Macadam, Helen

    2017-09-01

    Building Information Modeling (BIM) is a concept that has gained its place in the design, construction and maintenance of buildings in Czech Republic during recent years. This paper deals with description of usage, applications and potential benefits and disadvantages connected with implementation of BIM principles in the preparation and construction of infrastructure projects. Part of the paper describes the status of BIM implementation in Czech Republic, and there is a review of several virtual design and construction practices in Czech Republic. Examples of best practice are presented from current infrastructure projects. The paper further summarizes experiences with new technologies gained from the application of BIM related workflows. The focus is on the BIM model utilization for the machine control systems on site, quality assurance, quality management and construction management.

  9. Real-time assessments of water quality: expanding nowcasting throughout the Great Lakes

    USGS Publications Warehouse

    ,

    2013-01-01

    Nowcasts are systems that inform the public of current bacterial water-quality conditions at beaches on the basis of predictive models. During 2010–12, the U.S. Geological Survey (USGS) worked with 23 local and State agencies to improve existing operational beach nowcast systems at 4 beaches and expand the use of predictive models in nowcasts at an additional 45 beaches throughout the Great Lakes. The predictive models were specific to each beach, and the best model for each beach was based on a unique combination of environmental and water-quality explanatory variables. The variables used most often in models to predict Escherichia coli (E. coli) concentrations or the probability of exceeding a State recreational water-quality standard included turbidity, day of the year, wave height, wind direction and speed, antecedent rainfall for various time periods, and change in lake level over 24 hours. During validation of 42 beach models during 2012, the models performed better than the current method to assess recreational water quality (previous day's E. coli concentration). The USGS will continue to work with local agencies to improve nowcast predictions, enable technology transfer of predictive model development procedures, and implement more operational systems during 2013 and beyond.

  10. Evaluating a complex model designed to increase access to high quality primary mental health care for under-served groups: a multi-method study.

    PubMed

    Dowrick, Christopher; Bower, Peter; Chew-Graham, Carolyn; Lovell, Karina; Edwards, Suzanne; Lamb, Jonathan; Bristow, Katie; Gabbay, Mark; Burroughs, Heather; Beatty, Susan; Waheed, Waquas; Hann, Mark; Gask, Linda

    2016-02-17

    Many people with mental distress are disadvantaged because care is not available or does not address their needs. In order to increase access to high quality primary mental health care for under-served groups, we created a model of care with three discrete elements: community engagement, primary care training and tailored wellbeing interventions. We have previously demonstrated the individual impact of each element of the model. Here we assess the effectiveness of the combined model in increasing access to and improving the quality of primary mental health care. We test the assumptions that access to the wellbeing interventions is increased by the presence of community engagement and primary care training; and that quality of primary mental health care is increased by the presence of community engagement and the wellbeing interventions. We implemented the model in four under-served localities in North-West England, focusing on older people and minority ethnic populations. Using a quasi-experimental design with no-intervention comparators, we gathered a combination of quantitative and qualitative information. Quantitative information, including referral and recruitment rates for the wellbeing interventions, and practice referrals to mental health services, was analysed descriptively. Qualitative information derived from interview and focus group responses to topic guides from more than 110 participants. Framework analysis was used to generate findings from the qualitative data. Access to the wellbeing interventions was associated with the presence of the community engagement and the primary care training elements. Referrals to the wellbeing interventions were associated with community engagement, while recruitment was associated with primary care training. Qualitative data suggested that the mechanisms underlying these associations were increased awareness and sense of agency. The quality of primary mental health care was enhanced by information gained from our community mapping activities, and by the offer of access to the wellbeing interventions. There were variable benefits from health practitioner participation in community consultative groups. We also found that participation in the wellbeing interventions led to increased community engagement. We explored the interactions between elements of a multilevel intervention and identified important associations and underlying mechanisms. Further research is needed to test the generalisability of the model. Current Controlled Trials, reference ISRCTN68572159 . Registered 25 February 2013.

  11. Determinants of perceived sleep quality in normal sleepers.

    PubMed

    Goelema, M S; Regis, M; Haakma, R; van den Heuvel, E R; Markopoulos, P; Overeem, S

    2017-09-20

    This study aimed to establish the determinants of perceived sleep quality over a longer period of time, taking into account the separate contributions of actigraphy-based sleep measures and self-reported sleep indices. Fifty participants (52 ± 6.6 years; 27 females) completed two consecutive weeks of home monitoring, during which they kept a sleep-wake diary while their sleep was monitored using a wrist-worn actigraph. The diary included questions on perceived sleep quality, sleep-wake information, and additional factors such as well-being and stress. The data were analyzed using multilevel models to compare a model that included only actigraphy-based sleep measures (model Acti) to a model that included only self-reported sleep measures to explain perceived sleep quality (model Self). In addition, a model based on the self-reported sleep measures and extended with nonsleep-related factors was analyzed to find the most significant determinants of perceived sleep quality (model Extended). Self-reported sleep measures (model Self) explained 61% of the total variance, while actigraphy-based sleep measures (model Acti) only accounted for 41% of the perceived sleep quality. The main predictors in the self-reported model were number of awakenings during the night, sleep onset latency, and wake time after sleep onset. In the extended model, the number of awakenings during the night and total sleep time of the previous night were the strongest determinants of perceived sleep quality, with 64% of the variance explained. In our cohort, perceived sleep quality was mainly determined by self-reported sleep measures and less by actigraphy-based sleep indices. These data further stress the importance of taking multiple nights into account when trying to understand perceived sleep quality.

  12. Quality Analysis of Open Street Map Data

    NASA Astrophysics Data System (ADS)

    Wang, M.; Li, Q.; Hu, Q.; Zhou, M.

    2013-05-01

    Crowd sourcing geographic data is an opensource geographic data which is contributed by lots of non-professionals and provided to the public. The typical crowd sourcing geographic data contains GPS track data like OpenStreetMap, collaborative map data like Wikimapia, social websites like Twitter and Facebook, POI signed by Jiepang user and so on. These data will provide canonical geographic information for pubic after treatment. As compared with conventional geographic data collection and update method, the crowd sourcing geographic data from the non-professional has characteristics or advantages of large data volume, high currency, abundance information and low cost and becomes a research hotspot of international geographic information science in the recent years. Large volume crowd sourcing geographic data with high currency provides a new solution for geospatial database updating while it need to solve the quality problem of crowd sourcing geographic data obtained from the non-professionals. In this paper, a quality analysis model for OpenStreetMap crowd sourcing geographic data is proposed. Firstly, a quality analysis framework is designed based on data characteristic analysis of OSM data. Secondly, a quality assessment model for OSM data by three different quality elements: completeness, thematic accuracy and positional accuracy is presented. Finally, take the OSM data of Wuhan for instance, the paper analyses and assesses the quality of OSM data with 2011 version of navigation map for reference. The result shows that the high-level roads and urban traffic network of OSM data has a high positional accuracy and completeness so that these OSM data can be used for updating of urban road network database.

  13. MODEL ANALYSIS OF RIPARIAN BUFFER EFFECTIVENESS FOR REDUCING NUTRIENT INPUTS TO STREAMS IN AGRICULTURAL LANDSCAPES

    EPA Science Inventory

    Federal and state agencies responsible for protecting water quality rely mainly on statistically-based methods to assess and manage risks to the nation's streams, lakes and estuaries. Although statistical approaches provide valuable information on current trends in water quality...

  14. German dentists' websites on periodontitis have low quality of information.

    PubMed

    Schwendicke, Falk; Stange, Jörg; Stange, Claudia; Graetz, Christian

    2017-08-02

    The internet is an increasingly relevant source of health information. We aimed to assess the quality of German dentists' websites on periodontitis, hypothesizing that it was significantly associated with a number of practice-specific parameters. We searched four electronic search engines and included pages which were freely accessible, posted by a dental practice in Germany, and mentioned periodontal disease/therapy. Websites were assessed for (1) technical and functional aspects, (2) generic quality and risk of bias, (3) disease-specific information. For 1 and 2, validated tools (LIDA/DISCERN) were used for assessment. For 3, we developed a criterion catalogue encompassing items on etiologic and prognostic factors for periodontitis, the diagnostic and treatment process, and the generic chance of tooth retention in periodontitis patients. Inter- and intra-rater reliabilities were largely moderate. Generalized linear modeling was used to assess the association between the information quality (measured as % of maximally available scores) and practice-specific characteristics. Seventy-one websites were included. Technical and functional aspects were reported in significantly higher quality (median: 71%, 25/75th percentiles: 67/79%) than all other aspects (p < 0.05). Generic risk of bias and most disease-specific aspects showed significantly lower reporting quality (median range was 0-40%), with poorest reporting for prognostic factors (9;0/27%), diagnostic process (0;0/33%) and chances of tooth retention (0;0/2%). We found none of the practice-specific parameters to have significant impact on the overall quality of the websites. Most German dentists' websites on periodontitis are not fully trustworthy and relevant information are not or insufficiently considered. There is great need to improve the information quality from such websites at least with regards to periodontitis.

  15. Assessing reliability of protein-protein interactions by integrative analysis of data in model organisms.

    PubMed

    Lin, Xiaotong; Liu, Mei; Chen, Xue-wen

    2009-04-29

    Protein-protein interactions play vital roles in nearly all cellular processes and are involved in the construction of biological pathways such as metabolic and signal transduction pathways. Although large-scale experiments have enabled the discovery of thousands of previously unknown linkages among proteins in many organisms, the high-throughput interaction data is often associated with high error rates. Since protein interaction networks have been utilized in numerous biological inferences, the inclusive experimental errors inevitably affect the quality of such prediction. Thus, it is essential to assess the quality of the protein interaction data. In this paper, a novel Bayesian network-based integrative framework is proposed to assess the reliability of protein-protein interactions. We develop a cross-species in silico model that assigns likelihood scores to individual protein pairs based on the information entirely extracted from model organisms. Our proposed approach integrates multiple microarray datasets and novel features derived from gene ontology. Furthermore, the confidence scores for cross-species protein mappings are explicitly incorporated into our model. Applying our model to predict protein interactions in the human genome, we are able to achieve 80% in sensitivity and 70% in specificity. Finally, we assess the overall quality of the experimentally determined yeast protein-protein interaction dataset. We observe that the more high-throughput experiments confirming an interaction, the higher the likelihood score, which confirms the effectiveness of our approach. This study demonstrates that model organisms certainly provide important information for protein-protein interaction inference and assessment. The proposed method is able to assess not only the overall quality of an interaction dataset, but also the quality of individual protein-protein interactions. We expect the method to continually improve as more high quality interaction data from more model organisms becomes available and is readily scalable to a genome-wide application.

  16. Using Data From Ontario's Episode-Based Funding Model to Assess Quality of Chemotherapy.

    PubMed

    Kaizer, Leonard; Simanovski, Vicky; Lalonde, Carlin; Tariq, Huma; Blais, Irene; Evans, William K

    2016-10-01

    A new episode-based funding model for ambulatory systemic therapy was implemented in Ontario, Canada on April 1, 2014, after a comprehensive knowledge transfer and exchange strategy with providers and administrators. An analysis of the data from the first year of the new funding model provided an opportunity to assess the quality of chemotherapy, which was not possible under the old funding model. Options for chemotherapy regimens given with adjuvant/curative intent or palliative intent were informed by input from disease site groups. Bundles were developed and priced to enable evidence-informed best practice. Analysis of systemic therapy utilization after model implementation was performed to assess the concordance rate of the treatments chosen with recommended practice. The actual number of cycles of treatment delivered was also compared with expert recommendations. Significant improvement compared with baseline was seen in the proportion of adjuvant/curative regimens that aligned with disease site group-recommended options (98% v 90%). Similar improvement was seen for palliative regimens (94% v 89%). However, overall, the number of cycles of adjuvant/curative therapy delivered was lower than recommended best practice in 57.5% of patients. There was significant variation by disease site and between facilities. Linking funding to quality, supported by knowledge transfer and exchange, resulted in a rapid improvement in the quality of systemic treatment in Ontario. This analysis has also identified further opportunities for improvement and the need for model refinement.

  17. Modeling and Simulation Resource Repository (MSRR)(System Engineering/Integrated M&S Management Approach

    NASA Technical Reports Server (NTRS)

    Milroy, Audrey; Hale, Joe

    2006-01-01

    NASA s Exploration Systems Mission Directorate (ESMD) is implementing a management approach for modeling and simulation (M&S) that will provide decision-makers information on the model s fidelity, credibility, and quality, including the verification, validation and accreditation information. The NASA MSRR will be implemented leveraging M&S industry best practices. This presentation will discuss the requirements that will enable NASA to capture and make available the "meta data" or "simulation biography" data associated with a model. The presentation will also describe the requirements that drive how NASA will collect and document relevant information for models or suites of models in order to facilitate use and reuse of relevant models and provide visibility across NASA organizations and the larger M&S community.

  18. Assessing potential effects of highway runoff on receiving-water quality at selected sites in Oregon with the Stochastic Empirical Loading and Dilution Model (SELDM)

    USGS Publications Warehouse

    Risley, John C.; Granato, Gregory E.

    2014-01-01

    6. An analysis of the use of grab sampling and nonstochastic upstream modeling methods was done to evaluate the potential effects on modeling outcomes. Additional analyses using surrogate water-quality datasets for the upstream basin and highway catchment were provided for six Oregon study sites to illustrate the risk-based information that SELDM will produce. These analyses show that the potential effects of highway runoff on receiving-water quality downstream of the outfall depends on the ratio of drainage areas (dilution), the quality of the receiving water upstream of the highway, and the concentration of the criteria of the constituent of interest. These analyses also show that the probability of exceeding a water-quality criterion may depend on the input statistics used, thus careful selection of representative values is important.

  19. Information Integration for Concurrent Engineering (IICE) IDEF3 Process Description Capture Method Report

    DTIC Science & Technology

    1995-09-01

    vital processes of a business. process, IDEF, method, methodology, modeling, knowledge acquisition, requirements definition, information systems... knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems

  20. Applying a health behavior theory to explore the influence of information and experience on arsenic risk representations, policy beliefs, and protective behavior.

    PubMed

    Severtson, Dolores J; Baumann, Linda C; Brown, Roger L

    2006-04-01

    The common sense model (CSM) shows how people process information to construct representations, or mental models, that guide responses to health threats. We applied the CSM to understand how people responded to information about arsenic-contaminated well water. Constructs included external information (arsenic level and information use), experience (perceived water quality and arsenic-related health effects), representations, safety judgments, opinions about policies to mitigate environmental arsenic, and protective behavior. Of 649 surveys mailed to private well users with arsenic levels exceeding the maximum contaminant level, 545 (84%) were analyzed. Structural equation modeling quantified CSM relationships. Both external information and experience had substantial effects on behavior. Participants who identified a water problem were more likely to reduce exposure to arsenic. However, about 60% perceived good water quality and 60% safe water. Participants with higher arsenic levels selected higher personal safety thresholds and 20% reported a lower arsenic level than indicated by their well test. These beliefs would support judgments of safe water. A variety of psychological and contextual factors may explain judgments of safe water when information suggested otherwise. Information use had an indirect effect on policy beliefs through understanding environmental causes of arsenic. People need concrete information about environmental risk at both personal and environmental-systems levels to promote a comprehensive understanding and response. The CSM explained responses to arsenic information and may have application to other environmental risks.

  1. Using a Birth Center Model of Care to Improve Reproductive Outcomes in Informal Settlements-a Case Study.

    PubMed

    Wallace, Jacqueline

    2018-06-04

    The world is becoming increasingly urban. For the first time in history, more than 50% of human beings live in cities (United Nations, Department of Economic and Social Affairs, Population Division, ed. (2015)). Rapid urbanization is often chaotic and unstructured, leading to the formation of informal settlements or slums. Informal settlements are frequently located in environmentally hazardous areas and typically lack adequate sanitation and clean water, leading to poor health outcomes for residents. In these difficult circumstances women and children fair the worst, and reproductive outcomes for women living in informal settlements are grim. Insufficient uptake of antenatal care, lack of skilled birth attendants and poor-quality care contribute to maternal mortality rates in informal settlements that far outpace wealthier urban neighborhoods (Chant and McIlwaine (2016)). In response, a birth center model of maternity care is proposed for informal settlements. Birth centers have been shown to provide high quality, respectful, culturally appropriate care in high resource settings (Stapleton et al. J Midwifery Women's Health 58(1):3-14, 2013; Hodnett et al. Cochrane Database Syst Rev CD000012, 2012; Brocklehurst et al. BMJ 343:d7400, 2011). In this paper, three case studies are described that support the use of this model in low resource, urban settings.

  2. Health information exchange in the wild: the association between organizational capability and perceived utility of clinical event notifications in ambulatory and community care.

    PubMed

    Vest, Joshua R; Ancker, Jessica S

    2017-01-01

    Event notifications are real-time, electronic, automatic alerts to providers of their patients' health care encounters at other facilities. Our objective was to examine the effects of organizational capability and related social/organizational issues upon users' perceptions of the impact of event notifications on quality, efficiency, and satisfaction. We surveyed representatives (n = 49) of 10 organizations subscribing to the Bronx Regional Health Information Organization's event notification services about organizational capabilities, notification information quality, perceived usage, perceived impact, and organizational and respondent characteristics. The response rate was 89%. Average item scores were used to create an individual domain summary score. The association between the impact of event notifications and organizational characteristics was modeled using random-intercept logistic regression models. Respondents estimated that organizations followed up on the majority (83%) of event notifications. Supportive organizational policies were associated with the perception that event notifications improved quality of care (odds ratio [OR] = 2.12; 95% CI, = 1.05, 4.45), efficiency (OR = 2.06; 95% CI = 1.00, 4.21), and patient satisfaction (OR = 2.56; 95% CI = 1.13, 5.81). Higher quality of event notification information was also associated with a perceived positive impact on quality of care (OR = 2.84; 95% CI = 1.31, 6.12), efficiency (OR = 3.04; 95% CI = 1.38, 6.69), and patient satisfaction (OR = 2.96; 95% CI = 1.25, 7.03). Health care organizations with appropriate processes, workflows, and staff may be better positioned to use event notifications. Additionally, information quality remains critical in users' assessments and perceptions. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Optimizing cardiothoracic surgery information for a managed care environment.

    PubMed

    Denton, T A; Matloff, J M

    1995-11-01

    The rapid change occurring in American healthcare is a direct response to rising costs. Managed care is the fastest growing model that attempts to control escalating costs through limitations in patient choice, the active use of guidelines, and placing providers at risk. Managed care is an information intensive system, and those providers who use information effectively will be at an advantage in the competitive healthcare marketplace. There are five classes of information that providers must collect to be competitive in a managed care environment: patient satisfaction, medical outcomes, continuous quality improvement, quality of the decision, and financial data. Each of these should be actively used in marketing, assuring the quality of patient care, and maintaining financial stability. Although changes in our healthcare system are occurring rapidly, we need to respond to the marketplace to maintain our viability, but as physicians, we have the singular obligation to maintain the supremacy of the individual patient and the physician-patient relationship.

  4. Psychometrican analysis and dimensional structure of the Brazilian version of melasma quality of life scale (MELASQoL-BP)*

    PubMed Central

    Maranzatto, Camila Fernandes Pollo; Miot, Hélio Amante; Miot, Luciane Donida Bartoli; Meneguin, Silmara

    2016-01-01

    Background Although asymptomatic, melasma inflicts significant impact on quality of life. MELASQoL is the main instrument used to assess quality of life associated with melasma, it has been validated in several languages, but its latent dimensional structure and psychometric properties haven´t been fully explored. Objectives To evaluate psychometric characteristics, information and dimensional structure of the Brazilian version of MELASQoL. Methods Survey with patients with facial melasma through socio-demographic questionnaire, DLQI-BRA, MASI and MELASQoL-BP, exploratory and confirmatory factor analysis, internal consistency of MELASQoL and latent dimensions (Cronbach's alpha). The informativeness of the model and items were investigated by the Rasch model (ordinal data). Results We evaluated 154 patients, 134 (87%) were female, mean age (± SD) of 39 (± 8) years, the onset of melasma at 27 (± 8) years, median (p25-p75) of MASI scores , DLQI and MELASQoL 8 (5-15) 2 (1-6) and 30 (17-44). The correlation (rho) of MELASQoL with DLQI and MASI were: 0.70 and 0.36. Exploratory factor analysis identified two latent dimensions: Q1-Q3 and Q4-Q10, which had significantly more adjusted factor structure than the one-dimensional model: Χ2 / gl = 2.03, CFI = 0.95, AGFI = 0.94, RMSEA = 0.08. Cronbach's coefficient for the one-dimensional model and the factors were: 0.95, 0.92 and 0.93. Rasch analysis demonstrated that the use of seven alternatives per item resulted in no increase in the model informativeness. Conclusions MELASQoL-BP showed good psychometric performance and a latent structure of two dimensions. We also identified an oversizing of item alternatives to characterize the aggregate information to each dimension. PMID:27579735

  5. Expert decision-making strategies

    NASA Technical Reports Server (NTRS)

    Mosier, Kathleen L.

    1991-01-01

    A recognition-primed decisions (RPD) model is employed as a framework to investigate crew decision-making processes. The quality of information transfer, a critical component of the team RPD model and an indicator of the team's 'collective consciouness', is measured and analyzed with repect to crew performance. As indicated by the RPD model, timing and patterns of information search transfer were expected to reflect extensive and continual situation assessment, and serial evaluation of alternative states of the world or decision response options.

  6. E-Service Quality Evaluation on E-Government Website: Case Study BPJS Kesehatan Indonesia

    NASA Astrophysics Data System (ADS)

    Rasyid, A.; Alfina, I.

    2017-01-01

    This research intends to develop a model to evaluate the quality of e-services on e-government. The proposed model consists of seven dimensions: web design, reliability, responsiveness, privacy and security, personalization, information, and ease of use. The model is used to measure the quality of the e-registration of BPJS Kesehatan, an Indonesian government health insurance program. The validation and reliability testing show that of the seven dimensions proposed, only four that suitable for the case study. The result shows that the BPJS Kesehatan e-registration service is good in reliability and responsiveness dimensions, while from web design and ease of use dimensions the e-service still needs to be optimized.

  7. A Data Model for Teleconsultation in Managing High-Risk Pregnancies: Design and Preliminary Evaluation

    PubMed Central

    Deldar, Kolsoum

    2017-01-01

    Background Teleconsultation is a guarantor for virtual supervision of clinical professors on clinical decisions made by medical residents in teaching hospitals. Type, format, volume, and quality of exchanged information have a great influence on the quality of remote clinical decisions or tele-decisions. Thus, it is necessary to develop a reliable and standard model for these clinical relationships. Objective The goal of this study was to design and evaluate a data model for teleconsultation in the management of high-risk pregnancies. Methods This study was implemented in three phases. In the first phase, a systematic review, a qualitative study, and a Delphi approach were done in selected teaching hospitals. Systematic extraction and localization of diagnostic items to develop the tele-decision clinical archetypes were performed as the second phase. Finally, the developed model was evaluated using predefined consultation scenarios. Results Our review study has shown that present medical consultations have no specific structure or template for patient information exchange. Furthermore, there are many challenges in the remote medical decision-making process, and some of them are related to the lack of the mentioned structure. The evaluation phase of our research has shown that data quality (P<.001), adequacy (P<.001), organization (P<.001), confidence (P<.001), and convenience (P<.001) had more scores in archetype-based consultation scenarios compared with routine-based ones. Conclusions Our archetype-based model could acquire better and higher scores in the data quality, adequacy, organization, confidence, and convenience dimensions than ones with routine scenarios. It is probable that the suggested archetype-based teleconsultation model may improve the quality of physician-physician remote medical consultations. PMID:29242181

  8. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  9. Benefits of using enhanced air quality information in human health studies

    EPA Science Inventory

    The ability of four (4) enhancements of gridded PM2.5 concentrations derived from observations and air quality models to detect the relative risk of long-term exposure to PM2.5 are evaluated with a simulation study. The four enhancements include nearest-nei...

  10. Importance and Challenges in Use and Uptake of Air Quality Modelling in Developing Countries: Use of CAMx for Air Quality Management in the City of Johannesburg.

    NASA Astrophysics Data System (ADS)

    Garland, R. M.; Naidoo, M.; Sibiya, B.; Naidoo, S.; Bird, T.; von Gruenewaldt, R.; Liebenberg-Enslin, H.; Nekhwalivhe, M.; Netshandama, J.; Mahlatji, M.

    2017-12-01

    Ambient air pollution levels are regulated in South Africa; however in many areas pollution concentrations exceed these levels. The South African Air Quality Act also stipulates that government across all levels must have Air Quality Management Plans (AQMP) in place that outline the current state of air quality and emissions, as well as the implementable plan to manage, and where necessary improve, air quality. Historically, dispersion models have been used to support air quality management decisions, including in AQMPs. However, with the focus of air quality management shifting from focusing on industrial point sources to a more integrated and holistic management of all sources, chemical transport models are needed. CAMx was used in the review and development of the City of Johannesburg's AQMP to simulate hot spots of air pollution, as well as to model intervention scenarios. As the pollutants of concern in Johannesburg are ozone and particulate matter, it is critical to use a model that can simulate chemistry. CAMx was run at 1 km with a locally derived emissions inventory for 2014. The sources of pollution in the City are diverse (including, industrial, vehicles, domestic burning, natural), and many sources have large uncertainties in estimating emissions due to lack of necessary data and local emission factors. These uncertainties, together with a lack of measurements to validate the model against, hinder the performance of the model to simulate air quality and thus inform air quality management. However, as air quality worsens in Africa, it is critical for decision makers to have a strong evidence base on the state of air quality and impact of interventions in order to improve air quality effectively. This presentation will highlight the findings from using a chemical transport model for air quality management in the largest city in South Africa, the use and limitations of these for decision-makers, and proposed way forward.

  11. Quality by control: Towards model predictive control of mammalian cell culture bioprocesses.

    PubMed

    Sommeregger, Wolfgang; Sissolak, Bernhard; Kandra, Kulwant; von Stosch, Moritz; Mayer, Martin; Striedner, Gerald

    2017-07-01

    The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Nursing Home Price and Quality Responses to Publicly Reported Quality Information

    PubMed Central

    Clement, Jan P; Bazzoli, Gloria J; Zhao, Mei

    2012-01-01

    Objective To assess whether the release of Nursing Home Compare (NHC) data affected self-pay per diem prices and quality of care. Data Sources Primary data sources are the Annual Survey of Wisconsin Nursing Homes for 2001–2003, Online Survey and Certification Reporting System, NHC, and Area Resource File. Study Design We estimated fixed effects models with robust standard errors of per diem self-pay charge and quality before and after NHC. Principal Findings After NHC, low-quality nursing homes raised their prices by a small but significant amount and decreased their use of restraints but did not reduce pressure sores. Mid-level and high-quality nursing homes did not significantly increase self-pay prices after NHC nor consistently change quality. Conclusions Our findings suggest that the release of quality information affected nursing home behavior, especially pricing and quality decisions among low-quality facilities. Policy makers should continue to monitor quality and prices for self-pay residents and scrutinize low-quality homes over time to see whether they are on a pathway to improve quality. In addition, policy makers should not expect public reporting to result in quick fixes to nursing home quality problems. PMID:22092366

  13. Input variable selection and calibration data selection for storm water quality regression models.

    PubMed

    Sun, Siao; Bertrand-Krajewski, Jean-Luc

    2013-01-01

    Storm water quality models are useful tools in storm water management. Interest has been growing in analyzing existing data for developing models for urban storm water quality evaluations. It is important to select appropriate model inputs when many candidate explanatory variables are available. Model calibration and verification are essential steps in any storm water quality modeling. This study investigates input variable selection and calibration data selection in storm water quality regression models. The two selection problems are mutually interacted. A procedure is developed in order to fulfil the two selection tasks in order. The procedure firstly selects model input variables using a cross validation method. An appropriate number of variables are identified as model inputs to ensure that a model is neither overfitted nor underfitted. Based on the model input selection results, calibration data selection is studied. Uncertainty of model performances due to calibration data selection is investigated with a random selection method. An approach using the cluster method is applied in order to enhance model calibration practice based on the principle of selecting representative data for calibration. The comparison between results from the cluster selection method and random selection shows that the former can significantly improve performances of calibrated models. It is found that the information content in calibration data is important in addition to the size of calibration data.

  14. Patient experiences of burn scars in adults and children and development of a health-related quality of life conceptual model: A qualitative study.

    PubMed

    Simons, Megan; Price, Nathaniel; Kimble, Roy; Tyack, Zephanie

    2016-05-01

    The aim of this study was to understand the impact of burn scars on health-related quality of life (HRQOL) from the perspective of adults and children with burn scars, and caregivers to inform the development of a conceptual model of burn scar HRQOL. Twenty-one participants (adults and children) with burn scars and nine caregivers participated in semi-structured, face-to-face interviews between 2012 and 2013. During the interviews, participants were asked to describe features about their (or their child's) burn scars and its impact on everyday life. Two coders conducted thematic analysis, with consensus achieved through discussion and review with a third coder. The literature on HRQOL models was then reviewed to further inform the development of a conceptual model of burn scar HRQOL. Five themes emerged from the qualitative data: 'physical and sensory symptoms', 'impact of burn scar interventions', 'impact of burn scar symptoms', 'personal factors' and 'change over time'. Caregivers offered further insights into family functioning after burn, and the impacts of burn scars and burn scar interventions on family life. In the conceptual model, symptoms (sensory and physical) of burn scars are considered proximal to HRQOL, with distal indicators including functioning (physical, emotional, social, cognitive), individual factors and the environment. Overall quality of life was affected by HRQOL. Understanding the impact of burn scars on HRQOL and the development of a conceptual model will inform future burn scar research and clinical practice. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  15. Representing the Effects of Long-Range Transport and Lateral Boundary Conditions in Regional Air Pollution Models

    EPA Science Inventory

    The Community Multiscale Air Quality (CMAQ) modeling system was applied to a domain covering the northern hemisphere; meteorological information was derived from the Weather Research and Forecasting (WRF) model run on identical grid and projection configuration, while the emissio...

  16. Morphologic Quality of DSMs Based on Optical and Radar Space Imagery

    NASA Astrophysics Data System (ADS)

    Sefercik, U. G.; Bayik, C.; Karakis, S.; Jacobsen, K.

    2011-09-01

    Digital Surface Models (DSMs) are representing the visible surface of the earth by the height corresponding to its X-, Y-location and height value Z. The quality of a DSM can be described by the accuracy and the morphologic details. Both depend upon the used input information, the used technique and the roughness of the terrain. The influence of the topographic details to the DSM quality is shown for the test fields Istanbul and Zonguldak. Zonguldak has a rough mountainous character with heights from sea level up to 1640m, while Istanbul is dominated by rolling hills going up to an elevation of 435m. DSMs from SPOT-5, the SRTM C-band height models and ASTER GDEM have been investigated. The DSMs have been verified with height models from large scale aerial photos being more accurate and including morphologic details. It was necessary to determine and respect shifts of the height models caused by datum problems and orientation of the height models. The DSM quality is analyzed depending upon the terrain inclination. The DSM quality differs for both test fields. The morphologic quality depends upon the point spacing of the analyzed DSMs and the terrain characteristics.

  17. Near infrared spectroscopy based monitoring of extraction processes of raw material with the help of dynamic predictive modeling

    NASA Astrophysics Data System (ADS)

    Wang, Haixia; Suo, Tongchuan; Wu, Xiaolin; Zhang, Yue; Wang, Chunhua; Yu, Heshui; Li, Zheng

    2018-03-01

    The control of batch-to-batch quality variations remains a challenging task for pharmaceutical industries, e.g., traditional Chinese medicine (TCM) manufacturing. One difficult problem is to produce pharmaceutical products with consistent quality from raw material of large quality variations. In this paper, an integrated methodology combining the near infrared spectroscopy (NIRS) and dynamic predictive modeling is developed for the monitoring and control of the batch extraction process of licorice. With the spectra data in hand, the initial state of the process is firstly estimated with a state-space model to construct a process monitoring strategy for the early detection of variations induced by the initial process inputs such as raw materials. Secondly, the quality property of the end product is predicted at the mid-course during the extraction process with a partial least squares (PLS) model. The batch-end-time (BET) is then adjusted accordingly to minimize the quality variations. In conclusion, our study shows that with the help of the dynamic predictive modeling, NIRS can offer the past and future information of the process, which enables more accurate monitoring and control of process performance and product quality.

  18. Scan-To Output Validation: Towards a Standardized Geometric Quality Assessment of Building Information Models Based on Point Clouds

    NASA Astrophysics Data System (ADS)

    Bonduel, M.; Bassier, M.; Vergauwen, M.; Pauwels, P.; Klein, R.

    2017-11-01

    The use of Building Information Modeling (BIM) for existing buildings based on point clouds is increasing. Standardized geometric quality assessment of the BIMs is needed to make them more reliable and thus reusable for future users. First, available literature on the subject is studied. Next, an initial proposal for a standardized geometric quality assessment is presented. Finally, this method is tested and evaluated with a case study. The number of specifications on BIM relating to existing buildings is limited. The Levels of Accuracy (LOA) specification of the USIBD provides definitions and suggestions regarding geometric model accuracy, but lacks a standardized assessment method. A deviation analysis is found to be dependent on (1) the used mathematical model, (2) the density of the point clouds and (3) the order of comparison. Results of the analysis can be graphical and numerical. An analysis on macro (building) and micro (BIM object) scale is necessary. On macro scale, the complete model is compared to the original point cloud and vice versa to get an overview of the general model quality. The graphical results show occluded zones and non-modeled objects respectively. Colored point clouds are derived from this analysis and integrated in the BIM. On micro scale, the relevant surface parts are extracted per BIM object and compared to the complete point cloud. Occluded zones are extracted based on a maximum deviation. What remains is classified according to the LOA specification. The numerical results are integrated in the BIM with the use of object parameters.

  19. Can High Quality Overcome Consumer Resistance to Restricted Provider Access? Evidence from a Health Plan Choice Experiment

    PubMed Central

    Harris, Katherine M

    2002-01-01

    Objective To investigate the impact of quality information on the willingness of consumers to enroll in health plans that restrict provider access. Data Sources and Setting A survey administered to respondents between the ages of 25 and 64 in the West Los Angeles area with private health insurance. Study Design An experimental approach is used to measure the effect of variation in provider network features and information about the quality of network physicians on hypothetical plan choices. Conditional logit models are used to analyze the experimental choice data. Next, choice model parameter estimates are used to simulate the impact of changes in plan features on the market shares of competing health plans and to calculate the quality level required to make consumers indifferent to changes in provider access. Principal Findings The presence of quality information reduced the importance of provider network features in plan choices as hypothesized. However, there were not statistically meaningful differences by type of quality measure (i.e., consumer assessed versus expert assessed). The results imply that large quality differences are required to make consumers indifferent to changes in provider access. The impact of quality on plan choices depended more on the particular measure and less on the type of measure. Quality ratings based on the proportion of survey respondents “extremely satisfied with results of care” had the greatest impact on plan choice while the proportion of network doctors “affiliated with university medical centers” had the least. Other consumer and expert assessed measures had more comparable effects. Conclusions Overall the results provide empirical evidence that consumers are willing to trade high quality for restrictions on provider access. This willingness to trade implies that relatively small plans that place restrictions on provider access can successfully compete against less restrictive plans when they can demonstrate high quality. However, the results of this study suggest that in many cases, the level of quality required for consumers to accept access restrictions may be so high as to be unattainable. The results provide empirical support for the current focus of decision support efforts on consumer assessed quality measures. At the same time, however, the results suggest that consumers would also value quality measures based on expert assessments. This finding is relevant given the lack of comparative quality information based on expert judgment and research suggesting that consumers have apprehensions about their ability to meaningfully interpret performance-based quality measures. PMID:12132595

  20. Tools for proactive collection and use of quality metadata in GEOSS

    NASA Astrophysics Data System (ADS)

    Bastin, L.; Thum, S.; Maso, J.; Yang, K. X.; Nüst, D.; Van den Broek, M.; Lush, V.; Papeschi, F.; Riverola, A.

    2012-12-01

    The GEOSS Common Infrastructure allows interactive evaluation and selection of Earth Observation datasets by the scientific community and decision makers, but the data quality information needed to assess fitness for use is often patchy and hard to visualise when comparing candidate datasets. In a number of studies over the past decade, users repeatedly identified the same types of gaps in quality metadata, specifying the need for enhancements such as peer and expert review, better traceability and provenance information, information on citations and usage of a dataset, warning about problems identified with a dataset and potential workarounds, and 'soft knowledge' from data producers (e.g. recommendations for use which are not easily encoded using the existing standards). Despite clear identification of these issues in a number of recommendations, the gaps persist in practice and are highlighted once more in our own, more recent, surveys. This continuing deficit may well be the result of a historic paucity of tools to support the easy documentation and continual review of dataset quality. However, more recent developments in tools and standards, as well as more general technological advances, present the opportunity for a community of scientific users to adopt a more proactive attitude by commenting on their uses of data, and for that feedback to be federated with more traditional and static forms of metadata, allowing a user to more accurately assess the suitability of a dataset for their own specific context and reliability thresholds. The EU FP7 GeoViQua project aims to develop this opportunity by adding data quality representations to the existing search and visualisation functionalities of the Geo Portal. Subsequently we will help to close the gap by providing tools to easily create quality information, and to permit user-friendly exploration of that information as the ultimate incentive for improved data quality documentation. Quality information is derived from producer metadata, from the data themselves, from validation of in-situ sensor data, from provenance information and from user feedback, and will be aggregated to produce clear and useful summaries of quality, including a GEO Label. GeoViQua's conceptual quality information models for users and producers are specifically described and illustrated in this presentation. These models (which have been encoded as XML schemas and can be accessed at http://schemas.geoviqua.org/) are designed to satisfy the identified user needs while remaining consistent with current standards such as ISO 19115 and advanced drafts such as ISO 19157. The resulting components being developed for the GEO Portal are designed to lower the entry barrier to users who wish to help to generate and explore rich and useful metadata. This metadata will include reviews, comments and ratings, reports of usage in specific domains and specification of datasets used for benchmarking, as well as rich quantitative information encoded in more traditional data quality elements such as thematic correctness and positional accuracy. The value of the enriched metadata will also be enhanced by graphical tools for visualizing spatially distributed uncertainties. We demonstrate practical example applications in selected environmental application domains.

  1. Effects of natural and human factors on groundwater quality of basin-fill aquifers in the southwestern United States-conceptual models for selected contaminants

    USGS Publications Warehouse

    Bexfield, Laura M.; Thiros, Susan A.; Anning, David W.; Huntington, Jena M.; McKinney, Tim S.

    2011-01-01

    As part of the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program, the Southwest Principal Aquifers (SWPA) study is building a better understanding of the factors that affect water quality in basin-fill aquifers in the Southwestern United States. The SWPA study area includes four principal aquifers of the United States: the Basin and Range basin-fill aquifers in California, Nevada, Utah, and Arizona; the Rio Grande aquifer system in New Mexico and Colorado; and the California Coastal Basin and Central Valley aquifer systems in California. Similarities in the hydrogeology, land- and water-use practices, and water-quality issues for alluvial basins within the study area allow for regional analysis through synthesis of the baseline knowledge of groundwater-quality conditions in basins previously studied by the NAWQA Program. Resulting improvements in the understanding of the sources, movement, and fate of contaminants are assisting in the development of tools used to assess aquifer susceptibility and vulnerability.This report synthesizes previously published information about the groundwater systems and water quality of 15 information-rich basin-fill aquifers (SWPA case-study basins) into conceptual models of the primary natural and human factors commonly affecting groundwater quality with respect to selected contaminants, thereby helping to build a regional understanding of the susceptibility and vulnerability of basin-fill aquifers to those contaminants. Four relatively common contaminants (dissolved solids, nitrate, arsenic, and uranium) and two contaminant classes (volatile organic compounds (VOCs) and pesticide compounds) were investigated for sources and controls affecting their occurrence and distribution above specified levels of concern in groundwater of the case-study basins. Conceptual models of factors that are important to aquifer vulnerability with respect to those contaminants and contaminant classes were subsequently formed. The conceptual models are intended in part to provide a foundation for subsequent development of regional-scale statistical models that relate specific constituent concentrations or occurrence in groundwater to natural and human factors.

  2. Investigating the impact of mindfulness meditation training on working memory: a mathematical modeling approach.

    PubMed

    van Vugt, Marieke K; Jha, Amishi P

    2011-09-01

    We investigated whether mindfulness training (MT) influences information processing in a working memory task with complex visual stimuli. Participants were tested before (T1) and after (T2) participation in an intensive one-month MT retreat, and their performance was compared with that of an age- and education-matched control group. Accuracy did not differ across groups at either time point. Response times were faster and significantly less variable in the MT versus the control group at T2. Since these results could be due to changes in mnemonic processes, speed-accuracy trade-off, or nondecisional factors (e.g., motor execution), we used a mathematical modeling approach to disentangle these factors. The EZ-diffusion model (Wagenmakers, van der Maas, & Grasman, Psychonomic Bulletin & Review 14:(1), 3-22, 2007) suggested that MT leads to improved information quality and reduced response conservativeness, with no changes in nondecisional factors. The noisy exemplar model further suggested that the increase in information quality reflected a decrease in encoding noise and not an increase in forgetting. Thus, mathematical modeling may help clarify the mechanisms by which MT produces salutary effects on performance.

  3. Ground-water models for water resource planning

    USGS Publications Warehouse

    Moore, J.E.

    1983-01-01

    In the past decade hydrogeologists have emphasized the development of computer-based mathematical models to aid in the understanding of flow, the transport of solutes, transport of heat, and deformation in the ground-water system. These models have been used to provide information and predictions for water managers. Too frequently, ground-water was neglected in water resource planning because managers believed that it could not be adequately evaluated in terms of availability, quality, and effect of development on surface-water supplies. Now, however, with newly developed digital ground-water models, effects of development can be predicted. Such models have been used to predict hydrologic and quality changes under different stresses. These models have grown in complexity over the last ten years from simple one-layer models to three-dimensional simulations of ground-water flow, which may include solute transport, heat transport, effects of land subsidence, and encroachment of saltwater. Case histories illustrate how predictive ground-water models have provided the information needed for the sound planning and management of water resources in the USA. ?? 1983 D. Reidel Publishing Company.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bell, L.; Castaldi, A.; Jones, C.

    The ultimate goal of the project is to develop procedures, techniques, data and other information that will aid in the design of cost effective and energy efficient drying processes that produce high quality foods. This objective has been sought by performing studies to determine the pertinent properties of food products, by developing models to describe the fundamental phenomena of food drying and by testing the models at laboratory scale. Finally, this information is used to develop recommendations and strategies for improved dryer design and control. This volume emphasizes a detailed literature review and several extensive experimental studies. Since the basicmore » principle of food dehydration is the removal of water from food, the process of removing water causes quality changes which can be categorized as physical, chemical, and nutritional. These changes often have adverse effects on the quality of the resulting dehydrated food. In this work, the types of physical and chemical changes common in food drying and the important factors for them were reviewed. Pertinent kinetic models and kinetic data reported in literature were also collected and compiled as the results of review study. The overall objectives of this study were to identify major quality change in foods caused by drying process and to get the knowledge of the relationship between the quality change and factors known to affect them. The quality parameters reviewed included: browning, lipid oxidation, color loss, shrinkage, solubility, texture, aroma and flavor, vitamin and protein loss and microbiological concerns. 54 refs., 74 figs., 49 tabs.« less

  5. Nowcasting recreational water quality

    USGS Publications Warehouse

    Boehm, Alexandria B.; Whitman, Richard L.; Nevers, Meredith; Hou, Deyi; Weisberg, Stephen B.

    2007-01-01

    Advances in molecular techniques may soon provide new opportunities to provide more timely information on whether recreational beaches are free from fecal contamination. However, an alternative approach is the use of predictive models. This chapter presents a summary of these developing efforts. First, we describe documented physical, chemical, and biological factors that have been demonstrated by researchers to affect bacterial concentrations at beaches and thus represent logical parameters for inclusion in a model. Then, we illustrate how various types of models can be applied to predict water quality at freshwater and marine beaches.

  6. Study on a pattern classification method of soil quality based on simplified learning sample dataset

    USGS Publications Warehouse

    Zhang, Jiahua; Liu, S.; Hu, Y.; Tian, Y.

    2011-01-01

    Based on the massive soil information in current soil quality grade evaluation, this paper constructed an intelligent classification approach of soil quality grade depending on classical sampling techniques and disordered multiclassification Logistic regression model. As a case study to determine the learning sample capacity under certain confidence level and estimation accuracy, and use c-means algorithm to automatically extract the simplified learning sample dataset from the cultivated soil quality grade evaluation database for the study area, Long chuan county in Guangdong province, a disordered Logistic classifier model was then built and the calculation analysis steps of soil quality grade intelligent classification were given. The result indicated that the soil quality grade can be effectively learned and predicted by the extracted simplified dataset through this method, which changed the traditional method for soil quality grade evaluation. ?? 2011 IEEE.

  7. A spatial model to aggregate point-source and nonpoint-source water-quality data for large areas

    USGS Publications Warehouse

    White, D.A.; Smith, R.A.; Price, C.V.; Alexander, R.B.; Robinson, K.W.

    1992-01-01

    More objective and consistent methods are needed to assess water quality for large areas. A spatial model, one that capitalizes on the topologic relationships among spatial entities, to aggregate pollution sources from upstream drainage areas is described that can be implemented on land surfaces having heterogeneous water-pollution effects. An infrastructure of stream networks and drainage basins, derived from 1:250,000-scale digital-elevation models, define the hydrologic system in this spatial model. The spatial relationships between point- and nonpoint pollution sources and measurement locations are referenced to the hydrologic infrastructure with the aid of a geographic information system. A maximum-branching algorithm has been developed to simulate the effects of distance from a pollutant source to an arbitrary downstream location, a function traditionally employed in deterministic water quality models. ?? 1992.

  8. Using climate models to estimate the quality of global observational data sets.

    PubMed

    Massonnet, François; Bellprat, Omar; Guemas, Virginie; Doblas-Reyes, Francisco J

    2016-10-28

    Observational estimates of the climate system are essential to monitoring and understanding ongoing climate change and to assessing the quality of climate models used to produce near- and long-term climate information. This study poses the dual and unconventional question: Can climate models be used to assess the quality of observational references? We show that this question not only rests on solid theoretical grounds but also offers insightful applications in practice. By comparing four observational products of sea surface temperature with a large multimodel climate forecast ensemble, we find compelling evidence that models systematically score better against the most recent, advanced, but also most independent product. These results call for generalized procedures of model-observation comparison and provide guidance for a more objective observational data set selection. Copyright © 2016, American Association for the Advancement of Science.

  9. Comparison of modelling accuracy with and without exploiting automated optical monitoring information in predicting the treated wastewater quality.

    PubMed

    Tomperi, Jani; Leiviskä, Kauko

    2018-06-01

    Traditionally the modelling in an activated sludge process has been based on solely the process measurements, but as the interest to optically monitor wastewater samples to characterize the floc morphology has increased, in the recent years the results of image analyses have been more frequently utilized to predict the characteristics of wastewater. This study shows that the traditional process measurements or the automated optical monitoring variables by themselves are not capable of developing the best predictive models for the treated wastewater quality in a full-scale wastewater treatment plant, but utilizing these variables together the optimal models, which show the level and changes in the treated wastewater quality, are achieved. By this early warning, process operation can be optimized to avoid environmental damages and economic losses. The study also shows that specific optical monitoring variables are important in modelling a certain quality parameter, regardless of the other input variables available.

  10. A mass-density model can account for the size-weight illusion

    PubMed Central

    Bergmann Tiest, Wouter M.; Drewing, Knut

    2018-01-01

    When judging the heaviness of two objects with equal mass, people perceive the smaller and denser of the two as being heavier. Despite the large number of theories, covering bottom-up and top-down approaches, none of them can fully account for all aspects of this size-weight illusion and thus for human heaviness perception. Here we propose a new maximum-likelihood estimation model which describes the illusion as the weighted average of two heaviness estimates with correlated noise: One estimate derived from the object’s mass, and the other from the object’s density, with estimates’ weights based on their relative reliabilities. While information about mass can directly be perceived, information about density will in some cases first have to be derived from mass and volume. However, according to our model at the crucial perceptual level, heaviness judgments will be biased by the objects’ density, not by its size. In two magnitude estimation experiments, we tested model predictions for the visual and the haptic size-weight illusion. Participants lifted objects which varied in mass and density. We additionally varied the reliability of the density estimate by varying the quality of either visual (Experiment 1) or haptic (Experiment 2) volume information. As predicted, with increasing quality of volume information, heaviness judgments were increasingly biased towards the object’s density: Objects of the same density were perceived as more similar and big objects were perceived as increasingly lighter than small (denser) objects of the same mass. This perceived difference increased with an increasing difference in density. In an additional two-alternative forced choice heaviness experiment, we replicated that the illusion strength increased with the quality of volume information (Experiment 3). Overall, the results highly corroborate our model, which seems promising as a starting point for a unifying framework for the size-weight illusion and human heaviness perception. PMID:29447183

  11. A mass-density model can account for the size-weight illusion.

    PubMed

    Wolf, Christian; Bergmann Tiest, Wouter M; Drewing, Knut

    2018-01-01

    When judging the heaviness of two objects with equal mass, people perceive the smaller and denser of the two as being heavier. Despite the large number of theories, covering bottom-up and top-down approaches, none of them can fully account for all aspects of this size-weight illusion and thus for human heaviness perception. Here we propose a new maximum-likelihood estimation model which describes the illusion as the weighted average of two heaviness estimates with correlated noise: One estimate derived from the object's mass, and the other from the object's density, with estimates' weights based on their relative reliabilities. While information about mass can directly be perceived, information about density will in some cases first have to be derived from mass and volume. However, according to our model at the crucial perceptual level, heaviness judgments will be biased by the objects' density, not by its size. In two magnitude estimation experiments, we tested model predictions for the visual and the haptic size-weight illusion. Participants lifted objects which varied in mass and density. We additionally varied the reliability of the density estimate by varying the quality of either visual (Experiment 1) or haptic (Experiment 2) volume information. As predicted, with increasing quality of volume information, heaviness judgments were increasingly biased towards the object's density: Objects of the same density were perceived as more similar and big objects were perceived as increasingly lighter than small (denser) objects of the same mass. This perceived difference increased with an increasing difference in density. In an additional two-alternative forced choice heaviness experiment, we replicated that the illusion strength increased with the quality of volume information (Experiment 3). Overall, the results highly corroborate our model, which seems promising as a starting point for a unifying framework for the size-weight illusion and human heaviness perception.

  12. The complexity of earth observation valuation: Modeling the patterns and processes of agricultural production and groundwater quality to construct a production possibilities frontier

    NASA Astrophysics Data System (ADS)

    Forney, W.; Raunikar, R. P.; Bernknopf, R.; Mishra, S.

    2012-12-01

    A production possibilities frontier (PPF) is a graph comparing the production interdependencies for two commodities. In this case, the commodities are defined as the ecosystem services of agricultural production and groundwater quality. This presentation focuses on the refinement of techniques used in an application to estimate the value of remote sensing information. Value of information focuses on the use of uncertain and varying qualities of information within a specific decision-making context for a certain application, which in this case included land use, biogeochemical, hydrogeologic, economic and geospatial data and models. The refined techniques include deriving alternate patterns and processes of ecosystem functions, new estimates of ecosystem service values to construct a PPF, and the extension of this work into decision support systems. We have coupled earth observations of agricultural production with groundwater quality measurements to estimate the value of remote sensing information in northeastern Iowa to be 857M ± 198M (at the 2010 price level) per year. We will present an improved method for modeling crop rotation patterns to include multiple years of rotation, reduction in the assumptions associated with optimal land use allocations, and prioritized improvement of the resolution of input data (for example, soil resources and topography). The prioritization focuses on watersheds that were identified at a coarse-scale of analysis to have higher intensities of agricultural production and lower probabilities of groundwater survivability (in other words, remaining below a regulatory threshold for nitrate pollution) over time, and thus require finer-scaled modeling and analysis. These improved techniques and the simulation of certain scale-dependent policy and management actions, which trade-off the objectives of optimizing crop value versus maintaining potable groundwater, and provide new estimates for the empirical values of the PPF. The calculation of a PPF in this way provides a decision maker with a tool to consider the ramifications of different policies, management practices and regional objectives.

  13. Creating quality improvement culture in public health agencies.

    PubMed

    Davis, Mary V; Mahanna, Elizabeth; Joly, Brenda; Zelek, Michael; Riley, William; Verma, Pooja; Fisher, Jessica Solomon

    2014-01-01

    We conducted case studies of 10 agencies that participated in early quality improvement efforts. The agencies participated in a project conducted by the National Association of County and City Health Officials (2007-2008). Case study participants included health directors and quality improvement team leaders and members. We implemented multiple qualitative analysis processes, including cross-case analysis and logic modeling. We categorized agencies according to the extent to which they had developed a quality improvement culture. Agencies were conducting informal quality improvement projects (n = 4), conducting formal quality improvement projects (n = 3), or creating a quality improvement culture (n = 4). Agencies conducting formal quality improvement and creating a quality improvement culture had leadership support for quality improvement, participated in national quality improvement initiatives, had a greater number of staff trained in quality improvement and quality improvement teams that met regularly with decision-making authority. Agencies conducting informal quality improvement were likely to report that accreditation is the major driver for quality improvement work. Agencies creating a quality improvement culture were more likely to have a history of evidence-based decision-making and use quality improvement to address emerging issues. Our findings support previous research and add the roles of national public health accreditation and emerging issues as factors in agencies' ability to create and sustain a quality improvement culture.

  14. Identification of Long Bone Fractures in Radiology Reports Using Natural Language Processing to support Healthcare Quality Improvement.

    PubMed

    Grundmeier, Robert W; Masino, Aaron J; Casper, T Charles; Dean, Jonathan M; Bell, Jamie; Enriquez, Rene; Deakyne, Sara; Chamberlain, James M; Alpern, Elizabeth R

    2016-11-09

    Important information to support healthcare quality improvement is often recorded in free text documents such as radiology reports. Natural language processing (NLP) methods may help extract this information, but these methods have rarely been applied outside the research laboratories where they were developed. To implement and validate NLP tools to identify long bone fractures for pediatric emergency medicine quality improvement. Using freely available statistical software packages, we implemented NLP methods to identify long bone fractures from radiology reports. A sample of 1,000 radiology reports was used to construct three candidate classification models. A test set of 500 reports was used to validate the model performance. Blinded manual review of radiology reports by two independent physicians provided the reference standard. Each radiology report was segmented and word stem and bigram features were constructed. Common English "stop words" and rare features were excluded. We used 10-fold cross-validation to select optimal configuration parameters for each model. Accuracy, recall, precision and the F1 score were calculated. The final model was compared to the use of diagnosis codes for the identification of patients with long bone fractures. There were 329 unique word stems and 344 bigrams in the training documents. A support vector machine classifier with Gaussian kernel performed best on the test set with accuracy=0.958, recall=0.969, precision=0.940, and F1 score=0.954. Optimal parameters for this model were cost=4 and gamma=0.005. The three classification models that we tested all performed better than diagnosis codes in terms of accuracy, precision, and F1 score (diagnosis code accuracy=0.932, recall=0.960, precision=0.896, and F1 score=0.927). NLP methods using a corpus of 1,000 training documents accurately identified acute long bone fractures from radiology reports. Strategic use of straightforward NLP methods, implemented with freely available software, offers quality improvement teams new opportunities to extract information from narrative documents.

  15. Computing and Interpreting Fisher Information as a Metric of Sustainability: Regime Changes in the United States Air Quality

    EPA Science Inventory

    As a key tool in information theory, Fisher Information has been used to explore the observable behavior of a variety of systems. In particular, recent work has demonstrated its ability to assess the dynamic order of real and model systems. However, in order to solidify the use o...

  16. Understanding User Resistance to Information Technology: Toward a Comprehensive Model in Health Information Technology

    ERIC Educational Resources Information Center

    Ngafeeson, Madison N.

    2013-01-01

    The successful implementation of health information systems is expected to increase legibility, reduce medical errors, boost the quality of healthcare and shrink costs. Yet, evidence points to the fact that healthcare professionals resist the full use of these systems. Physicians and nurses have been reported to resist the system. Even though…

  17. Health Information Retrieval Tool (HIRT)

    PubMed Central

    Nyun, Mra Thinzar; Ogunyemi, Omolola; Zeng, Qing

    2002-01-01

    The World Wide Web (WWW) is a powerful way to deliver on-line health information, but one major problem limits its value to consumers: content is highly distributed, while relevant and high quality information is often difficult to find. To address this issue, we experimented with an approach that utilizes three-dimensional anatomic models in conjunction with free-text search.

  18. Game-Theoretic Models of Information Overload in Social Networks

    NASA Astrophysics Data System (ADS)

    Borgs, Christian; Chayes, Jennifer; Karrer, Brian; Meeder, Brendan; Ravi, R.; Reagans, Ray; Sayedi, Amin

    We study the effect of information overload on user engagement in an asymmetric social network like Twitter. We introduce simple game-theoretic models that capture rate competition between celebrities producing updates in such networks where users non-strategically choose a subset of celebrities to follow based on the utility derived from high quality updates as well as disutility derived from having to wade through too many updates. Our two variants model the two behaviors of users dropping some potential connections (followership model) or leaving the network altogether (engagement model). We show that under a simple formulation of celebrity rate competition, there is no pure strategy Nash equilibrium under the first model. We then identify special cases in both models when pure rate equilibria exist for the celebrities: For the followership model, we show existence of a pure rate equilibrium when there is a global ranking of the celebrities in terms of the quality of their updates to users. This result also generalizes to the case when there is a partial order consistent with all the linear orders of the celebrities based on their qualities to the users. Furthermore, these equilibria can be computed in polynomial time. For the engagement model, pure rate equilibria exist when all users are interested in the same number of celebrities, or when they are interested in at most two. Finally, we also give a finite though inefficient procedure to determine if pure equilibria exist in the general case of the followership model.

  19. Spatially explicit assessment of estuarine fish after Deepwater Horizon oil spill: trade-off in complexity and parsimony

    EPA Science Inventory

    Evaluating long- term contaminant effects on wildlife populations depends on spatial information about habitat quality, heterogeneity in contaminant exposure, and sensitivities and distributions of species integrated into a systems modeling approach. Rarely is this information re...

  20. The Assessment of Instruments for Detecting Surface Water Spills Associated with Oil and Gas Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, Aubrey E.; Hopkinson, Leslie; Soeder, Daniel

    Surface water and groundwater risks associated with unconventional oil and gas development result from potential spills of the large volumes of chemicals stored on-site during drilling and hydraulic fracturing operations, and the return to the surface of significant quantities of saline water produced during oil or gas well production. To better identify and mitigate risks, watershed models and tools are needed to evaluate the dispersion of pollutants in possible spill scenarios. This information may be used to determine the placement of in-stream water-quality monitoring instruments and to develop early-warning systems and emergency plans. A chemical dispersion model has been usedmore » to estimate the contaminant signal for in-stream measurements. Spills associated with oil and gas operations were identified within the Susquehanna River Basin Commission’s Remote Water Quality Monitoring Network. The volume of some contaminants was found to be sufficient to affect the water quality of certain drainage areas. The most commonly spilled compounds and expected peak concentrations at monitoring stations were used in laboratory experiments to determine if a signal could be detected and positively identified using standard water-quality monitoring equipment. The results were compared to historical data and baseline observations of water quality parameters, and showed that the chemicals tested do commonly affect water quality parameters. This work is an effort to demonstrate that hydrologic and water quality models may be applied to improve the placement of in-stream water quality monitoring devices. This information may increase the capability of early-warning systems to alert community health and environmental agencies of surface water spills associated with unconventional oil and gas operations.« less

  1. Assessment of Gas Potential in the Niobrara Formation, Rosebud Reservation, South Dakota

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, Aubrey E.; Hopkinson, Leslie; Soeder, Daniel

    2016-01-23

    Surface water and groundwater risks associated with unconventional oil and gas development result from potential spills of the large volumes of chemicals stored on-site during drilling and hydraulic fracturing operations, and the return to the surface of significant quantities of saline water produced during oil or gas well production. To better identify and mitigate risks, watershed models and tools are needed to evaluate the dispersion of pollutants in possible spill scenarios. This information may be used to determine the placement of in-stream water-quality monitoring instruments and to develop early-warning systems and emergency plans. A chemical dispersion model has been usedmore » to estimate the contaminant signal for in-stream measurements. Spills associated with oil and gas operations were identified within the Susquehanna River Basin Commission’s Remote Water Quality Monitoring Network. The volume of some contaminants was found to be sufficient to affect the water quality of certain drainage areas. The most commonly spilled compounds and expected peak concentrations at monitoring stations were used in laboratory experiments to determine if a signal could be detected and positively identified using standard water-quality monitoring equipment. The results were compared to historical data and baseline observations of water quality parameters, and showed that the chemicals tested do commonly affect water quality parameters. This work is an effort to demonstrate that hydrologic and water quality models may be applied to improve the placement of in-stream water quality monitoring devices. This information may increase the capability of early-warning systems to alert community health and environmental agencies of surface water spills associated with unconventional oil and gas operations.« less

  2. A risk-based coverage model for video surveillance camera control optimization

    NASA Astrophysics Data System (ADS)

    Zhang, Hongzhou; Du, Zhiguo; Zhao, Xingtao; Li, Peiyue; Li, Dehua

    2015-12-01

    Visual surveillance system for law enforcement or police case investigation is different from traditional application, for it is designed to monitor pedestrians, vehicles or potential accidents. Visual surveillance risk is defined as uncertainty of visual information of targets and events monitored in present work and risk entropy is introduced to modeling the requirement of police surveillance task on quality and quantity of vide information. the prosed coverage model is applied to calculate the preset FoV position of PTZ camera.

  3. Privacy-preserving periodical publishing for medical information

    NASA Astrophysics Data System (ADS)

    Jin, Hua; Ju, Shi-guang; Liu, Shan-cheng

    2013-07-01

    Existing privacy-preserving publishing models can not meet the requirement of periodical publishing for medical information whether these models are static or dynamic. This paper presents a (k,l)-anonymity model with keeping individual association and a principle based on (Epsilon)-invariance group for subsequent periodical publishing, and then, the PKIA and PSIGI algorithms are designed for them. The proposed methods can reserve more individual association with privacy-preserving and have better publishing quality. Experiments confirm our theoretical results and its practicability.

  4. A Science Plan for a Comprehensive Regional Assessment of the Atlantic Coastal Plain Aquifer System in Maryland

    USGS Publications Warehouse

    Shedlock, Robert J.; Bolton, David W.; Cleaves, Emery T.; Gerhart, James M.; Nardi, Mark R.

    2007-01-01

    The Maryland Coastal Plain region is, at present, largely dependent upon ground water for its water supply. Decades of increasing pumpage have caused ground-water levels in parts of the Maryland Coastal Plain to decline by as much as 2 feet per year in some areas of southern Maryland. Continued declines at this rate could affect the long-term sustainability of ground-water resources in Maryland's heavily populated Coastal Plain communities and the agricultural industry of the Eastern Shore. In response to a recommendation in 2004 by the Advisory Committee on the Management and Protection of the State's Water Resources, the Maryland Geological Survey and the U.S. Geological Survey have developed a science plan for a comprehensive assessment that will provide new scientific information and new data management and analysis tools for the State to use in allocating ground water in the Coastal Plain. The comprehensive assessment has five goals aimed at improving the current information and tools used to understand the resource potential of the aquifer system: (1) document the geologic and hydrologic characteristics of the aquifer system in the Maryland Coastal Plain and appropriate areas of adjacent states; (2) conduct detailed studies of the regional ground-water-flow system and water budget for the aquifer system; (3) improve documentation of patterns of water quality in all Coastal Plain aquifers, including the distribution of saltwater; (4) enhance ground-water-level, streamflow, and water-quality-monitoring networks in the Maryland Coastal Plain; and (5) develop science-based tools to facilitate sound management of the ground-water resources in the Maryland Coastal Plain. The assessment, as designed, will be conducted in three phases and if fully implemented, is expected to take 7 to 8 years to complete. Phase I, which was initiated in January 2006, is an effort to assemble all the information and investigation tools needed to do a more comprehensive assessment of the aquifer system. The work will include updating the hydrogeologic framework, developing a Geographic Information System-based aquifer information system, refinement of water-use information, assessment of existing water-quality data, and development of detailed plans for ground-water-flow and management models. Phase II is an intensive study phase during which a regional ground-water-flow model will be developed and calibrated for the entire region of Maryland in the Atlantic Coastal Plain as well as appropriate areas of Delaware and Virginia. The model will be used to simulate flow and water levels in the aquifer system and to study the water budget of the system. The model analysis will be based on published information but will be supplemented with field investigations of recharge and leakage in the aquifer system. Localized and finely discretized ground-water-flow models that are embedded in the regional model will be developed for selected areas of heavy withdrawals. Other modeling studies will be conducted to better understand flow in the unconfined parts of the aquifer system and to support the recharge studies. Phase II will also include selected water-quality studies and a study to determine how hydrologic and water-quality-monitoring networks need to be enhanced to appropriately assess the sustainability of the Coastal Plain aquifer system. Phase III will be largely devoted to the development and application of a ground-water optimization model. This model will be linked to the ground-water-flow model to create a model package that can be used to test different water-management scenarios. The management criteria that will be used to develop these scenarios will be determined in consultation with a variety of state and local stakeholders and policy makers in Phases I and II of the assessment. The development of the aquifer information system is a key component of the assessment. The system will store all relevant aquifer data

  5. Water quality assessment and meta model development in Melen watershed - Turkey.

    PubMed

    Erturk, Ali; Gurel, Melike; Ekdal, Alpaslan; Tavsan, Cigdem; Ugurluoglu, Aysegul; Seker, Dursun Zafer; Tanik, Aysegul; Ozturk, Izzet

    2010-07-01

    Istanbul, being one of the highly populated metropolitan areas of the world, has been facing water scarcity since the past decade. Water transfer from Melen Watershed was considered as the most feasible option to supply water to Istanbul due to its high water potential and relatively less degraded water quality. This study consists of two parts. In the first part, water quality data covering 26 parameters from 5 monitoring stations were analyzed and assessed due to the requirements of the "Quality Required of Surface Water Intended for the Abstraction of Drinking Water" regulation. In the second part, a one-dimensional stream water quality model with simple water quality kinetics was developed. It formed a basic design for more advanced water quality models for the watershed. The reason for assessing the water quality data and developing a model was to provide information for decision making on preliminary actions to prevent any further deterioration of existing water quality. According to the water quality assessment at the water abstraction point, Melen River has relatively poor water quality with regard to NH(4)(+), BOD(5), faecal streptococcus, manganese and phenol parameters, and is unsuitable for drinking water abstraction in terms of COD, PO(4)(3-), total coliform, total suspended solids, mercury and total chromium parameters. The results derived from the model were found to be consistent with the water quality assessment. It also showed that relatively high inorganic nitrogen and phosphorus concentrations along the streams are related to diffuse nutrient loads that should be managed together with municipal and industrial wastewaters. Copyright 2010 Elsevier Ltd. All rights reserved.

  6. Strategic collaborative quality management and employee job satisfaction

    PubMed Central

    Mosadeghrad, Ali Mohammad

    2014-01-01

    Background: This study aimed to examine Strategic Collaborative Quality Management (SCQM) impact on employee job satisfaction. Methods: The study presents a case study over six years following the implementation of the SCQM programme in a public hospital. A validated questionnaire was used to measure employees’ job satisfaction. The impact of the intervention was measured by comparing the pre-intervention and post-intervention measures in the hospital. Results: The hospital reported a significant improvement in some dimensions of job satisfaction like management and supervision, organisational policies, task requirement, and working conditions. Conclusion: This paper provides detailed information on how a quality management model implementation affects employees. A well developed, well introduced and institutionalised quality management model can improve employees’ job satisfaction. However, the success of quality management needs top management commitment and stability. PMID:24847482

  7. Strategic collaborative quality management and employee job satisfaction.

    PubMed

    Mosadeghrad, Ali Mohammad

    2014-05-01

    This study aimed to examine Strategic Collaborative Quality Management (SCQM) impact on employee job satisfaction. The study presents a case study over six years following the implementation of the SCQM programme in a public hospital. A validated questionnaire was used to measure employees' job satisfaction. The impact of the intervention was measured by comparing the pre-intervention and post-intervention measures in the hospital. The hospital reported a significant improvement in some dimensions of job satisfaction like management and supervision, organisational policies, task requirement, and working conditions. This paper provides detailed information on how a quality management model implementation affects employees. A well developed, well introduced and institutionalised quality management model can improve employees' job satisfaction. However, the success of quality management needs top management commitment and stability.

  8. Criteria for Reviewing District Competency Tests.

    ERIC Educational Resources Information Center

    Herman, Joan L.

    A formative evaluation minimum competency test model is examined. The model systematically uses assessment information to support and facilitate program improvement. In terms of the model, four inter-related qualities are essential for a sound testing program. The content validity perspective looks at how well the district has defined competency…

  9. Sensitivity Analysis of Dispersion Model Results in the NEXUS Health Study Due to Uncertainties in Traffic-Related Emissions Inputs

    EPA Science Inventory

    Dispersion modeling tools have traditionally provided critical information for air quality management decisions, but have been used recently to provide exposure estimates to support health studies. However, these models can be challenging to implement, particularly in near-road s...

  10. Valuing recreational fishing quality at rivers and streams

    NASA Astrophysics Data System (ADS)

    Melstrom, Richard T.; Lupi, Frank; Esselman, Peter C.; Stevenson, R. Jan

    2015-01-01

    This paper describes an economic model that links the demand for recreational stream fishing to fish biomass. Useful measures of fishing quality are often difficult to obtain. In the past, economists have linked the demand for fishing sites to species presence-absence indicators or average self-reported catch rates. The demand model presented here takes advantage of a unique data set of statewide biomass estimates for several popular game fish species in Michigan, including trout, bass and walleye. These data are combined with fishing trip information from a 2008-2010 survey of Michigan anglers in order to estimate a demand model. Fishing sites are defined by hydrologic unit boundaries and information on fish assemblages so that each site corresponds to the area of a small subwatershed, about 100-200 square miles in size. The random utility model choice set includes nearly all fishable streams in the state. The results indicate a significant relationship between the site choice behavior of anglers and the biomass of certain species. Anglers are more likely to visit streams in watersheds high in fish abundance, particularly for brook trout and walleye. The paper includes estimates of the economic value of several quality change and site loss scenarios.

  11. The Catchment Runoff Attenuation Flux Tool, a minimum information requirement nutrient pollution model

    NASA Astrophysics Data System (ADS)

    Adams, R.; Quinn, P. F.; Bowes, M. J.

    2015-04-01

    A model for simulating runoff pathways and water quality fluxes has been developed using the minimum information requirement (MIR) approach. The model, the Catchment Runoff Attenuation Flux Tool (CRAFT), is applicable to mesoscale catchments and focusses primarily on hydrological pathways that mobilise nutrients. Hence CRAFT can be used to investigate the impact of flow pathway management intervention strategies designed to reduce the loads of nutrients into receiving watercourses. The model can help policy makers meet water quality targets and consider methods to obtain "good" ecological status. A case study of the 414 km2 Frome catchment, Dorset, UK, has been described here as an application of CRAFT in order to highlight the above issues at the mesoscale. The model was primarily calibrated on 10-year records of weekly data to reproduce the observed flows and nutrient (nitrate nitrogen - N; phosphorus - P) concentrations. Data from 2 years with sub-daily monitoring at the same site were also analysed. These data highlighted some additional signals in the nutrient flux, particularly of soluble reactive phosphorus, which were not observable in the weekly data. This analysis has prompted the choice of using a daily time step as the minimum information requirement to simulate the processes observed at the mesoscale, including the impact of uncertainty. A management intervention scenario was also run to demonstrate how the model can support catchment managers investigating how reducing the concentrations of N and P in the various flow pathways. This mesoscale modelling tool can help policy makers consider a range of strategies to meet the European Union (EU) water quality targets for this type of catchment.

  12. Making difficult decisions: the role of quality of care in choosing a nursing home.

    PubMed

    Pesis-Katz, Irena; Phelps, Charles E; Temkin-Greener, Helena; Spector, William D; Veazie, Peter; Mukamel, Dana B

    2013-05-01

    We investigated how quality of care affects choosing a nursing home. We examined nursing home choice in California, Ohio, New York, and Texas in 2001, a period before the federal Nursing Home Compare report card was published. Thus, consumers were less able to observe clinical quality or clinical quality was masked. We modeled nursing home choice by estimating a conditional multinomial logit model. In all states, consumers were more likely to choose nursing homes of high hotel services quality but not clinical care quality. Nursing home choice was also significantly associated with shorter distance from prior residence, not-for-profit status, and larger facility size. In the absence of quality report cards, consumers choose a nursing home on the basis of the quality dimensions that are easy for them to observe, evaluate, and apply to their situation. Future research should focus on identifying the quality information that offers the most value added to consumers.

  13. Making Difficult Decisions: The Role of Quality of Care in Choosing a Nursing Home

    PubMed Central

    Phelps, Charles E.; Temkin-Greener, Helena; Spector, William D.; Veazie, Peter; Mukamel, Dana B.

    2013-01-01

    Objectives. We investigated how quality of care affects choosing a nursing home. Methods. We examined nursing home choice in California, Ohio, New York, and Texas in 2001, a period before the federal Nursing Home Compare report card was published. Thus, consumers were less able to observe clinical quality or clinical quality was masked. We modeled nursing home choice by estimating a conditional multinomial logit model. Results. In all states, consumers were more likely to choose nursing homes of high hotel services quality but not clinical care quality. Nursing home choice was also significantly associated with shorter distance from prior residence, not-for-profit status, and larger facility size. Conclusions. In the absence of quality report cards, consumers choose a nursing home on the basis of the quality dimensions that are easy for them to observe, evaluate, and apply to their situation. Future research should focus on identifying the quality information that offers the most value added to consumers. PMID:23488519

  14. The Importance of Context: Risk-based De-identification of Biomedical Data.

    PubMed

    Prasser, Fabian; Kohlmayer, Florian; Kuhn, Klaus A

    2016-08-05

    Data sharing is a central aspect of modern biomedical research. It is accompanied by significant privacy concerns and often data needs to be protected from re-identification. With methods of de-identification datasets can be transformed in such a way that it becomes extremely difficult to link their records to identified individuals. The most important challenge in this process is to find an adequate balance between an increase in privacy and a decrease in data quality. Accurately measuring the risk of re-identification in a specific data sharing scenario is an important aspect of data de-identification. Overestimation of risks will significantly deteriorate data quality, while underestimation will leave data prone to attacks on privacy. Several models have been proposed for measuring risks, but there is a lack of generic methods for risk-based data de-identification. The aim of the work described in this article was to bridge this gap and to show how the quality of de-identified datasets can be improved by using risk models to tailor the process of de-identification to a concrete context. We implemented a generic de-identification process and several models for measuring re-identification risks into the ARX de-identification tool for biomedical data. By integrating the methods into an existing framework, we were able to automatically transform datasets in such a way that information loss is minimized while it is ensured that re-identification risks meet a user-defined threshold. We performed an extensive experimental evaluation to analyze the impact of using different risk models and assumptions about the goals and the background knowledge of an attacker on the quality of de-identified data. The results of our experiments show that data quality can be improved significantly by using risk models for data de-identification. On a scale where 100 % represents the original input dataset and 0 % represents a dataset from which all information has been removed, the loss of information content could be reduced by up to 10 % when protecting datasets against strong adversaries and by up to 24 % when protecting datasets against weaker adversaries. The methods studied in this article are well suited for protecting sensitive biomedical data and our implementation is available as open-source software. Our results can be used by data custodians to increase the information content of de-identified data by tailoring the process to a specific data sharing scenario. Improving data quality is important for fostering the adoption of de-identification methods in biomedical research.

  15. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    NASA Astrophysics Data System (ADS)

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  16. Refinement of pollutant gas emissions in the state of Rio de Janeiro for applications in modeling air quality on a local scale

    NASA Astrophysics Data System (ADS)

    Domínguez Chovert, Angel; Félix Alonso, Marcelo; Frassoni, Ariane; José Ferreira, Valter; Eiras, Denis; Longo, Karla; Freitas, Saulo

    2017-04-01

    Numerical modeling is a fundamental tool for studying the earth system components along with weather and climate forecast. In fact, the development of on-line models allows to simulate conditions of the atmosphere, for example, to evaluate certain chemicals in weather events with the purpose of improving a region's quality of air. For this determined purpose, the on-line models employ information from a broad range of sources in order to generate its variables forecasts. But beyond vast information sources, for a region's quality of air study, the data concerning the amount and distribution of emissions of polluting gases must be representative, as well as, it's required complete georeferenced emissions for simulations made with high resolution. Consequently, the modifications made in this work to the PREP-CHEM-SRC (Preprocessor of trace gas and aerosol emission fields for regional and a global atmospheric chemistry models) tool are presented to meliorate the initialization files for BRAMS models, 5.2 version (Brazilian Developments on the Regional Atmospheric Modeling System) and WRF (Weather Research and Forecasting Model) with vehicle emissions in the state of Rio de Janeiro, Brazil. It was determined the annual vehicle emission, until the year 2030, of the nitrogen oxides species (NOx) and carbon monoxide (CO) for each city and using different scenarios. For Rio de Janeiro city, a process of distribution by emissions of the main pollutant gases was implemented. In total, five different types of routes were used and the emission percentage for each one was calculated using the most current traffic information in them. For to check the industrial contributions to the emissions were used the global datasets RETRO (REanalysis of TROpospheric chemical composition) and EDGAR-HTAP (Emission Database for Global Atmospheric Research). On the other hand, for the biogenic contributions was used information from the MEGAN model (Model of Emissions of Gases and Aerosols from Nature). For all the analyzed species it was possible to observe the strong influence of the vehicular activity on the emission distribution.

  17. Updating of states in operational hydrological models

    NASA Astrophysics Data System (ADS)

    Bruland, O.; Kolberg, S.; Engeland, K.; Gragne, A. S.; Liston, G.; Sand, K.; Tøfte, L.; Alfredsen, K.

    2012-04-01

    Operationally the main purpose of hydrological models is to provide runoff forecasts. The quality of the model state and the accuracy of the weather forecast together with the model quality define the runoff forecast quality. Input and model errors accumulate over time and may leave the model in a poor state. Usually model states can be related to observable conditions in the catchment. Updating of these states, knowing their relation to observable catchment conditions, influence directly the forecast quality. Norway is internationally in the forefront in hydropower scheduling both on short and long terms. The inflow forecasts are fundamental to this scheduling. Their quality directly influence the producers profit as they optimize hydropower production to market demand and at the same time minimize spill of water and maximize available hydraulic head. The quality of the inflow forecasts strongly depends on the quality of the models applied and the quality of the information they use. In this project the focus has been to improve the quality of the model states which the forecast is based upon. Runoff and snow storage are two observable quantities that reflect the model state and are used in this project for updating. Generally the methods used can be divided in three groups: The first re-estimates the forcing data in the updating period; the second alters the weights in the forecast ensemble; and the third directly changes the model states. The uncertainty related to the forcing data through the updating period is due to both uncertainty in the actual observation and to how well the gauging stations represent the catchment both in respect to temperatures and precipitation. The project looks at methodologies that automatically re-estimates the forcing data and tests the result against observed response. Model uncertainty is reflected in a joint distribution of model parameters estimated using the Dream algorithm.

  18. Quality assessment of protein model-structures based on structural and functional similarities.

    PubMed

    Konopka, Bogumil M; Nebel, Jean-Christophe; Kotulska, Malgorzata

    2012-09-21

    Experimental determination of protein 3D structures is expensive, time consuming and sometimes impossible. A gap between number of protein structures deposited in the World Wide Protein Data Bank and the number of sequenced proteins constantly broadens. Computational modeling is deemed to be one of the ways to deal with the problem. Although protein 3D structure prediction is a difficult task, many tools are available. These tools can model it from a sequence or partial structural information, e.g. contact maps. Consequently, biologists have the ability to generate automatically a putative 3D structure model of any protein. However, the main issue becomes evaluation of the model quality, which is one of the most important challenges of structural biology. GOBA--Gene Ontology-Based Assessment is a novel Protein Model Quality Assessment Program. It estimates the compatibility between a model-structure and its expected function. GOBA is based on the assumption that a high quality model is expected to be structurally similar to proteins functionally similar to the prediction target. Whereas DALI is used to measure structure similarity, protein functional similarity is quantified using standardized and hierarchical description of proteins provided by Gene Ontology combined with Wang's algorithm for calculating semantic similarity. Two approaches are proposed to express the quality of protein model-structures. One is a single model quality assessment method, the other is its modification, which provides a relative measure of model quality. Exhaustive evaluation is performed on data sets of model-structures submitted to the CASP8 and CASP9 contests. The validation shows that the method is able to discriminate between good and bad model-structures. The best of tested GOBA scores achieved 0.74 and 0.8 as a mean Pearson correlation to the observed quality of models in our CASP8 and CASP9-based validation sets. GOBA also obtained the best result for two targets of CASP8, and one of CASP9, compared to the contest participants. Consequently, GOBA offers a novel single model quality assessment program that addresses the practical needs of biologists. In conjunction with other Model Quality Assessment Programs (MQAPs), it would prove useful for the evaluation of single protein models.

  19. One-year simulation of ozone and particulate matter in China using WRF/CMAQ modeling system

    NASA Astrophysics Data System (ADS)

    Hu, Jianlin; Chen, Jianjun; Ying, Qi; Zhang, Hongliang

    2016-08-01

    China has been experiencing severe air pollution in recent decades. Although an ambient air quality monitoring network for criteria pollutants has been constructed in over 100 cities since 2013 in China, the temporal and spatial characteristics of some important pollutants, such as particulate matter (PM) components, remain unknown, limiting further studies investigating potential air pollution control strategies to improve air quality and associating human health outcomes with air pollution exposure. In this study, a yearlong (2013) air quality simulation using the Weather Research and Forecasting (WRF) model and the Community Multi-scale Air Quality (CMAQ) model was conducted to provide detailed temporal and spatial information of ozone (O3), total PM2.5, and chemical components. Multi-resolution Emission Inventory for China (MEIC) was used for anthropogenic emissions and observation data obtained from the national air quality monitoring network were collected to validate model performance. The model successfully reproduces the O3 and PM2.5 concentrations at most cities for most months, with model performance statistics meeting the performance criteria. However, overprediction of O3 generally occurs at low concentration range while underprediction of PM2.5 happens at low concentration range in summer. Spatially, the model has better performance in southern China than in northern China, central China, and Sichuan Basin. Strong seasonal variations of PM2.5 exist and wind speed and direction play important roles in high PM2.5 events. Secondary components have more boarder distribution than primary components. Sulfate (SO42-), nitrate (NO3-), ammonium (NH4+), and primary organic aerosol (POA) are the most important PM2.5 components. All components have the highest concentrations in winter except secondary organic aerosol (SOA). This study proves the ability of the CMAQ model to reproduce severe air pollution in China, identifies the directions where improvements are needed, and provides information for human exposure to multiple pollutants for assessing health effects.

  20. Assessment of efficiency of functioning the infocommunication systems a special purpose in the conditions of violation quality of relevance information

    NASA Astrophysics Data System (ADS)

    Parinov, A. V.; Korotkikh, L. P.; Desyatov, D. B.; Stepanov, L. V.

    2018-03-01

    The uniqueness of information processing mechanisms in special-purpose infocommunication systems and the increased interest of intruders lead to an increase in the relevance of the problems associated with their protection. The paper considers the issues of building risk-models for the violation of the relevance and value of information in infocommunication systems for special purposes. Also, special attention is paid to the connection between the qualities of relevance and the value of information obtained as a result of the operation of infocommunication systems for special purposes. Analytical expressions for the risk and damage function in the time range in special-purpose infocommunication systems are obtained, which can serve as a mathematical basis for risk assessment. Further, an analytical expression is obtained to assess the chance of obtaining up-to-date information in the operation of infocommunication systems up to the time the information quality is violated. An analytical expression for estimating the chance can be used to calculate the effectiveness of a special-purpose infocommunication system.

  1. Where should Momma go? Current nursing home performance measurement strategies and a less ambitious approach.

    PubMed

    Phillips, Charles D; Hawes, Catherine; Lieberman, Trudy; Koren, Mary Jane

    2007-06-25

    Nursing home performance measurement systems are practically ubiquitous. The vast majority of these systems aspire to rank order all nursing homes based on quantitative measures of quality. However, the ability of such systems to identify homes differing in quality is hampered by the multidimensional nature of nursing homes and their residents. As a result, the authors doubt the ability of many nursing home performance systems to truly help consumers differentiate among homes providing different levels of quality. We also argue that, for consumers, performance measurement models are better at identifying problem facilities than potentially good homes. In response to these concerns we present a proposal for a less ambitious approach to nursing home performance measurement than previously used. We believe consumers can make better informed choice using a simpler system designed to pinpoint poor-quality nursing homes, rather than one designed to rank hundreds of facilities based on differences in quality-of-care indicators that are of questionable importance. The suggested performance model is based on five principles used in the development of the Consumers Union 2006 Nursing Home Quality Monitor. We can best serve policy-makers and consumers by eschewing nursing home reporting systems that present information about all the facilities in a city, a state, or the nation on a website or in a report. We argue for greater modesty in our efforts and a focus on identifying only the potentially poorest or best homes. In the end, however, it is important to remember that information from any performance measurement website or report is no substitute for multiple visits to a home at different times of the day to personally assess quality.

  2. Simple yet effective: Historical proximity variables improve the species distribution models for invasive giant hogweed (Heracleum mantegazzianum s.l.) in Poland.

    PubMed

    Mędrzycki, Piotr; Jarzyna, Ingeborga; Obidziński, Artur; Tokarska-Guzik, Barbara; Sotek, Zofia; Pabjanek, Piotr; Pytlarczyk, Adam; Sachajdakiewicz, Izabela

    2017-01-01

    Species distribution models are scarcely applicable to invasive species because of their breaking of the models' assumptions. So far, few mechanistic, semi-mechanistic or statistical solutions like dispersal constraints or propagule limitation have been applied. We evaluated a novel quasi-semi-mechanistic approach for regional scale models, using historical proximity variables (HPV) representing a state of the population in a given moment in the past. Our aim was to test the effects of addition of HPV sets of different minimal recentness, information capacity and the total number of variables on the quality of the species distribution model for Heracleum mantegazzianum on 116000 km2 in Poland. As environmental predictors, we used fragments of 103 1×1 km, world- wide, free-access rasters from WorldGrids.org. Single and ensemble models were computed using BIOMOD2 package 3.1.47 working in R environment 3.1.0. The addition of HPV improved the quality of single and ensemble models from poor to good and excellent. The quality was the highest for the variants with HPVs based on the distance from the most recent past occurrences. It was mostly affected by the algorithm type, but all HPV traits (minimal recentness, information capacity, model type or the number of the time periods) were significantly important determinants. The addition of HPVs improved the quality of current projections, raising the occurrence probability in regions where the species had occurred before. We conclude that HPV addition enables semi-realistic estimation of the rate of spread and can be applied to the short-term forecasting of invasive or declining species, which also break equal-dispersal probability assumptions.

  3. A Cross-Cultural Multi-agent Model of Opportunism in Trade

    NASA Astrophysics Data System (ADS)

    Hofstede, Gert Jan; Jonker, Catholijn M.; Verwaart, Tim

    According to transaction cost economics, contracts are always incomplete and offer opportunities to defect. Some level of trust is a sine qua non for trade. If the seller is better informed about product quality than the buyer, the buyer has to rely on information the seller provides or has to check the information by testing the product or tracing the supply chain processes, thus incurring extra transaction cost. An opportunistic seller who assumes the buyer to trust, may deliver a lower quality product than agreed upon. In human decisions to deceive and to show trust or distrust, issues like mutual expectations, shame, self-esteem, personality, and reputation are involved. These factors depend in part on traders' cultural background. This paper proposes an agent model of deceit and trust and describes a multi-agent simulation where trading agents are differentiated according to Hofstede's dimensions of national culture. Simulations of USA and Dutch trading situations are compared.

  4. Using Feedback from Data Consumers to Capture Quality Information on Environmental Research Data

    NASA Astrophysics Data System (ADS)

    Devaraju, A.; Klump, J. F.

    2015-12-01

    Data quality information is essential to facilitate reuse of Earth science data. Recorded quality information must be sufficient for other researchers to select suitable data sets for their analysis and confirm the results and conclusions. In the research data ecosystem, several entities are responsible for data quality. Data producers (researchers and agencies) play a major role in this aspect as they often include validation checks or data cleaning as part of their work. It is possible that the quality information is not supplied with published data sets; if it is available, the descriptions might be incomplete, ambiguous or address specific quality aspects. Data repositories have built infrastructures to share data, but not all of them assess data quality. They normally provide guidelines of documenting quality information. Some suggests that scholarly and data journals should take a role in ensuring data quality by involving reviewers to assess data sets used in articles, and incorporating data quality criteria in the author guidelines. However, this mechanism primarily addresses data sets submitted to journals. We believe that data consumers will complement existing entities to assess and document the quality of published data sets. This has been adopted in crowd-source platforms such as Zooniverse, OpenStreetMap, Wikipedia, Mechanical Turk and Tomnod. This paper presents a framework designed based on open source tools to capture and share data users' feedback on the application and assessment of research data. The framework comprises a browser plug-in, a web service and a data model such that feedback can be easily reported, retrieved and searched. The feedback records are also made available as Linked Data to promote integration with other sources on the Web. Vocabularies from Dublin Core and PROV-O are used to clarify the source and attribution of feedback. The application of the framework is illustrated with the CSIRO's Data Access Portal.

  5. Mining Information form a Coupled Air Quality Model to Examine the Impacts of Agricultural Management Practices on Air and Groundwater Quality

    EPA Science Inventory

    Attributing nitrogen (N) in the environment to emissions from agricultural management practices is difficult because of the complex and inter-related chemical and biological reactions associated with N and its cascading effects across land, air and water. Such analyses are criti...

  6. Fine-scale application of the WRF-CMAQ modeling system to the 2013 DISCOVER-AQ San Joaquin Valley study

    EPA Science Inventory

    The DISCOVER-AQ project (Deriving Information on Surface conditions from Column and Vertically Resolved Observations Relevant to Air Quality), is a joint collaboration between NASA, U.S. EPA and a number of other local organizations with the goal of characterizing air quality in ...

  7. 75 FR 28789 - Office of Innovation and Improvement; Overview Information; Charter Schools Program (CSP) Grants...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-24

    ... Schools Program (CSP) Grants for Replication and Expansion of High-Quality Charter Schools; Notice... purpose of the CSP is to increase national understanding of the charter school model and to expand the number of high-quality charter schools available to students across the Nation by providing financial...

  8. Principles of health economic evaluations of lipid-lowering strategies.

    PubMed

    Ara, Roberta; Basarir, Hasan; Ward, Sue Elizabeth

    2012-08-01

    Policy decision-making in cardiovascular disease is increasingly informed by the results generated from decision-analytic models (DAMs). The methodological approaches and assumptions used in these DAMs impact on the results generated and can influence a policy decision based on a cost per quality-adjusted life year (QALY) threshold. Decision makers need to be provided with a clear understanding of the key sources of evidence and how they are used in the DAM to make an informed judgement on the quality and appropriateness of the results generated. Our review identified 12 studies exploring the cost-effectiveness of pharmaceutical lipid-lowering interventions published since January 2010. All studies used Markov models with annual cycles to represent the long-term clinical pathway. Important differences in the model structures and evidence base used within the DAMs were identified. Whereas the reporting standards were reasonably good, there were many instances when reporting of methods could be improved, particularly relating to baseline risk levels, long-term benefit of treatment and health state utility values. There is a scope for improvement in the reporting of evidence and modelling approaches used within DAMs to provide decision makers with a clearer understanding of the quality and validity of the results generated. This would be assisted by fuller publication of models, perhaps through detailed web appendices.

  9. The Betting Odds Rating System: Using soccer forecasts to forecast soccer.

    PubMed

    Wunderlich, Fabian; Memmert, Daniel

    2018-01-01

    Betting odds are frequently found to outperform mathematical models in sports related forecasting tasks, however the factors contributing to betting odds are not fully traceable and in contrast to rating-based forecasts no straightforward measure of team-specific quality is deducible from the betting odds. The present study investigates the approach of combining the methods of mathematical models and the information included in betting odds. A soccer forecasting model based on the well-known ELO rating system and taking advantage of betting odds as a source of information is presented. Data from almost 15.000 soccer matches (seasons 2007/2008 until 2016/2017) are used, including both domestic matches (English Premier League, German Bundesliga, Spanish Primera Division and Italian Serie A) and international matches (UEFA Champions League, UEFA Europe League). The novel betting odds based ELO model is shown to outperform classic ELO models, thus demonstrating that betting odds prior to a match contain more relevant information than the result of the match itself. It is shown how the novel model can help to gain valuable insights into the quality of soccer teams and its development over time, thus having a practical benefit in performance analysis. Moreover, it is argued that network based approaches might help in further improving rating and forecasting methods.

  10. The Betting Odds Rating System: Using soccer forecasts to forecast soccer

    PubMed Central

    Memmert, Daniel

    2018-01-01

    Betting odds are frequently found to outperform mathematical models in sports related forecasting tasks, however the factors contributing to betting odds are not fully traceable and in contrast to rating-based forecasts no straightforward measure of team-specific quality is deducible from the betting odds. The present study investigates the approach of combining the methods of mathematical models and the information included in betting odds. A soccer forecasting model based on the well-known ELO rating system and taking advantage of betting odds as a source of information is presented. Data from almost 15.000 soccer matches (seasons 2007/2008 until 2016/2017) are used, including both domestic matches (English Premier League, German Bundesliga, Spanish Primera Division and Italian Serie A) and international matches (UEFA Champions League, UEFA Europe League). The novel betting odds based ELO model is shown to outperform classic ELO models, thus demonstrating that betting odds prior to a match contain more relevant information than the result of the match itself. It is shown how the novel model can help to gain valuable insights into the quality of soccer teams and its development over time, thus having a practical benefit in performance analysis. Moreover, it is argued that network based approaches might help in further improving rating and forecasting methods. PMID:29870554

  11. Archetype modeling methodology.

    PubMed

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. The role of fine-grained annotations in supervised recognition of risk factors for heart disease from EHRs.

    PubMed

    Roberts, Kirk; Shooshan, Sonya E; Rodriguez, Laritza; Abhyankar, Swapna; Kilicoglu, Halil; Demner-Fushman, Dina

    2015-12-01

    This paper describes a supervised machine learning approach for identifying heart disease risk factors in clinical text, and assessing the impact of annotation granularity and quality on the system's ability to recognize these risk factors. We utilize a series of support vector machine models in conjunction with manually built lexicons to classify triggers specific to each risk factor. The features used for classification were quite simple, utilizing only lexical information and ignoring higher-level linguistic information such as syntax and semantics. Instead, we incorporated high-quality data to train the models by annotating additional information on top of a standard corpus. Despite the relative simplicity of the system, it achieves the highest scores (micro- and macro-F1, and micro- and macro-recall) out of the 20 participants in the 2014 i2b2/UTHealth Shared Task. This system obtains a micro- (macro-) precision of 0.8951 (0.8965), recall of 0.9625 (0.9611), and F1-measure of 0.9276 (0.9277). Additionally, we perform a series of experiments to assess the value of the annotated data we created. These experiments show how manually-labeled negative annotations can improve information extraction performance, demonstrating the importance of high-quality, fine-grained natural language annotations. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Decision-making in honeybee swarms based on quality and distance information of candidate nest sites.

    PubMed

    Laomettachit, Teeraphan; Termsaithong, Teerasit; Sae-Tang, Anuwat; Duangphakdee, Orawan

    2015-01-07

    In the nest-site selection process of honeybee swarms, an individual bee performs a waggle dance to communicate information about direction, quality, and distance of a discovered site to other bees at the swarm. Initially, different groups of bees dance to represent different potential sites, but eventually the swarm usually reaches an agreement for only one site. Here, we model the nest-site selection process in honeybee swarms of Apis mellifera and show how the swarms make adaptive decisions based on a trade-off between the quality and distance to candidate nest sites. We use bifurcation analysis and stochastic simulations to reveal that the swarm's site distance preference is moderate>near>far when the swarms choose between low quality sites. However, the distance preference becomes near>moderate>far when the swarms choose between high quality sites. Our simulations also indicate that swarms with large population size prefer nearer sites and, in addition, are more adaptive at making decisions based on available information compared to swarms with smaller population size. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. GEO Label Web Services for Dynamic and Effective Communication of Geospatial Metadata Quality

    NASA Astrophysics Data System (ADS)

    Lush, Victoria; Nüst, Daniel; Bastin, Lucy; Masó, Joan; Lumsden, Jo

    2014-05-01

    We present demonstrations of the GEO label Web services and their integration into a prototype extension of the GEOSS portal (http://scgeoviqua.sapienzaconsulting.com/web/guest/geo_home), the GMU portal (http://gis.csiss.gmu.edu/GADMFS/) and a GeoNetwork catalog application (http://uncertdata.aston.ac.uk:8080/geonetwork/srv/eng/main.home). The GEO label is designed to communicate, and facilitate interrogation of, geospatial quality information with a view to supporting efficient and effective dataset selection on the basis of quality, trustworthiness and fitness for use. The GEO label which we propose was developed and evaluated according to a user-centred design (UCD) approach in order to maximise the likelihood of user acceptance once deployed. The resulting label is dynamically generated from producer metadata in ISO or FDGC format, and incorporates user feedback on dataset usage, ratings and discovered issues, in order to supply a highly informative summary of metadata completeness and quality. The label was easily incorporated into a community portal as part of the GEO Architecture Implementation Programme (AIP-6) and has been successfully integrated into a prototype extension of the GEOSS portal, as well as the popular metadata catalog and editor, GeoNetwork. The design of the GEO label was based on 4 user studies conducted to: (1) elicit initial user requirements; (2) investigate initial user views on the concept of a GEO label and its potential role; (3) evaluate prototype label visualizations; and (4) evaluate and validate physical GEO label prototypes. The results of these studies indicated that users and producers support the concept of a label with drill-down interrogation facility, combining eight geospatial data informational aspects, namely: producer profile, producer comments, lineage information, standards compliance, quality information, user feedback, expert reviews, and citations information. These are delivered as eight facets of a wheel-like label, which are coloured according to metadata availability and are clickable to allow a user to engage with the original metadata and explore specific aspects in more detail. To support this graphical representation and allow for wider deployment architectures we have implemented two Web services, a PHP and a Java implementation, that generate GEO label representations by combining producer metadata (from standard catalogues or other published locations) with structured user feedback. Both services accept encoded URLs of publicly available metadata documents or metadata XML files as HTTP POST and GET requests and apply XPath and XSLT mappings to transform producer and feedback XML documents into clickable SVG GEO label representations. The label and services are underpinned by two XML-based quality models. The first is a producer model that extends ISO 19115 and 19157 to allow fuller citation of reference data, presentation of pixel- and dataset- level statistical quality information, and encoding of 'traceability' information on the lineage of an actual quality assessment. The second is a user quality model (realised as a feedback server and client) which allows reporting and query of ratings, usage reports, citations, comments and other domain knowledge. Both services are Open Source and are available on GitHub at https://github.com/lushv/geolabel-service and https://github.com/52North/GEO-label-java. The functionality of these services can be tested using our GEO label generation demos, available online at http://www.geolabel.net/demo.html and http://geoviqua.dev.52north.org/glbservice/index.jsf.

  15. A neuromathematical model of human information processing and its application to science content acquisition

    NASA Astrophysics Data System (ADS)

    Anderson, O. Roger

    The rate of information processing during science learning and the efficiency of the learner in mobilizing relevant information in long-term memory as an aid in transmitting newly acquired information to stable storage in long-term memory are fundamental aspects of science content acquisition. These cognitive processes, moreover, may be substantially related in tempo and quality of organization to the efficiency of higher thought processes such as divergent thinking and problem-solving ability that characterize scientific thought. As a contribution to our quantitative understanding of these fundamental information processes, a mathematical model of information acquisition is presented and empirically evaluated in comparison to evidence obtained from experimental studies of science content acquisition. Computer-based models are used to simulate variations in learning parameters and to generate the theoretical predictions to be empirically tested. The initial tests of the predictive accuracy of the model show close agreement between predicted and actual mean recall scores in short-term learning tasks. Implications of the model for human information acquisition and possible future research are discussed in the context of the unique theoretical framework of the model.

  16. From 'solution shop' model to 'focused factory' in hospital surgery: increasing care value and predictability.

    PubMed

    Cook, David; Thompson, Jeffrey E; Habermann, Elizabeth B; Visscher, Sue L; Dearani, Joseph A; Roger, Veronique L; Borah, Bijan J

    2014-05-01

    The full-service US hospital has been described organizationally as a "solution shop," in which medical problems are assumed to be unstructured and to require expert physicians to determine each course of care. If universally applied, this model contributes to unwarranted variation in care, which leads to lower quality and higher costs. We purposely disrupted the adult cardiac surgical practice that we led at Mayo Clinic, in Rochester, Minnesota, by creating a "focused factory" model (characterized by a uniform approach to delivering a limited set of high-quality products) within the practice's solution shop. Key elements of implementing the new model were mapping the care process, segmenting the patient population, using information technology to communicate clearly defined expectations, and empowering nonphysician providers at the bedside. Using a set of criteria, we determined that the focused-factory model was appropriate for 67 percent of cardiac surgical patients. We found that implementation of the model reduced resource use, length-of-stay, and cost. Variation was markedly reduced, and outcomes were improved. Assigning patients to different care models increases care value and the predictability of care process, outcomes, and costs while preserving (in a lesser clinical footprint) the strengths of the solution shop. We conclude that creating a focused-factory model within a solution shop, by applying industrial engineering principles and health information technology tools and changing the model of work, is very effective in both improving quality and reducing costs.

  17. Real-time dissemination of air quality information using data streams and Web technologies: linking air quality to health risks in urban areas.

    PubMed

    Davila, Silvije; Ilić, Jadranka Pečar; Bešlić, Ivan

    2015-06-01

    This article presents a new, original application of modern information and communication technology to provide effective real-time dissemination of air quality information and related health risks to the general public. Our on-line subsystem for urban real-time air quality monitoring is a crucial component of a more comprehensive integrated information system, which has been developed by the Institute for Medical Research and Occupational Health. It relies on a StreamInsight data stream management system and service-oriented architecture to process data streamed from seven monitoring stations across Zagreb. Parameters that are monitored include gases (NO, NO2, CO, O3, H2S, SO2, benzene, NH3), particulate matter (PM10 and PM2.5), and meteorological data (wind speed and direction, temperature and pressure). Streamed data are processed in real-time using complex continuous queries. They first go through automated validation, then hourly air quality index is calculated for every station, and a report sent to the Croatian Environment Agency. If the parameter values exceed the corresponding regulation limits for three consecutive hours, the web service generates an alert for population groups at risk. Coupled with the Common Air Quality Index model, our web application brings air pollution information closer to the general population and raises awareness about environmental and health issues. Soon we intend to expand the service to a mobile application that is being developed.

  18. Results of a modeling workshop concerning economic and environmental trends and concomitant resource management issues in the Mobile Bay area

    USGS Publications Warehouse

    Hamilton, David B.; Andrews, Austin K.; Auble, Gregor T.; Ellison, Richard A.; Johnson, Richard A.; Roelle, James E.; Staley, Michael J.

    1982-01-01

    During the past decade, the southern regions of the U.S. have experienced rapid change which is expected to continue into the foreseeable future. Growth in population, industry, and resource development has been attributed to a variety of advantages such as an abundant and inexpensive labor force, a mild climate, and the availability of energy, water, land, and other natural resources. While this growth has many benefits for the region, it also creates the potential for increased air, water, and solid waste pollution, and modification of natural habitats. A workshop was convened to consider the Mobile Bay area as a site-specific case of growth and its environmental consequences in the southern region. The objectives of the modeling workshop were to: (1) identify major factors of economic development as they relate to growth in the area over the immediate and longer term; (2) identify major environmental and resource management issues associated with this expected growth; and (3) identify and characterize the complex interrelationships among economic and environmental factors. This report summarizes the activities and results of a modeling workshop concerning economic growth and concomitant resource management issues in the Mobile Bay area. The workshop was organized around construction of a simulation model representing the relationships between a series of actions and indicators identified by participants. The workshop model had five major components. An Industry Submodel generated scenarios of growth in several industrial and transportation sectors. A Human Population/Economy Submodel calculated human population and economic variables in response to employment opportunities. A Land Use/Air Quality Submodel tabulated changes in land use, shoreline use, and air quality. A Water Submodel calculated indicators of water quality and quantity for fresh surface water, ground water, and Mobile Bay based on discharge information provided by the Industry and Human Population/Economy Submodels. Finally, a Fish Submodel calculated indicators of habitat quality for finfish and shellfish, utilizing information on water quality and wetlands acreage. The workshop was successful in identifying many of the critical interrelations between components of the Mobile area system. Not all of those interactions, such as the feedback of air quality as a limitation on development, could be incorporated into the workshop model because of the model's broad spatial scale and because of uncertainties or data gaps. Thus, the value of the modeling workshop was in the areas outlines below, rather than in the predictive power of the initial model developed at the workshop. First, participants developed a holistic perspective on the interactions which will determine future economic and environmental trends within the Mobile Bay area. Potential environmental consequences and limitations to grown identified at the workshop included: shoreline and water access; water quality of Mobile Bay; finfish and shellfish habitat quality with respect to dissolved oxygen and coliforms; air quality; and acreage of critical wetland habitat. Second, the model's requirements for specific, quantitative information stimulated supporting analyses, such as economic input-output calculations, which provide additional insight into the Mobile Bay area system. Third, the perspective of the Mobile area as an interacting system was developed in an open, cooperative forum which my provide a foundation for conflict resolution based on common understanding. Finally, the identification of model limitations and uncertainties should be useful in guiding the efficient allocation of future research effort.

  19. Information system of forest growth and productivity by site quality type and elements of forest

    NASA Astrophysics Data System (ADS)

    Khlyustov, V.

    2012-04-01

    Information system of forest growth and productivity by site quality type and elements of forest V.K. Khlustov Head of the Forestry Department of Russian State Agrarian University named after K.A.Timiryazev doctor of agricultural sciences, professor The efficiency of forest management can be improved substantially by development and introduction of principally new models of forest growth and productivity dynamics based on regionalized site specific parameters. Therefore an innovative information system was developed. It describes the current state and gives a forecast for forest stand parameters: growth, structure, commercial and biological productivity depend on type of site quality. In contrast to existing yield tables, the new system has environmental basis: site quality type. The information system contains set of multivariate statistical models and can work at the level of individual trees or at the stand level. The system provides a graphical visualization, as well as export of the emulation results. The System is able to calculate detailed description of any forest stand based on five initial indicators: site quality type, site index, stocking, composition, and tree age by elements of the forest. The results of the model run are following parameters: average diameter and height, top height, number of trees, basal area, growing stock (total, commercial with distribution by size, firewood and residuals), live biomass (stem, bark, branches, foliage). The system also provides the distribution of mentioned above forest stand parameters by tree diameter classes. To predict the future forest stand dynamics the system require in addition the time slot only. Full set of forest parameters mention above will be provided by the System. The most conservative initial parameters (site quality type and site index) can be kept in the form of geo referenced polygons. In this case the system would need only 3 dynamic initial parameters (stocking, composition and age) to simulate forest parameters and their dynamics. The system can substitute traditional processing of forest inventory field data and provide users with detailed information on the current state of forest and give a prediction. Implementation of the proposed system in combination with high resolution remote sensing is able to increase significantly the quality of forest inventory and at the same time reduce the costs. The system is a contribution to site oriented forest management. The System is registered in the Russian State Register of Computer Programs 12.07.2011, No 2011615418.

  20. Puget Sound Dissolved Oxygen Modeling Study: Development of an Intermediate Scale Water Quality Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khangaonkar, Tarang; Sackmann, Brandon S.; Long, Wen

    2012-10-01

    The Salish Sea, including Puget Sound, is a large estuarine system bounded by over seven thousand miles of complex shorelines, consists of several subbasins and many large inlets with distinct properties of their own. Pacific Ocean water enters Puget Sound through the Strait of Juan de Fuca at depth over the Admiralty Inlet sill. Ocean water mixed with freshwater discharges from runoff, rivers, and wastewater outfalls exits Puget Sound through the brackish surface outflow layer. Nutrient pollution is considered one of the largest threats to Puget Sound. There is considerable interest in understanding the effect of nutrient loads on themore » water quality and ecological health of Puget Sound in particular and the Salish Sea as a whole. The Washington State Department of Ecology (Ecology) contracted with Pacific Northwest National Laboratory (PNNL) to develop a coupled hydrodynamic and water quality model. The water quality model simulates algae growth, dissolved oxygen, (DO) and nutrient dynamics in Puget Sound to inform potential Puget Sound-wide nutrient management strategies. Specifically, the project is expected to help determine 1) if current and potential future nitrogen loadings from point and non-point sources are significantly impairing water quality at a large scale and 2) what level of nutrient reductions are necessary to reduce or control human impacts to DO levels in the sensitive areas. The project did not include any additional data collection but instead relied on currently available information. This report describes model development effort conducted during the period 2009 to 2012 under a U.S. Environmental Protection Agency (EPA) cooperative agreement with PNNL, Ecology, and the University of Washington awarded under the National Estuary Program« less

  1. Improving the quality of mass produced maps

    USGS Publications Warehouse

    Simley, J.

    2001-01-01

    Quality is critical in cartography because key decisions are often made based on the information the map communicates. The mass production of digital cartographic information to support geographic information science has now added a new dimension to the problem of cartographic quality, as problems once limited to small volumes can now proliferate in mass production programs. These problems can also affect the economics of map production by diverting a sizeable portion of production cost to pay for rework on maps with poor quality. Such problems are common to general industry-in response, the quality engineering profession has developed a number of successful methods to overcome these problems. Two important methods are the reduction of error through statistical analysis and addressing the quality environment in which people work. Once initial and obvious quality problems have been solved, outside influences periodically appear that cause adverse variations in quality and consequently increase production costs. Such errors can be difficult to detect before the customer is affected. However, a number of statistical techniques can be employed to detect variation so that the problem is eliminated before significant damage is caused. Additionally, the environment in which the workforce operates must be conductive to quality. Managers have a powerful responsibility to create this environment. Two sets of guidelines, known as Deming's Fourteen Points and ISO-9000, provide models for this environment.

  2. Multi-level multi-task learning for modeling cross-scale interactions in nested geospatial data

    USGS Publications Warehouse

    Yuan, Shuai; Zhou, Jiayu; Tan, Pang-Ning; Fergus, Emi; Wagner, Tyler; Sorrano, Patricia

    2017-01-01

    Predictive modeling of nested geospatial data is a challenging problem as the models must take into account potential interactions among variables defined at different spatial scales. These cross-scale interactions, as they are commonly known, are particularly important to understand relationships among ecological properties at macroscales. In this paper, we present a novel, multi-level multi-task learning framework for modeling nested geospatial data in the lake ecology domain. Specifically, we consider region-specific models to predict lake water quality from multi-scaled factors. Our framework enables distinct models to be developed for each region using both its local and regional information. The framework also allows information to be shared among the region-specific models through their common set of latent factors. Such information sharing helps to create more robust models especially for regions with limited or no training data. In addition, the framework can automatically determine cross-scale interactions between the regional variables and the local variables that are nested within them. Our experimental results show that the proposed framework outperforms all the baseline methods in at least 64% of the regions for 3 out of 4 lake water quality datasets evaluated in this study. Furthermore, the latent factors can be clustered to obtain a new set of regions that is more aligned with the response variables than the original regions that were defined a priori from the ecology domain.

  3. Understanding the Quality Factors That Influence the Continuance Intention of Students toward Participation in MOOCs

    ERIC Educational Resources Information Center

    Yang, Ming; Shao, Zhen; Liu, Qian; Liu, Chuiyi

    2017-01-01

    The massive open online course (MOOC) is emerging as the new paradigm for modern education. The success of MOOCs depends on learners' continued usage. Drawing upon the information systems success model (IS success model) and technology acceptance model, a theoretical model for studying learners' continuance intentions toward participation in MOOCs…

  4. Approach for environmental baseline water sampling

    USGS Publications Warehouse

    Smith, K.S.

    2011-01-01

    Samples collected during the exploration phase of mining represent baseline conditions at the site. As such, they can be very important in forecasting potential environmental impacts should mining proceed, and can become measurements against which future changes are compared. Constituents in stream water draining mined and mineralized areas tend to be geochemically, spatially, and temporally variable, which presents challenges in collecting both exploration and baseline water-quality samples. Because short-term (daily) variations can complicate long-term trends, it is important to consider recent findings concerning geochemical variability of stream-water constituents at short-term timescales in designing sampling plans. Also, adequate water-quality information is key to forecasting potential ecological impacts from mining. Therefore, it is useful to collect baseline water samples adequate tor geochemical and toxicological modeling. This requires complete chemical analyses of dissolved constituents that include major and minor chemical elements as well as physicochemical properties (including pH, specific conductance, dissolved oxygen) and dissolved organic carbon. Applying chemical-equilibrium and appropriate toxicological models to water-quality information leads to an understanding of the speciation, transport, sequestration, bioavailability, and aquatic toxicity of potential contaminants. Insights gained from geochemical and toxicological modeling of water-quality data can be used to design appropriate mitigation and for economic planning for future mining activities.

  5. Trend analysis of salt load and evaluation of the frequency of water-quality measurements for the Gunnison, the Colorado, and the Dolores rivers in Colorado and Utah

    USGS Publications Warehouse

    Kircher, J.E.; Dinicola, Richard S.; Middelburg, R.F.

    1984-01-01

    Monthly values were computed for water-quality constituents at four streamflow gaging stations in the Upper Colorado River basin for the determination of trends. Seasonal regression and seasonal Kendall trend analysis techniques were applied to two monthly data sets at each station site for four different time periods. A recently developed method for determining optimal water-discharge data-collection frequency was also applied to the monthly water-quality data. Trend analysis results varied with each monthly load computational method, period of record, and trend detection model used. No conclusions could be reached regarding which computational method was best to use in trend analysis. Time-period selection for analysis was found to be important with regard to intended use of the results. Seasonal Kendall procedures were found to be applicable to most data sets. Seasonal regression models were more difficult to apply and were sometimes of questionable validity; however, those results were more informative than seasonal Kendall results. The best model to use depends upon the characteristics of the data and the amount of trend information needed. The measurement-frequency optimization method had potential for application to water-quality data, but refinements are needed. (USGS)

  6. Model-based monitoring of stormwater runoff quality.

    PubMed

    Birch, Heidi; Vezzaro, Luca; Mikkelsen, Peter Steen

    2013-01-01

    Monitoring of micropollutants (MP) in stormwater is essential to evaluate the impacts of stormwater on the receiving aquatic environment. The aim of this study was to investigate how different strategies for monitoring of stormwater quality (combining a model with field sampling) affect the information obtained about MP discharged from the monitored system. A dynamic stormwater quality model was calibrated using MP data collected by automatic volume-proportional sampling and passive sampling in a storm drainage system on the outskirts of Copenhagen (Denmark) and a 10-year rain series was used to find annual average (AA) and maximum event mean concentrations. Use of this model reduced the uncertainty of predicted AA concentrations compared to a simple stochastic method based solely on data. The predicted AA concentration, obtained by using passive sampler measurements (1 month installation) for calibration of the model, resulted in the same predicted level but with narrower model prediction bounds than by using volume-proportional samples for calibration. This shows that passive sampling allows for a better exploitation of the resources allocated for stormwater quality monitoring.

  7. Evaluating Health Information Systems Using Ontologies

    PubMed Central

    Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan

    2016-01-01

    Background There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. Objectives The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems—whether similar or heterogeneous—by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. Methods On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. Results The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. Conclusions The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems. PMID:27311735

  8. Evaluating Health Information Systems Using Ontologies.

    PubMed

    Eivazzadeh, Shahryar; Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan

    2016-06-16

    There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems-whether similar or heterogeneous-by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems.

  9. Complex adaptive systems (CAS): an overview of key elements, characteristics and application to management theory.

    PubMed

    Ellis, Beverley; Herbert, Stuart Ian

    2011-01-01

    To identify key elements and characteristics of complex adaptive systems (CAS) relevant to implementing clinical governance, drawing on lessons from quality improvement programmes and the use of informatics in primary care. The research strategy includes a literature review to develop theoretical models of clinical governance of quality improvement in primary care organisations (PCOs) and a survey of PCOs. Complex adaptive system theories are a valuable tool to help make sense of natural phenomena, which include human responses to problem solving within the sampled PCOs. The research commenced with a survey; 76% (n16) of respondents preferred to support the implementation of clinical governance initiatives guided by outputs from general practice electronic health records. There was considerable variation in the way in which consultation data was captured, recorded and organised. Incentivised information sharing led to consensus on coding policies and models of data recording ahead of national contractual requirements. Informatics was acknowledged as a mechanism to link electronic health record outputs, quality improvement and resources. Investment in informatics was identified as a development priority in order to embed clinical governance principles in practice. Complex adaptive system theory usefully describes evolutionary change processes, providing insight into how the origins of quality assurance were predicated on rational reductionism and linearity. New forms of governance do not neutralise previous models, but add further dimensions to them. Clinical governance models have moved from deterministic and 'objective' factors to incorporate cultural aspects with feedback about quality enabled by informatics. The socio-technical lessons highlighted should inform healthcare management.

  10. A framework for considering business models.

    PubMed

    Anderson, James G

    2003-01-01

    Information technology (IT) such as computerized physician order entry, computer-based decision support and alerting systems, and electronic prescribing can reduce medical errors and improve the quality of health care. However, the business value of these systems is frequently questioned. At present a number of barriers exist to realizing the potential of IT to improve quality of care. Some of these barriers are: the ineffectiveness of existing error reporting systems, low investment in IT infrastructure, legal impediments to reforms, and the difficulty in demonstrating a sufficient return on investment to justify expenditures for quality improvement. This paper provides an overview of these issues, a framework for considering business models, and examples of successful implementations of IT to improve quality of patient care.

  11. Enhancing mathematics teachers' quality through Lesson Study.

    PubMed

    Lomibao, Laila S

    2016-01-01

    The efficiency and effectivity of the learning experience is dependent on the teacher quality, thus, enhancing teacher's quality is vital in improving the students learning outcome. Since, the usual top-down one-shot cascading model practice for teachers' professional development in Philippines has been observed to have much information dilution, and the Southeast Asian Ministers of Education Organization demanded the need to develop mathematics teachers' quality standards through the Southeast Asia Regional Standards for Mathematics Teachers (SEARS-MT), thus, an intensive, ongoing professional development model should be provided to teachers. This study was undertaken to determine the impact of Lesson Study on Bulua National High School mathematics teachers' quality level in terms of SEARS-MT dimensions. A mixed method of quantitative-qualitative research design was employed. Results of the analysis revealed that Lesson Study effectively enhanced mathematics teachers' quality and promoted teachers professional development. Teachers positively perceived Lesson Study to be beneficial for them to become a better mathematics teacher.

  12. Competition and quality in health care markets: a differential-game approach.

    PubMed

    Brekke, Kurt R; Cellini, Roberto; Siciliani, Luigi; Straume, Odd Rune

    2010-07-01

    We investigate the effect of competition on quality in health care markets with regulated prices taking a differential game approach, in which quality is a stock variable. Using a Hotelling framework, we derive the open-loop solution (health care providers set the optimal investment plan at the initial period) and the feedback closed-loop solution (providers move investments in response to the dynamics of the states). Under the closed-loop solution competition is more intense in the sense that providers observe quality in each period and base their investment on this information. If the marginal provision cost is constant, the open-loop and closed-loop solutions coincide, and the results are similar to the ones obtained by static models. If the marginal provision cost is increasing, investment and quality are lower in the closed-loop solution (when competition is more intense). In this case, static models tend to exaggerate the positive effect of competition on quality.

  13. Evaluating agricultural nonpoint-source pollution using integrated geographic information systems and hydrologic/water quality model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tim, U.S.; Jolly, R.

    1994-01-01

    Considerable progress has been made in developing physically based, distributed parameter, hydrologic/water quality (HIWQ) models for planning and control of nonpoint-source pollution. The widespread use of these models is often constrained by the excessive and time-consuming input data demands and the lack of computing efficiencies necessary for iterative simulation of alternative management strategies. Recent developments in geographic information systems (GIS) provide techniques for handling large amounts of spatial data for modeling nonpoint-source pollution problems. Because a GIS can be used to combine information from several sources to form an array of model input data and to examine any combinations ofmore » spatial input/output data, it represents a highly effective tool for HiWQ modeling. This paper describes the integration of a distributed-parameter model (AGNPS) with a GIS (ARC/INFO) to examine nonpoint sources of pollution in an agricultural watershed. The ARC/INFO GIS provided the tools to generate and spatially organize the disparate data to support modeling, while the AGNPS model was used to predict several water quality variables including soil erosion and sedimentation within a watershed. The integrated system was used to evaluate the effectiveness of several alternative management strategies in reducing sediment pollution in a 417-ha watershed located in southern Iowa. The implementation of vegetative filter strips and contour buffer (grass) strips resulted in a 41 and 47% reduction in sediment yield at the watershed outlet, respectively. In addition, when the integrated system was used, the combination of the above management strategies resulted in a 71% reduction in sediment yield. In general, the study demonstrated the utility of integrating a simulation model with GIS for nonpoini-source pollution control and planning. Such techniques can help characterize the diffuse sources of pollution at the landscape level. 52 refs., 6 figs., 1 tab.« less

  14. Season-ahead water quality forecasts for the Schuylkill River, Pennsylvania

    NASA Astrophysics Data System (ADS)

    Block, P. J.; Leung, K.

    2013-12-01

    Anticipating and preparing for elevated water quality parameter levels in critical water sources, using weather forecasts, is not uncommon. In this study, we explore the feasibility of extending this prediction scale to a season-ahead for the Schuylkill River in Philadelphia, utilizing both statistical and dynamical prediction models, to characterize the season. This advance information has relevance for recreational activities, ecosystem health, and water treatment, as the Schuylkill provides 40% of Philadelphia's water supply. The statistical model associates large-scale climate drivers with streamflow and water quality parameter levels; numerous variables from NOAA's CFSv2 model are evaluated for the dynamical approach. A multi-model combination is also assessed. Results indicate moderately skillful prediction of average summertime total coliform and wintertime turbidity, using season-ahead oceanic and atmospheric variables, predominantly from the North Atlantic Ocean. Models predicting the number of elevated turbidity events across the wintertime season are also explored.

  15. Cumulative Impact Assessment: Approaching Environmental Capacity in Development Area Using Environmental Impact Assessment Information

    NASA Astrophysics Data System (ADS)

    Cho, N.; Lee, M. J.; Maeng, J. H.

    2017-12-01

    Environmental impact assessment estimates the impact of development as a business unit and establishes mitigation plan. If the development is done, its economic effects can spread to the nearby areas. So that various developments can be distributed at different time intervals. The impact of the new developments can be combined with existing environmental impacts and can have a larger impact. That is, Cumulative impact assessment is needed to consider the environmental capacity of the Nearby area. Cumulative impact assessments require policy tools such as environmental impact assessment information and cumulative impact estimation models. In Korea, environmental information (water quality, air quality, etc.) of the development site is measured for environmental impact assessment and monitored for a certain period (generally 5 years) after the project. In addition, by constructing the environmental information as a spatial database, it is possible to express the environmental impact on a regional basis spatially and to intuitively use it for development site selection. Utilizing a composite model of environmental impact assessment information and Remote Sensing data for cumulative impact estimation, That can be used as a policy decision support tool that provides quantitative information for development area management, such as time series effect and sprawl phenomenon.

  16. Sequential Sampling Models in Cognitive Neuroscience: Advantages, Applications, and Extensions.

    PubMed

    Forstmann, B U; Ratcliff, R; Wagenmakers, E-J

    2016-01-01

    Sequential sampling models assume that people make speeded decisions by gradually accumulating noisy information until a threshold of evidence is reached. In cognitive science, one such model--the diffusion decision model--is now regularly used to decompose task performance into underlying processes such as the quality of information processing, response caution, and a priori bias. In the cognitive neurosciences, the diffusion decision model has recently been adopted as a quantitative tool to study the neural basis of decision making under time pressure. We present a selective overview of several recent applications and extensions of the diffusion decision model in the cognitive neurosciences.

  17. Using Modeling and Simulation to Examine the Benefits of a Network Tasking Order

    DTIC Science & Technology

    2010-01-01

    Without careful planning, the topolo- gies that form can suffer from poor Quality of Service (QoS). The networks could have bottlenecks, or worse, be... quality of service Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to...mission type include: • expected communications partners; • type of data transmitted; • bandwidth required (average, burst); • quality of service

  18. Geostatistical Prediction of Microbial Water Quality Throughout a Stream Network Using Meteorology, Land Cover, and Spatiotemporal Autocorrelation.

    PubMed

    Holcomb, David A; Messier, Kyle P; Serre, Marc L; Rowny, Jakob G; Stewart, Jill R

    2018-06-25

    Predictive modeling is promising as an inexpensive tool to assess water quality. We developed geostatistical predictive models of microbial water quality that empirically modeled spatiotemporal autocorrelation in measured fecal coliform (FC) bacteria concentrations to improve prediction. We compared five geostatistical models featuring different autocorrelation structures, fit to 676 observations from 19 locations in North Carolina's Jordan Lake watershed using meteorological and land cover predictor variables. Though stream distance metrics (with and without flow-weighting) failed to improve prediction over the Euclidean distance metric, incorporating temporal autocorrelation substantially improved prediction over the space-only models. We predicted FC throughout the stream network daily for one year, designating locations "impaired", "unimpaired", or "unassessed" if the probability of exceeding the state standard was ≥90%, ≤10%, or >10% but <90%, respectively. We could assign impairment status to more of the stream network on days any FC were measured, suggesting frequent sample-based monitoring remains necessary, though implementing spatiotemporal predictive models may reduce the number of concurrent sampling locations required to adequately assess water quality. Together, these results suggest that prioritizing sampling at different times and conditions using geographically sparse monitoring networks is adequate to build robust and informative geostatistical models of water quality impairment.

  19. Simulating smoke transport from wildland fires with a regional-scale air quality model: sensitivity to spatiotemporal allocation of fire emissions.

    PubMed

    Garcia-Menendez, Fernando; Hu, Yongtao; Odman, Mehmet T

    2014-09-15

    Air quality forecasts generated with chemical transport models can provide valuable information about the potential impacts of fires on pollutant levels. However, significant uncertainties are associated with fire-related emission estimates as well as their distribution on gridded modeling domains. In this study, we explore the sensitivity of fine particulate matter concentrations predicted by a regional-scale air quality model to the spatial and temporal allocation of fire emissions. The assessment was completed by simulating a fire-related smoke episode in which air quality throughout the Atlanta metropolitan area was affected on February 28, 2007. Sensitivity analyses were carried out to evaluate the significance of emission distribution among the model's vertical layers, along the horizontal plane, and into hourly inputs. Predicted PM2.5 concentrations were highly sensitive to emission injection altitude relative to planetary boundary layer height. Simulations were also responsive to the horizontal allocation of fire emissions and their distribution into single or multiple grid cells. Additionally, modeled concentrations were greatly sensitive to the temporal distribution of fire-related emissions. The analyses demonstrate that, in addition to adequate estimates of emitted mass, successfully modeling the impacts of fires on air quality depends on an accurate spatiotemporal allocation of emissions. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Simple yet effective: Historical proximity variables improve the species distribution models for invasive giant hogweed (Heracleum mantegazzianum s.l.) in Poland

    PubMed Central

    Jarzyna, Ingeborga; Obidziński, Artur; Tokarska-Guzik, Barbara; Sotek, Zofia; Pabjanek, Piotr; Pytlarczyk, Adam; Sachajdakiewicz, Izabela

    2017-01-01

    Species distribution models are scarcely applicable to invasive species because of their breaking of the models’ assumptions. So far, few mechanistic, semi-mechanistic or statistical solutions like dispersal constraints or propagule limitation have been applied. We evaluated a novel quasi-semi-mechanistic approach for regional scale models, using historical proximity variables (HPV) representing a state of the population in a given moment in the past. Our aim was to test the effects of addition of HPV sets of different minimal recentness, information capacity and the total number of variables on the quality of the species distribution model for Heracleum mantegazzianum on 116000 km2 in Poland. As environmental predictors, we used fragments of 103 1×1 km, world- wide, free-access rasters from WorldGrids.org. Single and ensemble models were computed using BIOMOD2 package 3.1.47 working in R environment 3.1.0. The addition of HPV improved the quality of single and ensemble models from poor to good and excellent. The quality was the highest for the variants with HPVs based on the distance from the most recent past occurrences. It was mostly affected by the algorithm type, but all HPV traits (minimal recentness, information capacity, model type or the number of the time periods) were significantly important determinants. The addition of HPVs improved the quality of current projections, raising the occurrence probability in regions where the species had occurred before. We conclude that HPV addition enables semi-realistic estimation of the rate of spread and can be applied to the short-term forecasting of invasive or declining species, which also break equal-dispersal probability assumptions. PMID:28926580

  1. Modeling and Management of Increased Urban Stormwater Runoff Using InfoSWMM Sustain in the Berkeley Neighborhood of Denver, Colorado

    NASA Astrophysics Data System (ADS)

    Panos, C.; Hogue, T. S.; McCray, J. E.

    2016-12-01

    Few urban studies have evaluated the hydrologic impacts of redevelopment - for example, a rapid conversion from single to multi-family homes - known as infill, or re-urbanization. Redevelopment provides unique stormwater challenges as private property owners in many cities are not mandated to undertake stormwater retrofits leading to an overall increase in stormwater quantity and decrease in quality. This research utilizes a version of the EPA's Storm Water Management Model (SWMM), InfoSWMM Sustain, to model and analyze the impacts of impervious cover change due to redevelopment on stormwater quantity and quality in Denver, Colorado, with a focus on the Berkeley Neighborhood, where the percent imperviousness is expected to increase significantly from a current value of 53% by 2025. We utilize flow data from multiple pressure transducers installed directly within the storm sewer network as well as water quality data from storm and low flow sampling to initially calibrate InfoSWMM Sustain using September 2015 through September 2016 storm data. Model scenarios include current land cover conditions as well as future imperviousness predictions from redevelopment. The Urban Drainage and Flood Control District's Colorado Urban Hydrograph Procedure (CUHP) model is also implemented and used for calibration and comparison to the InfoSWMM stormwater model. Model simulations predicting an average annual stormwater runoff for the basin will be used to inform stormwater capture for the Berkeley Neighborhood on the downstream Willis Case Golf Course, where treatment trains are being designed to provide irrigation water (a 250 ac-ft per year demand) and improved water quality for discharge to the nearby receiving waters of Clear Creek. Ultimately, study results will better inform regional stormwater capture requirements when transitioning from single to multi-family units by providing a quantitative basis for treatment and regulation priorities.

  2. Does competition improve health care quality?

    PubMed

    Scanlon, Dennis P; Swaminathan, Shailender; Lee, Woolton; Chernew, Michael

    2008-12-01

    To identify the effect of competition on health maintenance organizations' (HMOs) quality measures. Longitudinal analysis of a 5-year panel of the Healthcare Effectiveness Data and Information Set (HEDIS) and Consumer Assessment of Health Plans Survey(R) (CAHPS) data (calendar years 1998-2002). All plans submitting data to the National Committee for Quality Assurance (NCQA) were included regardless of their decision to allow NCQA to disclose their results publicly. NCQA, Interstudy, the Area Resource File, and the Bureau of Labor Statistics. Fixed-effects models were estimated that relate HMO competition to HMO quality controlling for an unmeasured, time-invariant plan, and market traits. Results are compared with estimates from models reliant on cross-sectional variation. Estimates suggest that plan quality does not improve with increased levels of HMO competition (as measured by either the Herfindahl index or the number of HMOs). Similarly, increased HMO penetration is generally not associated with improved quality. Cross-sectional models tend to suggest an inverse relationship between competition and quality. The strategies that promote competition among HMOs in the current market setting may not lead to improved HMO quality. It is possible that price competition dominates, with purchasers and consumers preferring lower premiums at the expense of improved quality, as measured by HEDIS and CAHPS. It is also possible that the fragmentation associated with competition hinders quality improvement.

  3. Gaps, disconnections, and discontinuities--the role of information exchange in the delivery of quality long-term care.

    PubMed

    Georgiou, Andrew; Marks, Anne; Braithwaite, Jeffrey; Westbrook, Johanna Irene

    2013-10-01

    The smart use of information and communication technologies (ICT) is widely seen as a means of enhancing the quality of aged care services. One of the barriers to ICT diffusion in aged care is the failure to cater for the complex and interdisciplinary requirements of the aged care environment. The aim of this qualitative study was to identify the layers of information exchange and communication and produce a conceptual model that can help to inform decisions related to the design, implementation, and sustainability of ICT. A qualitative study conducted in 2010 within seven Australian residential aged care facilities. It included 11 focus groups involving 47 staff and 54 individual interviews and observation sessions. The analysis of work processes identified key information exchange components related to the type of information (residential, clinical, and administrative) that is collected, stored, and communicated. This information relies on a diverse number of internal and external communication channels that are important for the organization of care. The findings highlight potential areas of communication dysfunction as a consequence of structural holes, fragmentation, or disconnections that can adversely affect the continuity and coordination of care, its safety, and quality.

  4. IS 2010 and ABET Accreditation: An Analysis of ABET-Accredited Information Systems Programs

    ERIC Educational Resources Information Center

    Saulnier, Bruce; White, Bruce

    2011-01-01

    Many strong forces are converging on information systems academic departments. Among these forces are quality considerations, accreditation, curriculum models, declining/steady student enrollments, and keeping current with respect to emerging technologies and trends. ABET, formerly the Accrediting Board for Engineering and Technology, is at…

  5. Information Technology in Education: The Critical Lack of Principled Leadership.

    ERIC Educational Resources Information Center

    Maddux, Cleborne D.

    2002-01-01

    Suggests there is a crisis in educational leadership, especially as it affects information technology. Highlights include educational leaders as managers; the commercialization of education; management strategies on campus; students as customers; quality control, online distance education, and the business model; and the future of online distance…

  6. Optimizing an estuarine water quality monitoring program through an entropy-based hierarchical spatiotemporal Bayesian framework

    NASA Astrophysics Data System (ADS)

    Alameddine, Ibrahim; Karmakar, Subhankar; Qian, Song S.; Paerl, Hans W.; Reckhow, Kenneth H.

    2013-10-01

    The total maximum daily load program aims to monitor more than 40,000 standard violations in around 20,000 impaired water bodies across the United States. Given resource limitations, future monitoring efforts have to be hedged against the uncertainties in the monitored system, while taking into account existing knowledge. In that respect, we have developed a hierarchical spatiotemporal Bayesian model that can be used to optimize an existing monitoring network by retaining stations that provide the maximum amount of information, while identifying locations that would benefit from the addition of new stations. The model assumes the water quality parameters are adequately described by a joint matrix normal distribution. The adopted approach allows for a reduction in redundancies, while emphasizing information richness rather than data richness. The developed approach incorporates the concept of entropy to account for the associated uncertainties. Three different entropy-based criteria are adopted: total system entropy, chlorophyll-a standard violation entropy, and dissolved oxygen standard violation entropy. A multiple attribute decision making framework is adopted to integrate the competing design criteria and to generate a single optimal design. The approach is implemented on the water quality monitoring system of the Neuse River Estuary in North Carolina, USA. The model results indicate that the high priority monitoring areas identified by the total system entropy and the dissolved oxygen violation entropy criteria are largely coincident. The monitoring design based on the chlorophyll-a standard violation entropy proved to be less informative, given the low probabilities of violating the water quality standard in the estuary.

  7. Competition in the pharmaceutical industry: how do quality differences shape advertising strategies?

    PubMed

    de Frutos, Maria-Angeles; Ornaghi, Carmine; Siotis, Georges

    2013-01-01

    We present a Hotelling model of price and advertising competition between prescription drugs that differ in quality/side effects. Promotional effort results in the endogenous formation of two consumer groups: brand loyal and non-brand loyal ones. We show that advertising intensities are strategic substitutes, with the better quality drugs being the ones that are most advertised. This positive association stems from the higher rents that firms can extract from consumers whose brand loyalty is endogenously determined by promotional effort. The model's main results on advertising and pricing strategies are taken to the data. The latter consists of product level data on prices and quantities, product level advertising data, as well as the qualitative information on drug quality contained in the Orange Book compiled by the Food and Drug Administration (FDA). The empirical results provide strong support to the model's predictions. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. E-nursing documentation as a tool for quality assurance.

    PubMed

    Rajkovic, Vladislav; Sustersic, Olga; Rajkovic, Uros

    2006-01-01

    The article presents the results of a project with which we describe the reengineering of nursing documentation. Documentation in nursing is an efficient tool for ensuring quality health care and consequently quality patient treatment along the whole clinical path. We have taken into account the nursing process and patient treatment based on Henderson theoretical model of nursing that consists of 14 basic living activities. The model of new documentation enables tracing, transparency, selectivity, monitoring and analyses. All these factors lead to improvements of a health system as well as to improved safety of patients and members of nursing teams. Thus the documentation was developed for three health care segments: secondary and tertiary level, dispensaries and community health care. The new quality introduced to the documentation process by information and communication technology is presented by a database model and a software prototype for managing documentation.

  9. Model of epidemic control based on quarantine and message delivery

    NASA Astrophysics Data System (ADS)

    Wang, Xingyuan; Zhao, Tianfang; Qin, Xiaomeng

    2016-09-01

    The model provides two novel strategies for the preventive control of epidemic diseases. One approach is related to the different isolating rates in latent period and invasion period. Experiments show that the increasing of isolating rates in invasion period, as long as over 0.5, contributes little to the preventing of epidemic; the improvement of isolation rate in latent period is key to control the disease spreading. Another is a specific mechanism of message delivering and forwarding. Information quality and information accumulating process are also considered there. Macroscopically, diseases are easy to control as long as the immune messages reach a certain quality. Individually, the accumulating messages bring people with certain immunity to the disease. Also, the model is performed on the classic complex networks like scale-free network and small-world network, and location-based social networks. Results show that the proposed measures demonstrate superior performance and significantly reduce the negative impact of epidemic disease.

  10. Effectiveness of e-learning in hospitals.

    PubMed

    Chuo, Yinghsiang; Liu, Chuangchun; Tsai, Chunghung

    2015-01-01

    Because medical personnel share different work shifts (i.e., three work shifts) and do not have a fixed work schedule, implementing timely, flexible, and quick e-learning methods for their continued education is imperative. Hospitals are currently focusing on developing e-learning. This study aims to explore the key factors that influence the effectiveness of e-learning in medical personnel. This study recruited medical personnel as the study participants and collected sample data by using the questionnaire survey method. This study is based on the information systems success model (IS success model), a significant model in MIS research. This study found that the factors (i.e., information quality, service quality, convenience, and learning climate) influence the e-learning satisfaction and in turn influence effectiveness in medical personnel. This study provided recommendations to medical institutions according to the derived findings, which can be used as a reference when establishing e-learning systems in the future.

  11. Development, implementation, and evaluation of the Apollo model of pediatric rehabilitation service delivery.

    PubMed

    Camden, Chantal; Swaine, Bonnie; Tétreault, Sylvie; Bergeron, Sophie; Lambert, Carole

    2013-05-01

    This article presents the experience of a rehabilitation program that undertook the challenge to reorganize its services to address accessibility issues and improve service quality. The context in which the reorganization process occurred, along with the relevant literature justifying the need for a new service delivery model, and an historical perspective on the planning; implementation; and evaluation phases of the process are described. In the planning phase, the constitution of the working committee, the data collected, and the information found in the literature are presented. Apollo, the new service delivery model, is then described along with each of its components (e.g., community, group, and individual interventions). Actions and lessons learnt during the implementation of each component are presented. We hope by sharing our experiences that we can help others make informed decisions about service reorganization to improve the quality of services provided to children with disabilities, their families, and their communities.

  12. Predicting fire effects on water quality: a perspective and future needs

    NASA Astrophysics Data System (ADS)

    Smith, Hugh; Sheridan, Gary; Nyman, Petter; Langhans, Christoph; Noske, Philip; Lane, Patrick

    2017-04-01

    Forest environments are a globally significant source of drinking water. Fire presents a credible threat to the supply of high quality water in many forested regions. The post-fire risk to water supplies depends on storm event characteristics, vegetation cover and fire-related changes in soil infiltration and erodibility modulated by landscape position. The resulting magnitude of runoff generation, erosion and constituent flux to streams and reservoirs determines the severity of water quality impacts in combination with the physical and chemical composition of the entrained material. Research to date suggests that most post-fire water quality impacts are due to large increases in the supply of particulates (fine-grained sediment and ash) and particle-associated chemical constituents. The largest water quality impacts result from high magnitude erosion events, including debris flow processes, which typically occur in response to short duration, high intensity storm events during the recovery period. Most research to date focuses on impacts on water quality after fire. However, information on potential water quality impacts is required prior to fire events for risk planning. Moreover, changes in climate and forest management (e.g. prescribed burning) that affect fire regimes may alter water quality risks. Therefore, prediction requires spatial-temporal representation of fire and rainfall regimes coupled with information on fire-related changes to soil hydrologic parameters. Recent work has applied such an approach by combining a fire spread model with historic fire weather data in a Monte Carlo simulation to quantify probabilities associated with fire and storm events generating debris flows and fine sediment influx to a reservoir located in Victoria, Australia. Prediction of fire effects on water quality would benefit from further research in several areas. First, more work on regional-scale stochastic modelling of intersecting fire and storm events with landscape zones of erosion vulnerability is required to support quantitative evaluation of water quality risk and the effect of future changes in climate and land management. Second, we underscore previous calls for characterisation of landscape-scale domains to support regionalisation of parameter sets derived from empirical studies. Recent examples include work identifying aridity as a control of hydro-geomorphic response to fire and the use of spectral-based indices to predict spatial heterogeneity in ash loadings. Third, information on post-fire erosion from colluvial or alluvial stores is needed to determine their significance as both sediment-contaminant sinks and sources. Such sediment stores may require explicit spatial representation in risk models for some environments and sediment tracing can be used to determine their relative importance as secondary sources. Fourth, increased dating of sediment archives could provide regional datasets of fire-related erosion event frequency. Presently, the lack of such data hinders evaluation of risk models linking fire and storm events to erosion and water quality impacts.

  13. Artificial neural networks applied to flow prediction scenarios in Tomebamba River - Paute watershed, for flood and water quality control and management at City of Cuenca Ecuador

    NASA Astrophysics Data System (ADS)

    Cisneros, Felipe; Veintimilla, Jaime

    2013-04-01

    The main aim of this research is to create a model of Artificial Neural Networks (ANN) that allows predicting the flow in Tomebamba River both, at real time and in a certain day of year. As inputs we are using information of rainfall and flow of the stations along of the river. This information is organized in scenarios and each scenario is prepared to a specific area. The information is acquired from the hydrological stations placed in the watershed using an electronic system developed at real time and it supports any kind or brands of this type of sensors. The prediction works very good three days in advance This research includes two ANN models: Back propagation and a hybrid model between back propagation and OWO-HWO. These last two models have been tested in a preliminary research. To validate the results we are using some error indicators such as: MSE, RMSE, EF, CD and BIAS. The results of this research reached high levels of reliability and the level of error are minimal. These predictions are useful for flood and water quality control and management at City of Cuenca Ecuador

  14. Informal cash payments for birth in Hungary: Are women paying to secure a known provider, respect, or quality of care?

    PubMed

    Baji, Petra; Rubashkin, Nicholas; Szebik, Imre; Stoll, Kathrin; Vedam, Saraswathi

    2017-09-01

    In Central and Eastern Europe, many women make informal cash payments to ensure continuity of provider, i.e., to have a "chosen" doctor who provided their prenatal care, be present for birth. High rates of obstetric interventions and disrespectful maternity care are also common to the region. No previous study has examined the associations among informal payments, intervention rates, and quality of maternity care. We distributed an online cross-sectional survey in 2014 to a nationally representative sample of Hungarian internet-using women (N = 600) who had given birth in the last 5 years. The survey included items related to socio-demographics, type of provider, obstetric interventions, and experiences of care. Women reported if they paid informally, and how much. We built a two-part model, where a bivariate probit model was used to estimate conditional probabilities of women paying informally, and a GLM model to explore the amount of payments. We calculated marginal effects of the covariates (provider choice, interventions, respectful care). Many more women (79%) with a chosen doctor paid informally (191 euros on average) compared to 17% of women without a chosen doctor (86 euros). Based on regression analysis, the chosen doctor's presence at birth was the principal determinant of payment. Intervention and procedure rates were significantly higher for women with a chosen doctor versus without (cesareans 45% vs. 33%; inductions 32% vs. 19%; episiotomy 75% vs. 62%; epidural 13% vs. 5%), but had no direct effect on payments. Half of the sample (42% with a chosen doctor, 62% without) reported some form of disrespectful care, but this did not reduce payments. Despite reporting disrespect and higher rates of interventions, women rewarded the presence of a chosen doctor with informal payments. They may be unaware of evidence-based standards, and trust that their chosen doctor provided high quality maternity care. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. A geographic data model for representing ground water systems.

    PubMed

    Strassberg, Gil; Maidment, David R; Jones, Norm L

    2007-01-01

    The Arc Hydro ground water data model is a geographic data model for representing spatial and temporal ground water information within a geographic information system (GIS). The data model is a standardized representation of ground water systems within a spatial database that provides a public domain template for GIS users to store, document, and analyze commonly used spatial and temporal ground water data sets. This paper describes the data model framework, a simplified version of the complete ground water data model that includes two-dimensional and three-dimensional (3D) object classes for representing aquifers, wells, and borehole data, and the 3D geospatial context in which these data exist. The framework data model also includes tabular objects for representing temporal information such as water levels and water quality samples that are related with spatial features.

  16. Concept for Future Data Services at the Long-Term Archive of WDCC combining DOIs with common PIDs

    NASA Astrophysics Data System (ADS)

    Stockhause, Martina; Weigel, Tobias; Toussaint, Frank; Höck, Heinke; Thiemann, Hannes; Lautenschlager, Michael

    2013-04-01

    The World Data Center for Climate (WDCC) hosted at the German Climate Computing Center (DKRZ) maintains a long-term archive (LTA) of climate model data as well as observational data. WDCC distinguishes between two types of LTA data: Structured data: Data output of an instrument or of a climate model run consists of numerous, highly structured individual datasets in a uniform format. Part of these data is also published on an ESGF (Earth System Grid Federation) data node. Detailed metadata is available allowing for fine-grained user-defined data access. Unstructured data: LTA data of finished scientific projects are in general unstructured and consist of datasets of different formats, different sizes, and different contents. For these data compact metadata is available as content information. The structured data is suitable for WDCC's DataCite DOI process, the project data only in exceptional cases. The DOI process includes a thorough quality control process of technical as well as scientific aspects by the publication agent and the data creator. DOIs are assigned to data collections appropriate to be cited in scientific publications, like a simulation run. The data collection is defined in agreement with the data creator. At the moment there is no possibility to identify and cite individual datasets within this DOI data collection analogous to the citation of chapters in a book. Also missing is a compact citation regulation for a user-specified collection of data. WDCC therefore complements its existing LTA/DOI concept by Persistent Identifier (PID) assignment to datasets using Handles. In addition to data identification for internal and external use, the concept of PIDs allows to define relations among PIDs. Such structural information is stored as key-value pair directly in the handles. Thus, relations provide basic provenance or lineage information, even if part of the data like intermediate results are lost. WDCC intends to use additional PIDs on metadata entities with a relation to the data PID(s). These add background information on the data creation process (e.g. descriptions of experiment, model, model set-up, and platform for the model run etc.) to the data. These pieces of additional information increase the re-usability of the archived model data, significantly. Other valuable additional information for scientific collaboration could be added by the same mechanism, like quality information and annotations. Apart from relations among data and metadata entities, PIDs on collections are advantageous for model data: Collections allow for persistent references to single datasets or subsets of data assigned a DOI, Data objects and additional information objects can be consistently connected via relations (provenance, creation, quality information for data),

  17. A fingerprint classification algorithm based on combination of local and global information

    NASA Astrophysics Data System (ADS)

    Liu, Chongjin; Fu, Xiang; Bian, Junjie; Feng, Jufu

    2011-12-01

    Fingerprint recognition is one of the most important technologies in biometric identification and has been wildly applied in commercial and forensic areas. Fingerprint classification, as the fundamental procedure in fingerprint recognition, can sharply decrease the quantity for fingerprint matching and improve the efficiency of fingerprint recognition. Most fingerprint classification algorithms are based on the number and position of singular points. Because the singular points detecting method only considers the local information commonly, the classification algorithms are sensitive to noise. In this paper, we propose a novel fingerprint classification algorithm combining the local and global information of fingerprint. Firstly we use local information to detect singular points and measure their quality considering orientation structure and image texture in adjacent areas. Furthermore the global orientation model is adopted to measure the reliability of singular points group. Finally the local quality and global reliability is weighted to classify fingerprint. Experiments demonstrate the accuracy and effectivity of our algorithm especially for the poor quality fingerprint images.

  18. “Wrong, but Useful”: Negotiating Uncertainty in Infectious Disease Modelling

    PubMed Central

    Christley, Robert M.; Mort, Maggie; Wynne, Brian; Wastling, Jonathan M.; Heathwaite, A. Louise; Pickup, Roger; Austin, Zoë; Latham, Sophia M.

    2013-01-01

    For infectious disease dynamical models to inform policy for containment of infectious diseases the models must be able to predict; however, it is well recognised that such prediction will never be perfect. Nevertheless, the consensus is that although models are uncertain, some may yet inform effective action. This assumes that the quality of a model can be ascertained in order to evaluate sufficiently model uncertainties, and to decide whether or not, or in what ways or under what conditions, the model should be ‘used’. We examined uncertainty in modelling, utilising a range of data: interviews with scientists, policy-makers and advisors, and analysis of policy documents, scientific publications and reports of major inquiries into key livestock epidemics. We show that the discourse of uncertainty in infectious disease models is multi-layered, flexible, contingent, embedded in context and plays a critical role in negotiating model credibility. We argue that usability and stability of a model is an outcome of the negotiation that occurs within the networks and discourses surrounding it. This negotiation employs a range of discursive devices that renders uncertainty in infectious disease modelling a plastic quality that is amenable to ‘interpretive flexibility’. The utility of models in the face of uncertainty is a function of this flexibility, the negotiation this allows, and the contexts in which model outputs are framed and interpreted in the decision making process. We contend that rather than being based predominantly on beliefs about quality, the usefulness and authority of a model may at times be primarily based on its functional status within the broad social and political environment in which it acts. PMID:24146851

  19. Integrating prior information into microwave tomography part 2: Impact of errors in prior information on microwave tomography image quality.

    PubMed

    Kurrant, Douglas; Fear, Elise; Baran, Anastasia; LoVetri, Joe

    2017-12-01

    The authors have developed a method to combine a patient-specific map of tissue structure and average dielectric properties with microwave tomography. The patient-specific map is acquired with radar-based techniques and serves as prior information for microwave tomography. The impact that the degree of structural detail included in this prior information has on image quality was reported in a previous investigation. The aim of the present study is to extend this previous work by identifying and quantifying the impact that errors in the prior information have on image quality, including the reconstruction of internal structures and lesions embedded in fibroglandular tissue. This study also extends the work of others reported in literature by emulating a clinical setting with a set of experiments that incorporate heterogeneity into both the breast interior and glandular region, as well as prior information related to both fat and glandular structures. Patient-specific structural information is acquired using radar-based methods that form a regional map of the breast. Errors are introduced to create a discrepancy in the geometry and electrical properties between the regional map and the model used to generate the data. This permits the impact that errors in the prior information have on image quality to be evaluated. Image quality is quantitatively assessed by measuring the ability of the algorithm to reconstruct both internal structures and lesions embedded in fibroglandular tissue. The study is conducted using both 2D and 3D numerical breast models constructed from MRI scans. The reconstruction results demonstrate robustness of the method relative to errors in the dielectric properties of the background regional map, and to misalignment errors. These errors do not significantly influence the reconstruction accuracy of the underlying structures, or the ability of the algorithm to reconstruct malignant tissue. Although misalignment errors do not significantly impact the quality of the reconstructed fat and glandular structures for the 3D scenarios, the dielectric properties are reconstructed less accurately within the glandular structure for these cases relative to the 2D cases. However, general agreement between the 2D and 3D results was found. A key contribution of this paper is the detailed analysis of the impact of prior information errors on the reconstruction accuracy and ability to detect tumors. The results support the utility of acquiring patient-specific information with radar-based techniques and incorporating this information into MWT. The method is robust to errors in the dielectric properties of the background regional map, and to misalignment errors. Completion of this analysis is an important step toward developing the method into a practical diagnostic tool. © 2017 American Association of Physicists in Medicine.

  20. Spiking Cortical Model Based Multimodal Medical Image Fusion by Combining Entropy Information with Weber Local Descriptor

    PubMed Central

    Zhang, Xuming; Ren, Jinxia; Huang, Zhiwen; Zhu, Fei

    2016-01-01

    Multimodal medical image fusion (MIF) plays an important role in clinical diagnosis and therapy. Existing MIF methods tend to introduce artifacts, lead to loss of image details or produce low-contrast fused images. To address these problems, a novel spiking cortical model (SCM) based MIF method has been proposed in this paper. The proposed method can generate high-quality fused images using the weighting fusion strategy based on the firing times of the SCM. In the weighting fusion scheme, the weight is determined by combining the entropy information of pulse outputs of the SCM with the Weber local descriptor operating on the firing mapping images produced from the pulse outputs. The extensive experiments on multimodal medical images show that compared with the numerous state-of-the-art MIF methods, the proposed method can preserve image details very well and avoid the introduction of artifacts effectively, and thus it significantly improves the quality of fused images in terms of human vision and objective evaluation criteria such as mutual information, edge preservation index, structural similarity based metric, fusion quality index, fusion similarity metric and standard deviation. PMID:27649190

  1. Spiking Cortical Model Based Multimodal Medical Image Fusion by Combining Entropy Information with Weber Local Descriptor.

    PubMed

    Zhang, Xuming; Ren, Jinxia; Huang, Zhiwen; Zhu, Fei

    2016-09-15

    Multimodal medical image fusion (MIF) plays an important role in clinical diagnosis and therapy. Existing MIF methods tend to introduce artifacts, lead to loss of image details or produce low-contrast fused images. To address these problems, a novel spiking cortical model (SCM) based MIF method has been proposed in this paper. The proposed method can generate high-quality fused images using the weighting fusion strategy based on the firing times of the SCM. In the weighting fusion scheme, the weight is determined by combining the entropy information of pulse outputs of the SCM with the Weber local descriptor operating on the firing mapping images produced from the pulse outputs. The extensive experiments on multimodal medical images show that compared with the numerous state-of-the-art MIF methods, the proposed method can preserve image details very well and avoid the introduction of artifacts effectively, and thus it significantly improves the quality of fused images in terms of human vision and objective evaluation criteria such as mutual information, edge preservation index, structural similarity based metric, fusion quality index, fusion similarity metric and standard deviation.

  2. Using quality indicators in anaesthesia: feeding back data to improve care.

    PubMed

    Benn, J; Arnold, G; Wei, I; Riley, C; Aleva, F

    2012-07-01

    After recent UK policy developments, considerable attention has been focused upon how clinical specialties measure and report on the quality of care delivered to patients. Defining the right indicators alone is insufficient to close the feedback loop. This narrative review aims to describe and synthesize a diverse body of research relevant to the question of how information from quality indicators can be fed back and used effectively to improve care. Anaesthesia poses certain challenges in the identification of valid outcome indicators sensitive to variations in anaesthetic care. Metrics collected during the immediate post-anaesthetic recovery period, such as patient temperature, patient-reported quality of recovery, and pain and nausea, provide potentially useful information for the anaesthetist, yet this information is not routinely fed back. Reviews of the effects of feeding back performance data to healthcare providers suggest that this may result in small to moderate positive effects upon outcomes and professional practice, with stronger effects where feedback is integrated within a broader quality improvement strategy. The dominant model for use of data within quality improvement is based upon the industrial process control approach, in which care processes are monitored continuously for process changes which are rapidly detectable for corrective action. From this review and experience of implementing these principles in practice, effective feedback from quality indicators is timely, credible, confidential, tailored to the recipient, and continuous. Considerable further work is needed to understand how information from quality indicators can be fed back in an effective way to clinicians and clinical units, in order to support revalidation and continuous improvement.

  3. Creating Quality Improvement Culture in Public Health Agencies

    PubMed Central

    Mahanna, Elizabeth; Joly, Brenda; Zelek, Michael; Riley, William; Verma, Pooja; Fisher, Jessica Solomon

    2014-01-01

    Objectives. We conducted case studies of 10 agencies that participated in early quality improvement efforts. Methods. The agencies participated in a project conducted by the National Association of County and City Health Officials (2007–2008). Case study participants included health directors and quality improvement team leaders and members. We implemented multiple qualitative analysis processes, including cross-case analysis and logic modeling. We categorized agencies according to the extent to which they had developed a quality improvement culture. Results. Agencies were conducting informal quality improvement projects (n = 4), conducting formal quality improvement projects (n = 3), or creating a quality improvement culture (n = 4). Agencies conducting formal quality improvement and creating a quality improvement culture had leadership support for quality improvement, participated in national quality improvement initiatives, had a greater number of staff trained in quality improvement and quality improvement teams that met regularly with decision-making authority. Agencies conducting informal quality improvement were likely to report that accreditation is the major driver for quality improvement work. Agencies creating a quality improvement culture were more likely to have a history of evidence-based decision-making and use quality improvement to address emerging issues. Conclusions. Our findings support previous research and add the roles of national public health accreditation and emerging issues as factors in agencies’ ability to create and sustain a quality improvement culture. PMID:24228680

  4. Incorporating Learning Characteristics into Automatic Essay Scoring Models: What Individual Differences and Linguistic Features Tell Us about Writing Quality

    ERIC Educational Resources Information Center

    Crossley, Scott A.; Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S.

    2016-01-01

    This study investigates a novel approach to automatically assessing essay quality that combines natural language processing approaches that assess text features with approaches that assess individual differences in writers such as demographic information, standardized test scores, and survey results. The results demonstrate that combining text…

  5. Indicators in Perspective. The Use of Quality Indicators in Vocational Education and Training. CEDEFOP Document.

    ERIC Educational Resources Information Center

    Van den Berghe, Wouter

    Indicators are used in quite different ways in vocational education and training, from control and accountability to performance and quality purposes. A classification model has been proposed in which many indicators can fit. It is based on two important dimensions of indicators: (1) the "message" relating to the information content,…

  6. Developing a Dynamic SPARROW Water Quality Decision Support System Using NASA Remotely-Sensed Products

    NASA Astrophysics Data System (ADS)

    Al-Hamdan, M. Z.; Smith, R. A.; Hoos, A.; Schwarz, G. E.; Alexander, R. B.; Crosson, W. L.; Srikishen, J.; Estes, M., Jr.; Cruise, J.; Al-Hamdan, A.; Ellenburg, W. L., II; Flores, A.; Sanford, W. E.; Zell, W.; Reitz, M.; Miller, M. P.; Journey, C. A.; Befus, K. M.; Swann, R.; Herder, T.; Sherwood, E.; Leverone, J.; Shelton, M.; Smith, E. T.; Anastasiou, C. J.; Seachrist, J.; Hughes, A.; Graves, D.

    2017-12-01

    The USGS Spatially Referenced Regression on Watershed Attributes (SPARROW) surface water quality modeling system has been widely used for long term, steady state water quality analysis. However, users have increasingly requested a dynamic version of SPARROW that can provide seasonal estimates of nutrients and suspended sediment to receiving waters. The goal of this NASA-funded project is to develop a dynamic decision support system to enhance the southeast SPARROW water quality model and finer-scale dynamic models for selected coastal watersheds through the use of remotely-sensed data and other NASA Land Information System (LIS) products. The spatial and temporal scale of satellite remote sensing products and LIS modeling data make these sources ideal for the purposes of development and operation of the dynamic SPARROW model. Remote sensing products including MODIS vegetation indices, SMAP surface soil moisture, and OMI atmospheric chemistry along with LIS-derived evapotranspiration (ET) and soil temperature and moisture products will be included in model development and operation. MODIS data will also be used to map annual land cover/land use in the study areas and in conjunction with Landsat and Sentinel to identify disturbed areas that might be sources of sediment and increased phosphorus loading through exposure of the bare soil. These data and others constitute the independent variables in a regression analysis whose dependent variables are the water quality constituents total nitrogen, total phosphorus, and suspended sediment. Remotely-sensed variables such as vegetation indices and ET can be proxies for nutrient uptake by vegetation; MODIS Leaf Area Index can indicate sources of phosphorus from vegetation; soil moisture and temperature are known to control rates of denitrification; and bare soil areas serve as sources of enhanced nutrient and sediment production. The enhanced SPARROW dynamic models will provide improved tools for end users to manage water quality in near real time and for the formulation of future scenarios to inform strategic planning. Time-varying SPARROW outputs will aid water managers in decision making regarding allocation of resources in protecting aquatic habitats, planning for harmful algal blooms, and restoration of degraded habitats, stream segments, or lakes.

  7. “Fine-Scale Application of the coupled WRF-CMAQ System to ...

    EPA Pesticide Factsheets

    The DISCOVER-AQ project (Deriving Information on Surface conditions from Column and Vertically Resolved Observations Relevant to Air Quality), is a joint collaboration between NASA, U.S. EPA and a number of other local organizations with the goal of characterizing air quality in urban areas using satellite, aircraft, vertical profiler and ground based measurements (http://discover-aq.larc.nasa.gov). In July 2011, the DISCOVER-AQ project conducted intensive air quality measurements in the Baltimore, MD and Washington, D.C. area in the eastern U.S. To take advantage of these unique data, the Community Multiscale Air Quality (CMAQ) model, coupled with the Weather Research and Forecasting (WRF) model is used to simulate the meteorology and air quality in the same region using 12-km, 4-km and 1-km horizontal grid spacings. The goal of the modeling exercise is to demonstrate the capability of the coupled WRF-CMAQ modeling system to simulate air quality at fine grid spacings in an urban area. Development of new data assimilation techniques and the use of higher resolution input data for the WRF model have been implemented to improve the meteorological results, particularly at the 4-km and 1-km grid resolutions. In addition, a number of updates to the CMAQ model were made to enhance the capability of the modeling system to accurately represent the magnitude and spatial distribution of pollutants at fine model resolutions. Data collected during the 2011 DISCOVER-AQ campa

  8. “Application and evaluation of the two-way coupled WRF ...

    EPA Pesticide Factsheets

    The DISCOVER-AQ project (Deriving Information on Surface conditions from Column and Vertically Resolved Observations Relevant to Air Quality), is a joint collaboration between NASA, U.S. EPA and a number of other local organizations with the goal of characterizing air quality in urban areas using satellite, aircraft, vertical profiler and ground based measurements (http://discover-aq.larc.nasa.gov). In July 2011, the DISCOVER-AQ project conducted intensive air quality measurements in the Baltimore, MD and Washington, D.C. area in the eastern U.S. To take advantage of these unique data, the Community Multiscale Air Quality (CMAQ) model, coupled with the Weather Research and Forecasting (WRF) model is used to simulate the meteorology and air quality in the same region using 12-km, 4-km and 1-km horizontal grid spacings. The goal of the modeling exercise is to demonstrate the capability of the coupled WRF-CMAQ modeling system to simulate air quality at fine grid spacings in an urban area. Development of new data assimilation techniques and the use of higher resolution input data for the WRF model have been implemented to improve the meteorological results, particularly at the 4-km and 1-km grid resolutions. In addition, a number of updates to the CMAQ model were made to enhance the capability of the modeling system to accurately represent the magnitude and spatial distribution of pollutants at fine model resolutions. Data collected during the 2011 DISCOVER-AQ campa

  9. Instructional Teleconferencing Models for Enhancing the Quality of Education.

    ERIC Educational Resources Information Center

    Showalter, Robert G.; Showalter, Marietta B.

    This compilation of models for the utilization of instructional teleconferencing begins with narrative information from the final report of the Inter-Institutional Instructional Teleconferencing for Indiana School Personnel Serving Handicapped Children Project, a collaborative venture undertaken by Purdue University and the Lafayette School…

  10. Development of a hydrologic connectivity dataset for SWAT assessments in the U.S.

    USDA-ARS?s Scientific Manuscript database

    Model based water quality assessments are as important informer of conservation and environmental policy in the US. The recently completed national scale Conservation Effects Assessment Project (CEAP) is being repeated using newer data, greater resolution, and enhanced models. National assessment...

  11. Promoting Continuous Quality Improvement in Online Teaching: The META Model

    ERIC Educational Resources Information Center

    Dittmar, Eileen; McCracken, Holly

    2012-01-01

    Experienced e-learning faculty members share strategies for implementing a comprehensive postsecondary faculty development program essential to continuous improvement of instructional skills. The high-impact META Model (centered around Mentoring, Engagement, Technology, and Assessment) promotes information sharing and content creation, and fosters…

  12. Air Quality Modeling of Traffic-related Air Pollutants for the NEXUS Study

    EPA Science Inventory

    The paper presents the results of the model applications to estimate exposure metrics in support of an epidemiologic study in Detroit, Michigan. A major challenge in traffic-related air pollution exposure studies is the lack of information regarding pollutant exposure characteriz...

  13. AIR QUALITY MODELING AT COARSE-TO-FINE SCALES IN URBAN AREAS

    EPA Science Inventory

    Urban air toxics control strategies are moving towards a community based modeling approach, with an emphasis on assessing those areas that experience high air toxic concentration levels, the so-called "hot spots". This approach will require information that accurately maps and...

  14. Improving of local ozone forecasting by integrated models.

    PubMed

    Gradišar, Dejan; Grašič, Boštjan; Božnar, Marija Zlata; Mlakar, Primož; Kocijan, Juš

    2016-09-01

    This paper discuss the problem of forecasting the maximum ozone concentrations in urban microlocations, where reliable alerting of the local population when thresholds have been surpassed is necessary. To improve the forecast, the methodology of integrated models is proposed. The model is based on multilayer perceptron neural networks that use as inputs all available information from QualeAria air-quality model, WRF numerical weather prediction model and onsite measurements of meteorology and air pollution. While air-quality and meteorological models cover large geographical 3-dimensional space, their local resolution is often not satisfactory. On the other hand, empirical methods have the advantage of good local forecasts. In this paper, integrated models are used for improved 1-day-ahead forecasting of the maximum hourly value of ozone within each day for representative locations in Slovenia. The WRF meteorological model is used for forecasting meteorological variables and the QualeAria air-quality model for gas concentrations. Their predictions, together with measurements from ground stations, are used as inputs to a neural network. The model validation results show that integrated models noticeably improve ozone forecasts and provide better alert systems.

  15. Air pollution and public health: a guidance document for risk managers.

    PubMed

    Craig, Lorraine; Brook, Jeffrey R; Chiotti, Quentin; Croes, Bart; Gower, Stephanie; Hedley, Anthony; Krewski, Daniel; Krupnick, Alan; Krzyzanowski, Michal; Moran, Michael D; Pennell, William; Samet, Jonathan M; Schneider, Jurgen; Shortreed, John; Williams, Martin

    2008-01-01

    This guidance document is a reference for air quality policymakers and managers providing state-of-the-art, evidence-based information on key determinants of air quality management decisions. The document reflects the findings of five annual meetings of the NERAM (Network for Environmental Risk Assessment and Management) International Colloquium Series on Air Quality Management (2001-2006), as well as the results of supporting international research. The topics covered in the guidance document reflect critical science and policy aspects of air quality risk management including i) health effects, ii) air quality emissions, measurement and modeling, iii) air quality management interventions, and iv) clean air policy challenges and opportunities.

  16. Evaluating clinical librarian services: a systematic review.

    PubMed

    Brettle, Alison; Maden-Jenkins, Michelle; Anderson, Lucy; McNally, Rosalind; Pratchett, Tracey; Tancock, Jenny; Thornton, Debra; Webb, Anne

    2011-03-01

      Previous systematic reviews have indicated limited evidence and poor quality evaluations of clinical librarian (CL) services. Rigorous evaluations should demonstrate the value of CL services, but guidance is needed before this can be achieved.   To undertake a systematic review which examines models of CL services, quality, methods and perspectives of clinical librarian service evaluations.   Systematic review methodology and synthesis of evidence, undertaken collaboratively by a group of 8 librarians to develop research and critical appraisal skills.   There are four clear models of clinical library service provision. Clinical librarians are effective in saving health professionals time, providing relevant, useful information and high quality services. Clinical librarians have a positive effect on clinical decision making by contributing to better informed decisions, diagnosis and choice of drug or therapy. The quality of CL studies is improving, but more work is needed on reducing bias and providing evidence of specific impacts on patient care. The Critical Incident Technique as part of a mixed method approach appears to offer a useful approach to demonstrating impact.   This systematic review provides practical guidance regarding the evaluation of CL services. It also provides updated evidence regarding the effectiveness and impact of CL services. The approach used was successful in developing research and critical appraisal skills in a group of librarians. © 2010 The authors. Health Information and Libraries Journal © 2010 Health Libraries Group.

  17. Localized real-time information on outdoor air quality at kindergartens in Oslo, Norway using low-cost sensor nodes.

    PubMed

    Castell, Nuria; Schneider, Philipp; Grossberndt, Sonja; Fredriksen, Mirjam F; Sousa-Santos, Gabriela; Vogt, Mathias; Bartonova, Alena

    2018-08-01

    In Norway, children in kindergartens spend significant time outdoors under all weather conditions, and there is thus a natural concern about the quality of outdoor air. It is well known that air pollution is associated with a wide variety of adverse health impacts for children, with greater impact on children with asthma. Especially during winter and spring, kindergartens in Oslo that are situated close to streets with busy traffic, or in areas where wood burning is used for house heating, can experience many days with bad air quality. During these periods, updated information on air quality levels can help the kindergarten teachers to plan appropriate outdoor activities and thus protect children's health. We have installed 17 low-cost air quality nodes in kindergartens in Oslo. These nodes are smaller, cheaper and less complex to use than traditional equipment. Performance evaluation shows that while they are less accurate and suffer from higher uncertainty than reference equipment, they still can provide reliable coarse information about local pollution. The main challenge when using this technology is that calibration parameters might change with time depending on the atmospheric conditions. Thus, even if the sensors are calibrated a priori, once deployed, and especially if they are deployed for a long time, it is not possible to determine if a node is over- or under-estimating the concentration levels. To enhance the data from the sensors, we employed a data fusion technique that allows generating a detailed air quality map merging the data from the sensors and the data from an urban model, thus being able to offer air quality information to any location within Oslo. We arranged a focus group with the participation of local administration, kindergarten staff and parents to understand their opinion and needs related to the air quality information that was provided to the participant kindergartens. They expressed concern about the data quality but agree that having updated information on the air quality in the surroundings of kindergartens can help them to reduce children's exposure to air pollution. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  18. The Evaluative Advantage of Novel Alternatives: An Information-Sampling Account.

    PubMed

    Le Mens, Gaël; Kareev, Yaakov; Avrahami, Judith

    2016-02-01

    New products, services, and ideas are often evaluated more favorably than similar but older ones. Although several explanations of this phenomenon have been proposed, we identify an overlooked asymmetry in information about new and old items that emerges when people seek positive experiences and learn about the qualities of (noisy) alternatives by experiencing them. The reason for the asymmetry is that people avoid rechoosing alternatives that previously led to poor outcomes; hence, additional feedback on their qualities is precluded. Negative quality estimates, even when caused by noise, thus tend to persist. This negative bias takes time to develop, and affects old alternatives more strongly than similar but newer alternatives. We analyze a simple learning model and demonstrate the process by which people would tend to evaluate a new alternative more positively than an older alternative with the same payoff distribution. The results from two experimental studies (Ns = 769 and 805) support the predictions of our model. © The Author(s) 2015.

  19. Habitat Suitability Index Models: Black-shouldered kite

    USGS Publications Warehouse

    Faanes, Craig A.; Howard, Rebecca J.

    1987-01-01

    A review and synthesis of existing information were used to develop a model for evaluating black-shouldered kite habitat quality. The model is scaled to produce an index between 0 (unsuitable habitat) to 1.0 (optimal habitat). Habitat suitability index models are designed for use with the Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service. Guidelines for model application are provided.

  20. Flight testing and frequency domain analysis for rotorcraft handling qualities characteristics

    NASA Technical Reports Server (NTRS)

    Ham, Johnnie A.; Gardner, Charles K.; Tischler, Mark B.

    1993-01-01

    A demonstration of frequency domain flight testing techniques and analyses was performed on a U.S. Army OH-58D helicopter in support of the OH-58D Airworthiness and Flight Characteristics Evaluation and the Army's development and ongoing review of Aeronautical Design Standard 33C, Handling Qualities Requirements for Military Rotorcraft. Hover and forward flight (60 knots) tests were conducted in 1 flight hour by Army experimental test pilots. Further processing of the hover data generated a complete database of velocity, angular rate, and acceleration frequency responses to control inputs. A joint effort was then undertaken by the Airworthiness Qualification Test Directorate (AQTD) and the U.S. Army Aeroflightdynamics Directorate (AFDD) to derive handling qualities information from the frequency response database. A significant amount of information could be extracted from the frequency domain database using a variety of approaches. This report documents numerous results that have been obtained from the simple frequency domain tests; in many areas, these results provide more insight into the aircraft dynamics that affect handling qualities than to traditional flight tests. The handling qualities results include ADS-33C bandwidth and phase delay calculations, vibration spectral determinations, transfer function models to examine single axis results, and a six degree of freedom fully coupled state space model. The ability of this model to accurately predict aircraft responses was verified using data from pulse inputs. This report also documents the frequency-sweep flight test technique and data analysis used to support the tests.

  1. Fire and Smoke Model Evaluation Experiment (FASMEE): Modeling gaps and data needs

    Treesearch

    Yongqiang Liu; Adam Kochanski; Kirk Baker; Ruddy Mell; Rodman Linn; Ronan Paugam; Jan Mandel; Aime Fournier; Mary Ann Jenkins; Scott Goodrick; Gary Achtemeier; Andrew Hudak; Matthew Dickson; Brian Potter; Craig Clements; Shawn Urbanski; Roger Ottmar; Narasimhan Larkin; Timothy Brown; Nancy French; Susan Prichard; Adam Watts; Derek McNamara

    2017-01-01

    Fire and smoke models are numerical tools for simulating fire behavior, smoke dynamics, and air quality impacts of wildland fires. Fire models are developed based on the fundamental chemistry and physics of combustion and fire spread or statistical analysis of experimental data (Sullivan 2009). They provide information on fire spread and fuel consumption for safe and...

  2. Classification of antecedents towards safety use of health information technology: A systematic review.

    PubMed

    Salahuddin, Lizawati; Ismail, Zuraini

    2015-11-01

    This paper provides a systematic review of safety use of health information technology (IT). The first objective is to identify the antecedents towards safety use of health IT by conducting systematic literature review (SLR). The second objective is to classify the identified antecedents based on the work system in Systems Engineering Initiative for Patient Safety (SEIPS) model and an extension of DeLone and McLean (D&M) information system (IS) success model. A systematic literature review (SLR) was conducted from peer-reviewed scholarly publications between January 2000 and July 2014. SLR was carried out and reported based on the preferred reporting items for systematic reviews and meta-analyses (PRISMA) statement. The related articles were identified by searching the articles published in Science Direct, Medline, EMBASE, and CINAHL databases. Data extracted from the resultant studies included are to be analysed based on the work system in Systems Engineering Initiative for Patient Safety (SEIPS) model, and also from the extended DeLone and McLean (D&M) information system (IS) success model. 55 articles delineated to be antecedents that influenced the safety use of health IT were included for review. Antecedents were identified and then classified into five key categories. The categories are (1) person, (2) technology, (3) tasks, (4) organization, and (5) environment. Specifically, person is attributed by competence while technology is associated to system quality, information quality, and service quality. Tasks are attributed by task-related stressor. Organisation is related to training, organisation resources, and teamwork. Lastly, environment is attributed by physical layout, and noise. This review provides evidence that the antecedents for safety use of health IT originated from both social and technical aspects. However, inappropriate health IT usage potentially increases the incidence of errors and produces new safety risks. The review cautions future implementation and adoption of health IT to carefully consider the complex interactions between social and technical elements propound in healthcare settings. Copyright © 2015. Published by Elsevier Ireland Ltd.

  3. An expert system for water quality modelling.

    PubMed

    Booty, W G; Lam, D C; Bobba, A G; Wong, I; Kay, D; Kerby, J P; Bowen, G S

    1992-12-01

    The RAISON-micro (Regional Analysis by Intelligent System ON a micro-computer) expert system is being used to predict the effects of mine effluents on receiving waters in Ontario. The potential of this system to assist regulatory agencies and mining industries to define more acceptable effluent limits was shown in an initial study. This system has been further developed so that the expert system helps the model user choose the most appropriate model for a particular application from a hierarchy of models. The system currently contains seven models which range from steady state to time dependent models, for both conservative and nonconservative substances in rivers and lakes. The menu driven expert system prompts the model user for information such as the nature of the receiving water system, the type of effluent being considered, and the range of background data available for use as input to the models. The system can also be used to determine the nature of the environmental conditions at the site which are not available in the textual information database, such as the components of river flow. Applications of the water quality expert system are presented for representative mine sites in the Timmins area of Ontario.

  4. Creating Physical 3D Stereolithograph Models of Brain and Skull

    PubMed Central

    Kelley, Daniel J.; Farhoud, Mohammed; Meyerand, M. Elizabeth; Nelson, David L.; Ramirez, Lincoln F.; Dempsey, Robert J.; Wolf, Alan J.; Alexander, Andrew L.; Davidson, Richard J.

    2007-01-01

    The human brain and skull are three dimensional (3D) anatomical structures with complex surfaces. However, medical images are often two dimensional (2D) and provide incomplete visualization of structural morphology. To overcome this loss in dimension, we developed and validated a freely available, semi-automated pathway to build 3D virtual reality (VR) and hand-held, stereolithograph models. To evaluate whether surface visualization in 3D was more informative than in 2D, undergraduate students (n = 50) used the Gillespie scale to rate 3D VR and physical models of both a living patient-volunteer's brain and the skull of Phineas Gage, a historically famous railroad worker whose misfortune with a projectile tamping iron provided the first evidence of a structure-function relationship in brain. Using our processing pathway, we successfully fabricated human brain and skull replicas and validated that the stereolithograph model preserved the scale of the VR model. Based on the Gillespie ratings, students indicated that the biological utility and quality of visual information at the surface of VR and stereolithograph models were greater than the 2D images from which they were derived. The method we developed is useful to create VR and stereolithograph 3D models from medical images and can be used to model hard or soft tissue in living or preserved specimens. Compared to 2D images, VR and stereolithograph models provide an extra dimension that enhances both the quality of visual information and utility of surface visualization in neuroscience and medicine. PMID:17971879

  5. A priori discretization error metrics for distributed hydrologic modeling applications

    NASA Astrophysics Data System (ADS)

    Liu, Hongli; Tolson, Bryan A.; Craig, James R.; Shafii, Mahyar

    2016-12-01

    Watershed spatial discretization is an important step in developing a distributed hydrologic model. A key difficulty in the spatial discretization process is maintaining a balance between the aggregation-induced information loss and the increase in computational burden caused by the inclusion of additional computational units. Objective identification of an appropriate discretization scheme still remains a challenge, in part because of the lack of quantitative measures for assessing discretization quality, particularly prior to simulation. This study proposes a priori discretization error metrics to quantify the information loss of any candidate discretization scheme without having to run and calibrate a hydrologic model. These error metrics are applicable to multi-variable and multi-site discretization evaluation and provide directly interpretable information to the hydrologic modeler about discretization quality. The first metric, a subbasin error metric, quantifies the routing information loss from discretization, and the second, a hydrological response unit (HRU) error metric, improves upon existing a priori metrics by quantifying the information loss due to changes in land cover or soil type property aggregation. The metrics are straightforward to understand and easy to recode. Informed by the error metrics, a two-step discretization decision-making approach is proposed with the advantage of reducing extreme errors and meeting the user-specified discretization error targets. The metrics and decision-making approach are applied to the discretization of the Grand River watershed in Ontario, Canada. Results show that information loss increases as discretization gets coarser. Moreover, results help to explain the modeling difficulties associated with smaller upstream subbasins since the worst discretization errors and highest error variability appear in smaller upstream areas instead of larger downstream drainage areas. Hydrologic modeling experiments under candidate discretization schemes validate the strong correlation between the proposed discretization error metrics and hydrologic simulation responses. Discretization decision-making results show that the common and convenient approach of making uniform discretization decisions across the watershed performs worse than the proposed non-uniform discretization approach in terms of preserving spatial heterogeneity under the same computational cost.

  6. Water quality management using statistical analysis and time-series prediction model

    NASA Astrophysics Data System (ADS)

    Parmar, Kulwinder Singh; Bhardwaj, Rashmi

    2014-12-01

    This paper deals with water quality management using statistical analysis and time-series prediction model. The monthly variation of water quality standards has been used to compare statistical mean, median, mode, standard deviation, kurtosis, skewness, coefficient of variation at Yamuna River. Model validated using R-squared, root mean square error, mean absolute percentage error, maximum absolute percentage error, mean absolute error, maximum absolute error, normalized Bayesian information criterion, Ljung-Box analysis, predicted value and confidence limits. Using auto regressive integrated moving average model, future water quality parameters values have been estimated. It is observed that predictive model is useful at 95 % confidence limits and curve is platykurtic for potential of hydrogen (pH), free ammonia, total Kjeldahl nitrogen, dissolved oxygen, water temperature (WT); leptokurtic for chemical oxygen demand, biochemical oxygen demand. Also, it is observed that predicted series is close to the original series which provides a perfect fit. All parameters except pH and WT cross the prescribed limits of the World Health Organization /United States Environmental Protection Agency, and thus water is not fit for drinking, agriculture and industrial use.

  7. Mapping carcass and meat quality QTL on Sus Scrofa chromosome 2 in commercial finishing pigs

    PubMed Central

    Heuven, Henri CM; van Wijk, Rik HJ; Dibbits, Bert; van Kampen, Tony A; Knol, Egbert F; Bovenhuis, Henk

    2009-01-01

    Quantitative trait loci (QTL) affecting carcass and meat quality located on SSC2 were identified using variance component methods. A large number of traits involved in meat and carcass quality was detected in a commercial crossbred population: 1855 pigs sired by 17 boars from a synthetic line, which where homozygous (A/A) for IGF2. Using combined linkage and linkage disequilibrium mapping (LDLA), several QTL significantly affecting loin muscle mass, ham weight and ham muscles (outer ham and knuckle ham) and meat quality traits, such as Minolta-L* and -b*, ultimate pH and Japanese colour score were detected. These results agreed well with previous QTL-studies involving SSC2. Since our study is carried out on crossbreds, different QTL may be segregating in the parental lines. To address this question, we compared models with a single QTL-variance component with models allowing for separate sire and dam QTL-variance components. The same QTL were identified using a single QTL variance component model compared to a model allowing for separate variances with minor differences with respect to QTL location. However, the variance component method made it possible to detect QTL segregating in the paternal line (e.g. HAMB), the maternal lines (e.g. Ham) or in both (e.g. pHu). Combining association and linkage information among haplotypes improved slightly the significance of the QTL compared to an analysis using linkage information only. PMID:19284675

  8. On Obtaining Estimates of the Fraction of Missing Information from Full Information Maximum Likelihood

    ERIC Educational Resources Information Center

    Savalei, Victoria; Rhemtulla, Mijke

    2012-01-01

    Fraction of missing information [lambda][subscript j] is a useful measure of the impact of missing data on the quality of estimation of a particular parameter. This measure can be computed for all parameters in the model, and it communicates the relative loss of efficiency in the estimation of a particular parameter due to missing data. It has…

  9. Multi-pollutant surface objective analyses and mapping of air quality health index over North America.

    PubMed

    Robichaud, Alain; Ménard, Richard; Zaïtseva, Yulia; Anselmo, David

    2016-01-01

    Air quality, like weather, can affect everyone, but responses differ depending on the sensitivity and health condition of a given individual. To help protect exposed populations, many countries have put in place real-time air quality nowcasting and forecasting capabilities. We present in this paper an optimal combination of air quality measurements and model outputs and show that it leads to significant improvements in the spatial representativeness of air quality. The product is referred to as multi-pollutant surface objective analyses (MPSOAs). Moreover, based on MPSOA, a geographical mapping of the Canadian Air Quality Health Index (AQHI) is also presented which provides users (policy makers, public, air quality forecasters, and epidemiologists) with a more accurate picture of the health risk anytime and anywhere in Canada and the USA. Since pollutants can also behave as passive atmospheric tracers, they provide information about transport and dispersion and, hence, reveal synoptic and regional meteorological phenomena. MPSOA could also be used to build air pollution climatology, compute local and national trends in air quality, and detect systematic biases in numerical air quality (AQ) models. Finally, initializing AQ models at regular time intervals with MPSOA can produce more accurate air quality forecasts. It is for these reasons that the Canadian Meteorological Centre (CMC) in collaboration with the Air Quality Research Division (AQRD) of Environment Canada has recently implemented MPSOA in their daily operations.

  10. Research governance: implications for health library and information professionals.

    PubMed

    Sen, Barbara A

    2003-03-01

    The Research Governance Framework for Health and Social Care published by the Department of Health in 2001 provides a model of best practice and a framework for research in the health and social care sector. This article reviews the Department of Health Research Governance Framework, discusses the implications of research governance for library and information professionals undertaking research in the health- and social-care sector and recommends strategies for best practice within the information profession relating to research governance. The scope of the Framework document that covers both clinical and non-clinical research is outlined. Any research involving, amongst other issues, patients, NHS staff and use or access to NHS premises may require ethics committee approval. Particular reference is made to the roles, responsibilities and professional conduct and the systems needed to support effective research practice. Issues such as these combine to encourage the development of a quality research culture which supports best practice. Questions arise regarding the training and experience of researchers, and access to the necessary information and support. The use of the Framework to guide research practice complements the quality issues within the evidence-based practice movement and supports the ongoing development of a quality research culture. Recommendations are given in relation to the document's five domains of ethics, science, information, health and safety and finance and intellectual property. Practical recommendations are offered for incorporating research governance into research practice in ways which conform to the Framework's standards and which are particularly relevant for research practitioners in information science. Concluding comments support the use of the Research Governance Framework as a model for best practice.

  11. Effective Materials Property Information Management for the 21st Century

    NASA Technical Reports Server (NTRS)

    Ren, Weiju; Cebon, David; Arnold, Steve

    2009-01-01

    This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in various organizations. In part these are fueled by the demands for higher efficiency in material testing, product design and engineering analysis. But equally important, organizations are being driven by the need for consistency, quality and traceability of data, as well as control of access to sensitive information such as proprietary data. Further, the use of increasingly sophisticated nonlinear, anisotropic and multi-scale engineering analyses requires both processing of large volumes of test data for development of constitutive models and complex materials data input for Computer-Aided Engineering (CAE) software. And finally, the globalization of economy often generates great needs for sharing a single "gold source" of materials information between members of global engineering teams in extended supply chains. Fortunately, material property management systems have kept pace with the growing user demands and evolved to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access, version, and quality controls; (ii) a wide range of data import, export and analysis capabilities; (iii) data "pedigree" traceability mechanisms; (iv) data searching, reporting and viewing tools; and (v) access to the information via a wide range of interfaces. In this paper the important requirements for advanced material data management systems, future challenges and opportunities such as automated error checking, data quality characterization, identification of gaps in datasets, as well as functionalities and business models to fuel database growth and maintenance are discussed.

  12. Environmental Fate and Bioaccumulation Modelling at the US Environmental Protection Agency: Application to Inform Decision-Making

    EPA Science Inventory

    This chapter reviews the regulatory background and policy applications driving the use of various types of environmental fate and bioaccumulation models at US EPA (air quality, surface water and watersheds, contaminated sites). Comparing current research frontiers with contempora...

  13. Impact of Increased Corn Production on Ground Water Quality and Human Health

    EPA Science Inventory

    In this study, we use a complex coupled modeling system to assess the impacts of increased corn production on groundwater. In particular, we show how the models provide new information on the drivers of contamination in groundwater, and then relate pollutant concentration change...

  14. Automated workflows for data curation and standardization of chemical structures for QSAR modeling

    EPA Science Inventory

    Large collections of chemical structures and associated experimental data are publicly available, and can be used to build robust QSAR models for applications in different fields. One common concern is the quality of both the chemical structure information and associated experime...

  15. Factors affecting success of an integrated community-based telehealth system.

    PubMed

    Hsieh, Hui-Lung; Tsai, Chung-Hung; Chih, Wen-Hai; Lin, Huei-Hsieh

    2015-01-01

    The rise of chronic and degenerative diseases in developed countries has become one critical epidemiologic issue. Telehealth can provide one viable way to enhance health care, public health, and health education delivery and support. The study aims to empirically examine and evaluate the success factors of community-based telehealth system adoption. The valid 336 respondents are the residents of a rural community in Taiwan. The structural equation modeling (SEM) was used to assess the proposed model applied to telehealth. The findings showed the research model had good explanatory power and fitness. Also, the findings indicated that system quality exerted the strongest overall effect on intention to use. Furthermore, service quality exerted the strongest overall effect on user satisfaction. The findings also illustrated that the joint effects of three intrinsic qualities (system quality, information quality, and service quality) on use were mediated by user satisfaction and intention to use. The study implies that community-based telehealth service providers should improve three intrinsic qualities to enhance user satisfaction and intention to use, which in turn can lead to increase the usage of the telehealth equipment. The integrated community-based telehealth system may become an innovative and suitable way to deliver better care to the residents of communities.

  16. Quality assessment of protein model-structures based on structural and functional similarities

    PubMed Central

    2012-01-01

    Background Experimental determination of protein 3D structures is expensive, time consuming and sometimes impossible. A gap between number of protein structures deposited in the World Wide Protein Data Bank and the number of sequenced proteins constantly broadens. Computational modeling is deemed to be one of the ways to deal with the problem. Although protein 3D structure prediction is a difficult task, many tools are available. These tools can model it from a sequence or partial structural information, e.g. contact maps. Consequently, biologists have the ability to generate automatically a putative 3D structure model of any protein. However, the main issue becomes evaluation of the model quality, which is one of the most important challenges of structural biology. Results GOBA - Gene Ontology-Based Assessment is a novel Protein Model Quality Assessment Program. It estimates the compatibility between a model-structure and its expected function. GOBA is based on the assumption that a high quality model is expected to be structurally similar to proteins functionally similar to the prediction target. Whereas DALI is used to measure structure similarity, protein functional similarity is quantified using standardized and hierarchical description of proteins provided by Gene Ontology combined with Wang's algorithm for calculating semantic similarity. Two approaches are proposed to express the quality of protein model-structures. One is a single model quality assessment method, the other is its modification, which provides a relative measure of model quality. Exhaustive evaluation is performed on data sets of model-structures submitted to the CASP8 and CASP9 contests. Conclusions The validation shows that the method is able to discriminate between good and bad model-structures. The best of tested GOBA scores achieved 0.74 and 0.8 as a mean Pearson correlation to the observed quality of models in our CASP8 and CASP9-based validation sets. GOBA also obtained the best result for two targets of CASP8, and one of CASP9, compared to the contest participants. Consequently, GOBA offers a novel single model quality assessment program that addresses the practical needs of biologists. In conjunction with other Model Quality Assessment Programs (MQAPs), it would prove useful for the evaluation of single protein models. PMID:22998498

  17. Air quality impact assessment of multiple open pit coal mines in northern Colombia.

    PubMed

    Huertas, José I; Huertas, María E; Izquierdo, Sebastián; González, Enrique D

    2012-01-01

    The coal mining region in northern Colombia is one of the largest open pit mining regions of the world. In 2009, there were 8 mining companies in operation with an approximate coal production of ∼70 Mtons/year. Since 2007, the Colombian air quality monitoring network has reported readings that exceed the daily and annual air quality standards for total suspended particulate (TSP) matter and particles with an equivalent aerodynamic diameter smaller than 10 μm (PM₁₀) in nearby villages. This paper describes work carried out in order to establish an appropriate clean air program for this region, based on the Colombian national environmental authority requirement for modeling of TSP and PM(10) dispersion. A TSP and PM₁₀ emission inventory was initially developed, and topographic and meteorological information for the region was collected and analyzed. Using this information, the dispersion of TSP was modeled in ISC3 and AERMOD using meteorological data collected by 3 local stations during 2008 and 2009. The results obtained were compared to actual values measured by the air quality monitoring network. High correlation coefficients (>0.73) were obtained, indicating that the models accurately described the main factors affecting particle dispersion in the region. The model was then used to forecast concentrations of particulate matter for 2010. Based on results from the model, areas within the modeling region were identified as highly, fairly, moderately and marginally polluted according to local regulations. Additionally, the contribution particulate matter to the pollution at each village was estimated. Using these predicted values, the Colombian environmental authority imposed new decontamination measures on the mining companies operating in the region. These measures included the relocation of three villages financed by the mine companies based on forecasted pollution levels. Copyright © 2011. Published by Elsevier Ltd.

  18. ICT and e-Governance: A Conceptual Model of e-DISC

    NASA Astrophysics Data System (ADS)

    Tejasvee, Sanjay; Sarangdevot, S. S.; Gahlot, Devendra; Gour, Vishal; Sandal, Shruti

    2010-11-01

    One of the most important objectives of e-governance is, proper distribution and delivery of government information and services to the citizens. By progression in resources of information technology, great opportunities comes to the government for serve information and services to the citizens and public sector in better manner. This paper intends to examine and explore the conceptual model of e-DISC (Effective Deliverance of Information and Services to the Citizens) The purpose of this paper is to gain a better understanding of e-government in India with the concept of e-DISC with ICTs and how to deal with challenges and barriers for successful e-DISC model with accuracy. The obtained results prove that the utilizing and by increasing interest in the new electronic, information, and communication technologies (ICTs) and e-DISC model in recent time, government improved the quality of e-governance and delivery of information and services and acknowledged the awareness of the system is also valuable.

  19. Development of a model web-based system to support a statewide quality consortium in radiation oncology.

    PubMed

    Moran, Jean M; Feng, Mary; Benedetti, Lisa A; Marsh, Robin; Griffith, Kent A; Matuszak, Martha M; Hess, Michael; McMullen, Matthew; Fisher, Jennifer H; Nurushev, Teamour; Grubb, Margaret; Gardner, Stephen; Nielsen, Daniel; Jagsi, Reshma; Hayman, James A; Pierce, Lori J

    A database in which patient data are compiled allows analytic opportunities for continuous improvements in treatment quality and comparative effectiveness research. We describe the development of a novel, web-based system that supports the collection of complex radiation treatment planning information from centers that use diverse techniques, software, and hardware for radiation oncology care in a statewide quality collaborative, the Michigan Radiation Oncology Quality Consortium (MROQC). The MROQC database seeks to enable assessment of physician- and patient-reported outcomes and quality improvement as a function of treatment planning and delivery techniques for breast and lung cancer patients. We created tools to collect anonymized data based on all plans. The MROQC system representing 24 institutions has been successfully deployed in the state of Michigan. Since 2012, dose-volume histogram and Digital Imaging and Communications in Medicine-radiation therapy plan data and information on simulation, planning, and delivery techniques have been collected. Audits indicated >90% accurate data submission and spurred refinements to data collection methodology. This model web-based system captures detailed, high-quality radiation therapy dosimetry data along with patient- and physician-reported outcomes and clinical data for a radiation therapy collaborative quality initiative. The collaborative nature of the project has been integral to its success. Our methodology can be applied to setting up analogous consortiums and databases. Copyright © 2016 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  20. Economic Evaluations of Multicomponent Disease Management Programs with Markov Models: A Systematic Review.

    PubMed

    Kirsch, Florian

    2016-12-01

    Disease management programs (DMPs) for chronic diseases are being increasingly implemented worldwide. To present a systematic overview of the economic effects of DMPs with Markov models. The quality of the models is assessed, the method by which the DMP intervention is incorporated into the model is examined, and the differences in the structure and data used in the models are considered. A literature search was conducted; the Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement was followed to ensure systematic selection of the articles. Study characteristics e.g. results, the intensity of the DMP and usual care, model design, time horizon, discount rates, utility measures, and cost-of-illness were extracted from the reviewed studies. Model quality was assessed by two researchers with two different appraisals: one proposed by Philips et al. (Good practice guidelines for decision-analytic modelling in health technology assessment: a review and consolidation of quality asessment. Pharmacoeconomics 2006;24:355-71) and the other proposed by Caro et al. (Questionnaire to assess relevance and credibility of modeling studies for informing health care decision making: an ISPOR-AMCP-NPC Good Practice Task Force report. Value Health 2014;17:174-82). A total of 16 studies (9 on chronic heart disease, 2 on asthma, and 5 on diabetes) met the inclusion criteria. Five studies reported cost savings and 11 studies reported additional costs. In the quality, the overall score of the models ranged from 39% to 65%, it ranged from 34% to 52%. Eleven models integrated effectiveness derived from a clinical trial or a meta-analysis of complete DMPs and only five models combined intervention effects from different sources into a DMP. The main limitations of the models are bad reporting practice and the variation in the selection of input parameters. Eleven of the 14 studies reported cost-effectiveness results of less than $30,000 per quality-adjusted life-year and the remaining two studies less than $30,000 per life-year gained. Nevertheless, if the reporting and selection of data problems are addressed, then Markov models should provide more reliable information for decision makers, because understanding under what circumstances a DMP is cost-effective is an important determinant of efficient resource allocation. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  1. Assessing the quality of the nation's water resources

    USGS Publications Warehouse

    Hamilton, Pixie A.

    2002-01-01

    This issue of IMPACT highlights findings from the first decade of studies (1991 to 2001) by the National Water-Quality Assessment (NAWQA) Program of the U.S. Geological Survey (USGS). The articles also discuss the Program’s approaches and models designed to help understand and estimate the fate and transport of contaminants in different geographic areas and environmental settings and over different time frames. NAWQA was established by Congress in 1991 with a goal of developing long-term, consistent, and comparable science-based information on nationwide water-quality conditions. This information is used to support sound management and policy decisions by decision makers at all levels – local, state, and national – who, every day, face complex regulations and management issues related to water resources.

  2. PconsFold: improved contact predictions improve protein models.

    PubMed

    Michel, Mirco; Hayat, Sikander; Skwark, Marcin J; Sander, Chris; Marks, Debora S; Elofsson, Arne

    2014-09-01

    Recently it has been shown that the quality of protein contact prediction from evolutionary information can be improved significantly if direct and indirect information is separated. Given sufficiently large protein families, the contact predictions contain sufficient information to predict the structure of many protein families. However, since the first studies contact prediction methods have improved. Here, we ask how much the final models are improved if improved contact predictions are used. In a small benchmark of 15 proteins, we show that the TM-scores of top-ranked models are improved by on average 33% using PconsFold compared with the original version of EVfold. In a larger benchmark, we find that the quality is improved with 15-30% when using PconsC in comparison with earlier contact prediction methods. Further, using Rosetta instead of CNS does not significantly improve global model accuracy, but the chemistry of models generated with Rosetta is improved. PconsFold is a fully automated pipeline for ab initio protein structure prediction based on evolutionary information. PconsFold is based on PconsC contact prediction and uses the Rosetta folding protocol. Due to its modularity, the contact prediction tool can be easily exchanged. The source code of PconsFold is available on GitHub at https://www.github.com/ElofssonLab/pcons-fold under the MIT license. PconsC is available from http://c.pcons.net/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  3. Spatial and Social Diffusion of Information and Influence: Models and Algorithms

    ERIC Educational Resources Information Center

    Doo, Myungcheol

    2012-01-01

    In this dissertation research, we argue that spatial alarms and activity-based social networks are two fundamentally new types of information and influence diffusion channels. Such new channels have the potential of enriching our professional experiences and our personal life quality in many unprecedented ways. First, we develop an activity driven…

  4. 40 CFR 52.820 - Identification of plan.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 66219; at EPA Air and Radiation Docket and Information Center, EPA West Building, 1301 Constitution... (202) 566-1742. For information on the availability of this material at NARA, call (202) 741-6030, or... Were Submitted on July 17, 1975 Statewide 9/3/75 10/1/76, 41 FR 43407 (10) Air Quality Modeling to...

  5. 40 CFR 52.820 - Identification of plan.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Information Center, Room Number 3334, EPA West Building, 1301 Constitution Avenue, NW., Washington, DC 20460... Library, please call the Office of Air and Radiation Docket at (202) 566-1742. For information on the... Were Submitted on July 17, 1975 Statewide 9/3/75 10/1/76, 41 FR 43407 (10) Air Quality Modeling to...

  6. An Entropy-Based Measure for Assessing Fuzziness in Logistic Regression

    PubMed Central

    Weiss, Brandi A.; Dardick, William

    2015-01-01

    This article introduces an entropy-based measure of data–model fit that can be used to assess the quality of logistic regression models. Entropy has previously been used in mixture-modeling to quantify how well individuals are classified into latent classes. The current study proposes the use of entropy for logistic regression models to quantify the quality of classification and separation of group membership. Entropy complements preexisting measures of data–model fit and provides unique information not contained in other measures. Hypothetical data scenarios, an applied example, and Monte Carlo simulation results are used to demonstrate the application of entropy in logistic regression. Entropy should be used in conjunction with other measures of data–model fit to assess how well logistic regression models classify cases into observed categories. PMID:29795897

  7. An Entropy-Based Measure for Assessing Fuzziness in Logistic Regression.

    PubMed

    Weiss, Brandi A; Dardick, William

    2016-12-01

    This article introduces an entropy-based measure of data-model fit that can be used to assess the quality of logistic regression models. Entropy has previously been used in mixture-modeling to quantify how well individuals are classified into latent classes. The current study proposes the use of entropy for logistic regression models to quantify the quality of classification and separation of group membership. Entropy complements preexisting measures of data-model fit and provides unique information not contained in other measures. Hypothetical data scenarios, an applied example, and Monte Carlo simulation results are used to demonstrate the application of entropy in logistic regression. Entropy should be used in conjunction with other measures of data-model fit to assess how well logistic regression models classify cases into observed categories.

  8. Towards knowledge-based systems in clinical practice: development of an integrated clinical information and knowledge management support system.

    PubMed

    Kalogeropoulos, Dimitris A; Carson, Ewart R; Collinson, Paul O

    2003-09-01

    Given that clinicians presented with identical clinical information will act in different ways, there is a need to introduce into routine clinical practice methods and tools to support the scientific homogeneity and accountability of healthcare decisions and actions. The benefits expected from such action include an overall reduction in cost, improved quality of care, patient and public opinion satisfaction. Computer-based medical data processing has yielded methods and tools for managing the task away from the hospital management level and closer to the desired disease and patient management level. To this end, advanced applications of information and disease process modelling technologies have already demonstrated an ability to significantly augment clinical decision making as a by-product. The wide-spread acceptance of evidence-based medicine as the basis of cost-conscious and concurrently quality-wise accountable clinical practice suffices as evidence supporting this claim. Electronic libraries are one-step towards an online status of this key health-care delivery quality control environment. Nonetheless, to date, the underlying information and knowledge management technologies have failed to be integrated into any form of pragmatic or marketable online and real-time clinical decision making tool. One of the main obstacles that needs to be overcome is the development of systems that treat both information and knowledge as clinical objects with same modelling requirements. This paper describes the development of such a system in the form of an intelligent clinical information management system: a system which at the most fundamental level of clinical decision support facilitates both the organised acquisition of clinical information and knowledge and provides a test-bed for the development and evaluation of knowledge-based decision support functions.

  9. Table Rock Lake Water-Clarity Assessment Using Landsat Thematic Mapper Satellite Data

    USGS Publications Warehouse

    Krizanich, Gary; Finn, Michael P.

    2009-01-01

    Water quality of Table Rock Lake in southwestern Missouri is assessed using Landsat Thematic Mapper satellite data. A pilot study uses multidate satellite image scenes in conjunction with physical measurements of secchi disk transparency collected by the Lakes of Missouri Volunteer Program to construct a regression model used to estimate water clarity. The natural log of secchi disk transparency is the dependent variable in the regression and the independent variables are Thematic Mapper band 1 (blue) reflectance and a ratio of the band 1 and band 3 (red) reflectance. The regression model can be used to reliably predict water clarity anywhere within the lake. A pixel-level lake map of predicted water clarity or computed trophic state can be produced from the model output. Information derived from this model can be used by water-resource managers to assess water quality and evaluate effects of changes in the watershed on water quality.

  10. Anatomy of Data Integration

    PubMed Central

    Brazhnik, Olga; Jones, John F.

    2007-01-01

    Producing reliable information is the ultimate goal of data processing. The ocean of data created with the advances of science and technologies calls for integration of data coming from heterogeneous sources that are diverse in their purposes, business rules, underlying models and enabling technologies. Reference models, Semantic Web, standards, ontology, and other technologies enable fast and efficient merging of heterogeneous data, while the reliability of produced information is largely defined by how well the data represent the reality. In this paper we initiate a framework for assessing the informational value of data that includes data dimensions; aligning data quality with business practices; identifying authoritative sources and integration keys; merging models; uniting updates of varying frequency and overlapping or gapped data sets. PMID:17071142

  11. The Use of Health Information Technology Within Collaborative and Integrated Models of Child Psychiatry Practice.

    PubMed

    Coffey, Sara; Vanderlip, Erik; Sarvet, Barry

    2017-01-01

    There is a consistent need for more child and adolescent psychiatrists. Despite increased recruitment of child and adolescent psychiatry trainees, traditional models of care will likely not be able to meet the need of youth with mental illness. Integrated care models focusing on population-based, team-based, measurement-based, and evidenced-based care have been effective in addressing accessibility and quality of care. These integrated models have specific needs regarding health information technology (HIT). HIT has been used in a variety of different ways in several integrated care models. HIT can aid in implementation of these models but is not without its challenges. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Performance and cost evaluation of health information systems using micro-costing and discrete-event simulation.

    PubMed

    Rejeb, Olfa; Pilet, Claire; Hamana, Sabri; Xie, Xiaolan; Durand, Thierry; Aloui, Saber; Doly, Anne; Biron, Pierre; Perrier, Lionel; Augusto, Vincent

    2018-06-01

    Innovation and health-care funding reforms have contributed to the deployment of Information and Communication Technology (ICT) to improve patient care. Many health-care organizations considered the application of ICT as a crucial key to enhance health-care management. The purpose of this paper is to provide a methodology to assess the organizational impact of high-level Health Information System (HIS) on patient pathway. We propose an integrated performance evaluation of HIS approach through the combination of formal modeling using the Architecture of Integrated Information Systems (ARIS) models, a micro-costing approach for cost evaluation, and a Discrete-Event Simulation (DES) approach. The methodology is applied to the consultation for cancer treatment process. Simulation scenarios are established to conclude about the impact of HIS on patient pathway. We demonstrated that although high level HIS lengthen the consultation, occupation rate of oncologists are lower and quality of service is higher (through the number of available information accessed during the consultation to formulate the diagnostic). The provided method allows also to determine the most cost-effective ICT elements to improve the care process quality while minimizing costs. The methodology is flexible enough to be applied to other health-care systems.

  13. Optimal spatio-temporal design of water quality monitoring networks for reservoirs: Application of the concept of value of information

    NASA Astrophysics Data System (ADS)

    Maymandi, Nahal; Kerachian, Reza; Nikoo, Mohammad Reza

    2018-03-01

    This paper presents a new methodology for optimizing Water Quality Monitoring (WQM) networks of reservoirs and lakes using the concept of the value of information (VOI) and utilizing results of a calibrated numerical water quality simulation model. With reference to the value of information theory, water quality of every checkpoint with a specific prior probability differs in time. After analyzing water quality samples taken from potential monitoring points, the posterior probabilities are updated using the Baye's theorem, and VOI of the samples is calculated. In the next step, the stations with maximum VOI is selected as optimal stations. This process is repeated for each sampling interval to obtain optimal monitoring network locations for each interval. The results of the proposed VOI-based methodology is compared with those obtained using an entropy theoretic approach. As the results of the two methodologies would be partially different, in the next step, the results are combined using a weighting method. Finally, the optimal sampling interval and location of WQM stations are chosen using the Evidential Reasoning (ER) decision making method. The efficiency and applicability of the methodology are evaluated using available water quantity and quality data of the Karkheh Reservoir in the southwestern part of Iran.

  14. The association between health-related quality of life/dietary satisfaction and perceived food environment among Japanese individuals with spinal cord injury.

    PubMed

    Hata, K; Inayama, T; Yoshiike, N

    2017-08-01

    Cross-sectional. To examine the association between health-related quality of life (HRQOL)/dietary satisfaction and perceived food environment in community-dwelling individuals with spinal cord injury (SCI). Members of the Spinal Injuries Japan organization. Subjects were 2007 Japanese individuals with SCI. A questionnaire conducted in 2015 included items addressing sociodemographic characteristics, HRQOL, dietary satisfaction and eight perceived food environment items. Responses from 506 individuals were analyzed (valid response rate=25%). Dependent variables were the physical and mental summary scores of the HRQOL and dietary satisfaction. The independent variable was the perceived food environment. We used a univariate analysis (in Model 1) and a multivariate analysis (in Models 2 and 3) as part of a binominal logistic regression analysis. In Model 3, we divided and analyzed the perceived food environment variable into 'access to food' and 'access to information'. Both physical and mental summary scores were related to 'dietary information acquisition in the community'. Dietary satisfaction was related to 'balanced meals in the household', 'food and health information available from family' and 'right health and dietary information acquisition from the media'. HRQOL and dietary satisfaction were differentially associated with perceived food environment factors in community-dwelling individuals with SCI. HRQOL was positively related to dietary information of perceived food environment in the community. Dietary satisfaction was positively related to perceived food environment in the household.

  15. “Application and evaluation of the two-way coupled WRF-CMAQ modeling system to the 2011 DISCOVER-AQ campaign in the Baltimore-Washington D.C. area.”

    EPA Science Inventory

    The DISCOVER-AQ project (Deriving Information on Surface conditions from Column and Vertically Resolved Observations Relevant to Air Quality), is a joint collaboration between NASA, U.S. EPA and a number of other local organizations with the goal of characterizing air quality in ...

  16. Sequentially Executed Model Evaluation Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as partmore » of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.« less

  17. Preliminary description of the area navigation software for a microcomputer-based Loran-C receiver

    NASA Technical Reports Server (NTRS)

    Oguri, F.

    1983-01-01

    The development of new software implementation of this software on a microcomputer (MOS 6502) to provide high quality navigation information is described. This software development provides Area/Route Navigation (RNAV) information from Time Differences (TDs) in raw form using an elliptical Earth model and a spherical model. The software is prepared for the microcomputer based Loran-C receiver. To compute navigation infomation, a (MOS 6502) microcomputer and a mathematical chip (AM 9511A) were combined with the Loran-C receiver. Final data reveals that this software does indeed provide accurate information with reasonable execution times.

  18. Developing Model Assesement for Learning (AFL) to Improve Quality and Evaluation in Pragmatic Course in IAIN Surakarta

    ERIC Educational Resources Information Center

    Retnaningsih, Woro; Djatmiko; Sumarlam

    2017-01-01

    The research objective is to develop a model of Assessment for Learning (AFL) in Pragmatic course in IAIN Surakarta. The research problems are as follows: How did the lecturer develop a model of AFL? What was the form of assessment information used as the model of AFL? How was the results of the implementation of the model of assessment. The…

  19. 20180318 - Automated workflows for data curation and standardization of chemical structures for QSAR modeling (ACS Spring)

    EPA Science Inventory

    Large collections of chemical structures and associated experimental data are publicly available, and can be used to build robust QSAR models for applications in different fields. One common concern is the quality of both the chemical structure information and associated experime...

  20. Motivation-Oriented Teaching Model for Certification Education

    ERIC Educational Resources Information Center

    Huang, Tien-Chi

    2013-01-01

    To evoke public emphasis on the professional skills training in vocational education, the Ministry of Education (MOE) in Taiwan has recently focused on the certificate of information skills. However, the lack of a systematic and institutionally certified instructor training model within Taiwan not only varies the quality of certified instructors,…

Top