Sample records for data-driven quality improvement

  1. An evaluation of the effectiveness of a risk-based monitoring approach implemented with clinical trials involving implantable cardiac medical devices.

    PubMed

    Diani, Christopher A; Rock, Angie; Moll, Phil

    2017-12-01

    Background Risk-based monitoring is a concept endorsed by the Food and Drug Administration to improve clinical trial data quality by focusing monitoring efforts on critical data elements and higher risk investigator sites. BIOTRONIK approached this by implementing a comprehensive strategy that assesses risk and data quality through a combination of operational controls and data surveillance. This publication demonstrates the effectiveness of a data-driven risk assessment methodology when used in conjunction with a tailored monitoring plan. Methods We developed a data-driven risk assessment system to rank 133 investigator sites comprising 3442 subjects and identify those sites that pose a potential risk to the integrity of data collected in implantable cardiac device clinical trials. This included identification of specific risk factors and a weighted scoring mechanism. We conducted trend analyses for risk assessment data collected over 1 year to assess the overall impact of our data surveillance process combined with other operational monitoring efforts. Results Trending analyses of key risk factors revealed an improvement in the quality of data collected during the observation period. The three risk factors follow-up compliance rate, unavailability of critical data, and noncompliance rate correspond closely with Food and Drug Administration's risk-based monitoring guidance document. Among these three risk factors, 100% (12/12) of quantiles analyzed showed an increase in data quality. Of these, 67% (8/12) of the improving trends in worst performing quantiles had p-values less than 0.05, and 17% (2/12) had p-values between 0.05 and 0.06. Among the poorest performing site quantiles, there was a statistically significant decrease in subject follow-up noncompliance rates, protocol noncompliance rates, and incidence of missing critical data. Conclusion One year after implementation of a comprehensive strategy for risk-based monitoring, including a data-driven risk assessment methodology to target on-site monitoring visits, statistically significant improvement was seen in a majority of measurable risk factors at the worst performing site quantiles. For the three risk factors which are most critical to the overall compliance of cardiac rhythm management medical device studies: follow-up compliance rate, unavailability of critical data, and noncompliance rate, we measured significant improvement in data quality. Although the worst performing site quantiles improved but not significantly in some risk factors such as subject attrition, the data-driven risk assessment highlighted key areas on which to continue focusing both on-site and centralized monitoring efforts. Data-driven surveillance of clinical trial performance provides actionable observations that can improve site performance. Clinical trials utilizing risk-based monitoring by leveraging a data-driven quality assessment combined with specific operational procedures may lead to an improvement in data quality and resource efficiencies.

  2. Defining datasets and creating data dictionaries for quality improvement and research in chronic disease using routinely collected data: an ontology-driven approach.

    PubMed

    de Lusignan, Simon; Liaw, Siaw-Teng; Michalakidis, Georgios; Jones, Simon

    2011-01-01

    The burden of chronic disease is increasing, and research and quality improvement will be less effective if case finding strategies are suboptimal. To describe an ontology-driven approach to case finding in chronic disease and how this approach can be used to create a data dictionary and make the codes used in case finding transparent. A five-step process: (1) identifying a reference coding system or terminology; (2) using an ontology-driven approach to identify cases; (3) developing metadata that can be used to identify the extracted data; (4) mapping the extracted data to the reference terminology; and (5) creating the data dictionary. Hypertension is presented as an exemplar. A patient with hypertension can be represented by a range of codes including diagnostic, history and administrative. Metadata can link the coding system and data extraction queries to the correct data mapping and translation tool, which then maps it to the equivalent code in the reference terminology. The code extracted, the term, its domain and subdomain, and the name of the data extraction query can then be automatically grouped and published online as a readily searchable data dictionary. An exemplar online is: www.clininf.eu/qickd-data-dictionary.html Adopting an ontology-driven approach to case finding could improve the quality of disease registers and of research based on routine data. It would offer considerable advantages over using limited datasets to define cases. This approach should be considered by those involved in research and quality improvement projects which utilise routine data.

  3. Demystifying Data: Designing and Implementing Data-Driven Systems and Practices for Continuous Quality Improvement

    ERIC Educational Resources Information Center

    Krugly, Andrew; Stein, Amanda; Centeno, Maribel G.

    2014-01-01

    Data-based decision making should be the driving force in any early care and education setting. Data usage compels early childhood practitioners and leaders to make decisions on the basis of more than just professional instinct. This article explores why early childhood schools should be using data for continuous quality improvement at various…

  4. A data-driven approach for quality assessment of radiologic interpretations.

    PubMed

    Hsu, William; Han, Simon X; Arnold, Corey W; Bui, Alex At; Enzmann, Dieter R

    2016-04-01

    Given the increasing emphasis on delivering high-quality, cost-efficient healthcare, improved methodologies are needed to measure the accuracy and utility of ordered diagnostic examinations in achieving the appropriate diagnosis. Here, we present a data-driven approach for performing automated quality assessment of radiologic interpretations using other clinical information (e.g., pathology) as a reference standard for individual radiologists, subspecialty sections, imaging modalities, and entire departments. Downstream diagnostic conclusions from the electronic medical record are utilized as "truth" to which upstream diagnoses generated by radiology are compared. The described system automatically extracts and compares patient medical data to characterize concordance between clinical sources. Initial results are presented in the context of breast imaging, matching 18 101 radiologic interpretations with 301 pathology diagnoses and achieving a precision and recall of 84% and 92%, respectively. The presented data-driven method highlights the challenges of integrating multiple data sources and the application of information extraction tools to facilitate healthcare quality improvement. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. A multi-band environment-adaptive approach to noise suppression for cochlear implants.

    PubMed

    Saki, Fatemeh; Mirzahasanloo, Taher; Kehtarnavaz, Nasser

    2014-01-01

    This paper presents an improved environment-adaptive noise suppression solution for the cochlear implants speech processing pipeline. This improvement is achieved by using a multi-band data-driven approach in place of a previously developed single-band data-driven approach. Seven commonly encountered noisy environments of street, car, restaurant, mall, bus, pub and train are considered to quantify the improvement. The results obtained indicate about 10% improvement in speech quality measures.

  6. Data Driven Quality Improvement of Health Professions Education: Design and Development of CLUE - An Interactive Curriculum Data Visualization Tool.

    PubMed

    Canning, Claire Ann; Loe, Alan; Cockett, Kathryn Jane; Gagnon, Paul; Zary, Nabil

    2017-01-01

    Curriculum Mapping and dynamic visualization is quickly becoming an integral aspect of quality improvement in support of innovations which drive curriculum quality assurance processes in medical education. CLUE (Curriculum Explorer) a highly interactive, engaging and independent platform was developed to support curriculum transparency, enhance student engagement, and enable granular search and display. Reflecting a design based approach to meet the needs of the school's varied stakeholders, CLUE employs an iterative and reflective approach to drive the evolution of its platform, as it seeks to accommodate the ever-changing needs of our stakeholders in the fast pace world of medicine and medical education today. CLUE exists independent of institutional systems and in this way, is uniquely positioned to deliver a data driven quality improvement resource, easily adaptable for use by any member of our health care professions.

  7. Leveraging information technology to drive improvement in patient satisfaction.

    PubMed

    Nash, Mary; Pestrue, Justin; Geier, Peter; Sharp, Karen; Helder, Amy; McAlearney, Ann Scheck

    2010-01-01

    A healthcare organization's commitment to quality and the patient experience requires senior leader involvement in improvement strategies, and accountability for goals. Further, improvement strategies are most effective when driven by data, and in the world of patient satisfaction, evidence is growing that nurse leader rounding and discharge calls are strategic tactics that can improve patient satisfaction. This article describes how The Ohio State University Medical Center (OSUMC) leveraged health information technology (IT) to apply a data-driven strategy execution to improve the patient experience. Specifically, two IT-driven approaches were used: (1) business intelligence reporting tools were used to create a meaningful reporting system including dashboards, scorecards, and tracking reports and (2) an improvement plan was implemented that focused on two high-impact tactics and data to hardwire accountability. Targeted information from the IT systems enabled clinicians and administrators to execute these strategic tactics, and senior leaders to monitor achievement of strategic goals. As a result, OSUMC's inpatient satisfaction scores on the Hospital Consumer Assessment of Healthcare Providers and Systems survey improved from 56% nines and tens in 2006 to 71% in 2009. © 2010 National Association for Healthcare Quality.

  8. Quality improvement in pediatrics: past, present, and future.

    PubMed

    Schwartz, Stephanie P; Rehder, Kyle J

    2017-01-01

    Almost two decades ago, the landmark report "To Err is Human" compelled healthcare to address the large numbers of hospitalized patients experiencing preventable harm. Concurrently, it became clear that the rapidly rising cost of healthcare would be unsustainable in the long-term. As a result, quality improvement methodologies initially rooted in other high-reliability industries have become a primary focus of healthcare. Multiple pediatric studies demonstrate remarkable quality and safety improvements in several domains including handoffs, catheter-associated blood stream infections, and other serious safety events. While both quality improvement and research are data-driven processes, significant differences exist between the two. Research utilizes a hypothesis driven approach to obtain new knowledge while quality improvement often incorporates a cyclic approach to translate existing knowledge into clinical practice. Recent publications have provided guidelines and methods for effectively reporting quality and safety work and improvement implementations. This review examines not only how quality improvement in pediatrics has led to improved outcomes, but also looks to the future of quality improvement in healthcare with focus on education and collaboration to ensure best practice approaches to caring for children.

  9. Continuous quality improvement at work: the first team--Part II.

    PubMed

    Bolt, B J; Lehany-Trese, A M; Williams, T P

    1995-01-01

    This second part of a two-part article follows Cape Canaveral Hospital's first continuous quality improvement team through the processes of goal setting, system analysis, data gathering, and problem resolution in the area of patients' assignment to observation status. The team's primary goal was data-driven improvement. As detailed here, the team's solution to improve the use of observation status is both time-efficient and offers opportunities for financial gain.

  10. Data quality and processing for decision making: divergence between corporate strategy and manufacturing processes

    NASA Astrophysics Data System (ADS)

    McNeil, Ronald D.; Miele, Renato; Shaul, Dennis

    2000-10-01

    Information technology is driving improvements in manufacturing systems. Results are higher productivity and quality. However, corporate strategy is driven by a number of factors and includes data and pressure from multiple stakeholders, which includes employees, managers, executives, stockholders, boards, suppliers and customers. It is also driven by information about competitors and emerging technology. Much information is based on processing of data and the resulting biases of the processors. Thus, stakeholders can base inputs on faulty perceptions, which are not reality based. Prior to processing, data used may be inaccurate. Sources of data and information may include demographic reports, statistical analyses, intelligence reports (e.g., marketing data), technology and primary data collection. The reliability and validity of data as well as the management of sources and information is critical element to strategy formulation. The paper explores data collection, processing and analyses from secondary and primary sources, information generation and report presentation for strategy formulation and contrast this with data and information utilized to drive internal process such as manufacturing. The hypothesis is that internal process, such as manufacturing, are subordinate to corporate strategies. The impact of possible divergence in quality of decisions at the corporate level on IT driven, quality-manufacturing processes based on measurable outcomes is significant. Recommendations for IT improvements at the corporate strategy level are given.

  11. Stakeholder-Driven Quality Improvement: A Compelling Force for Clinical Practice Guidelines.

    PubMed

    Rosenfeld, Richard M; Wyer, Peter C

    2018-01-01

    Clinical practice guideline development should be driven by rigorous methodology, but what is less clear is where quality improvement enters the process: should it be a priority-guiding force, or should it enter only after recommendations are formulated? We argue for a stakeholder-driven approach to guideline development, with an overriding goal of quality improvement based on stakeholder perceptions of needs, uncertainties, and knowledge gaps. In contrast, the widely used topic-driven approach, which often makes recommendations based only on randomized controlled trials, is driven by epidemiologic purity and evidence rigor, with quality improvement a downstream consideration. The advantages of a stakeholder-driven versus a topic-driven approach are highlighted by comparisons of guidelines for otitis media with effusion, thyroid nodules, sepsis, and acute bacterial rhinosinusitis. These comparisons show that stakeholder-driven guidelines are more likely to address the quality improvement needs and pressing concerns of clinicians and patients, including understudied populations and patients with multiple chronic conditions. Conversely, a topic-driven approach often addresses "typical" patients, based on research that may not reflect the needs of high-risk groups excluded from studies because of ethical issues or a desire for purity of research design.

  12. 42 CFR 418.58 - Condition of participation: Quality assessment and performance improvement.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... The hospice must develop, implement, and maintain an effective, ongoing, hospice-wide data-driven... learning throughout the hospice. (3) The hospice must take actions aimed at performance improvement and...

  13. 42 CFR 418.58 - Condition of participation: Quality assessment and performance improvement.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... The hospice must develop, implement, and maintain an effective, ongoing, hospice-wide data-driven... learning throughout the hospice. (3) The hospice must take actions aimed at performance improvement and...

  14. 42 CFR 418.58 - Condition of participation: Quality assessment and performance improvement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... The hospice must develop, implement, and maintain an effective, ongoing, hospice-wide data-driven... learning throughout the hospice. (3) The hospice must take actions aimed at performance improvement and...

  15. Data key to quest for quality.

    PubMed

    Chang, Florence S; Nielsen, Jon; Macias, Charles

    2013-11-01

    Late-binding data warehousing reduces the time it takes to obtain data needed to make crucial decisions. Late binding refers to when and how tightly data from the source applications are bound to the rules and vocabularies that make it useful. In some cases, data can be seen in real time. In historically paper-driven environments where data-driven decisions may be a new concept, buy-in from clinicians, physicians, and hospital leaders is key to success in using data to improve outcomes.

  16. Data-Driven Sampling Matrix Boolean Optimization for Energy-Efficient Biomedical Signal Acquisition by Compressive Sensing.

    PubMed

    Wang, Yuhao; Li, Xin; Xu, Kai; Ren, Fengbo; Yu, Hao

    2017-04-01

    Compressive sensing is widely used in biomedical applications, and the sampling matrix plays a critical role on both quality and power consumption of signal acquisition. It projects a high-dimensional vector of data into a low-dimensional subspace by matrix-vector multiplication. An optimal sampling matrix can ensure accurate data reconstruction and/or high compression ratio. Most existing optimization methods can only produce real-valued embedding matrices that result in large energy consumption during data acquisition. In this paper, we propose an efficient method that finds an optimal Boolean sampling matrix in order to reduce the energy consumption. Compared to random Boolean embedding, our data-driven Boolean sampling matrix can improve the image recovery quality by 9 dB. Moreover, in terms of sampling hardware complexity, it reduces the energy consumption by 4.6× and the silicon area by 1.9× over the data-driven real-valued embedding.

  17. Building an Evidence-Driven Child Welfare Workforce: A University-Agency Partnership

    ERIC Educational Resources Information Center

    Lery, Bridgette; Wiegmann, Wendy; Berrick, Jill Duerr

    2015-01-01

    The federal government increasingly expects child welfare systems to be more responsive to the needs of their local populations, connect strategies to results, and use continuous quality improvement (CQI) to accomplish these goals. A method for improving decision making, CQI relies on an inflow of high-quality data, up-to-date research evidence,…

  18. Study protocol of a mixed-methods evaluation of a cluster randomized trial to improve the safety of NSAID and antiplatelet prescribing: data-driven quality improvement in primary care.

    PubMed

    Grant, Aileen; Dreischulte, Tobias; Treweek, Shaun; Guthrie, Bruce

    2012-08-28

    Trials of complex interventions are criticized for being 'black box', so the UK Medical Research Council recommends carrying out a process evaluation to explain the trial findings. We believe it is good practice to pre-specify and publish process evaluation protocols to set standards and minimize bias. Unlike protocols for trials, little guidance or standards exist for the reporting of process evaluations. This paper presents the mixed-method process evaluation protocol of a cluster randomized trial, drawing on a framework designed by the authors. This mixed-method evaluation is based on four research questions and maps data collection to a logic model of how the data-driven quality improvement in primary care (DQIP) intervention is expected to work. Data collection will be predominately by qualitative case studies in eight to ten of the trial practices, focus groups with patients affected by the intervention and quantitative analysis of routine practice data, trial outcome and questionnaire data and data from the DQIP intervention. We believe that pre-specifying the intentions of a process evaluation can help to minimize bias arising from potentially misleading post-hoc analysis. We recognize it is also important to retain flexibility to examine the unexpected and the unintended. From that perspective, a mixed-methods evaluation allows the combination of exploratory and flexible qualitative work, and more pre-specified quantitative analysis, with each method contributing to the design, implementation and interpretation of the other.As well as strengthening the study the authors hope to stimulate discussion among their academic colleagues about publishing protocols for evaluations of randomized trials of complex interventions. DATA-DRIVEN QUALITY IMPROVEMENT IN PRIMARY CARE TRIAL REGISTRATION: ClinicalTrials.gov: NCT01425502.

  19. Ethical issues in using data from quality management programs.

    PubMed

    Nerenz, David R

    2009-08-01

    Since the advent of formal, data-driven quality improvement programs in health care in the late 1980s and early 1990s, there are have been questions raised about requirements for ethical committee review of quality improvement activities. A form of consensus emerged through a series of articles published between 1996 and 2007, but there is still significant variation among ethics review committees and individual project leaders in applying broad policies on requirements for committee review and/or written informed consent by participants. Recent developments in quality management, particularly the creation and use of multi-site disease registries, have raised new questions about requirements for review and consent, since the activities often have simultaneous research and quality improvement goals. This article discusses ways in which policies designed for local quality improvement projects and data bases may be adapted to apply to multi-site registries and research projects related to them.

  20. Using IT to improve quality at NewYork-Presybterian Hospital: a requirements-driven strategic planning process.

    PubMed

    Kuperman, Gilad J; Boyer, Aurelia; Cole, Curt; Forman, Bruce; Stetson, Peter D; Cooper, Mary

    2006-01-01

    At NewYork-Presbyterian Hospital, we are committed to the delivery of high quality care. We have implemented a strategic planning process to determine the information technology initiatives that will best help us improve quality. The process began with the creation of a Clinical Quality and IT Committee. The Committee identified 2 high priority goals that would enable demonstrably high quality care: 1) excellence at data warehousing, and 2) optimal use of automated clinical documentation to capture encounter-related quality and safety data. For each high priority goal, a working group was created to develop specific recommendations. The Data Warehousing subgroup has recommended the implementation of an architecture management process and an improved ability for users to get access to aggregate data. The Structured Documentation subgroup is establishing recommendations for a documentation template creation process. The strategic planning process at times is slow, but assures that the organization is focusing on the information technology activities most likely to lead to improved quality.

  1. Using IT to Improve Quality at NewYork-Presybterian Hospital: A Requirements-Driven Strategic Planning Process

    PubMed Central

    Kuperman, Gilad J.; Boyer, Aurelia; Cole, Curt; Forman, Bruce; Stetson, Peter D.; Cooper, Mary

    2006-01-01

    At NewYork-Presbyterian Hospital, we are committed to the delivery of high quality care. We have implemented a strategic planning process to determine the information technology initiatives that will best help us improve quality. The process began with the creation of a Clinical Quality and IT Committee. The Committee identified 2 high priority goals that would enable demonstrably high quality care: 1) excellence at data warehousing, and 2) optimal use of automated clinical documentation to capture encounter-related quality and safety data. For each high priority goal, a working group was created to develop specific recommendations. The Data Warehousing subgroup has recommended the implementation of an architecture management process and an improved ability for users to get access to aggregate data. The Structured Documentation subgroup is establishing recommendations for a documentation template creation process. The strategic planning process at times is slow, but assures that the organization is focusing on the information technology activities most likely to lead to improved quality. PMID:17238381

  2. Model-driven approach to data collection and reporting for quality improvement

    PubMed Central

    Curcin, Vasa; Woodcock, Thomas; Poots, Alan J.; Majeed, Azeem; Bell, Derek

    2014-01-01

    Continuous data collection and analysis have been shown essential to achieving improvement in healthcare. However, the data required for local improvement initiatives are often not readily available from hospital Electronic Health Record (EHR) systems or not routinely collected. Furthermore, improvement teams are often restricted in time and funding thus requiring inexpensive and rapid tools to support their work. Hence, the informatics challenge in healthcare local improvement initiatives consists of providing a mechanism for rapid modelling of the local domain by non-informatics experts, including performance metric definitions, and grounded in established improvement techniques. We investigate the feasibility of a model-driven software approach to address this challenge, whereby an improvement model designed by a team is used to automatically generate required electronic data collection instruments and reporting tools. To that goal, we have designed a generic Improvement Data Model (IDM) to capture the data items and quality measures relevant to the project, and constructed Web Improvement Support in Healthcare (WISH), a prototype tool that takes user-generated IDM models and creates a data schema, data collection web interfaces, and a set of live reports, based on Statistical Process Control (SPC) for use by improvement teams. The software has been successfully used in over 50 improvement projects, with more than 700 users. We present in detail the experiences of one of those initiatives, Chronic Obstructive Pulmonary Disease project in Northwest London hospitals. The specific challenges of improvement in healthcare are analysed and the benefits and limitations of the approach are discussed. PMID:24874182

  3. Quality and Efficiency Improvement Tools for Every Radiologist.

    PubMed

    Kudla, Alexei U; Brook, Olga R

    2018-06-01

    In an era of value-based medicine, data-driven quality improvement is more important than ever to ensure safe and efficient imaging services. Familiarity with high-value tools enables all radiologists to successfully engage in quality and efficiency improvement. In this article, we review the model for improvement, strategies for measurement, and common practical tools with real-life examples that include Run chart, Control chart (Shewhart chart), Fishbone (Cause-and-Effect or Ishikawa) diagram, Pareto chart, 5 Whys, and Root Cause Analysis. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  4. Model-driven approach to data collection and reporting for quality improvement.

    PubMed

    Curcin, Vasa; Woodcock, Thomas; Poots, Alan J; Majeed, Azeem; Bell, Derek

    2014-12-01

    Continuous data collection and analysis have been shown essential to achieving improvement in healthcare. However, the data required for local improvement initiatives are often not readily available from hospital Electronic Health Record (EHR) systems or not routinely collected. Furthermore, improvement teams are often restricted in time and funding thus requiring inexpensive and rapid tools to support their work. Hence, the informatics challenge in healthcare local improvement initiatives consists of providing a mechanism for rapid modelling of the local domain by non-informatics experts, including performance metric definitions, and grounded in established improvement techniques. We investigate the feasibility of a model-driven software approach to address this challenge, whereby an improvement model designed by a team is used to automatically generate required electronic data collection instruments and reporting tools. To that goal, we have designed a generic Improvement Data Model (IDM) to capture the data items and quality measures relevant to the project, and constructed Web Improvement Support in Healthcare (WISH), a prototype tool that takes user-generated IDM models and creates a data schema, data collection web interfaces, and a set of live reports, based on Statistical Process Control (SPC) for use by improvement teams. The software has been successfully used in over 50 improvement projects, with more than 700 users. We present in detail the experiences of one of those initiatives, Chronic Obstructive Pulmonary Disease project in Northwest London hospitals. The specific challenges of improvement in healthcare are analysed and the benefits and limitations of the approach are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Improving Population Health Through an Innovative Collaborative: The Be There San Diego Data for Quality Group.

    PubMed

    Fremont, Allen; Kranz, Ashley M; Phillips, Jessica; Garber, Chandra

    2017-06-01

    In 2012, leaders from disparate health care organizations established a data group aligned around a regional goal of preventing heart attacks and strokes in San Diego. The group---now named the Be There San Diego Data for Quality (DFQ) Group---is a safe venue for medical directors and other quality-improvement leaders to share performance data on quality-of-care measures for diabetes, hypertension, and cardiovascular disease, as well as insights, lessons learned, and challenges faced by each organization in treating these conditions. The DFQ Group has focused its efforts on improving the quality of services provided by each participating health care organization, and has placed a strong emphasis on analyzing trends in combined quality data to better understand the health of the entire San Diego population. By fostering collaboration among organizations that collectively serve a large portion of the local population and other key community stakeholders, the DFQ Group has helped form the foundation of a unique, multifaceted, multi-stakeholder, regional effort that is gaining national attention and funding for its community-driven approach.

  6. A Conceptual Analytics Model for an Outcome-Driven Quality Management Framework as Part of Professional Healthcare Education.

    PubMed

    Hervatis, Vasilis; Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil

    2015-10-06

    Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators' decision making. A deductive case study approach was applied to develop the conceptual model. The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach.

  7. A Conceptual Analytics Model for an Outcome-Driven Quality Management Framework as Part of Professional Healthcare Education

    PubMed Central

    Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil

    2015-01-01

    Background Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. Objective The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators’ decision making. Methods A deductive case study approach was applied to develop the conceptual model. Results The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. Conclusions The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach. PMID:27731840

  8. Co-designing for quality: Creating a user-driven tool to improve quality in youth mental health services.

    PubMed

    Hackett, Christina L; Mulvale, Gillian; Miatello, Ashleigh

    2018-04-29

    Although high quality mental health care for children and youth is a goal of many health systems, little is known about the dimensions of quality mental health care from users' perspectives. We engaged young people, caregivers and service providers to share experiences, which shed light on quality dimensions for youth mental health care. Using experience-based co-design, we collected qualitative data from young people aged 16-24 with a mental disorder (n = 19), identified caregivers (n = 12) and service providers (n = 14) about their experiences with respect to youth mental health services. Experience data were collected using multiple approaches including interviews, a suite of online and smartphone applications (n = 22), and a co-design event (n = 16) and analysed to extract touch points. These touch points were used to prioritize and co-design a user-driven prototype of a questionnaire to provide feedback to service providers. Young people, caregiver and service provider reports of service experiences were used to identify aspects of care quality at eight mental health service contact points: Access to mental health care; Transfer to/from hospital; Intake into hospital; Services provided; Assessment and treatment; Treatment environment; and Caregiver involvement in care. In some cases, low quality care was harmful to users and their caregivers. Young people co-designed a prototype of a user-driven feedback questionnaire to improve quality of service experiences that was supported by service providers and caregivers at the co-design event. By using EBCD to capture in-depth data regarding experiences of young people, their caregivers and service providers, study participants have begun to establish a baseline for acceptable quality of mental health care for young people. © 2018 The Authors. Health Expectations published by John Wiley & Sons Ltd.

  9. Proving Value in Radiology: Experience Developing and Implementing a Shareable Open Source Registry Platform Driven by Radiology Workflow.

    PubMed

    Gichoya, Judy Wawira; Kohli, Marc D; Haste, Paul; Abigail, Elizabeth Mills; Johnson, Matthew S

    2017-10-01

    Numerous initiatives are in place to support value based care in radiology including decision support using appropriateness criteria, quality metrics like radiation dose monitoring, and efforts to improve the quality of the radiology report for consumption by referring providers. These initiatives are largely data driven. Organizations can choose to purchase proprietary registry systems, pay for software as a service solution, or deploy/build their own registry systems. Traditionally, registries are created for a single purpose like radiation dosage or specific disease tracking like diabetes registry. This results in a fragmented view of the patient, and increases overhead to maintain such single purpose registry system by requiring an alternative data entry workflow and additional infrastructure to host and maintain multiple registries for different clinical needs. This complexity is magnified in the health care enterprise whereby radiology systems usually are run parallel to other clinical systems due to the different clinical workflow for radiologists. In the new era of value based care where data needs are increasing with demand for a shorter turnaround time to provide data that can be used for information and decision making, there is a critical gap to develop registries that are more adapt to the radiology workflow with minimal overhead on resources for maintenance and setup. We share our experience of developing and implementing an open source registry system for quality improvement and research in our academic institution that is driven by our radiology workflow.

  10. A Continuous Quality Improvement Project to Implement Infant-Driven Feeding as a Standard of Practice in the Newborn/Infant Intensive Care Unit.

    PubMed

    Chrupcala, Kimberly A; Edwards, Taryn M; Spatz, Diane L

    2015-01-01

    To increase the number of neonates who were fed according to cues prior to discharge and potentially decrease length of stay. Continuous quality improvement. Eighty-five bed level IV neonatal intensive care unit. Surgical and nonsurgical neonates of all gestational ages. Neonates younger than 32 weeks gestation, who required intubation, continuous positive airway pressure (CPAP), high flow nasal cannula (HFNC), or did not have suck or gag reflexes were excluded as potential candidates for infant-driven feeding. The project was conducted over a 13-month period using the following methods: (a) baseline data collection, (b) designation of Infant Driven Feeding (IDF) Champions, (c) creation of a multidisciplinary team, (d) creation of electronic health record documentation, (e) initial staff education, (f) monthly team meetings, (g) reeducation throughout the duration of the project, and (h) patient-family education. Baseline data were collected on 20 neonates with a mean gestational age of 36 0/7(th) weeks and a mean total length of stay (LOS) of 43 days. Postimplementation data were collected on 150 neonates with a mean gestational age of 36 1/7(th) weeks and a mean total LOS of 36.4 days. A potential decrease in the mean total LOS of stay by 6.63 days was achieved during this continuous quality improvement (CQI) project. Neonates who are fed according to cues can become successful oral feeders and can be safely discharged home regardless of gestational age or diagnosis. © 2015 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses.

  11. Affordable non-traditional source data mining for context assessment to improve distributed fusion system robustness

    NASA Astrophysics Data System (ADS)

    Bowman, Christopher; Haith, Gary; Steinberg, Alan; Morefield, Charles; Morefield, Michael

    2013-05-01

    This paper describes methods to affordably improve the robustness of distributed fusion systems by opportunistically leveraging non-traditional data sources. Adaptive methods help find relevant data, create models, and characterize the model quality. These methods also can measure the conformity of this non-traditional data with fusion system products including situation modeling and mission impact prediction. Non-traditional data can improve the quantity, quality, availability, timeliness, and diversity of the baseline fusion system sources and therefore can improve prediction and estimation accuracy and robustness at all levels of fusion. Techniques are described that automatically learn to characterize and search non-traditional contextual data to enable operators integrate the data with the high-level fusion systems and ontologies. These techniques apply the extension of the Data Fusion & Resource Management Dual Node Network (DNN) technical architecture at Level 4. The DNN architecture supports effectively assessment and management of the expanded portfolio of data sources, entities of interest, models, and algorithms including data pattern discovery and context conformity. Affordable model-driven and data-driven data mining methods to discover unknown models from non-traditional and `big data' sources are used to automatically learn entity behaviors and correlations with fusion products, [14 and 15]. This paper describes our context assessment software development, and the demonstration of context assessment of non-traditional data to compare to an intelligence surveillance and reconnaissance fusion product based upon an IED POIs workflow.

  12. Data That Drive: Closing the Loop in the Learning Hospital System

    PubMed Central

    Liu, Vincent X.; Morehouse, John W.; Baker, Jennifer M.; Greene, John D.; Kipnis, Patricia; Escobar, Gabriel J.

    2017-01-01

    The learning healthcare system describes a vision of US healthcare that capitalizes on science, information technology, incentives, and care culture to drive improvements in the quality of health care. The inpatient setting, one of the most costly and impactful domains of healthcare, is an ideal setting in which to use data and information technology to foster continuous learning and quality improvement. The rapid digitization of inpatient medicine offers incredible new opportunities to use data from routine care to generate new discovery and thus close the virtuous cycle of learning. We use an object lesson—sepsis care within the 21 hospitals of the Kaiser Permanente Northern California integrated healthcare delivery system—to offer insight into the critical elements necessary for developing a learning hospital system. We then describe how a hospital-wide data-driven approach to inpatient care can facilitate improvements in the quality of hospital care. PMID:27805797

  13. Implementing a user-driven online quality improvement toolkit for cancer care.

    PubMed

    Luck, Jeff; York, Laura S; Bowman, Candice; Gale, Randall C; Smith, Nina; Asch, Steven M

    2015-05-01

    Peer-to-peer collaboration within integrated health systems requires a mechanism for sharing quality improvement lessons. The Veterans Health Administration (VA) developed online compendia of tools linked to specific cancer quality indicators. We evaluated awareness and use of the toolkits, variation across facilities, impact of social marketing, and factors influencing toolkit use. A diffusion of innovations conceptual framework guided the collection of user activity data from the Toolkit Series SharePoint site and an online survey of potential Lung Cancer Care Toolkit users. The VA Toolkit Series site had 5,088 unique visitors in its first 22 months; 5% of users accounted for 40% of page views. Social marketing communications were correlated with site usage. Of survey respondents (n = 355), 54% had visited the site, of whom 24% downloaded at least one tool. Respondents' awareness of the lung cancer quality performance of their facility, and facility participation in quality improvement collaboratives, were positively associated with Toolkit Series site use. Facility-level lung cancer tool implementation varied widely across tool types. The VA Toolkit Series achieved widespread use and a high degree of user engagement, although use varied widely across facilities. The most active users were aware of and active in cancer care quality improvement. Toolkit use seemed to be reinforced by other quality improvement activities. A combination of user-driven tool creation and centralized toolkit development seemed to be effective for leveraging health information technology to spread disease-specific quality improvement tools within an integrated health care system. Copyright © 2015 by American Society of Clinical Oncology.

  14. What are the most effective strategies for improving quality and safety of health care?

    PubMed

    Scott, I

    2009-06-01

    There is now a plethora of different quality improvement strategies (QIS) for optimizing health care, some clinician/patient driven, others manager/policy-maker driven. Which of these are most effective remains unclear despite expressed concerns about potential for QIS-related patient harm and wasting of resources. The objective of this study was to review published literature assessing the relative effectiveness of different QIS. Data sources comprising PubMed Clinical Queries, Cochrane Library and its Effective Practice and Organization of Care database, and HealthStar were searched for studies of QIS between January 1985 and February 2008 using search terms based on an a priori QIS classification suggested by experts. Systematic reviews of controlled trials were selected in determining effect sizes for specific QIS, which were compared as a narrative meta-review. Clinician/patient driven QIS were associated with stronger evidence of efficacy and larger effect sizes than manager/policy-maker driven QIS. The most effective strategies (>10% absolute increase in appropriate care or equivalent measure) included clinician-directed audit and feedback cycles, clinical decision support systems, specialty outreach programmes, chronic disease management programmes, continuing professional education based on interactive small-group case discussions, and patient-mediated clinician reminders. Pay-for-performance schemes directed to clinician groups and organizational process redesign were modestly effective. Other manager/policy-maker driven QIS including continuous quality improvement programmes, risk and safety management systems, public scorecards and performance reports, external accreditation, and clinical governance arrangements have not been adequately evaluated with regard to effectiveness. QIS are heterogeneous and methodological flaws in much of the evaluative literature limit validity and generalizability of results. Based on current best available evidence, clinician/patient driven QIS appear to be more effective than manager/policy-maker driven QIS although the latter have, in many instances, attracted insufficient robust evaluations to accurately determine their comparative effectiveness.

  15. Macro vs micro level surgical quality improvement: a regional collaborative demonstrates the case for a national NSQIP initiative.

    PubMed

    Tepas, Joseph J; Kerwin, Andrew J; deVilla, Jhun; Nussbaum, Michael S

    2014-04-01

    The Florida Surgical Care Initiative (FSCI) is a quality improvement collaborative of the American College of Surgeons National Surgical Quality Improvement Program (NSQIP) and the Florida Hospital Association. In the wake of a dramatic decrease in complications and cost documented over 15 months, we analyzed the semiannual measures reports (SAR) to determine whether this improvement was driven by specific institutions or was a global accomplishment by all participants. Reports from NSQIP were analyzed to determine rank change of participants. Odds ratio (OR) of observed-to-expected incidence of the 4 FSCI outcomes (catheter-associated urinary tract infection [CAUTI], surgical site infection [SSI], colorectal, and surgery in patients older than 65 years) were used to assess individual and group performance. Data from SAR 2 (October 2011 to April 2012) were compared with data from SAR 3 (May to July 2012). Poorly performing hospitals were tracked to determine evidence of improvement. Individual facility performance was evaluated by determining proportion of hospitals showing improved rank across all measures. Fifty-four hospitals were evaluated. SAR 2 reported 28,112 general and vascular surgical cases; SAR 3 added 10,784 more. The proportion of institutions with OR < 1 for each measure did not change significantly. Only urinary tract infection and colorectal measures demonstrated increased number of hospitals with OR < 1. Each institution that was a significant negative outlier in SAR 2 demonstrated improvement. Three of 54 hospitals demonstrated improvement across all 4 measures. Of 15 hospitals with improved performance across 3 measures, all included elderly surgery. The increase in quality achieved across this population of surgical patients was the result of a quality assessment process driven by NSQIP rather than disproportionate improvement of some raising the bar for all. The NSQIP process, applied collaboratively across a population by committed institutions, produces dramatic results. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  16. Resident-Specific Morbidity Reduced Following ACS NSQIP Data-Driven Quality Program.

    PubMed

    Turrentine, Florence E; Hanks, John B; Tracci, Megan C; Jones, R Scott; Schirmer, Bruce D; Smith, Philip W

    2018-04-16

    The Accreditation Council for Graduate Medical Education Milestone Project for general surgery provided a more robust method for developing and tracking residents' competence. This framework enhanced systematic and progressive development of residents' competencies in surgical quality improvement. A 22-month interactive, educational program based on resident-specific surgical outcomes data culminated in a quality improvement project for postgraduate year 4 surgery residents. Self- assessment, quality knowledge test, and resident-specific American College of Surgeons National Surgical Quality Improvement Program Quality In-Training Initiative morbidity were compared before and after the intervention. Quality in-training initiative morbidity decreased from 25% (82/325) to 18% (93/517), p = 0.015 despite residents performing more complex cases. All participants achieved level 4 competency (4/4) within the general surgery milestones improvement of care, practice-based learning and improvement competency. Institutional American College of Surgeons National Surgical Quality Improvement Program general surgery morbidity improved from the ninth to the sixth decile. Quality assessment and improvement self-assessment postintervention scores (M = 23.80, SD = 4.97) were not significantly higher than preintervention scores (M = 19.20, SD = 5.26), p = 0.061. Quality Improvement Knowledge Application Tool postintervention test scores (M = 17.4, SD = 4.88), were not significantly higher than pretest scores (M = 13.2, SD = 1.92), p = 0.12. Sharing validated resident-specific clinical data with participants was associated with improved surgical outcomes. Participating fourth year surgical residents achieved the highest score, a level 4, in the practice based learning and improvement competency of the improvement of care practice domain and observed significantly reduced surgical morbidity for cases in which they participated. Copyright © 2018. Published by Elsevier Inc.

  17. Practice Facilitator Strategies for Addressing Electronic Health Record Data Challenges for Quality Improvement: EvidenceNOW.

    PubMed

    Hemler, Jennifer R; Hall, Jennifer D; Cholan, Raja A; Crabtree, Benjamin F; Damschroder, Laura J; Solberg, Leif I; Ono, Sarah S; Cohen, Deborah J

    2018-01-01

    Practice facilitators ("facilitators") can play an important role in supporting primary care practices in performing quality improvement (QI), but they need complete and accurate clinical performance data from practices' electronic health records (EHR) to help them set improvement priorities, guide clinical change, and monitor progress. Here, we describe the strategies facilitators use to help practices perform QI when complete or accurate performance data are not available. Seven regional cooperatives enrolled approximately 1500 small-to-medium-sized primary care practices and 136 facilitators in EvidenceNOW, the Agency for Healthcare Research and Quality's initiative to improve cardiovascular preventive services. The national evaluation team analyzed qualitative data from online diaries, site visit field notes, and interviews to discover how facilitators worked with practices on EHR data challenges to obtain and use data for QI. We found facilitators faced practice-level EHR data challenges, such as a lack of clinical performance data, partial or incomplete clinical performance data, and inaccurate clinical performance data. We found that facilitators responded to these challenges, respectively, by using other data sources or tools to fill in for missing data, approximating performance reports and generating patient lists, and teaching practices how to document care and confirm performance measures. In addition, facilitators helped practices communicate with EHR vendors or health systems in requesting data they needed. Overall, facilitators tailored strategies to fit the individual practice and helped build data skills and trust. Facilitators can use a range of strategies to help practices perform data-driven QI when performance data are inaccurate, incomplete, or missing. Support is necessary to help practices, particularly those with EHR data challenges, build their capacity for conducting data-driven QI that is required of them for participating in practice transformation and performance-based payment programs. It is questionable how practices with data challenges will perform in programs without this kind of support. © Copyright 2018 by the American Board of Family Medicine.

  18. Building Data-Driven Pathways From Routinely Collected Hospital Data: A Case Study on Prostate Cancer

    PubMed Central

    Clark, Jeremy; Cooper, Colin S; Mills, Robert; Rayward-Smith, Victor J; de la Iglesia, Beatriz

    2015-01-01

    Background Routinely collected data in hospitals is complex, typically heterogeneous, and scattered across multiple Hospital Information Systems (HIS). This big data, created as a byproduct of health care activities, has the potential to provide a better understanding of diseases, unearth hidden patterns, and improve services and cost. The extent and uses of such data rely on its quality, which is not consistently checked, nor fully understood. Nevertheless, using routine data for the construction of data-driven clinical pathways, describing processes and trends, is a key topic receiving increasing attention in the literature. Traditional algorithms do not cope well with unstructured processes or data, and do not produce clinically meaningful visualizations. Supporting systems that provide additional information, context, and quality assurance inspection are needed. Objective The objective of the study is to explore how routine hospital data can be used to develop data-driven pathways that describe the journeys that patients take through care, and their potential uses in biomedical research; it proposes a framework for the construction, quality assessment, and visualization of patient pathways for clinical studies and decision support using a case study on prostate cancer. Methods Data pertaining to prostate cancer patients were extracted from a large UK hospital from eight different HIS, validated, and complemented with information from the local cancer registry. Data-driven pathways were built for each of the 1904 patients and an expert knowledge base, containing rules on the prostate cancer biomarker, was used to assess the completeness and utility of the pathways for a specific clinical study. Software components were built to provide meaningful visualizations for the constructed pathways. Results The proposed framework and pathway formalism enable the summarization, visualization, and querying of complex patient-centric clinical information, as well as the computation of quality indicators and dimensions. A novel graphical representation of the pathways allows the synthesis of such information. Conclusions Clinical pathways built from routinely collected hospital data can unearth information about patients and diseases that may otherwise be unavailable or overlooked in hospitals. Data-driven clinical pathways allow for heterogeneous data (ie, semistructured and unstructured data) to be collated over a unified data model and for data quality dimensions to be assessed. This work has enabled further research on prostate cancer and its biomarkers, and on the development and application of methods to mine, compare, analyze, and visualize pathways constructed from routine data. This is an important development for the reuse of big data in hospitals. PMID:26162314

  19. Improving Quality: How Leaders Advance Student Engagement at Private, Tuition-Driven Institutions

    ERIC Educational Resources Information Center

    Sluis, Kimberly A.

    2017-01-01

    Students and families, lawmakers, and the general public have become increasingly concerned about the quality of U.S. higher education. Given the competitive higher education landscape, private, tuition-driven colleges and universities are particularly vulnerable to concerns about quality. This study investigates how faculty and administrative…

  20. Clinical governance: bridging the gap between managerial and clinical approaches to quality of care

    PubMed Central

    Buetow, S. A.; Roland, M.

    1999-01-01

    Clinical governance has been introduced as a new approach to quality improvement in the UK national health service. This article maps clinical governance against a discussion of the four main approaches to measuring and improving quality of care: quality assessment, quality assurance, clinical audit, and quality improvement (including continuous quality improvement). Quality assessment underpins each approach. Whereas clinical audit has, in general, been professionally led, managers have driven quality improvement initiatives. Quality assurance approaches have been perceived to be externally driven by managers or to involve professional inspection. It is discussed how clinical governance seeks to bridge these approaches. Clinical governance allows clinicians in the UK to lead a comprehensive strategy to improve quality within provider organisations, although with an expectation of greatly increased external accountability. Clinical governance aims to bring together managerial, organisational, and clinical approaches to improving quality of care. If successful, it will define a new type of professionalism for the next century. Failure by the professions to seize the opportunity is likely to result in increasingly detailed external control of clinical activity in the UK, as has occurred in some other countries. PMID:10847876

  1. Data that drive: Closing the loop in the learning hospital system.

    PubMed

    Liu, Vincent X; Morehouse, John W; Baker, Jennifer M; Greene, John D; Kipnis, Patricia; Escobar, Gabriel J

    2016-11-01

    The learning healthcare system describes a vision of US healthcare that capitalizes on science, information technology, incentives, and care culture to drive improvements in the quality of health care. The inpatient setting, one of the most costly and impactful domains of healthcare, is an ideal setting in which to use data and information technology to foster continuous learning and quality improvement. The rapid digitization of inpatient medicine offers incredible new opportunities to use data from routine care to generate new discovery and thus close the virtuous cycle of learning. We use an object lesson-sepsis care within the 21 hospitals of the Kaiser Permanente Northern California integrated healthcare delivery system-to offer insight into the critical elements necessary for developing a learning hospital system. We then describe how a hospital-wide data-driven approach to inpatient care can facilitate improvements in the quality of hospital care. Journal of Hospital Medicine 2016;11:S11-S17. © 2016 Society of Hospital Medicine. © 2016 Society of Hospital Medicine.

  2. Hypothesis driven drug design: improving quality and effectiveness of the design-make-test-analyse cycle.

    PubMed

    Plowright, Alleyn T; Johnstone, Craig; Kihlberg, Jan; Pettersson, Jonas; Robb, Graeme; Thompson, Richard A

    2012-01-01

    In drug discovery, the central process of constructing and testing hypotheses, carefully conducting experiments and analysing the associated data for new findings and information is known as the design-make-test-analyse cycle. Each step relies heavily on the inputs and outputs of the other three components. In this article we report our efforts to improve and integrate all parts to enable smooth and rapid flow of high quality ideas. Key improvements include enhancing multi-disciplinary input into 'Design', increasing the use of knowledge and reducing cycle times in 'Make', providing parallel sets of relevant data within ten working days in 'Test' and maximising the learning in 'Analyse'. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Seed-Specific Expression of OsDWF4, a Rate-Limiting Gene Involved in Brassinosteroids Biosynthesis, Improves Both Grain Yield and Quality in Rice.

    PubMed

    Li, Qian-Feng; Yu, Jia-Wen; Lu, Jun; Fei, Hong-Yuan; Luo, Ming; Cao, Bu-Wei; Huang, Li-Chun; Zhang, Chang-Quan; Liu, Qiao-Quan

    2018-04-18

    Brassinosteroids (BRs) are essential plant-specific steroidal hormones that regulate diverse growth and developmental processes in plants. We evaluated the effects of OsDWF4, a gene that encodes a rate-limiting enzyme in BR biosynthesis, on both rice yield and quality when driven by the Gt1 or Ubi promoter, which correspond to seed-specific or constitutive expression, respectively. Generally, transgenic plants expressing OsDWF4 showed increased grain yield with more tillers and longer and heavier seeds. Moreover, the starch physicochemical properties of the transgenic rice were also improved. Interestingly, OsDWF4 was found to exert different effects on either rice yield or quality when driven by the different promoters. The overall performance of the pGt1::OsDWF4 lines was better than that of the pUbi::OsDWF4 lines. Our data not only demonstrate the effects of OsDWF4 overexpression on both rice yield and quality but also suggest that a seed-specific promoter is a good choice in BR-mediated rice breeding programs.

  4. School Improvement under Test-Driven Accountability: A Comparison of High- and Low-Performing Middle Schools in California. CSE Report 717

    ERIC Educational Resources Information Center

    Mintrop, Heinrich; Trujillo, Tina

    2007-01-01

    Based on in-depth data from nine demographically similar schools, the study asks five questions in regard to key aspects of the improvement process and that speak to the consequential validity of accountability indicators: Do schools that differ widely according to system performance criteria also differ on the quality of the educational…

  5. Integrating watershed- and farm-scale modeling framework for targeting critical source areas while maintaining farm economic viability

    USDA-ARS?s Scientific Manuscript database

    Quantitative risk assessments of pollution and data related to the effectiveness of mitigating best management practices (BMPs) are important aspects of nonpoint source (NPS) pollution control efforts, particularly those driven by specific water quality objectives and by measurable improvement goals...

  6. The Promise of Baldrige for K-12 Education. ACT Policy Report.

    ERIC Educational Resources Information Center

    Walpole, MaryBeth.; Noeth, Richard J.

    This report examines the evidence available on improving school quality through implementation of the Malcolm Baldrige Education Criteria for Performance Excellence. The Baldrige criteria address many issues other failed educational efforts have not, including leadership, systems thinking, changes in school culture, and data-driven decision…

  7. Surveying Professionals' Views of Positive Behavior Support and Behavior Analysis

    ERIC Educational Resources Information Center

    Filter, Kevin J.; Tincani, Matt; Fung, Daniel

    2009-01-01

    Positive behavior support (PBS) is an empirically driven approach to improve quality of life influenced by the science of behavior analysis. Recent discussions have evolved around PBS, behavior analysis, and their relationship within education and human services fields. To date, few data have been offered to guide behaviorally oriented…

  8. Agreement and disagreement on health care quality concepts among academic health professionals: the Saudi case.

    PubMed

    Mahrous, Mohamed Saad

    2014-01-01

    A systematic and rigorous implementation of quality improvement processes is likely to improve the well-being of staff members and heighten their job satisfaction. Assessing professionals' perceptions of health care quality should lead to the betterment of health care services. In Saudi Arabia, no previous studies examine how university health professionals view health care quality concepts. A cross-sectional analytical study employing a self-administered questionnaire with 43 statements assessing quality perceptions of academic health care professionals was used. Despite the agreement of health professionals on numerous quality concepts addressed in this study, there was insufficient agreement on 10 core quality concepts, 3 of which were the following: "quality focuses on customers" (50%), "quality is tangible and therefore measurable" (29.3%), and "quality is data-driven" (62%). Hence, providing health professionals with relevant training likely will generate a better understanding of quality concepts and optimize their performance.

  9. New Driving Scheme to Improve Hysteresis Characteristics of Organic Thin Film Transistor-Driven Active-Matrix Organic Light Emitting Diode Display

    NASA Astrophysics Data System (ADS)

    Yamamoto, Toshihiro; Nakajima, Yoshiki; Takei, Tatsuya; Fujisaki, Yoshihide; Fukagawa, Hirohiko; Suzuki, Mitsunori; Motomura, Genichi; Sato, Hiroto; Tokito, Shizuo; Fujikake, Hideo

    2011-02-01

    A new driving scheme for an active-matrix organic light emitting diode (AMOLED) display was developed to prevent the picture quality degradation caused by the hysteresis characteristics of organic thin film transistors (OTFTs). In this driving scheme, the gate electrode voltage of a driving-OTFT is directly controlled through the storage capacitor so that the operating point for the driving-OTFT is on the same hysteresis curve for every pixel after signal data are stored in the storage capacitor. Although the number of OTFTs in each pixel for the AMOLED display is restricted because OTFT size should be large enough to drive organic light emitting diodes (OLEDs) due to their small carrier mobility, it can improve the picture quality for an OTFT-driven flexible OLED display with the basic two transistor-one capacitor circuitry.

  10. Epilepsy informatics and an ontology-driven infrastructure for large database research and patient care in epilepsy.

    PubMed

    Sahoo, Satya S; Zhang, Guo-Qiang; Lhatoo, Samden D

    2013-08-01

    The epilepsy community increasingly recognizes the need for a modern classification system that can also be easily integrated with effective informatics tools. The 2010 reports by the United States President's Council of Advisors on Science and Technology (PCAST) identified informatics as a critical resource to improve quality of patient care, drive clinical research, and reduce the cost of health services. An effective informatics infrastructure for epilepsy, which is underpinned by a formal knowledge model or ontology, can leverage an ever increasing amount of multimodal data to improve (1) clinical decision support, (2) access to information for patients and their families, (3) easier data sharing, and (4) accelerate secondary use of clinical data. Modeling the recommendations of the International League Against Epilepsy (ILAE) classification system in the form of an epilepsy domain ontology is essential for consistent use of terminology in a variety of applications, including electronic health records systems and clinical applications. In this review, we discuss the data management issues in epilepsy and explore the benefits of an ontology-driven informatics infrastructure and its role in adoption of a "data-driven" paradigm in epilepsy research. Wiley Periodicals, Inc. © 2013 International League Against Epilepsy.

  11. Improved data retrieval from TreeBASE via taxonomic and linguistic data enrichment

    PubMed Central

    Anwar, Nadia; Hunt, Ela

    2009-01-01

    Background TreeBASE, the only data repository for phylogenetic studies, is not being used effectively since it does not meet the taxonomic data retrieval requirements of the systematics community. We show, through an examination of the queries performed on TreeBASE, that data retrieval using taxon names is unsatisfactory. Results We report on a new wrapper supporting taxon queries on TreeBASE by utilising a Taxonomy and Classification Database (TCl-Db) we created. TCl-Db holds merged and consolidated taxonomic names from multiple data sources and can be used to translate hierarchical, vernacular and synonym queries into specific query terms in TreeBASE. The query expansion supported by TCl-Db shows very significant information retrieval quality improvement. The wrapper can be accessed at the URL The methodology we developed is scalable and can be applied to new data, as those become available in the future. Conclusion Significantly improved data retrieval quality is shown for all queries, and additional flexibility is achieved via user-driven taxonomy selection. PMID:19426482

  12. System-Level Action Required for Wide-Scale Improvement in Quality of Primary Health Care: Synthesis of Feedback from an Interactive Process to Promote Dissemination and Use of Aggregated Quality of Care Data.

    PubMed

    Bailie, Jodie; Laycock, Alison; Matthews, Veronica; Bailie, Ross

    2016-01-01

    There is an enduring gap between recommended practice and care that is actually delivered; and there is wide variation between primary health care (PHC) centers in delivery of care. Where aspects of care are not being done well across a range of PHC centers, this is likely due to inadequacies in the broader system. This paper aims to describe stakeholders' perceptions of the barriers and enablers to addressing gaps in Australian Aboriginal and Torres Strait Islander chronic illness care and child health, and to identify key drivers for improvement. This paper draws on data collected as part of a large-scale continuous quality improvement project in Australian Indigenous PHC settings. We undertook a qualitative assessment of stakeholder feedback on the main barriers and enablers to addressing gaps in care for Aboriginal and Torres Strait Islander children and in chronic illness care. Themes on barriers and enablers were further analyzed to develop a "driver diagram," an improvement tool used to locate barriers and enablers within causal pathways (as primary and secondary drivers), enabling them to be targeted by tailored interventions. We identified 5 primary drivers and 11 secondary drivers of high-quality care, and associated strategies that have potential for wide-scale implementation to address barriers and enablers for improving care. Perceived barriers to addressing gaps in care included both health system and staff attributes. Primary drivers were: staff capability to deliver high-quality care; availability and use of clinical information systems and decision support tools; embedding of quality improvement processes and data-driven decision-making; appropriate and effective recruitment and retention of staff; and community capacity, engagement and mobilization for health. Suggested strategies included mechanisms for increasing clinical supervision and support, staff retention, reorientation of service delivery, use of information systems and community health literacy. The findings identify areas of focus for development of barrier-driven, tailored interventions to improve health outcomes. They reinforce the importance of system-level action to improve health center performance and health outcomes, and of developing strategies to address system-wide challenges that can be adapted to local contexts.

  13. Performance enhancement using a balanced scorecard in a Patient-centered Medical Home.

    PubMed

    Fields, Scott A; Cohen, Deborah

    2011-01-01

    Oregon Health & Science University Family Medicine implemented a balanced scorecard within our clinics that embraces the inherent tensions between care quality, financial productivity, and operational efficiency. This data-driven performance improvement process involved: (1) consensus-building around specific indicators to be measured, (2) developing and refining the balanced scorecard, and (3) using the balanced scorecard in the quality improvement process. Developing and implementing the balanced scorecard stimulated an important culture shift among clinics; practice members now actively use data to recognize successes, understand emerging problems, and make changes in response to these problems. Our experience shows how Patient-centered Medical Homes can be enhanced through use of information technology and evidence-based tools that support improved decision making and performance and help practices develop into learning organizations.

  14. Web-Based Predictive Analytics to Improve Patient Flow in the Emergency Department

    NASA Technical Reports Server (NTRS)

    Buckler, David L.

    2012-01-01

    The Emergency Department (ED) simulation project was established to demonstrate how requirements-driven analysis and process simulation can help improve the quality of patient care for the Veterans Health Administration's (VHA) Veterans Affairs Medical Centers (VAMC). This project developed a web-based simulation prototype of patient flow in EDs, validated the performance of the simulation against operational data, and documented IT requirements for the ED simulation.

  15. The six critical attributes of the next generation of quality management software systems.

    PubMed

    Clark, Kathleen

    2011-07-01

    Driven by both the need to meet regulatory requirements and a genuine desire to drive improved quality, quality management systems encompassing standard operating procedure, corrective and preventative actions and related processes have existed for many years, both in paper and electronic form. The impact of quality management systems on 'actual' quality, however, is often reported as far less than desired. A quality management software system that moves beyond formal forms-driven processes to include a true closed loop design, manage disparate processes across the enterprise, provide support for collaborative processes and deliver insight into the overall state of control has the potential to close the gap between simply accomplishing regulatory compliance and delivering measurable improvements in quality and efficiency.

  16. A roadmap for improving healthcare service quality.

    PubMed

    Kennedy, Denise M; Caselli, Richard J; Berry, Leonard L

    2011-01-01

    A data-driven, comprehensive model for improving service and creating long-term value was developed and implemented at Mayo Clinic Arizona (MCA). Healthcare organizations can use this model to prepare for value-based purchasing, a payment system in which quality and patient experience measures will influence reimbursement. Surviving and thriving in such a system will require a comprehensive approach to sustaining excellent service performance from physicians and allied health staff (e.g., nurses, technicians, nonclinical staff). The seven prongs in MCA's service quality improvement model are (1) multiple data sources to drive improvement, (2) accountability for service quality, (3) service consultation and improvement tools, (4) service values and behaviors, (5) education and training, (6) ongoing monitoring and control, and (7) recognition and reward. The model was fully implemented and tested in five departments in which patient perception of provider-specific service attributes and/or overall quality of care were below the 90th percentile for patient satisfaction in the vendor's database. Extent of the implementation was at the discretion of department leadership. Perception data rating various service attributes were collected from randomly selected patients and monitored over a 24-month period. The largest increases in patient perception of excellence over the pilot period were realized when all seven prongs of the model were implemented as a comprehensive improvement approach. The results of this pilot may help other healthcare organizations prepare for value-based purchasing.

  17. Building an Evidence-Driven Child Welfare Workforce: A University–Agency Partnership

    PubMed Central

    Lery, Bridgette; Wiegmann, Wendy; Berrick, Jill Duerr

    2016-01-01

    The federal government increasingly expects child welfare systems to be more responsive to the needs of their local populations, connect strategies to results, and use continuous quality improvement (CQI) to accomplish these goals. A method for improving decision making, CQI relies on an inflow of high-quality data, up-to-date research evidence, and a robust organizational structure and climate that supports the deliberate use of evidence for decision making. This article describes an effort to build and support these essential system components through one public-private child welfare agency–university partnership. PMID:27429534

  18. Using an Inpatient Quality Improvement Curriculum for Internal Medicine Residents to Improve Pneumococcal Conjugate Vaccine Administration Rates.

    PubMed

    Jolin, Jonathan; van Aalst, Robertus; Volpp, Bryan; Taylor, Thomas; Cohen, Emily

    2018-06-01

    Pneumococcal infections are an important source of morbidity and mortality in older adults and persons with compromised immune systems. New recommendations from the Advisory Committee on Immunization Practices (ACIP) became available September 2014, which included recommendations for the use of the 13-valent pneumococcal conjugate vaccine (PCV13). A study was conducted to increase the PCV13 vaccination rates of hospitalized patients at the White River Junction Veterans Affairs Medical Center (White River Junction, Vermont) through the use of a resident-driven quality improvement (QI) project. From December 2014 through April 2016, 16 internal medicine inpatient residents addressed inpatient PCV13 vaccination rates by participating in the facility's QI curriculum. Eight Plan-Do-Study-Act cycles were used, including discharge template editing, electronic reminders, and the discovery of a vaccination administration documentation error in the record through data validation. The measure was the monthly percentage of patients who received PCV13 vaccination (vaccination completion rate) of those discharged from the hospital medicine service who were due for PCV13 vaccination. The percentage of veterans discharged with an up-to-date PCV13 vaccination on discharge increased from approximately 30% to 87% and was sustained. Despite being driven by many different residents, this project demonstrates that continuous improvement can be achieved through a structured and iterative process while providing active learning of core QI concepts to residents. It also displays a method in which new guidelines can be incorporated into practice in an effective manner. Finally, this project is an example of how resident-driven data validation can lead to further improvement. Published by Elsevier Inc.

  19. Collecting data along the continuum of prevention and care: a Continuous Quality Improvement approach.

    PubMed

    Indyk, Leonard; Indyk, Debbie

    2006-01-01

    For the past 14 years, a team of applied social scientists and system analysts has worked with a wide variety of Community- Based Organizations (CBO's), other grassroots agencies and networks, and Medical Center departments to support resource, program, staff and data development and evaluation for hospital- and community-based programs and agencies serving HIV at-risk and affected populations. A by-product of this work has been the development, elaboration and refinement of an approach to Continuous Quality Improvement (CQI) which is appropriate for diverse community-based providers and agencies. A key component of our CQI system involves the installation of a sophisticated relational database management and reporting system (DBMS) which is used to collect, analyze, and report data in an iterative process to provide feedback among the evaluators, agency administration and staff. The database system is designed for two purposes: (1) to support the agency's administrative internal and external reporting requirements; (2) to support the development of practice driven health services and early intervention research. The body of work has fostered a unique opportunity for the development of exploratory service-driven research which serves both administrative and research needs.

  20. Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Shiyu, E-mail: shiyu.xu@gmail.com; Chen, Ying, E-mail: adachen@siu.edu; Lu, Jianping

    2015-09-15

    Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair basedmore » prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications.« less

  1. Heterogeneous postsurgical data analytics for predictive modeling of mortality risks in intensive care units.

    PubMed

    Yun Chen; Hui Yang

    2014-01-01

    The rapid advancements of biomedical instrumentation and healthcare technology have resulted in data-rich environments in hospitals. However, the meaningful information extracted from rich datasets is limited. There is a dire need to go beyond current medical practices, and develop data-driven methods and tools that will enable and help (i) the handling of big data, (ii) the extraction of data-driven knowledge, (iii) the exploitation of acquired knowledge for optimizing clinical decisions. This present study focuses on the prediction of mortality rates in Intensive Care Units (ICU) using patient-specific healthcare recordings. It is worth mentioning that postsurgical monitoring in ICU leads to massive datasets with unique properties, e.g., variable heterogeneity, patient heterogeneity, and time asyncronization. To cope with the challenges in ICU datasets, we developed the postsurgical decision support system with a series of analytical tools, including data categorization, data pre-processing, feature extraction, feature selection, and predictive modeling. Experimental results show that the proposed data-driven methodology outperforms traditional approaches and yields better results based on the evaluation of real-world ICU data from 4000 subjects in the database. This research shows great potentials for the use of data-driven analytics to improve the quality of healthcare services.

  2. Data-Driven Decision Making as a Tool to Improve Software Development Productivity

    ERIC Educational Resources Information Center

    Brown, Mary Erin

    2013-01-01

    The worldwide software project failure rate, based on a survey of information technology software manager's view of user satisfaction, product quality, and staff productivity, is estimated to be between 24% and 36% and software project success has not kept pace with the advances in hardware. The problem addressed by this study was the limited…

  3. Using Quality Improvement to Improve Internal and External Coordination and Referrals.

    PubMed

    Cain, Katherine L; Collins, Ragan P

    As part of accreditation, Public Health Accreditation Board site visitors recommended that the New Orleans Health Department strengthen its quality improvement program. With support from the Public Health Accreditation Board, the New Orleans Health Department subsequently embarked on a data-driven planning process through which it prioritized quality improvement projects for 2016. One of these projects aimed to improve referrals to New Orleans Health Department's direct services programs from local clinics and hospitals to better provide our most vulnerable residents with a continuum of care. After completing a cause-and-effect analysis, we implemented a solution involving increased outreach to health care institutions and saw annual participation increase in 3 out of 4 of our programs. We leveraged this work to successfully apply for funding to create a centralized referral system, which will facilitate partnerships among local health and human service agencies and improve access to services. This is one example of how accreditation has benefited our health department and our community. We have found that the accreditation process promotes a culture of quality and helps health departments identify and address areas for improvement.

  4. Environmental Data-Driven Inquiry and Exploration (EDDIE)- Water Focused Modules for interacting with Big Hydrologic Data

    NASA Astrophysics Data System (ADS)

    Meixner, T.; Gougis, R.; O'Reilly, C.; Klug, J.; Richardson, D.; Castendyk, D.; Carey, C.; Bader, N.; Stomberg, J.; Soule, D. C.

    2016-12-01

    High-frequency sensor data are driving a shift in the Earth and environmental sciences. The availability of high-frequency data creates an engagement opportunity for undergraduate students in primary research by using large, long-term, and sensor-based, data directly in the scientific curriculum. Project EDDIE (Environmental Data-Driven Inquiry & Exploration) has developed flexible classroom activity modules designed to meet a series of pedagogical goals that include (1) developing skills required to manipulate large datasets at different scales to conduct inquiry-based investigations; (2) developing students' reasoning about statistical variation; and (3) fostering accurate student conceptions about the nature of environmental science. The modules cover a wide range of topics, including lake physics and metabolism, stream discharge, water quality, soil respiration, seismology, and climate change. In this presentation we will focus on a sequence of modules of particular interest to hydrologists - stream discharge, water quality and nutrient loading. Assessment results show that our modules are effective at making students more comfortable analyzing data, improved understanding of statistical concepts, and stronger data analysis capability. This project is funded by an NSF TUES grant (NSF DEB 1245707).

  5. Accurate evaluation and analysis of functional genomics data and methods

    PubMed Central

    Greene, Casey S.; Troyanskaya, Olga G.

    2016-01-01

    The development of technology capable of inexpensively performing large-scale measurements of biological systems has generated a wealth of data. Integrative analysis of these data holds the promise of uncovering gene function, regulation, and, in the longer run, understanding complex disease. However, their analysis has proved very challenging, as it is difficult to quickly and effectively assess the relevance and accuracy of these data for individual biological questions. Here, we identify biases that present challenges for the assessment of functional genomics data and methods. We then discuss evaluation methods that, taken together, begin to address these issues. We also argue that the funding of systematic data-driven experiments and of high-quality curation efforts will further improve evaluation metrics so that they more-accurately assess functional genomics data and methods. Such metrics will allow researchers in the field of functional genomics to continue to answer important biological questions in a data-driven manner. PMID:22268703

  6. Good, better, best? A comprehensive comparison of healthcare providers' performance: An application to physiotherapy practices in primary care.

    PubMed

    Steenhuis, Sander; Groeneweg, Niels; Koolman, Xander; Portrait, France

    2017-12-01

    Most payment methods in healthcare stimulate volume-driven care, rather than value-driven care. Value-based payment methods such as Pay-For-Performance have the potential to reduce costs and improve quality of care. Ideally, outcome indicators are used in the assessment of providers' performance. The aim of this paper is to describe the feasibility of assessing and comparing the performances of providers using a comprehensive set of quality and cost data. We had access to unique and extensive datasets containing individual data on PROMs, PREMs and costs of physiotherapy practices in Dutch primary care. We merged these datasets at the patient-level and compared the performances of these practices using case-mix corrected linear regression models. Several significant differences in performance were detected between practices. These results can be used by both physiotherapists, to improve treatment given, and insurers to support their purchasing decisions. The study demonstrates that it is feasible to compare the performance of providers using PROMs and PREMs. However, it would take an extra effort to increase usefulness and it remains unclear under which conditions this effort is cost-effective. Healthcare providers need to be aware of the added value of registering outcomes to improve their quality. Insurers need to facilitate this by designing value-based contracts with the right incentives. Only then can payment methods contribute to value-based healthcare and increase value for patients. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. The Alliance for Innovation in Maternal Health Care: A Way Forward.

    PubMed

    Mahoney, Jeanne

    2018-06-01

    The Alliance for Innovation in Maternal Health is a program supported by the Health Services Resource Administration to reduce maternal mortality and severe maternal morbidity in the United States. This program develops bundles of evidence based action steps for birth facilities to adapt. Progress is monitored at the facility, state and national levels to foster data-driven quality improvement efforts.

  8. MyPOD: an EMR-Based Tool that Facilitates Quality Improvement and Maintenance of Certification.

    PubMed

    Berman, Loren; Duffy, Brian; Randall Brenn, B; Vinocur, Charles

    2017-03-01

    Maintenance of Certification (MOC) was designed to assess physician competencies including operative case volume and outcomes. This information, if collected consistently and systematically, can be used to facilitate quality improvement. Information automatically extracted from the electronic medical record (EMR) can be used as a prompt to compile these data. We developed an EMR-based program called MyPOD (My Personal Outcomes Data) to track surgical outcomes at our institution. We compared occurrences reported in the first 18 months to those captured in the American College of Surgeons National Surgical Quality Improvement Program-Pediatric (ACS NSQIP-P) over the same time period. During the first 18 months of using MyPOD, 691 cases were captured in both MyPOD and NSQIP-P. There were 48 cases with occurrences in NSQIP-P (6.9% occurrence rate). MyPOD captured 33% of the occurrences and 83% of the deaths reported in NSQIP-P. Use of the MyPOD program helped to identify series of complications and facilitated systematic change to improve outcomes. MyPOD provides comparative data that is essential in performance evaluation and facilitates quality improvement in surgery. This program and similar EMR-driven tools are becoming essential components of the MOC process. Our initial review has revealed opportunities for improvement in self-reporting which we can continue to measure by comparison to NSQIP-P. In addition, it has identified systems issues that have led to hospital-wide improvements.

  9. Impact of Data-driven Respiratory Gating in Clinical PET.

    PubMed

    Büther, Florian; Vehren, Thomas; Schäfers, Klaus P; Schäfers, Michael

    2016-10-01

    Purpose To study the feasibility and impact of respiratory gating in positron emission tomographic (PET) imaging in a clinical trial comparing conventional hardware-based gating with a data-driven approach and to describe the distribution of determined parameters. Materials and Methods This prospective study was approved by the ethics committee of the University Hospital of Münster (AZ 2014-217-f-N). Seventy-four patients suspected of having abdominal or thoracic fluorine 18 fluorodeoxyglucose (FDG)-positive lesions underwent clinical whole-body FDG PET/computed tomographic (CT) examinations. Respiratory gating was performed by using a pressure-sensitive belt system (belt gating [BG]) and an automatic data-driven approach (data-driven gating [DDG]). PET images were analyzed for lesion uptake, metabolic volumes, respiratory shifts of lesions, and diagnostic image quality. Results Forty-eight patients had at least one lesion in the field of view, resulting in a total of 164 lesions analyzed (range of number of lesions per patient, one to 13). Both gating methods revealed respiratory shifts of lesions (4.4 mm ± 3.1 for BG vs 4.8 mm ± 3.6 for DDG, P = .76). Increase in uptake of the lesions compared with nongated values did not differ significantly between both methods (maximum standardized uptake value [SUVmax], +7% ± 13 for BG vs +8% ± 16 for DDG, P = .76). Similarly, gating significantly decreased metabolic lesion volumes with both methods (-6% ± 26 for BG vs -7% ± 21 for DDG, P = .44) compared with nongated reconstructions. Blinded reading revealed significant improvements in diagnostic image quality when using gating, without significant differences between the methods (DDG was judged to be inferior to BG in 22 cases, equal in 12 cases, and superior in 15 cases; P = .32). Conclusion Respiratory gating increases diagnostic image quality and uptake values and decreases metabolic volumes compared with nongated acquisitions. Data-driven approaches are clinically applicable alternatives to belt-based methods and might help establishing routine respiratory gating in clinical PET/CT. (©) RSNA, 2016 Online supplemental material is available for this article.

  10. Humor adds the creative touch to CQI teams.

    PubMed

    Balzer, J W

    1994-07-01

    The health care industry is looking to continuous quality improvement as a process to both improve patient care and promote cost effectiveness. Interdisciplinary teams are learning to work together and to use data-driven problem solving. Humor adds a creative and welcome touch to the process that makes it easier and more fun to work in teams. The team leader or facilitator who uses humor along the journey sanctions the risk-taking behavior that accompanies creative solutions to tough problems.

  11. Evaluation of Normalization Methods to Pave the Way Towards Large-Scale LC-MS-Based Metabolomics Profiling Experiments

    PubMed Central

    Valkenborg, Dirk; Baggerman, Geert; Vanaerschot, Manu; Witters, Erwin; Dujardin, Jean-Claude; Burzykowski, Tomasz; Berg, Maya

    2013-01-01

    Abstract Combining liquid chromatography-mass spectrometry (LC-MS)-based metabolomics experiments that were collected over a long period of time remains problematic due to systematic variability between LC-MS measurements. Until now, most normalization methods for LC-MS data are model-driven, based on internal standards or intermediate quality control runs, where an external model is extrapolated to the dataset of interest. In the first part of this article, we evaluate several existing data-driven normalization approaches on LC-MS metabolomics experiments, which do not require the use of internal standards. According to variability measures, each normalization method performs relatively well, showing that the use of any normalization method will greatly improve data-analysis originating from multiple experimental runs. In the second part, we apply cyclic-Loess normalization to a Leishmania sample. This normalization method allows the removal of systematic variability between two measurement blocks over time and maintains the differential metabolites. In conclusion, normalization allows for pooling datasets from different measurement blocks over time and increases the statistical power of the analysis, hence paving the way to increase the scale of LC-MS metabolomics experiments. From our investigation, we recommend data-driven normalization methods over model-driven normalization methods, if only a few internal standards were used. Moreover, data-driven normalization methods are the best option to normalize datasets from untargeted LC-MS experiments. PMID:23808607

  12. Influencers on quality of life as reported by people living with dementia in long-term care: a descriptive exploratory approach.

    PubMed

    Moyle, Wendy; Fetherstonhaugh, Deirdre; Greben, Melissa; Beattie, Elizabeth

    2015-04-23

    Over half of the residents in long-term care have a diagnosis of dementia. Maintaining quality of life is important, as there is no cure for dementia. Quality of life may be used as a benchmark for caregiving, and can help to enhance respect for the person with dementia and to improve care provision. The purpose of this study was to describe quality of life as reported by people living with dementia in long-term care in terms of the influencers of, as well as the strategies needed, to improve quality of life. A descriptive exploratory approach. A subsample of twelve residents across two Australian states from a national quantitative study on quality of life was interviewed. Data were analysed thematically from a realist perspective. The approach to the thematic analysis was inductive and data-driven. Three themes emerged in relation to influencers and strategies related to quality of life: (a) maintaining independence, (b) having something to do, and (c) the importance of social interaction. The findings highlight the importance of understanding individual resident needs and consideration of the complexity of living in large group living situations, in particular in regard to resident decision-making.

  13. Mass imbalances in EPANET water-quality simulations

    NASA Astrophysics Data System (ADS)

    Davis, Michael J.; Janke, Robert; Taxon, Thomas N.

    2018-04-01

    EPANET is widely employed to simulate water quality in water distribution systems. However, in general, the time-driven simulation approach used to determine concentrations of water-quality constituents provides accurate results only for short water-quality time steps. Overly long time steps can yield errors in concentration estimates and can result in situations in which constituent mass is not conserved. The use of a time step that is sufficiently short to avoid these problems may not always be feasible. The absence of EPANET errors or warnings does not ensure conservation of mass. This paper provides examples illustrating mass imbalances and explains how such imbalances can occur because of fundamental limitations in the water-quality routing algorithm used in EPANET. In general, these limitations cannot be overcome by the use of improved water-quality modeling practices. This paper also presents a preliminary event-driven approach that conserves mass with a water-quality time step that is as long as the hydraulic time step. Results obtained using the current approach converge, or tend to converge, toward those obtained using the preliminary event-driven approach as the water-quality time step decreases. Improving the water-quality routing algorithm used in EPANET could eliminate mass imbalances and related errors in estimated concentrations. The results presented in this paper should be of value to those who perform water-quality simulations using EPANET or use the results of such simulations, including utility managers and engineers.

  14. Combining Knowledge and Data Driven Insights for Identifying Risk Factors using Electronic Health Records

    PubMed Central

    Sun, Jimeng; Hu, Jianying; Luo, Dijun; Markatou, Marianthi; Wang, Fei; Edabollahi, Shahram; Steinhubl, Steven E.; Daar, Zahra; Stewart, Walter F.

    2012-01-01

    Background: The ability to identify the risk factors related to an adverse condition, e.g., heart failures (HF) diagnosis, is very important for improving care quality and reducing cost. Existing approaches for risk factor identification are either knowledge driven (from guidelines or literatures) or data driven (from observational data). No existing method provides a model to effectively combine expert knowledge with data driven insight for risk factor identification. Methods: We present a systematic approach to enhance known knowledge-based risk factors with additional potential risk factors derived from data. The core of our approach is a sparse regression model with regularization terms that correspond to both knowledge and data driven risk factors. Results: The approach is validated using a large dataset containing 4,644 heart failure cases and 45,981 controls. The outpatient electronic health records (EHRs) for these patients include diagnosis, medication, lab results from 2003–2010. We demonstrate that the proposed method can identify complementary risk factors that are not in the existing known factors and can better predict the onset of HF. We quantitatively compare different sets of risk factors in the context of predicting onset of HF using the performance metric, the Area Under the ROC Curve (AUC). The combined risk factors between knowledge and data significantly outperform knowledge-based risk factors alone. Furthermore, those additional risk factors are confirmed to be clinically meaningful by a cardiologist. Conclusion: We present a systematic framework for combining knowledge and data driven insights for risk factor identification. We demonstrate the power of this framework in the context of predicting onset of HF, where our approach can successfully identify intuitive and predictive risk factors beyond a set of known HF risk factors. PMID:23304365

  15. Cost and impact of a quality improvement programme in mental health services.

    PubMed

    Beecham, Jennifer; Ramsay, Angus; Gordon, Kate; Maltby, Sophie; Walshe, Kieran; Shaw, Ian; Worrall, Adrian; King, Sarah

    2010-04-01

    To estimate the cost and impact of a centrally-driven quality improvement initiative in four UK mental health communities. Total costs in year 1 were identified using documentation, a staff survey, semi-structured interviews and discussion groups. Few outcome data were collected within the programme so thematic analysis was used to identify the programme's impact within its five broad underlying principles. The survey had a 40% response. Total costs ranged between pound164,000 and pound458,000 per site, plus staff time spent on workstreams. There was a very hazy view of the resources absorbed and poor recording of expenditure and activity. The initiative generated little demonstrable improvements in service quality but some participants reported changes in attitudes. Given the difficult contexts, short time-scales and capacity constraints, the programme's lack of impact is not surprising. It may, however, represent a worthwhile investment in cultural change which might facilitate improvements in how services are delivered.

  16. A comprehensive review of the SLMTA literature part 2: Measuring success.

    PubMed

    Luman, Elizabeth T; Yao, Katy; Nkengasong, John N

    2014-01-01

    Since its introduction in 2009, the Strengthening Laboratory Management Toward Accreditation (SLMTA) programme has been implemented in 617 laboratories in 47 countries. We completed a systematic review of the published literature on SLMTA. The review consists of two companion papers; this article examines quantitative evidence presented in the publications along with a meta-analysis of selected results. We identified 28 published articles with data from SLMTA implementation. The SLMTA programme was evaluated through audits based on a standard checklist, which is divided into 12 sections corresponding to the 12 Quality System Essentials (QSEs). Several basic service delivery indicators reported by programmes were also examined. Results for various components of the programme were reviewed and summarised; a meta-analysis of QSE results grouped by the three stages of the quality cycle was conducted for 126 laboratories in 12 countries. Global programme data show improved quality in SLMTA laboratories in every country, with average improvements on audit scores of 25 percentage points. Meta-analysis identified Improvement Management as the weakest stage, with internal audit (8%) and occurrence management (16%) showing the lowest scores. Studies documented 19% - 95% reductions in turn-around times, 69% - 93% reductions in specimen rejection rates, 76% - 81% increases in clinician satisfaction rates, 67% - 85% improvements in external quality assessment results, 50% - 66% decreases in nonconformities and 67% increases in staff punctuality. The wide array of results reported provides a comprehensive picture of the SLMTA programme overall, suggesting a substantive impact on provision of quality laboratory services and patient care. These comprehensive results establish a solid data-driven foundation for program improvement and further expansion.

  17. Improved spatial accuracy of functional maps in the rat olfactory bulb using supervised machine learning approach.

    PubMed

    Murphy, Matthew C; Poplawsky, Alexander J; Vazquez, Alberto L; Chan, Kevin C; Kim, Seong-Gi; Fukuda, Mitsuhiro

    2016-08-15

    Functional MRI (fMRI) is a popular and important tool for noninvasive mapping of neural activity. As fMRI measures the hemodynamic response, the resulting activation maps do not perfectly reflect the underlying neural activity. The purpose of this work was to design a data-driven model to improve the spatial accuracy of fMRI maps in the rat olfactory bulb. This system is an ideal choice for this investigation since the bulb circuit is well characterized, allowing for an accurate definition of activity patterns in order to train the model. We generated models for both cerebral blood volume weighted (CBVw) and blood oxygen level dependent (BOLD) fMRI data. The results indicate that the spatial accuracy of the activation maps is either significantly improved or at worst not significantly different when using the learned models compared to a conventional general linear model approach, particularly for BOLD images and activity patterns involving deep layers of the bulb. Furthermore, the activation maps computed by CBVw and BOLD data show increased agreement when using the learned models, lending more confidence to their accuracy. The models presented here could have an immediate impact on studies of the olfactory bulb, but perhaps more importantly, demonstrate the potential for similar flexible, data-driven models to improve the quality of activation maps calculated using fMRI data. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Data-Driven Diffusion Of Innovations: Successes And Challenges In 3 Large-Scale Innovative Delivery Models

    PubMed Central

    Dorr, David A.; Cohen, Deborah J.; Adler-Milstein, Julia

    2018-01-01

    Failed diffusion of innovations may be linked to an inability to use and apply data, information, and knowledge to change perceptions of current practice and motivate change. Using qualitative and quantitative data from three large-scale health care delivery innovations—accountable care organizations, advanced primary care practice, and EvidenceNOW—we assessed where data-driven innovation is occurring and where challenges lie. We found that implementation of some technological components of innovation (for example, electronic health records) has occurred among health care organizations, but core functions needed to use data to drive innovation are lacking. Deficits include the inability to extract and aggregate data from the records; gaps in sharing data; and challenges in adopting advanced data functions, particularly those related to timely reporting of performance data. The unexpectedly high costs and burden incurred during implementation of the innovations have limited organizations’ ability to address these and other deficits. Solutions that could help speed progress in data-driven innovation include facilitating peer-to-peer technical assistance, providing tailored feedback reports to providers from data aggregators, and using practice facilitators skilled in using data technology for quality improvement to help practices transform. Policy efforts that promote these solutions may enable more rapid uptake of and successful participation in innovative delivery system reforms. PMID:29401031

  19. Data-Driven Diffusion Of Innovations: Successes And Challenges In 3 Large-Scale Innovative Delivery Models.

    PubMed

    Dorr, David A; Cohen, Deborah J; Adler-Milstein, Julia

    2018-02-01

    Failed diffusion of innovations may be linked to an inability to use and apply data, information, and knowledge to change perceptions of current practice and motivate change. Using qualitative and quantitative data from three large-scale health care delivery innovations-accountable care organizations, advanced primary care practice, and EvidenceNOW-we assessed where data-driven innovation is occurring and where challenges lie. We found that implementation of some technological components of innovation (for example, electronic health records) has occurred among health care organizations, but core functions needed to use data to drive innovation are lacking. Deficits include the inability to extract and aggregate data from the records; gaps in sharing data; and challenges in adopting advanced data functions, particularly those related to timely reporting of performance data. The unexpectedly high costs and burden incurred during implementation of the innovations have limited organizations' ability to address these and other deficits. Solutions that could help speed progress in data-driven innovation include facilitating peer-to-peer technical assistance, providing tailored feedback reports to providers from data aggregators, and using practice facilitators skilled in using data technology for quality improvement to help practices transform. Policy efforts that promote these solutions may enable more rapid uptake of and successful participation in innovative delivery system reforms.

  20. Five key pillars of an analytics center of excellence, which are required to manage populations and transform organizations into the next era of health care.

    PubMed

    Reichert, Jim; Furlong, Gerry

    2014-01-01

    Acute care facilities are experiencing fiscal challenges as noted by decreasing admissions and lower reimbursement creating an unsustainable fiscal environment as we move into the next era of health care. This situation necessitates a strategy to move away from acting solely on hunches and instinct to using analytics to become a truly data-driven organization that identifies opportunities within patient populations to improve the quality and efficiency of care across the continuum. A brief overview of knowledge management philosophies will be provided and how it is used to enable organizations to leverage data, information, and knowledge for operational transformation leading to improved outcomes. This article outlines the 5 key pillars of an Analytics Center of Excellence; governance, organizational structure, people, process, and technology, that are foundational to the development of this strategy. While culture is the most important factor to achieve organizational transformation and improved care delivery, it is the 5 pillars of the ACoE that will enable the culture shift necessary to become a truly data-driven organization and thus achieve transformation into the next era of health care.

  1. Six sigma tools for a patient safety-oriented, quality-checklist driven radiation medicine department.

    PubMed

    Kapur, Ajay; Potters, Louis

    2012-01-01

    The purpose of this work was to develop and implement six sigma practices toward the enhancement of patient safety in an electronic, quality checklist-driven, multicenter, paperless radiation medicine department. A quality checklist process map (QPM), stratified into consultation through treatment-completion stages was incorporated into an oncology information systems platform. A cross-functional quality management team conducted quality-function-deployment and define-measure-analyze-improve-control (DMAIC) six sigma exercises with a focus on patient safety. QPM procedures were Pareto-sorted in order of decreasing patient safety risk with failure mode and effects analysis (FMEA). Quantitative metrics for a grouped set of highest risk procedures were established. These included procedural delays, associated standard deviations and six sigma Z scores. Baseline performance of the QPM was established over the previous year of usage. Data-driven analysis led to simplification, standardization, and refinement of the QPM with standard deviation, slip-day reduction, and Z-score enhancement goals. A no-fly policy (NFP) for patient safety was introduced at the improve-control DMAIC phase, with a process map interlock imposed on treatment initiation in the event of FMEA-identified high-risk tasks being delayed or not completed. The NFP was introduced in a pilot phase with specific stopping rules and the same metrics used for performance assessments. A custom root-cause analysis database was deployed to monitor patient safety events. Relative to the baseline period, average slip days and standard deviations for the risk-enhanced QPM procedures improved by over threefold factors in the NFP period. The Z scores improved by approximately 20%. A trend for proactive delays instead of reactive hard stops was observed with no adverse effects of the NFP. The number of computed potential no-fly delays per month dropped from 60 to 20 over a total of 520 cases. The fraction of computed potential no-fly cases that were delayed in NFP compliance rose from 28% to 45%. Proactive delays rose to 80% of all delayed cases. For potential no-fly cases, event reporting rose from 18% to 50%, while for actually delayed cases, event reporting rose from 65% to 100%. With complex technologies, resource-compromised staff, and pressures to hasten treatment initiation, the use of the six sigma driven process interlocks may mitigate potential patient safety risks as demonstrated in this study. Copyright © 2012 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  2. Application of a theoretical model to evaluate COPD disease management.

    PubMed

    Lemmens, Karin M M; Nieboer, Anna P; Rutten-Van Mölken, Maureen P M H; van Schayck, Constant P; Asin, Javier D; Dirven, Jos A M; Huijsman, Robbert

    2010-03-26

    Disease management programmes are heterogeneous in nature and often lack a theoretical basis. An evaluation model has been developed in which theoretically driven inquiries link disease management interventions to outcomes. The aim of this study is to methodically evaluate the impact of a disease management programme for patients with chronic obstructive pulmonary disease (COPD) on process, intermediate and final outcomes of care in a general practice setting. A quasi-experimental research was performed with 12-months follow-up of 189 COPD patients in primary care in the Netherlands. The programme included patient education, protocolised assessment and treatment of COPD, structural follow-up and coordination by practice nurses at 3, 6 and 12 months. Data on intermediate outcomes (knowledge, psychosocial mediators, self-efficacy and behaviour) and final outcomes (dyspnoea, quality of life, measured by the CRQ and CCQ, and patient experiences) were obtained from questionnaires and electronic registries. Implementation of the programme was associated with significant improvements in dyspnoea (p < 0.001) and patient experiences (p < 0.001). No significant improvement was found in mean quality of life scores. Improvements were found in several intermediate outcomes, including investment beliefs (p < 0.05), disease-specific knowledge (p < 0.01; p < 0.001) and medication compliance (p < 0.01). Overall, process improvement was established. The model showed associations between significantly improved intermediate outcomes and improvements in quality of life and dyspnoea. The application of a theory-driven model enhances the design and evaluation of disease management programmes aimed at improving health outcomes. This study supports the notion that a theoretical approach strengthens the evaluation designs of complex interventions. Moreover, it provides prudent evidence that the implementation of COPD disease management programmes can positively influence outcomes of care.

  3. Application of a theoretical model to evaluate COPD disease management

    PubMed Central

    2010-01-01

    Background Disease management programmes are heterogeneous in nature and often lack a theoretical basis. An evaluation model has been developed in which theoretically driven inquiries link disease management interventions to outcomes. The aim of this study is to methodically evaluate the impact of a disease management programme for patients with chronic obstructive pulmonary disease (COPD) on process, intermediate and final outcomes of care in a general practice setting. Methods A quasi-experimental research was performed with 12-months follow-up of 189 COPD patients in primary care in the Netherlands. The programme included patient education, protocolised assessment and treatment of COPD, structural follow-up and coordination by practice nurses at 3, 6 and 12 months. Data on intermediate outcomes (knowledge, psychosocial mediators, self-efficacy and behaviour) and final outcomes (dyspnoea, quality of life, measured by the CRQ and CCQ, and patient experiences) were obtained from questionnaires and electronic registries. Results Implementation of the programme was associated with significant improvements in dyspnoea (p < 0.001) and patient experiences (p < 0.001). No significant improvement was found in mean quality of life scores. Improvements were found in several intermediate outcomes, including investment beliefs (p < 0.05), disease-specific knowledge (p < 0.01; p < 0.001) and medication compliance (p < 0.01). Overall, process improvement was established. The model showed associations between significantly improved intermediate outcomes and improvements in quality of life and dyspnoea. Conclusions The application of a theory-driven model enhances the design and evaluation of disease management programmes aimed at improving health outcomes. This study supports the notion that a theoretical approach strengthens the evaluation designs of complex interventions. Moreover, it provides prudent evidence that the implementation of COPD disease management programmes can positively influence outcomes of care. PMID:20346135

  4. Developing Cyberinfrastructure Tools and Services for Metadata Quality Evaluation

    NASA Astrophysics Data System (ADS)

    Mecum, B.; Gordon, S.; Habermann, T.; Jones, M. B.; Leinfelder, B.; Powers, L. A.; Slaughter, P.

    2016-12-01

    Metadata and data quality are at the core of reusable and reproducible science. While great progress has been made over the years, much of the metadata collected only addresses data discovery, covering concepts such as titles and keywords. Improving metadata beyond the discoverability plateau means documenting detailed concepts within the data such as sampling protocols, instrumentation used, and variables measured. Given that metadata commonly do not describe their data at this level, how might we improve the state of things? Giving scientists and data managers easy to use tools to evaluate metadata quality that utilize community-driven recommendations is the key to producing high-quality metadata. To achieve this goal, we created a set of cyberinfrastructure tools and services that integrate with existing metadata and data curation workflows which can be used to improve metadata and data quality across the sciences. These tools work across metadata dialects (e.g., ISO19115, FGDC, EML, etc.) and can be used to assess aspects of quality beyond what is internal to the metadata such as the congruence between the metadata and the data it describes. The system makes use of a user-friendly mechanism for expressing a suite of checks as code in popular data science programming languages such as Python and R. This reduces the burden on scientists and data managers to learn yet another language. We demonstrated these services and tools in three ways. First, we evaluated a large corpus of datasets in the DataONE federation of data repositories against a metadata recommendation modeled after existing recommendations such as the LTER best practices and the Attribute Convention for Dataset Discovery (ACDD). Second, we showed how this service can be used to display metadata and data quality information to data producers during the data submission and metadata creation process, and to data consumers through data catalog search and access tools. Third, we showed how the centrally deployed DataONE quality service can achieve major efficiency gains by allowing member repositories to customize and use recommendations that fit their specific needs without having to create de novo infrastructure at their site.

  5. Secondary data analysis of large data sets in urology: successes and errors to avoid.

    PubMed

    Schlomer, Bruce J; Copp, Hillary L

    2014-03-01

    Secondary data analysis is the use of data collected for research by someone other than the investigator. In the last several years there has been a dramatic increase in the number of these studies being published in urological journals and presented at urological meetings, especially involving secondary data analysis of large administrative data sets. Along with this expansion, skepticism for secondary data analysis studies has increased for many urologists. In this narrative review we discuss the types of large data sets that are commonly used for secondary data analysis in urology, and discuss the advantages and disadvantages of secondary data analysis. A literature search was performed to identify urological secondary data analysis studies published since 2008 using commonly used large data sets, and examples of high quality studies published in high impact journals are given. We outline an approach for performing a successful hypothesis or goal driven secondary data analysis study and highlight common errors to avoid. More than 350 secondary data analysis studies using large data sets have been published on urological topics since 2008 with likely many more studies presented at meetings but never published. Nonhypothesis or goal driven studies have likely constituted some of these studies and have probably contributed to the increased skepticism of this type of research. However, many high quality, hypothesis driven studies addressing research questions that would have been difficult to conduct with other methods have been performed in the last few years. Secondary data analysis is a powerful tool that can address questions which could not be adequately studied by another method. Knowledge of the limitations of secondary data analysis and of the data sets used is critical for a successful study. There are also important errors to avoid when planning and performing a secondary data analysis study. Investigators and the urological community need to strive to use secondary data analysis of large data sets appropriately to produce high quality studies that hopefully lead to improved patient outcomes. Copyright © 2014 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  6. Feasibility and Efficacy of Nurse-Driven Acute Stroke Care.

    PubMed

    Mainali, Shraddha; Stutzman, Sonja; Sengupta, Samarpita; Dirickson, Amanda; Riise, Laura; Jones, Donald; Yang, Julian; Olson, DaiWai M

    2017-05-01

    Acute stroke care requires rapid assessment and intervention. Replacing traditional sequential algorithms in stroke care with parallel processing using telestroke consultation could be useful in the management of acute stroke patients. The purpose of this study was to assess the feasibility of a nurse-driven acute stroke protocol using a parallel processing model. This is a prospective, nonrandomized, feasibility study of a quality improvement initiative. Stroke team members had a 1-month training phase, and then the protocol was implemented for 6 months and data were collected on a "run-sheet." The primary outcome of this study was to determine if a nurse-driven acute stroke protocol is feasible and assists in decreasing door to needle (intravenous tissue plasminogen activator [IV-tPA]) times. Of the 153 stroke patients seen during the protocol implementation phase, 57 were designated as "level 1" (symptom onset <4.5 hours) strokes requiring acute stroke management. Among these strokes, 78% were nurse-driven, and 75% of the telestroke encounters were also nurse-driven. The average door to computerized tomography time was significantly reduced in nurse-driven codes (38.9 minutes versus 24.4 minutes; P < .04). The use of a nurse-driven protocol is feasible and effective. When used in conjunction with a telestroke specialist, it may be of value in improving patient outcomes by decreasing the time for door to decision for IV-tPA. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  7. Quality-Focused Management.

    ERIC Educational Resources Information Center

    Needham, Robbie Lee

    1993-01-01

    Presents the quality-focused management (QFM) system and explains the departure QFM makes from established community college management practices. Describes the system's self-directed teams engaged in a continuous improvement process driven by customer demand and long-term commitment to quality and cost control. (13 references.) (MAB)

  8. Epilepsy informatics and an ontology-driven infrastructure for large database research and patient care in epilepsy

    PubMed Central

    Sahoo, Satya S.; Zhang, Guo-Qiang; Lhatoo, Samden D.

    2013-01-01

    Summary The epilepsy community increasingly recognizes the need for a modern classification system that can also be easily integrated with effective informatics tools. The 2010 reports by the United States President's Council of Advisors on Science and Technology (PCAST) identified informatics as a critical resource to improve quality of patient care, drive clinical research, and reduce the cost of health services. An effective informatics infrastructure for epilepsy, which is underpinned by a formal knowledge model or ontology, can leverage an ever increasing amount of multimodal data to improve (1) clinical decision support, (2) access to information for patients and their families, (3) easier data sharing, and (4) accelerate secondary use of clinical data. Modeling the recommendations of the International League Against Epilepsy (ILAE) classification system in the form of an epilepsy domain ontology is essential for consistent use of terminology in a variety of applications, including electronic health records systems and clinical applications. In this review, we discuss the data management issues in epilepsy and explore the benefits of an ontology-driven informatics infrastructure and its role in adoption of a “data-driven” paradigm in epilepsy research. PMID:23647220

  9. Results-driven approach to improving quality and productivity

    Treesearch

    John Dramm

    2000-01-01

    Quality control (QC) programs do not often realize their full potential. Elaborate and expensive QC programs can easily get side tracked by the process of building a program with promises of “Someday, this will all pay off.” Training employees in QC methods is no guarantee that quality will improve. Several documented cases show that such activity-centered efforts...

  10. Are Improvements in Measured Performance Driven by Better Treatment or "Denominator Management"?

    PubMed

    Harris, Alex H S; Chen, Cheng; Rubinsky, Anna D; Hoggatt, Katherine J; Neuman, Matthew; Vanneman, Megan E

    2016-04-01

    Process measures of healthcare quality are usually formulated as the number of patients who receive evidence-based treatment (numerator) divided by the number of patients in the target population (denominator). When the systems being evaluated can influence which patients are included in the denominator, it is reasonable to wonder if improvements in measured quality are driven by expanding numerators or contracting denominators. In 2003, the US Department of Veteran Affairs (VA) based executive compensation in part on performance on a substance use disorder (SUD) continuity-of-care quality measure. The first goal of this study was to evaluate if implementing the measure in this way resulted in expected improvements in measured performance. The second goal was to examine if the proportion of patients with SUD who qualified for the denominator contracted after the quality measure was implemented, and to describe the facility-level variation in and correlates of denominator contraction or expansion. Using 40 quarters of data straddling the implementation of the performance measure, an interrupted time series design was used to evaluate changes in two outcomes. All veterans with an SUD diagnosis in all VA facilities from fiscal year 2000 to 2009. The two outcomes were 1) measured performance-patients retained/patients qualified and 2) denominator prevalence-patients qualified/patients with SUD program contact. Measured performance improved over time (P < 0.001). Notably, the proportion of patients with SUD program contact who qualified for the denominator decreased more rapidly after the measure was implemented (p = 0.02). Facilities with higher pre-implementation denominator prevalence had steeper declines in denominator prevalence after implementation (p < 0.001). These results should motivate the development of measures that are less vulnerable to denominator management, and also the exploration of "shadow measures" to monitor and reduce undesirable denominator management.

  11. Predictors of Success for Community-Driven Water Quality Management--Lessons from Three Catchments in New Zealand

    ERIC Educational Resources Information Center

    Tyson, Ben; Unson, Christine; Edgar, Nick

    2017-01-01

    Three community engagement projects on the South Island of New Zealand are enacting education and communication initiatives to improve the uptake of best management practices on farms regarding nutrient management for improving water quality. Understanding the enablers and barriers to effective community-based catchment management is fundamental…

  12. WE-A-BRC-00: The Quality Gap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Quality and safety in healthcare are inextricably linked. There are compelling data that link poor quality radiation therapy to inferior patient survival. Radiation Oncology clinical trial protocol deviations often involve incorrect target volume delineation or dosing, akin to radiotherapy incidents which also often involve partial geometric miss or improper radiation dosing. When patients with radiation protocol variations are compared to those without significant protocol variations, clinical outcome is negatively impacted. Traditionally, quality assurance in radiation oncology has been driven largely by new technological advances, and safety improvement has been driven by reactive responses to past system failures and prescriptive mandatesmore » recommended by professional organizations and promulgated by regulators. Prescriptive approaches to quality and safety alone often do not address the huge variety of process and technique used in radiation oncology. Risk-based assessments of radiotherapy processes provide a mechanism to enhance quality and safety, both for new and for established techniques. It is imperative that we explore such a paradigm shift at this time, when expectations from patients as well as providers are rising while available resources are falling. There is much we can learn from our past experiences to be applied towards the new risk-based assessments. Learning Objectives: Understand the impact of clinical and technical quality on outcomes Understand the importance of quality care in radiation oncology Learn to assess the impact of quality on clinical outcomes D. Followill, NIH Grant CA180803.« less

  13. Better Patient Care At High-Quality Hospitals May Save Medicare Money And Bolster Episode-Based Payment Models.

    PubMed

    Tsai, Thomas C; Greaves, Felix; Zheng, Jie; Orav, E John; Zinner, Michael J; Jha, Ashish K

    2016-09-01

    US policy makers are making efforts to simultaneously improve the quality of and reduce spending on health care through alternative payment models such as bundled payment. Bundled payment models are predicated on the theory that aligning financial incentives for all providers across an episode of care will lower health care spending while improving quality. Whether this is true remains unknown. Using national Medicare fee-for-service claims for the period 2011-12 and data on hospital quality, we evaluated how thirty- and ninety-day episode-based spending were related to two validated measures of surgical quality-patient satisfaction and surgical mortality. We found that patients who had major surgery at high-quality hospitals cost Medicare less than those who had surgery at low-quality institutions, for both thirty- and ninety-day periods. The difference in Medicare spending between low- and high-quality hospitals was driven primarily by postacute care, which accounted for 59.5 percent of the difference in thirty-day episode spending, and readmissions, which accounted for 19.9 percent. These findings suggest that efforts to achieve value through bundled payment should focus on improving care at low-quality hospitals and reducing unnecessary use of postacute care. Project HOPE—The People-to-People Health Foundation, Inc.

  14. Low-Burden Strategies to Promote Smoking Cessation Treatment Among Patients With Serious Mental Illness.

    PubMed

    Chen, Li-Shiun; Baker, Timothy B; Korpecki, Jeanette M; Johnson, Kelly E; Hook, Jaime P; Brownson, Ross C; Bierut, Laura J

    2018-06-01

    Patients with serious mental illness have high smoking prevalence and early mortality. Inadequate implementation of evidence-based smoking cessation treatment in community mental health centers (CMHCs) contributes to this disparity. This column describes a study of the effects of quality improvement strategies on treatment and cessation outcomes among patients with serious mental illness at four CMHCs. Two low-burden strategies, decision support and academic detailing with data-driven feedback, were implemented in the CMHCs' clinics from 2014 to 2016. Pre- and postimplementation data from pharmacy and medical records were analyzed. The percentage of patients receiving cessation medication increased from 5% to 18% (p≤.001), and smoking prevalence decreased from 57% to 54% (p≤.001). This quality improvement approach holds great potential for increasing the level of smoking cessation care for patients treated in CMHC settings. Decision support and academic detailing with feedback may be effective strategies to promote best practices.

  15. Content-Based Multi-Channel Network Coding Algorithm in the Millimeter-Wave Sensor Network

    PubMed Central

    Lin, Kai; Wang, Di; Hu, Long

    2016-01-01

    With the development of wireless technology, the widespread use of 5G is already an irreversible trend, and millimeter-wave sensor networks are becoming more and more common. However, due to the high degree of complexity and bandwidth bottlenecks, the millimeter-wave sensor network still faces numerous problems. In this paper, we propose a novel content-based multi-channel network coding algorithm, which uses the functions of data fusion, multi-channel and network coding to improve the data transmission; the algorithm is referred to as content-based multi-channel network coding (CMNC). The CMNC algorithm provides a fusion-driven model based on the Dempster-Shafer (D-S) evidence theory to classify the sensor nodes into different classes according to the data content. By using the result of the classification, the CMNC algorithm also provides the channel assignment strategy and uses network coding to further improve the quality of data transmission in the millimeter-wave sensor network. Extensive simulations are carried out and compared to other methods. Our simulation results show that the proposed CMNC algorithm can effectively improve the quality of data transmission and has better performance than the compared methods. PMID:27376302

  16. Reduction in pediatric identification band errors: a quality collaborative.

    PubMed

    Phillips, Shannon Connor; Saysana, Michele; Worley, Sarah; Hain, Paul D

    2012-06-01

    Accurate and consistent placement of a patient identification (ID) band is used in health care to reduce errors associated with patient misidentification. Multiple safety organizations have devoted time and energy to improving patient ID, but no multicenter improvement collaboratives have shown scalability of previously successful interventions. We hoped to reduce by half the pediatric patient ID band error rate, defined as absent, illegible, or inaccurate ID band, across a quality improvement learning collaborative of hospitals in 1 year. On the basis of a previously successful single-site intervention, we conducted a self-selected 6-site collaborative to reduce ID band errors in heterogeneous pediatric hospital settings. The collaborative had 3 phases: preparatory work and employee survey of current practice and barriers, data collection (ID band failure rate), and intervention driven by data and collaborative learning to accelerate change. The collaborative audited 11377 patients for ID band errors between September 2009 and September 2010. The ID band failure rate decreased from 17% to 4.1% (77% relative reduction). Interventions including education of frontline staff regarding correct ID bands as a safety strategy; a change to softer ID bands, including "luggage tag" type ID bands for some patients; and partnering with families and patients through education were applied at all institutions. Over 13 months, a collaborative of pediatric institutions significantly reduced the ID band failure rate. This quality improvement learning collaborative demonstrates that safety improvements tested in a single institution can be disseminated to improve quality of care across large populations of children.

  17. Observational study using the tools of lean six sigma to improve the efficiency of the resident rounding process.

    PubMed

    Chand, David V

    2011-06-01

    Recent focus on resident work hours has challenged residency programs to modify their curricula to meet established duty hour restrictions and fulfill their mission to develop the next generation of clinicians. Simultaneously, health care systems strive to deliver efficient, high-quality care to patients and families. The primary goal of this observational study was to use a data-driven approach to eliminate examples of waste and variation identified in resident rounding using Lean Six Sigma methodology. A secondary goal was to improve the efficiency of the rounding process, as measured by the reduction in nonvalue-added time. We used the "DMAIC" methodology: define, measure, analyze, improve, and control. Pediatric and family medicine residents rotating on the pediatric hospitalist team participated in the observation phase. Residents, nurses, hospitalists, and parents of patients completed surveys to gauge their attitudes toward rounds. The Mann-Whitney test was used to test for differences in the median times measured during the preimprovement and postimprovement phases, and the Student t test was used for comparison of survey data. Collaborative, family-centered rounding with elimination of the "prerounding" process, as well as standard work instructions and pacing the process to meet customer demand (takt time), were implemented. Nonvalue-added time per patient was reduced by 64% (P  =  .005). Survey data suggested that team members preferred the collaborative, family-centered approach to the traditional model of rounding. Lean Six Sigma provides tools, a philosophy, and a structured, data-driven approach to address a problem. In our case this facilitated an effort to adhere to duty hour restrictions while promoting education and quality care. Such approaches will become increasingly useful as health care delivery and education continue to transform.

  18. Observational Study Using the Tools of Lean Six Sigma to Improve the Efficiency of the Resident Rounding Process

    PubMed Central

    Chand, David V.

    2011-01-01

    Background Recent focus on resident work hours has challenged residency programs to modify their curricula to meet established duty hour restrictions and fulfill their mission to develop the next generation of clinicians. Simultaneously, health care systems strive to deliver efficient, high-quality care to patients and families. The primary goal of this observational study was to use a data-driven approach to eliminate examples of waste and variation identified in resident rounding using Lean Six Sigma methodology. A secondary goal was to improve the efficiency of the rounding process, as measured by the reduction in nonvalue-added time. Methods We used the “DMAIC” methodology: define, measure, analyze, improve, and control. Pediatric and family medicine residents rotating on the pediatric hospitalist team participated in the observation phase. Residents, nurses, hospitalists, and parents of patients completed surveys to gauge their attitudes toward rounds. The Mann-Whitney test was used to test for differences in the median times measured during the preimprovement and postimprovement phases, and the Student t test was used for comparison of survey data. Results and Discussion Collaborative, family-centered rounding with elimination of the “prerounding” process, as well as standard work instructions and pacing the process to meet customer demand (takt time), were implemented. Nonvalue-added time per patient was reduced by 64% (P  =  .005). Survey data suggested that team members preferred the collaborative, family-centered approach to the traditional model of rounding. Conclusions Lean Six Sigma provides tools, a philosophy, and a structured, data-driven approach to address a problem. In our case this facilitated an effort to adhere to duty hour restrictions while promoting education and quality care. Such approaches will become increasingly useful as health care delivery and education continue to transform. PMID:22655134

  19. A comprehensive review of the SLMTA literature part 2: Measuring success

    PubMed Central

    Yao, Katy; Nkengasong, John N.

    2014-01-01

    Background Since its introduction in 2009, the Strengthening Laboratory Management Toward Accreditation (SLMTA) programme has been implemented in 617 laboratories in 47 countries. Objective We completed a systematic review of the published literature on SLMTA. The review consists of two companion papers; this article examines quantitative evidence presented in the publications along with a meta-analysis of selected results. Methods We identified 28 published articles with data from SLMTA implementation. The SLMTA programme was evaluated through audits based on a standard checklist, which is divided into 12 sections corresponding to the 12 Quality System Essentials (QSEs). Several basic service delivery indicators reported by programmes were also examined. Results for various components of the programme were reviewed and summarised; a meta-analysis of QSE results grouped by the three stages of the quality cycle was conducted for 126 laboratories in 12 countries. Results Global programme data show improved quality in SLMTA laboratories in every country, with average improvements on audit scores of 25 percentage points. Meta-analysis identified Improvement Management as the weakest stage, with internal audit (8%) and occurrence management (16%) showing the lowest scores. Studies documented 19% – 95% reductions in turn-around times, 69% – 93% reductions in specimen rejection rates, 76% – 81% increases in clinician satisfaction rates, 67% – 85% improvements in external quality assessment results, 50% – 66% decreases in nonconformities and 67% increases in staff punctuality. Conclusions The wide array of results reported provides a comprehensive picture of the SLMTA programme overall, suggesting a substantive impact on provision of quality laboratory services and patient care. These comprehensive results establish a solid data-driven foundation for program improvement and further expansion. PMID:29043201

  20. Using New Technologies for Time Diary Data Collection: Instrument Design and Data Quality Findings from a Mixed-Mode Pilot Survey.

    PubMed

    Chatzitheochari, Stella; Fisher, Kimberly; Gilbert, Emily; Calderwood, Lisa; Huskinson, Tom; Cleary, Andrew; Gershuny, Jonathan

    2018-01-01

    Recent years have witnessed a steady growth of time-use research, driven by the increased research and policy interest in population activity patterns and their associations with long-term outcomes. There is recent interest in moving beyond traditional paper-administered time diaries to use new technologies for data collection in order to reduce respondent burden and administration costs, and to improve data quality. This paper presents two novel diary instruments that were employed by a large-scale multi-disciplinary cohort study in order to obtain information on the time allocation of adolescents in the United Kingdom. A web-administered diary and a smartphone app were created, and a mixed-mode data collection approach was followed: cohort members were asked to choose between these two modes, and those who were unable or refused to use the web/app modes were offered a paper diary. Using data from a pilot survey of 86 participants, we examine diary data quality indicators across the three modes. Results suggest that the web and app modes yield an overall better time diary data quality than the paper mode, with a higher proportion of diaries with complete activity and contextual information. Results also show that the web and app modes yield a comparable number of activity episodes to the paper mode. These results suggest that the use of new technologies can improve diary data quality. Future research using larger samples should systematically investigate selection and measurement effects in mixed-mode time-use survey designs.

  1. A model to begin to use clinical outcomes in medical education.

    PubMed

    Haan, Constance K; Edwards, Fred H; Poole, Betty; Godley, Melissa; Genuardi, Frank J; Zenni, Elisa A

    2008-06-01

    The latest phase of the Accreditation Council for Graduate Medical Education (ACGME) Outcome Project challenges graduate medical education (GME) programs to select meaningful clinical quality indicators by which to measure trainee performance and progress, as well as to assess and improve educational effectiveness of programs. The authors describe efforts to measure educational quality, incorporating measurable patient-care outcomes to guide improvement. University of Florida College of Medicine-Jacksonville education leaders developed a tiered framework for selecting clinical indicators whose outcomes would illustrate integration of the ACGME competencies and their assessment with learning and clinical care. In order of preference, indicators selected should align with a specialty's (1) national benchmarked consensus standards, (2) national specialty society standards, (3) standards of local, institutional, or regional quality initiatives, or (4) top-priority diagnostic and/or therapeutic categories for the specialty, based on areas of high frequency, impact, or cost. All programs successfully applied the tiered process to clinical indicator selection and then identified data sources to track clinical outcomes. Using clinical outcomes in resident evaluation assesses the resident's performance as reflective of his or her participation in the health care delivery team. Programmatic improvements are driven by clinical outcomes that are shown to be below benchmark across the residents. Selecting appropriate clinical indicators-representative of quality of care and of graduate medical education-is the first step toward tracking educational outcomes using clinical data as the basis for evaluation and improvement. This effort is an important aspect of orienting trainees to using data for monitoring and improving care processes and outcomes throughout their careers.

  2. Electronic Records, Registries, and the Development of "Big Data": Crowd-Sourcing Quality toward Knowledge.

    PubMed

    Dewdney, Summer B; Lachance, Jason

    2016-01-01

    Despite many perceived advances in treatment over the past few decades, cancer continues to present a significant health burden, particularly to the aging US population. Forces including shrinking funding mechanisms, cost and quality concerns, as well as disappointing clinical outcomes have driven a surge of recent efforts into utilizing the technological innovation that has permeated other industries by leveraging large and complex data sets, so called "big data." In this review, we will review some of the history of oncology data collection, including the earliest data registries, as well as explore the future directions of this new brand of research while highlighting some of the more recent and promising efforts to harness the power of the electronic health record and the multitude of data co-located there, in an effort to improve individualized cancer-related outcomes in rapid real time.

  3. Reconstruction of dynamical systems from resampled point processes produced by neuron models

    NASA Astrophysics Data System (ADS)

    Pavlova, Olga N.; Pavlov, Alexey N.

    2018-04-01

    Characterization of dynamical features of chaotic oscillations from point processes is based on embedding theorems for non-uniformly sampled signals such as the sequences of interspike intervals (ISIs). This theoretical background confirms the ability of attractor reconstruction from ISIs generated by chaotically driven neuron models. The quality of such reconstruction depends on the available length of the analyzed dataset. We discuss how data resampling improves the reconstruction for short amount of data and show that this effect is observed for different types of mechanisms for spike generation.

  4. A Collaborative Data Chat: Teaching Summative Assessment Data Use in Pre-Service Teacher Education

    ERIC Educational Resources Information Center

    Piro, Jody S.; Dunlap, Karen; Shutt, Tammy

    2014-01-01

    As the quality of educational outputs has been problematized, accountability systems have driven reform based upon summative assessment data. These policies impact the ways that educators use data within schools and subsequently, how teacher education programs may adjust their curricula to teach data-driven decision-making to inform instruction.…

  5. Evaluation of regional climate simulations for air quality modelling purposes

    NASA Astrophysics Data System (ADS)

    Menut, Laurent; Tripathi, Om P.; Colette, Augustin; Vautard, Robert; Flaounas, Emmanouil; Bessagnet, Bertrand

    2013-05-01

    In order to evaluate the future potential benefits of emission regulation on regional air quality, while taking into account the effects of climate change, off-line air quality projection simulations are driven using weather forcing taken from regional climate models. These regional models are themselves driven by simulations carried out using global climate models (GCM) and economical scenarios. Uncertainties and biases in climate models introduce an additional "climate modeling" source of uncertainty that is to be added to all other types of uncertainties in air quality modeling for policy evaluation. In this article we evaluate the changes in air quality-related weather variables induced by replacing reanalyses-forced by GCM-forced regional climate simulations. As an example we use GCM simulations carried out in the framework of the ERA-interim programme and of the CMIP5 project using the Institut Pierre-Simon Laplace climate model (IPSLcm), driving regional simulations performed in the framework of the EURO-CORDEX programme. In summer, we found compensating deficiencies acting on photochemistry: an overestimation by GCM-driven weather due to a positive bias in short-wave radiation, a negative bias in wind speed, too many stagnant episodes, and a negative temperature bias. In winter, air quality is mostly driven by dispersion, and we could not identify significant differences in either wind or planetary boundary layer height statistics between GCM-driven and reanalyses-driven regional simulations. However, precipitation appears largely overestimated in GCM-driven simulations, which could significantly affect the simulation of aerosol concentrations. The identification of these biases will help interpreting results of future air quality simulations using these data. Despite these, we conclude that the identified differences should not lead to major difficulties in using GCM-driven regional climate simulations for air quality projections.

  6. Data-Driven Robust M-LS-SVR-Based NARX Modeling for Estimation and Control of Molten Iron Quality Indices in Blast Furnace Ironmaking.

    PubMed

    Zhou, Ping; Guo, Dongwei; Wang, Hong; Chai, Tianyou

    2017-09-29

    Optimal operation of an industrial blast furnace (BF) ironmaking process largely depends on a reliable measurement of molten iron quality (MIQ) indices, which are not feasible using the conventional sensors. This paper proposes a novel data-driven robust modeling method for the online estimation and control of MIQ indices. First, a nonlinear autoregressive exogenous (NARX) model is constructed for the MIQ indices to completely capture the nonlinear dynamics of the BF process. Then, considering that the standard least-squares support vector regression (LS-SVR) cannot directly cope with the multioutput problem, a multitask transfer learning is proposed to design a novel multioutput LS-SVR (M-LS-SVR) for the learning of the NARX model. Furthermore, a novel M-estimator is proposed to reduce the interference of outliers and improve the robustness of the M-LS-SVR model. Since the weights of different outlier data are properly given by the weight function, their corresponding contributions on modeling can properly be distinguished, thus a robust modeling result can be achieved. Finally, a novel multiobjective evaluation index on the modeling performance is developed by comprehensively considering the root-mean-square error of modeling and the correlation coefficient on trend fitting, based on which the nondominated sorting genetic algorithm II is used to globally optimize the model parameters. Both experiments using industrial data and industrial applications illustrate that the proposed method can eliminate the adverse effect caused by the fluctuation of data in BF process efficiently. This indicates its stronger robustness and higher accuracy. Moreover, control testing shows that the developed model can be well applied to realize data-driven control of the BF process.

  7. Data-Driven Robust M-LS-SVR-Based NARX Modeling for Estimation and Control of Molten Iron Quality Indices in Blast Furnace Ironmaking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Ping; Guo, Dongwei; Wang, Hong

    Optimal operation of an industrial blast furnace (BF) ironmaking process largely depends on a reliable measurement of molten iron quality (MIQ) indices, which are not feasible using the conventional sensors. This paper proposes a novel data-driven robust modeling method for the online estimation and control of MIQ indices. First, a nonlinear autoregressive exogenous (NARX) model is constructed for the MIQ indices to completely capture the nonlinear dynamics of the BF process. Then, considering that the standard least-squares support vector regression (LS-SVR) cannot directly cope with the multioutput problem, a multitask transfer learning is proposed to design a novel multioutput LS-SVRmore » (M-LS-SVR) for the learning of the NARX model. Furthermore, a novel M-estimator is proposed to reduce the interference of outliers and improve the robustness of the M-LS-SVR model. Since the weights of different outlier data are properly given by the weight function, their corresponding contributions on modeling can properly be distinguished, thus a robust modeling result can be achieved. Finally, a novel multiobjective evaluation index on the modeling performance is developed by comprehensively considering the root-mean-square error of modeling and the correlation coefficient on trend fitting, based on which the nondominated sorting genetic algorithm II is used to globally optimize the model parameters. Both experiments using industrial data and industrial applications illustrate that the proposed method can eliminate the adverse effect caused by the fluctuation of data in BF process efficiently. In conclusion, this indicates its stronger robustness and higher accuracy. Moreover, control testing shows that the developed model can be well applied to realize data-driven control of the BF process.« less

  8. Data-Driven Robust M-LS-SVR-Based NARX Modeling for Estimation and Control of Molten Iron Quality Indices in Blast Furnace Ironmaking

    DOE PAGES

    Zhou, Ping; Guo, Dongwei; Wang, Hong; ...

    2017-09-29

    Optimal operation of an industrial blast furnace (BF) ironmaking process largely depends on a reliable measurement of molten iron quality (MIQ) indices, which are not feasible using the conventional sensors. This paper proposes a novel data-driven robust modeling method for the online estimation and control of MIQ indices. First, a nonlinear autoregressive exogenous (NARX) model is constructed for the MIQ indices to completely capture the nonlinear dynamics of the BF process. Then, considering that the standard least-squares support vector regression (LS-SVR) cannot directly cope with the multioutput problem, a multitask transfer learning is proposed to design a novel multioutput LS-SVRmore » (M-LS-SVR) for the learning of the NARX model. Furthermore, a novel M-estimator is proposed to reduce the interference of outliers and improve the robustness of the M-LS-SVR model. Since the weights of different outlier data are properly given by the weight function, their corresponding contributions on modeling can properly be distinguished, thus a robust modeling result can be achieved. Finally, a novel multiobjective evaluation index on the modeling performance is developed by comprehensively considering the root-mean-square error of modeling and the correlation coefficient on trend fitting, based on which the nondominated sorting genetic algorithm II is used to globally optimize the model parameters. Both experiments using industrial data and industrial applications illustrate that the proposed method can eliminate the adverse effect caused by the fluctuation of data in BF process efficiently. In conclusion, this indicates its stronger robustness and higher accuracy. Moreover, control testing shows that the developed model can be well applied to realize data-driven control of the BF process.« less

  9. Perceived Factors Associated with Sustained Improvement Following Participation in a Multicenter Quality Improvement Collaborative.

    PubMed

    Stone, Sohini; Lee, Henry C; Sharek, Paul J

    2016-07-01

    The California Perinatal Quality Care Collaborative led the Breastmilk Nutrition Quality Improvement Collaborative from October 2009 to September 2010 to increase the percentage of very low birth weight infants receiving breast milk at discharge in 11 collaborative neonatal ICUs (NICUs). Observed increases in breast milk feeding and decreases in necrotizing enterocolitis persisted for 6 months after the collaborative ended. Eighteen to 24 months after the end of the collaborative, some sites maintained or further increased their gains, while others trended back toward baseline. A study was conducted to assess the qualitative factors that affect sustained improvement following participation. Collaborative leaders at each of the 11 NICUs that participated in the Breastmilk Nutrition Quality Improvement Collaborative were invited to participate in a site-specific one-hour phone interview. Interviews were recorded and transcribed and then analyzed using qualitative research analysis software to identify themes associated with sustained improvement. Eight of 11 invited centers agreed to participate in the interviews. Thematic saturation was achieved by the sixth interview, so further interviews were not pursued. Factors contributing to sustainability included physician involvement within the multidisciplinary teams, continuous education, incorporation of interventions into the daily work flow, and integration of a data-driven feedback system. Early consideration by site leaders of how to integrate best-practice interventions into the daily work flow, and ensuring physician commitment and ongoing education based in continuous data review, should enhance the likelihood of sustaining improvements. To maximize sustained success, future collaborative design should consider proactively identifying and supporting these factors at participating sites.

  10. Performance improvement: one model to reduce length of stay.

    PubMed

    Chisari, E; Mele, J A

    1994-01-01

    Dedicated quality professionals are tired of quick fixes, Band-Aids, and other first-aid strategies that offer only temporary relief of nagging problems rather than a long-term cure. Implementing strategies that can produce permanent solutions to crucial problems is a challenge confronted by organizations striving for continuous performance improvement. One vehicle, driven by data and customer requirements, that can help to solve problems and sustain success over time is the storyboard. This article illustrates the use of the storyboard as the framework for reducing length of stay--one of the most important problems facing healthcare organizations today.

  11. QUEST®: A Data-Driven Collaboration to Improve Quality, Efficiency, Safety, and Transparency in Acute Care.

    PubMed

    Crimmins, Mary M; Lowe, Timothy J; Barrington, Monica; Kaylor, Courtney; Phipps, Terri; Le-Roy, Charlene; Brooks, Tammy; Jones, Mashekia; Martin, John

    2016-06-01

    In 2008 Premier (Premier, Inc., Charlotte, North Carolina) began its Quality, Efficiency, and Safety with Transparency (QUEST®) collaborative, which is an acute health care organization program focused on improving quality and reducing patient harm. Retrospective performance data for QUEST hospitals were used to establish trends from the third quarter (Q3; July–September) of 2006 through Q3 2015. The study population included past and present members of the QUEST collaborative (N = 356), with each participating hospital considered a member. The QUEST program engages with member hospitals through a routine-coaching structure, sprints, minicollaboratives, and face-to-face meetings. Cost and efficiency data showed reductions in adjusted cost per discharge for hospitals between Q3 2013 (mean, $8,296; median, $8,459) and Q3 2015 (mean, $8,217; median, $7,895). Evidence-based care (EBC) measures showed improvement from baseline (Q3 2006; mean, 77%; median, 79%) to Q3 2015 (mean, 95%; median, 96%). Observed-to-expected (O/E) mortality improved from 1% to 22% better-than-expected outcomes on average. The QUEST safety harm composite score showed moderate reduction from Q1 2009 to Q3 2015, as did the O/E readmission rates--from Q1 2010 to Q3 2015--with improvement from a 5% to an 8% better-than-expected score. Quantitative and qualitative evaluation of QUEST collaborative hospitals indicated that for the 2006-2015 period, QUEST facilities reduced cost per discharge, improved adherence with evidence-based practice, reduced safety harm composite score, improved patient experience, and reduced unplanned readmissions.

  12. Towards Customer-Driven Management in Hospitality Education: A Case Study of the Higher Hotel Institute, Cyprus.

    ERIC Educational Resources Information Center

    Varnavas, Andreas P.; Soteriou, Andreas C.

    2002-01-01

    Presents and discusses the approach used by the Higher Hotel Institute in Cyprus to incorporate total quality management through establishment of a customer-driven management culture in its hospitality education program. Discusses how it collects and uses service-quality related data from future employers, staff, and students in pursuing this…

  13. Using aircraft and satellite observations to improve regulatory air quality models

    NASA Astrophysics Data System (ADS)

    Canty, T. P.; Vinciguerra, T.; Anderson, D. C.; Carpenter, S. F.; Goldberg, D. L.; Hembeck, L.; Montgomery, L.; Liu, X.; Salawitch, R. J.; Dickerson, R. R.

    2014-12-01

    Federal and state agencies rely on EPA approved models to develop attainment strategies that will bring states into compliance with the National Ambient Air Quality Standards (NAAQS). We will describe modifications to the Community Multi-Scale Air Quality (CMAQ) model and Comprehensive Air Quality Model with Extensions (CAMx) frameworks motivated by analysis of NASA satellite and aircraft measurements. Observations of tropospheric column NO2 from OMI have already led to the identification of an important deficiency in the chemical mechanisms used by models; data collected during the DISCOVER-AQ field campaign has been instrumental in devising an improved representation of the chemistry of nitrogen species. Our recent work has focused on the use of: OMI observations of tropospheric O3 to assess and improve the representation of boundary conditions used by AQ models, OMI NO2 to derive a top down NOx emission inventory from commercial shipping vessels that affect air quality in the Eastern U.S., and OMI HCHO to assess the C5H8 emission inventories provided by bioegenic emissions models. We will describe how these OMI-driven model improvements are being incorporated into the State Implementation Plans (SIPs) being prepared for submission to EPA in summer 2015 and how future modeling efforts may be impacted by our findings.

  14. Behavior analysis: the science of training.

    PubMed

    Farhoody, Parvene

    2012-09-01

    Behavior analysis is a data-driven science dedicated to understanding the mechanisms of behavior. Applied behavior analysis is a branch of this scientific field that systematically applies scientific principles to real-world problems in an effort to improve quality of life. The use of the behavioral technology provides a way to teach human and nonhuman animals more effectively and efficiently and offers those using this technology increased success in achieving behavioral goals. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. Agriculture-driven deforestation in the tropics from 1990-2015: emissions, trends and uncertainties

    NASA Astrophysics Data System (ADS)

    Carter, Sarah; Herold, Martin; Avitabile, Valerio; de Bruin, Sytze; De Sy, Veronique; Kooistra, Lammert; Rufino, Mariana C.

    2018-01-01

    Limited data exists on emissions from agriculture-driven deforestation, and available data are typically uncertain. In this paper, we provide comparable estimates of emissions from both all deforestation and agriculture-driven deforestation, with uncertainties for 91 countries across the tropics between 1990 and 2015. Uncertainties associated with input datasets (activity data and emissions factors) were used to combine the datasets, where most certain datasets contribute the most. This method utilizes all the input data, while minimizing the uncertainty of the emissions estimate. The uncertainty of input datasets was influenced by the quality of the data, the sample size (for sample-based datasets), and the extent to which the timeframe of the data matches the period of interest. Area of deforestation, and the agriculture-driver factor (extent to which agriculture drives deforestation), were the most uncertain components of the emissions estimates, thus improvement in the uncertainties related to these estimates will provide the greatest reductions in uncertainties of emissions estimates. Over the period of the study, Latin America had the highest proportion of deforestation driven by agriculture (78%), and Africa had the lowest (62%). Latin America had the highest emissions from agriculture-driven deforestation, and these peaked at 974 ± 148 Mt CO2 yr-1 in 2000-2005. Africa saw a continuous increase in emissions between 1990 and 2015 (from 154 ± 21-412 ± 75 Mt CO2 yr-1), so mitigation initiatives could be prioritized there. Uncertainties for emissions from agriculture-driven deforestation are ± 62.4% (average over 1990-2015), and uncertainties were highest in Asia and lowest in Latin America. Uncertainty information is crucial for transparency when reporting, and gives credibility to related mitigation initiatives. We demonstrate that uncertainty data can also be useful when combining multiple open datasets, so we recommend new data providers to include this information.

  16. Determination of significance in Ecological Impact Assessment: Past change, current practice and future improvements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briggs, Sam; Hudson, Malcolm D., E-mail: mdh@soton.ac.uk

    2013-01-15

    Ecological Impact Assessment (EcIA) is an important tool for conservation and achieving sustainable development. 'Significant' impacts are those which disturb or alter the environment to a measurable degree. Significance is a crucial part of EcIA, our understanding of the concept in practice is vital if it is to be effective as a tool. This study employed three methods to assess how the determination of significance has changed through time, what current practice is, and what would lead to future improvements. Three data streams were collected: interviews with expert stakeholders, a review of 30 Environmental Statements and a broad-scale survey ofmore » the United Kingdom Institute of Ecology and Environmental Management (IEEM) members. The approach taken in the determination of significance has become more standardised and subjectivity has become constrained through a transparent framework. This has largely been driven by a set of guidelines produced by IEEM in 2006. The significance of impacts is now more clearly justified and the accuracy with which it is determined has improved. However, there are limitations to accuracy and effectiveness of the determination of significance. These are the quality of baseline survey data, our scientific understanding of ecological processes and the lack of monitoring and feedback of results. These in turn are restricted by the limited resources available in consultancies. The most notable recommendations for future practice are the implementation of monitoring and the publication of feedback, the creation of a central database for baseline survey data and the streamlining of guidance. - Highlights: Black-Right-Pointing-Pointer The assessment of significance has changed markedly through time. Black-Right-Pointing-Pointer The IEEM guidelines have driven a standardisation of practice. Black-Right-Pointing-Pointer Currently limited by quality of baseline data and scientific understanding. Black-Right-Pointing-Pointer Monitoring and feedback would rapidly improve practice.« less

  17. The MDE Diploma: First International Postgraduate Specialization in Model-Driven Engineering

    ERIC Educational Resources Information Center

    Cabot, Jordi; Tisi, Massimo

    2011-01-01

    Model-Driven Engineering (MDE) is changing the way we build, operate, and maintain our software-intensive systems. Several projects using MDE practices are reporting significant improvements in quality and performance but, to be able to handle these projects, software engineers need a set of technical and interpersonal skills that are currently…

  18. Data-Driven Robust RVFLNs Modeling of a Blast Furnace Iron-Making Process Using Cauchy Distribution Weighted M-Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Ping; Lv, Youbin; Wang, Hong

    Optimal operation of a practical blast furnace (BF) ironmaking process depends largely on a good measurement of molten iron quality (MIQ) indices. However, measuring the MIQ online is not feasible using the available techniques. In this paper, a novel data-driven robust modeling is proposed for online estimation of MIQ using improved random vector functional-link networks (RVFLNs). Since the output weights of traditional RVFLNs are obtained by the least squares approach, a robustness problem may occur when the training dataset is contaminated with outliers. This affects the modeling accuracy of RVFLNs. To solve this problem, a Cauchy distribution weighted M-estimation basedmore » robust RFVLNs is proposed. Since the weights of different outlier data are properly determined by the Cauchy distribution, their corresponding contribution on modeling can be properly distinguished. Thus robust and better modeling results can be achieved. Moreover, given that the BF is a complex nonlinear system with numerous coupling variables, the data-driven canonical correlation analysis is employed to identify the most influential components from multitudinous factors that affect the MIQ indices to reduce the model dimension. Finally, experiments using industrial data and comparative studies have demonstrated that the obtained model produces a better modeling and estimating accuracy and stronger robustness than other modeling methods.« less

  19. A data-driven approach to quality risk management.

    PubMed

    Alemayehu, Demissie; Alvir, Jose; Levenstein, Marcia; Nickerson, David

    2013-10-01

    An effective clinical trial strategy to ensure patient safety as well as trial quality and efficiency involves an integrated approach, including prospective identification of risk factors, mitigation of the risks through proper study design and execution, and assessment of quality metrics in real-time. Such an integrated quality management plan may also be enhanced by using data-driven techniques to identify risk factors that are most relevant in predicting quality issues associated with a trial. In this paper, we illustrate such an approach using data collected from actual clinical trials. Several statistical methods were employed, including the Wilcoxon rank-sum test and logistic regression, to identify the presence of association between risk factors and the occurrence of quality issues, applied to data on quality of clinical trials sponsored by Pfizer. ONLY A SUBSET OF THE RISK FACTORS HAD A SIGNIFICANT ASSOCIATION WITH QUALITY ISSUES, AND INCLUDED: Whether study used Placebo, whether an agent was a biologic, unusual packaging label, complex dosing, and over 25 planned procedures. Proper implementation of the strategy can help to optimize resource utilization without compromising trial integrity and patient safety.

  20. Reality check of socio-hydrological interactions in water quality and ecosystem management

    NASA Astrophysics Data System (ADS)

    Destouni, Georgia; Fischer, Ida; Prieto, Carmen

    2017-04-01

    Socio-hydrological interactions in water management for improving water quality and ecosystem status include as key components both (i) the societal measures taken for mitigation and control, and (ii) the societal characterization and monitoring efforts made for choosing management targets and checking the effects of measures taken to reach the targets. This study investigates such monitoring, characterization and management efforts and effects over the first six-year management cycle of the EU Water Framework Directive (WFD). The investigation uses Sweden and the WFD-regulated management of its stream and lake waters as a concrete quantification example, with focus on the nutrient and eutrophication conditions that determine the most prominent water quality and ecosystem problems in need of mitigation in the Swedish waters. The case results show a relatively small available monitoring base for determination of these nutrient and eutrophication conditions, even though they constitute key parts in the overall WFD-based approach to classification and management of ecosystem status. Specifically, actual nutrient monitoring exists in only around 1% (down to 0.2% for nutrient loads) of the Swedish stream and lake water bodies; modeling is used to fill the gaps for the remaining unmonitored fraction of classified and managed waters. The available data show that the hydro-climatically driven stream water discharge is a primary explanatory variable for the resulting societal classification of ecosystem status in Swedish waters; this may be due to the discharge magnitude being dominant in determining nutrient loading to these waters. At any rate, with such a hydro-climatically related, rather than human-pressure related, determinant of the societal ecosystem-status classification, the main human-driven causes and effects of eutrophication may not be appropriately identified, and the measures taken for mitigating these may not be well chosen. The available monitoring data from Swedish waters support this hypothesis, by showing that the first WFD management cycle 2009-2015 has led to only slight changes in measured nutrient concentrations, with moderate-to-bad status waters mostly undergoing concentration increases. These management results are in direct contrast to the WFD management goals that ecosystem status in all member-state waters must be improved to at least good level, and in any case not be allowed to further deteriorate. In general, the present results show that societal approaches to ecosystem status classification, monitoring and improvement may need a focus shift for improved identification and quantification of the human-driven components of nutrient inputs, concentrations and loads in water environments. Dominant hydro-climatic change drivers and effects must of course also be understood and accounted for. However, adaptation to hydro-climatic changes should be additional to and aligned with, rather than instead of, necessary mitigation of human-driven eutrophication. The present case results call for further science-based testing and evidence of societal water quality and ecosystem management actually targeting and following up the potential achievement of such mitigation.

  1. Achieving quality of service in IP networks

    NASA Astrophysics Data System (ADS)

    Hays, Tim

    2001-07-01

    The Internet Protocol (IP) has served global networks well, providing a standardized method to transmit data among many disparate systems. But IP is designed for simplicity, and only enables a `best effort' service that can be subject to delays and loss of data. For data networks, this is an acceptable trade-off. In the emerging world of convergence, driven by new applications such as video streaming and IP telephony, minimizing latency and packet loss as well as jitter can be critical. Simply increasing the size of the IP network `pipe' to meet those demands is not always sufficient. In this environment, vendors and standards bodies are endeavoring to create technologies and techniques to enable IP to improve the quality of service it can provide, while retaining the characteristics that has enabled it to become the dominant networking protocol.

  2. Evidence of Impact of High Quality Principal Training in Illinois. Proposal Submitted to Governor Elect Rauner Education Transition Team on Behalf of the Center for the Study of Education Policy at Illinois State University

    ERIC Educational Resources Information Center

    Center for the Study of Education Policy, 2011

    2011-01-01

    Driven by research that evidences the direct impact of principals on school-wide improvements, Illinois has been working at the forefront of innovation and improvement in principal quality for quite some time. A large body of research supports the impact of school leadership on school and student outcomes. While high quality instruction is…

  3. Implementation of a Nurse Driven Pathway to Reduce Incidence of Hospital Acquired Pressure Injuries in the Pediatric Intensive Care Setting.

    PubMed

    Rowe, Angela D; McCarty, Karen; Huett, Amy

    2018-03-13

    A large, freestanding pediatric hospital in the southern United States saw a 117% increase in reported hospital acquired pressure injuries (HAPI) between 2013 and 2015, with the intensive care units being the units of highest occurrence. Design and Methods A quality improvement project was designed and implemented to assist with pressure injury prevention. Literature review confirmed that pediatric HAPIs are a challenge and that usage of bundles and user-friendly guidelines/pathways can help eliminate barriers to prevention. The aim of this quality improvement project had two aims. First, to reduce HAPI incidence in the PICU by 10%. Second, to increase consistent usage of pressure injury prevention strategies as evidenced by a 10% increase in pressure injury bundle compliance. The third aim was to identify if there are differences in percentage of interventions implemented between two different groups of patients. Donabedian's model of Structure, Process, and Outcomes guided the development and implementation of this quality improvement project. Interventions focused on risk assessment subscale scores have the opportunity to mitigate specific risk factors and improve pressure injury prevention. Through implementation of the nurse driven pathway there was as 57% decrease in reported HAPIs in the PICU as well as a 66% increase in pressure ulcer prevention bundle compliance. Implementation of the nurse driven pressure injury prevention pathway was successful. There was a significant increase in bundle compliance for pressure ulcer prevention and a decrease in reported HAPIs. The pathway developed and implemented for this quality improvement project could be adapted to other populations and care settings to provide guidance across the continuum. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Have Nursing Home Compare quality measure scores changed over time in response to competition?

    PubMed

    Castle, Nicholas G; Engberg, John; Liu, Darren

    2007-06-01

    Currently, the Centers for Medicare and Medicaid Services report on 15 Quality Measures (QMs) on the Nursing Home Compare (NHC) website. It is assumed that nursing homes are able to make improvements on these QMs, and in doing so they will attract more residents. In this investigation, we examine changes in QM scores, and whether competition and/or excess demand have influenced these change scores over a period of 1 year. Data come from NHC and the On-line Survey Certification And Recording (OSCAR) system. QM change scores are calculated using values from January 2003 to January 2004. A series of regression analyses are used to examine the association of competition and excess demand on QM scores. Eight QMs show an average decrease in scores (ie, better quality) and six QMs show an average increase in scores (ie, worse quality). However, for 13 of the 14 QMs these average changes averaged less than 1%. The regression analyses show an association between higher competition and improving QM scores and an association between lower occupancy and improving QM scores. As would be predicted based on the market-driven mechanism underlying quality improvements using report cards, we show that it is in the most competitive markets and those with the lowest average occupancy rates that improvements in the QM scores are more likely.

  5. Optimizing Facility Configurations and Operating Conditions for Improved Performance in the NASA Ames 24 Inch Shock Tube

    NASA Technical Reports Server (NTRS)

    Bogdanoff, David W.; Cruden, Brett A.

    2016-01-01

    The Ames Electric Arc Shock Tube (EAST) is a shock tube wherein the driver gas can be heated by an electric arc discharge. The electrical energy is stored in a 1.2 MJ capacitor bank. Four inch and 24 inch diameter driven tubes are available. The facility is described and the need for testing in the 24 inch tube to better simulate low density NASA mission profiles is discussed. Three test entries, 53, 53B and 59, are discussed. Tests are done with air or Mars gas (95.7% CO2/2.7% N2/1.6% Ar) at pressures of 0.01 to 0.14 Torr. Velocities spanned 6.3-9.2 km/s, with a nominal center of 7 km/s. Many facility configurations are studied in an effort to improve data quality. Various driver and driven tube configurations and the use of a buffer section between the driver and the driven tube are studied. Diagnostics include test times, time histories of the shock light pulses and tilts of the shock wave off the plane normal to the tube axis. The report will detail the results of the various trials, give the best configuration/operating conditions found to date and provide recommendations for further improvements. Finally, diaphragm performance is discussed.

  6. Key Performance Indicators in Radiology: You Can't Manage What You Can't Measure.

    PubMed

    Harvey, H Benjamin; Hassanzadeh, Elmira; Aran, Shima; Rosenthal, Daniel I; Thrall, James H; Abujudeh, Hani H

    2016-01-01

    Quality assurance (QA) is a fundamental component of every successful radiology operation. A radiology QA program must be able to efficiently and effectively monitor and respond to quality problems. However, as radiology QA has expanded into the depths of radiology operations, the task of defining and measuring quality has become more difficult. Key performance indicators (KPIs) are highly valuable data points and measurement tools that can be used to monitor and evaluate the quality of services provided by a radiology operation. As such, KPIs empower a radiology QA program to bridge normative understandings of health care quality with on-the-ground quality management. This review introduces the importance of KPIs in health care QA, a framework for structuring KPIs, a method to identify and tailor KPIs, and strategies to analyze and communicate KPI data that would drive process improvement. Adopting a KPI-driven QA program is both good for patient care and allows a radiology operation to demonstrate measurable value to other health care stakeholders. Copyright © 2015 Mosby, Inc. All rights reserved.

  7. WE-A-BRC-01: Introduction to the Certificate Course

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palta, J.

    Quality and safety in healthcare are inextricably linked. There are compelling data that link poor quality radiation therapy to inferior patient survival. Radiation Oncology clinical trial protocol deviations often involve incorrect target volume delineation or dosing, akin to radiotherapy incidents which also often involve partial geometric miss or improper radiation dosing. When patients with radiation protocol variations are compared to those without significant protocol variations, clinical outcome is negatively impacted. Traditionally, quality assurance in radiation oncology has been driven largely by new technological advances, and safety improvement has been driven by reactive responses to past system failures and prescriptive mandatesmore » recommended by professional organizations and promulgated by regulators. Prescriptive approaches to quality and safety alone often do not address the huge variety of process and technique used in radiation oncology. Risk-based assessments of radiotherapy processes provide a mechanism to enhance quality and safety, both for new and for established techniques. It is imperative that we explore such a paradigm shift at this time, when expectations from patients as well as providers are rising while available resources are falling. There is much we can learn from our past experiences to be applied towards the new risk-based assessments. Learning Objectives: Understand the impact of clinical and technical quality on outcomes Understand the importance of quality care in radiation oncology Learn to assess the impact of quality on clinical outcomes D. Followill, NIH Grant CA180803.« less

  8. WE-A-BRC-03: Lessons Learned: IROC Audits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Followill, D.

    Quality and safety in healthcare are inextricably linked. There are compelling data that link poor quality radiation therapy to inferior patient survival. Radiation Oncology clinical trial protocol deviations often involve incorrect target volume delineation or dosing, akin to radiotherapy incidents which also often involve partial geometric miss or improper radiation dosing. When patients with radiation protocol variations are compared to those without significant protocol variations, clinical outcome is negatively impacted. Traditionally, quality assurance in radiation oncology has been driven largely by new technological advances, and safety improvement has been driven by reactive responses to past system failures and prescriptive mandatesmore » recommended by professional organizations and promulgated by regulators. Prescriptive approaches to quality and safety alone often do not address the huge variety of process and technique used in radiation oncology. Risk-based assessments of radiotherapy processes provide a mechanism to enhance quality and safety, both for new and for established techniques. It is imperative that we explore such a paradigm shift at this time, when expectations from patients as well as providers are rising while available resources are falling. There is much we can learn from our past experiences to be applied towards the new risk-based assessments. Learning Objectives: Understand the impact of clinical and technical quality on outcomes Understand the importance of quality care in radiation oncology Learn to assess the impact of quality on clinical outcomes D. Followill, NIH Grant CA180803.« less

  9. WE-A-BRC-02: Lessons Learned: Clinical Trials and Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, S.

    Quality and safety in healthcare are inextricably linked. There are compelling data that link poor quality radiation therapy to inferior patient survival. Radiation Oncology clinical trial protocol deviations often involve incorrect target volume delineation or dosing, akin to radiotherapy incidents which also often involve partial geometric miss or improper radiation dosing. When patients with radiation protocol variations are compared to those without significant protocol variations, clinical outcome is negatively impacted. Traditionally, quality assurance in radiation oncology has been driven largely by new technological advances, and safety improvement has been driven by reactive responses to past system failures and prescriptive mandatesmore » recommended by professional organizations and promulgated by regulators. Prescriptive approaches to quality and safety alone often do not address the huge variety of process and technique used in radiation oncology. Risk-based assessments of radiotherapy processes provide a mechanism to enhance quality and safety, both for new and for established techniques. It is imperative that we explore such a paradigm shift at this time, when expectations from patients as well as providers are rising while available resources are falling. There is much we can learn from our past experiences to be applied towards the new risk-based assessments. Learning Objectives: Understand the impact of clinical and technical quality on outcomes Understand the importance of quality care in radiation oncology Learn to assess the impact of quality on clinical outcomes D. Followill, NIH Grant CA180803.« less

  10. OCEAN-PC and a distributed network for ocean data

    NASA Technical Reports Server (NTRS)

    Mclain, Douglas R.

    1992-01-01

    The Intergovernmental Oceanographic Commission (IOC) wishes to develop an integrated software package for oceanographic data entry and access in developing countries. The software, called 'OCEAN-PC', would run on low cost PC microcomputers and would encourage and standardize: (1) entry of local ocean observations; (2) quality control of the local data; (3) merging local data with historical data; (4) improved display and analysis of the merged data; and (5) international data exchange. OCEAN-PC will link existing MS-DOS oceanographic programs and data sets with table-driven format conversions. Since many ocean data sets are now being distributed on optical discs (Compact Discs - Read Only Memory, CD-ROM, Mass et al. 1987), OCEAN-PC will emphasize access to CD-ROMs.

  11. Improving prehospital trauma care in Rwanda through continuous quality improvement: an interrupted time series analysis.

    PubMed

    Scott, John W; Nyinawankusi, Jeanne D'Arc; Enumah, Samuel; Maine, Rebecca; Uwitonze, Eric; Hu, Yihan; Kabagema, Ignace; Byiringiro, Jean Claude; Riviello, Robert; Jayaraman, Sudha

    2017-07-01

    Injury is a major cause of premature death and disability in East Africa, and high-quality pre-hospital care is essential for optimal trauma outcomes. The Rwandan pre-hospital emergency care service (SAMU) uses an electronic database to evaluate and optimize pre-hospital care through a continuous quality improvement programme (CQIP), beginning March 2014. The SAMU database was used to assess pre-hospital quality metrics including supplementary oxygen for hypoxia (O2), intravenous fluids for hypotension (IVF), cervical collar placement for head injuries (c-collar), and either splinting (splint) or administration of pain medications (pain) for long bone fractures. Targets of >90% were set for each metric and daily team meetings and monthly feedback sessions were implemented to address opportunities for improvement. These five pre-hospital quality metrics were assessed monthly before and after implementation of the CQIP. Met and unmet needs for O2, IVF, and c-collar were combined into a summative monthly SAMU Trauma Quality Scores (STQ score). An interrupted time series linear regression model compared the STQ score during 14 months before the CQIP implementation to the first 14 months after. During the 29-month study period 3,822 patients met study criteria. 1,028 patients needed one or more of the five studied interventions during the study period. All five endpoints had a significant increase between the pre-CQI and post-CQI periods (p<0.05 for all), and all five achieved a post-CQI average of at least 90% completion. The monthly composite STQ scores ranged from 76.5 to 97.9 pre-CQI, but tightened to 86.1-98.7 during the post-CQI period. Interrupted time series analysis of the STQ score showed that CQI programme led to both an immediate improvement of +6.1% (p=0.017) and sustained monthly improvements in care delivery-improving at a rate of 0.7% per month (p=0.028). The SAMU experience demonstrates the utility of a responsive, data-driven quality improvement programme to yield significant immediate and sustained improvements in pre-hospital care for trauma in Rwanda. This programme may be used as an example for additional efforts engaging frontline staff with real-time data feedback in order to rapidly translate data collection efforts into improved care for the injured in a resource-limited setting. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Intelligent Data Granulation on Load: Improving Infobright's Knowledge Grid

    NASA Astrophysics Data System (ADS)

    Ślęzak, Dominik; Kowalski, Marcin

    One of the major aspects of Infobright's relational database technology is automatic decomposition of each of data tables onto Rough Rows, each consisting of 64K of original rows. Rough Rows are automatically annotated by Knowledge Nodes that represent compact information about the rows' values. Query performance depends on the quality of Knowledge Nodes, i.e., their efficiency in minimizing the access to the compressed portions of data stored on disk, according to the specific query optimization procedures. We show how to implement the mechanism of organizing the incoming data into such Rough Rows that maximize the quality of the corresponding Knowledge Nodes. Given clear business-driven requirements, the implemented mechanism needs to be fully integrated with the data load process, causing no decrease in the data load speed. The performance gain resulting from better data organization is illustrated by some tests over our benchmark data. The differences between the proposed mechanism and some well-known procedures of database clustering or partitioning are discussed. The paper is a continuation of our patent application [22].

  13. Surgical adverse outcome reporting as part of routine clinical care.

    PubMed

    Kievit, J; Krukerink, M; Marang-van de Mheen, P J

    2010-12-01

    In The Netherlands, health professionals have created a doctor-driven standardised system to report and analyse adverse outcomes (AO). The aim is to improve healthcare by learning from past experiences. The key elements of this system are (1) an unequivocal definition of an adverse outcome, (2) appropriate contextual information and (3) a three-dimensional hierarchical classification system. First, to assess whether routine doctor-driven AO reporting is feasible. Second, to investigate how doctors can learn from AO reporting and analysis to improve the quality of care. Feasibility was assessed by how well doctors reported AO in the surgical department of a Dutch university hospital over a period of 9 years. AO incidence was analysed per patient subgroup and over time, in a time-trend analysis of three equal 3-year periods. AO were analysed case by case and statistically, to learn lessons from past events. In 19,907 surgical admissions, 9189 AOs were reported: one or more AO in 18.2% of admissions. On average, 55 lessons were learnt each year (in 4.3% of AO). More AO were reported in P3 than P1 (OR 1.39 (1.23-1.57)). Although minor AO increased, fatal AO decreased over time (OR 0.59 (0.45-0.77)). Doctor-driven AO reporting is shown to be feasible. Lessons can be learnt from case-by-case analyses of individual AO, as well as by statistical analysis of AO groups and subgroups (illustrated by time-trend analysis), thus contributing to the improvement of the quality of care. Moreover, by standardising AO reporting, data can be compared across departments or hospitals, to generate (confidential) mirror information for professionals cooperating in a peer-review setting.

  14. Surgical adverse outcome reporting as part of routine clinical care

    PubMed Central

    Krukerink, M; Marang-van de Mheen, P J

    2010-01-01

    Background In The Netherlands, health professionals have created a doctor-driven standardised system to report and analyse adverse outcomes (AO). The aim is to improve healthcare by learning from past experiences. The key elements of this system are (1) an unequivocal definition of an adverse outcome, (2) appropriate contextual information and (3) a three-dimensional hierarchical classification system. Objectives First, to assess whether routine doctor-driven AO reporting is feasible. Second, to investigate how doctors can learn from AO reporting and analysis to improve the quality of care. Methods Feasibility was assessed by how well doctors reported AO in the surgical department of a Dutch university hospital over a period of 9 years. AO incidence was analysed per patient subgroup and over time, in a time-trend analysis of three equal 3-year periods. AO were analysed case by case and statistically, to learn lessons from past events. Results In 19 907 surgical admissions, 9189 AOs were reported: one or more AO in 18.2% of admissions. On average, 55 lessons were learnt each year (in 4.3% of AO). More AO were reported in P3 than P1 (OR 1.39 (1.23–1.57)). Although minor AO increased, fatal AO decreased over time (OR 0.59 (0.45–0.77)). Conclusions Doctor-driven AO reporting is shown to be feasible. Lessons can be learnt from case-by-case analyses of individual AO, as well as by statistical analysis of AO groups and subgroups (illustrated by time-trend analysis), thus contributing to the improvement of the quality of care. Moreover, by standardising AO reporting, data can be compared across departments or hospitals, to generate (confidential) mirror information for professionals cooperating in a peer-review setting. PMID:20430928

  15. Data Driven Math Intervention: What the Numbers Say

    ERIC Educational Resources Information Center

    Martin, Anthony W.

    2013-01-01

    This study was designed to determine whether or not data driven math skills groups would be effective in increasing student academic achievement. From this topic three key questions arose: "Would the implementation of data driven math skills groups improve student academic achievement more than standard instruction as measured by the…

  16. Effects of coconut granular activated carbon pretreatment on membrane filtration in a gravitational driven process to improve drinking water quality.

    PubMed

    da Silva, Flávia Vieira; Yamaguchi, Natália Ueda; Lovato, Gilselaine Afonso; da Silva, Fernando Alves; Reis, Miria Hespanhol Miranda; de Amorim, Maria Teresa Pessoa Sousa; Tavares, Célia Regina Granhen; Bergamasco, Rosângela

    2012-01-01

    This study evaluates the performance of a polymeric microfiltration membrane, as well as its combination with a coconut granular activated carbon (GAC) pretreatment, in a gravitational filtration module, to improve the quality of water destined to human consumption. The proposed membrane and adsorbent were thoroughly characterized using instrumental techniques, such as contact angle, Brunauer-Emmett-Teller) and Fourier transform infrared spectroscopy analyses. The applied processes (membrane and GAC + membrane) were evaluated regarding permeate flux, fouling percentage, pH and removal of Escherichia coli, colour, turbidity and free chlorine. The obtained results for filtrations with and without GAC pretreatment were similar in terms of water quality. GAC pretreatment ensured higher chlorine removals, as well as higher initial permeate fluxes. This system, applying GAC as a pretreatment and a gravitational driven membrane filtration, could be considered as an alternative point-of-use treatment for water destined for human consumption.

  17. Are All Clinical Studies Sponsored by Industry Not Valid?

    PubMed Central

    Heinemann, Lutz

    2008-01-01

    Industry-sponsored studies have such a bad reputation that some journals require an additional statistical analysis by an independent statistician. This commentary discusses some of the reasons why academic people tend to believe that “academic” science is better than industry-driven science. Most likely, when it comes to publications, the risk of fraud exists in both worlds as the pressure to publish “significant” data is prevalent in both worlds. In contrast to the academic world, the level of control by regulatory bodies for industry-sponsored studies is much higher. Therefore, the quality of industry-driven studies is high, at least when it comes to the quality of data. One of the main reasons why academic people are so skeptical about the pharmaceutical industry is a lack of knowledge about the work done in industry. It is as demanding and scientific as in other industries. In turn, many physicians working in the pharmaceutical industry have low self-esteem. Also, the pharmaceutical industry should improve its self-presentation adequately to get rid of its bad image. There is a clear need for more communication between both worlds in order to better understand the mutual difficulties and needs. PMID:19885307

  18. Using Quality Improvement Methods and Time-Driven Activity-Based Costing to Improve Value-Based Cancer Care Delivery at a Cancer Genetics Clinic.

    PubMed

    Tan, Ryan Y C; Met-Domestici, Marie; Zhou, Ke; Guzman, Alexis B; Lim, Soon Thye; Soo, Khee Chee; Feeley, Thomas W; Ngeow, Joanne

    2016-03-01

    To meet increasing demand for cancer genetic testing and improve value-based cancer care delivery, National Cancer Centre Singapore restructured the Cancer Genetics Service in 2014. Care delivery processes were redesigned. We sought to improve access by increasing the clinic capacity of the Cancer Genetics Service by 100% within 1 year without increasing direct personnel costs. Process mapping and plan-do-study-act (PDSA) cycles were used in a quality improvement project for the Cancer Genetics Service clinic. The impact of interventions was evaluated by tracking the weekly number of patient consultations and access times for appointments between April 2014 and May 2015. The cost impact of implemented process changes was calculated using the time-driven activity-based costing method. Our study completed two PDSA cycles. An important outcome was achieved after the first cycle: The inclusion of a genetic counselor increased clinic capacity by 350%. The number of patients seen per week increased from two in April 2014 (range, zero to four patients) to seven in November 2014 (range, four to 10 patients). Our second PDSA cycle showed that manual preappointment reminder calls reduced the variation in the nonattendance rate and contributed to a further increase in patients seen per week to 10 in May 2015 (range, seven to 13 patients). There was a concomitant decrease in costs of the patient care cycle by 18% after both PDSA cycles. This study shows how quality improvement methods can be combined with time-driven activity-based costing to increase value. In this paper, we demonstrate how we improved access while reducing costs of care delivery. Copyright © 2016 by American Society of Clinical Oncology.

  19. Depicting the interplay between organisational tiers in the use of a national quality registry to develop quality of care in Sweden.

    PubMed

    Eldh, Ann Catrine; Fredriksson, Mio; Vengberg, Sofie; Halford, Christina; Wallin, Lars; Dahlström, Tobias; Winblad, Ulrika

    2015-11-25

    With a pending need to identify potential means to improved quality of care, national quality registries (NQRs) are identified as a promising route. Yet, there is limited evidence with regards to what hinders and facilitates the NQR innovation, what signifies the contexts in which NQRs are applied and drive quality improvement. Supposedly, barriers and facilitators to NQR-driven quality improvement may be found in the healthcare context, in the politico-administrative context, as well as with an NQR itself. In this study, we investigated the potential variation with regards to if and how an NQR was applied by decision-makers and users in regions and clinical settings. The aim was to depict the interplay between the clinical and the politico-administrative tiers in the use of NQRs to develop quality of care, examining an established registry on stroke care as a case study. We interviewed 44 individuals representing the clinical and the politico-administrative settings of 4 out of 21 regions strategically chosen for including stroke units representing a variety of outcomes in the NQR on stroke (Riksstroke) and a variety of settings. The transcribed interviews were analysed by applying The Consolidated Framework for Implementation Research (CFIR). In two regions, decision-makers and/or administrators had initiated healthcare process projects for stroke, engaging the health professionals in the local stroke units who contributed with, for example, local data from Riksstroke. The Riksstroke data was used for identifying improvement issues, for setting goals, and asserting that the stroke units achieved an equivalent standard of care and a certain level of quality of stroke care. Meanwhile, one region had more recently initiated such a project and the fourth region had no similar collaboration across tiers. Apart from these projects, there was limited joint communication across tiers and none that included all individuals and functions engaged in quality improvement with regards to stroke care. If NQRs are to provide for quality improvement and learning opportunities, advances must be made in the links between the structures and processes across all organisational tiers, including decision-makers, administrators and health professionals engaged in a particular healthcare process.

  20. The Quality of Teaching Staff: Higher Education Institutions' Compliance with the European Standards and Guidelines for Quality Assurance--The Case of Portugal

    ERIC Educational Resources Information Center

    Cardoso, Sónia; Tavares, Orlanda; Sin, Cristina

    2015-01-01

    In recent years, initiatives for the improvement of teaching quality have been pursued both at European and national levels. Such is the case of the European Standards and Guidelines for Quality Assurance (ESG) and of legislation passed by several European countries, including Portugal, in response to European policy developments driven by the…

  1. Multi-hospital Community NICU Quality Improvement Improves Survival of ELBW Infants.

    PubMed

    Owens, Jack D; Soltau, Thomas; McCaughn, Danny; Miller, Jason; O'Mara, Patrick; Robbins, Kenny; Temple, David M; Wender, David F

    2015-08-01

    Quality improvement or high reliability in medicine is an evolving science where we seek to integrate evidence-based medicine, structural resources, process management, leadership models, culture, and education. Newborn Associates is a community-based neonatology practice that staffs and manages neonatal intensive care units (NICU's) at Central Mississippi Medical Center, Mississippi Baptist Medical Center, River Oaks Hospital, St Dominic's Hospital and Woman's Hospital within the Jackson, Mississippi, metropolitan area. These hospitals participate in the Vermont-Oxford Neonatal Network (VON), which is a voluntary national network of about 1000 NICU groups that submit data allowing them to benchmark their patient outcome. This network currently holds data on 1.5 million infants. Participation may also include the Newborn Improvement Quality Collaborative (NICQ) which is an intensive quality improvement program where 40-60 of the almost 1000 VON centers participate each year or the iNICQ, which is an internet-based collaborative involving about 150 centers per year. From 2008-2009, our group concentrated efforts on quality improvement which included consolidating resources of three corporately managed hospitals to allow focused care of babies under 800-1000 grams at a single center, expanding participation in the VON NICQ to include all physicians and centers, and establishing a group QI focused committee aimed at sharing practice bundles and adopting quality improvement methodology. The goal of this article is to report the impact of these QI activities on survival of the smallest preterm infants who weigh less than 1500 grams at birth. Two epochs were compared: 2006-2009, and 2010-2013. 551 VLBW (< 1 500 grams) infants from epoch I were compared to 583 VLBW infants from epoch 2. Mortality in this group decreased from 18% to 11.1% (OR 0.62,95% CI 0.44-0.88). Mortality in the 501-750 grams birth weight category decreased from 45.7% to 18% (OR 0.39,95% CI 0.21-0.74). Improved survival was noted in all centers over the time period. These findings suggest that a physician-driven, multidisciplinary, individualized and multifactorial quality improvement effort can positively impact the care of extremely preterm infants in the community NICU setting.

  2. Data-driven event-by-event respiratory motion correction using TOF PET list-mode centroid of distribution

    NASA Astrophysics Data System (ADS)

    Ren, Silin; Jin, Xiao; Chan, Chung; Jian, Yiqiang; Mulnix, Tim; Liu, Chi; E Carson, Richard

    2017-06-01

    Data-driven respiratory gating techniques were developed to correct for respiratory motion in PET studies, without the help of external motion tracking systems. Due to the greatly increased image noise in gated reconstructions, it is desirable to develop a data-driven event-by-event respiratory motion correction method. In this study, using the Centroid-of-distribution (COD) algorithm, we established a data-driven event-by-event respiratory motion correction technique using TOF PET list-mode data, and investigated its performance by comparing with an external system-based correction method. Ten human scans with the pancreatic β-cell tracer 18F-FP-(+)-DTBZ were employed. Data-driven respiratory motions in superior-inferior (SI) and anterior-posterior (AP) directions were first determined by computing the centroid of all radioactive events during each short time frame with further processing. The Anzai belt system was employed to record respiratory motion in all studies. COD traces in both SI and AP directions were first compared with Anzai traces by computing the Pearson correlation coefficients. Then, respiratory gated reconstructions based on either COD or Anzai traces were performed to evaluate their relative performance in capturing respiratory motion. Finally, based on correlations of displacements of organ locations in all directions and COD information, continuous 3D internal organ motion in SI and AP directions was calculated based on COD traces to guide event-by-event respiratory motion correction in the MOLAR reconstruction framework. Continuous respiratory correction results based on COD were compared with that based on Anzai, and without motion correction. Data-driven COD traces showed a good correlation with Anzai in both SI and AP directions for the majority of studies, with correlation coefficients ranging from 63% to 89%. Based on the determined respiratory displacements of pancreas between end-expiration and end-inspiration from gated reconstructions, there was no significant difference between COD-based and Anzai-based methods. Finally, data-driven COD-based event-by-event respiratory motion correction yielded comparable results to that based on Anzai respiratory traces, in terms of contrast recovery and reduced motion-induced blur. Data-driven event-by-event respiratory motion correction using COD showed significant image quality improvement compared with reconstructions with no motion correction, and gave comparable results to the Anzai-based method.

  3. Data-driven event-by-event respiratory motion correction using TOF PET list-mode centroid of distribution.

    PubMed

    Ren, Silin; Jin, Xiao; Chan, Chung; Jian, Yiqiang; Mulnix, Tim; Liu, Chi; Carson, Richard E

    2017-06-21

    Data-driven respiratory gating techniques were developed to correct for respiratory motion in PET studies, without the help of external motion tracking systems. Due to the greatly increased image noise in gated reconstructions, it is desirable to develop a data-driven event-by-event respiratory motion correction method. In this study, using the Centroid-of-distribution (COD) algorithm, we established a data-driven event-by-event respiratory motion correction technique using TOF PET list-mode data, and investigated its performance by comparing with an external system-based correction method. Ten human scans with the pancreatic β-cell tracer 18 F-FP-(+)-DTBZ were employed. Data-driven respiratory motions in superior-inferior (SI) and anterior-posterior (AP) directions were first determined by computing the centroid of all radioactive events during each short time frame with further processing. The Anzai belt system was employed to record respiratory motion in all studies. COD traces in both SI and AP directions were first compared with Anzai traces by computing the Pearson correlation coefficients. Then, respiratory gated reconstructions based on either COD or Anzai traces were performed to evaluate their relative performance in capturing respiratory motion. Finally, based on correlations of displacements of organ locations in all directions and COD information, continuous 3D internal organ motion in SI and AP directions was calculated based on COD traces to guide event-by-event respiratory motion correction in the MOLAR reconstruction framework. Continuous respiratory correction results based on COD were compared with that based on Anzai, and without motion correction. Data-driven COD traces showed a good correlation with Anzai in both SI and AP directions for the majority of studies, with correlation coefficients ranging from 63% to 89%. Based on the determined respiratory displacements of pancreas between end-expiration and end-inspiration from gated reconstructions, there was no significant difference between COD-based and Anzai-based methods. Finally, data-driven COD-based event-by-event respiratory motion correction yielded comparable results to that based on Anzai respiratory traces, in terms of contrast recovery and reduced motion-induced blur. Data-driven event-by-event respiratory motion correction using COD showed significant image quality improvement compared with reconstructions with no motion correction, and gave comparable results to the Anzai-based method.

  4. The promise and peril of chemical probes

    PubMed Central

    Arrowsmith, Cheryl H; Audia, James E; Austin, Christopher; Baell, Jonathan; Bennett, Jonathan; Blagg, Julian; Bountra, Chas; Brennan, Paul E; Brown, Peter J; Bunnage, Mark E; Buser-Doepner, Carolyn; Campbell, Robert M; Carter, Adrian J; Cohen, Philip; Copeland, Robert A; Cravatt, Ben; Dahlin, Jayme L; Dhanak, Dashyant; Frederiksen, Mathias; Frye, Stephen V; Gray, Nathanael; Grimshaw, Charles E; Hepworth, David; Howe, Trevor; Huber, Kilian V M; Jin, Jian; Knapp, Stefan; Kotz, Joanne D; Kruger, Ryan G; Lowe, Derek; Mader, Mary M; Marsden, Brian; Mueller-Fahrnow, Anke; Müller, Susanne; O'Hagan, Ronan C; Overington, John P; Owen, Dafydd R; Rosenberg, Saul H; Ross, Ruth; Roth, Bryan; Schapira, Matthieu; Schreiber, Stuart L; Shoichet, Brian; Sundström, Michael; Superti-Furga, Giulio; Taunton, Jack; Toledo-Sherman, Leticia; Walpole, Chris; Walters, Michael A; Willson, Timothy M; Workman, Paul; Young, Robert N; Zuercher, William J

    2016-01-01

    Chemical probes are powerful reagents with increasing impacts on biomedical research. However, probes of poor quality or that are used incorrectly generate misleading results. To help address these shortcomings, we will create a community-driven wiki resource to improve quality and convey current best practice. PMID:26196764

  5. The New Zealand Major Trauma Registry: the foundation for a data-driven approach in a contemporary trauma system.

    PubMed

    Isles, Siobhan; Christey, Grant; Civil, Ian; Hicks, Peter

    2017-10-06

    To describe the development of the New Zealand Major Trauma Registry (NZ-MTR) and the initial experiences of its use. The background to the development of the NZ-MTR was reviewed and the processes undertaken to implement a single-instance of a web-based national registry described. A national minimum dataset was defined and utilised. Key structures to support the Registry such as a data governance group were established. The NZ-MTR was successfully implemented and is the foundation for a new, data-driven model of quality improvement. In its first year of operation over 1,300 patients were entered into the Registry although coverage is not yet universal. Overall incidence is 40.8 major trauma cases/100,000 population. The incidence in the Māori population was 69/100,000 compared with 31/100,000 in the non-Māori population. Case fatality rate was 9%. Three age peaks were observed at 20-24 years, 50-59 years and above 85 years. Road traffic crashes accounted for 50% of all caseload. A significant proportion of major trauma patients (21%) were transferred to one or more hospitals before reaching a definitive care facility. Despite the challenges working across multiple jurisdictions, initiation of a single-instance web-based registry has been achieved. The NZ-MTR enables New Zealand to have a national view of trauma treatment and outcomes for the first time. It will inform quality improvement and injury prevention initiatives and potentially decrease the burden of injury on all New Zealanders.

  6. Plant species traits are the predominant control on litter decomposition rates within biomes worldwide.

    PubMed

    Cornwell, William K; Cornelissen, Johannes H C; Amatangelo, Kathryn; Dorrepaal, Ellen; Eviner, Valerie T; Godoy, Oscar; Hobbie, Sarah E; Hoorens, Bart; Kurokawa, Hiroko; Pérez-Harguindeguy, Natalia; Quested, Helen M; Santiago, Louis S; Wardle, David A; Wright, Ian J; Aerts, Rien; Allison, Steven D; van Bodegom, Peter; Brovkin, Victor; Chatain, Alex; Callaghan, Terry V; Díaz, Sandra; Garnier, Eric; Gurvich, Diego E; Kazakou, Elena; Klein, Julia A; Read, Jenny; Reich, Peter B; Soudzilovskaia, Nadejda A; Vaieretti, M Victoria; Westoby, Mark

    2008-10-01

    Worldwide decomposition rates depend both on climate and the legacy of plant functional traits as litter quality. To quantify the degree to which functional differentiation among species affects their litter decomposition rates, we brought together leaf trait and litter mass loss data for 818 species from 66 decomposition experiments on six continents. We show that: (i) the magnitude of species-driven differences is much larger than previously thought and greater than climate-driven variation; (ii) the decomposability of a species' litter is consistently correlated with that species' ecological strategy within different ecosystems globally, representing a new connection between whole plant carbon strategy and biogeochemical cycling. This connection between plant strategies and decomposability is crucial for both understanding vegetation-soil feedbacks, and for improving forecasts of the global carbon cycle.

  7. A data-driven approach to quality risk management

    PubMed Central

    Alemayehu, Demissie; Alvir, Jose; Levenstein, Marcia; Nickerson, David

    2013-01-01

    Aim: An effective clinical trial strategy to ensure patient safety as well as trial quality and efficiency involves an integrated approach, including prospective identification of risk factors, mitigation of the risks through proper study design and execution, and assessment of quality metrics in real-time. Such an integrated quality management plan may also be enhanced by using data-driven techniques to identify risk factors that are most relevant in predicting quality issues associated with a trial. In this paper, we illustrate such an approach using data collected from actual clinical trials. Materials and Methods: Several statistical methods were employed, including the Wilcoxon rank-sum test and logistic regression, to identify the presence of association between risk factors and the occurrence of quality issues, applied to data on quality of clinical trials sponsored by Pfizer. Results: Only a subset of the risk factors had a significant association with quality issues, and included: Whether study used Placebo, whether an agent was a biologic, unusual packaging label, complex dosing, and over 25 planned procedures. Conclusion: Proper implementation of the strategy can help to optimize resource utilization without compromising trial integrity and patient safety. PMID:24312890

  8. What Drives Teachers to Improve? The Role of Teacher Mindset in Professional Learning

    ERIC Educational Resources Information Center

    Gero, Greg Philip

    2013-01-01

    Teacher quality has received increasing focus over the past decade, yet, by some measures, teachers rarely improve after their first few years of teaching, and not all teachers seem driven to improve. Traditional models of professional learning have emphasized the processes that teachers take part in as a facilitator of their improvement. Research…

  9. Money matters: exploiting the data from outcomes research for quality improvement initiatives.

    PubMed

    Impellizzeri, Franco M; Bizzini, Mario; Leunig, Michael; Maffiuletti, Nicola A; Mannion, Anne F

    2009-08-01

    In recent years, there has been an increase in studies that have sought to identify predictors of treatment outcome and to examine the efficacy of surgical and non-surgical treatments. In addition to the scientific advancement associated with these studies per se, the hospitals and clinics where the studies are conducted may gain indirect financial benefit from participating in such projects as a result of the prestige derived from corporate social responsibility, a reputational lever used to reward such institutions. It is known that there is a positive association between corporate social performance and corporate financial performance. However, in addition to this, the research findings and the research staff can constitute resources from which the provider can reap a more direct benefit, by means of their contribution to quality control and improvement. Poor quality is costly. Patient satisfaction increases the chances that the patient will be a promoter of the provider to friends and colleagues. As such, involvement of the research staff in the improvement of the quality of care can ultimately result in economic revenue for the provider. The most advanced methodologies for continuous quality improvement (e.g., six-sigma) are data-driven and use statistical tools similar to those utilized in the traditional research setting. Given that these methods rely on the application of the scientific process to quality improvement, researchers have the adequate skills and mind-set to embrace them and thereby contribute effectively to the quality team. The aim of this article is to demonstrate by means of real-life examples how to utilize the findings of outcome studies for quality management in a manner similar to that used in the business community. It also aims to stimulate research groups to better understand that, by adopting a different perspective, their studies can be an additional resource for the healthcare provider. The change in perspective should stimulate researchers to go beyond the traditional studies examining predictors of treatment outcome and to see things instead in terms of the "bigger picture", i.e., the improvement of the process outcome, the quality of the service.

  10. Money matters: exploiting the data from outcomes research for quality improvement initiatives

    PubMed Central

    Bizzini, Mario; Leunig, Michael; Maffiuletti, Nicola A.; Mannion, Anne F.

    2009-01-01

    In recent years, there has been an increase in studies that have sought to identify predictors of treatment outcome and to examine the efficacy of surgical and non-surgical treatments. In addition to the scientific advancement associated with these studies per se, the hospitals and clinics where the studies are conducted may gain indirect financial benefit from participating in such projects as a result of the prestige derived from corporate social responsibility, a reputational lever used to reward such institutions. It is known that there is a positive association between corporate social performance and corporate financial performance. However, in addition to this, the research findings and the research staff can constitute resources from which the provider can reap a more direct benefit, by means of their contribution to quality control and improvement. Poor quality is costly. Patient satisfaction increases the chances that the patient will be a promoter of the provider to friends and colleagues. As such, involvement of the research staff in the improvement of the quality of care can ultimately result in economic revenue for the provider. The most advanced methodologies for continuous quality improvement (e.g., six-sigma) are data-driven and use statistical tools similar to those utilized in the traditional research setting. Given that these methods rely on the application of the scientific process to quality improvement, researchers have the adequate skills and mind-set to embrace them and thereby contribute effectively to the quality team. The aim of this article is to demonstrate by means of real-life examples how to utilize the findings of outcome studies for quality management in a manner similar to that used in the business community. It also aims to stimulate research groups to better understand that, by adopting a different perspective, their studies can be an additional resource for the healthcare provider. The change in perspective should stimulate researchers to go beyond the traditional studies examining predictors of treatment outcome and to see things instead in terms of the “bigger picture”, i.e., the improvement of the process outcome, the quality of the service. PMID:19294433

  11. Mentorship and coaching to support strengthening healthcare systems: lessons learned across the five Population Health Implementation and Training partnership projects in sub-Saharan Africa.

    PubMed

    Manzi, Anatole; Hirschhorn, Lisa R; Sherr, Kenneth; Chirwa, Cindy; Baynes, Colin; Awoonor-Williams, John Koku

    2017-12-21

    Despite global efforts to increase health workforce capacity through training and guidelines, challenges remain in bridging the gap between knowledge and quality clinical practice and addressing health system deficiencies preventing health workers from providing high quality care. In many developing countries, supervision activities focus on data collection, auditing and report completion rather than catalyzing learning and supporting system quality improvement. To address this gap, mentorship and coaching interventions were implemented in projects in five African countries (Ghana, Mozambique, Rwanda, Tanzania, and Zambia) as components of health systems strengthening (HSS) strategies funded through the Doris Duke Charitable Foundation's African Health Initiative. We report on lessons learned from a cross-country evaluation. The evaluation was designed based on a conceptual model derived from the project-specific interventions. Semi-structured interviews were administered to key informants to capture data in six categories: 1) mentorship and coaching goals, 2) selection and training of mentors and coaches, 3) integration with the existing systems, 4) monitoring and evaluation, 5) reported outcomes, and 6) challenges and successes. A review of project-published articles and technical reports from the individual projects supplemented interview information. Although there was heterogeneity in the approaches to mentorship and coaching and targeted areas of the country projects, all led to improvements in core health system areas, including quality of clinical care, data-driven decision making, leadership and accountability, and staff satisfaction. Adaptation of approaches to reflect local context encouraged their adoption and improved their effectiveness and sustainability. We found that incorporating mentorship and coaching activities into HSS strategies was associated with improvements in quality of care and health systems, and mentorship and coaching represents an important component of HSS activities designed to improve not just coverage, but even further effective coverage, in achieving Universal Health Care.

  12. Total Quality Management in Libraries: A Sourcebook.

    ERIC Educational Resources Information Center

    O'Neil, Rosanna M., Comp.

    Total Quality Management (TQM) brings together the best aspects of organizational excellence by driving out fear, offering customer-driven products and services, doing it right the first time by eliminating error, and maintaining inventory control without waste. Libraries are service organizations which are constantly trying to improve service.…

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saide, Pablo E.; Peterson, David A.; de Silva, Arlindo

    We couple airborne, ground-based, and satellite observations; conduct regional simulations; and develop and apply an inversion technique to constrain hourly smoke emissions from the Rim Fire, the third largest observed in California, USA. Emissions constrained with multiplatform data show notable nocturnal enhancements (sometimes over a factor of 20), correlate better with daily burned area data, and are a factor of 2–4 higher than a priori estimates, highlighting the need for improved characterization of diurnal profiles and day-to-day variability when modeling extreme fires. Constraining only with satellite data results in smaller enhancements mainly due to missing retrievals near the emissions source,more » suggesting that top-down emission estimates for these events could be underestimated and a multiplatform approach is required to resolve them. Predictions driven by emissions constrained with multiplatform data present significant variations in downwind air quality and in aerosol feedback on meteorology, emphasizing the need for improved emissions estimates during exceptional events.« less

  14. Measuring the Cost of Quality in Higher Education: A Faculty Perspective

    ERIC Educational Resources Information Center

    Ruhupatty, LeRoy; Maguad, Ben A.

    2015-01-01

    Most critical activities in colleges and universities are driven by financial considerations. It is thus important that revenues are found to support these activities or ways identified to streamline costs. One way to cut cost is to improve the efficiency of schools to address the issue of poor quality. In this paper, the cost of poor quality in…

  15. The Potential of Knowing More: A Review of Data-Driven Urban Water Management.

    PubMed

    Eggimann, Sven; Mutzner, Lena; Wani, Omar; Schneider, Mariane Yvonne; Spuhler, Dorothee; Moy de Vitry, Matthew; Beutler, Philipp; Maurer, Max

    2017-03-07

    The promise of collecting and utilizing large amounts of data has never been greater in the history of urban water management (UWM). This paper reviews several data-driven approaches which play a key role in bringing forward a sea change. It critically investigates whether data-driven UWM offers a promising foundation for addressing current challenges and supporting fundamental changes in UWM. We discuss the examples of better rain-data management, urban pluvial flood-risk management and forecasting, drinking water and sewer network operation and management, integrated design and management, increasing water productivity, wastewater-based epidemiology and on-site water and wastewater treatment. The accumulated evidence from literature points toward a future UWM that offers significant potential benefits thanks to increased collection and utilization of data. The findings show that data-driven UWM allows us to develop and apply novel methods, to optimize the efficiency of the current network-based approach, and to extend functionality of today's systems. However, generic challenges related to data-driven approaches (e.g., data processing, data availability, data quality, data costs) and the specific challenges of data-driven UWM need to be addressed, namely data access and ownership, current engineering practices and the difficulty of assessing the cost benefits of data-driven UWM.

  16. Improving rates of cotrimoxazole prophylaxis in resource-limited settings: implementation of a quality improvement approach.

    PubMed

    Bardfield, J; Agins, B; Palumbo, M; Wei, A L; Morris, J; Marston, B

    2014-12-01

    To demonstrate the effectiveness of quality improvement methods to monitor and improve administration of cotrimoxazole (CTX) prophylaxis to improve health outcomes among adults living with HIV/AIDS in low resource countries. Program evaluation. HIV/AIDS health care facilities in Uganda, Mozambique, Namibia and Haiti. Performance measures based on national guidelines are developed in each country. These may include CD4 monitoring, ART adherence and uptake of CTX prophylaxis. CTX prophylaxis is routinely selected, because it has been shown to reduce HIV-related morbidity and mortality. Patient records are sampled using a standard statistical table to achieve a minimum confidence interval of 90% with a spread of ±8% in participating clinics. If an electronic medical record is available, all patients are reviewed. Routine review of performance measures, usually every 6 months, is conducted to identify gaps in care. Improvement interventions are developed and implemented at health facilities, informed by performance results, and local/national public health priorities. Median clinic rates of CTX prophylaxis. Median performance rates of CTX prophylaxis generally improved for adult HIV+ patients between 2006 and 2013 across countries, with median clinic rates higher than baseline at follow-up in 16 of 18 groups of clinics implementing CTX -focused improvement projects. Quality management offers a data-driven method to improve the quality of HIV care in low resource countries. Application of improvement principles has been shown to be effective to increase the rates of CTX prophylaxis in national HIV programs in multiple countries. © The Author 2014. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  17. Identifying effective pathways in a successful continuous quality improvement programme: the GEDAPS study.

    PubMed

    Bodicoat, Danielle H; Mundet, Xavier; Gray, Laura J; Cos, Xavier; Davies, Melanie J; Khunti, Kamlesh; Cano, Juan-Franciso

    2014-12-01

    Continuous quality improvement programmes often target several aspects of care, some of which may be more effective meaning that resources could be focussed on these. The objective was to identify the effective and ineffective aspects of a successful continuous quality improvement programme for individuals with type 2 diabetes in primary care. Data were from a series of cross-sectional studies (GEDAPS) in primary care, Catalonia, Spain, in 55 centres (2239 participants) in 1993, and 92 centres (5819 participants) in 2002. A structural equation modelling approach was used. The intervention was associated with improved microvascular outcomes through microalbuminuria and funduscopy screening, which had a direct effect on microvascular outcomes, and through attending 2-4 nurse visits and having ≥1 blood pressure measurement, which acted through reducing systolic blood pressure. The intervention was associated with improved macrovascular outcomes through blood pressure measurement and attending 2-4 nurse visits (through systolic blood pressure) and having ≥3 education topics, ≥1 HbA1c measurement and adequate medication (through HbA1c). Cholesterol measurement, weight measurement and foot examination did not contribute towards the effectiveness of the intervention. The pathways through which a continuous quality improvement programme appeared to act to reduce microvascular and macrovascular complications were driven by reductions in systolic blood pressure and HbA1c, which were attained through changes in nurse and education visits, measurement and medication. This suggests that these factors are potential areas on which future quality improvement programmes should focus. © 2014 John Wiley & Sons, Ltd.

  18. Stability improvement of a four cable-driven parallel manipulator using a center of mass balance system

    NASA Astrophysics Data System (ADS)

    Salafian, Iman; Stewart, Blake; Newman, Matthew; Zygielbaum, Arthur I.; Terry, Benjamin

    2017-04-01

    A four cable-driven parallel manipulator (CDPM), consisting of sophisticated spectrometers and imagers, is under development for use in acquiring phenotypic and environmental data over an acre-sized crop field. To obtain accurate and high quality data from the instruments, the end effector must be stable during sensing. One of the factors that reduces stability is the center of mass offset of the end effector, which can cause a pendulum effect or undesired tilt angle. The purpose of this work is to develop a system and method for balancing the center of mass of a 12th-scale CDPM to minimize vibration that can cause error in the acquired data. A simple method for balancing the end effector is needed to enable end users of the CDPM to arbitrarily add and remove sensors and imagers from the end effector as their experiments may require. A Center of Mass Balancing System (CMBS) is developed in this study which consists of an adjustable system of weights and a gimbal for tilt mitigation. An electronic circuit board including an orientation sensor, wireless data communication, and load cells was designed to validate the CMBS. To measure improvements gained by the CMBS, several static and dynamic experiments are carried out. In the experiments, the dynamic vibrations due to the translational motion and static orientation were measured with and without CMBS use. The results show that the CMBS system improves the stability of the end-effector by decreasing vibration and static tilt angle.

  19. The Role of Community-Driven Data Curation for Enterprises

    NASA Astrophysics Data System (ADS)

    Curry, Edward; Freitas, Andre; O'Riáin, Sean

    With increased utilization of data within their operational and strategic processes, enterprises need to ensure data quality and accuracy. Data curation is a process that can ensure the quality of data and its fitness for use. Traditional approaches to curation are struggling with increased data volumes, and near real-time demands for curated data. In response, curation teams have turned to community crowd-sourcing and semi-automatedmetadata tools for assistance. This chapter provides an overview of data curation, discusses the business motivations for curating data and investigates the role of community-based data curation, focusing on internal communities and pre-competitive data collaborations. The chapter is supported by case studies from Wikipedia, The New York Times, Thomson Reuters, Protein Data Bank and ChemSpider upon which best practices for both social and technical aspects of community-driven data curation are described.

  20. A Middle School Principal's and Teachers' Perceptions of Leadership Practices in Data-Driven Decision Making

    ERIC Educational Resources Information Center

    Godreau Cimma, Kelly L.

    2011-01-01

    The purpose of this qualitative case study was to describe one Connecticut middle school's voluntary implementation of a data-driven decision making process in order to improve student academic performance. Data-driven decision making is a component of Connecticut's accountability system to assist schools in meeting the requirements of the No…

  1. A Viewpoint on Wearable Technology-Enabled Measurement of Wellbeing and Health-Related Quality of Life in Parkinson’s Disease

    PubMed Central

    van Uem, Janet M.T.; Isaacs, Tom; Lewin, Alan; Bresolin, Eros; Salkovic, Dina; Espay, Alberto J.; Matthews, Helen; Maetzler, Walter

    2016-01-01

    In this viewpoint, we discuss how several aspects of Parkinson’s disease (PD) – known to be correlated with wellbeing and health-related quality of life–could be measured using wearable devices (‘wearables’). Moreover, three people with PD (PwP) having exhaustive experience with using such devices write about their personal understanding of wellbeing and health-related quality of life, building a bridge between the true needs defined by PwP and the available methods of data collection. Rapidly evolving new technologies develop wearables that probe function and behaviour in domestic environments of people with chronic conditions such as PD and have the potential to serve their needs. Gathered data can serve to inform patient-driven management changes, enabling greater control by PwP and enhancing likelihood of improvements in wellbeing and health-related quality of life. Data can also be used to quantify wellbeing and health-related quality of life. Additionally these techniques can uncover novel more sensitive and more ecologically valid disease-related endpoints. Active involvement of PwP in data collection and interpretation stands to provide personally and clinically meaningful endpoints and milestones to inform advances in research and relevance of translational efforts in PD. PMID:27003779

  2. Preventive care quality of Medicare Accountable Care Organizations: Associations of organizational characteristics with performance

    PubMed Central

    Albright, Benjamin B.; Lewis, Valerie A.; Ross, Joseph S.; Colla, Carrie H.

    2015-01-01

    Background Accountable Care Organizations (ACOs) are a delivery and payment model aiming to coordinate care, control costs, and improve quality. Medicare ACOs are responsible for eight measures of preventive care quality. Objectives To create composite measures of preventive care quality and examine associations of ACO characteristics with performance. Design Cross-sectional study of Medicare Shared Savings Program and Pioneer participants. We linked quality performance to descriptive data from the National Survey of ACOs. We created composite measures using exploratory factor analysis, and used regression to assess associations with organizational characteristics. Results Of 252 eligible ACOs, 246 reported on preventive care quality, 177 of which completed the survey (response rate=72%). In their first year, ACOs lagged behind PPO performance on the majority of comparable measures. We identified two underlying factors among eight measures and created composites for each: disease prevention, driven by vaccines and cancer screenings, and wellness screening, driven by annual health screenings. Participation in the Advanced Payment Model, having fewer specialists, and having more Medicare ACO beneficiaries per primary care provider were associated with significantly better performance on both composites. Better performance on disease prevention was also associated with inclusion of a hospital, greater electronic health record capabilities, a larger primary care workforce, and fewer minority beneficiaries. Conclusions ACO preventive care quality performance is related to provider composition and benefitted by upfront investment. Vaccine and cancer screening quality performance is more dependent on organizational structure and characteristics than performance on annual wellness screenings, likely due to greater complexity in eligibility determination and service administration. PMID:26759974

  3. Preventive Care Quality of Medicare Accountable Care Organizations: Associations of Organizational Characteristics With Performance.

    PubMed

    Albright, Benjamin B; Lewis, Valerie A; Ross, Joseph S; Colla, Carrie H

    2016-03-01

    Accountable Care Organizations (ACOs) are a delivery and payment model aiming to coordinate care, control costs, and improve quality. Medicare ACOs are responsible for 8 measures of preventive care quality. To create composite measures of preventive care quality and examine associations of ACO characteristics with performance. This is a cross-sectional study of Medicare Shared Savings Program and Pioneer participants. We linked quality performance to descriptive data from the National Survey of ACOs. We created composite measures using exploratory factor analysis, and used regression to assess associations with organizational characteristics. Of 252 eligible ACOs, 246 reported on preventive care quality, 177 of which completed the survey (response rate=72%). In their first year, ACOs lagged behind PPO performance on the majority of comparable measures. We identified 2 underlying factors among 8 measures and created composites for each: disease prevention, driven by vaccines and cancer screenings, and wellness screening, driven by annual health screenings. Participation in the Advanced Payment Model, having fewer specialists, and having more Medicare ACO beneficiaries per primary care provider were associated with significantly better performance on both composites. Better performance on disease prevention was also associated with inclusion of a hospital, greater electronic health record capabilities, a larger primary care workforce, and fewer minority beneficiaries. ACO preventive care quality performance is related to provider composition and benefitted by upfront investment. Vaccine and cancer screening quality performance is more dependent on organizational structure and characteristics than performance on annual wellness screenings, likely due to greater complexity in eligibility determination and service administration.

  4. A statistical model for water quality predictions from a river discharge using coastal observations

    NASA Astrophysics Data System (ADS)

    Kim, S.; Terrill, E. J.

    2007-12-01

    Understanding and predicting coastal ocean water quality has benefits for reducing human health risks, protecting the environment, and improving local economies which depend on clean beaches. Continuous observations of coastal physical oceanography increase the understanding of the processes which control the fate and transport of a riverine plume which potentially contains high levels of contaminants from the upstream watershed. A data-driven model of the fate and transport of river plume water from the Tijuana River has been developed using surface current observations provided by a network of HF radar operated as part of a local coastal observatory that has been in place since 2002. The model outputs are compared with water quality sampling of shoreline indicator bacteria, and the skill of an alarm for low water quality is evaluated using the receiver operating characteristic (ROC) curve. In addition, statistical analysis of beach closures in comparison with environmental variables is also discussed.

  5. The association of Nursing Home Compare quality measures with market competition and occupancy rates.

    PubMed

    Castle, Nicholas G; Liu, Darren; Engberg, John

    2008-01-01

    Since 2002, the Centers for Medicare and Medicaid Services have reported quality measures on the Nursing Home Compare Web site. It has been assumed that nursing homes are able to make improvements on these measures. In this study researchers examined nursing homes to see whether they have improved their quality scores, after accounting for regression to the mean. Researchers also examined whether gains varied according to market competition or market occupancy rates. They identified some regression to the mean for the quality measure scores over time; nevertheless, they also determined that some nursing homes had indeed made small improvements in their quality measure scores. As would be predicted based on the market-driven mechanism underlying quality improvements using report cards, the greatest improvements occurred in the most competitive markets and in those with the Lowest average occupancy rates. As policies to promote more competition in Long-term care proceed, further reducing occupancy rates, further, albeit small, quality gains will likely be made in the future.

  6. An evaluation of data-driven motion estimation in comparison to the usage of external-surrogates in cardiac SPECT imaging

    PubMed Central

    Mukherjee, Joyeeta Mitra; Hutton, Brian F; Johnson, Karen L; Pretorius, P Hendrik; King, Michael A

    2014-01-01

    Motion estimation methods in single photon emission computed tomography (SPECT) can be classified into methods which depend on just the emission data (data-driven), or those that use some other source of information such as an external surrogate. The surrogate-based methods estimate the motion exhibited externally which may not correlate exactly with the movement of organs inside the body. The accuracy of data-driven strategies on the other hand is affected by the type and timing of motion occurrence during acquisition, the source distribution, and various degrading factors such as attenuation, scatter, and system spatial resolution. The goal of this paper is to investigate the performance of two data-driven motion estimation schemes based on the rigid-body registration of projections of motion-transformed source distributions to the acquired projection data for cardiac SPECT studies. Comparison is also made of six intensity based registration metrics to an external surrogate-based method. In the data-driven schemes, a partially reconstructed heart is used as the initial source distribution. The partially-reconstructed heart has inaccuracies due to limited angle artifacts resulting from using only a part of the SPECT projections acquired while the patient maintained the same pose. The performance of different cost functions in quantifying consistency with the SPECT projection data in the data-driven schemes was compared for clinically realistic patient motion occurring as discrete pose changes, one or two times during acquisition. The six intensity-based metrics studied were mean-squared difference (MSD), mutual information (MI), normalized mutual information (NMI), pattern intensity (PI), normalized cross-correlation (NCC) and entropy of the difference (EDI). Quantitative and qualitative analysis of the performance is reported using Monte-Carlo simulations of a realistic heart phantom including degradation factors such as attenuation, scatter and system spatial resolution. Further the visual appearance of motion-corrected images using data-driven motion estimates was compared to that obtained using the external motion-tracking system in patient studies. Pattern intensity and normalized mutual information cost functions were observed to have the best performance in terms of lowest average position error and stability with degradation of image quality of the partial reconstruction in simulations. In all patients, the visual quality of PI-based estimation was either significantly better or comparable to NMI-based estimation. Best visual quality was obtained with PI-based estimation in 1 of the 5 patient studies, and with external-surrogate based correction in 3 out of 5 patients. In the remaining patient study there was little motion and all methods yielded similar visual image quality. PMID:24107647

  7. Quality: performance improvement, teamwork, information technology and protocols.

    PubMed

    Coleman, Nana E; Pon, Steven

    2013-04-01

    Using the Institute of Medicine framework that outlines the domains of quality, this article considers four key aspects of health care delivery which have the potential to significantly affect the quality of health care within the pediatric intensive care unit. The discussion covers: performance improvement and how existing methods for reporting, review, and analysis of medical error relate to patient care; team composition and workflow; and the impact of information technologies on clinical practice. Also considered is how protocol-driven and standardized practice affects both patients and the fiscal interests of the health care system.

  8. Improved RF Measurements of SRF Cavity Quality Factors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holzbauer, J. P.; Contreras, C.; Pischalnikov, Y.

    SRF cavity quality factors can be accurately measured using RF-power based techniques only when the cavity is very close to critically coupled. This limitation is from systematic errors driven by non-ideal RF components. When the cavity is not close to critically coupled, these systematic effects limit the accuracy of the measurements. The combination of the complex base-band envelopes of the cavity RF signals in combination with a trombone in the circuit allow the relative calibration of the RF signals to be extracted from the data and systematic effects to be characterized and suppressed. The improved calibration allows accurate measurements tomore » be made over a much wider range of couplings. Demonstration of these techniques during testing of a single-spoke resonator with a coupling factor of near 7 will be presented, along with recommendations for application of these techniques.« less

  9. Medicine and democracy: The importance of institutional quality in the relationship between health expenditure and health outcomes in the MENA region.

    PubMed

    Bousmah, Marwân-Al-Qays; Ventelou, Bruno; Abu-Zaineh, Mohammad

    2016-08-01

    Evidence suggests that the effect of health expenditure on health outcomes is highly context-specific and may be driven by other factors. We construct a panel dataset of 18 countries from the Middle East and North Africa region for the period 1995-2012. Panel data models are used to estimate the macro-level determinants of health outcomes. The core finding of the paper is that increasing health expenditure leads to health outcomes improvements only to the extent that the quality of institutions within a country is sufficiently high. The sensitivity of the results is assessed using various measures of health outcomes as well as institutional variables. Overall, it appears that increasing health care expenditure in the MENA region is a necessary but not sufficient condition for health outcomes improvements. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Association between quality of care and complications after abdominal surgery.

    PubMed

    Bergman, Simon; Deban, Melina; Martelli, Vanessa; Monette, Michèle; Sourial, Nadia; Hamadani, Fadi; Teasdale, Debby; Holcroft, Christina; Zakrzewski, Helena; Fraser, Shannon

    2014-09-01

    Measuring the quality of surgical care is essential to identifying areas of weakness in the delivery of effective surgical care and to improving patient outcomes. Our objectives were to (1) assess the quality of surgical care delivered to adult patients; and (2) determine the association between quality of surgical care and postoperative complications. This retrospective, pilot, cohort study was conducted at a single university-affiliated institution. Using the institution's National Surgical Quality Improvement Program database (2009-2010), 273 consecutive patients ≥18 years of age who underwent elective major abdominal operations were selected. Adherence to 10 process-based quality indicators (QIs) was measured and quantified by calculating a patient quality score (no. of QIs passed/no. of QIs eligible). A pass rate for each individual QI was also calculated. The association between quality of surgical care and postoperative complications was assessed using an incidence rate ratio, which was estimated from a Poisson regression. The mean overall patient quality score was 67.2 ± 14.4% (range, 25-100%). The mean QI pass rate was 65.9 ± 26.1%, which varied widely from 9.6% (oral intake documentation) to 95.6% (prophylactic antibiotics). Poisson regression revealed that as the quality score increased, the incidence of postoperative complications decreased (incidence rate ratio, 0.19; P = .011). A sensitivity analysis revealed that this association was likely driven by the postoperative ambulation QI. Higher quality scores, mainly driven by early ambulation, were associated with fewer postoperative complications. QIs with unacceptably low adherence were identified as targets for future quality improvement initiatives. Copyright © 2014 Mosby, Inc. All rights reserved.

  11. Discharge, water temperature, and selected meteorological data for Vancouver Lake, Vancouver, Washington, water years 2011-13

    USGS Publications Warehouse

    Foreman, James R.; Marshall, Cameron A.; Sheibley, Rich W.

    2014-01-01

    The U.S. Geological Survey partnered with the Vancouver Lake Watershed Partnership in a 2-year intensive study to quantify the movement of water and nutrients through Vancouver Lake in Vancouver, Washington. This report is intended to assist the Vancouver Lake Watershed Partnership in evaluating potential courses of action to mitigate seasonally driven blooms of harmful cyanobacteria and to improve overall water quality of the lake. This report contains stream discharge, lake water temperature, and selected meteorological data for water years 2011, 2012, and 2013 that were used to develop the water and nutrient budgets for the lake.

  12. Benchmarking: your performance measurement and improvement tool.

    PubMed

    Senn, G F

    2000-01-01

    Many respected professional healthcare organizations and societies today are seeking to establish data-driven performance measurement strategies such as benchmarking. Clinicians are, however, resistant to "benchmarking" that is based on financial data alone, concerned that it may be adverse to the patients' best interests. Benchmarking of clinical procedures that uses physician's codes such as Current Procedural Terminology (CPTs) has greater credibility with practitioners. Better Performers, organizations that can perform procedures successfully at lower cost and in less time, become the "benchmark" against which other organizations can measure themselves. The Better Performers' strategies can be adopted by other facilities to save time or money while maintaining quality patient care.

  13. Total quality management - It works for aerospace information services

    NASA Technical Reports Server (NTRS)

    Erwin, James; Eberline, Carl; Colquitt, Wanda

    1993-01-01

    Today we are in the midst of information and 'total quality' revolutions. At the NASA STI Program's Center for AeroSpace Information (CASI), we are focused on using continuous improvements techniques to enrich today's services and products and to ensure that tomorrow's technology supports the TQM-based improvement of future STI program products and services. The Continuous Improvements Program at CASI is the foundation for Total Quality Management in products and services. The focus is customer-driven; its goal, to identify processes and procedures that can be improved and new technologies that can be integrated with the processes to gain efficiencies, provide effectiveness, and promote customer satisfaction. This Program seeks to establish quality through an iterative defect prevention approach that is based on the incorporation of standards and measurements into the processing cycle.

  14. Role of Science at EPA

    EPA Pesticide Factsheets

    Decision-making is driven by research with the highest standards for integrity, peer review, transparency, and ethics. Ongoing positive impacts include reducing pollution, improving air quality, defining exposure pathways, and protecting water sources.

  15. Clinical practice guideline development manual: a quality-driven approach for translating evidence into action.

    PubMed

    Rosenfeld, Richard M; Shiffman, Richard N

    2009-06-01

    Guidelines translate best evidence into best practice. A well-crafted guideline promotes quality by reducing health-care variations, improving diagnostic accuracy, promoting effective therapy, and discouraging ineffective-or potentially harmful-interventions. Despite a plethora of published guidelines, methodology is often poorly defined and varies greatly within and among organizations. This manual describes the principles and practices used successfully by the American Academy of Otolaryngology-Head and Neck Surgery to produce quality-driven, evidence-based guidelines using efficient and transparent methodology for action-ready recommendations with multidisciplinary applicability. The development process, which allows moving from conception to completion in 12 months, emphasizes a logical sequence of key action statements supported by amplifying text, evidence profiles, and recommendation grades that link action to evidence. As clinical practice guidelines become more prominent as a key metric of quality health care, organizations must develop efficient production strategies that balance rigor and pragmatism. Equally important, clinicians must become savvy in understanding what guidelines are-and are not-and how they are best utilized to improve care. The information in this manual should help clinicians and organizations achieve these goals.

  16. Rural vs urban hospital performance in a 'competitive' public health service.

    PubMed

    Garcia-Lacalle, Javier; Martin, Emilio

    2010-09-01

    In some western countries, market-driven reforms to improve efficiency and quality have harmed the performance of some hospitals, occasionally leading to their closure, mostly in rural areas. This paper seeks to explore whether these reforms affect urban and rural hospitals differently in a European health service. Rural and urban hospital performance is compared taking into account their efficiency and perceived quality. The study is focused on the Andalusian Health Service (SAS) in Spain, which has implemented a freedom of hospital choice policy and a reimbursement system based on hospital performance. Data Envelopment Analysis, the Mann-Whitney U test and Multidimensional Scaling techniques are conducted for two years, 2003 and 2006. The results show that rural and urban hospitals perform similarly in the efficiency dimension, whereas rural hospitals perform significantly better than urban hospitals in the patient satisfaction dimension. When the two dimensions are considered jointly, some rural hospitals are found to be the best performers. As such, market-driven reforms do not necessary result in a difference in the performance of rural and urban hospitals. Copyright 2010 Elsevier Ltd. All rights reserved.

  17. Seamless service: maintaining momentum.

    PubMed

    Grinstead, N; Timoney, R

    1994-01-01

    Describes the process used by the Mater Infirmorum Hospital in Belfast in 1992-1994 to achieve high quality care (Seamless Service), motivate staff to deliver and measure performance. Aims of the project include focusing the organization on the customer, improving teamwork and motivation at all levels. After comprehensive data collection from GPs, patients and staff management forums developed a full TQM strategy to gain support and maintain momentum including innovative staff events (every staff member was given the opportunity to attend) where multilevel, multidisciplinary workshops enabled staff to design customer care standards, develop teams and lead customer-driven change.

  18. Optical probing of high intensity laser interaction with micron-sized cryogenic hydrogen jets

    NASA Astrophysics Data System (ADS)

    Ziegler, Tim; Rehwald, Martin; Obst, Lieselotte; Bernert, Constantin; Brack, Florian-Emanuel; Curry, Chandra B.; Gauthier, Maxence; Glenzer, Siegfried H.; Göde, Sebastian; Kazak, Lev; Kraft, Stephan D.; Kuntzsch, Michael; Loeser, Markus; Metzkes-Ng, Josefine; Rödel, Christian; Schlenvoigt, Hans-Peter; Schramm, Ulrich; Siebold, Mathias; Tiggesbäumker, Josef; Wolter, Steffen; Zeil, Karl

    2018-07-01

    Probing the rapid dynamics of plasma evolution in laser-driven plasma interactions provides deeper understanding of experiments in the context of laser-driven ion acceleration and facilitates the interplay with complementing numerical investigations. Besides the microscopic scales involved, strong plasma (self-)emission, predominantly around the harmonics of the driver laser, often complicates the data analysis. We present the concept and the implementation of a stand-alone probe laser system that is temporally synchronized to the driver laser, providing probing wavelengths beyond the harmonics of the driver laser. The capability of this system is shown during a full-scale laser proton acceleration experiment using renewable cryogenic hydrogen jet targets. For further improvements, we studied the influence of probe color, observation angle of the probe and temporal contrast of the driver laser on the probe image quality.

  19. Mass imbalances in EPANET water-quality simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Michael J.; Janke, Robert; Taxon, Thomas N.

    EPANET is widely employed to simulate water quality in water distribution systems. However, the time-driven simulation approach used to determine concentrations of water-quality constituents provides accurate results, in general, only for small water-quality time steps; use of an adequately short time step may not be feasible. Overly long time steps can yield errors in concentrations and result in situations in which constituent mass is not conserved. Mass may not be conserved even when EPANET gives no errors or warnings. This paper explains how such imbalances can occur and provides examples of such cases; it also presents a preliminary event-driven approachmore » that conserves mass with a water-quality time step that is as long as the hydraulic time step. Results obtained using the current approach converge, or tend to converge, to those obtained using the new approach as the water-quality time step decreases. Improving the water-quality routing algorithm used in EPANET could eliminate mass imbalances and related errors in estimated concentrations.« less

  20. Community Participation in International Development Education Quality Improvement Efforts: Current Paradoxes and Opportunities

    ERIC Educational Resources Information Center

    Kendall, Nancy; Kaunda, Zikani; Friedson-Rideneur, Sophia

    2015-01-01

    International development organizations increasingly use "participatory development" approaches to improve the effectiveness of their programs. Participatory frameworks are commonly limited in scope and funder-driven; these top-down approaches to participation have proven to be both ineffective, and at times, contradictory in their…

  1. Understanding Leadership Paradigms for Improvement in Higher Education

    ERIC Educational Resources Information Center

    Flumerfelt, Shannon; Banachowski, Michael

    2011-01-01

    Purpose: This research article is based on the Baldrige National Quality Program Education Criteria for Performance Excellence's conceptualization of improvement as a dual cycle/three element initiative of examining and bettering inputs, processes, and outputs as driven by measurement, analysis and knowledge management work. This study isolates a…

  2. A Data-Driven Solution for Performance Improvement

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Marketed as the "Software of the Future," Optimal Engineering Systems P.I. EXPERT(TM) technology offers statistical process control and optimization techniques that are critical to businesses looking to restructure or accelerate operations in order to gain a competitive edge. Kennedy Space Center granted Optimal Engineering Systems the funding and aid necessary to develop a prototype of the process monitoring and improvement software. Completion of this prototype demonstrated that it was possible to integrate traditional statistical quality assurance tools with robust optimization techniques in a user- friendly format that is visually compelling. Using an expert system knowledge base, the software allows the user to determine objectives, capture constraints and out-of-control processes, predict results, and compute optimal process settings.

  3. [Potentials of cooperative quality management initiatives: BQS Institute projects, January 2010 - July 2013].

    PubMed

    Veit, Christof; Bungard, Sven; Hertle, Dagmar; Grothaus, Franz-Josef; Kötting, Joachim; Arnold, Nicolai

    2013-01-01

    Alongside the projects of internal quality management and mandatory quality assurance there is a variety of quality driven projects across institutions initiated and run by various partners to continuously improve the quality of care. The multiplicity and characteristics of these projects are discussed on the basis of projects run by the BQS Institute between 2010 and 2013. In addition, useful interactions and linking with mandatory quality benchmarking and with internal quality management are discussed. (As supplied by publisher). Copyright © 2013. Published by Elsevier GmbH.

  4. Total quality in acute care hospitals: guidelines for hospital managers.

    PubMed

    Holthof, B

    1991-08-01

    Quality improvement can not focus exclusively on peer review and the scientific evaluation of medical care processes. These essential elements have to be complemented with a focus on individual patient needs and preferences. Only then will hospitals create the competitive advantage needed to survive in an increasingly market-driven hospital industry. Hospital managers can identify these patients' needs by 'living the patient experience' and should then set the hospital's quality objectives according to its target patients and their needs. Excellent quality program design, however, is not sufficient. Successful implementation of a quality improvement program further requires fundamental changes in pivotal jobholders' behavior and mindset and in the supporting organizational design elements.

  5. The Environmental Data Initiative: A broad-use data repository for environmental and ecological data that strives to balance data quality and ease of submission

    NASA Astrophysics Data System (ADS)

    Servilla, M. S.; Brunt, J.; Costa, D.; Gries, C.; Grossman-Clarke, S.; Hanson, P. C.; O'Brien, M.; Smith, C.; Vanderbilt, K.; Waide, R.

    2017-12-01

    In the world of data repositories, there seems to be a never ending struggle between the generation of high-quality data documentation and the ease of archiving a data product in a repository - the higher the documentation standards, the greater effort required by the scientist, and the less likely the data will be archived. The Environmental Data Initiative (EDI) attempts to balance the rigor of data documentation to the amount of effort required by a scientist to upload and archive data. As an outgrowth of the LTER Network Information System, the EDI is funded by the US NSF Division of Environmental Biology, to support the LTER, LTREB, OBFS, and MSB programs, in addition to providing an open data archive for environmental scientists without a viable archive. EDI uses the PASTA repository software, developed originally by the LTER. PASTA is metadata driven and documents data with the Ecological Metadata Language (EML), a high-fidelity standard that can describe all types of data in great detail. PASTA incorporates a series of data quality tests to ensure that data are correctly documented with EML in a process that is termed "metadata and data congruence", and incongruent data packages are forbidden in the repository. EDI reduces the burden of data documentation on scientists in two ways: first, EDI provides hands-on assistance in data documentation best practices using R and being developed in Python, for generating EML. These tools obscure the details of EML generation and syntax by providing a more natural and contextual setting for describing data. Second, EDI works closely with community information managers in defining rules used in PASTA quality tests. Rules deemed too strict can be turned off completely or just issue a warning, while the community learns to best handle the situation and improve their documentation practices. Rules can also be added or refined over time to improve overall quality of archived data. The outcome of quality tests are stored as part of the data archive in PASTA and are accessible to all users of the EDI data repository. In summary, EDI's metadata support to scientists and the comprehensive set of data quality tests for metadata and data congruency provide an ideal archive for environmental and ecological data.

  6. Data collection and compilation for a geodatabase of groundwater, surface-water, water-quality, geophysical, and geologic data, Pecos County Region, Texas, 1930-2011

    USGS Publications Warehouse

    Pearson, Daniel K.; Bumgarner, Johnathan R.; Houston, Natalie A.; Stanton, Gregory P.; Teeple, Andrew; Thomas, Jonathan V.

    2012-01-01

    The U.S. Geological Survey, in cooperation with Middle Pecos Groundwater Conservation District, Pecos County, City of Fort Stockton, Brewster County, and Pecos County Water Control and Improvement District No. 1, compiled groundwater, surface-water, water-quality, geophysical, and geologic data for site locations in the Pecos County region, Texas, and developed a geodatabase to facilitate use of this information. Data were compiled for an approximately 4,700 square mile area of the Pecos County region, Texas. The geodatabase contains data from 8,242 sampling locations; it was designed to organize and store field-collected geochemical and geophysical data, as well as digital database resources from the U.S. Geological Survey, Middle Pecos Groundwater Conservation District, Texas Water Development Board, Texas Commission on Environmental Quality,and numerous other State and local databases. The geodatabase combines these disparate database resources into a simple data model. Site locations are geospatially enabled and stored in a geodatabase feature class for cartographic visualization and spatial analysis within a Geographic Information System. The sampling locations are related to hydrogeologic information through the use of geodatabase relationship classes. The geodatabase relationship classes provide the ability to perform complex spatial and data-driven queries to explore data stored in the geodatabase.

  7. Applying a Theory-Driven Framework to Guide Quality Improvement Efforts in Nursing Homes: The LOCK Model.

    PubMed

    Mills, Whitney L; Pimentel, Camilla B; Palmer, Jennifer A; Snow, A Lynn; Wewiorski, Nancy J; Allen, Rebecca S; Hartmann, Christine W

    2018-05-08

    Implementing quality improvement (QI) programs in nursing homes continues to encounter significant challenges, despite recognized need. QI approaches provide nursing home staff with opportunities to collaborate on developing and testing strategies for improving care delivery. We present a theory-driven and user-friendly adaptable framework and facilitation package to overcome existing challenges and guide QI efforts in nursing homes. The framework is grounded in the foundational concepts of strengths-based learning, observation, relationship-based teams, efficiency, and organizational learning. We adapted these concepts to QI in the nursing home setting, creating the "LOCK" framework. The LOCK framework is currently being disseminated across the Veterans Health Administration. The LOCK framework has five tenets: (a) Look for the bright spots, (b) Observe, (c) Collaborate in huddles, (d) Keep it bite-sized, and (e) facilitation. Each tenet is described. We also present a case study documenting how a fictional nursing home can implement the LOCK framework as part of a QI effort to improve engagement between staff and residents. The case study describes sample observations, processes, and outcomes. We also discuss practical applications for nursing home staff, the adaptability of LOCK for different QI projects, the specific role of facilitation, and lessons learned. The proposed framework complements national efforts to improve quality of care and quality of life for nursing home residents and may be valuable across long-term care settings and QI project types.

  8. TH-EF-BRA-03: Assessment of Data-Driven Respiratory Motion-Compensation Methods for 4D-CBCT Image Registration and Reconstruction Using Clinical Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riblett, MJ; Weiss, E; Hugo, GD

    Purpose: To evaluate the performance of a 4D-CBCT registration and reconstruction method that corrects for respiratory motion and enhances image quality under clinically relevant conditions. Methods: Building on previous work, which tested feasibility of a motion-compensation workflow using image datasets superior to clinical acquisitions, this study assesses workflow performance under clinical conditions in terms of image quality improvement. Evaluated workflows utilized a combination of groupwise deformable image registration (DIR) and image reconstruction. Four-dimensional cone beam CT (4D-CBCT) FDK reconstructions were registered to either mean or respiratory phase reference frame images to model respiratory motion. The resulting 4D transformation was usedmore » to deform projection data during the FDK backprojection operation to create a motion-compensated reconstruction. To simulate clinically realistic conditions, superior quality projection datasets were sampled using a phase-binned striding method. Tissue interface sharpness (TIS) was defined as the slope of a sigmoid curve fit to the lung-diaphragm boundary or to the carina tissue-airway boundary when no diaphragm was discernable. Image quality improvement was assessed in 19 clinical cases by evaluating mitigation of view-aliasing artifacts, tissue interface sharpness recovery, and noise reduction. Results: For clinical datasets, evaluated average TIS recovery relative to base 4D-CBCT reconstructions was observed to be 87% using fixed-frame registration alone; 87% using fixed-frame with motion-compensated reconstruction; 92% using mean-frame registration alone; and 90% using mean-frame with motion-compensated reconstruction. Soft tissue noise was reduced on average by 43% and 44% for the fixed-frame registration and registration with motion-compensation methods, respectively, and by 40% and 42% for the corresponding mean-frame methods. Considerable reductions in view aliasing artifacts were observed for each method. Conclusion: Data-driven groupwise registration and motion-compensated reconstruction have the potential to improve the quality of 4D-CBCT images acquired under clinical conditions. For clinical image datasets, the addition of motion compensation after groupwise registration visibly reduced artifact impact. This work was supported by the National Cancer Institute of the National Institutes of Health under Award Number R01CA166119. Hugo and Weiss hold a research agreement with Philips Healthcare and license agreement with Varian Medical Systems. Weiss receives royalties from UpToDate. Christensen receives funds from Roger Koch to support research.« less

  9. Variability in physical and biological exchange among coastal wetlands and their adjacent Great Lakes

    EPA Science Inventory

    Hydrology is a major governor of physically-driven exchange among coastal wetlands and the adjacent Great Lake, whereas fish movement is a major governor of biologically-driven exchange. We use data describing coastal wetland morphology, hydrology, water quality, and fish tissue...

  10. Redesigning inpatient care: Testing the effectiveness of an accountable care team model.

    PubMed

    Kara, Areeba; Johnson, Cynthia S; Nicley, Amy; Niemeier, Michael R; Hui, Siu L

    2015-12-01

    US healthcare underperforms on quality and safety metrics. Inpatient care constitutes an immense opportunity to intervene to improve care. Describe a model of inpatient care and measure its impact. A quantitative assessment of the implementation of a new model of care. The graded implementation of the model allowed us to follow outcomes and measure their association with the dose of the implementation. Inpatient medical and surgical units in a large academic health center. Eight interventions rooted in improving interprofessional collaboration (IPC), enabling data-driven decisions, and providing leadership were implemented. Outcome data from August 2012 to December 2013 were analyzed using generalized linear mixed models for associations with the implementation of the model. Length of stay (LOS) index, case-mix index-adjusted variable direct costs (CMI-adjusted VDC), 30-day readmission rates, overall patient satisfaction scores, and provider satisfaction with the model were measured. The implementation of the model was associated with decreases in LOS index (P < 0.0001) and CMI-adjusted VDC (P = 0.0006). We did not detect improvements in readmission rates or patient satisfaction scores. Most providers (95.8%, n = 92) agreed that the model had improved the quality and safety of the care delivered. Creating an environment and framework in which IPC is fostered, performance data are transparently available, and leadership is provided may improve value on both medical and surgical units. These interventions appear to be well accepted by front-line staff. Readmission rates and patient satisfaction remain challenging. © 2015 Society of Hospital Medicine.

  11. Guide to a Student-Family-School-Community Partnership: Using a Student & Data Driven Process to Improve School Environments & Promote Student Success

    ERIC Educational Resources Information Center

    Burgoa, Carol; Izu, Jo Ann

    2010-01-01

    This guide presents a data-driven, research-based process--referred to as the "school-community forum process"--for increasing youth voice, promoting resilience, strengthening adult-youth connections, and ultimately, for improving schools. It uses a "student listening circle"--a special type of focus group involving eight to…

  12. The Australian Higher Education Quality Assurance Framework: From Improvement-Led to Compliance-Driven

    ERIC Educational Resources Information Center

    Shah, Mahsood; Jarzabkowski, Lucy

    2013-01-01

    The Australian government initiated a review of higher education in 2008. One of the outcomes of the review was the formation of a national regulator, the Tertiary Education Quality and Standards Agency (TEQSA), with responsibilities to: register all higher education providers, accredit the courses of the non self-accrediting providers, assure…

  13. A Correlational Analysis: Electronic Health Records (EHR) and Quality of Care in Critical Access Hospitals

    ERIC Educational Resources Information Center

    Khan, Arshia A.

    2012-01-01

    Driven by the compulsion to improve the evident paucity in quality of care, especially in critical access hospitals in the United States, policy makers, healthcare providers, and administrators have taken the advise of researchers suggesting the integration of technology in healthcare. The Electronic Health Record (EHR) System composed of multiple…

  14. Development of a Reference Information Model and Knowledgebase for Electronic Bloodstream Infection Detection

    PubMed Central

    Borlawsky, Tara; Hota, Bala; Lin, Michael Y.; Khan, Yosef; Young, Jeremy; Santangelo, Jennifer; Stevenson, Kurt B.

    2008-01-01

    The most prevalent hospital-acquired infections in the United States are bloodstream infections (BSIs) associated with the presence of a central venous catheter. There is currently a movement, including national organizations such as the Centers for Medicare and Medicaid Services as well as consumer, quality improvement and patient safety groups, encouraging the standardization of reporting and aggregation of such nosocomial infection data to increase and improve reporting, and enable rate comparisons among healthcare institutions. Domain modeling is a well-known method for designing interoperable processes that take advantage of existing data and legacy systems. We have combined such a model-driven design approach with the use of partitioned clinical and business logic knowledgebases in order to employ a previously validated electronic BSI surveillance algorithm in the context of a multi-center study. PMID:18999213

  15. Social Capital in Data-Driven Community College Reform

    ERIC Educational Resources Information Center

    Kerrigan, Monica Reid

    2015-01-01

    The current rhetoric around using data to improve community college student outcomes with only limited research on data-driven decision-making (DDDM) within postsecondary education compels a more comprehensive understanding of colleges' capacity for using data to inform decisions. Based on an analysis of faculty and administrators' perceptions and…

  16. Improving opioid safety practices in primary care: protocol for the development and evaluation of a multifaceted, theory-informed pilot intervention for healthcare providers

    PubMed Central

    Leece, Pamela; Buchman, Daniel Z; Hamilton, Michael; Timmings, Caitlyn; Shantharam, Yalnee; Moore, Julia; Furlan, Andrea D

    2017-01-01

    Introduction In North America, drug overdose deaths are reaching unprecedented levels, largely driven by increasing prescription opioid-related deaths. Despite the development of several opioid guidelines, prescribing behaviours still contribute to poor patient outcomes and societal harm. Factors at the provider and system level may hinder or facilitate the application of evidence-based guidelines; interventions designed to address such factors are needed. Methods and analysis Using implementation science and behaviour change theory, we have planned the development and evaluation of a comprehensive Opioid Self-Assessment Package, designed to increase adherence to the Canadian Opioid Guideline among family physicians. The intervention uses practical educational and self-assessment tools to provide prescribers with feedback on their current knowledge and practices, and resources to improve their practice. The evaluation approach uses a pretest and post-test design and includes both quantitative and qualitative methods at baseline and 6 months. We will recruit a purposive sample of approximately 10 family physicians in Ontario from diverse practice settings, who currently treat patients with long-term opioid therapy for chronic pain. Quantitative data will be analysed using basic descriptive statistics, and qualitative data will be analysed using the Framework Method. Ethics and dissemination The University Health Network Research Ethics Board approved this study. Dissemination plan includes publications, conference presentations and brief stakeholder reports. This evidence-informed, theory-driven intervention has implications for national application of opioid quality improvement tools in primary care settings. We are engaging experts and end users in advisory and stakeholder roles throughout our project to increase its national relevance, application and sustainability. The performance measures could be used as the basis for health system quality improvement indicators to monitor opioid prescribing. Additionally, the methods and approach used in this study could be adapted for other opioid guidelines, or applied to other areas of preventive healthcare and clinical guideline implementation processes. PMID:28446522

  17. Measuring surgical performance: A risky game?

    PubMed

    Kiernan, F; Rahman, F

    2015-08-01

    Interest in performance measurement has been driven by increased demand for better indicators of hospital quality of care. This is due in part to policy makers wishing to benchmark standards of care and implement quality improvements, and also by an increased demand for transparency and accountability. We describe the role of performance measurement, which is not only about quality improvement, but also serves as a guide in allocating resources within health systems, and between health, education, and social welfare systems. As hospital based healthcare is responsible for the most cost within the healthcare system, and treats the most severely ill of patients, it is no surprise that performance measurement has focused attention on hospital based care, and in particular on surgery, as an important means of improving quality and accountability. We are particularly concerned about the choice of mortality as an outcome measure in surgery, as this choice assumes that all mortality in surgery is preventable. In reality, as a low quality indicator of care it risks both gaming, and cream-skimming, unless accurate risk adjustment exists. Further concerns relate to the public reporting of this outcome measure. As mortality rates are an imperfect measure of quality, the reputation of individual surgeons will be threatened by the public release of this data. Significant effort should be made to communicate the results to the public in an appropriate manner. Copyright © 2015 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.

  18. Comparison of the performance of tracer kinetic model-driven registration for dynamic contrast enhanced MRI using different models of contrast enhancement.

    PubMed

    Buonaccorsi, Giovanni A; Roberts, Caleb; Cheung, Sue; Watson, Yvonne; O'Connor, James P B; Davies, Karen; Jackson, Alan; Jayson, Gordon C; Parker, Geoff J M

    2006-09-01

    The quantitative analysis of dynamic contrast-enhanced (DCE) magnetic resonance imaging (MRI) data is subject to model fitting errors caused by motion during the time-series data acquisition. However, the time-varying features that occur as a result of contrast enhancement can confound motion correction techniques based on conventional registration similarity measures. We have therefore developed a heuristic, locally controlled tracer kinetic model-driven registration procedure, in which the model accounts for contrast enhancement, and applied it to the registration of abdominal DCE-MRI data at high temporal resolution. Using severely motion-corrupted data sets that had been excluded from analysis in a clinical trial of an antiangiogenic agent, we compared the results obtained when using different models to drive the tracer kinetic model-driven registration with those obtained when using a conventional registration against the time series mean image volume. Using tracer kinetic model-driven registration, it was possible to improve model fitting by reducing the sum of squared errors but the improvement was only realized when using a model that adequately described the features of the time series data. The registration against the time series mean significantly distorted the time series data, as did tracer kinetic model-driven registration using a simpler model of contrast enhancement. When an appropriate model is used, tracer kinetic model-driven registration influences motion-corrupted model fit parameter estimates and provides significant improvements in localization in three-dimensional parameter maps. This has positive implications for the use of quantitative DCE-MRI for example in clinical trials of antiangiogenic or antivascular agents.

  19. More than Just Test Scores: Leading for Improvement with an Alternative Community-Driven Accountability Metric

    ERIC Educational Resources Information Center

    Spain, Angeline; McMahon, Kelly

    2016-01-01

    In this case, Sharon Rowley, a veteran principal, volunteers to participate in a new community-driven accountability initiative and encounters dilemmas about what it means to be a "data-driven" instructional leader. This case provides an opportunity for aspiring school leaders to explore and apply data-use theory to the work of leading…

  20. Public reporting and the quality of care of German nursing homes.

    PubMed

    Herr, Annika; Nguyen, Thu-Van; Schmitz, Hendrik

    2016-10-01

    Since 2009, German nursing homes have been evaluated regularly by an external institution with quality report cards published online. We follow recent debates and argue that most of the information in the report cards does not reliably measure quality of care. However, a subset of up to seven measures does. Do these measures that reflect "risk factors" improve over time? Using a sample of more than 3000 German nursing homes with information on two waves, we assume that the introduction of public reporting is an exogenous institutional change and apply before-after-estimations to obtain estimates for the relation between public reporting and quality. We find a significant improvement of the identified risk factors. Also, the two employed outcome quality indicators improve significantly. The improvements are driven by nursing homes with low quality in the first evaluation. To the extent that this can be interpreted as evidence that public reporting positively affects the (reported) quality in nursing homes, policy makers should carefully choose indicators reflecting care-sensitive quality. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. A Prototype for Content-Rich Decision-Making Support in NOAA using Data as an Asset

    NASA Astrophysics Data System (ADS)

    Austin, M.; Peng, G.

    2015-12-01

    Data Producers and Data Providers do not always collaborate to ensure that the data meets the needs of a broad range of user communities. User needs are not always considered in the beginning of the data production and delivery phases. Often data experts are required to explain or create custom output so that the data can be used by decision makers. Lack of documentation and quality information can result in poor user acceptance or data misinterpretation. This presentation will describe how new content integration tools have been created by NOAA's National Environmental Satellite, Data, and Information Service (NESDIS) to improve quality throughout the data management lifecycle. The prototype integrates contents into a decision-making support tool from NOAA's Observing System Integrated Assessment (NOSIA) Value Tree, NOAA's Data Catalog/Digital Object Identifier (DOI) projects (collection-level metadata) and Data/Stewardship Maturity Matrices (Data and Stewardship Quality Rating Information). The National Centers for Environmental Information's (NCEI) Global Historical Climatology Network-Monthly (GHCN) dataset is used as a case study to formulate/develop the prototype tool and demonstrate its power with the content-centric approach in addition to completeness of metadata elements. This demonstrates the benefits of the prototype tool in both bottom roll-up and top roll-down fashion. The prototype tool delivers a standards based methodology that allows users to determine the quality and value of data that is fit for purpose. It encourages data producers and data providers/stewards to consider users' needs prior to data creation and dissemination resulting in user driven data requirements increasing return on investment.

  2. Circle the Wagons & Bust out the Big Guns! Tame the "Wild West" of Distance Librarianship Using Quality Matters™ Benchmarks

    ERIC Educational Resources Information Center

    Pickens, Kathleen; Witte, Ginna

    2015-01-01

    The Quality Matters™ (QM) Program is utilized by over 700 colleges and universities to ensure that online course design meets standards imperative to student success in a Web-based classroom. Although a faculty-driven peer-review process, QM provides assessment from a student perspective, thereby identifying opportunities for improvement that may…

  3. Addressing the Basics: Academics' View of the Purpose of Higher Education

    ERIC Educational Resources Information Center

    Watty, Kim

    2006-01-01

    A number of changes have occurred in the higher education sector under the auspices of quality and quality improvement. Much of this change has resulted in a compliance-driven environment (more measures, more meetings, more form-filling and less time for the core activities of teaching and research). It is an environment that seeks to assure all…

  4. Evaluating a complex, multi-site, community-based program to improve healthcare quality: the summative research design for the Aligning Forces for Quality initiative.

    PubMed

    Scanlon, Dennis P; Wolf, Laura J; Alexander, Jeffrey A; Christianson, Jon B; Greene, Jessica; Jean-Jacques, Muriel; McHugh, Megan; Shi, Yunfeng; Leitzell, Brigitt; Vanderbrink, Jocelyn M

    2016-08-01

    The Aligning Forces for Quality (AF4Q) initiative was the Robert Wood Johnson Foundation's (RWJF's) signature effort to increase the overall quality of healthcare in targeted communities throughout the country. In addition to sponsoring this 16-site complex program, RWJF funded an independent scientific evaluation to support objective research on the initiative's effectiveness and contributions to basic knowledge in 5 core programmatic areas. The research design, data, and challenges faced during the summative evaluation phase of this near decade-long program are discussed. A descriptive overview of the summative research design and its development for a multi-site, community-based, healthcare quality improvement initiative is provided. The summative research design employed by the evaluation team is discussed. The evaluation team's summative research design involved a data-driven assessment of the effectiveness of the AF4Q program at large, assessments of the impact of AF4Q in the specific programmatic areas, and an assessment of how the AF4Q alliances were positioned for the future at the end of the program. The AF4Q initiative was the largest privately funded community-based healthcare improvement initiative in the United States to date and was implemented at a time of rapid change in national healthcare policy. The implementation of large-scale, multi-site initiatives is becoming an increasingly common approach for addressing problems in healthcare. The summative evaluation research design for the AF4Q initiative, and the lessons learned from its approach, may be valuable to others tasked with evaluating similarly complex community-based initiatives.

  5. Clinical practice guideline development manual: A quality-driven approach for translating evidence into action

    PubMed Central

    Rosenfeld, Richard M.; Shiffman, Richard N.

    2010-01-01

    Background Guidelines translate best evidence into best practice. A well-crafted guideline promotes quality by reducing healthcare variations, improving diagnostic accuracy, promoting effective therapy, and discouraging ineffective – or potentially harmful – interventions. Despite a plethora of published guidelines, methodology is often poorly defined and varies greatly within and among organizations. Purpose This manual describes the principles and practices used successfully by the American Academy of Otolaryngology – Head and Neck Surgery to produce quality-driven, evidence-based guidelines using efficient and transparent methodology for action-ready recommendations with multi-disciplinary applicability. The development process, which allows moving from conception to completion in twelve months, emphasizes a logical sequence of key action statements supported by amplifying text, evidence profiles, and recommendation grades that link action to evidence. Conclusions As clinical practice guidelines become more prominent as a key metric of quality healthcare, organizations must develop efficient production strategies that balance rigor and pragmatism. Equally important, clinicians must become savvy in understanding what guidelines are – and are not – and how they are best utilized to improve care. The information in this manual should help clinicians and organizations achieve these goals. PMID:19464525

  6. A Systematic Review of Published Respondent-Driven Sampling Surveys Collecting Behavioral and Biologic Data.

    PubMed

    Johnston, Lisa G; Hakim, Avi J; Dittrich, Samantha; Burnett, Janet; Kim, Evelyn; White, Richard G

    2016-08-01

    Reporting key details of respondent-driven sampling (RDS) survey implementation and analysis is essential for assessing the quality of RDS surveys. RDS is both a recruitment and analytic method and, as such, it is important to adequately describe both aspects in publications. We extracted data from peer-reviewed literature published through September, 2013 that reported collected biological specimens using RDS. We identified 151 eligible peer-reviewed articles describing 222 surveys conducted in seven regions throughout the world. Most published surveys reported basic implementation information such as survey city, country, year, population sampled, interview method, and final sample size. However, many surveys did not report essential methodological and analytical information for assessing RDS survey quality, including number of recruitment sites, seeds at start and end, maximum number of waves, and whether data were adjusted for network size. Understanding the quality of data collection and analysis in RDS is useful for effectively planning public health service delivery and funding priorities.

  7. Pharmacist leadership in ICU quality improvement: coordinating spontaneous awakening and breathing trials.

    PubMed

    Stollings, Joanna L; Foss, Julie J; Ely, E Wesley; Ambrose, Anna M; Rice, Todd W; Girard, Timothy D; Wheeler, Arthur P

    2015-08-01

    Coordinating efforts across disciplines in the intensive care unit is a key component of quality improvement (QI) efforts. Spontaneous awakening trials (SATs) and spontaneous breathing trials (SBTs) are considered key components of guidelines, yet unfortunately are often not done or coordinated properly. To determine if a pharmacist-driven awakening and breathing coordination (ABC) QI program would improve compliance (ie, process measures) as compared with the previous protocol, which did not involve pharmacists. The QI program included pharmacist-led education, daily discussion on rounds, and weekly performance reports to staff. Using a pre-QI versus during-QI versus post-QI intervention design, we compared data from 500 control ventilator-days (pre-QI period) versus 580 prospective ventilator-days (during-QI period). We then evaluated the sustainability of the QI program in 216 ventilator-days in the post-QI period. SAT safety screens were performed on only 20% pre-QI patient-days versus 97% of during-QI patient-days (P < 0.001) and 100% of post-QI patient-days (P = 0.25). The rates of passing the SAT safety screen in pre-QI and during-QI periods were 63% versus 78% (P = 0.03) and 81% in the post-QI period (P = 0.86). The rates of SATs among eligible patients on continuous infusions were only 53% in the pre-QI versus 85% in the during-QI (P = 0.0001) and 87% in the post-QI (P = 1) periods. In this QI initiative, a pharmacist-driven, interdisciplinary ABC protocol significantly improved process measures compliance, comparing the pre-QI versus during-QI rates of screening, performing, and coordinating SAT and SBTs, and these results were sustained in the 8-month follow-up period post-QI program. © The Author(s) 2015.

  8. Seamless service: research and action.

    PubMed

    Grinstead, N; Timoney, R

    1994-01-01

    Describes the process used by the Mater Infirmorum Hospital in Belfast in 1992-1994 to achieve high quality care (Seamless Service), and motivate staff to deliver and measure performance. Aims of the project include focusing the organization on the customer, improving teamwork and motivation at all levels. After comprehensive data collection from GPs, patients and staff forums developed a full TQM strategy to gain support and maintain momentum including innovative staff events (every staff member was given the opportunity to attend) where multilevel, multidisciplinary workshops enabled staff to design customer care standards, develop teams and lead customer-driven change.

  9. Improved Quantitative Plant Proteomics via the Combination of Targeted and Untargeted Data Acquisition

    PubMed Central

    Hart-Smith, Gene; Reis, Rodrigo S.; Waterhouse, Peter M.; Wilkins, Marc R.

    2017-01-01

    Quantitative proteomics strategies – which are playing important roles in the expanding field of plant molecular systems biology – are traditionally designated as either hypothesis driven or non-hypothesis driven. Many of these strategies aim to select individual peptide ions for tandem mass spectrometry (MS/MS), and to do this mixed hypothesis driven and non-hypothesis driven approaches are theoretically simple to implement. In-depth investigations into the efficacies of such approaches have, however, yet to be described. In this study, using combined samples of unlabeled and metabolically 15N-labeled Arabidopsis thaliana proteins, we investigate the mixed use of targeted data acquisition (TDA) and data dependent acquisition (DDA) – referred to as TDA/DDA – to facilitate both hypothesis driven and non-hypothesis driven quantitative data collection in individual LC-MS/MS experiments. To investigate TDA/DDA for hypothesis driven data collection, 7 miRNA target proteins of differing size and abundance were targeted using inclusion lists comprised of 1558 m/z values, using 3 different TDA/DDA experimental designs. In samples in which targeted peptide ions were of particularly low abundance (i.e., predominantly only marginally above mass analyser detection limits), TDA/DDA produced statistically significant increases in the number of targeted peptides identified (230 ± 8 versus 80 ± 3 for DDA; p = 1.1 × 10-3) and quantified (35 ± 3 versus 21 ± 2 for DDA; p = 0.038) per experiment relative to the use of DDA only. These expected improvements in hypothesis driven data collection were observed alongside unexpected improvements in non-hypothesis driven data collection. Untargeted peptide ions with m/z values matching those in inclusion lists were repeatedly identified and quantified across technical replicate TDA/DDA experiments, resulting in significant increases in the percentages of proteins repeatedly quantified in TDA/DDA experiments only relative to DDA experiments only (33.0 ± 2.6% versus 8.0 ± 2.7%, respectively; p = 0.011). These results were observed together with uncompromised broad-scale MS/MS data collection in TDA/DDA experiments relative to DDA experiments. Using our observations we provide guidelines for TDA/DDA method design for quantitative plant proteomics studies, and suggest that TDA/DDA is a broadly underutilized proteomics data acquisition strategy. PMID:29021799

  10. Plant breeding and rural development in the United States.

    Treesearch

    KE Woeste; SB Blanche; KA Moldenhauer; CD Nelson

    2010-01-01

    Plant breeders contributed enormously to the agricultural and economic development of the United States. By improving the profitability of farming, plant breeders improved the economic condition of farmers and contributed to the growth and structure of rural communities. In the years since World War II, agriculture and the quality of rural life have been driven by...

  11. Trauma patient discharge and care transition experiences: Identifying opportunities for quality improvement in trauma centres.

    PubMed

    Gotlib Conn, Lesley; Zwaiman, Ashley; DasGupta, Tracey; Hales, Brigette; Watamaniuk, Aaron; Nathens, Avery B

    2018-01-01

    Challenges delivering quality care are especially salient during hospital discharge and care transitions. Severely injured patients discharged from a trauma centre will go either home, to rehabilitation or another acute care hospital with complex management needs. This purpose of this study was to explore the experiences of trauma patients and families treated in a regional academic trauma centre to better understand and improve their discharge and care transition experiences. A qualitative study using inductive thematic analysis was conducted between March and October 2016. Telephone interviews were conducted with trauma patients and/or a family member after discharge from the trauma centre. Data collection and analysis were completed inductively and iteratively consistent with a qualitative approach. Twenty-four interviews included 19 patients and 7 family members. Participants' experiences drew attention to discharge and transfer processes that either (1) Fostered quality discharge or (2) Impeded quality discharge. Fostering quality discharge was ward staff preparation efforts; establishing effective care continuity; and, adequate emotional support. Impeding discharge quality was perceived pressure to leave the hospital; imposed transfer decisions; and, sub-optimal communication and coordination around discharge. Patient-provider communication was viewed to be driven by system, rather than patient need. Inter-facility information gaps raised concern about receiving facilities' ability to care for injured patients. The quality of trauma patient discharge and transition experiences is undermined by system- and ward-level processes that compete, rather than align, in producing high quality patient-centred discharge. Local improvement solutions focused on modifiable factors within the trauma centre include patient-oriented discharge education and patient navigation; however, these approaches alone may be insufficient to enhance patient experiences. Trauma patients encounter complex barriers to quality discharge that likely require a comprehensive, multimodal intervention. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Predicting the Quality of Pasteurized Vegetables Using Kinetic Models: A Review

    PubMed Central

    Aamir, Muhammad; Ovissipour, Mahmoudreza; Sablani, Shyam S.; Rasco, Barbara

    2013-01-01

    A resurgence in interest examining thermal pasteurization technologies has been driven by demands for “cleaner” labeling and the need of organic and natural foods markets for suitable preventive measures to impede microbial growth and extend shelf life of minimally processed foods and ready-to-eat foods with a concomitant reduction in the use of chemical preservatives. This review describes the effects of thermal pasteurization on vegetable quality attributes including altering flavor and texture to improve consumer acceptability, stabilizing color, improving digestibility, palatability and retaining bioavailability of important nutrients, and bioactive compounds. Here, we provide kinetic parameters for inactivation of viral and bacterial pathogens and their surrogates and marker enzymes used to monitor process effectiveness in a variety of plant food items. Data on thermal processing protocols leading to higher retention and bioactivity are also presented. Thermal inactivation of foodborne viruses and pathogenic bacteria, specifically at lower pasteurization temperatures or via new technologies such as dielectric heating, can lead to greater retention of “fresh-like” properties. PMID:26904594

  13. Contrasting analytical and data-driven frameworks for radiogenomic modeling of normal tissue toxicities in prostate cancer.

    PubMed

    Coates, James; Jeyaseelan, Asha K; Ybarra, Norma; David, Marc; Faria, Sergio; Souhami, Luis; Cury, Fabio; Duclos, Marie; El Naqa, Issam

    2015-04-01

    We explore analytical and data-driven approaches to investigate the integration of genetic variations (single nucleotide polymorphisms [SNPs] and copy number variations [CNVs]) with dosimetric and clinical variables in modeling radiation-induced rectal bleeding (RB) and erectile dysfunction (ED) in prostate cancer patients. Sixty-two patients who underwent curative hypofractionated radiotherapy (66 Gy in 22 fractions) between 2002 and 2010 were retrospectively genotyped for CNV and SNP rs5489 in the xrcc1 DNA repair gene. Fifty-four patients had full dosimetric profiles. Two parallel modeling approaches were compared to assess the risk of severe RB (Grade⩾3) and ED (Grade⩾1); Maximum likelihood estimated generalized Lyman-Kutcher-Burman (LKB) and logistic regression. Statistical resampling based on cross-validation was used to evaluate model predictive power and generalizability to unseen data. Integration of biological variables xrcc1 CNV and SNP improved the fit of the RB and ED analytical and data-driven models. Cross-validation of the generalized LKB models yielded increases in classification performance of 27.4% for RB and 14.6% for ED when xrcc1 CNV and SNP were included, respectively. Biological variables added to logistic regression modeling improved classification performance over standard dosimetric models by 33.5% for RB and 21.2% for ED models. As a proof-of-concept, we demonstrated that the combination of genetic and dosimetric variables can provide significant improvement in NTCP prediction using analytical and data-driven approaches. The improvement in prediction performance was more pronounced in the data driven approaches. Moreover, we have shown that CNVs, in addition to SNPs, may be useful structural genetic variants in predicting radiation toxicities. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Process evaluation of the data-driven quality improvement in primary care (DQIP) trial: active and less active ingredients of a multi-component complex intervention to reduce high-risk primary care prescribing.

    PubMed

    Grant, Aileen; Dreischulte, Tobias; Guthrie, Bruce

    2017-01-07

    Two to 4% of emergency hospital admissions are caused by preventable adverse drug events. The estimated costs of such avoidable admissions in England were £530 million in 2015. The data-driven quality improvement in primary care (DQIP) intervention was designed to prompt review of patients vulnerable from currently prescribed non-steroidal anti-inflammatory drugs (NSAIDs) and anti-platelets and was found to be effective at reducing this prescribing. A process evaluation was conducted parallel to the trial, and this paper reports the analysis which aimed to explore response to the intervention delivered to clusters in relation to participants' perceptions about which intervention elements were active in changing their practice. Data generation was by in-depth interview with key staff exploring participant's perceptions of the intervention components. Analysis was iterative using the framework technique and drawing on normalisation process theory. All the primary components of the intervention were perceived as active, but at different stages of implementation: financial incentives primarily supported recruitment; education motivated the GPs to initiate implementation; the informatics tool facilitated sustained implementation. Participants perceived the primary components as interdependent. Intervention subcomponents also varied in whether and when they were active. For example, run charts providing feedback of change in prescribing over time were ignored in the informatics tool, but were motivating in some practices in the regular e-mailed newsletter. The high-risk NSAID and anti-platelet prescribing targeted was accepted as important by all interviewees, and this shared understanding was a key wider context underlying intervention effectiveness. This was a novel use of process evaluation data which examined whether and how the individual intervention components were effective from the perspective of the professionals delivering changed care to patients. These findings are important for reproducibility and roll-out of the intervention. ClinicalTrials.gov, NCT01425502 .

  15. Quality of life in bipolar disorder: towards a dynamic understanding.

    PubMed

    Morton, E; Murray, G; Michalak, E E; Lam, R W; Beaulieu, S; Sharma, V; Cervantes, P; Parikh, S V; Yatham, L N

    2018-05-01

    Although quality of life (QoL) is receiving increasing attention in bipolar disorder (BD) research and practice, little is known about its naturalistic trajectory. The dual aims of this study were to prospectively investigate: (a) the trajectory of QoL under guideline-driven treatment and (b) the dynamic relationship between mood symptoms and QoL. In total, 362 patients with BD receiving guideline-driven treatment were prospectively followed at 3-month intervals for up to 5 years. Mental (Mental Component Score - MCS) and physical (Physical Component Score - PCS) QoL were measured using the self-report SF-36. Clinician-rated symptom data were recorded for mania and depression. Multilevel modelling was used to analyse MCS and PCS over time, QoL trajectories predicted by time-lagged symptoms, and symptom trajectories predicted by time-lagged QoL. MCS exhibited a positive trajectory, while PCS worsened over time. Investigation of temporal relationships between QoL and symptoms suggested bidirectional effects: earlier depressive symptoms were negatively associated with mental QoL, and earlier manic symptoms were negatively associated with physical QoL. Importantly, earlier MCS and PCS were both negatively associated with downstream symptoms of mania and depression. The present investigation illustrates real-world outcomes for QoL under guideline-driven BD treatment: improvements in mental QoL and decrements in physical QoL were observed. The data permitted investigation of dynamic interactions between QoL and symptoms, generating novel evidence for bidirectional effects and encouraging further research into this important interplay. Investigation of relevant time-varying covariates (e.g. medications) was beyond scope. Future research should investigate possible determinants of QoL and the interplay between symptoms and wellbeing/satisfaction-centric measures of QoL.

  16. Temporal Data-Driven Sleep Scheduling and Spatial Data-Driven Anomaly Detection for Clustered Wireless Sensor Networks

    PubMed Central

    Li, Gang; He, Bin; Huang, Hongwei; Tang, Limin

    2016-01-01

    The spatial–temporal correlation is an important feature of sensor data in wireless sensor networks (WSNs). Most of the existing works based on the spatial–temporal correlation can be divided into two parts: redundancy reduction and anomaly detection. These two parts are pursued separately in existing works. In this work, the combination of temporal data-driven sleep scheduling (TDSS) and spatial data-driven anomaly detection is proposed, where TDSS can reduce data redundancy. The TDSS model is inspired by transmission control protocol (TCP) congestion control. Based on long and linear cluster structure in the tunnel monitoring system, cooperative TDSS and spatial data-driven anomaly detection are then proposed. To realize synchronous acquisition in the same ring for analyzing the situation of every ring, TDSS is implemented in a cooperative way in the cluster. To keep the precision of sensor data, spatial data-driven anomaly detection based on the spatial correlation and Kriging method is realized to generate an anomaly indicator. The experiment results show that cooperative TDSS can realize non-uniform sensing effectively to reduce the energy consumption. In addition, spatial data-driven anomaly detection is quite significant for maintaining and improving the precision of sensor data. PMID:27690035

  17. Measuring the value of process improvement initiatives in a preoperative assessment center using time-driven activity-based costing.

    PubMed

    French, Katy E; Albright, Heidi W; Frenzel, John C; Incalcaterra, James R; Rubio, Augustin C; Jones, Jessica F; Feeley, Thomas W

    2013-12-01

    The value and impact of process improvement initiatives are difficult to quantify. We describe the use of time-driven activity-based costing (TDABC) in a clinical setting to quantify the value of process improvements in terms of cost, time and personnel resources. Difficulty in identifying and measuring the cost savings of process improvement initiatives in a Preoperative Assessment Center (PAC). Use TDABC to measure the value of process improvement initiatives that reduce the costs of performing a preoperative assessment while maintaining the quality of the assessment. Apply the principles of TDABC in a PAC to measure the value, from baseline, of two phases of performance improvement initiatives and determine the impact of each implementation in terms of cost, time and efficiency. Through two rounds of performance improvements, we quantified an overall reduction in time spent by patient and personnel of 33% that resulted in a 46% reduction in the costs of providing care in the center. The performance improvements resulted in a 17% decrease in the total number of full time equivalents (FTE's) needed to staff the center and a 19% increase in the numbers of patients assessed in the center. Quality of care, as assessed by the rate of cancellations on the day of surgery, was not adversely impacted by the process improvements. © 2013 Published by Elsevier Inc.

  18. An Exploratory Analysis of Societal Preferences for Research-Driven Quality of Life Improvements in Canada

    ERIC Educational Resources Information Center

    Rudd, Murray A.

    2011-01-01

    Research in the humanities, arts, and social sciences (HASS) tends to have impacts that enhance quality of life (QOL) but that are not amenable to pricing in established markets. If the economic value of "non-market" research impacts is ignored when making the business case for HASS research, society will under-invest in it. My goal in…

  19. Machine Learning and Deep Learning Models to Predict Runoff Water Quantity and Quality

    NASA Astrophysics Data System (ADS)

    Bradford, S. A.; Liang, J.; Li, W.; Murata, T.; Simunek, J.

    2017-12-01

    Contaminants can be rapidly transported at the soil surface by runoff to surface water bodies. Physically-based models, which are based on the mathematical description of main hydrological processes, are key tools for predicting surface water impairment. Along with physically-based models, data-driven models are becoming increasingly popular for describing the behavior of hydrological and water resources systems since these models can be used to complement or even replace physically based-models. In this presentation we propose a new data-driven model as an alternative to a physically-based overland flow and transport model. First, we have developed a physically-based numerical model to simulate overland flow and contaminant transport (the HYDRUS-1D overland flow module). A large number of numerical simulations were carried out to develop a database containing information about the impact of various input parameters (weather patterns, surface topography, vegetation, soil conditions, contaminants, and best management practices) on runoff water quantity and quality outputs. This database was used to train data-driven models. Three different methods (Neural Networks, Support Vector Machines, and Recurrence Neural Networks) were explored to prepare input- output functional relations. Results demonstrate the ability and limitations of machine learning and deep learning models to predict runoff water quantity and quality.

  20. Data-driven medicinal chemistry in the era of big data.

    PubMed

    Lusher, Scott J; McGuire, Ross; van Schaik, René C; Nicholson, C David; de Vlieg, Jacob

    2014-07-01

    Science, and the way we undertake research, is changing. The increasing rate of data generation across all scientific disciplines is providing incredible opportunities for data-driven research, with the potential to transform our current practices. The exploitation of so-called 'big data' will enable us to undertake research projects never previously possible but should also stimulate a re-evaluation of all our data practices. Data-driven medicinal chemistry approaches have the potential to improve decision making in drug discovery projects, providing that all researchers embrace the role of 'data scientist' and uncover the meaningful relationships and patterns in available data. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Data-Driven Decision Making in Out-of-School Time Programs. Part 6 in a Series on Implementing Evidence-Based Practices in Out-of-School Time Programs: The Role of Organization-Level Activities. Research-to-Results Brief. Publication #2009-34

    ERIC Educational Resources Information Center

    Bandy, Tawana; Burkhauser, Mary; Metz, Allison J. R.

    2009-01-01

    Although many program managers look to data to inform decision-making and manage their programs, high-quality program data may not always be available. Yet such data are necessary for effective program implementation. The use of high-quality data facilitates program management, reduces reliance on anecdotal information, and ensures that data are…

  2. IS MANUAL THERAPY A RATIONAL APPROACH TO IMPROVING HEALTH-RELATED QUALITY OF LIFE IN PEOPLE WITH ARTHRITIS?

    PubMed Central

    Cameron, Melainie

    2002-01-01

    Background: People with arthritic disease are advised to participate in gentle exercise on a regular basis, and pursue long-term medication regimes. Alternative therapies are also used by people with arthritis, and may sometimes be recommended by rheumatologists and other medical personnel. Alternative therapies may be divided into two types: active therapies, in which the patient takes a driving role, and passive therapies, in which the therapy cannot proceed unless driven by a therapist. Objective: To review the effectiveness of manual therapy in improving the health-related quality of life (HRQOL) of people with two common arthritis conditions: Osteoarthritis and rheumatoid arthritis. Discussion: Massage, and other passive (practitioner-driven) manual therapies, have been anecdotally reported to improve health-related quality of life (HRQOL) in people with arthritis. Many manual therapists consult with patients who have arthritic diseases, receive referrals from rheumatologists, and consider the arthritic diseases to be within their field of practise. Although there is empirical evidence that manual therapy with some types of arthritis is beneficial, the level of effectiveness however is under-researched. Medical authorities are reluctant to endorse manual therapies for arthritis due to a lack of scientific evidence demonstrating efficacy, safety, and cost effectiveness. PMID:17987169

  3. Transform coding for space applications

    NASA Technical Reports Server (NTRS)

    Glover, Daniel

    1993-01-01

    Data compression coding requirements for aerospace applications differ somewhat from the compression requirements for entertainment systems. On the one hand, entertainment applications are bit rate driven with the goal of getting the best quality possible with a given bandwidth. Science applications are quality driven with the goal of getting the lowest bit rate for a given level of reconstruction quality. In the past, the required quality level has been nothing less than perfect allowing only the use of lossless compression methods (if that). With the advent of better, faster, cheaper missions, an opportunity has arisen for lossy data compression methods to find a use in science applications as requirements for perfect quality reconstruction runs into cost constraints. This paper presents a review of the data compression problem from the space application perspective. Transform coding techniques are described and some simple, integer transforms are presented. The application of these transforms to space-based data compression problems is discussed. Integer transforms have an advantage over conventional transforms in computational complexity. Space applications are different from broadcast or entertainment in that it is desirable to have a simple encoder (in space) and tolerate a more complicated decoder (on the ground) rather than vice versa. Energy compaction with new transforms are compared with the Walsh-Hadamard (WHT), Discrete Cosine (DCT), and Integer Cosine (ICT) transforms.

  4. Managed behavioral healthcare in the private sector.

    PubMed

    Jeffrey, M; Riley, J

    2000-09-01

    Employers, in their search for cost containment and quality improvement, have driven the development of the behavioral health managed care vendor. More specifically, the behavioral health carve-out is an innovation that was developed to respond to employer and, more recently, health plan needs. Now that the product has matured, it is increasingly being asked to justify its existence. Costs have certainly been maintained, but improvements in quality have not always been evident. The issues the authors address include, as cost pressures continue, can the industry deliver on its promise to improve care? Will it need to evolve to yet another level, with new or different features?

  5. Data-Driven Instructional Leadership

    ERIC Educational Resources Information Center

    Blink, Rebecca

    2006-01-01

    With real-world examples from actual schools, this book illustrates how to nurture a culture of continuous improvement, meet the needs of individual students, foster an environment of high expectations, and meet the requirements of NCLB. Each component of the Data-Driven Instructional Leadership (DDIS) model represents several branches of…

  6. Identifying Data-Driven Instructional Systems. Research to Practice Brief

    ERIC Educational Resources Information Center

    Lawrence, K. S.

    2016-01-01

    The study summarized in this research to practice brief, "Creating data-driven instructional systems in school: The new instructional leadership," Halverson, R., Grigg, J., Pritchett, R., & Thomas, C. (2015), "Journal of School Leadership," 25. 447-481, investigated whether student outcome improvements were linked to the…

  7. A decision-supported outpatient practice system.

    PubMed Central

    Barrows, R. C.; Allen, B. A.; Smith, K. C.; Arni, V. V.; Sherman, E.

    1996-01-01

    We describe a Decision-supported Outpatient Practice (DOP) system developed and now in use at the Columbia-Presbyterian Medical Center. DOP is an automated ambulatory medical record system that integrates in-patient and ambulatory care data, and incorporates active and passive decision support mechanisms with a view towards improving the quality of primary care. Active decision support occurs in the form of event-driven reminders created within a remote clinical information system with its central data repository and decision support system (DSS). Novel features of DOP include patient specific health maintenance task lists calculated by the remote DSS. uses of a semantically structured controlled medical vocabulary to support clinical results review and provider data entry, and exploitation of an underlying ambulatory data model that provides for an explicit record of evolution of insight regarding patient management. Benefits, challenges, and plans are discussed. PMID:8947774

  8. The Korean Neonatal Network: An Overview

    PubMed Central

    Chang, Yun Sil; Park, Hyun-Young

    2015-01-01

    Currently, in the Republic of Korea, despite the very-low-birth rate, the birth rate and number of preterm infants are markedly increasing. Neonatal deaths and major complications mostly occur in premature infants, especially very-low-birth-weight infants (VLBWIs). VLBWIs weigh less than 1,500 g at birth and require intensive treatment in a neonatal intensive care unit (NICU). The operation of the Korean Neonatal Network (KNN) officially started on April 15, 2013, by the Korean Society of Neonatology with support from the Korea Centers for Disease Control and Prevention. The KNN is a national multicenter neonatal network based on a prospective web-based registry for VLBWIs. About 2,000 VLBWIs from 60 participating hospital NICUs are registered annually in the KNN. The KNN has built unique systems such as a web-based real-time data display on the web site and a site-visit monitoring system for data quality surveillance. The KNN should be maintained and developed further in order to generate appropriate, population-based, data-driven, health-care policies; facilitate active multicenter neonatal research, including quality improvement of neonatal care; and ultimately lead to improvement in the prognosis of high-risk newborns and subsequent reduction in health-care costs through the development of evidence-based neonatal medicine in Korea. PMID:26566355

  9. The Korean Neonatal Network: An Overview.

    PubMed

    Chang, Yun Sil; Park, Hyun-Young; Park, Won Soon

    2015-10-01

    Currently, in the Republic of Korea, despite the very-low-birth rate, the birth rate and number of preterm infants are markedly increasing. Neonatal deaths and major complications mostly occur in premature infants, especially very-low-birth-weight infants (VLBWIs). VLBWIs weigh less than 1,500 g at birth and require intensive treatment in a neonatal intensive care unit (NICU). The operation of the Korean Neonatal Network (KNN) officially started on April 15, 2013, by the Korean Society of Neonatology with support from the Korea Centers for Disease Control and Prevention. The KNN is a national multicenter neonatal network based on a prospective web-based registry for VLBWIs. About 2,000 VLBWIs from 60 participating hospital NICUs are registered annually in the KNN. The KNN has built unique systems such as a web-based real-time data display on the web site and a site-visit monitoring system for data quality surveillance. The KNN should be maintained and developed further in order to generate appropriate, population-based, data-driven, health-care policies; facilitate active multicenter neonatal research, including quality improvement of neonatal care; and ultimately lead to improvement in the prognosis of high-risk newborns and subsequent reduction in health-care costs through the development of evidence-based neonatal medicine in Korea.

  10. An evaluation of the NQF Quality Data Model for representing Electronic Health Record driven phenotyping algorithms.

    PubMed

    Thompson, William K; Rasmussen, Luke V; Pacheco, Jennifer A; Peissig, Peggy L; Denny, Joshua C; Kho, Abel N; Miller, Aaron; Pathak, Jyotishman

    2012-01-01

    The development of Electronic Health Record (EHR)-based phenotype selection algorithms is a non-trivial and highly iterative process involving domain experts and informaticians. To make it easier to port algorithms across institutions, it is desirable to represent them using an unambiguous formal specification language. For this purpose we evaluated the recently developed National Quality Forum (NQF) information model designed for EHR-based quality measures: the Quality Data Model (QDM). We selected 9 phenotyping algorithms that had been previously developed as part of the eMERGE consortium and translated them into QDM format. Our study concluded that the QDM contains several core elements that make it a promising format for EHR-driven phenotyping algorithms for clinical research. However, we also found areas in which the QDM could be usefully extended, such as representing information extracted from clinical text, and the ability to handle algorithms that do not consist of Boolean combinations of criteria.

  11. Professional Quality of Life of Veterans Affairs Staff and Providers in a Patient-Centered Care Environment.

    PubMed

    Locatelli, Sara M; LaVela, Sherri L

    2015-01-01

    Changes to the work environment prompted by the movement toward patient-centered care have the potential to improve occupational stress among health care workers by improving team-based work activities, collaboration, and employee-driven quality improvement. This study was conducted to examine professional quality of life among providers at patient-centered care pilot facilities. Surveys were conducted with 76 Veterans Affairs employees/providers at facilities piloting patient-centered care interventions, to assess demographics, workplace practices and views (team-based environment, employee voice, quality of communication, and turnover intention), and professional quality of life (compassion satisfaction, burnout, and secondary traumatic stress).Professional quality-of-life subscales were not related to employee position type, age, or gender. Employee voice measures were related to lower burnout and higher compassion satisfaction. In addition, employees who were considering leaving their position showed higher burnout and lower compassion satisfaction scores. None of the work practices showed relationships with secondary traumatic stress.

  12. Impact of ablator thickness and laser drive duration on a platform for supersonic, shockwave-driven hydrodynamic instability experiments

    DOE PAGES

    Wan, W. C.; Malamud, Guy; Shimony, A.; ...

    2016-12-07

    Here, we discuss changes to a target design that improved the quality and consistency of data obtained through a novel experimental platform that enables the study of hydrodynamic instabilities in a compressible regime. The experiment uses a laser to drive steady, supersonic shockwave over well-characterized initial perturbations. Early experiments were adversely affected by inadequate experimental timescales and, potentially, an unintended secondary shockwave. These issues were addressed by extending the 4 x 10 13 W/cm 2 laser pulse from 19 ns to 28 ns, and increasing the ablator thickness from 185 µm to 500 µm. We present data demonstrating the performancemore » of the platform.« less

  13. Impact of ablator thickness and laser drive duration on a platform for supersonic, shockwave-driven hydrodynamic instability experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wan, W. C.; Malamud, Guy; Shimony, A.

    Here, we discuss changes to a target design that improved the quality and consistency of data obtained through a novel experimental platform that enables the study of hydrodynamic instabilities in a compressible regime. The experiment uses a laser to drive steady, supersonic shockwave over well-characterized initial perturbations. Early experiments were adversely affected by inadequate experimental timescales and, potentially, an unintended secondary shockwave. These issues were addressed by extending the 4 x 10 13 W/cm 2 laser pulse from 19 ns to 28 ns, and increasing the ablator thickness from 185 µm to 500 µm. We present data demonstrating the performancemore » of the platform.« less

  14. Smart Health - Potential and Pathways: A Survey

    NASA Astrophysics Data System (ADS)

    Arulananthan, C.; Hanifa, Sabibullah Mohamed

    2017-08-01

    Healthcare is an imperative key field of research, where individuals or groups can be engaged in the self-tracking of any kind of biological, physical, behavioral, or environmental information. In a massive health care data, the valuable information is hidden. The quantity of the available unstructured data has been expanding on an exponential scale. The newly developing Disruptive Technologies can handle many challenges that face data analysis and ability to extract valuable information via data analytics. Connected Wellness in Healthcare would retrieve patient’s physiological, pathological and behavioral parameters through sensors to perform inner workings of human body analysis. Disruptive technologies can take us from a reactive illness-driven to a proactive wellness-driven system in health care. It is need to be strive and create a smart health system towards wellness-driven instead of being illness-driven, today’s biggest problem in health care. Wellness-driven-analytics application help to promote healthiest living environment called “Smart Health”, deliver empower based quality of living. The contributions of this survey reveals and opens (touches uncovered areas) the possible doors in the line of research on smart health and its computing technologies.

  15. PEPSI-Dock: a detailed data-driven protein-protein interaction potential accelerated by polar Fourier correlation.

    PubMed

    Neveu, Emilie; Ritchie, David W; Popov, Petr; Grudinin, Sergei

    2016-09-01

    Docking prediction algorithms aim to find the native conformation of a complex of proteins from knowledge of their unbound structures. They rely on a combination of sampling and scoring methods, adapted to different scales. Polynomial Expansion of Protein Structures and Interactions for Docking (PEPSI-Dock) improves the accuracy of the first stage of the docking pipeline, which will sharpen up the final predictions. Indeed, PEPSI-Dock benefits from the precision of a very detailed data-driven model of the binding free energy used with a global and exhaustive rigid-body search space. As well as being accurate, our computations are among the fastest by virtue of the sparse representation of the pre-computed potentials and FFT-accelerated sampling techniques. Overall, this is the first demonstration of a FFT-accelerated docking method coupled with an arbitrary-shaped distance-dependent interaction potential. First, we present a novel learning process to compute data-driven distant-dependent pairwise potentials, adapted from our previous method used for rescoring of putative protein-protein binding poses. The potential coefficients are learned by combining machine-learning techniques with physically interpretable descriptors. Then, we describe the integration of the deduced potentials into a FFT-accelerated spherical sampling provided by the Hex library. Overall, on a training set of 163 heterodimers, PEPSI-Dock achieves a success rate of 91% mid-quality predictions in the top-10 solutions. On a subset of the protein docking benchmark v5, it achieves 44.4% mid-quality predictions in the top-10 solutions when starting from bound structures and 20.5% when starting from unbound structures. The method runs in 5-15 min on a modern laptop and can easily be extended to other types of interactions. https://team.inria.fr/nano-d/software/PEPSI-Dock sergei.grudinin@inria.fr. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Using baldrige performance excellence program approaches in the pursuit of radiation oncology quality care, patient satisfaction, and workforce commitment.

    PubMed

    Sternick, Edward S

    2011-01-01

    The Malcolm Baldrige National Quality Improvement Act was signed into law in 1987 to advance US business competitiveness and economic growth. Administered by the National Institute of Standards and Technology, the Act created the Baldrige National Quality Program, recently renamed the Baldrige Performance Excellence Program. The comprehensive analytical approaches referred to as the Baldrige Healthcare Criteria, are very well-suited for the evaluation and sustainable improvement of radiation oncology management and operations. A multidisciplinary self-assessment approach is used for radiotherapy program evaluation and development in order to generate a fact-based, knowledge-driven system for improving quality of care, increasing patient satisfaction, enhancing leadership effectiveness, building employee engagement, and boosting organizational innovation. This methodology also provides a valuable framework for benchmarking an individual radiation oncology practice's operations and results against guidelines defined by accreditation and professional organizations and regulatory agencies.

  17. Data publication with the structural biology data grid supports live analysis

    DOE PAGES

    Meyer, Peter A.; Socias, Stephanie; Key, Jason; ...

    2016-03-07

    Access to experimental X-ray diffraction image data is fundamental for validation and reproduction of macromolecular models and indispensable for development of structural biology processing methods. Here, we established a diffraction data publication and dissemination system, Structural Biology Data Grid (SBDG; data.sbgrid.org), to preserve primary experimental data sets that support scientific publications. Data sets are accessible to researchers through a community driven data grid, which facilitates global data access. Our analysis of a pilot collection of crystallographic data sets demonstrates that the information archived by SBDG is sufficient to reprocess data to statistics that meet or exceed the quality of themore » original published structures. SBDG has extended its services to the entire community and is used to develop support for other types of biomedical data sets. In conclusion, it is anticipated that access to the experimental data sets will enhance the paradigm shift in the community towards a much more dynamic body of continuously improving data analysis.« less

  18. Data publication with the structural biology data grid supports live analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Peter A.; Socias, Stephanie; Key, Jason

    Access to experimental X-ray diffraction image data is fundamental for validation and reproduction of macromolecular models and indispensable for development of structural biology processing methods. Here, we established a diffraction data publication and dissemination system, Structural Biology Data Grid (SBDG; data.sbgrid.org), to preserve primary experimental data sets that support scientific publications. Data sets are accessible to researchers through a community driven data grid, which facilitates global data access. Our analysis of a pilot collection of crystallographic data sets demonstrates that the information archived by SBDG is sufficient to reprocess data to statistics that meet or exceed the quality of themore » original published structures. SBDG has extended its services to the entire community and is used to develop support for other types of biomedical data sets. In conclusion, it is anticipated that access to the experimental data sets will enhance the paradigm shift in the community towards a much more dynamic body of continuously improving data analysis.« less

  19. Data publication with the structural biology data grid supports live analysis.

    PubMed

    Meyer, Peter A; Socias, Stephanie; Key, Jason; Ransey, Elizabeth; Tjon, Emily C; Buschiazzo, Alejandro; Lei, Ming; Botka, Chris; Withrow, James; Neau, David; Rajashankar, Kanagalaghatta; Anderson, Karen S; Baxter, Richard H; Blacklow, Stephen C; Boggon, Titus J; Bonvin, Alexandre M J J; Borek, Dominika; Brett, Tom J; Caflisch, Amedeo; Chang, Chung-I; Chazin, Walter J; Corbett, Kevin D; Cosgrove, Michael S; Crosson, Sean; Dhe-Paganon, Sirano; Di Cera, Enrico; Drennan, Catherine L; Eck, Michael J; Eichman, Brandt F; Fan, Qing R; Ferré-D'Amaré, Adrian R; Fromme, J Christopher; Garcia, K Christopher; Gaudet, Rachelle; Gong, Peng; Harrison, Stephen C; Heldwein, Ekaterina E; Jia, Zongchao; Keenan, Robert J; Kruse, Andrew C; Kvansakul, Marc; McLellan, Jason S; Modis, Yorgo; Nam, Yunsun; Otwinowski, Zbyszek; Pai, Emil F; Pereira, Pedro José Barbosa; Petosa, Carlo; Raman, C S; Rapoport, Tom A; Roll-Mecak, Antonina; Rosen, Michael K; Rudenko, Gabby; Schlessinger, Joseph; Schwartz, Thomas U; Shamoo, Yousif; Sondermann, Holger; Tao, Yizhi J; Tolia, Niraj H; Tsodikov, Oleg V; Westover, Kenneth D; Wu, Hao; Foster, Ian; Fraser, James S; Maia, Filipe R N C; Gonen, Tamir; Kirchhausen, Tom; Diederichs, Kay; Crosas, Mercè; Sliz, Piotr

    2016-03-07

    Access to experimental X-ray diffraction image data is fundamental for validation and reproduction of macromolecular models and indispensable for development of structural biology processing methods. Here, we established a diffraction data publication and dissemination system, Structural Biology Data Grid (SBDG; data.sbgrid.org), to preserve primary experimental data sets that support scientific publications. Data sets are accessible to researchers through a community driven data grid, which facilitates global data access. Our analysis of a pilot collection of crystallographic data sets demonstrates that the information archived by SBDG is sufficient to reprocess data to statistics that meet or exceed the quality of the original published structures. SBDG has extended its services to the entire community and is used to develop support for other types of biomedical data sets. It is anticipated that access to the experimental data sets will enhance the paradigm shift in the community towards a much more dynamic body of continuously improving data analysis.

  20. Data publication with the structural biology data grid supports live analysis

    PubMed Central

    Meyer, Peter A.; Socias, Stephanie; Key, Jason; Ransey, Elizabeth; Tjon, Emily C.; Buschiazzo, Alejandro; Lei, Ming; Botka, Chris; Withrow, James; Neau, David; Rajashankar, Kanagalaghatta; Anderson, Karen S.; Baxter, Richard H.; Blacklow, Stephen C.; Boggon, Titus J.; Bonvin, Alexandre M. J. J.; Borek, Dominika; Brett, Tom J.; Caflisch, Amedeo; Chang, Chung-I; Chazin, Walter J.; Corbett, Kevin D.; Cosgrove, Michael S.; Crosson, Sean; Dhe-Paganon, Sirano; Di Cera, Enrico; Drennan, Catherine L.; Eck, Michael J.; Eichman, Brandt F.; Fan, Qing R.; Ferré-D'Amaré, Adrian R.; Christopher Fromme, J.; Garcia, K. Christopher; Gaudet, Rachelle; Gong, Peng; Harrison, Stephen C.; Heldwein, Ekaterina E.; Jia, Zongchao; Keenan, Robert J.; Kruse, Andrew C.; Kvansakul, Marc; McLellan, Jason S.; Modis, Yorgo; Nam, Yunsun; Otwinowski, Zbyszek; Pai, Emil F.; Pereira, Pedro José Barbosa; Petosa, Carlo; Raman, C. S.; Rapoport, Tom A.; Roll-Mecak, Antonina; Rosen, Michael K.; Rudenko, Gabby; Schlessinger, Joseph; Schwartz, Thomas U.; Shamoo, Yousif; Sondermann, Holger; Tao, Yizhi J.; Tolia, Niraj H.; Tsodikov, Oleg V.; Westover, Kenneth D.; Wu, Hao; Foster, Ian; Fraser, James S.; Maia, Filipe R. N C.; Gonen, Tamir; Kirchhausen, Tom; Diederichs, Kay; Crosas, Mercè; Sliz, Piotr

    2016-01-01

    Access to experimental X-ray diffraction image data is fundamental for validation and reproduction of macromolecular models and indispensable for development of structural biology processing methods. Here, we established a diffraction data publication and dissemination system, Structural Biology Data Grid (SBDG; data.sbgrid.org), to preserve primary experimental data sets that support scientific publications. Data sets are accessible to researchers through a community driven data grid, which facilitates global data access. Our analysis of a pilot collection of crystallographic data sets demonstrates that the information archived by SBDG is sufficient to reprocess data to statistics that meet or exceed the quality of the original published structures. SBDG has extended its services to the entire community and is used to develop support for other types of biomedical data sets. It is anticipated that access to the experimental data sets will enhance the paradigm shift in the community towards a much more dynamic body of continuously improving data analysis. PMID:26947396

  1. Implementing for Sustainability: Promoting Use of a Measurement Feedback System for Innovation and Quality Improvement.

    PubMed

    Douglas, Susan; Button, Suzanne; Casey, Susan E

    2016-05-01

    Measurement feedback systems (MFSs) are increasingly recognized as evidence-based treatments for improving mental health outcomes, in addition to being a useful administrative tool for service planning and reporting. Promising research findings have driven practice administrators and policymakers to emphasize the incorporation of outcomes monitoring into electronic health systems. To promote MFS integrity and protect against potentially negative outcomes, it is vital that adoption and implementation be guided by scientifically rigorous yet practical principles. In this point of view, the authors discuss and provide examples of three user-centered and theory-based principles: emphasizing integration with clinical values and workflow, promoting administrative leadership with the 'golden thread' of data-informed decision-making, and facilitating sustainability by encouraging innovation. In our experience, enacting these principles serves to promote sustainable implementation of MFSs in the community while also allowing innovation to occur, which can inform improvements to guide future MFS research.

  2. Lessons Learned from Stakeholder-Driven Modeling in the Western Lake Erie Basin

    NASA Astrophysics Data System (ADS)

    Muenich, R. L.; Read, J.; Vaccaro, L.; Kalcic, M. M.; Scavia, D.

    2017-12-01

    Lake Erie's history includes a great environmental success story. Recognizing the impact of high phosphorus loads from point sources, the United States and Canada 1972 Great Lakes Water Quality Agreement set load reduction targets to reduce algae blooms and hypoxia. The Lake responded quickly to those reductions and it was declared a success. However, since the mid-1990s, Lake Erie's algal blooms and hypoxia have returned, and this time with a dominant algae species that produces toxins. Return of the algal blooms and hypoxia is again driven by phosphorus loads, but this time a major source is the agriculturally-dominated Maumee River watershed that covers NW Ohio, NE Indiana, and SE Michigan, and the hypoxic extent has been shown to be driven by Maumee River loads plus those from the bi-national and multiple land-use St. Clair - Detroit River system. Stakeholders in the Lake Erie watershed have a long history of engagement with environmental policy, including modeling and monitoring efforts. This talk will focus on the application of interdisciplinary, stakeholder-driven modeling efforts aimed at understanding the primary phosphorus sources and potential pathways to reduce these sources and the resulting algal blooms and hypoxia in Lake Erie. We will discuss the challenges, such as engaging users with different goals, benefits to modeling, such as improvements in modeling data, and new research questions emerging from these modeling efforts that are driven by end-user needs.

  3. Broad phonetic class definition driven by phone confusions

    NASA Astrophysics Data System (ADS)

    Lopes, Carla; Perdigão, Fernando

    2012-12-01

    Intermediate representations between the speech signal and phones may be used to improve discrimination among phones that are often confused. These representations are usually found according to broad phonetic classes, which are defined by a phonetician. This article proposes an alternative data-driven method to generate these classes. Phone confusion information from the analysis of the output of a phone recognition system is used to find clusters at high risk of mutual confusion. A metric is defined to compute the distance between phones. The results, using TIMIT data, show that the proposed confusion-driven phone clustering method is an attractive alternative to the approaches based on human knowledge. A hierarchical classification structure to improve phone recognition is also proposed using a discriminative weight training method. Experiments show improvements in phone recognition on the TIMIT database compared to a baseline system.

  4. Creating a System for Data-Driven Decision-Making: Applying the Principal-Agent Framework

    ERIC Educational Resources Information Center

    Wohlstetter, Priscilla; Datnow, Amanda; Park, Vicki

    2008-01-01

    The purpose of this article is to improve our understanding of data-driven decision-making strategies that are initiated at the district or system level. We apply principal-agent theory to the analysis of qualitative data gathered in a case study of 4 urban school systems. Our findings suggest educators at the school level need not only systemic…

  5. "Power quality system," a new system of quality management for globalization: towards innovation and competitive advantages.

    PubMed

    Abdul-Rahman, H; Berawi, M A

    Knowledge Management (KM) addresses the critical issues of organizational adoption, survival and competence in the face of an increasingly changing environment. KM embodies organizational processes that seek a synergistic combination of the data and information processing capabilities of information and communication technologies (ICT), and the creative and innovative capacity of human beings to improve ICT In that role, knowledge management will improve quality management and avoid or minimize losses and weakness that usually come from poor performance as well as increase the competitive level of the company and its ability to survive in the global marketplace. To achieve quality, all parties including the clients, company consultants, contractors, entrepreneurs, suppliers, and the governing bodies (i.e., all involved stake-holders) need to collaborate and commit to achieving quality. The design based organizations in major business and construction companies have to be quality driven to support healthy growth in today's competitive market. In the march towards vision 2020 and globalization (i.e., the one world community) of many companies, their design based organizations need to have superior quality management and knowledge management to anticipate changes. The implementation of a quality system such as the ISO 9000 Standards, Total Quality Management, or Quality Function Deployment (QFD) focuses the company's resources towards achieving faster and better results in the global market with less cost. To anticipate the needs of the marketplace and clients as the world and technology change, a new system, which we call Power Quality System (PQS), has been designed. PQS is a combination of information and communication technologies (ICT) and the creative and innovative capacity of human beings to meet the challenges of the new world business and to develop high quality products.

  6. a Representation-Driven Ontology for Spatial Data Quality Elements, with Orthoimagery as Running Example

    NASA Astrophysics Data System (ADS)

    Hangouët, J.-F.

    2015-08-01

    The many facets of what is encompassed by such an expression as "quality of spatial data" can be considered as a specific domain of reality worthy of formal description, i.e. of ontological abstraction. Various ontologies for data quality elements have already been proposed in literature. Today, the system of quality elements is most generally used and discussed according to the configuration exposed in the "data dictionary for data quality" of international standard ISO 19157. Our communication proposes an alternative view. This is founded on a perspective which focuses on the specificity of spatial data as a product: the representation perspective, where data in the computer are meant to show things of the geographic world and to be interpreted as such. The resulting ontology introduces new elements, the usefulness of which will be illustrated by orthoimagery examples.

  7. INDOOR AIR QUALITY MODEL VERSION 1.0 DOCUMENTATION

    EPA Science Inventory

    The report presents a multiroom model for estimating the impact of various sources on indoor air quality (IAQ). The model is written for use on IBM-PC and compatible microcomputers. It is easy to use with a menu-driven user interface. Data are entered using a fill-in-a-form inter...

  8. Improving Procedure Start Times and Decreasing Delays in Interventional Radiology: A Department's Quality Improvement Initiative.

    PubMed

    Villarreal, Monica C; Rostad, Bradley S; Wright, Richard; Applegate, Kimberly E

    2015-12-01

    To identify and reduce reasons for delays in procedure start times, particularly the first cases of the day, within the interventional radiology (IR) divisions of the Department of Radiology using principles of continuous quality improvement. An interdisciplinary team representative of the IR and preprocedure/postprocedure care area (PPCA) health care personnel, managers, and data analysts was formed. A standardized form was used to document both inpatient and outpatient progress through the PPCA and IR workflow in six rooms and to document reasons for delays. Data generated were used to identify key problems areas, implement improvement interventions, and monitor their effects. Project duration was 6 months. The average number of on-time starts for the first case of the day increased from 23% to 56% (P value < .01). The average number of on-time, scheduled outpatients increased from 30% to 45% (P value < .01). Patient wait time to arrive at treatment room once they were ready for their procedure was reduced on average by 10 minutes (P value < .01). Patient care delay duration per 100 patients was reduced from 30.3 to 21.6 hours (29% reduction). Number of patient care delays per 100 patients was reduced from 46.6 to 40.1 (17% reduction). Top reasons for delay included waiting for consent (26% of delays duration) and laboratory tests (12%). Many complex factors contribute to procedure start time delays within an IR practice. A data-driven and patient-centered, interdisciplinary team approach was effective in reducing delays in IR. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  9. Sensor Web Dynamic Measurement Techniques and Adaptive Observing Strategies

    NASA Technical Reports Server (NTRS)

    Talabac, Stephen J.

    2004-01-01

    Sensor Web observing systems may have the potential to significantly improve our ability to monitor, understand, and predict the evolution of rapidly evolving, transient, or variable environmental features and events. This improvement will come about by integrating novel data collection techniques, new or improved instruments, emerging communications technologies and protocols, sensor mark-up languages, and interoperable planning and scheduling systems. In contrast to today's observing systems, "event-driven" sensor webs will synthesize real- or near-real time measurements and information from other platforms and then react by reconfiguring the platforms and instruments to invoke new measurement modes and adaptive observation strategies. Similarly, "model-driven" sensor webs will utilize environmental prediction models to initiate targeted sensor measurements or to use a new observing strategy. The sensor web concept contrasts with today's data collection techniques and observing system operations concepts where independent measurements are made by remote sensing and in situ platforms that do not share, and therefore cannot act upon, potentially useful complementary sensor measurement data and platform state information. This presentation describes NASA's view of event-driven and model-driven Sensor Webs and highlights several research and development activities at the Goddard Space Flight Center.

  10. Compressing Test and Evaluation by Using Flow Data for Scalable Network Traffic Analysis

    DTIC Science & Technology

    2014-10-01

    test events, quality of service and other key metrics of military systems and networks are evaluated. Network data captured in standard flow formats...mentioned here. The Ozone Widget Framework (Next Century, n.d.) has proven to be very useful. Also, an extensive, clean, and optimized JavaScript ...library for visualizing many types of data can be found in D3–Data Driven Documents (Bostock, 2013). Quality of Service from Flow Two essential metrics of

  11. qPortal: A platform for data-driven biomedical research.

    PubMed

    Mohr, Christopher; Friedrich, Andreas; Wojnar, David; Kenar, Erhan; Polatkan, Aydin Can; Codrea, Marius Cosmin; Czemmel, Stefan; Kohlbacher, Oliver; Nahnsen, Sven

    2018-01-01

    Modern biomedical research aims at drawing biological conclusions from large, highly complex biological datasets. It has become common practice to make extensive use of high-throughput technologies that produce big amounts of heterogeneous data. In addition to the ever-improving accuracy, methods are getting faster and cheaper, resulting in a steadily increasing need for scalable data management and easily accessible means of analysis. We present qPortal, a platform providing users with an intuitive way to manage and analyze quantitative biological data. The backend leverages a variety of concepts and technologies, such as relational databases, data stores, data models and means of data transfer, as well as front-end solutions to give users access to data management and easy-to-use analysis options. Users are empowered to conduct their experiments from the experimental design to the visualization of their results through the platform. Here, we illustrate the feature-rich portal by simulating a biomedical study based on publically available data. We demonstrate the software's strength in supporting the entire project life cycle. The software supports the project design and registration, empowers users to do all-digital project management and finally provides means to perform analysis. We compare our approach to Galaxy, one of the most widely used scientific workflow and analysis platforms in computational biology. Application of both systems to a small case study shows the differences between a data-driven approach (qPortal) and a workflow-driven approach (Galaxy). qPortal, a one-stop-shop solution for biomedical projects offers up-to-date analysis pipelines, quality control workflows, and visualization tools. Through intensive user interactions, appropriate data models have been developed. These models build the foundation of our biological data management system and provide possibilities to annotate data, query metadata for statistics and future re-analysis on high-performance computing systems via coupling of workflow management systems. Integration of project and data management as well as workflow resources in one place present clear advantages over existing solutions.

  12. Residency Training: Quality improvement projects in neurology residency and fellowship: applying DMAIC methodology.

    PubMed

    Kassardjian, Charles D; Williamson, Michelle L; van Buskirk, Dorothy J; Ernste, Floranne C; Hunderfund, Andrea N Leep

    2015-07-14

    Teaching quality improvement (QI) is a priority for residency and fellowship training programs. However, many medical trainees have had little exposure to QI methods. The purpose of this study is to review a rigorous and simple QI methodology (define, measure, analyze, improve, and control [DMAIC]) and demonstrate its use in a fellow-driven QI project aimed at reducing the number of delayed and canceled muscle biopsies at our institution. DMAIC was utilized. The project aim was to reduce the number of delayed muscle biopsies to 10% or less within 24 months. Baseline data were collected for 12 months. These data were analyzed to identify root causes for muscle biopsy delays and cancellations. Interventions were developed to address the most common root causes. Performance was then remeasured for 9 months. Baseline data were collected on 97 of 120 muscle biopsies during 2013. Twenty biopsies (20.6%) were delayed. The most common causes were scheduling too many tests on the same day and lack of fasting. Interventions aimed at patient education and biopsy scheduling were implemented. The effect was to reduce the number of delayed biopsies to 6.6% (6/91) over the next 9 months. Familiarity with QI methodologies such as DMAIC is helpful to ensure valid results and conclusions. Utilizing DMAIC, we were able to implement simple changes and significantly reduce the number of delayed muscle biopsies at our institution. © 2015 American Academy of Neurology.

  13. Designing a Clinical Data Warehouse Architecture to Support Quality Improvement Initiatives.

    PubMed

    Chelico, John D; Wilcox, Adam B; Vawdrey, David K; Kuperman, Gilad J

    2016-01-01

    Clinical data warehouses, initially directed towards clinical research or financial analyses, are evolving to support quality improvement efforts, and must now address the quality improvement life cycle. In addition, data that are needed for quality improvement often do not reside in a single database, requiring easier methods to query data across multiple disparate sources. We created a virtual data warehouse at NewYork Presbyterian Hospital that allowed us to bring together data from several source systems throughout the organization. We also created a framework to match the maturity of a data request in the quality improvement life cycle to proper tools needed for each request. As projects progress in the Define, Measure, Analyze, Improve, Control stages of quality improvement, there is a proper matching of resources the data needs at each step. We describe the analysis and design creating a robust model for applying clinical data warehousing to quality improvement.

  14. Designing a Clinical Data Warehouse Architecture to Support Quality Improvement Initiatives

    PubMed Central

    Chelico, John D.; Wilcox, Adam B.; Vawdrey, David K.; Kuperman, Gilad J.

    2016-01-01

    Clinical data warehouses, initially directed towards clinical research or financial analyses, are evolving to support quality improvement efforts, and must now address the quality improvement life cycle. In addition, data that are needed for quality improvement often do not reside in a single database, requiring easier methods to query data across multiple disparate sources. We created a virtual data warehouse at NewYork Presbyterian Hospital that allowed us to bring together data from several source systems throughout the organization. We also created a framework to match the maturity of a data request in the quality improvement life cycle to proper tools needed for each request. As projects progress in the Define, Measure, Analyze, Improve, Control stages of quality improvement, there is a proper matching of resources the data needs at each step. We describe the analysis and design creating a robust model for applying clinical data warehousing to quality improvement. PMID:28269833

  15. Data-Driven Hint Generation in Vast Solution Spaces: A Self-Improving Python Programming Tutor

    ERIC Educational Resources Information Center

    Rivers, Kelly; Koedinger, Kenneth R.

    2017-01-01

    To provide personalized help to students who are working on code-writing problems, we introduce a data-driven tutoring system, ITAP (Intelligent Teaching Assistant for Programming). ITAP uses state abstraction, path construction, and state reification to automatically generate personalized hints for students, even when given states that have not…

  16. Reconstruction of dynamic image series from undersampled MRI data using data-driven model consistency condition (MOCCO).

    PubMed

    Velikina, Julia V; Samsonov, Alexey A

    2015-11-01

    To accelerate dynamic MR imaging through development of a novel image reconstruction technique using low-rank temporal signal models preestimated from training data. We introduce the model consistency condition (MOCCO) technique, which utilizes temporal models to regularize reconstruction without constraining the solution to be low-rank, as is performed in related techniques. This is achieved by using a data-driven model to design a transform for compressed sensing-type regularization. The enforcement of general compliance with the model without excessively penalizing deviating signal allows recovery of a full-rank solution. Our method was compared with a standard low-rank approach utilizing model-based dimensionality reduction in phantoms and patient examinations for time-resolved contrast-enhanced angiography (CE-MRA) and cardiac CINE imaging. We studied the sensitivity of all methods to rank reduction and temporal subspace modeling errors. MOCCO demonstrated reduced sensitivity to modeling errors compared with the standard approach. Full-rank MOCCO solutions showed significantly improved preservation of temporal fidelity and aliasing/noise suppression in highly accelerated CE-MRA (acceleration up to 27) and cardiac CINE (acceleration up to 15) data. MOCCO overcomes several important deficiencies of previously proposed methods based on pre-estimated temporal models and allows high quality image restoration from highly undersampled CE-MRA and cardiac CINE data. © 2014 Wiley Periodicals, Inc.

  17. RECONSTRUCTION OF DYNAMIC IMAGE SERIES FROM UNDERSAMPLED MRI DATA USING DATA-DRIVEN MODEL CONSISTENCY CONDITION (MOCCO)

    PubMed Central

    Velikina, Julia V.; Samsonov, Alexey A.

    2014-01-01

    Purpose To accelerate dynamic MR imaging through development of a novel image reconstruction technique using low-rank temporal signal models pre-estimated from training data. Theory We introduce the MOdel Consistency COndition (MOCCO) technique that utilizes temporal models to regularize the reconstruction without constraining the solution to be low-rank as performed in related techniques. This is achieved by using a data-driven model to design a transform for compressed sensing-type regularization. The enforcement of general compliance with the model without excessively penalizing deviating signal allows recovery of a full-rank solution. Methods Our method was compared to standard low-rank approach utilizing model-based dimensionality reduction in phantoms and patient examinations for time-resolved contrast-enhanced angiography (CE MRA) and cardiac CINE imaging. We studied sensitivity of all methods to rank-reduction and temporal subspace modeling errors. Results MOCCO demonstrated reduced sensitivity to modeling errors compared to the standard approach. Full-rank MOCCO solutions showed significantly improved preservation of temporal fidelity and aliasing/noise suppression in highly accelerated CE MRA (acceleration up to 27) and cardiac CINE (acceleration up to 15) data. Conclusions MOCCO overcomes several important deficiencies of previously proposed methods based on pre-estimated temporal models and allows high quality image restoration from highly undersampled CE-MRA and cardiac CINE data. PMID:25399724

  18. Development of Six Sigma methodology for CNC milling process improvements

    NASA Astrophysics Data System (ADS)

    Ismail, M. N.; Rose, A. N. M.; Mohammed, N. Z.; Rashid, M. F. F. Ab

    2017-10-01

    Quality and productivity have been identified as an important role in any organization, especially for manufacturing sectors to gain more profit that leads to success of a company. This paper reports a work improvement project in Kolej Kemahiran Tinggi MARA Kuantan. It involves problem identification in production of “Khufi” product and proposing an effective framework to improve the current situation effectively. Based on the observation and data collection on the work in progress (WIP) product, the major problem has been identified related to function of the product which is the parts can’t assemble properly due to dimension of the product is out of specification. The six sigma has been used as a methodology to study and improve of the problems identified. Six Sigma is a highly statistical and data driven approach to solving complex business problems. It uses a methodical five phase approach define, measure, analysis, improve and control (DMAIC) to help understand the process and the variables that affect it so that can be optimized the processes. Finally, the root cause and solution for the production of “Khufi” problem has been identified and implemented then the result for this product was successfully followed the specification of fitting.

  19. Exemplar pediatric collaborative improvement networks: achieving results.

    PubMed

    Billett, Amy L; Colletti, Richard B; Mandel, Keith E; Miller, Marlene; Muething, Stephen E; Sharek, Paul J; Lannon, Carole M

    2013-06-01

    A number of pediatric collaborative improvement networks have demonstrated improved care and outcomes for children. Regionally, Cincinnati Children's Hospital Medical Center Physician Hospital Organization has sustained key asthma processes, substantially increased the percentage of their asthma population receiving "perfect care," and implemented an innovative pay-for-performance program with a large commercial payor based on asthma performance measures. The California Perinatal Quality Care Collaborative uses its outcomes database to improve care for infants in California NICUs. It has achieved reductions in central line-associated blood stream infections (CLABSI), increased breast-milk feeding rates at hospital discharge, and is now working to improve delivery room management. Solutions for Patient Safety (SPS) has achieved significant improvements in adverse drug events and surgical site infections across all 8 Ohio children's hospitals, with 7700 fewer children harmed and >$11.8 million in avoided costs. SPS is now expanding nationally, aiming to eliminate all events of serious harm at children's hospitals. National collaborative networks include ImproveCareNow, which aims to improve care and outcomes for children with inflammatory bowel disease. Reliable adherence to Model Care Guidelines has produced improved remission rates without using new medications and a significant increase in the proportion of Crohn disease patients not taking prednisone. Data-driven collaboratives of the Children's Hospital Association Quality Transformation Network initially focused on CLABSI in PICUs. By September 2011, they had prevented an estimated 2964 CLABSI, saving 355 lives and $103,722,423. Subsequent improvement efforts include CLABSI reductions in additional settings and populations.

  20. The early effects of Medicare's mandatory hospital pay-for-performance program.

    PubMed

    Ryan, Andrew M; Burgess, James F; Pesko, Michael F; Borden, William B; Dimick, Justin B

    2015-02-01

    To evaluate the impact of hospital value-based purchasing (HVBP) on clinical quality and patient experience during its initial implementation period (July 2011-March 2012). Hospital-level clinical quality and patient experience data from Hospital Compare from up to 5 years before and three quarters after HVBP was initiated. Acute care hospitals were exposed to HVBP by mandate while critical access hospitals and hospitals located in Maryland were not exposed. We performed a difference-in-differences analysis, comparing performance on 12 incentivized clinical process and 8 incentivized patient experience measures between hospitals exposed to the program and a matched comparison group of nonexposed hospitals. We also evaluated whether hospitals that were ultimately exposed to HVBP may have anticipated the program by improving quality in advance of its introduction. Difference-in-differences estimates indicated that hospitals that were exposed to HVBP did not show greater improvement for either the clinical process or patient experience measures during the program's first implementation period. Estimates from our preferred specification showed that HVBP was associated with a 0.51 percentage point reduction in composite quality for the clinical process measures (p > .10, 95 percent CI: -1.37, 0.34) and a 0.30 percentage point reduction in composite quality for the patient experience measures (p > .10, 95 percent CI: -0.79, 0.19). We found some evidence that hospitals improved performance on clinical process measures prior to the start of HVBP, but no evidence of this phenomenon for the patient experience measures. The timing of the financial incentives in HVBP was not associated with improved quality of care. It is unclear whether improvement for the clinical process measures prior to the start of HVBP was driven by the expectation of the program or was the result of other factors. © Health Research and Educational Trust.

  1. Evaluating Land-Atmosphere Interactions with the North American Soil Moisture Database

    NASA Astrophysics Data System (ADS)

    Giles, S. M.; Quiring, S. M.; Ford, T.; Chavez, N.; Galvan, J.

    2015-12-01

    The North American Soil Moisture Database (NASMD) is a high-quality observational soil moisture database that was developed to study land-atmosphere interactions. It includes over 1,800 monitoring stations the United States, Canada and Mexico. Soil moisture data are collected from multiple sources, quality controlled and integrated into an online database (soilmoisture.tamu.edu). The period of record varies substantially and only a few of these stations have an observation record extending back into the 1990s. Daily soil moisture observations have been quality controlled using the North American Soil Moisture Database QAQC algorithm. The database is designed to facilitate observationally-driven investigations of land-atmosphere interactions, validation of the accuracy of soil moisture simulations in global land surface models, satellite calibration/validation for SMOS and SMAP, and an improved understanding of how soil moisture influences climate on seasonal to interannual timescales. This paper provides some examples of how the NASMD has been utilized to enhance understanding of land-atmosphere interactions in the U.S. Great Plains.

  2. Lower extremity EMG-driven modeling of walking with automated adjustment of musculoskeletal geometry

    PubMed Central

    Meyer, Andrew J.; Patten, Carolynn

    2017-01-01

    Neuromusculoskeletal disorders affecting walking ability are often difficult to manage, in part due to limited understanding of how a patient’s lower extremity muscle excitations contribute to the patient’s lower extremity joint moments. To assist in the study of these disorders, researchers have developed electromyography (EMG) driven neuromusculoskeletal models utilizing scaled generic musculoskeletal geometry. While these models can predict individual muscle contributions to lower extremity joint moments during walking, the accuracy of the predictions can be hindered by errors in the scaled geometry. This study presents a novel EMG-driven modeling method that automatically adjusts surrogate representations of the patient’s musculoskeletal geometry to improve prediction of lower extremity joint moments during walking. In addition to commonly adjusted neuromusculoskeletal model parameters, the proposed method adjusts model parameters defining muscle-tendon lengths, velocities, and moment arms. We evaluated our EMG-driven modeling method using data collected from a high-functioning hemiparetic subject walking on an instrumented treadmill at speeds ranging from 0.4 to 0.8 m/s. EMG-driven model parameter values were calibrated to match inverse dynamic moments for five degrees of freedom in each leg while keeping musculoskeletal geometry close to that of an initial scaled musculoskeletal model. We found that our EMG-driven modeling method incorporating automated adjustment of musculoskeletal geometry predicted net joint moments during walking more accurately than did the same method without geometric adjustments. Geometric adjustments improved moment prediction errors by 25% on average and up to 52%, with the largest improvements occurring at the hip. Predicted adjustments to musculoskeletal geometry were comparable to errors reported in the literature between scaled generic geometric models and measurements made from imaging data. Our results demonstrate that with appropriate experimental data, joint moment predictions for walking generated by an EMG-driven model can be improved significantly when automated adjustment of musculoskeletal geometry is included in the model calibration process. PMID:28700708

  3. Improving preventive health care in Aboriginal and Torres Strait Islander primary care settings.

    PubMed

    Bailie, Jodie; Matthews, Veronica; Laycock, Alison; Schultz, Rosalie; Burgess, Christopher P; Peiris, David; Larkins, Sarah; Bailie, Ross

    2017-07-14

    Like other colonised populations, Indigenous Australians experience poorer health outcomes than non-Indigenous Australians. Preventable chronic disease is the largest contributor to the health differential between Indigenous and non-Indigenous Australians, but recommended best-practice preventive care is not consistently provided to Indigenous Australians. Significant improvement in health care delivery could be achieved through identifying and minimising evidence-practice gaps. Our objective was to use clinical audit data to create a framework of the priority evidence-practice gaps, strategies to address them, and drivers to support these strategies in the delivery of recommended preventive care. De-identified preventive health clinical audit data from 137 primary health care (PHC) centres in five jurisdictions were analysed (n = 17,108 audited records of well adults with no documented major chronic disease; 367 system assessments; 2005-2014), together with stakeholder survey data relating to interpretation of these data, using a mixed-methods approach (n = 152 responses collated in 2015-16). Stakeholders surveyed included clinicians, managers, policy officers, continuous quality improvement (CQI) facilitators and academics. Priority evidence-practice gaps and associated barriers, enablers and strategies to address the gaps were identified and reported back through two-stages of consultation. Further analysis and interpretation of these data were used to develop a framework of strategies and drivers for health service improvement. Stakeholder identified priorities were: following-up abnormal test results; completing cardiovascular risk assessments; timely recording of results; recording enquiries about living conditions, family relationships and substance use; providing support for clients identified with emotional wellbeing risk; enhancing systems to enable team function and continuity of care. Drivers identified for improving care in these areas included: strong Indigenous participation in the PHC service; appropriate team structure and function to support preventive care; meaningful use of data to support quality of care and CQI; and corporate support functions and structures. The framework should be useful for guiding development and implementation of barrier-driven, tailored interventions for primary health care service delivery and policy contexts, and for guiding further research. While specific strategies to improve the quality of preventive care need to be tailored to local context, these findings reinforce the requirement for multi-level action across the system. The framework and findings may be useful for similar purposes in other parts of the world, with appropriate attention to context in different locations.

  4. Improving opioid safety practices in primary care: protocol for the development and evaluation of a multifaceted, theory-informed pilot intervention for healthcare providers.

    PubMed

    Leece, Pamela; Buchman, Daniel Z; Hamilton, Michael; Timmings, Caitlyn; Shantharam, Yalnee; Moore, Julia; Furlan, Andrea D

    2017-04-26

    In North America, drug overdose deaths are reaching unprecedented levels, largely driven by increasing prescription opioid-related deaths. Despite the development of several opioid guidelines, prescribing behaviours still contribute to poor patient outcomes and societal harm. Factors at the provider and system level may hinder or facilitate the application of evidence-based guidelines; interventions designed to address such factors are needed. Using implementation science and behaviour change theory, we have planned the development and evaluation of a comprehensive Opioid Self-Assessment Package, designed to increase adherence to the Canadian Opioid Guideline among family physicians. The intervention uses practical educational and self-assessment tools to provide prescribers with feedback on their current knowledge and practices, and resources to improve their practice. The evaluation approach uses a pretest and post-test design and includes both quantitative and qualitative methods at baseline and 6 months. We will recruit a purposive sample of approximately 10 family physicians in Ontario from diverse practice settings, who currently treat patients with long-term opioid therapy for chronic pain. Quantitative data will be analysed using basic descriptive statistics, and qualitative data will be analysed using the Framework Method. The University Health Network Research Ethics Board approved this study. Dissemination plan includes publications, conference presentations and brief stakeholder reports. This evidence-informed, theory-driven intervention has implications for national application of opioid quality improvement tools in primary care settings. We are engaging experts and end users in advisory and stakeholder roles throughout our project to increase its national relevance, application and sustainability. The performance measures could be used as the basis for health system quality improvement indicators to monitor opioid prescribing. Additionally, the methods and approach used in this study could be adapted for other opioid guidelines, or applied to other areas of preventive healthcare and clinical guideline implementation processes. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  5. Process-driven selection of information systems for healthcare

    NASA Astrophysics Data System (ADS)

    Mills, Stephen F.; Yeh, Raymond T.; Giroir, Brett P.; Tanik, Murat M.

    1995-05-01

    Integration of networking and data management technologies such as PACS, RIS and HIS into a healthcare enterprise in a clinically acceptable manner is a difficult problem. Data within such a facility are generally managed via a combination of manual hardcopy systems and proprietary, special-purpose data processing systems. Process modeling techniques have been successfully applied to engineering and manufacturing enterprises, but have not generally been applied to service-based enterprises such as healthcare facilities. The use of process modeling techniques can provide guidance for the placement, configuration and usage of PACS and other informatics technologies within the healthcare enterprise, and thus improve the quality of healthcare. Initial process modeling activities conducted within the Pediatric ICU at Children's Medical Center in Dallas, Texas are described. The ongoing development of a full enterprise- level model for the Pediatric ICU is also described.

  6. Assurance of Myeloid Growth Factor Administration in an Infusion Center: Pilot Quality Improvement Initiative.

    PubMed

    Ramirez, Pamela Maree; Peterson, Barry; Holtshopple, Christine; Borja, Kristina; Torres, Vincent; Valdivia-Peppers, Lucille; Harriague, Julio; Joe, Melanie D

    2017-12-01

    Four incident reports involving missed doses of myeloid growth factors (MGFs) triggered the need for an outcome-driven initiative. From March 1, 2015, to February 29, 2016, at University of California Irvine Health Chao Infusion Center, 116 of 3,300 MGF doses were missed (3.52%), including pegfilgrastim, filgrastim, and sargramostim. We hypothesized that with the application of Lean Six Sigma methodology, we would achieve our primary objective of reducing the number of missed MGF doses to < 0.5%. This quality improvement initiative was conducted at Chao Infusion Center as part of a Lean Six Sigma Green Belt Certification Program. Therefore, Lean Six Sigma principles and tools were used throughout each phase of the project. Retrospective and prospective medical record reviews and data analyses were performed to evaluate the extent of the identified problem and impact of the process changes. Improvements included systems applications, practice changes, process modifications, and safety-net procedures. Preintervention, 24 missed doses (20.7%) required patient supportive care measures, resulting in increased hospital costs and decreased quality of care. Postintervention, from June 8, 2016, to August 7, 2016, zero of 489 MGF doses were missed after 2 months of intervention ( P < .001). Chao Infusion Center reduced missed doses from 3.52% to 0%, reaching the goal of < 0.5%. The establishment of simplified and standardized processes with safety checks for error prevention increased quality of care. Lean Six Sigma methodology can be applied by other institutions to produce positive outcomes and implement similar practice changes.

  7. Exploring Data-Driven Decision-Making in the Field: How Faculty Use Data and Other Forms of Information to Guide Instructional Decision-Making. WCER Working Paper No. 2014-3

    ERIC Educational Resources Information Center

    Hora, Matthew T.; Bouwma-Gearhart, Jana; Park, Hyoung Joon

    2014-01-01

    A defining characteristic of current U.S. educational policy is the use of data to inform decisions about resource allocation, teacher hiring, and curriculum and instruction. Perhaps the biggest challenge to data-driven decision making (DDDM) is that data use alone does not automatically result in improved teaching and learning. Research indicates…

  8. Improving the Quality of Health Care Services for Adolescents, Globally: A Standards-Driven Approach

    PubMed Central

    Nair, Manisha; Baltag, Valentina; Bose, Krishna; Boschi-Pinto, Cynthia; Lambrechts, Thierry; Mathai, Matthews

    2015-01-01

    Purpose The World Health Organization (WHO) undertook an extensive and elaborate process to develop eight Global Standards to improve quality of health care services for adolescents. The objectives of this article are to present the Global Standards and their method of development. Methods The Global Standards were developed through a four-stage process: (1) conducting needs assessment; (2) developing the Global Standards and their criteria; (3) expert consultations; and (4) assessing their usability. Needs assessment involved conducting a meta-review of systematic reviews and two online global surveys in 2013, one with primary health care providers and another with adolescents. The Global Standards were developed based on the needs assessment in conjunction with analysis of 26 national standards from 25 countries. The final document was reviewed by experts from the World Health Organization regional and country offices, governments, academia, nongovernmental organizations, and development partners. The standards were subsequently tested in Benin and in a regional expert consultation of Latin America and Caribbean countries for their usability. Results The process resulted in the development of eight Global Standards and 79 criteria for measuring them: (1) adolescents' health literacy; (2) community support; (3) appropriate package of services; (4) providers' competencies; (5) facility characteristics; (6) equity and nondiscrimination; (7) data and quality improvement; and (8) adolescents' participation. Conclusions The eight standards are intended to act as benchmarks against which quality of health care provided to adolescents could be compared. Health care services can use the standards as part of their internal quality assurance mechanisms or as part of an external accreditation process. PMID:26299556

  9. 2010-2011 Performance of the AirNow Satellite Data Processor

    NASA Astrophysics Data System (ADS)

    Pasch, A. N.; DeWinter, J. L.; Haderman, M. D.; van Donkelaar, A.; Martin, R. V.; Szykman, J.; White, J. E.; Dickerson, P.; Zahn, P. H.; Dye, T. S.

    2012-12-01

    The U.S. Environmental Protection Agency's (EPA) AirNow program provides maps of real time hourly Air Quality Index (AQI) conditions and daily AQI forecasts nationwide (http://www.airnow.gov). The public uses these maps to make health-based decisions. The usefulness of the AirNow air quality maps depends on the accuracy and spatial coverage of air quality measurements. Currently, the maps use only ground-based measurements, which have significant gaps in coverage in some parts of the United States. As a result, contoured AQI levels have high uncertainty in regions far from monitors. To improve the usefulness of air quality maps, scientists at EPA, Dalhousie University, and Sonoma Technology, Inc. have been working in collaboration with the National Aeronautics and Space Administration (NASA) and the National Oceanic and Atmospheric Administration (NOAA) to incorporate satellite-estimated surface PM2.5 concentrations into the maps via the AirNow Satellite Data Processor (ASDP). These satellite estimates are derived using NASA/NOAA satellite aerosol optical depth (AOD) retrievals and GEOS-Chem modeled ratios of surface PM2.5 concentrations to AOD. GEOS-Chem is a three-dimensional chemical transport model for atmospheric composition driven by meteorological input from the Goddard Earth Observing System (GOES). The ASDP can fuse multiple PM2.5 concentration data sets to generate AQI maps with improved spatial coverage. The goal of ASDP is to provide more detailed AQI information in monitor-sparse locations and augment monitor-dense locations with more information. We will present a statistical analysis for 2010-2011 of the ASDP predictions of PM2.5 focusing on performance at validation sites. In addition, we will present several case studies evaluating the ASDP's performance for multiple regions and seasons, focusing specifically on days when large spatial gradients in AQI and wildfire smoke impact were observed.

  10. Can Western quality improvement methods transform the Russian health care system?

    PubMed

    Tillinghast, S J

    1998-05-01

    The Russian health care system largely remains the same system that was in place during the existence of the Soviet Union. It is almost entirely state owned and operated, although ownership and management have developed from the central government to the oblast (province). The ZdravReform (Health Reform) Program (ZRP) in Russia, which began in 1993, included the goal of improving the quality and cost-effectiveness of the health care system. Work on introducing continuous quality improvement (CQI), evidence-based practice guidelines, and indicators of quality was conducted in 1995-1996. INTRODUCING EVIDENCE-BASED MEDICINE: As a result of the poor quality of Russian-language medical journals and the inability to gain access to the knowledge available in Western medical literature, Russian medical practices have not kept up with the rapid evolution of evidence-based medical practice that has begun transforming Western medicine. A number of evidence-based clinical practice guidelines were translated and disseminated to Russian-speaking physicians working in facilities participating in ZRP in Russia and Central Asia. Given the limitations of existing measures of the quality of care, indicators were developed for participating ambulatory polyclinics in several oblasts in Siberia. Russian physicians responsible for quality of care for their respective oblasts formed a working group to develop the indicators. A clinical information system that would provide automated collection and analysis of the indicator data-as well as additional patient record information-was also developed. CQI activities, entailing a multidisciplinary, participatory team approach, were conducted in four oblasts in western Siberia. Projects addressed the management of community-acquired pneumonia and reduction of length of stay after myocardial infarction (MI). One of the oblasts provided an example of a home-grown evidence-based protocol for post-MI care, which was adopted in the other three oblasts. Evidence-based medicine is critically needed to improve the quality of research and publications, medical education, and medical practice. Physicians everywhere are data driven; they change their practices when convinced by good data. The key to successful introduction of evidence-based medicine is understanding the fundamentals of good scientific method as applied to medicine. The Russian health care system's experience in reporting to higher authorities' process and outcomes data that resemble our modern indicators can provide the basis for accurate and valid measures of quality. In contrast with American expectations that a significant cultural change in an organization could take years, even with great effort, Russian physicians and other clinicians rapidly assimilated the new concepts of QI and put them to use. More on-site assistance by international medical consultants will still be needed for several years to hasten the process of change and ensure that it does not become stalled.

  11. An urban observatory for quantifying phosphorus and suspended solid loads in combined natural and stormwater conveyances.

    PubMed

    Melcher, Anthony A; Horsburgh, Jeffery S

    2017-06-01

    Water quality in urban streams and stormwater systems is highly dynamic, both spatially and temporally, and can change drastically during storm events. Infrequent grab samples commonly collected for estimating pollutant loadings are insufficient to characterize water quality in many urban water systems. In situ water quality measurements are being used as surrogates for continuous pollutant load estimates; however, relatively few studies have tested the validity of surrogate indicators in urban stormwater conveyances. In this paper, we describe an observatory aimed at demonstrating the infrastructure required for surrogate monitoring in urban water systems and for capturing the dynamic behavior of stormwater-driven pollutant loads. We describe the instrumentation of multiple, autonomous water quality and quantity monitoring sites within an urban observatory. We also describe smart and adaptive sampling procedures implemented to improve data collection for developing surrogate relationships and for capturing the temporal and spatial variability of pollutant loading events in urban watersheds. Results show that the observatory is able to capture short-duration storm events within multiple catchments and, through inter-site communication, sampling efforts can be synchronized across multiple monitoring sites.

  12. The Impact of Data-Based Science Instruction on Standardized Test Performance

    NASA Astrophysics Data System (ADS)

    Herrington, Tia W.

    Increased teacher accountability efforts have resulted in the use of data to improve student achievement. This study addressed teachers' inconsistent use of data-driven instruction in middle school science. Evidence of the impact of data-based instruction on student achievement and school and district practices has been well documented by researchers. In science, less information has been available on teachers' use of data for classroom instruction. Drawing on data-driven decision making theory, the purpose of this study was to examine whether data-based instruction impacted performance on the science Criterion Referenced Competency Test (CRCT) and to explore the factors that impeded its use by a purposeful sample of 12 science teachers at a data-driven school. The research questions addressed in this study included understanding: (a) the association between student performance on the science portion of the CRCT and data-driven instruction professional development, (b) middle school science teachers' perception of the usefulness of data, and (c) the factors that hindered the use of data for science instruction. This study employed a mixed methods sequential explanatory design. Data collected included 8th grade CRCT data, survey responses, and individual teacher interviews. A chi-square test revealed no improvement in the CRCT scores following the implementation of professional development on data-driven instruction (chi 2 (1) = .183, p = .67). Results from surveys and interviews revealed that teachers used data to inform their instruction, indicating time as the major hindrance to their use. Implications for social change include the development of lesson plans that will empower science teachers to deliver data-based instruction and students to achieve identified academic goals.

  13. Computing and Applying Atomic Regulons to Understand Gene Expression and Regulation

    PubMed Central

    Faria, José P.; Davis, James J.; Edirisinghe, Janaka N.; Taylor, Ronald C.; Weisenhorn, Pamela; Olson, Robert D.; Stevens, Rick L.; Rocha, Miguel; Rocha, Isabel; Best, Aaron A.; DeJongh, Matthew; Tintle, Nathan L.; Parrello, Bruce; Overbeek, Ross; Henry, Christopher S.

    2016-01-01

    Understanding gene function and regulation is essential for the interpretation, prediction, and ultimate design of cell responses to changes in the environment. An important step toward meeting the challenge of understanding gene function and regulation is the identification of sets of genes that are always co-expressed. These gene sets, Atomic Regulons (ARs), represent fundamental units of function within a cell and could be used to associate genes of unknown function with cellular processes and to enable rational genetic engineering of cellular systems. Here, we describe an approach for inferring ARs that leverages large-scale expression data sets, gene context, and functional relationships among genes. We computed ARs for Escherichia coli based on 907 gene expression experiments and compared our results with gene clusters produced by two prevalent data-driven methods: Hierarchical clustering and k-means clustering. We compared ARs and purely data-driven gene clusters to the curated set of regulatory interactions for E. coli found in RegulonDB, showing that ARs are more consistent with gold standard regulons than are data-driven gene clusters. We further examined the consistency of ARs and data-driven gene clusters in the context of gene interactions predicted by Context Likelihood of Relatedness (CLR) analysis, finding that the ARs show better agreement with CLR predicted interactions. We determined the impact of increasing amounts of expression data on AR construction and find that while more data improve ARs, it is not necessary to use the full set of gene expression experiments available for E. coli to produce high quality ARs. In order to explore the conservation of co-regulated gene sets across different organisms, we computed ARs for Shewanella oneidensis, Pseudomonas aeruginosa, Thermus thermophilus, and Staphylococcus aureus, each of which represents increasing degrees of phylogenetic distance from E. coli. Comparison of the organism-specific ARs showed that the consistency of AR gene membership correlates with phylogenetic distance, but there is clear variability in the regulatory networks of closely related organisms. As large scale expression data sets become increasingly common for model and non-model organisms, comparative analyses of atomic regulons will provide valuable insights into fundamental regulatory modules used across the bacterial domain. PMID:27933038

  14. Digital control and data acquisition for high-value GTA welding

    NASA Astrophysics Data System (ADS)

    George, T. G.; Franco-Ferreira, E. A.

    Electric power for the Cassini space probe will be provided by radioisotope thermoelectric generators (RTG's) thermally driven by General-Purpose Heat Source (GPHS) modules. Each GPHS module contains four, 150-g, pellets of Pu-238O2, and each of the four pellets is encapsulated within a thin-wall iridium-alloy shell. GTA girth welding of these capsules is performed at Los Alamos National Laboratory (LANL) on an automated, digitally-controlled welding system. Baseline design considerations for system automation and strategies employed to maximize process yield, improve process consistency, and generate required quality assurance information are discussed. Design of the automated girth welding system was driven by a number of factors which militated for precise parametric control and data acquisition. Foremost among these factors was the extraordinary value of the capsule components. In addition, DOE order 5700.6B, which took effect on 23 Sep. 1986, required that all operations adhere to strict levels of process quality assurance. A detailed technical specification for the GPHS welding system was developed on the basis of a joint LANL/Westinghouse Savannah River Company (WSRC) design effort. After a competitive bidding process, Jetline Engineering, Inc., of Irvine, California, was selected as the system manufacturer. During the period over which four identical welding systems were fabricated, very close liason was maintained between the LANL/WSRC technical representatives and the vendor. The level of rapport was outstanding, and the end result was the 1990 delivery of four systems that met or exceeded all specification requirements.

  15. A status review of photovoltaic power conversion equipment reliability, safety, and quality assurance protocols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hacke, Peter; Lokanath, Sumanth; Williams, Paul

    Data indicate that the inverter is the element of the photovoltaic plant that has the highest number of service calls and the greatest operation and maintenance cost burden. This paper describes the projects and relevant background needed in developing design qualification standards that would serve to establish a minimum level of reliability, along with a review of photovoltaic inverter quality and safety standards, most of which are in their infancy. We compare stresses and levels for accelerated testing of inverters proposed in the standard drafts, and those proposed by manufacturers and purchasers of inverters. We also review bases for themore » methods, stress types, and stress levels for durability testing of key inverter components. Many of the test protocols appear to need more comprehensive inclusion of stress factors existing in the natural environment such as wind driven rain, dust, and grid disturbances. Further understanding of how temperature, humidity ingress, and voltage bias affect the inverters and their components is also required. We provide data indicating inconsistent quality of the inverters and the durability of components leading to greater cost for the photovoltaic plant operator. Accordingly, the recommendation for data collection within quality standards for obtaining cost of ownership metrics is made. Design validation testing using realistic operation, environmental, and connection conditions, including under end-use field conditions with feedback for continuous improvement is recommended for inclusion within a quality standard.« less

  16. A status review of photovoltaic power conversion equipment reliability, safety, and quality assurance protocols

    DOE PAGES

    Hacke, Peter; Lokanath, Sumanth; Williams, Paul; ...

    2017-10-10

    Data indicate that the inverter is the element of the photovoltaic plant that has the highest number of service calls and the greatest operation and maintenance cost burden. This paper describes the projects and relevant background needed in developing design qualification standards that would serve to establish a minimum level of reliability, along with a review of photovoltaic inverter quality and safety standards, most of which are in their infancy. We compare stresses and levels for accelerated testing of inverters proposed in the standard drafts, and those proposed by manufacturers and purchasers of inverters. We also review bases for themore » methods, stress types, and stress levels for durability testing of key inverter components. Many of the test protocols appear to need more comprehensive inclusion of stress factors existing in the natural environment such as wind driven rain, dust, and grid disturbances. Further understanding of how temperature, humidity ingress, and voltage bias affect the inverters and their components is also required. We provide data indicating inconsistent quality of the inverters and the durability of components leading to greater cost for the photovoltaic plant operator. Accordingly, the recommendation for data collection within quality standards for obtaining cost of ownership metrics is made. Design validation testing using realistic operation, environmental, and connection conditions, including under end-use field conditions with feedback for continuous improvement is recommended for inclusion within a quality standard.« less

  17. Teaching Reform of Civil Engineering Materials Course Based on Project-Driven Pedagogy

    NASA Astrophysics Data System (ADS)

    Yidong, Xu; Wei, Chen; WeiguoJian, You; Jiansheng, Shen

    2018-05-01

    In view of the scattered experimental projects in practical courses of civil engineering materials, the poor practical ability of students and the disconnection between practical teaching and theoretical teaching, this paper proposes a practical teaching procedure. Firstly, the single experiment should be offered which emphasizes on improving the students’ basic experimental operating ability. Secondly, the compressive experiment is offered and the overall quality of students can be examined in the form of project team. In order to investigate the effect of teaching reform, the comparative analysis of the students of three grades (2014, 2015 and 2016) majored in civil engineering was conducted. The result shows that the students’ ability of experimental operation is obviously improved by using the project driven method-based teaching reform. Besides, the students’ ability to analyse and solve problems has also been improved.

  18. CodingQuarry: highly accurate hidden Markov model gene prediction in fungal genomes using RNA-seq transcripts.

    PubMed

    Testa, Alison C; Hane, James K; Ellwood, Simon R; Oliver, Richard P

    2015-03-11

    The impact of gene annotation quality on functional and comparative genomics makes gene prediction an important process, particularly in non-model species, including many fungi. Sets of homologous protein sequences are rarely complete with respect to the fungal species of interest and are often small or unreliable, especially when closely related species have not been sequenced or annotated in detail. In these cases, protein homology-based evidence fails to correctly annotate many genes, or significantly improve ab initio predictions. Generalised hidden Markov models (GHMM) have proven to be invaluable tools in gene annotation and, recently, RNA-seq has emerged as a cost-effective means to significantly improve the quality of automated gene annotation. As these methods do not require sets of homologous proteins, improving gene prediction from these resources is of benefit to fungal researchers. While many pipelines now incorporate RNA-seq data in training GHMMs, there has been relatively little investigation into additionally combining RNA-seq data at the point of prediction, and room for improvement in this area motivates this study. CodingQuarry is a highly accurate, self-training GHMM fungal gene predictor designed to work with assembled, aligned RNA-seq transcripts. RNA-seq data informs annotations both during gene-model training and in prediction. Our approach capitalises on the high quality of fungal transcript assemblies by incorporating predictions made directly from transcript sequences. Correct predictions are made despite transcript assembly problems, including those caused by overlap between the transcripts of adjacent gene loci. Stringent benchmarking against high-confidence annotation subsets showed CodingQuarry predicted 91.3% of Schizosaccharomyces pombe genes and 90.4% of Saccharomyces cerevisiae genes perfectly. These results are 4-5% better than those of AUGUSTUS, the next best performing RNA-seq driven gene predictor tested. Comparisons against whole genome Sc. pombe and S. cerevisiae annotations further substantiate a 4-5% improvement in the number of correctly predicted genes. We demonstrate the success of a novel method of incorporating RNA-seq data into GHMM fungal gene prediction. This shows that a high quality annotation can be achieved without relying on protein homology or a training set of genes. CodingQuarry is freely available ( https://sourceforge.net/projects/codingquarry/ ), and suitable for incorporation into genome annotation pipelines.

  19. The ACR BI-RADS® Experience: Learning From History

    PubMed Central

    Burnside, Elizabeth S.; Sickles, Edward A.; Bassett, Lawrence W.; Rubin, Daniel L.; Lee, Carol H.; Ikeda, Debra M.; Mendelson, Ellen B.; Wilcox, Pamela A.; Butler, Priscilla F.; D’Orsi, Carl J.

    2011-01-01

    The Breast Imaging Reporting and Data System® (BI-RADS®) initiative, instituted by the ACR, was begun in the late 1980s to address a lack of standardization and uniformity in mammography practice reporting. An important component of the BI-RADS initiative is the lexicon, a dictionary of descriptors of specific imaging features. The BI-RADS lexicon has always been data driven, using descriptors that previously had been shown in the literature to be predictive of benign and malignant disease. Once established, the BI-RADS lexicon provided new opportunities for quality assurance, communication, research, and improved patient care. The history of this lexicon illustrates a series of challenges and instructive successes that provide a valuable guide for other groups that aspire to develop similar lexicons in the future. PMID:19945040

  20. Improving and sustaining delivery of CPT for PTSD in mental health systems: a cluster randomized trial.

    PubMed

    Wiltsey Stirman, Shannon; Finley, Erin P; Shields, Norman; Cook, Joan; Haine-Schlagel, Rachel; Burgess, James F; Dimeff, Linda; Koerner, Kelly; Suvak, Michael; Gutner, Cassidy A; Gagnon, David; Masina, Tasoula; Beristianos, Matthew; Mallard, Kera; Ramirez, Vanessa; Monson, Candice

    2017-03-06

    Large-scale implementation of evidence-based psychotherapies (EBPs) such as cognitive processing therapy (CPT) for posttraumatic stress disorder can have a tremendous impact on mental and physical health, healthcare utilization, and quality of life. While many mental health systems (MHS) have invested heavily in programs to implement EBPs, few eligible patients receive EBPs in routine care settings, and clinicians do not appear to deliver the full treatment protocol to many of their patients. Emerging evidence suggests that when CPT and other EBPs are delivered at low levels of fidelity, clinical outcomes are negatively impacted. Thus, identifying strategies to improve and sustain the delivery of CPT and other EBPs is critical. Existing literature has suggested two competing strategies to promote sustainability. One emphasizes fidelity to the treatment protocol through ongoing consultation and fidelity monitoring. The other focuses on improving the fit and effectiveness of these treatments through appropriate adaptations to the treatment or the clinical setting through a process of data-driven, continuous quality improvement. Neither has been evaluated in terms of impact on sustained implementation. To compare these approaches on the key sustainability outcomes and provide initial guidance on sustainability strategies, we propose a cluster randomized trial with mental health clinics (n = 32) in three diverse MHSs that have implemented CPT. Cohorts of clinicians and clinical managers will participate in 1 year of a fidelity oriented learning collaborative or 1 year of a continuous quality improvement-oriented learning collaborative. Patient-level PTSD symptom change, CPT fidelity and adaptation, penetration, and clinics' capacity to deliver EBP will be examined. Survey and interview data will also be collected to investigate multilevel influences on the success of the two learning collaborative strategies. This research will be conducted by a team of investigators with expertise in CPT implementation, mixed method research strategies, quality improvement, and implementation science, with input from stakeholders in each participating MHS. It will have broad implications for supporting ongoing delivery of EBPs in mental health and healthcare systems and settings. The resulting products have the potential to significantly improve efforts to ensure ongoing high quality implementation and consumer access to EBPs. NCT02449421 . Registered 02/09/2015.

  1. Merging Economic and Environmental Concerns through Ecopreneurship. Digest Number 98-8.

    ERIC Educational Resources Information Center

    Schuyler, Gwyer

    Ecopreneurs are entrepreneurs whose business efforts are not only driven by profit, but also by a concern for the environment. Ecopreneurship, also known as environmental entrepreneurship and eco-capitalism, is becoming more widespread as a new market-based approach to identifying opportunities for improving environmental quality and capitalizing…

  2. Integrity, standards, and QC-related issues with big data in pre-clinical drug discovery.

    PubMed

    Brothers, John F; Ung, Matthew; Escalante-Chong, Renan; Ross, Jermaine; Zhang, Jenny; Cha, Yoonjeong; Lysaght, Andrew; Funt, Jason; Kusko, Rebecca

    2018-06-01

    The tremendous expansion of data analytics and public and private big datasets presents an important opportunity for pre-clinical drug discovery and development. In the field of life sciences, the growth of genetic, genomic, transcriptomic and proteomic data is partly driven by a rapid decline in experimental costs as biotechnology improves throughput, scalability, and speed. Yet far too many researchers tend to underestimate the challenges and consequences involving data integrity and quality standards. Given the effect of data integrity on scientific interpretation, these issues have significant implications during preclinical drug development. We describe standardized approaches for maximizing the utility of publicly available or privately generated biological data and address some of the common pitfalls. We also discuss the increasing interest to integrate and interpret cross-platform data. Principles outlined here should serve as a useful broad guide for existing analytical practices and pipelines and as a tool for developing additional insights into therapeutics using big data. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Analyzing the Discourse of Chais Conferences for the Study of Innovation and Learning Technologies via a Data-Driven Approach

    ERIC Educational Resources Information Center

    Silber-Varod, Vered; Eshet-Alkalai, Yoram; Geri, Nitza

    2016-01-01

    The current rapid technological changes confront researchers of learning technologies with the challenge of evaluating them, predicting trends, and improving their adoption and diffusion. This study utilizes a data-driven discourse analysis approach, namely culturomics, to investigate changes over time in the research of learning technologies. The…

  4. Dutch healthcare reform: did it result in performance improvement of health plans? A comparison of consumer experiences over time.

    PubMed

    Hendriks, Michelle; Spreeuwenberg, Peter; Rademakers, Jany; Delnoij, Diana M J

    2009-09-17

    Many countries have introduced elements of managed competition in their healthcare system with the aim to accomplish more efficient and demand-driven health care. Simultaneously, generating and reporting of comparative healthcare information has become an important quality-improvement instrument. We examined whether the introduction of managed competition in the Dutch healthcare system along with public reporting of quality information was associated with performance improvement in health plans. Experiences of consumers with their health plan were measured in four consecutive years (2005-2008) using the CQI(R) health plan instrument 'Experiences with Healthcare and Health Insurer'. Data were available of 13,819 respondents (response = 45%) of 30 health plans in 2005, of 8,266 respondents (response = 39%) of 32 health plans in 2006, of 8,088 respondents (response = 34%) of 32 health plans in 2007, and of 7,183 respondents (response = 31%) of 32 health plans in 2008. We performed multilevel regression analyses with three levels: respondent, health plan and year of measurement. Per year and per quality aspect, we estimated health plan means while adjusting for consumers' age, education and self-reported health status. We tested for linear and quadratic time effects using chi-squares. The overall performance of health plans increased significantly from 2005 to 2008 on four quality aspects. For three other aspects, we found that the overall performance first declined and then increased from 2006 to 2008, but the performance in 2008 was not better than in 2005. The overall performance of health plans did not improve more often for quality aspects that were identified as important areas of improvement in the first year of measurement. On six out of seven aspects, the performance of health plans that scored below average in 2005 increased more than the performance of health plans that scored average and/or above average in that year. We found mixed results concerning the effects of managed competition on the performance of health plans. To determine whether managed competition in the healthcare system leads to quality improvement in health plans, it is important to examine whether and for what reasons health plans initiate improvement efforts.

  5. Using science and psychology to improve the dissemination and evaluation of scientific work.

    PubMed

    Buttliere, Brett T

    2014-01-01

    Here I outline some of what science can tell us about the problems in psychological publishing and how to best address those problems. First, the motivation behind questionable research practices is examined (the desire to get ahead or, at least, not fall behind). Next, behavior modification strategies are discussed, pointing out that reward works better than punishment. Humans are utility seekers and the implementation of current change initiatives is hindered by high initial buy-in costs and insufficient expected utility. Open science tools interested in improving science should team up, to increase utility while lowering the cost and risk associated with engagement. The best way to realign individual and group motives will probably be to create one, centralized, easy to use, platform, with a profile, a feed of targeted science stories based upon previous system interaction, a sophisticated (public) discussion section, and impact metrics which use the associated data. These measures encourage high quality review and other prosocial activities while inhibiting self-serving behavior. Some advantages of centrally digitizing communications are outlined, including ways the data could be used to improve the peer review process. Most generally, it seems that decisions about change design and implementation should be theory and data driven.

  6. Using the Principles of F.A.I.R Data to Improve the Measure of Value of Big Data and Big Data Repositories

    NASA Astrophysics Data System (ADS)

    Richards, C. J.; Wyborn, L. A.; Evans, B. J. K.; Wang, J.; Druken, K. A.; Smillie, J.; Pringle, S.

    2017-12-01

    In a data-intensive world, finding the right data can be time-consuming and, when found, may involve compromises on quality and often considerable extra effort to wrangle it into shape. This is particularly true as users are exploring new and innovative ways of working with data from different sources and scientific domains. It is recognised that the effort and specialist knowledge required to transform datasets to meet these requirements goes beyond the reasonable remit of a single research project or research community. Instead, Government investments in national collaborations like the Australian National University's National Computational Infrastructure (NCI), provide a sustainable way to bring together and transform disparate data collections from a range of disciplines in ways which enable new and innovative analysis and use. With these goals in mind, the NCI established a Data Quality Strategy (DQS) for managing 10PB of reference data collections with a particular focus on improving data use and reuse across scientific domains, making the data suitable for use in a high-end computational and data-intensive environment, and supporting programmatic access for a range of applications. Evaluating how effectively we're achivieving these goals and maintaining ongoing funding requires demonstration of the value and impact of these data collections. Standard approaches to measuring data value involve basic measures of `data usage' or make an attempt to track data to `research outcomes'. While useful, these measures fail to capture the value of the level of curation or quality assurance in making the data available. To fill this gap, NCI has developed a 3-tiered approach to measuring the return on investment which broadens the concept of value to include improvements in access to and use of the data. Key to this approach was integrating the guiding principles of the Force 11 community's F.A.I.R data into the DQS because it provides a community-driven standards-based framework which can be used for metrics. The NCI metrics provide useful information for data users, data custodians as well as data repositories and, most importantly, can be used to demonstrate the return on investment in both quantitative and qualitative terms.

  7. Evaluation of the AirNow Satellite Data Processor for 2010-2012

    NASA Astrophysics Data System (ADS)

    Pasch, A. N.; DeWinter, J. L.; Dye, T.; Haderman, M.; Zahn, P. H.; Szykman, J.; White, J. E.; Dickerson, P.; van Donkelaar, A.; Martin, R.

    2013-12-01

    The U.S. Environmental Protection Agency's (EPA) AirNow program provides the public with real-time and forecasted air quality conditions. Millions of people each day use information from AirNow to protect their health. The AirNow program (http://www.airnow.gov) reports ground-level ozone (O3) and fine particulate matter (PM2.5) with a standardized index called the Air Quality Index (AQI). AirNow aggregates information from over 130 state, local, and federal air quality agencies and provides tools for over 2,000 agency staff responsible for monitoring, forecasting, and communicating local air quality. Each hour, AirNow systems generate thousands of maps and products. The usefulness of the AirNow air quality maps depends on the accuracy and spatial coverage of air quality measurements. Currently, the maps use only ground-based measurements, which have significant gaps in coverage in some parts of the United States. As a result, contoured AQI levels have high uncertainty in regions far from monitors. To improve the usefulness of air quality maps, scientists at EPA, Dalhousie University, and Sonoma Technology, Inc., in collaboration with the National Aeronautics and Space Administration (NASA) and the National Oceanic and Atmospheric Administration (NOAA), have completed a project to incorporate satellite-estimated surface PM2.5 concentrations into the maps via the AirNow Satellite Data Processor (ASDP). These satellite estimates are derived using NASA/NOAA satellite aerosol optical depth (AOD) retrievals and GEOS-Chem modeled ratios of surface PM2.5 concentrations to AOD. GEOS-Chem is a three-dimensional chemical transport model for atmospheric composition driven by meteorological input from the Goddard Earth Observing System (GEOS). The ASDP can fuse multiple PM2.5 concentration data sets to generate AQI maps with improved spatial coverage. The goals of ASDP are to provide more detailed AQI information in monitor-sparse locations and to augment monitor-dense locations with more information. The ASDP system uses a weighted-average approach using uncertainty information about each data set. Recent improvements in the estimation of the uncertainty of interpolated ground-based monitor data have allowed for a more complete characterization of the uncertainty of the surface measurements. We will present a statistical analysis for 2010-2012 of the ASDP predictions of PM2.5 focusing on performance at validation sites. In addition, we will present several case studies evaluating the ASDP's performance for multiple regions and seasons, focusing specifically on days when large spatial gradients in AQI and wildfire smoke impacts were observed.

  8. A Theoretically Driven Investigation of the Efficacy of an Immersive Interactive Avatar Rich Virtual Environment in Pre-deployment Nursing Knowledge and Teamwork Skills Training

    DTIC Science & Technology

    2013-05-01

    estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data... pedagogy , and instructional quality. Measures of effectiveness data is minimal and often has not been conducted in a rigorous manner. To be clear...instructional pedagogy and instructional quality between the programs offered. Efficacy studies beyond student satisfaction scores have not been done in a

  9. Nursing staff connect libraries with improving patient care but not with achieving organisational objectives: a grounded theory approach.

    PubMed

    Chamberlain, David; Brook, Richard

    2014-03-01

    Health organisations are often driven by specific targets defined by mission statements, aims and objectives to improve patient care. Health libraries need to demonstrate that they contribute to organisational objectives, but it is not clear how nurses view that contribution. To investigate ward nursing staff motivations, their awareness of ward and organisational objectives; and their attitudes towards the contribution of health library services to improving patient care. Qualitative research using focus group data was combined with content analysis of literature evidence and library statistics (quantitative data). Data were analysed using thematic coding, divided into five group themes: understanding of Trust, Ward and Personal objectives, use of Library, use of other information sources, quality and Issues. Four basic social-psychological processes were then developed. Behaviour indicates low awareness of organisational objectives despite patient-centric motivation. High awareness of library services is shown with some connection made by ward staff between improved knowledge and improved patient care. There was a two-tiered understanding of ward objectives and library services, based on level of seniority. However, evidence-based culture needs to be intrinsic in the organisation before all staff benefit. Libraries can actively engage in this at ward and board level and improve patient care by supporting organisational objectives. © 2014 The author. Health Information and Libraries Journal © 2014 Health Libraries Group.

  10. Quantitative and qualitative assessment of the bovine abortion surveillance system in France.

    PubMed

    Bronner, Anne; Gay, Emilie; Fortané, Nicolas; Palussière, Mathilde; Hendrikx, Pascal; Hénaux, Viviane; Calavas, Didier

    2015-06-01

    Bovine abortion is the main clinical sign of bovine brucellosis, a disease of which France has been declared officially free since 2005. To ensure the early detection of any brucellosis outbreak, event-driven surveillance relies on the mandatory notification of bovine abortions and the brucellosis testing of aborting cows. However, the under-reporting of abortions appears frequent. Our objectives were to assess the aptitude of the bovine abortion surveillance system to detect each and every bovine abortion and to identify factors influencing the system's effectiveness. We evaluated five attributes defined by the U.S. Centers for Disease Control with a method suited to each attribute: (1) data quality was studied quantitatively and qualitatively, as this factor considerably influences data analysis and results; (2) sensitivity and representativeness were estimated using a unilist capture-recapture approach to quantify the surveillance system's effectiveness; (3) acceptability and simplicity were studied through qualitative interviews of actors in the field, given that the surveillance system relies heavily on abortion notifications by farmers and veterinarians. Our analysis showed that (1) data quality was generally satisfactory even though some errors might be due to actors' lack of awareness of the need to collect accurate data; (2) from 2006 to 2011, the mean annual sensitivity - i.e. the proportion of farmers who reported at least one abortion out of all those who detected such events - was around 34%, but was significantly higher in dairy than beef cattle herds (highlighting a lack of representativeness); (3) overall, the system's low sensitivity was related to its low acceptability and lack of simplicity. This study showed that, in contrast to policy-makers, most farmers and veterinarians perceived the risk of a brucellosis outbreak as negligible. They did not consider sporadic abortions as a suspected case of brucellosis and usually reported abortions only to identify their cause rather than to reject brucellosis. The system proved too complex, especially for beef cattle farmers, as they may fail to detect aborting cows at pasture or have difficulties catching them for sampling. By investigating critical attributes, our evaluation highlighted the surveillance system's strengths and needed improvements. We believe our comprehensive approach can be used to assess other event-driven surveillance systems. In addition, some of our recommendations on increasing the effectiveness of event-driven brucellosis surveillance may be useful in improving the notification rate for suspected cases of other exotic diseases. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Next Generation Quality: Assessing the Physician in Clinical History Completeness and Diagnostic Interpretations Using Funnel Plots and Normalized Deviations Plots in 3,854 Prostate Biopsies.

    PubMed

    Bonert, Michael; El-Shinnawy, Ihab; Carvalho, Michael; Williams, Phillip; Salama, Samih; Tang, Damu; Kapoor, Anil

    2017-01-01

    Observational data and funnel plots are routinely used outside of pathology to understand trends and improve performance. Extract diagnostic rate (DR) information from free text surgical pathology reports with synoptic elements and assess whether inter-rater variation and clinical history completeness information useful for continuous quality improvement (CQI) can be obtained. All in-house prostate biopsies in a 6-year period at two large teaching hospitals were extracted and then diagnostically categorized using string matching, fuzzy string matching, and hierarchical pruning. DRs were then stratified by the submitting physicians and pathologists. Funnel plots were created to assess for diagnostic bias. 3,854 prostate biopsies were found and all could be diagnostically classified. Two audits involving the review of 700 reports and a comparison of the synoptic elements with the free text interpretations suggest a categorization error rate of <1%. Twenty-seven pathologists each read >40 cases and together assessed 3,690 biopsies. There was considerable inter-rater variability and a trend toward more World Health Organization/International Society of Urologic Pathology Grade 1 cancers in older pathologists. Normalized deviations plots, constructed using the median DR, and standard error can elucidate associated over- and under-calls for an individual pathologist in relation to their practice group. Clinical history completeness by submitting medical doctor varied significantly (100% to 22%). Free text data analyses have some limitations; however, they could be used for data-driven CQI in anatomical pathology, and could lead to the next generation in quality of care.

  12. Toward a more data-driven supervision of collegiate counseling centers.

    PubMed

    Varlotta, Lori E

    2012-01-01

    Hearing the national call for higher education accountability, the author of this tripartite article urges university administrators to move towards a more data-driven approach to counseling center supervision. Toward that end, the author first examines a key factor--perceived increase in student pathology--that appears to shape budget and staffing decisions in many university centers. Second, she reviews the emerging but conflicting research of clinician-scholars who are trying to empirically verify or refute that perception; their conflicting results suggest that no study alone should be used as the "final word" in evidence-based decision-making. Third, the author delineates the campus-specific data that should be gathered to guide staffing and budgeting decisions on each campus. She concludes by reminding readers that data-driven decisions can and should foster high-quality care that is concurrently efficient, effective, and in sync with the needs of a particular university and student body.

  13. Clinical Practice Guideline Development Manual, Third Edition: a quality-driven approach for translating evidence into action.

    PubMed

    Rosenfeld, Richard M; Shiffman, Richard N; Robertson, Peter

    2013-01-01

    Guidelines translate best evidence into best practice. A well-crafted guideline promotes quality by reducing health care variations, improving diagnostic accuracy, promoting effective therapy, and discouraging ineffective-or potentially harmful-interventions. Despite a plethora of published guidelines, methodology is often poorly defined and varies greatly within and among organizations. The third edition of this manual describes the principles and practices used successfully by the American Academy of Otolaryngology--Head and Neck Surgery Foundation to produce quality-driven, evidence-based guidelines using efficient and transparent methodology for actionable recommendations with multidisciplinary applicability. The development process emphasizes a logical sequence of key action statements supported by amplifying text, action statement profiles, and recommendation grades linking action to evidence. New material in this edition includes standards for trustworthy guidelines, updated classification of evidence levels, increased patient and public involvement, assessing confidence in the evidence, documenting differences of opinion, expanded discussion of conflict of interest, and use of computerized decision support for crafting actionable recommendations. As clinical practice guidelines become more prominent as a key metric of quality health care, organizations must develop efficient production strategies that balance rigor and pragmatism. Equally important, clinicians must become savvy in understanding what guidelines are--and are not--and how they are best used to improve care. The information in this manual should help clinicians and organizations achieve these goals.

  14. Cluster randomized trial of a multilevel evidence-based quality improvement approach to tailoring VA Patient Aligned Care Teams to the needs of women Veterans.

    PubMed

    Yano, Elizabeth M; Darling, Jill E; Hamilton, Alison B; Canelo, Ismelda; Chuang, Emmeline; Meredith, Lisa S; Rubenstein, Lisa V

    2016-07-19

    The Veterans Health Administration (VA) has undertaken a major initiative to transform care through implementation of Patient Aligned Care Teams (PACTs). Based on the patient-centered medical home (PCMH) concept, PACT aims to improve access, continuity, coordination, and comprehensiveness using team-based care that is patient-driven and patient-centered. However, how VA should adapt PACT to meet the needs of special populations, such as women Veterans (WVs), was not considered in initial implementation guidance. WVs' numerical minority in VA healthcare settings (approximately 7-8 % of users) creates logistical challenges to delivering gender-sensitive comprehensive care. The main goal of this study is to test an evidence-based quality improvement approach (EBQI) to tailoring PACT to meet the needs of WVs, incorporating comprehensive primary care services and gender-specific care in gender-sensitive environments, thereby accelerating achievement of PACT tenets for women (Women's Health (WH)-PACT). EBQI is a systematic approach to developing a multilevel research-clinical partnership that engages senior organizational leaders and local quality improvement (QI) teams in adapting and implementing new care models in the context of prior evidence and local practice conditions, with researchers providing technical support, formative feedback, and practice facilitation. In a 12-site cluster randomized trial, we will evaluate WH-PACT model achievement using patient, provider, staff, and practice surveys, in addition to analyses of secondary administrative and chart-based data. We will explore impacts of receipt of WH-PACT care on quality of chronic disease care and prevention, health status, patient satisfaction and experience of care, provider experience, utilization, and costs. Using mixed methods, we will assess pre-post practice contexts; document EBQI activities undertaken in participating facilities and their relationship to provider/staff and team actions/attitudes; document WH-PACT implementation; and examine barriers/facilitators to EBQI-supported WH-PACT implementation through a combination of semi-structured interviews and monthly formative progress narratives and administrative data. Lack of gender-sensitive comprehensive care has demonstrated consequences for the technical quality and ratings of care among WVs and may contribute to decisions to continue use or seek care elsewhere under the US Affordable Care Act. We hypothesize that tailoring PACT implementation through EBQI may improve the experience and quality of care at many levels. ClinicalTrials.gov, NCT02039856.

  15. Adolescent Problematic Social Networking and School Experiences: The Mediating Effects of Sleep Disruptions and Sleep Quality.

    PubMed

    Vernon, Lynette; Barber, Bonnie L; Modecki, Kathryn L

    2015-07-01

    An important developmental task for adolescents is to become increasingly responsible for their own health behaviors. Establishing healthy sleep routines and controlling media use before bedtime are important for adequate, quality sleep so adolescents are alert during the day and perform well at school. Despite the prevalence of adolescent social media use and the large percentage of computers and cell phones in adolescents' bedrooms, no studies to date have investigated the link between problematic adolescent investment in social networking, their sleep practices, and associated experiences at school. A sample of 1,886 students in Australia aged between 12 and 18 years of age completed self-report data on problematic social networking use, sleep disturbances, sleep quality, and school satisfaction. Structural equation modeling (SEM) substantiated the serial mediation hypothesis: for adolescents, problematic social networking use significantly increased sleep disturbances, which adversely affected perceptions of sleep quality that, in turn, lowered adolescents' appraisals of their school satisfaction. This significant pattern was largely driven by the indirect effect of sleep disturbances. These findings suggest that adolescents are vulnerable to negative consequences from social networking use. Specifically, problematic social networking is associated with poor school experiences, which result from poor sleep habits. Promoting better sleep routines by minimizing sleep disturbances from social media use could improve school experiences for adolescents with enhanced emotional engagement and improved subjective well-being.

  16. An Energy Efficient Adaptive Sampling Algorithm in a Sensor Network for Automated Water Quality Monitoring.

    PubMed

    Shu, Tongxin; Xia, Min; Chen, Jiahong; Silva, Clarence de

    2017-11-05

    Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA) is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO) and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA), while achieving around the same Normalized Mean Error (NME), DDASA is superior in saving 5.31% more battery energy.

  17. An Energy Efficient Adaptive Sampling Algorithm in a Sensor Network for Automated Water Quality Monitoring

    PubMed Central

    Shu, Tongxin; Xia, Min; Chen, Jiahong; de Silva, Clarence

    2017-01-01

    Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA) is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO) and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA), while achieving around the same Normalized Mean Error (NME), DDASA is superior in saving 5.31% more battery energy. PMID:29113087

  18. Pay-for-performance policy and data-driven decision making within nursing homes: a qualitative study.

    PubMed

    Abrahamson, Kathleen; Miech, Edward; Davila, Heather Wood; Mueller, Christine; Cooke, Valerie; Arling, Greg

    2015-05-01

    Health systems globally and within the USA have introduced nursing home pay-for-performance (P4P) programmes in response to the need for improved nursing home quality. Central to the challenge of administering effective P4P is the availability of accurate, timely and clinically appropriate data for decision making. We aimed to explore ways in which data were collected, thought about and used as a result of participation in a P4P programme. Semistructured interviews were conducted with 232 nursing home employees from within 70 nursing homes that participated in P4P-sponsored quality improvement (QI) projects. Interview data were analysed to identify themes surrounding collecting, thinking about and using data for QI decision making. The term 'data' appeared 247 times in the interviews, and over 92% of these instances (228/247) were spontaneous references by nursing home staff. Overall, 34% of respondents (79/232) referred directly to 'data' in their interviews. Nursing home leadership more frequently discussed data use than direct care staff. Emergent themes included using data to identify a QI problem, gathering data in new ways at the local level, and measuring outcomes in response to P4P participation. Alterations in data use as a result of policy change were theoretically consistent with the revised version of the Promoting Action on Research Implementation in Health Services framework, which posits that successful implementation is a function of evidence, context and facilitation. Providing a reimbursement context that facilitates the collection and use of reliable local evidence may be an important consideration to others contemplating the adaptation of P4P policies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  19. Process evaluation of the Data-driven Quality Improvement in Primary Care (DQIP) trial: quantitative examination of variation between practices in recruitment, implementation and effectiveness.

    PubMed

    Dreischulte, Tobias; Grant, Aileen; Hapca, Adrian; Guthrie, Bruce

    2018-01-05

    The cluster randomised trial of the Data-driven Quality Improvement in Primary Care (DQIP) intervention showed that education, informatics and financial incentives for general medical practices to review patients with ongoing high-risk prescribing of non-steroidal anti-inflammatory drugs and antiplatelets reduced the primary end point of high-risk prescribing by 37%, where both ongoing and new high-risk prescribing were significantly reduced. This quantitative process evaluation examined practice factors associated with (1) participation in the DQIP trial, (2) review activity (extent and nature of documented reviews) and (3) practice level effectiveness (relative reductions in the primary end point). Invited practices recruited (n=33) and not recruited (n=32) to the DQIP trial in Scotland, UK. (1) Characteristics of recruited versus non-recruited practices. Associations of (2) practice characteristics and 'adoption' (self-reported implementation work done by practices) with documented review activity and (3) of practice characteristics, DQIP adoption and review activity with effectiveness. (1) Recruited practices had lower performance in the quality and outcomes framework than those declining participation. (2) Not being an approved general practitioner training practice and higher self-reported adoption were significantly associated with higher review activity. (3) Effectiveness ranged from a relative increase in high-risk prescribing of 24.1% to a relative reduction of 77.2%. High-risk prescribing and DQIP adoption (but not documented review activity) were significantly associated with greater effectiveness in the final multivariate model, explaining 64.0% of variation in effectiveness. Intervention implementation and effectiveness of the DQIP intervention varied substantially between practices. Although the DQIP intervention primarily targeted review of ongoing high-risk prescribing, the finding that self-reported DQIP adoption was a stronger predictor of effectiveness than documented review activity supports that reducing initiation and/or re-initiation of high-risk prescribing is key to its effectiveness. NCT01425502; Post-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. An Assessment of CFD Effectiveness for Vortex Flow Simulation to Meet Preliminary Design Needs

    NASA Technical Reports Server (NTRS)

    Raj, P.; Ghaffari, F.; Finley, D. B.

    2003-01-01

    The low-speed flight and transonic maneuvering characteristics of combat air vehicles designed for efficient supersonic flight are significantly affected by the presence of free vortices. At moderate-to-high angles of attack, the flow invariably separates from the leading edges of the swept slender wings, as well as from the forebodies of the air vehicles, and rolls up to form free vortices. The design of military vehicles is heavily driven by the need to simultaneously improve performance and affordability.1 In order to meet this need, increasing emphasis is being placed on using Modeling & Simulation environments employing the Integrated Product & Process Development (IPPD) concept. The primary focus is on expeditiously providing design teams with high-fidelity data needed to make more informed decisions in the preliminary design stage. Extensive aerodynamic data are needed to support combat air vehicle design. Force and moment data are used to evaluate performance and handling qualities; surface pressures provide inputs for structural design; and flow-field data facilitate system integration. Continuing advances in computational fluid dynamics (CFD) provide an attractive means of generating the desired data in a manner that is responsive to the needs of the preliminary design efforts. The responsiveness is readily characterized as timely delivery of quality data at low cost.

  1. Improving patient satisfaction with pain management using Six Sigma tools.

    PubMed

    DuPree, Erin; Martin, Lisa; Anderson, Rebecca; Kathuria, Navneet; Reich, David; Porter, Carol; Chassin, Mark R

    2009-07-01

    Patient satisfaction as a direct and public measure of quality of care is changing the way hospitals address quality improvement. The feasibility of using the Six Sigma DMAIC (Define, Measure, Analyze, Improve, Control) methodology to improve patient satisfaction as it relates to pain management was evaluated. This project used the DMAIC methodology to improve patients' overall satisfaction with pain management on two inpatient units in an urban academic medical center. Pre- and postintervention patient surveys were conducted. The DMAIC methodology provided a data-driven structure to determine the optimal improvement strategies, as well as a long-term plan for maintaining any improvements. In addition, the Change Acceleration Process (CAP) was used throughout the project's various DMAIC stages to further the work of the team by creating a shared need to meet the objectives of the project. Overall satisfaction with pain management "excellent" ratings increased from 37% to 54%. Both units surpassed the goal of at least 50% of responses in the "excellent" category. Several key drivers of satisfaction with pain management were uncovered in the Analyze phase of the project, and each saw rating increases from the pre-intervention to postintervention surveys. Ongoing monitoring by the hospital inpatient satisfaction survey showed that the pain satisfaction score improved in subsequent quarters as compared with the pre-intervention period. The Six Sigma DMAIC methodology can be used successfully to improve patient satisfaction. The project led to measurable improvements in patient satisfaction with pain management, which have endured past the duration of the Six Sigma project. The Control phase of DMAIC allows the improvements to be incorporated into daily operations.

  2. Strength in Numbers: Data-Driven Collaboration May Not Sound Sexy, But it Could Save Your Job

    ERIC Educational Resources Information Center

    Buzzeo, Toni

    2010-01-01

    This article describes a practical, sure-fire way for media specialists to boost student achievement. The method is called data-driven collaboration, and it's a practical, easy-to-use technique in which media specialists and teachers work together to pinpoint kids' instructional needs and improve their essential skills. The author discusses the…

  3. How Instructional Coaches Support Data-Driven Decision Making: Policy Implementation and Effects in Florida Middle Schools

    ERIC Educational Resources Information Center

    Marsh, Julie A.; McCombs, Jennifer Sloan; Martorell, Francisco

    2010-01-01

    This article examines the convergence of two popular school improvement policies: instructional coaching and data-driven decision making (DDDM). Drawing on a mixed methods study of a statewide reading coach program in Florida middle schools, the article examines how coaches support DDDM and how this support relates to student and teacher outcomes.…

  4. An acoustic energy framework for predicting combustion-driven acoustic instabilities in premixed gas-turbines

    NASA Astrophysics Data System (ADS)

    Ibrahim, Zuhair M. A.

    The purpose of this study was to discover and assess student financial services delivered to students enrolled at East Tennessee State University. The research was undertaken for institutional self-improvement. The research explored changes that have occurred in student financial services in the dynamic higher education market. The research revealed universities pursued best practices for the delivery of student financial services through expanded employee knowledge, restructured organizations, and integrated information technologies. The research was conducted during October and November, 2006. The data were gathered from an online student survey of student financial services. The areas researched included: the Bursar office, the Financial Aid office, and online services. The results of the data analysis revealed problems with the students' perceived quality of existing financial services and the additional services students desire. The research focused on student perceptions of the quality of financial services by age and gender classifications and response categories. Although no statistically significant difference was found between the age-gender classifications on the perception of the quality of the financial services studied, the research adds to our understanding of student financial services at East Tennessee State University. Recommendation for continued research included annual surveys of segmented student populations that include ethnicity, age, gender, and educational level. The research would be used for continuous improvement efforts and student relationship management. Also additional research was recommended for employee learning in relation to the institution's mission, goals, and values.

  5. Vendor management: a model for collaboration and quality improvement.

    PubMed

    Friedman, M D; Bailit, M H; Michel, J O

    1995-11-01

    The Massachusetts Medicaid agency, also known as the Division of Medical Assistance, has developed a quality-driven approach for managing its managed care suppliers. Such an approach has, as its foundation, principles of continuous quality improvement (CQI). Suppliers participate in an annual process whereby CQI goals are negotiated between the division and its suppliers. The division then works with suppliers to achieve such goals. A cornerstone of the division's approach is the notion that data can highlight an unlimited number of opportunities for improvement and that pursuit of such opportunities will ultimately result in meaningful improvements in the health status of recipients who are served by the division. The agency's approach involves five key steps: 1) the development of contractual terms and purchasing specifications; 2) the identification of improvement priorities; 3) the negotiation of improvement goals; 4) efforts directed at meeting improvement goals and measurement of success; and 5) collaboration to achieve mutual objectives. Overall, suppliers report many benefits of collaborative participation in CQI activities with the division. Suppliers have enhanced their understanding of the importance of meeting the needs of the customer and have further accrued benefits resulting from discussions with managed care vendors throughout the site regarding benchmarking of efforts and CQI efforts. Conversely, suppliers are challenged by the need to balance and allocate resources to meet increasing demands, which are not always consistent, from various purchasers, including the division. The division has been challenged in the evolution of its contract management strategy by an uneven level of knowledge among managed care vendors regarding CQI; goal setting and measurement issues; the length of time and level of effort required to develop good relationships with suppliers; and the critical importance of comparable, valid, and timely submission of data. Over the last three years, the division has seen a dramatic increase in the responsiveness of managed care suppliers to meet its needs as a purchaser. Specifically, this has been expressed through supplier ability to meet mutually negotiated improvement goals. The division is also pleased that it is beginning to achieve some measurable improvements in outcomes of care.

  6. A shared computer-based problem-oriented patient record for the primary care team.

    PubMed

    Linnarsson, R; Nordgren, K

    1995-01-01

    1. INTRODUCTION. A computer-based patient record (CPR) system, Swedestar, has been developed for use in primary health care. The principal aim of the system is to support continuous quality improvement through improved information handling, improved decision-making, and improved procedures for quality assurance. The Swedestar system has evolved during a ten-year period beginning in 1984. 2. SYSTEM DESIGN. The design philosophy is based on the following key factors: a shared, problem-oriented patient record; structured data entry based on an extensive controlled vocabulary; advanced search and query functions, where the query language has the most important role; integrated decision support for drug prescribing and care protocols and guidelines; integrated procedures for quality assurance. 3. A SHARED PROBLEM-ORIENTED PATIENT RECORD. The core of the CPR system is the problem-oriented patient record. All problems of one patient, recorded by different members of the care team, are displayed on the problem list. Starting from this list, a problem follow-up can be made, one problem at a time or for several problems simultaneously. Thus, it is possible to get an integrated view, across provider categories, of those problems of one patient that belong together. This shared problem-oriented patient record provides an important basis for the primary care team work. 4. INTEGRATED DECISION SUPPORT. The decision support of the system includes a drug prescribing module and a care protocol module. The drug prescribing module is integrated with the patient records and includes an on-line check of the patient's medication list for potential interactions and data-driven reminders concerning major drug problems. Care protocols have been developed for the most common chronic diseases, such as asthma, diabetes, and hypertension. The patient records can be automatically checked according to the care protocols. 5. PRACTICAL EXPERIENCE. The Swedestar system has been implemented in a primary care area with 30,000 inhabitants. It is being used by all the primary care team members: 15 general practitioners, 25 district nurses, and 10 physiotherapists. Several years of practical experience of the CPR system shows that it has a positive impact on quality of care on four levels: 1) improved clinical follow-up of individual patients; 2) facilitated follow-up of aggregated data such as practice activity analysis, annual reports, and clinical indicators; 3) automated medical audit; and 4) concurrent audit. Within that primary care area, quality of care has improved substantially in several aspects due to the use of the CPR system [1].

  7. Diagnostic quality driven physiological data collection for personal healthcare.

    PubMed

    Jea, David; Balani, Rahul; Hsu, Ju-Lan; Cho, Dae-Ki; Gerla, Mario; Srivastava, Mani B

    2008-01-01

    We believe that each individual is unique, and that it is necessary for diagnosis purpose to have a distinctive combination of signals and data features that fits the personal health status. It is essential to develop mechanisms for reducing the amount of data that needs to be transferred (to mitigate the troublesome periodically recharging of a device) while maintaining diagnostic accuracy. Thus, the system should not uniformly compress the collected physiological data, but compress data in a personalized fashion that preserves the 'important' signal features for each individual such that it is enough to make the diagnosis with a required high confidence level. We present a diagnostic quality driven mechanism for remote ECG monitoring, which enables a notation of priorities encoded into the wave segments. The priority is specified by the diagnosis engine or medical experts and is dynamic and individual dependent. The system pre-processes the collected physiological information according to the assigned priority before delivering to the backend server. We demonstrate that the proposed approach provides accurate inference results while effectively compressing the data.

  8. Approximate Computing Techniques for Iterative Graph Algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Panyala, Ajay R.; Subasi, Omer; Halappanavar, Mahantesh

    Approximate computing enables processing of large-scale graphs by trading off quality for performance. Approximate computing techniques have become critical not only due to the emergence of parallel architectures but also the availability of large scale datasets enabling data-driven discovery. Using two prototypical graph algorithms, PageRank and community detection, we present several approximate computing heuristics to scale the performance with minimal loss of accuracy. We present several heuristics including loop perforation, data caching, incomplete graph coloring and synchronization, and evaluate their efficiency. We demonstrate performance improvements of up to 83% for PageRank and up to 450x for community detection, with lowmore » impact of accuracy for both the algorithms. We expect the proposed approximate techniques will enable scalable graph analytics on data of importance to several applications in science and their subsequent adoption to scale similar graph algorithms.« less

  9. Review of Potential Wind Tunnel Balance Technologies

    NASA Technical Reports Server (NTRS)

    Burns, Devin E.; Williams, Quincy L.; Phillips, Ben D.; Commo, Sean A.; Ponder, Jonathon D.

    2016-01-01

    This manuscript reviews design, manufacture, materials, sensors, and data acquisition technologies that may benefit wind tunnel balances for the aerospace research community. Current state-of-the-art practices are used as the benchmark to consider advancements driven by researcher and facility needs. Additive manufacturing is highlighted as a promising alternative technology to conventional fabrication and has the potential to reduce both the cost and time required to manufacture force balances. Material alternatives to maraging steels are reviewed. Sensor technologies including piezoresistive, piezoelectric, surface acoustic wave, and fiber optic are compared to traditional foil based gages to highlight unique opportunities and shared challenges for implementation in wind tunnel environments. Finally, data acquisition systems that could be integrated into force balances are highlighted as a way to simplify the user experience and improve data quality. In summary, a rank ordering is provided to support strategic investment in exploring the technologies reviewed in this manuscript.

  10. Structure, Content, Delivery, Service, and Outcomes: Quality e-Learning in Higher Education

    ERIC Educational Resources Information Center

    MacDonald, Colla J.; Thompson, Terrie Lynn

    2005-01-01

    This paper addresses the need for quality e-Learning experiences. We used the Demand-Driven Learning Model (MacDonald, Stodel, Farres, Breithaupt, and Gabriel, 2001) to evaluate an online Masters in Education course. Multiple data collection methods were used to understand the experiences of stakeholders in this case study: the learners, design…

  11. 42 CFR 482.21 - Condition of participation: Quality assessment and performance improvement program.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... quality improvement and patient safety, including the reduction of medical errors, is defined, implemented... address priorities for improved quality of care and patient safety; and that all improvement actions are... incorporate quality indicator data including patient care data, and other relevant data, for example...

  12. 42 CFR 482.21 - Condition of participation: Quality assessment and performance improvement program.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... quality improvement and patient safety, including the reduction of medical errors, is defined, implemented... address priorities for improved quality of care and patient safety; and that all improvement actions are... incorporate quality indicator data including patient care data, and other relevant data, for example...

  13. 42 CFR 482.21 - Condition of participation: Quality assessment and performance improvement program.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... quality improvement and patient safety, including the reduction of medical errors, is defined, implemented... address priorities for improved quality of care and patient safety; and that all improvement actions are... incorporate quality indicator data including patient care data, and other relevant data, for example...

  14. Quality improvement and person-centredness: a participatory mixed methods study to develop the 'always event' concept for primary care.

    PubMed

    Bowie, Paul; McNab, Duncan; Ferguson, Julie; de Wet, Carl; Smith, Gregor; MacLeod, Marion; McKay, John; White, Craig

    2015-04-28

    (1) To ascertain from patients what really matters to them on a personal level of such high importance that it should 'always happen' when they interact with healthcare professionals and staff groups. (2) To critically review existing criteria for selecting 'always events' (AEs) and generate a candidate list of AE examples based on the patient feedback data. Mixed methods study informed by participatory design principles. Convenience samples of patients with a long-term clinical condition in Scottish general practices. 195 patients from 13 general practices were interviewed (n=65) or completed questionnaires (n=130). 4 themes of high importance to patients were identified from which examples of potential 'AEs' (n=8) were generated: (1) emotional support, respect and kindness (eg, "I want all practice team members to show genuine concern for me at all times"); (2) clinical care management (eg, "I want the correct treatment for my problem"); (3) communication and information (eg, "I want the clinician who sees me to know my medical history") and (4) access to, and continuity of, healthcare (eg, "I want to arrange appointments around my family and work commitments"). Each 'AE' was linked to a system process or professional behaviour that could be measured to facilitate improvements in the quality of patient care. This study is the first known attempt to develop the AE concept as a person-centred approach to quality improvement in primary care. Practice managers were able to collect data from patients on what they 'always want' in terms of expectations related to care quality from which a list of AE examples was generated that could potentially be used as patient-driven quality improvement (QI) measures. There is strong implementation potential in the Scottish health service. However, further evaluation of the utility of the method is also necessary. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  15. Data Analysis and Data-Driven Decision-Making Strategies Implemented by Elementary Teachers in Selected Exited Program Improvement Safe Harbor Schools in Southern California

    ERIC Educational Resources Information Center

    Senger, Karen

    2012-01-01

    Purpose: The purposes of this study were to investigate and describe how elementary teachers in exited Program Improvement-Safe Harbor schools acquire student achievement data through assessments, the strategies and reflections utilized to make sense of the data to improve student achievement, ensure curriculum and instructional goals are aligned,…

  16. Data-driven risk identification in phase III clinical trials using central statistical monitoring.

    PubMed

    Timmermans, Catherine; Venet, David; Burzykowski, Tomasz

    2016-02-01

    Our interest lies in quality control for clinical trials, in the context of risk-based monitoring (RBM). We specifically study the use of central statistical monitoring (CSM) to support RBM. Under an RBM paradigm, we claim that CSM has a key role to play in identifying the "risks to the most critical data elements and processes" that will drive targeted oversight. In order to support this claim, we first see how to characterize the risks that may affect clinical trials. We then discuss how CSM can be understood as a tool for providing a set of data-driven key risk indicators (KRIs), which help to organize adaptive targeted monitoring. Several case studies are provided where issues in a clinical trial have been identified thanks to targeted investigation after the identification of a risk using CSM. Using CSM to build data-driven KRIs helps to identify different kinds of issues in clinical trials. This ability is directly linked with the exhaustiveness of the CSM approach and its flexibility in the definition of the risks that are searched for when identifying the KRIs. In practice, a CSM assessment of the clinical database seems essential to ensure data quality. The atypical data patterns found in some centers and variables are seen as KRIs under a RBM approach. Targeted monitoring or data management queries can be used to confirm whether the KRIs point to an actual issue or not.

  17. Creating Opportunities for Organizational Leadership (COOL): Creating a culture and curriculum that fosters psychiatric leadership development and quality improvement.

    PubMed

    Dickey, Chandlee; Dismukes, Rodney; Topor, David

    2014-06-01

    The authors describe the Harvard South Shore Psychiatry Residency Training Program curriculum "Creating Opportunities for Organizational Leadership," an innovative, multitiered, resident-driven, outcome-focused set of experiences designed to develop residents' leadership skills in personal leadership, organizational leadership, negotiation, strategic thinking, and systems redesign.

  18. Purpose-Driven Grading

    ERIC Educational Resources Information Center

    Carlson, Jane A. K.; Kimpton, Ann

    2010-01-01

    Allowing students to improve their grade by revising their written work may help students learn to revise, but it gives them no incentive to turn in quality work from the start. This article proposes a way to invert the process, thereby teaching students how to revise, while enforcing a more disciplined approach to good writing. (Contains 3…

  19. Principals' Perceptions of Competition for Students in Milwaukee Schools

    ERIC Educational Resources Information Center

    Kasman, Matthew; Loeb, Susanna

    2013-01-01

    The assertion that choice-driven competition between schools will improve school quality rests on several largely unexamined assumptions. One is that choice increases the competitive pressure experienced by school leaders. A second is that schools will seek to become more effective in response to competitive pressure. In this article, we use…

  20. Creating Environments Conducive for Lifelong Learning

    ERIC Educational Resources Information Center

    Derrick, M. Gail

    2003-01-01

    A technological transformation during the past decade has eliminated the boundaries between formal and informal learning. As people adapt to a knowledge-driven society, a cultural transformation is occurring. Lifelong learning is an essential goal of education as a means to improve the quality of life for an individual, a culture, or a society.…

  1. Water quality assessment with hierarchical cluster analysis based on Mahalanobis distance.

    PubMed

    Du, Xiangjun; Shao, Fengjing; Wu, Shunyao; Zhang, Hanlin; Xu, Si

    2017-07-01

    Water quality assessment is crucial for assessment of marine eutrophication, prediction of harmful algal blooms, and environment protection. Previous studies have developed many numeric modeling methods and data driven approaches for water quality assessment. The cluster analysis, an approach widely used for grouping data, has also been employed. However, there are complex correlations between water quality variables, which play important roles in water quality assessment but have always been overlooked. In this paper, we analyze correlations between water quality variables and propose an alternative method for water quality assessment with hierarchical cluster analysis based on Mahalanobis distance. Further, we cluster water quality data collected form coastal water of Bohai Sea and North Yellow Sea of China, and apply clustering results to evaluate its water quality. To evaluate the validity, we also cluster the water quality data with cluster analysis based on Euclidean distance, which are widely adopted by previous studies. The results show that our method is more suitable for water quality assessment with many correlated water quality variables. To our knowledge, it is the first attempt to apply Mahalanobis distance for coastal water quality assessment.

  2. Social Impact Assessment: The lesser sibling in the South African EIA process?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hildebrandt, L., E-mail: Leandri.hildebrandt@nwu.ac.za; Sandham, L.A., E-mail: luke.sandham@nwu.ac.za

    2014-09-15

    Social Impact Assessment has developed as an integral but neglected component of EIA in South Africa since it became mandatory in 1997, and has therefore been referred to as the “orphan” or “lesser sibling” of EIA, as has SIA in the UK and the US. The aim of this paper is to test this claim by reviewing the quality of a sample of SIA reports, and also to establish whether there has been any improvement in quality following the introduction of revised EIA regulations in 2006. The results confirm that SIA can be called “the lesser sibling” due to themore » weak grades achieved in the quality review, but also reveal that there has been a slight and consistent improvement in quality, most likely driven by best practice considerations in the absence of prescriptive regulations for SIA. Suggestions and recommendations for addressing observed weakness in SIA performance are advanced. - Highlights: • The quality of a sample of SIA reports was evaluated using a review package. • SIA reports received mostly weak grades. • Limited improvement observed from first to second regulatory regime. • Improvements most likely due to best practice considerations.« less

  3. Bee pollination improves crop quality, shelf life and commercial value.

    PubMed

    Klatt, Björn K; Holzschuh, Andrea; Westphal, Catrin; Clough, Yann; Smit, Inga; Pawelzik, Elke; Tscharntke, Teja

    2014-01-22

    Pollination improves the yield of most crop species and contributes to one-third of global crop production, but comprehensive benefits including crop quality are still unknown. Hence, pollination is underestimated by international policies, which is particularly alarming in times of agricultural intensification and diminishing pollination services. In this study, exclusion experiments with strawberries showed bee pollination to improve fruit quality, quantity and market value compared with wind and self-pollination. Bee-pollinated fruits were heavier, had less malformations and reached higher commercial grades. They had increased redness and reduced sugar-acid-ratios and were firmer, thus improving the commercially important shelf life. Longer shelf life reduced fruit loss by at least 11%. This is accounting for 0.32 billion US$ of the 1.44 billion US$ provided by bee pollination to the total value of 2.90 billion US$ made with strawberry selling in the European Union 2009. The fruit quality and yield effects are driven by the pollination-mediated production of hormonal growth regulators, which occur in several pollination-dependent crops. Thus, our comprehensive findings should be transferable to a wide range of crops and demonstrate bee pollination to be a hitherto underestimated but vital and economically important determinant of fruit quality.

  4. Dense blocks of energetic ions driven by multi-petawatt lasers

    PubMed Central

    Weng, S. M.; Liu, M.; Sheng, Z. M.; Murakami, M.; Chen, M.; Yu, L. L.; Zhang, J.

    2016-01-01

    Laser-driven ion accelerators have the advantages of compact size, high density, and short bunch duration over conventional accelerators. Nevertheless, it is still challenging to simultaneously enhance the yield and quality of laser-driven ion beams for practical applications. Here we propose a scheme to address this challenge via the use of emerging multi-petawatt lasers and a density-modulated target. The density-modulated target permits its ions to be uniformly accelerated as a dense block by laser radiation pressure. In addition, the beam quality of the accelerated ions is remarkably improved by embedding the target in a thick enough substrate, which suppresses hot electron refluxing and thus alleviates plasma heating. Particle-in-cell simulations demonstrate that almost all ions in a solid-density plasma of a few microns can be uniformly accelerated to about 25% of the speed of light by a laser pulse at an intensity around 1022 W/cm2. The resulting dense block of energetic ions may drive fusion ignition and more generally create matter with unprecedented high energy density. PMID:26924793

  5. CANCELLED Microwave Ion Source and Beam Injection for anAccelerator-Driven Neut ron Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vainionpaa, J.H.; Gough, R.; Hoff, M.

    2007-02-27

    An over-dense microwave driven ion source capable of producing deuterium (or hydrogen) beams at 100-200 mA/cm{sup 2} and with atomic fraction > 90% was designed and tested with an electrostatic low energy beam transport section (LEBT). This ion source was incorporated into the design of an Accelerator Driven Neutron Source (ADNS). The other key components in the ADNS include a 6 MeV RFQ accelerator, a beam bending and scanning system, and a deuterium gas target. In this design a 40 mA D{sup +} beam is produced from a 6 mm diameter aperture using a 60 kV extraction voltage. The LEBTmore » section consists of 5 electrodes arranged to form 2 Einzel lenses that focus the beam into the RFQ entrance. To create the ECR condition, 2 induction coils are used to create {approx} 875 Gauss on axis inside the source chamber. To prevent HV breakdown in the LEBT a magnetic field clamp is necessary to minimize the field in this region. Matching of the microwave power from the waveguide to the plasma is done by an autotuner. They observed significant improvement of the beam quality after installing a boron nitride liner inside the ion source. The measured emittance data are compared with PBGUNS simulations.« less

  6. The Use of Lean Six Sigma Methodology in Increasing Capacity of a Chemical Production Facility at DSM.

    PubMed

    Meeuwse, Marco

    2018-03-30

    Lean Six Sigma is an improvement method, combining Lean, which focuses on removing 'waste' from a process, with Six Sigma, which is a data-driven approach, making use of statistical tools. Traditionally it is used to improve the quality of products (reducing defects), or processes (reducing variability). However, it can also be used as a tool to increase the productivity or capacity of a production plant. The Lean Six Sigma methodology is therefore an important pillar of continuous improvement within DSM. In the example shown here a multistep batch process is improved, by analyzing the duration of the relevant process steps, and optimizing the procedures. Process steps were performed in parallel instead of sequential, and some steps were made shorter. The variability was reduced, giving the opportunity to make a tighter planning, and thereby reducing waiting times. Without any investment in new equipment or technical modifications, the productivity of the plant was improved by more than 20%; only by changing procedures and the programming of the process control system.

  7. Task-driven optimization of CT tube current modulation and regularization in model-based iterative reconstruction

    NASA Astrophysics Data System (ADS)

    Gang, Grace J.; Siewerdsen, Jeffrey H.; Webster Stayman, J.

    2017-06-01

    Tube current modulation (TCM) is routinely adopted on diagnostic CT scanners for dose reduction. Conventional TCM strategies are generally designed for filtered-backprojection (FBP) reconstruction to satisfy simple image quality requirements based on noise. This work investigates TCM designs for model-based iterative reconstruction (MBIR) to achieve optimal imaging performance as determined by a task-based image quality metric. Additionally, regularization is an important aspect of MBIR that is jointly optimized with TCM, and includes both the regularization strength that controls overall smoothness as well as directional weights that permits control of the isotropy/anisotropy of the local noise and resolution properties. Initial investigations focus on a known imaging task at a single location in the image volume. The framework adopts Fourier and analytical approximations for fast estimation of the local noise power spectrum (NPS) and modulation transfer function (MTF)—each carrying dependencies on TCM and regularization. For the single location optimization, the local detectability index (d‧) of the specific task was directly adopted as the objective function. A covariance matrix adaptation evolution strategy (CMA-ES) algorithm was employed to identify the optimal combination of imaging parameters. Evaluations of both conventional and task-driven approaches were performed in an abdomen phantom for a mid-frequency discrimination task in the kidney. Among the conventional strategies, the TCM pattern optimal for FBP using a minimum variance criterion yielded a worse task-based performance compared to an unmodulated strategy when applied to MBIR. Moreover, task-driven TCM designs for MBIR were found to have the opposite behavior from conventional designs for FBP, with greater fluence assigned to the less attenuating views of the abdomen and less fluence to the more attenuating lateral views. Such TCM patterns exaggerate the intrinsic anisotropy of the MTF and NPS as a result of the data weighting in MBIR. Directional penalty design was found to reinforce the same trend. The task-driven approaches outperform conventional approaches, with the maximum improvement in d‧ of 13% given by the joint optimization of TCM and regularization. This work demonstrates that the TCM optimal for MBIR is distinct from conventional strategies proposed for FBP reconstruction and strategies optimal for FBP are suboptimal and may even reduce performance when applied to MBIR. The task-driven imaging framework offers a promising approach for optimizing acquisition and reconstruction for MBIR that can improve imaging performance and/or dose utilization beyond conventional imaging strategies.

  8. Who can best influence the quality of teenagers' cars?

    PubMed

    Keall, Michael D; Newstead, Stuart

    2013-01-01

    Because young drivers' vehicles have been found to offer poor occupant protection in many countries, this study sought to identify the most appropriate audience for information and publicity designed to change purchasing preferences to improve these vehicles and resultant injury outcomes. An analysis of New Zealand vehicles crashed by drivers aged 19 years or less linked to data on the owner of the vehicle was undertaken. Details on the crashed vehicles were merged with licensing information to identify the owner's age group. It was presumed that most vehicles driven by teens but owned by someone aged 30 to 59 would be owned by a parent of the teen. Only 14 percent of vehicles crashed by teens were owned by teens. Generally, older vehicles with poor crashworthiness were provided for the teenage driver, whatever the age group of the owner. However, cars crashed by teens but owned by their parents were on average almost 2 years younger and had relatively superior crashworthiness than the teenager-owned and crashed vehicles, although their crashworthiness was still poor compared to vehicles driven by older drivers. Evidently, parents are key people in making vehicle purchasing decisions regarding the cars that teenagers drive and should be the main audience for measures to improve the poor secondary safety performance of teenagers' vehicles.

  9. Factors driving diabetes care improvement in a large medical group: ten years of progress.

    PubMed

    Sperl-Hillen, JoAnn M; O'Connor, Patrick J

    2005-08-01

    The purpose of this study was to document trends in diabetes quality of care and coinciding strategies for quality improvement over 10 years in a large medical group. Adults with diagnosed diabetes mellitus were identified each year from 1994 (N = 5610) to 2003 (N = 7650), and internal medical group data quantified improvement trends. Multivariate analysis was used to identify factors that did and did not contribute to improvement trends. Median glycosylated hemoglobin A1C (A1C) levels improved from 8.3% in 1994 to 6.9% in 2003 (P <.001). Mean low-density lipoprotein (LDL) cholesterol measurements improved from 132 mg/dL in 1995 to 97 mg/dL in 2003 (P <.001). Both A1C (P <.01) and LDL improvement (P <.0001) were driven by drug intensification, leadership commitment to diabetes improvement, greater continuity of primary care, participation in local and national diabetes care improvement initiatives, and allocation of multidisciplinary resources at the clinic level to improve diabetes care. Resources were spent on nurse and dietitian educators, active outreach to high-risk patients facilitated by registries, physician opinion leader activities including clinic-based training programs, and financial incentives to primary care clinics. Use of endocrinology referrals was stable throughout the period at about 10% of patients per year, and there were no disease management contracts to outside vendors over the study period. Electronic medical records did not favorably affect glycemic control or lipid control in this setting. This primary care-based system achieved A1C and LDL reductions sufficient to reduce macrovascular and microvascular risk by about 50% according to landmark studies; further risk reduction should be attainable through better blood pressure control. Strategies for diabetes improvement need to be customized to address documented gaps in quality of care, provider prescribing behaviors, and patient characteristics.

  10. Leveraging health information technology to achieve the “triple aim” of healthcare reform

    PubMed Central

    Sood, Harpreet S; Bates, David W

    2015-01-01

    Objective To investigate experiences with leveraging health information technology (HIT) to improve patient care and population health, and reduce healthcare expenditures. Materials and methods In-depth qualitative interviews with federal government employees, health policy, HIT and medico-legal experts, health providers, physicians, purchasers, payers, patient advocates, and vendors from across the United States. Results The authors undertook 47 interviews. There was a widely shared belief that Health Information Technology for Economic and Clinical Health (HITECH) had catalyzed the creation of a digital infrastructure, which was being used in innovative ways to improve quality of care and curtail costs. There were however major concerns about the poor usability of electronic health records (EHRs), their limited ability to support multi-disciplinary care, and major difficulties with health information exchange, which undermined efforts to deliver integrated patient-centered care. Proposed strategies for enhancing the benefits of HIT included federal stimulation of competition by mandating vendors to open-up their application program interfaces, incenting development of low-cost consumer informatics tools, and promoting Congressional review of the The Health Insurance Portability and Accountability Act (HIPPA) to optimize the balance between data privacy and reuse. Many underscored the need to “kick the legs from underneath the fee-for-service model” and replace it with a data-driven reimbursement system that rewards high quality care. Conclusions The HITECH Act has stimulated unprecedented, multi-stakeholder interest in HIT. Early experiences indicate that the resulting digital infrastructure is being used to improve quality of care and curtail costs. Reform efforts are however severely limited by problems with usability, limited interoperability and the persistence of the fee-for-service paradigm—addressing these issues therefore needs to be the federal government’s main policy target. PMID:25882032

  11. The evolution of Rare Pride: using evaluation to drive adaptive management in a biodiversity conservation organization.

    PubMed

    Jenks, Brett; Vaughan, Peter W; Butler, Paul J

    2010-05-01

    Rare Pride is a social marketing program that stimulates human behavior change in order to promote biodiversity conservation in critically threatened regions in developing countries. A series of formal evaluation studies, networking strategies, and evaluative inquiries have driven a 20-year process of adaptive management that has resulted in extensive programmatic changes within Pride. This paper describes the types of evaluation that Rare used to drive adaptive management and the changes it caused in Pride's theory-of-change and programmatic structure. We argue that (a) qualitative data gathered from partners and staff through structured interviews is most effective at identifying problems with current programs and procedures, (b) networking with other organizations is the most effective strategy for learning of new management strategies, and (c) quantitative data gathered through surveys is effective at measuring program impact and quality. Adaptive management has allowed Rare to increase its Pride program from implementing about two campaigns per year in 2001 to more than 40 per year in 2009 while improving program quality and maintaining program impact. Copyright 2009 Elsevier Ltd. All rights reserved.

  12. Total quality management: It works for aerospace information services

    NASA Technical Reports Server (NTRS)

    Erwin, James; Eberline, Carl; Colquitt, Wanda

    1993-01-01

    Today we are in the midst of information and 'total quality' revolutions. At the NASA STI Program's Center for AeroSpace Information (CASI), we are focused on using continuous improvements techniques to enrich today's services and products and to ensure that tomorrow's technology supports the TQM-based improvement of future STI program products and services. The Continuous Improvements Program at CASI is the foundation for Total Quality Management in products and services. The focus is customer-driven; its goal, to identify processes and procedures that can be improved and new technologies that can be integrated with the processes to gain efficiencies, provide effectiveness, and promote customer satisfaction. This Program seeks to establish quality through an iterative defect prevention approach that is based on the incorporation of standards and measurements into the processing cycle. Four projects are described that utilize cross-functional, problem-solving teams for identifying requirements and defining tasks and task standards, management participation, attention to critical processes, and measurable long-term goals. The implementation of these projects provides the customer with measurably improved access to information that is provided through several channels: the NASA STI Database, document requests for microfiche and hardcopy, and the Centralized Help Desk.

  13. Developing community-driven quality improvement initiatives to enhance chronic disease care in Indigenous communities in Canada: the FORGE AHEAD program protocol.

    PubMed

    Naqshbandi Hayward, Mariam; Paquette-Warren, Jann; Harris, Stewart B

    2016-07-26

    Given the dramatic rise and impact of chronic diseases and gaps in care in Indigenous peoples in Canada, a shift from the dominant episodic and responsive healthcare model most common in First Nations communities to one that places emphasis on proactive prevention and chronic disease management is urgently needed. The Transformation of Indigenous Primary Healthcare Delivery (FORGE AHEAD) Program partners with 11 First Nations communities across six provinces in Canada to develop and evaluate community-driven quality improvement (QI) initiatives to enhance chronic disease care. FORGE AHEAD is a 5-year research program (2013-2017) that utilizes a pre-post mixed-methods observational design rooted in participatory research principles to work with communities in developing culturally relevant innovations and improved access to available services. This intensive program incorporates a series of 10 inter-related and progressive program activities designed to foster community-driven initiatives with type 2 diabetes mellitus as the action disease. Preparatory activities include a national community profile survey, best practice and policy literature review, and readiness tool development. Community-level intervention activities include community and clinical readiness consultations, development of a diabetes registry and surveillance system, and QI activities. With a focus on capacity building, all community-level activities are driven by trained community members who champion QI initiatives in their community. Program wrap-up activities include readiness tool validation, cost-analysis and process evaluation. In collaboration with Health Canada and the Aboriginal Diabetes Initiative, scale-up toolkits will be developed in order to build on lessons-learned, tools and methods, and to fuel sustainability and spread of successful innovations. The outcomes of this research program, its related cost and the subsequent policy recommendations, will have the potential to significantly affect future policy decisions pertaining to chronic disease care in First Nations communities in Canada. Current ClinicalTrial.gov protocol ID NCT02234973 . Date of Registration: July 30, 2014.

  14. External radioactive markers for PET data-driven respiratory gating in positron emission tomography.

    PubMed

    Büther, Florian; Ernst, Iris; Hamill, James; Eich, Hans T; Schober, Otmar; Schäfers, Michael; Schäfers, Klaus P

    2013-04-01

    Respiratory gating is an established approach to overcoming respiration-induced image artefacts in PET. Of special interest in this respect are raw PET data-driven gating methods which do not require additional hardware to acquire respiratory signals during the scan. However, these methods rely heavily on the quality of the acquired PET data (statistical properties, data contrast, etc.). We therefore combined external radioactive markers with data-driven respiratory gating in PET/CT. The feasibility and accuracy of this approach was studied for [(18)F]FDG PET/CT imaging in patients with malignant liver and lung lesions. PET data from 30 patients with abdominal or thoracic [(18)F]FDG-positive lesions (primary tumours or metastases) were included in this prospective study. The patients underwent a 10-min list-mode PET scan with a single bed position following a standard clinical whole-body [(18)F]FDG PET/CT scan. During this scan, one to three radioactive point sources (either (22)Na or (18)F, 50-100 kBq) in a dedicated holder were attached the patient's abdomen. The list mode data acquired were retrospectively analysed for respiratory signals using established data-driven gating approaches and additionally by tracking the motion of the point sources in sinogram space. Gated reconstructions were examined qualitatively, in terms of the amount of respiratory displacement and in respect of changes in local image intensity in the gated images. The presence of the external markers did not affect whole-body PET/CT image quality. Tracking of the markers led to characteristic respiratory curves in all patients. Applying these curves for gated reconstructions resulted in images in which motion was well resolved. Quantitatively, the performance of the external marker-based approach was similar to that of the best intrinsic data-driven methods. Overall, the gain in measured tumour uptake from the nongated to the gated images indicating successful removal of respiratory motion was correlated with the magnitude of the respiratory displacement of the respective tumour lesion, but not with lesion size. Respiratory information can be assessed from list-mode PET/CT through PET data-derived tracking of external radioactive markers. This information can be successfully applied to respiratory gating to reduce motion-related image blurring. In contrast to other previously described PET data-driven approaches, the external marker approach is independent of tumour uptake and thereby applicable even in patients with poor uptake and small tumours.

  15. Great Expectations: How Role Expectations and Role Experiences Relate to Perceptions of Group Cohesion.

    PubMed

    Benson, Alex J; Eys, Mark A; Irving, P Gregory

    2016-04-01

    Many athletes experience a discrepancy between the roles they expect to fulfill and the roles they eventually occupy. Drawing from met expectations theory, we applied response surface methodology to examine how role expectations, in relation to role experiences, influence perceptions of group cohesion among Canadian Interuniversity Sport athletes (N = 153). On the basis of data from two time points, as athletes approached and exceeded their role contribution expectations, they reported higher perceptions of task cohesion. Furthermore, as athletes approached and exceeded their social involvement expectations, they reported higher perceptions of social cohesion. These response surface patterns-pertaining to task and social cohesion-were driven by the positive influence of role experiences. On the basis of the interplay between athletes' role experiences and their perception of the group environment, efforts to improve team dynamics may benefit from focusing on improving the quality of role experiences, in conjunction with developing realistic role expectations.

  16. Experiences and Lessons From Polio Eradication Applied to Immunization in 10 Focus Countries of the Polio Endgame Strategic Plan

    PubMed Central

    Mallya, Apoorva; Sandhu, Hardeep; Anya, Blanche-Philomene; Yusuf, Nasir; Ntakibirora, Marcelline; Hasman, Andreas; Fahmy, Kamal; Agbor, John; Corkum, Melissa; Sumaili, Kyandindi; Siddique, Anisur Rahman; Bammeke, Jane; Braka, Fiona; Andriamihantanirina, Rija; Ziao, Antoine-Marie C.; Djumo, Clement; Yapi, Moise Desire; Sosler, Stephen; Eggers, Rudolf

    2017-01-01

    Abstract Nine polio areas of expertise were applied to broader immunization and mother, newborn and child health goals in ten focus countries of the Polio Eradication Endgame Strategic Plan: policy & strategy development, planning, management and oversight (accountability framework), implementation & service delivery, monitoring, communications & community engagement, disease surveillance & data analysis, technical quality & capacity building, and partnerships. Although coverage improvements depend on multiple factors and increased coverage cannot be attributed to the use of polio assets alone, 6 out of the 10 focus countries improved coverage in three doses of diphtheria tetanus pertussis containing vaccine between 2013 and 2015. Government leadership, evidence-based programming, country-driven comprehensive operational annual plans, community partnership and strong accountability systems are critical for all programs and polio eradication has illustrated these can be leveraged to increase immunization coverage and equity and enhance global health security in the focus countries. PMID:28838187

  17. Digital control and data acquisition for high-value GTA welding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George, T.G.; Franco-Ferreira, E.A.

    1993-10-01

    Electric power for the Cassini space probe wig be provided by radioisotope thermoelectric generators (RTGs) thermally driven by General-Purpose Heat Source (GPHS) modules. Each GPHS module contains four, 150-g, pellets of {sup 238}PuO{sub 2}, and each of the four pellets is encapsulated within a thin-wall iridium-alloy shell. GTA girth welding of these capsules is performed at Los Alamos National Laboratory (LANL) on an automated, digitally-controlled welding system. This paper discusses baseline design considerations for system automation and strategies employed to maximize process yield, improve process consistency, and generate required quality assurance information. Design of the automated girth welding system wasmore » driven by a number of factors which militated for precise parametric control and data acquisition. Foremost among these factors was the extraordinary value of the capsule components. In addition, DOE order 5700.6B, which took effect on 23 September 1986, required that all operations adhere to strict levels of process quality assurance. A detailed technical specification for the GPHS welding system was developed on the basis of a joint Lanl/Westinghouse Savannah River Company (WSRC) design effort. After a competitive bidding process, Jetline Engineering, Inc., of Irvine, California, was selected as the system manufacturer. During the period over which four identical welding systems were fabricated, very close liason was maintained between the LANL/WSRC technical representatives and the vendor. The level of rapport was outstanding, and the end result was the 1990 delivery of four systems that met or exceeded all specification requirements.« less

  18. Improving data quality in the linked open data: a survey

    NASA Astrophysics Data System (ADS)

    Hadhiatma, A.

    2018-03-01

    The Linked Open Data (LOD) is “web of data”, a different paradigm from “web of document” commonly used today. However, the huge LOD still suffers from data quality problems such as completeness, consistency, and accuracy. Data quality problems relate to designing effective methods both to manage and to retrieve information at various data quality levels. Based on review from papers and journals, addressing data quality requires some standards functioning to (1) identification of data quality problems, (2) assessment of data quality for a given context, and (3) correction of data quality problems. However, mostly the methods and strategies dealing with the LOD data quality were not as an integrative approach. Hence, based on those standards and an integrative approach, there are opportunities to improve the LOD data quality in the term of incompleteness, inaccuracy and inconsistency, considering to its schema and ontology, namely ontology refinement. Moreover, the term of the ontology refinement means that it copes not only to improve data quality but also to enrich the LOD. Therefore, it needs (1) a standard for data quality assessment and evaluation which is more appropriate to the LOD; (2) a framework of methods based on statistical relational learning that can improve the correction of data quality problems as well as enrich the LOD.

  19. Science-Driven Computing: NERSC's Plan for 2006-2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Horst D.; Kramer, William T.C.; Bailey, David H.

    NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less

  20. The evaluation of single-view and multi-view fusion 3D echocardiography using image-driven segmentation and tracking.

    PubMed

    Rajpoot, Kashif; Grau, Vicente; Noble, J Alison; Becher, Harald; Szmigielski, Cezary

    2011-08-01

    Real-time 3D echocardiography (RT3DE) promises a more objective and complete cardiac functional analysis by dynamic 3D image acquisition. Despite several efforts towards automation of left ventricle (LV) segmentation and tracking, these remain challenging research problems due to the poor-quality nature of acquired images usually containing missing anatomical information, speckle noise, and limited field-of-view (FOV). Recently, multi-view fusion 3D echocardiography has been introduced as acquiring multiple conventional single-view RT3DE images with small probe movements and fusing them together after alignment. This concept of multi-view fusion helps to improve image quality and anatomical information and extends the FOV. We now take this work further by comparing single-view and multi-view fused images in a systematic study. In order to better illustrate the differences, this work evaluates image quality and information content of single-view and multi-view fused images using image-driven LV endocardial segmentation and tracking. The image-driven methods were utilized to fully exploit image quality and anatomical information present in the image, thus purposely not including any high-level constraints like prior shape or motion knowledge in the analysis approaches. Experiments show that multi-view fused images are better suited for LV segmentation and tracking, while relatively more failures and errors were observed on single-view images. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Closing the Loop: Automated Data-Driven Cognitive Model Discoveries Lead to Improved Instruction and Learning Gains

    ERIC Educational Resources Information Center

    Liu, Ran; Koedinger, Kenneth R.

    2017-01-01

    As the use of educational technology becomes more ubiquitous, an enormous amount of learning process data is being produced. Educational data mining seeks to analyze and model these data, with the ultimate goal of improving learning outcomes. The most firmly grounded and rigorous evaluation of an educational data mining discovery is whether it…

  2. Human Papillomavirus Drives Tumor Development Throughout the Head and Neck: Improved Prognosis Is Associated With an Immune Response Largely Restricted to the Oropharynx

    PubMed Central

    Chakravarthy, Ankur; Henderson, Stephen; Thirdborough, Stephen M.; Ottensmeier, Christian H.; Su, Xiaoping; Lechner, Matt; Feber, Andrew; Thomas, Gareth J.

    2016-01-01

    Purpose In squamous cell carcinomas of the head and neck (HNSCC), the increasing incidence of oropharyngeal squamous cell carcinomas (OPSCCs) is attributable to human papillomavirus (HPV) infection. Despite commonly presenting at late stage, HPV-driven OPSCCs are associated with improved prognosis compared with HPV-negative disease. HPV DNA is also detectable in nonoropharyngeal (non-OPSCC), but its pathogenic role and clinical significance are unclear. The objectives of this study were to determine whether HPV plays a causal role in non-OPSCC and to investigate whether HPV confers a survival benefit in these tumors. Methods Meta-analysis was used to build a cross-tissue gene-expression signature for HPV-driven cancer. Classifiers trained by machine-learning approaches were used to predict the HPV status of 520 HNSCCs profiled by The Cancer Genome Atlas project. DNA methylation data were similarly used to classify 464 HNSCCs and these analyses were integrated with genomic, histopathology, and survival data to permit a comprehensive comparison of HPV transcript-positive OPSCC and non-OPSCC. Results HPV-driven tumors accounted for 4.1% of non-OPSCCs. Regardless of anatomic site, HPV+ HNSCCs shared highly similar gene expression and DNA methylation profiles; nonkeratinizing, basaloid histopathological features; and lack of TP53 or CDKN2A alterations. Improved overall survival, however, was largely restricted to HPV-driven OPSCCs, which were associated with increased levels of tumor-infiltrating lymphocytes compared with HPV-driven non-OPSCCs. Conclusion Our analysis identified a causal role for HPV in transcript-positive non-OPSCCs throughout the head and neck. Notably, however, HPV-driven non-OPSCCs display a distinct immune microenvironment and clinical behavior compared with HPV-driven OPSCCs. PMID:27863190

  3. Continuous Water Quality Monitoring in the Sacramento-San Joaquin Delta to support Ecosystem Science

    NASA Astrophysics Data System (ADS)

    Downing, B. D.; Bergamaschi, B. A.; Pellerin, B. A.; Saraceno, J.; Sauer, M.; Kraus, T. E.; Burau, J. R.; Fujii, R.

    2013-12-01

    Characterizing habitat quality and nutrient availability to food webs is an essential step for understanding and predicting the success of pelagic organisms in the Sacramento-San Joaquin Delta (Delta). The difficulty is that water quality and nutrient supply changes continuously as tidal and wind-driven currents move new water parcels to and from comparatively static geomorphic settings. Understanding interactions between nutrient cycling, suspended sediment, and plankton dynamics with flow and tidal range relative to position in the estuary is critical to predicting and managing bottom up effects on aquatic habitat in the Delta. Historically, quantifying concentrations and loads in the Delta has relied on water quality data collected at monthly intervals. Current in situ optical sensors for nutrients, dissolved organic matter (DOM) and algal pigments (chlorophyll-A, phycocyanin) allow for real-time, high-frequency measurements on time scales of seconds, and extending up to years. Such data is essential for characterizing changes in water quality over short and long term temporal scales as well as over broader spatial scales. High frequency water quality data have been collected at key stations in the Delta since 2012. Sensors that continuously measure nitrate, DOM, algal pigments and turbidity have been co-located at pre-existing Delta flow monitoring stations. Data from the stations are telemetered to USGS data servers and are designed to run autonomously with a monthly service interval, where sensors are cleaned and checked against calibration standards. The autonomous system is verified against discrete samples taken monthly and intensively over periodic ebb to flood tidal cycles. Here we present examples of how coupled optical and acoustic data from the sensor network to improve our understanding of nutrient and DOM dynamics and fluxes. The data offer robust quantitative estimates of concentrations and constituent fluxes needed to investigate biogeochemical processes in tidal reaches of the Delta. The data is available in real time on the web and has proven invaluable for anticipating interactions between nutrient supply and the Delta landscape, and is useful for continued research in aspects of pelagic habitat quality, algal productivity, and food web dynamics.

  4. Surgical data science: The new knowledge domain

    PubMed Central

    Vedula, S. Swaroop; Hager, Gregory D.

    2017-01-01

    Healthcare in general, and surgery/interventional care in particular, is evolving through rapid advances in technology and increasing complexity of care with the goal of maximizing quality and value of care. While innovations in diagnostic and therapeutic technologies have driven past improvements in quality of surgical care, future transformation in care will be enabled by data. Conventional methodologies, such as registry studies, are limited in their scope for discovery and research, extent and complexity of data, breadth of analytic techniques, and translation or integration of research findings into patient care. We foresee the emergence of Surgical/Interventional Data Science (SDS) as a key element to addressing these limitations and creating a sustainable path toward evidence-based improvement of interventional healthcare pathways. SDS will create tools to measure, model and quantify the pathways or processes within the context of patient health states or outcomes, and use information gained to inform healthcare decisions, guidelines, best practices, policy, and training, thereby improving the safety and quality of healthcare and its value. Data is pervasive throughout the surgical care pathway; thus, SDS can impact various aspects of care including prevention, diagnosis, intervention, or post-operative recovery. Existing literature already provides preliminary results suggesting how a data science approach to surgical decision-making could more accurately predict severe complications using complex data from pre-, intra-, and post-operative contexts, how it could support intra-operative decision-making using both existing knowledge and continuous data streams throughout the surgical care pathway, and how it could enable effective collaboration between human care providers and intelligent technologies. In addition, SDS is poised to play a central role in surgical education, for example, through objective assessments, automated virtual coaching, and robot-assisted active learning of surgical skill. However, the potential for transforming surgical care and training through SDS may only be realized through a cultural shift that not only institutionalizes technology to seamlessly capture data but also assimilates individuals with expertise in data science into clinical research teams. Furthermore, collaboration with industry partners from the inception of the discovery process promotes optimal design of data products as well as their efficient translation and commercialization. As surgery continues to evolve through advances in technology that enhance delivery of care, SDS represents a new knowledge domain to engineer surgical care of the future. PMID:28936475

  5. Surgical data science: The new knowledge domain.

    PubMed

    Vedula, S Swaroop; Hager, Gregory D

    2017-04-01

    Healthcare in general, and surgery/interventional care in particular, is evolving through rapid advances in technology and increasing complexity of care with the goal of maximizing quality and value of care. While innovations in diagnostic and therapeutic technologies have driven past improvements in quality of surgical care, future transformation in care will be enabled by data. Conventional methodologies, such as registry studies, are limited in their scope for discovery and research, extent and complexity of data, breadth of analytic techniques, and translation or integration of research findings into patient care. We foresee the emergence of Surgical/Interventional Data Science (SDS) as a key element to addressing these limitations and creating a sustainable path toward evidence-based improvement of interventional healthcare pathways. SDS will create tools to measure, model and quantify the pathways or processes within the context of patient health states or outcomes, and use information gained to inform healthcare decisions, guidelines, best practices, policy, and training, thereby improving the safety and quality of healthcare and its value. Data is pervasive throughout the surgical care pathway; thus, SDS can impact various aspects of care including prevention, diagnosis, intervention, or post-operative recovery. Existing literature already provides preliminary results suggesting how a data science approach to surgical decision-making could more accurately predict severe complications using complex data from pre-, intra-, and post-operative contexts, how it could support intra-operative decision-making using both existing knowledge and continuous data streams throughout the surgical care pathway, and how it could enable effective collaboration between human care providers and intelligent technologies. In addition, SDS is poised to play a central role in surgical education, for example, through objective assessments, automated virtual coaching, and robot-assisted active learning of surgical skill. However, the potential for transforming surgical care and training through SDS may only be realized through a cultural shift that not only institutionalizes technology to seamlessly capture data but also assimilates individuals with expertise in data science into clinical research teams. Furthermore, collaboration with industry partners from the inception of the discovery process promotes optimal design of data products as well as their efficient translation and commercialization. As surgery continues to evolve through advances in technology that enhance delivery of care, SDS represents a new knowledge domain to engineer surgical care of the future.

  6. Design-Driven Innovation as Seen in a Worldwide Values-Based Curriculum

    ERIC Educational Resources Information Center

    Hadlock, Camey Andersen; McDonald, Jason K.

    2014-01-01

    While instructional design's technological roots have given it many approaches for process and product improvement, in most cases designers still rely on instructional forms that do not allow them to develop instruction of a quality consistent with that expressed by the field's visionary leaders. As a result, often the teachers and students using…

  7. 42 CFR 416.43 - Conditions for coverage-Quality assessment and performance improvement.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... outcomes, patient safety, and quality of care. (2) Performance improvement activities must track adverse... improves patient safety by using quality indicators or performance measures associated with improved health... incorporate quality indicator data, including patient care and other relevant data regarding services...

  8. 42 CFR 416.43 - Conditions for coverage-Quality assessment and performance improvement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... outcomes, patient safety, and quality of care. (2) Performance improvement activities must track adverse... improves patient safety by using quality indicators or performance measures associated with improved health... incorporate quality indicator data, including patient care and other relevant data regarding services...

  9. 42 CFR 416.43 - Conditions for coverage-Quality assessment and performance improvement.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... outcomes, patient safety, and quality of care. (2) Performance improvement activities must track adverse... improves patient safety by using quality indicators or performance measures associated with improved health... incorporate quality indicator data, including patient care and other relevant data regarding services...

  10. Methods and dimensions of electronic health record data quality assessment: enabling reuse for clinical research

    PubMed Central

    Weng, Chunhua

    2013-01-01

    Objective To review the methods and dimensions of data quality assessment in the context of electronic health record (EHR) data reuse for research. Materials and methods A review of the clinical research literature discussing data quality assessment methodology for EHR data was performed. Using an iterative process, the aspects of data quality being measured were abstracted and categorized, as well as the methods of assessment used. Results Five dimensions of data quality were identified, which are completeness, correctness, concordance, plausibility, and currency, and seven broad categories of data quality assessment methods: comparison with gold standards, data element agreement, data source agreement, distribution comparison, validity checks, log review, and element presence. Discussion Examination of the methods by which clinical researchers have investigated the quality and suitability of EHR data for research shows that there are fundamental features of data quality, which may be difficult to measure, as well as proxy dimensions. Researchers interested in the reuse of EHR data for clinical research are recommended to consider the adoption of a consistent taxonomy of EHR data quality, to remain aware of the task-dependence of data quality, to integrate work on data quality assessment from other fields, and to adopt systematic, empirically driven, statistically based methods of data quality assessment. Conclusion There is currently little consistency or potential generalizability in the methods used to assess EHR data quality. If the reuse of EHR data for clinical research is to become accepted, researchers should adopt validated, systematic methods of EHR data quality assessment. PMID:22733976

  11. Data-Driven Decision Making and Its Effects on Leadership Practices and Student Achievement in K-5 Public Elementary Schools in California

    ERIC Educational Resources Information Center

    Ceja, Rafael, Jr.

    2012-01-01

    The enactment of the NCLB Act of 2001 and its legislative mandates for accountability testing throughout the nation brought to the forefront the issue of data-driven decision making. This emphasis on improving education has been spurred due to the alleged failure of the public school system. As a result, the role of administrators has evolved to…

  12. Relative value unit-based compensation incentivization in an academic vascular practice improves productivity with no early adverse impact on quality.

    PubMed

    Awad, Nadia; Caputo, Francis J; Carpenter, Jeffrey P; Alexander, James B; Trani, José L; Lombardi, Joseph V

    2017-02-01

    Given the increased pressure from governmental programs to restructure reimbursements to reflect quality metrics achieved by physicians, review of current reimbursement schemes is necessary to ensure sustainability of the physician's performance while maintaining and ultimately improving patient outcomes. This study reviewed the impact of reimbursement incentives on evidence-based care outcomes within a vascular surgical program at an academic tertiary care center. Data for patients with a confirmed 30-day follow-up for the vascular surgery subset of our institution's National Surgical Quality Improvement Program submission for the years 2013 and 2014 were reviewed. The outcomes reviewed included 30-day mortality, readmission, unplanned returns to the operating room, and all major morbidities. A comparison of both total charges and work relative value units (RVUs) generated was performed before and after changes were made from a salary-based to a productivity-based compensation model. P value analysis was used to determine if there were any statistically significant differences in patient outcomes between the two study years. No statistically significant difference in outcomes of the core measures studied was identified between the two periods. There was a trend toward a lower incidence of respiratory complications, largely driven by a lower incidence in pneumonia between 2013 and 2014. The vascular division had a net increase of 8.2% in total charges and 5.7% in work RVUs after the RVU-based incentivization program was instituted. Revenue-improving measures can improve sustainability of a vascular program without negatively affecting patient care as evidenced by the lack of difference in evidence-based core outcome measures in our study period. Further studies are needed to elucidate the long-term effects of incentivization programs on both patient care and program viability. Copyright © 2016. Published by Elsevier Inc.

  13. Structured data quality reports to improve EHR data quality.

    PubMed

    Taggart, Jane; Liaw, Siaw-Teng; Yu, Hairong

    2015-12-01

    To examine whether a structured data quality report (SDQR) and feedback sessions with practice principals and managers improve the quality of routinely collected data in EHRs. The intervention was conducted in four general practices participating in the Fairfield neighborhood electronic Practice Based Research Network (ePBRN). Data were extracted from their clinical information systems and summarised as a SDQR to guide feedback to practice principals and managers at 0, 4, 8 and 12 months. Data quality (DQ) metrics included completeness, correctness, consistency and duplication of patient records. Information on data recording practices, data quality improvement, and utility of SDQRs was collected at the feedback sessions at the practices. The main outcome measure was change in the recording of clinical information and level of meeting Royal Australian College of General Practice (RACGP) targets. Birth date was 100% and gender 99% complete at baseline and maintained. DQ of all variables measured improved significantly (p<0.01) over 12 months, but was not sufficient to comply with RACGP standards. Improvement was greatest with allergies. There was no significant change in duplicate records. SDQRs and feedback sessions support general practitioners and practice managers to focus on improving the recording of patient information. However, improved practice DQ, was not sufficient to meet RACGP targets. Randomised controlled studies are required to evaluate strategies to improve data quality and any associated improved safety and quality of care. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Real World Data Driven Evolution of Volvo Cars’ Side Impact Protection Systems and their Effectiveness

    PubMed Central

    Jakobsson, Lotta; Lindman, Magdalena; Svanberg, Bo; Carlsson, Henrik

    2010-01-01

    This study analyses the outcome of the continuous improved occupant protection over the last two decades for front seat near side occupants in side impacts based on a real world driven working process. The effectiveness of four generations of improved side impact protection are calculated based on data from Volvo’s statistical accident database of Volvo Cars in Sweden. Generation I includes vehicles with a new structural and interior concept (SIPS). Generation II includes vehicles with structural improvements and a new chest airbag (SIPSbag). Generation III includes vehicles with further improved SIPS and SIPSbag as well as the new concept with a head protecting Inflatable Curtain (IC). Generation IV includes the most recent vehicles with further improvements of all the systems plus advanced sensors and seat belt pretensioner activation. Compared to baseline vehicles, vehicles of generation I reduce MAIS2+ injuries by 54%, generation II by 61% and generation III by 72%. For generation IV effectiveness figures cannot be calculated because of the lack of MAIS2+ injuries. A continuous improved performance is also seen when studying the AIS2+ pelvis, abdomen, chest and head injuries separately. By using the same real world driven working process, future improvements and possibly new passive as well as active safety systems, will be developed with the aim of further improved protection to near side occupants in side impacts. PMID:21050597

  15. Using science and psychology to improve the dissemination and evaluation of scientific work

    PubMed Central

    Buttliere, Brett T.

    2014-01-01

    Here I outline some of what science can tell us about the problems in psychological publishing and how to best address those problems. First, the motivation behind questionable research practices is examined (the desire to get ahead or, at least, not fall behind). Next, behavior modification strategies are discussed, pointing out that reward works better than punishment. Humans are utility seekers and the implementation of current change initiatives is hindered by high initial buy-in costs and insufficient expected utility. Open science tools interested in improving science should team up, to increase utility while lowering the cost and risk associated with engagement. The best way to realign individual and group motives will probably be to create one, centralized, easy to use, platform, with a profile, a feed of targeted science stories based upon previous system interaction, a sophisticated (public) discussion section, and impact metrics which use the associated data. These measures encourage high quality review and other prosocial activities while inhibiting self-serving behavior. Some advantages of centrally digitizing communications are outlined, including ways the data could be used to improve the peer review process. Most generally, it seems that decisions about change design and implementation should be theory and data driven. PMID:25191261

  16. Improved frame-based estimation of head motion in PET brain imaging.

    PubMed

    Mukherjee, J M; Lindsay, C; Mukherjee, A; Olivier, P; Shao, L; King, M A; Licho, R

    2016-05-01

    Head motion during PET brain imaging can cause significant degradation of image quality. Several authors have proposed ways to compensate for PET brain motion to restore image quality and improve quantitation. Head restraints can reduce movement but are unreliable; thus the need for alternative strategies such as data-driven motion estimation or external motion tracking. Herein, the authors present a data-driven motion estimation method using a preprocessing technique that allows the usage of very short duration frames, thus reducing the intraframe motion problem commonly observed in the multiple frame acquisition method. The list mode data for PET acquisition is uniformly divided into 5-s frames and images are reconstructed without attenuation correction. Interframe motion is estimated using a 3D multiresolution registration algorithm and subsequently compensated for. For this study, the authors used 8 PET brain studies that used F-18 FDG as the tracer and contained minor or no initial motion. After reconstruction and prior to motion estimation, known motion was introduced to each frame to simulate head motion during a PET acquisition. To investigate the trade-off in motion estimation and compensation with respect to frames of different length, the authors summed 5-s frames accordingly to produce 10 and 60 s frames. Summed images generated from the motion-compensated reconstructed frames were then compared to the original PET image reconstruction without motion compensation. The authors found that our method is able to compensate for both gradual and step-like motions using frame times as short as 5 s with a spatial accuracy of 0.2 mm on average. Complex volunteer motion involving all six degrees of freedom was estimated with lower accuracy (0.3 mm on average) than the other types investigated. Preprocessing of 5-s images was necessary for successful image registration. Since their method utilizes nonattenuation corrected frames, it is not susceptible to motion introduced between CT and PET acquisitions. The authors have shown that they can estimate motion for frames with time intervals as short as 5 s using nonattenuation corrected reconstructed FDG PET brain images. Intraframe motion in 60-s frames causes degradation of accuracy to about 2 mm based on the motion type.

  17. Code Red: The Danger of Data-Driven Instruction

    ERIC Educational Resources Information Center

    Neuman, Susan B.

    2016-01-01

    "Data-drive instruction can distort the way reading is taught, harming the students who need high-quality instruction the most," Susan B. Neuman concludes from her research team's two years of observation in nine low-income New York City schools. She describes how some students are reminded that they are "failures" every day by…

  18. SnoMAP: Pioneering the Path for Clinical Coding to Improve Patient Care.

    PubMed

    Lawley, Michael; Truran, Donna; Hansen, David; Good, Norm; Staib, Andrew; Sullivan, Clair

    2017-01-01

    The increasing demand for healthcare and the static resources available necessitate data driven improvements in healthcare at large scale. The SnoMAP tool was rapidly developed to provide an automated solution that transforms and maps clinician-entered data to provide data which is fit for both administrative and clinical purposes. Accuracy of data mapping was maintained.

  19. 42 CFR 480.143 - QIO involvement in shared health data systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs) Disclosure of Confidential Information § 480.143 QIO involvement in shared health data...

  20. 42 CFR 480.143 - QIO involvement in shared health data systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs) Disclosure of Confidential Information § 480.143 QIO involvement in shared health data...

  1. 42 CFR 480.143 - QIO involvement in shared health data systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs) Disclosure of Confidential Information § 480.143 QIO involvement in shared health data...

  2. Quantifying surgical complexity with machine learning: looking beyond patient factors to improve surgical models.

    PubMed

    Van Esbroeck, Alexander; Rubinfeld, Ilan; Hall, Bruce; Syed, Zeeshan

    2014-11-01

    To investigate the use of machine learning to empirically determine the risk of individual surgical procedures and to improve surgical models with this information. American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) data from 2005 to 2009 were used to train support vector machine (SVM) classifiers to learn the relationship between textual constructs in current procedural terminology (CPT) descriptions and mortality, morbidity, Clavien 4 complications, and surgical-site infections (SSI) within 30 days of surgery. The procedural risk scores produced by the SVM classifiers were validated on data from 2010 in univariate and multivariate analyses. The procedural risk scores produced by the SVM classifiers achieved moderate-to-high levels of discrimination in univariate analyses (area under receiver operating characteristic curve: 0.871 for mortality, 0.789 for morbidity, 0.791 for SSI, 0.845 for Clavien 4 complications). Addition of these scores also substantially improved multivariate models comprising patient factors and previously proposed correlates of procedural risk (net reclassification improvement and integrated discrimination improvement: 0.54 and 0.001 for mortality, 0.46 and 0.011 for morbidity, 0.68 and 0.022 for SSI, 0.44 and 0.001 for Clavien 4 complications; P < .05 for all comparisons). Similar improvements were noted in discrimination and calibration for other statistical measures, and in subcohorts comprising patients with general or vascular surgery. Machine learning provides clinically useful estimates of surgical risk for individual procedures. This information can be measured in an entirely data-driven manner and substantially improves multifactorial models to predict postoperative complications. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Guide for Improving NRS Data Quality: Procedures for Data Collection and Training.

    ERIC Educational Resources Information Center

    Condelli, Larry; Castillo, Laura; Seburn, Mary; Deveaux, Jon

    This guide for improving the quality of National Reporting System for Adult Education (NRS) data through improved data collection and training is intended for local providers and state administrators. Chapter 1 explains the guide's purpose, contents, and use and defines the following components of data quality: objectivity; integrity;…

  4. Demonstration of application-driven network slicing and orchestration in optical/packet domains: on-demand vDC expansion for Hadoop MapReduce optimization.

    PubMed

    Kong, Bingxin; Liu, Siqi; Yin, Jie; Li, Shengru; Zhu, Zuqing

    2018-05-28

    Nowadays, it is common for service providers (SPs) to leverage hybrid clouds to improve the quality-of-service (QoS) of their Big Data applications. However, for achieving guaranteed latency and/or bandwidth in its hybrid cloud, an SP might desire to have a virtual datacenter (vDC) network, in which it can manage and manipulate the network connections freely. To address this requirement, we design and implement a network slicing and orchestration (NSO) system that can create and expand vDCs across optical/packet domains on-demand. Considering Hadoop MapReduce (M/R) as the use-case, we describe the proposed architectures of the system's data, control and management planes, and present the operation procedures for creating, expanding, monitoring and managing a vDC for M/R optimization. The proposed NSO system is then realized in a small-scale network testbed that includes four optical/packet domains, and we conduct experiments in it to demonstrate the whole operations of the data, control and management planes. Our experimental results verify that application-driven on-demand vDC expansion across optical/packet domains can be achieved for M/R optimization, and after being provisioned with a vDC, the SP using the NSO system can fully control the vDC network and further optimize the M/R jobs in it with network orchestration.

  5. Building bridges: engaging medical residents in quality improvement and medical leadership.

    PubMed

    Voogt, Judith J; van Rensen, Elizabeth L J; van der Schaaf, Marieke F; Noordegraaf, Mirko; Schneider, Margriet Me

    2016-12-01

    To develop an educational intervention that targets residents' beliefs and attitudes to quality Improvement (QI) and leadership in order to demonstrate proactive behaviour. Theory-driven, mixed methods study including document analysis, interviews, observations and open-ended questionnaires. Six Dutch teaching hospitals. Using expertise from medicine, psychology, organizational and educational sciences we developed a situated learning programme named Ponder and IMProve (PIMP). The acronym PIMP reflects the original upbeat name in Dutch, Verwonder & Verbeter. It has a modern, positive meaning that relates to improving your current circumstances. In quarterly 1-h sessions residents are challenged to identify daily workplace frustrations and translate them into small-scale QI activities. Organizational awareness, beliefs and attitudes to QI and organizational responsibilities, resident behaviour, barriers and facilitators to successful learning and the programme's potential impact on the organization. Overall, 19 PIMP meetings were held over a period of 3 years. Residents defined 119 PIMP goals, resolved 37 projects and are currently working on another 39 projects. Interviews show that PIMP sessions make residents more aware of the organizational aspects of their daily work. Moreover, residents feel empowered to take up the role of change agent. Facilitators for success include a positive cost-benefit trade-off, a valuable group process and a safe learning environment. This article demonstrates the added value of multidisciplinary theory-driven research for the design, development and evaluation of educational programmes. Residents can be encouraged to develop organizational awareness and reshape their daily frustrations in QI work. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  6. Hospital on a page: standardizing data presentation to drive quality improvement.

    PubMed

    Heenan, Michael; DiEmanuele, Michelle; Hayward-Murray, Kathryn; Dadgar, Ladan

    2012-01-01

    Over the past five years, the Credit Valley Hospital (CVH) invested time and financial and human resources into performance measurement systems. In doing so, CVH launched a number of data tools including electronic scorecards and dashboards. However, the processes and accountability structures associated with the tools struggled to gain credibility with clinical and administrative leadership as the performance measurement system was primarily driven by the technology rather than a sound information strategy. Although a corporate-level scorecard was regularly updated, program-related scorecards and other measurement tools were only populated when programs reported to the board, at the time of accreditation or as a result of regulatory requirements. In addition, information contained in data reports was often presented in a manner that did not engage clinical and corporate decision-makers in the key issues of quality, access and sustainability. Following the release of its new strategic plan in 2009, CVH renewed its performance measurement framework and the methods by which it presented data so that the organization's strategic plan could be implemented and measured from the boardroom to the bedside. Long, complex spreadsheets were transformed into strategically designed, easy-to-understand, easy-to-access reports released in a standardized method in terms of format, media, content and timing. The following article describes the method CVH adopted to communicate the organization's performance and the role it played in enhancing the culture of quality and patient safety within the hospital.

  7. Application of Ontologies for Big Earth Data

    NASA Astrophysics Data System (ADS)

    Huang, T.; Chang, G.; Armstrong, E. M.; Boening, C.

    2014-12-01

    Connected data is smarter data! Earth Science research infrastructure must do more than just being able to support temporal, geospatial discovery of satellite data. As the Earth Science data archives continue to expand across NASA data centers, the research communities are demanding smarter data services. A successful research infrastructure must be able to present researchers the complete picture, that is, datasets with linked citations, related interdisciplinary data, imageries, current events, social media discussions, and scientific data tools that are relevant to the particular dataset. The popular Semantic Web for Earth and Environmental Terminology (SWEET) ontologies is a collection of ontologies and concepts designed to improve discovery and application of Earth Science data. The SWEET ontologies collection was initially developed to capture the relationships between keywords in the NASA Global Change Master Directory (GCMD). Over the years this popular ontologies collection has expanded to cover over 200 ontologies and 6000 concepts to enable scalable classification of Earth system science concepts and Space science. This presentation discusses the semantic web technologies as the enabling technology for data-intensive science. We will discuss the application of the SWEET ontologies as a critical component in knowledge-driven research infrastructure for some of the recent projects, which include the DARPA Ontological System for Context Artifact and Resources (OSCAR), 2013 NASA ACCESS Virtual Quality Screening Service (VQSS), and the 2013 NASA Sea Level Change Portal (SLCP) projects. The presentation will also discuss the benefits in using semantic web technologies in developing research infrastructure for Big Earth Science Data in an attempt to "accommodate all domains and provide the necessary glue for information to be cross-linked, correlated, and discovered in a semantically rich manner." [1] [1] Savas Parastatidis: A platform for all that we know: creating a knowledge-driven research infrastructure. The Fourth Paradigm 2009: 165-172

  8. Dynamically adaptive data-driven simulation of extreme hydrological flows

    NASA Astrophysics Data System (ADS)

    Kumar Jain, Pushkar; Mandli, Kyle; Hoteit, Ibrahim; Knio, Omar; Dawson, Clint

    2018-02-01

    Hydrological hazards such as storm surges, tsunamis, and rainfall-induced flooding are physically complex events that are costly in loss of human life and economic productivity. Many such disasters could be mitigated through improved emergency evacuation in real-time and through the development of resilient infrastructure based on knowledge of how systems respond to extreme events. Data-driven computational modeling is a critical technology underpinning these efforts. This investigation focuses on the novel combination of methodologies in forward simulation and data assimilation. The forward geophysical model utilizes adaptive mesh refinement (AMR), a process by which a computational mesh can adapt in time and space based on the current state of a simulation. The forward solution is combined with ensemble based data assimilation methods, whereby observations from an event are assimilated into the forward simulation to improve the veracity of the solution, or used to invert for uncertain physical parameters. The novelty in our approach is the tight two-way coupling of AMR and ensemble filtering techniques. The technology is tested using actual data from the Chile tsunami event of February 27, 2010. These advances offer the promise of significantly transforming data-driven, real-time modeling of hydrological hazards, with potentially broader applications in other science domains.

  9. Using Two Different Approaches to Assess Dietary Patterns: Hypothesis-Driven and Data-Driven Analysis.

    PubMed

    Previdelli, Ágatha Nogueira; de Andrade, Samantha Caesar; Fisberg, Regina Mara; Marchioni, Dirce Maria

    2016-09-23

    The use of dietary patterns to assess dietary intake has become increasingly common in nutritional epidemiology studies due to the complexity and multidimensionality of the diet. Currently, two main approaches have been widely used to assess dietary patterns: data-driven and hypothesis-driven analysis. Since the methods explore different angles of dietary intake, using both approaches simultaneously might yield complementary and useful information; thus, we aimed to use both approaches to gain knowledge of adolescents' dietary patterns. Food intake from a cross-sectional survey with 295 adolescents was assessed by 24 h dietary recall (24HR). In hypothesis-driven analysis, based on the American National Cancer Institute method, the usual intake of Brazilian Healthy Eating Index Revised components were estimated. In the data-driven approach, the usual intake of foods/food groups was estimated by the Multiple Source Method. In the results, hypothesis-driven analysis showed low scores for Whole grains, Total vegetables, Total fruit and Whole fruits), while, in data-driven analysis, fruits and whole grains were not presented in any pattern. High intakes of sodium, fats and sugars were observed in hypothesis-driven analysis with low total scores for Sodium, Saturated fat and SoFAA (calories from solid fat, alcohol and added sugar) components in agreement, while the data-driven approach showed the intake of several foods/food groups rich in these nutrients, such as butter/margarine, cookies, chocolate powder, whole milk, cheese, processed meat/cold cuts and candies. In this study, using both approaches at the same time provided consistent and complementary information with regard to assessing the overall dietary habits that will be important in order to drive public health programs, and improve their efficiency to monitor and evaluate the dietary patterns of populations.

  10. Reducing hospital-acquired heel ulcer rates in an acute care facility: an evaluation of a nurse-driven performance improvement project.

    PubMed

    McElhinny, Mary Louise; Hooper, Christine

    2008-01-01

    A nurse-driven performance improvement project designed to reduce the incidence of hospital-acquired ulcers of the heel in an acute care setting was evaluated. This was a descriptive evaluative study using secondary data analysis. Data were collected in 2004, prior to implementation of the prevention project and compared to results obtained in 2006, after the project was implemented. Data were collected in a 172-bed, not-for-profit inpatient acute care facility in North Central California. All medical-surgical inpatients aged 18 years and older were included in the samples. Data were collected on 113 inpatients prior to implementation of the project in 2004. Data were also collected on a sample of 124 inpatients in 2006. The prevalence and incidence of heel pressure ulcers were obtained through skin surveys prior to implementation of the prevention program and following its implementation. Results from 2004 were compared to data collected in 2006 after introduction of the Braden Scale for Predicting Pressure Sore Risk. Heel pressure ulcers were staged using the National Pressure Ulcer Advisory Panel (NPUAP) staging system and recommendations provided by the Agency for Health Care Quality Research (AHRQ) clinical practice guidelines. The incidence of hospital-acquired heel pressure ulcers in 2004 was 13.5% (4 of 37 patients). After implementation of the program in 2006, the incidence of hospital-acquired heel pressure ulcers was 13.8% (5 of 36 patients). The intervention did not appear to receive adequate staff nurse support needed to make the project successful. Factors that influenced the lack of support may have included: (1) educational method used, (2) lack of organization approved, evidenced-based standardized protocols for prevention and treatment of heel ulcers, and (3) failure of facility management to convey the importance as well as their support for the project.

  11. Data-driven train set crash dynamics simulation

    NASA Astrophysics Data System (ADS)

    Tang, Zhao; Zhu, Yunrui; Nie, Yinyu; Guo, Shihui; Liu, Fengjia; Chang, Jian; Zhang, Jianjun

    2017-02-01

    Traditional finite element (FE) methods are arguably expensive in computation/simulation of the train crash. High computational cost limits their direct applications in investigating dynamic behaviours of an entire train set for crashworthiness design and structural optimisation. On the contrary, multi-body modelling is widely used because of its low computational cost with the trade-off in accuracy. In this study, a data-driven train crash modelling method is proposed to improve the performance of a multi-body dynamics simulation of train set crash without increasing the computational burden. This is achieved by the parallel random forest algorithm, which is a machine learning approach that extracts useful patterns of force-displacement curves and predicts a force-displacement relation in a given collision condition from a collection of offline FE simulation data on various collision conditions, namely different crash velocities in our analysis. Using the FE simulation results as a benchmark, we compared our method with traditional multi-body modelling methods and the result shows that our data-driven method improves the accuracy over traditional multi-body models in train crash simulation and runs at the same level of efficiency.

  12. Severity of psychiatric and physical problems is associated with lower quality of life in methadone patients in Indonesia.

    PubMed

    Iskandar, Shelly; van Crevel, Reinout; Hidayat, Teddy; Siregar, Ike M P; Achmad, Tri H; van der Ven, Andre J; De Jong, Cor A

    2013-01-01

    The goal of methadone maintenance treatment (MMT) is to reduce the harm and to improve patients' quality of life (Qol). However, the Qol is also influenced by other co-occurring disorders. Data regarding the Qol and the co-occurrence of these disorders is lacking in low-middle income countries. We therefore describe the prevalence of physical, psychiatric, and drug abuse co-occurring disorders among MMT patients in Indonesia and determine the association between the severity of the co-occurring disorders and the Qol. Data were collected in 112 injection drug abusers (IDUs) attending a MMT program in West Java, Indonesia, using validated questionnaires, medical records and laboratory testing. For comparison, 154 IDUs not enrolled in MMT were recruited by respondent driven sampling. The most frequent co-occurring disorders were hepatitis C (92%), HIV (77%), benzodiazepine abuse (56%), and anxiety disorders (32%). IDUs in MMT had one (26%), two (47%), or three (27%) co-occurring disorders. Higher severity in psychiatric and physical problems was associated with poorer Qol. IDUs not enrolled in MMT had similar co-occurring problems. The prevalence of co-occurring disorders in IDUs in Indonesia is high and they influence their Qol. Therefore, comprehensive treatment, especially focusing on the common co-occurring disorders should be provided in MMT to improve the Qol. Copyright © American Academy of Addiction Psychiatry.

  13. A Decision Fusion Framework for Treatment Recommendation Systems.

    PubMed

    Mei, Jing; Liu, Haifeng; Li, Xiang; Xie, Guotong; Yu, Yiqin

    2015-01-01

    Treatment recommendation is a nontrivial task--it requires not only domain knowledge from evidence-based medicine, but also data insights from descriptive, predictive and prescriptive analysis. A single treatment recommendation system is usually trained or modeled with a limited (size or quality) source. This paper proposes a decision fusion framework, combining both knowledge-driven and data-driven decision engines for treatment recommendation. End users (e.g. using the clinician workstation or mobile apps) could have a comprehensive view of various engines' opinions, as well as the final decision after fusion. For implementation, we leverage several well-known fusion algorithms, such as decision templates and meta classifiers (of logistic and SVM, etc.). Using an outcome-driven evaluation metric, we compare the fusion engine with base engines, and our experimental results show that decision fusion is a promising way towards a more valuable treatment recommendation.

  14. 42 CFR 480.114 - Limitation on data collection.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs) Qio Access to Information § 480.114 Limitation on data collection. A QIO or any agent, organization, or institution acting...

  15. 42 CFR 480.114 - Limitation on data collection.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION REVIEW INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs) Qio Access to Information § 480.114 Limitation on data collection. A QIO or any agent, organization, or...

  16. 42 CFR 480.114 - Limitation on data collection.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs) Qio Access to Information § 480.114 Limitation on data collection. A QIO or any agent, organization, or institution acting...

  17. 42 CFR 480.143 - QIO involvement in shared health data systems.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION REVIEW INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs) Disclosure of Confidential Information § 480.143 QIO involvement in shared health data...

  18. 42 CFR 480.114 - Limitation on data collection.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs) Qio Access to Information § 480.114 Limitation on data collection. A QIO or any agent, organization, or institution acting...

  19. 42 CFR 480.143 - QIO involvement in shared health data systems.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION REVIEW INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs) Disclosure of Confidential Information § 480.143 QIO involvement in shared health data...

  20. 42 CFR 480.114 - Limitation on data collection.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION REVIEW INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs) Qio Access to Information § 480.114 Limitation on data collection. A QIO or any agent, organization, or...

  1. A system framework of inter-enterprise machining quality control based on fractal theory

    NASA Astrophysics Data System (ADS)

    Zhao, Liping; Qin, Yongtao; Yao, Yiyong; Yan, Peng

    2014-03-01

    In order to meet the quality control requirement of dynamic and complicated product machining processes among enterprises, a system framework of inter-enterprise machining quality control based on fractal was proposed. In this system framework, the fractal-specific characteristic of inter-enterprise machining quality control function was analysed, and the model of inter-enterprise machining quality control was constructed by the nature of fractal structures. Furthermore, the goal-driven strategy of inter-enterprise quality control and the dynamic organisation strategy of inter-enterprise quality improvement were constructed by the characteristic analysis on this model. In addition, the architecture of inter-enterprise machining quality control based on fractal was established by means of Web service. Finally, a case study for application was presented. The result showed that the proposed method was available, and could provide guidance for quality control and support for product reliability in inter-enterprise machining processes.

  2. Data consistency-driven scatter kernel optimization for x-ray cone-beam CT

    NASA Astrophysics Data System (ADS)

    Kim, Changhwan; Park, Miran; Sung, Younghun; Lee, Jaehak; Choi, Jiyoung; Cho, Seungryong

    2015-08-01

    Accurate and efficient scatter correction is essential for acquisition of high-quality x-ray cone-beam CT (CBCT) images for various applications. This study was conducted to demonstrate the feasibility of using the data consistency condition (DCC) as a criterion for scatter kernel optimization in scatter deconvolution methods in CBCT. As in CBCT, data consistency in the mid-plane is primarily challenged by scatter, we utilized data consistency to confirm the degree of scatter correction and to steer the update in iterative kernel optimization. By means of the parallel-beam DCC via fan-parallel rebinning, we iteratively optimized the scatter kernel parameters, using a particle swarm optimization algorithm for its computational efficiency and excellent convergence. The proposed method was validated by a simulation study using the XCAT numerical phantom and also by experimental studies using the ACS head phantom and the pelvic part of the Rando phantom. The results showed that the proposed method can effectively improve the accuracy of deconvolution-based scatter correction. Quantitative assessments of image quality parameters such as contrast and structure similarity (SSIM) revealed that the optimally selected scatter kernel improves the contrast of scatter-free images by up to 99.5%, 94.4%, and 84.4%, and of the SSIM in an XCAT study, an ACS head phantom study, and a pelvis phantom study by up to 96.7%, 90.5%, and 87.8%, respectively. The proposed method can achieve accurate and efficient scatter correction from a single cone-beam scan without need of any auxiliary hardware or additional experimentation.

  3. Assessing the impact of continuous quality improvement/total quality management: concept versus implementation.

    PubMed Central

    Shortell, S M; O'Brien, J L; Carman, J M; Foster, R W; Hughes, E F; Boerstler, H; O'Connor, E J

    1995-01-01

    OBJECTIVE: This study examines the relationships among organizational culture, quality improvement processes and selected outcomes for a sample of up to 61 U. S. hospitals. DATA SOURCES AND STUDY SETTING: Primary data were collected from 61 U. S. hospitals (located primarily in the midwest and the west) on measures related to continuous quality improvement/total quality management (CQI/TQM), organizational culture, implementation approaches, and degree of quality improvement implementation based on the Baldrige Award criteria. These data were combined with independently collected data on perceived impact and objective measures of clinical efficiency (i.e., charges and length of stay) for six clinical conditions. STUDY DESIGN: The study involved cross-sectional examination of the named relationships. DATA COLLECTION/EXTRACTION METHODS: Reliable and valid scales for the organizational culture and quality improvement implementation measures were developed based on responses from over 7,000 individuals across the 61 hospitals with an overall completion rate of 72 percent. Independent data on perceived impact were collected from a national survey and independent data on clinical efficiency from a companion study of managed care. PRINCIPAL FINDINGS: A participative, flexible, risk-taking organizational culture was significantly related to quality improvement implementation. Quality improvement implementation, in turn, was positively associated with greater perceived patient outcomes and human resource development. Larger-size hospitals experienced lower clinical efficiency with regard to higher charges and higher length of stay, due in part to having more bureaucratic and hierarchical cultures that serve as a barrier to quality improvement implementation. CONCLUSIONS: What really matters is whether or not a hospital has a culture that supports quality improvement work and an approach that encourages flexible implementation. Larger-size hospitals face more difficult challenges in this regard. PMID:7782222

  4. Electronic referrals: what matters to the users.

    PubMed

    Warren, Jim; Gu, Yulong; Day, Karen; White, Sue; Pollock, Malcolm

    2012-01-01

    Between September 2010 and May 2011 we evaluated three implementations of electronic referral (eReferral) systems at Hutt Valley, Northland and Canterbury District Health Boards in New Zealand. Qualitative and quantitative data were gathered through project documentation, database records and stakeholder interviews. This paper reports on the user perspectives based on interviews with 78 clinical, management and operational stakeholders in the three regions. Themes that emerge across the regions are compared and synthesised. Interviews focused on pre-planned domains including quality of referral, ease of use and patient safety, but agendas were adapted progressively to elaborate and triangulate on themes emerging from earlier interviews and to clarify indications from analysis of database records. The eReferral users, including general practitioners, specialists and administrative staff, report benefits in the areas of: (1) availability and transparency of referral-related data; (2) work transformation; (3) improved data quality and (4) the convenience of auto-population from the practice management system into the referral forms. eReferral provides enhanced visibility of referral data and status within the limits of the implementation (which only goes to the hospital door in some cases). Users in all projects indicated the desire to further exploit IT to enhance two-way communication between community and hospital. Reduced administrative handling is a clear work transformation benefit with mixed feedback regarding clinical workload impact. Innovations such as GP eReferral triaging teams illustrate the further potential for workflow transformation. Consistent structure in eReferrals, as well as simple legibility, enhances data quality. Efficiency and completeness is provided by auto-population of forms from system data, but opens issues around data accuracy. All three projects highlight the importance of user involvement in design, implementation and refinement. In keeping with this, Canterbury utilises a systematic pathway definition process that brings together GPs and specialist to debate and agree on the local management of a condition. User feedback exposes many opportunities for improving usability. The findings are based on individual experiences accounted by participating stakeholders; the risk of bias is mitigated, however, by triangulation across three distinct implementations of eReferrals. Quantitative follow-up on key outstanding issues, notably impact of structured eReferral forms on GP time to write a referral, is warranted. Key eReferral users include clinicians on both ends of the referral process as well as the administrative staff. User experience in three eReferral projects has shown that they particularly appreciate improvement of referral visibility, as well as information quality; promising workflow transformations have been achieved in some places. Auto-population of forms leads to opportunities, and issues, that are prompting further attention to data quality. While the importance of user feedback should be obvious, it is not universal to seek it or to provide resources to effectively follow up with improvements driven by such feedback. To maximise benefits, innovative health IT projects must take an iterative approach guided by ongoing user experience.

  5. Effect of cause-of-death training on agreement between hospital discharge diagnoses and cause of death reported, inpatient hospital deaths, New York City, 2008-2010.

    PubMed

    Ong, Paulina; Gambatese, Melissa; Begier, Elizabeth; Zimmerman, Regina; Soto, Antonio; Madsen, Ann

    2015-01-15

    Accurate cause-of-death reporting is required for mortality data to validly inform public health programming and evaluation. Research demonstrates overreporting of heart disease on New York City death certificates. We describe changes in reported causes of death following a New York City health department training conducted in 2009 to improve accuracy of cause-of-death reporting at 8 hospitals. The objective of our study was to assess the degree to which death certificates citing heart disease as cause of death agreed with hospital discharge data and the degree to which training improved accuracy of reporting. We analyzed 74,373 death certificates for 2008 through 2010 that were linked with hospital discharge records for New York City inpatient deaths and calculated the proportion of discordant deaths, that is, death certificates reporting an underlying cause of heart disease with no corresponding discharge record diagnosis. We also summarized top principal diagnoses among discordant reports and calculated the proportion of inpatient deaths reporting sepsis, a condition underreported in New York City, to assess whether documentation practices changed in response to clarifications made during the intervention. Citywide discordance between death certificates and discharge data decreased from 14.9% in 2008 to 9.6% in 2010 (P < .001), driven by a decrease in discordance at intervention hospitals (20.2% in 2008 to 8.9% in 2010; P < .001). At intervention hospitals, reporting of sepsis increased from 3.7% of inpatient deaths in 2008 to 20.6% in 2010 (P < .001). Overreporting of heart disease as cause of death declined at intervention hospitals, driving a citywide decline, and sepsis reporting practices changed in accordance with health department training. Researchers should consider the effect of overreporting and data-quality changes when analyzing New York City heart disease mortality trends. Other vital records jurisdictions should employ similar interventions to improve cause-of-death reporting and use linked discharge data to monitor data quality.

  6. ESIP Information Quality Cluster (IQC)

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K.; Peng, Ge; Moroni, David F.

    2016-01-01

    The Information Quality Cluster (IQC) within the Federation of Earth Science Information Partners (ESIP) was initially formed in 2011 and has evolved significantly over time. The current objectives of the IQC are to: 1. Actively evaluate community data quality best practices and standards; 2. Improve capture, description, discovery, and usability of information about data quality in Earth science data products; 3. Ensure producers of data products are aware of standards and best practices for conveying data quality, and data providers distributors intermediaries establish, improve and evolve mechanisms to assist users in discovering and understanding data quality information; and 4. Consistently provide guidance to data managers and stewards on how best to implement data quality standards and best practices to ensure and improve maturity of their data products. The activities of the IQC include: 1. Identification of additional needs for consistently capturing, describing, and conveying quality information through use case studies with broad and diverse applications; 2. Establishing and providing community-wide guidance on roles and responsibilities of key players and stakeholders including users and management; 3. Prototyping of conveying quality information to users in a more consistent, transparent, and digestible manner; 4. Establishing a baseline of standards and best practices for data quality; 5. Evaluating recommendations from NASA's DQWG in a broader context and proposing possible implementations; and 6. Engaging data providers, data managers, and data user communities as resources to improve our standards and best practices. Following the principles of openness of the ESIP Federation, IQC invites all individuals interested in improving capture, description, discovery, and usability of information about data quality in Earth science data products to participate in its activities.

  7. Cardiac surgical outcomes improvement led by a physician champion working with a nurse clinical coordinator.

    PubMed

    Stanford, John R; Swaney-Berghoff, Laurie; Recht, Kimberly

    2012-01-01

    Cardiac surgical outcomes improvement in a community hospital was driven by a physician champion working with a nurse clinical coordinator. Specific system improvements implemented were (1) nurse checklists of vital signs, cardiovascular function parameters, and life support appliance operation; (2) use of the EuroSCORE system of preoperative patient risk assessment; (3) monthly morbidity and mortality conferences; and (4) daily patient progress tracking. The hospital received 1 star (bottom 12% of hospitals for quality outcomes) from the Society of Thoracic Surgeons Adult Cardiac Database in 2006 prior to program inception, 2 stars (middle 76% of hospitals for quality outcomes) in 2007 and 2008, and 3 stars (top 12% of hospitals) in 2009. The physician and nurse together combined a strategy for clinical improvement with the cultural practices at the hospital to ensure that system improvements approved at the strategic level were implemented at the point of care. Both strategy and culture must be addressed to ensure patient outcomes improvement.

  8. Simulation and Sensitivity in a Nested Modeling System for South America. Part II: GCM Boundary Forcing.

    NASA Astrophysics Data System (ADS)

    Rojas, Maisa; Seth, Anji

    2003-08-01

    of this study, the RegCM's ability to simulate circulation and rainfall observed in the two extreme seasons was demonstrated when driven at the lateral boundaries by reanalyzed forcing. Seasonal integrations with the RegCM driven by GCM ensemble-derived lateral boundary forcing demonstrate that the nested model responds well to the SST forcing, by capturing the major features of the circulation and rainfall differences between the two years. The GCM-driven model also improves upon the monthly evolution of rainfall compared with that from the GCM. However, the nested model rainfall simulations for the two seasons are degraded compared with those from the reanalyses-driven RegCM integrations. The poor location of the Atlantic intertropical convergence zone (ITCZ) in the GCM leads to excess rainfall in Nordeste in the nested model.An expanded domain was tested, wherein the RegCM was permitted more internal freedom to respond to SST and regional orographic forcing. Results show that the RegCM is able to improve the location of the ITCZ, and the seasonal evolution of rainfall in Nordeste, the Amazon region, and the southeastern region of Brazil. However, it remains that the limiting factor in the skill of the nested modeling system is the quality of the lateral boundary forcing provided by the global model.

  9. 39 CFR 3050.42 - Proceedings to improve the quality of financial data.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Proceedings to improve the quality of financial... § 3050.42 Proceedings to improve the quality of financial data. The Commission may, on its own motion or on request of an interested party, initiate proceedings to improve the quality, accuracy, or...

  10. Dynamic Self-adaptive Remote Health Monitoring System for Diabetics

    PubMed Central

    Suh, Myung-kyung; Moin, Tannaz; Woodbridge, Jonathan; Lan, Mars; Ghasemzadeh, Hassan; Bui, Alex; Ahmadi, Sheila; Sarrafzadeh, Majid

    2016-01-01

    Diabetes is the seventh leading cause of death in the United States. In 2010, about 1.9 million new cases of diabetes were diagnosed in people aged 20 years or older. Remote health monitoring systems can help diabetics and their healthcare professionals monitor health-related measurements by providing real-time feedback. However, data-driven methods to dynamically prioritize and generate tasks are not well investigated in the remote health monitoring. This paper presents a task optimization technique used in WANDA (Weight and Activity with Blood Pressure and Other Vital Signs); a wireless health project that leverages sensor technology and wireless communication to monitor the health status of patients with diabetes. WANDA applies data analytics in real-time to improving the quality of care. The developed algorithm minimizes the number of daily tasks required by diabetic patients using association rules that satisfies a minimum support threshold. Each of these tasks maximizes information gain, thereby improving the overall level of care. Experimental results show that the developed algorithm can reduce the number of tasks up to 28.6% with minimum support 0.95, minimum confidence 0.97 and high efficiency. PMID:23366365

  11. Legitimacy, trustee incentives, and board processes: the case of public and private non-profit nursing homes.

    PubMed

    Dewaelheyns, Nico; Eeckloo, Kristof; Van Hulle, Cynthia

    2011-01-01

    Using a unique data set, this study explores how type of ownership (government/private) is related to processes of governance. The findings suggest that the neo-institutional perspective and the self-interest rationale of the agency perspective are helpful in explaining processes of governance in both government- and privately owned non-profit organizations. Due to adverse incentives and the quest for legitimacy, supervising governance bodies within local government-owned non-profit institutions pay relatively less attention to the development of high quality supervising bodies and delegate little to management. Our findings also indicate that governance processes in private institutions are more aligned with the business model and that this alignment is likely driven by a concern to improve decision making. By contrast, our data also suggest that in local government-owned institutions re-election concerns of politicians-trustees are an important force in the governance processes of these institutions. In view of these adverse incentives - in contrast to the case of private organizations - a governance code is unlikely to entail much improvement in government-owned organizations. Copyright © 2010 John Wiley & Sons, Ltd.

  12. In Search of Effective Solutions to Curb Workplace Violence.

    PubMed

    Arnetz, Judith; Lipscomb, Jane; Ogaitis, Joanne

    2017-04-01

    Investigators have applied epidemiological principles to the study of workplace violence, producing results that offer intriguing information to hospitals struggling for a way forward on this issue. In a randomized, to hospitals struggling for a wary forward on this issue. In a randomized, controlled trial, the researchers found that a one-time, unit-based intervention can reduce the incidence of violent events, and that the approach offers some lasting effect over time. The intervention consisted of a 45-minute discussion with unit supervisors in which unit-specific data regarding violent incidents in their workplace were shared along with an array of improvement strategies. Unit supervisors then were directed to work with their teams to develop action plans to address violence, although they were free to adopt whatever solutions they deemed best. At six moths post-intervention, there was a clear reduction in the incident rate ratios of violent events on the intervention units as compared with control units that did not conduct an intervention. Experts note that the study demonstrates that an effective workplace violence intervention or program must be data-driven and based on principles of continuous quality improvement.

  13. On the Internet of Things, smart cities and the WHO Healthy Cities

    PubMed Central

    2014-01-01

    This article gives a brief overview of the Internet of Things (IoT) for cities, offering examples of IoT-powered 21st century smart cities, including the experience of the Spanish city of Barcelona in implementing its own IoT-driven services to improve the quality of life of its people through measures that promote an eco-friendly, sustainable environment. The potential benefits as well as the challenges associated with IoT for cities are discussed. Much of the 'big data' that are continuously generated by IoT sensors, devices, systems and services are geo-tagged or geo-located. The importance of having robust, intelligent geospatial analytics systems in place to process and make sense of such data in real time cannot therefore be overestimated. The authors argue that IoT-powered smart cities stand better chances of becoming healthier cities. The World Health Organization (WHO) Healthy Cities Network and associated national networks have hundreds of member cities around the world that could benefit from, and harness the power of, IoT to improve the health and well-being of their local populations. PMID:24669838

  14. On the Internet of Things, smart cities and the WHO Healthy Cities.

    PubMed

    Kamel Boulos, Maged N; Al-Shorbaji, Najeeb M

    2014-03-27

    This article gives a brief overview of the Internet of Things (IoT) for cities, offering examples of IoT-powered 21st century smart cities, including the experience of the Spanish city of Barcelona in implementing its own IoT-driven services to improve the quality of life of its people through measures that promote an eco-friendly, sustainable environment. The potential benefits as well as the challenges associated with IoT for cities are discussed. Much of the 'big data' that are continuously generated by IoT sensors, devices, systems and services are geo-tagged or geo-located. The importance of having robust, intelligent geospatial analytics systems in place to process and make sense of such data in real time cannot therefore be overestimated. The authors argue that IoT-powered smart cities stand better chances of becoming healthier cities. The World Health Organization (WHO) Healthy Cities Network and associated national networks have hundreds of member cities around the world that could benefit from, and harness the power of, IoT to improve the health and well-being of their local populations.

  15. A biomechanical modeling-guided simultaneous motion estimation and image reconstruction technique (SMEIR-Bio) for 4D-CBCT reconstruction

    NASA Astrophysics Data System (ADS)

    Huang, Xiaokun; Zhang, You; Wang, Jing

    2018-02-01

    Reconstructing four-dimensional cone-beam computed tomography (4D-CBCT) images directly from respiratory phase-sorted traditional 3D-CBCT projections can capture target motion trajectory, reduce motion artifacts, and reduce imaging dose and time. However, the limited numbers of projections in each phase after phase-sorting decreases CBCT image quality under traditional reconstruction techniques. To address this problem, we developed a simultaneous motion estimation and image reconstruction (SMEIR) algorithm, an iterative method that can reconstruct higher quality 4D-CBCT images from limited projections using an inter-phase intensity-driven motion model. However, the accuracy of the intensity-driven motion model is limited in regions with fine details whose quality is degraded due to insufficient projection number, which consequently degrades the reconstructed image quality in corresponding regions. In this study, we developed a new 4D-CBCT reconstruction algorithm by introducing biomechanical modeling into SMEIR (SMEIR-Bio) to boost the accuracy of the motion model in regions with small fine structures. The biomechanical modeling uses tetrahedral meshes to model organs of interest and solves internal organ motion using tissue elasticity parameters and mesh boundary conditions. This physics-driven approach enhances the accuracy of solved motion in the organ’s fine structures regions. This study used 11 lung patient cases to evaluate the performance of SMEIR-Bio, making both qualitative and quantitative comparisons between SMEIR-Bio, SMEIR, and the algebraic reconstruction technique with total variation regularization (ART-TV). The reconstruction results suggest that SMEIR-Bio improves the motion model’s accuracy in regions containing small fine details, which consequently enhances the accuracy and quality of the reconstructed 4D-CBCT images.

  16. [Technical background of data collection for parametric observation of total mesorectal excision (TME) in rectal cancer].

    PubMed

    Bláha, M; Hoch, J; Ferko, A; Ryška, A; Hovorková, E

    Improvement in any human activity is preconditioned by inspection of results and providing feedback used for modification of the processes applied. Comparison of experts experience in the given field is another indispensable part leading to optimisation and improvement of processes, and optimally to implementation of standards. For the purpose of objective comparison and assessment of the processes, it is always necessary to describe the processes in a parametric way, to obtain representative data, to assess the achieved results, and to provide unquestionable and data-driven feedback based on such analysis. This may lead to a consensus on the definition of standards in the given area of health care. Total mesorectal excision (TME) is a standard procedure of rectal cancer (C20) surgical treatment. However, the quality of performed procedures varies in different health care facilities, which is given, among others, by internal processes and surgeons experience. Assessment of surgical treatment results is therefore of key importance. A pathologist who assesses the resected tissue can provide valuable feedback in this respect. An information system for the parametric assessment of TME performance is described in our article, including technical background in the form of a multicentre clinical registry and the structure of observed parameters. We consider the proposed system of TME parametric assessment as significant for improvement of TME performance, aimed at reducing local recurrences and at improving the overall prognosis of patients. rectal cancer total mesorectal excision parametric data clinical registries TME registry.

  17. The swiss neonatal quality cycle, a monitor for clinical performance and tool for quality improvement

    PubMed Central

    2013-01-01

    Background We describe the setup of a neonatal quality improvement tool and list which peer-reviewed requirements it fulfils and which it does not. We report on the so-far observed effects, how the units can identify quality improvement potential, and how they can measure the effect of changes made to improve quality. Methods Application of a prospective longitudinal national cohort data collection that uses algorithms to ensure high data quality (i.e. checks for completeness, plausibility and reliability), and to perform data imaging (Plsek’s p-charts and standardized mortality or morbidity ratio SMR charts). The collected data allows monitoring a study collective of very low birth-weight infants born from 2009 to 2011 by applying a quality cycle following the steps ′guideline – perform - falsify – reform′. Results 2025 VLBW live-births from 2009 to 2011 representing 96.1% of all VLBW live-births in Switzerland display a similar mortality rate but better morbidity rates when compared to other networks. Data quality in general is high but subject to improvement in some units. Seven measurements display quality improvement potential in individual units. The methods used fulfil several international recommendations. Conclusions The Quality Cycle of the Swiss Neonatal Network is a helpful instrument to monitor and gradually help improve the quality of care in a region with high quality standards and low statistical discrimination capacity. PMID:24074151

  18. Twin robotic x-ray system for 2D radiographic and 3D cone-beam CT imaging

    NASA Astrophysics Data System (ADS)

    Fieselmann, Andreas; Steinbrener, Jan; Jerebko, Anna K.; Voigt, Johannes M.; Scholz, Rosemarie; Ritschl, Ludwig; Mertelmeier, Thomas

    2016-03-01

    In this work, we provide an initial characterization of a novel twin robotic X-ray system. This system is equipped with two motor-driven telescopic arms carrying X-ray tube and flat-panel detector, respectively. 2D radiographs and fluoroscopic image sequences can be obtained from different viewing angles. Projection data for 3D cone-beam CT reconstruction can be acquired during simultaneous movement of the arms along dedicated scanning trajectories. We provide an initial evaluation of the 3D image quality based on phantom scans and clinical images. Furthermore, initial evaluation of patient dose is conducted. The results show that the system delivers high image quality for a range of medical applications. In particular, high spatial resolution enables adequate visualization of bone structures. This system allows 3D X-ray scanning of patients in standing and weight-bearing position. It could enable new 2D/3D imaging workflows in musculoskeletal imaging and improve diagnosis of musculoskeletal disorders.

  19. Clinical data interoperability based on archetype transformation.

    PubMed

    Costa, Catalina Martínez; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2011-10-01

    The semantic interoperability between health information systems is a major challenge to improve the quality of clinical practice and patient safety. In recent years many projects have faced this problem and provided solutions based on specific standards and technologies in order to satisfy the needs of a particular scenario. Most of such solutions cannot be easily adapted to new scenarios, thus more global solutions are needed. In this work, we have focused on the semantic interoperability of electronic healthcare records standards based on the dual model architecture and we have developed a solution that has been applied to ISO 13606 and openEHR. The technological infrastructure combines reference models, archetypes and ontologies, with the support of Model-driven Engineering techniques. For this purpose, the interoperability infrastructure developed in previous work by our group has been reused and extended to cover the requirements of data transformation. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Impact Compaction of a Granular Material

    NASA Astrophysics Data System (ADS)

    Fenton, Gregg; Asay, Blaine; Todd, Steve; Grady, Dennis

    2017-06-01

    The dynamic behavior of granular materials has importance to a variety of engineering applications. Although, the mechanical behavior of granular materials have been studied extensively for several decades, the dynamic behavior of these materials remains poorly understood. High-quality experimental data are needed to improve our general understanding of granular material compaction physics. This paper describes how an instrumented plunger impact system can be used to measure the compaction process for granular materials at high and controlled strain rates and subsequently used for computational modelling. The experimental technique relies on a gas-gun driven plunger system to generate a compaction wave through a volume of granular material. This volume of material has been redundantly instrumented along the bed length to track the progression of the compaction wave, and the piston displacement is measured with Photon Doppler Velocimetry (PDV). Using the gathered experimental data along with the initial material tap density, a granular material equation of state can be determined.

  1. Team Learning for Healthcare Quality Improvement

    PubMed Central

    Eppstein, Margaret J.; Horbar, Jeffrey D.

    2014-01-01

    In organized healthcare quality improvement collaboratives (QICs), teams of practitioners from different hospitals exchange information on clinical practices with the aim of improving health outcomes at their own institutions. However, what works in one hospital may not work in others with different local contexts because of nonlinear interactions among various demographics, treatments, and practices. In previous studies of collaborations where the goal is a collective problem solving, teams of diverse individuals have been shown to outperform teams of similar individuals. However, when the purpose of collaboration is knowledge diffusion in complex environments, it is not clear whether team diversity will help or hinder effective learning. In this paper, we first use an agent-based model of QICs to show that teams comprising similar individuals outperform those with more diverse individuals under nearly all conditions, and that this advantage increases with the complexity of the landscape and level of noise in assessing performance. Examination of data from a network of real hospitals provides encouraging evidence of a high degree of similarity in clinical practices, especially within teams of hospitals engaging in QIC teams. However, our model also suggests that groups of similar hospitals could benefit from larger teams and more open sharing of details on clinical outcomes than is currently the norm. To facilitate this, we propose a secure virtual collaboration system that would allow hospitals to efficiently identify potentially better practices in use at other institutions similar to theirs without any institutions having to sacrifice the privacy of their own data. Our results may also have implications for other types of data-driven diffusive learning such as in personalized medicine and evolutionary search in noisy, complex combinatorial optimization problems. PMID:25360395

  2. Current Evidence on the Association of Dietary Patterns and Bone Health: A Scoping Review123

    PubMed Central

    Movassagh, Elham Z

    2017-01-01

    Nutrition is an important modifiable factor that affects bone health. Diet is a complex mixture of nutrients and foods that correlate or interact with each other. Dietary pattern approaches take into account contributions from various aspects of diet. Findings from dietary pattern studies could complement those from single-nutrient and food studies on bone health. In this study we aimed to conduct a scoping review of the literature that assessed the impact of dietary patterns (derived with the use of both a priori and data-driven approaches) on bone outcomes, including bone mineral status, bone biomarkers, osteoporosis, and fracture risk. We retrieved 49 human studies up to June 2016 from the PubMed, Embase, and CINAHL databases. Most of these studies used a data-driven method, especially factor analysis, to derive dietary patterns. Several studies examined adherence to a variety of the a priori dietary indexes, including the Mediterranean diet score, the Healthy Eating Index (HEI), and the Alternative Healthy Eating Index (AHEI). The bone mineral density (BMD) diet score was developed to measure adherence to a dietary pattern beneficial to bone mineral density. Findings revealed a beneficial impact of higher adherence to a “healthy” dietary pattern derived using a data-driven method, the Mediterranean diet, HEI, AHEI, Dietary Diversity Score, Diet Quality Index–International, BMD Diet Score, Healthy Diet Indicator, and Korean Diet Score, on bone. In contrast, the “Western” dietary pattern and those featuring some aspects of an unhealthy diet were associated inversely with bone health. In both a priori and data-driven dietary pattern studies, a dietary pattern that emphasized the intake of fruit, vegetables, whole grains, poultry and fish, nuts and legumes, and low-fat dairy products and de-emphasized the intake of soft drinks, fried foods, meat and processed products, sweets and desserts, and refined grains showed a beneficial impact on bone health. Overall, adherence to a healthy dietary pattern consisting of the above-mentioned food groups can improve bone mineral status and decrease osteoporosis and fracture risk. PMID:28096123

  3. Impact of 'stretch' targets for cardiovascular disease management within a local pay-for-performance programme.

    PubMed

    Pape, Utz J; Huckvale, Kit; Car, Josip; Majeed, Azeem; Millett, Christopher

    2015-01-01

    Pay-for-performance programs are often aimed to improve the management of chronic diseases. We evaluate the impact of a local pay for performance programme (QOF+), which rewarded financially more ambitious quality targets ('stretch targets') than those used nationally in the Quality and Outcomes Framework (QOF). We focus on targets for intermediate outcomes in patients with cardiovascular disease and diabetes. A difference-in-difference approach is used to compare practice level achievements before and after the introduction of the local pay for performance program. In addition, we analysed patient-level data on exception reporting and intermediate outcomes utilizing an interrupted time series analysis. The local pay for performance program led to significantly higher target achievements (hypertension: p-value <0.001, coronary heart disease: p-values <0.001, diabetes: p-values <0.061, stroke: p-values <0.003). However, the increase was driven by higher rates of exception reporting (hypertension: p-value <0.001, coronary heart disease: p-values <0.03, diabetes: p-values <0.05) in patients with all conditions except for stroke. Exception reporting allows practitioners to exclude patients from target calculations if certain criteria are met, e.g. informed dissent of the patient for treatment. There were no statistically significant improvements in mean blood pressure, cholesterol or HbA1c levels. Thus, achievement of higher payment thresholds in the local pay for performance scheme was mainly attributed to increased exception reporting by practices with no discernable improvements in overall clinical quality. Hence, active monitoring of exception reporting should be considered when setting more ambitious quality targets. More generally, the study suggests a trade-off between additional incentive for better care and monitoring costs.

  4. Women in physics in Bulgaria-Enhancing research

    NASA Astrophysics Data System (ADS)

    Proykova, Ana

    2013-03-01

    Bulgaria currently has a relatively high number of women in top positions at the governmental level, yet the presidents of the important universities and most of the directors of research institutions are male. Gender balance is driven by the need to improve research quality in interdisciplinary fields, where the similarities and differences between men and women in creativity and thought play a crucial role.

  5. Technical Communication--The Need and the Demand of Global World

    ERIC Educational Resources Information Center

    Patel, Dipika S.

    2013-01-01

    The present world is known as Hi-tech world as it is driven by technology. It is the vehicle to get access with this modernized world. However, due to continuous changes taking place in the field of technology, people keep looking for new developments for improving the quality of teaching and learning methodologies. In the fast developing 21st…

  6. PARTNERING TO IMPROVE HUMAN EXPOSURE METHODS

    EPA Science Inventory

    Methods development research is an application-driven scientific area that addresses programmatic needs. The goals are to reduce measurement uncertainties, address data gaps, and improve existing analytical procedures for estimating human exposures. Partnerships have been develop...

  7. Improving oil classification quality from oil spill fingerprint beyond six sigma approach.

    PubMed

    Juahir, Hafizan; Ismail, Azimah; Mohamed, Saiful Bahri; Toriman, Mohd Ekhwan; Kassim, Azlina Md; Zain, Sharifuddin Md; Ahmad, Wan Kamaruzaman Wan; Wah, Wong Kok; Zali, Munirah Abdul; Retnam, Ananthy; Taib, Mohd Zaki Mohd; Mokhtar, Mazlin

    2017-07-15

    This study involves the use of quality engineering in oil spill classification based on oil spill fingerprinting from GC-FID and GC-MS employing the six-sigma approach. The oil spills are recovered from various water areas of Peninsular Malaysia and Sabah (East Malaysia). The study approach used six sigma methodologies that effectively serve as the problem solving in oil classification extracted from the complex mixtures of oil spilled dataset. The analysis of six sigma link with the quality engineering improved the organizational performance to achieve its objectivity of the environmental forensics. The study reveals that oil spills are discriminated into four groups' viz. diesel, hydrocarbon fuel oil (HFO), mixture oil lubricant and fuel oil (MOLFO) and waste oil (WO) according to the similarity of the intrinsic chemical properties. Through the validation, it confirmed that four discriminant component, diesel, hydrocarbon fuel oil (HFO), mixture oil lubricant and fuel oil (MOLFO) and waste oil (WO) dominate the oil types with a total variance of 99.51% with ANOVA giving F stat >F critical at 95% confidence level and a Chi Square goodness test of 74.87. Results obtained from this study reveals that by employing six-sigma approach in a data-driven problem such as in the case of oil spill classification, good decision making can be expedited. Copyright © 2017. Published by Elsevier Ltd.

  8. Template for success: using a resident-designed sign-out template in the handover of patient care.

    PubMed

    Clark, Clancy J; Sindell, Sarah L; Koehler, Richard P

    2011-01-01

    Report our implementation of a standardized handover process in a general surgery residency program. The standardized handover process, sign-out template, method of implementation, and continuous quality improvement process were designed by general surgery residents with support of faculty and senior hospital administration using standard work principles and business models of the Virginia Mason Production System and the Toyota Production System. Nonprofit, tertiary referral teaching hospital. General surgery residents, residency faculty, patient care providers, and hospital administration. After instruction in quality improvement initiatives, a team of general surgery residents designed a sign-out process using an electronic template and standard procedures. The initial implementation phase resulted in 73% compliance. Using resident-driven continuous quality improvement processes, real-time feedback enabled residents to modify and improve this process, eventually attaining 100% compliance and acceptance by residents. The creation of a standardized template and protocol for patient handovers might eliminate communication failures. Encouraging residents to participate in this process can establish the groundwork for successful implementation of a standardized handover process. Integrating a continuous quality-improvement process into such an initiative can promote active participation of busy general surgery residents and lead to successful implementation of standard procedures. Copyright © 2011 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  9. Improving data quality across 3 sub-Saharan African countries using the Consolidated Framework for Implementation Research (CFIR): results from the African Health Initiative.

    PubMed

    Gimbel, Sarah; Mwanza, Moses; Nisingizwe, Marie Paul; Michel, Cathy; Hirschhorn, Lisa

    2017-12-21

    High-quality data are critical to inform, monitor and manage health programs. Over the seven-year African Health Initiative of the Doris Duke Charitable Foundation, three of the five Population Health Implementation and Training (PHIT) partnership projects in Mozambique, Rwanda, and Zambia introduced strategies to improve the quality and evaluation of routinely-collected data at the primary health care level, and stimulate its use in evidence-based decision-making. Using the Consolidated Framework for Implementation Research (CFIR) as a guide, this paper: 1) describes and categorizes data quality assessment and improvement activities of the projects, and 2) identifies core intervention components and implementation strategy adaptations introduced to improve data quality in each setting. The CFIR was adapted through a qualitative theme reduction process involving discussions with key informants from each project, who identified two domains and ten constructs most relevant to the study aim of describing and comparing each country's data quality assessment approach and implementation process. Data were collected on each project's data quality improvement strategies, activities implemented, and results via a semi-structured questionnaire with closed and open-ended items administered to health management information systems leads in each country, with complementary data abstraction from project reports. Across the three projects, intervention components that aligned with user priorities and government systems were perceived to be relatively advantageous, and more readily adapted and adopted. Activities that both assessed and improved data quality (including data quality assessments, mentorship and supportive supervision, establishment and/or strengthening of electronic medical record systems), received higher ranking scores from respondents. Our findings suggest that, at a minimum, successful data quality improvement efforts should include routine audits linked to ongoing, on-the-job mentoring at the point of service. This pairing of interventions engages health workers in data collection, cleaning, and analysis of real-world data, and thus provides important skills building with on-site mentoring. The effect of these core components is strengthened by performance review meetings that unify multiple health system levels (provincial, district, facility, and community) to assess data quality, highlight areas of weakness, and plan improvements.

  10. 42 CFR 480.144 - Access to QIO data and information.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs... 42 Public Health 4 2012-10-01 2012-10-01 false Access to QIO data and information. 480.144 Section...

  11. 42 CFR 480.144 - Access to QIO data and information.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION REVIEW INFORMATION Utilization and Quality Control Quality Improvement Organizations... 42 Public Health 4 2011-10-01 2011-10-01 false Access to QIO data and information. 480.144 Section...

  12. 42 CFR 480.144 - Access to QIO data and information.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs... 42 Public Health 4 2013-10-01 2013-10-01 false Access to QIO data and information. 480.144 Section...

  13. 42 CFR 480.144 - Access to QIO data and information.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs... 42 Public Health 4 2014-10-01 2014-10-01 false Access to QIO data and information. 480.144 Section...

  14. Retrospective cost adaptive Reynolds-averaged Navier-Stokes k-ω model for data-driven unsteady turbulent simulations

    NASA Astrophysics Data System (ADS)

    Li, Zhiyong; Hoagg, Jesse B.; Martin, Alexandre; Bailey, Sean C. C.

    2018-03-01

    This paper presents a data-driven computational model for simulating unsteady turbulent flows, where sparse measurement data is available. The model uses the retrospective cost adaptation (RCA) algorithm to automatically adjust the closure coefficients of the Reynolds-averaged Navier-Stokes (RANS) k- ω turbulence equations to improve agreement between the simulated flow and the measurements. The RCA-RANS k- ω model is verified for steady flow using a pipe-flow test case and for unsteady flow using a surface-mounted-cube test case. Measurements used for adaptation of the verification cases are obtained from baseline simulations with known closure coefficients. These verification test cases demonstrate that the RCA-RANS k- ω model can successfully adapt the closure coefficients to improve agreement between the simulated flow field and a set of sparse flow-field measurements. Furthermore, the RCA-RANS k- ω model improves agreement between the simulated flow and the baseline flow at locations at which measurements do not exist. The RCA-RANS k- ω model is also validated with experimental data from 2 test cases: steady pipe flow, and unsteady flow past a square cylinder. In both test cases, the adaptation improves agreement with experimental data in comparison to the results from a non-adaptive RANS k- ω model that uses the standard values of the k- ω closure coefficients. For the steady pipe flow, adaptation is driven by mean stream-wise velocity measurements at 24 locations along the pipe radius. The RCA-RANS k- ω model reduces the average velocity error at these locations by over 35%. For the unsteady flow over a square cylinder, adaptation is driven by time-varying surface pressure measurements at 2 locations on the square cylinder. The RCA-RANS k- ω model reduces the average surface-pressure error at these locations by 88.8%.

  15. Alternative Fuels Data Center: Natural Gas Street Sweepers Improve Air

    Science.gov Websites

    Quality in New York Natural Gas Street Sweepers Improve Air Quality in New York to someone by E -mail Share Alternative Fuels Data Center: Natural Gas Street Sweepers Improve Air Quality in New York on Facebook Tweet about Alternative Fuels Data Center: Natural Gas Street Sweepers Improve Air

  16. A high performing brain-machine interface driven by low-frequency local field potentials alone and together with spikes

    NASA Astrophysics Data System (ADS)

    Stavisky, Sergey D.; Kao, Jonathan C.; Nuyujukian, Paul; Ryu, Stephen I.; Shenoy, Krishna V.

    2015-06-01

    Objective. Brain-machine interfaces (BMIs) seek to enable people with movement disabilities to directly control prosthetic systems with their neural activity. Current high performance BMIs are driven by action potentials (spikes), but access to this signal often diminishes as sensors degrade over time. Decoding local field potentials (LFPs) as an alternative or complementary BMI control signal may improve performance when there is a paucity of spike signals. To date only a small handful of LFP decoding methods have been tested online; there remains a need to test different LFP decoding approaches and improve LFP-driven performance. There has also not been a reported demonstration of a hybrid BMI that decodes kinematics from both LFP and spikes. Here we first evaluate a BMI driven by the local motor potential (LMP), a low-pass filtered time-domain LFP amplitude feature. We then combine decoding of both LMP and spikes to implement a hybrid BMI. Approach. Spikes and LFP were recorded from two macaques implanted with multielectrode arrays in primary and premotor cortex while they performed a reaching task. We then evaluated closed-loop BMI control using biomimetic decoders driven by LMP, spikes, or both signals together. Main results. LMP decoding enabled quick and accurate cursor control which surpassed previously reported LFP BMI performance. Hybrid decoding of both spikes and LMP improved performance when spikes signal quality was mediocre to poor. Significance. These findings show that LMP is an effective BMI control signal which requires minimal power to extract and can substitute for or augment impoverished spikes signals. Use of this signal may lengthen the useful lifespan of BMIs and is therefore an important step towards clinically viable BMIs.

  17. Evaluation of global water quality - the potential of a data- and model-driven analysis

    NASA Astrophysics Data System (ADS)

    Bärlund, Ilona; Flörke, Martina; Alcamo, Joseph; Völker, Jeanette; Malsy, Marcus; Kaus, Andrew; Reder, Klara; Büttner, Olaf; Katterfeld, Christiane; Dietrich, Désirée; Borchardt, Dietrich

    2016-04-01

    The ongoing socio-economic development presents a new challenge for water quality worldwide, especially in developing and emerging countries. It is estimated that due to population growth and the extension of water supply networks, the amount of waste water will rise sharply. This can lead to an increased risk of surface water quality degradation, if the wastewater is not sufficiently treated. This development has impacts on ecosystems and human health, as well as food security. The United Nations Member States have adopted targets for sustainable development. They include, inter alia, sustainable protection of water quality and sustainable use of water resources. To achieve these goals, appropriate monitoring strategies and the development of indicators for water quality are required. Within the pre-study for a 'World Water Quality Assessment' (WWQA) led by United Nations Environment Programme (UNEP), a methodology for assessing water quality, taking into account the above-mentioned objectives has been developed. The novelty of this methodology is the linked model- and data-driven approach. The focus is on parameters reflecting the key water quality issues, such as increased waste water pollution, salinization or eutrophication. The results from the pre-study show, for example, that already about one seventh of all watercourses in Latin America, Africa and Asia show high organic pollution. This is of central importance for inland fisheries and associated food security. In addition, it could be demonstrated that global water quality databases have large gaps. These must be closed in the future in order to obtain an overall picture of global water quality and to target measures more efficiently. The aim of this presentation is to introduce the methodology developed within the WWQA pre-study and to show selected examples of application in Latin America, Africa and Asia.

  18. Using Data-Driven Model-Brain Mappings to Constrain Formal Models of Cognition

    PubMed Central

    Borst, Jelmer P.; Nijboer, Menno; Taatgen, Niels A.; van Rijn, Hedderik; Anderson, John R.

    2015-01-01

    In this paper we propose a method to create data-driven mappings from components of cognitive models to brain regions. Cognitive models are notoriously hard to evaluate, especially based on behavioral measures alone. Neuroimaging data can provide additional constraints, but this requires a mapping from model components to brain regions. Although such mappings can be based on the experience of the modeler or on a reading of the literature, a formal method is preferred to prevent researcher-based biases. In this paper we used model-based fMRI analysis to create a data-driven model-brain mapping for five modules of the ACT-R cognitive architecture. We then validated this mapping by applying it to two new datasets with associated models. The new mapping was at least as powerful as an existing mapping that was based on the literature, and indicated where the models were supported by the data and where they have to be improved. We conclude that data-driven model-brain mappings can provide strong constraints on cognitive models, and that model-based fMRI is a suitable way to create such mappings. PMID:25747601

  19. Spatiotemporal distribution of algal and nutrient, and their correlations based on long-term monitoring data in Lake Taihu, China

    NASA Astrophysics Data System (ADS)

    Acharya, K.; Li, Y.; Stone, M.; Yu, Z.; Young, M.; Shafer, D. S.; Zhu, J.; Warwick, J. J.

    2009-12-01

    Eutrophication in Lake Taihu - China’s third largest freshwater lake - has led to deterioration of water quality and caused more frequent cyanobacteria blooms at many lake locations in recent years. Eutrophication is thought to be fueled by increased nutrient loading, a consequence of rapid population and economic growth in the region. To understand the spatiotemporal distribution of algal blooms, a database was developed that includes long-term meteorological, hydrological, water quality, and socioeconomic data from the Lake Taihu watershed. The data were collected through various field observations, and augmented with information from local and provincial agencies, and universities. Based on the data, spatiotemporal distributions of, and correlations between, chlorophyll-a (Chl-a), total phosphorus (TP), total nitrogen (TN) and water temperature (WT) were analyzed. Results revealed a high degree of correlation between TP and Chl-a concentrations during warm seasons, with high concentrations of both substances present in the northern and northwest portions of the lake. During winter months, Chl-a concentrations were more strongly correlated with WT. Spatial trends in TP and TN concentrations corresponded to observed nutrient fluxes from adjoining rivers in densely populated areas, demonstrating the influence of watershed pollutant loads on lake water quality. Among important questions to be answered is whether wind-driven resuspension of existing nutrients in sediments in this shallow (< 3 m) lake may cause cyanobacteria blooms to begin. This study identifies other questions, data gaps, and research needs, and provides a foundation for improving lake management strategies.

  20. Obtaining antibiotics online from within the UK: a cross-sectional study.

    PubMed

    Boyd, Sara Elizabeth; Moore, Luke Stephen Prockter; Gilchrist, Mark; Costelloe, Ceire; Castro-Sánchez, Enrique; Franklin, Bryony Dean; Holmes, Alison Helen

    2017-05-01

    Improved antibiotic stewardship (AS) and reduced prescribing in primary care, with a parallel increase in personal internet use, could lead citizens to obtain antibiotics from alternative sources online. A cross-sectional analysis was performed to: (i) determine the quality and legality of online pharmacies selling antibiotics to the UK public; (ii) describe processes for obtaining antibiotics online from within the UK; and (iii) identify resulting AS and patient safety issues. Searches were conducted for 'buy antibiotics online' using Google and Yahoo. For each search engine, data from the first 10 web sites with unique URL addresses were reviewed. Analysis was conducted on evidence of appropriate pharmacy registration, prescription requirement, whether antibiotic choice was 'prescriber-driven' or 'consumer-driven', and whether specific information was required (allergies, comorbidities, pregnancy) or given (adverse effects) prior to purchase. Twenty unique URL addresses were analysed in detail. Online pharmacies evidencing their location in the UK ( n  = 5; 25%) required a prescription before antibiotic purchase, and were appropriately registered. Online pharmacies unclear about the location they were operating from ( n  = 10; 50%) had variable prescription requirements, and no evidence of appropriate registration. Nine (45%) online pharmacies did not require a prescription prior to purchase. For 16 (80%) online pharmacies, decisions were initially consumer-driven for antibiotic choice, dose and quantity. Wide variation exists among online pharmacies in relation to antibiotic practices, highlighting considerable patient safety and AS issues. Improved education, legislation, regulation and new best practice stewardship guidelines are urgently needed for online antibiotic suppliers. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Does adding clinical data to administrative data improve agreement among hospital quality measures?

    PubMed

    Hanchate, Amresh D; Stolzmann, Kelly L; Rosen, Amy K; Fink, Aaron S; Shwartz, Michael; Ash, Arlene S; Abdulkerim, Hassen; Pugh, Mary Jo V; Shokeen, Priti; Borzecki, Ann

    2017-09-01

    Hospital performance measures based on patient mortality and readmission have indicated modest rates of agreement. We examined if combining clinical data on laboratory tests and vital signs with administrative data leads to improved agreement with each other, and with other measures of hospital performance in the nation's largest integrated health care system. We used patient-level administrative and clinical data, and hospital-level data on quality indicators, for 2007-2010 from the Veterans Health Administration (VA). For patients admitted for acute myocardial infarction (AMI), heart failure (HF) and pneumonia we examined changes in hospital performance on 30-d mortality and 30-d readmission rates as a result of adding clinical data to administrative data. We evaluated whether this enhancement yielded improved measures of hospital quality, based on concordance with other hospital quality indicators. For 30-d mortality, data enhancement improved model performance, and significantly changed hospital performance profiles; for 30-d readmission, the impact was modest. Concordance between enhanced measures of both outcomes, and with other hospital quality measures - including Joint Commission process measures, VA Surgical Quality Improvement Program (VASQIP) mortality and morbidity, and case volume - remained poor. Adding laboratory tests and vital signs to measure hospital performance on mortality and readmission did not improve the poor rates of agreement across hospital quality indicators in the VA. Efforts to improve risk adjustment models should continue; however, evidence of validation should precede their use as reliable measures of quality. Published by Elsevier Inc.

  2. Modeling and executing electronic health records driven phenotyping algorithms using the NQF Quality Data Model and JBoss® Drools Engine.

    PubMed

    Li, Dingcheng; Endle, Cory M; Murthy, Sahana; Stancl, Craig; Suesse, Dale; Sottara, Davide; Huff, Stanley M; Chute, Christopher G; Pathak, Jyotishman

    2012-01-01

    With increasing adoption of electronic health records (EHRs), the need for formal representations for EHR-driven phenotyping algorithms has been recognized for some time. The recently proposed Quality Data Model from the National Quality Forum (NQF) provides an information model and a grammar that is intended to represent data collected during routine clinical care in EHRs as well as the basic logic required to represent the algorithmic criteria for phenotype definitions. The QDM is further aligned with Meaningful Use standards to ensure that the clinical data and algorithmic criteria are represented in a consistent, unambiguous and reproducible manner. However, phenotype definitions represented in QDM, while structured, cannot be executed readily on existing EHRs. Rather, human interpretation, and subsequent implementation is a required step for this process. To address this need, the current study investigates open-source JBoss® Drools rules engine for automatic translation of QDM criteria into rules for execution over EHR data. In particular, using Apache Foundation's Unstructured Information Management Architecture (UIMA) platform, we developed a translator tool for converting QDM defined phenotyping algorithm criteria into executable Drools rules scripts, and demonstrated their execution on real patient data from Mayo Clinic to identify cases for Coronary Artery Disease and Diabetes. To the best of our knowledge, this is the first study illustrating a framework and an approach for executing phenotyping criteria modeled in QDM using the Drools business rules management system.

  3. Modeling and Executing Electronic Health Records Driven Phenotyping Algorithms using the NQF Quality Data Model and JBoss® Drools Engine

    PubMed Central

    Li, Dingcheng; Endle, Cory M; Murthy, Sahana; Stancl, Craig; Suesse, Dale; Sottara, Davide; Huff, Stanley M.; Chute, Christopher G.; Pathak, Jyotishman

    2012-01-01

    With increasing adoption of electronic health records (EHRs), the need for formal representations for EHR-driven phenotyping algorithms has been recognized for some time. The recently proposed Quality Data Model from the National Quality Forum (NQF) provides an information model and a grammar that is intended to represent data collected during routine clinical care in EHRs as well as the basic logic required to represent the algorithmic criteria for phenotype definitions. The QDM is further aligned with Meaningful Use standards to ensure that the clinical data and algorithmic criteria are represented in a consistent, unambiguous and reproducible manner. However, phenotype definitions represented in QDM, while structured, cannot be executed readily on existing EHRs. Rather, human interpretation, and subsequent implementation is a required step for this process. To address this need, the current study investigates open-source JBoss® Drools rules engine for automatic translation of QDM criteria into rules for execution over EHR data. In particular, using Apache Foundation’s Unstructured Information Management Architecture (UIMA) platform, we developed a translator tool for converting QDM defined phenotyping algorithm criteria into executable Drools rules scripts, and demonstrated their execution on real patient data from Mayo Clinic to identify cases for Coronary Artery Disease and Diabetes. To the best of our knowledge, this is the first study illustrating a framework and an approach for executing phenotyping criteria modeled in QDM using the Drools business rules management system. PMID:23304325

  4. Can a pharmacy intervention improve the metabolic risks of mental health patients? Evaluation of a novel collaborative service.

    PubMed

    Maulavizada, Husna; Emmerton, Lynne; Hattingh, Hendrika Laetitia

    2016-04-26

    The pressure on healthcare services worldwide has driven the incorporation of disease state management services within community pharmacies in developed countries. Pharmacists are recognised as the most accessible healthcare professionals, and the incorporation of these services facilitates patient care. In Australia, the opportunity to manage pharmacy patients with mental illness has been underutilised, despite the existence of service models for other chronic conditions. This paper is an independent evaluation of a novel service developed by a community pharmacy in Perth, Western Australia. The service represents collaboration between a nurse practitioner and community pharmacy staff in the management of mental health patients with metabolic risks. We applied practice service standards for Australian community pharmacies to develop an evaluation framework for this novel service. This was followed by semi-structured interviews with staff members at the study pharmacy to explore service processes and procedures. Descriptive analysis of interviews was supplemented with analysis of patients' biometric data. All data were evaluated against the developed framework. The evaluation framework comprised 13 process, 5 outcomes, and 11 quality indicators. Interview data from eight staff members and biometric data from 20 community-dwelling mental health patients taking antipsychotics were evaluated against the framework. Predominantly, patients were managed by the pharmacy's nurse practitioner, with medication management provided by pharmacists. Patients' biometric measurements comprised weight, blood pressure, blood glucose levels, lipid profiles and management of obesity, smoking, hypertension and diabetes. Positive outcomes observed in the patient data included weight loss, smoking cessation, and improved blood pressure, blood glucose and lipid levels. The developed framework allowed effective evaluation of the service, and may be applicable to other pharmacy services. The metabolic clinic met key process, quality and outcomes indicators. The positive patient outcomes may assist in securing further funding.

  5. Optimization of classification and regression analysis of four monoclonal antibodies from Raman spectra using collaborative machine learning approach.

    PubMed

    Le, Laetitia Minh Maï; Kégl, Balázs; Gramfort, Alexandre; Marini, Camille; Nguyen, David; Cherti, Mehdi; Tfaili, Sana; Tfayli, Ali; Baillet-Guffroy, Arlette; Prognon, Patrice; Chaminade, Pierre; Caudron, Eric

    2018-07-01

    The use of monoclonal antibodies (mAbs) constitutes one of the most important strategies to treat patients suffering from cancers such as hematological malignancies and solid tumors. These antibodies are prescribed by the physician and prepared by hospital pharmacists. An analytical control enables the quality of the preparations to be ensured. The aim of this study was to explore the development of a rapid analytical method for quality control. The method used four mAbs (Infliximab, Bevacizumab, Rituximab and Ramucirumab) at various concentrations and was based on recording Raman data and coupling them to a traditional chemometric and machine learning approach for data analysis. Compared to conventional linear approach, prediction errors are reduced with a data-driven approach using statistical machine learning methods. In the latter, preprocessing and predictive models are jointly optimized. An additional original aspect of the work involved on submitting the problem to a collaborative data challenge platform called Rapid Analytics and Model Prototyping (RAMP). This allowed using solutions from about 300 data scientists in collaborative work. Using machine learning, the prediction of the four mAbs samples was considerably improved. The best predictive model showed a combined error of 2.4% versus 14.6% using linear approach. The concentration and classification errors were 5.8% and 0.7%, only three spectra were misclassified over the 429 spectra of the test set. This large improvement obtained with machine learning techniques was uniform for all molecules but maximal for Bevacizumab with an 88.3% reduction on combined errors (2.1% versus 17.9%). Copyright © 2018 Elsevier B.V. All rights reserved.

  6. A comprehensive method for GNSS data quality determination to improve ionospheric data analysis.

    PubMed

    Kim, Minchan; Seo, Jiwon; Lee, Jiyun

    2014-08-14

    Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis.

  7. A Comprehensive Method for GNSS Data Quality Determination to Improve Ionospheric Data Analysis

    PubMed Central

    Kim, Minchan; Seo, Jiwon; Lee, Jiyun

    2014-01-01

    Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis. PMID:25196005

  8. Experiences and lessons learned from creating a generalized workflow for data publication of field campaign datasets

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Ramachandran, R.; Deb, D.; Beaty, T.; Wright, D.

    2017-12-01

    This paper summarizes the workflow challenges of curating and publishing data produced from disparate data sources and provides a generalized workflow solution to efficiently archive data generated by researchers. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics and the Global Hydrology Resource Center (GHRC) DAAC have been collaborating on the development of a generalized workflow solution to efficiently manage the data publication process. The generalized workflow presented here are built on lessons learned from implementations of the workflow system. Data publication consists of the following steps: Accepting the data package from the data providers, ensuring the full integrity of the data files. Identifying and addressing data quality issues Assembling standardized, detailed metadata and documentation, including file level details, processing methodology, and characteristics of data files Setting up data access mechanisms Setup of the data in data tools and services for improved data dissemination and user experience Registering the dataset in online search and discovery catalogues Preserving the data location through Digital Object Identifiers (DOI) We will describe the steps taken to automate, and realize efficiencies to the above process. The goals of the workflow system are to reduce the time taken to publish a dataset, to increase the quality of documentation and metadata, and to track individual datasets through the data curation process. Utilities developed to achieve these goal will be described. We will also share metrics driven value of the workflow system and discuss the future steps towards creation of a common software framework.

  9. Designing an optimal software intensive system acquisition: A game theoretic approach

    NASA Astrophysics Data System (ADS)

    Buettner, Douglas John

    The development of schedule-constrained software-intensive space systems is challenging. Case study data from national security space programs developed at the U.S. Air Force Space and Missile Systems Center (USAF SMC) provide evidence of the strong desire by contractors to skip or severely reduce software development design and early defect detection methods in these schedule-constrained environments. The research findings suggest recommendations to fully address these issues at numerous levels. However, the observations lead us to investigate modeling and theoretical methods to fundamentally understand what motivated this behavior in the first place. As a result, Madachy's inspection-based system dynamics model is modified to include unit testing and an integration test feedback loop. This Modified Madachy Model (MMM) is used as a tool to investigate the consequences of this behavior on the observed defect dynamics for two remarkably different case study software projects. Latin Hypercube sampling of the MMM with sample distributions for quality, schedule and cost-driven strategies demonstrate that the higher cost and effort quality-driven strategies provide consistently better schedule performance than the schedule-driven up-front effort-reduction strategies. Game theory reasoning for schedule-driven engineers cutting corners on inspections and unit testing is based on the case study evidence and Austin's agency model to describe the observed phenomena. Game theory concepts are then used to argue that the source of the problem and hence the solution to developers cutting corners on quality for schedule-driven system acquisitions ultimately lies with the government. The game theory arguments also lead to the suggestion that the use of a multi-player dynamic Nash bargaining game provides a solution for our observed lack of quality game between the government (the acquirer) and "large-corporation" software developers. A note is provided that argues this multi-player dynamic Nash bargaining game also provides the solution to Freeman Dyson's problem, for a way to place a label of good or bad on systems.

  10. Looking good or doing better? Patterns of decoupling in the implementation of clinical directorates.

    PubMed

    Mascia, Daniele; Morandi, Federica; Cicchetti, Americo

    2014-01-01

    The interest toward hospital restructuring has risen significantly in recent years. In spite of its potential benefits, often organizational restructuring in health care produces unexpected consequences. Extant research suggests that institutional theory provides a powerful theoretical lens through which hospital restructuring can be described and explained. According to this perspective, the effectiveness of change is strongly related to the extent to which innovative arrangements, tools, or practices are adopted and implemented within hospitals. Whenever these new arrangements require a substantial modification of internal processes and practices, resistance to implementation emerges and organizational change is likely to become neutralized. This study analyzes how hospital organizations engage in decoupling by adopting but not implementing a new organizational model named clinical directorate. We collected primary data on the diffusion of the clinical directorate model, which was mandated by law in the Italian National Health Service to improve hospital services. We surveyed the adoption and implementation of the clinical directorate model by monitoring the presence of clinical governance tools (measures for the quality improvement of hospital services) within single directorates. In particular, we compared hospitals that adopted the model before (early adopters) or after (later adopters) the mandate was introduced. Hospitals were engaged in decoupling by adopting the new arrangement but not implementing internal practices and tools for quality improvement. The introduction of the law significantly affected the decoupling, with late-adopter hospitals being less likely to implement the adopted model. The present research shows that changes in quality improvement processes may vary in relation to policy makers' interventions aimed at boosting the adoption of new hospital arrangements. Hospital administrators need to be aware and identify the institutional changes that might be driven by law to be able to react consistently with expectations of policymakers.

  11. Compound management beyond efficiency.

    PubMed

    Burr, Ian; Winchester, Toby; Keighley, Wilma; Sewing, Andreas

    2009-06-01

    Codeveloping alongside chemistry and in vitro screening, compound management was one of the first areas in research recognizing the need for efficient processes and workflows. Material management groups have centralized, automated, miniaturized and, importantly, found out what not to do with compounds. While driving down cost and improving quality in storage and processing, researchers still face the challenge of interfacing optimally with changing business processes, in screening groups, and with external vendors and focusing on biologicals in many companies. Here we review our strategy to provide a seamless link between compound acquisition and screening operations and the impact of material management on quality of the downstream processes. Although this is driven in part by new technologies and improved quality control within material management, redefining team structures and roles also drives job satisfaction and motivation in our teams with a subsequent positive impact on cycle times and customer feedback.

  12. A Focus Group Exploration of Automated Case-Finders to Identify High-Risk Heart Failure Patients Within an Urban Safety Net Hospital.

    PubMed

    Patterson, Mark E; Miranda, Derick; Schuman, Greg; Eaton, Christopher; Smith, Andrew; Silver, Brad

    2016-01-01

    Leveraging "big data" as a means of informing cost-effective care holds potential in triaging high-risk heart failure (HF) patients for interventions within hospitals seeking to reduce 30-day readmissions. Explore provider's beliefs and perceptions about using an electronic health record (EHR)-based tool that uses unstructured clinical notes to risk-stratify high-risk heart failure patients. Six providers from an inpatient HF clinic within an urban safety net hospital were recruited to participate in a semistructured focus group. A facilitator led a discussion on the feasibility and value of using an EHR tool driven by unstructured clinical notes to help identify high-risk patients. Data collected from transcripts were analyzed using a thematic analysis that facilitated drawing conclusions clustered around categories and themes. From six categories emerged two themes: (1) challenges of finding valid and accurate results, and (2) strategies used to overcome these challenges. Although employing a tool that uses electronic medical record (EMR) unstructured text as the benchmark by which to identify high-risk patients is efficient, choosing appropriate benchmark groups could be challenging given the multiple causes of readmission. Strategies to mitigate these challenges include establishing clear selection criteria to guide benchmark group composition, and quality outcome goals for the hospital. Prior to implementing into practice an innovative EMR-based case-finder driven by unstructured clinical notes, providers are advised to do the following: (1) define patient quality outcome goals, (2) establish criteria by which to guide benchmark selection, and (3) verify the tool's validity and reliability. Achieving consensus on these issues would be necessary for this innovative EHR-based tool to effectively improve clinical decision-making and in turn, decrease readmissions for high-risk patients.

  13. Improving Information Exchange and Coordination amongst Homeland Security Organizations (Briefing Charts)

    DTIC Science & Technology

    2005-06-01

    need for user-defined dashboard • automated monitoring of web data sources • task driven data aggregation and display Working toward automated processing of task, resource, and intelligence updates

  14. Data Systems and Reports as Active Participants in Data Interpretation

    ERIC Educational Resources Information Center

    Rankin, Jenny Grant

    2016-01-01

    Most data-informed decision-making in education is undermined by flawed interpretations. Educator-driven interventions to improve data use are beneficial but not omnipotent, as data misunderstandings persist at schools and school districts commended for ideal data use support. Meanwhile, most data systems and reports display figures without…

  15. How NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements.

    NASA Astrophysics Data System (ADS)

    Tisdale, M.

    2016-12-01

    NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), OGC Web Coverage Services (WCS) and leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams and ASDC are utilizing these services, developing applications using the Web AppBuilder for ArcGIS and ArcGIS API for Javascript, and evaluating restructuring their data production and access scripts within the ArcGIS Python Toolbox framework and Geoprocessing service environment. These capabilities yield a greater usage and exposure of ASDC data holdings and provide improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry.

  16. From the field and lab to data wrangling and facilitating interagency collaboration- how I ended up in the data science world.

    NASA Astrophysics Data System (ADS)

    Kreft, J.

    2015-12-01

    I work to build systems that make environmental data more accessible and usable for others—a role that I love and, ten years ago, would not have guessed I would play. I transitioned from conducting pure research to learning more about data curation and information science, and eventually, to combining knowledge of both the research and data science worlds in my current position at the U.S. Geological Survey Center for Integrated Data Analytics (USGS CIDA). At the USGS, I primarily work on the Water Quality Portal, an interagency tool for providing high performance, standards driven access to water quality data, and the USGS Publications Warehouse, which plays a key and ever expanding role in providing access to USGS Publications and their associated data sets. Both projects require an overarching focus on building services to make science data more visible and accessible to users. In addition, listening to the needs of the research scientists who are both collecting and using the data to improve the tools I guide the development of. Concepts that I learned at the University Of Illinois at Urbana-Champaign Graduate School of Library and Information Science Data Curation Education Program were critical to a successful transition from the research world to the data science world. Data curation and data science are playing an ever-larger role in surmounting current and future data challenges at the USGS, and the need for people with interests in both research and data science will continue to grow.

  17. Peer-driven quality improvement among health workers and traditional birth attendants in Sierra Leone: linkages between providers' organizational skills and relationships.

    PubMed

    Higgins-Steele, Ariel; Waller, Kathryn; Fotso, Jean Christophe; Vesel, Linda

    2015-01-01

    Sierra Leone has among the poorest maternal and child health indicators in the world and investments in public health have been predominately to increase demand for services, with fewer initiatives targeting supply side factors that influence health workers' work environment. This paper uses data from the Quality Circles project in a rural district of Sierra Leone to achieve three objectives. First, we examine the effect of the intervention on organizational skills and relationships among coworkers as well as between health workers and traditional birth attendants. Second, we examine whether changes in organizational skills are associated with changes in relationships among and between formal and informal health providers and between health providers and clients. Third, we aim to further understand these changes through the perspectives of health workers and traditional birth attendants. The Quality Circles project was implemented in Kailahun District in the Eastern province of Sierra Leone from August 2011 to June 2013, with adjacent Tonkolili District serving as the control site. Using a mixed-methods approach, the evaluation included a quantitative survey, in-depth interviews and focus group discussions with health workers and traditional birth attendants. Mean values of the variables of interest were compared across sub-populations, and correlation analyses were performed between changes in organizational skills and changes in relationships. The results demonstrate that the Quality Circles intervention had positive effects on organizational skills and relationships. Furthermore, improvements in all organizational skill variables - problem-solving, strategizing and negotiation skills - were strongly associated with a change in the overall relationship variable. The Quality Circles approach has the potential to support health workers to improve their organizational skills and relationships, which in turn can contribute to improving the interpersonal dimensions of the quality of care in low-resource contexts. This method brings together peers in a structured process for constructive group work and individual skill development, which are important in low-resource contexts where active participation and resourcefulness of health workers can also contribute to better health service delivery.

  18. Teamwork for Oversight of Processes and Systems (TOPS). Implementation guide for TOPS version 2.0, 10 August 1992

    NASA Technical Reports Server (NTRS)

    Strand, Albert A.; Jackson, Darryl J.

    1992-01-01

    As the nation redefines priorities to deal with a rapidly changing world order, both government and industry require new approaches for oversight of management systems, particularly for high technology products. Declining defense budgets will lead to significant reductions in government contract management personnel. Concurrently, defense contractors are reducing administrative and overhead staffing to control costs. These combined pressures require bold approaches for the oversight of management systems. In the Spring of 1991, the DPRO and TRW created a Process Action Team (PAT) to jointly prepare a Performance Based Management (PBM) system titled Teamwork for Oversight of Processes and Systems (TOPS). The primary goal is implementation of a performance based management system based on objective data to review critical TRW processes with an emphasis on continuous improvement. The processes are: Finance and Business Systems, Engineering and Manufacturing Systems, Quality Assurance, and Software Systems. The team established a number of goals: delivery of quality products to contractual terms and conditions; ensure that TRW management systems meet government guidance and good business practices; use of objective data to measure critical processes; elimination of wasteful/duplicative reviews and audits; emphasis on teamwork--all efforts must be perceived to add value by both sides and decisions are made by consensus; and synergy and the creation of a strong working trust between TRW and the DPRO. TOPS permits the adjustment of oversight resources when conditions change or when TRW systems performance indicate either an increase or decrease in surveillance is appropriate. Monthly Contractor Performance Assessments (CPA) are derived from a summary of supporting system level and process-level ratings obtained from objective process-level data. Tiered, objective, data-driven metrics are highly successful in achieving a cooperative and effective method of measuring performance. The teamwork-based culture developed by TOPS proved an unequaled success in removing adversarial relationships and creating an atmosphere of continuous improvement in quality processes at TRW. The new working relationship does not decrease the responsibility or authority of the DPRO to ensure contract compliance and it permits both parties to work more effectively to improve total quality and reduce cost. By emphasizing teamwork in developing a stronger approach to efficient management of the defense industrial base TOPS is a singular success.

  19. Teamwork for Oversight of Processes and Systems (TOPS). Implementation guide for TOPS version 2.0, 10 August 1992

    NASA Astrophysics Data System (ADS)

    Strand, Albert A.; Jackson, Darryl J.

    As the nation redefines priorities to deal with a rapidly changing world order, both government and industry require new approaches for oversight of management systems, particularly for high technology products. Declining defense budgets will lead to significant reductions in government contract management personnel. Concurrently, defense contractors are reducing administrative and overhead staffing to control costs. These combined pressures require bold approaches for the oversight of management systems. In the Spring of 1991, the DPRO and TRW created a Process Action Team (PAT) to jointly prepare a Performance Based Management (PBM) system titled Teamwork for Oversight of Processes and Systems (TOPS). The primary goal is implementation of a performance based management system based on objective data to review critical TRW processes with an emphasis on continuous improvement. The processes are: Finance and Business Systems, Engineering and Manufacturing Systems, Quality Assurance, and Software Systems. The team established a number of goals: delivery of quality products to contractual terms and conditions; ensure that TRW management systems meet government guidance and good business practices; use of objective data to measure critical processes; elimination of wasteful/duplicative reviews and audits; emphasis on teamwork--all efforts must be perceived to add value by both sides and decisions are made by consensus; and synergy and the creation of a strong working trust between TRW and the DPRO. TOPS permits the adjustment of oversight resources when conditions change or when TRW systems performance indicate either an increase or decrease in surveillance is appropriate. Monthly Contractor Performance Assessments (CPA) are derived from a summary of supporting system level and process-level ratings obtained from objective process-level data. Tiered, objective, data-driven metrics are highly successful in achieving a cooperative and effective method of measuring performance. The teamwork-based culture developed by TOPS proved an unequaled success in removing adversarial relationships and creating an atmosphere of continuous improvement in quality processes at TRW. The new working relationship does not decrease the responsibility or authority of the DPRO to ensure contract compliance and it permits both parties to work more effectively to improve total quality and reduce cost. By emphasizing teamwork in developing a stronger approach to efficient management of the defense industrial base TOPS is a singular success.

  20. Feature maps driven no-reference image quality prediction of authentically distorted images

    NASA Astrophysics Data System (ADS)

    Ghadiyaram, Deepti; Bovik, Alan C.

    2015-03-01

    Current blind image quality prediction models rely on benchmark databases comprised of singly and synthetically distorted images, thereby learning image features that are only adequate to predict human perceived visual quality on such inauthentic distortions. However, real world images often contain complex mixtures of multiple distortions. Rather than a) discounting the effect of these mixtures of distortions on an image's perceptual quality and considering only the dominant distortion or b) using features that are only proven to be efficient for singly distorted images, we deeply study the natural scene statistics of authentically distorted images, in different color spaces and transform domains. We propose a feature-maps-driven statistical approach which avoids any latent assumptions about the type of distortion(s) contained in an image, and focuses instead on modeling the remarkable consistencies in the scene statistics of real world images in the absence of distortions. We design a deep belief network that takes model-based statistical image features derived from a very large database of authentically distorted images as input and discovers good feature representations by generalizing over different distortion types, mixtures, and severities, which are later used to learn a regressor for quality prediction. We demonstrate the remarkable competence of our features for improving automatic perceptual quality prediction on a benchmark database and on the newly designed LIVE Authentic Image Quality Challenge Database and show that our approach of combining robust statistical features and the deep belief network dramatically outperforms the state-of-the-art.

  1. The Structural Consequences of Big Data-Driven Education.

    PubMed

    Zeide, Elana

    2017-06-01

    Educators and commenters who evaluate big data-driven learning environments focus on specific questions: whether automated education platforms improve learning outcomes, invade student privacy, and promote equality. This article puts aside separate unresolved-and perhaps unresolvable-issues regarding the concrete effects of specific technologies. It instead examines how big data-driven tools alter the structure of schools' pedagogical decision-making, and, in doing so, change fundamental aspects of America's education enterprise. Technological mediation and data-driven decision-making have a particularly significant impact in learning environments because the education process primarily consists of dynamic information exchange. In this overview, I highlight three significant structural shifts that accompany school reliance on data-driven instructional platforms that perform core school functions: teaching, assessment, and credentialing. First, virtual learning environments create information technology infrastructures featuring constant data collection, continuous algorithmic assessment, and possibly infinite record retention. This undermines the traditional intellectual privacy and safety of classrooms. Second, these systems displace pedagogical decision-making from educators serving public interests to private, often for-profit, technology providers. They constrain teachers' academic autonomy, obscure student evaluation, and reduce parents' and students' ability to participate or challenge education decision-making. Third, big data-driven tools define what "counts" as education by mapping the concepts, creating the content, determining the metrics, and setting desired learning outcomes of instruction. These shifts cede important decision-making to private entities without public scrutiny or pedagogical examination. In contrast to the public and heated debates that accompany textbook choices, schools often adopt education technologies ad hoc. Given education's crucial impact on individual and collective success, educators and policymakers must consider the implications of data-driven education proactively and explicitly.

  2. A data-driven search for semen-related phenotypes in conception delay

    PubMed Central

    Patel, C. J.; Sundaram, R.; Buck Louis, G. M.

    2016-01-01

    SUMMARY Sperm count, morphology, and motility have been reported to be predictive of pregnancy, although with equivocal basis prompting some authors to question the prognostic value of semen analysis. To assess the utility of including semen quality data in predicting conception delay or requiring >6 cycles to become pregnant (referred to as conception delay), we utilized novel data-driven analytic techniques in a pre-conception cohort of couples prospectively followed up for time-to-pregnancy. The study cohort comprised 402 (80%) male partners who provided semen samples and had time-to-pregnancy information. Female partners used home pregnancy tests and recorded results in daily journals. Odds ratios (OR), false discovery rates, and 95% confidence intervals (CIs) for conception delay (time-to-pregnancy > 6 cycles) were estimated for 40 semen quality phenotypes comprising 35 semen quality endpoints and 5 closely related fecundity determinants (body mass index, time of contraception, lipids, cotinine and seminal white blood cells). Both traditional and strict sperm phenotype measures were associated with lower odds of conception delay. Specifically, for an increase in percent morphologically normal spermatozoa using traditional methods, we observed a 40% decrease in conception delay (OR = 0.6, 95% CI = 0.50, 0.81; p = 0.0003). Similarly, for an increase in strict criteria, we observed a 30% decrease in odds for conception delay (OR = 0.7, 95% CI = 0.52, 0.83; p = 0.001). On the other hand, an increase in percent coiled tail spermatozoa was associated with a 40% increase in the odds for conception delay (OR = 1.4, 95% CI = 1.12, 1.75; p = 0.003). However, our findings suggest that semen phenotypes have little predictive value of conception delay (area under the curve of 73%). In a multivariate model containing significant semen factors and traditional risk factors (i.e. age, body mass index, cotinine and ever having fathered a pregnancy), there was a modest improvement in prediction of conception delay (16% increase in area under the curve, p < 0.0002). PMID:27792860

  3. Fostering evidence-based quality improvement for patient-centered medical homes: Initiating local quality councils to transform primary care.

    PubMed

    Stockdale, Susan E; Zuchowski, Jessica; Rubenstein, Lisa V; Sapir, Negar; Yano, Elizabeth M; Altman, Lisa; Fickel, Jacqueline J; McDougall, Skye; Dresselhaus, Timothy; Hamilton, Alison B

    Although the patient-centered medical home endorses quality improvement principles, methods for supporting ongoing, systematic primary care quality improvement have not been evaluated. We introduced primary care quality councils at six Veterans Health Administration sites as an organizational intervention with three key design elements: (a) fostering interdisciplinary quality improvement leadership, (b) establishing a structured quality improvement process, and (c) facilitating organizationally aligned frontline quality improvement innovation. Our evaluation objectives were to (a) assess design element implementation, (b) describe implementation barriers and facilitators, and (c) assess successful quality improvement project completion and spread. We analyzed administrative records and conducted interviews with 85 organizational leaders. We developed and applied criteria for assessing design element implementation using hybrid deductive/inductive analytic techniques. All quality councils implemented interdisciplinary leadership and a structured quality improvement process, and all but one completed at least one quality improvement project and a toolkit for spreading improvements. Quality councils were perceived as most effective when service line leaders had well-functioning interdisciplinary communication. Matching positions within leadership hierarchies with appropriate supportive roles facilitated frontline quality improvement efforts. Two key resources were (a) a dedicated internal facilitator with project management, data collection, and presentation skills and (b) support for preparing customized data reports for identifying and addressing practice level quality issues. Overall, quality councils successfully cultivated interdisciplinary, multilevel primary care quality improvement leadership with accountability mechanisms and generated frontline innovations suitable for spread. Practice level performance data and quality improvement project management support were critical. In order to successfully facilitate systematic, sustainable primary care quality improvement, regional and executive health care system leaders should engage interdisciplinary practice level leadership in a priority-setting process that encourages frontline innovation and establish local structures such as quality councils to coordinate quality improvement initiatives, ensure accountability, and promote spread of best practices.

  4. Data-Driven Healthcare: Challenges and Opportunities for Interactive Visualization.

    PubMed

    Gotz, David; Borland, David

    2016-01-01

    The healthcare industry's widespread digitization efforts are reshaping one of the largest sectors of the world's economy. This transformation is enabling systems that promise to use ever-improving data-driven evidence to help doctors make more precise diagnoses, institutions identify at risk patients for intervention, clinicians develop more personalized treatment plans, and researchers better understand medical outcomes within complex patient populations. Given the scale and complexity of the data required to achieve these goals, advanced data visualization tools have the potential to play a critical role. This article reviews a number of visualization challenges unique to the healthcare discipline.

  5. Use of Structural Equation Modeling to Demonstrate the Differential Impact of Storage and Voiding Lower Urinary Tract Symptoms on Symptom Bother and Quality of Life during Treatment for Lower Urinary Tract Symptoms Associated with Benign Prostatic Hyperplasia.

    PubMed

    McVary, Kevin T; Peterson, Andrew; Donatucci, Craig F; Baygani, Simin; Henneges, Carsten; Clouth, Johannes; Wong, David; Oelke, Matthias

    2016-09-01

    Lower urinary tract symptoms associated with benign prostatic hyperplasia typically respond well to medical therapy. While changes in total I-PSS (International Prostate Symptom Score) are generally accepted as measurement for treatment response, I-PSS storage and voiding subscores may not accurately reflect the influence of symptom improvement on patient bother and quality of life. Structural equation modeling was done to evaluate physiological interrelationships measured by I-PSS storage vs voiding subscore questions and measure the magnitude of effects on bother using BII (Benign Prostatic Hyperplasia Impact Index) and quality of life on I-PSS quality of life questions. Pooled data from 4 randomized, controlled trials of tadalafil and placebo in 1,462 men with lower urinary tract symptoms/benign prostatic hyperplasia were used to investigate the relationship of storage vs voiding lower urinary tract symptoms on BII and quality of life. The final structural equation model demonstrated a sufficient fit to model interdependence of storage, voiding, bother and quality of life (probability for test of close fit <0.0001). Storage aspects had a twofold greater effect on voiding vs voiding aspects on storage (0.61 vs 0.28, each p <0.0001). The direct effect of storage on bother was twofold greater than voiding on bother (0.64 vs 0.29, each p <0.0001). Bother directly impacted quality of life by the largest magnitude of (-0.83), largely driven by storage lower urinary tract symptoms (p <0.0001). Total I-PSS is a reliable instrument to assess the therapeutic response in lower urinary tract symptoms/benign prostatic hyperplasia cases. However, an improvement in storage lower urinary tract symptoms is mainly responsible for improved bother and quality of life during treatment. Care should be taken when evaluating the accuracy of I-PSS subscores as indicators of the response to medical therapy. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  6. Quality aspects of ex vivo root canal treatments done by undergraduate dental students using four different endodontic treatment systems.

    PubMed

    Jungnickel, Luise; Kruse, Casper; Vaeth, Michael; Kirkevang, Lise-Lotte

    2018-04-01

    To evaluate factors associated with treatment quality of ex vivo root canal treatments performed by undergraduate dental students using different endodontic treatment systems. Four students performed root canal treatment on 80 extracted human teeth using four endodontic treatment systems in designated treatment order following a Latin square design. Lateral seal and length of root canal fillings was radiographically assessed; for lateral seal, a graded visual scale was used. Treatment time was measured separately for access preparation, biomechanical root canal preparation, obturation and for the total procedure. Mishaps were registered. An ANOVA mirroring the Latin square design was performed. Use of machine-driven nickel-titanium systems resulted in overall better quality scores for lateral seal than use of the manual stainless-steel system. Among systems with machine-driven files, scores did not significantly differ. Use of machine-driven instruments resulted in shorter treatment time than manual instrumentation. Machine-driven systems with few files achieved shorter treatment times. With increasing number of treatments, root canal-filling quality increased, treatment time decreased; a learning curve was plotted. No root canal shaping file separated. The use of endodontic treatment systems with machine-driven files led to higher quality lateral seal compared to the manual system. The three contemporary machine-driven systems delivered comparable results regarding quality of root canal fillings; they were safe to use and provided a more efficient workflow than the manual technique. Increasing experience had a positive impact on the quality of root canal fillings while treatment time decreased.

  7. A Parametric Oscillator Experiment for Undergraduates

    NASA Astrophysics Data System (ADS)

    Huff, Alison; Thompson, Johnathon; Pate, Jacob; Kim, Hannah; Chiao, Raymond; Sharping, Jay

    We describe an upper-division undergraduate-level analytic mechanics experiment or classroom demonstration of a weakly-damped pendulum driven into parametric resonance. Students can derive the equations of motion from first principles and extract key oscillator features, such as quality factor and parametric gain, from experimental data. The apparatus is compact, portable and easily constructed from inexpensive components. Motion control and data acquisition are accomplished using an Arduino micro-controller incorporating a servo motor, laser sensor, and data logger. We record the passage time of the pendulum through its equilibrium position and obtain the maximum speed per oscillation as a function of time. As examples of the interesting physics which the experiment reveals, we present contour plots depicting the energy of the system as functions of driven frequency and modulation depth. We observe the transition to steady state oscillation and compare the experimental oscillation threshold with theoretical expectations. A thorough understanding of this hands-on laboratory exercise provides a foundation for current research in quantum information and opto-mechanics, where damped harmonic motion, quality factor, and parametric amplification are central.

  8. Revenue, relationships and routines: the social organization of acute myocardial infarction patient transfers in the United States.

    PubMed

    Veinot, Tiffany C; Bosk, Emily A; Unnikrishnan, K P; Iwashyna, Theodore J

    2012-11-01

    Heart attack, or acute myocardial infarction (AMI), is a leading cause of death in the United States (U.S.). The most effective therapy for AMI is rapid revascularization: the mechanical opening of the clogged artery in the heart. Forty-four percent of patients with AMI who are admitted to a non-revascularization hospital in the U.S. are transferred to a hospital with that capacity. Yet, we know little about the process by which community hospitals complete these transfers, and why publicly available hospital quality data plays a small role in community hospitals' choice of transfer destinations. Therefore, we investigated how community hospital staff implement patient transfers and select destinations. We conducted a mixed methods study involving: interviews with staff at three community hospitals (n = 25) in a Midwestern state and analysis of U.S. national Medicare records for 1996-2006. Community hospitals in the U.S., including our field sites, typically had longstanding relationships with one key receiving hospital. Community hospitals addressed the need for rapid AMI patient transfers by routinizing the collective, interhospital work process. Routinization reduced staff uncertainty, coordinated their efforts and conserved their cognitive resources for patient care. While destination selection was nominally a physician role, the decision was routinized, such that staff immediately contacted a "usual" transfer destination upon AMI diagnosis. Transfer destination selection was primarily driven at an institutional level by organizational concerns and bed supply, rather than physician choice or patient preference. Transfer routinization emerged as a form of social order that invoked tradeoffs between process speed and efficiency and patient-centered, quality-driven decision making. We consider the implications of routinization and institutional imperatives for health policy, quality improvement and health informatics interventions. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Medication Use by U.S. Crewmembers on the International Space Station

    NASA Technical Reports Server (NTRS)

    Wotring, V. E.

    2015-01-01

    This study examined medication use during long-duration. Medication records from 24 crewmembers on 20 missions (greater than 30 days duration) were examined for trends in usage rates, efficacy, indication, as well as adverse event qualities, frequencies and severities. No controls were possible in this observational, retrospective analysis of available data; comparisons are made to similar studies of individuals on shortduration spaceflights and submarine deployments. The most frequently used medications were for sleep problems, pain, congestion or allergy. Medication use during spaceflight missions was similar to what is seen in adult ambulatory medicine; one notable exception is that usage of sleep aids was about 10 times higher in spaceflight. There were also two apparent treatment failures in cases of skin rash, raising questions about the efficacy or suitability of the treatments used. Many spaceflight-related medication uses were linked to extravehicular-activities and operationally-driven schedule changes. The data suggest that sleep and skin rash merit additional study prior to longer space exploration missions. It also seems likely that alterations in schedule-shifting or extravehicular activity suits would reduce the need for many medication uses, preserving resources as well as improving crew quality of life.

  10. Enabling data-driven provenance in NetCDF, via OGC WPS operations. Climate Analysis services use case.

    NASA Astrophysics Data System (ADS)

    Mihajlovski, A.; Spinuso, A.; Plieger, M.; Som de Cerff, W.

    2016-12-01

    Modern Climate analysis platforms provide generic and standardized ways of accessing data and processing services. These are typically supported by a wide range of OGC formats and interfaces. However, the problem of instrumentally tracing the lineage of the transformations occurring on a dataset and its provenance remains an open challenge. It requires standard-driven and interoperable solutions to facilitate understanding, sharing of self-describing data products, fostering collaboration among peers. The CLIPC portal provided us real use case, where the need of an instrumented provenance management is fundamental. CLIPC provides a single point of access for scientific information on climate change. The data about the physical environment which is used to inform climate change policy and adaptation measures comes from several categories: satellite measurements, terrestrial observing systems, model projections and simulations and from re-analyses. This is made possible through the Copernicus Earth Observation Programme for Europe. With a backbone combining WPS and OPeNDAP services, CLIPC has two themes: 1. Harmonized access to climate datasets derived from models, observations and re-analyses 2. A climate impact tool kit to evaluate, rank and aggregate indicators The climate impact tool kit is realised with the orchestration of a number of WPS that ingest, normalize and combine NetCDF files. The WPS allowing this specific computation are hosted by the climate4impact portal, which is a more generic climate data-access and processing service. In this context, guaranteeing validation and reproducibility of results, is a clearly stated requirement to improve the quality of the results obtained by the combined analysis Two core contributions made, are the enabling of a provenance wrapper around WPS services and the enabling of provenance tracing within the NetCDF format, which adopts and extends the W3C's PROV model. To disseminate indicator data and create transformed data products, a standardized provenance, metadata and processing infrastructure is researched for CLIPC. These efforts will lead towards the provision of tools for further web service processing development and optimisation, opening up possibilities to scale and administer abstract users and data driven workflows.

  11. NGSmethDB 2017: enhanced methylomes and differential methylation

    PubMed Central

    Lebrón, Ricardo; Gómez-Martín, Cristina; Carpena, Pedro; Bernaola-Galván, Pedro; Barturen, Guillermo; Hackenberg, Michael; Oliver, José L.

    2017-01-01

    The 2017 update of NGSmethDB stores whole genome methylomes generated from short-read data sets obtained by bisulfite sequencing (WGBS) technology. To generate high-quality methylomes, stringent quality controls were integrated with third-part software, adding also a two-step mapping process to exploit the advantages of the new genome assembly models. The samples were all profiled under constant parameter settings, thus enabling comparative downstream analyses. Besides a significant increase in the number of samples, NGSmethDB now includes two additional data-types, which are a valuable resource for the discovery of methylation epigenetic biomarkers: (i) differentially methylated single-cytosines; and (ii) methylation segments (i.e. genome regions of homogeneous methylation). The NGSmethDB back-end is now based on MongoDB, a NoSQL hierarchical database using JSON-formatted documents and dynamic schemas, thus accelerating sample comparative analyses. Besides conventional database dumps, track hubs were implemented, which improved database access, visualization in genome browsers and comparative analyses to third-part annotations. In addition, the database can be also accessed through a RESTful API. Lastly, a Python client and a multiplatform virtual machine allow for program-driven access from user desktop. This way, private methylation data can be compared to NGSmethDB without the need to upload them to public servers. Database website: http://bioinfo2.ugr.es/NGSmethDB. PMID:27794041

  12. Effectiveness of User- and Expert-Driven Web-based Hypertension Programs: an RCT.

    PubMed

    Liu, Sam; Brooks, Dina; Thomas, Scott G; Eysenbach, Gunther; Nolan, Robert P

    2018-04-01

    The effectiveness of self-guided Internet-based lifestyle counseling (e-counseling) varies, depending on treatment protocol. Two dominant procedures in e-counseling are expert- and user-driven. The influence of these procedures on hypertension management remains unclear. The objective was to assess whether blood pressure improved with expert-driven or user-driven e-counseling over control intervention in patients with hypertension over a 4-month period. This study used a three-parallel group, double-blind randomized controlled design. In Toronto, Canada, 128 participants (aged 35-74 years) with hypertension were recruited. Participants were recruited using online and poster advertisements. Data collection took place between June 2012 and June 2014. Data were analyzed from October 2014 to December 2016. Controls received a weekly e-mail newsletter regarding hypertension management. The expert-driven group was prescribed a weekly exercise and diet plan (e.g., increase 1,000 steps/day this week). The user-driven group received weekly e-mail, which allowed participants to choose their intervention goals (e.g., [1] feel more confident to change my lifestyle, or [2] self-help tips for exercise or a heart healthy diet). Primary outcome was systolic blood pressure measured at baseline and 4-month follow-up. Secondary outcomes included cholesterol, 10-year Framingham cardiovascular risk, daily steps, and dietary habits. Expert-driven groups showed a greater systolic blood pressure decrease than controls at follow-up (expert-driven versus control: -7.5 mmHg, 95% CI= -12.5, -2.6, p=0.01). Systolic blood pressure reduction did not significantly differ between user- and expert-driven. Expert-driven compared with controls also showed a significant improvement in pulse pressure, cholesterol, and Framingham risk score. The expert-driven intervention was significantly more effective than both user-driven and control groups in increasing daily steps and fruit intake. It may be advisable to incorporate an expert-driven e-counseling protocol in order to accommodate participants with greater motivation to change their lifestyle behaviors, but more studies are needed. This study is registered at www.clinicaltrials.gov NCT03111836. Copyright © 2018 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  13. Analyzing Living Surveys: Visualization Beyond the Data Release

    NASA Astrophysics Data System (ADS)

    Buddelmeijer, H.; Noorishad, P.; Williams, D.; Ivanova, M.; Roerdink, J. B. T. M.; Valentijn, E. A.

    2015-09-01

    Surveys need to provide more than periodic data releases. Science often requires data that is not captured in such releases. This mismatch between the constraints set by a fixed data release and the needs of the scientists is solved in the Astro-WISE information system by extending its request-driven data handling into the analysis domain. This leads to Query-Driven Visualization, where all data handling is automated and scalable by exploiting the strengths of data pulling. Astro-WISE is data-centric: new data creates itself automatically, if no suitable existing data can be found to fulfill a request. This approach allows scientists to visualize exactly the data they need, without any manual data management, freeing their time for research. The benefits of query-driven visualization are highlighted by searching for distant quasars in KiDS, a 1500 square degree optical survey. KiDS needs to be treated as a living survey to minimize the time between observation and (spectral) followup. The first window of opportunity would be missed if it were necessary to wait for data releases. The results from the default processing pipelines are used for a quick and broad selection of quasar candidates. More precise measurements of source properties can subsequently be requested to downsize the candidate set, requiring partial reprocessing of the images. Finally, the raw and reduced pixels themselves are inspected by eye to rank the final candidate list. The quality of the resulting candidate list and the speed of its creation were only achievable due to query driven-visualization of the living archive.

  14. Support System to Improve Reading Activity in Parkinson’s Disease and Essential Tremor Patients

    PubMed Central

    Parrales Bravo, Franklin; Del Barrio García, Alberto A.; Gallego de la Sacristana, Mercedes; López Manzanares, Lydia; Vivancos, José; Ayala Rodrigo, José Luis

    2017-01-01

    The use of information and communication technologies (ICTs) to improve the quality of life of people with chronic and degenerative diseases is a topic receiving much attention nowadays. We can observe that new technologies have driven numerous scientific projects in e-Health, encompassing Smart and Mobile Health, in order to address all the matters related to data processing and health. Our work focuses on helping to improve the quality of life of people with Parkinson’s Disease (PD) and Essential Tremor (ET) by means of a low-cost platform that enables them to read books in an easy manner. Our system is composed of two robotic arms and a graphical interface developed for Android platforms. After several tests, our proposal has achieved a 96.5% accuracy for A4 80 gr non-glossy paper. Moreover, our system has outperformed the state-of-the-art platforms considering different types of paper and inclined surfaces. The feedback from ET and PD patients was collected at “La Princesa” University Hospital in Madrid and was used to study the user experience. Several features such as ease of use, speed, correct behavior or confidence were measured via patient feedback, and a high level of satisfaction was awarded to most of them. According to the patients, our system is a promising tool for facilitating the activity of reading. PMID:28467366

  15. Task-Driven Optimization of Fluence Field and Regularization for Model-Based Iterative Reconstruction in Computed Tomography.

    PubMed

    Gang, Grace J; Siewerdsen, Jeffrey H; Stayman, J Webster

    2017-12-01

    This paper presents a joint optimization of dynamic fluence field modulation (FFM) and regularization in quadratic penalized-likelihood reconstruction that maximizes a task-based imaging performance metric. We adopted a task-driven imaging framework for prospective designs of the imaging parameters. A maxi-min objective function was adopted to maximize the minimum detectability index ( ) throughout the image. The optimization algorithm alternates between FFM (represented by low-dimensional basis functions) and local regularization (including the regularization strength and directional penalty weights). The task-driven approach was compared with three FFM strategies commonly proposed for FBP reconstruction (as well as a task-driven TCM strategy) for a discrimination task in an abdomen phantom. The task-driven FFM assigned more fluence to less attenuating anteroposterior views and yielded approximately constant fluence behind the object. The optimal regularization was almost uniform throughout image. Furthermore, the task-driven FFM strategy redistribute fluence across detector elements in order to prescribe more fluence to the more attenuating central region of the phantom. Compared with all strategies, the task-driven FFM strategy not only improved minimum by at least 17.8%, but yielded higher over a large area inside the object. The optimal FFM was highly dependent on the amount of regularization, indicating the importance of a joint optimization. Sample reconstructions of simulated data generally support the performance estimates based on computed . The improvements in detectability show the potential of the task-driven imaging framework to improve imaging performance at a fixed dose, or, equivalently, to provide a similar level of performance at reduced dose.

  16. The "I" in QRIS Survey: Collecting Data on Quality Improvement Activities for Early Childhood Education Programs. REL 2017-221

    ERIC Educational Resources Information Center

    Faria, Ann-Marie; Hawkinson, Laura; Metzger, Ivan; Bouacha, Nora; Cantave, Michelle

    2017-01-01

    A quality rating and improvement system (QRIS) is a voluntary state assessment system that uses multidimensional data on early childhood education programs to rate program quality, support quality improvement efforts, and provide information to families about the quality of available early childhood education programs. QRISs have two components:…

  17. Toward a North American Standard for Mobile Data Services

    NASA Technical Reports Server (NTRS)

    Dean, Richard A.; Levesque, Allen H.

    1991-01-01

    The rapid introduction of digital mobile communications systems is an important part of the emerging digital communications scene. These developments pose both a potential problem and a challenge. On one hand, these separate market driven developments can result in an uncontrolled mixture of analog and digital links which inhibit data modem services across the mobile/Public Switched network (PSTN). On the other hand, the near coincidence of schedules for development of some of these systems, i.e., Digital Cellular, Mobile Satellite, Land Mobile Radio, and ISDN, provides an opportunity to address interoperability problems by defining interfaces, control, and service standards that are compatible among these new services. In this paper we address the problem of providing data services interoperation between mobile terminals and data devices on the PSTN. The expected data services include G3 Fax, asynchronous data, and the government's STU-3 secure voice system, and future data services such as ISDN. We address a common architecture and a limited set of issues that are key to interoperable mobile data services. We believe that common mobile data standards will both improve the quality of data service and simplify the systems for manufacturers, data users, and service providers.

  18. Toward a North American standard for mobile data services

    NASA Astrophysics Data System (ADS)

    Dean, Richard A.; Levesque, Allen H.

    1991-09-01

    The rapid introduction of digital mobile communications systems is an important part of the emerging digital communications scene. These developments pose both a potential problem and a challenge. On one hand, these separate market driven developments can result in an uncontrolled mixture of analog and digital links which inhibit data modem services across the mobile/Public Switched network (PSTN). On the other hand, the near coincidence of schedules for development of some of these systems, i.e., Digital Cellular, Mobile Satellite, Land Mobile Radio, and ISDN, provides an opportunity to address interoperability problems by defining interfaces, control, and service standards that are compatible among these new services. In this paper we address the problem of providing data services interoperation between mobile terminals and data devices on the PSTN. The expected data services include G3 Fax, asynchronous data, and the government's STU-3 secure voice system, and future data services such as ISDN. We address a common architecture and a limited set of issues that are key to interoperable mobile data services. We believe that common mobile data standards will both improve the quality of data service and simplify the systems for manufacturers, data users, and service providers.

  19. Data mining and statistical inference in selective laser melting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamath, Chandrika

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  20. Data mining and statistical inference in selective laser melting

    DOE PAGES

    Kamath, Chandrika

    2016-01-11

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  1. A Unified Data-Driven Approach for Programming In Situ Analysis and Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aiken, Alex

    The placement and movement of data is becoming the key limiting factor on both performance and energy efficiency of high performance computations. As systems generate more data, it is becoming increasingly difficult to actually move that data elsewhere for post-processing, as the rate of improvements in supporting I/O infrastructure is not keeping pace. Together, these trends are creating a shift in how we think about exascale computations, from a viewpoint that focuses on FLOPS to one that focuses on data and data-centric operations as fundamental to the reasoning about, and optimization of, scientific workflows on extreme-scale architectures. The overarching goalmore » of our effort was the study of a unified data-driven approach for programming applications and in situ analysis and visualization. Our work was to understand the interplay between data-centric programming model requirements at extreme-scale and the overall impact of those requirements on the design, capabilities, flexibility, and implementation details for both applications and the supporting in situ infrastructure. In this context, we made many improvements to the Legion programming system (one of the leading data-centric models today) and demonstrated in situ analyses on real application codes using these improvements.« less

  2. A regional ionospheric TEC mapping technique over China and adjacent areas on the basis of data assimilation

    NASA Astrophysics Data System (ADS)

    Aa, Ercha; Huang, Wengeng; Yu, Shimei; Liu, Siqing; Shi, Liqin; Gong, Jiancun; Chen, Yanhong; Shen, Hua

    2015-06-01

    In this paper, a regional total electron content (TEC) mapping technique over China and adjacent areas (70°E-140°E and 15°N-55°N) is developed on the basis of a Kalman filter data assimilation scheme driven by Global Navigation Satellite Systems (GNSS) data from the Crustal Movement Observation Network of China and International GNSS Service. The regional TEC maps can be generated accordingly with the spatial and temporal resolution being 1°×1° and 5 min, respectively. The accuracy and quality of the TEC mapping technique have been validated through the comparison with GNSS observations, the International Reference Ionosphere model values, the global ionosphere maps from Center for Orbit Determination of Europe, and the Massachusetts Institute of Technology Automated Processing of GPS TEC data from Madrigal database. The verification results indicate that great systematic improvements can be obtained when data are assimilated into the background model, which demonstrates the effectiveness of this technique in providing accurate regional specification of the ionospheric TEC over China and adjacent areas.

  3. Dynameomics: data-driven methods and models for utilizing large-scale protein structure repositories for improving fragment-based loop prediction.

    PubMed

    Rysavy, Steven J; Beck, David A C; Daggett, Valerie

    2014-11-01

    Protein function is intimately linked to protein structure and dynamics yet experimentally determined structures frequently omit regions within a protein due to indeterminate data, which is often due protein dynamics. We propose that atomistic molecular dynamics simulations provide a diverse sampling of biologically relevant structures for these missing segments (and beyond) to improve structural modeling and structure prediction. Here we make use of the Dynameomics data warehouse, which contains simulations of representatives of essentially all known protein folds. We developed novel computational methods to efficiently identify, rank and retrieve small peptide structures, or fragments, from this database. We also created a novel data model to analyze and compare large repositories of structural data, such as contained within the Protein Data Bank and the Dynameomics data warehouse. Our evaluation compares these structural repositories for improving loop predictions and analyzes the utility of our methods and models. Using a standard set of loop structures, containing 510 loops, 30 for each loop length from 4 to 20 residues, we find that the inclusion of Dynameomics structures in fragment-based methods improves the quality of the loop predictions without being dependent on sequence homology. Depending on loop length, ∼ 25-75% of the best predictions came from the Dynameomics set, resulting in lower main chain root-mean-square deviations for all fragment lengths using the combined fragment library. We also provide specific cases where Dynameomics fragments provide better predictions for NMR loop structures than fragments from crystal structures. Online access to these fragment libraries is available at http://www.dynameomics.org/fragments. © 2014 The Protein Society.

  4. Dynameomics: Data-driven methods and models for utilizing large-scale protein structure repositories for improving fragment-based loop prediction

    PubMed Central

    Rysavy, Steven J; Beck, David AC; Daggett, Valerie

    2014-01-01

    Protein function is intimately linked to protein structure and dynamics yet experimentally determined structures frequently omit regions within a protein due to indeterminate data, which is often due protein dynamics. We propose that atomistic molecular dynamics simulations provide a diverse sampling of biologically relevant structures for these missing segments (and beyond) to improve structural modeling and structure prediction. Here we make use of the Dynameomics data warehouse, which contains simulations of representatives of essentially all known protein folds. We developed novel computational methods to efficiently identify, rank and retrieve small peptide structures, or fragments, from this database. We also created a novel data model to analyze and compare large repositories of structural data, such as contained within the Protein Data Bank and the Dynameomics data warehouse. Our evaluation compares these structural repositories for improving loop predictions and analyzes the utility of our methods and models. Using a standard set of loop structures, containing 510 loops, 30 for each loop length from 4 to 20 residues, we find that the inclusion of Dynameomics structures in fragment-based methods improves the quality of the loop predictions without being dependent on sequence homology. Depending on loop length, ∼25–75% of the best predictions came from the Dynameomics set, resulting in lower main chain root-mean-square deviations for all fragment lengths using the combined fragment library. We also provide specific cases where Dynameomics fragments provide better predictions for NMR loop structures than fragments from crystal structures. Online access to these fragment libraries is available at http://www.dynameomics.org/fragments. PMID:25142412

  5. Minimizing variance in Care of Pediatric Blunt Solid Organ Injury through Utilization of a hemodynamic-driven protocol: a multi-institution study.

    PubMed

    Cunningham, Aaron J; Lofberg, Katrine M; Krishnaswami, Sanjay; Butler, Marilyn W; Azarow, Kenneth S; Hamilton, Nicholas A; Fialkowski, Elizabeth A; Bilyeu, Pamela; Ohm, Erika; Burns, Erin C; Hendrickson, Margo; Krishnan, Preetha; Gingalewski, Cynthia; Jafri, Mubeen A

    2017-12-01

    An expedited recovery protocol for management of pediatric blunt solid organ injury (spleen, liver, and kidney) was instituted across two Level 1 Trauma Centers, managed by nine pediatric surgeons within three hospital systems. Data were collected for 18months on consecutive patients after protocol implementation. Patient demographics (including grade of injury), surgeon compliance, National Surgical Quality Improvement Program (NSQIP) complications, direct hospital cost, length of stay, time in the ICU, phlebotomy, and re-admission were compared to an 18-month control period immediately preceding study initiation. A total of 106 patients were treated (control=55, protocol=51). Demographics were similar among groups, and compliance was 78%. Hospital stay (4.6 vs. 3.5days, p=0.04), ICU stay (1.9 vs. 1.0days, p=0.02), and total phlebotomy (7.7 vs. 5.3 draws, p=0.007) were significantly less in the protocol group. A decrease in direct hospital costs was also observed ($11,965 vs. $8795, p=0.09). Complication rates (1.8% vs. 3.9%, p=0.86, no deaths) were similar. An expedited, hemodynamic-driven, pediatric solid organ injury protocol is achievable across hospital systems and surgeons. Through implementation we maintained quality while impacting length of stay, ICU utilization, phlebotomy, and cost. Future protocols should work to further limit resource utilization. Retrospective cohort study. Level II. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Informatics: essential infrastructure for quality assessment and improvement in nursing.

    PubMed Central

    Henry, S B

    1995-01-01

    In recent decades there have been major advances in the creation and implementation of information technologies and in the development of measures of health care quality. The premise of this article is that informatics provides essential infrastructure for quality assessment and improvement in nursing. In this context, the term quality assessment and improvement comprises both short-term processes such as continuous quality improvement (CQI) and long-term outcomes management. This premise is supported by 1) presentation of a historical perspective on quality assessment and improvement; 2) delineation of the types of data required for quality assessment and improvement; and 3) description of the current and potential uses of information technology in the acquisition, storage, transformation, and presentation of quality data, information, and knowledge. PMID:7614118

  7. The effects of swallowing disorders, dysgeusia, oral mucositis and xerostomia on nutritional status, oral intake and weight loss in head and neck cancer patients: A systematic review.

    PubMed

    Bressan, Valentina; Stevanin, Simone; Bianchi, Monica; Aleo, Giuseppe; Bagnasco, Annamaria; Sasso, Loredana

    2016-04-01

    Combined-modality treatment of head and neck cancer is becoming more common, driven by the idea that organ(s) preservation should maintain patient appearance and the function of organ(s) involved. Even if treatments have improved, they can still be associated with acute and late adverse effects. The aim of this systematic review was to retrieve current data on how swallowing disorders, dysgeusia, oral mucositis, and xerostomia affect nutritional status, oral intake and weight loss in head and neck cancer (HNC) patients. A systematic literature search covered four relevant electronic databases from January 2005 to May 2015. Retrieved papers were categorised and evaluated considering their methodological quality. Two independent reviewers reviewed manuscripts and abstracted data using a standardised form. Quality assessment of the included studies was performed using the Edwards Method Score. Of the 1459 abstracts reviewed, a total of 25 studies were included. The most studied symptom was dysphagia, even if symptoms were interconnected and affected one other. In most of the selected studies the level of evidence was between 2 and 3, and their quality level was from medium to low. There are limited data about dysgeusia, oral mucositis and xerostomia outcomes available for HNC patients. There is a lack of well-designed clinical trials and multicenter-prospective cohort studies, therefore further research is needed to ascertain which aspects of these symptoms should be measured. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Modulated Modularity Clustering as an Exploratory Tool for Functional Genomic Inference

    PubMed Central

    Stone, Eric A.; Ayroles, Julien F.

    2009-01-01

    In recent years, the advent of high-throughput assays, coupled with their diminishing cost, has facilitated a systems approach to biology. As a consequence, massive amounts of data are currently being generated, requiring efficient methodology aimed at the reduction of scale. Whole-genome transcriptional profiling is a standard component of systems-level analyses, and to reduce scale and improve inference clustering genes is common. Since clustering is often the first step toward generating hypotheses, cluster quality is critical. Conversely, because the validation of cluster-driven hypotheses is indirect, it is critical that quality clusters not be obtained by subjective means. In this paper, we present a new objective-based clustering method and demonstrate that it yields high-quality results. Our method, modulated modularity clustering (MMC), seeks community structure in graphical data. MMC modulates the connection strengths of edges in a weighted graph to maximize an objective function (called modularity) that quantifies community structure. The result of this maximization is a clustering through which tightly-connected groups of vertices emerge. Our application is to systems genetics, and we quantitatively compare MMC both to the hierarchical clustering method most commonly employed and to three popular spectral clustering approaches. We further validate MMC through analyses of human and Drosophila melanogaster expression data, demonstrating that the clusters we obtain are biologically meaningful. We show MMC to be effective and suitable to applications of large scale. In light of these features, we advocate MMC as a standard tool for exploration and hypothesis generation. PMID:19424432

  9. Using Institutional Effectiveness Data To Stimulate Improvement...Getting Data off the Shelf and into the Hands of Stakeholders. AIR 2000 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Hill, Sandy; Willekens, Rene G.

    This paper outlines a successful, stakeholder driven, continuous improvement process that is currently used at Estrella Mountain Community College, Arizona. The process is designed to address the next step of what to do with institutional effectiveness data after it has been collected and reported. Many institutional effectiveness processes stop…

  10. Assessing the impact of continuous quality improvement/total quality management: concept versus implementation.

    PubMed

    Shortell, S M; O'Brien, J L; Carman, J M; Foster, R W; Hughes, E F; Boerstler, H; O'Connor, E J

    1995-06-01

    This study examines the relationships among organizational culture, quality improvement processes and selected outcomes for a sample of up to 61 U. S. hospitals. Primary data were collected from 61 U. S. hospitals (located primarily in the midwest and the west) on measures related to continuous quality improvement/total quality management (CQI/TQM), organizational culture, implementation approaches, and degree of quality improvement implementation based on the Baldrige Award criteria. These data were combined with independently collected data on perceived impact and objective measures of clinical efficiency (i.e., charges and length of stay) for six clinical conditions. The study involved cross-sectional examination of the named relationships. Reliable and valid scales for the organizational culture and quality improvement implementation measures were developed based on responses from over 7,000 individuals across the 61 hospitals with an overall completion rate of 72 percent. Independent data on perceived impact were collected from a national survey and independent data on clinical efficiency from a companion study of managed care. A participative, flexible, risk-taking organizational culture was significantly related to quality improvement implementation. Quality improvement implementation, in turn, was positively associated with greater perceived patient outcomes and human resource development. Larger-size hospitals experienced lower clinical efficiency with regard to higher charges and higher length of stay, due in part to having more bureaucratic and hierarchical cultures that serve as a barrier to quality improvement implementation. What really matters is whether or not a hospital has a culture that supports quality improvement work and an approach that encourages flexible implementation. Larger-size hospitals face more difficult challenges in this regard.

  11. The University of Florida Department of Surgery: building a stronger tomorrow on yesterday's foundation.

    PubMed

    Behrns, Kevin E; Copeland, Edward M; Howard, Richard J

    2012-01-01

    Established in 1957, the University of Florida Department of Surgery has a solid foundation on which current faculty are driven to build a stronger tomorrow. The department is focused on promoting patient-centered care, expanding its research portfolio to improve techniques and outcomes, and training the surgical leaders of tomorrow. It fosters an environment where faculty, residents, students, and staff challenge long-held traditions with the goal of improving the health of our patients, the quality of our care, and the vitality of our work environment.

  12. Performance Analysis of a Ring Current Model Driven by Global MHD

    NASA Astrophysics Data System (ADS)

    Falasca, A.; Keller, K. A.; Fok, M.; Hesse, M.; Gombosi, T.

    2003-12-01

    Effectively modeling the high-energy particles in Earth's inner magnetosphere has the potential to improve safety in both manned and unmanned spacecraft. One model of this environment is the Fok Ring Current Model. This model can utilize as inputs both solar wind data, and empirical ionospheric electric field and magnetic field models. Alternatively, we have a procedure which allows the model to be driven by outputs from the BATS-R-US global MHD model. By using in-situ satellite data we will compare the predictive capability of this model in its original stand-alone form, to that of the model when driven by the BATS-R-US Global Magnetosphere Model. As a basis for comparison we use the April 2002 and May 2003 storms where suitable LANL geosynchronous data are available.

  13. Integrating Predictive Modeling with Control System Design for Managed Aquifer Recharge and Recovery Applications

    NASA Astrophysics Data System (ADS)

    Drumheller, Z. W.; Regnery, J.; Lee, J. H.; Illangasekare, T. H.; Kitanidis, P. K.; Smits, K. M.

    2014-12-01

    Aquifers around the world show troubling signs of irreversible depletion and seawater intrusion as climate change, population growth, and urbanization led to reduced natural recharge rates and overuse. Scientists and engineers have begun to re-investigate the technology of managed aquifer recharge and recovery (MAR) as a means to increase the reliability of the diminishing and increasingly variable groundwater supply. MAR systems offer the possibility of naturally increasing groundwater storage while improving the quality of impaired water used for recharge. Unfortunately, MAR systems remain wrought with operational challenges related to the quality and quantity of recharged and recovered water stemming from a lack of data-driven, real-time control. Our project seeks to ease the operational challenges of MAR facilities through the implementation of active sensor networks, adaptively calibrated flow and transport models, and simulation-based meta-heuristic control optimization methods. The developed system works by continually collecting hydraulic and water quality data from a sensor network embedded within the aquifer. The data is fed into an inversion algorithm, which calibrates the parameters and initial conditions of a predictive flow and transport model. The calibrated model is passed to a meta-heuristic control optimization algorithm (e.g. genetic algorithm) to execute the simulations and determine the best course of action, i.e., the optimal pumping policy for current aquifer conditions. The optimal pumping policy is manually or autonomously applied. During operation, sensor data are used to assess the accuracy of the optimal prediction and augment the pumping strategy as needed. At laboratory-scale, a small (18"H x 46"L) and an intermediate (6'H x 16'L) two-dimensional synthetic aquifer were constructed and outfitted with sensor networks. Data collection and model inversion components were developed and sensor data were validated by analytical measurements.

  14. Modelling of beef sensory quality for a better prediction of palatability.

    PubMed

    Hocquette, Jean-François; Van Wezemael, Lynn; Chriki, Sghaier; Legrand, Isabelle; Verbeke, Wim; Farmer, Linda; Scollan, Nigel D; Polkinghorne, Rod; Rødbotten, Rune; Allen, Paul; Pethick, David W

    2014-07-01

    Despite efforts by the industry to control the eating quality of beef, there remains a high level of variability in palatability, which is one reason for consumer dissatisfaction. In Europe, there is still no reliable on-line tool to predict beef quality and deliver consistent quality beef to consumers. Beef quality traits depend in part on the physical and chemical properties of the muscles. The determination of these properties (known as muscle profiling) will allow for more informed decisions to be made in the selection of individual muscles for the production of value-added products. Therefore, scientists and professional partners of the ProSafeBeef project have brought together all the data they have accumulated over 20 years. The resulting BIF-Beef (Integrated and Functional Biology of Beef) data warehouse contains available data of animal growth, carcass composition, muscle tissue characteristics and beef quality traits. This database is useful to determine the most important muscle characteristics associated with a high tenderness, a high flavour or generally a high quality. Another more consumer driven modelling tool was developed in Australia: the Meat Standards Australia (MSA) grading scheme that predicts beef quality for each individual muscle×specific cooking method combination using various information on the corresponding animals and post-slaughter processing factors. This system has also the potential to detect variability in quality within muscles. The MSA system proved to be effective in predicting beef palatability not only in Australia but also in many other countries. The results of the work conducted in Europe within the ProSafeBeef project indicate that it would be possible to manage a grading system in Europe similar to the MSA system. The combination of the different modelling approaches (namely muscle biochemistry and a MSA-like meat grading system adapted to the European market) is a promising area of research to improve the prediction of beef quality. In both approaches, the volume of data available not only provides statistically sound correlations between various factors and beef quality traits but also a better understanding of the variability of beef quality according to various criteria (breed, age, sex, pH, marbling etc.). © 2013 The American Meat Science Association. All rights reserved.

  15. Understanding How Clinician-Patient Relationships and Relational Continuity of Care Affect Recovery from Serious Mental Illness: STARS Study Results

    PubMed Central

    Green, Carla A.; Polen, Michael R.; Janoff, Shannon L.; Castleton, David K.; Wisdom, Jennifer P.; Vuckovic, Nancy; Perrin, Nancy A.; Paulson, Robert I.; Oken, Stuart L.

    2008-01-01

    Objective Recommendations for improving care include increased patient-clinician collaboration, patient empowerment, and greater relational continuity of care. All rely upon good clinician-patient relationships, yet little is known about how relational continuity and clinician-patient relationships interact, or their effects on recovery from mental illness. Methods Individuals (92 women, 85 men) with schizophrenia, schizoaffective disorder, affective psychosis, or bipolar disorder participated in this observational study. Participants completed in-depth interviews detailing personal and mental health histories. Questionnaires included quality of life and recovery assessments and were linked to records of services used. Qualitative analyses yielded a hypothesized model of the effects of relational continuity and clinician-patient relationships on recovery and quality of life, tested using covariance structure modeling. Results Qualitative data showed that positive, trusting relationships with clinicians, developed over time, aid recovery. When “fit” with clinicians was good, long-term relational continuity of care allowed development of close, collaborative relationships, fostered good illness and medication management, and supported patient-directed decisions. Most valued were competent, caring, trustworthy, and trusting clinicians who treated clinical encounters “like friendships,” increasing willingness to seek help and continue care when treatments were not effective and supporting “normal” rather than “mentally ill” identities. Statistical models showed positive relationships between recovery-oriented patient-driven care and satisfaction with clinicians, medication satisfaction, and recovery. Relational continuity indirectly affected quality of life via satisfaction with clinicians; medication satisfaction was associated with fewer symptoms; fewer symptoms were associated with recovery and better quality of life. Conclusions Strong clinician-patient relationships, relational continuity, and a caring, collaborative approach facilitate recovery from mental illness and improved quality of life. PMID:18614445

  16. Understanding how clinician-patient relationships and relational continuity of care affect recovery from serious mental illness: STARS study results.

    PubMed

    Green, Carla A; Polen, Michael R; Janoff, Shannon L; Castleton, David K; Wisdom, Jennifer P; Vuckovic, Nancy; Perrin, Nancy A; Paulson, Robert I; Oken, Stuart L

    2008-01-01

    Recommendations for improving care include increased patient-clinician collaboration, patient empowerment, and greater relational continuity of care. All rely upon good clinician-patient relationships, yet little is known about how relational continuity and clinician-patient relationships interact, or their effects on recovery from mental illness. Individuals (92 women, 85 men) with schizophrenia, schizoaffective disorder, affective psychosis, or bipolar disorder participated in this observational study. Participants completed in-depth interviews detailing personal and mental health histories. Questionnaires included quality of life and recovery assessments and were linked to records of services used. Qualitative analyses yielded a hypothesized model of the effects of relational continuity and clinician-patient relationships on recovery and quality of life, tested using covariance structure modeling. Qualitative data showed that positive, trusting relationships with clinicians, developed over time, aid recovery. When "fit" with clinicians was good, long-term relational continuity of care allowed development of close, collaborative relationships, fostered good illness and medication management, and supported patient-directed decisions. Most valued were competent, caring, trustworthy, and trusting clinicians who treated clinical encounters "like friendships," increasing willingness to seek help and continue care when treatments were not effective and supporting "normal" rather than "mentally ill" identities. Statistical models showed positive relationships between recovery-oriented patient-driven care and satisfaction with clinicians, medication satisfaction, and recovery. Relational continuity indirectly affected quality of life via satisfaction with clinicians; medication satisfaction was associated with fewer symptoms; fewer symptoms were associated with recovery and better quality of life. Strong clinician-patient relationships, relational continuity, and a caring, collaborative approach facilitate recovery from mental illness and improved quality of life.

  17. Three Research Strategies of Neuroscience and the Future of Legal Imaging Evidence.

    PubMed

    Jun, Jinkwon; Yoo, Soyoung

    2018-01-01

    Neuroscientific imaging evidence (NIE) has become an integral part of the criminal justice system in the United States. However, in most legal cases, NIE is submitted and used only to mitigate penalties because the court does not recognize it as substantial evidence, considering its lack of reliability. Nevertheless, we here discuss how neuroscience is expected to improve the use of NIE in the legal system. For this purpose, we classified the efforts of neuroscientists into three research strategies: cognitive subtraction, the data-driven approach, and the brain-manipulation approach. Cognitive subtraction is outdated and problematic; consequently, the court deemed it to be an inadequate approach in terms of legal evidence in 2012. In contrast, the data-driven and brain manipulation approaches, which are state-of-the-art approaches, have overcome the limitations of cognitive subtraction. The data-driven approach brings data science into the field and is benefiting immensely from the development of research platforms that allow automatized collection, analysis, and sharing of data. This broadens the scale of imaging evidence. The brain-manipulation approach uses high-functioning tools that facilitate non-invasive and precise human brain manipulation. These two approaches are expected to have synergistic effects. Neuroscience has strived to improve the evidential reliability of NIE, with considerable success. With the support of cutting-edge technologies, and the progress of these approaches, the evidential status of NIE will be improved and NIE will become an increasingly important part of legal practice.

  18. Examining the role of implementation quality in school-based prevention using the PATHS curriculum. Promoting Alternative THinking Skills Curriculum.

    PubMed

    Kam, Chi-Ming; Greenberg, Mark T; Walls, Carla T

    2003-03-01

    In order for empirically validated school-based prevention programs to "go to scale," it is important to understand the processes underlying program dissemination. Data collected in effectiveness trials, especially those measuring the quality of program implementation and administrative support, are valuable in explicating important factors influencing implementation. This study describes findings regarding quality of implementation in a recent effectiveness trial conducted in a high-risk, American urban community. This delinquency prevention trial is a locally owned intervention, which used the Promoting Alternative THinking Skills Curriculum as its major program component. The intervention involved 350 first graders in 6 inner-city public schools. Three schools implemented the intervention and the other 3 were comparison schools from the same school district. Although intervention effects were not found for all the intervention schools, the intervention was effective in improving children's emotional competence and reducing their aggression in schools which effectively supported the intervention. This study, utilizing data from the 3 intervention schools (13 classrooms and 164 students), suggested that 2 factors contributed to the success of the intervention: (a) adequate support from school principals and (b) high degree of classroom implementation by teachers. These findings are discussed in light of the theory-driven models in program evaluation that emphasized the importance of the multiple factors influencing the implementation of school-based interventions.

  19. Palliative Care Specialist Consultation Is Associated With Supportive Care Quality in Advanced Cancer.

    PubMed

    Walling, Anne M; Tisnado, Diana; Ettner, Susan L; Asch, Steven M; Dy, Sydney M; Pantoja, Philip; Lee, Martin; Ahluwalia, Sangeeta C; Schreibeis-Baum, Hannah; Malin, Jennifer L; Lorenz, Karl A

    2016-10-01

    Although recent randomized controlled trials support early palliative care for patients with advanced cancer, the specific processes of care associated with these findings and whether these improvements can be replicated in the broader health care system are uncertain. The aim of this study was to evaluate the occurrence of palliative care consultation and its association with specific processes of supportive care in a national cohort of Veterans using the Cancer Quality ASSIST (Assessing Symptoms Side Effects and Indicators of Supportive Treatment) measures. We abstracted data from 719 patients' medical records diagnosed with advanced lung, colorectal, or pancreatic cancer in 2008 over a period of three years or until death who received care in the Veterans Affairs Health System to evaluate the association of palliative care specialty consultation with the quality of supportive care overall and by domain using a multivariate regression model. All but 54 of 719 patients died within three years and 293 received at least one palliative care consult. Patients evaluated by a palliative care specialist at diagnosis scored seven percentage points higher overall (P < 0.001) and 11 percentage points higher (P < 0.001) within the information and care planning domain compared with those without a consult. Early palliative care specialist consultation is associated with better quality of supportive care in three advanced cancers, predominantly driven by improvements in information and care planning. This study supports the effectiveness of early palliative care consultation in three common advanced cancers within the Veterans Affairs Health System and provides a greater understanding of what care processes palliative care teams influence. Published by Elsevier Inc.

  20. Implementation of a hospital-based quality assessment program for rectal cancer.

    PubMed

    Hendren, Samantha; McKeown, Ellen; Morris, Arden M; Wong, Sandra L; Oerline, Mary; Poe, Lyndia; Campbell, Darrell A; Birkmeyer, Nancy J

    2014-05-01

    Quality improvement programs in Europe have had a markedly beneficial effect on the processes and outcomes of rectal cancer care. The quality of rectal cancer care in the United States is not as well understood, and scalable quality improvement programs have not been developed. The purpose of this article is to describe the implementation of a hospital-based quality assessment program for rectal cancer, targeting both community and academic hospitals. We recruited 10 hospitals from a surgical quality improvement organization. Nurse reviewers were trained to abstract rectal cancer data from hospital medical records, and abstracts were assessed for accuracy. We conducted two surveys to assess the training program and limitations of the data abstraction. We validated data completeness and accuracy by comparing hospital medical record and tumor registry data. Nine of 10 hospitals successfully performed abstractions with ≥ 90% accuracy. Experienced nurse reviewers were challenged by the technical details in operative and pathology reports. Although most variables had less than 10% missing data, outpatient testing information was lacking from some hospitals' inpatient records. This implementation project yielded a final quality assessment program consisting of 20 medical records variables and 11 tumor registry variables. An innovative program linking tumor registry data to quality-improvement data for rectal cancer quality assessment was successfully implemented in 10 hospitals. This data platform and training program can serve as a template for other organizations that are interested in assessing and improving the quality of rectal cancer care. Copyright © 2014 by American Society of Clinical Oncology.

  1. Drivers of Dashboard Development (3-D): A Curricular Continuous Quality Improvement Approach.

    PubMed

    Shroyer, A Laurie; Lu, Wei-Hsin; Chandran, Latha

    2016-04-01

    Undergraduate medical education (UME) programs are seeking systematic ways to monitor and manage their educational performance metrics and document their achievement of external goals (e.g., Liaison Committee on Medical Education [LCME] accreditation requirements) and internal objectives (institution-specific metrics). In other continuous quality improvement (CQI) settings, summary dashboard reports have been used to evaluate and improve performance. The Stony Brook University School of Medicine UME leadership team developed and implemented summary dashboard performance reports in 2009 to document LCME standards/criteria compliance, evaluate medical student performance, and identify progress in attaining institutional curricular goals and objectives. Key performance indicators (KPIs) and benchmarks were established and have been routinely monitored as part of the novel Drivers of Dashboard Development (3-D) approach to curricular CQI. The systematic 3-D approach has had positive CQI impacts. Substantial improvements over time have been documented in KPIs including timeliness of clerkship grades, midclerkship feedback, student mistreatment policy awareness, and student satisfaction. Stakeholder feedback indicates that the dashboards have provided useful information guiding data-driven curricular changes, such as integrating clinician-scientists as lecturers in basic science courses to clarify the clinical relevance of specific topics. Gaining stakeholder acceptance of the 3-D approach required clear communication of preestablished targets and annual meetings with department leaders and course/clerkship directors. The 3-D approach may be considered by UME programs as a template for providing faculty and leadership with a CQI framework to establish shared goals, document compliance, report accomplishments, enrich communications, facilitate decisions, and improve performance.

  2. Experiences and Lessons From Polio Eradication Applied to Immunization in 10 Focus Countries of the Polio Endgame Strategic Plan.

    PubMed

    van den Ent, Maya M V X; Mallya, Apoorva; Sandhu, Hardeep; Anya, Blanche-Philomene; Yusuf, Nasir; Ntakibirora, Marcelline; Hasman, Andreas; Fahmy, Kamal; Agbor, John; Corkum, Melissa; Sumaili, Kyandindi; Siddique, Anisur Rahman; Bammeke, Jane; Braka, Fiona; Andriamihantanirina, Rija; Ziao, Antoine-Marie C; Djumo, Clement; Yapi, Moise Desire; Sosler, Stephen; Eggers, Rudolf

    2017-07-01

    Nine polio areas of expertise were applied to broader immunization and mother, newborn and child health goals in ten focus countries of the Polio Eradication Endgame Strategic Plan: policy & strategy development, planning, management and oversight (accountability framework), implementation & service delivery, monitoring, communications & community engagement, disease surveillance & data analysis, technical quality & capacity building, and partnerships. Although coverage improvements depend on multiple factors and increased coverage cannot be attributed to the use of polio assets alone, 6 out of the 10 focus countries improved coverage in three doses of diphtheria tetanus pertussis containing vaccine between 2013 and 2015. Government leadership, evidence-based programming, country-driven comprehensive operational annual plans, community partnership and strong accountability systems are critical for all programs and polio eradication has illustrated these can be leveraged to increase immunization coverage and equity and enhance global health security in the focus countries. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  3. 77 FR 40404 - Agency Information Collection Activities: Requests for Comments; Clearance of Renewed Approval of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-09

    ... Activities: Requests for Comments; Clearance of Renewed Approval of Information Collection: Advanced... about our intention to request the Office of Management and Budget (OMB) approval to renew an information collection. The Advanced Qualification Program (AQP) incorporates data driven quality control...

  4. The Datafication of Everything - Even Toilets.

    PubMed

    Lun, Kwok-Chan

    2018-04-22

    Health informatics has benefitted from the development of Info-Communications Technology (ICT) over the last fifty years. Advances in ICT in healthcare have now started to spur advances in Data Technology as hospital information systems, electronic health and medical records, mobile devices, social media and Internet Of Things (IOT) are making a substantial impact on the generation of data. It is timely for healthcare institutions to recognize data as a corporate asset and promote a data-driven culture within the institution. It is both strategic and timely for IMIA, as an international organization in health informatics, to take the lead to promote a data-driven culture in healthcare organizations. This can be achieved by expanding the terms of reference of its existing Working Group on Data Mining and Big Data Analysis to include (1) data analytics with special reference to healthcare, (2) big data tools and solutions, (3) bridging information technology and data technology and (4) data quality issues and challenges. Georg Thieme Verlag KG Stuttgart.

  5. RAS - Target Identification - Informatics

    Cancer.gov

    The RAS Informatics lab group develops tools to track and analyze “big data” from the RAS Initiative, as well as analyzes data from external projects. By integrating internal and external data, this group helps improve understanding of RAS-driven cancers.

  6. Can commonly-used fan-driven air cleaning technologies improve indoor air quality? A literature review

    NASA Astrophysics Data System (ADS)

    Zhang, Yinping; Mo, Jinhan; Li, Yuguo; Sundell, Jan; Wargocki, Pawel; Zhang, Jensen; Little, John C.; Corsi, Richard; Deng, Qihong; Leung, Michael H. K.; Fang, Lei; Chen, Wenhao; Li, Jinguang; Sun, Yuexia

    2011-08-01

    Air cleaning techniques have been applied worldwide with the goal of improving indoor air quality. The effectiveness of applying these techniques varies widely, and pollutant removal efficiency is usually determined in controlled laboratory environments which may not be realized in practice. Some air cleaners are largely ineffective, and some produce harmful by-products. To summarize what is known regarding the effectiveness of fan-driven air cleaning technologies, a state-of-the-art review of the scientific literature was undertaken by a multidisciplinary panel of experts from Europe, North America, and Asia with expertise in air cleaning, aerosol science, medicine, chemistry and ventilation. The effects on health were not examined. Over 26,000 articles were identified in major literature databases; 400 were selected as being relevant based on their titles and abstracts by the first two authors, who further reduced the number of articles to 160 based on the full texts. These articles were reviewed by the panel using predefined inclusion criteria during their first meeting. Additions were also made by the panel. Of these, 133 articles were finally selected for detailed review. Each article was assessed independently by two members of the panel and then judged by the entire panel during a consensus meeting. During this process 59 articles were deemed conclusive and their results were used for final reporting at their second meeting. The conclusions are that: (1) None of the reviewed technologies was able to effectively remove all indoor pollutants and many were found to generate undesirable by-products during operation. (2) Particle filtration and sorption of gaseous pollutants were among the most effective air cleaning technologies, but there is insufficient information regarding long-term performance and proper maintenance. (3) The existing data make it difficult to extract information such as Clean Air Delivery Rate (CADR), which represents a common benchmark for comparing the performance of different air cleaning technologies. (4) To compare and select suitable indoor air cleaning devices, a labeling system accounting for characteristics such as CADR, energy consumption, volume, harmful by-products, and life span is necessary. For that purpose, a standard test room and condition should be built and studied. (5) Although there is evidence that some air cleaning technologies improve indoor air quality, further research is needed before any of them can be confidently recommended for use in indoor environments.

  7. PconsD: ultra rapid, accurate model quality assessment for protein structure prediction.

    PubMed

    Skwark, Marcin J; Elofsson, Arne

    2013-07-15

    Clustering methods are often needed for accurately assessing the quality of modeled protein structures. Recent blind evaluation of quality assessment methods in CASP10 showed that there is little difference between many different methods as far as ranking models and selecting best model are concerned. When comparing many models, the computational cost of the model comparison can become significant. Here, we present PconsD, a fast, stream-computing method for distance-driven model quality assessment that runs on consumer hardware. PconsD is at least one order of magnitude faster than other methods of comparable accuracy. The source code for PconsD is freely available at http://d.pcons.net/. Supplementary benchmarking data are also available there. arne@bioinfo.se Supplementary data are available at Bioinformatics online.

  8. Effectiveness and feasibility of assistant push on improvement of chest compression quality: a crossover study.

    PubMed

    Choi, Sung Soo; Yun, Seong-Woo; Lee, Byung Kook; Jeung, Kyung Woon; Song, Kyoung Hwan; Lee, Chang-Hee; Park, Jung Soo; Jeong, Ji Yeon; Shin, Sang Yeol

    2015-03-01

    To improve the quality of chest compression (CC), we developed the assistant-push method, whereby the second rescuer pushes the back of the chest compressor during CC. We investigated the effectiveness and feasibility of assistant push in achieving and maintaining the CC quality. This was a randomized crossover trial in which 41 subjects randomly performed both of standard CC (single-rescuer group) and CC with instructor-driven assistant push (assistant-push group) in different order. Each session of CC was performed for 2 minutes using a manikin. Subjects were also assigned to both roles of chest compressor and assistant and together performed CC with subject-driven assistant push. Depth of CC, compression to recoil ratio, duty cycle, and rate of incomplete recoil were quantified. The mean depth of CC (57.0 [56.0-59.0] vs 55.0 [49.5-57.5], P < .001) was significantly deeper, and the compression force (33.8 [29.3-36.4] vs 23.3 [20.4-25.3], P < .001) was stronger in the assistant-push group. The ratio of compression to recoil, duty cycle, and rate of incomplete chest recoil were comparable between the 2 groups. The CC depth in the single-rescuer group decreased significantly every 30 seconds, whereas in the assistant-push group, it was comparable at 60- and 90-second time points (P = .004). The subject assistant-push group performed CCs at a depth comparable with that of the instructor assistant-push group. The assistant-push method improved the depth of CC and attenuated its decline, eventually helping maintain adequate CC depth over time. Subjects were able to feasibly learn assistant push and performed effectively. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. CONFOLD2: improved contact-driven ab initio protein structure modeling.

    PubMed

    Adhikari, Badri; Cheng, Jianlin

    2018-01-25

    Contact-guided protein structure prediction methods are becoming more and more successful because of the latest advances in residue-residue contact prediction. To support contact-driven structure prediction, effective tools that can quickly build tertiary structural models of good quality from predicted contacts need to be developed. We develop an improved contact-driven protein modelling method, CONFOLD2, and study how it may be effectively used for ab initio protein structure prediction with predicted contacts as input. It builds models using various subsets of input contacts to explore the fold space under the guidance of a soft square energy function, and then clusters the models to obtain the top five models. CONFOLD2 obtains an average reconstruction accuracy of 0.57 TM-score for the 150 proteins in the PSICOV contact prediction dataset. When benchmarked on the CASP11 contacts predicted using CONSIP2 and CASP12 contacts predicted using Raptor-X, CONFOLD2 achieves a mean TM-score of 0.41 on both datasets. CONFOLD2 allows to quickly generate top five structural models for a protein sequence when its secondary structures and contacts predictions at hand. The source code of CONFOLD2 is publicly available at https://github.com/multicom-toolbox/CONFOLD2/ .

  10. Accounting for quality in the measurement of hospital performance: evidence from Costa Rica.

    PubMed

    Arocena, Pablo; García-Prado, Ariadna

    2007-07-01

    This paper provides insights into how Costa Rican public hospitals responded to the pressure for increased efficiency and quality introduced by the reforms carried out over the period 1997-2001. To that purpose we compute a generalized output distance function by means of non-parametric mathematical programming to construct a productivity index, which accounts for productivity changes while controlling for quality of care. Our results show an improvement in hospital performance mainly driven by quality increases. The adoption of management contracts seems to have contributed to such enhancement, more notably for small hospitals. Further, productivity growth is primarily due to technical and scale efficiency change rather than technological change. A number of policy implications are drawn from these results. Copyright (c) 2006 John Wiley & Sons, Ltd.

  11. Universal fragment descriptors for predicting properties of inorganic crystals

    NASA Astrophysics Data System (ADS)

    Isayev, Olexandr; Oses, Corey; Toher, Cormac; Gossett, Eric; Curtarolo, Stefano; Tropsha, Alexander

    2017-06-01

    Although historically materials discovery has been driven by a laborious trial-and-error process, knowledge-driven materials design can now be enabled by the rational combination of Machine Learning methods and materials databases. Here, data from the AFLOW repository for ab initio calculations is combined with Quantitative Materials Structure-Property Relationship models to predict important properties: metal/insulator classification, band gap energy, bulk/shear moduli, Debye temperature and heat capacities. The prediction's accuracy compares well with the quality of the training data for virtually any stoichiometric inorganic crystalline material, reciprocating the available thermomechanical experimental data. The universality of the approach is attributed to the construction of the descriptors: Property-Labelled Materials Fragments. The representations require only minimal structural input allowing straightforward implementations of simple heuristic design rules.

  12. Universal fragment descriptors for predicting properties of inorganic crystals.

    PubMed

    Isayev, Olexandr; Oses, Corey; Toher, Cormac; Gossett, Eric; Curtarolo, Stefano; Tropsha, Alexander

    2017-06-05

    Although historically materials discovery has been driven by a laborious trial-and-error process, knowledge-driven materials design can now be enabled by the rational combination of Machine Learning methods and materials databases. Here, data from the AFLOW repository for ab initio calculations is combined with Quantitative Materials Structure-Property Relationship models to predict important properties: metal/insulator classification, band gap energy, bulk/shear moduli, Debye temperature and heat capacities. The prediction's accuracy compares well with the quality of the training data for virtually any stoichiometric inorganic crystalline material, reciprocating the available thermomechanical experimental data. The universality of the approach is attributed to the construction of the descriptors: Property-Labelled Materials Fragments. The representations require only minimal structural input allowing straightforward implementations of simple heuristic design rules.

  13. Developing Large Scale Explosively Driven Flyer Experiments on Sand

    NASA Astrophysics Data System (ADS)

    Rehagen, Thomas; Kraus, Richard

    2017-06-01

    Measurements of the dynamic behavior of granular materials are of great importance to a variety of scientific and engineering applications, including planetary science, seismology, and construction and destruction. In addition, high quality data are needed to enhance our understanding of granular physics and improve the computational models used to simulate related physical processes. However, since there is a non-negligible grain size associated with these materials, experiments must be of a relatively large scale in order to capture the continuum response of the material and reduce errors associated with the finite grain size. We will present designs for explosively driven flyer experiments to make high accuracy measurements of the Hugoniot of sand (with a grain size of hundreds of microns). To achieve an accuracy of better than a few percent in density, we are developing a platform to measure the Hugoniot of samples several centimeters in thickness. We will present the target designs as well as coupled designs for the explosively launched flyer system. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract No. DE-AC52-07NA27344.

  14. What does theory-driven evaluation add to the analysis of self-reported outcomes of diabetes education? A comparative realist evaluation of a participatory patient education approach.

    PubMed

    Pals, Regitze A S; Olesen, Kasper; Willaing, Ingrid

    2016-06-01

    To explore the effects of the Next Education (NEED) patient education approach in diabetes education. We tested the use of the NEED approach at eight intervention sites (n=193). Six additional sites served as controls (n=58). Data were collected through questionnaires, interviews and observations. We analysed data using descriptive statistics, logistic regression and systematic text condensation. Results from logistic regression demonstrated better overall assessment of education program experiences and enhanced self-reported improvements in maintaining medications correctly among patients from intervention sites, as compared to control sites. Interviews and observations suggested that improvements in health behavior could be explained by mechanisms related to the education setting, including using person-centeredness and dialogue. However, similar mechanisms were observed at control sites. Observations suggested that the quality of group dynamics, patients' motivation and educators' ability to facilitate participation in education, supported by the NEED approach, contributed to better results at intervention sites. The use of participatory approaches and, in particular, the NEED patient education approach in group-based diabetes education improved self-management skills and health behavior outcomes among individuals with diabetes. The use of dialogue tools in diabetes education is advised for educators. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. Delivering safe and effective test-result communication, management and follow-up: a mixed-methods study protocol.

    PubMed

    Dahm, Maria R; Georgiou, Andrew; Westbrook, Johanna I; Greenfield, David; Horvath, Andrea R; Wakefield, Denis; Li, Ling; Hillman, Ken; Bolton, Patrick; Brown, Anthony; Jones, Graham; Herkes, Robert; Lindeman, Robert; Legg, Michael; Makeham, Meredith; Moses, Daniel; Badmus, Dauda; Campbell, Craig; Hardie, Rae-Anne; Li, Julie; McCaughey, Euan; Sezgin, Gorkem; Thomas, Judith; Wabe, Nasir

    2018-02-15

    The failure to follow-up pathology and medical imaging test results poses patient-safety risks which threaten the effectiveness, quality and safety of patient care. The objective of this project is to: (1) improve the effectiveness and safety of test-result management through the establishment of clear governance processes of communication, responsibility and accountability; (2) harness health information technology (IT) to inform and monitor test-result management; (3) enhance the contribution of consumers to the establishment of safe and effective test-result management systems. This convergent mixed-methods project triangulates three multistage studies at seven adult hospitals and one paediatric hospital in Australia.Study 1 adopts qualitative research approaches including semistructured interviews, focus groups and ethnographic observations to gain a better understanding of test-result communication and management practices in hospitals, and to identify patient-safety risks which require quality-improvement interventions.Study 2 analyses linked sets of routinely collected healthcare data to examine critical test-result thresholds and test-result notification processes. A controlled before-and-after study across three emergency departments will measure the impact of interventions (including the use of IT) developed to improve the safety and quality of test-result communication and management processes.Study 3 adopts a consumer-driven approach, including semistructured interviews, and the convening of consumer-reference groups and community forums. The qualitative data will identify mechanisms to enhance the role of consumers in test-management governance processes, and inform the direction of the research and the interpretation of findings. Ethical approval has been granted by the South Eastern Sydney Local Health District Human Research Ethics Committee and Macquarie University. Findings will be disseminated in academic, industry and consumer journals, newsletters and conferences. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. Performance and Quality Assessment of the Forthcoming Copernicus Marine Service Global Ocean Monitoring and Forecasting Real-Time System

    NASA Astrophysics Data System (ADS)

    Lellouche, J. M.; Le Galloudec, O.; Greiner, E.; Garric, G.; Regnier, C.; Drillet, Y.

    2016-02-01

    Mercator Ocean currently delivers in real-time daily services (weekly analyses and daily forecast) with a global 1/12° high resolution system. The model component is the NEMO platform driven at the surface by the IFS ECMWF atmospheric analyses and forecasts. Observations are assimilated by means of a reduced-order Kalman filter with a 3D multivariate modal decomposition of the forecast error. It includes an adaptive-error estimate and a localization algorithm. Along track altimeter data, satellite Sea Surface Temperature and in situ temperature and salinity vertical profiles are jointly assimilated to estimate the initial conditions for numerical ocean forecasting. A 3D-Var scheme provides a correction for the slowly-evolving large-scale biases in temperature and salinity.Since May 2015, Mercator Ocean opened the Copernicus Marine Service (CMS) and is in charge of the global ocean analyses and forecast, at eddy resolving resolution. In this context, R&D activities have been conducted at Mercator Ocean these last years in order to improve the real-time 1/12° global system for the next CMS version in 2016. The ocean/sea-ice model and the assimilation scheme benefit among others from the following improvements: large-scale and objective correction of atmospheric quantities with satellite data, new Mean Dynamic Topography taking into account the last version of GOCE geoid, new adaptive tuning of some observational errors, new Quality Control on the assimilated temperature and salinity vertical profiles based on dynamic height criteria, assimilation of satellite sea-ice concentration, new freshwater runoff from ice sheets melting …This presentation doesn't focus on the impact of each update, but rather on the overall behavior of the system integrating all updates. This assessment reports on the products quality improvements, highlighting the level of performance and the reliability of the new system.

  17. Institutional Oversight of the Graduate Medical Education Enterprise: Development of an Annual Institutional Review

    PubMed Central

    Amedee, Ronald G.; Piazza, Janice C.

    2016-01-01

    Background: The Accreditation Council for Graduate Medical Education (ACGME) fully implemented all aspects of the Next Accreditation System (NAS) on July 1, 2014. In lieu of periodic accreditation site visits of programs and institutions, the NAS requires active, ongoing oversight by the sponsoring institutions (SIs) to maintain accreditation readiness and program quality. Methods: The Ochsner Health System Graduate Medical Education Committee (GMEC) has instituted a process that provides a structured, process-driven improvement approach at the program level, using a Program Evaluation Committee to review key performance data and construct an annual program evaluation for each accredited residency. The Ochsner GMEC evaluates the aggregate program data and creates an Annual Institutional Review (AIR) document that provides direction and focus for ongoing program improvement. This descriptive article reviews the 2014 process and various metrics collected and analyzed to demonstrate the program review and institutional oversight provided by the Ochsner graduate medical education (GME) enterprise. Results: The 2014 AIR provided an overview of performance and quality of the Ochsner GME program for the 2013-2014 academic year with particular attention to program outcomes; resident supervision, responsibilities, evaluation, and compliance with duty‐hour standards; results of the ACGME survey of residents and core faculty; and resident participation in patient safety and quality activities and curriculum. The GMEC identified other relevant institutional performance indicators that are incorporated into the AIR and reflect SI engagement in and contribution to program performance at the individual program and institutional levels. Conclusion: The Ochsner GME office and its program directors are faced with the ever-increasing challenges of today's healthcare environment as well as escalating institutional and program accreditation requirements. The overall commitment of this SI to advancing our GME enterprise is clearly evident, and the opportunity for continued improvement resulting from institutional oversight is being realized. PMID:27046412

  18. Leveraging health information technology to achieve the "triple aim" of healthcare reform.

    PubMed

    Sheikh, Aziz; Sood, Harpreet S; Bates, David W

    2015-07-01

    To investigate experiences with leveraging health information technology (HIT) to improve patient care and population health, and reduce healthcare expenditures. In-depth qualitative interviews with federal government employees, health policy, HIT and medico-legal experts, health providers, physicians, purchasers, payers, patient advocates, and vendors from across the United States. The authors undertook 47 interviews. There was a widely shared belief that Health Information Technology for Economic and Clinical Health (HITECH) had catalyzed the creation of a digital infrastructure, which was being used in innovative ways to improve quality of care and curtail costs. There were however major concerns about the poor usability of electronic health records (EHRs), their limited ability to support multi-disciplinary care, and major difficulties with health information exchange, which undermined efforts to deliver integrated patient-centered care. Proposed strategies for enhancing the benefits of HIT included federal stimulation of competition by mandating vendors to open-up their application program interfaces, incenting development of low-cost consumer informatics tools, and promoting Congressional review of the The Health Insurance Portability and Accountability Act (HIPPA) to optimize the balance between data privacy and reuse. Many underscored the need to "kick the legs from underneath the fee-for-service model" and replace it with a data-driven reimbursement system that rewards high quality care. The HITECH Act has stimulated unprecedented, multi-stakeholder interest in HIT. Early experiences indicate that the resulting digital infrastructure is being used to improve quality of care and curtail costs. Reform efforts are however severely limited by problems with usability, limited interoperability and the persistence of the fee-for-service paradigm-addressing these issues therefore needs to be the federal government's main policy target. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. 'Six sigma approach' - an objective strategy in digital assessment of postoperative air leaks: a prospective randomised study.

    PubMed

    Bertolaccini, Luca; Rizzardi, Giovanna; Filice, Mary Jo; Terzi, Alberto

    2011-05-01

    Until now, only way to report air leaks (ALs) has been with an analogue score in an inherently subjective manner. The Six Sigma quality improvement methodology is a data-driven approach applicable to evaluate the quality of the quantification method of repetitive procedures. We applied the Six Sigma concept to improve the process of AL evaluation. A digital device for AL measurement (Drentech PALM, Redax S.r.l., Mirandola (MO), Italy) was applied to 49 consecutive patients, who underwent pulmonary intervention, compared with a similar population with classical chest drainage. Data recorded were postoperative AL, chest-tube removal days, number of chest roentgenograms, hospital length of stay; device setup time, average time rating AL and patient satisfaction. Bivariable comparisons were made using the Mann-Whitney test, the χ² test and Fisher's exact test. Analysis of quality was conducted using the Six Sigma methodology. There were no significant differences regarding AL (p=0.075), although not statistically significant; there was a reduction of postoperative chest X-rays (four vs five) and of hospital length of stay (6.5 vs 7.1 days); and a marginally significant difference was found between chest-tube removal days (p=0.056). There were significant differences regarding device setup time (p=0.001), average time rating AL (p=0.001), inter-observer variability (p=0.001) and patient satisfaction (p=0.002). Six Sigma analyses revealed accurate assessment of AL. Continuous digital measurement of AL reduces degree of variability of AL score, gives more assurance for tube removal, and reports AL without the apprehension of observer error. Efficiency and effectiveness improved with the use of a digital device. We have noted that the AL curves depict actually sealing of AL. The clinical importance of AL curves requires further study. Copyright © 2010 European Association for Cardio-Thoracic Surgery. Published by Elsevier B.V. All rights reserved.

  20. Accountable care organization readiness and academic medical centers.

    PubMed

    Berkowitz, Scott A; Pahira, Jennifer J

    2014-09-01

    As academic medical centers (AMCs) consider becoming accountable care organizations (ACOs) under Medicare, they must assess their readiness for this transition. Of the 253 Medicare ACOs prior to 2014, 51 (20%) are AMCs. Three critical components of ACO readiness are institutional and ACO structure, leadership, and governance; robust information technology and analytic systems; and care coordination and management to improve care delivery and health at the population level. All of these must be viewed through the lens of unique AMC mission-driven goals.There is clear benefit to developing and maintaining a centralized internal leadership when it comes to driving change within an ACO, yet there is also the need for broad stakeholder involvement. Other important structural features are an extensive primary care foundation; concomitant operation of a managed care plan or risk-bearing entity; or maintaining a close relationship with post-acute-care or skilled nursing facilities, which provide valuable expertise in coordinating care across the continuum. ACOs also require comprehensive and integrated data and analytic systems that provide meaningful population data to inform care teams in real time, promote quality improvement, and monitor spending trends. AMCs will require proven care coordination and management strategies within a population health framework and deployment of an innovative workforce.AMC core functions of providing high-quality subspecialty and primary care, generating new knowledge, and training future health care leaders can be well aligned with a transition to an ACO model. Further study of results from Medicare-related ACO programs and commercial ACOs will help define best practices.

Top