Human Connectome Project Informatics: quality control, database services, and data visualization
Marcus, Daniel S.; Harms, Michael P.; Snyder, Abraham Z.; Jenkinson, Mark; Wilson, J Anthony; Glasser, Matthew F.; Barch, Deanna M.; Archie, Kevin A.; Burgess, Gregory C.; Ramaratnam, Mohana; Hodge, Michael; Horton, William; Herrick, Rick; Olsen, Timothy; McKay, Michael; House, Matthew; Hileman, Michael; Reid, Erin; Harwell, John; Coalson, Timothy; Schindler, Jon; Elam, Jennifer S.; Curtiss, Sandra W.; Van Essen, David C.
2013-01-01
The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study. PMID:23707591
New tools for sculpting cranial implants in a shared haptic augmented reality environment.
Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary
2006-01-01
New volumetric tools were developed for the design and fabrication of high quality cranial implants from patient CT data. These virtual tools replace time consuming physical sculpting, mold making and casting steps. The implant is designed by medical professionals in tele-immersive collaboration. Virtual clay is added in the virtual defect area on the CT data using the adding tool. With force feedback the modeler can feel the edge of the defect and fill only the space where no bone is present. A carving tool and a smoothing tool are then used to sculpt and refine the implant. To make a physical evaluation, the skull with simulated defect and the implant are fabricated via stereolithography to allow neurosurgeons to evaluate the quality of the implant. Initial tests demonstrate a very high quality fit. These new haptic volumetric sculpting tools are a critical component of a comprehensive tele-immersive system.
Dy, Sydney M; Purnell, Tanjala S
2012-02-01
High-quality provider-patient decision-making is key to quality care for complex conditions. We performed an analysis of key elements relevant to quality and complex, shared medical decision-making. Based on a search of electronic databases, including Medline and the Cochrane Library, as well as relevant articles' reference lists, reviews of tools, and annotated bibliographies, we developed a list of key concepts and applied them to a decision-making example. Key concepts identified included provider competence, trustworthiness, and cultural competence; communication with patients and families; information quality; patient/surrogate competence; and roles and involvement. We applied this concept list to a case example, shared decision-making for live donor kidney transplantation, and identified the likely most important concepts as provider and cultural competence, information quality, and communication with patients and families. This concept list may be useful for conceptualizing the quality of complex shared decision-making and in guiding research in this area. Copyright © 2011 Elsevier Ltd. All rights reserved.
Implementing a user-driven online quality improvement toolkit for cancer care.
Luck, Jeff; York, Laura S; Bowman, Candice; Gale, Randall C; Smith, Nina; Asch, Steven M
2015-05-01
Peer-to-peer collaboration within integrated health systems requires a mechanism for sharing quality improvement lessons. The Veterans Health Administration (VA) developed online compendia of tools linked to specific cancer quality indicators. We evaluated awareness and use of the toolkits, variation across facilities, impact of social marketing, and factors influencing toolkit use. A diffusion of innovations conceptual framework guided the collection of user activity data from the Toolkit Series SharePoint site and an online survey of potential Lung Cancer Care Toolkit users. The VA Toolkit Series site had 5,088 unique visitors in its first 22 months; 5% of users accounted for 40% of page views. Social marketing communications were correlated with site usage. Of survey respondents (n = 355), 54% had visited the site, of whom 24% downloaded at least one tool. Respondents' awareness of the lung cancer quality performance of their facility, and facility participation in quality improvement collaboratives, were positively associated with Toolkit Series site use. Facility-level lung cancer tool implementation varied widely across tool types. The VA Toolkit Series achieved widespread use and a high degree of user engagement, although use varied widely across facilities. The most active users were aware of and active in cancer care quality improvement. Toolkit use seemed to be reinforced by other quality improvement activities. A combination of user-driven tool creation and centralized toolkit development seemed to be effective for leveraging health information technology to spread disease-specific quality improvement tools within an integrated health care system. Copyright © 2015 by American Society of Clinical Oncology.
NASA Technical Reports Server (NTRS)
Falke, Stefan; Husar, Rudolf
2011-01-01
The goal of this REASoN applications and technology project is to deliver and use Earth Science Enterprise (ESE) data and tools in support of air quality management. Its scope falls within the domain of air quality management and aims to develop a federated air quality information sharing network that includes data from NASA, EPA, US States and others. Project goals were achieved through a access of satellite and ground observation data, web services information technology, interoperability standards, and air quality community collaboration. In contributing to a network of NASA ESE data in support of particulate air quality management, the project will develop access to distributed data, build Web infrastructure, and create tools for data processing and analysis. The key technologies used in the project include emerging web services for developing self describing and modular data access and processing tools, and service oriented architecture for chaining web services together to assemble customized air quality management applications. The technology and tools required for this project were developed within DataFed.net, a shared infrastructure that supports collaborative atmospheric data sharing and processing web services. Much of the collaboration was facilitated through community interactions through the Federation of Earth Science Information Partners (ESIP) Air Quality Workgroup. The main activities during the project that successfully advanced DataFed, enabled air quality applications and established community-oriented infrastructures were: develop access to distributed data (surface and satellite), build Web infrastructure to support data access, processing and analysis create tools for data processing and analysis foster air quality community collaboration and interoperability.
A simple tool for neuroimaging data sharing
Haselgrove, Christian; Poline, Jean-Baptiste; Kennedy, David N.
2014-01-01
Data sharing is becoming increasingly common, but despite encouragement and facilitation by funding agencies, journals, and some research efforts, most neuroimaging data acquired today is still not shared due to political, financial, social, and technical barriers to sharing data that remain. In particular, technical solutions are few for researchers that are not a part of larger efforts with dedicated sharing infrastructures, and social barriers such as the time commitment required to share can keep data from becoming publicly available. We present a system for sharing neuroimaging data, designed to be simple to use and to provide benefit to the data provider. The system consists of a server at the International Neuroinformatics Coordinating Facility (INCF) and user tools for uploading data to the server. The primary design principle for the user tools is ease of use: the user identifies a directory containing Digital Imaging and Communications in Medicine (DICOM) data, provides their INCF Portal authentication, and provides identifiers for the subject and imaging session. The user tool anonymizes the data and sends it to the server. The server then runs quality control routines on the data, and the data and the quality control reports are made public. The user retains control of the data and may change the sharing policy as they need. The result is that in a few minutes of the user’s time, DICOM data can be anonymized and made publicly available, and an initial quality control assessment can be performed on the data. The system is currently functional, and user tools and access to the public image database are available at http://xnat.incf.org/. PMID:24904398
Stories from OpenAQ, a Global and Grassroots Open Air Quality Community
NASA Astrophysics Data System (ADS)
Hasenkopf, C. A.; Flasher, J. C.; Veerman, O.; Scalamogna, A.; Silva, D.; Salmon, M.; Buuralda, D.; DeWitt, L. H.
2016-12-01
Air pollution, responsible for more deaths each year than HIV/AIDS and malaria, combined, is a global public health crisis. Yet many scientific questions, including those directly relevant for policy, remain unanswered when it comes to the impact of air pollution on health in highly polluted environments. Often, specific solutions to improving air quality are local and sustained through public engagement, policy and monitoring. Both the overarching science of air quality and health and local solutions rely on access to reliable, timely air quality data. Over the past year, the OpenAQ community has opened up existing disparate air quality data in 24 countries through an open source platform (openaq.org) so that communities around the world can use it to advance science, public engagement, and policy. We will share stories of communities, from Delhi to Ulaanbaatar and from scientists to journalists, using open air quality data from our platform to advance their fight against air inequality. We will share recent research we have conducted on best practices for engaging different communities and building tools that enable the public to fully unleash the power of open air quality data to fight air inequality. The subsequent open-source tools (github.com/openaq) we have developed from this research and our entire data-sharing platform may be of interest to other open data communities.
LC Data QUEST: A Technical Architecture for Community Federated Clinical Data Sharing.
Stephens, Kari A; Lin, Ching-Ping; Baldwin, Laura-Mae; Echo-Hawk, Abigail; Keppel, Gina A; Buchwald, Dedra; Whitener, Ron J; Korngiebel, Diane M; Berg, Alfred O; Black, Robert A; Tarczy-Hornoch, Peter
2012-01-01
The University of Washington Institute of Translational Health Sciences is engaged in a project, LC Data QUEST, building data sharing capacity in primary care practices serving rural and tribal populations in the Washington, Wyoming, Alaska, Montana, Idaho region to build research infrastructure. We report on the iterative process of developing the technical architecture for semantically aligning electronic health data in primary care settings across our pilot sites and tools that will facilitate linkages between the research and practice communities. Our architecture emphasizes sustainable technical solutions for addressing data extraction, alignment, quality, and metadata management. The architecture provides immediate benefits to participating partners via a clinical decision support tool and data querying functionality to support local quality improvement efforts. The FInDiT tool catalogues type, quantity, and quality of the data that are available across the LC Data QUEST data sharing architecture. These tools facilitate the bi-directional process of translational research.
LC Data QUEST: A Technical Architecture for Community Federated Clinical Data Sharing
Stephens, Kari A.; Lin, Ching-Ping; Baldwin, Laura-Mae; Echo-Hawk, Abigail; Keppel, Gina A.; Buchwald, Dedra; Whitener, Ron J.; Korngiebel, Diane M.; Berg, Alfred O.; Black, Robert A.; Tarczy-Hornoch, Peter
2012-01-01
The University of Washington Institute of Translational Health Sciences is engaged in a project, LC Data QUEST, building data sharing capacity in primary care practices serving rural and tribal populations in the Washington, Wyoming, Alaska, Montana, Idaho region to build research infrastructure. We report on the iterative process of developing the technical architecture for semantically aligning electronic health data in primary care settings across our pilot sites and tools that will facilitate linkages between the research and practice communities. Our architecture emphasizes sustainable technical solutions for addressing data extraction, alignment, quality, and metadata management. The architecture provides immediate benefits to participating partners via a clinical decision support tool and data querying functionality to support local quality improvement efforts. The FInDiT tool catalogues type, quantity, and quality of the data that are available across the LC Data QUEST data sharing architecture. These tools facilitate the bi-directional process of translational research. PMID:22779052
Tools to Promote Shared Decision Making in Serious Illness: A Systematic Review.
Austin, C Adrian; Mohottige, Dinushika; Sudore, Rebecca L; Smith, Alexander K; Hanson, Laura C
2015-07-01
Serious illness impairs function and threatens survival. Patients facing serious illness value shared decision making, yet few decision aids address the needs of this population. To perform a systematic review of evidence about decision aids and other exportable tools that promote shared decision making in serious illness, thereby (1) identifying tools relevant to the treatment decisions of seriously ill patients and their caregivers, (2) evaluating the quality of evidence for these tools, and (3) summarizing their effect on outcomes and accessibility for clinicians. We searched PubMed, CINAHL, and PsychInfo from January 1, 1995, through October 31, 2014, and identified additional studies from reference lists and other systematic reviews. Clinical trials with random or nonrandom controls were included if they tested print, video, or web-based tools for advance care planning (ACP) or decision aids for serious illness. We extracted data on the study population, design, results, and risk for bias using the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) criteria. Each tool was evaluated for its effect on patient outcomes and accessibility. Seventeen randomized clinical trials tested decision tools in serious illness. Nearly all the trials were of moderate or high quality and showed that decision tools improve patient knowledge and awareness of treatment choices. The available tools address ACP, palliative care and goals of care communication, feeding options in dementia, lung transplant in cystic fibrosis, and truth telling in terminal cancer. Five randomized clinical trials provided further evidence that decision tools improve ACP documentation, clinical decisions, and treatment received. Clinicians can access and use evidence-based tools to engage seriously ill patients in shared decision making. This field of research is in an early stage; future research is needed to develop novel decision aids for other serious diagnoses and key decisions. Health care delivery organizations should prioritize the use of currently available tools that are evidence based and effective.
A data management and publication workflow for a large-scale, heterogeneous sensor network.
Jones, Amber Spackman; Horsburgh, Jeffery S; Reeder, Stephanie L; Ramírez, Maurier; Caraballo, Juan
2015-06-01
It is common for hydrology researchers to collect data using in situ sensors at high frequencies, for extended durations, and with spatial distributions that produce data volumes requiring infrastructure for data storage, management, and sharing. The availability and utility of these data in addressing scientific questions related to water availability, water quality, and natural disasters relies on effective cyberinfrastructure that facilitates transformation of raw sensor data into usable data products. It also depends on the ability of researchers to share and access the data in useable formats. In this paper, we describe a data management and publication workflow and software tools for research groups and sites conducting long-term monitoring using in situ sensors. Functionality includes the ability to track monitoring equipment inventory and events related to field maintenance. Linking this information to the observational data is imperative in ensuring the quality of sensor-based data products. We present these tools in the context of a case study for the innovative Urban Transitions and Aridregion Hydrosustainability (iUTAH) sensor network. The iUTAH monitoring network includes sensors at aquatic and terrestrial sites for continuous monitoring of common meteorological variables, snow accumulation and melt, soil moisture, surface water flow, and surface water quality. We present the overall workflow we have developed for effectively transferring data from field monitoring sites to ultimate end-users and describe the software tools we have deployed for storing, managing, and sharing the sensor data. These tools are all open source and available for others to use.
The NGEE Arctic Data Archive -- Portal for Archiving and Distributing Data and Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boden, Thomas A; Palanisamy, Giri; Devarakonda, Ranjeet
2014-01-01
The Next-Generation Ecosystem Experiments (NGEE Arctic) project is committed to implementing a rigorous and high-quality data management program. The goal is to implement innovative and cost-effective guidelines and tools for collecting, archiving, and sharing data within the project, the larger scientific community, and the public. The NGEE Arctic web site is the framework for implementing these data management and data sharing tools. The open sharing of NGEE Arctic data among project researchers, the broader scientific community, and the public is critical to meeting the scientific goals and objectives of the NGEE Arctic project and critical to advancing the mission ofmore » the Department of Energy (DOE), Office of Science, Biological and Environmental (BER) Terrestrial Ecosystem Science (TES) program.« less
Advantages of Integrative Data Analysis for Developmental Research
ERIC Educational Resources Information Center
Bainter, Sierra A.; Curran, Patrick J.
2015-01-01
Amid recent progress in cognitive development research, high-quality data resources are accumulating, and data sharing and secondary data analysis are becoming increasingly valuable tools. Integrative data analysis (IDA) is an exciting analytical framework that can enhance secondary data analysis in powerful ways. IDA pools item-level data across…
Development of a web-based toolkit to support improvement of care coordination in primary care.
Ganz, David A; Barnard, Jenny M; Smith, Nina Z Y; Miake-Lye, Isomi M; Delevan, Deborah M; Simon, Alissa; Rose, Danielle E; Stockdale, Susan E; Chang, Evelyn T; Noël, Polly H; Finley, Erin P; Lee, Martin L; Zulman, Donna M; Cordasco, Kristina M; Rubenstein, Lisa V
2018-05-23
Promising practices for the coordination of chronic care exist, but how to select and share these practices to support quality improvement within a healthcare system is uncertain. This study describes an approach for selecting high-quality tools for an online care coordination toolkit to be used in Veterans Health Administration (VA) primary care practices. We evaluated tools in three steps: (1) an initial screening to identify tools relevant to care coordination in VA primary care, (2) a two-clinician expert review process assessing tool characteristics (e.g. frequency of problem addressed, linkage to patients' experience of care, effect on practice workflow, and sustainability with existing resources) and assigning each tool a summary rating, and (3) semi-structured interviews with VA patients and frontline clinicians and staff. Of 300 potentially relevant tools identified by searching online resources, 65, 38, and 18 remained after steps one, two and three, respectively. The 18 tools cover five topics: managing referrals to specialty care, medication management, patient after-visit summary, patient activation materials, agenda setting, patient pre-visit packet, and provider contact information for patients. The final toolkit provides access to the 18 tools, as well as detailed information about tools' expected benefits, and resources required for tool implementation. Future care coordination efforts can benefit from systematically reviewing available tools to identify those that are high quality and relevant.
"CrowdTeaching": Supporting Teacher Collective Intelligence Communities
ERIC Educational Resources Information Center
Recker, Mimi M.; Yuan, Min; Ye, Lei
2013-01-01
The widespread availability of high-quality Web-based tools and content offers new promise and potential for supporting teachers as creators of instructional activities. When coupled with a participatory Web culture and infrastructure, teachers can share their creations as well as leverage from the best that their peers have to offer to support a…
A Review of Shared Decision-Making and Patient Decision Aids in Radiation Oncology.
Woodhouse, Kristina Demas; Tremont, Katie; Vachani, Anil; Schapira, Marilyn M; Vapiwala, Neha; Simone, Charles B; Berman, Abigail T
2017-06-01
Cancer treatment decisions are complex and may be challenging for patients, as multiple treatment options can often be reasonably considered. As a result, decisional support tools have been developed to assist patients in the decision-making process. A commonly used intervention to facilitate shared decision-making is a decision aid, which provides evidence-based outcomes information and guides patients towards choosing the treatment option that best aligns with their preferences and values. To ensure high quality, systematic frameworks and standards have been proposed for the development of an optimal aid for decision making. Studies have examined the impact of these tools on facilitating treatment decisions and improving decision-related outcomes. In radiation oncology, randomized controlled trials have demonstrated that decision aids have the potential to improve patient outcomes, including increased knowledge about treatment options and decreased decisional conflict with decision-making. This article provides an overview of the shared-decision making process and summarizes the development, validation, and implementation of decision aids as patient educational tools in radiation oncology. Finally, this article reviews the findings from decision aid studies in radiation oncology and offers various strategies to effectively implement shared decision-making into clinical practice.
Generating community-built tools for data sharing and analysis in environmental networks
Read, Jordan S.; Gries, Corinna; Read, Emily K.; Klug, Jennifer; Hanson, Paul C.; Hipsey, Matthew R.; Jennings, Eleanor; O'Reilley, Catherine; Winslow, Luke A.; Pierson, Don; McBride, Christopher G.; Hamilton, David
2016-01-01
Rapid data growth in many environmental sectors has necessitated tools to manage and analyze these data. The development of tools often lags behind the proliferation of data, however, which may slow exploratory opportunities and scientific progress. The Global Lake Ecological Observatory Network (GLEON) collaborative model supports an efficient and comprehensive data–analysis–insight life cycle, including implementations of data quality control checks, statistical calculations/derivations, models, and data visualizations. These tools are community-built and openly shared. We discuss the network structure that enables tool development and a culture of sharing, leading to optimized output from limited resources. Specifically, data sharing and a flat collaborative structure encourage the development of tools that enable scientific insights from these data. Here we provide a cross-section of scientific advances derived from global-scale analyses in GLEON. We document enhancements to science capabilities made possible by the development of analytical tools and highlight opportunities to expand this framework to benefit other environmental networks.
Kristensen, Finn Børlum; Lampe, Kristian; Chase, Deborah L; Lee-Robin, Sun Hae; Wild, Claudia; Moharra, Montse; Garrido, Marcial Velasco; Nielsen, Camilla Palmhøj; Røttingen, John-Arne; Neikter, Susanna Allgurin; Bistrup, Marie Louise
2009-12-01
This article presents an overview of the practical methods and tools to support transnational Health Technology Assessment (HTA) that were developed and pilot tested by the European network for HTA (EUnetHTA), which involved a total of sixty-four Partner organizations. The methods differ according to scope and purpose of each of the tools developed. They included, for example, literature reviews, surveys, Delphi and consensus methods, workshops, pilot tests, and internal/public consultation. Practical results include an HTA Core Model and a Handbook on the use of the model, two pilot examples of HTA core information, an HTA Adaptation Toolkit for taking existing reports into new settings, a book about HTA and health policy making in Europe, a newsletter providing structured information about emerging/new technologies, an interactive Web-based tool to share information about monitoring activities for emerging/new technologies, and a Handbook on HTA capacity building for Member States with limited institutionalization of HTA. The tools provide high-quality information and methodological frameworks for HTA that facilitate preparation of HTA documentation, and sharing of information in and across national or regional systems. The tools will be used and further tested by partners in the EUnetHTA Collaboration aiming to (i) help reduce unnecessary duplication of HTA activities, (ii) develop and promote good practice in HTA methods and processes, (iii) share what can be shared, (iv) facilitate local adaptation of HTA information, (v) improve the links between health policy and HTA.
Mold and Indoor Air Quality in Schools
... Centers Mold Contact Us Share Mold and Indoor Air Quality in Schools Mold and Moisture in Schools Webinar ... premier resource on this issue is the Indoor Air Quality Tools for Schools kit. Our schools-related resources ...
ARX - A Comprehensive Tool for Anonymizing Biomedical Data
Prasser, Fabian; Kohlmayer, Florian; Lautenschläger, Ronald; Kuhn, Klaus A.
2014-01-01
Collaboration and data sharing have become core elements of biomedical research. Especially when sensitive data from distributed sources are linked, privacy threats have to be considered. Statistical disclosure control allows the protection of sensitive data by introducing fuzziness. Reduction of data quality, however, needs to be balanced against gains in protection. Therefore, tools are needed which provide a good overview of the anonymization process to those responsible for data sharing. These tools require graphical interfaces and the use of intuitive and replicable methods. In addition, extensive testing, documentation and openness to reviews by the community are important. Existing publicly available software is limited in functionality, and often active support is lacking. We present ARX, an anonymization tool that i) implements a wide variety of privacy methods in a highly efficient manner, ii) provides an intuitive cross-platform graphical interface, iii) offers a programming interface for integration into other software systems, and iv) is well documented and actively supported. PMID:25954407
Robertson, Eden G; Wakefield, Claire E; Signorelli, Christina; Cohn, Richard J; Patenaude, Andrea; Foster, Claire; Pettit, Tristan; Fardell, Joanna E
2018-07-01
We conducted a systematic review to identify the strategies that have been recommended in the literature to facilitate shared decision-making regarding enrolment in pediatric oncology clinical trials. We searched seven databases for peer-reviewed literature, published 1990-2017. Of 924 articles identified, 17 studies were eligible for the review. We assessed study quality using the 'Mixed-Methods Appraisal Tool'. We coded the results and discussions of papers line-by-line using nVivo software. We categorized strategies thematically. Five main themes emerged: 1) decision-making as a process, 2) individuality of the process; 3) information provision, 4) the role of communication, or 5) decision and psychosocial support. Families should have adequate time to make a decision. HCPs should elicit parents' and patients' preferences for level of information and decision involvement. Information should be clear and provided in multiple modalities. Articles also recommended providing training for healthcare professionals and access to psychosocial support for families. High quality, individually-tailored information, open communication and psychosocial support appear vital in supporting decision-making regarding enrollment in clinical trials. These data will usefully inform future decision-making interventions/tools to support families making clinical trial decisions. A solid evidence-base for effective strategies which facilitate shared decision-making is needed. Copyright © 2018 Elsevier B.V. All rights reserved.
a Standardized Approach to Topographic Data Processing and Workflow Management
NASA Astrophysics Data System (ADS)
Wheaton, J. M.; Bailey, P.; Glenn, N. F.; Hensleigh, J.; Hudak, A. T.; Shrestha, R.; Spaete, L.
2013-12-01
An ever-increasing list of options exist for collecting high resolution topographic data, including airborne LIDAR, terrestrial laser scanners, bathymetric SONAR and structure-from-motion. An equally rich, arguably overwhelming, variety of tools exists with which to organize, quality control, filter, analyze and summarize these data. However, scientists are often left to cobble together their analysis as a series of ad hoc steps, often using custom scripts and one-time processes that are poorly documented and rarely shared with the community. Even when literature-cited software tools are used, the input and output parameters differ from tool to tool. These parameters are rarely archived and the steps performed lost, making the analysis virtually impossible to replicate precisely. What is missing is a coherent, robust, framework for combining reliable, well-documented topographic data-processing steps into a workflow that can be repeated and even shared with others. We have taken several popular topographic data processing tools - including point cloud filtering and decimation as well as DEM differencing - and defined a common protocol for passing inputs and outputs between them. This presentation describes a free, public online portal that enables scientists to create custom workflows for processing topographic data using a number of popular topographic processing tools. Users provide the inputs required for each tool and in what sequence they want to combine them. This information is then stored for future reuse (and optionally sharing with others) before the user then downloads a single package that contains all the input and output specifications together with the software tools themselves. The user then launches the included batch file that executes the workflow on their local computer against their topographic data. This ZCloudTools architecture helps standardize, automate and archive topographic data processing. It also represents a forum for discovering and sharing effective topographic processing workflows.
Bergholz, W
2008-11-01
In many high-tech industries, quality management (QM) has enabled improvements of quality by a factor of 100 or more, in combination with significant cost reductions. Compared to this, the application of QM methods in health care is in its initial stages. It is anticipated that stringent process management, embedded in an effective QM system will lead to significant improvements in health care in general and in the German public health service in particular. Process management is an ideal platform for controlling in the health care sector, and it will significantly improve the leverage of controlling to bring down costs. Best practice sharing in industry has led to quantum leap improvements. Process management will enable best practice sharing also in the public health service, in spite of the highly diverse portfolio of services that the public health service offers in different German regions. Finally, it is emphasised that "technical" QM, e.g., on the basis of the ISO 9001 standard is not sufficient to reach excellence. It is necessary to integrate soft factors, such as patient or employee satisfaction, and leadership quality into the system. The EFQM model for excellence can serve as proven tool to reach this goal.
Teach-Discover-Treat (TDT): Collaborative Computational Drug Discovery for Neglected Diseases
Jansen, Johanna M.; Cornell, Wendy; Tseng, Y. Jane; Amaro, Rommie E.
2012-01-01
Teach – Discover – Treat (TDT) is an initiative to promote the development and sharing of computational tools solicited through a competition with the aim to impact education and collaborative drug discovery for neglected diseases. Collaboration, multidisciplinary integration, and innovation are essential for successful drug discovery. This requires a workforce that is trained in state-of-the-art workflows and equipped with the ability to collaborate on platforms that are accessible and free. The TDT competition solicits high quality computational workflows for neglected disease targets, using freely available, open access tools. PMID:23085175
2013-01-01
Background Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists’ capacity to use these immunoassays to evaluate human clinical trials. Results The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose–response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Conclusions Unlike other tools tailored for Luminex immunoassays, LabKey Server allows labs to customize their Luminex analyses using scripting while still presenting users with a single, graphical interface for processing and analyzing data. The LabKey Server system also stands out among Luminex tools for enabling smooth, secure transfer of data, quality control information, and analyses between collaborators. LabKey Server and its Luminex features are freely available as open source software at http://www.labkey.com under the Apache 2.0 license. PMID:23631706
Eckels, Josh; Nathe, Cory; Nelson, Elizabeth K; Shoemaker, Sara G; Nostrand, Elizabeth Van; Yates, Nicole L; Ashley, Vicki C; Harris, Linda J; Bollenbeck, Mark; Fong, Youyi; Tomaras, Georgia D; Piehler, Britt
2013-04-30
Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists' capacity to use these immunoassays to evaluate human clinical trials. The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose-response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Unlike other tools tailored for Luminex immunoassays, LabKey Server allows labs to customize their Luminex analyses using scripting while still presenting users with a single, graphical interface for processing and analyzing data. The LabKey Server system also stands out among Luminex tools for enabling smooth, secure transfer of data, quality control information, and analyses between collaborators. LabKey Server and its Luminex features are freely available as open source software at http://www.labkey.com under the Apache 2.0 license.
Computerized Decision Aids for Shared Decision Making in Serious Illness: Systematic Review.
Staszewska, Anna; Zaki, Pearl; Lee, Joon
2017-10-06
Shared decision making (SDM) is important in achieving patient-centered care. SDM tools such as decision aids are intended to inform the patient. When used to assist in decision making between treatments, decision aids have been shown to reduce decisional conflict, increase ease of decision making, and increase modification of previous decisions. The purpose of this systematic review is to assess the impact of computerized decision aids on patient-centered outcomes related to SDM for seriously ill patients. PubMed and Scopus databases were searched to identify randomized controlled trials (RCTs) that assessed the impact of computerized decision aids on patient-centered outcomes and SDM in serious illness. Six RCTs were identified and data were extracted on study population, design, and results. Risk of bias was assessed by a modified Cochrane Risk of Bias Tool for Quality Assessment of Randomized Controlled Trials. Six RCTs tested decision tools in varying serious illnesses. Three studies compared different computerized decision aids against each other and a control. All but one study demonstrated improvement in at least one patient-centered outcome. Computerized decision tools may reduce unnecessary treatment in patients with low disease severity in comparison with informational pamphlets. Additionally, electronic health record (EHR) portals may provide the opportunity to manage care from the home for individuals affected by illness. The quality of decision aids is of great importance. Furthermore, satisfaction with the use of tools is associated with increased patient satisfaction and reduced decisional conflict. Finally, patients may benefit from computerized decision tools without the need for increased physician involvement. Most computerized decision aids improved at least one patient-centered outcome. All RCTs identified were at a High Risk of Bias or Unclear Risk of Bias. Effort should be made to improve the quality of RCTs testing SDM aids in serious illness. ©Anna Staszewska, Pearl Zaki, Joon Lee. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 06.10.2017.
NASA Astrophysics Data System (ADS)
Mueller, Daniel L.
1994-03-01
Xerox virtually created the plain paper copier industry, it enjoyed unparalleled growth and its name became synonymous with copying. However, competition in the 1970s aggressively attacked this attractive growth market and took away market share. An evaluation of the competition told Xerox that its competitors were selling products for what it cost Xerox to make them, that their quality was better and that their goal was to capture all of Xerox' market share. The fundamental precept that Xerox pursued to meet this competitive threat and recapture market share was the recognition that long term success is dependent upon total mastery of quality, especially in manufacturing. In turning this precept into reality, Xerox Manufacturing made dramatic improvements in all of its processes and practices focusing on quality as defined by the customer. Actions to accomplish this result included training all people in basic statistical tools and their applications, the use of employee involvement teams and continuous quality improvement techniques. These and other actions were successful in not only enabling Xerox to turn the competitive threat and recover market share, but to also win the Malcolm Baldrige Award for Quality in 1989.
Internal Quality Assurance Benchmarking. ENQA Workshop Report 20
ERIC Educational Resources Information Center
Blackstock, Douglas; Burquel, Nadine; Comet, Nuria; Kajaste, Matti; dos Santos, Sergio Machado; Marcos, Sandra; Moser, Marion; Ponds, Henri; Scheuthle, Harald; Sixto, Luis Carlos Velon
2012-01-01
The Internal Quality Assurance group of ENQA (IQA Group) has been organising a yearly seminar for its members since 2007. The main objective is to share experiences concerning the internal quality assurance of work processes in the participating agencies. The overarching theme of the 2011 seminar was how to use benchmarking as a tool for…
Towards a collaborative, global infrastructure for biodiversity assessment
Guralnick, Robert P; Hill, Andrew W; Lane, Meredith
2007-01-01
Biodiversity data are rapidly becoming available over the Internet in common formats that promote sharing and exchange. Currently, these data are somewhat problematic, primarily with regard to geographic and taxonomic accuracy, for use in ecological research, natural resources management and conservation decision-making. However, web-based georeferencing tools that utilize best practices and gazetteer databases can be employed to improve geographic data. Taxonomic data quality can be improved through web-enabled valid taxon names databases and services, as well as more efficient mechanisms to return systematic research results and taxonomic misidentification rates back to the biodiversity community. Both of these are under construction. A separate but related challenge will be developing web-based visualization and analysis tools for tracking biodiversity change. Our aim was to discuss how such tools, combined with data of enhanced quality, will help transform today's portals to raw biodiversity data into nexuses of collaborative creation and sharing of biodiversity knowledge. PMID:17594421
Methods and Frequency of Sharing of Learning Resources by Medical Students
ERIC Educational Resources Information Center
Judd, Terry; Elliott, Kristine
2017-01-01
University students have ready access to quality learning resources through learning management systems (LMS), online library collections and generic search tools. However, anecdotal evidence suggests they sometimes turn to peer-based sharing rather than sourcing resources directly. We know little about this practice--how common it is, what sort…
Sadigh, Gelareh; Carlos, Ruth C; Krupinski, Elizabeth A; Meltzer, Carolyn C; Duszak, Richard
2017-11-01
The purpose of this article is to review the literature on communicating transparency in health care pricing, both overall and specifically for medical imaging. Focus is also placed on the imperatives and initiatives that will increasingly impact radiologists and their patients. Most Americans seek transparency in health care pricing, yet such discussions occur in fewer than half of patient encounters. Although price transparency tools can help decrease health care spending, most are used infrequently and most lack information about quality. Given the high costs associated with many imaging services, radiologists should be aware of such initiatives to optimize patient engagement and informed shared decision making.
Kunneman, Marleen; Branda, Megan E; Noseworthy, Peter A; Linzer, Mark; Burnett, Bruce; Dick, Sara; Spencer-Bonilla, Gabriela; Fernandez, Cara A; Gorr, Haeshik; Wambua, Mike; Keune, Shelly; Zeballos-Palacios, Claudia; Hargraves, Ian; Shah, Nilay D; Montori, Victor M
2017-09-29
Nonvalvular atrial fibrillation (AF) is a common ongoing health problem that places patients at risk of stroke. Whether and how a patient addresses this risk depends on each patient's goals, context, and values. Consequently, leading cardiovascular societies recommend using shared decision making (SDM) to individualize antithrombotic treatment in patients with AF. The aim of this study is to assess the extent to which the ANTICOAGULATION CHOICE conversation tool promotes high-quality SDM and influences anticoagulation uptake and adherence in patients with AF at risk of strokes. This study protocol describes a multicenter, encounter-level, randomized trial to assess the effect of using the ANTICOAGULATION CHOICE conversation tool in the clinical encounter, compared to usual care. The participating centers include an academic hospital system, a suburban community group practice, and an urban safety net hospital, all in Minnesota, USA. Patients with ongoing nonvalvular AF at risk of strokes (CHA 2 DS 2 -VASc score ≥ 1 in men, or ≥ 2 in women) will be eligible for participation. We aim to include 999 patients and their clinicians. The primary outcome is the quality of SDM as perceived by participants, and as assessed by a post-encounter survey that ascertains (a) knowledge transfer, (b) concordance of the decision made, (c) quality of communication, and (d) satisfaction with the decision-making process. Recordings of encounters will be reviewed to assess the extent of patient involvement and how participants use the tool (fidelity). Anticoagulant use, choice of agent, and adherence will be drawn from patients' medical and pharmacy records. Strokes and bleeding events will be drawn from patient records. This study will provide a valid and precise measure of the effect of the ANTICOAGULATION CHOICE conversation tool on SDM quality and processes, and on the treatment choices and adherence to therapy among AF patients at risk of stroke. ClinicalTrials.gov, NCT02905032 . Registered on 9 September 2016.
Lean management: innovative tools for engaging teams in continuous quality improvement.
Perreault, Lucille; Vaillancourt, Lise; Filion, Catherine; Hadj, Camélia
2014-01-01
Lean management has proven to be a sustainable method to ensure a high level of patient care through innovation and teamwork. It involves a set of six tools that allow for visual management shared among team members. The team focuses their efforts on the improvement of organizational indicators in a standardized and engaging way, resulting in the sustainability of improvements. This article outlines the program's rollout at Montfort Hospital (l'Hôpital Montfort). In only a few months, two pilot units accomplished close to 50 improvements each. In addition, the organizational employee satisfaction questionnaire showed very positive results. Copyright © 2014 Longwoods Publishing.
Implementation of Cyberinfrastructure and Data Management Workflow for a Large-Scale Sensor Network
NASA Astrophysics Data System (ADS)
Jones, A. S.; Horsburgh, J. S.
2014-12-01
Monitoring with in situ environmental sensors and other forms of field-based observation presents many challenges for data management, particularly for large-scale networks consisting of multiple sites, sensors, and personnel. The availability and utility of these data in addressing scientific questions relies on effective cyberinfrastructure that facilitates transformation of raw sensor data into functional data products. It also depends on the ability of researchers to share and access the data in useable formats. In addition to addressing the challenges presented by the quantity of data, monitoring networks need practices to ensure high data quality, including procedures and tools for post processing. Data quality is further enhanced if practitioners are able to track equipment, deployments, calibrations, and other events related to site maintenance and associate these details with observational data. In this presentation we will describe the overall workflow that we have developed for research groups and sites conducting long term monitoring using in situ sensors. Features of the workflow include: software tools to automate the transfer of data from field sites to databases, a Python-based program for data quality control post-processing, a web-based application for online discovery and visualization of data, and a data model and web interface for managing physical infrastructure. By automating the data management workflow, the time from collection to analysis is reduced and sharing and publication is facilitated. The incorporation of metadata standards and descriptions and the use of open-source tools enhances the sustainability and reusability of the data. We will describe the workflow and tools that we have developed in the context of the iUTAH (innovative Urban Transitions and Aridregion Hydrosustainability) monitoring network. The iUTAH network consists of aquatic and climate sensors deployed in three watersheds to monitor Gradients Along Mountain to Urban Transitions (GAMUT). The variety of environmental sensors and the multi-watershed, multi-institutional nature of the network necessitate a well-planned and efficient workflow for acquiring, managing, and sharing sensor data, which should be useful for similar large-scale and long-term networks.
MO-F-211-01: Methods for Completing Practice Quality Improvement (PQI).
Johnson, J; Brown, K; Ibbott, G; Pawlicki, T
2012-06-01
Practice Quality Improvement (PQI) is becoming an expected part of routine practice in healthcare as an approach to provide more efficient, effective and high quality care. Additionally, as part of the ABR's Maintenance of Certification (MOC) pathway, medical physicists are now expected to complete a PQI project. This session will describe the history behind and benefits of the ABR's MOC program, provide details of quality improvement methods and how to successfully complete a PQI project. PQI methods include various commonly used engineering and management tools. The Plan-Do-Study-Act (PDSA) cycle will be presented as one project planning and implementation tool. Other PQI analysis instruments such as flowcharts, Pareto charts, process control charts and fishbone diagrams will also be explained with examples. Cause analysis, solution development and implementation, and post-implementation measurement will be presented. Project identification and definition as well as appropriate measurement tool selection will be offered. Methods to choose key quality metrics (key quality indicators) will also be addressed. Several sample PQI projects and templates available through the AAPM and other organizations will be described. At least three examples of completed PQI projects will be shared. 1. Identify and define a PQI project 2. Identify and select measurement methods/techniques for use with the PQI project 3. Describe example(s) of completed projects. © 2012 American Association of Physicists in Medicine.
Anesthesiology leadership rounding: identifying opportunities for improvement.
Gravenstein, Dietrich; Ford, Susan; Enneking, F Kayser
2012-01-01
Rounding that includes participation of individuals with authority to implement changes has been advocated as important to the transformation of an institution into a high-quality and safe organization. We describe a Department of Anesthesiology's experience with leadership rounding. The Department Chair or other senior faculty designate, a quality coordinator, up to four residents, the ward charge nurse, and patient nurses participated in rounds at bedsides. During a 23-month period, 14 significant opportunities to improve care were identified. Nurses identified 5 of these opportunities, primary team physicians 2, the rounding team 4, and patients or their family members another 3. The anesthesiology service had sole or shared responsibility for 10 improvements. A variety of organizations track specific measures across all phases of the patient experience to gauge quality of care. Chart auditing tools for detecting threats to safety are often used. These measures and tools missed opportunities for improvement that were discovered only through rounding. We conclude that the introduction of leadership rounding by an anesthesiology service can identify opportunities for improving quality that are not captured by conventional efforts.
Achieving total quality through intelligence.
Fuld, L M
1992-02-01
American firms want 'total quality'. The time and money spent by U.S. companies attempting to qualify for the coveted Baldrige Award exemplifies corporate America's desire to achieve new quality standards. Corporate intelligence and 'total quality' are inextricably linked. In this article, the authors demonstrate how shared and properly-used information can be a powerful tool for elevating quality standards, and how corporate intelligence programmes can provide the information links vital for success in attaining the highest standards of quality.
Innovative Quality-Assurance Strategies for Tuberculosis Surveillance in the United States
Manangan, Lilia Ponce; Tryon, Cheryl; Magee, Elvin; Miramontes, Roque
2012-01-01
Introduction. The Centers for Disease Control and Prevention (CDC)'s National Tuberculosis Surveillance System (NTSS) is the national repository of tuberculosis (TB) data in the United States. Jurisdictions report to NTSS through the Report of Verified Case of Tuberculosis (RVCT) form that transitioned to a web-based system in 2009. Materials and Methods. To improve RVCT data quality, CDC conducted a quality assurance (QA) needs assessment to develop QA strategies. These include QA components (case detection, data accuracy, completeness, timeliness, data security, and confidentiality); sample tools such as National TB Indicators Project (NTIP) to identify TB case reporting discrepancies; comprehensive training course; resource guide and toolkit. Results and Discussion. During July–September 2011, 73 staff from 34 (57%) of 60 reporting jurisdictions participated in QA training. Participants stated usefulness of sharing jurisdictions' QA methods; 66 (93%) wrote that the QA tools will be effective for their activities. Several jurisdictions reported implementation of QA tools pertinent to their programs. Data showed >8% increase in NTSS and NTIP enrollment through Secure Access Management Services, which monitors system usage, from August 2011–February 2012. Conclusions. Despite challenges imposed by web-based surveillance systems, QA strategies can be developed with innovation and collaboration. These strategies can also be used by other disease programs to ensure high data quality. PMID:22685648
Nemes, Szilard; Rolfson, Ola; Garellick, Göran
2018-02-01
Clinicians considering improvements in health-related quality of life (HRQoL) after total hip replacement (THR) must account for multiple pieces of information. Evidence-based decisions are important to best assess the effect of THR on HRQoL. This work aims at constructing a shared decision-making tool that helps clinicians assessing the future benefits of THR by offering predictions of 1-year postoperative HRQoL of THR patients. We used data from the Swedish Hip Arthroplasty Register. Data from 2008 were used as training set and data from 2009 to 2012 as validation set. We adopted two approaches. First, we assumed a continuous distribution for the EQ-5D index and modelled the postoperative EQ-5D index with regression models. Second, we modelled the five dimensions of the EQ-5D and weighted together the predictions using the UK Time Trade-Off value set. As predictors, we used preoperative EQ-5D dimensions and the EQ-5D index, EQ visual analogue scale, visual analogue scale pain, Charnley classification, age, gender, body mass index, American Society of Anesthesiologists, surgical approach and prosthesis type. Additionally, the tested algorithms were combined in a single predictive tool by stacking. Best predictive power was obtained by the multivariate adaptive regression splines (R 2 = 0.158). However, this was not significantly better than the predictive power of linear regressions (R 2 = 0.157). The stacked model had a predictive power of 17%. Successful implementation of a shared decision-making tool that can aid clinicians and patients in understanding expected improvement in HRQoL following THR would require higher predictive power than we achieved. For a shared decision-making tool to succeed, further variables, such as socioeconomics, need to be considered. © 2016 John Wiley & Sons, Ltd.
Choo, Esther K; Ranney, Megan L; Chan, Teresa M; Trueger, N Seth; Walsh, Amy E; Tegtmeyer, Ken; McNamara, Shannon O; Choi, Ricky Y; Carroll, Christopher L
2015-05-01
Twitter is a tool for physicians to increase engagement of learners and the public, share scientific information, crowdsource new ideas, conduct, discuss and challenge emerging research, pursue professional development and continuing medical education, expand networks around specialized topics and provide moral support to colleagues. However, new users or skeptics may well be wary of its potential pitfalls. The aims of this commentary are to discuss the potential advantages of the Twitter platform for dialogue among physicians, to explore the barriers to accurate and high-quality healthcare discourse and, finally, to recommend potential safeguards physicians may employ against these threats in order to participate productively.
The State of Cloud-Based Biospecimen and Biobank Data Management Tools.
Paul, Shonali; Gade, Aditi; Mallipeddi, Sumani
2017-04-01
Biobanks are critical for collecting and managing high-quality biospecimens from donors with appropriate clinical annotation. The high-quality human biospecimens and associated data are required to better understand disease processes. Therefore, biobanks have become an important and essential resource for healthcare research and drug discovery. However, collecting and managing huge volumes of data (biospecimens and associated clinical data) necessitate that biobanks use appropriate data management solutions that can keep pace with the ever-changing requirements of research. To automate biobank data management, biobanks have been investing in traditional Laboratory Information Management Systems (LIMS). However, there are a myriad of challenges faced by biobanks in acquiring traditional LIMS. Traditional LIMS are cost-intensive and often lack the flexibility to accommodate changes in data sources and workflows. Cloud technology is emerging as an alternative that provides the opportunity to small and medium-sized biobanks to automate their operations in a cost-effective manner, even without IT personnel. Cloud-based solutions offer the advantage of heightened security, rapid scalability, dynamic allocation of services, and can facilitate collaboration between different research groups by using a shared environment on a "pay-as-you-go" basis. The benefits offered by cloud technology have resulted in the development of cloud-based data management solutions as an alternative to traditional on-premise software. After evaluating the advantages offered by cloud technology, several biobanks have started adopting cloud-based tools. Cloud-based tools provide biobanks with easy access to biospecimen data for real-time sharing with clinicians. Another major benefit realized by biobanks by implementing cloud-based applications is unlimited data storage on the cloud and automatic backups for protecting any data loss in the face of natural calamities.
ERIC Educational Resources Information Center
Maulana, Ridwan; Helms-Lorenz, Michelle
2016-01-01
Observations and student perceptions are recognised as important tools for examining teaching behaviour, but little is known about whether both perspectives share similar construct representations and how both perspectives link with student academic outcomes. The present study compared the construct representation of preservice teachers' teaching…
Margenthaler, Julie A; Ollila, David W
2016-10-01
Although breast-conserving therapy is considered the preferred treatment for the majority of women with early-stage breast cancer, mastectomy rates in this group remain high. The patient, physician, and systems factors contributing to a decision for mastectomy are complicated. Understanding the individual patient's values and goals when making this decision is paramount to providing a shared decision-making process that will yield the desired outcome. The cornerstones of this discussion include education of the patient, access to decision-aid tools, and time to make an informed decision. However, it is also paramount for the physician to understand that a significant majority of women with an informed and complete understanding of their surgical choices will still prefer mastectomy. The rates of breast conservation versus mastectomy should not be considered a quality measure alone. Rather, the extent by which patients are informed, involved in decision-making, and undergoing treatments that reflect their goals is the true test of quality. Here we explore some of the factors that impact the patient preference for breast conservation versus mastectomy and how shared decision-making can be maximized for patient satisfaction.
Word Travelers: Using Digital Tools to Explore Vocabulary and Develop Independent Learners
ERIC Educational Resources Information Center
Tysseling, Lee Ann
2012-01-01
The Internet is full of tools for vocabulary development, but the quality and usefulness for teachers and students vary greatly. With a traditionalist's respect for word knowledge and an adventurer's spirit for discovering new routes to learning, Lee Ann Tysseling shares an exciting array of technology-assisted resources that can boost students'…
Quality assessment tools add value.
Paul, L
1996-10-01
The rapid evolution of the health care marketplace can be expected to continue as we move closer to the 21st Century. Externally-imposed pressures for cost reduction will increasingly be accompanied by pressure within health care organizations as risk-sharing reimbursement arrangements become more commonplace. Competitive advantage will be available to those organizations that can demonstrate objective value as defined by the cost-quality equation. The tools an organization chooses to perform quality assessment will be an important factor in its ability to demonstrate such value. Traditional quality assurance will in all likelihood continue, but the extent to which quality improvement activities are adopted by the culture of an organization may determine its ability to provide objective evidence of better health status outcomes.
Degrassi, Flori; Sopranzi, Cristina; Leto, Antonella; Amato, Simona; D'Urso, Antonio
2009-01-01
Managing quality in health care whilst ensuring equity is a fundamental aspect of the provision of services by healthcare organizations. Measuring perceived quality of care is an important tool for evaluating the quality of healthcare delivery in that it allows the implementation of corrective actions to meet the healthcare needs of patients. The Rome B (ASL RMB) local health authority adopted the UNI EN 10006:2006 norms as a management tool, therefore introducing the evaluation of customer satisfaction as an opportunity to involve users in the creation of quality healthcare services with and for the citizens. This paper presents the activities implemented and the results achieved with regards to shared and integrated continuous improvement of services.
Indoor Air Quality Tribal Partners Program
IAQ Tribal Partners Program. Empowering champions of healthy IAQ in tribal communities with tools for networking, sharing innovative and promising programs and practices and a reservoir of the best available tribal-specific IAQ information and materials.
Collaborative Manufacturing for Small-Medium Enterprises
NASA Astrophysics Data System (ADS)
Irianto, D.
2016-02-01
Manufacturing systems involve decisions concerning production processes, capacity, planning, and control. In a MTO manufacturing systems, strategic decisions concerning fulfilment of customer requirement, manufacturing cost, and due date of delivery are the most important. In order to accelerate the decision making process, research on decision making structure when receiving order and sequencing activities under limited capacity is required. An effective decision making process is typically required by small-medium components and tools maker as supporting industries to large industries. On one side, metal small-medium enterprises are expected to produce parts, components or tools (i.e. jigs, fixture, mold, and dies) with high precision, low cost, and exact delivery time. On the other side, a metal small- medium enterprise may have weak bargaining position due to aspects such as low production capacity, limited budget for material procurement, and limited high precision machine and equipment. Instead of receiving order exclusively, a small-medium enterprise can collaborate with other small-medium enterprise in order to fulfill requirements high quality, low manufacturing cost, and just in time delivery. Small-medium enterprises can share their best capabilities to form effective supporting industries. Independent body such as community service at university can take a role as a collaboration manager. The Laboratory of Production Systems at Bandung Institute of Technology has implemented shared manufacturing systems for small-medium enterprise collaboration.
Evaluation of the Iconic Pain Assessment Tool by a heterogeneous group of people in pain
Lalloo, Chitra; Henry, James L
2011-01-01
The Iconic Pain Assessment Tool (IPAT) is a novel web-based instrument for the self-report of pain quality, intensity and location in the form of a permanent diary. Originally designed for people with central poststroke pain, the tool is being adapted for a larger, more diverse patient population. The present study aimed to collect evaluative feedback on the IPAT from a heterogeneous sample of individuals with chronic pain. The specific study aims were to evaluate participant comfort with the tool including enjoyment, ease of use and comfort with the electronic medium; to assess perceived value of the tool for communicating pain quality, intensity and location; to gauge participant intent to share their pain diaries with others and use the tool on a regular basis to track their pain over time; to assess the perceived descriptiveness of current IPAT icons and the numerical rating scale; and to identify strengths and weaknesses of the tool to refine the existing prototype. Written and verbal feedback from individuals with a variety of chronic pain conditions (n=23) were collected in the context of these objectives. Overall, the IPAT was positively endorsed by this heterogeneous sample of people in pain. The authors concluded that the IPAT is a user-friendly instrument that has the potential to help people express, document and share their personal experience with chronic pain. PMID:21369536
Schroy, Paul C; Mylvaganam, Shamini; Davidson, Peter
2014-02-01
Decision aids for colorectal cancer (CRC) screening have been shown to enable patients to identify a preferred screening option, but the extent to which such tools facilitate shared decision making (SDM) from the perspective of the provider is less well established. Our goal was to elicit provider feedback regarding the impact of a CRC screening decision aid on SDM in the primary care setting. Cross-sectional survey. Primary care providers participating in a clinical trial evaluating the impact of a novel CRC screening decision aid on SDM and adherence. Perceptions of the impact of the tool on decision-making and implementation issues. Twenty-nine of 42 (71%) eligible providers responded, including 27 internists and two nurse practitioners. The majority (>60%) felt that use of the tool complimented their usual approach, increased patient knowledge, helped patients identify a preferred screening option, improved the quality of decision making, saved time and increased patients' desire to get screened. Respondents were more neutral is their assessment of whether the tool improved the overall quality of the patient visit or patient satisfaction. Fewer than 50% felt that the tool would be easy to implement into their practices or that it would be widely used by their colleagues. Decision aids for CRC screening can improve the quality and efficiency of SDM from the provider perspective but future use is likely to depend on the extent to which barriers to implementation can be addressed. © 2011 John Wiley & Sons Ltd.
Gonzàlez, J; Gispert, M; Gil, M; Hviid, M; Dourmad, J Y; de Greef, K H; Zimmer, C; Fàbrega, E
2014-12-01
A market conformity tool, based on technological meat quality parameters, was developed within the Q-PorkChains project, to be included in a global sustainability evaluation of pig farming systems. The specific objective of the market conformity tool was to define a scoring system based on the suitability of meat to elaborate the main pork products, according to their market shares based on industry requirements, in different pig farming systems. The tool was based on carcass and meat quality parameters that are commonly used for the assessment of technological quality, which provide representative and repeatable data and are easily measurable. They were the following: cold carcass weight; lean meat percentage; minimum subcutaneous back fat depth at m. gluteus medius level, 45 postmortem and ultimate pH (measured at 24-h postmortem) in m. longissimus lumborum and semimembranosus; meat colour; drip losses and intramuscular fat content in a m. longissimus sample. Five categories of pork products produced at large scale in Europe were considered in the study: fresh meat, cooked products, dry products, specialties and other meat products. For each of the studied farming systems, the technological meat quality requirements, as well as the market shares for each product category within farming system, were obtained from the literature and personal communications from experts. The tool resulted in an overall conformity score that enabled to discriminate among systems according to the degree of matching of the achieved carcass and meat quality with the requirements of the targeted market. In order to improve feasibility, the tool was simplified by selecting ultimate pH at m. longissimus or semimembranosus, minimum fat thickness measured at the left half carcass over m. gluteus medius and intramuscular fat content in a m. longissimus sample as iceberg indicators. The overall suitability scores calculated by using both the complete and the reduced tools presented good correlation and the results obtained were similar. The tool could be considered as robust enough to discriminate among different systems, since it was tested in a wide range of them. It also can be used to detect improvement opportunities to enhance sustainability of pig farming systems. The final objective of the study was achieved, since the market suitability tool could be used in an integrated sustainability analysis of pig farming systems.
Seeland, Ute; Nauman, Ahmad T; Cornelis, Alissa; Ludwig, Sabine; Dunkel, Mathias; Kararigas, Georgios; Regitz-Zagrosek, Vera
2016-01-01
Sex and Gender Medicine is a novel discipline that provides equitable medical care for society and improves outcomes for both male and female patients. The integration of sex- and gender-specific knowledge into medical curricula is limited due to adequate learning material, systematic teacher training and an innovative communication strategy. We aimed at initiating an e-learning and knowledge-sharing platform for Sex and Gender Medicine, the eGender platform (http://egender.charite.de), to ensure that future doctors and health professionals will have adequate knowledge and communication skills on sex and gender differences in order to make informed decisions for their patients. The web-based eGender knowledge-sharing platform was designed to support the blended learning pedagogical teaching concept and follows the didactic concept of constructivism. Learning materials developed by Sex and Gender Medicine experts of seven universities have been used as the basis for the new learning tools . The content of these tools is patient-centered and provides add-on information on gender-sensitive aspects of diseases. The structural part of eGender was designed and developed using the open source e-learning platform Moodle. The eGender platform comprises an English and a German version of e-learning modules: one focusing on basic knowledge and seven on specific medical disciplines. Each module consists of several courses corresponding to a disease or symptom complex. Self-organized learning has to be managed by using different learning tools, e.g., texts and audiovisual material, tools for online communication and collaborative work. More than 90 users from Europe registered for the eGender Medicine learning modules. The most frequently accessed module was "Gender Medicine-Basics" and the users favored discussion forums. These e-learning modules fulfill the quality criteria for higher education and are used within the elective Master Module "Gender Medicine-Basics" implemented into the accredited Master of Public Health at Charité-Berlin. The eGender platform is a flexible and user-friendly electronical knowledge-sharing platform providing evidence-based high-quality learning material used by a growing number of registered users. The eGender Medicine learning modules could be key in the reform of medical curricula to integrate Sex and Gender Medicine into the education of health professionals.
Grudzen, Corita R; Anderson, Jana R; Carpenter, Christopher R; Hess, Erik P
2016-12-01
Shared decision making in emergency medicine has the potential to improve the quality, safety, and outcomes of emergency department (ED) patients. Given that the ED is the gateway to care for patients with a variety of illnesses and injuries and the safety net for patients otherwise unable to access care, shared decision making in the ED is relevant to numerous disciplines and the interests of the United States (U.S.) public. On May 10, 2016 the 16th annual Academic Emergency Medicine (AEM) consensus conference, "Shared Decision Making: Development of a Policy-Relevant Patient-Centered Research Agenda" was held in New Orleans, Louisiana. During this one-day conference clinicians, researchers, policy-makers, patient and caregiver representatives, funding agency representatives, trainees, and content experts across many areas of medicine interacted to define high priority areas for research in 1 of 6 domains: 1) diagnostic testing; 2) policy, 3) dissemination/implementation and education, 4) development and testing of shared decision making approaches and tools in practice, 5) palliative care and geriatrics, and 6) vulnerable populations and limited health literacy. This manuscript describes the current state of shared decision making in the ED context, provides an overview of the conference planning process, the aims of the conference, the focus of each respective breakout session, the roles of patient and caregiver representatives and an overview of the conference agenda. The results of this conference published in this issue of AEM provide an essential summary of the future research priorities for shared decision making to increase quality of care and patient-centered outcomes. © 2016 by the Society for Academic Emergency Medicine.
Christley, Scott; Scarborough, Walter; Salinas, Eddie; Rounds, William H; Toby, Inimary T; Fonner, John M; Levin, Mikhail K; Kim, Min; Mock, Stephen A; Jordan, Christopher; Ostmeyer, Jared; Buntzman, Adam; Rubelt, Florian; Davila, Marco L; Monson, Nancy L; Scheuermann, Richard H; Cowell, Lindsay G
2018-01-01
Recent technological advances in immune repertoire sequencing have created tremendous potential for advancing our understanding of adaptive immune response dynamics in various states of health and disease. Immune repertoire sequencing produces large, highly complex data sets, however, which require specialized methods and software tools for their effective analysis and interpretation. VDJServer is a cloud-based analysis portal for immune repertoire sequence data that provide access to a suite of tools for a complete analysis workflow, including modules for preprocessing and quality control of sequence reads, V(D)J gene segment assignment, repertoire characterization, and repertoire comparison. VDJServer also provides sophisticated visualizations for exploratory analysis. It is accessible through a standard web browser via a graphical user interface designed for use by immunologists, clinicians, and bioinformatics researchers. VDJServer provides a data commons for public sharing of repertoire sequencing data, as well as private sharing of data between users. We describe the main functionality and architecture of VDJServer and demonstrate its capabilities with use cases from cancer immunology and autoimmunity. VDJServer provides a complete analysis suite for human and mouse T-cell and B-cell receptor repertoire sequencing data. The combination of its user-friendly interface and high-performance computing allows large immune repertoire sequencing projects to be analyzed with no programming or software installation required. VDJServer is a web-accessible cloud platform that provides access through a graphical user interface to a data management infrastructure, a collection of analysis tools covering all steps in an analysis, and an infrastructure for sharing data along with workflows, results, and computational provenance. VDJServer is a free, publicly available, and open-source licensed resource.
Elwyn, Glyn; Burstin, Helen; Barry, Michael J; Corry, Maureen P; Durand, Marie Anne; Lessler, Daniel; Saigal, Christopher
2018-04-27
Efforts to implement the use of patient decision aids to stimulate shared decision making are gaining prominence. Patient decision aids have been designed to help patients participate in making specific choices among health care options. Because these tools clearly influence decisions, poor quality, inaccurate or unbalanced presentations or misleading tools are a risk to patients. As payer interest in these tools increases, so does the risk that patients are harmed by the use of tools that are described as patient decision aids yet fail to meet established standards. To address this problem, the National Quality Forum (NQF) in the USA convened a multi-stakeholder expert panel in 2016 to propose national standards for a patient decision aid certification process. In 2017, NQF established an Action Team to foster shared decision making, and to call for a national certification process as one recommendation among others to stimulate improvement. A persistent barrier to the setup of a national patient decision aids certification process is the lack of a sustainable financial model to support the work. Copyright © 2018 The Author(s). Published by Elsevier B.V. All rights reserved.
Bitton, Asaf; Ratcliffe, Hannah L; Veillard, Jeremy H; Kress, Daniel H; Barkley, Shannon; Kimball, Meredith; Secci, Federica; Wong, Ethan; Basu, Lopa; Taylor, Chelsea; Bayona, Jaime; Wang, Hong; Lagomarsino, Gina; Hirschhorn, Lisa R
2017-05-01
Primary health care (PHC) has been recognized as a core component of effective health systems since the early part of the twentieth century. However, despite notable progress, there remains a large gap between what individuals and communities need, and the quality and effectiveness of care delivered. The Primary Health Care Performance Initiative (PHCPI) was established by an international consortium to catalyze improvements in PHC delivery and outcomes in low- and middle-income countries through better measurement and sharing of effective models and practices. PHCPI has developed a framework to illustrate the relationship between key financing, workforce, and supply inputs, and core primary health care functions of first-contact accessibility, comprehensiveness, coordination, continuity, and person-centeredness. The framework provides guidance for more effective assessment of current strengths and gaps in PHC delivery through a core set of 25 key indicators ("Vital Signs"). Emerging best practices that foster high-performing PHC system development are being codified and shared around low- and high-income countries. These measurement and improvement approaches provide countries and implementers with tools to assess the current state of their PHC delivery system and to identify where cross-country learning can accelerate improvements in PHC quality and effectiveness.
A parallel algorithm for multi-level logic synthesis using the transduction method. M.S. Thesis
NASA Technical Reports Server (NTRS)
Lim, Chieng-Fai
1991-01-01
The Transduction Method has been shown to be a powerful tool in the optimization of multilevel networks. Many tools such as the SYLON synthesis system (X90), (CM89), (LM90) have been developed based on this method. A parallel implementation is presented of SYLON-XTRANS (XM89) on an eight processor Encore Multimax shared memory multiprocessor. It minimizes multilevel networks consisting of simple gates through parallel pruning, gate substitution, gate merging, generalized gate substitution, and gate input reduction. This implementation, called Parallel TRANSduction (PTRANS), also uses partitioning to break large circuits up and performs inter- and intra-partition dynamic load balancing. With this, good speedups and high processor efficiencies are achievable without sacrificing the resulting circuit quality.
Improving Patient Safety: Improving Communication.
Bittner-Fagan, Heather; Davis, Joshua; Savoy, Margot
2017-12-01
Communication among physicians, staff, and patients is a critical element in patient safety. Effective communication skills can be taught and improved through training and awareness. The practice of family medicine allows for long-term relationships with patients, which affords opportunities for ongoing, high-quality communication. There are many barriers to effective communication, including patient factors, clinician factors, and system factors, but tools and strategies exist to address these barriers, improve communication, and engage patients in their care. Use of universal precautions for health literacy, appropriate medical interpreters, and shared decision-making are evidence-based tools that improve communication and increase patient safety. Written permission from the American Academy of Family Physicians is required for reproduction of this material in whole or in part in any form or medium.
MiMiR – an integrated platform for microarray data sharing, mining and analysis
Tomlinson, Chris; Thimma, Manjula; Alexandrakis, Stelios; Castillo, Tito; Dennis, Jayne L; Brooks, Anthony; Bradley, Thomas; Turnbull, Carly; Blaveri, Ekaterini; Barton, Geraint; Chiba, Norie; Maratou, Klio; Soutter, Pat; Aitman, Tim; Game, Laurence
2008-01-01
Background Despite considerable efforts within the microarray community for standardising data format, content and description, microarray technologies present major challenges in managing, sharing, analysing and re-using the large amount of data generated locally or internationally. Additionally, it is recognised that inconsistent and low quality experimental annotation in public data repositories significantly compromises the re-use of microarray data for meta-analysis. MiMiR, the Microarray data Mining Resource was designed to tackle some of these limitations and challenges. Here we present new software components and enhancements to the original infrastructure that increase accessibility, utility and opportunities for large scale mining of experimental and clinical data. Results A user friendly Online Annotation Tool allows researchers to submit detailed experimental information via the web at the time of data generation rather than at the time of publication. This ensures the easy access and high accuracy of meta-data collected. Experiments are programmatically built in the MiMiR database from the submitted information and details are systematically curated and further annotated by a team of trained annotators using a new Curation and Annotation Tool. Clinical information can be annotated and coded with a clinical Data Mapping Tool within an appropriate ethical framework. Users can visualise experimental annotation, assess data quality, download and share data via a web-based experiment browser called MiMiR Online. All requests to access data in MiMiR are routed through a sophisticated middleware security layer thereby allowing secure data access and sharing amongst MiMiR registered users prior to publication. Data in MiMiR can be mined and analysed using the integrated EMAAS open source analysis web portal or via export of data and meta-data into Rosetta Resolver data analysis package. Conclusion The new MiMiR suite of software enables systematic and effective capture of extensive experimental and clinical information with the highest MIAME score, and secure data sharing prior to publication. MiMiR currently contains more than 150 experiments corresponding to over 3000 hybridisations and supports the Microarray Centre's large microarray user community and two international consortia. The MiMiR flexible and scalable hardware and software architecture enables secure warehousing of thousands of datasets, including clinical studies, from microarray and potentially other -omics technologies. PMID:18801157
MiMiR--an integrated platform for microarray data sharing, mining and analysis.
Tomlinson, Chris; Thimma, Manjula; Alexandrakis, Stelios; Castillo, Tito; Dennis, Jayne L; Brooks, Anthony; Bradley, Thomas; Turnbull, Carly; Blaveri, Ekaterini; Barton, Geraint; Chiba, Norie; Maratou, Klio; Soutter, Pat; Aitman, Tim; Game, Laurence
2008-09-18
Despite considerable efforts within the microarray community for standardising data format, content and description, microarray technologies present major challenges in managing, sharing, analysing and re-using the large amount of data generated locally or internationally. Additionally, it is recognised that inconsistent and low quality experimental annotation in public data repositories significantly compromises the re-use of microarray data for meta-analysis. MiMiR, the Microarray data Mining Resource was designed to tackle some of these limitations and challenges. Here we present new software components and enhancements to the original infrastructure that increase accessibility, utility and opportunities for large scale mining of experimental and clinical data. A user friendly Online Annotation Tool allows researchers to submit detailed experimental information via the web at the time of data generation rather than at the time of publication. This ensures the easy access and high accuracy of meta-data collected. Experiments are programmatically built in the MiMiR database from the submitted information and details are systematically curated and further annotated by a team of trained annotators using a new Curation and Annotation Tool. Clinical information can be annotated and coded with a clinical Data Mapping Tool within an appropriate ethical framework. Users can visualise experimental annotation, assess data quality, download and share data via a web-based experiment browser called MiMiR Online. All requests to access data in MiMiR are routed through a sophisticated middleware security layer thereby allowing secure data access and sharing amongst MiMiR registered users prior to publication. Data in MiMiR can be mined and analysed using the integrated EMAAS open source analysis web portal or via export of data and meta-data into Rosetta Resolver data analysis package. The new MiMiR suite of software enables systematic and effective capture of extensive experimental and clinical information with the highest MIAME score, and secure data sharing prior to publication. MiMiR currently contains more than 150 experiments corresponding to over 3000 hybridisations and supports the Microarray Centre's large microarray user community and two international consortia. The MiMiR flexible and scalable hardware and software architecture enables secure warehousing of thousands of datasets, including clinical studies, from microarray and potentially other -omics technologies.
Creating a Highly Reliable Neonatal Intensive Care Unit Through Safer Systems of Care.
Panagos, Patoula G; Pearlman, Stephen A
2017-09-01
Neonates requiring intensive care are at high risk for medical errors due to their unique characteristics and high acuity. Designing a safer work environment begins with safe processes. Creating a culture of safety demands the involvement of all organizational levels and an interdisciplinary approach. Adverse events can result from suboptimal communication and lack of a shared mental model. This chapter describes tools to promote better patient safety in the NICU through monitoring adverse events, improving communication and using information technology. Unplanned extubation is an example of a neonatal safety concern that can be reduced by employing quality improvement methodology. Copyright © 2017 Elsevier Inc. All rights reserved.
Hibbard, Judith H; Greene, Jessica; Sofaer, Shoshanna; Firminger, Kirsten; Hirsh, Judith
2012-03-01
Advocates of health reform continue to pursue policies and tools that will make information about comparative costs and resource use available to consumers. Reformers expect that consumers will use the data to choose high-value providers-those who offer higher quality and lower prices-and thus contribute to the broader goal of controlling national health care spending. However, communicating this information effectively is more challenging than it might first appear. For example, consumers are more interested in the quality of health care than in its cost, and many perceive a low-cost provider to be substandard. In this study of 1,421 employees, we examined how different presentations of information affect the likelihood that consumers will make high-value choices. We found that a substantial minority of the respondents shied away from low-cost providers, and even consumers who pay a larger share of their health care costs themselves were likely to equate high cost with high quality. At the same time, we found that presenting cost data alongside easy-to-interpret quality information and highlighting high-value options improved the likelihood that consumers would choose those options. Reporting strategies that follow such a format will help consumers understand that a doctor who provides higher-quality care than other doctors does not necessarily cost more.
Steele Gray, Carolyn; Wodchis, Walter P; Upshur, Ross; Cott, Cheryl; McKinstry, Brian; Mercer, Stewart; Palen, Ted E; Ramsay, Tim; Thavorn, Kednapa
2016-06-24
Older adults experiencing multiple chronic illnesses are at high risk of hospitalization and health decline if they are unable to manage the significant challenges posed by their health conditions. Goal-oriented care approaches can provide better care for these complex patients, but clinicians find the process of ascertaining goals "too complex and too-time consuming," and goals are often not agreed upon between complex patients and their providers. The electronic patient reported outcomes (ePRO) mobile app and portal offers an innovative approach to creating and monitoring goal-oriented patient-care plans to improve patient self-management and shared decision-making between patients and health care providers. The ePRO tool also supports proactive patient monitoring by the patient, caregiver(s), and health care provider. It was developed with and for older adults with complex care needs as a means to improve their quality of life. Our proposed project will evaluate the use, effectiveness, and value for money of the ePRO tool in a 12-month multicenter, randomized controlled trial in Ontario; targeting individuals 65 or over with two or more chronic conditions that require frequent health care visits to manage their health conditions. Intervention groups using the ePRO tool will be compared with control groups on measures of quality of life, patient experience, and cost-effectiveness. We will also evaluate the implementation of the tool. The proposed project presented in this paper will be funded through the Canadian Institute for Health Research (CIHR) eHealth Innovation Partnerships Program (eHIPP) program (CIHR-348362). The expected completion date of the study is November, 2019. We anticipate our program of work will support improved quality of life and patient self-management, improved patient-centered primary care delivery, and will encourage the adoption of goal-oriented care approaches across primary health care systems. We have partnered with family health teams and quality improvement organizations in Ontario to ensure that our research is practical and that findings are shared widely. We will work with our established international network to develop an implementation framework to support continued adaptation and adoption across Canada and internationally.
Wodchis, Walter P; Upshur, Ross; Cott, Cheryl; McKinstry, Brian; Mercer, Stewart; Palen, Ted E; Ramsay, Tim; Thavorn, Kednapa
2016-01-01
Background Older adults experiencing multiple chronic illnesses are at high risk of hospitalization and health decline if they are unable to manage the significant challenges posed by their health conditions. Goal-oriented care approaches can provide better care for these complex patients, but clinicians find the process of ascertaining goals “too complex and too-time consuming,” and goals are often not agreed upon between complex patients and their providers. The electronic patient reported outcomes (ePRO) mobile app and portal offers an innovative approach to creating and monitoring goal-oriented patient-care plans to improve patient self-management and shared decision-making between patients and health care providers. The ePRO tool also supports proactive patient monitoring by the patient, caregiver(s), and health care provider. It was developed with and for older adults with complex care needs as a means to improve their quality of life. Objective Our proposed project will evaluate the use, effectiveness, and value for money of the ePRO tool in a 12-month multicenter, randomized controlled trial in Ontario; targeting individuals 65 or over with two or more chronic conditions that require frequent health care visits to manage their health conditions. Methods Intervention groups using the ePRO tool will be compared with control groups on measures of quality of life, patient experience, and cost-effectiveness. We will also evaluate the implementation of the tool. Results The proposed project presented in this paper will be funded through the Canadian Institute for Health Research (CIHR) eHealth Innovation Partnerships Program (eHIPP) program (CIHR–143559). The expected completion date of the study is November, 2019. Conclusions We anticipate our program of work will support improved quality of life and patient self-management, improved patient-centered primary care delivery, and will encourage the adoption of goal-oriented care approaches across primary health care systems. We have partnered with family health teams and quality improvement organizations in Ontario to ensure that our research is practical and that findings are shared widely. We will work with our established international network to develop an implementation framework to support continued adaptation and adoption across Canada and internationally. PMID:27341765
Coproducing Aboriginal patient journey mapping tools for improved quality and coordination of care.
Kelly, Janet; Dwyer, Judith; Mackean, Tamara; O'Donnell, Kim; Willis, Eileen
2016-12-08
This paper describes the rationale and process for developing a set of Aboriginal patient journey mapping tools with Aboriginal patients, health professionals, support workers, educators and researchers in the Managing Two Worlds Together project between 2008 and 2015. Aboriginal patients and their families from rural and remote areas, and healthcare providers in urban, rural and remote settings, shared their perceptions of the barriers and enablers to quality care in interviews and focus groups, and individual patient journey case studies were documented. Data were thematically analysed. In the absence of suitable existing tools, a new analytical framework and mapping approach was developed. The utility of the tools in other settings was then tested with health professionals, and the tools were further modified for use in quality improvement in health and education settings in South Australia and the Northern Territory. A central set of patient journey mapping tools with flexible adaptations, a workbook, and five sets of case studies describing how staff adapted and used the tools at different sites are available for wider use.
Creating and sharing clinical decision support content with Web 2.0: Issues and examples.
Wright, Adam; Bates, David W; Middleton, Blackford; Hongsermeier, Tonya; Kashyap, Vipul; Thomas, Sean M; Sittig, Dean F
2009-04-01
Clinical decision support is a powerful tool for improving healthcare quality and patient safety. However, developing a comprehensive package of decision support interventions is costly and difficult. If used well, Web 2.0 methods may make it easier and less costly to develop decision support. Web 2.0 is characterized by online communities, open sharing, interactivity and collaboration. Although most previous attempts at sharing clinical decision support content have worked outside of the Web 2.0 framework, several initiatives are beginning to use Web 2.0 to share and collaborate on decision support content. We present case studies of three efforts: the Clinfowiki, a world-accessible wiki for developing decision support content; Partners Healthcare eRooms, web-based tools for developing decision support within a single organization; and Epic Systems Corporation's Community Library, a repository for sharing decision support content for customers of a single clinical system vendor. We evaluate the potential of Web 2.0 technologies to enable collaborative development and sharing of clinical decision support systems through the lens of three case studies; analyzing technical, legal and organizational issues for developers, consumers and organizers of clinical decision support content in Web 2.0. We believe the case for Web 2.0 as a tool for collaborating on clinical decision support content appears strong, particularly for collaborative content development within an organization.
ERIC Educational Resources Information Center
DeBruin-Parecki, Andrea
2007-01-01
Everyone knows how important it is to read to young children--but it is the "quality" of shared reading that really affects emergent literacy. How well are adults engaging and teaching children as they read together? How well are children listening and responding? The first and only tool to measure the quality of adult and child interactions…
NASA Astrophysics Data System (ADS)
Gries, C.; Read, J. S.; Winslow, L. A.; Hanson, P. C.; Weathers, K. C.
2014-12-01
The Global Lake Ecological Observatory Network (GLEON) is an international community of scientists, educators and citizens with the mission to conduct innovative science by sharing and interpreting high-resolution sensor data to understand, predict and communicate the role and response of lakes in a changing global environment. During its almost ten years of existence and continual growth, GLEON has inspired innovative science, new modeling approaches, and accumulated extensive experience in the management of streaming, high resolution, and large volume data. However, a recent 'data task force' identified inhibiting data infrastructure issues, including providing access to data, discovering distributed data, and integrating data into useful data products for scientific research and management. Accordingly, in support of the complete data lifecycle, tools are being developed by the GLEON community and integrated with innovative technology from other groups to improve environmental observations data management in the broader community. Specifically we will discuss raw data handling with tools developed by the Consortium of Universities for the Advancement of Hydrologic Sciences (CUAHSI, Observation Data Model and DataLoader), quality control practices using a newly developed R package (sensorQC), data access with HydroDesktop, or webservices delivering WaterML, data analysis with the R package rLakeAnalyzer, and final storage of the quality controlled, harmonized and value added data product in a DataONE member node. Such data product is then discoverable, accessible for new analyses and citable in subsequent publications. Leveraging GLEON's organizational structure, community trust, extensive experience, and technological talent the goal is to develop a design and implementation plan for a data publishing and sharing system that will address not only GLEON's needs, but also those of other environmental research communities.
Wang, Jin; Patel, Vimal; Burns, Daniel; Laycock, John; Pandya, Kinnari; Tsoi, Jennifer; DeSilva, Binodh; Ma, Mark; Lee, Jean
2013-07-01
Regulated bioanalytical laboratories that run ligand-binding assays in support of biotherapeutics development face ever-increasing demand to support more projects with increased efficiency. Laboratory automation is a tool that has the potential to improve both quality and efficiency in a bioanalytical laboratory. The success of laboratory automation requires thoughtful evaluation of program needs and fit-for-purpose strategies, followed by pragmatic implementation plans and continuous user support. In this article, we present the development of fit-for-purpose automation of total walk-away and flexible modular modes. We shared the sustaining experience of vendor collaboration and team work to educate, promote and track the use of automation. The implementation of laboratory automation improves assay performance, data quality, process efficiency and method transfer to CRO in a regulated bioanalytical laboratory environment.
Stewart, Moira; Thind, Amardeep; Terry, Amanda L; Chevendra, Vijaya; Marshall, J Neil
2009-11-01
Electronic medical records (EMRs) are posited as a tool for improving practice, policy and research in primary healthcare. This paper describes the Deliver Primary Healthcare Information (DELPHI) Project at the Department of Family Medicine at the University of Western Ontario, focusing on its development, current status and research potential in order to share experiences with researchers in similar contexts. The project progressed through four stages: (a) participant recruitment, (b) EMR software modification and implementation, (c) database creation and (d) data quality assessment. Currently, the DELPHI database holds more than two years of high-quality, de-identified data from 10 practices, with 30,000 patients and nearly a quarter of a million encounters.
Social Networking Adapted for Distributed Scientific Collaboration
NASA Technical Reports Server (NTRS)
Karimabadi, Homa
2012-01-01
Share is a social networking site with novel, specially designed feature sets to enable simultaneous remote collaboration and sharing of large data sets among scientists. The site will include not only the standard features found on popular consumer-oriented social networking sites such as Facebook and Myspace, but also a number of powerful tools to extend its functionality to a science collaboration site. A Virtual Observatory is a promising technology for making data accessible from various missions and instruments through a Web browser. Sci-Share augments services provided by Virtual Observatories by enabling distributed collaboration and sharing of downloaded and/or processed data among scientists. This will, in turn, increase science returns from NASA missions. Sci-Share also enables better utilization of NASA s high-performance computing resources by providing an easy and central mechanism to access and share large files on users space or those saved on mass storage. The most common means of remote scientific collaboration today remains the trio of e-mail for electronic communication, FTP for file sharing, and personalized Web sites for dissemination of papers and research results. Each of these tools has well-known limitations. Sci-Share transforms the social networking paradigm into a scientific collaboration environment by offering powerful tools for cooperative discourse and digital content sharing. Sci-Share differentiates itself by serving as an online repository for users digital content with the following unique features: a) Sharing of any file type, any size, from anywhere; b) Creation of projects and groups for controlled sharing; c) Module for sharing files on HPC (High Performance Computing) sites; d) Universal accessibility of staged files as embedded links on other sites (e.g. Facebook) and tools (e.g. e-mail); e) Drag-and-drop transfer of large files, replacing awkward e-mail attachments (and file size limitations); f) Enterprise-level data and messaging encryption; and g) Easy-to-use intuitive workflow.
Quality improvement and practice-based research in neurology using the electronic medical record
Frigerio, Roberta; Kazmi, Nazia; Meyers, Steven L.; Sefa, Meredith; Walters, Shaun A.; Silverstein, Jonathan C.
2015-01-01
Abstract We describe quality improvement and practice-based research using the electronic medical record (EMR) in a community health system–based department of neurology. Our care transformation initiative targets 10 neurologic disorders (brain tumors, epilepsy, migraine, memory disorders, mild traumatic brain injury, multiple sclerosis, neuropathy, Parkinson disease, restless legs syndrome, and stroke) and brain health (risk assessments and interventions to prevent Alzheimer disease and related disorders in targeted populations). Our informatics methods include building and implementing structured clinical documentation support tools in the EMR; electronic data capture; enrollment, data quality, and descriptive reports; quality improvement projects; clinical decision support tools; subgroup-based adaptive assignments and pragmatic trials; and DNA biobanking. We are sharing EMR tools and deidentified data with other departments toward the creation of a Neurology Practice-Based Research Network. We discuss practical points to assist other clinical practices to make quality improvements and practice-based research in neurology using the EMR a reality. PMID:26576324
NASA Astrophysics Data System (ADS)
Aufdenkampe, A. K.; Tarboton, D. G.; Horsburgh, J. S.; Mayorga, E.; McFarland, M.; Robbins, A.; Haag, S.; Shokoufandeh, A.; Evans, B. M.; Arscott, D. B.
2017-12-01
The Model My Watershed Web app (https://app.wikiwatershed.org/) and the BiG-CZ Data Portal (http://portal.bigcz.org/) and are web applications that share a common codebase and a common goal to deliver high-performance discovery, visualization and analysis of geospatial data in an intuitive user interface in web browser. Model My Watershed (MMW) was designed as a decision support system for watershed conservation implementation. BiG CZ Data Portal was designed to provide context and background data for research sites. Users begin by creating an Area of Interest, via an automated watershed delineation tool, a free draw tool, selection of a predefined area such as a county or USGS Hydrological Unit (HUC), or uploading a custom polygon. Both Web apps visualize and provide summary statistics of land use, soil groups, streams, climate and other geospatial information. MMW then allows users to run a watershed model to simulate different scenarios of human impacts on stormwater runoff and water-quality. BiG CZ Data Portal allows users to search for scientific and monitoring data within the Area of Interest, which also serves as a prototype for the upcoming Monitor My Watershed web app. Both systems integrate with CUAHSI cyberinfrastructure, including visualizing observational data from CUAHSI Water Data Center and storing user data via CUAHSI HydroShare. Both systems also integrate with the new EnviroDIY Water Quality Data Portal (http://data.envirodiy.org/), a system for crowd-sourcing environmental monitoring data using open-source sensor stations (http://envirodiy.org/mayfly/) and based on the Observations Data Model v2.
MO-E-9A-01: Risk Based Quality Management: TG100 In Action
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huq, M; Palta, J; Dunscombe, P
2014-06-15
One of the goals of quality management in radiation therapy is to gain high confidence that patients will receive the prescribed treatment correctly. To accomplish these goals professional societies such as the American Association of Physicists in Medicine (AAPM) has published many quality assurance (QA), quality control (QC), and quality management (QM) guidance documents. In general, the recommendations provided in these documents have emphasized on performing device-specific QA at the expense of process flow and protection of the patient against catastrophic errors. Analyses of radiation therapy incidents find that they are most often caused by flaws in the overall therapymore » process, from initial consult through final treatment, than by isolated hardware or computer failures detectable by traditional physics QA. This challenge is shared by many intrinsically hazardous industries. Risk assessment tools and analysis techniques have been developed to define, identify, and eliminate known and/or potential failures, problems, or errors, from a system, process and/or service before they reach the customer. These include, but are not limited to, process mapping, failure modes and effects analysis (FMEA), fault tree analysis (FTA), and establishment of a quality management program that best avoids the faults and risks that have been identified in the overall process. These tools can be easily adapted to radiation therapy practices because of their simplicity and effectiveness to provide efficient ways to enhance the safety and quality of treatment processes. Task group 100 (TG100) of AAPM has developed a risk-based quality management program that uses these tools. This session will be devoted to a discussion of these tools and how these tools can be used in a given radiotherapy clinic to develop a risk based QM program. Learning Objectives: Learn how to design a process map for a radiotherapy process. Learn how to perform a FMEA analysis for a given process. Learn what Fault tree analysis is all about. Learn how to design a quality management program based upon the information obtained from process mapping, FMEA and FTA.« less
Basic Information about the Indoor Air Quality Tribal Partners Program
IAQ Tribal Partners Program. This website aims to further empower champions of healthy IAQ in tribal communities with tools for networking, sharing programs and practices, and by serving as a reservoir of the best available tribal-specific IAQ information.
Nichols, B. Nolan; Pohl, Kilian M.
2017-01-01
Accelerating insight into the relation between brain and behavior entails conducting small and large-scale research endeavors that lead to reproducible results. Consensus is emerging between funding agencies, publishers, and the research community that data sharing is a fundamental requirement to ensure all such endeavors foster data reuse and fuel reproducible discoveries. Funding agency and publisher mandates to share data are bolstered by a growing number of data sharing efforts that demonstrate how information technologies can enable meaningful data reuse. Neuroinformatics evaluates scientific needs and develops solutions to facilitate the use of data across the cognitive and neurosciences. For example, electronic data capture and management tools designed to facilitate human neurocognitive research can decrease the setup time of studies, improve quality control, and streamline the process of harmonizing, curating, and sharing data across data repositories. In this article we outline the advantages and disadvantages of adopting software applications that support these features by reviewing the tools available and then presenting two contrasting neuroimaging study scenarios in the context of conducting a cross-sectional and a multisite longitudinal study. PMID:26267019
Powell-Smith, Anna; Goldacre, Ben
2016-01-01
Background : Failure to publish trial results is a prevalent ethical breach with a negative impact on patient care. Audit is an important tool for quality improvement. We set out to produce an online resource that automatically identifies the sponsors with the best and worst record for failing to share trial results. Methods: A tool was produced that identifies all completed trials from clinicaltrials.gov, searches for results in the clinicaltrials.gov registry and on PubMed, and presents summary statistics for each sponsor online. Results : The TrialsTracker tool is now available. Results are consistent with previous publication bias cohort studies using manual searches. The prevalence of missing studies is presented for various classes of sponsor. All code and data is shared. Discussion: We have designed, built, and launched an easily accessible online service, the TrialsTracker, that identifies sponsors who have failed in their duty to make results of clinical trials available, and which can be maintained at low cost. Sponsors who wish to improve their performance metrics in this tool can do so by publishing the results of their trials.
The Speckle Toolbox: A Powerful Data Reduction Tool for CCD Astrometry
NASA Astrophysics Data System (ADS)
Harshaw, Richard; Rowe, David; Genet, Russell
2017-01-01
Recent advances in high-speed low-noise CCD and CMOS cameras, coupled with breakthroughs in data reduction software that runs on desktop PCs, has opened the domain of speckle interferometry and high-accuracy CCD measurements of double stars to amateurs, allowing them to do useful science of high quality. This paper describes how to use a speckle interferometry reduction program, the Speckle Tool Box (STB), to achieve this level of result. For over a year the author (Harshaw) has been using STB (and its predecessor, Plate Solve 3) to obtain measurements of double stars based on CCD camera technology for pairs that are either too wide (the stars not sharing the same isoplanatic patch, roughly 5 arc-seconds in diameter) or too faint to image in the coherence time required for speckle (usually under 40ms). This same approach - using speckle reduction software to measure CCD pairs with greater accuracy than possible with lucky imaging - has been used, it turns out, for several years by the U. S. Naval Observatory.
Democratizing molecular diagnostics for the developing world.
Abou Tayoun, Ahmad N; Burchard, Paul R; Malik, Imran; Scherer, Axel; Tsongalis, Gregory J
2014-01-01
Infectious diseases that are largely treatable continue to pose a tremendous burden on the developing world despite the availability of highly potent drugs. The high mortality and morbidity rates of these diseases are largely due to a lack of affordable diagnostics that are accessible to resource-limited areas and that can deliver high-quality results. In fact, modified molecular diagnostics for infectious diseases were rated as the top biotechnology to improve health in developing countries. In this review, we describe the characteristics of accessible molecular diagnostic tools and discuss the challenges associated with implementing such tools at low infrastructure sites. We highlight our experience as part of the "Grand Challenge" project supported by the Gates Foundation for addressing global health inequities and describe issues and solutions associated with developing adequate technologies or molecular assays needed for broad access in the developing world. We believe that sharing this knowledge will facilitate the development of new molecular technologies that are extremely valuable for improving global health.
Patient simulation: a literary synthesis of assessment tools in anesthesiology.
Edler, Alice A; Fanning, Ruth G; Chen, Michael I; Claure, Rebecca; Almazan, Dondee; Struyk, Brain; Seiden, Samuel C
2009-12-20
High-fidelity patient simulation (HFPS) has been hypothesized as a modality for assessing competency of knowledge and skill in patient simulation, but uniform methods for HFPS performance assessment (PA) have not yet been completely achieved. Anesthesiology as a field founded the HFPS discipline and also leads in its PA. This project reviews the types, quality, and designated purpose of HFPS PA tools in anesthesiology. We used the systematic review method and systematically reviewed anesthesiology literature referenced in PubMed to assess the quality and reliability of available PA tools in HFPS. Of 412 articles identified, 50 met our inclusion criteria. Seventy seven percent of studies have been published since 2000; more recent studies demonstrated higher quality. Investigators reported a variety of test construction and validation methods. The most commonly reported test construction methods included "modified Delphi Techniques" for item selection, reliability measurement using inter-rater agreement, and intra-class correlations between test items or subtests. Modern test theory, in particular generalizability theory, was used in nine (18%) of studies. Test score validity has been addressed in multiple investigations and shown a significant improvement in reporting accuracy. However the assessment of predicative has been low across the majority of studies. Usability and practicality of testing occasions and tools was only anecdotally reported. To more completely comply with the gold standards for PA design, both shared experience of experts and recognition of test construction standards, including reliability and validity measurements, instrument piloting, rater training, and explicit identification of the purpose and proposed use of the assessment tool, are required.
Validation of an instrument to measure inter-organisational linkages in general practice.
Amoroso, Cheryl; Proudfoot, Judith; Bubner, Tanya; Jayasinghe, Upali W; Holton, Christine; Winstanley, Julie; Beilby, Justin; Harris, Mark F
2007-12-03
Linkages between general medical practices and external services are important for high quality chronic disease care. The purpose of this research is to describe the development, evaluation and use of a brief tool that measures the comprehensiveness and quality of a general practice's linkages with external providers for the management of patients with chronic disease. In this study, clinical linkages are defined as the communication, support, and referral arrangements between services for the care and assistance of patients with chronic disease. An interview to measure surgery-level (rather than individual clinician-level) clinical linkages was developed, piloted, reviewed, and evaluated with 97 Australian general practices. Two validated survey instruments were posted to patients, and a survey of locally available services was developed and posted to participating Divisions of General Practice (support organisations). Hypotheses regarding internal validity, association with local services, and patient satisfaction were tested using factor analysis, logistic regression and multilevel regression models. The resulting General Practice Clinical Linkages Interview (GP-CLI) is a nine-item tool with three underlying factors: referral and advice linkages, shared care and care planning linkages, and community access and awareness linkages. Local availability of chronic disease services has no affect on the comprehensiveness of services with which practices link, however, comprehensiveness of clinical linkages has an association with patient assessment of access, receptionist services, and of continuity of care in their general practice. The GP-CLI may be useful to researchers examining comparable health care systems for measuring the comprehensiveness and quality of linkages at a general practice-level with related services, possessing both internal and external validity. The tool can be used with large samples exploring the impact, outcomes, and facilitators of high quality clinical linkages in general practice.
Cooperative hunting and meat sharing 400–200 kya at Qesem Cave, Israel
Stiner, Mary C.; Barkai, Ran; Gopher, Avi
2009-01-01
Zooarchaeological research at Qesem Cave, Israel demonstrates that large-game hunting was a regular practice by the late Lower Paleolithic period. The 400- to 200,000-year-old fallow deer assemblages from this cave provide early examples of prime-age-focused ungulate hunting, a human predator–prey relationship that has persisted into recent times. The meat diet at Qesem centered on large game and was supplemented with tortoises. These hominins hunted cooperatively, and consumption of the highest quality parts of large prey was delayed until the food could be moved to the cave and processed with the aid of blade cutting tools and fire. Delayed consumption of high-quality body parts implies that the meat was shared with other members of the group. The types of cut marks on upper limb bones indicate simple flesh removal activities only. The Qesem cut marks are both more abundant and more randomly oriented than those observed in Middle and Upper Paleolithic cases in the Levant, suggesting that more (skilled and unskilled) individuals were directly involved in cutting meat from the bones at Qesem Cave. Among recent humans, butchering of large animals normally involves a chain of focused tasks performed by one or just a few persons, and butchering guides many of the formalities of meat distribution and sharing that follow. The results from Qesem Cave raise new hypotheses about possible differences in the mechanics of meat sharing between the late Lower Paleolithic and Middle Paleolithic. PMID:19666542
Cooperative hunting and meat sharing 400-200 kya at Qesem Cave, Israel.
Stiner, Mary C; Barkai, Ran; Gopher, Avi
2009-08-11
Zooarchaeological research at Qesem Cave, Israel demonstrates that large-game hunting was a regular practice by the late Lower Paleolithic period. The 400- to 200,000-year-old fallow deer assemblages from this cave provide early examples of prime-age-focused ungulate hunting, a human predator-prey relationship that has persisted into recent times. The meat diet at Qesem centered on large game and was supplemented with tortoises. These hominins hunted cooperatively, and consumption of the highest quality parts of large prey was delayed until the food could be moved to the cave and processed with the aid of blade cutting tools and fire. Delayed consumption of high-quality body parts implies that the meat was shared with other members of the group. The types of cut marks on upper limb bones indicate simple flesh removal activities only. The Qesem cut marks are both more abundant and more randomly oriented than those observed in Middle and Upper Paleolithic cases in the Levant, suggesting that more (skilled and unskilled) individuals were directly involved in cutting meat from the bones at Qesem Cave. Among recent humans, butchering of large animals normally involves a chain of focused tasks performed by one or just a few persons, and butchering guides many of the formalities of meat distribution and sharing that follow. The results from Qesem Cave raise new hypotheses about possible differences in the mechanics of meat sharing between the late Lower Paleolithic and Middle Paleolithic.
Shared decision-making – transferring research into practice: the Analytic Hierarchy Process (AHP)
Dolan, James G.
2008-01-01
Objective To illustrate how the Analytic Hierarchy Process (AHP) can be used to promote shared decision-making and enhance clinician-patient communication. Methods Tutorial review. Results The AHP promotes shared decision making by creating a framework that is used to define the decision, summarize the information available, prioritize information needs, elicit preferences and values, and foster meaningful communication among decision stakeholders. Conclusions The AHP and related multi-criteria methods have the potential for improving the quality of clinical decisions and overcoming current barriers to implementing shared decision making in busy clinical settings. Further research is needed to determine the best way to implement these tools and to determine their effectiveness. Practice Implications Many clinical decisions involve preference-based trade-offs between competing risks and benefits. The AHP is a well-developed method that provides a practical approach for improving patient-provider communication, clinical decision-making, and the quality of patient care in these situations. PMID:18760559
BiGG Models: A platform for integrating, standardizing and sharing genome-scale models
King, Zachary A.; Lu, Justin; Drager, Andreas; ...
2015-10-17
In this study, genome-scale metabolic models are mathematically structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scalemore » metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data.« less
BiGG Models: A platform for integrating, standardizing and sharing genome-scale models
King, Zachary A.; Lu, Justin; Dräger, Andreas; Miller, Philip; Federowicz, Stephen; Lerman, Joshua A.; Ebrahim, Ali; Palsson, Bernhard O.; Lewis, Nathan E.
2016-01-01
Genome-scale metabolic models are mathematically-structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scale metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data. PMID:26476456
Shared decision-making in epilepsy management.
Pickrell, W O; Elwyn, G; Smith, P E M
2015-06-01
Policy makers, clinicians, and patients increasingly recognize the need for greater patient involvement in clinical decision-making. Shared decision-making helps address these concerns by providing a framework for clinicians and patients to make decisions together using the best evidence. Shared decision-making is applicable to situations where several acceptable options exist (clinical equipoise). Such situations occur commonly in epilepsy, for example, in decisions regarding the choice of medication, treatment in pregnancy, and medication withdrawal. A talk model is a way of implementing shared decision-making during consultations, and decision aids are useful tools to assist in the process. Although there is limited evidence available for shared decision-making in epilepsy, there are several benefits of shared decision-making in general including improved decision quality, more informed choices, and better treatment concordance. Copyright © 2015 Elsevier Inc. All rights reserved.
A standard for measuring metadata quality in spectral libraries
NASA Astrophysics Data System (ADS)
Rasaiah, B.; Jones, S. D.; Bellman, C.
2013-12-01
A standard for measuring metadata quality in spectral libraries Barbara Rasaiah, Simon Jones, Chris Bellman RMIT University Melbourne, Australia barbara.rasaiah@rmit.edu.au, simon.jones@rmit.edu.au, chris.bellman@rmit.edu.au ABSTRACT There is an urgent need within the international remote sensing community to establish a metadata standard for field spectroscopy that ensures high quality, interoperable metadata sets that can be archived and shared efficiently within Earth observation data sharing systems. Metadata are an important component in the cataloguing and analysis of in situ spectroscopy datasets because of their central role in identifying and quantifying the quality and reliability of spectral data and the products derived from them. This paper presents approaches to measuring metadata completeness and quality in spectral libraries to determine reliability, interoperability, and re-useability of a dataset. Explored are quality parameters that meet the unique requirements of in situ spectroscopy datasets, across many campaigns. Examined are the challenges presented by ensuring that data creators, owners, and data users ensure a high level of data integrity throughout the lifecycle of a dataset. Issues such as field measurement methods, instrument calibration, and data representativeness are investigated. The proposed metadata standard incorporates expert recommendations that include metadata protocols critical to all campaigns, and those that are restricted to campaigns for specific target measurements. The implication of semantics and syntax for a robust and flexible metadata standard are also considered. Approaches towards an operational and logistically viable implementation of a quality standard are discussed. This paper also proposes a way forward for adapting and enhancing current geospatial metadata standards to the unique requirements of field spectroscopy metadata quality. [0430] BIOGEOSCIENCES / Computational methods and data processing [0480] BIOGEOSCIENCES / Remote sensing [1904] INFORMATICS / Community standards [1912] INFORMATICS / Data management, preservation, rescue [1926] INFORMATICS / Geospatial [1930] INFORMATICS / Data and information governance [1946] INFORMATICS / Metadata [1952] INFORMATICS / Modeling [1976] INFORMATICS / Software tools and services [9810] GENERAL OR MISCELLANEOUS / New fields
Air Quality and Heart Health: Managing an Emerging ...
Dr. Cascio will share with a broad range of federal agencies current understanding of the links between air quality and cardiovascular health. The key facts include that air pollution contributes a high attributable health burden. That certain well-defined vulnerable subpopulations are at higher risk. At-risk populations include those with heart disease, lung disease and diabetes, older adults, children and individuals living in low socioeconomic neighborhoods. There is no established threshold level for safe long-term exposure to air particle pollution, and some of the basic biological mechanisms that account for adverse health effects are now known. This knowledge is giving us insight into how we might mitigate the effects apart from the regulatory efforts to improve overall air quality. Moreover, the work that each State has done to improve air quality has resulted in improved health outcomes including cardiovascular outcomes, and longer lives. The presentation will address: 1) What do we know? 2) Who are the at-risk populations? 3) What can communities do to reduce risk? 4) What can healthcare professionals do to reduce risk of the at-risk population? And 5) What tools are available to help healthcare professionals and their patients reduce exposure and risk from air pollutants? The talk will feature a description of the Air Quality Index and associated EPA tools and health information that can be used by health care providers to educate their at-ris
Brown, Eric W.; Detter, Chris; Gerner-Smidt, Peter; Gilmour, Matthew W.; Harmsen, Dag; Hendriksen, Rene S.; Hewson, Roger; Heymann, David L.; Johansson, Karin; Ijaz, Kashef; Keim, Paul S.; Koopmans, Marion; Kroneman, Annelies; Wong, Danilo Lo Fo; Lund, Ole; Palm, Daniel; Sawanpanyalert, Pathom; Sobel, Jeremy; Schlundt, Jørgen
2012-01-01
The rapid advancement of genome technologies holds great promise for improving the quality and speed of clinical and public health laboratory investigations and for decreasing their cost. The latest generation of genome DNA sequencers can provide highly detailed and robust information on disease-causing microbes, and in the near future these technologies will be suitable for routine use in national, regional, and global public health laboratories. With additional improvements in instrumentation, these next- or third-generation sequencers are likely to replace conventional culture-based and molecular typing methods to provide point-of-care clinical diagnosis and other essential information for quicker and better treatment of patients. Provided there is free-sharing of information by all clinical and public health laboratories, these genomic tools could spawn a global system of linked databases of pathogen genomes that would ensure more efficient detection, prevention, and control of endemic, emerging, and other infectious disease outbreaks worldwide. PMID:23092707
Rosenberg, David M; Horn, Charles C
2016-08-01
Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. Copyright © 2016 the American Physiological Society.
2016-01-01
Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus—a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software—an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. PMID:27098025
Using the Generic Mapping Tools From Within the MATLAB, Octave and Julia Computing Environments
NASA Astrophysics Data System (ADS)
Luis, J. M. F.; Wessel, P.
2016-12-01
The Generic Mapping Tools (GMT) is a widely used software infrastructure tool set for analyzing and displaying geoscience data. Its power to analyze and process data and produce publication-quality graphics has made it one of several standard processing toolsets used by a large segment of the Earth and Ocean Sciences. GMT's strengths lie in superior publication-quality vector graphics, geodetic-quality map projections, robust data processing algorithms scalable to enormous data sets, and ability to run under all common operating systems. The GMT tool chest offers over 120 modules sharing a common set of command options, file structures, and documentation. GMT modules are command line tools that accept input and write output, and this design allows users to write scripts in which one module's output becomes another module's input, creating highly customized GMT workflows. With the release of GMT 5, these modules are high-level functions with a C API, potentially allowing users access to high-level GMT capabilities from any programmable environment. Many scientists who use GMT also use other computational tools, such as MATLAB® and its clone Octave. We have built a MATLAB/Octave interface on top of the GMT 5 C API. Thus, MATLAB or Octave now has full access to all GMT modules as well as fundamental input/output of GMT data objects via a MEX function. Internally, the GMT/MATLAB C API defines six high-level composite data objects that handle input and output of data via individual GMT modules. These are data tables, grids, text tables (text/data mixed records), color palette tables, raster images (1-4 color bands), and PostScript. The API is responsible for translating between the six GMT objects and the corresponding native MATLAB objects. References to data arrays are passed if transposing of matrices is not required. The GMT and MATLAB/Octave combination is extremely flexible, letting the user harvest the general numerical and graphical capabilities of both systems, and represents a giant step forward in interoperability between GMT and other software package. We will present examples of the symbiotic benefits of combining these platforms. Two other extensions are also in the works: a nearly finished Julia wrapper and an embryonic Python module. Publication supported by FCT- project UID/GEO/50019/2013 - Instituto D. Luiz
ERIC Educational Resources Information Center
Research Assessment Management, Inc., Silver Spring, MD.
A quality Head Start facility should provide a physical environment responsive both to the needs of the children and families served and to the needs of staff, volunteers, and community agencies that share space with Head Start. This manual is a tool for Head Start grantees and delegate agencies for assessing existing facilities, making…
Toward Deriving Software Architectures from Quality Attributes
1994-08-01
administration of Its orograms on the basis of religion creec ancestry. belief, age veteran status sexuai orientation or rn violation of federal state or Ioca...environments rely on the notion of a "tool bus" or an explicit shared repository [ Wasser - man 89] to allow easy integration of tools. 4.7 Unit...attributed parse tree and symbol table that the compiler cre- ates and annotates during its various phases. This results in a very different software
Improved de novo genomic assembly for the domestic donkey.
Renaud, Gabriel; Petersen, Bent; Seguin-Orlando, Andaine; Bertelsen, Mads Frost; Waller, Andrew; Newton, Richard; Paillot, Romain; Bryant, Neil; Vaudin, Mark; Librado, Pablo; Orlando, Ludovic
2018-04-01
Donkeys and horses share a common ancestor dating back to about 4 million years ago. Although a high-quality genome assembly at the chromosomal level is available for the horse, current assemblies available for the donkey are limited to moderately sized scaffolds. The absence of a better-quality assembly for the donkey has hampered studies involving the characterization of patterns of genetic variation at the genome-wide scale. These range from the application of genomic tools to selective breeding and conservation to the more fundamental characterization of the genomic loci underlying speciation and domestication. We present a new high-quality donkey genome assembly obtained using the Chicago HiRise assembly technology, providing scaffolds of subchromosomal size. We make use of this new assembly to obtain more accurate measures of heterozygosity for equine species other than the horse, both genome-wide and locally, and to detect runs of homozygosity potentially pertaining to positive selection in domestic donkeys. Finally, this new assembly allowed us to identify fine-scale chromosomal rearrangements between the horse and the donkey that likely played an active role in their divergence and, ultimately, speciation.
Improved de novo genomic assembly for the domestic donkey
Newton, Richard; Paillot, Romain; Bryant, Neil; Vaudin, Mark
2018-01-01
Donkeys and horses share a common ancestor dating back to about 4 million years ago. Although a high-quality genome assembly at the chromosomal level is available for the horse, current assemblies available for the donkey are limited to moderately sized scaffolds. The absence of a better-quality assembly for the donkey has hampered studies involving the characterization of patterns of genetic variation at the genome-wide scale. These range from the application of genomic tools to selective breeding and conservation to the more fundamental characterization of the genomic loci underlying speciation and domestication. We present a new high-quality donkey genome assembly obtained using the Chicago HiRise assembly technology, providing scaffolds of subchromosomal size. We make use of this new assembly to obtain more accurate measures of heterozygosity for equine species other than the horse, both genome-wide and locally, and to detect runs of homozygosity potentially pertaining to positive selection in domestic donkeys. Finally, this new assembly allowed us to identify fine-scale chromosomal rearrangements between the horse and the donkey that likely played an active role in their divergence and, ultimately, speciation. PMID:29740610
SOA-based digital library services and composition in biomedical applications.
Zhao, Xia; Liu, Enjie; Clapworthy, Gordon J; Viceconti, Marco; Testi, Debora
2012-06-01
Carefully collected, high-quality data are crucial in biomedical visualization, and it is important that the user community has ready access to both this data and the high-performance computing resources needed by the complex, computational algorithms that will process it. Biological researchers generally require data, tools and algorithms from multiple providers to achieve their goals. This paper illustrates our response to the problems that result from this. The Living Human Digital Library (LHDL) project presented in this paper has taken advantage of Web Services to build a biomedical digital library infrastructure that allows clinicians and researchers not only to preserve, trace and share data resources, but also to collaborate at the data-processing level. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System
NASA Technical Reports Server (NTRS)
Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.
1999-01-01
Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cox, S.; Benioff, R.
2011-05-01
The Coordinated Low Emissions Assistance Network (CLEAN) is a voluntary network of international practitioners supporting low-emission planning in developing countries. The network seeks to improve quality of support through sharing project information, tools, best practices and lessons, and by fostering harmonized assistance. CLEAN has developed an inventory to track and analyze international technical support and tools for low-carbon planning activities in developing countries. This paper presents a preliminary analysis of the inventory to help identify trends in assistance activities and tools available to support developing countries with low-emission planning.
Rossi, Francesca; Coppo, Monica; Zucchetti, Giulia; Bazzano, Daniela; Ricci, Federica; Vassallo, Elena; Nesi, Francesca; Fagioli, Franca
2016-11-01
Hematopoietic stem cell transplantation is a therapeutic strategy for several oncohematological diseases. It increases survival rates but leads to a high incidence of related effects. The objective of this paper was to examine the existing literature on physical exercise interventions among pediatric HSCT recipients to explore the most often utilized rehabilitative assessment and treatment tools. Studies published from 2002 to April 1, 2015 were selected: 10 studies were included. A previous literary review has shown that rehabilitation programs have a positive impact on quality of life. Our analysis identified some significant outcome variables and shared intervention areas. © 2016 Wiley Periodicals, Inc.
Guilak, Farshid
2017-03-21
We are currently in one of the most exciting times for science and engineering as we witness unprecedented growth in our computational and experimental capabilities to generate new data and models. To facilitate data and model sharing, and to enhance reproducibility and rigor in biomechanics research, the Journal of Biomechanics has introduced a number of tools for Content Innovation to allow presentation, sharing, and archiving of methods, models, and data in our articles. The tools include an Interactive Plot Viewer, 3D Geometric Shape and Model Viewer, Virtual Microscope, Interactive MATLAB Figure Viewer, and Audioslides. Authors are highly encouraged to make use of these in upcoming journal submissions. Copyright © 2017 Elsevier Ltd. All rights reserved.
Stein, Bradley D; Kogan, Jane N; Mihalyo, Mark J; Schuster, James; Deegan, Patricia E; Sorbero, Mark J; Drake, Robert E
2013-04-01
Healthcare reform emphasizes patient-centered care and shared decision-making. This study examined the impact on psychotropic adherence of a decision support center and computerized tool designed to empower and activate consumers prior to an outpatient medication management visit. Administrative data were used to identify 1,122 Medicaid-enrolled adults receiving psychotropic medication from community mental health centers over a two-year period from community mental health centers. Multivariate linear regression models were used to examine if tool users had higher rates of 180-day medication adherence than non-users. Older clients, Caucasian clients, those without recent hospitalizations, and those who were Medicaid-eligible due to disability had higher rates of 180-day medication adherence. After controlling for sociodemographics, clinical characteristics, baseline adherence, and secular changes over time, using the computerized tool did not affect adherence to psychotropic medications. The computerized decision tool did not affect medication adherence among clients in outpatient mental health clinics. Additional research should clarify the impact of decision-making tools on other important outcomes such as engagement, patient-prescriber communication, quality of care, self-management, and long-term clinical and functional outcomes.
Towards shared patient records: an architecture for using routine data for nationwide research.
Knaup, Petra; Garde, Sebastian; Merzweiler, Angela; Graf, Norbert; Schilling, Freimut; Weber, Ralf; Haux, Reinhold
2006-01-01
Ubiquitous information is currently one of the most challenging slogans in medical informatics research. An adequate architecture for shared electronic patient records is needed which can use data for multiple purposes and which is extensible for new research questions. We introduce eardap as architecture for using routine data for nationwide clinical research in a multihospital environment. eardap can be characterized as terminology-based. Main advantage of our approach is the extensibility by new items and new research questions. Once the definition of items for a research question is finished, a consistent, corresponding database can be created without any informatics skills. Our experiences in pediatric oncology in Germany have shown the applicability of eardap. The functions of our core system were in routine clinical use in several hospitals. We validated the terminology management system (TMS) and the module generation tool with the basic data set of pediatric oncology. The multiple usability depends mainly on the quality of item planning in the TMS. High quality harmonization will lead to a higher amount of multiply used data. When using eardap, special emphasis is to be placed on interfaces to local hospital information systems and data security issues.
Ancker, Jessica S; Witteman, Holly O; Hafeez, Baria; Provencher, Thierry; Van de Graaf, Mary; Wei, Esther
2015-06-04
A critical problem for patients with chronic conditions who see multiple health care providers is incomplete or inaccurate information, which can contribute to lack of care coordination, low quality of care, and medical errors. As part of a larger project on applications of consumer health information technology (HIT) and barriers to its use, we conducted a semistructured interview study with patients with multiple chronic conditions (MCC) with the objective of exploring their role in managing their personal health information. Semistructured interviews were conducted with patients and providers. Patients were eligible if they had multiple chronic conditions and were in regular care with one of two medical organizations in New York City; health care providers were eligible if they had experience caring for patients with multiple chronic conditions. Analysis was conducted from a grounded theory perspective, and recruitment was concluded when saturation was achieved. A total of 22 patients and 7 providers were interviewed; patients had an average of 3.5 (SD 1.5) chronic conditions and reported having regular relationships with an average of 5 providers. Four major themes arose: (1) Responsibility for managing medical information: some patients perceived information management and sharing as the responsibility of health care providers; others—particularly those who had had bad experiences in the past—took primary responsibility for information sharing; (2) What information should be shared: although privacy concerns did influence some patients' perceptions of sharing of medical data, decisions about what to share were also heavily influenced by their understanding of health and disease and by the degree to which they understood the health care system; (3) Methods and tools varied: those patients who did take an active role in managing their records used a variety of electronic tools, paper tools, and memory; and (4) Information management as invisible work: managing transfers of medical information to solve problems was a tremendous amount of work that was largely unrecognized by the medical establishment. We conclude that personal health information management should be recognized as an additional burden that MCC places upon patients. Effective structural solutions for information sharing, whether institutional ones such as care management or technological ones such as electronic health information exchange, are likely not only to improve the quality of information shared but reduce the burden on patients already weighed down by MCC.
Phillips, Nicole Margaret; Street, Maryann; Haesler, Emily
2016-02-01
Patient participation in healthcare is recognised internationally as essential for consumer-centric, high-quality healthcare delivery. Its measurement as part of continuous quality improvement requires development of agreed standards and measurable indicators. This systematic review sought to identify strategies to measure patient participation in healthcare and to report their reliability and validity. In the context of this review, patient participation was constructed as shared decision-making, acknowledging the patient as having critical knowledge regarding their own health and care needs and promoting self-care/autonomy. Following a comprehensive search, studies reporting reliability or validity of an instrument used in a healthcare setting to measure patient participation, published in English between January 2004 and March 2014 were eligible for inclusion. From an initial search, which identified 1582 studies, 156 studies were retrieved and screened against inclusion criteria. Thirty-three studies reporting 24 patient participation measurement tools met inclusion criteria, and were critically appraised. The majority of studies were descriptive psychometric studies using prospective, cross-sectional designs. Almost all the tools completed by patients, family caregivers, observers or more than one stakeholder focused on aspects of patient-professional communication. Few tools designed for completion by patients or family caregivers provided valid and reliable measures of patient participation. There was low correlation between many of the tools and other measures of patient satisfaction. Few reliable and valid tools for measurement of patient participation in healthcare have been recently developed. Of those reported in this review, the dyadic Observing Patient Involvement in Decision Making (dyadic-OPTION) tool presents the most promise for measuring core components of patient participation. There remains a need for further study into valid, reliable and feasible strategies for measuring patient participation as part of continuous quality improvement. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Godolphin, William
2009-01-01
Shared decision-making has been called the crux of patient-centred care and identified as a key part of change for improved quality and safety in healthcare. However, it rarely happens, is hard to do and is not taught - for many reasons. Talking with patients about options is not embedded in the attitudes or communication skills training of most healthcare professionals. Information tools such as patient decision aids, personal health records and the Internet will help to shift this state, as will policy that drives patient and public involvement in healthcare delivery and training.
Kamadjeu, Raoul
2009-01-01
Background The use of GIS in public health is growing, a consequence of a rapidly evolving technology and increasing accessibility to a wider audience. Google Earth™ (GE) is becoming an important mapping infrastructure for public health. However, generating traditional public health maps for GE is still beyond the reach of most public health professionals. In this paper, we explain, through the example of polio eradication activities in the Democratic Republic of Congo, how we used GE Earth as a planning tool and we share the methods used to generate public health maps. Results The use of GE improved field operations and resulted in better dispatch of vaccination teams and allocation of resources. It also allowed the creation of maps of high quality for advocacy, training and to help understand the spatiotemporal relationship between all the entities involved in the polio outbreak and response. Conclusion GE has the potential of making mapping available to a new set of public health users in developing countries. High quality and free satellite imagery, rich features including Keyhole Markup Language or image overlay provide a flexible but yet powerful platform that set it apart from traditional GIS tools and this power is still to be fully harnessed by public health professionals. PMID:19161606
Bouhaddou, Omar; Lincoln, Michael J.; Maulden, Sarah; Murphy, Holli; Warnekar, Pradnya; Nguyen, Viet; Lam, Siew; Brown, Steven H; Frankson, Ferdinand J.; Crandall, Glen; Hughes, Carla; Sigley, Roger; Insley, Marcia; Graham, Gail
2006-01-01
The Veterans Administration (VA) has adopted an ambitious program to standardize its clinical terminology to comply with industry-wide standards. The VA is using commercially available tools and in-house software to create a high-quality reference terminology system. The terminology will be used by current and future applications with no planned disruption to operational systems. The first large customer of the group is the national VA Health Data Repository (HDR). Unique enterprise identifiers are assigned to each standard term, and a rich network of semantic relationships makes the resulting data not only recognizable, but highly computable and reusable in a variety of applications, including decision support and data sharing with partners such as the Department of Defense (DoD). This paper describes the specific methods and approaches that the VA has employed to develop and implement this innovative program in existing information system. The goal is to share with others our experience with key issues that face our industry as we move toward an electronic health record for every individual. PMID:17238306
Leonard, M; Graham, S; Bonacum, D
2004-10-01
Effective communication and teamwork is essential for the delivery of high quality, safe patient care. Communication failures are an extremely common cause of inadvertent patient harm. The complexity of medical care, coupled with the inherent limitations of human performance, make it critically important that clinicians have standardised communication tools, create an environment in which individuals can speak up and express concerns, and share common "critical language" to alert team members to unsafe situations. All too frequently, effective communication is situation or personality dependent. Other high reliability domains, such as commercial aviation, have shown that the adoption of standardised tools and behaviours is a very effective strategy in enhancing teamwork and reducing risk. We describe our ongoing patient safety implementation using this approach within Kaiser Permanente, a non-profit American healthcare system providing care for 8.3 million patients. We describe specific clinical experience in the application of surgical briefings, properties of high reliability perinatal care, the value of critical event training and simulation, and benefits of a standardised communication process in the care of patients transferred from hospitals to skilled nursing facilities. Additionally, lessons learned as to effective techniques in achieving cultural change, evidence of improving the quality of the work environment, practice transfer strategies, critical success factors, and the evolving methods of demonstrating the benefit of such work are described.
Peer Observation: A Tool for Staff Development or Compliance?
ERIC Educational Resources Information Center
Shortland, Sue
2004-01-01
Peer observation has become a feature of university practice over the last decade, the primary impetus for its introduction being the political drive to raise teaching quality via the development and sharing of 'good practice'. Peer observation within higher education (HE) involves observing colleagues in the classroom and has the further aim of…
Improving and integrating data on invasive species collected by citizen scientists
2010-01-01
Limited resources make it difficult to effectively document, monitor, and control invasive species across large areas, resulting in large gaps in our knowledge of current and future invasion patterns. We surveyed 128 citizen science program coordinators and interviewed 15 of them to evaluate their potential role in filling these gaps. Many programs collect data on invasive species and are willing to contribute these data to public databases. Although resources for education and monitoring are readily available, groups generally lack tools to manage and analyze data. Potential users of these data also retain concerns over data quality. We discuss how to address these concerns about citizen scientist data and programs while preserving the advantages they afford. A unified yet flexible national citizen science program aimed at tracking invasive species location, abundance, and control efforts could be designed using centralized data sharing and management tools. Such a system could meet the needs of multiple stakeholders while allowing efficiencies of scale, greater standardization of methods, and improved data quality testing and sharing. Finally, we present a prototype for such a system (see www.citsci.org).
Kannan, Vaishnavi; Fish, Jason S; Mutz, Jacqueline M; Carrington, Angela R; Lai, Ki; Davis, Lisa S; Youngblood, Josh E; Rauschuber, Mark R; Flores, Kathryn A; Sara, Evan J; Bhat, Deepa G; Willett, DuWayne L
2017-06-14
Creation of a new electronic health record (EHR)-based registry often can be a "one-off" complex endeavor: first developing new EHR data collection and clinical decision support tools, followed by developing registry-specific data extractions from the EHR for analysis. Each development phase typically has its own long development and testing time, leading to a prolonged overall cycle time for delivering one functioning registry with companion reporting into production. The next registry request then starts from scratch. Such an approach will not scale to meet the emerging demand for specialty registries to support population health and value-based care. To determine if the creation of EHR-based specialty registries could be markedly accelerated by employing (a) a finite core set of EHR data collection principles and methods, (b) concurrent engineering of data extraction and data warehouse design using a common dimensional data model for all registries, and (c) agile development methods commonly employed in new product development. We adopted as guiding principles to (a) capture data as a byproduct of care of the patient, (b) reinforce optimal EHR use by clinicians, (c) employ a finite but robust set of EHR data capture tool types, and (d) leverage our existing technology toolkit. Registries were defined by a shared condition (recorded on the Problem List) or a shared exposure to a procedure (recorded on the Surgical History) or to a medication (recorded on the Medication List). Any EHR fields needed - either to determine registry membership or to calculate a registry-associated clinical quality measure (CQM) - were included in the enterprise data warehouse (EDW) shared dimensional data model. Extract-transform-load (ETL) code was written to pull data at defined "grains" from the EHR into the EDW model. All calculated CQM values were stored in a single Fact table in the EDW crossing all registries. Registry-specific dashboards were created in the EHR to display both (a) real-time patient lists of registry patients and (b) EDW-generated CQM data. Agile project management methods were employed, including co-development, lightweight requirements documentation with User Stories and acceptance criteria, and time-boxed iterative development of EHR features in 2-week "sprints" for rapid-cycle feedback and refinement. Using this approach, in calendar year 2015 we developed a total of 43 specialty chronic disease registries, with 111 new EHR data collection and clinical decision support tools, 163 new clinical quality measures, and 30 clinic-specific dashboards reporting on both real-time patient care gaps and summarized and vetted CQM measure performance trends. This study suggests concurrent design of EHR data collection tools and reporting can quickly yield useful EHR structured data for chronic disease registries, and bodes well for efforts to migrate away from manual abstraction. This work also supports the view that in new EHR-based registry development, as in new product development, adopting agile principles and practices can help deliver valued, high-quality features early and often.
Kannan, Vaishnavi; Fish, Jason S; Mutz, Jacqueline M; Carrington, Angela R; Lai, Ki; Davis, Lisa S; Youngblood, Josh E; Rauschuber, Mark R; Flores, Kathryn A; Sara, Evan J; Bhat, Deepa G; Willett, DuWayne L
2017-01-01
Creation of a new electronic health record (EHR)-based registry often can be a "one-off" complex endeavor: first developing new EHR data collection and clinical decision support tools, followed by developing registry-specific data extractions from the EHR for analysis. Each development phase typically has its own long development and testing time, leading to a prolonged overall cycle time for delivering one functioning registry with companion reporting into production. The next registry request then starts from scratch. Such an approach will not scale to meet the emerging demand for specialty registries to support population health and value-based care. To determine if the creation of EHR-based specialty registries could be markedly accelerated by employing (a) a finite core set of EHR data collection principles and methods, (b) concurrent engineering of data extraction and data warehouse design using a common dimensional data model for all registries, and (c) agile development methods commonly employed in new product development. We adopted as guiding principles to (a) capture data as a byproduct of care of the patient, (b) reinforce optimal EHR use by clinicians, (c) employ a finite but robust set of EHR data capture tool types, and (d) leverage our existing technology toolkit. Registries were defined by a shared condition (recorded on the Problem List) or a shared exposure to a procedure (recorded on the Surgical History) or to a medication (recorded on the Medication List). Any EHR fields needed - either to determine registry membership or to calculate a registry-associated clinical quality measure (CQM) - were included in the enterprise data warehouse (EDW) shared dimensional data model. Extract-transform-load (ETL) code was written to pull data at defined "grains" from the EHR into the EDW model. All calculated CQM values were stored in a single Fact table in the EDW crossing all registries. Registry-specific dashboards were created in the EHR to display both (a) real-time patient lists of registry patients and (b) EDW-gener-ated CQM data. Agile project management methods were employed, including co-development, lightweight requirements documentation with User Stories and acceptance criteria, and time-boxed iterative development of EHR features in 2-week "sprints" for rapid-cycle feedback and refinement. Using this approach, in calendar year 2015 we developed a total of 43 specialty chronic disease registries, with 111 new EHR data collection and clinical decision support tools, 163 new clinical quality measures, and 30 clinic-specific dashboards reporting on both real-time patient care gaps and summarized and vetted CQM measure performance trends. This study suggests concurrent design of EHR data collection tools and reporting can quickly yield useful EHR structured data for chronic disease registries, and bodes well for efforts to migrate away from manual abstraction. This work also supports the view that in new EHR-based registry development, as in new product development, adopting agile principles and practices can help deliver valued, high-quality features early and often. Schattauer GmbH.
Barty, Rebecca L; Gagliardi, Kathleen; Owens, Wendy; Lauzon, Deborah; Scheuermann, Sheena; Liu, Yang; Wang, Grace; Pai, Menaka; Heddle, Nancy M
2015-07-01
Benchmarking is a quality improvement tool that compares an organization's performance to that of its peers for selected indicators, to improve practice. Processes to develop evidence-based benchmarks for red blood cell (RBC) outdating in Ontario hospitals, based on RBC hospital disposition data from Canadian Blood Services, have been previously reported. These benchmarks were implemented in 160 hospitals provincewide with a multifaceted approach, which included hospital education, inventory management tools and resources, summaries of best practice recommendations, recognition of high-performing sites, and audit tools on the Transfusion Ontario website (http://transfusionontario.org). In this study we describe the implementation process and the impact of the benchmarking program on RBC outdating. A conceptual framework for continuous quality improvement of a benchmarking program was also developed. The RBC outdating rate for all hospitals trended downward continuously from April 2006 to February 2012, irrespective of hospitals' transfusion rates or their distance from the blood supplier. The highest annual outdating rate was 2.82%, at the beginning of the observation period. Each year brought further reductions, with a nadir outdating rate of 1.02% achieved in 2011. The key elements of the successful benchmarking strategy included dynamic targets, a comprehensive and evidence-based implementation strategy, ongoing information sharing, and a robust data system to track information. The Ontario benchmarking program for RBC outdating resulted in continuous and sustained quality improvement. Our conceptual iterative framework for benchmarking provides a guide for institutions implementing a benchmarking program. © 2015 AABB.
Parallelization of NAS Benchmarks for Shared Memory Multiprocessors
NASA Technical Reports Server (NTRS)
Waheed, Abdul; Yan, Jerry C.; Saini, Subhash (Technical Monitor)
1998-01-01
This paper presents our experiences of parallelizing the sequential implementation of NAS benchmarks using compiler directives on SGI Origin2000 distributed shared memory (DSM) system. Porting existing applications to new high performance parallel and distributed computing platforms is a challenging task. Ideally, a user develops a sequential version of the application, leaving the task of porting to new generations of high performance computing systems to parallelization tools and compilers. Due to the simplicity of programming shared-memory multiprocessors, compiler developers have provided various facilities to allow the users to exploit parallelism. Native compilers on SGI Origin2000 support multiprocessing directives to allow users to exploit loop-level parallelism in their programs. Additionally, supporting tools can accomplish this process automatically and present the results of parallelization to the users. We experimented with these compiler directives and supporting tools by parallelizing sequential implementation of NAS benchmarks. Results reported in this paper indicate that with minimal effort, the performance gain is comparable with the hand-parallelized, carefully optimized, message-passing implementations of the same benchmarks.
Korst, Lisa M; Aydin, Carolyn E; Signer, Jordana M K; Fink, Arlene
2011-08-01
The development of readiness metrics for organizational participation in health information exchange is critical for monitoring progress toward, and achievement of, successful inter-organizational collaboration. In preparation for the development of a tool to measure readiness for data-sharing, we tested whether organizational capacities known to be related to readiness were associated with successful participation in an American data-sharing collaborative for quality improvement. Cross-sectional design, using an on-line survey of hospitals in a large, mature data-sharing collaborative organized for benchmarking and improvement in nursing care quality. Factor analysis was used to identify salient constructs, and identified factors were analyzed with respect to "successful" participation. "Success" was defined as the incorporation of comparative performance data into the hospital dashboard. The most important factor in predicting success included survey items measuring the strength of organizational leadership in fostering a culture of quality improvement (QI Leadership): (1) presence of a supportive hospital executive; (2) the extent to which a hospital values data; (3) the presence of leaders' vision for how the collaborative advances the hospital's strategic goals; (4) hospital use of the collaborative data to track quality outcomes; and (5) staff recognition of a strong mandate for collaborative participation (α=0.84, correlation with Success 0.68 [P<0.0001]). The data emphasize the importance of hospital QI Leadership in collaboratives that aim to share data for QI or safety purposes. Such metrics should prove useful in the planning and development of this complex form of inter-organizational collaboration. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Korst, Lisa M.; Aydin, Carolyn E.; Signer, Jordana M. K.; Fink, Arlene
2011-01-01
Objective The development of readiness metrics for organizational participation in health information exchange is critical for monitoring progress toward, and achievement of, successful inter-organizational collaboration. In preparation for the development of a tool to measure readiness for data-sharing, we tested whether organizational capacities known to be related to readiness were associated with successful participation in an American data-sharing collaborative for quality improvement. Design Cross-sectional design, using an on-line survey of hospitals in a large, mature data-sharing collaborative organized for benchmarking and improvement in nursing care quality. Measurements Factor analysis was used to identify salient constructs, and identified factors were analyzed with respect to “successful” participation. “Success” was defined as the incorporation of comparative performance data into the hospital dashboard. Results The most important factor in predicting success included survey items measuring the strength of organizational leadership in fostering a culture of quality improvement (QI Leadership): 1) presence of a supportive hospital executive; 2) the extent to which a hospital values data; 3) the presence of leaders’ vision for how the collaborative advances the hospital’s strategic goals; 4) hospital use of the collaborative data to track quality outcomes; and 5) staff recognition of a strong mandate for collaborative participation (α = 0.84, correlation with Success 0.68 [P < 0.0001]). Conclusion The data emphasize the importance of hospital QI Leadership in collaboratives that aim to share data for QI or safety purposes. Such metrics should prove useful in the planning and development of this complex form of inter-organizational collaboration. PMID:21330191
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morrell, William C.; Birkel, Garrett W.; Forrer, Mark
Although recent advances in synthetic biology allow us to produce biological designs more efficiently than ever, our ability to predict the end result of these designs is still nascent. Predictive models require large amounts of high-quality data to be parametrized and tested, which are not generally available. Here, we present the Experiment Data Depot (EDD), an online tool designed as a repository of experimental data and metadata. EDD provides a convenient way to upload a variety of data types, visualize these data, and export them in a standardized fashion for use with predictive algorithms. In this paper, we describe EDDmore » and showcase its utility for three different use cases: storage of characterized synthetic biology parts, leveraging proteomics data to improve biofuel yield, and the use of extracellular metabolite concentrations to predict intracellular metabolic fluxes.« less
The Impact of Alternative Payment Models on Oncology Innovation and Patient Care.
Miller, Amy M; Omenn, Gilbert S; Kean, Marcia A
2016-05-15
Oncology care is in a time of major transformation. Scientific discovery is driving breakthroughs in prevention, diagnostics, and treatment, resulting in tremendous gains for patients as the number of cancer survivors continues to grow on an annual basis. At the same time, there is mounting pressure across the healthcare system to contain costs while improving the quality of cancer care. In response to this pressure, private and government payers are increasingly turning to tools such as alternative payment models (APM) and clinical pathways to improve the efficiency of care, inform coverage decisions, and support shared decision-making. As APMs, clinical pathways and other tools are utilized more broadly, it will be critical that these models support the evidence-based use of innovative biomedical advances, including personalized medicine, and deliver patient-centered, high-value care. Clin Cancer Res; 22(10); 2335-41. ©2016 AACR. ©2016 American Association for Cancer Research.
Morrell, William C.; Birkel, Garrett W.; Forrer, Mark; ...
2017-08-21
Although recent advances in synthetic biology allow us to produce biological designs more efficiently than ever, our ability to predict the end result of these designs is still nascent. Predictive models require large amounts of high-quality data to be parametrized and tested, which are not generally available. Here, we present the Experiment Data Depot (EDD), an online tool designed as a repository of experimental data and metadata. EDD provides a convenient way to upload a variety of data types, visualize these data, and export them in a standardized fashion for use with predictive algorithms. In this paper, we describe EDDmore » and showcase its utility for three different use cases: storage of characterized synthetic biology parts, leveraging proteomics data to improve biofuel yield, and the use of extracellular metabolite concentrations to predict intracellular metabolic fluxes.« less
Morrell, William C; Birkel, Garrett W; Forrer, Mark; Lopez, Teresa; Backman, Tyler W H; Dussault, Michael; Petzold, Christopher J; Baidoo, Edward E K; Costello, Zak; Ando, David; Alonso-Gutierrez, Jorge; George, Kevin W; Mukhopadhyay, Aindrila; Vaino, Ian; Keasling, Jay D; Adams, Paul D; Hillson, Nathan J; Garcia Martin, Hector
2017-12-15
Although recent advances in synthetic biology allow us to produce biological designs more efficiently than ever, our ability to predict the end result of these designs is still nascent. Predictive models require large amounts of high-quality data to be parametrized and tested, which are not generally available. Here, we present the Experiment Data Depot (EDD), an online tool designed as a repository of experimental data and metadata. EDD provides a convenient way to upload a variety of data types, visualize these data, and export them in a standardized fashion for use with predictive algorithms. In this paper, we describe EDD and showcase its utility for three different use cases: storage of characterized synthetic biology parts, leveraging proteomics data to improve biofuel yield, and the use of extracellular metabolite concentrations to predict intracellular metabolic fluxes.
Aij, Kjeld Harald; Rapsaniotis, Sofia
2017-01-01
As health care organizations face pressures to improve quality and efficiency while reducing costs, leaders are adopting management techniques and tools used in manufacturing and other industries, especially Lean. Successful Lean leaders appear to use a coaching leadership style that shares underlying principles with servant leadership. There is little information about specific similarities and differences between Lean and servant leaderships. We systematically reviewed the literature on Lean leadership, servant leadership, and health care and performed a comparative analysis of attributes using Russell and Stone's leadership framework. We found significant overlap between the two leadership styles, although there were notable differences in origins, philosophy, characteristics and behaviors, and tools. We conclude that both Lean and servant leaderships are promising models that can contribute to the delivery of patient-centered, high-value care. Servant leadership may provide the means to engage and develop employees to become successful Lean leaders in health care organizations.
A low-cost sensing system for cooperative air quality monitoring in urban areas.
Brienza, Simone; Galli, Andrea; Anastasi, Giuseppe; Bruschi, Paolo
2015-05-26
Air quality in urban areas is a very important topic as it closely affects the health of citizens. Recent studies highlight that the exposure to polluted air can increase the incidence of diseases and deteriorate the quality of life. Hence, it is necessary to develop tools for real-time air quality monitoring, so as to allow appropriate and timely decisions. In this paper, we present uSense, a low-cost cooperative monitoring tool that allows knowing, in real-time, the concentrations of polluting gases in various areas of the city. Specifically, users monitor the areas of their interest by deploying low-cost and low-power sensor nodes. In addition, they can share the collected data following a social networking approach. uSense has been tested through an in-field experimentation performed in different areas of a city. The obtained results are in line with those provided by the local environmental control authority and show that uSense can be profitably used for air quality monitoring.
De-identification of health records using Anonym: effectiveness and robustness across datasets.
Zuccon, Guido; Kotzur, Daniel; Nguyen, Anthony; Bergheim, Anton
2014-07-01
Evaluate the effectiveness and robustness of Anonym, a tool for de-identifying free-text health records based on conditional random fields classifiers informed by linguistic and lexical features, as well as features extracted by pattern matching techniques. De-identification of personal health information in electronic health records is essential for the sharing and secondary usage of clinical data. De-identification tools that adapt to different sources of clinical data are attractive as they would require minimal intervention to guarantee high effectiveness. The effectiveness and robustness of Anonym are evaluated across multiple datasets, including the widely adopted Integrating Biology and the Bedside (i2b2) dataset, used for evaluation in a de-identification challenge. The datasets used here vary in type of health records, source of data, and their quality, with one of the datasets containing optical character recognition errors. Anonym identifies and removes up to 96.6% of personal health identifiers (recall) with a precision of up to 98.2% on the i2b2 dataset, outperforming the best system proposed in the i2b2 challenge. The effectiveness of Anonym across datasets is found to depend on the amount of information available for training. Findings show that Anonym compares to the best approach from the 2006 i2b2 shared task. It is easy to retrain Anonym with new datasets; if retrained, the system is robust to variations of training size, data type and quality in presence of sufficient training data. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.
Developing Quality Improvement capacity and capability across the Children in Fife partnership.
Morris, Craig; Alexander, Ingrid
2016-01-01
A Project Manager from the Fife Early Years Collaborative facilitated a large-scale Quality Improvement (herein QI) project to build organisational capacity and capability across the Children in Fife partnership through three separate, eight month training cohorts. This 18 month QI project enabled 32 practitioners to increase their skills, knowledge, and experiences in a variety of QI tools including the Model for Improvement which then supported the delivery of high quality improvement projects and improved outcomes for children and families. Essentially growing the confidence and capability of practitioners to deliver sustainable QI. 27 respective improvement projects were delivered, some leading to service redesign, reduced waiting times, increased uptake of health entitlements, and improved accessibility to front-line health services. 13 improvement projects spread or scaled beyond the initial site and informal QI mentoring took place with peers in respective agencies. Multiple PDSA cycles were conducted testing the most efficient and effective support mechanisms during and post training, maintaining regular contact, and utilising social media to share progress and achievements.
Gómez-García, Francisco; Ruano, Juan; Gay-Mimbrera, Jesus; Aguilar-Luque, Macarena; Sanz-Cabanillas, Juan Luis; Alcalde-Mellado, Patricia; Maestre-López, Beatriz; Carmona-Fernández, Pedro Jesús; González-Padilla, Marcelino; García-Nieto, Antonio Vélez; Isla-Tejera, Beatriz
2017-12-01
No gold standard exists to assess methodological quality of systematic reviews (SRs). Although Assessing the Methodological Quality of Systematic Reviews (AMSTAR) is widely accepted for analyzing quality, the ROBIS instrument has recently been developed. This study aimed to compare the capacity of both instruments to capture the quality of SRs concerning psoriasis interventions. Systematic literature searches were undertaken on relevant databases. For each review, methodological quality and bias risk were evaluated using the AMSTAR and ROBIS tools. Descriptive and principal component analyses were conducted to describe similarities and discrepancies between both assessment tools. We classified 139 intervention SRs as displaying high/moderate/low methodological quality and as high/low risk of bias. A high risk of bias was detected for most SRs classified as displaying high or moderate methodological quality by AMSTAR. When comparing ROBIS result profiles, responses to domain 4 signaling questions showed the greatest differences between bias risk assessments, whereas domain 2 items showed the least. When considering SRs published about psoriasis, methodological quality remains suboptimal, and the risk of bias is elevated, even for SRs exhibiting high methodological quality. Furthermore, the AMSTAR and ROBIS tools may be considered as complementary when conducting quality assessment of SRs. Copyright © 2017 Elsevier Inc. All rights reserved.
Johnsen, Bjørn Helge; Westli, Heidi Kristina; Espevik, Roar; Wisborg, Torben; Brattebø, Guttorm
2017-11-10
High quality team leadership is important for the outcome of medical emergencies. However, the behavioral marker of leadership are not well defined. The present study investigated frequency of behavioral markers of shared mental models (SMM) on quality of medical management. Training video recordings of 27 trauma teams simulating emergencies were analyzed according to team -leader's frequency of shared mental model behavioral markers. The results showed a positive correlation of quality of medical management with leaders sharing information without an explicit demand for the information ("push" of information) and with leaders communicating their situational awareness (SA) and demonstrating implicit supporting behavior. When separating the sample into higher versus lower performing teams, the higher performing teams had leaders who displayed a greater frequency of "push" of information and communication of SA and supportive behavior. No difference was found for the behavioral marker of team initiative, measured as bringing up suggestions to other teammembers. The results of this study emphasize the team leader's role in initiating and updating a team's shared mental model. Team leaders should also set expectations for acceptable interaction patterns (e.g., promoting information exchange) and create a team climate that encourages behaviors, such as mutual performance monitoring, backup behavior, and adaptability to enhance SMM.
NASA Astrophysics Data System (ADS)
Pesquer, Lluís; Jirka, Simon; van de Giesen, Nick; Masó, Joan; Stasch, Christoph; Van Nooyen, Ronald; Prat, Ester; Pons, Xavier
2015-04-01
This work describes the strategy of the European Horizon 2020 project WaterInnEU. Its vision is to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to the water sector and to establish suitable conditions for new market opportunities based on these offerings. The main goals are: • Connect the research results and developments of previous EU funded activities with the already existing data available on European level and also with to the companies that are able to offer products and services based on these tools and data. • Offer an independent marketplace platform complemented by technical and commercial expertise as a service for users to allow the access to products and services best fitting their priorities, capabilities and procurement processes. One of the pillars of WaterInnEU is to stimulate and prioritize the application of international standards into ICT tools and policy briefs. The standardization of formats, services and processes will allow for a harmonized water management between different sectors, fragmented areas and scales (local, regional or international) approaches. Several levels of interoperability will be addressed: • Syntactic: Connecting system and tools together: Syntactic interoperability allows for client and service tools to automatically discover, access, and process data and information (query and exchange parts of a database) and to connect each other in process chains. The discovery of water related data is achieved using metadata cataloguing standards and, in particular, the one adopted by the INSPIRE directive: OGC Catalogue Service for the Web (CSW). • Semantic: Sharing a pan-European conceptual framework This is the ability of computer systems to exchange data with unambiguous, shared meaning. The project therefore addresses not only the packaging of data (syntax), but also the simultaneous transmission of the meaning with the data (semantics). This is accomplished by linking each data element to a controlled, shared vocabulary. In Europe, INSPIRE defines a shared vocabulary and its associated links to an ontology. For hydrographical information this can be used as a baseline. • Organizational: Harmonizing policy aspects This level of interoperability deals with operational methodologies and procedures that organizations use to administrate their own data and processing capabilities and to share those capabilities with others. This layer is addressed by the adoption of common policy briefs that facilitate both robust protocols and flexibility to interact with others. • Data visualization: Making data easy to see The WMS and WMTS standards are the most commonly used geographic information visualization standards for sharing information in web portals. Our solution will incorporate a quality extension of these standards for visualizing data quality as nested layers linked to the different data sets. In the presented approach, the use of standards can be seen twofold: the tools and products should leverage standards wherever possible to ensure interoperability between solution providers, and the platform itself must utilize standards as much as possible, to allow for example the integration with other systems through open APIs or the description of available items.
Embedding learning from adverse incidents: a UK case study.
Eshareturi, Cyril; Serrant, Laura
2017-04-18
Purpose This paper reports on a regionally based UK study uncovering what has worked well in learning from adverse incidents in hospitals. The purpose of this paper is to review the incident investigation methodology used in identifying strengths or weaknesses and explore the use of a database as a tool to embed learning. Design/methodology/approach Documentary examination was conducted of all adverse incidents reported between 1 June 2011 and 30 June 2012 by three UK National Health Service hospitals. One root cause analysis report per adverse incident for each individual hospital was sent to an advisory group for a review. Using terms of reference supplied, the advisory group feedback was analysed using an inductive thematic approach. The emergent themes led to the generation of questions which informed seven in-depth semi-structured interviews. Findings "Time" and "work pressures" were identified as barriers to using adverse incident investigations as tools for quality enhancement. Methodologically, a weakness in approach was that no criteria influenced the techniques which were used in investigating adverse incidents. Regarding the sharing of learning, the use of a database as a tool to embed learning across the region was not supported. Practical implications Softer intelligence from adverse incident investigations could be usefully shared between hospitals through a regional forum. Originality/value The use of a database as a tool to facilitate the sharing of learning from adverse incidents across the health economy is not supported.
Electronic Conferencing Tools for Student Apprenticeship and Perspective Taking.
ERIC Educational Resources Information Center
Bonk, Curtis Jay; And Others
1996-01-01
Discusses three electronic conferencing tools that apprentice novice learners and encourage them to interact and grapple with alternative perspectives. Technical details of each tool are described, along with 1 instance where all 3 technologies were united, resulting in a highly interactive conversation shared by over 30 people at 4 different…
Quality audit--a review of the literature concerning delivery of continence care.
Swaffield, J
1995-09-01
This paper outlines the role of quality audit within the framework of quality assurance, presenting the concurrent and retrospective approaches available. The literature survey provides a review of the limited audit tools available and their application to continence services and care delivery, as well as attempts to produce tools from national and local standard setting. Audit is part of a process; it can involve staff, patients and their relatives and the team of professionals providing care, as well as focusing on organizational and management levels. In an era of market delivery of services there is a need to justify why audit is important to continence advisors and managers. Effectiveness, efficiency and economics may drive the National Health Service, but quality assurance, which includes standards and audit tools, offers the means to ensure the quality of continence services and care to patients and auditing is also required in the purchaser/provider contracts for patient services. An overview and progress to date of published and other a projects in auditing continence care and service is presented. By outlining and highlighting the audit of continence service delivery and care as a basis on which to build quality assurance programmes, it is hoped that this knowledge will be shared through the setting up of a central auditing clearing project.
Becoming-Learner: Coordinates for Mapping the Space and Subject of Nomadic Pedagogy
ERIC Educational Resources Information Center
Fendler, Rachel
2013-01-01
How can the process of "becoming learner" be observed, documented, and shared? What methodology could be used to discuss nomadic qualities of learning mobilities? This article argues in favor of an arts-based research approach, specifically social cartography, as a tool that can encourage young people to reflect on their identity as…
Brushstrokes: Styles and Techniques of Chinese Painting. A Teacher Workshop.
ERIC Educational Resources Information Center
Asian Art Museum of San Francisco, CA.
Brushwork is the essential characteristic of Chinese painting. Ink and brushwork provide the foundation of Chinese pictures, even when color also is used. In the quality of the brushwork the artist captures the spirit resonance, the raison d'etre of a painting. In China, painting and writing developed hand in hand, sharing the same tools and…
Youpi: YOUr processing PIpeline
NASA Astrophysics Data System (ADS)
Monnerville, Mathias; Sémah, Gregory
2012-03-01
Youpi is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. Built on top of various open source reduction tools released to the community by TERAPIX (http://terapix.iap.fr), Youpi can help organize data, manage processing jobs on a computer cluster in real time (using Condor) and facilitate teamwork by allowing fine-grain sharing of results and data. Youpi is modular and comes with plugins which perform, from within a browser, various processing tasks such as evaluating the quality of incoming images (using the QualityFITS software package), computing astrometric and photometric solutions (using SCAMP), resampling and co-adding FITS images (using SWarp) and extracting sources and building source catalogues from astronomical images (using SExtractor). Youpi is useful for small to medium-sized data reduction projects; it is free and is published under the GNU General Public License.
Uomini, Natalie Thaïs; Meyer, Georg Friedrich
2013-01-01
The popular theory that complex tool-making and language co-evolved in the human lineage rests on the hypothesis that both skills share underlying brain processes and systems. However, language and stone tool-making have so far only been studied separately using a range of neuroimaging techniques and diverse paradigms. We present the first-ever study of brain activation that directly compares active Acheulean tool-making and language. Using functional transcranial Doppler ultrasonography (fTCD), we measured brain blood flow lateralization patterns (hemodynamics) in subjects who performed two tasks designed to isolate the planning component of Acheulean stone tool-making and cued word generation as a language task. We show highly correlated hemodynamics in the initial 10 seconds of task execution. Stone tool-making and cued word generation cause common cerebral blood flow lateralization signatures in our participants. This is consistent with a shared neural substrate for prehistoric stone tool-making and language, and is compatible with language evolution theories that posit a co-evolution of language and manual praxis. In turn, our results support the hypothesis that aspects of language might have emerged as early as 1.75 million years ago, with the start of Acheulean technology.
Active Wiki Knowledge Repository
2012-10-01
data using SPARQL queries or RESTful web-services; ‘gardening’ tools for examining the semantically tagged content in the wiki; high-level language tool...Tagging & RDF triple-store Fusion and inferences for collaboration Tools for Consuming Data SPARQL queries or RESTful WS Inference & Gardening tools...other stores using AW SPARQL queries and rendering templates; and 4) Interactively share maps and other content using annotation tools to post notes
Motivating Communities To Go Beyond the Discovery Plateau
NASA Astrophysics Data System (ADS)
Habermann, T.; Kozimor, J.
2014-12-01
Years of emphasizing discovery and minimal metadata requirements have resulted in a culture that accepts that metadata are for discovery and complete metadata are too complex or difficult for researchers to understand and create. Evolving the culture past this "data-discovery plateau" requires a multi-faceted approach that addresses the rational and emotional sides of the problem. On the rational side, scientists know that data and results must be well documented in order to be reproducible, re-usable, and trustworthy. We need tools that script critical moves towards well-described destinations and help identify members of the community that are already leading the way towards those destinations. We need mechanisms that help those leaders share their experiences and examples. On the emotional side, we need to emphasize that high-quality metadata makes data trustworthy, divide the improvement process into digestible pieces and create mechanisms for clearly identifying and rewarding progress. We also need to provide clear opportunities for community members to increase their expertise and to share their skills.
Achieving effective staffing through a shared decision-making approach to open-shift management.
Valentine, Nancy M; Nash, Jan; Hughes, Douglas; Douglas, Kathy
2008-01-01
Managing costs while retaining qualified nurses and finding workforce solutions that ensure the delivery of high-quality patient care is of primary importance to nurse leaders and executive management. Leading healthcare organizations are using open-shift management technology as a strategy to improve staffing effectiveness and the work environment. In many hospitals, open-shift management technology has become an essential workforce management tool, nursing benefit, and recruitment and retention incentive. In this article, the authors discuss how a successful nursing initiative to apply automation to open-shift scheduling and fulfillment across a 3-hospital system had a broad enterprise-wide impact resulting in dramatic improvements in nurse satisfaction, retention, recruitment, and the bottom line.
Science education as a civil right: Urban schools and opportunity-to-learn considerations
NASA Astrophysics Data System (ADS)
Tate, William
2001-11-01
In this article I make the case that urban science education is a civil rights issue and that to effectively address it as such we must shift from arguments for civil rights as shared physical space in schools to demands for high-quality academic preparation that includes the opportunity to learn science. The argument is organized into two sections: first, a review of the school desegregation literature to make the case that urban science education for all is a civil rights issue; and second, an examination and critique of opportunity-to-learn literature, including an analysis of three opportunity-to-learn constructs to illustrate their potential as civil rights tools in science education.
Horwood, Christiane M; Youngleson, Michele S; Moses, Edward; Stern, Amy F; Barker, Pierre M
2015-07-01
Achieving long-term retention in HIV care is an important challenge for HIV management and achieving elimination of mother-to-child transmission. Sustainable, affordable strategies are required to achieve this, including strengthening of community-based interventions. Deployment of community-based health workers (CHWs) can improve health outcomes but there is a need to identify systems to support and maintain high-quality performance. Quality-improvement strategies have been successfully implemented to improve quality and coverage of healthcare in facilities and could provide a framework to support community-based interventions. Four community-based quality-improvement projects from South Africa, Malawi and Mozambique are described. Community-based improvement teams linked to the facility-based health system participated in learning networks (modified Breakthrough Series), and used quality-improvement methods to improve process performance. Teams were guided by trained quality mentors who used local data to help nurses and CHWs identify gaps in service provision and test solutions. Learning network participants gathered at intervals to share progress and identify successful strategies for improvement. CHWs demonstrated understanding of quality-improvement concepts, tools and methods, and implemented quality-improvement projects successfully. Challenges of using quality-improvement approaches in community settings included adapting processes, particularly data reporting, to the education level and first language of community members. Quality-improvement techniques can be implemented by CHWs to improve outcomes in community settings but these approaches require adaptation and additional mentoring support to be successful. More research is required to establish the effectiveness of this approach on processes and outcomes of care.
Ameisen, David; Deroulers, Christophe; Perrier, Valérie; Bouhidel, Fatiha; Battistella, Maxime; Legrès, Luc; Janin, Anne; Bertheau, Philippe; Yunès, Jean-Baptiste
2014-01-01
Since microscopic slides can now be automatically digitized and integrated in the clinical workflow, quality assessment of Whole Slide Images (WSI) has become a crucial issue. We present a no-reference quality assessment method that has been thoroughly tested since 2010 and is under implementation in multiple sites, both public university-hospitals and private entities. It is part of the FlexMIm R&D project which aims to improve the global workflow of digital pathology. For these uses, we have developed two programming libraries, in Java and Python, which can be integrated in various types of WSI acquisition systems, viewers and image analysis tools. Development and testing have been carried out on a MacBook Pro i7 and on a bi-Xeon 2.7GHz server. Libraries implementing the blur assessment method have been developed in Java, Python, PHP5 and MySQL5. For web applications, JavaScript, Ajax, JSON and Sockets were also used, as well as the Google Maps API. Aperio SVS files were converted into the Google Maps format using VIPS and Openslide libraries. We designed the Java library as a Service Provider Interface (SPI), extendable by third parties. Analysis is computed in real-time (3 billion pixels per minute). Tests were made on 5000 single images, 200 NDPI WSI, 100 Aperio SVS WSI converted to the Google Maps format. Applications based on our method and libraries can be used upstream, as calibration and quality control tool for the WSI acquisition systems, or as tools to reacquire tiles while the WSI is being scanned. They can also be used downstream to reacquire the complete slides that are below the quality threshold for surgical pathology analysis. WSI may also be displayed in a smarter way by sending and displaying the regions of highest quality before other regions. Such quality assessment scores could be integrated as WSI's metadata shared in clinical, research or teaching contexts, for a more efficient medical informatics workflow.
Strategies for Increasing the Market Share of Recycled Products—A Games Theory Approach
NASA Astrophysics Data System (ADS)
Batzias, Dimitris F.; Pollalis, Yannis A.
2009-08-01
A methodological framework (including 28 activity stages and 10 decision nodes) has been designed under the form of an algorithmic procedure for the development of strategies for increasing the market share of recycled products within a games theory context. A case example is presented referring to a paper market, where a recycling company (RC) is in competition with a virgin-raw-material-using company (VC). The strategies of the VC, for increasing its market share, are the strengthening of (and advertisement based on) the high quality (VC1), the high reliability (VC2), the combination quality and reliability, putting emphasis on the first component (VC3), the combination quality and reliability, putting emphasis on the second component (VC4). The strategies of the RC, for increasing its market share, are proper advertisement based on the low price of produced recycled paper satisfying minimum quality requirements (RC1), the combination of low price with sensitization of the public as regards environmental and materials-saving issues, putting emphasis on the first component (RC2), the same combination, putting emphasis on the second component (RC3). Analysis of all possible situations for the case example under examination is also presented.
[A development and evaluation of nursing KMS using QFD in outpatient departments].
Lee, Han Na; Yun, Eun Kyoung
2014-02-01
This study was done to develop and implement the Nursing KMS (knowledge management system) in order to improve knowledge sharing and creation among clinical nurses in outpatient departments. This study was a methodological research using the 'System Development Life Cycle': consisting of planning, analyzing, design, implementation, and evaluation. Quality Function Deployment (QFD) was applied to establish nurse requirements and to identify important design requirements. Participants were 32 nurses and for evaluation data were collected pre and post intervention at K Hospital in Seoul, a tertiary hospital with over 1,000 beds. The Nursing KMS was built using a Linux-based operating system, Oracle DBMS, and Java 1.6 web programming tools. The system was implemented as a sub-system of the hospital information system. There was statistically significant differences in the sharing of knowledge but creating of knowledge was no statistically meaningful difference observed. In terms of satisfaction with the system, system efficiency ranked first followed by system convenience, information suitability and information usefulness. The results indicate that the use of Nursing KMS increases nurses' knowledge sharing and can contribute to increased quality of nursing knowledge and provide more opportunities for nurses to gain expertise from knowledge shared among nurses.
Job-Sharing Couples in Academia: Administrative Policies and Practices.
ERIC Educational Resources Information Center
Mikitka, Kathleen Faith
1984-01-01
Examined existing administrative policies and procedures for academic job sharing for married couples in a survey of 12 institutions and 16 administrators. Results suggested growing consideration of job sharing by academic employers and pointed out advantages such as attracting high-quality faculty and extending faculty resources. (JAC)
Complying with physician gain-sharing restrictions.
O'Hare, P K
1998-05-01
Many IDSs are considering implementing gain-sharing programs as a way to motivate their physicians to provide high-quality, cost-effective services. Before embarking on such programs, however, IDSs need to understand the legal requirements associated with such programs to ensure that the gain-sharing arrangement is in compliance with Federal law.
National Fusion Collaboratory: Grid Computing for Simulations and Experiments
NASA Astrophysics Data System (ADS)
Greenwald, Martin
2004-05-01
The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.
Codifying Collegiality: Recent Developments in Data Sharing Policy in the Life Sciences
Pham-Kanter, Genevieve; Zinner, Darren E.; Campbell, Eric G.
2014-01-01
Over the last decade, there have been significant changes in data sharing policies and in the data sharing environment faced by life science researchers. Using data from a 2013 survey of over 1600 life science researchers, we analyze the effects of sharing policies of funding agencies and journals. We also examine the effects of new sharing infrastructure and tools (i.e., third party repositories and online supplements). We find that recently enacted data sharing policies and new sharing infrastructure and tools have had a sizable effect on encouraging data sharing. In particular, third party repositories and online supplements as well as data sharing requirements of funding agencies, particularly the NIH and the National Human Genome Research Institute, were perceived by scientists to have had a large effect on facilitating data sharing. In addition, we found a high degree of compliance with these new policies, although noncompliance resulted in few formal or informal sanctions. Despite the overall effectiveness of data sharing policies, some significant gaps remain: about one third of grant reviewers placed no weight on data sharing plans in their reviews, and a similar percentage ignored the requirements of material transfer agreements. These patterns suggest that although most of these new policies have been effective, there is still room for policy improvement. PMID:25259842
BingEO: Enable Distributed Earth Observation Data for Environmental Research
NASA Astrophysics Data System (ADS)
Wu, H.; Yang, C.; Xu, Y.
2010-12-01
Our planet is facing great environmental challenges including global climate change, environmental vulnerability, extreme poverty, and a shortage of clean cheap energy. To address these problems, scientists are developing various models to analysis, forecast, simulate various geospatial phenomena to support critical decision making. These models not only challenge our computing technology, but also challenge us to feed huge demands of earth observation data. Through various policies and programs, open and free sharing of earth observation data are advocated in earth science. Currently, thousands of data sources are freely available online through open standards such as Web Map Service (WMS), Web Feature Service (WFS) and Web Coverage Service (WCS). Seamless sharing and access to these resources call for a spatial Cyberinfrastructure (CI) to enable the use of spatial data for the advancement of related applied sciences including environmental research. Based on Microsoft Bing Search Engine and Bing Map, a seamlessly integrated and visual tool is under development to bridge the gap between researchers/educators and earth observation data providers. With this tool, earth science researchers/educators can easily and visually find the best data sets for their research and education. The tool includes a registry and its related supporting module at server-side and an integrated portal as its client. The proposed portal, Bing Earth Observation (BingEO), is based on Bing Search and Bing Map to: 1) Use Bing Search to discover Web Map Services (WMS) resources available over the internet; 2) Develop and maintain a registry to manage all the available WMS resources and constantly monitor their service quality; 3) Allow users to manually register data services; 4) Provide a Bing Maps-based Web application to visualize the data on a high-quality and easy-to-manipulate map platform and enable users to select the best data layers online. Given the amount of observation data accumulated already and still growing, BingEO will allow these resources to be utilized more widely, intensively, efficiently and economically in earth science applications.
Witteman, Holly O; Hafeez, Baria; Provencher, Thierry; Van de Graaf, Mary; Wei, Esther
2015-01-01
Background A critical problem for patients with chronic conditions who see multiple health care providers is incomplete or inaccurate information, which can contribute to lack of care coordination, low quality of care, and medical errors. Objective As part of a larger project on applications of consumer health information technology (HIT) and barriers to its use, we conducted a semistructured interview study with patients with multiple chronic conditions (MCC) with the objective of exploring their role in managing their personal health information. Methods Semistructured interviews were conducted with patients and providers. Patients were eligible if they had multiple chronic conditions and were in regular care with one of two medical organizations in New York City; health care providers were eligible if they had experience caring for patients with multiple chronic conditions. Analysis was conducted from a grounded theory perspective, and recruitment was concluded when saturation was achieved. Results A total of 22 patients and 7 providers were interviewed; patients had an average of 3.5 (SD 1.5) chronic conditions and reported having regular relationships with an average of 5 providers. Four major themes arose: (1) Responsibility for managing medical information: some patients perceived information management and sharing as the responsibility of health care providers; others—particularly those who had had bad experiences in the past—took primary responsibility for information sharing; (2) What information should be shared: although privacy concerns did influence some patients’ perceptions of sharing of medical data, decisions about what to share were also heavily influenced by their understanding of health and disease and by the degree to which they understood the health care system; (3) Methods and tools varied: those patients who did take an active role in managing their records used a variety of electronic tools, paper tools, and memory; and (4) Information management as invisible work: managing transfers of medical information to solve problems was a tremendous amount of work that was largely unrecognized by the medical establishment. Conclusions We conclude that personal health information management should be recognized as an additional burden that MCC places upon patients. Effective structural solutions for information sharing, whether institutional ones such as care management or technological ones such as electronic health information exchange, are likely not only to improve the quality of information shared but reduce the burden on patients already weighed down by MCC. PMID:26043709
Communicating and visualizing data quality through Web Map Services
NASA Astrophysics Data System (ADS)
Roberts, Charles; Blower, Jon; Maso, Joan; Diaz, Daniel; Griffiths, Guy; Lewis, Jane
2014-05-01
The sharing and visualization of environmental data through OGC Web Map Services is becoming increasingly common. However, information about the quality of data is rarely presented. (In this presentation we consider mostly data uncertainty as a measure of quality, although we acknowledge that many other quality measures are relevant to the geoscience community.) In the context of the GeoViQua project (http://www.geoviqua.org) we have developed conventions and tools for using WMS to deliver data quality information. The "WMS-Q" convention describes how the WMS specification can be used to publish quality information at the level of datasets, variables and individual pixels (samples). WMS-Q requires no extensions to the WMS 1.3.0 specification, being entirely backward-compatible. (An earlier version of WMS-Q was published as OGC Engineering Report 12-160.) To complement the WMS-Q convention, we have also developed extensions to the OGC Symbology Encoding (SE) specification, enabling uncertain geoscience data to be portrayed using a variety of visualization techniques. These include contours, stippling, blackening, whitening, opacity, bivariate colour maps, confidence interval triangles and glyphs. There may also be more extensive applications of these methods beyond the visual representation of uncertainty. In this presentation we will briefly describe the scope of the WMS-Q and "extended SE" specifications and then demonstrate the innovations using open-source software based upon ncWMS (http://ncwms.sf.net). We apply the tools to a variety of datasets including Earth Observation data from the European Space Agency's Climate Change Initiative. The software allows uncertain raster data to be shared through Web Map Services, giving the user fine control over data visualization.
Selen, Arzu; Cruañes, Maria T; Müllertz, Anette; Dickinson, Paul A; Cook, Jack A; Polli, James E; Kesisoglou, Filippos; Crison, John; Johnson, Kevin C; Muirhead, Gordon T; Schofield, Timothy; Tsong, Yi
2010-09-01
A biopharmaceutics and Quality by Design (QbD) conference was held on June 10-12, 2009 in Rockville, Maryland, USA to provide a forum and identify approaches for enhancing product quality for patient benefit. Presentations concerned the current biopharmaceutical toolbox (i.e., in vitro, in silico, pre-clinical, in vivo, and statistical approaches), as well as case studies, and reflections on new paradigms. Plenary and breakout session discussions evaluated the current state and envisioned a future state that more effectively integrates QbD and biopharmaceutics. Breakout groups discussed the following four topics: Integrating Biopharmaceutical Assessment into the QbD Paradigm, Predictive Statistical Tools, Predictive Mechanistic Tools, and Predictive Analytical Tools. Nine priority areas, further described in this report, were identified for advancing integration of biopharmaceutics and support a more fundamentally based, integrated approach to setting product dissolution/release acceptance criteria. Collaboration among a broad range of disciplines and fostering a knowledge sharing environment that places the patient's needs as the focus of drug development, consistent with science- and risk-based spirit of QbD, were identified as key components of the path forward.
Gambrill, E
1999-03-01
Encouraging professionals in training and later to consider practice-related research findings when making important clinical decisions is an on-going concern. Evidenced-Based Medicine (EBM) and the Cochrane Collaboration (CC) provide a source of tools and ideas for doing so, as well as a roster of colleagues who share this interest. Evidenced-based medicine involves integrating clinical expertise with the best available external evidence from systematic research as well as considering the values and expectations of patients/clients. Advantage can be taken of educational formats developed in EBM, such as problem-based learning and critical-appraisal workshops in which participants learn how to ask key answerable questions related to important clinical practice questions (e.g., regarding effectiveness, accuracy of assessment measures, prediction, prevention, and quality of clinical practice guidelines) and to access and critically appraise related research. The Cochrane Collaboration is a world-wide network of centers that prepare, maintain, and disseminate high-quality systematic reviews on the efficacy of healthcare. These databases allow access to evidence related to clinical practice decisions. Forging reciprocal working relationships with those involved in EBM reciprocal and the CC should contribute to the pursuit of shared goals such as basing clinical decisions on the best-available evidence and involving clients as informed consumers.
Baldassano, Steven N; Brinkmann, Benjamin H; Ung, Hoameng; Blevins, Tyler; Conrad, Erin C; Leyde, Kent; Cook, Mark J; Khambhati, Ankit N; Wagenaar, Joost B; Worrell, Gregory A; Litt, Brian
2017-06-01
There exist significant clinical and basic research needs for accurate, automated seizure detection algorithms. These algorithms have translational potential in responsive neurostimulation devices and in automatic parsing of continuous intracranial electroencephalography data. An important barrier to developing accurate, validated algorithms for seizure detection is limited access to high-quality, expertly annotated seizure data from prolonged recordings. To overcome this, we hosted a kaggle.com competition to crowdsource the development of seizure detection algorithms using intracranial electroencephalography from canines and humans with epilepsy. The top three performing algorithms from the contest were then validated on out-of-sample patient data including standard clinical data and continuous ambulatory human data obtained over several years using the implantable NeuroVista seizure advisory system. Two hundred teams of data scientists from all over the world participated in the kaggle.com competition. The top performing teams submitted highly accurate algorithms with consistent performance in the out-of-sample validation study. The performance of these seizure detection algorithms, achieved using freely available code and data, sets a new reproducible benchmark for personalized seizure detection. We have also shared a 'plug and play' pipeline to allow other researchers to easily use these algorithms on their own datasets. The success of this competition demonstrates how sharing code and high quality data results in the creation of powerful translational tools with significant potential to impact patient care. © The Author (2017). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Vermont Core Standards and Self-Assessment Tool for Center-Based Early Childhood Programs.
ERIC Educational Resources Information Center
Vermont State Agency of Human Services, Waterbury.
In response to the desire to create for child development services a unified system which shares common standards for quality and respects the diversity and uniqueness of individuals and of programs, a committee of the Early Childhood Work Group collected and compared all the different standards now in force for the early childhood programs in the…
Moulton, Haley; Tosteson, Tor D; Zhao, Wenyan; Pearson, Loretta; Mycek, Kristina; Scherer, Emily; Weinstein, James N; Pearson, Adam; Abdu, William; Schwarz, Susan; Kelly, Michael; McGuire, Kevin; Milam, Alden; Lurie, Jon D
2018-06-05
Prospective evaluation of an informational web-based calculator for communicating estimates of personalized treatment outcomes. To evaluate the usability, effectiveness in communicating benefits and risks, and impact on decision quality of a calculator tool for patients with intervertebral disc herniations, spinal stenosis, and degenerative spondylolisthesis who are deciding between surgical and non-surgical treatments. The decision to have back surgery is preference-sensitive and warrants shared decision-making. However, more patient-specific, individualized tools for presenting clinical evidence on treatment outcomes are needed. Using Spine Patient Outcomes Research Trial (SPORT) data, prediction models were designed and integrated into a web-based calculator tool: http://spinesurgerycalc.dartmouth.edu/calc/. Consumer Reports subscribers with back-related pain were invited to use the calculator via email, and patient participants were recruited to use the calculator in a prospective manner following an initial appointment at participating spine centers. Participants completed questionnaires before and after using the calculator. We randomly assigned previously validated questions that tested knowledge about the treatment options to be asked either before or after viewing the calculator. 1,256 Consumer Reports subscribers and 68 patient participants completed the calculator and questionnaires. Knowledge scores were higher in the post-calculator group compared to the pre-calculator group, indicating that calculator usage successfully informed users. Decisional conflict was lower when measured following calculator use, suggesting the calculator was beneficial in the decision-making process. Participants generally found the tool helpful and easy to use. While the calculator is not a comprehensive decision aid, it does focus on communicating individualized risks and benefits for treatment options. Moreover, it appears to be helpful in achieving the goals of more traditional shared decision-making tools. It not only improved knowledge scores but also improved other aspects of decision quality.
Using "get with the guidelines" to improve cardiovascular secondary prevention.
LaBresh, Kenneth A; Gliklich, Richard; Liljestrand, James; Peto, Randolph; Ellrodt, A Gray
2003-10-01
"Get With The Guidelines (GWTG)" was developed and piloted by the American Heart Association (AHA), New England Affiliate; MassPRO, Inc.; and other organizations to reduce the gap in the application of secondary prevention guidelines in hospitalized cardiovascular disease patients. Collaborative learning programs and technology solutions were created for the project. The interactive Web-based patient management tool (PMT) was developed using quality measures derived from the AHA/American College of Cardiology secondary prevention guidelines. It provided data entry, embedded reminders and guideline summaries, and online reports of quality measure performance, including comparisons with the aggregate performance of all hospitals. Multidisciplinary teams from 24 hospitals participated in the 2000-2001 pilot. Four collaborative learning sessions and monthly conference calls supported team interaction. Best-practices sharing and the use of an Internet tool enabled hospitals to change systems and collect data on 1,738 patients. The GWTG program, a template of learning sessions with didactic presentations, best-practices sharing, and collaborative multidisciplinary team meetings supported by the Internet-based data collection and reporting system, can be extended to multiple regions without requiring additional development. Following the completion of the pilot, the AHA adopted GWTG as a national program.
Winter, Alfred; Takabayashi, Katsuhiko; Jahn, Franziska; Kimura, Eizen; Engelbrecht, Rolf; Haux, Reinhold; Honda, Masayuki; Hübner, Ursula H; Inoue, Sozo; Kohl, Christian D; Matsumoto, Takehiro; Matsumura, Yasushi; Miyo, Kengo; Nakashima, Naoki; Prokosch, Hans-Ulrich; Staemmler, Martin
2017-08-07
For more than 30 years, there has been close cooperation between Japanese and German scientists with regard to information systems in health care. Collaboration has been formalized by an agreement between the respective scientific associations. Following this agreement, two joint workshops took place to explore the similarities and differences of electronic health record systems (EHRS) against the background of the two national healthcare systems that share many commonalities. To establish a framework and requirements for the quality of EHRS that may also serve as a basis for comparing different EHRS. Donabedian's three dimensions of quality of medical care were adapted to the outcome, process, and structural quality of EHRS and their management. These quality dimensions were proposed before the first workshop of EHRS experts and enriched during the discussions. The Quality Requirements Framework of EHRS (QRF-EHRS) was defined and complemented by requirements for high quality EHRS. The framework integrates three quality dimensions (outcome, process, and structural quality), three layers of information systems (processes and data, applications, and physical tools) and three dimensions of information management (strategic, tactical, and operational information management). Describing and comparing the quality of EHRS is in fact a multidimensional problem as given by the QRF-EHRS framework. This framework will be utilized to compare Japanese and German EHRS, notably those that were presented at the second workshop.
MilxXplore: a web-based system to explore large imaging datasets.
Bourgeat, P; Dore, V; Villemagne, V L; Rowe, C C; Salvado, O; Fripp, J
2013-01-01
As large-scale medical imaging studies are becoming more common, there is an increasing reliance on automated software to extract quantitative information from these images. As the size of the cohorts keeps increasing with large studies, there is a also a need for tools that allow results from automated image processing and analysis to be presented in a way that enables fast and efficient quality checking, tagging and reporting on cases in which automatic processing failed or was problematic. MilxXplore is an open source visualization platform, which provides an interface to navigate and explore imaging data in a web browser, giving the end user the opportunity to perform quality control and reporting in a user friendly, collaborative and efficient way. Compared to existing software solutions that often provide an overview of the results at the subject's level, MilxXplore pools the results of individual subjects and time points together, allowing easy and efficient navigation and browsing through the different acquisitions of a subject over time, and comparing the results against the rest of the population. MilxXplore is fast, flexible and allows remote quality checks of processed imaging data, facilitating data sharing and collaboration across multiple locations, and can be easily integrated into a cloud computing pipeline. With the growing trend of open data and open science, such a tool will become increasingly important to share and publish results of imaging analysis.
Development of a personalized decision aid for breast cancer risk reduction and management.
Ozanne, Elissa M; Howe, Rebecca; Omer, Zehra; Esserman, Laura J
2014-01-14
Breast cancer risk reduction has the potential to decrease the incidence of the disease, yet remains underused. We report on the development a web-based tool that provides automated risk assessment and personalized decision support designed for collaborative use between patients and clinicians. Under Institutional Review Board approval, we evaluated the decision tool through a patient focus group, usability testing, and provider interviews (including breast specialists, primary care physicians, genetic counselors). This included demonstrations and data collection at two scientific conferences (2009 International Shared Decision Making Conference, 2009 San Antonio Breast Cancer Symposium). Overall, the evaluations were favorable. The patient focus group evaluations and usability testing (N = 34) provided qualitative feedback about format and design; 88% of these participants found the tool useful and 94% found it easy to use. 91% of the providers (N = 23) indicated that they would use the tool in their clinical setting. BreastHealthDecisions.org represents a new approach to breast cancer prevention care and a framework for high quality preventive healthcare. The ability to integrate risk assessment and decision support in real time will allow for informed, value-driven, and patient-centered breast cancer prevention decisions. The tool is being further evaluated in the clinical setting.
Kannan, V; Fish, JS; Mutz, JM; Carrington, AR; Lai, K; Davis, LS; Youngblood, JE; Rauschuber, MR; Flores, KA; Sara, EJ; Bhat, DG; Willett, DL
2017-01-01
Summary Background Creation of a new electronic health record (EHR)-based registry often can be a "one-off" complex endeavor: first developing new EHR data collection and clinical decision support tools, followed by developing registry-specific data extractions from the EHR for analysis. Each development phase typically has its own long development and testing time, leading to a prolonged overall cycle time for delivering one functioning registry with companion reporting into production. The next registry request then starts from scratch. Such an approach will not scale to meet the emerging demand for specialty registries to support population health and value-based care. Objective To determine if the creation of EHR-based specialty registries could be markedly accelerated by employing (a) a finite core set of EHR data collection principles and methods, (b) concurrent engineering of data extraction and data warehouse design using a common dimensional data model for all registries, and (c) agile development methods commonly employed in new product development. Methods We adopted as guiding principles to (a) capture data as a by product of care of the patient, (b) reinforce optimal EHR use by clinicians, (c) employ a finite but robust set of EHR data capture tool types, and (d) leverage our existing technology toolkit. Registries were defined by a shared condition (recorded on the Problem List) or a shared exposure to a procedure (recorded on the Surgical History) or to a medication (recorded on the Medication List). Any EHR fields needed—either to determine registry membership or to calculate a registry-associated clinical quality measure (CQM)—were included in the enterprise data warehouse (EDW) shared dimensional data model. Extract-transform-load (ETL) code was written to pull data at defined “grains” from the EHR into the EDW model. All calculated CQM values were stored in a single Fact table in the EDW crossing all registries. Registry-specific dashboards were created in the EHR to display both (a) real-time patient lists of registry patients and (b) EDW-generated CQM data. Agile project management methods were employed, including co-development, lightweight requirements documentation with User Stories and acceptance criteria, and time-boxed iterative development of EHR features in 2-week “sprints” for rapid-cycle feedback and refinement. Results Using this approach, in calendar year 2015 we developed a total of 43 specialty chronic disease registries, with 111 new EHR data collection and clinical decision support tools, 163 new clinical quality measures, and 30 clinic-specific dashboards reporting on both real-time patient care gaps and summarized and vetted CQM measure performance trends. Conclusions This study suggests concurrent design of EHR data collection tools and reporting can quickly yield useful EHR structured data for chronic disease registries, and bodes well for efforts to migrate away from manual abstraction. This work also supports the view that in new EHR-based registry development, as in new product development, adopting agile principles and practices can help deliver valued, high-quality features early and often. PMID:28930362
Henderson, Zsakeba T; Ernst, Kelly; Simpson, Kathleen Rice; Berns, Scott; Suchdev, Danielle B; Main, Elliott; McCaffrey, Martin; Lee, Karyn; Rouse, Tara Bristol; Olson, Christine K
2018-03-01
State Perinatal Quality Collaboratives (PQCs) are networks of multidisciplinary teams working to improve maternal and infant health outcomes. To address the shared needs across state PQCs and enable collaboration, Centers for Disease Control and Prevention (CDC), in partnership with March of Dimes and perinatal quality improvement experts from across the country, supported the development and launch of the National Network of Perinatal Quality Collaboratives (NNPQC). This process included assessing the status of PQCs in this country and identifying the needs and resources that would be most useful to support PQC development. National representatives from 48 states gathered for the first meeting of the NNPQC to share best practices for making measurable improvements in maternal and infant health. The number of state PQCs has grown considerably over the past decade, with an active PQC or a PQC in development in almost every state. However, PQCs have some common challenges that need to be addressed. After its successful launch, the NNPQC is positioned to ensure that every state PQC has access to key tools and resources that build capacity to actively improve maternal and infant health outcomes and healthcare quality.
Henderson, Zsakeba T; Ernst, Kelly; Simpson, Kathleen Rice; Berns, Scott D; Suchdev, Danielle B; Main, Elliott; McCaffrey, Martin; Lee, Karyn; Rouse, Tara Bristol; Olson, Christine K
2018-02-01
State Perinatal Quality Collaboratives (PQCs) are networks of multidisciplinary teams working to improve maternal and infant health outcomes. To address the shared needs across state PQCs and enable collaboration, Centers for Disease Control and Prevention, in partnership with March of Dimes and perinatal quality improvement experts from across the country, supported the development and launch of the National Network of PQCs National Network of Perinatal Quality Collaboratives (NNPQC). This process included assessing the status of PQCs in this country and identifying the needs and resources that would be most useful to support PQC development. National representatives from 48 states gathered for the first meeting of the NNPQC to share best practices for making measurable improvements in maternal and infant health. The number of state PQCs has grown considerably over the past decade, with an active PQC or a PQC in development in almost every state. However, PQCs have some common challenges that need to be addressed. After its successful launch, the NNPQC is positioned to ensure that every state PQC has access to key tools and resources that build capacity to actively improve maternal and infant health outcomes and healthcare quality.
Complex adaptive systems: a tool for interpreting responses and behaviours.
Ellis, Beverley
2011-01-01
Quality improvement is a priority for health services worldwide. There are many barriers to implementing change at the locality level and misinterpreting responses and behaviours can effectively block change. Electronic health records will influence the means by which knowledge and information are generated and sustained among those operating quality improvement programmes. To explain how complex adaptive system (CAS) theory provides a useful tool and new insight into the responses and behaviours that relate to quality improvement programmes in primary care enabled by informatics. Case studies in two English localities who participated in the implementation and development of quality improvement programmes. The research strategy included purposefully sampled case studies, conducted within a social constructionist ontological perspective. Responses and behaviours of quality improvement programmes in the two localities include both positive and negative influences associated with a networked model of governance. Pressures of time, resources and workload are common issues, along with the need for education and training about capturing, coding, recording and sharing information held within electronic health records to support various information requirements. Primary care informatics enables information symmetry among those operating quality improvement programmes by making some aspects of care explicit, allowing consensus about quality improvement priorities and implementable solutions.
A patient workflow management system built on guidelines.
Dazzi, L.; Fassino, C.; Saracco, R.; Quaglini, S.; Stefanelli, M.
1997-01-01
To provide high quality, shared, and distributed medical care, clinical and organizational issues need to be integrated. This work describes a methodology for developing a Patient Workflow Management System, based on a detailed model of both the medical work process and the organizational structure. We assume that the medical work process is represented through clinical practice guidelines, and that an ontological description of the organization is available. Thus, we developed tools 1) for acquiring the medical knowledge contained into a guideline, 2) to translate the derived formalized guideline into a computational formalism, precisely a Petri Net, 3) to maintain different representation levels. The high level representation guarantees that the Patient Workflow follows the guideline prescriptions, while the low level takes into account the specific organization characteristics and allow allocating resources for managing a specific patient in daily practice. PMID:9357606
Narayanan, Jaishree; Dobrin, Sofia; Choi, Janet; Rubin, Susan; Pham, Anna; Patel, Vimal; Frigerio, Roberta; Maurer, Darryck; Gupta, Payal; Link, Lourdes; Walters, Shaun; Wang, Chi; Ji, Yuan; Maraganore, Demetrius M
2017-01-01
Using the electronic medical record (EMR) to capture structured clinical data at the point of care would be a practical way to support quality improvement and practice-based research in epilepsy. We describe our stepwise process for building structured clinical documentation support tools in the EMR that define best practices in epilepsy, and we describe how we incorporated these toolkits into our clinical workflow. These tools write notes and capture hundreds of fields of data including several score tests: Generalized Anxiety Disorder-7 items, Neurological Disorders Depression Inventory for Epilepsy, Epworth Sleepiness Scale, Quality of Life in Epilepsy-10 items, Montreal Cognitive Assessment/Short Test of Mental Status, and Medical Research Council Prognostic Index. The tools summarize brain imaging, blood laboratory, and electroencephalography results, and document neuromodulation treatments. The tools provide Best Practices Advisories and other clinical decision support when appropriate. The tools prompt enrollment in a DNA biobanking study. We have thus far enrolled 231 patients for initial visits and are starting our first annual follow-up visits and provide a brief description of our cohort. We are sharing these EMR tools and captured data with other epilepsy clinics as part of a Neurology Practice Based Research Network, and are using the tools to conduct pragmatic trials using subgroup-based adaptive designs. © 2016 The Authors. Epilepsia published by Wiley Periodicals, Inc. on behalf of International League Against Epilepsy.
Siedlecki, Sandra L; Albert, Nancy M
This article will describe how to assess interrater reliability and validity of risk assessment tools, using easy-to-follow formulas, and to provide calculations that demonstrate principles discussed. Clinical nurse specialists should be able to identify risk assessment tools that provide high-quality interrater reliability and the highest validity for predicting true events of importance to clinical settings. Making best practice recommendations for assessment tool use is critical to high-quality patient care and safe practices that impact patient outcomes and nursing resources. Optimal risk assessment tool selection requires knowledge about interrater reliability and tool validity. The clinical nurse specialist will understand the reliability and validity issues associated with risk assessment tools, and be able to evaluate tools using basic calculations. Risk assessment tools are developed to objectively predict quality and safety events and ultimately reduce the risk of event occurrence through preventive interventions. To ensure high-quality tool use, clinical nurse specialists must critically assess tool properties. The better the tool's ability to predict adverse events, the more likely that event risk is mediated. Interrater reliability and validity assessment is relatively an easy skill to master and will result in better decisions when selecting or making recommendations for risk assessment tool use.
O'Connor, Brian D.; Yuen, Denis; Chung, Vincent; Duncan, Andrew G.; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent
2017-01-01
As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH). PMID:28344774
O'Connor, Brian D; Yuen, Denis; Chung, Vincent; Duncan, Andrew G; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent
2017-01-01
As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH).
Interactive web-based identification and visualization of transcript shared sequences.
Azhir, Alaleh; Merino, Louis-Henri; Nauen, David W
2018-05-12
We have developed TraC (Transcript Consensus), a web-based tool for detecting and visualizing shared sequences among two or more mRNA transcripts such as splice variants. Results including exon-exon boundaries are returned in a highly intuitive, data-rich, interactive plot that permits users to explore the similarities and differences of multiple transcript sequences. The online tool (http://labs.pathology.jhu.edu/nauen/trac/) is free to use. The source code is freely available for download (https://github.com/nauenlab/TraC). Copyright © 2018 Elsevier Inc. All rights reserved.
Doherr, Hanna; Christalle, Eva; Kriston, Levente; Härter, Martin; Scholl, Isabelle
2017-01-01
The Shared Decision Making Questionnaire (SDM-Q-9 and SDM-Q-Doc) is a 9-item measure of the decisional process in medical encounters from both patients' and physicians' perspectives. It has good acceptance, feasibility, and reliability. This systematic review aimed to 1) evaluate the use of the SDM-Q-9 and SDM-Q-Doc in intervention studies on shared decision making (SDM) in clinical settings, 2) describe how the SDM-Q-9 and SDM-Q-Doc performed regarding sensitivity to change, and 3) assess the methodological quality of studies and study protocols that use the measure. We conducted a systematic review of studies published between 2010 and October 2015 that evaluated interventions to facilitate SDM. The search strategy comprised three databases (EMBASE, PsycINFO, and Medline), reference tracking, citation tracking, and personal knowledge. Two independent reviewers screened titles and abstracts as well as full texts of potentially relevant records. We extracted the data using a pilot tested sheet, and we assessed the methodological quality of included studies using the Quality Assessment Tools from the U.S. National Institute of Health (NIH). Five completed studies and six study protocols fulfilled the inclusion criteria. The measure was used in a variety of health care settings, mainly in Europe, to evaluate several types of interventions. The reported mean sum scores ranged from 42 to 75 on a scale from 0 to 100. In four studies no significant change was detected in the mean-differences between main groups. In the fifth study the difference was small. Quality assessment revealed a high risk of bias in four of the five completed studies, while the study protocols received moderate quality ratings. We found a wide range of areas in which the SDM-Q-9 and SDM-Q-Doc were applied. In the future this review may help researchers decide whether the measure fits their purposes. Furthermore, the review revealed risk of bias in previous trials that used the measure, and may help future trials decrease this risk. More research on the measure's sensitivity to change is strongly suggested.
Construction of an advanced software tool for planetary atmospheric modeling
NASA Technical Reports Server (NTRS)
Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.
1993-01-01
Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a testbed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.
Construction of an advanced software tool for planetary atmospheric modeling
NASA Technical Reports Server (NTRS)
Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.
1992-01-01
Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.
Berger-Fiffy, Jill
2012-01-01
Harvard Vanguard Medical Associates (Harvard Vanguard) decided to develop a Shared Medical Appointment (SMA) program in 2007 for a variety of reasons. The program has launched 86 SMAs in 17 specialties at 12 sites and has exceeded 13 000 patient visits. Currently, the practice offers 54 SMAs and is believed to be the largest program in the country. This article provides an overview regarding staffing, space and equipment, project planning, promotional materials, training programs, workflow development, and the use of quality improvement (ie, LEAN) tools used to monitor the work to be completed and the metrics to date.
Shewade, Hemant Deepak; Vidhubala, E; Subramani, Divyaraj Prabhakar; Lal, Pranay; Bhatt, Neelam; Sundaramoorthi, C; Singh, Rana J; Kumar, Ajay M V
2017-01-01
A large state-wide tobacco survey was conducted using modified version of pretested, globally validated Global Adult Tobacco Survey (GATS) questionnaire in 2015-22016 in Tamil Nadu, India. Due to resource constrains, data collection was carrid out using paper-based questionnaires (unlike the GATS-India, 2009-2010, which used hand-held computer devices) while data entry was done using open access tools. The objective of this paper is to describe the process of data entry and assess its quality assurance and efficiency. In EpiData language, a variable is referred to as 'field' and a questionnaire (set of fields) as 'record'. EpiData software was used for double data entry with adequate checks followed by validation. Teamviewer was used for remote training and trouble shooting. The EpiData databases (one each for each district and each zone in Chennai city) were housed in shared Dropbox folders, which enabled secure sharing of files and automatic back-up. Each database for a district/zone had separate file for data entry of household level and individual level questionnaire. Of 32,945 households, there were 111,363 individuals aged ≥15 years. The average proportion of records with data entry errors for a district/zone in household level and individual level file was 4% and 24%, respectively. These are the errors that would have gone unnoticed if single entry was used. The median (inter-quartile range) time taken for double data entry for a single household level and individual level questionnaire was 30 (24, 40) s and 86 (64, 126) s, respectively. Efficient and quality-assured near-real-time data entry in a large sub-national tobacco survey was performed using innovative, resource-efficient use of open access tools.
Using Google Earth as an innovative tool for community mapping.
Lefer, Theodore B; Anderson, Matthew R; Fornari, Alice; Lambert, Anastasia; Fletcher, Jason; Baquero, Maria
2008-01-01
Maps are used to track diseases and illustrate the social context of health problems. However, commercial mapping software requires special training. This article illustrates how nonspecialists used Google Earth, a free program, to create community maps. The Bronx, New York, is characterized by high levels of obesity and diabetes. Residents and medical students measured the variety and quality of food and exercise sources around a residency training clinic and a student-run free clinic, using Google Earth to create maps with minimal assistance. Locations were identified using street addresses or simply by pointing to them on a map. Maps can be shared via e-mail, viewed online with Google Earth or Google Maps, and the data can be incorporated into other mapping software.
A vision for end-to-end data services to foster international partnerships through data sharing
NASA Astrophysics Data System (ADS)
Ramamurthy, M.; Yoksas, T.
2009-04-01
Increasingly, the conduct of science requires scientific partnerships and sharing of knowledge, information, and other assets. This is particularly true in our field where the highly-coupled Earth system and its many linkages have heightened the importance of collaborations across geographic, disciplinary, and organizational boundaries. The climate system, for example, is far too complex a puzzle to be unraveled by individual investigators or nations. As articulated in the NSF Strategic Plan: FY 2006-2011, "…discovery increasingly requires expertise of individuals from different disciplines, with diverse perspectives, and often from different nations, working together to accommodate the extraordinary complexity of today's science and engineering challenges." The Nobel Prize winning IPCC assessments are a prime example of such an effort. Earth science education is also uniquely suited to drawing connections between the dynamic Earth system and societal issues. Events like the 2004 Indian Ocean tsunami and Hurricane Katrina provide ample evidence of this relevance, as they underscore the importance of timely and interdisciplinary integration and synthesis of data. Our success in addressing such complex problems and advancing geosciences depends on the availability of a state-of-the-art and robust cyberinfrastructure, transparent and timely access to high-quality data from diverse sources, and requisite tools to integrate and use the data effectively, toward creating new knowledge. To that end, Unidata's vision calls for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users — no matter where they are, how they are connected to the Internet, or what computing device they use — will be able to find and access a plethora of geosciences data, experience how all of the aforementioned services work together, and use our tools and services both productively and creatively in their research, education, and other activities. Permit me to elucidate on what that vision really means for you by drawing a simple analogy. Most of you are familiar with Amazon and eBay e-commerce sites and content sharing sites like You Tube and Flickr. On the eBay marketplace, people can sell practically anything at any time and buyers can share their experience of purchasing a product or the reputation of a seller. Likewise, at Amazon, thousands of merchants sell their goods and millions of customers not only buy those goods, but provide a review or opinion of the products they buy and share their experiences with the purchase. Similarly, You Tube and Flickr are sites tailored to video- and photo-sharing, respectively, where users can upload their own content and share them with millions of other users, including family and friends. What all these sites have enabled is a sense of a virtual community in which users can search and browse products or content, comment and rate those products from anywhere, at any time, and via any Internet-enabled device like an iPhone, laptop, or a desktop computer. In essence, these enterprises have fundamentally altered people's buying modes and behavior toward purchases. I believe that similar approaches, appropriately tailored to meet the needs of the scientific community, can be adopted to provide and share geosciences data in the future. For example, future case-study data access systems, in addition to providing datasets and tools, will provide services that allow users to provide commentaries of a weather event, say a hurricane, as well as provide feedback on the quality, usefulness and interpretation of the datasets through integrated blogs, forums and Wikis, upload and share products they derive, ancillary materials that users might have gathered (such as photos and videos from the storm), and publications and curricular materials they develop, all through a single data portal. In essence, such case study collections will be "living" or dynamic, allowing users to be also contributors as they add value to and grow existing case study collections. At Unidata, our goal is to provide a portfolio of integrated data services toward realizing the vision presented here so that the geosciences community can continue to address societally relevant problems such as weather prediction, atmospheric and oceanic variability, climate change, and the water cycle, and advance scientific discovery.
iDASH: integrating data for analysis, anonymization, and sharing
Bafna, Vineet; Boxwala, Aziz A; Chapman, Brian E; Chapman, Wendy W; Chaudhuri, Kamalika; Day, Michele E; Farcas, Claudiu; Heintzman, Nathaniel D; Jiang, Xiaoqian; Kim, Hyeoneui; Kim, Jihoon; Matheny, Michael E; Resnic, Frederic S; Vinterbo, Staal A
2011-01-01
iDASH (integrating data for analysis, anonymization, and sharing) is the newest National Center for Biomedical Computing funded by the NIH. It focuses on algorithms and tools for sharing data in a privacy-preserving manner. Foundational privacy technology research performed within iDASH is coupled with innovative engineering for collaborative tool development and data-sharing capabilities in a private Health Insurance Portability and Accountability Act (HIPAA)-certified cloud. Driving Biological Projects, which span different biological levels (from molecules to individuals to populations) and focus on various health conditions, help guide research and development within this Center. Furthermore, training and dissemination efforts connect the Center with its stakeholders and educate data owners and data consumers on how to share and use clinical and biological data. Through these various mechanisms, iDASH implements its goal of providing biomedical and behavioral researchers with access to data, software, and a high-performance computing environment, thus enabling them to generate and test new hypotheses. PMID:22081224
iDASH: integrating data for analysis, anonymization, and sharing.
Ohno-Machado, Lucila; Bafna, Vineet; Boxwala, Aziz A; Chapman, Brian E; Chapman, Wendy W; Chaudhuri, Kamalika; Day, Michele E; Farcas, Claudiu; Heintzman, Nathaniel D; Jiang, Xiaoqian; Kim, Hyeoneui; Kim, Jihoon; Matheny, Michael E; Resnic, Frederic S; Vinterbo, Staal A
2012-01-01
iDASH (integrating data for analysis, anonymization, and sharing) is the newest National Center for Biomedical Computing funded by the NIH. It focuses on algorithms and tools for sharing data in a privacy-preserving manner. Foundational privacy technology research performed within iDASH is coupled with innovative engineering for collaborative tool development and data-sharing capabilities in a private Health Insurance Portability and Accountability Act (HIPAA)-certified cloud. Driving Biological Projects, which span different biological levels (from molecules to individuals to populations) and focus on various health conditions, help guide research and development within this Center. Furthermore, training and dissemination efforts connect the Center with its stakeholders and educate data owners and data consumers on how to share and use clinical and biological data. Through these various mechanisms, iDASH implements its goal of providing biomedical and behavioral researchers with access to data, software, and a high-performance computing environment, thus enabling them to generate and test new hypotheses.
Color extended visual cryptography using error diffusion.
Kang, InKoo; Arce, Gonzalo R; Lee, Heung-Kyu
2011-01-01
Color visual cryptography (VC) encrypts a color secret message into n color halftone image shares. Previous methods in the literature show good results for black and white or gray scale VC schemes, however, they are not sufficient to be applied directly to color shares due to different color structures. Some methods for color visual cryptography are not satisfactory in terms of producing either meaningless shares or meaningful shares with low visual quality, leading to suspicion of encryption. This paper introduces the concept of visual information pixel (VIP) synchronization and error diffusion to attain a color visual cryptography encryption method that produces meaningful color shares with high visual quality. VIP synchronization retains the positions of pixels carrying visual information of original images throughout the color channels and error diffusion generates shares pleasant to human eyes. Comparisons with previous approaches show the superior performance of the new method.
Aij, Kjeld Harald; Rapsaniotis, Sofia
2017-01-01
As health care organizations face pressures to improve quality and efficiency while reducing costs, leaders are adopting management techniques and tools used in manufacturing and other industries, especially Lean. Successful Lean leaders appear to use a coaching leadership style that shares underlying principles with servant leadership. There is little information about specific similarities and differences between Lean and servant leaderships. We systematically reviewed the literature on Lean leadership, servant leadership, and health care and performed a comparative analysis of attributes using Russell and Stone’s leadership framework. We found significant overlap between the two leadership styles, although there were notable differences in origins, philosophy, characteristics and behaviors, and tools. We conclude that both Lean and servant leaderships are promising models that can contribute to the delivery of patient-centered, high-value care. Servant leadership may provide the means to engage and develop employees to become successful Lean leaders in health care organizations. PMID:29355240
Connor, Carol McDonald
2013-12-01
In this commentary, I make five points: that designing observation systems that actually predict students' outcomes is challenging; second that systems that capture the complex and dynamic nature of the classroom learning environment are more likely to be able to meet this challenge; three, that observation tools are most useful when developed to serve a particular purpose and are put to that purpose; four that technology can help; and five, there are policy implications for valid and reliable classroom observation tools. The two observation systems presented in this special issue represent an important step forward and a move toward policy that promises to make a true difference in what is defined as high quality and effective teaching, what it looks like in the classroom, and how these practices can be more widely disseminated so that all children, including those attending under-resourced schools, can experience effective instruction, academic success and the lifelong accomplishment that follows. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Data management and data enrichment for systems biology projects.
Wittig, Ulrike; Rey, Maja; Weidemann, Andreas; Müller, Wolfgang
2017-11-10
Collecting, curating, interlinking, and sharing high quality data are central to de.NBI-SysBio, the systems biology data management service center within the de.NBI network (German Network for Bioinformatics Infrastructure). The work of the center is guided by the FAIR principles for scientific data management and stewardship. FAIR stands for the four foundational principles Findability, Accessibility, Interoperability, and Reusability which were established to enhance the ability of machines to automatically find, access, exchange and use data. Within this overview paper we describe three tools (SABIO-RK, Excemplify, SEEK) that exemplify the contribution of de.NBI-SysBio services to FAIR data, models, and experimental methods storage and exchange. The interconnectivity of the tools and the data workflow within systems biology projects will be explained. For many years we are the German partner in the FAIRDOM initiative (http://fair-dom.org) to establish a European data and model management service facility for systems biology. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Connor, Carol McDonald
2016-01-01
In this commentary, I make five points: that designing observation systems that actually predict students’ outcomes is challenging; second that systems that capture the complex and dynamic nature of the classroom learning environment are more likely to be able to meet this challenge; three, that observation tools are most useful when developed to serve a particular purpose and are put to that purpose; four that technology can help; and five, there are policy implications for valid and reliable classroom observation tools. The two observation systems presented in this special issue represent an important step forward and a move toward policy that promises to make a true difference in what is defined as high quality and effective teaching, what it looks like in the classroom, and how these practices can be more widely disseminated so that all children, including those attending under-resourced schools, can experience effective instruction, academic success and the lifelong accomplishment that follows. PMID:24341927
NASA Astrophysics Data System (ADS)
Vrolijk, Mark; Ogawa, Takayuki; Camanho, Arthur; Biasutti, Manfredi; Lorenz, David
2018-05-01
As a result from the ever increasing demand to produce lighter vehicles, more and more advanced high-strength materials are used in automotive industry. Focusing on sheet metal cold forming processes, these materials require high pressing forces and exhibit large springback after forming. Due to the high pressing forces deformations occur in the tooling geometry, introducing dimensional inaccuracies in the blank and potentially impact the final springback behavior. As a result the tool deformations can have an impact on the final assembly or introduce cosmetic defects. Often several iterations are required in try-out to obtain the required tolerances, with costs going up to as much as 30% of the entire product development cost. To investigate the sheet metal part feasibility and quality, in automotive industry CAE tools are widely used. However, in current practice the influence of the tool deformations on the final part quality is generally neglected and simulations are carried out with rigid tools to avoid drastically increased calculation times. If the tool deformation is analyzed through simulation it is normally done at the end of the drawing prosses, when contact conditions are mapped on the die structure and a static analysis is performed to check the deflections of the tool. But this method does not predict the influence of these deflections on the final quality of the part. In order to take tool deformations into account during drawing simulations, ESI has developed the ability to couple solvers efficiently in a way the tool deformations can be real-time included in the drawing simulation without high increase in simulation time compared to simulations with rigid tools. In this paper a study will be presented which demonstrates the effect of tool deformations on the final part quality.
Gauchotte, Guillaume; Ameisen, David; Boutonnat, Jean; Battistella, Maxime; Copie, Christiane; Garcia, Stéphane; Rigau, Valérie; Galateau-Sallé, Françoise; Terris, Benoit; Vergier, Béatrice; Wendum, Dominique; Bertheau, Philippe
2013-06-01
Building online teaching materials is a highly time and energy consuming task for teachers of a single university. With the help of the Collège des pathologistes, we initiated a French national university network for building mutualized online teaching pathology cases, tests and other pedagogic resources. Nineteen French universities are associated to this project, initially funded by UNF3S (http://www.unf3s.org/). One national e-learning Moodle platform (http://virtual-slides.univ-paris7.fr/moodle/) contains texts, medias and URL pointing toward decentralized virtual slides. The Moodle interface has been explained to the teachers since september 2011 using web-based conferences with screen-sharing. The following contents have been created: 20 clinical cases, several tests with multiple choices and short answer questions, and gross examination videos. A survey with 16 teachers and students showed a 94 % satisfaction rate, most of the 16 participants being favorable to the development of e-learning, in parallel with other courses in classroom. These tools will be further developed for the different study levels of pathology. In conclusion, these tools offer very interesting perspectives for pathology teaching. The organization of a national inter-university network is a useful way to create and share numerous and good-quality pedagogic resources. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
Critically Loaded Hole Technology Pilot Collaborative Test Programme.
1980-11-01
270 rpm Spindle Speed - 1450 rpm Feed Rate - Manual Feed Rate - Manual Cutting Fluid - Dry Cutting Fluid - Dry Tool Type - Cordia S-18 Tool Type... Cordia S-18 TABLE XI MANUFACTURING DETAILS FOR HIGH AND LOW QUALITY HOLES SELECTED BY THE UNITED KINGDOM HIGH QUALITY LOW QUALITY Pilot Hole: - 1/8 inch
Thomson, Sarah; Schang, Laura; Chernew, Michael E
2013-04-01
This article reviews efforts in the United States and several other member countries of the Organization for Economic Cooperation and Development to encourage patients, through cost sharing, to use goods such as medications, services, and providers that offer better value than other options--an approach known as value-based cost sharing. Among the countries we reviewed, we found that value-based approaches were most commonly applied to drug cost sharing. A few countries, including the United States, employed financial incentives, such as lower copayments, to encourage use of preferred providers or preventive services. Evidence suggests that these efforts can increase patients' use of high-value services--although they may also be associated with high administrative costs and could exacerbate health inequalities among various groups. With careful design, implementation, and evaluation, value-based cost sharing can be an important tool for aligning patient and provider incentives to pursue high-value care.
Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura
2016-01-01
Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use.
2011-01-01
Background Multiple types of assays allow sensitive detection of virus-specific neutralizing antibodies. For example, the extent of antibody neutralization of HIV-1, SIV and SHIV can be measured in the TZM-bl cell line through the degree of luciferase reporter gene expression after infection. In the past, neutralization curves and titers for this standard assay have been calculated using an Excel macro. Updating all instances of such a macro with new techniques can be unwieldy and introduce non-uniformity across multi-lab teams. Using Excel also poses challenges in centrally storing, sharing and associating raw data files and results. Results We present LabKey Server's NAb tool for organizing, analyzing and securely sharing data, files and results for neutralizing antibody (NAb) assays, including the luciferase-based TZM-bl NAb assay. The customizable tool supports high-throughput experiments and includes a graphical plate template designer, allowing researchers to quickly adapt calculations to new plate layouts. The tool calculates the percent neutralization for each serum dilution based on luminescence measurements, fits a range of neutralization curves to titration results and uses these curves to estimate the neutralizing antibody titers for benchmark dilutions. Results, curve visualizations and raw data files are stored in a database and shared through a secure, web-based interface. NAb results can be integrated with other data sources based on sample identifiers. It is simple to make results public after publication by updating folder security settings. Conclusions Standardized tools for analyzing, archiving and sharing assay results can improve the reproducibility, comparability and reliability of results obtained across many labs. LabKey Server and its NAb tool are freely available as open source software at http://www.labkey.com under the Apache 2.0 license. Many members of the HIV research community can also access the LabKey Server NAb tool without installing the software by using the Atlas Science Portal (https://atlas.scharp.org). Atlas is an installation of LabKey Server. PMID:21619655
UceWeb: a web-based collaborative tool for collecting and sharing quality of life data.
Parimbelli, E; Sacchi, L; Rubrichi, S; Mazzanti, A; Quaglini, S
2015-01-01
This work aims at building a platform where quality-of-life data, namely utility coefficients, can be elicited not only for immediate use, but also systematically stored together with patient profiles to build a public repository to be further exploited in studies on specific target populations (e.g. cost/utility analyses). We capitalized on utility theory and previous experience to define a set of desirable features such a tool should show to facilitate sound elicitation of quality of life. A set of visualization tools and algorithms has been developed to this purpose. To make it easily accessible for potential users, the software has been designed as a web application. A pilot validation study has been performed on 20 atrial fibrillation patients. A collaborative platform, UceWeb, has been developed and tested. It implements the standard gamble, time trade-off and rating-scale utility elicitation methods. It allows doctors and patients to choose the mode of interaction to maximize patients’ comfort in answering difficult questions. Every utility elicitation may contribute to the growth of the repository. UceWeb can become a unique source of data allowing researchers both to perform more reliable comparisons among healthcare interventions and build statistical models to gain deeper insight into quality of life data.
Ved, Ronak; Cobbold, Naomi; Igbagiri, Kueni; Willis, Mark; Leach, Paul; Zaben, Malik
2017-08-01
This study evaluates the quality of information available on the internet for carers of children with epilepsy considering treatment with Vagus Nerve Stimulation (VNS). Selected key phrases were entered into two popular search engines (Google™, Yahoo™). These phrases were: "Vagus nerve stimulator", alone and in combination with "childhood epilepsy", "paediatric epilepsy" and "epilepsy in childhood"; "VNS", and "VNS epilepsy". The first 50 hits per search were then screened. Of 600 identified sites, duplicated (262), irrelevant (230) and inaccessible (15) results were excluded. 93 websites were identified for evaluation using the DISCERN instrument, an online validation tool for patient information websites. The mean DISCERN score of all analysed websites was 39/80 (49%; SD 13.5). This equates to Fair to borderline Poor global quality, (Excellent=80-63; Good=62-51; Fair=50-39; Poor=38-27; Very poor=26-15). None of the analysed sites obtained an Excellent quality rating. 13% (12) obtained a Good score, 40% (37) obtained an Average score, 35% (33) obtained a Poor score, and 12% (11) obtained a Very poor score. The cohort of websites scored particularly poorly on assessment of whether reliable, holistic information was presented, for instance provision of reliable sources, (28%, SD 18) and discussion of alternative treatments, (30%, SD 14). To facilitate patient-centred shared decision-making, high quality information needs to be available for patients and families considering VNS. This study identifies that such information is difficult to locate on the internet. There is a need to develop focussed and reliable online patient resources for VNS. Copyright © 2017 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.
Hot Topic: Empowering Parents with Data
ERIC Educational Resources Information Center
Data for Action, 2011
2011-01-01
Nearly every high-priority item in national, federal, state, and local discussions about education--and policy proposals across the political spectrum--requires high-quality longitudinal data to inform its design, implementation, and evaluation. This factsheet shares Data Quality Campaign's (DQC's) analysis of what "Data for Action 2011: DQC's…
Alyusuf, Raja H; Prasad, Kameshwar; Abdel Satir, Ali M; Abalkhail, Ali A; Arora, Roopa K
2013-01-01
The exponential use of the internet as a learning resource coupled with varied quality of many websites, lead to a need to identify suitable websites for teaching purposes. The aim of this study is to develop and to validate a tool, which evaluates the quality of undergraduate medical educational websites; and apply it to the field of pathology. A tool was devised through several steps of item generation, reduction, weightage, pilot testing, post-pilot modification of the tool and validating the tool. Tool validation included measurement of inter-observer reliability; and generation of criterion related, construct related and content related validity. The validated tool was subsequently tested by applying it to a population of pathology websites. Reliability testing showed a high internal consistency reliability (Cronbach's alpha = 0.92), high inter-observer reliability (Pearson's correlation r = 0.88), intraclass correlation coefficient = 0.85 and κ =0.75. It showed high criterion related, construct related and content related validity. The tool showed moderately high concordance with the gold standard (κ =0.61); 92.2% sensitivity, 67.8% specificity, 75.6% positive predictive value and 88.9% negative predictive value. The validated tool was applied to 278 websites; 29.9% were rated as recommended, 41.0% as recommended with caution and 29.1% as not recommended. A systematic tool was devised to evaluate the quality of websites for medical educational purposes. The tool was shown to yield reliable and valid inferences through its application to pathology websites.
Lasers for industrial production processing: tailored tools with increasing flexibility
NASA Astrophysics Data System (ADS)
Rath, Wolfram
2012-03-01
High-power fiber lasers are the newest generation of diode-pumped solid-state lasers. Due to their all-fiber design they are compact, efficient and robust. Rofin's Fiber lasers are available with highest beam qualities but the use of different process fiber core sizes enables the user additionally to adapt the beam quality, focus size and Rayleigh length to his requirements for best processing results. Multi-mode fibers from 50μm to 600μm with corresponding beam qualities of 2.5 mm.mrad to 25 mm.mrad are typically used. The integrated beam switching modules can make the laser power available to 4 different manufacturing systems or can share the power to two processing heads for parallel processing. Also CO2 Slab lasers combine high power with either "single-mode" beam quality or higher order modes. The wellestablished technique is in use for a large number of industrial applications, processing either metals or non-metallic materials. For many of these applications CO2 lasers remain the best choice of possible laser sources either driven by the specific requirements of the application or because of the cost structure of the application. The actual technical properties of these lasers will be presented including an overview over the wavelength driven differences of application results, examples of current industrial practice as cutting, welding, surface processing including the flexible use of scanners and classical optics processing heads.
Simmons, Magenta B; Coates, Dominiek; Batchelor, Samantha; Dimopoulos-Bick, Tara; Howe, Deborah
2017-12-12
Youth participation is central to early intervention policy and quality frameworks. There is good evidence for peer support (individuals with lived experience helping other consumers) and shared decision making (involving consumers in making decisions about their own care) in adult settings. However, youth programs are rarely tested or described in detail. This report aims to fill this gap by describing a consumer focused intervention in an early intervention service. This paper describes the development process, intervention content and implementation challenges of the Choices about Healthcare Options Informed by Client Experiences and Expectations (CHOICE) Pilot Project. This highly novel and innovative project combined both youth peer work and youth shared decision making. Eight peer workers were employed to deliver an online shared decision-making tool at a youth mental health service in New South Wales, Australia. The intervention development involved best practice principles, including international standards and elements of co-design. The implementation of the peer workforce in the service involved a number of targeted strategies designed to support this new service model. However, several implementation challenges were experienced which resulted in critical learning about how best to deliver these types of interventions. Delivering peer work and shared decision making within an early intervention service is feasible, but not without challenges. Providing adequate detail about interventions and implementation strategies fills a critical gap in the literature. Understanding optimal youth involvement strategies assists others to deliver acceptable and effective services to young people who experience mental ill health. © 2017 John Wiley & Sons Australia, Ltd.
[Shared medical decision making in gynaecology].
This, P; Panel, P
2010-02-01
When two options or more can be chosen in medical care, the final decision implies two steps: facts analysis, and patient evaluation of preferences. Shared Medical Decision-Making is a rational conceptual frame that can be used in such cases. In this paper, we describe the concept, its practical modalities, and the questions raised by its use. In gynaecology, many medical situations involve "sensitive preferences choice": for example, contraceptive choice, menorrhagia treatment, and approach of menopause. Some tools from the "Shared Medical Decision Making" concept are useful to structure medical consultations, to convey information, and to reveal patients preferences. Decision aid are used in clinical research settings, but some of them may also be easily used in usual practice, and help physicians to improve both quality and traceability of the decisional process. Copyright 2009 Elsevier Masson SAS. All rights reserved.
Guidance for commissioning NHS England dental conscious sedation services: a framework tool.
Howlett, Paul
2014-01-01
Conscious sedation is an integral part of modern day dental care and should be delivered through a high quality, effective and evidence-based approach. Commissioning of NHS dental services in England is currently under review by NHS England and the National Dental Commissioning Group. This group has identified the management of vulnerable people including anxious patients, as one of its priorities. The Society for the Advancement of Anaesthesia in Dentistry (SAAD) believes this provides an opportunity to influence the commissioning of NHS conscious sedation services. With this aim in mind,"Guidance for Commissioning NHS England Dental Conscious Sedation Services: A Framework Tool" was developed. This guidance proposes a common approach to the organisation of NHS dental conscious sedation services in England, advocating the provision of Tier 1 and Tier 2 services in all regions. Its ethos is a"hub and spoke" model of service delivery with patient assessment delivered by experienced and well trained dental sedationists at its core. In line with the recent Francis Report fundamental standards for all aspects of dental conscious sedation practice are outlined, supported by a robust and predictable quality assurance process. This work has been shared with key stakeholders in NHS England including the Chief Dental Officer and the Head of Primary Care Commissioning.
Optimization of hydrometric monitoring network in urban drainage systems using information theory.
Yazdi, J
2017-10-01
Regular and continuous monitoring of urban runoff in both quality and quantity aspects is of great importance for controlling and managing surface runoff. Due to the considerable costs of establishing new gauges, optimization of the monitoring network is essential. This research proposes an approach for site selection of new discharge stations in urban areas, based on entropy theory in conjunction with multi-objective optimization tools and numerical models. The modeling framework provides an optimal trade-off between the maximum possible information content and the minimum shared information among stations. This approach was applied to the main surface-water collection system in Tehran to determine new optimal monitoring points under the cost considerations. Experimental results on this drainage network show that the obtained cost-effective designs noticeably outperform the consulting engineers' proposal in terms of both information contents and shared information. The research also determined the highly frequent sites at the Pareto front which might be important for decision makers to give a priority for gauge installation on those locations of the network.
Neuroimaging, Genetics, and Clinical Data Sharing in Python Using the CubicWeb Framework
Grigis, Antoine; Goyard, David; Cherbonnier, Robin; Gareau, Thomas; Papadopoulos Orfanos, Dimitri; Chauvat, Nicolas; Di Mascio, Adrien; Schumann, Gunter; Spooren, Will; Murphy, Declan; Frouin, Vincent
2017-01-01
In neurosciences or psychiatry, the emergence of large multi-center population imaging studies raises numerous technological challenges. From distributed data collection, across different institutions and countries, to final data publication service, one must handle the massive, heterogeneous, and complex data from genetics, imaging, demographics, or clinical scores. These data must be both efficiently obtained and downloadable. We present a Python solution, based on the CubicWeb open-source semantic framework, aimed at building population imaging study repositories. In addition, we focus on the tools developed around this framework to overcome the challenges associated with data sharing and collaborative requirements. We describe a set of three highly adaptive web services that transform the CubicWeb framework into a (1) multi-center upload platform, (2) collaborative quality assessment platform, and (3) publication platform endowed with massive-download capabilities. Two major European projects, IMAGEN and EU-AIMS, are currently supported by the described framework. We also present a Python package that enables end users to remotely query neuroimaging, genetics, and clinical data from scripts. PMID:28360851
ACLAME: a CLAssification of Mobile genetic Elements, update 2010.
Leplae, Raphaël; Lima-Mendez, Gipsi; Toussaint, Ariane
2010-01-01
The ACLAME database is dedicated to the collection, analysis and classification of sequenced mobile genetic elements (MGEs, in particular phages and plasmids). In addition to providing information on the MGEs content, classifications are available at various levels of organization. At the gene/protein level, families group similar sequences that are expected to share the same function. Families of four or more proteins are manually assigned with a functional annotation using the GeneOntology and the locally developed ontology MeGO dedicated to MGEs. At the genome level, evolutionary cohesive modules group sets of protein families shared among MGEs. At the population level, networks display the reticulate evolutionary relationships among MGEs. To increase the coverage of the phage sequence space, ACLAME version 0.4 incorporates 760 high-quality predicted prophages selected from the Prophinder database. Most of the data can be downloaded from the freely accessible ACLAME web site (http://aclame.ulb.ac.be). The BLAST interface for querying the database has been extended and numerous tools for in-depth analysis of the results have been added.
Neuroimaging, Genetics, and Clinical Data Sharing in Python Using the CubicWeb Framework.
Grigis, Antoine; Goyard, David; Cherbonnier, Robin; Gareau, Thomas; Papadopoulos Orfanos, Dimitri; Chauvat, Nicolas; Di Mascio, Adrien; Schumann, Gunter; Spooren, Will; Murphy, Declan; Frouin, Vincent
2017-01-01
In neurosciences or psychiatry, the emergence of large multi-center population imaging studies raises numerous technological challenges. From distributed data collection, across different institutions and countries, to final data publication service, one must handle the massive, heterogeneous, and complex data from genetics, imaging, demographics, or clinical scores. These data must be both efficiently obtained and downloadable. We present a Python solution, based on the CubicWeb open-source semantic framework, aimed at building population imaging study repositories. In addition, we focus on the tools developed around this framework to overcome the challenges associated with data sharing and collaborative requirements. We describe a set of three highly adaptive web services that transform the CubicWeb framework into a (1) multi-center upload platform, (2) collaborative quality assessment platform, and (3) publication platform endowed with massive-download capabilities. Two major European projects, IMAGEN and EU-AIMS, are currently supported by the described framework. We also present a Python package that enables end users to remotely query neuroimaging, genetics, and clinical data from scripts.
Tweeting and Treating: How Hospitals Use Twitter to Improve Care.
Gomes, Christian; Coustasse, Alberto
2015-01-01
Hospitals that have adopted Twitter primarily use it to share organizational news, provide general health care information, advertise upcoming community events, and foster networking. The purpose of this study was to explore the benefits that Twitter utilization has had in improving quality of care, access to care, patient satisfaction, and community footprint while assessing the barriers to its implementation. The methodology used was a qualitative study with a semistructured interview combined with a literature review, which followed the basic principles of a systematic review. The utilization of Twitter by hospitals suggest that it leads to savings of resources, enhanced employee and patient communication, and expanded patient reach in the community. Savings opportunities are generated by preventing unnecessary office visits, producing billable patient encounters, and eliminating high recruiting costs. Communication is enhanced using Twitter by sharing organizational content, news, and health promotions and can be also a useful tool during crises. The utilization of Twitter in the hospital setting has been more beneficial than detrimental in its ability to generate opportunities for cost savings, recruiting, communication with employees and patients, and community reach.
Improving care transitions through meaningful use stage 2: continuity of care document.
Murphy, Lyn Stankiewicz; Wilson, Marisa L; Newhouse, Robin P
2013-02-01
In this department, Drs Murphy, Wilson, and Newhouse highlight hot topics in nursing outcomes, research, and evidence-based practice relevant to the nurse administrator. The goal is to discuss the practical implications for nurse leaders in diverse healthcare settings. Content includes evidence-based projects and decision making, locating measurement tools for quality improvement and safety projects, using outcome measures to evaluate quality, practice implications of administrative research, and exemplars of projects that demon strate innovative approaches to organizational problems. In this article, the authors describe the elements of continuity of care documentation, how sharing information can improve the quality and safety of care transitions and the implications for nurse executives.
Sensory analysis and consumer acceptance of 140 high-quality extra virgin olive oils.
Valli, Enrico; Bendini, Alessandra; Popp, Martin; Bongartz, Annette
2014-08-01
Sensory analysis is a crucial tool for evaluating the quality of extra virgin olive oils. One aim of such an investigation is to verify if the sensory attributes themselves - which are strictly related to volatile and phenolic compounds - may permit the discrimination of high-quality products obtained by olives of different cultivars and/or grown in various regions. Moreover, a crucial topic is to investigate the interdependency between relevant parameters determining consumer acceptance and objective sensory characteristics evaluated by the panel test. By statistically analysing the sensory results, a grouping - but not discriminatory - effect was shown for some cultivars and some producing areas. The preference map shows that the most appreciated samples by consumers were situated in the direction of the 'ripe fruity' and 'sweet' axis and opposite to the 'bitter' and 'other attributes' (pungent, green fruity, freshly cut grass, green tomato, harmony, persistency) axis. Extra virgin olive oils produced from olives of the same cultivars and grown in the same areas shared similar sensorial attributes. Some differences in terms of expectation and interpretation of sensory characteristics of extra virgin olive oils might be present for consumers and panellists: most of the consumers appear unfamiliar with positive sensorial attributes, such as bitterness and pungency. © 2013 Society of Chemical Industry.
[Development of quality assurance/quality control web system in radiotherapy].
Okamoto, Hiroyuki; Mochizuki, Toshihiko; Yokoyama, Kazutoshi; Wakita, Akihisa; Nakamura, Satoshi; Ueki, Heihachi; Shiozawa, Keiko; Sasaki, Koji; Fuse, Masashi; Abe, Yoshihisa; Itami, Jun
2013-12-01
Our purpose is to develop a QA/QC (quality assurance/quality control) web system using a server-side script language such as HTML (HyperText Markup Language) and PHP (Hypertext Preprocessor), which can be useful as a tool to share information about QA/QC in radiotherapy. The system proposed in this study can be easily built in one's own institute, because HTML can be easily handled. There are two desired functions in a QA/QC web system: (i) To review the results of QA/QC for a radiotherapy machine, manuals, and reports necessary for routinely performing radiotherapy through this system. By disclosing the results, transparency can be maintained, (ii) To reveal a protocol for QA/QC in one's own institute using pictures and movies relating to QA/QC for simplicity's sake, which can also be used as an educational tool for junior radiation technologists and medical physicists. By using this system, not only administrators, but also all staff involved in radiotherapy, can obtain information about the conditions and accuracy of treatment machines through the QA/QC web system.
An Open Source Tool for Game Theoretic Health Data De-Identification.
Prasser, Fabian; Gaupp, James; Wan, Zhiyu; Xia, Weiyi; Vorobeychik, Yevgeniy; Kantarcioglu, Murat; Kuhn, Klaus; Malin, Brad
2017-01-01
Biomedical data continues to grow in quantity and quality, creating new opportunities for research and data-driven applications. To realize these activities at scale, data must be shared beyond its initial point of collection. To maintain privacy, healthcare organizations often de-identify data, but they assume worst-case adversaries, inducing high levels of data corruption. Recently, game theory has been proposed to account for the incentives of data publishers and recipients (who attempt to re-identify patients), but this perspective has been more hypothetical than practical. In this paper, we report on a new game theoretic data publication strategy and its integration into the open source software ARX. We evaluate our implementation with an analysis on the relationship between data transformation, utility, and efficiency for over 30,000 demographic records drawn from the U.S. Census Bureau. The results indicate that our implementation is scalable and can be combined with various data privacy risk and quality measures.
MilxXplore: a web-based system to explore large imaging datasets
Bourgeat, P; Dore, V; Villemagne, V L; Rowe, C C; Salvado, O; Fripp, J
2013-01-01
Objective As large-scale medical imaging studies are becoming more common, there is an increasing reliance on automated software to extract quantitative information from these images. As the size of the cohorts keeps increasing with large studies, there is a also a need for tools that allow results from automated image processing and analysis to be presented in a way that enables fast and efficient quality checking, tagging and reporting on cases in which automatic processing failed or was problematic. Materials and methods MilxXplore is an open source visualization platform, which provides an interface to navigate and explore imaging data in a web browser, giving the end user the opportunity to perform quality control and reporting in a user friendly, collaborative and efficient way. Discussion Compared to existing software solutions that often provide an overview of the results at the subject's level, MilxXplore pools the results of individual subjects and time points together, allowing easy and efficient navigation and browsing through the different acquisitions of a subject over time, and comparing the results against the rest of the population. Conclusions MilxXplore is fast, flexible and allows remote quality checks of processed imaging data, facilitating data sharing and collaboration across multiple locations, and can be easily integrated into a cloud computing pipeline. With the growing trend of open data and open science, such a tool will become increasingly important to share and publish results of imaging analysis. PMID:23775173
Sciacovelli, Laura; O'Kane, Maurice; Skaik, Younis Abdelwahab; Caciagli, Patrizio; Pellegrini, Cristina; Da Rin, Giorgio; Ivanov, Agnes; Ghys, Timothy; Plebani, Mario
2011-05-01
The adoption of Quality Indicators (QIs) has prompted the development of tools to measure and evaluate the quality and effectiveness of laboratory testing, first in the hospital setting and subsequently in ambulatory and other care settings. While Laboratory Medicine has an important role in the delivery of high-quality care, no consensus exists as yet on the use of QIs focussing on all steps of the laboratory total testing process (TTP), and further research in this area is required. In order to reduce errors in laboratory testing, the IFCC Working Group on "Laboratory Errors and Patient Safety" (WG-LEPS) developed a series of Quality Indicators, specifically designed for clinical laboratories. In the first phase of the project, specific QIs for key processes of the TTP were identified, including all the pre-, intra- and post-analytic steps. The overall aim of the project is to create a common reporting system for clinical laboratories based on standardized data collection, and to define state-of-the-art and Quality Specifications (QSs) for each QI independent of: a) the size of organization and type of activities; b) the complexity of processes undertaken; and c) different degree of knowledge and ability of the staff. The aim of the present paper is to report the results collected from participating laboratories from February 2008 to December 2009 and to identify preliminary QSs. The results demonstrate that a Model of Quality Indicators managed as an External Quality Assurance Program can serve as a tool to monitor and control the pre-, intra- and post-analytical activities. It might also allow clinical laboratories to identify risks that lead to errors resulting in patient harm: identification and design of practices that eliminate medical errors; the sharing of information and education of clinical and laboratory teams on practices that reduce or prevent errors; the monitoring and evaluation of improvement activities.
Davidson, Jaime A; Rosales, Aracely; Shillington, Alicia C; Bailey, Robert A; Kabir, Chris; Umpierrez, Guillermo E
2015-01-01
Purpose To describe the cultural and linguistic adaptation and Spanish translation of an English-language patient decision aid (PDA) for use in supporting shared decision-making in Hispanics/Latinos with type 2 diabetes mellitus (T2DM), a group at a high risk for complications. Patients and methods A steering committee of endocrinologists, a primary care physician, a certified diabetes educator, and a dietician, each with extensive experience in providing care to Hispanics/Latinos was convened to assess a PDA developed for English-speaking patients with T2DM. English content was reviewed for cultural sensitivity and appropriateness for a Hispanic/Latino population. A consensus-building process and iterative version edits incorporated clinician perspectives. The content was adapted to be consistent with traditional Hispanic/Latino cultural communication precepts (eg, avoidance of hostile confrontation; value for warm interaction; respect for authority; value of family support for decisions). The PDA was translated by native-speaking individuals with diabetes expertise. Results The PDA underwent testing during cognitive interviews with ten Spanish-speaking Hispanics/Latinos with T2DM to ensure that the content is reflective of the experience, understanding, and language Hispanic/Latino patients use to describe diabetes and treatment. Content edits were made to assure a literacy level appropriate to the audience, and the PDA was produced for online video dissemination. Conclusion High-quality, well-developed tools to facilitate shared decision-making in populations with limited access to culturally sensitive information can narrow gaps and align care with individual patient preferences. A newly developed PDA is available for shared decision-making that provides culturally appropriate treatment information for inadequately controlled Hispanics/Latinos with T2DM. The impact on the overall health of patients and care management of T2DM requires further study. PMID:25995623
Tagai, Erin K; Miller, Suzanne M; Kutikov, Alexander; Diefenbach, Michael A; Gor, Ronak A; Al-Saleem, Tahseen; Chen, David Y T; Fleszar, Sara; Roy, Gem
2018-01-15
The Gleason scoring system is a key component of a prostate cancer diagnosis, since it indicates disease aggressiveness. It also serves as a risk communication tool that facilitates shared treatment decision-making. However, the system is highly complex and therefore difficult to communicate: factors which have been shown to undermine well-informed and high-quality shared treatment decision-making. To systematically explore prostate cancer patients' understanding of the Gleason scoring system (GSS), we assessed knowledge and perceived importance among men who had completed treatment (N = 50). Patients were administered a survey that assessed patient knowledge and patients' perceived importance of the GSS, as well as demographics, medical factors (e.g., Gleason score at diagnosis), and health literacy. Bivariate analyses were conducted to identify associations with patient knowledge and perceived importance of the GSS. The sample was generally well-educated (48% with a bachelor's degree or higher) and health literate (M = 12.9, SD = 2.2, range = 3-15). Despite this, patient knowledge of the GSS was low (M = 1.8, SD = 1.4, range = 1-4). Patients' understanding of the importance of the GSS was moderate (M = 2.8, SD = 1.0, range = 0-4) and was positively associated with GSS knowledge (p < .01). Additionally, GSS knowledge was negatively associated with years since biopsy (p < .05). Age and health literacy were positively associated with patients' perceived importance of the GSS (p < .05), but not with GSS knowledge. Patient knowledge is thus less than optimal and would benefit from enhanced communication to maximize shared treatment decision-making. Future studies are needed to explore the potential utility of a simplified Gleason grading system and improved patient-provider communication.
Davidson, Jaime A; Rosales, Aracely; Shillington, Alicia C; Bailey, Robert A; Kabir, Chris; Umpierrez, Guillermo E
2015-01-01
To describe the cultural and linguistic adaptation and Spanish translation of an English-language patient decision aid (PDA) for use in supporting shared decision-making in Hispanics/Latinos with type 2 diabetes mellitus (T2DM), a group at a high risk for complications. A steering committee of endocrinologists, a primary care physician, a certified diabetes educator, and a dietician, each with extensive experience in providing care to Hispanics/Latinos was convened to assess a PDA developed for English-speaking patients with T2DM. English content was reviewed for cultural sensitivity and appropriateness for a Hispanic/Latino population. A consensus-building process and iterative version edits incorporated clinician perspectives. The content was adapted to be consistent with traditional Hispanic/Latino cultural communication precepts (eg, avoidance of hostile confrontation; value for warm interaction; respect for authority; value of family support for decisions). The PDA was translated by native-speaking individuals with diabetes expertise. The PDA underwent testing during cognitive interviews with ten Spanish-speaking Hispanics/Latinos with T2DM to ensure that the content is reflective of the experience, understanding, and language Hispanic/Latino patients use to describe diabetes and treatment. Content edits were made to assure a literacy level appropriate to the audience, and the PDA was produced for online video dissemination. High-quality, well-developed tools to facilitate shared decision-making in populations with limited access to culturally sensitive information can narrow gaps and align care with individual patient preferences. A newly developed PDA is available for shared decision-making that provides culturally appropriate treatment information for inadequately controlled Hispanics/Latinos with T2DM. The impact on the overall health of patients and care management of T2DM requires further study.
2013-01-01
Background Clinicians face challenges in promoting colorectal cancer screening due to multiple competing demands. A decision aid that clarifies patient preferences and improves decision quality can aid shared decision making and be effective at increasing colorectal cancer screening rates. However, exactly how such an intervention improves shared decision making is unclear. This study, funded by the National Cancer Institute, seeks to provide detailed understanding of how an interactive decision aid that elicits patient’s risks and preferences impacts patient-clinician communication and shared decision making, and ultimately colorectal cancer screening adherence. Methods/Design This is a two-armed single-blinded randomized controlled trial with the target of 300 patients per arm. The setting is eleven community and three academic primary care practices in Metro Detroit. Patients are men and women aged between 50 and 75 years who are not up to date on colorectal cancer screening. ColoDATES Web (intervention arm), a decision aid that incorporates interactive personal risk assessment and preference clarification tools, is compared to a non-interactive website that matches ColoDATES Web in content but does not contain interactive tools (control arm). Primary outcomes are patient uptake of colorectal cancer screening; patient decision quality (knowledge, preference clarification, intent); clinician’s degree of shared decision making; and patient-clinician concordance in the screening test chosen. Secondary outcome incorporates a Structural Equation Modeling approach to understand the mechanism of the causal pathway and test the validity of the proposed conceptual model based on Theory of Planned Behavior. Clinicians and those performing the analysis are blinded to arms. Discussion The central hypothesis is that ColoDATES Web will improve colorectal cancer screening adherence through improvement in patient behavioral factors, shared decision making between the patient and the clinician, and concordance between the patient’s and clinician’s preferred colorectal cancer screening test. The results of this study will be among the first to examine the effect of a real-time preference assessment exercise on colorectal cancer screening and mediators, and, in doing so, will shed light on the patient-clinician communication and shared decision making ‘black box’ that currently exists between the delivery of decision aids to patients and subsequent patient behavior. Trial Registration ClinicalTrials.gov ID NCT01514786 PMID:24216139
Open source tools for fluorescent imaging.
Hamilton, Nicholas A
2012-01-01
As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.
Traumatic Brain Injury Diffusion Magnetic Resonance Imaging Research Roadmap Development Project
2011-10-01
promising technology on the horizon is the Diffusion Tensor Imaging ( DTI ). Diffusion tensor imaging ( DTI ) is a magnetic resonance imaging (MRI)-based...in the brain. The potential for DTI to improve our understanding of TBI has not been fully explored and challenges associated with non-existent...processing tools, quality control standards, and a shared image repository. The recommendations will be disseminated and pilot tested. A DTI of TBI
Collaborative workbench for cyberinfrastructure to accelerate science algorithm development
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Maskey, M.; Kuo, K.; Lynnes, C.
2013-12-01
There are significant untapped resources for information and knowledge creation within the Earth Science community in the form of data, algorithms, services, analysis workflows or scripts, and the related knowledge about these resources. Despite the huge growth in social networking and collaboration platforms, these resources often reside on an investigator's workstation or laboratory and are rarely shared. A major reason for this is that there are very few scientific collaboration platforms, and those that exist typically require the use of a new set of analysis tools and paradigms to leverage the shared infrastructure. As a result, adoption of these collaborative platforms for science research is inhibited by the high cost to an individual scientist of switching from his or her own familiar environment and set of tools to a new environment and tool set. This presentation will describe an ongoing project developing an Earth Science Collaborative Workbench (CWB). The CWB approach will eliminate this barrier by augmenting a scientist's current research environment and tool set to allow him or her to easily share diverse data and algorithms. The CWB will leverage evolving technologies such as commodity computing and social networking to design an architecture for scalable collaboration that will support the emerging vision of an Earth Science Collaboratory. The CWB is being implemented on the robust and open source Eclipse framework and will be compatible with widely used scientific analysis tools such as IDL. The myScience Catalog built into CWB will capture and track metadata and provenance about data and algorithms for the researchers in a non-intrusive manner with minimal overhead. Seamless interfaces to multiple Cloud services will support sharing algorithms, data, and analysis results, as well as access to storage and computer resources. A Community Catalog will track the use of shared science artifacts and manage collaborations among researchers.
Canadian ENGOs in governance of water resources: information needs and monitoring practices.
Kebo, Sasha; Bunch, Martin J
2013-11-01
Water quality monitoring involves a complex set of steps and a variety of approaches. Its goals include understanding of aquatic habitats, informing management and facilitating decision making, and educating citizens. Environmental nongovernmental organizations (ENGOs) are increasingly engaged in water quality monitoring and act as environmental watchdogs and stewards of water resources. These organizations exhibit different monitoring mandates. As government involvement in water quality monitoring continues to decline, it becomes essential that we understand their modi operandi. By doing so, we can enhance efficacy and encourage data sharing and communication. This research examined Canadian ENGOs that collect their own data on water quality with respect to water quality monitoring activities and information needs. This work had a twofold purpose: (1) to enhance knowledge about the Canadian ENGOs operating in the realm of water quality monitoring and (2) to guide and inform development of web-based geographic information systems (GIS) to support water quality monitoring, particularly using benthic macroinvertebrate protocols. A structured telephone survey was administered across 10 Canadian provinces to 21 ENGOs that undertake water quality monitoring. This generated information about barriers and challenges of data sharing, commonly collected metrics, human resources, and perceptions of volunteer-collected data. Results are presented on an aggregate level and among different groups of respondents. Use of geomatics technology was not consistent among respondents, and we found no noteworthy differences between organizations that did and did not use GIS tools. About one third of respondents did not employ computerized systems (including databases and spreadsheets) to support data management, analysis, and sharing. Despite their advantage as a holistic water quality indicator, benthic macroinvertebrates (BMIs) were not widely employed in stream monitoring. Although BMIs are particularly suitable for the purpose of citizen education, few organizations collected this metric, despite having public education and awareness as part of their mandate.
Promoting Continuous Quality Improvement in Online Teaching: The META Model
ERIC Educational Resources Information Center
Dittmar, Eileen; McCracken, Holly
2012-01-01
Experienced e-learning faculty members share strategies for implementing a comprehensive postsecondary faculty development program essential to continuous improvement of instructional skills. The high-impact META Model (centered around Mentoring, Engagement, Technology, and Assessment) promotes information sharing and content creation, and fosters…
Lean Management Systems in Radiology: Elements for Success.
Schultz, Stacy R; Ruter, Royce L; Tibor, Laura C
2016-01-01
This article is a review of the literature on Lean and Lean Management Systems and how they have been implemented in healthcare organizations and particularly in radiology departments. The review focuses on the elements required for a successful implementation of Lean by applying the principles of a Lean Management System instead of a Lean tools-only approach. This review shares the successes and failures from healthcare organizations' efforts to improve the quality and safety of the services they provide. There are a limited number of healthcare organizations in the literature who have shared their experiences and additional research is necessary to determine whether a Lean Management System is a viable alternative to the current management structure in healthcare.
2014-01-01
Background Since microscopic slides can now be automatically digitized and integrated in the clinical workflow, quality assessment of Whole Slide Images (WSI) has become a crucial issue. We present a no-reference quality assessment method that has been thoroughly tested since 2010 and is under implementation in multiple sites, both public university-hospitals and private entities. It is part of the FlexMIm R&D project which aims to improve the global workflow of digital pathology. For these uses, we have developed two programming libraries, in Java and Python, which can be integrated in various types of WSI acquisition systems, viewers and image analysis tools. Methods Development and testing have been carried out on a MacBook Pro i7 and on a bi-Xeon 2.7GHz server. Libraries implementing the blur assessment method have been developed in Java, Python, PHP5 and MySQL5. For web applications, JavaScript, Ajax, JSON and Sockets were also used, as well as the Google Maps API. Aperio SVS files were converted into the Google Maps format using VIPS and Openslide libraries. Results We designed the Java library as a Service Provider Interface (SPI), extendable by third parties. Analysis is computed in real-time (3 billion pixels per minute). Tests were made on 5000 single images, 200 NDPI WSI, 100 Aperio SVS WSI converted to the Google Maps format. Conclusions Applications based on our method and libraries can be used upstream, as calibration and quality control tool for the WSI acquisition systems, or as tools to reacquire tiles while the WSI is being scanned. They can also be used downstream to reacquire the complete slides that are below the quality threshold for surgical pathology analysis. WSI may also be displayed in a smarter way by sending and displaying the regions of highest quality before other regions. Such quality assessment scores could be integrated as WSI's metadata shared in clinical, research or teaching contexts, for a more efficient medical informatics workflow. PMID:25565494
Simiyu, Sheillah; Swilling, Mark; Cairncross, Sandy; Rheingans, Richard
2017-01-11
Shared facilities are not recognised as improved sanitation due to challenges of maintenance as they easily can be avenues for the spread of diseases. Thus there is need to evaluate the quality of shared facilities, especially in informal settlements, where they are commonly used. A shared facility can be equated to a common good whose management depends on the users. If users do not work collectively towards keeping the facility clean, it is likely that the quality may depreciate due to lack of maintenance. This study examined the quality of shared sanitation facilities and used the common pool resource (CPR) management principles to examine the determinants of shared sanitation quality in the informal settlements of Kisumu, Kenya. Using a multiple case study design, the study employed both quantitative and qualitative methods. In both phases, users of shared sanitation facilities were interviewed, while shared sanitation facilities were inspected. Shared sanitation quality was a score which was the dependent variable in a regression analysis. Interviews during the qualitative stage were aimed at understanding management practices of shared sanitation users. Qualitative data was analysed thematically by following the CPR principles. Shared facilities, most of which were dirty, were shared by an average of eight households, and their quality decreased with an increase in the number of households sharing. The effect of numbers on quality is explained by behaviour reflected in the CPR principles, as it was easier to define boundaries of shared facilities when there were fewer users who cooperated towards improving their shared sanitation facility. Other factors, such as defined management systems, cooperation, collective decision making, and social norms, also played a role in influencing the behaviour of users towards keeping shared facilities clean and functional. Apart from hardware factors, quality of shared sanitation is largely due to group behaviour of users. The CPR principles form a crucial lens through which the dynamics of shared sanitation facilities in informal settlements can be understood. Development and policy efforts should incorporate group behaviour as they determine the quality of shared sanitation facilities.
Calleja-Fernández, Alicia; Pintor-de-la-Maza, Begoña; Vidal-Casariego, Alfonso; Cano-Rodríguez, Isidoro; Ballesteros-Pomar, María D
2016-06-01
Texture-modified diets (TMDs) should fulfil nutritional goals, guarantee homogenous texture, and meet food safety regulations. The food industry has created texture-modified food (TMF) that meets the TMD requirements of quality and safety for inpatients. To design and develop a tool that allows the objective selection of foodstuffs for TMDs that ensures nutritional requirements and swallowing safety of inpatients in order to improve their quality of life, especially regarding their food satisfaction. An evaluation tool was designed to objectively determine the adequacy of food included in the TMD menus of a hospital. The "Objective Evaluation Tool for Texture-Modified Food" (OET-TMF) consists of seven items that evaluate the food's nutritional quality (energy and protein input), presence of allergens, texture and viscosity, cooking, storage type, useful life, and patient acceptance. The total score ranged from 0 to 64 and was divided into four categories: high quality, good quality, medium quality, and low quality. Studying four different commercial TMFs contributed to the validation of the tool. All the evaluated products scored between high and good regarding quality. There was a tendency (p = 0.077) towards higher consumption and a higher overall quality of the product obtained with the OET-TMF. The product that scored highest with the tool was the best accepted; the product with the lowest score had the highest rate of refusal. The OET-TMF allows for the objective discrimination of the quality of TMF. In addition, it shows a certain relationship between the observed and assessed quality intake.
De Belvis, Antonio Giulio; Specchia, Maria Lucia; Ferriero, Anna Maria; Capizzi, Silvio
2017-01-01
Risk management is a key tool in Clinical Governance. Our project aimed to define, share, apply and measure the impact of tools and methodologies for the continuous improvement of quality of care, especially in relation to the multi-disciplinary and integrated management of the hyperglycemic patient in hospital settings. A training project, coordinated by a scientific board of experts in diabetes and health management and an Expert Meeting with representatives of all the participating centers was launched in 2014. The project involved eight hospitals through the organization of meetings with five managers and 25 speakers, including diabetologists, internists, pharmacists and nurses. The analysis showed a wide variability in the adoption of tools and processes towards a comprehensive and coordinated management of hyperglycemic patients.
Alyusuf, Raja H.; Prasad, Kameshwar; Abdel Satir, Ali M.; Abalkhail, Ali A.; Arora, Roopa K.
2013-01-01
Background: The exponential use of the internet as a learning resource coupled with varied quality of many websites, lead to a need to identify suitable websites for teaching purposes. Aim: The aim of this study is to develop and to validate a tool, which evaluates the quality of undergraduate medical educational websites; and apply it to the field of pathology. Methods: A tool was devised through several steps of item generation, reduction, weightage, pilot testing, post-pilot modification of the tool and validating the tool. Tool validation included measurement of inter-observer reliability; and generation of criterion related, construct related and content related validity. The validated tool was subsequently tested by applying it to a population of pathology websites. Results and Discussion: Reliability testing showed a high internal consistency reliability (Cronbach's alpha = 0.92), high inter-observer reliability (Pearson's correlation r = 0.88), intraclass correlation coefficient = 0.85 and κ =0.75. It showed high criterion related, construct related and content related validity. The tool showed moderately high concordance with the gold standard (κ =0.61); 92.2% sensitivity, 67.8% specificity, 75.6% positive predictive value and 88.9% negative predictive value. The validated tool was applied to 278 websites; 29.9% were rated as recommended, 41.0% as recommended with caution and 29.1% as not recommended. Conclusion: A systematic tool was devised to evaluate the quality of websites for medical educational purposes. The tool was shown to yield reliable and valid inferences through its application to pathology websites. PMID:24392243
Evaluation of the quality of patient information to support informed shared decision-making.
Godolphin, W; Towle, A; McKendry, R
2001-12-01
(a) To find out how much patient information material on display in family physicians' offices refers to management choices, and hence may be useful to support informed and shared decision-making (ISDM) by patients and (b) to evaluate the quality of print information materials exchanged during the consultation, i.e. brought in by patients or given out by family physicians. All print information available for patients and exchanged between physicians and patients was collected in a single complete day of the office practices of 21 family physicians. A published and validated instrument (DISCERN) was used to assess quality. Community office practices in the greater Vancouver area, British Columbia, Canada. The physicians were purposefully recruited by their association with the medical school Department of Family Practice, their interest in providing patients with print information and their representation of a range of practice types and location. The source of the pamphlets and these categories: available in the physicians' offices; exchanged between physician and patient; and produced with the explicit or apparent intent to support evidence-based patient choice. The quality of the print information to support ISDM, as measured by DISCERN and the ease of use and reliability of the DISCERN tool. Fewer than 50% of pamphlets available in these offices fulfilled our minimum criteria for ISDM (mentioned more than one management option). Offices varied widely in the proportion of pamphlets on display that supported ISDM and how particular the physician was in selecting materials. The DISCERN tool is quick, valid and reliable for the evaluation of patient information. The quality of patient information materials used in the consultation and available in these offices was below midpoint on the DISCERN score. Major deficiencies were with respect to the mention of choices, risks, effect of no treatment or uncertainty and reliability (source, evidence-base). Good quality information can be produced; some is available locally.
Evaluation of the quality of patient information to support informed shared decision‐making
Godolphin, William; Towle, Angela; McKendry, Rachael
2008-01-01
Objectives (a) To find out how much patient information material on display in family physicians’ offices refers to management choices, and hence may be useful to support informed and shared decision‐making (ISDM) by patients and (b) to evaluate the quality of print information materials exchanged during the consultation, i.e. brought in by patients or given out by family physicians. Design All print information available for patients and exchanged between physicians and patients was collected in a single complete day of the office practices of 21 family physicians. A published and validated instrument (DISCERN) was used to assess quality. Setting and participants Community office practices in the greater Vancouver area, British Columbia, Canada. The physicians were purposefully recruited by their association with the medical school Department of Family Practice, their interest in providing patients with print information and their representation of a range of practice types and location. Main variables studied The source of the pamphlets and these categories: available in the physicians’ offices; exchanged between physician and patient; and produced with the explicit or apparent intent to support evidence‐based patient choice. Main outcome measures The quality of the print information to support ISDM, as measured by DISCERN and the ease of use and reliability of the DISCERN tool. Results and conclusions Fewer than 50% of pamphlets available in these offices fulfilled our minimum criteria for ISDM (mentioned more than one management option). Offices varied widely in the proportion of pamphlets on display that supported ISDM and how particular the physician was in selecting materials. The DISCERN tool is quick, valid and reliable for the evaluation of patient information. The quality of patient information materials used in the consultation and available in these offices was below midpoint on the DISCERN score. Major deficiencies were with respect to the mention of choices, risks, effect of no treatment or uncertainty and reliability (source, evidence‐base). Good quality information can be produced; some is available locally. PMID:11703497
Eldh, Ann Catrine; Luhr, Kristina; Ehnfors, Margareta
2015-12-01
To report on the development and initial testing of a clinical tool, The Patient Preferences for Patient Participation tool (The 4Ps), which will allow patients to depict, prioritize, and evaluate their participation in health care. While patient participation is vital for high quality health care, a common definition incorporating all stakeholders' experience is pending. In order to support participation in health care, a tool for determining patients' preferences on participation is proposed, including opportunities to evaluate participation while considering patient preferences. Exploratory mixed methods studies informed the development of the tool, and descriptive design guided its initial testing. The 4Ps tool was tested with 21 Swedish researcher experts (REs) and patient experts (PEs) with experience of patient participation. Individual Think Aloud interviews were employed to capture experiences of content, response process, and acceptability. 'The 4Ps' included three sections for the patient to depict, prioritize, and evaluate participation using 12 items corresponding to 'Having Dialogue', 'Sharing Knowledge', 'Planning', and 'Managing Self-care'. The REs and PEs considered 'The 4Ps' comprehensible, and that all items corresponded to the concept of patient participation. The tool was perceived to facilitate patient participation whilst requiring amendments to content and layout. A tool like The 4Ps provides opportunities for patients to depict participation, and thus supports communication and collaboration. Further patient evaluation is needed to understand the conditions for patient participation. While The 4Ps is promising, revision and testing in clinical practice is required. © 2014 John Wiley & Sons Ltd.
Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura
2016-01-01
Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use. PMID:26930204
SECIMTools: a suite of metabolomics data analysis tools.
Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M
2018-04-20
Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.
NASA Astrophysics Data System (ADS)
Hasenkopf, C. A.
2017-12-01
Increasingly, open data, open-source projects are unearthing rich datasets and tools, previously impossible for more traditional avenues to generate. These projects are possible, in part, because of the emergence of online collaborative and code-sharing tools, decreasing costs of cloud-based services to fetch, store, and serve data, and increasing interest of individuals to contribute their time and skills to 'open projects.' While such projects have generated palpable enthusiasm from many sectors, many of these projects face uncharted paths for sustainability, visibility, and acceptance. Our project, OpenAQ, is an example of an open-source, open data community that is currently forging its own uncharted path. OpenAQ is an open air quality data platform that aggregates and universally formats government and research-grade air quality data from 50 countries across the world. To date, we make available more than 76 million air quality (PM2.5, PM10, SO2, NO2, O3, CO and black carbon) data points through an open Application Programming Interface (API) and a user-customizable download interface at https://openaq.org. The goal of the platform is to enable an ecosystem of users to advance air pollution efforts from science to policy to the private sector. The platform is also an open-source project (https://github.com/openaq) and has only been made possible through the coding and data contributions of individuals around the world. In our first two years of existence, we have seen requests for data to our API skyrocket to more than 6 million datapoints per month, and use-cases as varied as ingesting data aggregated from our system into real-time models of wildfires to building open-source statistical packages (e.g. ropenaq and py-openaq) on top of the platform to creating public-friendly apps and chatbots. We will share a whirl-wind trip through our evolution and the many lessons learned so far related to platform structure, community engagement, organizational model type and sustainability.
Climate Science News 2.0 at NSIDC
NASA Astrophysics Data System (ADS)
Leitzell, K.; Meier, W.; Serreze, M. C.; Stroeve, J. C.; Scambos, T. A.
2011-12-01
How does a small science and data center step into new media? We do not have a lot of time to blog daily, maintain multiple social media accounts, monitor comments, or to constantly buff our image in the fast-changing world of social media. At the same time, the National Snow and Ice Data Center (NSIDC)'s news announcements and updates on Arctic sea ice reach a huge audience. We have answers to the questions about Arctic climate change that many people are asking, and we want to share that information with people who get their news from non-traditional sources. How can we take advantage of new technologies to help our information reach the largest number of people, without overwhelming our limited resources? So far our approach has been to continue offering innovative, insightful content that in some ways sells itself. We use social media as a tool to share this popular content, emphasizing quality over quantity (We do not tweet every day, but when we do, people listen). We also use social media as a research and "buzz-monitoring" tool to learn more about and to interact with our diverse audience. Even before NSIDC ventured onto Twitter and Facebook, people were using these tools to share our content. Social media allowed us to passively enjoy their benefits, as our regular readers shared updates with their friends and colleagues. The news, analysis, and data we provide were unique, and that made them attractive to a broad readership. By dipping a toe into social media, however, we found that we could start sharing our content with more control, and that a little effort goes a long way in spreading the word. In this presentation/poster we will show how NSIDC is using Twitter, Facebook, and the new Icelights Web site, to communicate with the public about changing sea ice and climate.
Achieving Quality in e-Learning through Relational Coordination
ERIC Educational Resources Information Center
Margalina, Vasilica Maria; De-Pablos-Heredero, Carmen; Montes-Botella, Jose Luis
2017-01-01
In this research, the relational coordination model has been applied to prove learners' and instructors' high levels of satisfaction in e-learning. According to the model, organizations can obtain better results in terms of satisfaction by providing shared knowledge, shared goals and mutual respect mechanisms, supported by a frequent, timely and…
Proposal for constructing an advanced software tool for planetary atmospheric modeling
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.
1990-01-01
Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.
Adoption of a Nationwide Shared Medical Record in France: Lessons Learnt after 5 Years of Deployment
Séroussi, Brigitte; Bouaud, Jacques
2016-01-01
Information sharing among health practitioners, either for coordinated or unscheduled care, is necessary to guarantee care quality and patient safety. In most countries, nationwide programs have provided tools to support information sharing, from centralized care records to health information exchange between electronic health records (EHRs). The French personal medical record (DMP) is a centralized patient-controlled record, created according to the opt-in consent model. It contains the documents health practitioners voluntarily push into the DMP from their EHRs. Five years after the launching of the program in December 2010, there were nearly 570,000 DMPs covering only 1.5% of the target population in December 2015. Reasons for this poor level of adoption are discussed in the perspective of other countries’ initiatives. The new French governmental strategy for the DMP deployment in 2016 is outlined, with the implementation of measures similar to the US Meaningful Use. PMID:28269907
Evaluating knowledge transfer practices among construction organization in Malaysia
NASA Astrophysics Data System (ADS)
Zaidi, Mohd Azian; Baharuddin, Mohd Nurfaisal; Bahardin, Nur Fadhilah; Yasin, Mohd Fadzil Mat; Nawi, Mohd Nasrun Mohd; Deraman, Rafikullah
2016-08-01
The aims of this paper is to identify a key dimension of knowledge transfer component to improve construction organization performance. It investigates the effectiveness of present knowledge transfer practices currently adopted by the Malaysian construction organizations and examines the relationship between knowledge transfer factors and organizational factors. A survey among 151 respondents including a different contractor registration grade was employed for the study. The survey shows that a seven-teen (17) factors known as creating shared awareness for information sharing, communication, personal skills,individual attitude,training, organizational culture, information technology,motivation, monitoring and supervision, service quality,information accessibility, information supply, socialization process,knowledge tools, coaching and monitoring, staff briefing and information sharing were identify as a key dimension for knowledge transfer success. This finding suggest that through improvement of each factor, the recognition of the whole strategic knowledge transfer process can be increase thus helping to strengthen the Malaysian construction organization for competitive advantages.
Mesa-Gutiérrez, J C; Bardají, C; Brun, N; Núñez, B; Sánchez, B; Sanvicente, B; Obiols, P; Rigol, S
2012-04-01
New tools from the web are a complete breakthrough in management of information. The aim of this paper is to present different resources in a friendly way, with apps and examples in the different phases of the knowledge management for the paediatric surgeon: search, filter, reception, classification, sharing, collaborative work and publication. We are assisting to a real revolution on how to manage knowledge and information. The main charateristics are: immediateness, social component, growing interaction, and easiness. Every physician has clinical questions and the Internet gives us more and more resources to make searchs easier. Along with them we need electronic resources to filter information of quality and to make easier transfer of knowledge to clinical practice. Cloud computing is on continuous development and makes possible sharing information with differents users and computers. The main feature of the apps from the Intenet is the social component, that makes possible interaction, sharing and collaborative work.
Tools for Communication: Novel infrastructure to address patient-perceived gaps in oncology care .
McMullen, Suzanne; Szabo, Shelagh; Halbert, Ronald J; Lai, Catherine; Parikh, Aparna; Bunce, Mikele; Khoury, Raya; Small, Art; Masaquel, Anthony
2017-04-01
Healthcare providers (HCPs) and patient communication are integral to high-quality oncology care. The patient and HCP perspectives are needed to identify gaps in care and develop communication tools. . This study aimed to understand patient- and HCP-perceived elements of and gaps in high-quality care to develop novel communication tools to improve care. . Qualitative interviews were conducted among 16 patients with cancer and 10 HCPs in the United States. Trained interviewers elicited patients' and HCPs' concerns, views, and perceived needs for communication tools. A thematic analysis was used to identify four quality of care domains, depicted in a conceptual model, and two draft communication tools were developed to address identified gaps. . No patients reported previously using a communication tool, and gaps in communication regarding treatment aims and education were evident. Two tools were developed to assess patients' life and treatment goals and the importance of ongoing education.
Yi, Ming; Zhao, Yongmei; Jia, Li; He, Mei; Kebebew, Electron; Stephens, Robert M.
2014-01-01
To apply exome-seq-derived variants in the clinical setting, there is an urgent need to identify the best variant caller(s) from a large collection of available options. We have used an Illumina exome-seq dataset as a benchmark, with two validation scenarios—family pedigree information and SNP array data for the same samples, permitting global high-throughput cross-validation, to evaluate the quality of SNP calls derived from several popular variant discovery tools from both the open-source and commercial communities using a set of designated quality metrics. To the best of our knowledge, this is the first large-scale performance comparison of exome-seq variant discovery tools using high-throughput validation with both Mendelian inheritance checking and SNP array data, which allows us to gain insights into the accuracy of SNP calling through such high-throughput validation in an unprecedented way, whereas the previously reported comparison studies have only assessed concordance of these tools without directly assessing the quality of the derived SNPs. More importantly, the main purpose of our study was to establish a reusable procedure that applies high-throughput validation to compare the quality of SNP discovery tools with a focus on exome-seq, which can be used to compare any forthcoming tool(s) of interest. PMID:24831545
Metadata based management and sharing of distributed biomedical data
Vergara-Niedermayr, Cristobal; Liu, Peiya
2014-01-01
Biomedical research data sharing is becoming increasingly important for researchers to reuse experiments, pool expertise and validate approaches. However, there are many hurdles for data sharing, including the unwillingness to share, lack of flexible data model for providing context information, difficulty to share syntactically and semantically consistent data across distributed institutions, and high cost to provide tools to share the data. SciPort is a web-based collaborative biomedical data sharing platform to support data sharing across distributed organisations. SciPort provides a generic metadata model to flexibly customise and organise the data. To enable convenient data sharing, SciPort provides a central server based data sharing architecture with a one-click data sharing from a local server. To enable consistency, SciPort provides collaborative distributed schema management across distributed sites. To enable semantic consistency, SciPort provides semantic tagging through controlled vocabularies. SciPort is lightweight and can be easily deployed for building data sharing communities. PMID:24834105
Managing Epilepsy Well: Emerging e-Tools for epilepsy self-management.
Shegog, Ross; Bamps, Yvan A; Patel, Archna; Kakacek, Jody; Escoffery, Cam; Johnson, Erica K; Ilozumba, Ukwuoma O
2013-10-01
The Managing Epilepsy Well (MEW) Network was established in 2007 by the Centers for Disease Control and Prevention Epilepsy Program to expand epilepsy self-management research. The network has employed collaborative research strategies to develop, test, and disseminate evidence-based, community-based, and e-Health interventions (e-Tools) for epilepsy self-management for people with epilepsy, caregivers, and health-care providers. Since its inception, MEW Network collaborators have conducted formative studies (n=7) investigating the potential of e-Health to support epilepsy self-management and intervention studies evaluating e-Tools (n=5). The MEW e-Tools (the MEW website, WebEase, UPLIFT, MINDSET, and PEARLS online training) and affiliated e-Tools (Texting 4 Control) are designed to complement self-management practices in each phase of the epilepsy care continuum. These tools exemplify a concerted research agenda, shared methodological principles and models for epilepsy self-management, and a communal knowledge base for implementing e-Health to improve quality of life for people with epilepsy. © 2013.
Clinical research in a hospital--from the lone rider to teamwork.
Hannisdal, E
1996-01-01
Clinical research of high international standard is very demanding and requires clinical data of high quality, software, hardware and competence in research design and statistical treatment of data. Most busy clinicians have little time allocated for clinical research and this increases the need for a potent infrastructure. This paper describes how the Norwegian Radium Hospital, a specialized cancer hospital, has reorganized the clinical research process. This includes a new department, the Clinical Research Office, which serves the formal framework, a central Diagnosis Registry, clinical databases and multicentre studies. The department assists about 120 users, mainly clinicians. Installation of a network software package with over 10 programs has strongly provided an internal standardization, reduced the costs and saved clinicians a great deal of time. The hospital is building up about 40 diagnosis-specific clinical databases with up to 200 variables registered. These databases are shared by the treatment group and seem to be important tools for quality assurance. We conclude that the clinical research process benefits from a firm infrastructure facilitating teamwork through extensive use of modern information technology. We are now ready for the next phase, which is to work for a better external technical framework for cooperation with other institutions throughout the world.
A tool for assessment of heart failure prescribing quality: A systematic review and meta-analysis.
El Hadidi, Seif; Darweesh, Ebtissam; Byrne, Stephen; Bermingham, Margaret
2018-04-16
Heart failure (HF) guidelines aim to standardise patient care. Internationally, prescribing practice in HF may deviate from guidelines and so a standardised tool is required to assess prescribing quality. A systematic review and meta-analysis were performed to identify a quantitative tool for measuring adherence to HF guidelines and its clinical implications. Eleven electronic databases were searched to include studies reporting a comprehensive tool for measuring adherence to prescribing guidelines in HF patients aged ≥18 years. Qualitative studies or studies measuring prescription rates alone were excluded. Study quality was assessed using the Good ReseArch for Comparative Effectiveness Checklist. In total, 2455 studies were identified. Sixteen eligible full-text articles were included (n = 14 354 patients, mean age 69 ± 8 y). The Guideline Adherence Index (GAI), and its modified versions, was the most frequently cited tool (n = 13). Other tools identified were the Individualised Reconciled Evidence Recommendations, the Composite Heart Failure Performance, and the Heart Failure Scale. The meta-analysis included the GAI studies of good to high quality. The average GAI-3 was 62%. Compared to low GAI, high GAI patients had lower mortality rate (7.6% vs 33.9%) and lower rehospitalisation rates (23.5% vs 24.5%); both P ≤ .05. High GAI was associated with reduced risk of mortality (hazard ratio = 0.29, 95% confidence interval, 0.06-0.51) and rehospitalisation (hazard ratio = 0.64, 95% confidence interval, 0.41-1.00). No tool was used to improve prescribing quality. The GAI is the most frequently used tool to assess guideline adherence in HF. High GAI is associated with improved HF outcomes. Copyright © 2018 John Wiley & Sons, Ltd.
Classification of processes involved in sharing individual participant data from clinical trials.
Ohmann, Christian; Canham, Steve; Banzi, Rita; Kuchinke, Wolfgang; Battaglia, Serena
2018-01-01
Background: In recent years, a cultural change in the handling of data from research has resulted in the strong promotion of a culture of openness and increased sharing of data. In the area of clinical trials, sharing of individual participant data involves a complex set of processes and the interaction of many actors and actions. Individual services/tools to support data sharing are available, but what is missing is a detailed, structured and comprehensive list of processes/subprocesses involved and tools/services needed. Methods : Principles and recommendations from a published data sharing consensus document are analysed in detail by a small expert group. Processes/subprocesses involved in data sharing are identified and linked to actors and possible services/tools. Definitions are adapted from the business process model and notation (BPMN) and applied in the analysis. Results: A detailed and comprehensive list of individual processes/subprocesses involved in data sharing, structured according to 9 main processes, is provided. Possible tools/services to support these processes/subprocesses are identified and grouped according to major type of support. Conclusions: The list of individual processes/subprocesses and tools/services identified is a first step towards development of a generic framework or architecture for sharing of data from clinical trials. Such a framework is strongly needed to give an overview of how various actors, research processes and services could form an interoperable system for data sharing.
Classification of processes involved in sharing individual participant data from clinical trials
Ohmann, Christian; Canham, Steve; Banzi, Rita; Kuchinke, Wolfgang; Battaglia, Serena
2018-01-01
Background: In recent years, a cultural change in the handling of data from research has resulted in the strong promotion of a culture of openness and increased sharing of data. In the area of clinical trials, sharing of individual participant data involves a complex set of processes and the interaction of many actors and actions. Individual services/tools to support data sharing are available, but what is missing is a detailed, structured and comprehensive list of processes/subprocesses involved and tools/services needed. Methods: Principles and recommendations from a published data sharing consensus document are analysed in detail by a small expert group. Processes/subprocesses involved in data sharing are identified and linked to actors and possible services/tools. Definitions are adapted from the business process model and notation (BPMN) and applied in the analysis. Results: A detailed and comprehensive list of individual processes/subprocesses involved in data sharing, structured according to 9 main processes, is provided. Possible tools/services to support these processes/subprocesses are identified and grouped according to major type of support. Conclusions: The list of individual processes/subprocesses and tools/services identified is a first step towards development of a generic framework or architecture for sharing of data from clinical trials. Such a framework is strongly needed to give an overview of how various actors, research processes and services could form an interoperable system for data sharing. PMID:29623192
BioShaDock: a community driven bioinformatics shared Docker-based tools registry
Moreews, François; Sallou, Olivier; Ménager, Hervé; Le bras, Yvan; Monjeaud, Cyril; Blanchet, Christophe; Collin, Olivier
2015-01-01
Linux container technologies, as represented by Docker, provide an alternative to complex and time-consuming installation processes needed for scientific software. The ease of deployment and the process isolation they enable, as well as the reproducibility they permit across environments and versions, are among the qualities that make them interesting candidates for the construction of bioinformatic infrastructures, at any scale from single workstations to high throughput computing architectures. The Docker Hub is a public registry which can be used to distribute bioinformatic software as Docker images. However, its lack of curation and its genericity make it difficult for a bioinformatics user to find the most appropriate images needed. BioShaDock is a bioinformatics-focused Docker registry, which provides a local and fully controlled environment to build and publish bioinformatic software as portable Docker images. It provides a number of improvements over the base Docker registry on authentication and permissions management, that enable its integration in existing bioinformatic infrastructures such as computing platforms. The metadata associated with the registered images are domain-centric, including for instance concepts defined in the EDAM ontology, a shared and structured vocabulary of commonly used terms in bioinformatics. The registry also includes user defined tags to facilitate its discovery, as well as a link to the tool description in the ELIXIR registry if it already exists. If it does not, the BioShaDock registry will synchronize with the registry to create a new description in the Elixir registry, based on the BioShaDock entry metadata. This link will help users get more information on the tool such as its EDAM operations, input and output types. This allows integration with the ELIXIR Tools and Data Services Registry, thus providing the appropriate visibility of such images to the bioinformatics community. PMID:26913191
BioShaDock: a community driven bioinformatics shared Docker-based tools registry.
Moreews, François; Sallou, Olivier; Ménager, Hervé; Le Bras, Yvan; Monjeaud, Cyril; Blanchet, Christophe; Collin, Olivier
2015-01-01
Linux container technologies, as represented by Docker, provide an alternative to complex and time-consuming installation processes needed for scientific software. The ease of deployment and the process isolation they enable, as well as the reproducibility they permit across environments and versions, are among the qualities that make them interesting candidates for the construction of bioinformatic infrastructures, at any scale from single workstations to high throughput computing architectures. The Docker Hub is a public registry which can be used to distribute bioinformatic software as Docker images. However, its lack of curation and its genericity make it difficult for a bioinformatics user to find the most appropriate images needed. BioShaDock is a bioinformatics-focused Docker registry, which provides a local and fully controlled environment to build and publish bioinformatic software as portable Docker images. It provides a number of improvements over the base Docker registry on authentication and permissions management, that enable its integration in existing bioinformatic infrastructures such as computing platforms. The metadata associated with the registered images are domain-centric, including for instance concepts defined in the EDAM ontology, a shared and structured vocabulary of commonly used terms in bioinformatics. The registry also includes user defined tags to facilitate its discovery, as well as a link to the tool description in the ELIXIR registry if it already exists. If it does not, the BioShaDock registry will synchronize with the registry to create a new description in the Elixir registry, based on the BioShaDock entry metadata. This link will help users get more information on the tool such as its EDAM operations, input and output types. This allows integration with the ELIXIR Tools and Data Services Registry, thus providing the appropriate visibility of such images to the bioinformatics community.
Electronic Tools for Health Information Exchange
2013-01-01
Background As patients experience transitions in care, there is a need to share information between care providers in an accurate and timely manner. With the push towards electronic medical records and other electronic tools (eTools) (and away from paper-based health records) for health information exchange, there remains uncertainty around the impact of eTools as a form of communication. Objective To examine the impact of eTools for health information exchange in the context of care coordination for individuals with chronic disease in the community. Data Sources A literature search was performed on April 26, 2012, using OVID MEDLINE, OVID MEDLINE In-Process and Other Non-Indexed Citations, OVID EMBASE, EBSCO Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Wiley Cochrane Library, and the Centre for Reviews and Dissemination database, for studies published until April 26, 2012 (no start date limit was applied). Review Methods A systematic literature search was conducted, and meta-analysis conducted where appropriate. Outcomes of interest fell into 4 categories: health services utilization, disease-specific clinical outcomes, process-of-care indicators, and measures of efficiency. The quality of the evidence was assessed individually for each outcome. Expert panels were assembled for stakeholder engagement and contextualization. Results Eleven articles were identified (4 randomized controlled trials and 7 observational studies). There was moderate quality evidence of a reduction in hospitalizations, hospital length of stay, and emergency department visits following the implementation of an electronically generated laboratory report with recommendations based on clinical guidelines. The evidence showed no difference in disease-specific outcomes; there was no evidence of a positive impact on process-of-care indicators or measures of efficiency. Limitations A limited body of research specifically examined eTools for health information exchange in the population and setting of interest. This evidence included a combination of study designs and was further limited by heterogeneity in individual technologies and settings in which they were implemented. Conclusions There is evidence that the right eTools in the right environment and context can significantly impact health services utilization. However, the findings from this evidence-based analysis raise doubts about the ability of eTools with care-coordination capabilities to independently improve the quality of outpatient care. While eTools may be able to support and sustain processes, inefficiencies embedded in the health care system may require more than automation alone to resolve. Plain Language Summary Patients with chronic diseases often work with many different health care providers. To ensure smooth transitions from one setting to the next, health care providers must share information and coordinate care effectively. Electronic medical records (eTools) are being used more and more to coordinate patient care, but it is not yet known whether they are more effective than paper-based health records. In this analysis, we reviewed the evidence for the use of eTools to exchange information and coordinate care for people with chronic diseases in the community. There was some evidence that eTools reduced the number of hospital and emergency department visits, as well as patients' length of stay in the hospital, but there was no evidence that eTools improved the overall quality of patient care. PMID:24194799
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malony, Allen D.; Wolf, Felix G.
2014-01-31
The growing number of cores provided by today’s high-end computing systems present substantial challenges to application developers in their pursuit of parallel efficiency. To find the most effective optimization strategy, application developers need insight into the runtime behavior of their code. The University of Oregon (UO) and the Juelich Supercomputing Centre of Forschungszentrum Juelich (FZJ) develop the performance analysis tools TAU and Scalasca, respectively, which allow high-performance computing (HPC) users to collect and analyze relevant performance data – even at very large scales. TAU and Scalasca are considered among the most advanced parallel performance systems available, and are used extensivelymore » across HPC centers in the U.S., Germany, and around the world. The TAU and Scalasca groups share a heritage of parallel performance tool research and partnership throughout the past fifteen years. Indeed, the close interactions of the two groups resulted in a cross-fertilization of tool ideas and technologies that pushed TAU and Scalasca to what they are today. It also produced two performance systems with an increasing degree of functional overlap. While each tool has its specific analysis focus, the tools were implementing measurement infrastructures that were substantially similar. Because each tool provides complementary performance analysis, sharing of measurement results is valuable to provide the user with more facets to understand performance behavior. However, each measurement system was producing performance data in different formats, requiring data interoperability tools to be created. A common measurement and instrumentation system was needed to more closely integrate TAU and Scalasca and to avoid the duplication of development and maintenance effort. The PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis) project was proposed over three years ago as a joint international effort between UO and FZJ to accomplish these objectives: (1) refactor TAU and Scalasca performance system components for core code sharing and (2) integrate TAU and Scalasca functionality through data interfaces, formats, and utilities. As presented in this report, the project has completed these goals. In addition to shared technical advances, the groups have worked to engage with users through application performance engineering and tools training. In this regard, the project benefits from the close interactions the teams have with national laboratories in the United States and Germany. We have also sought to enhance our interactions through joint tutorials and outreach. UO has become a member of the Virtual Institute of High-Productivity Supercomputing (VI-HPS) established by the Helmholtz Association of German Research Centres as a center of excellence, focusing on HPC tools for diagnosing programming errors and optimizing performance. UO and FZJ have conducted several VI-HPS training activities together within the past three years.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malony, Allen D.; Wolf, Felix G.
2014-01-31
The growing number of cores provided by today’s high-end computing systems present substantial challenges to application developers in their pursuit of parallel efficiency. To find the most effective optimization strategy, application developers need insight into the runtime behavior of their code. The University of Oregon (UO) and the Juelich Supercomputing Centre of Forschungszentrum Juelich (FZJ) develop the performance analysis tools TAU and Scalasca, respectively, which allow high-performance computing (HPC) users to collect and analyze relevant performance data – even at very large scales. TAU and Scalasca are considered among the most advanced parallel performance systems available, and are used extensivelymore » across HPC centers in the U.S., Germany, and around the world. The TAU and Scalasca groups share a heritage of parallel performance tool research and partnership throughout the past fifteen years. Indeed, the close interactions of the two groups resulted in a cross-fertilization of tool ideas and technologies that pushed TAU and Scalasca to what they are today. It also produced two performance systems with an increasing degree of functional overlap. While each tool has its specific analysis focus, the tools were implementing measurement infrastructures that were substantially similar. Because each tool provides complementary performance analysis, sharing of measurement results is valuable to provide the user with more facets to understand performance behavior. However, each measurement system was producing performance data in different formats, requiring data interoperability tools to be created. A common measurement and instrumentation system was needed to more closely integrate TAU and Scalasca and to avoid the duplication of development and maintenance effort. The PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis) project was proposed over three years ago as a joint international effort between UO and FZJ to accomplish these objectives: (1) refactor TAU and Scalasca performance system components for core code sharing and (2) integrate TAU and Scalasca functionality through data interfaces, formats, and utilities. As presented in this report, the project has completed these goals. In addition to shared technical advances, the groups have worked to engage with users through application performance engineering and tools training. In this regard, the project benefits from the close interactions the teams have with national laboratories in the United States and Germany. We have also sought to enhance our interactions through joint tutorials and outreach. UO has become a member of the Virtual Institute of High-Productivity Supercomputing (VI-HPS) established by the Helmholtz Association of German Research Centres as a center of excellence, focusing on HPC tools for diagnosing programming errors and optimizing performance. UO and FZJ have conducted several VI-HPS training activities together within the past three years.« less
Sharing Earth Observation Data When Health Management
NASA Astrophysics Data System (ADS)
Cox, E. L., Jr.
2015-12-01
While the global community is struck by pandemics and epidemics from time to time the ability to fully utilize earth observations and integrate environmental information has been limited - until recently. Mature science understanding is allowing new levels of situational awareness be possible when and if the relevant data is available and shared in a timely and useable manner. Satellite and other remote sensing tools have been used to observe, monitor, assess and predict weather and water impacts for decades. In the last few years much of this has included a focus on the ability to monitor changes on climate scales that suggest changes in quantity and quality of ecosystem resources or the "one-health" approach where trans-disciplinary links between environment, animal and vegetative health may provide indications of best ways to manage susceptibility to infectious disease or outbreaks. But the scale of impacts and availability of information from earth observing satellites, airborne platforms, health tracking systems and surveillance networks offer new integrated tools. This presentation will describe several recent events, such as Superstorm Sandy in the United States and the Ebola outbreak in Africa, where public health and health infrastructure have been exposed to environmental hazards and lessons learned from disaster response in the ability to share data have been effective in risk reduction.
Spruijt-Metz, Donna; Hekler, Eric; Saranummi, Niilo; Intille, Stephen; Korhonen, Ilkka; Nilsen, Wendy; Rivera, Daniel E; Spring, Bonnie; Michie, Susan; Asch, David A; Sanna, Alberto; Salcedo, Vicente Traver; Kukakfa, Rita; Pavel, Misha
2015-09-01
Adverse and suboptimal health behaviors and habits are responsible for approximately 40 % of preventable deaths, in addition to their unfavorable effects on quality of life and economics. Our current understanding of human behavior is largely based on static "snapshots" of human behavior, rather than ongoing, dynamic feedback loops of behavior in response to ever-changing biological, social, personal, and environmental states. This paper first discusses how new technologies (i.e., mobile sensors, smartphones, ubiquitous computing, and cloud-enabled processing/computing) and emerging systems modeling techniques enable the development of new, dynamic, and empirical models of human behavior that could facilitate just-in-time adaptive, scalable interventions. The paper then describes concrete steps to the creation of robust dynamic mathematical models of behavior including: (1) establishing "gold standard" measures, (2) the creation of a behavioral ontology for shared language and understanding tools that both enable dynamic theorizing across disciplines, (3) the development of data sharing resources, and (4) facilitating improved sharing of mathematical models and tools to support rapid aggregation of the models. We conclude with the discussion of what might be incorporated into a "knowledge commons," which could help to bring together these disparate activities into a unified system and structure for organizing knowledge about behavior.
Low- and High-Text Books Facilitate the Same Amount and Quality of Extratextual Talk
ERIC Educational Resources Information Center
Muhinyi, Amber; Hesketh, Anne
2017-01-01
Recent research suggests that caregiver-child extratextual talk during shared book reading facilitates the development of preschool children's oral language skills. This study investigated the effects of the amount of picturebook text on mother-child extratextual talk during shared book reading. Twenty-four mother-child dyads (children aged…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-08
... other high credit quality, short-term fixed-income or similar securities (including shares of money market funds, bank deposits, bank money market accounts, certain variable rate-demand notes, and...- income or similar securities (including shares of money market funds, bank deposits, bank money market...
Information of urban morphological features at high resolution is needed to properly model and characterize the meteorological and air quality fields in urban areas. We describe a new project called National Urban Database with Access Portal Tool, (NUDAPT) that addresses this nee...
CARD 2017: expansion and model-centric curation of the comprehensive antibiotic resistance database
Jia, Baofeng; Raphenya, Amogelang R.; Alcock, Brian; Waglechner, Nicholas; Guo, Peiyao; Tsang, Kara K.; Lago, Briony A.; Dave, Biren M.; Pereira, Sheldon; Sharma, Arjun N.; Doshi, Sachin; Courtot, Mélanie; Lo, Raymond; Williams, Laura E.; Frye, Jonathan G.; Elsayegh, Tariq; Sardar, Daim; Westman, Erin L.; Pawlowski, Andrew C.; Johnson, Timothy A.; Brinkman, Fiona S.L.; Wright, Gerard D.; McArthur, Andrew G.
2017-01-01
The Comprehensive Antibiotic Resistance Database (CARD; http://arpcard.mcmaster.ca) is a manually curated resource containing high quality reference data on the molecular basis of antimicrobial resistance (AMR), with an emphasis on the genes, proteins and mutations involved in AMR. CARD is ontologically structured, model centric, and spans the breadth of AMR drug classes and resistance mechanisms, including intrinsic, mutation-driven and acquired resistance. It is built upon the Antibiotic Resistance Ontology (ARO), a custom built, interconnected and hierarchical controlled vocabulary allowing advanced data sharing and organization. Its design allows the development of novel genome analysis tools, such as the Resistance Gene Identifier (RGI) for resistome prediction from raw genome sequence. Recent improvements include extensive curation of additional reference sequences and mutations, development of a unique Model Ontology and accompanying AMR detection models to power sequence analysis, new visualization tools, and expansion of the RGI for detection of emergent AMR threats. CARD curation is updated monthly based on an interplay of manual literature curation, computational text mining, and genome analysis. PMID:27789705
Malenke, J R; Milash, B; Miller, A W; Dearing, M D
2013-07-01
Massively parallel sequencing has enabled the creation of novel, in-depth genetic tools for nonmodel, ecologically important organisms. We present the de novo transcriptome sequencing, analysis and microarray development for a vertebrate herbivore, the woodrat (Neotoma spp.). This genus is of ecological and evolutionary interest, especially with respect to ingestion and hepatic metabolism of potentially toxic plant secondary compounds. We generated a liver transcriptome of the desert woodrat (Neotoma lepida) using the Roche 454 platform. The assembled contigs were well annotated using rodent references (99.7% annotation), and biotransformation function was reflected in the gene ontology. The transcriptome was used to develop a custom microarray (eArray, Agilent). We tested the microarray with three experiments: one across species with similar habitat (thus, dietary) niches, one across species with different habitat niches and one across populations within a species. The resulting one-colour arrays had high technical and biological quality. Probes designed from the woodrat transcriptome performed significantly better than functionally similar probes from the Norway rat (Rattus norvegicus). There were a multitude of expression differences across the woodrat treatments, many of which related to biotransformation processes and activities. The pattern and function of the differences indicate shared ecological pressures, and not merely phylogenetic distance, play an important role in shaping gene expression profiles of woodrat species and populations. The quality and functionality of the woodrat transcriptome and custom microarray suggest these tools will be valuable for expanding the scope of herbivore biology, as well as the exploration of conceptual topics in ecology. © 2013 John Wiley & Sons Ltd.
The quality of instruments to assess the process of shared decision making: A systematic review.
Gärtner, Fania R; Bomhof-Roordink, Hanna; Smith, Ian P; Scholl, Isabelle; Stiggelbout, Anne M; Pieterse, Arwen H
2018-01-01
To inventory instruments assessing the process of shared decision making and appraise their measurement quality, taking into account the methodological quality of their validation studies. In a systematic review we searched seven databases (PubMed, Embase, Emcare, Cochrane, PsycINFO, Web of Science, Academic Search Premier) for studies investigating instruments measuring the process of shared decision making. Per identified instrument, we assessed the level of evidence separately for 10 measurement properties following a three-step procedure: 1) appraisal of the methodological quality using the COnsensus-based Standards for the selection of health status Measurement INstruments (COSMIN) checklist, 2) appraisal of the psychometric quality of the measurement property using three possible quality scores, 3) best-evidence synthesis based on the number of studies, their methodological and psychometrical quality, and the direction and consistency of the results. The study protocol was registered at PROSPERO: CRD42015023397. We included 51 articles describing the development and/or evaluation of 40 shared decision-making process instruments: 16 patient questionnaires, 4 provider questionnaires, 18 coding schemes and 2 instruments measuring multiple perspectives. There is an overall lack of evidence for their measurement quality, either because validation is missing or methods are poor. The best-evidence synthesis indicated positive results for a major part of instruments for content validity (50%) and structural validity (53%) if these were evaluated, but negative results for a major part of instruments when inter-rater reliability (47%) and hypotheses testing (59%) were evaluated. Due to the lack of evidence on measurement quality, the choice for the most appropriate instrument can best be based on the instrument's content and characteristics such as the perspective that they assess. We recommend refinement and validation of existing instruments, and the use of COSMIN-guidelines to help guarantee high-quality evaluations.
Brown, Alexandra E; Okayasu, Hiromasa; Nzioki, Michael M; Wadood, Mufti Z; Chabot-Couture, Guillaume; Quddus, Arshad; Walker, George; Sutter, Roland W
2014-11-01
Monitoring the quality of supplementary immunization activities (SIAs) is a key tool for polio eradication. Regular monitoring data, however, are often unreliable, showing high coverage levels in virtually all areas, including those with ongoing virus circulation. To address this challenge, lot quality assurance sampling (LQAS) was introduced in 2009 as an additional tool to monitor SIA quality. Now used in 8 countries, LQAS provides a number of programmatic benefits: identifying areas of weak coverage quality with statistical reliability, differentiating areas of varying coverage with greater precision, and allowing for trend analysis of campaign quality. LQAS also accommodates changes to survey format, interpretation thresholds, evaluations of sample size, and data collection through mobile phones to improve timeliness of reporting and allow for visualization of campaign quality. LQAS becomes increasingly important to address remaining gaps in SIA quality and help focus resources on high-risk areas to prevent the continued transmission of wild poliovirus. © Crown copyright 2014.
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.
Simonyan, Vahan; Mazumder, Raja
2014-09-30
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis
Simonyan, Vahan; Mazumder, Raja
2014-01-01
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953
Approaches to pharmacy benefit management and the impact of consumer cost sharing.
Olson, Bridget M
2003-01-01
Numerous mechanisms have been introduced to deliver prescription drug benefits while controlling pharmaceutical costs. An understanding of the most prominent mechanisms of benefit management is an important step in determining the most effective approach to take in future years. The aims of this review were to illustrate the mechanisms by which managed care has attempted to efficiently and equitably deliver pharmacy benefits and to discuss the impact of such programs, including consumer cost sharing. A review of the literature was conducted using the PreMedline and MEDLINE databases from the years 1966 to 2002, reference lists from relevant articles, and online sources, including news releases, conference materials, and pharmacy benefit management reports. Numerous pharmacy benefit management tools and their impact on utilization, expenditures, and health outcomes are reviewed, including disease state management; utilization management (ie, quantity limitations and prior authorization); drug utilization review; formulary management (ie, open and closed); delivery systems (ie, retail and mail order); and mechanisms for implementing consumer cost sharing (ie, generic incentives, multitiered copayments, and co-insurance). Although there is some evidence to suggest that certain benefit management tools have been successful in reducing health plan expenditures, a more thorough investigation of their potential unintended consequences is needed. Implementing adequate levels of consumer cost sharing is necessary if employers and health plans are to continue offering prescription drug benefits. It is important to remember, however, that quality health care cannot be forfeited for the sake of short-term cost savings.
Shewade, Hemant Deepak; Vidhubala, E; Subramani, Divyaraj Prabhakar; Lal, Pranay; Bhatt, Neelam; Sundaramoorthi, C.; Singh, Rana J.; Kumar, Ajay M. V.
2017-01-01
ABSTRACT Background: A large state-wide tobacco survey was conducted using modified version of pretested, globally validated Global Adult Tobacco Survey (GATS) questionnaire in 2015–22016 in Tamil Nadu, India. Due to resource constrains, data collection was carrid out using paper-based questionnaires (unlike the GATS-India, 2009–2010, which used hand-held computer devices) while data entry was done using open access tools. The objective of this paper is to describe the process of data entry and assess its quality assurance and efficiency. Methods: In EpiData language, a variable is referred to as ‘field’ and a questionnaire (set of fields) as ‘record’. EpiData software was used for double data entry with adequate checks followed by validation. Teamviewer was used for remote training and trouble shooting. The EpiData databases (one each for each district and each zone in Chennai city) were housed in shared Dropbox folders, which enabled secure sharing of files and automatic back-up. Each database for a district/zone had separate file for data entry of household level and individual level questionnaire. Results: Of 32,945 households, there were 111,363 individuals aged ≥15 years. The average proportion of records with data entry errors for a district/zone in household level and individual level file was 4% and 24%, respectively. These are the errors that would have gone unnoticed if single entry was used. The median (inter-quartile range) time taken for double data entry for a single household level and individual level questionnaire was 30 (24, 40) s and 86 (64, 126) s, respectively. Conclusion: Efficient and quality-assured near-real-time data entry in a large sub-national tobacco survey was performed using innovative, resource-efficient use of open access tools. PMID:29092673
Digital plagiarism - The web giveth and the web shall taketh
Presti, David E
2000-01-01
Publishing students' and researchers' papers on the World Wide Web (WWW) facilitates the sharing of information within and between academic communities. However, the ease of copying and transporting digital information leaves these authors' ideas open to plagiarism. Using tools such as the Plagiarism.org database, which compares submissions to reports and papers available on the Internet, could discover instances of plagiarism, revolutionize the peer review process, and raise the quality of published research everywhere. PMID:11720925
Digital plagiarism--the Web giveth and the Web shall taketh.
Barrie, J M; Presti, D E
2000-01-01
Publishing students' and researchers' papers on the World Wide Web (WWW) facilitates the sharing of information within and between academic communities. However, the ease of copying and transporting digital information leaves these authors' ideas open to plagiarism. Using tools such as the Plagiarism.org database, which compares submissions to reports and papers available on the Internet, could discover instances of plagiarism, revolutionize the peer review process, and raise the quality of published research everywhere.
Measuring quality in services for children with an intellectual disability.
Koornneef, Erik
2006-01-01
To evaluate the application of one particular quality measurement tool, the SERVQUAL instrument, as a potential mechanism to measure quality in services for children with disabilities Staff and family of children with an intellectual disability in two organisations providing specialist therapy and day completed an adapted SERVQUAL questionnaire. A total of 81 SERVQUAL questionnaires were distributed and 59 questionnaires were returned (response rate of 73 per cent). The SERVQUAL instrument can be considered as a useful diagnostic tool to identify particular strengths and areas for improvement in services for people with disabilities as the instrument lends itself for the monitoring of the effectiveness of quality improvement initiatives over time. The findings also showed relatively high customer expectations and the organisations involved in this research are currently not meeting all of these high expectations as significant quality gaps were found in the areas of reliability and responsiveness. The sample size was relatively small and the measurement of quality using the SERVQUAL instrument remains a challenge, due to the conceptual and empirical difficulties. The SERVQUAL instrument is probably most be attractive to service managers and funding organisations because of its ability to identify gaps in the quality of the service. The tool had been used to measure quality in services for people with disabilities and the research has shown that this tool might be an important additional quality measurement tool for services.
Mayer, Miguel A; Karampiperis, Pythagoras; Kukurikos, Antonis; Karkaletsis, Vangelis; Stamatakis, Kostas; Villarroel, Dagmar; Leis, Angela
2011-06-01
The number of health-related websites is increasing day-by-day; however, their quality is variable and difficult to assess. Various "trust marks" and filtering portals have been created in order to assist consumers in retrieving quality medical information. Consumers are using search engines as the main tool to get health information; however, the major problem is that the meaning of the web content is not machine-readable in the sense that computers cannot understand words and sentences as humans can. In addition, trust marks are invisible to search engines, thus limiting their usefulness in practice. During the last five years there have been different attempts to use Semantic Web tools to label health-related web resources to help internet users identify trustworthy resources. This paper discusses how Semantic Web technologies can be applied in practice to generate machine-readable labels and display their content, as well as to empower end-users by providing them with the infrastructure for expressing and sharing their opinions on the quality of health-related web resources.
van de Belt, Tom H; Nijmeijer, Hugo; Grim, David; Engelen, Lucien Jlpg; Vreeken, Rinaldo; van Gelder, Marleen Mmj; Laan, Mark Ter
2018-06-02
Cancer patients need high quality information about the disease stage, treatment options and side effects. High quality information can also improve health literacy, shared decision-making and satisfaction. We created patient-specific 3D models of tumours including surrounding functional areas, and assessed what patients with glioma actually value (or fear) about these models when they are used to educate them about the relation between their tumour and specific brain parts, the surgical procedure, and risks. We carried out an explorative study with adult glioma patients, who underwent functional MRI and DTi as part of the pre-operative work-up. All participants received an actual size 3D model, printed based on fMRI and DTi imaging. Semi-structured interviews were held to identify facilitators and barriers for using the model, and perceived effects. A model was successfully created for all 11 participants. A total of 18 facilitators and 8 barriers were identified. The model improved patients' understanding about their situation, that it was easier to ask questions to their neurosurgeon based on their model and that it supported their decision about the preferred treatment. A perceived barrier for using the 3D model was that it could be emotionally confronting, particularly in an early phase of the disease process. Positive effects were related to psychological domains including coping, learning effects and communication. Patient-specific 3D models are promising and simple tools that could help patients with glioma to better understand their situation, treatment options and risks. They have the potential to improve shared decision-making. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.
Lawani, Moulikatou Adouni; Valéra, Béatriz; Fortier-Brochu, Émilie; Légaré, France; Carmichael, Pierre-Hugues; Côté, Luc; Voyer, Philippe; Kröger, Edeltraut; Witteman, Holly; Rodriguez, Charo; Giguere, Anik M C
2017-03-15
Decision support tools build upon comprehensive and timely syntheses of literature. Rapid reviews may allow supporting their development by omitting certain components of traditional systematic reviews. We thus aimed to describe a rapid review approach underlying the development of decision support tools, i.e., five decision boxes (DB) for shared decision-making between seniors living with dementia, their caregivers, and healthcare providers. We included studies based on PICO questions (Participant, Intervention, Comparison, Outcome) describing each of the five specific decision. We gave priority to higher quality evidence (e.g., systematic reviews). For each DB, we first identified secondary sources of literature, namely, clinical summaries, clinical practice guidelines, and systematic reviews. After an initial extraction, we searched for primary studies in academic databases and grey literature to fill gaps in evidence. We extracted study designs, sample sizes, populations, and probabilities of benefits/harms of the health options. A single reviewer conducted the literature search and study selection. The data extracted by one reviewer was verified by a second experienced reviewer. Two reviewers assessed the quality of the evidence. We converted all probabilities into absolute risks for ease of understanding. Two to five experts validated the content of each DB. We conducted descriptive statistical analyses on the review processes and resources required. The approach allowed screening of a limited number of references (range: 104 to 406/review). For each review, we included 15 to 26 studies, 2 to 10 health options, 11 to 62 health outcomes and we conducted 9 to 47 quality assessments. A team of ten reviewers with varying levels of expertise was supported at specific steps by an information specialist, a biostatistician, and a graphic designer. The time required to complete a rapid review varied from 7 to 31 weeks per review (mean ± SD, 19 ± 10 weeks). Data extraction required the most time (8 ± 6.8 weeks). The average estimated cost of a rapid review was C$11,646 (SD = C$10,914). This approach enabled the development of clinical tools more rapidly than with a traditional systematic review. Future studies should evaluate the applicability of this approach to other teams/tools.
How strong are passwords used to protect personal health information in clinical trials?
El Emam, Khaled; Moreau, Katherine; Jonker, Elizabeth
2011-02-11
Findings and statements about how securely personal health information is managed in clinical research are mixed. The objective of our study was to evaluate the security of practices used to transfer and share sensitive files in clinical trials. Two studies were performed. First, 15 password-protected files that were transmitted by email during regulated Canadian clinical trials were obtained. Commercial password recovery tools were used on these files to try to crack their passwords. Second, interviews with 20 study coordinators were conducted to understand file-sharing practices in clinical trials for files containing personal health information. We were able to crack the passwords for 93% of the files (14/15). Among these, 13 files contained thousands of records with sensitive health information on trial participants. The passwords tended to be relatively weak, using common names of locations, animals, car brands, and obvious numeric sequences. Patient information is commonly shared by email in the context of query resolution. Files containing personal health information are shared by email and, by posting them on shared drives with common passwords, to facilitate collaboration. If files containing sensitive patient information must be transferred by email, mechanisms to encrypt them and to ensure that password strength is high are necessary. More sophisticated collaboration tools are required to allow file sharing without password sharing. We provide recommendations to implement these practices.
How Strong are Passwords Used to Protect Personal Health Information in Clinical Trials?
Moreau, Katherine; Jonker, Elizabeth
2011-01-01
Background Findings and statements about how securely personal health information is managed in clinical research are mixed. Objective The objective of our study was to evaluate the security of practices used to transfer and share sensitive files in clinical trials. Methods Two studies were performed. First, 15 password-protected files that were transmitted by email during regulated Canadian clinical trials were obtained. Commercial password recovery tools were used on these files to try to crack their passwords. Second, interviews with 20 study coordinators were conducted to understand file-sharing practices in clinical trials for files containing personal health information. Results We were able to crack the passwords for 93% of the files (14/15). Among these, 13 files contained thousands of records with sensitive health information on trial participants. The passwords tended to be relatively weak, using common names of locations, animals, car brands, and obvious numeric sequences. Patient information is commonly shared by email in the context of query resolution. Files containing personal health information are shared by email and, by posting them on shared drives with common passwords, to facilitate collaboration. Conclusion If files containing sensitive patient information must be transferred by email, mechanisms to encrypt them and to ensure that password strength is high are necessary. More sophisticated collaboration tools are required to allow file sharing without password sharing. We provide recommendations to implement these practices. PMID:21317106
NASA Astrophysics Data System (ADS)
Koymans, Mathijs; Langereis, Cor; Pastor-Galán, Daniel; van Hinsbergen, Douwe
2017-04-01
This contribution gives an overview of Paleomagnetism.org (Koymans et al., 2016), an online environment for paleomagnetic analysis. The application is developed in JavaScript and is fully open-sourced. It presents an interactive website in which paleomagnetic data can be interpreted, evaluated, visualized, and shared with others. The application has been available from late 2015 and since then has evolved with the addition of a magnetostratigraphic tool, additional input formats, and features that emphasize on the link between geomagnetism and tectonics. In the interpretation portal, principle component analysis (Kirschvink et al., 1981) can be applied on visualized demagnetization data (Zijderveld, 1967). Interpreted directions and great circles are combined using the iterative procedure described by (McFadden and McElhinny, 1988). The resulting directions can be further used in the statistics portal or exported as raw tabulated data and high-quality figures. The available tools in the statistics portal cover standard Fisher statistics for directional data and virtual geomagnetic poles (Fisher, 1953; Butler, 1992; Deenen et al., 2011). Other tools include the eigenvector approach foldtest (Tauxe and Watson, 1994), a bootstrapped reversal test (Tauxe et al., 2009), and the classical reversal test (McFadden and McElhinny, 1990). An implementation exists for the detection and correction of inclination shallowing in sediments (Tauxe and Kent, 2004; Tauxe et al., 2008), and a module to visualize apparent polar wander paths (Torsvik et al., 2012; Kent and Irving, 2010; Besse and Courtillot, 2002) for large continent-bearing plates. A miscellaneous portal exists for a set of tools that include a boostrapped oroclinal test (Pastor-Galán et al., 2016) for assessing possible linear relationships between strike and declination. Another tool that is available completes a net tectonic rotation analysis (after Morris et al., 1999) that restores a dyke to its paleo-vertical and can be used in determining paleo-spreading directions fundamental to plate reconstructions. Paleomagnetism.org provides an integrated approach for researchers to export and share paleomagnetic data through a common interface. The portals create a custom exportable file that can be distributed and included in public databases. With a publication, this file can be appended and would contain all paleomagnetic data discussed in the publication. The appended file can then be imported to the application by other researchers for reviewing. The accessibility and simplicity through which paleomagnetic data can be interpreted, analyzed, visualized, and shared should make Paleomagnetism.org of interest to the paleomagnetic and tectonic communities.
ERIC Educational Resources Information Center
Dadaczynski, Kevin; Paulus, Peter; de Vries, Nanne; de Ruiter, Silvia; Buijs, Goof
2010-01-01
The HEPS Inventory Tool aims to support stakeholders working in school health promotion to promote high quality interventions on healthy eating and physical activity. As a tool it provides a step-by-step approach on how to develop a national or regional inventory of existing school based interventions on healthy eating and physical activity. It…
Optimization of removal function in computer controlled optical surfacing
NASA Astrophysics Data System (ADS)
Chen, Xi; Guo, Peiji; Ren, Jianfeng
2010-10-01
The technical principle of computer controlled optical surfacing (CCOS) and the common method of optimizing removal function that is used in CCOS are introduced in this paper. A new optimizing method time-sharing synthesis of removal function is proposed to solve problems of the removal function being far away from Gaussian type and slow approaching of the removal function error that encountered in the mode of planet motion or translation-rotation. Detailed time-sharing synthesis of using six removal functions is discussed. For a given region on the workpiece, six positions are selected as the centers of the removal function; polishing tool controlled by the executive system of CCOS revolves around each centre to complete a cycle in proper order. The overall removal function obtained by the time-sharing process is the ratio of total material removal in six cycles to time duration of the six cycles, which depends on the arrangement and distribution of the six removal functions. Simulations on the synthesized overall removal functions under two different modes of motion, i.e., planet motion and translation-rotation are performed from which the optimized combination of tool parameters and distribution of time-sharing synthesis removal functions are obtained. The evaluation function when optimizing is determined by an approaching factor which is defined as the ratio of the material removal within the area of half of the polishing tool coverage from the polishing center to the total material removal within the full polishing tool coverage area. After optimization, it is found that the optimized removal function obtained by time-sharing synthesis is closer to the ideal Gaussian type removal function than those by the traditional methods. The time-sharing synthesis method of the removal function provides an efficient way to increase the convergence speed of the surface error in CCOS for the fabrication of aspheric optical surfaces, and to reduce the intermediate- and high-frequency error.
Air Quality and Heart Health: Managing an Emerging Cardiovascular Risk Factor
Dr. Cascio will share with a broad range of federal agencies current understanding of the links between air quality and cardiovascular health. The key facts include that air pollution contributes a high attributable health burden. That certain well-defined vulnerable subpopulat...
Yang, Nathan; Hosseini, Sarah; Mascarella, Marco A; Young, Meredith; Posel, Nancy; Fung, Kevin; Nguyen, Lily H P
2017-05-25
Learners often utilize online resources to supplement formalized curricula, and to appropriately support learning, these resources should be of high quality. Thus, the objectives of this study are to develop and provide validity evidence supporting an assessment tool designed to assess the quality of educational websites in Otolaryngology- Head & Neck Surgery (ORL-HNS), and identify those that could support effective web-based learning. METHODS: After a literature review, the Modified Education in Otolaryngology Website (MEOW) assessment tool was designed by a panel of experts based on a previously validated website assessment tool. A search strategy using a Google-based search engine was used subsequently to identify websites. Those that were free of charge and in English were included. Websites were coded for whether their content targeted medical students or residents. Using the MEOW assessment tool, two independent raters scored the websites. Inter-rater and intra-rater reliability were evaluated, and scores were compared to recommendations from a content expert. The MEOW assessment tool included a total of 20 items divided in 8 categories related to authorship, frequency of revision, content accuracy, interactivity, visual presentation, navigability, speed and recommended hyperlinks. A total of 43 out of 334 websites identified by the search met inclusion criteria. The scores generated by our tool appeared to differentiate higher quality websites from lower quality ones: websites that the expert "would recommend" scored 38.4 (out of 56; CI [34.4-42.4]) and "would not recommend" 27.0 (CI [23.2-30.9]). Inter-rater and intra-rater intraclass correlation coefficient were greater than 0.7. Using the MEOW assessment tool, high quality ORL-HNS educational websites were identified.
Distributed geospatial model sharing based on open interoperability standards
Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin
2009-01-01
Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.
Motivation for Knowledge Sharing by Expert Participants in Company-Hosted Online User Communities
ERIC Educational Resources Information Center
Cheng, Jingli
2014-01-01
Company-hosted online user communities are increasingly popular as firms continue to search for ways to provide their customers with high quality and reliable support in a low cost and scalable way. Yet, empirical understanding of motivations for knowledge sharing in this type of online communities is lacking, especially with regard to an…
Medical faculties educational network: multidimensional quality assessment.
Komenda, Martin; Schwarz, Daniel; Feberová, Jitka; Stípek, Stanislav; Mihál, Vladimír; Dušek, Ladislav
2012-12-01
Today, World Wide Web technology provides many opportunities in the disclosure of electronic learning and teaching content. The MEFANET project (MEdical FAculties NETwork) has initiated international, effective and open cooperation among all Czech and Slovak medical faculties in the medical education fields. This paper introduces the original MEFANET educational web portal platform. Its main aim is to present the unique collaborative environment, which combines the sharing of electronic educational resources with the use tools for their quality evaluation. It is in fact a complex e-publishing system, which consists of ten standalone portal instances and one central gateway. The fundamental principles of the developed system and used technologies are reported here, as well as procedures of a new multidimensional quality assessment. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Integrating multiple data sources in species distribution modeling: A framework for data fusion
Pacifici, Krishna; Reich, Brian J.; Miller, David A.W.; Gardner, Beth; Stauffer, Glenn E.; Singh, Susheela; McKerrow, Alexa; Collazo, Jaime A.
2017-01-01
The last decade has seen a dramatic increase in the use of species distribution models (SDMs) to characterize patterns of species’ occurrence and abundance. Efforts to parameterize SDMs often create a tension between the quality and quantity of data available to fit models. Estimation methods that integrate both standardized and non-standardized data types offer a potential solution to the tradeoff between data quality and quantity. Recently several authors have developed approaches for jointly modeling two sources of data (one of high quality and one of lesser quality). We extend their work by allowing for explicit spatial autocorrelation in occurrence and detection error using a Multivariate Conditional Autoregressive (MVCAR) model and develop three models that share information in a less direct manner resulting in more robust performance when the auxiliary data is of lesser quality. We describe these three new approaches (“Shared,” “Correlation,” “Covariates”) for combining data sources and show their use in a case study of the Brown-headed Nuthatch in the Southeastern U.S. and through simulations. All three of the approaches which used the second data source improved out-of-sample predictions relative to a single data source (“Single”). When information in the second data source is of high quality, the Shared model performs the best, but the Correlation and Covariates model also perform well. When the information quality in the second data source is of lesser quality, the Correlation and Covariates model performed better suggesting they are robust alternatives when little is known about auxiliary data collected opportunistically or through citizen scientists. Methods that allow for both data types to be used will maximize the useful information available for estimating species distributions.
How to write a Critically Appraised Topic: evidence to underpin routine clinical practice.
Callander, J; Anstey, A V; Ingram, J R; Limpens, J; Flohr, C; Spuls, P I
2017-10-01
Critically appraised topics (CATs) are essential tools for busy clinicians who wish to ensure that their daily clinical practice is underpinned by evidence-based medicine. CATs are short summaries of the most up-to-date, high-quality available evidence that is found using thorough structured methods. They can be used to answer specific, patient-orientated questions that arise recurrently in real-life practice. This article provides readers with a detailed guide to performing their own CATs. It is split into four main sections reflecting the four main steps involved in performing a CAT: formulation of a focused question, a search for the most relevant and highest-quality evidence, critical appraisal of the evidence and application of the results back to the patient scenario. As well as helping to improve patient care on an individual basis by answering specific clinical questions that arise, CATs can help spread and share knowledge with colleagues on an international level through publication in the evidence-based dermatology section of the British Journal of Dermatology. © 2017 British Association of Dermatologists.
Janamian, Tina; Upham, Susan J; Crossland, Lisa; Jackson, Claire L
2016-04-18
To conduct a systematic review of the literature to identify existing online primary care quality improvement tools and resources to support organisational improvement related to the seven elements in the Primary Care Practice Improvement Tool (PC-PIT), with the identified tools and resources to progress to a Delphi study for further assessment of relevance and utility. Systematic review of the international published and grey literature. CINAHL, Embase and PubMed databases were searched in March 2014 for articles published between January 2004 and December 2013. GreyNet International and other relevant websites and repositories were also searched in March-April 2014 for documents dated between 1992 and 2012. All citations were imported into a bibliographic database. Published and unpublished tools and resources were included in the review if they were in English, related to primary care quality improvement and addressed any of the seven PC-PIT elements of a high-performing practice. Tools and resources that met the eligibility criteria were then evaluated for their accessibility, relevance, utility and comprehensiveness using a four-criteria appraisal framework. We used a data extraction template to systematically extract information from eligible tools and resources. A content analysis approach was used to explore the tools and resources and collate relevant information: name of the tool or resource, year and country of development, author, name of the organisation that provided access and its URL, accessibility information or problems, overview of each tool or resource and the quality improvement element(s) it addresses. If available, a copy of the tool or resource was downloaded into the bibliographic database, along with supporting evidence (published or unpublished) on its use in primary care. This systematic review identified 53 tools and resources that can potentially be provided as part of a suite of tools and resources to support primary care practices in improving the quality of their practice, to achieve improved health outcomes.
Procurement of Shared Data Instruments for Research Electronic Data Capture (REDCap)
Obeid, Jihad S; McGraw, Catherine A; Minor, Brenda L; Conde, José G; Pawluk, Robert; Lin, Michael; Wang, Janey; Banks, Sean R; Hemphill, Sheree A; Taylor, Rob; Harris, Paul A
2012-01-01
REDCap (Research Electronic Data Capture) is a web-based software solution and tool set that allows biomedical researchers to create secure online forms for data capture, management and analysis with minimal effort and training. The Shared Data Instrument Library (SDIL) is a relatively new component of REDCap that allows sharing of commonly used data collection instruments for immediate study use by 3 research teams. Objectives of the SDIL project include: 1) facilitating reuse of data dictionaries and reducing duplication of effort; 2) promoting the use of validated data collection instruments, data standards and best practices; and 3) promoting research collaboration and data sharing. Instruments submitted to the library are reviewed by a library oversight committee, with rotating membership from multiple institutions, which ensures quality, relevance and legality of shared instruments. The design allows researchers to download the instruments in a consumable electronic format in the REDCap environment. At the time of this writing, the SDIL contains over 128 data collection instruments. Over 2500 instances of instruments have been downloaded by researchers at multiple institutions. In this paper we describe the library platform, provide detail about experience gained during the first 25 months of sharing public domain instruments and provide evidence of impact for the SDIL across the REDCap consortium research community. We postulate that the shared library of instruments reduces the burden of adhering to sound data collection principles while promoting best practices. PMID:23149159
MPHASYS: a mouse phenotype analysis system
Calder, R Brent; Beems, Rudolf B; van Steeg, Harry; Mian, I Saira; Lohman, Paul HM; Vijg, Jan
2007-01-01
Background Systematic, high-throughput studies of mouse phenotypes have been hampered by the inability to analyze individual animal data from a multitude of sources in an integrated manner. Studies generally make comparisons at the level of genotype or treatment thereby excluding associations that may be subtle or involve compound phenotypes. Additionally, the lack of integrated, standardized ontologies and methodologies for data exchange has inhibited scientific collaboration and discovery. Results Here we introduce a Mouse Phenotype Analysis System (MPHASYS), a platform for integrating data generated by studies of mouse models of human biology and disease such as aging and cancer. This computational platform is designed to provide a standardized methodology for working with animal data; a framework for data entry, analysis and sharing; and ontologies and methodologies for ensuring accurate data capture. We describe the tools that currently comprise MPHASYS, primarily ones related to mouse pathology, and outline its use in a study of individual animal-specific patterns of multiple pathology in mice harboring a specific germline mutation in the DNA repair and transcription-specific gene Xpd. Conclusion MPHASYS is a system for analyzing multiple data types from individual animals. It provides a framework for developing data analysis applications, and tools for collecting and distributing high-quality data. The software is platform independent and freely available under an open-source license [1]. PMID:17553167
Data sharing in neuroimaging research
Poline, Jean-Baptiste; Breeze, Janis L.; Ghosh, Satrajit; Gorgolewski, Krzysztof; Halchenko, Yaroslav O.; Hanke, Michael; Haselgrove, Christian; Helmer, Karl G.; Keator, David B.; Marcus, Daniel S.; Poldrack, Russell A.; Schwartz, Yannick; Ashburner, John; Kennedy, David N.
2012-01-01
Significant resources around the world have been invested in neuroimaging studies of brain function and disease. Easier access to this large body of work should have profound impact on research in cognitive neuroscience and psychiatry, leading to advances in the diagnosis and treatment of psychiatric and neurological disease. A trend toward increased sharing of neuroimaging data has emerged in recent years. Nevertheless, a number of barriers continue to impede momentum. Many researchers and institutions remain uncertain about how to share data or lack the tools and expertise to participate in data sharing. The use of electronic data capture (EDC) methods for neuroimaging greatly simplifies the task of data collection and has the potential to help standardize many aspects of data sharing. We review here the motivations for sharing neuroimaging data, the current data sharing landscape, and the sociological or technical barriers that still need to be addressed. The INCF Task Force on Neuroimaging Datasharing, in conjunction with several collaborative groups around the world, has started work on several tools to ease and eventually automate the practice of data sharing. It is hoped that such tools will allow researchers to easily share raw, processed, and derived neuroimaging data, with appropriate metadata and provenance records, and will improve the reproducibility of neuroimaging studies. By providing seamless integration of data sharing and analysis tools within a commodity research environment, the Task Force seeks to identify and minimize barriers to data sharing in the field of neuroimaging. PMID:22493576
A suite of R packages for web-enabled modeling and analysis of surface waters
NASA Astrophysics Data System (ADS)
Read, J. S.; Winslow, L. A.; Nüst, D.; De Cicco, L.; Walker, J. I.
2014-12-01
Researchers often create redundant methods for downloading, manipulating, and analyzing data from online resources. Moreover, the reproducibility of science can be hampered by complicated and voluminous data, lack of time for documentation and long-term maintenance of software, and fear of exposing programming skills. The combination of these factors can encourage unshared one-off programmatic solutions instead of openly provided reusable methods. Federal and academic researchers in the water resources and informatics domains have collaborated to address these issues. The result of this collaboration is a suite of modular R packages that can be used independently or as elements in reproducible analytical workflows. These documented and freely available R packages were designed to fill basic needs for the effective use of water data: the retrieval of time-series and spatial data from web resources (dataRetrieval, geoknife), performing quality assurance and quality control checks of these data with robust statistical methods (sensorQC), the creation of useful data derivatives (including physically- and biologically-relevant indices; GDopp, LakeMetabolizer), and the execution and evaluation of models (glmtools, rLakeAnalyzer). Here, we share details and recommendations for the collaborative coding process, and highlight the benefits of an open-source tool development pattern with a popular programming language in the water resources discipline (such as R). We provide examples of reproducible science driven by large volumes of web-available data using these tools, explore benefits of accessing packages as standardized web processing services (WPS) and present a working platform that allows domain experts to publish scientific algorithms in a service-oriented architecture (WPS4R). We assert that in the era of open data, tools that leverage these data should also be freely shared, transparent, and developed in an open innovation environment.
Hage, S J
1991-01-01
"Rapid and tumultuous change in health care as well as business has precipitated a power shift," declares Mr. Hage in this candid discussion of a quality that is both abstract and concrete. Centralized power is no longer the order of the day; in fact, the new stance supports pushing power down into organizations where it can be better used by those closer to the action. The author maintains that effective participants in this new model will learn to share power and respect knowledge as the only tool that wields it.
Quality measurement in physician-staffed emergency medical services: a systematic literature review.
Haugland, Helge; Uleberg, Oddvar; Klepstad, Pål; Krüger, Andreas; Rehn, Marius
2018-05-15
Quality measurement of physician-staffed emergency medical services (P-EMS) is necessary to improve service quality. Knowledge and consensus on this topic are scarce, making quality measurement of P-EMS a high-priority research area. The aim of this review was to identify, describe and evaluate studies of quality measurement in P-EMS. The databases of MEDLINE and Embase were searched initially, followed by a search for included article citations in Scopus. The study eligibility criteria were: (1) articles describing the use of one quality indicator (QI) or more in P-EMS, (2) original manuscripts, (3) articles published from 1 January 1968 until 5 October 2016. The literature search identified 4699 records. 4543 were excluded after reviewing title and abstract. An additional 129 were excluded based on a full-text review. The remaining 27 papers were included in the analysis. Methodological quality was assessed using an adapted critical appraisal tool. The description of used QIs and methods of quality measurement was extracted. Variables describing the involved P-EMSs were extracted as well. In the included papers, a common understanding of which QIs to use in P-EMS did not exist. Fifteen papers used only a single QI. The most widely used QIs were 'Adherence to medical protocols', 'Provision of advanced interventions', 'Response time' and 'Adverse events'. The review demonstrated a lack of shared understanding of which QIs to use in P-EMS. Moreover, papers using only one QI dominated the literature, thus increasing the risk of a narrow perspective in quality measurement. Future quality measurement in P-EMS should rely on a set of consensus-based QIs, ensuring a comprehensive approach to quality measurement.
The Role of Social Media Tools: Accessible Tourism for Disabled Citizens
ERIC Educational Resources Information Center
Altinay, Zehra; Saner, Tulen; Bahçelerli, Nesrin M.; Altinay, Fahriye
2016-01-01
Knowledge sharing becomes important to accomplish digital citizenship. Social media tools become popular to share and diffuse the knowledge in the digitalization. This social media learning and knowledge sharing platforms provides accessibility to the services within societies especially for disabled citizens. This research study aims to evaluate…
Web-based visual analysis for high-throughput genomics
2013-01-01
Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput genomics experiments. PMID:23758618
Improving physician's hand over among oncology staff using standardized communication tool
Alolayan, Ashwaq; Alkaiyat, Mohammad; Ali, Yosra; Alshami, Mona; Al-Surimi, Khaled; Jazieh, Abdul-Rahman
2017-01-01
Cancer patients are frequently admitted to hospital for many reasons. During their hospitalization they are handled by different physicians and other care providers. Maintaining good communication among physicians is essential to assure patient safety and the delivery of quality patient care. Several incidents of miscommunication issues have been reported due to lack of a standardized communication tool for patients' hand over among physicians at our oncology department. Hence, this improvement project aims at assessing the impact of using a standardized communication tool on improving patients' hand over and quality of patient care. A quality improvement team has been formed to address the issue of cancer patients' hand over. We adopted specific hand over tool to be used by physicians. This tool was developed based on well-known and validated communication tool called ISBAR - Identify, Situation, Background, Assessment and Recommendation, which contains pertinent information about the patient's condition. The form should be shared at a specific point in time during the handover process. We monitored the compliance of physician's with this tool over 16 weeks embedded by four ‘purposive’ and ‘sequential’ Plan-Do-Study-Act (PDSA) cycles; where each PDSA cycle was developed based on the challenges faced and lessons learned in each step and the result of the previous PDSA cycle. Physicians compliance rate of using the tool had improved significantly from 45% (baseline) to 100% after the fourth PDSA cycle. Other process measure was measuring acknowledgment of hand over receipt email at two checkpoints at 8:00 – 9:00 a.m. and 4:00 – 5:00 p.m. The project showed that using a standardized handover form as a daily communication method between physicians is a useful idea and feasible to improve cancer patients handover with positive impact on many aspects of healthcare process and outcomes. PMID:28174657
Sharing-based social capital associated with harvest production and wealth in the Canadian Arctic
2018-01-01
Social institutions that facilitate sharing and redistribution may help mitigate the impact of resource shocks. In the North American Arctic, traditional food sharing may direct food to those who need it and provide a form of natural insurance against temporal variability in hunting returns within households. Here, network properties that facilitate resource flow (network size, quality, and density) are examined in a country food sharing network comprising 109 Inuit households from a village in Nunavik (Canada), using regressions to investigate the relationships between these network measures and household socioeconomic attributes. The results show that although single women and elders have larger networks, the sharing network is not structured to prioritize sharing towards households with low food availability. Rather, much food sharing appears to be driven by reciprocity between high-harvest households, meaning that poor, low-harvest households tend to have less sharing-based social capital than more affluent, high-harvest households. This suggests that poor, low-harvest households may be more vulnerable to disruptions in the availability of country food. PMID:29529040
NASA Astrophysics Data System (ADS)
Burba, G. G.; Johnson, D.; Velgersdyk, M.; Beaty, K.; Forgione, A.; Begashaw, I.; Allyn, D.
2015-12-01
Significant increases in data generation and computing power in recent years have greatly improved spatial and temporal flux data coverage on multiple scales, from a single station to continental flux networks. At the same time, operating budgets for flux teams and stations infrastructure are getting ever more difficult to acquire and sustain. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are needed to effectively and efficiently handle the entire process. This would help maximize time dedicated to answering research questions, and minimize time and expenses spent on data processing, quality control and station management. Cross-sharing the stations with external institutions may also help leverage available funding, increase scientific collaboration, and promote data analyses and publications. FluxSuite, a new advanced tool combining hardware, software and web-service, was developed to address these specific demands. It automates key stages of flux workflow, minimizes day-to-day site management, and modernizes the handling of data flows: Each next-generation station measures all parameters needed for flux computations Field microcomputer calculates final fully-corrected flux rates in real time, including computation-intensive Fourier transforms, spectra, co-spectra, multiple rotations, stationarity, footprint, etc. Final fluxes, radiation, weather and soil data are merged into a single quality-controlled file Multiple flux stations are linked into an automated time-synchronized network Flux network manager, or PI, can see all stations in real time, including fluxes, supporting data, automated reports, and email alerts PI can assign rights, allow or restrict access to stations and data: selected stations can be shared via rights-managed access internally or with external institutions Researchers without stations could form "virtual networks" for specific projects by collaborating with PIs from different actual networks This presentation provides detailed examples of FluxSuite currently utilized by two large flux networks in China (National Academy of Sciences & Agricultural Academy of Sciences), and smaller networks with stations in the USA, Germany, Ireland, Malaysia and other locations around the globe.
Electroforming of optical tooling in high-strength Ni-Co alloy
NASA Astrophysics Data System (ADS)
Stein, Berl
2003-05-01
Plastic optics are often mass produced by injection, compression or injection-compression molding. Optical quality molds can be directly machined in appropriate materials (tool steels, electroless nickel, aluminum, etc.), but much greater cost efficiency can be achieved with electroformed modl inserts. Traditionally, electroforming of optical quality mold inserts has been carried out in nickel, a material much softer than tool steels which, when hardened to 45 - 50 HRc usually exhibit high wear resistance and long service life (hundreds of thousands of impressions per mold). Because of their low hardness (< 20 HRc), nickel molds can produce only tens of thousands of parts before they are scrapped due to wear or accidental damage. This drawback prevented their wider usage in general plastic and optical mold making. Recently, NiCoForm has developed a proprietary Ni-CO electroforming bath combining the high strength and wear resistance of the alloy with the low stress and high replication fidelity typical of pure nickel electroforming. This paper will outline the approach to electroforming of optical quality tooling in low stress, high strength Ni-Co alloy and present several examples of electroformed NiColoy mold inserts.
Network Computing Infrastructure to Share Tools and Data in Global Nuclear Energy Partnership
NASA Astrophysics Data System (ADS)
Kim, Guehee; Suzuki, Yoshio; Teshima, Naoya
CCSE/JAEA (Center for Computational Science and e-Systems/Japan Atomic Energy Agency) integrated a prototype system of a network computing infrastructure for sharing tools and data to support the U.S. and Japan collaboration in GNEP (Global Nuclear Energy Partnership). We focused on three technical issues to apply our information process infrastructure, which are accessibility, security, and usability. In designing the prototype system, we integrated and improved both network and Web technologies. For the accessibility issue, we adopted SSL-VPN (Security Socket Layer-Virtual Private Network) technology for the access beyond firewalls. For the security issue, we developed an authentication gateway based on the PKI (Public Key Infrastructure) authentication mechanism to strengthen the security. Also, we set fine access control policy to shared tools and data and used shared key based encryption method to protect tools and data against leakage to third parties. For the usability issue, we chose Web browsers as user interface and developed Web application to provide functions to support sharing tools and data. By using WebDAV (Web-based Distributed Authoring and Versioning) function, users can manipulate shared tools and data through the Windows-like folder environment. We implemented the prototype system in Grid infrastructure for atomic energy research: AEGIS (Atomic Energy Grid Infrastructure) developed by CCSE/JAEA. The prototype system was applied for the trial use in the first period of GNEP.
Cheung, Carol C; D'Arrigo, Corrado; Dietel, Manfred; Francis, Glenn D; Fulton, Regan; Gilks, C Blake; Hall, Jacqueline A; Hornick, Jason L; Ibrahim, Merdol; Marchetti, Antonio; Miller, Keith; van Krieken, J Han; Nielsen, Soren; Swanson, Paul E; Taylor, Clive R; Vyberg, Mogens; Zhou, Xiaoge; Torlakovic, Emina E
2017-04-01
The numbers of diagnostic, prognostic, and predictive immunohistochemistry (IHC) tests are increasing; the implementation and validation of new IHC tests, revalidation of existing tests, as well as the on-going need for daily quality assurance monitoring present significant challenges to clinical laboratories. There is a need for proper quality tools, specifically tissue tools that will enable laboratories to successfully carry out these processes. This paper clarifies, through the lens of laboratory tissue tools, how validation, verification, and revalidation of IHC tests can be performed in order to develop and maintain high quality "fit-for-purpose" IHC testing in the era of precision medicine. This is the final part of the 4-part series "Evolution of Quality Assurance for Clinical Immunohistochemistry in the Era of Precision Medicine."
NASA Astrophysics Data System (ADS)
Rajib, M. A.; Merwade, V.; Song, C.; Zhao, L.; Kim, I. L.; Zhe, S.
2014-12-01
Setting up of any hydrologic model requires a large amount of efforts including compilation of all the data, creation of input files, calibration and validation. Given the amount of efforts involved, it is possible that models for a watershed get created multiple times by multiple groups or organizations to accomplish different research, educational or policy goals. To reduce the duplication of efforts and enable collaboration among different groups or organizations around an already existing hydrology model, a platform is needed where anyone can search for existing models, perform simple scenario analysis and visualize model results. The creator and users of a model on such a platform can then collaborate to accomplish new research or educational objectives. From this perspective, a prototype cyber-infrastructure (CI), called SWATShare, is developed for sharing, running and visualizing Soil Water Assessment Tool (SWAT) models in an interactive GIS-enabled web environment. Users can utilize SWATShare to publish or upload their own models, search and download existing SWAT models developed by others, run simulations including calibration using high performance resources provided by XSEDE and Cloud. Besides running and sharing, SWATShare hosts a novel spatio-temporal visualization system for SWAT model outputs. In temporal scale, the system creates time-series plots for all the hydrology and water quality variables available along the reach as well as in watershed-level. In spatial scale, the system can dynamically generate sub-basin level thematic maps for any variable at any user-defined date or date range; and thereby, allowing users to run animations or download the data for subsequent analyses. In addition to research, SWATShare can also be used within a classroom setting as an educational tool for modeling and comparing the hydrologic processes under different geographic and climatic settings. SWATShare is publicly available at https://www.water-hub.org/swatshare.
IEDA: Making Small Data BIG Through Interdisciplinary Partnerships Among Long-tail Domains
NASA Astrophysics Data System (ADS)
Lehnert, K. A.; Carbotte, S. M.; Arko, R. A.; Ferrini, V. L.; Hsu, L.; Song, L.; Ghiorso, M. S.; Walker, D. J.
2014-12-01
The Big Data world in the Earth Sciences so far exists primarily for disciplines that generate massive volumes of observational or computed data using large-scale, shared instrumentation such as global sensor networks, satellites, or high-performance computing facilities. These data are typically managed and curated by well-supported community data facilities that also provide the tools for exploring the data through visualization or statistical analysis. In many other domains, especially those where data are primarily acquired by individual investigators or small teams (known as 'Long-tail data'), data are poorly shared and integrated, lacking a community-based data infrastructure that ensures persistent access, quality control, standardization, and integration of data, as well as appropriate tools to fully explore and mine the data within the context of broader Earth Science datasets. IEDA (Integrated Earth Data Applications, www.iedadata.org) is a data facility funded by the US NSF to develop and operate data services that support data stewardship throughout the full life cycle of observational data in the solid earth sciences, with a focus on the data management needs of individual researchers. IEDA builds on a strong foundation of mature disciplinary data systems for marine geology and geophysics, geochemistry, and geochronology. These systems have dramatically advanced data resources in those long-tail Earth science domains. IEDA has strengthened these resources by establishing a consolidated, enterprise-grade infrastructure that is shared by the domain-specific data systems, and implementing joint data curation and data publication services that follow community standards. In recent years, other domain-specific data efforts have partnered with IEDA to take advantage of this infrastructure and improve data services to their respective communities with formal data publication, long-term preservation of data holdings, and better sustainability. IEDA hopes to foster such partnerships with streamlined data services, including user-friendly, single-point interfaces for data submission, discovery, and access across the partner systems to support interdisciplinary science.
Molleman, Toon; van Ginneken, Esther F J C
2015-09-01
Prisons worldwide operate under crowded conditions, in which prisoners are forced to share a cell. Few studies have looked at the relationship between cell sharing and the quality of prison life in Europe. This study aims to fill this gap with a multilevel analysis on the link between cell sharing and quality of prison life, using results from a Dutch prisoner survey. Findings show that cell sharing is associated with lower perceived prison quality, which is partially mediated by reduced quality of staff-prisoner relationships. Cell sharing thus undermines the Dutch penological philosophy, which considers staff-prisoner relationships to be at the heart of prisoner treatment and rehabilitation. It is recommended that prisoners are held in single rather than double cells. © The Author(s) 2014.
Randomization Based Privacy Preserving Categorical Data Analysis
ERIC Educational Resources Information Center
Guo, Ling
2010-01-01
The success of data mining relies on the availability of high quality data. To ensure quality data mining, effective information sharing between organizations becomes a vital requirement in today's society. Since data mining often involves sensitive information of individuals, the public has expressed a deep concern about their privacy.…
Boggan, Joel C; Cheely, George; Shah, Bimal R; Heffelfinger, Randy; Springall, Deanna; Thomas, Samantha M; Zaas, Aimee; Bae, Jonathan
2014-09-01
Systematically engaging residents in large programs in quality improvement (QI) is challenging. To coordinate a shared QI project in a large residency program using an online tool. A web-based QI tool guided residents through a 2-phase evaluation of performance of foot examinations in patients with diabetes. In phase 1, residents completed reviews of health records with online data entry. Residents were then presented with personal performance data relative to peers and were prompted to develop improvement plans. In phase 2, residents again reviewed personal performance. Rates of performance were compared at the program and clinic levels for each phase, with data presented for residents. Acceptability was measured by the number of residents completing each phase. Feasibility was measured by estimated faculty, programmer, and administrator time and costs. Seventy-nine of 86 eligible residents (92%) completed improvement plans and reviewed 1471 patients in phase 1, whereas 68 residents (79%) reviewed 1054 patient charts in phase 2. Rates of performance of examination increased significantly between phases (from 52% to 73% for complete examination, P < .001). Development of the tool required 130 hours of programmer time. Project analysis and management required 6 hours of administrator and faculty time monthly. An online tool developed and implemented for program-wide QI initiatives successfully engaged residents to participate in QI activities. Residents using this tool demonstrated improvement in a selected quality target. This tool could be adapted by other graduate medical education programs or for faculty development.
Lavin, Mary Ann; Harper, Ellen; Barr, Nancy
2015-04-14
The electronic health record (EHR) is a documentation tool that yields data useful in enhancing patient safety, evaluating care quality, maximizing efficiency, and measuring staffing needs. Although nurses applaud the EHR, they also indicate dissatisfaction with its design and cumbersome electronic processes. This article describes the views of nurses shared by members of the Nursing Practice Committee of the Missouri Nurses Association; it encourages nurses to share their EHR concerns with Information Technology (IT) staff and vendors and to take their place at the table when nursing-related IT decisions are made. In this article, we describe the experiential-reflective reasoning and action model used to understand staff nurses' perspectives, share committee reflections and recommendations for improving both documentation and documentation technology, and conclude by encouraging nurses to develop their documentation and informatics skills. Nursing issues include medication safety, documentation and standards of practice, and EHR efficiency. IT concerns include interoperability, vendors, innovation, nursing voice, education, and collaboration.
The Climate-G testbed: towards a large scale data sharing environment for climate change
NASA Astrophysics Data System (ADS)
Aloisio, G.; Fiore, S.; Denvil, S.; Petitdidier, M.; Fox, P.; Schwichtenberg, H.; Blower, J.; Barbera, R.
2009-04-01
The Climate-G testbed provides an experimental large scale data environment for climate change addressing challenging data and metadata management issues. The main scope of Climate-G is to allow scientists to carry out geographical and cross-institutional climate data discovery, access, visualization and sharing. Climate-G is a multidisciplinary collaboration involving both climate and computer scientists and it currently involves several partners such as: Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), Institut Pierre-Simon Laplace (IPSL), Fraunhofer Institut für Algorithmen und Wissenschaftliches Rechnen (SCAI), National Center for Atmospheric Research (NCAR), University of Reading, University of Catania and University of Salento. To perform distributed metadata search and discovery, we adopted a CMCC metadata solution (which provides a high level of scalability, transparency, fault tolerance and autonomy) leveraging both on P2P and grid technologies (GRelC Data Access and Integration Service). Moreover, data are available through OPeNDAP/THREDDS services, Live Access Server as well as the OGC compliant Web Map Service and they can be downloaded, visualized, accessed into the proposed environment through the Climate-G Data Distribution Centre (DDC), the web gateway to the Climate-G digital library. The DDC is a data-grid portal allowing users to easily, securely and transparently perform search/discovery, metadata management, data access, data visualization, etc. Godiva2 (integrated into the DDC) displays 2D maps (and animations) and also exports maps for display on the Google Earth virtual globe. Presently, Climate-G publishes (through the DDC) about 2TB of data related to the ENSEMBLES project (also including distributed replicas of data) as well as to the IPCC AR4. The main results of the proposed work are: wide data access/sharing environment for climate change; P2P/grid metadata approach; production-level Climate-G DDC; high quality tools for data visualization; metadata search/discovery across several countries/institutions; open environment for climate change data sharing.
Chidambaram, Valliammai; Brewster, Philip J.; Jordan, Kristine C.; Hurdle, John F.
2013-01-01
The United States, indeed the world, struggles with a serious obesity epidemic. The costs of this epidemic in terms of healthcare dollar expenditures and human morbidity/mortality are staggering. Surprisingly, clinicians are ill-equipped in general to advise patients on effective, longitudinal weight loss strategies. We argue that one factor hindering clinicians and patients in effective shared decision-making about weight loss is the absence of a metric that can be reasoned about and monitored over time, as clinicians do routinely with, say, serum lipid levels or HgA1C. We propose that a dietary quality measure championed by the USDA and NCI, the HEI-2005/2010, is an ideal metric for this purpose. We describe a new tool, the quality Dietary Information Extraction Tool (qDIET), which is a step toward an automated, self-sustaining process that can link retail grocery purchase data to the appropriate USDA databases to permit the calculation of the HEI-2005/2010. PMID:24551333
Chidambaram, Valliammai; Brewster, Philip J; Jordan, Kristine C; Hurdle, John F
2013-01-01
The United States, indeed the world, struggles with a serious obesity epidemic. The costs of this epidemic in terms of healthcare dollar expenditures and human morbidity/mortality are staggering. Surprisingly, clinicians are ill-equipped in general to advise patients on effective, longitudinal weight loss strategies. We argue that one factor hindering clinicians and patients in effective shared decision-making about weight loss is the absence of a metric that can be reasoned about and monitored over time, as clinicians do routinely with, say, serum lipid levels or HgA1C. We propose that a dietary quality measure championed by the USDA and NCI, the HEI-2005/2010, is an ideal metric for this purpose. We describe a new tool, the quality Dietary Information Extraction Tool (qDIET), which is a step toward an automated, self-sustaining process that can link retail grocery purchase data to the appropriate USDA databases to permit the calculation of the HEI-2005/2010.
Enriching the Web of Data with Educational Information Using We-Share
ERIC Educational Resources Information Center
Ruiz-Calleja, Adolfo; Asensio-Pérez, Juan I.; Vega-Gorgojo, Guillermo; Gómez-Sánchez, Eduardo; Bote-Lorenzo, Miguel L.; Alario-Hoyos, Carlos
2017-01-01
This paper presents We-Share, a social annotation application that enables educators to publish and retrieve information about educational ICT tools. As a distinctive characteristic, We-Share provides educators data about educational tools already available on the Web of Data while allowing them to enrich such data with their experience using…
Tool and process for miniature explosive joining of tubes
NASA Technical Reports Server (NTRS)
Bement, Laurence J. (Inventor); Bailey, James W. (Inventor)
1987-01-01
A tool and process to be used in the explosive joining of tubes is disclosed. The tool consists of an initiator, a tool form, and a ribbon explosive. The assembled tool is a compact, storable, and safe device suitable for explosive joining of small, lightweight tubes down to 0.20 inch in diameter. The invention is inserted into either another tube or a tube plate. A shim or standoff between the two surfaces to be welded is necessary. Initiation of the explosive inside the tube results in a high velocity, angular collision between the mating surfaces. This collision creates surface melts and collision bonding wherein electron-sharing linkups are formed.
Examples of Effective Data Sharing in Scientific Publishing
Kitchin, John R.
2015-05-11
Here, we present a perspective on an approach to data sharing in scientific publications we have been developing in our group. The essence of the approach is that data can be embedded in a human-readable and machine-addressable way within the traditional publishing environment. We show this by example for both computational and experimental data. We articulate a need for new authoring tools to facilitate data sharing, and we discuss the tools we have been developing for this purpose. With these tools, data generation, analysis, and manuscript preparation can be deeply integrated, resulting in easier and better data sharing in scientificmore » publications.« less
Examples of Effective Data Sharing in Scientific Publishing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kitchin, John R.
Here, we present a perspective on an approach to data sharing in scientific publications we have been developing in our group. The essence of the approach is that data can be embedded in a human-readable and machine-addressable way within the traditional publishing environment. We show this by example for both computational and experimental data. We articulate a need for new authoring tools to facilitate data sharing, and we discuss the tools we have been developing for this purpose. With these tools, data generation, analysis, and manuscript preparation can be deeply integrated, resulting in easier and better data sharing in scientificmore » publications.« less
The informatics capability maturity of integrated primary care centres in Australia.
Liaw, Siaw-Teng; Kearns, Rachael; Taggart, Jane; Frank, Oliver; Lane, Riki; Tam, Michael; Dennis, Sarah; Walker, Christine; Russell, Grant; Harris, Mark
2017-09-01
Integrated primary care requires systems and service integration along with financial incentives to promote downward substitution to a single entry point to care. Integrated Primary Care Centres (IPCCs) aim to improve integration by co-location of health services. The Informatics Capability Maturity (ICM) describes how well health organisations collect, manage and share information; manage eHealth technology, implementation, change, data quality and governance; and use "intelligence" to improve care. Describe associations of ICM with systems and service integration in IPCCs. Mixed methods evaluation of IPCCs in metropolitan and rural Australia: an enhanced general practice, four GP Super Clinics, a "HealthOne" (private-public partnership) and a Community Health Centre. Data collection methods included self-assessed ICM, document review, interviews, observations in practice and assessment of electronic health record data. Data was analysed and compared across IPCCs. The IPCCs demonstrated a range of funding models, ownership, leadership, organisation and ICM. Digital tools were used with varying effectiveness to collect, use and share data. Connectivity was problematic, requiring "work-arounds" to communicate and share information. The lack of technical, data and software interoperability standards, clinical coding and secure messaging were barriers to data collection, integration and sharing. Strong leadership and governance was important for successful implementation of robust and secure eHealth systems. Patient engagement with eHealth tools was suboptimal. ICM is positively associated with integration of data, systems and care. Improved ICM requires a health workforce with eHealth competencies; technical, semantic and software standards; adequate privacy and security; and good governance and leadership. Copyright © 2017 Elsevier B.V. All rights reserved.
Artificial intelligence support for scientific model-building
NASA Technical Reports Server (NTRS)
Keller, Richard M.
1992-01-01
Scientific model-building can be a time-intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientific development team to understand. We believe that artificial intelligence techniques can facilitate both the model-building and model-sharing process. In this paper, we overview our effort to build a scientific modeling software tool that aids the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high-level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities.
The State of Software for Evolutionary Biology.
Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros
2018-05-01
With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.
ERIC Educational Resources Information Center
Martin, Sonya N.; Scantlebury, Kathryn
2009-01-01
This paper focuses on content-based and pedagogical instructors' use of cogenerative dialogues to improve instructional practice and to evaluate program effectiveness in a professional development program for high school chemistry teachers. We share our research findings from using cogenerative dialogues as an evaluative tool for general…
Water Quality Projects Summary for the Mid-Columbia and Cumberland River Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Kevin M.; Witt, Adam M.; Hadjerioua, Boualem
Scheduling and operational control of hydropower systems is accompanied with a keen awareness of the management of water use, environmental effects, and policy, especially within the context of strict water rights policy and generation maximization. This is a multi-objective problem for many hydropower systems, including the Cumberland and Mid-Columbia river systems. Though each of these two systems have distinct operational philosophies, hydrologic characteristics, and system dynamics, they both share a responsibility to effectively manage hydropower and the environment, which requires state-of-the art improvements in the approaches and applications for water quality modeling. The Department of Energy and Oak Ridge Nationalmore » Laboratory have developed tools for total dissolved gas (TDG) prediction on the Mid-Columbia River and a decision-support system used for hydropower generation and environmental optimization on the Cumberland River. In conjunction with IIHR - Hydroscience & Engineering, The University of Iowa and University of Colorado s Center for Advanced Decision Support for Water and Environmental Systems (CADSWES), ORNL has managed the development of a TDG predictive methodology at seven dams along the Mid-Columbia River and has enabled the ability to utilize this methodology for optimization of operations at these projects with the commercially available software package Riverware. ORNL has also managed the collaboration with Vanderbilt University and Lipscomb University to develop a state-of-the art method for reducing high-fidelity water quality modeling results into surrogate models which can be used effectively within the context of optimization efforts to maximize generation for a reservoir system based on environmental and policy constraints. The novel contribution of these efforts is the ability to predict water quality conditions with simplified methodologies at the same level of accuracy as more complex and resource intensive computing methods. These efforts were designed to incorporate well into existing hydropower and reservoir system scheduling models, with runtimes that are comparable to existing software tools. In addition, the transferability of these tools to assess other systems is enhanced due the use of simplistic and easily attainable values for inputs, straight-forward calibration of predictive equation coefficients, and standardized comparison of traditionally familiar outputs.« less
Parish, Sharon J; Nappi, Rossella E; Kingsberg, Sheryl
2018-03-05
This narrative review strives to give healthcare providers (HCPs) who care for menopausal women better tools and skills to initiate discussions with women about menopause and hormone therapy (HT), communicate complex concepts and data, and promote shared decision-making. We review relevant studies on HT, barriers to treatment of menopausal symptoms, and effective communication strategies. We also provide recommendations for communicating with patients about HT based on the medical literature and our own professional experience. Both patient and HCP-related barriers can prevent women from accessing treatment for bothersome symptoms of menopause. Many women and HCPs have a poor understanding of the complex, nuanced data regarding HT. The benefits and risks vary with patient age and time since menopause, duration of use, inclusion of a progestin, and patient medical history. Women may also have fears about potential side effects of HT and feel unable to make informed choices. Strategies for effective patient communication and shared decision-making include use of open-ended questions to elicit patient's concerns and preferences, reflecting back to the patient what the HCP heard, presenting evidence about benefits and risks in language the patient can understand, keeping risks in perspective (eg, provide absolute, and also relative risks) without minimizing them, and making conscious efforts to minimize potential bias. Necessary components for achieving high-quality, shared decisions about HT involve a combination of medical evidence, communication skills, and recognition of patient goals and concerns. Use of such strategies can enhance women's satisfaction with care.
Weiler, Gabriele; Schröder, Christina; Schera, Fatima; Dobkowicz, Matthias; Kiefer, Stephan; Heidtke, Karsten R; Hänold, Stefanie; Nwankwo, Iheanyi; Forgó, Nikolaus; Stanulla, Martin; Eckert, Cornelia; Graf, Norbert
2014-01-01
Biobanks represent key resources for clinico-genomic research and are needed to pave the way to personalised medicine. To achieve this goal, it is crucial that scientists can securely access and share high-quality biomaterial and related data. Therefore, there is a growing interest in integrating biobanks into larger biomedical information and communication technology (ICT) infrastructures. The European project p-medicine is currently building an innovative ICT infrastructure to meet this need. This platform provides tools and services for conducting research and clinical trials in personalised medicine. In this paper, we describe one of its main components, the biobank access framework p-BioSPRE (p-medicine Biospecimen Search and Project Request Engine). This generic framework enables and simplifies access to existing biobanks, but also to offer own biomaterial collections to research communities, and to manage biobank specimens and related clinical data over the ObTiMA Trial Biomaterial Manager. p-BioSPRE takes into consideration all relevant ethical and legal standards, e.g., safeguarding donors’ personal rights and enabling biobanks to keep control over the donated material and related data. The framework thus enables secure sharing of biomaterial within open and closed research communities, while flexibly integrating related clinical and omics data. Although the development of the framework is mainly driven by user scenarios from the cancer domain, in this case, acute lymphoblastic leukaemia and Wilms tumour, it can be extended to further disease entities. PMID:24567758
Buch, Martin Sandberg; Edwards, Adrian; Eriksson, Tina
2009-01-01
The Maturity Matrix is a group-based formative self-evaluation tool aimed at assessing the degree of organisational development in general practice and providing a starting point for local quality improvement. Earlier studies of the Maturity Matrix have shown that participants find the method a useful way of assessing their practice's organisational development. However, little is known about participants' views on the resulting efforts to implement intended changes. To explore users' perspectives on the Maturity Matrix method, the facilitation process, and drivers and barriers for implementation of intended changes. Observation of two facilitated practice meetings, 17 semi-structured interviews with participating general practitioners (GPs) or their staff, and mapping of reasons for continuing or quitting the project. General practices in Denmark Main outcomes: Successful change was associated with: a clearly identified anchor person within the practice, a shared and regular meeting structure, and an external facilitator who provides support and counselling during the implementation process. Failure to implement change was associated with: a high patient-related workload, staff or GP turnover (that seemed to affect small practices more), no clearly identified anchor person or anchor persons who did not do anything, no continuous support from an external facilitator, and no formal commitment to working with agreed changes. Future attempts to improve the impact of the Maturity Matrix, and similar tools for quality improvement, could include: (a) attention to matters of variation caused by practice size, (b) systematic counselling on barriers to implementation and support to structure the change processes, (c) a commitment from participants that goes beyond participation in two-yearly assessments, and (d) an anchor person for each identified goal who takes on the responsibility for improvement in practice.
Evans, Steven T; Stewart, Kevin D; Afdahl, Chris; Patel, Rohan; Newell, Kelcy J
2017-07-14
In this paper, we discuss the optimization and implementation of a high throughput process development (HTPD) tool that utilizes commercially available micro-liter sized column technology for the purification of multiple clinically significant monoclonal antibodies. Chromatographic profiles generated using this optimized tool are shown to overlay with comparable profiles from the conventional bench-scale and clinical manufacturing scale. Further, all product quality attributes measured are comparable across scales for the mAb purifications. In addition to supporting chromatography process development efforts (e.g., optimization screening), comparable product quality results at all scales makes this tool is an appropriate scale model to enable purification and product quality comparisons of HTPD bioreactors conditions. The ability to perform up to 8 chromatography purifications in parallel with reduced material requirements per run creates opportunities for gathering more process knowledge in less time. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
How we make cell therapy in Italy.
Montemurro, Tiziana; Viganò, Mariele; Budelli, Silvia; Montelatici, Elisa; Lavazza, Cristiana; Marino, Luigi; Parazzi, Valentina; Lazzari, Lorenza; Giordano, Rosaria
2015-01-01
In the 21st century scenario, new therapeutic tools are needed to take up the social and medical challenge posed by the more and more frequent degenerative disorders and by the aging of population. The recent category of advanced therapy medicinal products has been created to comprise cellular, gene therapy, and tissue engineered products, as a new class of drugs. Their manufacture requires the same pharmaceutical framework as for conventional drugs and this means that industrial, large-scale manufacturing process has to be adapted to the peculiar characteristics of cell-containing products. Our hospital took up the challenge of this new path in the early 2000s; and herein we describe the approach we followed to set up a pharmaceutical-grade facility in a public hospital context, with the aim to share the solutions we found to make cell therapy compliant with the requirements for the production and the quality control of a high-standard medicinal product.
Practising cloud-based telemedicine in developing countries.
Puustjärvi, Juha; Puustjärvi, Leena
2013-01-01
In industrialised countries, telemedicine has proven to be a valuable tool for enabling access to knowledge and allowing information exchange, and showing that it is possible to provide good quality of healthcare to isolated communities. However, there are many barriers to the widespread implementation of telemedicine in rural areas of developing countries. These include deficient internet connectivity and sophisticated peripheral medical devices. Furthermore, developing countries have very high patients-per-doctor ratios. In this paper, we report our work on developing a cloud-based health information system, which promotes telemedicine and patient-centred healthcare by exploiting modern information and communication technologies such as OWL-ontologies and SQL-triggers. The reason for using cloud technology is twofold. First, cloud service models are easily adaptable for sharing patients health information, which is of prime importance in patient-centred healthcare as well as in telemedicine. Second, the cloud and the consulting physicians may locate anywhere in the internet.
The quality of instruments to assess the process of shared decision making: A systematic review
Bomhof-Roordink, Hanna; Smith, Ian P.; Scholl, Isabelle; Stiggelbout, Anne M.; Pieterse, Arwen H.
2018-01-01
Objective To inventory instruments assessing the process of shared decision making and appraise their measurement quality, taking into account the methodological quality of their validation studies. Methods In a systematic review we searched seven databases (PubMed, Embase, Emcare, Cochrane, PsycINFO, Web of Science, Academic Search Premier) for studies investigating instruments measuring the process of shared decision making. Per identified instrument, we assessed the level of evidence separately for 10 measurement properties following a three-step procedure: 1) appraisal of the methodological quality using the COnsensus-based Standards for the selection of health status Measurement INstruments (COSMIN) checklist, 2) appraisal of the psychometric quality of the measurement property using three possible quality scores, 3) best-evidence synthesis based on the number of studies, their methodological and psychometrical quality, and the direction and consistency of the results. The study protocol was registered at PROSPERO: CRD42015023397. Results We included 51 articles describing the development and/or evaluation of 40 shared decision-making process instruments: 16 patient questionnaires, 4 provider questionnaires, 18 coding schemes and 2 instruments measuring multiple perspectives. There is an overall lack of evidence for their measurement quality, either because validation is missing or methods are poor. The best-evidence synthesis indicated positive results for a major part of instruments for content validity (50%) and structural validity (53%) if these were evaluated, but negative results for a major part of instruments when inter-rater reliability (47%) and hypotheses testing (59%) were evaluated. Conclusions Due to the lack of evidence on measurement quality, the choice for the most appropriate instrument can best be based on the instrument’s content and characteristics such as the perspective that they assess. We recommend refinement and validation of existing instruments, and the use of COSMIN-guidelines to help guarantee high-quality evaluations. PMID:29447193
Acoustic Wave Filter Technology-A Review.
Ruppel, Clemens C W
2017-09-01
Today, acoustic filters are the filter technology to meet the requirements with respect to performance dictated by the cellular phone standards and their form factor. Around two billion cellular phones are sold every year, and smart phones are of a very high percentage of approximately two-thirds. Smart phones require a very high number of filter functions ranging from the low double-digit range up to almost triple digit numbers in the near future. In the frequency range up to 1 GHz, surface acoustic wave (SAW) filters are almost exclusively employed, while in the higher frequency range, bulk acoustic wave (BAW) and SAW filters are competing for their shares. Prerequisites for the success of acoustic filters were the availability of high-quality substrates, advanced and highly reproducible fabrication technologies, optimum filter techniques, precise simulation software, and advanced design tools that allow the fast and efficient design according to customer specifications. This paper will try to focus on innovations leading to high volume applications of intermediate frequency (IF) and radio frequency (RF) acoustic filters, e.g., TV IF filters, IF filters for cellular phones, and SAW/BAW RF filters for the RF front-end of cellular phones.
NASA Astrophysics Data System (ADS)
Sivarami Reddy, N.; Ramamurthy, D. V., Dr.; Prahlada Rao, K., Dr.
2017-08-01
This article addresses simultaneous scheduling of machines, AGVs and tools where machines are allowed to share the tools considering transfer times of jobs and tools between machines, to generate best optimal sequences that minimize makespan in a multi-machine Flexible Manufacturing System (FMS). Performance of FMS is expected to improve by effective utilization of its resources, by proper integration and synchronization of their scheduling. Symbiotic Organisms Search (SOS) algorithm is a potent tool which is a better alternative for solving optimization problems like scheduling and proven itself. The proposed SOS algorithm is tested on 22 job sets with makespan as objective for scheduling of machines and tools where machines are allowed to share tools without considering transfer times of jobs and tools and the results are compared with the results of existing methods. The results show that the SOS has outperformed. The same SOS algorithm is used for simultaneous scheduling of machines, AGVs and tools where machines are allowed to share tools considering transfer times of jobs and tools to determine the best optimal sequences that minimize makespan.
A practical workflow for making anatomical atlases for biological research.
Wan, Yong; Lewis, A Kelsey; Colasanto, Mary; van Langeveld, Mark; Kardon, Gabrielle; Hansen, Charles
2012-01-01
The anatomical atlas has been at the intersection of science and art for centuries. These atlases are essential to biological research, but high-quality atlases are often scarce. Recent advances in imaging technology have made high-quality 3D atlases possible. However, until now there has been a lack of practical workflows using standard tools to generate atlases from images of biological samples. With certain adaptations, CG artists' workflow and tools, traditionally used in the film industry, are practical for building high-quality biological atlases. Researchers have developed a workflow for generating a 3D anatomical atlas using accessible artists' tools. They used this workflow to build a mouse limb atlas for studying the musculoskeletal system's development. This research aims to raise the awareness of using artists' tools in scientific research and promote interdisciplinary collaborations between artists and scientists. This video (http://youtu.be/g61C-nia9ms) demonstrates a workflow for creating an anatomical atlas.
Fernald, Douglas; Hamer, Mika; James, Kathy; Tutt, Brandon; West, David
2015-01-01
Family medicine and internal medicine physicians order diagnostic laboratory tests for nearly one-third of patient encounters in an average week, yet among medical errors in primary care, an estimated 15% to 54% are attributed to laboratory testing processes. From a practice improvement perspective, we (1) describe the need for laboratory testing process quality improvements from the perspective of primary care practices, and (2) describe the approaches and resources needed to implement laboratory testing process quality improvements in practice. We applied practice observations, process mapping, and interviews with primary care practices in the Shared Networks of Colorado Ambulatory Practices and Partners (SNOCAP)-affiliated practice-based research networks that field-tested in 2013 a laboratory testing process improvement toolkit. From the data collected in each of the 22 participating practices, common testing quality issues included, but were not limited to, 3 main testing process steps: laboratory test preparation, test tracking, and patient notification. Three overarching qualitative themes emerged: practices readily acknowledge multiple laboratory testing process problems; practices know that they need help addressing the issues; and practices face challenges with finding patient-centered solutions compatible with practice priorities and available resources. While practices were able to get started with guidance and a toolkit to improve laboratory testing processes, most did not seem able to achieve their quality improvement aims unassisted. Providing specific guidance tools with practice facilitation or other rapid-cycle quality improvement support may be an effective approach to improve common laboratory testing issues in primary care. © Copyright 2015 by the American Board of Family Medicine.
Thermo-hydro-mechanical-chemical processes in fractured-porous media: Benchmarks and examples
NASA Astrophysics Data System (ADS)
Kolditz, O.; Shao, H.; Görke, U.; Kalbacher, T.; Bauer, S.; McDermott, C. I.; Wang, W.
2012-12-01
The book comprises an assembly of benchmarks and examples for porous media mechanics collected over the last twenty years. Analysis of thermo-hydro-mechanical-chemical (THMC) processes is essential to many applications in environmental engineering, such as geological waste deposition, geothermal energy utilisation, carbon capture and storage, water resources management, hydrology, even climate change. In order to assess the feasibility as well as the safety of geotechnical applications, process-based modelling is the only tool to put numbers, i.e. to quantify future scenarios. This charges a huge responsibility concerning the reliability of computational tools. Benchmarking is an appropriate methodology to verify the quality of modelling tools based on best practices. Moreover, benchmarking and code comparison foster community efforts. The benchmark book is part of the OpenGeoSys initiative - an open source project to share knowledge and experience in environmental analysis and scientific computation.
Byrnit, Jill T; Høgh-Olesen, Henrik; Makransky, Guido
2015-08-01
All over the world, humans (Homo sapiens) display resource-sharing behavior, and common patterns of sharing seem to exist across cultures. Humans are not the only primates to share, and observations from the wild have long documented food sharing behavior in our closest phylogenetic relatives, chimpanzees (Pan troglodytes) and bonobos (Pan paniscus). However, few controlled studies have been made in which groups of Pan are introduced to food items that may be shared or monopolized by a first food possessor, and very few studies have examined what happens to these sharing patterns if the food in question is a highly attractive, monopolizable food source. The one study to date to include food quality as the independent variable used different types of food as high- and low-value items, making differences in food divisibility and size potentially confounding factors. It was the aim of the present study to examine the sharing behavior of groups of captive chimpanzees and bonobos when introducing the same type of food (branches) manipulated to be of 2 different degrees of desirability (with or without syrup). Results showed that the large majority of food transfers in both species came about as sharing in which group members were allowed to cofeed or remove food from the stock of the food possessor, and the introduction of high-value food resulted in more sharing, not less. Food sharing behavior differed between species in that chimpanzees displayed significantly more begging behavior than bonobos. Bonobos, instead, engaged in sexual invitations, which the chimpanzees never did. (PsycINFO Database Record (c) 2015 APA, all rights reserved).
Does technology help doctors to access, use and share knowledge?
Bullock, Alison
2014-01-01
Given the power and pervasiveness of technology, this paper considers whether it can help doctors to access, use and share knowledge and thus contribute to their ability to uphold the part of the Hippocratic Oath concerned with respecting 'the hard-won scientific gains of those physicians in whose steps I walk' and sharing 'such knowledge as is mine with those who are to follow'. How technology supports connections between doctors and knowledge is considered by focusing on the use of mobile technology in the workplace and Web 2.0 tools. Sfard's 'acquisition' and 'participation' models are employed to help develop an understanding of what these uses of technology mean for learning and knowledge sharing. The employment of technology is not neutral in its effects. Issues relate to knowledge ownership, information overload, quality control and interpretations attached to the use of mobile devices in the workplace. These issues raise deeper questions about the nature of knowledge and social theory and socio-material research questions about the effect of technology on workplace learning. Although the empirical and theoretical evidence presented shows how technology has clear potential to contribute both to accessing evidence and sharing knowledge, there is need for further research that applies theoretical frameworks to the analysis of the impact of technology on workplace learning. © 2013 John Wiley & Sons Ltd.
Anderson, D A; Bankston, K; Stindt, J L; Weybright, D W
2000-09-01
Today's managed care environment is forcing hospitals to seek new and innovative ways to deliver a seamless continuum of high-quality care and services to defined populations at lower costs. Many are striving to achieve this goal through the implementation of shared governance models that support point-of-service decision making, interdisciplinary partnerships, and the integration of work across clinical settings and along the service delivery continuum. The authors describe the key processes and strategies used to facilitate the design and successful implementation of an interdisciplinary shared governance model at The University Hospital, Cincinnati, Ohio. Implementation costs and initial benefits obtained over a 2-year period also are identified.
Photogrammetry on glaciers: Old and new knowledge
NASA Astrophysics Data System (ADS)
Pfeffer, W. T.; Welty, E.; O'Neel, S.
2014-12-01
In the past few decades terrestrial photogrammetry has become a widely used tool for glaciological research, brought about in part by the proliferation of high-quality, low-cost digital cameras, dramatic increases in image-processing power of computers, and very innovative progress in image processing, much of which has come from computer vision research and from the computer gaming industry. At present, glaciologists have developed their capacity to gather images much further than their ability to process them. Many researchers have accumulated vast inventories of imagery, but have no efficient means to extract the data they desire from them. In many cases these are single-image time series where the processing limitation lies in the paucity of methods to obtain 3-dimension object space information from measurements in the 2-dimensional image space; in other cases camera pairs have been operated but no automated means is in hand for conventional stereometric analysis of many thousands of image pairs. Often the processing task is further complicated by weak camera geometry or ground control distribution, either of which will compromise the quality of 3-dimensional object space solutions. Solutions exist for many of these problems, found sometimes among the latest computer vision results, and sometimes buried in decades-old pre-digital terrestrial photogrammetric literature. Other problems, particularly those arising from poorly constrained or underdetermined camera and ground control geometry, may be unsolvable. Small-scale, ground-based photography and photogrammetry of glaciers has grown over the past few decades in an organic and disorganized fashion, with much duplication of effort and little coordination or sharing of knowledge among researchers. Given the utility of terrestrial photogrammetry, its low cost (if properly developed and implemented), and the substantial value of the information to be had from it, some further effort to share knowledge and methods would be a great benefit for the community. We consider some of the main problems to be solved, and aspects of how optimal knowledge sharing might be accomplished.
Collins, Sarah A.; Gazarian, Priscilla; Stade, Diana; McNally, Kelly; Morrison, Conny; Ohashi, Kumiko; Lehmann, Lisa; Dalal, Anuj; Bates, David W.; Dykes, Patricia C.
2014-01-01
Patient- and Family-Centered Care (PFCC) is essential for high quality care in the critical and acute-specialty care hospital setting. Effective PFCC requires clinicians to form an integrated interprofessional team to collaboratively engage with the patient/family and contribute to a shared patient-centered plan of care. We conducted observations on a critical care and specialty unit to understand the plan of care activities and workflow documentation requirements for nurses and physicians to inform the development of a shared patient-centered plan of care to support patient engagement. We identified siloed plan of care documentation, with workflow opportunities to converge the nurses plan of care with the physician planned To-do lists and quality and safety checklists. Integration of nurses and physicians plan of care activities into a shared plan of care is a feasible and valuable step toward interprofessional teams that effectively engage patients in plan of care activities. PMID:25954345
Cyberinfrastructure for Open Science at the Montreal Neurological Institute
Das, Samir; Glatard, Tristan; Rogers, Christine; Saigle, John; Paiva, Santiago; MacIntyre, Leigh; Safi-Harab, Mouna; Rousseau, Marc-Etienne; Stirling, Jordan; Khalili-Mahani, Najmeh; MacFarlane, David; Kostopoulos, Penelope; Rioux, Pierre; Madjar, Cecile; Lecours-Boucher, Xavier; Vanamala, Sandeep; Adalat, Reza; Mohaddes, Zia; Fonov, Vladimir S.; Milot, Sylvain; Leppert, Ilana; Degroot, Clotilde; Durcan, Thomas M.; Campbell, Tara; Moreau, Jeremy; Dagher, Alain; Collins, D. Louis; Karamchandani, Jason; Bar-Or, Amit; Fon, Edward A.; Hoge, Rick; Baillet, Sylvain; Rouleau, Guy; Evans, Alan C.
2017-01-01
Data sharing is becoming more of a requirement as technologies mature and as global research and communications diversify. As a result, researchers are looking for practical solutions, not only to enhance scientific collaborations, but also to acquire larger amounts of data, and to access specialized datasets. In many cases, the realities of data acquisition present a significant burden, therefore gaining access to public datasets allows for more robust analyses and broadly enriched data exploration. To answer this demand, the Montreal Neurological Institute has announced its commitment to Open Science, harnessing the power of making both clinical and research data available to the world (Owens, 2016a,b). As such, the LORIS and CBRAIN (Das et al., 2016) platforms have been tasked with the technical challenges specific to the institutional-level implementation of open data sharing, including: Comprehensive linking of multimodal data (phenotypic, clinical, neuroimaging, biobanking, and genomics, etc.)Secure database encryption, specifically designed for institutional and multi-project data sharing, ensuring subject confidentiality (using multi-tiered identifiers).Querying capabilities with multiple levels of single study and institutional permissions, allowing public data sharing for all consented and de-identified subject data.Configurable pipelines and flags to facilitate acquisition and analysis, as well as access to High Performance Computing clusters for rapid data processing and sharing of software tools.Robust Workflows and Quality Control mechanisms ensuring transparency and consistency in best practices.Long term storage (and web access) of data, reducing loss of institutional data assets.Enhanced web-based visualization of imaging, genomic, and phenotypic data, allowing for real-time viewing and manipulation of data from anywhere in the world.Numerous modules for data filtering, summary statistics, and personalized and configurable dashboards. Implementing the vision of Open Science at the Montreal Neurological Institute will be a concerted undertaking that seeks to facilitate data sharing for the global research community. Our goal is to utilize the years of experience in multi-site collaborative research infrastructure to implement the technical requirements to achieve this level of public data sharing in a practical yet robust manner, in support of accelerating scientific discovery. PMID:28111547
Cyberinfrastructure for Open Science at the Montreal Neurological Institute.
Das, Samir; Glatard, Tristan; Rogers, Christine; Saigle, John; Paiva, Santiago; MacIntyre, Leigh; Safi-Harab, Mouna; Rousseau, Marc-Etienne; Stirling, Jordan; Khalili-Mahani, Najmeh; MacFarlane, David; Kostopoulos, Penelope; Rioux, Pierre; Madjar, Cecile; Lecours-Boucher, Xavier; Vanamala, Sandeep; Adalat, Reza; Mohaddes, Zia; Fonov, Vladimir S; Milot, Sylvain; Leppert, Ilana; Degroot, Clotilde; Durcan, Thomas M; Campbell, Tara; Moreau, Jeremy; Dagher, Alain; Collins, D Louis; Karamchandani, Jason; Bar-Or, Amit; Fon, Edward A; Hoge, Rick; Baillet, Sylvain; Rouleau, Guy; Evans, Alan C
2016-01-01
Data sharing is becoming more of a requirement as technologies mature and as global research and communications diversify. As a result, researchers are looking for practical solutions, not only to enhance scientific collaborations, but also to acquire larger amounts of data, and to access specialized datasets. In many cases, the realities of data acquisition present a significant burden, therefore gaining access to public datasets allows for more robust analyses and broadly enriched data exploration. To answer this demand, the Montreal Neurological Institute has announced its commitment to Open Science, harnessing the power of making both clinical and research data available to the world (Owens, 2016a,b). As such, the LORIS and CBRAIN (Das et al., 2016) platforms have been tasked with the technical challenges specific to the institutional-level implementation of open data sharing, including: Comprehensive linking of multimodal data (phenotypic, clinical, neuroimaging, biobanking, and genomics, etc.)Secure database encryption, specifically designed for institutional and multi-project data sharing, ensuring subject confidentiality (using multi-tiered identifiers).Querying capabilities with multiple levels of single study and institutional permissions, allowing public data sharing for all consented and de-identified subject data.Configurable pipelines and flags to facilitate acquisition and analysis, as well as access to High Performance Computing clusters for rapid data processing and sharing of software tools.Robust Workflows and Quality Control mechanisms ensuring transparency and consistency in best practices.Long term storage (and web access) of data, reducing loss of institutional data assets.Enhanced web-based visualization of imaging, genomic, and phenotypic data, allowing for real-time viewing and manipulation of data from anywhere in the world.Numerous modules for data filtering, summary statistics, and personalized and configurable dashboards. Implementing the vision of Open Science at the Montreal Neurological Institute will be a concerted undertaking that seeks to facilitate data sharing for the global research community. Our goal is to utilize the years of experience in multi-site collaborative research infrastructure to implement the technical requirements to achieve this level of public data sharing in a practical yet robust manner, in support of accelerating scientific discovery.
Distance Education for Physicians: Adaptation of a Canadian Experience to Uruguay
ERIC Educational Resources Information Center
Llambi, Laura; Margolis, Alvaro; Toews, John; Dapueto, Juan; Esteves, Elba; Martinez, Elisa; Forster, Thais; Lopez, Antonio; Lockyer, Jocelyn
2008-01-01
Introduction: The production of online high-quality continuing professional development is a complex process that demands familiarity with effective program and content design. Collaboration and sharing across nations would appear to be a reasonable way to improve quality, increase access, and reduce costs. Methods: In this case report, the…
U.S. Geological Survey continuous monitoring workshop—Workshop summary report
Sullivan, Daniel J.; Joiner, John K.; Caslow, Kerry A.; Landers, Mark N.; Pellerin, Brian A.; Rasmussen, Patrick P.; Sheets, Rodney A.
2018-04-20
Executive SummaryThe collection of high-frequency (in other words, “continuous”) water data has been made easier over the years because of advances in technologies to measure, transmit, store, and query large, temporally dense datasets. Commercially available, in-situ sensors and data-collection platforms—together with new techniques for data analysis—provide an opportunity to monitor water quantity and quality at time scales during which meaningful changes occur. The U.S. Geological Survey (USGS) Continuous Monitoring Workshop was held to build stronger collaboration within the Water Mission Area on the collection, interpretation, and application of continuous monitoring data; share technical approaches for the collection and management of continuous data that improves consistency and efficiency across the USGS; and explore techniques and tools for the interpretation of continuous monitoring data, which increases the value to cooperators and the public. The workshop was organized into three major themes: Collecting Continuous Data, Understanding and Using Continuous Data, and Observing and Delivering Continuous Data in the Future. Presentations each day covered a variety of related topics, with a special session at the end of each day designed to bring discussion and problem solving to the forefront.The workshop brought together more than 70 USGS scientists and managers from across the Water Mission Area and Water Science Centers. Tools to manage, assure, control quality, and explore large streams of continuous water data are being developed by the USGS and other organizations and will be critical to making full use of these high-frequency data for research and monitoring. Disseminating continuous monitoring data and findings relevant to critical cooperator and societal issues is central to advancing the USGS networks and mission. Several important outcomes emerged from the presentations and breakout sessions.
MultispeQ Beta: a tool for large-scale plant phenotyping connected to the open PhotosynQ network
Austic, Greg; Zegarac, Robert; Osei-Bonsu, Isaac; Hoh, Donghee; Chilvers, Martin I.; Roth, Mitchell G.; Bi, Kevin; TerAvest, Dan; Weebadde, Prabode; Kramer, David M.
2016-01-01
Large-scale high-throughput plant phenotyping (sometimes called phenomics) is becoming increasingly important in plant biology and agriculture and is essential to cutting-edge plant breeding and management approaches needed to meet the food and fuel needs for the next century. Currently, the application of these approaches is severely limited by the availability of appropriate instrumentation and by the ability to communicate experimental protocols, results and analyses. To address these issues, we have developed a low-cost, yet sophisticated open-source scientific instrument designed to enable communities of researchers, plant breeders, educators, farmers and citizen scientists to collect high-quality field data on a large scale. The MultispeQ provides measurements in the field or laboratory of both, environmental conditions (light intensity and quality, temperature, humidity, CO2 levels, time and location) and useful plant phenotypes, including photosynthetic parameters—photosystem II quantum yield (ΦII), non-photochemical exciton quenching (NPQ), photosystem II photoinhibition, light-driven proton translocation and thylakoid proton motive force, regulation of the chloroplast ATP synthase and potentially many others—and leaf chlorophyll and other pigments. Plant phenotype data are transmitted from the MultispeQ to mobile devices, laptops or desktop computers together with key metadata that gets saved to the PhotosynQ platform (https://photosynq.org) and provides a suite of web-based tools for sharing, visualization, filtering, dissemination and analyses. We present validation experiments, comparing MultispeQ results with established platforms, and show that it can be usefully deployed in both laboratory and field settings. We present evidence that MultispeQ can be used by communities of researchers to rapidly measure, store and analyse multiple environmental and plant properties, allowing for deeper understanding of the complex interactions between plants and their environment. PMID:27853580
MultispeQ Beta: a tool for large-scale plant phenotyping connected to the open PhotosynQ network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhlgert, Sebastian; Austic, Greg; Zegarac, Robert
Large-scale high-throughput plant phenotyping (sometimes called phenomics) is becoming increasingly important in plant biology and agriculture and is essential to cutting-edge plant breeding and management approaches needed to meet the food and fuel needs for the next century. Currently, the application of these approaches is severely limited by the availability of appropriate instrumentation and by the ability to communicate experimental protocols, results and analyses. To address these issues, we have developed a low-cost, yet sophisticated open-source scientific instrument designed to enable communities of researchers, plant breeders, educators, farmers and citizen scientists to collect high-quality field data on a large scale.more » The MultispeQ provides measurements in the field or laboratory of both, environmental conditions (light intensity and quality, temperature, humidity, CO 2 levels, time and location) and useful plant phenotypes, including photosynthetic parameters—photosystem II quantum yield (Φ II), non-photochemical exciton quenching (NPQ), photosystem II photoinhibition, light-driven proton translocation and thylakoid proton motive force, regulation of the chloroplast ATP synthase and potentially many others—and leaf chlorophyll and other pigments. Plant phenotype data are transmitted from the MultispeQ to mobile devices, laptops or desktop computers together with key metadata that gets saved to the PhotosynQ platform (https://photosynq.org) and provides a suite of web-based tools for sharing, visualization, filtering, dissemination and analyses. We present validation experiments, comparing MultispeQ results with established platforms, and show that it can be usefully deployed in both laboratory and field settings. We present evidence that MultispeQ can be used by communities of researchers to rapidly measure, store and analyse multiple environmental and plant properties, allowing for deeper understanding of the complex interactions between plants and their environment.« less
MultispeQ Beta: a tool for large-scale plant phenotyping connected to the open PhotosynQ network
Kuhlgert, Sebastian; Austic, Greg; Zegarac, Robert; ...
2016-10-26
Large-scale high-throughput plant phenotyping (sometimes called phenomics) is becoming increasingly important in plant biology and agriculture and is essential to cutting-edge plant breeding and management approaches needed to meet the food and fuel needs for the next century. Currently, the application of these approaches is severely limited by the availability of appropriate instrumentation and by the ability to communicate experimental protocols, results and analyses. To address these issues, we have developed a low-cost, yet sophisticated open-source scientific instrument designed to enable communities of researchers, plant breeders, educators, farmers and citizen scientists to collect high-quality field data on a large scale.more » The MultispeQ provides measurements in the field or laboratory of both, environmental conditions (light intensity and quality, temperature, humidity, CO 2 levels, time and location) and useful plant phenotypes, including photosynthetic parameters—photosystem II quantum yield (Φ II), non-photochemical exciton quenching (NPQ), photosystem II photoinhibition, light-driven proton translocation and thylakoid proton motive force, regulation of the chloroplast ATP synthase and potentially many others—and leaf chlorophyll and other pigments. Plant phenotype data are transmitted from the MultispeQ to mobile devices, laptops or desktop computers together with key metadata that gets saved to the PhotosynQ platform (https://photosynq.org) and provides a suite of web-based tools for sharing, visualization, filtering, dissemination and analyses. We present validation experiments, comparing MultispeQ results with established platforms, and show that it can be usefully deployed in both laboratory and field settings. We present evidence that MultispeQ can be used by communities of researchers to rapidly measure, store and analyse multiple environmental and plant properties, allowing for deeper understanding of the complex interactions between plants and their environment.« less
Tisminetzky, Mayra; Bayliss, Elizabeth A; Magaziner, Jay S; Allore, Heather G; Anzuoni, Kathryn; Boyd, Cynthia M; Gill, Thomas M; Go, Alan S; Greenspan, Susan L; Hanson, Leah R; Hornbrook, Mark C; Kitzman, Dalane W; Larson, Eric B; Naylor, Mary D; Shirley, Benjamin E; Tai-Seale, Ming; Teri, Linda; Tinetti, Mary E; Whitson, Heather E; Gurwitz, Jerry H
2017-07-01
To prioritize research topics relevant to the care of the growing population of older adults with multiple chronic conditions (MCCs). Survey of experts in MCC practice, research, and policy. Topics were derived from white papers, funding announcements, or funded research projects relating to older adults with MCCs. Survey conducted through the Health Care Systems Research Network (HCSRN) and Claude D. Pepper Older Americans Independence Centers (OAICs) Advancing Geriatrics Infrastructure and Network Growth Initiative, a joint endeavor of the HCSRN and OAICs. Individuals affiliated with the HCSRN or OAICs and national MCC experts, including individuals affiliated with funding agencies having MCC-related grant portfolios. A "top box" methodology was used, counting the number of respondents selecting the top response on a 5-point Likert scale and dividing by the total number of responses to calculate a top box percentage for each of 37 topics. The highest-ranked research topics relevant to the health and healthcare of older adults with MCCs were health-related quality of life in older adults with MCCs; development of assessment tools (to assess, e.g., symptom burden, quality of life, function); interactions between medications, disease processes, and health outcomes; disability; implementation of novel (and scalable) models of care; association between clusters of chronic conditions and clinical, financial, and social outcomes; role of caregivers; symptom burden; shared decision-making to enhance care planning; and tools to improve clinical decision-making. Study findings serve to inform the development of a comprehensive research agenda to address the challenges relating to the care of this "high-need, high-cost" population and the healthcare delivery systems responsible for serving it. © 2017, Copyright the Authors Journal compilation © 2017, The American Geriatrics Society.
Alexander, Gregory L; Pasupathy, Kalyan S; Steege, Linsey M; Strecker, E Bradley; Carley, Kathleen M
2014-08-01
The role of nursing home (NH) information technology (IT) in quality improvement has not been clearly established, and its impacts on communication between care givers and patient outcomes in these settings deserve further attention. In this research, we describe a mixed method approach to explore communication strategies used by healthcare providers for resident skin risk in NH with high IT sophistication (ITS). Sample included NH participating in the statewide survey of ITS. We incorporated rigorous observation of 8- and 12-h shifts, and focus groups to identify how NH IT and a range of synchronous and asynchronous tools are used. Social network analysis tools and qualitative analysis were used to analyze data and identify relationships between ITS dimensions and communication interactions between care providers. Two of the nine ITS dimensions (resident care-technological and administrative activities-technological) and total ITS were significantly negatively correlated with number of unique interactions. As more processes in resident care and administrative activities are supported by technology, the lower the number of observed unique interactions. Additionally, four thematic areas emerged from staff focus groups that demonstrate how important IT is to resident care in these facilities including providing resident-centered care, teamwork and collaboration, maintaining safety and quality, and using standardized information resources. Our findings in this study confirm prior research that as technology support (resident care and administrative activities) and overall ITS increases, observed interactions between staff members decrease. Conversations during staff interviews focused on how technology facilitated resident centered care through enhanced information sharing, greater virtual collaboration between team members, and improved care delivery. These results provide evidence for improving the design and implementation of IT in long term care systems to support communication and associated resident outcomes. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Van der Wees, Philip; Qaseem, Amir; Kaila, Minna; Ollenschlaeger, Guenter; Rosenfeld, Richard
2012-02-09
Clinical practice and public health guidelines are important tools for translating research findings into practice with the aim of assisting health practitioners as well as patients and consumers in health behavior and healthcare decision-making. Numerous programs for guideline development exist around the world, with growing international collaboration to improve their quality. One of the key features in developing trustworthy guidelines is that recommendations should be based on high-quality systematic reviews of the best available evidence. The review process used by guideline developers to identify and grade relevant evidence for developing recommendations should be systematic, transparent and unbiased. In this paper, we provide an overview of current international developments in the field of practice guidelines and methods to develop guidelines, with a specific focus on the role of systematic reviews. The Guidelines International Network (G-I-N) aims to stimulate collaboration between guideline developers and systematic reviewers to optimize the use of available evidence in guideline development and to increase efficiency in the guideline development process. Considering the significant benefit of systematic reviews for the guideline community, the G-I-N Board of Trustees supports the international prospective register of systematic reviews (PROSPERO) initiative. G-I-N also recently launched a Data Extraction Resource (GINDER) to present and share data extracted from individual studies in a standardized template. PROSPERO and GINDER are complementary tools to enhance collaboration between guideline developers and systematic reviewers to allow for alignment of activities and a reduction in duplication of effort.
Micro-optical fabrication by ultraprecision diamond machining and precision molding
NASA Astrophysics Data System (ADS)
Li, Hui; Li, Likai; Naples, Neil J.; Roblee, Jeffrey W.; Yi, Allen Y.
2017-06-01
Ultraprecision diamond machining and high volume molding for affordable high precision high performance optical elements are becoming a viable process in optical industry for low cost high quality microoptical component manufacturing. In this process, first high precision microoptical molds are fabricated using ultraprecision single point diamond machining followed by high volume production methods such as compression or injection molding. In the last two decades, there have been steady improvements in ultraprecision machine design and performance, particularly with the introduction of both slow tool and fast tool servo. Today optical molds, including freeform surfaces and microlens arrays, are routinely diamond machined to final finish without post machining polishing. For consumers, compression molding or injection molding provide efficient and high quality optics at extremely low cost. In this paper, first ultraprecision machine design and machining processes such as slow tool and fast too servo are described then both compression molding and injection molding of polymer optics are discussed. To implement precision optical manufacturing by molding, numerical modeling can be included in the future as a critical part of the manufacturing process to ensure high product quality.
NASA Astrophysics Data System (ADS)
Akhavan Niaki, Farbod
The objective of this research is first to investigate the applicability and advantage of statistical state estimation methods for predicting tool wear in machining nickel-based superalloys over deterministic methods, and second to study the effects of cutting tool wear on the quality of the part. Nickel-based superalloys are among those classes of materials that are known as hard-to-machine alloys. These materials exhibit a unique combination of maintaining their strength at high temperature and have high resistance to corrosion and creep. These unique characteristics make them an ideal candidate for harsh environments like combustion chambers of gas turbines. However, the same characteristics that make nickel-based alloys suitable for aggressive conditions introduce difficulties when machining them. High strength and low thermal conductivity accelerate the cutting tool wear and increase the possibility of the in-process tool breakage. A blunt tool nominally deteriorates the surface integrity and damages quality of the machined part by inducing high tensile residual stresses, generating micro-cracks, altering the microstructure or leaving a poor roughness profile behind. As a consequence in this case, the expensive superalloy would have to be scrapped. The current dominant solution for industry is to sacrifice the productivity rate by replacing the tool in the early stages of its life or to choose conservative cutting conditions in order to lower the wear rate and preserve workpiece quality. Thus, monitoring the state of the cutting tool and estimating its effects on part quality is a critical task for increasing productivity and profitability in machining superalloys. This work aims to first introduce a probabilistic-based framework for estimating tool wear in milling and turning of superalloys and second to study the detrimental effects of functional state of the cutting tool in terms of wear and wear rate on part quality. In the milling operation, the mechanisms of tool failure were first identified and, based on the rapid catastrophic failure of the tool, a Bayesian inference method (i.e., Markov Chain Monte Carlo, MCMC) was used for parameter calibration of tool wear using a power mechanistic model. The calibrated model was then used in the state space probabilistic framework of a Kalman filter to estimate the tool flank wear. Furthermore, an on-machine laser measuring system was utilized and fused into the Kalman filter to improve the estimation accuracy. In the turning operation the behavior of progressive wear was investigated as well. Due to the nonlinear nature of wear in turning, an extended Kalman filter was designed for tracking progressive wear, and the results of the probabilistic-based method were compared with a deterministic technique, where significant improvement (more than 60% increase in estimation accuracy) was achieved. To fulfill the second objective of this research in understanding the underlying effects of wear on part quality in cutting nickel-based superalloys, a comprehensive study on surface roughness, dimensional integrity and residual stress was conducted. The estimated results derived from a probabilistic filter were used for finding the proper correlations between wear, surface roughness and dimensional integrity, along with a finite element simulation for predicting the residual stress profile for sharp and worn cutting tool conditions. The output of this research provides the essential information on condition monitoring of the tool and its effects on product quality. The low-cost Hall effect sensor used in this work to capture spindle power in the context of the stochastic filter can effectively estimate tool wear in both milling and turning operations, while the estimated wear can be used to generate knowledge of the state of workpiece surface integrity. Therefore the true functionality and efficiency of the tool in superalloy machining can be evaluated without additional high-cost sensing.
Expert consensus on best evaluative practices in community-based rehabilitation.
Grandisson, Marie; Thibeault, Rachel; Hébert, Michèle; Cameron, Debra
2016-01-01
The objective of this study was to generate expert consensus on best evaluative practices for community-based rehabilitation (CBR). This consensus includes key features of the evaluation process and methods, and discussion of whether a shared framework should be used to report findings and, if so, which framework should play this role. A Delphi study with two predefined rounds was conducted. Experts in CBR from a wide range of geographical areas and disciplinary backgrounds were recruited to complete the questionnaires. Both quantitative and qualitative analyses were performed to generate the recommendations for best practices in CBR evaluation. A panel of 42 experts reached consensus on 13 recommendations for best evaluative practices in CBR. In regard to the critical qualities of sound CBR evaluation processes, panellists emphasized that these processes should be inclusive, participatory, empowering and respectful of local cultures and languages. The group agreed that evaluators should consider the use of mixed methods and participatory tools, and should combine indicators from a universal list of CBR indicators with locally generated ones. The group also agreed that a common framework should guide CBR evaluations, and that this framework should be a flexible combination between the CBR Matrix and the CBR Principles. An expert panel reached consensus on key features of best evaluative practices in CBR. Knowledge transfer initiatives are now required to develop guidelines, tools and training opportunities to facilitate CBR program evaluations. CBR evaluation processes should strive to be inclusive, participatory, empowering and respectful of local cultures and languages. CBR evaluators should strongly consider using mixed methods, participatory tools, a combination of indicators generated with the local community and with others from a bank of CBR indicators. CBR evaluations should be situated within a shared, but flexible, framework. This shared framework could combine the CBR Matrix and the CBR Principles.
Reviews of theoretical frameworks: Challenges and judging the quality of theory application.
Hean, Sarah; Anderson, Liz; Green, Chris; John, Carol; Pitt, Richard; O'Halloran, Cath
2016-06-01
Rigorous reviews of available information, from a range of resources, are required to support medical and health educators in their decision making. The aim of this article is to highlight the importance of a review of theoretical frameworks specifically as a supplement to reviews that focus on a synthesis of the empirical evidence alone. Establishing a shared understanding of theory as a concept is highlighted as a challenge and some practical strategies to achieving this are presented. This article also introduces the concept of theoretical quality, arguing that a critique of how theory is applied should complement the methodological appraisal of the literature in a review. We illustrate the challenge of establishing a shared meaning of theory through reference to experiences of an on-going review of this kind conducted in the field of interprofessional education (IPE) and use a high scoring paper selected in this review to illustrate how theoretical quality can be assessed. In reaching a shared understanding of theory as a concept, practical strategies that promote experiential and practical ways of knowing are required in addition to more propositional ways of sharing knowledge. Concepts of parsimony, testability, operational adequacy and empirical adequacy are explored as concepts that establish theoretical quality. Reviews of theoretical frameworks used in medical education are required to inform educational practice. Review teams should make time and effort to reach a shared understanding of the term theory. Theory reviews, and reviews more widely, should add an assessment of theory application to the protocol of their review method.
Porter, Mark W; Porter, Mark William; Milley, David; Oliveti, Kristyn; Ladd, Allen; O'Hara, Ryan J; Desai, Bimal R; White, Peter S
2008-11-06
Flexible, highly accessible collaboration tools can inherently conflict with controls placed on information sharing by offices charged with privacy protection, compliance, and maintenance of the general business environment. Our implementation of a commercial enterprise wiki within the academic research environment addresses concerns of all involved through the development of a robust user training program, a suite of software customizations that enhance security elements, a robust auditing program, allowance for inter-institutional wiki collaboration, and wiki-specific governance.
Design and Testing of a Tool for Evaluating the Quality of Diabetes Consumer-Information Web Sites
Steinwachs, Donald; Rubin, Haya R
2003-01-01
Background Most existing tools for measuring the quality of Internet health information focus almost exclusively on structural criteria or other proxies for quality information rather than evaluating actual accuracy and comprehensiveness. Objective This research sought to develop a new performance-measurement tool for evaluating the quality of Internet health information, test the validity and reliability of the tool, and assess the variability in diabetes Web site quality. Methods An objective, systematic tool was developed to evaluate Internet diabetes information based on a quality-of-care measurement framework. The principal investigator developed an abstraction tool and trained an external reviewer on its use. The tool included 7 structural measures and 34 performance measures created by using evidence-based practice guidelines and experts' judgments of accuracy and comprehensiveness. Results Substantial variation existed in all categories, with overall scores following a normal distribution and ranging from 15% to 95% (mean was 50% and median was 51%). Lin's concordance correlation coefficient to assess agreement between raters produced a rho of 0.761 (Pearson's r of 0.769), suggesting moderate to high agreement. The average agreement between raters for the performance measures was 0.80. Conclusions Diabetes Web site quality varies widely. Alpha testing of this new tool suggests that it could become a reliable and valid method for evaluating the quality of Internet health sites. Such an instrument could help lay people distinguish between beneficial and misleading information. PMID:14713658
A Participative Tool for Sharing, Annotating and Archiving Submarine Video Data
NASA Astrophysics Data System (ADS)
Marcon, Y.; Kottmann, R.; Ratmeyer, V.; Boetius, A.
2016-02-01
Oceans cover more than 70 percent of the Earth's surface and are known to play an essential role on all of the Earth systems and cycles. However, less than 5 percent of the ocean bottom has been explored and many aspects of the deep-sea world remain poorly understood. Increasing our ocean literacy is a necessity in order for specialists and non-specialists to better grasp the roles of the ocean on the Earth's system, its resources, and the impact of human activities on it. Due to technological advances, deep-sea research produces ever-increasing amounts of scientific video data. However, using such data for science communication and public outreach purposes remains difficult as tools for accessing/sharing such scientific data are often lacking. Indeed, there is no common solution for the management and analysis of marine video data, which are often scattered across multiple research institutes or working groups and it is difficult to get an overview of the whereabouts of those data. The VIDLIB Deep-Sea Video Platform is a web-based tool for sharing/annotating time-coded deep-sea video data. VIDLIB provides a participatory way to share and analyze video data. Scientists can share expert knowledge for video analysis without the need to upload/download large video files. Also, VIDLIB offers streaming capabilities and has potential for participatory science and science communication in that non-specialists can ask questions on what they see and get answers from scientists. Such a tool is highly valuable in terms of scientific public outreach and popular science. Video data are by far the most efficient way to communicate scientific findings to a non-expert public. VIDLIB is being used for studying the impact of deep-sea mining on benthic communities as well as for exploration in polar regions. We will present the structure and workflow of VIDLIB as well as an example of video analysis. VIDLIB (http://vidlib.marum.de) is funded by the EU EUROFLEET project and the Helmholtz Alliance ROBEX.
Shared Care: A Quality Improvement Initiative to Optimize Primary Care Management of Constipation
Vernacchio, Louis; Trudell, Emily; Antonelli, Richard; Nurko, Samuel; Leichtner, Alan M.; Lightdale, Jenifer R.
2015-01-01
BACKGROUND: Pediatric constipation is commonly managed in the primary care setting, where there is much variability in management and specialty referral use. Shared Care is a collaborative quality improvement initiative between Boston Children’s Hospital and the Pediatric Physician’s Organization at Children’s (PPOC), through which subspecialists provide primary care providers with education, decision-support tools, pre-referral management recommendations, and access to advice. We investigated whether Shared Care reduces referrals and improves adherence to established clinical guidelines. METHODS: We reviewed the primary care management of patients 1 to 18 years old seen by a Boston Children’s Hospital gastroenterologist and diagnosed with constipation who were referred from PPOC practices in the 6 months before and after implementation of Shared Care. Charts were assessed for patient factors and key components of management. We also tracked referral rates for all PPOC patients for 29 months before implementation and 19 months after implementation. RESULTS: Fewer active patients in the sample were referred after implementation (61/27 365 [0.22%] vs 90/27 792 [0.36%], P = .003). The duration of pre-referral management increased, and the rate of fecal impaction decreased after implementation. No differences were observed in documentation of key management recommendations. Analysis of medical claims showed no statistically significant change in referrals. CONCLUSIONS: A multifaceted initiative to support primary care management of constipation can alter clinical care, but changes in referral behavior and pre-referral management may be difficult to detect and sustain. Future efforts may benefit from novel approaches to provider engagement and systems integration. PMID:25896837
Raising Virtual Laboratories in Australia onto global platforms
NASA Astrophysics Data System (ADS)
Wyborn, L. A.; Barker, M.; Fraser, R.; Evans, B. J. K.; Moloney, G.; Proctor, R.; Moise, A. F.; Hamish, H.
2016-12-01
Across the globe, Virtual Laboratories (VLs), Science Gateways (SGs), and Virtual Research Environments (VREs) are being developed that enable users who are not co-located to actively work together at various scales to share data, models, tools, software, workflows, best practices, etc. Outcomes range from enabling `long tail' researchers to more easily access specific data collections, to facilitating complex workflows on powerful supercomputers. In Australia, government funding has facilitated the development of a range of VLs through the National eResearch Collaborative Tools and Resources (NeCTAR) program. The VLs provide highly collaborative, research-domain oriented, integrated software infrastructures that meet user community needs. Twelve VLs have been funded since 2012, including the Virtual Geophysics Laboratory (VGL); Virtual Hazards, Impact and Risk Laboratory (VHIRL); Climate and Weather Science Laboratory (CWSLab); Marine Virtual Laboratory (MarVL); and Biodiversity and Climate Change Virtual Laboratory (BCCVL). These VLs share similar technical challenges, with common issues emerging on integration of tools, applications and access data collections via both cloud-based environments and other distributed resources. While each VL began with a focus on a specific research domain, communities of practice have now formed across the VLs around common issues, and facilitate identification of best practice case studies, and new standards. As a result, tools are now being shared where the VLs access data via data services using international standards such as ISO, OGC, W3C. The sharing of these approaches is starting to facilitate re-usability of infrastructure and is a step towards supporting interdisciplinary research. Whilst the focus of the VLs are Australia-centric, by using standards, these environments are able to be extended to analysis on other international datasets. Many VL datasets are subsets of global datasets and so extension to global is a small (and often requested) step. Similarly, most of the tools, software, and other technologies could be shared across infrastructures globally. Therefore, it is now time to better connect the Australian VLs with similar initiatives elsewhere to create international platforms that can contribute to global research challenges.
Code of Federal Regulations, 2010 CFR
2010-01-01
... designated management agency will annually set maximum individual BMP cost-share levels for the project area... offsite water quality, and (2) The matching share requirements would place a burden on the landowner or... shared must have a positive effect on water quality by reducing the amount of agricultural nonpoint...
Code of Federal Regulations, 2011 CFR
2011-01-01
... designated management agency will annually set maximum individual BMP cost-share levels for the project area... offsite water quality, and (2) The matching share requirements would place a burden on the landowner or... shared must have a positive effect on water quality by reducing the amount of agricultural nonpoint...
He Asked Me What!?--Using Shared Online Accounts as Training Tools for Distance Learning Librarians
ERIC Educational Resources Information Center
Robinson, Kelly; Casey, Anne Marie; Citro, Kathleen
2017-01-01
This study explores the idea of creating a knowledge base from shared online accounts to use in training librarians who perform distance reference services. Through a survey, follow-up interviews and a case study, the investigators explored current and potential use of shared online accounts as training tools. This study revealed that the…
NASA Technical Reports Server (NTRS)
Fatig, Michael
1993-01-01
Flight operations and the preparation for it has become increasingly complex as mission complexities increase. Further, the mission model dictates that a significant increase in flight operations activities is upon us. Finally, there is a need for process improvement and economy in the operations arena. It is therefore time that we recognize flight operations as a complex process requiring a defined, structured, and life cycle approach vitally linked to space segment, ground segment, and science operations processes. With this recognition, an FOT Tool Kit consisting of six major components designed to provide tools to guide flight operations activities throughout the mission life cycle was developed. The major components of the FOT Tool Kit and the concepts behind the flight operations life cycle process as developed at NASA's GSFC for GSFC-based missions are addressed. The Tool Kit is therefore intended to increase productivity, quality, cost, and schedule performance of the flight operations tasks through the use of documented, structured methodologies; knowledge of past lessons learned and upcoming new technology; and through reuse and sharing of key products and special application programs made possible through the development of standardized key products and special program directories.
Pathway Tools version 13.0: integrated software for pathway/genome informatics and systems biology
Paley, Suzanne M.; Krummenacker, Markus; Latendresse, Mario; Dale, Joseph M.; Lee, Thomas J.; Kaipa, Pallavi; Gilham, Fred; Spaulding, Aaron; Popescu, Liviu; Altman, Tomer; Paulsen, Ian; Keseler, Ingrid M.; Caspi, Ron
2010-01-01
Pathway Tools is a production-quality software environment for creating a type of model-organism database called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc integrates the evolving understanding of the genes, proteins, metabolic network and regulatory network of an organism. This article provides an overview of Pathway Tools capabilities. The software performs multiple computational inferences including prediction of metabolic pathways, prediction of metabolic pathway hole fillers and prediction of operons. It enables interactive editing of PGDBs by DB curators. It supports web publishing of PGDBs, and provides a large number of query and visualization tools. The software also supports comparative analyses of PGDBs, and provides several systems biology analyses of PGDBs including reachability analysis of metabolic networks, and interactive tracing of metabolites through a metabolic network. More than 800 PGDBs have been created using Pathway Tools by scientists around the world, many of which are curated DBs for important model organisms. Those PGDBs can be exchanged using a peer-to-peer DB sharing system called the PGDB Registry. PMID:19955237
Magdon-Ismail, Zainab; Benesch, Curtis; Cushman, Jeremy T; Brissette, Ian; Southerland, Andrew M; Brandler, Ethan S; Sozener, Cemal B; Flor, Sue; Hemmitt, Roseanne; Wales, Kathleen; Parrigan, Krystal; Levine, Steven R
2017-07-01
The American Heart Association/American Stroke Association and Department of Health Stroke Coverdell Program convened a stakeholder meeting in upstate NY to develop recommendations to enhance stroke systems for acute large vessel occlusion. Prehospital, hospital, and Department of Health leadership were invited (n=157). Participants provided goals/concerns and developed recommendations for prehospital triage and interfacility transport, rating each using a 3-level impact (A [high], B, and C [low]) and implementation feasibility (1 [high], 2, and 3 [low]) scale. Six weeks later, participants finalized recommendations. Seventy-one stakeholders (45% of invitees) attended. Six themes around goals/concerns emerged: (1) emergency medical services capacity, (2) validated prehospital screening tools, (3) facility capability, (4) triage/transport guidelines, (5) data capture/feedback tools, and (6) facility competition. In response, high-impact (level A) prehospital recommendations, stratified by implementation feasibility, were (1) use of online medical control for triage (6%); (2) regional transportation strategy (31%), standardized emergency medical services checklists (18%), quality metrics (14%), standardized prehospital screening tools (13%), and feedback for performance improvement (7%); and (3) smartphone application algorithm for screening/decision-making (6%) and ambulance-based telemedicine (6%). Level A interfacility transfer recommendations were (1) standardized transfer process (32%)/timing goals (16%)/regionalized systems (11%), performance metrics (11%), image sharing capabilities (7%); (2) provider education (9%) and stroke toolbox (5%); and (3) interfacility telemedicine (7%) and feedback (2%). The methods used and recommendations generated provide models for stroke system enhancement. Implementation may vary based on geographic need/capacity and be contingent on establishing standard care practices. Further research is needed to establish optimal implementation strategies. © 2017 American Heart Association, Inc.
Teno, Joan M; Mor, Vincent; Ward, Nicholas; Roy, Jason; Clarridge, Brian; Wennberg, John E; Fisher, Elliott S
2005-11-01
To compare the quality of end-of-life care of persons dying in regions of differing practice intensity. Mortality follow-back survey. Geographic regions in the highest and lowest deciles of intensive care unit (ICU) use. Bereaved family member or other knowledgeable informants. Unmet needs, concerns, and rating of quality of end-of-life care in five domains (physical comfort and emotional support of the decedent, shared decision-making, treatment of the dying person with respect, providing information and emotional support to family members). Decedents in high- (n=365) and low-intensity (n=413) hospital service areas (HSAs) did not differ in age, sex, education, marital status, leading causes of death, or the degree to which death was expected, but those in the high-intensity ICU HSAs were more likely to be black and to live in nonrural areas. Respondents in high-intensity HSAs were more likely to report that care was of lower quality in each domain, and these differences were statistically significant in three of five domains. Respondents from high-intensity HSAs were more likely to report inadequate emotional support for the decedent (relative risk (RR)=1.2, 95% confidence interval (CI)=1.0-1.4), concerns with shared decision-making (RR=1.8, 95% CI=1.0-2.9), inadequate information about what to expect (RR=1.5, 95% CI=1.3-1.8), and failure to treat the decedent with respect (RR=1.4, 95% CI=1.0-1.9). Overall ratings of the quality of end-of-life care were also significantly lower in high-intensity HSAs. Dying in regions with a higher use of ICU care is not associated with improved perceptions of quality of end-of-life care.
The State of Software for Evolutionary Biology
Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros
2018-01-01
Abstract With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development. PMID:29385525
Tools for Local and Distributed Climate Data Access
NASA Astrophysics Data System (ADS)
Schweitzer, R.; O'Brien, K.; Burger, E. F.; Smith, K. M.; Manke, A. B.; Radhakrishnan, A.; Balaji, V.
2017-12-01
Last year we reported on our efforts to adapt existing tools to facilitate model development. During the lifecycle of a Climate Model Intercomparison Project (CMIP), data must be quality controlled before it can be published and studied. Like previous efforts, the next CMIP6 will produce an unprecedented volume of data. For an institution, modelling group or modeller the volume of data is unmanageable without tools that organize and automate as many processes as possible. Even if a modelling group has tools for data and metadata management, it often falls on individuals to do the initial quality assessment for a model run with bespoke tools. Using individually crafted tools can lead to interruptions when project personnel change and may result in inconsistencies and duplication of effort across groups. This talk will expand on our experiences using available tools (Ferret/PyFerret, the Live Access Server, the GFDL Curator, the GFDL Model Development Database Interface and the THREDDS Data Server) to seamlessly automate the data assembly process to give users "one-click" access to a rich suite of Web-based analysis and comparison tools. On the surface, it appears that this collection of tools is well suited to the task, but our experience of the last year taught us that the data volume and distributed storage adds a number of challenges in adapting the tools for this task. Quality control and initial evaluation add their own set of challenges. We will discuss how we addressed the needs of QC researchers by expanding standard tools to include specialized plots and leveraged the configurability of the tools to add specific user defined analysis operations so they are available to everyone using the system. We also report on our efforts to overcome some of the technical barriers for wide adoption of the tools by providing pre-built containers that are easily deployed in virtual machine and cloud environments. Finally, we will offer some suggestions for added features, configuration options and improved robustness that can make future implementation of similar systems operate faster and more reliably. Solving these challenges for data sets distributed narrowly across networks and storage systems of points the way to solving similar problems associated with sharing data distributed across institutions continents.
Using Feedback from Data Consumers to Capture Quality Information on Environmental Research Data
NASA Astrophysics Data System (ADS)
Devaraju, A.; Klump, J. F.
2015-12-01
Data quality information is essential to facilitate reuse of Earth science data. Recorded quality information must be sufficient for other researchers to select suitable data sets for their analysis and confirm the results and conclusions. In the research data ecosystem, several entities are responsible for data quality. Data producers (researchers and agencies) play a major role in this aspect as they often include validation checks or data cleaning as part of their work. It is possible that the quality information is not supplied with published data sets; if it is available, the descriptions might be incomplete, ambiguous or address specific quality aspects. Data repositories have built infrastructures to share data, but not all of them assess data quality. They normally provide guidelines of documenting quality information. Some suggests that scholarly and data journals should take a role in ensuring data quality by involving reviewers to assess data sets used in articles, and incorporating data quality criteria in the author guidelines. However, this mechanism primarily addresses data sets submitted to journals. We believe that data consumers will complement existing entities to assess and document the quality of published data sets. This has been adopted in crowd-source platforms such as Zooniverse, OpenStreetMap, Wikipedia, Mechanical Turk and Tomnod. This paper presents a framework designed based on open source tools to capture and share data users' feedback on the application and assessment of research data. The framework comprises a browser plug-in, a web service and a data model such that feedback can be easily reported, retrieved and searched. The feedback records are also made available as Linked Data to promote integration with other sources on the Web. Vocabularies from Dublin Core and PROV-O are used to clarify the source and attribution of feedback. The application of the framework is illustrated with the CSIRO's Data Access Portal.
Design of Scalable and Effective Earth Science Collaboration Tool
NASA Astrophysics Data System (ADS)
Maskey, M.; Ramachandran, R.; Kuo, K. S.; Lynnes, C.; Niamsuwan, N.; Chidambaram, C.
2014-12-01
Collaborative research is growing rapidly. Many tools including IDEs are now beginning to incorporate new collaborative features. Software engineering research has shown the effectiveness of collaborative programming and analysis. In particular, drastic reduction in software development time resulting in reduced cost has been highlighted. Recently, we have witnessed the rise of applications that allow users to share their content. Most of these applications scale such collaboration using cloud technologies. Earth science research needs to adopt collaboration technologies to reduce redundancy, cut cost, expand knowledgebase, and scale research experiments. To address these needs, we developed the Earth science collaboration workbench (CWB). CWB provides researchers with various collaboration features by augmenting their existing analysis tools to minimize learning curve. During the development of the CWB, we understood that Earth science collaboration tasks are varied and we concluded that it is not possible to design a tool that serves all collaboration purposes. We adopted a mix of synchronous and asynchronous sharing methods that can be used to perform collaboration across time and location dimensions. We have used cloud technology for scaling the collaboration. Cloud has been highly utilized and valuable tool for Earth science researchers. Among other usages, cloud is used for sharing research results, Earth science data, and virtual machine images; allowing CWB to create and maintain research environments and networks to enhance collaboration between researchers. Furthermore, collaborative versioning tool, Git, is integrated into CWB for versioning of science artifacts. In this paper, we present our experience in designing and implementing the CWB. We will also discuss the integration of collaborative code development use cases for data search and discovery using NASA DAAC and simulation of satellite observations using NASA Earth Observing System Simulation Suite (NEOS3).
Streufert, Ben; Reed, Shelby D; Orlando, Lori A; Taylor, Dean C; Huber, Joel C; Mather, Richard C
2017-03-01
Although surgical management of a first-time anterior shoulder dislocation (FTASD) can reduce the risk of recurrent dislocation, other treatment characteristics, costs, and outcomes are important to patients considering treatment options. While patient preferences, such as those elicited by conjoint analysis, have been shown to be important in medical decision-making, the magnitudes or effects of patient preferences in treating an FTASD are unknown. To test a novel shared decision-making tool after sustained FTASD. Specifically measured were the following: (1) importance of aspects of operative versus nonoperative treatment, (2) respondents' agreement with results generated by the tool, (3) willingness to share these results with physicians, and (4) association of results with choice of treatment after FTASD. Cross-sectional study; Level of evidence, 3. A tool was designed and tested using members of Amazon Mechanical Turk, an online panel. The tool included an adaptive conjoint analysis exercise, a method to understand individuals' perceived importance of the following attributes of treatment: (1) chance of recurrent dislocation, (2) cost, (3) short-term limits on shoulder motion, (4) limits on participation in high-risk activities, and (5) duration of physical therapy. Respondents then chose between operative and nonoperative treatment for hypothetical shoulder dislocation. Overall, 374 of 501 (75%) respondents met the inclusion criteria, of which most were young, active males; one-third reported prior dislocation. From the conjoint analysis, the importance of recurrent dislocation and cost of treatment were the most important attributes. A substantial majority agreed with the tool's ability to generate representative preferences and indicated that they would share these preferences with their physician. Importance of recurrence proved significantly predictive of respondents' treatment choices, independent of sex or age; however, activity level was important to previous dislocators. A total of 125 (55%) males and 33 (23%) females chose surgery after FTASD, as did 37% of previous dislocators compared with 45% of nondislocators. When given thorough information about the risks and benefits, respondents had strong preferences for operative treatment after an FTASD. Respondents agreed with the survey results and wanted to share the information with providers. Recurrence was the most important attribute and played a role in decisions about treatment.
Study on electroplating technology of diamond tools for machining hard and brittle materials
NASA Astrophysics Data System (ADS)
Cui, Ying; Chen, Jian Hua; Sun, Li Peng; Wang, Yue
2016-10-01
With the development of the high speed cutting, the ultra-precision machining and ultrasonic vibration technique in processing hard and brittle material , the requirement of cutting tools is becoming higher and higher. As electroplated diamond tools have distinct advantages, such as high adaptability, high durability, long service life and good dimensional stability, the cutting tools are effective and extensive used in grinding hard and brittle materials. In this paper, the coating structure of electroplating diamond tool is described. The electroplating process flow is presented, and the influence of pretreatment on the machining quality is analyzed. Through the experimental research and summary, the reasonable formula of the electrolyte, the electroplating technologic parameters and the suitable sanding method were determined. Meanwhile, the drilling experiment on glass-ceramic shows that the electroplating process can effectively improve the cutting performance of diamond tools. It has laid a good foundation for further improving the quality and efficiency of the machining of hard and brittle materials.
Telemedical applications and grid technology
NASA Astrophysics Data System (ADS)
Graschew, Georgi; Roelofs, Theo A.; Rakowsky, Stefan; Schlag, Peter M.; Kaiser, Silvan; Albayrak, Sahin
2005-11-01
Due to the experience in the exploitation of previous European telemedicine projects an open Euro-Mediterranean consortium proposes the Virtual Euro-Mediterranean Hospital (VEMH) initiative. The provision of the same advanced technologies to the European and Mediterranean Countries should contribute to their better dialogue for integration. VEMH aims to facilitate the interconnection of various services through real integration which must take into account the social, human and cultural dimensions. VEMH will provide a platform consisting of a satellite and terrestrial link for the application of medical e-learning, real-time telemedicine and medical assistance. The methodologies for the VEMH are medical-needs-driven instead of technology-driven. They supply new management tools for virtual medical communities and allow management of clinical outcomes for implementation of evidence-based medicine. Due to the distributed character of the VEMH Grid technology becomes inevitable for successful deployment of the services. Existing Grid Engines provide basic computing power needed by today's medical analysis tasks but lack other capabilities needed for communication and knowledge sharing services envisioned. When it comes to heterogeneous systems to be shared by different institutions especially the high level system management areas are still unsupported. Therefore a Metagrid Engine is needed that provides a superset of functionalities across different Grid Engines and manages strong privacy and Quality of Service constraints at this comprehensive level.
Development of medical writing in India: Past, present and future.
Sharma, Suhasini
2017-01-01
Pharmaceutical medical writing has grown significantly in India in the last couple of decades. It includes preparing regulatory, safety, and publication documents as well as educational and communication material related to health and health-care products. Medical writing requires medical understanding, knowledge of drug development and the regulatory and safety domains, understanding of research methodologies, and awareness of relevant regulations and guidelines. It also requires the ability to analyze, interpret, and present biomedical scientific data in the required format and good writing skills. Medical writing is the fourth most commonly outsourced clinical development activity, and its global demand has steadily increased due to rising cost pressures on the pharmaceutical industry. India has the unique advantages of a large workforce of science graduates and medical professionals trained in English and lower costs, which make it a suitable destination for outsourcing medical writing services. However, the current share of India in global medical writing business is very small. This industry in India faces some real challenges, such as the lack of depth and breadth in domain expertise, inadequate technical writing skills, high attrition rates, and paucity of standardized training programs as well as quality assessment tools. Focusing our time, attention, and resources to address these challenges will help the Indian medical writing industry gain its rightful share in the global medical writing business.
d'Alquen, Daniela; De Boeck, Kris; Bradley, Judy; Vávrová, Věra; Dembski, Birgit; Wagner, Thomas O F; Pfalz, Annette; Hebestreit, Helge
2012-02-06
The European Centres of Reference Network for Cystic Fibrosis (ECORN-CF) established an Internet forum which provides the opportunity for CF patients and other interested people to ask experts questions about CF in their mother language. The objectives of this study were to: 1) develop a detailed quality assessment tool to analyze quality of expert answers, 2) evaluate the intra- and inter-rater agreement of this tool, and 3) explore changes in the quality of expert answers over the time frame of the project. The quality assessment tool was developed by an expert panel. Five experts within the ECORN-CF project used the quality assessment tool to analyze the quality of 108 expert answers published on ECORN-CF from six language zones. 25 expert answers were scored at two time points, one year apart. Quality of answers was also assessed at an early and later period of the project. Individual rater scores and group mean scores were analyzed for each expert answer. A scoring system and training manual were developed analyzing two quality categories of answers: content and formal quality. For content quality, the grades based on group mean scores for all raters showed substantial agreement between two time points, however this was not the case for the grades based on individual rater scores. For formal quality the grades based on group mean scores showed only slight agreement between two time points and there was also poor agreement between time points for the individual grades. The inter-rater agreement for content quality was fair (mean kappa value 0.232 ± 0.036, p < 0.001) while only slight agreement was observed for the grades of the formal quality (mean kappa value 0.105 ± 0.024, p < 0.001). The quality of expert answers was rated high (four language zones) or satisfactory (two language zones) and did not change over time. The quality assessment tool described in this study was feasible and reliable when content quality was assessed by a group of raters. Within ECORN-CF, the tool will help ensure that CF patients all over Europe have equal possibility of access to high quality expert advice on their illness. © 2012 d’Alquen et al; licensee BioMed Central Ltd.
On the viability of supporting institutional sharing of remote laboratory facilities
NASA Astrophysics Data System (ADS)
Lowe, David; Dang, Bridgette; Daniel, Keith; Murray, Stephen; Lindsay, Euan
2015-11-01
Laboratories are generally regarded as critical to engineering education, and yet educational institutions face significant challenges in developing and maintaining high-quality laboratory facilities. Remote laboratories are increasingly being explored as a partial solution to this challenge, with research showing that - for the right learning outcomes - they can be viable adjuncts or alternatives to conventional hands-on laboratories. One consequential opportunity arising from the inherent support for distributed access is the possibility of cross-institutional shared facilities. While both technical feasibility and pedagogic implications of remote laboratories have been well studied within the literature, the organisational and logistical issues associated with shared facilities have received limited consideration. This paper uses an existing national-scale laboratory sharing initiative, along with a related survey and laboratory sharing data, to analyse a range of factors that can affect engagement in laboratory sharing. The paper also discusses the implications for supporting ongoing laboratory sharing.
Managing Personal and Group Collections of Information
NASA Technical Reports Server (NTRS)
Wolfe, Shawn R.; Wragg, Stephen D.; Chen, James R.; Koga, Dennis (Technical Monitor)
1999-01-01
The internet revolution has dramatically increased the amount of information available to users. Various tools such as search engines have been developed to help users find the information they need from this vast repository. Users often also need tools to help manipulate the growing amount of useful information they have discovered. Current tools available for this purpose are typically local components of web browsers designed to manage URL bookmarks. They provide limited functionalities to handle high information complexities. To tackle this have created DIAMS, an agent-based tool to help users or groups manage their information collections and share their collections with other. the main features of DIAMS are described here.
NASA Astrophysics Data System (ADS)
Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.
2014-12-01
The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.
Boggan, Joel C.; Cheely, George; Shah, Bimal R.; Heffelfinger, Randy; Springall, Deanna; Thomas, Samantha M.; Zaas, Aimee; Bae, Jonathan
2014-01-01
Background Systematically engaging residents in large programs in quality improvement (QI) is challenging. Objective To coordinate a shared QI project in a large residency program using an online tool. Methods A web-based QI tool guided residents through a 2-phase evaluation of performance of foot examinations in patients with diabetes. In phase 1, residents completed reviews of health records with online data entry. Residents were then presented with personal performance data relative to peers and were prompted to develop improvement plans. In phase 2, residents again reviewed personal performance. Rates of performance were compared at the program and clinic levels for each phase, with data presented for residents. Acceptability was measured by the number of residents completing each phase. Feasibility was measured by estimated faculty, programmer, and administrator time and costs. Results Seventy-nine of 86 eligible residents (92%) completed improvement plans and reviewed 1471 patients in phase 1, whereas 68 residents (79%) reviewed 1054 patient charts in phase 2. Rates of performance of examination increased significantly between phases (from 52% to 73% for complete examination, P < .001). Development of the tool required 130 hours of programmer time. Project analysis and management required 6 hours of administrator and faculty time monthly. Conclusions An online tool developed and implemented for program-wide QI initiatives successfully engaged residents to participate in QI activities. Residents using this tool demonstrated improvement in a selected quality target. This tool could be adapted by other graduate medical education programs or for faculty development. PMID:26279782
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seed, Ian; James, Paula; Mathieson, John
2013-07-01
With decreasing budgets and increasing pressure on completing cleanup missions as quickly, safely and cost-effectively as possible, there is significant benefit to be gained from collaboration and joint efforts between organizations facing similar issues. With this in mind, the US Department of Energy (DOE) and the UK Nuclear Decommissioning Authority (NDA) have formally agreed to share information on lessons learned on the development and application of new technologies and approaches to improve the safety, cost effectiveness and schedule of the cleanup legacy wastes. To facilitate information exchange a range of tools and methodologies were established. These included tacit knowledge exchangemore » through facilitated meetings, conference calls and Site visits as well as explicit knowledge exchange through document sharing and newsletters. A DOE web-based portal has been established to capture these exchanges and add to them via discussion boards. The information exchange is operating at the Government-to-Government strategic level as well as at the Site Contractor level to address both technical and managerial topic areas. This effort has resulted in opening a dialogue and building working relationships. In some areas joint programs of work have been initiated thus saving resource and enabling the parties to leverage off one another activities. The potential benefits of high quality information exchange are significant, ranging from cost avoidance through identification of an approach to a problem that has been proven elsewhere to cost sharing and joint development of a new technology to address a common problem. The benefits in outcomes significantly outweigh the costs of the process. The applicability of the tools and methods along with the lessons learned regarding some key issues is of use to any organization that wants to improve value for money. In the waste management marketplace, there are a multitude of challenges being addressed by multiple organizations and the effective pooling and exchange of knowledge and experience can only be of benefit to all participants to help complete the cleanup mission more quickly and more cost effectively. This paper examines in detail the tools and processes used to promote information exchange and the progress made to date. It also discusses the challenges and issues involved and proposes recommendations to others who are involved in similar activities. (authors)« less
Unidata's Vision for Providing Comprehensive and End-to-end Data Services
NASA Astrophysics Data System (ADS)
Ramamurthy, M. K.
2009-05-01
This paper presents Unidata's vision for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users no matter where they are or how they are connected to the Internetwill be able to find and access a plethora of geosciences data and use Unidata-provided tools and services both productively and creatively in their research and education. What that vision means for the Unidata community is elucidated by drawing a simple analogy. Most of users are familiar with Amazon and eBay e-commerce sites and content sharing sites like YouTube and Flickr. On the eBay marketplace, people can sell practically anything at any time and buyers can share their experience of purchasing a product or the reputation of a seller. Likewise, at Amazon, thousands of merchants sell their goods and millions of customers not only buy those goods, but provide a review or opinion of the products they buy and share their experiences as purchasers. Similarly, YouTube and Flickr are sites tailored to video- and photo-sharing, respectively, where users can upload their own content and share it with millions of other users, including family and friends. What all these sites, together with social-networking applications like MySpace and Facebook, have enabled is a sense of a virtual community in which users can search and browse products or content, comment and rate those products from anywhere, at any time, and via any Internet- enabled device like an iPhone, laptop, or a desktop computer. In essence, these enterprises have fundamentally altered people's buying modes and behavior toward purchases. Unidata believes that similar approaches, appropriately tailored to meet the needs of the scientific community, can be adopted to provide and share geosciences data and actively collaborate in the future. For example, future case-study data access systems, in addition to providing datasets and tools, will provide services that allow users to provide commentaries on a weather event, say a hurricane, as well as provide feedback on the quality, usefulness, and interpretation of the datasets through integrated blogs, forums, and Wikis, along with uploading and sharing products they derive, ancillary materials that users might have gathered (such as photos and videos from the storm), and publications and curricular materials they develop, all through a single data portal. In essence, such case study collections will be "living" or dynamic, allowing users to be also contributors as they add value to and grow existing case study collections.
Toward a System of Total Quality Management: Applying the Deming Approach to the Education Setting.
ERIC Educational Resources Information Center
McLeod, Willis B.; And Others
1992-01-01
Recently, the Petersburg (Virginia) Public Schools have moved away from a highly centralized organizational structure to a Total Quality Management system featuring shared decision making and school-based management practices. The district was guided by Deming's philosophy that all stakeholders should be involved in defining the level of products…
Implementing shared decision making in routine mental health care
Slade, Mike
2017-01-01
Shared decision making (SDM) in mental health care involves clinicians and patients working together to make decisions. The key elements of SDM have been identified, decision support tools have been developed, and SDM has been recommended in mental health at policy level. Yet implementation remains limited. Two justifications are typically advanced in support of SDM. The clinical justification is that SDM leads to improved outcome, yet the available empirical evidence base is inconclusive. The ethical justification is that SDM is a right, but clinicians need to balance the biomedical ethical principles of autonomy and justice with beneficence and non‐maleficence. It is argued that SDM is “polyvalent”, a sociological concept which describes an idea commanding superficial but not deep agreement between disparate stakeholders. Implementing SDM in routine mental health services is as much a cultural as a technical problem. Three challenges are identified: creating widespread access to high‐quality decision support tools; integrating SDM with other recovery‐supporting interventions; and responding to cultural changes as patients develop the normal expectations of citizenship. Two approaches which may inform responses in the mental health system to these cultural changes – social marketing and the hospitality industry – are identified. PMID:28498575
Standardized Automated CO2/H2O Flux Systems for Individual Research Groups and Flux Networks
NASA Astrophysics Data System (ADS)
Burba, George; Begashaw, Israel; Fratini, Gerardo; Griessbaum, Frank; Kathilankal, James; Xu, Liukang; Franz, Daniela; Joseph, Everette; Larmanou, Eric; Miller, Scott; Papale, Dario; Sabbatini, Simone; Sachs, Torsten; Sakai, Ricardo; McDermitt, Dayle
2017-04-01
In recent years, spatial and temporal flux data coverage improved significantly, and on multiple scales, from a single station to continental networks, due to standardization, automation, and management of data collection, and better handling of the extensive amounts of generated data. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are required to effectively and efficiently handle the entire process. Such tools are needed to maximize time dedicated to authoring publications and answering research questions, and to minimize time and expenses spent on data acquisition, processing, and quality control. Thus, these tools should produce standardized verifiable datasets and provide a way to cross-share the standardized data with external collaborators to leverage available funding, promote data analyses and publications. LI-COR gas analyzers are widely used in past and present flux networks such as AmeriFlux, ICOS, AsiaFlux, OzFlux, NEON, CarboEurope, and FluxNet-Canada, etc. These analyzers have gone through several major improvements over the past 30 years. However, in 2016, a three-prong development was completed to create an automated flux system which can accept multiple sonic anemometer and datalogger models, compute final and complete fluxes on-site, merge final fluxes with supporting weather soil and radiation data, monitor station outputs and send automated alerts to researchers, and allow secure sharing and cross-sharing of the station and data access. Two types of these research systems were developed: open-path (LI-7500RS) and enclosed-path (LI-7200RS). Key developments included: • Improvement of gas analyzer performance • Standardization and automation of final flux calculations onsite, and in real-time • Seamless integration with latest site management and data sharing tools In terms of the gas analyzer performance, the RS analyzers are based on established LI-7500/A and LI-7200 models, and the improvements focused on increased stability in the presence of contamination, refining temperature control and compensation, and providing more accurate fast gas concentration measurements. In terms of the flux calculations, improvements focused on automating the on-site flux calculations using EddyPro® software run by a weatherized fully digital microcomputer, SmartFlux2. In terms of site management and data sharing, the development focused on web-based software, FluxSuite, which allows real-time station monitoring and data access by multiple users. The presentation will describe details for the key developments and will include results from field tests of the RS gas analyzer models in comparison with older models and control reference instruments.
42 CFR 480.143 - QIO involvement in shared health data systems.
Code of Federal Regulations, 2013 CFR
2013-10-01
... HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs) Disclosure of Confidential Information § 480.143 QIO involvement in shared health data...
42 CFR 480.143 - QIO involvement in shared health data systems.
Code of Federal Regulations, 2012 CFR
2012-10-01
... HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs) Disclosure of Confidential Information § 480.143 QIO involvement in shared health data...
42 CFR 480.143 - QIO involvement in shared health data systems.
Code of Federal Regulations, 2014 CFR
2014-10-01
... HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs) Disclosure of Confidential Information § 480.143 QIO involvement in shared health data...
Kamp Dush, Claire M.; Taylor, Miles G.
2011-01-01
Using typologies outlined by Gottman and Fitzpatrick as well as institutional and companionate models of marriage, the authors conducted a latent class analysis of marital conflict trajectories using 20 years of data from the Marital Instability Over the Life Course study. Respondents were in one of three groups: high, medium (around the mean), or low conflict. Several factors predicted conflict trajectory group membership; respondents who believed in lifelong marriage and shared decisions equally with their spouse were more likely to report low and less likely to report high conflict. The conflict trajectories were intersected with marital happiness trajectories to examine predictors of high and low quality marriages. A stronger belief in lifelong marriage, shared decision making, and husbands sharing a greater proportion of housework were associated with an increased likelihood of membership in a high happiness, low conflict marriage, and a decreased likelihood of a low marital happiness group. PMID:22328798
Khalifa, Abdulrahman; Meystre, Stéphane
2015-12-01
The 2014 i2b2 natural language processing shared task focused on identifying cardiovascular risk factors such as high blood pressure, high cholesterol levels, obesity and smoking status among other factors found in health records of diabetic patients. In addition, the task involved detecting medications, and time information associated with the extracted data. This paper presents the development and evaluation of a natural language processing (NLP) application conceived for this i2b2 shared task. For increased efficiency, the application main components were adapted from two existing NLP tools implemented in the Apache UIMA framework: Textractor (for dictionary-based lookup) and cTAKES (for preprocessing and smoking status detection). The application achieved a final (micro-averaged) F1-measure of 87.5% on the final evaluation test set. Our attempt was mostly based on existing tools adapted with minimal changes and allowed for satisfying performance with limited development efforts. Copyright © 2015 Elsevier Inc. All rights reserved.
... R S T U V W X Y Z Image Gallery Share: The Image Gallery contains high-quality digital photographs available from ... Select a category below to view additional thumbnail images. Images are available for direct download in 2 ...
Ernst, E J; Speck, P M; Fitzpatrick, J J
2012-01-01
Digital photography is a valuable adjunct to document physical injuries after sexual assault. In order for a digital photograph to have high image quality, there must exist a high level of naturalness. Digital photo documentation has varying degrees of naturalness; however, for a photograph to be natural, specific technical elements for the viewer must be satisfied. No tool was available to rate the naturalness of digital photo documentation of female genital injuries after sexual assault. The Photo Documentation Image Quality Scoring System (PDIQSS) tool was developed to rate technical elements for naturalness. Using this tool, experts evaluated randomly selected digital photographs of female genital injuries captured following sexual assault. Naturalness of female genital injuries following sexual assault was demonstrated when measured in all dimensions.
Measuring the Quality of Early Childhood Programs--Guidelines for Effective Evaluation Tools.
ERIC Educational Resources Information Center
Epstein, Ann S.
2000-01-01
Summarizes what High/Scope discovered to be the critical characteristics of a comprehensive and valid measure of early childhood program quality. Provides suggestions for how the tool can be used, and highlights with examples. Asserts that the guidelines effectively assess efforts of child development, staff development, and soundness of…
A Face-to-Face Professional Development Model to Enhance Teaching of Online Research Strategies
ERIC Educational Resources Information Center
Terrazas-Arellanes, Fatima E.; Knox, Carolyn; Strycker, Lisa A.; Walden, Emily
2016-01-01
To help students navigate the digital environment, teachers not only need access to the right technology tools but they must also engage in pedagogically sound, high-quality professional development. For teachers, quality professional development can mean the difference between merely using technology tools and creating transformative change in…
Sharing information about cancer with one's family is associated with improved quality of life.
Lai, Carlo; Borrelli, Beatrice; Ciurluini, Paola; Aceto, Paola
2017-10-01
The aim of this study was to investigate the association between cancer patients' ability to share information about their illness with their social network and attachment style dimensions, alexithymia, and quality of life. We hypothesised that ability to share information about one's cancer with family, friends, and medical teams would be positively associated with quality of life and secure attachment and negatively associated with alexithymia. Forty-five cancer patients were recruited from the Psycho-oncology Unit of the San Camillo-Forlanini Hospital in Rome. We collected anamnestic data and self-report data on social sharing ability, quality of life, alexithymia, and attachment. Sharing with family (B = 4.66; SE = 1.82; β = .52; SE = 0.20; t(41) = 2.6; P = .0143) was the only predictor of global health status, and attachment security was the only predictor of mean social sharing (B = 0.25; SE = 0.06; β = .63; SE = 0.14; t(41) = 4.4; P < .0001). Encouraging patients to share information about their experience of cancer may help to improve their quality of life. Attachment security seems to promote social sharing. Psychological assessments of cancer patients should cover both ability to share information about one's cancer with family and attachment security. Copyright © 2016 John Wiley & Sons, Ltd.
Cooperative water-resources monitoring in the St. Clair River/Lake St. Clair Basin, Michigan
Rheaume, Stephen J.; Neff, Brian P.; Blumer, Stephen P.
2007-01-01
As part of the Lake St. Clair Regional Monitoring Project, this report describes numerous cooperative water-resources monitoring efforts conducted in the St. Clair River/Lake St. Clair Basin over the last 100 years. Cooperative monitoring is a tool used to observe and record changes in water quantity and quality over time. This report describes cooperative efforts for monitoring streamflows and flood magnitudes, past and present water-quality conditions, significant human-health threats, and flow-regime changes that are the result of changing land use. Water-resources monitoring is a long-term effort that can be made cost-effective by leveraging funds, sharing data, and avoiding duplication of effort. Without long-term cooperative monitoring, future water-resources managers and planners may find it difficult to establish and maintain public supply, recreational, ecological, and esthetic water-quality goals for the St. Clair River/Lake St. Clair Basin.
Anderson, Jane A; Godwin, Kyler M; Saleem, Jason J; Russell, Scott; Robinson, Joshua J; Kimmel, Barbara
2014-12-01
This article reports redesign strategies identified to create a Web-based user-interface for the Self-management TO Prevent (STOP) Stroke Tool. Members of a Stroke Quality Improvement Network (N = 12) viewed a visualization video of a proposed prototype and provided feedback on implementation barriers/facilitators. Stroke-care providers (N = 10) tested the Web-based prototype in think-aloud sessions of simulated clinic visits. Participants' dialogues were coded into themes. Access to comprehensive information and the automated features/systematized processes were the primary accessibility and usability facilitator themes. The need for training, time to complete the tool, and computer-centric care were identified as possible usability barriers. Patient accountability, reminders for best practice, goal-focused care, and communication/counseling themes indicate that the STOP Stroke Tool supports the paradigm of patient-centered care. The STOP Stroke Tool was found to prompt clinicians on secondary stroke-prevention clinical-practice guidelines, facilitate comprehensive documentation of evidence-based care, and support clinicians in providing patient-centered care through the shared decision-making process that occurred while using the action-planning/goal-setting feature of the tool. © The Author(s) 2013.
ERIC Educational Resources Information Center
Li, Rui; Liu, Min
2007-01-01
The purpose of this study is to examine the potential of using computer databases as cognitive tools to share learners' cognitive load and facilitate learning in a multimedia problem-based learning (PBL) environment designed for sixth graders. Two research questions were: (a) can the computer database tool share sixth-graders' cognitive load? and…
42 CFR 480.143 - QIO involvement in shared health data systems.
Code of Federal Regulations, 2011 CFR
2011-10-01
... HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION REVIEW INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs) Disclosure of Confidential Information § 480.143 QIO involvement in shared health data...
42 CFR 480.143 - QIO involvement in shared health data systems.
Code of Federal Regulations, 2010 CFR
2010-10-01
... HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION REVIEW INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs) Disclosure of Confidential Information § 480.143 QIO involvement in shared health data...
Marchant, Carol A; Briggs, Katharine A; Long, Anthony
2008-01-01
ABSTRACT Lhasa Limited is a not-for-profit organization that exists to promote the sharing of data and knowledge in chemistry and the life sciences. It has developed the software tools Derek for Windows, Meteor, and Vitic to facilitate such sharing. Derek for Windows and Meteor are knowledge-based expert systems that predict the toxicity and metabolism of a chemical, respectively. Vitic is a chemically intelligent toxicity database. An overview of each software system is provided along with examples of the sharing of data and knowledge in the context of their development. These examples include illustrations of (1) the use of data entry and editing tools for the sharing of data and knowledge within organizations; (2) the use of proprietary data to develop nonconfidential knowledge that can be shared between organizations; (3) the use of shared expert knowledge to refine predictions; (4) the sharing of proprietary data between organizations through the formation of data-sharing groups; and (5) the use of proprietary data to validate predictions. Sharing of chemical toxicity and metabolism data and knowledge in this way offers a number of benefits including the possibilities of faster scientific progress and reductions in the use of animals in testing. Maximizing the accessibility of data also becomes increasingly crucial as in silico systems move toward the prediction of more complex phenomena for which limited data are available.
Scientific Digital Libraries, Interoperability, and Ontologies
NASA Technical Reports Server (NTRS)
Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.
2009-01-01
Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.
Bowles, Kathryn H; Hanlon, Alexandra; Holland, Diane; Potashnik, Sheryl L; Topaz, Maxim
2014-01-01
Hospital clinicians are overwhelmed with the volume of patients churning through the health care systems. The study purpose was to determine whether alerting case managers about high-risk patients by supplying decision support results in better discharge plans as evidenced by time to first hospital readmission. Four medical units at one urban, university medical center. A quasi-experimental study including a usual care and experimental phase with hospitalized English-speaking patients aged 55 years and older. The intervention included using an evidence-based screening tool, the Discharge Decision Support System (D2S2), that supports clinicians' discharge referral decision making by identifying high-risk patients upon admission who need a referral for post-acute care. The usual care phase included collection of the D2S2 information, but not sharing the information with case managers. The experimental phase included data collection and then sharing the results with the case managers. The study compared time to readmission between index discharge date and 30 and 60 days in patients in both groups (usual care vs. experimental). After sharing the D2S2 results, the percentage of referral or high-risk patients readmitted by 30 and 60 days decreased by 6% and 9%, respectively, representing a 26% relative reduction in readmissions for both periods. Supplying decision support to identify high-risk patients recommended for postacute referral is associated with better discharge plans as evidenced by an increase in time to first hospital readmission. The tool supplies standardized information upon admission allowing more time to work with high-risk admissions.
[Citizens' veillance on environmental health through ICT and Genomics].
Tallacchini, Mariachiara; Biggeri, Annibale
2014-01-01
In the last decade three different phenomena have merged: the widespread use of ICT devices to collect and potentially share personal and scientific data, and to build networked communities; biobanking for genomics, namely the organized storage of human biological samples and information; and the collaboration between scientists and citizens in creating knowledge, namely peer-production of knowledge, for shared social goals. These different forms of knowledge, technical tools, and skills have merged in community based scientific and social, as well as legal, initiatives, where scientists and citizens use genetic information and ICT as powerful ways to gain more control over their health and the environment. These activities can no longer be simply qualified as epidemiological research and surveillance. Instead, they can be framed as new forms of citizens' participatory "veillance:" an attitude of cognitive proactive alertness towards the protection of common goods. This paper illustrates two Italian case-studies where citizens and scientists, by making use of both ICT and biobanking, have joined with the goal of protecting environmental health in highly polluted contexts. The statute of these initiatives still needs to be defined as to both the validity of the underlying citizen science and the lack of adequate legal tools for structuring them. However, as to their scientific quality and use of sophisticated technologies, these activities cannot be compared to previous experiences, such as those inspired by so-called popular epidemiology. Moreover, the deep awareness towards the data to be transparent, reliable, and accessible, as well as towards funding mechanisms to be crowdsourced, allows these experiences to go beyond the mere confrontation with institutional knowledge, and to represent a potential model for knowledge production for institutional implementation.
ExpressionDB: An open source platform for distributing genome-scale datasets.
Hughes, Laura D; Lewis, Scott A; Hughes, Michael E
2017-01-01
RNA-sequencing (RNA-seq) and microarrays are methods for measuring gene expression across the entire transcriptome. Recent advances have made these techniques practical and affordable for essentially any laboratory with experience in molecular biology. A variety of computational methods have been developed to decrease the amount of bioinformatics expertise necessary to analyze these data. Nevertheless, many barriers persist which discourage new labs from using functional genomics approaches. Since high-quality gene expression studies have enduring value as resources to the entire research community, it is of particular importance that small labs have the capacity to share their analyzed datasets with the research community. Here we introduce ExpressionDB, an open source platform for visualizing RNA-seq and microarray data accommodating virtually any number of different samples. ExpressionDB is based on Shiny, a customizable web application which allows data sharing locally and online with customizable code written in R. ExpressionDB allows intuitive searches based on gene symbols, descriptions, or gene ontology terms, and it includes tools for dynamically filtering results based on expression level, fold change, and false-discovery rates. Built-in visualization tools include heatmaps, volcano plots, and principal component analysis, ensuring streamlined and consistent visualization to all users. All of the scripts for building an ExpressionDB with user-supplied data are freely available on GitHub, and the Creative Commons license allows fully open customization by end-users. We estimate that a demo database can be created in under one hour with minimal programming experience, and that a new database with user-supplied expression data can be completed and online in less than one day.
Sustainable Materials Management: U.S. State Data Measurement Sharing Program
The State Data Measurement Sharing Program (SMP) is an online reporting, information sharing, and measurement tool that allows U.S. states to share a wide range of information about waste, recycling, and composting.
Key elements of high-quality practice organisation in primary health care: a systematic review.
Crossland, Lisa; Janamian, Tina; Jackson, Claire L
2014-08-04
To identify elements that are integral to high-quality practice and determine considerations relating to high-quality practice organisation in primary care. A narrative systematic review of published and grey literature. Electronic databases (PubMed, CINAHL, the Cochrane Library, Embase, Emerald Insight, PsycInfo, the Primary Health Care Research and Information Service website, Google Scholar) were searched in November 2013 and used to identify articles published in English from 2002 to 2013. Reference lists of included articles were searched for relevant unpublished articles and reports. Data were configured at the study level to allow for the inclusion of findings from a broad range of study types. Ten elements were most often included in the existing organisational assessment tools. A further three elements were identified from an inductive thematic analysis of descriptive articles, and were noted as important considerations in effective quality improvement in primary care settings. Although there are some validated tools available to primary care that identify and build quality, most are single-strategy approaches developed outside health care settings. There are currently no validated organisational improvement tools, designed specifically for primary health care, which combine all elements of practice improvement and whose use does not require extensive external facilitation.
Quality and Efficiency Improvement Tools for Every Radiologist.
Kudla, Alexei U; Brook, Olga R
2018-06-01
In an era of value-based medicine, data-driven quality improvement is more important than ever to ensure safe and efficient imaging services. Familiarity with high-value tools enables all radiologists to successfully engage in quality and efficiency improvement. In this article, we review the model for improvement, strategies for measurement, and common practical tools with real-life examples that include Run chart, Control chart (Shewhart chart), Fishbone (Cause-and-Effect or Ishikawa) diagram, Pareto chart, 5 Whys, and Root Cause Analysis. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Marrin, Katy; Wood, Fiona; Firth, Jill; Kinsey, Katharine; Edwards, Adrian; Brain, Kate E; Newcombe, Robert G; Nye, Alan; Pickles, Timothy; Hawthorne, Kamila; Elwyn, Glyn
2014-04-07
Despite policy interest, an ethical imperative, and evidence of the benefits of patient decision support tools, the adoption of shared decision making (SDM) in day-to-day clinical practice remains slow and is inhibited by barriers that include culture and attitudes; resources and time pressures. Patient decision support tools often require high levels of health and computer literacy. Option Grids are one-page evidence-based summaries of the available condition-specific treatment options, listing patients' frequently asked questions. They are designed to be sufficiently brief and accessible enough to support a better dialogue between patients and clinicians during routine consultations. This paper describes a study to assess whether an Option Grid for osteoarthritis of the knee (OA of the knee) facilitates SDM, and explores the use of Option Grids by patients disadvantaged by language or poor health literacy. This will be a stepped wedge exploratory trial involving 72 patients with OA of the knee referred from primary medical care to a specialist musculoskeletal service in Oldham. Six physiotherapists will sequentially join the trial and consult with six patients using usual care procedures. After a period of brief training in using the Option Grid, the same six physiotherapists will consult with six further patients using an Option Grid in the consultation. The primary outcome will be efficacy of the Option Grid in facilitating SDM as measured by observational scores using the OPTION scale. Comparisons will be made between patients who have received the Option Grid and those who received usual care. A Decision Quality Measure (DQM) will assess quality of decision making. The health literacy of patients will be measured using the REALM-R instrument. Consultations will be observed and audio-recorded. Interviews will be conducted with the physiotherapists, patients and any interpreters present to explore their views of using the Option Grid. Option Grids offer a potential solution to the barriers to implementing traditional decision aids into routine clinical practice. The study will assess whether Option Grids can facilitate SDM in day-to-day clinical practice and explore their use with patients disadvantaged by language or poor health literacy. Current Controlled Trials ISRCTN94871417.
Pisani, Elizabeth; Botchway, Stella
2017-01-01
Background: Increasingly, biomedical researchers are encouraged or required by research funders and journals to share their data, but there's very little guidance on how to do that equitably and usefully, especially in resource-constrained settings. We performed an in-depth case study of one data sharing pioneer: the WorldWide Antimalarial Resistance Network (WWARN). Methods: The case study included a records review, a quantitative analysis of WAARN-related publications, in-depth interviews with 47 people familiar with WWARN, and a witness seminar involving a sub-set of 11 interviewees. Results: WWARN originally aimed to collate clinical, in vitro, pharmacological and molecular data into linked, open-access databases intended to serve as a public resource to guide antimalarial drug treatment policies. Our study describes how WWARN navigated challenging institutional and academic incentive structures, alongside funders' reluctance to invest in capacity building in malaria-endemic countries, which impeded data sharing. The network increased data contributions by focusing on providing free, online tools to improve the quality and efficiency of data collection, and by inviting collaborative authorship on papers addressing policy-relevant questions that could only be answered through pooled analyses. By July 1, 2016, the database included standardised data from 103 molecular studies and 186 clinical trials, representing 135,000 individual patients. Developing the database took longer and cost more than anticipated, and efforts to increase equity for data contributors are on-going. However, analyses of the pooled data have generated new methods and influenced malaria treatment recommendations globally. Despite not achieving the initial goal of real-time surveillance, WWARN has developed strong data governance and curation tools, which are now being adapted relatively quickly for other diseases. Conclusions: To be useful, data sharing requires investment in long-term infrastructure. To be feasible, it requires new incentive structures that favour the generation of reusable knowledge. PMID:29018840
Pisani, Elizabeth; Botchway, Stella
2017-01-01
Increasingly, biomedical researchers are encouraged or required by research funders and journals to share their data, but there's very little guidance on how to do that equitably and usefully, especially in resource-constrained settings. We performed an in-depth case study of one data sharing pioneer: the WorldWide Antimalarial Resistance Network (WWARN). The case study included a records review, a quantitative analysis of WAARN-related publications, in-depth interviews with 47 people familiar with WWARN, and a witness seminar involving a sub-set of 11 interviewees. WWARN originally aimed to collate clinical, in vitro, pharmacological and molecular data into linked, open-access databases intended to serve as a public resource to guide antimalarial drug treatment policies. Our study describes how WWARN navigated challenging institutional and academic incentive structures, alongside funders' reluctance to invest in capacity building in malaria-endemic countries, which impeded data sharing. The network increased data contributions by focusing on providing free, online tools to improve the quality and efficiency of data collection, and by inviting collaborative authorship on papers addressing policy-relevant questions that could only be answered through pooled analyses. By July 1, 2016, the database included standardised data from 103 molecular studies and 186 clinical trials, representing 135,000 individual patients. Developing the database took longer and cost more than anticipated, and efforts to increase equity for data contributors are on-going. However, analyses of the pooled data have generated new methods and influenced malaria treatment recommendations globally. Despite not achieving the initial goal of real-time surveillance, WWARN has developed strong data governance and curation tools, which are now being adapted relatively quickly for other diseases. To be useful, data sharing requires investment in long-term infrastructure. To be feasible, it requires new incentive structures that favour the generation of reusable knowledge.
Low-Cost Air Quality Monitoring Tools: From Research to Practice (A Workshop Summary)
Griswold, William G.; RS, Abhijit; Johnston, Jill E.; Herting, Megan M.; Thorson, Jacob; Collier-Oxandale, Ashley; Hannigan, Michael
2017-01-01
In May 2017, a two-day workshop was held in Los Angeles (California, U.S.A.) to gather practitioners who work with low-cost sensors used to make air quality measurements. The community of practice included individuals from academia, industry, non-profit groups, community-based organizations, and regulatory agencies. The group gathered to share knowledge developed from a variety of pilot projects in hopes of advancing the collective knowledge about how best to use low-cost air quality sensors. Panel discussion topics included: (1) best practices for deployment and calibration of low-cost sensor systems, (2) data standardization efforts and database design, (3) advances in sensor calibration, data management, and data analysis and visualization, and (4) lessons learned from research/community partnerships to encourage purposeful use of sensors and create change/action. Panel discussions summarized knowledge advances and project successes while also highlighting the questions, unresolved issues, and technological limitations that still remain within the low-cost air quality sensor arena. PMID:29143775
High-performance computing — an overview
NASA Astrophysics Data System (ADS)
Marksteiner, Peter
1996-08-01
An overview of high-performance computing (HPC) is given. Different types of computer architectures used in HPC are discussed: vector supercomputers, high-performance RISC processors, various parallel computers like symmetric multiprocessors, workstation clusters, massively parallel processors. Software tools and programming techniques used in HPC are reviewed: vectorizing compilers, optimization and vector tuning, optimization for RISC processors; parallel programming techniques like shared-memory parallelism, message passing and data parallelism; and numerical libraries.
Collaborative search in electronic health records.
Zheng, Kai; Mei, Qiaozhu; Hanauer, David A
2011-05-01
A full-text search engine can be a useful tool for augmenting the reuse value of unstructured narrative data stored in electronic health records (EHR). A prominent barrier to the effective utilization of such tools originates from users' lack of search expertise and/or medical-domain knowledge. To mitigate the issue, the authors experimented with a 'collaborative search' feature through a homegrown EHR search engine that allows users to preserve their search knowledge and share it with others. This feature was inspired by the success of many social information-foraging techniques used on the web that leverage users' collective wisdom to improve the quality and efficiency of information retrieval. The authors conducted an empirical evaluation study over a 4-year period. The user sample consisted of 451 academic researchers, medical practitioners, and hospital administrators. The data were analyzed using a social-network analysis to delineate the structure of the user collaboration networks that mediated the diffusion of knowledge of search. The users embraced the concept with considerable enthusiasm. About half of the EHR searches processed by the system (0.44 million) were based on stored search knowledge; 0.16 million utilized shared knowledge made available by other users. The social-network analysis results also suggest that the user-collaboration networks engendered by the collaborative search feature played an instrumental role in enabling the transfer of search knowledge across people and domains. Applying collaborative search, a social information-foraging technique popularly used on the web, may provide the potential to improve the quality and efficiency of information retrieval in healthcare.
Collaborative search in electronic health records
Mei, Qiaozhu; Hanauer, David A
2011-01-01
Objective A full-text search engine can be a useful tool for augmenting the reuse value of unstructured narrative data stored in electronic health records (EHR). A prominent barrier to the effective utilization of such tools originates from users' lack of search expertise and/or medical-domain knowledge. To mitigate the issue, the authors experimented with a ‘collaborative search’ feature through a homegrown EHR search engine that allows users to preserve their search knowledge and share it with others. This feature was inspired by the success of many social information-foraging techniques used on the web that leverage users' collective wisdom to improve the quality and efficiency of information retrieval. Design The authors conducted an empirical evaluation study over a 4-year period. The user sample consisted of 451 academic researchers, medical practitioners, and hospital administrators. The data were analyzed using a social-network analysis to delineate the structure of the user collaboration networks that mediated the diffusion of knowledge of search. Results The users embraced the concept with considerable enthusiasm. About half of the EHR searches processed by the system (0.44 million) were based on stored search knowledge; 0.16 million utilized shared knowledge made available by other users. The social-network analysis results also suggest that the user-collaboration networks engendered by the collaborative search feature played an instrumental role in enabling the transfer of search knowledge across people and domains. Conclusion Applying collaborative search, a social information-foraging technique popularly used on the web, may provide the potential to improve the quality and efficiency of information retrieval in healthcare. PMID:21486887
Teachers Connect "with" Technology: Online Tools Build New Pathways to Collaboration
ERIC Educational Resources Information Center
Phillips, Vicki L.; Olson, Lynn
2013-01-01
Teachers, curriculum experts, and other educators work together using online tools developed by the Bill & Melinda Gates Foundation to create high-quality, useful lessons and research-based instructional tools incorporating the Common Core State Standards.
NASA Astrophysics Data System (ADS)
Greene, G.; Kyprianou, M.; Levay, K.; Sienkewicz, M.; Donaldson, T.; Dower, T.; Swam, M.; Bushouse, H.; Greenfield, P.; Kidwell, R.; Wolfe, D.; Gardner, L.; Nieto-Santisteban, M.; Swade, D.; McLean, B.; Abney, F.; Alexov, A.; Binegar, S.; Aloisi, A.; Slowinski, S.; Gousoulin, J.
2015-09-01
The next generation for the Space Telescope Science Institute data management system is gearing up to provide a suite of archive system services supporting the operation of the James Webb Space Telescope. We are now completing the initial stage of integration and testing for the preliminary ground system builds of the JWST Science Operations Center which includes multiple components of the Data Management Subsystem (DMS). The vision for astronomical science and research with the JWST archive introduces both solutions to formal mission requirements and innovation derived from our existing mission systems along with the collective shared experience of our global user community. We are building upon the success of the Hubble Space Telescope archive systems, standards developed by the International Virtual Observatory Alliance, and collaborations with our archive data center partners. In proceeding forward, the “one archive” architectural model presented here is designed to balance the objectives for this new and exciting mission. The STScI JWST archive will deliver high quality calibrated science data products, support multi-mission data discovery and analysis, and provide an infrastructure which supports bridges to highly valued community tools and services.
Fragile Relationships: Japan, High Technology, and U.S. Vital Interests
1990-04-04
costs ; better manufacturing techniques; higher quality products; excellent marketing plans; and, direct and indirect government support. 5 Akio...short-term and have not looked to the future, as have the Japanese. The stockholders’ demand for large, quick profits often cost market shares and...grip the country as we use increasing amounts of our wealth to service the national debt. Japan will be our creditor. U.S. market shares, both
Moscucci, Mauro; Share, David; Kline-Rogers, Eva; O'Donnell, Michael; Maxwell-Eward, Ann; Meengs, William L; Clark, Vivian L; Kraft, Phillip; De Franco, Anthony C; Chambers, James L; Patel, Kirit; McGinnity, John G; Eagle, Kim A
2002-10-01
The past decade has been characterized by increased scrutiny of outcomes of surgical and percutaneous coronary interventions (PCIs). This increased scrutiny has led to the development of regional, state, and national databases for outcome assessment and for public reporting. This report describes the initial development of a regional, collaborative, cardiovascular consortium and the progress made so far by this collaborative group. In 1997, a group of hospitals in the state Michigan agreed to create a regional collaborative consortium for the development of a quality improvement program in interventional cardiology. The project included the creation of a comprehensive database of PCIs to be used for risk assessment, feedback on absolute and risk-adjusted outcomes, and sharing of information. To date, information from nearly 20,000 PCIs have been collected. A risk prediction tool for death in the hospital and additional risk prediction tools for other outcomes have been developed from the data collected, and are currently used by the participating centers for risk assessment and for quality improvement. As the project enters into year 5, the participating centers are deeply engaged in the quality improvement phase, and expansion to a total of 17 hospitals with active PCI programs is in process. In conclusion, the Blue Cross Blue Shield of Michigan Cardiovascular Consortium is an example of a regional collaborative effort to assess and improve quality of care and outcomes that overcome the barriers of traditional market and academic competition.
The eTOX data-sharing project to advance in silico drug-induced toxicity prediction.
Cases, Montserrat; Briggs, Katharine; Steger-Hartmann, Thomas; Pognan, François; Marc, Philippe; Kleinöder, Thomas; Schwab, Christof H; Pastor, Manuel; Wichard, Jörg; Sanz, Ferran
2014-11-14
The high-quality in vivo preclinical safety data produced by the pharmaceutical industry during drug development, which follows numerous strict guidelines, are mostly not available in the public domain. These safety data are sometimes published as a condensed summary for the few compounds that reach the market, but the majority of studies are never made public and are often difficult to access in an automated way, even sometimes within the owning company itself. It is evident from many academic and industrial examples, that useful data mining and model development requires large and representative data sets and careful curation of the collected data. In 2010, under the auspices of the Innovative Medicines Initiative, the eTOX project started with the objective of extracting and sharing preclinical study data from paper or pdf archives of toxicology departments of the 13 participating pharmaceutical companies and using such data for establishing a detailed, well-curated database, which could then serve as source for read-across approaches (early assessment of the potential toxicity of a drug candidate by comparison of similar structure and/or effects) and training of predictive models. The paper describes the efforts undertaken to allow effective data sharing intellectual property (IP) protection and set up of adequate controlled vocabularies) and to establish the database (currently with over 4000 studies contributed by the pharma companies corresponding to more than 1400 compounds). In addition, the status of predictive models building and some specific features of the eTOX predictive system (eTOXsys) are presented as decision support knowledge-based tools for drug development process at an early stage.
Pediatric faculty and residents’ perspectives on In-Training Evaluation Reports (ITERs)
Patel, Rikin; Drover, Anne; Chafe, Roger
2015-01-01
Background In-training evaluation reports (ITERs) are used by over 90% of postgraduate medical training programs in Canada for resident assessment. Our study examined the perspectives of faculty and residents in one pediatric program as a means to improve the ITER as an evaluation tool. Method Two separate focus groups were conducted, one with eight pediatric residents and one with nine clinical faculty within the pediatrics program of Memorial University’s Faculty of Medicine to discuss their perceptions of, and suggestions for improving, the use of ITERs. Results Residents and faculty shared many similar suggestions for improving the ITER as an evaluation tool. Both the faculty and residents emphasized the importance of written feedback, contextualizing the evaluation and timely follow-up. The biggest challenge appears to be the discrepancy in the quality of feedback sought by the residents and the faculty members’ ability to do so in a time effective manner. Others concerns related to the need for better engagement in setting rotation objectives and more direct observation by the faculty member completing the ITER. Conclusions The ITER is a useful tool in resident evaluations, but a number of issues relating to its actual use could improve the quality of feedback which residents receive. PMID:27004076
Tools for beach health data management, data processing, and predictive model implementation
,
2013-01-01
This fact sheet describes utilities created for management of recreational waters to provide efficient data management, data aggregation, and predictive modeling as well as a prototype geographic information system (GIS)-based tool for data visualization and summary. All of these utilities were developed to assist beach managers in making decisions to protect public health. The Environmental Data Discovery and Transformation (EnDDaT) Web service identifies, compiles, and sorts environmental data from a variety of sources that help to define climatic, hydrologic, and hydrodynamic characteristics including multiple data sources within the U.S. Geological Survey and the National Oceanic and Atmospheric Administration. The Great Lakes Beach Health Database (GLBH-DB) and Web application was designed to provide a flexible input, export, and storage platform for beach water quality and sanitary survey monitoring data to compliment beach monitoring programs within the Great Lakes. A real-time predictive modeling strategy was implemented by combining the capabilities of EnDDaT and the GLBH-DB for timely, automated prediction of beach water quality. The GIS-based tool was developed to map beaches based on their physical and biological characteristics, which was shared with multiple partners to provide concepts and information for future Web-accessible beach data outlets.
Support for Taverna workflows in the VPH-Share cloud platform.
Kasztelnik, Marek; Coto, Ernesto; Bubak, Marian; Malawski, Maciej; Nowakowski, Piotr; Arenas, Juan; Saglimbeni, Alfredo; Testi, Debora; Frangi, Alejandro F
2017-07-01
To address the increasing need for collaborative endeavours within the Virtual Physiological Human (VPH) community, the VPH-Share collaborative cloud platform allows researchers to expose and share sequences of complex biomedical processing tasks in the form of computational workflows. The Taverna Workflow System is a very popular tool for orchestrating complex biomedical & bioinformatics processing tasks in the VPH community. This paper describes the VPH-Share components that support the building and execution of Taverna workflows, and explains how they interact with other VPH-Share components to improve the capabilities of the VPH-Share platform. Taverna workflow support is delivered by the Atmosphere cloud management platform and the VPH-Share Taverna plugin. These components are explained in detail, along with the two main procedures that were developed to enable this seamless integration: workflow composition and execution. 1) Seamless integration of VPH-Share with other components and systems. 2) Extended range of different tools for workflows. 3) Successful integration of scientific workflows from other VPH projects. 4) Execution speed improvement for medical applications. The presented workflow integration provides VPH-Share users with a wide range of different possibilities to compose and execute workflows, such as desktop or online composition, online batch execution, multithreading, remote execution, etc. The specific advantages of each supported tool are presented, as are the roles of Atmosphere and the VPH-Share plugin within the VPH-Share project. The combination of the VPH-Share plugin and Atmosphere engenders the VPH-Share infrastructure with far more flexible, powerful and usable capabilities for the VPH-Share community. As both components can continue to evolve and improve independently, we acknowledge that further improvements are still to be developed and will be described. Copyright © 2017 Elsevier B.V. All rights reserved.
Muller-Juge, Virginie; Cullati, Stéphane; Blondon, Katherine S; Hudelson, Patricia; Maître, Fabienne; Vu, Nu V; Savoldelli, Georges L; Nendaz, Mathieu R
2014-01-01
Effective teamwork is necessary for optimal patient care. There is insufficient understanding of interactions between physicians and nurses on internal medicine wards. To describe resident physicians' and nurses' actual behaviours contributing to teamwork quality in the setting of a simulated internal medicine ward. A volunteer sample of 14 pairs of residents and nurses in internal medicine was asked to manage one non-urgent and one urgent clinical case in a simulated ward, using a high-fidelity manikin. After the simulation, participants attended a stimulated-recall session during which they viewed the videotape of the simulation and explained their actions and perceptions. All simulations were transcribed, coded, and analyzed, using a qualitative method (template analysis). Quality of teamwork was assessed, based on patient management efficiency and presence of shared management goals and of team spirit. Most resident-nurse pairs tended to interact in a traditional way, with residents taking the leadership and nurses executing medical prescriptions and assuming their own specific role. They also demonstrated different types of interactions involving shared responsibilities and decision making, constructive suggestions, active communication and listening, and manifestations of positive team building. The presence of a leader in the pair or a truly shared leadership between resident and nurse contributed to teamwork quality only if both members of the pair demonstrated sufficient autonomy. In case of a lack of autonomy of one member, the other member could compensate for it, if his/her own autonomy was sufficiently strong and if there were demonstrations of mutual listening, information sharing, and positive team building. Although they often relied on traditional types of interaction, residents and nurses also demonstrated readiness for increased sharing of responsibilities. Interprofessional education should insist on better redefinition of respective roles and reinforce behaviours shown to enhance teamwork quality.
Muller-Juge, Virginie; Cullati, Stéphane; Blondon, Katherine S.; Hudelson, Patricia; Maître, Fabienne; Vu, Nu V.; Savoldelli, Georges L.; Nendaz, Mathieu R.
2014-01-01
Background Effective teamwork is necessary for optimal patient care. There is insufficient understanding of interactions between physicians and nurses on internal medicine wards. Objective To describe resident physicians’ and nurses’ actual behaviours contributing to teamwork quality in the setting of a simulated internal medicine ward. Methods A volunteer sample of 14 pairs of residents and nurses in internal medicine was asked to manage one non-urgent and one urgent clinical case in a simulated ward, using a high-fidelity manikin. After the simulation, participants attended a stimulated-recall session during which they viewed the videotape of the simulation and explained their actions and perceptions. All simulations were transcribed, coded, and analyzed, using a qualitative method (template analysis). Quality of teamwork was assessed, based on patient management efficiency and presence of shared management goals and of team spirit. Results Most resident-nurse pairs tended to interact in a traditional way, with residents taking the leadership and nurses executing medical prescriptions and assuming their own specific role. They also demonstrated different types of interactions involving shared responsibilities and decision making, constructive suggestions, active communication and listening, and manifestations of positive team building. The presence of a leader in the pair or a truly shared leadership between resident and nurse contributed to teamwork quality only if both members of the pair demonstrated sufficient autonomy. In case of a lack of autonomy of one member, the other member could compensate for it, if his/her own autonomy was sufficiently strong and if there were demonstrations of mutual listening, information sharing, and positive team building. Conclusions Although they often relied on traditional types of interaction, residents and nurses also demonstrated readiness for increased sharing of responsibilities. Interprofessional education should insist on better redefinition of respective roles and reinforce behaviours shown to enhance teamwork quality. PMID:24769672
[Establishing IAQ Metrics and Baseline Measures.] "Indoor Air Quality Tools for Schools" Update #20
ERIC Educational Resources Information Center
US Environmental Protection Agency, 2009
2009-01-01
This issue of "Indoor Air Quality Tools for Schools" Update ("IAQ TfS" Update) contains the following items: (1) News and Events; (2) IAQ Profile: Establishing Your Baseline for Long-Term Success (Feature Article); (3) Insight into Excellence: Belleville Township High School District #201, 2009 Leadership Award Winner; and (4) Have Your Questions…
NASA Astrophysics Data System (ADS)
Vecsey, Luděk; Plomerová, Jaroslava; Jedlička, Petr; Munzarová, Helena; Babuška, Vladislav; AlpArray Working Group
2017-12-01
This paper focuses on major issues related to the data reliability and network performance of 20 broadband (BB) stations of the Czech (CZ) MOBNET (MOBile NETwork) seismic pool within the AlpArray seismic experiments. Currently used high-resolution seismological applications require high-quality data recorded for a sufficiently long time interval at seismological observatories and during the entire time of operation of the temporary stations. In this paper we present new hardware and software tools we have been developing during the last two decades while analysing data from several international passive experiments. The new tools help to assure the high-quality standard of broadband seismic data and eliminate potential errors before supplying data to seismological centres. Special attention is paid to crucial issues like the detection of sensor misorientation, timing problems, interchange of record components and/or their polarity reversal, sensor mass centring, or anomalous channel amplitudes due to, for example, imperfect gain. Thorough data quality control should represent an integral constituent of seismic data recording, preprocessing, and archiving, especially for data from temporary stations in passive seismic experiments. Large international seismic experiments require enormous efforts from scientists from different countries and institutions to gather hundreds of stations to be deployed in the field during a limited time period. In this paper, we demonstrate the beneficial effects of the procedures we have developed for acquiring a reliable large set of high-quality data from each group participating in field experiments. The presented tools can be applied manually or automatically on data from any seismic network.
Mullinx, Cassandra; Phillips, Scott; Shenk, Kelly; Hearn, Paul; Devereux, Olivia
2009-01-01
The Chesapeake Bay Program (CBP) is attempting to more strategically implement management actions to improve the health of the Nation’s largest estuary. In 2007 the U.S. Geological Survey (USGS) and U.S. Environmental Protection Agency (USEPA) CBP office began a joint effort to develop a suite of Internetaccessible decision-support tools and to help meet the needs of CBP partners to improve water quality and habitat conditions in the Chesapeake Bay and its watersheds. An adaptive management framework is being used to provide a structured decision process for information and individual tools needed to implement and assess practices to improve the condition of the Chesapeake Bay ecosystem. The Chesapeake Online Adaptive Support Toolkit (COAST) is a collection of web-based analytical tools and information, organized in an adaptive management framework, intended to aid decisionmakers in protecting and restoring the integrity of the Bay ecosystem. The initial version of COAST is focused on water quality issues. During early and mid- 2008, initial ideas for COAST were shared and discussed with various CBP partners and other potential user groups. At these meetings, test cases were selected to help improve understanding of the types of information and analytical functionality that would be most useful for specific partners’ needs. These discussions added considerable knowledge about the nature of decisionmaking for Federal, State, local and nongovernmental partners. Version 1.0 of COAST, released in early winter of 2008, will be further reviewed to determine improvements needed to address implementation and assessment of water quality practices. Future versions of COAST may address other aspects of ecosystem restoration, including restoration of habitat and living resources and maintaining watershed health.
3-D printing provides a novel approach for standardization and reproducibility of freezing devices
Hu, E; Childress, William; Tiersch, Terrence R.
2017-01-01
Cryopreservation has become an important and accepted tool for long-term germplasm conservation of animals and plants. To protect genetic resources, repositories have been developed with national and international cooperation. For a repository to be effective, the genetic material submitted must be of good quality and comparable to other submissions. However, due to a variety of reasons, including constraints in knowledge and available resources, cryopreservation methods for aquatic species vary widely across user groups which reduces reproducibility and weakens quality control. Herein we describe a standardizable freezing device produced using 3-dimensional (3-D) printing and introduce the concept of network sharing to achieve aggregate high-throughput cryopreservation for aquatic species. The objectives were to: 1) adapt widely available polystyrene foam products that would be inexpensive, portable, and provide adequate work space; 2) develop a design suitable for 3-D printing that could provide multiple configurations, be inexpensive, and easy to use, and 3) evaluate various configurations to attain freezing rates suitable for various common cryopreservation containers. Through this approach, identical components can be accessed globally, and we demonstrated that 3-D printers can be used to fabricate parts for standardizable freezing devices yielding relevant and reproducible cooling rates across users. With standardized devices for freezing, methods and samples can harmonize into an aggregated high-throughput pathway not currently available for aquatic species repository development. PMID:28465185
Usability and acceptance evaluation of ACESO: a Web-based breast cancer survivorship tool.
Kapoor, Akshat; Nambisan, Priya
2018-06-01
The specific objective of this research is to design and develop a personalized Web application to support breast cancer survivors after treatment, as they deal with post-treatment challenges, such as comorbidities and side effects of treatment. A mixed-methods approach, utilizing a combination of think-aloud analysis, personal interviews, and surveys, was adopted for user acceptance and usability testing among a group of breast cancer survivors. User feedback was gathered on their perceived value of the application, and any user-interface issues that may hinder the overall usability were identified. The application's portability and capability of organizing their entire breast cancer-related medical history as well as tracking various quality of life indicators were perceived to be valuable features. The application had an overall high usability; however, certain sections of the application were not as intuitive to locate. Visual elements of the website were appreciated; however, overall experience would benefit from incorporating more sociable elements that exhibit positive re-enforcement within the end user and provide a friendlier experience. The results of the study showcase the need for more personalized tools and resources to support survivors in self-management. It also demonstrates the ability to integrate breast cancer survivorship care plans from diverse providers and paves the way to add further value-added features in consumer health applications, such as personal decision support. Using a personal decision support-based tool can serve as a training tool and resource, providing these patients with pertinent information about the various aspects of their long-term health, while educating them about any related side effects and symptoms. It is hoped that making such tools more accessible could help in engaging survivors to play an active role in managing their health and encourage shared decision-making with their providers.
2016-05-01
Sharik 1.0: User Needs and System Requirements for a Web -Based Tool to Support Collaborative Sensemaking Shadi Ghajar-Khosravi...share the new intelligence items with their peers. In this report, the authors describe Sharik (SHAring Resources, Information, and Knowledge), a web ...SHAring Resources, Information and Knowledge, soit le partage des ressources, de l’information et des connaissances), un outil Web qui facilite le
Griffiths, Alex; Beaussier, Anne-Laure; Demeritt, David; Rothstein, Henry
2017-02-01
The Care Quality Commission (CQC) is responsible for ensuring the quality of the health and social care delivered by more than 30 000 registered providers in England. With only limited resources for conducting on-site inspections, the CQC has used statistical surveillance tools to help it identify which providers it should prioritise for inspection. In the face of planned funding cuts, the CQC plans to put more reliance on statistical surveillance tools to assess risks to quality and prioritise inspections accordingly. To evaluate the ability of the CQC's latest surveillance tool, Intelligent Monitoring (IM), to predict the quality of care provided by National Health Service (NHS) hospital trusts so that those at greatest risk of providing poor-quality care can be identified and targeted for inspection. The predictive ability of the IM tool is evaluated through regression analyses and χ 2 testing of the relationship between the quantitative risk score generated by the IM tool and the subsequent quality rating awarded following detailed on-site inspection by large expert teams of inspectors. First, the continuous risk scores generated by the CQC's IM statistical surveillance tool cannot predict inspection-based quality ratings of NHS hospital trusts (OR 0.38 (0.14 to 1.05) for Outstanding/Good, OR 0.94 (0.80 to -1.10) for Good/Requires improvement, and OR 0.90 (0.76 to 1.07) for Requires improvement/Inadequate). Second, the risk scores cannot be used more simply to distinguish the trusts performing poorly-those subsequently rated either 'Requires improvement' or 'Inadequate'-from the trusts performing well-those subsequently rated either 'Good' or 'Outstanding' (OR 1.07 (0.91 to 1.26)). Classifying CQC's risk bandings 1-3 as high risk and 4-6 as low risk, 11 of the high risk trusts were performing well and 43 of the low risk trusts were performing poorly, resulting in an overall accuracy rate of 47.6%. Third, the risk scores cannot be used even more simply to distinguish the worst performing trusts-those subsequently rated 'Inadequate'-from the remaining, better performing trusts (OR 1.11 (0.94 to 1.32)). Classifying CQC's risk banding 1 as high risk and 2-6 as low risk, the highest overall accuracy rate of 72.8% was achieved, but still only 6 of the 13 Inadequate trusts were correctly classified as being high risk. Since the IM statistical surveillance tool cannot predict the outcome of NHS hospital trust inspections, it cannot be used for prioritisation. A new approach to inspection planning is therefore required. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Woods, Cindy; Carlisle, Karen; Larkins, Sarah; Thompson, Sandra Claire; Tsey, Komla; Matthews, Veronica; Bailie, Ross
2017-01-01
Continuous Quality Improvement is a process for raising the quality of primary health care (PHC) across Indigenous PHC services. In addition to clinical auditing using plan, do, study, and act cycles, engaging staff in a process of reflecting on systems to support quality care is vital. The One21seventy Systems Assessment Tool (SAT) supports staff to assess systems performance in terms of five key components. This study examines quantitative and qualitative SAT data from five high-improving Indigenous PHC services in northern Australia to understand the systems used to support quality care. High-improving services selected for the study were determined by calculating quality of care indices for Indigenous health services participating in the Audit and Best Practice in Chronic Disease National Research Partnership. Services that reported continuing high improvement in quality of care delivered across two or more audit tools in three or more audits were selected for the study. Precollected SAT data (from annual team SAT meetings) are presented longitudinally using radar plots for quantitative scores for each component, and content analysis is used to describe strengths and weaknesses of performance in each systems' component. High-improving services were able to demonstrate strong processes for assessing system performance and consistent improvement in systems to support quality care across components. Key strengths in the quality support systems included adequate and orientated workforce, appropriate health system supports, and engagement with other organizations and community, while the weaknesses included lack of service infrastructure, recruitment, retention, and support for staff and additional costs. Qualitative data revealed clear voices from health service staff expressing concerns with performance, and subsequent SAT data provided evidence of changes made to address concerns. Learning from the processes and strengths of high-improving services may be useful as we work with services striving to improve the quality of care provided in other areas.
Ruff, Jesley C; Herndon, Jill Boylston; Horton, Roger A; Lynch, Julie; Mathwig, Dawn C; Leonard, Audra; Aravamudhan, Krishna
2017-10-27
Health registries are commonly used in medicine to support public health activities and are increasingly used in quality improvement (QI) initiatives. Illustrations of dental registries and their QI applications are lacking. Within dentistry, caries risk assessment implementation and documentation are vital to optimal patient care. The purpose of this article is to describe the processes used to develop a caries risk assessment registry as a QI initiative to support clinical caries risk assessment, caries prevention, and disease management for children. Developmental steps reflected Agency for Healthcare Research and Quality recommendations for planning QI registries and included engaging "champions," defining the project, identifying registry features, defining performance dashboard indicators, and pilot testing with participant feedback. We followed Standards for Quality Improvement Reporting Excellence guidelines. Registry eligibility is patients aged 0-17 years. QI tools include prompts to register eligible patients; decision support tools grounded in evidence-based guidelines; and performance dashboard reports delivered at the provider and aggregated levels at regular intervals. The registry was successfully piloted in two practices with documented caries risk assessment increasing from 57 percent to 92 percent and positive feedback regarding the potential to improve dental practice patient centeredness, patient engagement and education, and quality of care. The caries risk assessment registry demonstrates how dental registries may be used in QI efforts to promote joint patient and provider engagement, foster shared decision making, and systematically collect patient information to generate timely and actionable data to improve care quality and patient outcomes at the individual and population levels. © 2017 American Association of Public Health Dentistry.
Assessment of SOAP note evaluation tools in colleges and schools of pharmacy.
Sando, Karen R; Skoy, Elizabeth; Bradley, Courtney; Frenzel, Jeanne; Kirwin, Jennifer; Urteaga, Elizabeth
2017-07-01
To describe current methods used to assess SOAP notes in colleges and schools of pharmacy. Members of the American Association of Colleges of Pharmacy Laboratory Instructors Special Interest Group were invited to share assessment tools for SOAP notes. Content of submissions was evaluated to characterize overall qualities and how the tools assessed subjective, objective, assessment, and plan information. Thirty-nine assessment tools from 25 schools were evaluated. Twenty-nine (74%) of the tools were rubrics and ten (26%) were checklists. All rubrics included analytic scoring elements, while two (7%) were mixed with holistic and analytic scoring elements. A majority of the rubrics (35%) used a four-item rating scale. Substantial variability existed in how tools evaluated subjective and objective sections. All tools included problem identification in the assessment section. Other assessment items included goals (82%) and rationale (69%). Seventy-seven percent assessed drug therapy; however, only 33% assessed non-drug therapy. Other plan items included education (59%) and follow-up (90%). There is a great deal of variation in the specific elements used to evaluate SOAP notes in colleges and schools of pharmacy. Improved consistency in assessment methods to evaluate SOAP notes may better prepare students to produce standardized documentation when entering practice. Copyright © 2017 Elsevier Inc. All rights reserved.
Executive Skills for Busy School Leaders
ERIC Educational Resources Information Center
Hitch, Chris; Coley, David C.
2010-01-01
This comprehensive and practical handbook offers research-based tools to help you fulfill all of your leadership responsibilities on time and with laser-like focus. The authors also share tips from their combined experiences as elementary, middle, and high school principals. This book provides examples of best practices from the business and…
Beyond the Movie Screen: An Antarctic Adventure
ERIC Educational Resources Information Center
Cajigal, Aris Reynold V.; Chamrat, Suthida; Tippins, Deborah; Mueller, Mike; Thomson, Norman
2011-01-01
Movies depicting science-related issues often capture the attention of today's youth. As an instructional tool, movies can take us beyond the drama and action and thrilling scenes. In this article we share our experiences of using the movie "Eight Below" as a centerpiece for developing high school students' understanding of basic…
xGDBvm: A Web GUI-Driven Workflow for Annotating Eukaryotic Genomes in the Cloud[OPEN
Merchant, Nirav
2016-01-01
Genome-wide annotation of gene structure requires the integration of numerous computational steps. Currently, annotation is arguably best accomplished through collaboration of bioinformatics and domain experts, with broad community involvement. However, such a collaborative approach is not scalable at today’s pace of sequence generation. To address this problem, we developed the xGDBvm software, which uses an intuitive graphical user interface to access a number of common genome analysis and gene structure tools, preconfigured in a self-contained virtual machine image. Once their virtual machine instance is deployed through iPlant’s Atmosphere cloud services, users access the xGDBvm workflow via a unified Web interface to manage inputs, set program parameters, configure links to high-performance computing (HPC) resources, view and manage output, apply analysis and editing tools, or access contextual help. The xGDBvm workflow will mask the genome, compute spliced alignments from transcript and/or protein inputs (locally or on a remote HPC cluster), predict gene structures and gene structure quality, and display output in a public or private genome browser complete with accessory tools. Problematic gene predictions are flagged and can be reannotated using the integrated yrGATE annotation tool. xGDBvm can also be configured to append or replace existing data or load precomputed data. Multiple genomes can be annotated and displayed, and outputs can be archived for sharing or backup. xGDBvm can be adapted to a variety of use cases including de novo genome annotation, reannotation, comparison of different annotations, and training or teaching. PMID:27020957
xGDBvm: A Web GUI-Driven Workflow for Annotating Eukaryotic Genomes in the Cloud.
Duvick, Jon; Standage, Daniel S; Merchant, Nirav; Brendel, Volker P
2016-04-01
Genome-wide annotation of gene structure requires the integration of numerous computational steps. Currently, annotation is arguably best accomplished through collaboration of bioinformatics and domain experts, with broad community involvement. However, such a collaborative approach is not scalable at today's pace of sequence generation. To address this problem, we developed the xGDBvm software, which uses an intuitive graphical user interface to access a number of common genome analysis and gene structure tools, preconfigured in a self-contained virtual machine image. Once their virtual machine instance is deployed through iPlant's Atmosphere cloud services, users access the xGDBvm workflow via a unified Web interface to manage inputs, set program parameters, configure links to high-performance computing (HPC) resources, view and manage output, apply analysis and editing tools, or access contextual help. The xGDBvm workflow will mask the genome, compute spliced alignments from transcript and/or protein inputs (locally or on a remote HPC cluster), predict gene structures and gene structure quality, and display output in a public or private genome browser complete with accessory tools. Problematic gene predictions are flagged and can be reannotated using the integrated yrGATE annotation tool. xGDBvm can also be configured to append or replace existing data or load precomputed data. Multiple genomes can be annotated and displayed, and outputs can be archived for sharing or backup. xGDBvm can be adapted to a variety of use cases including de novo genome annotation, reannotation, comparison of different annotations, and training or teaching. © 2016 American Society of Plant Biologists. All rights reserved.
CloudMan as a platform for tool, data, and analysis distribution.
Afgan, Enis; Chapman, Brad; Taylor, James
2012-11-27
Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. CloudMan (usecloudman.org) enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions.
Okamura, Kyoko; Bernstein, Judith; Fidler, Anne T
2002-01-01
The Internet has become a major source of health information for women, but information placed on the World Wide Web does not routinely undergo a peer review process before dissemination. In this study, we present an analysis of 197 infertility-related Web sites for quality and accountability, using JAMA's minimal core standards for responsible print. Only 2% of the web sites analyzed met all four recommended standards, and 50.8% failed to report any of the four. Commercial web sites were more likely to fail to meet minimum standards (71.2%) than those with educational (46.8%) or supportive (29.8%) elements. Web sites with educational and informational components were most common (70.6%), followed by commercial sites (52.8%) and sites that offered a forum for infertility support and activism (28.9%). Internet resources available to infertile patients are at best variable. The current state of infertility-related materials on the World Wide Web offers unprecedented opportunities to improve services to a growing number of e-health users. Because of variations in quality of site content, women's health clinicians must assume responsibility for a new role as information monitor. This study provides assessment tools clinicians can apply and share with clients.
Food management behaviours in food-insecure, lone mother-led families.
Sim, S Meaghan; Glanville, N Theresa; McIntyre, Lynn
2011-01-01
Little is known about how food is managed in households where food resources are scarce. In this study, the household food management behaviours utilized by food-insecure, lone mother-led families from Atlantic Canada were characterized, and relationships among these behaviours and diet quality were examined. Thematic analysis of 24 in-depth interviews from a larger study of mother-led, low-income families was integrated with sociodemographic characteristics, food-insecurity status, and four weekly 24-hour dietary recalls for all household members to yield a family behaviour score (FBS) as a summative measure of food management behaviours, and a healthy plate score (HPS) as a measure of diet quality. Five distinct food management behaviours were identified: authoritative, healthism, sharing, structured, and planning behaviours. An increase in the FBS was associated with a proportional increase in the HPS. Authoritative, healthism, and planning food management behaviours were the strongest predictors of the HPS for all household members (p<0.05). The structured management behaviour was related to the degree of food insecurity. The FBS and HPS tools hold promise as a way to identify food-insecure families at risk of low diet quality. The next phase of this research will validate the use of these tools in the practice setting.
The share of ultra-processed foods determines the overall nutritional quality of diets in Brazil.
Louzada, Maria Laura da Costa; Ricardo, Camila Zancheta; Steele, Euridice Martinez; Levy, Renata Bertazzi; Cannon, Geoffrey; Monteiro, Carlos Augusto
2018-01-01
To estimate the dietary share of ultra-processed foods and to determine its association with the overall nutritional quality of diets in Brazil. Cross-sectional. Brazil. A representative sample of 32 898 Brazilians aged ≥10 years was studied. Food intake data were collected. We calculated the average dietary content of individual nutrients and compared them across quintiles of energy share of ultra-processed foods. Then we identified nutrient-based dietary patterns, and evaluated the association between quintiles of dietary share of ultra-processed foods and the patterns' scores. The mean per capita daily dietary energy intake was 7933 kJ (1896 kcal), with 58·1 % from unprocessed or minimally processed foods, 10·9 % from processed culinary ingredients, 10·6 % from processed foods and 20·4 % from ultra-processed foods. Consumption of ultra-processed foods was directly associated with high consumption of free sugars and total, saturated and trans fats, and with low consumption of protein, dietary fibre, and most of the assessed vitamins and minerals. Four nutrient-based dietary patterns were identified. 'Healthy pattern 1' carried more protein and micronutrients, and less free sugars. 'Healthy pattern 2' carried more vitamins. 'Healthy pattern 3' carried more dietary fibre and minerals and less free sugars. 'Unhealthy pattern' carried more total, saturated and trans fats, and less dietary fibre. The dietary share of ultra-processed foods was inversely associated with 'healthy pattern 1' (-0·16; 95 % CI -0·17, -0·15) and 'healthy pattern 3' (-0·18; 95 % CI -0·19, -0·17), and directly associated with 'unhealthy pattern' (0·17; 95 % CI 0·15, 0·18). Dietary share of ultra-processed foods determines the overall nutritional quality of diets in Brazil.
The Future of Catalogers and Cataloging.
ERIC Educational Resources Information Center
Holley, Robert P.
1981-01-01
Future emphasis in cataloging will be on the sharing of high quality bibliographic records through a national network. As original cataloging decreases, catalogers, rather than disappearing, will more likely be managers of the library's bibliographic control system. (Author/RAA)
Using An Online Photo-Sharing Tool (Flickr) to Connect Students During Earth Science Week
NASA Astrophysics Data System (ADS)
Guertin, L. A.
2009-12-01
At the university level, some faculty desire to have their students connect with middle school and high school students for activities and discussions relating to Earth science. Unfortunately, it is not always feasible to coordinate face-to-face meetings of the students, especially when trying to forge connections with schools located at a distance. Therefore, I have turned to an online tool to forge the connections for an Earth science outreach activity - specifically, the use of the photo-sharing tool Flickr, http://www.flickr.com. Flickr is an online photo management and sharing application that allows for the creation of a community with authorized members to contribute images viewable by the general public. For this project, the participating student community included undergraduates from Penn State University, as well as middle school and high school students from Delaware, Michigan, Kentucky, and North Carolina. I decided a theme should be selected for the students to frame the project. I selected the 2009 Earth Science Week (ESW) photography context theme, How Climate Shapes My World, as I felt it was important to have the students connect with a nationwide celebration and exploration of this topic. Students were encouraged to consider what the theme meant to them and how to represent that through a photograph. Each student was required to provide a title and description for the photograph contributed to the Flickr group (http://www.flickr.com/groups/earthscienceweek2009). As this Flickr project was only a collaboration and sharing of photos and not a contest, the students were encouraged to not only submit their photo in Flickr but to the actual ESW contest. The deadline to post the photographs online in Flickr was set for the end of Earth Science Week. The key to the ESW Flickr project was not just the taking and viewing of photos. The Flickr website is designed with the idea of social networking around an image. Flickr facilitated a dialogue that had students talking to each other, focusing on an academic topic. I was excited to be able to show students an academic use of Flickr versus just posting and organizing personal photographs for social networking. After the submission deadline, the Penn State students were required to go back into Flickr and post a comment under each photograph submitted by the middle school and high school students, as well as begin a discussion thread on the overall theme. Overall, the project demonstrated how an online scholarly community can be created to share photos and engage in discussion with student participants in separate locations. Flickr can be effective as an online social networking tool to foster collaboration and innovation in a virtual academic community.
Distribution and Validation of CERES Irradiance Global Data Products Via Web Based Tools
NASA Technical Reports Server (NTRS)
Rutan, David; Mitrescu, Cristian; Doelling, David; Kato, Seiji
2016-01-01
The CERES SYN1deg product provides climate quality 3-hourly globally gridded and temporally complete maps of top of atmosphere, in atmosphere, and surface fluxes. This product requires efficient release to the public and validation to maintain quality assurance. The CERES team developed web-tools for the distribution of both the global gridded products and grid boxes that contain long term validation sites that maintain high quality flux observations at the Earth's surface. These are found at: http://ceres.larc.nasa.gov/order_data.php. In this poster we explore the various tools available to users to sub-set, download, and validate using surface observations the SYN1Deg and Surface-EBAF products. We also analyze differences found in long-term records from well-maintained land surface sites such as the ARM central facility and high quality buoy radiometers, which due to their isolated nature cannot be maintained in a similar manner to their land based counterparts.
A scoping review of patient discharge from intensive care: opportunities and tools to improve care.
Stelfox, Henry T; Lane, Dan; Boyd, Jamie M; Taylor, Simon; Perrier, Laure; Straus, Sharon; Zygun, David; Zuege, Danny J
2015-02-01
We conducted a scoping review to systematically review the literature reporting patient discharge from ICUs, identify facilitators and barriers to high-quality care, and describe tools developed to improve care. We searched Medline, Embase, CINAHL, and the Cochrane Central Register of Controlled Trials. Data were extracted on the article type, study details for research articles, patient population, phase of care during discharge, and dimensions of health-care quality. From 8,154 unique publications we included 224 articles. Of these, 131 articles (58%) were original research, predominantly case series (23%) and cohort (16%) studies; 12% were narrative reviews; and 11% were guidelines/policies. Common themes included patient and family needs/experiences (29% of articles) and the importance of complete and accurate information (26%). Facilitators of high-quality care included provider-patient communication (30%), provider-provider communication (25%), and the use of guidelines/policies (29%). Patient and family anxiety (21%) and limited availability of ICU and ward resources (26%) were reported barriers to high-quality care. A total of 47 tools to facilitate patient discharge from the ICU were identified and focused on patient evaluation for discharge (29%), discharge planning and teaching (47%), and optimized discharge summaries (23%). Common themes, facilitators and barriers related to patient and family needs/experiences, communication, and the use of guidelines/policies to standardize patient discharge from ICU transcend the literature. Candidate tools to improve care are available; comparative evaluation is needed prior to broad implementation and could be tested through local quality-improvement programs.
BikeMaps.org: A Global Tool for Collision and Near Miss Mapping
Nelson, Trisalyn A.; Denouden, Taylor; Jestico, Benjamin; Laberee, Karen; Winters, Meghan
2015-01-01
There are many public health benefits to cycling, such as chronic disease reduction and improved air quality. Real and perceived concerns about safety are primary barriers to new ridership. Due to limited forums for official reporting of cycling incidents, lack of comprehensive data is limiting our ability to study cycling safety and conduct surveillance. Our goal is to introduce BikeMaps.org, a new website developed by the authors for crowd-source mapping of cycling collisions and near misses. BikeMaps.org is a global mapping system that allows citizens to map locations of cycling incidents and report on the nature of the event. Attributes collected are designed for spatial modeling research on predictors of safety and risk, and to aid surveillance and planning. Released in October 2014, within 2 months the website had more than 14,000 visitors and mapping in 14 countries. Collisions represent 38% of reports (134/356) and near misses 62% (222/356). In our pilot city, Victoria, Canada, citizens mapped data equivalent to about 1 year of official cycling collision reports within 2 months via BikeMaps.org. Using report completeness as an indicator, early reports indicate that data are of high quality with 50% being fully attributed and another 10% having only one missing attribute. We are advancing this technology, with the development of a mobile App, improved data visualization, real-time altering of hazard reports, and automated open-source tools for data sharing. Researchers and citizens interested in utilizing the BikeMaps.org technology can get involved by encouraging citizen mapping in their region. PMID:25870852
BikeMaps.org: A Global Tool for Collision and Near Miss Mapping.
Nelson, Trisalyn A; Denouden, Taylor; Jestico, Benjamin; Laberee, Karen; Winters, Meghan
2015-01-01
There are many public health benefits to cycling, such as chronic disease reduction and improved air quality. Real and perceived concerns about safety are primary barriers to new ridership. Due to limited forums for official reporting of cycling incidents, lack of comprehensive data is limiting our ability to study cycling safety and conduct surveillance. Our goal is to introduce BikeMaps.org, a new website developed by the authors for crowd-source mapping of cycling collisions and near misses. BikeMaps.org is a global mapping system that allows citizens to map locations of cycling incidents and report on the nature of the event. Attributes collected are designed for spatial modeling research on predictors of safety and risk, and to aid surveillance and planning. Released in October 2014, within 2 months the website had more than 14,000 visitors and mapping in 14 countries. Collisions represent 38% of reports (134/356) and near misses 62% (222/356). In our pilot city, Victoria, Canada, citizens mapped data equivalent to about 1 year of official cycling collision reports within 2 months via BikeMaps.org. Using report completeness as an indicator, early reports indicate that data are of high quality with 50% being fully attributed and another 10% having only one missing attribute. We are advancing this technology, with the development of a mobile App, improved data visualization, real-time altering of hazard reports, and automated open-source tools for data sharing. Researchers and citizens interested in utilizing the BikeMaps.org technology can get involved by encouraging citizen mapping in their region.
Electronic immunization data collection systems: application of an evaluation framework.
Heidebrecht, Christine L; Kwong, Jeffrey C; Finkelstein, Michael; Quan, Sherman D; Pereira, Jennifer A; Quach, Susan; Deeks, Shelley L
2014-01-14
Evaluating the features and performance of health information systems can serve to strengthen the systems themselves as well as to guide other organizations in the process of designing and implementing surveillance tools. We adapted an evaluation framework in order to assess electronic immunization data collection systems, and applied it in two Ontario public health units. The Centers for Disease Control and Prevention's Guidelines for Evaluating Public Health Surveillance Systems are broad in nature and serve as an organizational tool to guide the development of comprehensive evaluation materials. Based on these Guidelines, and informed by other evaluation resources and input from stakeholders in the public health community, we applied an evaluation framework to two examples of immunization data collection and examined several system attributes: simplicity, flexibility, data quality, timeliness, and acceptability. Data collection approaches included key informant interviews, logic and completeness assessments, client surveys, and on-site observations. Both evaluated systems allow high-quality immunization data to be collected, analyzed, and applied in a rapid fashion. However, neither system is currently able to link to other providers' immunization data or provincial data sources, limiting the comprehensiveness of coverage assessments. We recommended that both organizations explore possibilities for external data linkage and collaborate with other jurisdictions to promote a provincial immunization repository or data sharing platform. Electronic systems such as the ones described in this paper allow immunization data to be collected, analyzed, and applied in a rapid fashion, and represent the infostructure required to establish a population-based immunization registry, critical for comprehensively assessing vaccine coverage.
Total Quality Management: A Recipe for Success
1990-04-02
Total Quality Management (TQM) is a high level Department of Defense (DOD) initiative that is being touted as the primary management tool to force...to create a DOD wide organizational climate that will stimulate and perpetuate individual productivity enhancing contributions. Keywords: Quality control; Quality management ; TQM.
New educational tools to encourage high-school students' activity in stem
NASA Astrophysics Data System (ADS)
Mayorova, Vera; Grishko, Dmitriy; Leonov, Victor
2018-01-01
Many students have to choose their future profession during their last years in the high school and therefore to choose a university where they will get proper education. That choice may define their professional life for many years ahead or probably for the rest of their lives. Bauman Moscow State Technical University conducts various events to introduce future professions to high-school students. Such activity helps them to pick specialization in line with their interests and motivates them to study key scientific subjects. The paper focuses on newly developed educational tools to encourage high school students' interest in STEM disciplines. These tools include laboratory courses developed in the fields of physics, information technologies and mathematics. More than 2000 high school students already participated in these experimental courses. These activities are aimed at increasing the quality of STEM disciplines learning which will result in higher quality of training of future engineers.
Informatics methods to enable sharing of quantitative imaging research data.
Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L
2012-11-01
The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.
Internet Interventions for Long-Term Conditions: Patient and Caregiver Quality Criteria
Murray, Elizabeth; Stevenson, Fiona; Gore, Charles; Nazareth, Irwin
2006-01-01
Background Interactive health communication applications (IHCAs) that combine high-quality health information with interactive components, such as self-assessment tools, behavior change support, peer support, or decision support, are likely to benefit people with long-term conditions. IHCAs are now largely Web-based and are becoming known as "Internet interventions." Although there are numerous professionally generated criteria to assess health-related websites, to date there has been scant exploration of patient-generated assessment criteria even though patients and professionals use different criteria for assessing the quality of traditional sources of health information. Objective We aimed to determine patients' and caregivers' requirements of IHCAs for long-term conditions as well as their criteria for assessing the quality of different programs. Methods This was a qualitative study with focus groups. Patients and caregivers managing long-term conditions used three (predominantly Web-based) IHCAs relevant to their condition and subsequently discussed the strengths and weaknesses of the different IHCAs in focus groups. Participants in any one focus group all shared the same long-term condition and viewed the same three IHCAs. Patient and caregiver criteria for IHCAs emerged from the data. Results There were 40 patients and caregivers who participated in 10 focus groups. Participants welcomed the potential of Internet interventions but felt that many were not achieving their full potential. Participants generated detailed and specific quality criteria relating to information content, presentation, interactivity, and trustworthiness, which can be used by developers and purchasers of Internet interventions. Conclusions The user-generated quality criteria reported in this paper should help developers and purchasers provide Internet interventions that better meet user needs. PMID:16954123
Otegui, Javier; Ariño, Arturo H
2012-08-15
In any data quality workflow, data publishers must become aware of issues in their data so these can be corrected. User feedback mechanisms provide one avenue, while global assessments of datasets provide another. To date, there is no publicly available tool to allow both biodiversity data institutions sharing their data through the Global Biodiversity Information Facility network and its potential users to assess datasets as a whole. Contributing to bridge this gap both for publishers and users, we introduce BIoDiversity DataSets Assessment Tool, an online tool that enables selected diagnostic visualizations on the content of data publishers and/or their individual collections. The online application is accessible at http://www.unav.es/unzyec/mzna/biddsat/ and is supported by all major browsers. The source code is licensed under the GNU GPLv3 license (http://www.gnu.org/licenses/gpl-3.0.txt) and is available at https://github.com/jotegui/BIDDSAT.
Bala, Adarsh; Gupta, B M
2010-01-01
This study analyses the research output in India in neurosciences during the period 1999-2008 and the analyses included research growth, rank, global publications' share, citation impact, share of international collaborative papers and major collaborative partner countries and patterns of research communication in most productive journals. It also analyses the characteristics of most productive institutions, authors and high-cited papers. The publication output and impact of India is also compared with China, Brazil and South Korea. Scopus Citation database was used for retrieving the publications' output of India and other countries in neurosciences during 1999-2008. India's global publications' share in neurosciences during the study period was 0.99% (with 4503 papers) and it ranked 21 st among the top 26 countries in neurosciences. The average annual publication growth rate was 11.37%, shared 17.34% of international collaborative papers and the average citation per paper was 4.21. India was far behind China, Brazil and South Korea in terms of publication output, citation quality and share of international collaborative papers in neurosciences. India is far behind in terms of publication output, citation quality and share of international collaborative papers in neurosciences when compared to other countries with an emerging economy. There is an urgent need to substantially increase the research activities in the field of neurosciences in India.
Simulation as a vehicle for enhancing collaborative practice models.
Jeffries, Pamela R; McNelis, Angela M; Wheeler, Corinne A
2008-12-01
Clinical simulation used in a collaborative practice approach is a powerful tool to prepare health care providers for shared responsibility for patient care. Clinical simulations are being used increasingly in professional curricula to prepare providers for quality practice. Little is known, however, about how these simulations can be used to foster collaborative practice across disciplines. This article provides an overview of what simulation is, what collaborative practice models are, and how to set up a model using simulations. An example of a collaborative practice model is presented, and nursing implications of using a collaborative practice model in simulations are discussed.
Sreedharan, Vipin T; Schultheiss, Sebastian J; Jean, Géraldine; Kahles, André; Bohnert, Regina; Drewe, Philipp; Mudrakarta, Pramod; Görnitz, Nico; Zeller, Georg; Rätsch, Gunnar
2014-05-01
We present Oqtans, an open-source workbench for quantitative transcriptome analysis, that is integrated in Galaxy. Its distinguishing features include customizable computational workflows and a modular pipeline architecture that facilitates comparative assessment of tool and data quality. Oqtans integrates an assortment of machine learning-powered tools into Galaxy, which show superior or equal performance to state-of-the-art tools. Implemented tools comprise a complete transcriptome analysis workflow: short-read alignment, transcript identification/quantification and differential expression analysis. Oqtans and Galaxy facilitate persistent storage, data exchange and documentation of intermediate results and analysis workflows. We illustrate how Oqtans aids the interpretation of data from different experiments in easy to understand use cases. Users can easily create their own workflows and extend Oqtans by integrating specific tools. Oqtans is available as (i) a cloud machine image with a demo instance at cloud.oqtans.org, (ii) a public Galaxy instance at galaxy.cbio.mskcc.org, (iii) a git repository containing all installed software (oqtans.org/git); most of which is also available from (iv) the Galaxy Toolshed and (v) a share string to use along with Galaxy CloudMan.
From Good to Great: Discussion Starter Tool
ERIC Educational Resources Information Center
Center on Great Teachers and Leaders, 2014
2014-01-01
In the report "From Good to Great: Exemplary Teachers Share Perspectives on Increasing Teacher Effectiveness across the Career Continuum," (See full report in ERIC at ED555657) National and State Teachers of the Year shared their views on what helped them become great teachers. This accompanying "Discussion Starter Tool" builds…
Development and Classroom Implementation of an Environmental Data Creation and Sharing Tool
ERIC Educational Resources Information Center
Brogan, Daniel S.; McDonald, Walter M.; Lohani, Vinod K.; Dymond, Randel L.; Bradner, Aaron J.
2016-01-01
Education is essential for solving the complex water-related challenges facing society. The Learning Enhanced Watershed Assessment System (LEWAS) and the Online Watershed Learning System (OWLS) provide data creation and data sharing infrastructures, respectively, that combine to form an environmental learning tool. This system collects, integrates…
Many of us nowadays invest significant amounts of time in sharing our activities and opinions with friends and family via social networking tools such as Facebook, Twitter or other related websites. However, despite the availability of many platforms for scientists to connect and...
Thinking strategically about capitation.
Boland, P
1997-05-01
All managed care stakeholders--health plan members, employers, providers, community organizations, and government entitites--share a common interest in reducing healthcare costs while improving the quality of care health plan members receive. Although capitation is a usually thought of primarily as a payment mechanism, it can be a powerful tool providers and health plans can use to accomplish these strategic objectives and others, such as restoring and maintaining the health of plan members or improving a community's health status. For capitation to work effectively as a strategic tool, its use must be tied to a corporate agenda of partnering with stakeholders to achieve broader strategic goals. Health plans and providers must develop a partnership strategy in which each stakeholder has well-defined roles and responsibilities. The capitation structure must reinforce interdependence, shift focus from meeting organizational needs to meeting customer needs, and develop risk-driven care strategies.
Schifferdecker, Karen E; Adachi-Mejia, Anna M; Butcher, Rebecca L; O'Connor, Sharon; Li, Zhigang; Bazos, Dorothy A
2016-01-01
Action Learning Collaboratives (ALCs), whereby teams apply quality improvement (QI) tools and methods, have successfully improved patient care delivery and outcomes. We adapted and tested the ALC model as a community-based obesity prevention intervention focused on physical activity and healthy eating. The intervention used QI tools (e.g., progress monitoring) and team-based activities and was implemented in three communities through nine monthly meetings. To assess process and outcomes, we used a longitudinal repeated-measures and mixed-methods triangulation approach with a quasi-experimental design including objective measures at three time points. Most of the 97 participants were female (85.4%), White (93.8%), and non-Hispanic/Latino (95.9%). Average age was 52 years; 28.0% had annual household income of $20,000 or less; and mean body mass index was 35. Through mixed-effects models, we found some physical activity outcomes improved. Other outcomes did not significantly change. Although participants favorably viewed the QI tools, components of the QI process such as sharing goals and data on progress in teams and during meetings were limited. Participants' requests for more education or activities around physical activity and healthy eating, rather than progress monitoring and data sharing required for QI activities, challenged ALC model implementation. An ALC model for community-based obesity prevention may be more effective when applied to preexisting teams in community-based organizations. © 2015 Society for Public Health Education.
Span, Marijke; Hettinga, Marike; Groen-van de Ven, Leontine; Jukema, Jan; Janssen, Ruud; Vernooij-Dassen, Myrra; Eefsting, Jan; Smits, Carolien
2018-06-01
The aim of this study was at gaining insight into the participatory design approach of involving people with dementia in the development of the DecideGuide, an interactive web tool facilitating shared decision-making in their care networks. An explanatory case study design was used when developing the DecideGuide. A secondary analysis focused on the data gathered from the participating people with dementia during the development stages: semi-structured interviews (n = 23), four focus group interviews (n = 18), usability tests (n = 3), and a field study (n = 4). Content analysis was applied to the data. Four themes showed to be important regarding the participation experiences of involving people with dementia in research: valuable feedback on content and design of the DecideGuide, motivation to participate, perspectives of people with dementia and others about distress related to involvement, and time investment. People with dementia can give essential feedback and, therefore, their contribution is useful and valuable. Meaningful participation of people with dementia takes time that should be taken into account. It is important for people with dementia to be able to reciprocate the efforts others make and to feel of significance to others. Implications for Rehabilitation People with dementia can contribute meaningfully to the content and design and their perspective is essential for developing useful and user-friendly tools. Participating in research activities may contribute to social inclusion, empowerment, and quality of life of people with dementia.
Mamlin, Burke W; Tierney, William M
2016-01-01
Healthcare is an information business with expanding use of information and communication technologies (ICTs). Current ICT tools are immature, but a brighter future looms. We examine 7 areas of ICT in healthcare: electronic health records (EHRs), health information exchange (HIE), patient portals, telemedicine, social media, mobile devices and wearable sensors and monitors, and privacy and security. In each of these areas, we examine the current status and future promise, highlighting how each might reach its promise. Steps to better EHRs include a universal programming interface, universal patient identifiers, improved documentation and improved data analysis. HIEs require federal subsidies for sustainability and support from EHR vendors, targeting seamless sharing of EHR data. Patient portals must bring patients into the EHR with better design and training, greater provider engagement and leveraging HIEs. Telemedicine needs sustainable payment models, clear rules of engagement, quality measures and monitoring. Social media needs consensus on rules of engagement for providers, better data mining tools and approaches to counter disinformation. Mobile and wearable devices benefit from a universal programming interface, improved infrastructure, more rigorous research and integration with EHRs and HIEs. Laws for privacy and security need updating to match current technologies, and data stewards should share information on breaches and standardize best practices. ICT tools are evolving quickly in healthcare and require a rational and well-funded national agenda for development, use and assessment. Copyright © 2016 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Sushko, Iurii; Novotarskyi, Sergii; Körner, Robert; Pandey, Anil Kumar; Rupp, Matthias; Teetz, Wolfram; Brandmaier, Stefan; Abdelaziz, Ahmed; Prokopenko, Volodymyr V.; Tanchuk, Vsevolod Y.; Todeschini, Roberto; Varnek, Alexandre; Marcou, Gilles; Ertl, Peter; Potemkin, Vladimir; Grishina, Maria; Gasteiger, Johann; Schwab, Christof; Baskin, Igor I.; Palyulin, Vladimir A.; Radchenko, Eugene V.; Welsh, William J.; Kholodovych, Vladyslav; Chekmarev, Dmitriy; Cherkasov, Artem; Aires-de-Sousa, Joao; Zhang, Qing-You; Bender, Andreas; Nigsch, Florian; Patiny, Luc; Williams, Antony; Tkachenko, Valery; Tetko, Igor V.
2011-06-01
The Online Chemical Modeling Environment is a web-based platform that aims to automate and simplify the typical steps required for QSAR modeling. The platform consists of two major subsystems: the database of experimental measurements and the modeling framework. A user-contributed database contains a set of tools for easy input, search and modification of thousands of records. The OCHEM database is based on the wiki principle and focuses primarily on the quality and verifiability of the data. The database is tightly integrated with the modeling framework, which supports all the steps required to create a predictive model: data search, calculation and selection of a vast variety of molecular descriptors, application of machine learning methods, validation, analysis of the model and assessment of the applicability domain. As compared to other similar systems, OCHEM is not intended to re-implement the existing tools or models but rather to invite the original authors to contribute their results, make them publicly available, share them with other users and to become members of the growing research community. Our intention is to make OCHEM a widely used platform to perform the QSPR/QSAR studies online and share it with other users on the Web. The ultimate goal of OCHEM is collecting all possible chemoinformatics tools within one simple, reliable and user-friendly resource. The OCHEM is free for web users and it is available online at http://www.ochem.eu.
Ernecoff, Natalie C; Witteman, Holly O; Chon, Kristen; Chen, Yanquan Iris; Buddadhumaruk, Praewpannarai; Chiarchiaro, Jared; Shotsberger, Kaitlin J; Shields, Anne-Marie; Myers, Brad A; Hough, Catherine L; Carson, Shannon S; Lo, Bernard; Matthay, Michael A; Anderson, Wendy G; Peterson, Michael W; Steingrub, Jay S; Arnold, Robert M; White, Douglas B
2016-06-01
Although barriers to shared decision making in intensive care units are well documented, there are currently no easily scaled interventions to overcome these problems. We sought to assess stakeholders' perceptions of the acceptability, usefulness, and design suggestions for a tablet-based tool to support communication and shared decision making in ICUs. We conducted in-depth semi-structured interviews with 58 key stakeholders (30 surrogates and 28 ICU care providers). Interviews explored stakeholders' perceptions about the acceptability of a tablet-based tool to support communication and shared decision making, including the usefulness of modules focused on orienting families to the ICU, educating them about the surrogate's role, completing a question prompt list, eliciting patient values, educating about treatment options, eliciting perceptions about prognosis, and providing psychosocial support resources. The interviewer also elicited stakeholders' design suggestions for such a tool. We used constant comparative methods to identify key themes that arose during the interviews. Overall, 95% (55/58) of participants perceived the proposed tool to be acceptable, with 98% (57/58) of interviewees finding six or more of the seven content domains acceptable. Stakeholders identified several potential benefits of the tool including that it would help families prepare for the surrogate role and for family meetings as well as give surrogates time and a framework to think about the patient's values and treatment options. Key design suggestions included: conceptualize the tool as a supplement to rather than a substitute for surrogate-clinician communication; make the tool flexible with respect to how, where, and when surrogates can access the tool; incorporate interactive exercises; use video and narration to minimize the cognitive load of the intervention; and build an extremely simple user interface to maximize usefulness for individuals with low computer literacy. There is broad support among stakeholders for the use of a tablet-based tool to improve communication and shared decision making in ICUs. Eliciting the perspectives of key stakeholders early in the design process yielded important insights to create a tool tailored to the needs of surrogates and care providers in ICUs. Copyright © 2016 Elsevier Inc. All rights reserved.
WhatsApp Messenger as an Adjunctive Tool for Telemedicine: An Overview
2017-01-01
Background The advent of telemedicine has allowed physicians to deliver medical treatment to patients from a distance. Mobile apps such as WhatsApp Messenger, an instant messaging service, came as a novel concept in all fields of social life, including medicine. The use of instant messaging services has been shown to improve communication within medical teams by providing means for quick teleconsultation, information sharing, and starting treatment as soon as possible. Objective The aim of this study was to perform a comprehensive systematic review of present literature on the use of the WhatsApp Messenger app as an adjunctive health care tool for medical doctors. Methods Searches were performed in PubMed, EMBASE, and the Cochrane Library using the term “whatsapp*” in articles published before January 2016. A bibliography of all relevant original articles that used the WhatsApp Messenger app was created. The level of evidence of each study was determined according to the Oxford Levels of Evidence ranking system produced by the Oxford Centre for Evidence-Based Medicine. The impact and the indications of WhatsApp Messenger are discussed in order to understand the extent to which this app currently functions as an adjunctive tool for telemedicine. Results The database search identified a total of 30 studies in which the term “whatsapp*” was used. Each article’s list of references was evaluated item-by-item. After literature reviews, letters to the editor, and low-quality studies were excluded, a total of 10 studies were found to be eligible for inclusion. Of these studies, 9 had been published in the English language and 1 had been published in Spanish. Five were published by medical doctors. Conclusions The pooled data presents compelling evidence that the WhatsApp Messenger app is a promising system, whether used as a communication tool between health care professionals, as a means of communication between health care professionals and the general public, or as a learning tool for providing health care information to professionals or to the general population. However, high-quality and properly evaluated research is needed, as are improvements in descriptions of the methodology and the study processes. These improvements will allow WhatsApp Messenger to be categorically defined as an effective telemedicine tool in many different fields of health care. PMID:28733273
Evaluation of English Websites on Dental Caries by Using Consumer Evaluation Tools.
Blizniuk, Anastasiya; Furukawa, Sayaka; Ueno, Masayuki; Kawaguchi, Yoko
2016-01-01
To evaluate the quality of patient-oriented online information about dental caries using existing consumer evaluation tools and to judge the efficacy of these tools in quality assessment. The websites for the evaluation were pooled by using two general search engines (Google and Yahoo!). The search terms were: 'dental caries', 'tooth decay' and 'tooth cavity'. Three assessment tools (LIDA, DISCERN and FRES) were used to evaluate the quality of the information in the areas of accessibility, usability, reliability and readability. In total, 77 websites were analysed. The median scores of LIDA accessibility and usability were 45.0 and 8.0, respectively, which corresponded to a medium level of quality. The median reliability scores for LIDA (12.0) and DISCERN (20.0) both corresponded to low level of quality. The readability was high with the median FRES score 59.7. The websites on caries had good accessibility, usability and readability, while reliability of the information was poor. The LIDA instrument was found to be more convenient than DISCERN and can be recommended to lay people for quick quality assessment.
Spaner, Donna; Caraiscos, Valerie B; Muystra, Christina; Furman, Margaret Lynn; Zaltz-Dubin, Jodi; Wharton, Marilyn; Whitehead, Katherine
Optimal care for patients in the palliative care setting requires effective clinical teamwork. Communication may be challenging for health-care workers from different disciplines. Daily rounds are one way for clinical teams to share information and develop care plans for patients. The objective of this initiative was to improve the structure and process of daily palliative care rounds by incorporating the use of standardized tools and improved documentation into the meeting. We chose a quality improvement (QI) approach to address this initiative. Our aims were to increase the use of assessment tools when discussing patient care in rounds and to improve the documentation and accessibility of important information in the health record, including goals of care. This QI initiative used a preintervention and postintervention comparison of the outcome measures of interest. The initiative was tested in a palliative care unit (PCU) over a 22-month period from April 2014 to January 2016. Participants were clinical staff in the PCU. Data collected after the completion of several plan-do-study-act cycles showed increased use and incorporation of the Edmonton Symptom Assessment System and Palliative Performance Scale into patient care discussions as well as improvement in inclusion of goals of care into the patient plan of care. Our findings demonstrate that the effectiveness of daily palliative care rounds can be improved by incorporating the use of standard assessment tools and changes into the meeting structure to better focus and direct patient care discussions.
Klein, Dawn M; Fix, Gemmae M; Hogan, Timothy P; Simon, Steven R; Nazi, Kim M; Turvey, Carolyn L
2015-08-18
Information sharing between providers is critical for care coordination, especially in health systems such as the United States Department of Veterans Affairs (VA), where many patients also receive care from other health care organizations. Patients can facilitate this sharing by using the Blue Button, an online tool that promotes patients' ability to view, print, and download their health records. The aim of this study was to characterize (1) patients' use of Blue Button, an online information-sharing tool in VA's patient portal, My HealtheVet, (2) information-sharing practices between VA and non-VA providers, and (3) how providers and patients use a printed Blue Button report during a clinical visit. Semistructured qualitative interviews were conducted with 34 VA patients, 10 VA providers, and 9 non-VA providers. Interviews focused on patients' use of Blue Button, information-sharing practices between VA and non-VA providers, and how patients and providers use a printed Blue Button report during a clinical visit. Qualitative themes were identified through iterative rounds of coding starting with an a priori schema based on technology adoption theory. Information sharing between VA and non-VA providers relied primarily on the patient. Patients most commonly used Blue Button to access and share VA laboratory results. Providers recognized the need for improved information sharing, valued the Blue Button printout, and expressed interest in a way to share information electronically across settings. Consumer-oriented technologies such as Blue Button can facilitate patients sharing health information with providers in other health care systems; however, more education is needed to inform patients of this use to facilitate care coordination. Additional research is needed to explore how personal health record documents, such as Blue Button reports, can be easily shared and incorporated into the clinical workflow of providers.
Enhancing and Evaluating Scientific Argumentation in the Inquiryoriented College Chemistry Classroom
NASA Astrophysics Data System (ADS)
D'Souza, Annabel Nica
The research presented in chapters 2, 3, and 4 in this dissertation uses a sociocultural and sociohistorical lens, particularly around power, authority of knowledge and identity formation, to investigate the complexity of engaging in, supporting, and evaluating high-quality argumentation within a college biochemistry inquiry-oriented classroom. Argumentation skills are essential to college and career (National Research Council, 2010) and for a democratic citizenry. It is central to science teaching and learning (Osborne et al., 2004a) and can deepen content knowledge (Jimenez-Aleixandre et al., 2000; Jimenez-Aleixandre & Pereiro-Munhoz, 2002). When students have opportunities to make claims and support it with evidence and reasoning they may also increase their problem-solving and critical thinking capacity (Case, 2005; Willingham, 2007). Overall, this has implications in supporting students to become increasingly literate in scientific ideas, language, and practices. However, supporting argumentation can be challenging for instructors, particularly in designing leaning environments that facilitate and evaluate both the process and the product during student discussions (Duschl & Osborne, 2002). Fostering argumentation is complex and requires explicit modeling and multiple opportunities for dialogic interactions. This dissertation will examine how several facets influence argumentation in order to support instructors in implementing and improving argumentation in their inquiry-oriented classrooms. These facets include access to language and use of discursive moves, classroom design, curriculum and instructional activities, and interactional dynamics and power negotiation. The data set for this dissertation is a transcript generated from the audio- and video capture of a 7-minute student discussion around a mechanism in the TCA (TriCarboxylic Acid) cycle, as well as student writing, and course documents from student portfolios. This dissertation, organized using the manuscript style structure, will present three standalone chapters, each with a specific focus related to the central theme of supporting argumentation, which is the connecting thread. Chapter 2 will discuss how power is negotiated during the argumentation process and how interaction dynamics can support or inhibit the quality of argumentation. Chapter 3 will provide assessment and evaluation support to instructors who want to guide their students in meeting high quality levels in both the process and product of argumentation. Finally, chapter 4 will explore the influence of pedagogical, and instructional resources and tools on the quality of argumentation. This includes a discussion of the influence of classroom talk, particularly discursive moves and interactional dynamics, as well the curriculum and instructional activities, and the design features of the learning environment. Each chapter will conclude with instructional implications that provide practical guidance in the form of pedagogical activities to instructors. Partial funding for this dissertation was received from a PSC-CUNY Cycle 44 Research Award (66799-00 44). Findings suggest that the classroom design can support collaboration and the dialogic nature of argumentation, and the curriculum and activities can act as resources for students to share and negotiate multiple perspectives, but that instructors can also influence the process of argumentation by utilizing specific discursive moves, such as telling and revoicing, to promote or inhibit argumentation. The results, specifically from chapter 4, also propose that instructors model and share the expected criteria for high quality components of argumentation. The need for instructors to be aware of the criteria for high levels of quality for each of the argumentation components is a critical implication of this research. The criterion is presented in this dissertation and is derived from a review of multiple findings by researchers of argumentation, as well the scientific community at large. Creating structures and implementing targeted pedagogical strategies that support argumentation can lead students to use the process of argumentation as an empowerment tool to enact agency and negotiate power. This has the potential to sustain the success of science students, create a community of practice, and increase equity and access for all.
The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences.
Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker
2016-01-01
The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant's platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses.
Emery, Jon D; Jefford, Michael; King, Madeleine; Hayne, Dickon; Martin, Andrew; Doorey, Juanita; Hyatt, Amelia; Habgood, Emily; Lim, Tee; Hawks, Cynthia; Pirotta, Marie; Trevena, Lyndal; Schofield, Penelope
2017-03-01
To test the feasibility and efficacy of a multifaceted model of shared care for men after completion of treatment for prostate cancer. Men who had completed treatment for low- to moderate-risk prostate cancer within the previous 8 weeks were eligible. Participants were randomized to usual care or shared care. Shared care entailed substituting two hospital visits with three visits in primary care, a survivorship care plan, recall and reminders, and screening for distress and unmet needs. Outcome measures included psychological distress, prostate cancer-specific quality of life, satisfaction and preferences for care and healthcare resource use. A total of 88 men were randomized (shared care n = 45; usual care n = 43). There were no clinically important or statistically significant differences between groups with regard to distress, prostate cancer-specific quality of life or satisfaction with care. At the end of the trial, men in the intervention group were significantly more likely to prefer a shared care model to hospital follow-up than those in the control group (intervention 63% vs control 24%; P<0.001). There was high compliance with prostate-specific antigen monitoring in both groups. The shared care model was cheaper than usual care (shared care AUS$1411; usual care AUS$1728; difference AUS$323 [plausible range AUS$91-554]). Well-structured shared care for men with low- to moderate-risk prostate cancer is feasible and appears to produce clinically similar outcomes to those of standard care, at a lower cost. © 2016 The Authors BJU International © 2016 BJU International Published by John Wiley & Sons Ltd.
Queens Tri-School Confederation 1992-93 Evaluation Report. OREA Report.
ERIC Educational Resources Information Center
Dworkowitz, Barbara
This report presents the evaluation results of the Queens Tri-School Confederation magnet programs in New York City: programs designed to reduce minority-group isolation among high school students in three high schools and simultaneously improve the quality of their education through the sharing of resources and expertise. These programs, which…
Cooperation stimulation strategies for peer-to-peer wireless live video-sharing social networks.
Lin, W Sabrina; Zhao, H Vicky; Liu, K J Ray
2010-07-01
Human behavior analysis in video sharing social networks is an emerging research area, which analyzes the behavior of users who share multimedia content and investigates the impact of human dynamics on video sharing systems. Users watching live streaming in the same wireless network share the same limited bandwidth of backbone connection to the Internet, thus, they might want to cooperate with each other to obtain better video quality. These users form a wireless live-streaming social network. Every user wishes to watch video with high quality while paying as little as possible cost to help others. This paper focuses on providing incentives for user cooperation. We propose a game-theoretic framework to model user behavior and to analyze the optimal strategies for user cooperation simulation in wireless live streaming. We first analyze the Pareto optimality and the time-sensitive bargaining equilibrium of the two-person game. We then extend the solution to the multiuser scenario. We also consider potential selfish users' cheating behavior and malicious users' attacking behavior and analyze the performance of the proposed strategies with the existence of cheating users and malicious attackers. Both our analytical and simulation results show that the proposed strategies can effectively stimulate user cooperation, achieve cheat free and attack resistance, and help provide reliable services for wireless live streaming applications.
NASA Technical Reports Server (NTRS)
Jones, Jeremy; Grosvenor, Sandy; Wolf, Karl; Li, Connie; Koratkar, Anuradha; Powers, Edward I. (Technical Monitor)
2001-01-01
In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing between groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for SOFIA, the SIRTF planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, defacto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA - both successes and failures - and offer some lessons learned that may promote further successes in collaboration and re-use.
NASA Technical Reports Server (NTRS)
Korathkar, Anuradha; Grosvenor, Sandy; Jones, Jeremy; Li, Connie; Mackey, Jennifer; Neher, Ken; Obenschain, Arthur F. (Technical Monitor)
2001-01-01
In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing among groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for the SIRTF (Space Infrared Telescope Facility) planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, de facto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA--both successes and failures, and offer some lessons learned that might promote further successes in collaboration and re-use.
From Provenance Standards and Tools to Queries and Actionable Provenance
NASA Astrophysics Data System (ADS)
Ludaescher, B.
2017-12-01
The W3C PROV standard provides a minimal core for sharing retrospective provenance information for scientific workflows and scripts. PROV extensions such as DataONE's ProvONE model are necessary for linking runtime observables in retrospective provenance records with conceptual-level prospective provenance information, i.e., workflow (or dataflow) graphs. Runtime provenance recorders, such as DataONE's RunManager for R, or noWorkflow for Python capture retrospective provenance automatically. YesWorkflow (YW) is a toolkit that allows researchers to declare high-level prospective provenance models of scripts via simple inline comments (YW-annotations), revealing the computational modules and dataflow dependencies in the script. By combining and linking both forms of provenance, important queries and use cases can be supported that neither provenance model can afford on its own. We present existing and emerging provenance tools developed for the DataONE and SKOPE (Synthesizing Knowledge of Past Environments) projects. We show how the different tools can be used individually and in combination to model, capture, share, query, and visualize provenance information. We also present challenges and opportunities for making provenance information more immediately actionable for the researchers who create it in the first place. We argue that such a shift towards "provenance-for-self" is necessary to accelerate the creation, sharing, and use of provenance in support of transparent, reproducible computational and data science.
Adding value to laboratory medicine: a professional responsibility.
Beastall, Graham H
2013-01-01
Laboratory medicine is a medical specialty at the centre of healthcare. When used optimally laboratory medicine generates knowledge that can facilitate patient safety, improve patient outcomes, shorten patient journeys and lead to more cost-effective healthcare. Optimal use of laboratory medicine relies on dynamic and authoritative leadership outside as well as inside the laboratory. The first responsibility of the head of a clinical laboratory is to ensure the provision of a high quality service across a wide range of parameters culminating in laboratory accreditation against an international standard, such as ISO 15189. From that essential baseline the leadership of laboratory medicine at local, national and international level needs to 'add value' to ensure the optimal delivery, use, development and evaluation of the services provided for individuals and for groups of patients. A convenient tool to illustrate added value is use of the mnemonic 'SCIENCE'. This tool allows added value to be considered in seven domains: standardisation and harmonisation; clinical effectiveness; innovation; evidence-based practice; novel applications; cost-effectiveness; and education of others. The assessment of added value in laboratory medicine may be considered against a framework that comprises three dimensions: operational efficiency; patient management; and patient behaviours. The profession and the patient will benefit from sharing examples of adding value to laboratory medicine.
Yu, Yao; Hu, Hao; Bohlender, Ryan J; Hu, Fulan; Chen, Jiun-Sheng; Holt, Carson; Fowler, Jerry; Guthery, Stephen L; Scheet, Paul; Hildebrandt, Michelle A T; Yandell, Mark; Huff, Chad D
2018-04-06
High-throughput sequencing data are increasingly being made available to the research community for secondary analyses, providing new opportunities for large-scale association studies. However, heterogeneity in target capture and sequencing technologies often introduce strong technological stratification biases that overwhelm subtle signals of association in studies of complex traits. Here, we introduce the Cross-Platform Association Toolkit, XPAT, which provides a suite of tools designed to support and conduct large-scale association studies with heterogeneous sequencing datasets. XPAT includes tools to support cross-platform aware variant calling, quality control filtering, gene-based association testing and rare variant effect size estimation. To evaluate the performance of XPAT, we conducted case-control association studies for three diseases, including 783 breast cancer cases, 272 ovarian cancer cases, 205 Crohn disease cases and 3507 shared controls (including 1722 females) using sequencing data from multiple sources. XPAT greatly reduced Type I error inflation in the case-control analyses, while replicating many previously identified disease-gene associations. We also show that association tests conducted with XPAT using cross-platform data have comparable performance to tests using matched platform data. XPAT enables new association studies that combine existing sequencing datasets to identify genetic loci associated with common diseases and other complex traits.
Cyberlearning for Climate Literacy: Challenges and Opportunities
NASA Astrophysics Data System (ADS)
McCaffrey, M. S.; Buhr, S. M.; Gold, A. U.; Ledley, T. S.; Mooney, M. E.; Niepold, F.
2010-12-01
Cyberlearning tools provide cost and carbon-efficient avenues for fostering a climate literate society through online engagement with learners. With climate change education becoming a Presidential Priority in 2009, funding for grants from NSF, NASA and NOAA is leading to a new generation of cyberlearning resources that supplement existing online resources. This paper provides an overview of challenges and opportunities relating to the online delivery of high quality, often complex climate science by examining several existing and emerging efforts, including the Climate Literacy and Energy Awareness Network (CLEAN,) a National Science Digital Library Pathway, the development by CIRES Education and Outreach of the Inspiring Climate Education Excellence (ICEE) online course, TERC’s Earth Exploration Toolbook (EET,) DataTools, and EarthLab modules, the NOAA Climate Stewards Education Program (CSEP) that utilizes the NSTA E-Learning Center, online efforts by members of the Federation of Earth Science Information Partners (ESIP), UCAR’s Climate Discovery program, and the Climate Adaptation, Mitigation e-Learning (CAMeL) project. In addition, we will summarize outcomes of the Cyberlearning for Climate Literacy workshop held in Washington DC in the Fall of 2009 and examine opportunities for teachers to develop and share their own lesson plans based on climate-related web resources that currently lack built-in learning activities, assessments or teaching tips.
The Hierarchical Data Format as a Foundation for Community Data Sharing
NASA Astrophysics Data System (ADS)
Habermann, T.
2017-12-01
Hierarchical Data Format (HDF) formats and libraries have been used by individual researchers and major science programs across many Earth and Space Science disciplines and sectors to provide high-performance information storage and access for several decades. Generic group, dataset, and attribute objects in HDF have been combined in many ways to form domain objects that scientists understand and use. Well-known applications of HDF in the Earth Sciences include thousands of global satellite observations and products produced by NASA's Earth Observing System using the HDF-EOS conventions, navigation quality bathymetry produced as Bathymetric Attributed Grids (BAGs) by the OpenNavigationSurface project and others, seismic wave collections written into the Adoptable Seismic Data Format (ASDF) and many oceanographic and atmospheric products produced using the climate-forecast conventions with the netCDF4 data model and API to HDF5. This is the modus operandi of these communities: 1) develop a model of scientific data objects and associated metadata used in a domain, 2) implement that model using HDF, 3) develop software libraries that connect that model to tools and 4) encourage adoption of those tools in the community. Understanding these domain object implementations and facilitating communication across communities is an important goal of The HDF Group. We will discuss these examples and approaches to community outreach during this session.
Tupper, Judith B; Gray, Carolyn E; Pearson, Karen B; Coburn, Andrew F
2015-01-01
The "siloed" approach to healthcare delivery contributes to communication challenges and to potential patient harm when patients transfer between settings. This article reports on the evaluation of a demonstration in 10 rural communities to improve the safety of nursing facility (NF) transfers to hospital emergency departments by forming interprofessional teams of hospital, emergency medical service, and NF staff to develop and implement tools and protocols for standardizing critical interfacility communication pathways and information sharing. We worked with each of the 10 teams to document current communication processes and information sharing tools and to design, implement, and evaluate strategies/tools to increase effective communication and sharing of patient information across settings. A mixed methods approach was used to evaluate changes from baseline in documentation of patient information shared across settings during the transfer process. Study findings showed significant improvement in key areas across the three settings, including infection status and baseline mental functioning. Improvement strategies and performance varied across settings; however, accurate and consistent information sharing of advance directives and medication lists remains a challenge. Study results demonstrate that with neutral facilitation and technical support, collaborative interfacility teams can assess and effectively address communication and information sharing problems that threaten patient safety.
Entomological Opportunities and Challenges for Sustainable Viticulture in a Global Market.
Daane, Kent M; Vincent, Charles; Isaacs, Rufus; Ioriatti, Claudio
2018-01-07
Viticulture has experienced dramatic global growth in acreage and value. As the international exchange of goods has increased, so too has the market demand for sustainably produced products. Both elements redefine the entomological challenges posed to viticulture and have stimulated significant advances in arthropod pest control programs. Vineyard managers on all continents are increasingly combating invasive species, resulting in the adoption of novel insecticides, semiochemicals, and molecular tools to support sustainable viticulture. At the local level, vineyard management practices consider factors such as the surrounding natural ecosystem, risk to fish populations, and air quality. Coordinated multinational responses to pest invasion have been highly effective and have, for example, resulted in eradication of the moth Lobesia botrana from California vineyards, a pest found in 2009 and eradicated by 2016. At the global level, the shared pests and solutions for their suppression will play an increasing role in delivering internationally sensitive pest management programs that respond to invasive pests, climate change, novel vector and pathogen relationships, and pesticide restrictions.
Disentangling diatom species complexes: does morphometry suffice?
Borrego-Ramos, María; Olenici, Adriana
2017-01-01
Accurate taxonomic resolution in light microscopy analyses of microalgae is essential to achieve high quality, comparable results in both floristic analyses and biomonitoring studies. A number of closely related diatom taxa have been detected to date co-occurring within benthic diatom assemblages, sharing many morphological, morphometrical and ecological characteristics. In this contribution, we analysed the hypothesis that, where a large sample size (number of individuals) is available, common morphometrical parameters (valve length, width and stria density) are sufficient to achieve a correct identification to the species level. We focused on some common diatom taxa belonging to the genus Gomphonema. More than 400 valves and frustules were photographed in valve view and measured using Fiji software. Several statistical tools (mixture and discriminant analysis, k-means clustering, classification trees, etc.) were explored to test whether mere morphometry, independently of other valve features, leads to correct identifications, when compared to identifications made by experts. In view of the results obtained, morphometry-based determination in diatom taxonomy is discouraged. PMID:29250472
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradel, Lauren; Endert, Alexander; Koch, Kristen
2013-08-01
Large, high-resolution vertical displays carry the potential to increase the accuracy of collaborative sensemaking, given correctly designed visual analytics tools. From an exploratory user study using a fictional textual intelligence analysis task, we investigated how users interact with the display to construct spatial schemas and externalize information, as well as how they establish shared and private territories. We investigated the space management strategies of users partitioned by type of tool philosophy followed (visualization- or text-centric). We classified the types of territorial behavior exhibited in terms of how the users interacted with information on the display (integrated or independent workspaces). Next,more » we examined how territorial behavior impacted the common ground between the pairs of users. Finally, we offer design suggestions for building future co-located collaborative visual analytics tools specifically for use on large, high-resolution vertical displays.« less
HPTLC Fingerprint Analysis: A Quality Control for Authentication of Herbal Phytochemicals
NASA Astrophysics Data System (ADS)
Ram, Mauji; Abdin, M. Z.; Khan, M. A.; Jha, Prabhakar
Authentication and consistent quality are the basic requirement for Indian traditional medicine (TIM), Chinese traditional herbal medicine (TCHM), and their commercial products, regardless of the kind of research conducted to modernize the TIM and TCHM. The complexities of TIM and TCHM challenge the current official quality control mode, for which only a few biochemical markers were selected for identification and quantitative assay. Referring too many unknown factors existed in TIM and TCHM, it is impossible and unnecessary to pinpoint qualitatively and quantitatively every single component contained in the herbal drug. Chromatographic fingerprint is a rational option to meet the need for more effective and powerful quality assessment to TIM and TCHM. The optimized chromatographic fingerprint is not only an alternative analytical tool for authentication, but also an approach to express the various pattern of chemical ingredients distribution in the herbal drugs and preserve such "database" for further multifaced sustainable studies. Analytical separation techniques, for example, high-performance liquid chromatography (HPLC), gas chromatography (GC) and mass spectrometry (MS) were among the most popular methods of choice used for quality control of raw material and finished herbal product. Fingerprint analysis approach using high-performance thin-layer chromatography (HPTLC) has become the most potent tool for quality control of herbal medicines because of its simplicity and reliability. It can serve as a tool for identification, authentication, and quality control of herbal drugs. In this chapter, attempts are being made to expand the use of HPTLC and at the same time create interest among prospective researcher in herbal analysis. The developed method can be used as a quality control tool for rapid authentication from a wide variety of herbal samples. Some examples demonstrated the role of fingerprinting in quality control and assessment.
CloudMan as a platform for tool, data, and analysis distribution
2012-01-01
Background Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. Results CloudMan (usecloudman.org) enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. Conclusions With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions. PMID:23181507
Automatic Generation of OpenMP Directives and Its Application to Computational Fluid Dynamics Codes
NASA Technical Reports Server (NTRS)
Yan, Jerry; Jin, Haoqiang; Frumkin, Michael; Yan, Jerry (Technical Monitor)
2000-01-01
The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. As great progress was made in hardware and software technologies, performance of parallel programs with compiler directives has demonstrated large improvement. The introduction of OpenMP directives, the industrial standard for shared-memory programming, has minimized the issue of portability. In this study, we have extended CAPTools, a computer-aided parallelization toolkit, to automatically generate OpenMP-based parallel programs with nominal user assistance. We outline techniques used in the implementation of the tool and discuss the application of this tool on the NAS Parallel Benchmarks and several computational fluid dynamics codes. This work demonstrates the great potential of using the tool to quickly port parallel programs and also achieve good performance that exceeds some of the commercial tools.
Determining customer satisfaction in anatomic pathology.
Zarbo, Richard J
2006-05-01
Measurement of physicians' and patients' satisfaction with laboratory services has become a standard practice in the United States, prompted by national accreditation requirements. Unlike other surveys of hospital-, outpatient care-, or physician-related activities, no ongoing, comprehensive customer satisfaction survey of anatomic pathology services is available for subscription that would allow continual benchmarking against peer laboratories. Pathologists, therefore, must often design their own local assessment tools to determine physician satisfaction in anatomic pathology. To describe satisfaction survey design that would elicit specific information from physician customers about key elements of anatomic pathology services. The author shares his experience in biannually assessing customer satisfaction in anatomic pathology with survey tools designed at the Henry Ford Hospital, Detroit, Mich. Benchmarks for physician satisfaction, opportunities for improvement, and characteristics that correlated with a high level of physician satisfaction were identified nationally from a standardized survey tool used by 94 laboratories in the 2001 College of American Pathologists Q-Probes quality improvement program. In general, physicians are most satisfied with professional diagnostic services and least satisfied with pathology services related to poor communication. A well-designed and conducted customer satisfaction survey is an opportunity for pathologists to periodically educate physician customers about services offered, manage unrealistic expectations, and understand the evolving needs of the physician customer. Armed with current information from physician customers, the pathologist is better able to strategically plan for resources that facilitate performance improvements in anatomic pathology laboratory services that align with evolving clinical needs in health care delivery.
Tolerant (parallel) Programming
NASA Technical Reports Server (NTRS)
DiNucci, David C.; Bailey, David H. (Technical Monitor)
1997-01-01
In order to be truly portable, a program must be tolerant of a wide range of development and execution environments, and a parallel program is just one which must be tolerant of a very wide range. This paper first defines the term "tolerant programming", then describes many layers of tools to accomplish it. The primary focus is on F-Nets, a formal model for expressing computation as a folded partial-ordering of operations, thereby providing an architecture-independent expression of tolerant parallel algorithms. For implementing F-Nets, Cooperative Data Sharing (CDS) is a subroutine package for implementing communication efficiently in a large number of environments (e.g. shared memory and message passing). Software Cabling (SC), a very-high-level graphical programming language for building large F-Nets, possesses many of the features normally expected from today's computer languages (e.g. data abstraction, array operations). Finally, L2(sup 3) is a CASE tool which facilitates the construction, compilation, execution, and debugging of SC programs.
A Rating Tool for Sharing Experiences with Serious Games
ERIC Educational Resources Information Center
Hendrix, Maurice; Backlund, Per; Vampula, Boris
2014-01-01
The potential of Computer Games for non-entertainment purposes, such as education, is well established. A wide variety of games have been developed for the educational market, covering subjects such as mathematics and languages. However, while a growing industry developing educational games exist, the practical uptake in schools is not as high as…
Mission Possible: Measuring Critical Thinking and Problem Solving
ERIC Educational Resources Information Center
Wren, Doug; Cashwell, Amy
2018-01-01
The author describes how Virginia Beach City Public Schools developed a performance assessment that they administer to all 4th graders, 7th graders, and high school students in the district. He describes lessons learned about creating good performance tasks and developing a successful scoring process, as well as sharing tools connected to this…
Consumer Health Informatics: Past, Present, and Future of a Rapidly Evolving Domain.
Demiris, G
2016-05-20
Consumer Health Informatics (CHI) is a rapidly growing domain within the field of biomedical and health informatics. The objective of this paper is to reflect on the past twenty five years and showcase informatics concepts and applications that led to new models of care and patient empowerment, and to predict future trends and challenges for the next 25 years. We discuss concepts and systems based on a review and analysis of published literature in the consumer health informatics domain in the last 25 years. The field was introduced with the vision that one day patients will be in charge of their own health care using informatics tools and systems. Scientific literature in the field originally focused on ways to assess the quality and validity of available printed health information, only to grow significantly to cover diverse areas such as online communities, social media, and shared decision-making. Concepts such as home telehealth, mHealth, and the quantified-self movement, tools to address transparency of health care organizations, and personal health records and portals provided significant milestones in the field. Consumers are able to actively participate in the decision-making process and to engage in health care processes and decisions. However, challenges such as health literacy and the digital divide have hindered us from maximizing the potential of CHI tools with a significant portion of underserved populations unable to access and utilize them. At the same time, at a global scale consumer tools can increase access to care for underserved populations in developing countries. The field continues to grow and emerging movements such as precision medicine and the sharing economy will introduce new opportunities and challenges.
NASA Technical Reports Server (NTRS)
2014-01-01
Topics covered include: Innovative Software Tools Measure Behavioral Alertness; Miniaturized, Portable Sensors Monitor Metabolic Health; Patient Simulators Train Emergency Caregivers; Solar Refrigerators Store Life-Saving Vaccines; Monitors Enable Medication Management in Patients' Homes; Handheld Diagnostic Device Delivers Quick Medical Readings; Experiments Result in Safer, Spin-Resistant Aircraft; Interfaces Visualize Data for Airline Safety, Efficiency; Data Mining Tools Make Flights Safer, More Efficient; NASA Standards Inform Comfortable Car Seats; Heat Shield Paves the Way for Commercial Space; Air Systems Provide Life Support to Miners; Coatings Preserve Metal, Stone, Tile, and Concrete; Robots Spur Software That Lends a Hand; Cloud-Based Data Sharing Connects Emergency Managers; Catalytic Converters Maintain Air Quality in Mines; NASA-Enhanced Water Bottles Filter Water on the Go; Brainwave Monitoring Software Improves Distracted Minds; Thermal Materials Protect Priceless, Personal Keepsakes; Home Air Purifiers Eradicate Harmful Pathogens; Thermal Materials Drive Professional Apparel Line; Radiant Barriers Save Energy in Buildings; Open Source Initiative Powers Real-Time Data Streams; Shuttle Engine Designs Revolutionize Solar Power; Procedure-Authoring Tool Improves Safety on Oil Rigs; Satellite Data Aid Monitoring of Nation's Forests; Mars Technologies Spawn Durable Wind Turbines; Programs Visualize Earth and Space for Interactive Education; Processor Units Reduce Satellite Construction Costs; Software Accelerates Computing Time for Complex Math; Simulation Tools Prevent Signal Interference on Spacecraft; Software Simplifies the Sharing of Numerical Models; Virtual Machine Language Controls Remote Devices; Micro-Accelerometers Monitor Equipment Health; Reactors Save Energy, Costs for Hydrogen Production; Cameras Monitor Spacecraft Integrity to Prevent Failures; Testing Devices Garner Data on Insulation Performance; Smart Sensors Gather Information for Machine Diagnostics; Oxygen Sensors Monitor Bioreactors and Ensure Health and Safety; Vision Algorithms Catch Defects in Screen Displays; and Deformable Mirrors Capture Exoplanet Data, Reflect Lasers.
Consumer Health Informatics: Past, Present, and Future of a Rapidly Evolving Domain
2016-01-01
Summary Objectives Consumer Health Informatics (CHI) is a rapidly growing domain within the field of biomedical and health informatics. The objective of this paper is to reflect on the past twenty five years and showcase informatics concepts and applications that led to new models of care and patient empowerment, and to predict future trends and challenges for the next 25 years. Methods We discuss concepts and systems based on a review and analysis of published literature in the consumer health informatics domain in the last 25 years. Results The field was introduced with the vision that one day patients will be in charge of their own health care using informatics tools and systems. Scientific literature in the field originally focused on ways to assess the quality and validity of available printed health information, only to grow significantly to cover diverse areas such as online communities, social media, and shared decision-making. Concepts such as home telehealth, mHealth, and the quantified-self movement, tools to address transparency of health care organizations, and personal health records and portals provided significant milestones in the field. Conclusion Consumers are able to actively participate in the decision-making process and to engage in health care processes and decisions. However, challenges such as health literacy and the digital divide have hindered us from maximizing the potential of CHI tools with a significant portion of underserved populations unable to access and utilize them. At the same time, at a global scale consumer tools can increase access to care for underserved populations in developing countries. The field continues to grow and emerging movements such as precision medicine and the sharing economy will introduce new opportunities and challenges. PMID:27199196
Hackett, Christina L; Mulvale, Gillian; Miatello, Ashleigh
2018-04-29
Although high quality mental health care for children and youth is a goal of many health systems, little is known about the dimensions of quality mental health care from users' perspectives. We engaged young people, caregivers and service providers to share experiences, which shed light on quality dimensions for youth mental health care. Using experience-based co-design, we collected qualitative data from young people aged 16-24 with a mental disorder (n = 19), identified caregivers (n = 12) and service providers (n = 14) about their experiences with respect to youth mental health services. Experience data were collected using multiple approaches including interviews, a suite of online and smartphone applications (n = 22), and a co-design event (n = 16) and analysed to extract touch points. These touch points were used to prioritize and co-design a user-driven prototype of a questionnaire to provide feedback to service providers. Young people, caregiver and service provider reports of service experiences were used to identify aspects of care quality at eight mental health service contact points: Access to mental health care; Transfer to/from hospital; Intake into hospital; Services provided; Assessment and treatment; Treatment environment; and Caregiver involvement in care. In some cases, low quality care was harmful to users and their caregivers. Young people co-designed a prototype of a user-driven feedback questionnaire to improve quality of service experiences that was supported by service providers and caregivers at the co-design event. By using EBCD to capture in-depth data regarding experiences of young people, their caregivers and service providers, study participants have begun to establish a baseline for acceptable quality of mental health care for young people. © 2018 The Authors. Health Expectations published by John Wiley & Sons Ltd.
Development of medical writing in India: Past, present and future
Sharma, Suhasini
2017-01-01
Pharmaceutical medical writing has grown significantly in India in the last couple of decades. It includes preparing regulatory, safety, and publication documents as well as educational and communication material related to health and health-care products. Medical writing requires medical understanding, knowledge of drug development and the regulatory and safety domains, understanding of research methodologies, and awareness of relevant regulations and guidelines. It also requires the ability to analyze, interpret, and present biomedical scientific data in the required format and good writing skills. Medical writing is the fourth most commonly outsourced clinical development activity, and its global demand has steadily increased due to rising cost pressures on the pharmaceutical industry. India has the unique advantages of a large workforce of science graduates and medical professionals trained in English and lower costs, which make it a suitable destination for outsourcing medical writing services. However, the current share of India in global medical writing business is very small. This industry in India faces some real challenges, such as the lack of depth and breadth in domain expertise, inadequate technical writing skills, high attrition rates, and paucity of standardized training programs as well as quality assessment tools. Focusing our time, attention, and resources to address these challenges will help the Indian medical writing industry gain its rightful share in the global medical writing business. PMID:28194338
Videos as an Instructional Tool in Pre-Service Science Teacher Education
ERIC Educational Resources Information Center
Sonmez, Duygu; Hakverdi-Can, Meral
2012-01-01
Problem Statement: Student teaching is an integral part of teacher education. While it provides preservice teachers with real classroom experience, though, it is limited in that it does not provide shared experience. Used as instructional tools, videos provide a shared common experience in a controlled environment to pre-service teachers in…
Change Management Meets Web 2.0
ERIC Educational Resources Information Center
Gale, Doug
2008-01-01
Web 2.0 is the term used to describe a group of web-based creativity, information-sharing, and collaboration tools including wikis, blogs, social networks, and folksonomies. The common thread in all of these tools is twofold: They enable collaboration and information sharing, and their impact on higher education has been dramatic. A recent study…
Appropriating Geometric Series as a Cultural Tool: A Study of Student Collaborative Learning
ERIC Educational Resources Information Center
Carlsen, Martin
2010-01-01
The aim of this article is to illustrate how students, through collaborative small-group problem solving, appropriate the concept of geometric series. Student appropriation of cultural tools is dependent on five sociocultural aspects: involvement in joint activity, shared focus of attention, shared meanings for utterances, transforming actions and…
Social Medicine: Twitter in Healthcare.
Pershad, Yash; Hangge, Patrick T; Albadawi, Hassan; Oklu, Rahmi
2018-05-28
Social media enables the public sharing of information. With the recent emphasis on transparency and the open sharing of information between doctors and patients, the intersection of social media and healthcare is of particular interest. Twitter is currently the most popular form of social media used for healthcare communication; here, we examine the use of Twitter in medicine and specifically explore in what capacity using Twitter to share information on treatments and research has the potential to improve care. The sharing of information on Twitter can create a communicative and collaborative atmosphere for patients, physicians, and researchers and even improve quality of care. However, risks involved with using Twitter for healthcare discourse include high rates of misinformation, difficulties in verifying the credibility of sources, overwhelmingly high volumes of information available on Twitter, concerns about professionalism, and the opportunity cost of using physician time. Ultimately, the use of Twitter in healthcare can allow patients, healthcare professionals, and researchers to be more informed, but specific guidelines for appropriate use are necessary.
Using Option Grids: steps toward shared decision-making for neonatal circumcision.
Fay, Mary; Grande, Stuart W; Donnelly, Kyla; Elwyn, Glyn
2016-02-01
To assess the impact, acceptability and feasibility of a short encounter tool designed to enhance the process of shared decision-making and parental engagement. We analyzed video-recordings of clinical encounters, half undertaken before and half after a brief intervention that trained four clinicians how to use Option Grids, using an observer-based measure of shared decision-making. We also analyzed semi-structured interviews conducted with the clinicians four weeks after their exposure to the intervention. Observer OPTION(5) scores were higher at post-intervention, with a mean of 33.9 (SD=23.5) compared to a mean of 16.1 (SD=7.1) for pre-intervention, a significant difference of 17.8 (95% CI: 2.4, 33.2). Prior to using the intervention, clinicians used a consent document to frame circumcision as a default practice. Encounters with the Option Grid conferred agency to both parents and clinicians, and facilitated shared decision-making. Clinician reported recognizing the tool's positive effect on their communication process. Tools such as Option Grids have the potential to make it easier for clinicians to achieve shared decision-making. Encounter tools have the potential to change practice. More research is needed to test their feasibility in routine practice. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
A comprehensive evaluation of assembly scaffolding tools
2014-01-01
Background Genome assembly is typically a two-stage process: contig assembly followed by the use of paired sequencing reads to join contigs into scaffolds. Scaffolds are usually the focus of reported assembly statistics; longer scaffolds greatly facilitate the use of genome sequences in downstream analyses, and it is appealing to present larger numbers as metrics of assembly performance. However, scaffolds are highly prone to errors, especially when generated using short reads, which can directly result in inflated assembly statistics. Results Here we provide the first independent evaluation of scaffolding tools for second-generation sequencing data. We find large variations in the quality of results depending on the tool and dataset used. Even extremely simple test cases of perfect input, constructed to elucidate the behaviour of each algorithm, produced some surprising results. We further dissect the performance of the scaffolders using real and simulated sequencing data derived from the genomes of Staphylococcus aureus, Rhodobacter sphaeroides, Plasmodium falciparum and Homo sapiens. The results from simulated data are of high quality, with several of the tools producing perfect output. However, at least 10% of joins remains unidentified when using real data. Conclusions The scaffolders vary in their usability, speed and number of correct and missed joins made between contigs. Results from real data highlight opportunities for further improvements of the tools. Overall, SGA, SOPRA and SSPACE generally outperform the other tools on our datasets. However, the quality of the results is highly dependent on the read mapper and genome complexity. PMID:24581555
Rubin, Katrine Hass; Friis-Holmberg, Teresa; Hermann, Anne Pernille; Abrahamsen, Bo; Brixen, Kim
2013-08-01
A huge number of risk assessment tools have been developed. Far from all have been validated in external studies, more of them have absence of methodological and transparent evidence, and few are integrated in national guidelines. Therefore, we performed a systematic review to provide an overview of existing valid and reliable risk assessment tools for prediction of osteoporotic fractures. Additionally, we aimed to determine if the performance of each tool was sufficient for practical use, and last, to examine whether the complexity of the tools influenced their discriminative power. We searched PubMed, Embase, and Cochrane databases for papers and evaluated these with respect to methodological quality using the Quality Assessment Tool for Diagnostic Accuracy Studies (QUADAS) checklist. A total of 48 tools were identified; 20 had been externally validated, however, only six tools had been tested more than once in a population-based setting with acceptable methodological quality. None of the tools performed consistently better than the others and simple tools (i.e., the Osteoporosis Self-assessment Tool [OST], Osteoporosis Risk Assessment Instrument [ORAI], and Garvan Fracture Risk Calculator [Garvan]) often did as well or better than more complex tools (i.e., Simple Calculated Risk Estimation Score [SCORE], WHO Fracture Risk Assessment Tool [FRAX], and Qfracture). No studies determined the effectiveness of tools in selecting patients for therapy and thus improving fracture outcomes. High-quality studies in randomized design with population-based cohorts with different case mixes are needed. Copyright © 2013 American Society for Bone and Mineral Research.
Rupert, Douglas J; Squiers, Linda B; Renaud, Jeanette M; Whitehead, Nedra S; Osborn, Roger J; Furberg, Robert D; Squire, Claudia M; Tzeng, Janice P
2013-08-01
Women with hereditary breast and ovarian cancer syndrome (HBOC) face a higher risk of earlier, more aggressive cancer. Because of HBOC's rarity, screening is recommended only for women with strong cancer family histories. However, most patients do not have accurate history available and struggle to understand genetic concepts. Cancer in the Family, an online clinical decision support tool, calculated women's HBOC risk and promoted shared patient-provider decisions about screening. A pilot evaluation (n=9 providers, n=48 patients) assessed the tool's impact on knowledge, attitudes, and screening decisions. Patients used the tool before wellness exams and completed three surveys. Providers accessed the tool during exams, completed exam checklists, and completed four surveys. Patients entered complete family histories (67%), calculated personal risk (96%), and shared risk printouts with providers (65%). HBOC knowledge increased dramatically for patients and providers, and many patients (75%) perceived tool results as valid. The tool prompted patient-provider discussions about HBOC risk and cancer family history (88%). The tool was effective in increasing knowledge, collecting family history, and sparking patient-provider discussions about HBOC screening. Interactive tools can effectively communicate personalized risk and promote shared decisions, but they are not a substitute for patient-provider discussions. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Rogers, Jess; Manca, Donna; Lang-Robertson, Kelly; Bell, Stephanie; Salvalaggio, Ginetta; Greiver, Michelle; Korownyk, Christina; Klein, Doug; Carroll, June C.; Kahan, Mel; Meuser, Jamie; Buchman, Sandy; Barrett, Rebekah M.; Grunfeld, Eva
2014-01-01
Background The aim of the Building on Existing Tools to Improve Chronic Disease Prevention and Screening in Family Practice (BETTER) randomized controlled trial is to improve the primary prevention of and screening for multiple conditions (diabetes, cardiovascular disease, cancer) and some of the associated lifestyle factors (tobacco use, alcohol overuse, poor nutrition, physical inactivity). In this article, we describe how we harmonized the evidence-based clinical practice guideline recommendations and patient tools to determine the content for the BETTER trial. Methods We identified clinical practice guidelines and tools through a structured literature search; we included both indexed and grey literature. From these guidelines, recommendations were extracted and integrated into knowledge products and outcome measures for use in the BETTER trial. End-users (family physicians, nurse practitioners, nurses and dieticians) were engaged in reviewing the recommendations and tools, as well as tailoring the content to the needs of the BETTER trial and family practice. Results In total, 3–5 high-quality guidelines were identified for each condition; from these, we identified high-grade recommendations for the prevention of and screening for chronic disease. The guideline recommendations were limited by conflicting recommendations, vague wording and different taxonomies for strength of recommendation. There was a lack of quality evidence for manoeuvres to improve the uptake of guidelines among patients with depression. We developed the BETTER clinical algorithms for the implementation plan. Although it was difficult to identify high-quality tools, 180 tools of interest were identified. Interpretation The intervention for the BETTER trial was built by integrating existing guidelines and tools, and working with end-users throughout the process to increase the intervention’s utility for practice. Trial registration: ISRCTN07170460 PMID:25077119
Jointly structuring triadic spaces of meaning and action: book sharing from 3 months on
Rossmanith, Nicole; Costall, Alan; Reichelt, Andreas F.; López, Beatriz; Reddy, Vasudevi
2014-01-01
This study explores the emergence of triadic interactions through the example of book sharing. As part of a naturalistic study, 10 infants were visited in their homes from 3–12 months. We report that (1) book sharing as a form of infant-caregiver-object interaction occurred from as early as 3 months. Using qualitative video analysis at a micro-level adapting methodologies from conversation and interaction analysis, we demonstrate that caregivers and infants practiced book sharing in a highly co-ordinated way, with caregivers carving out interaction units and shaping actions into action arcs and infants actively participating and co-ordinating their attention between mother and object from the beginning. We also (2) sketch a developmental trajectory of book sharing over the first year and show that the quality and dynamics of book sharing interactions underwent considerable change as the ecological situation was transformed in parallel with the infants' development of attention and motor skills. Social book sharing interactions reached an early peak at 6 months with the infants becoming more active in the coordination of attention between caregiver and book. From 7 to 9 months, the infants shifted their interest largely to solitary object exploration, in parallel with newly emerging postural and object manipulation skills, disrupting the social coordination and the cultural frame of book sharing. In the period from 9 to 12 months, social book interactions resurfaced, as infants began to effectively integrate manual object actions within the socially shared activity. In conclusion, to fully understand the development and qualities of triadic cultural activities such as book sharing, we need to look especially at the hitherto overlooked early period from 4 to 6 months, and investigate how shared spaces of meaning and action are structured together in and through interaction, creating the substrate for continuing cooperation and cultural learning. PMID:25540629
Jointly structuring triadic spaces of meaning and action: book sharing from 3 months on.
Rossmanith, Nicole; Costall, Alan; Reichelt, Andreas F; López, Beatriz; Reddy, Vasudevi
2014-01-01
This study explores the emergence of triadic interactions through the example of book sharing. As part of a naturalistic study, 10 infants were visited in their homes from 3-12 months. We report that (1) book sharing as a form of infant-caregiver-object interaction occurred from as early as 3 months. Using qualitative video analysis at a micro-level adapting methodologies from conversation and interaction analysis, we demonstrate that caregivers and infants practiced book sharing in a highly co-ordinated way, with caregivers carving out interaction units and shaping actions into action arcs and infants actively participating and co-ordinating their attention between mother and object from the beginning. We also (2) sketch a developmental trajectory of book sharing over the first year and show that the quality and dynamics of book sharing interactions underwent considerable change as the ecological situation was transformed in parallel with the infants' development of attention and motor skills. Social book sharing interactions reached an early peak at 6 months with the infants becoming more active in the coordination of attention between caregiver and book. From 7 to 9 months, the infants shifted their interest largely to solitary object exploration, in parallel with newly emerging postural and object manipulation skills, disrupting the social coordination and the cultural frame of book sharing. In the period from 9 to 12 months, social book interactions resurfaced, as infants began to effectively integrate manual object actions within the socially shared activity. In conclusion, to fully understand the development and qualities of triadic cultural activities such as book sharing, we need to look especially at the hitherto overlooked early period from 4 to 6 months, and investigate how shared spaces of meaning and action are structured together in and through interaction, creating the substrate for continuing cooperation and cultural learning.