Sample records for qualitative error analysis

  1. Qualitative Examination of Children's Naming Skills through Test Adaptations.

    ERIC Educational Resources Information Center

    Fried-Oken, Melanie

    1987-01-01

    The Double Administration Naming Technique assists clinicians in obtaining qualitative information about a client's visual confrontation naming skills through administration of a standard naming test; readministration of the same test; identification of single and double errors; cuing for double naming errors; and qualitative analysis of naming…

  2. FRamework Assessing Notorious Contributing Influences for Error (FRANCIE): Perspective on Taxonomy Development to Support Error Reporting and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lon N. Haney; David I. Gertman

    2003-04-01

    Beginning in the 1980s a primary focus of human reliability analysis was estimation of human error probabilities. However, detailed qualitative modeling with comprehensive representation of contextual variables often was lacking. This was likely due to the lack of comprehensive error and performance shaping factor taxonomies, and the limited data available on observed error rates and their relationship to specific contextual variables. In the mid 90s Boeing, America West Airlines, NASA Ames Research Center and INEEL partnered in a NASA sponsored Advanced Concepts grant to: assess the state of the art in human error analysis, identify future needs for human errormore » analysis, and develop an approach addressing these needs. Identified needs included the need for a method to identify and prioritize task and contextual characteristics affecting human reliability. Other needs identified included developing comprehensive taxonomies to support detailed qualitative modeling and to structure meaningful data collection efforts across domains. A result was the development of the FRamework Assessing Notorious Contributing Influences for Error (FRANCIE) with a taxonomy for airline maintenance tasks. The assignment of performance shaping factors to generic errors by experts proved to be valuable to qualitative modeling. Performance shaping factors and error types from such detailed approaches can be used to structure error reporting schemes. In a recent NASA Advanced Human Support Technology grant FRANCIE was refined, and two new taxonomies for use on space missions were developed. The development, sharing, and use of error taxonomies, and the refinement of approaches for increased fidelity of qualitative modeling is offered as a means to help direct useful data collection strategies.« less

  3. Where Are the Logical Errors in the Theory of Big Bang?

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2015-04-01

    The critical analysis of the foundations of the theory of Big Bang is proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. It is argued that the starting point of the theory of Big Bang contains three fundamental logical errors. The first error is the assumption that a macroscopic object (having qualitative determinacy) can have an arbitrarily small size and can be in the singular state (i.e., in the state that has no qualitative determinacy). This assumption implies that the transition, (macroscopic object having the qualitative determinacy) --> (singular state of matter that has no qualitative determinacy), leads to loss of information contained in the macroscopic object. The second error is the assumption that there are the void and the boundary between matter and void. But if such boundary existed, then it would mean that the void has dimensions and can be measured. The third error is the assumption that the singular state of matter can make a transition into the normal state without the existence of the program of qualitative and quantitative development of the matter, without controlling influence of other (independent) object. However, these assumptions conflict with the practice and, consequently, formal logic, rational dialectics, and cybernetics. Indeed, from the point of view of cybernetics, the transition, (singular state of the Universe) -->(normal state of the Universe),would be possible only in the case if there was the Managed Object that is outside the Universe and have full, complete, and detailed information about the Universe. Thus, the theory of Big Bang is a scientific fiction.

  4. Generation 1.5 Written Error Patterns: A Comparative Study

    ERIC Educational Resources Information Center

    Doolan, Stephen M.; Miller, Donald

    2012-01-01

    In an attempt to contribute to existing research on Generation 1.5 students, the current study uses quantitative and qualitative methods to compare error patterns in a corpus of Generation 1.5, L1, and L2 community college student writing. This error analysis provides one important way to determine if error patterns in Generation 1.5 student…

  5. Error analysis of mathematical problems on TIMSS: A case of Indonesian secondary students

    NASA Astrophysics Data System (ADS)

    Priyani, H. A.; Ekawati, R.

    2018-01-01

    Indonesian students’ competence in solving mathematical problems is still considered as weak. It was pointed out by the results of international assessment such as TIMSS. This might be caused by various types of errors made. Hence, this study aimed at identifying students’ errors in solving mathematical problems in TIMSS in the topic of numbers that considered as the fundamental concept in Mathematics. This study applied descriptive qualitative analysis. The subject was three students with most errors in the test indicators who were taken from 34 students of 8th graders. Data was obtained through paper and pencil test and student’s’ interview. The error analysis indicated that in solving Applying level problem, the type of error that students made was operational errors. In addition, for reasoning level problem, there are three types of errors made such as conceptual errors, operational errors and principal errors. Meanwhile, analysis of the causes of students’ errors showed that students did not comprehend the mathematical problems given.

  6. The qualitative problem of major quotation errors, as illustrated by 10 different examples in the headache literature.

    PubMed

    Tfelt-Hansen, Peer

    2015-03-01

    There are two types of errors when references are used in the scientific literature: citation errors and quotation errors, and these errors have in reviews mainly been evaluated quantitatively. Quotation errors are the major problem, and 1 review reported 6% major quotation errors. The objective of this listing of quotation errors is to illustrate by qualitative analysis of different types of 10 major quotation errors how and possibly why authors misquote references. The author selected for review the first 10 different consecutive major quotation errors encountered from his reading of the headache literature. The characteristics of the 10 quotation errors ranged considerably. Thus, in a review of migraine therapy in a very prestigious medical journal, the superiority of a new treatment (sumatriptan) vs an old treatment (aspirin plus metoclopramide) was claimed despite no significant difference for the primary efficacy measure in the trial. One author, in a scientific debate, referred to the lack of dilation of the middle meningeal artery in spontaneous migraine despite the fact that only 1 migraine attack was studied. The possibility for creative major quotation errors in the medical literature is most likely infinite. Qualitative evaluations, as the present, of major quotation errors will hopefully result in more general awareness of quotation problems in the medical literature. Even if the final responsibility for correct use of quotations is with the authors, the referees, the experts with the knowledge needed to spot quotation errors, should be more involved in ensuring correct and fair use of references. Finally, this paper suggests that major misleading quotations, if pointed out by readers of the journal, should, as a rule, be corrected by way of an erratum statement. © 2015 American Headache Society.

  7. A simple, objective analysis scheme for scatterometer data. [Seasat A satellite observation of wind over ocean

    NASA Technical Reports Server (NTRS)

    Levy, G.; Brown, R. A.

    1986-01-01

    A simple economical objective analysis scheme is devised and tested on real scatterometer data. It is designed to treat dense data such as those of the Seasat A Satellite Scatterometer (SASS) for individual or multiple passes, and preserves subsynoptic scale features. Errors are evaluated with the aid of sampling ('bootstrap') statistical methods. In addition, sensitivity tests have been performed which establish qualitative confidence in calculated fields of divergence and vorticity. The SASS wind algorithm could be improved; however, the data at this point are limited by instrument errors rather than analysis errors. The analysis error is typically negligible in comparison with the instrument error, but amounts to 30 percent of the instrument error in areas of strong wind shear. The scheme is very economical, and thus suitable for large volumes of dense data such as SASS data.

  8. Slow Learner Errors Analysis in Solving Fractions Problems in Inclusive Junior High School Class

    NASA Astrophysics Data System (ADS)

    Novitasari, N.; Lukito, A.; Ekawati, R.

    2018-01-01

    A slow learner whose IQ is between 71 and 89 will have difficulties in solving mathematics problems that often lead to errors. The errors could be analyzed to where the errors may occur and its type. This research is qualitative descriptive which aims to describe the locations, types, and causes of slow learner errors in the inclusive junior high school class in solving the fraction problem. The subject of this research is one slow learner of seventh-grade student which was selected through direct observation by the researcher and through discussion with mathematics teacher and special tutor which handles the slow learner students. Data collection methods used in this study are written tasks and semistructured interviews. The collected data was analyzed by Newman’s Error Analysis (NEA). Results show that there are four locations of errors, namely comprehension, transformation, process skills, and encoding errors. There are four types of errors, such as concept, principle, algorithm, and counting errors. The results of this error analysis will help teachers to identify the causes of the errors made by the slow learner.

  9. Soft error evaluation and vulnerability analysis in Xilinx Zynq-7010 system-on chip

    NASA Astrophysics Data System (ADS)

    Du, Xuecheng; He, Chaohui; Liu, Shuhuan; Zhang, Yao; Li, Yonghong; Xiong, Ceng; Tan, Pengkang

    2016-09-01

    Radiation-induced soft errors are an increasingly important threat to the reliability of modern electronic systems. In order to evaluate system-on chip's reliability and soft error, the fault tree analysis method was used in this work. The system fault tree was constructed based on Xilinx Zynq-7010 All Programmable SoC. Moreover, the soft error rates of different components in Zynq-7010 SoC were tested by americium-241 alpha radiation source. Furthermore, some parameters that used to evaluate the system's reliability and safety were calculated using Isograph Reliability Workbench 11.0, such as failure rate, unavailability and mean time to failure (MTTF). According to fault tree analysis for system-on chip, the critical blocks and system reliability were evaluated through the qualitative and quantitative analysis.

  10. A Qualitative Analysis of Imitation Performances of Preschoolers With Down Syndrome.

    PubMed

    Vanvuchelen, Marleen

    2016-05-01

    A number of studies suggest that imitation is a characteristic strength in children with Down Syndrome (DS). The present study aims to discover whether imitation performances are qualitatively phenotypical in DS. Eight preschoolers with DS were matched on chronological, mental, language and imitation age with 8 preschoolers with intellectual disability of undifferentiated etiology (ID-UND). Imitation performances on the Preschool Imitation and Praxis Scale were videotaped for blind scoring on 30 possible errors. Children with DS made fewer production errors (synkinesias, OR 0.3 [0.1-0.7]), but more conceptual errors (substitution, OR 2.5 [1.6-3.9]) compared to children with ID-UND. This finding is in line with the view of a cognitive phenotype in DS, which is characterized by preserved visuospatial and impaired language abilities.

  11. Developing a model for the adequate description of electronic communication in hospitals.

    PubMed

    Saboor, Samrend; Ammenwerth, Elske

    2011-01-01

    Adequate information and communication systems (ICT) can help to improve the communication in hospitals. Changes to the ICT-infrastructure of hospitals must be planed carefully. In order to support a comprehensive planning, we presented a classification of 81 common errors of the electronic communication on the MIE 2008 congress. Our objective now was to develop a data model that defines specific requirements for an adequate description of electronic communication processes We first applied the method of explicating qualitative content analysis on the error categorization in order to determine the essential process details. After this, we applied the method of subsuming qualitative content analysis on the results of the first step. A data model for the adequate description of electronic communication. This model comprises 61 entities and 91 relationships. The data model comprises and organizes all details that are necessary for the detection of the respective errors. It can be for either used to extend the capabilities of existing modeling methods or as a basis for the development of a new approach.

  12. Microscopic saw mark analysis: an empirical approach.

    PubMed

    Love, Jennifer C; Derrick, Sharon M; Wiersema, Jason M; Peters, Charles

    2015-01-01

    Microscopic saw mark analysis is a well published and generally accepted qualitative analytical method. However, little research has focused on identifying and mitigating potential sources of error associated with the method. The presented study proposes the use of classification trees and random forest classifiers as an optimal, statistically sound approach to mitigate the potential for error of variability and outcome error in microscopic saw mark analysis. The statistical model was applied to 58 experimental saw marks created with four types of saws. The saw marks were made in fresh human femurs obtained through anatomical gift and were analyzed using a Keyence digital microscope. The statistical approach weighed the variables based on discriminatory value and produced decision trees with an associated outcome error rate of 8.62-17.82%. © 2014 American Academy of Forensic Sciences.

  13. An analysis of pilot error-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.

    1974-01-01

    A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.

  14. Errors Analysis of Students in Mathematics Department to Learn Plane Geometry

    NASA Astrophysics Data System (ADS)

    Mirna, M.

    2018-04-01

    This article describes the results of qualitative descriptive research that reveal the locations, types and causes of student error in answering the problem of plane geometry at the problem-solving level. Answers from 59 students on three test items informed that students showed errors ranging from understanding the concepts and principles of geometry itself to the error in applying it to problem solving. Their type of error consists of concept errors, principle errors and operational errors. The results of reflection with four subjects reveal the causes of the error are: 1) student learning motivation is very low, 2) in high school learning experience, geometry has been seen as unimportant, 3) the students' experience using their reasoning in solving the problem is very less, and 4) students' reasoning ability is still very low.

  15. Students’ Errors in Geometry Viewed from Spatial Intelligence

    NASA Astrophysics Data System (ADS)

    Riastuti, N.; Mardiyana, M.; Pramudya, I.

    2017-09-01

    Geometry is one of the difficult materials because students must have ability to visualize, describe images, draw shapes, and know the kind of shapes. This study aim is to describe student error based on Newmans’ Error Analysis in solving geometry problems viewed from spatial intelligence. This research uses descriptive qualitative method by using purposive sampling technique. The datas in this research are the result of geometri material test and interview by the 8th graders of Junior High School in Indonesia. The results of this study show that in each category of spatial intelligence has a different type of error in solving the problem on the material geometry. Errors are mostly made by students with low spatial intelligence because they have deficiencies in visual abilities. Analysis of student error viewed from spatial intelligence is expected to help students do reflection in solving the problem of geometry.

  16. Patients' perception of types of errors in palliative care - results from a qualitative interview study.

    PubMed

    Kiesewetter, Isabel; Schulz, Christian; Bausewein, Claudia; Fountain, Rita; Schmitz, Andrea

    2016-08-11

    Medical errors have been recognized as a relevant public health concern and research efforts to improve patient safety have increased. In palliative care, however, studies on errors are rare and mainly focus on quantitative measures. We aimed to explore how palliative care patients perceive and think about errors in palliative care and to generate an understanding of patients' perception of errors in that specialty. A semistructured qualitative interview study was conducted with patients who had received at least 1 week of palliative care in an inpatient or outpatient setting. All interviews were transcribed verbatim and analysed according to qualitative content analysis. Twelve patients from two centers were interviewed (7 women, median age 63.5 years, range 22-90 years). Eleven patients suffered from a malignancy. Days in palliative care ranged from 10 to 180 days (median 28 days). 96 categories emerged which were summed up under 11 umbrella terms definition, difference, type, cause, consequence, meaning, recognition, handling, prevention, person causing and affected person. A deductive model was developed assigning umbrella terms to error-theory-based factor levels (definition, type and process-related factors). 23 categories for type of error were identified, including 12 categories that can be considered as palliative care specific. On the level of process-related factors 3 palliative care specific categories emerged (recognition, meaning and consequence of errors). From the patients' perspective, there are some aspects of errors that could be considered as specific to palliative care. As the results of our study suggest, these palliative care-specific aspects seem to be very important from the patients' point of view and should receive further investigation. Moreover, the findings of this study can serve as a guide to further assess single aspects or categories of errors in palliative care in future research.

  17. Obstetric Neuraxial Drug Administration Errors: A Quantitative and Qualitative Analytical Review.

    PubMed

    Patel, Santosh; Loveridge, Robert

    2015-12-01

    Drug administration errors in obstetric neuraxial anesthesia can have devastating consequences. Although fully recognizing that they represent "only the tip of the iceberg," published case reports/series of these errors were reviewed in detail with the aim of estimating the frequency and the nature of these errors. We identified case reports and case series from MEDLINE and performed a quantitative analysis of the involved drugs, error setting, source of error, the observed complications, and any therapeutic interventions. We subsequently performed a qualitative analysis of the human factors involved and proposed modifications to practice. Twenty-nine cases were identified. Various drugs were given in error, but no direct effects on the course of labor, mode of delivery, or neonatal outcome were reported. Four maternal deaths from the accidental intrathecal administration of tranexamic acid were reported, all occurring after delivery of the fetus. A range of hemodynamic and neurologic signs and symptoms were noted, but the most commonly reported complication was the failure of the intended neuraxial anesthetic technique. Several human factors were present; most common factors were drug storage issues and similar drug appearance. Four practice recommendations were identified as being likely to have prevented the errors. The reported errors exposed latent conditions within health care systems. We suggest that the implementation of the following processes may decrease the risk of these types of drug errors: (1) Careful reading of the label on any drug ampule or syringe before the drug is drawn up or injected; (2) labeling all syringes; (3) checking labels with a second person or a device (such as a barcode reader linked to a computer) before the drug is drawn up or administered; and (4) use of non-Luer lock connectors on all epidural/spinal/combined spinal-epidural devices. Further study is required to determine whether routine use of these processes will reduce drug error.

  18. Towards reporting standards for neuropsychological study results: A proposal to minimize communication errors with standardized qualitative descriptors for normalized test scores.

    PubMed

    Schoenberg, Mike R; Rum, Ruba S

    2017-11-01

    Rapid, clear and efficient communication of neuropsychological results is essential to benefit patient care. Errors in communication are a lead cause of medical errors; nevertheless, there remains a lack of consistency in how neuropsychological scores are communicated. A major limitation in the communication of neuropsychological results is the inconsistent use of qualitative descriptors for standardized test scores and the use of vague terminology. PubMed search from 1 Jan 2007 to 1 Aug 2016 to identify guidelines or consensus statements for the description and reporting of qualitative terms to communicate neuropsychological test scores was conducted. The review found the use of confusing and overlapping terms to describe various ranges of percentile standardized test scores. In response, we propose a simplified set of qualitative descriptors for normalized test scores (Q-Simple) as a means to reduce errors in communicating test results. The Q-Simple qualitative terms are: 'very superior', 'superior', 'high average', 'average', 'low average', 'borderline' and 'abnormal/impaired'. A case example illustrates the proposed Q-Simple qualitative classification system to communicate neuropsychological results for neurosurgical planning. The Q-Simple qualitative descriptor system is aimed as a means to improve and standardize communication of standardized neuropsychological test scores. Research are needed to further evaluate neuropsychological communication errors. Conveying the clinical implications of neuropsychological results in a manner that minimizes risk for communication errors is a quintessential component of evidence-based practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Linguistic pattern analysis of misspellings of typically developing writers in grades 1-9.

    PubMed

    Bahr, Ruth Huntley; Sillian, Elaine R; Berninger, Virginia W; Dow, Michael

    2012-12-01

    A mixed-methods approach, evaluating triple word-form theory, was used to describe linguistic patterns of misspellings. Spelling errors were taken from narrative and expository writing samples provided by 888 typically developing students in Grades 1-9. Errors were coded by category (phonological, orthographic, and morphological) and specific linguistic feature affected. Grade-level effects were analyzed with trend analysis. Qualitative analyses determined frequent error types and how use of specific linguistic features varied across grades. Phonological, orthographic, and morphological errors were noted across all grades, but orthographic errors predominated. Linear trends revealed developmental shifts in error proportions for the orthographic and morphological categories between Grades 4 and 5. Similar error types were noted across age groups, but the nature of linguistic feature error changed with age. Triple word-form theory was supported. By Grade 1, orthographic errors predominated, and phonological and morphological error patterns were evident. Morphological errors increased in relative frequency in older students, probably due to a combination of word-formation issues and vocabulary growth. These patterns suggest that normal spelling development reflects nonlinear growth and that it takes a long time to develop a robust orthographic lexicon that coordinates phonology, orthography, and morphology and supports word-specific, conventional spelling.

  20. Medication errors as malpractice-a qualitative content analysis of 585 medication errors by nurses in Sweden.

    PubMed

    Björkstén, Karin Sparring; Bergqvist, Monica; Andersén-Karlsson, Eva; Benson, Lina; Ulfvarson, Johanna

    2016-08-24

    Many studies address the prevalence of medication errors but few address medication errors serious enough to be regarded as malpractice. Other studies have analyzed the individual and system contributory factor leading to a medication error. Nurses have a key role in medication administration, and there are contradictory reports on the nurses' work experience in relation to the risk and type for medication errors. All medication errors where a nurse was held responsible for malpractice (n = 585) during 11 years in Sweden were included. A qualitative content analysis and classification according to the type and the individual and system contributory factors was made. In order to test for possible differences between nurses' work experience and associations within and between the errors and contributory factors, Fisher's exact test was used, and Cohen's kappa (k) was performed to estimate the magnitude and direction of the associations. There were a total of 613 medication errors in the 585 cases, the most common being "Wrong dose" (41 %), "Wrong patient" (13 %) and "Omission of drug" (12 %). In 95 % of the cases, an average of 1.4 individual contributory factors was found; the most common being "Negligence, forgetfulness or lack of attentiveness" (68 %), "Proper protocol not followed" (25 %), "Lack of knowledge" (13 %) and "Practice beyond scope" (12 %). In 78 % of the cases, an average of 1.7 system contributory factors was found; the most common being "Role overload" (36 %), "Unclear communication or orders" (30 %) and "Lack of adequate access to guidelines or unclear organisational routines" (30 %). The errors "Wrong patient due to mix-up of patients" and "Wrong route" and the contributory factors "Lack of knowledge" and "Negligence, forgetfulness or lack of attentiveness" were more common in less experienced nurses. The experienced nurses were more prone to "Practice beyond scope of practice" and to make errors in spite of "Lack of adequate access to guidelines or unclear organisational routines". Medication errors regarded as malpractice in Sweden were of the same character as medication errors worldwide. A complex interplay between individual and system factors often contributed to the errors.

  1. Perceptions and Attitudes towards Medication Error Reporting in Primary Care Clinics: A Qualitative Study in Malaysia.

    PubMed

    Samsiah, A; Othman, Noordin; Jamshed, Shazia; Hassali, Mohamed Azmi

    2016-01-01

    To explore and understand participants' perceptions and attitudes towards the reporting of medication errors (MEs). A qualitative study using in-depth interviews of 31 healthcare practitioners from nine publicly funded, primary care clinics in three states in peninsular Malaysia was conducted for this study. The participants included family medicine specialists, doctors, pharmacists, pharmacist assistants, nurses and assistant medical officers. The interviews were audiotaped and transcribed verbatim. Analysis of the data was guided by the framework approach. Six themes and 28 codes were identified. Despite the availability of a reporting system, most of the participants agreed that MEs were underreported. The nature of the error plays an important role in determining the reporting. The reporting system, organisational factors, provider factors, reporter's burden and benefit of reporting also were identified. Healthcare practitioners in primary care clinics understood the importance of reporting MEs to improve patient safety. Their perceptions and attitudes towards reporting of MEs were influenced by many factors which affect the decision-making process of whether or not to report. Although the process is complex, it primarily is determined by the severity of the outcome of the errors. The participants voluntarily report the errors if they are familiar with the reporting system, what error to report, when to report and what form to use.

  2. Learning Through Experience: Influence of Formal and Informal Training on Medical Error Disclosure Skills in Residents.

    PubMed

    Wong, Brian M; Coffey, Maitreya; Nousiainen, Markku T; Brydges, Ryan; McDonald-Blumer, Heather; Atkinson, Adelle; Levinson, Wendy; Stroud, Lynfa

    2017-02-01

    Residents' attitudes toward error disclosure have improved over time. It is unclear whether this has been accompanied by improvements in disclosure skills. To measure the disclosure skills of internal medicine (IM), paediatrics, and orthopaedic surgery residents, and to explore resident perceptions of formal versus informal training in preparing them for disclosure in real-world practice. We assessed residents' error disclosure skills using a structured role play with a standardized patient in 2012-2013. We compared disclosure skills across programs using analysis of variance. We conducted a multiple linear regression, including data from a historical cohort of IM residents from 2005, to investigate the influence of predictor variables on performance: training program, cohort year, and prior disclosure training and experience. We conducted a qualitative descriptive analysis of data from semistructured interviews with residents to explore resident perceptions of formal versus informal disclosure training. In a comparison of disclosure skills for 49 residents, there was no difference in overall performance across specialties (4.1 to 4.4 of 5, P  = .19). In regression analysis, only the current cohort was significantly associated with skill: current residents performed better than a historical cohort of 42 IM residents ( P  < .001). Qualitative analysis identified the importance of both formal (workshops, morbidity and mortality rounds) and informal (role modeling, debriefing) activities in preparation for disclosure in real-world practice. Residents across specialties have similar skills in disclosure of errors. Residents identified role modeling and a strong local patient safety culture as key facilitators for disclosure.

  3. Medication errors with electronic prescribing (eP): Two views of the same picture

    PubMed Central

    2010-01-01

    Background Quantitative prospective methods are widely used to evaluate the impact of new technologies such as electronic prescribing (eP) on medication errors. However, they are labour-intensive and it is not always feasible to obtain pre-intervention data. Our objective was to compare the eP medication error picture obtained with retrospective quantitative and qualitative methods. Methods The study was carried out at one English district general hospital approximately two years after implementation of an integrated electronic prescribing, administration and records system. Quantitative: A structured retrospective analysis was carried out of clinical records and medication orders for 75 randomly selected patients admitted to three wards (medicine, surgery and paediatrics) six months after eP implementation. Qualitative: Eight doctors, 6 nurses, 8 pharmacy staff and 4 other staff at senior, middle and junior grades, and 19 adult patients on acute surgical and medical wards were interviewed. Staff interviews explored experiences of developing and working with the system; patient interviews focused on experiences of medicine prescribing and administration on the ward. Interview transcripts were searched systematically for accounts of medication incidents. A classification scheme was developed and applied to the errors identified in the records review. Results The two approaches produced similar pictures of the drug use process. Interviews identified types of error identified in the retrospective notes review plus two eP-specific errors which were not detected by record review. Interview data took less time to collect than record review, and provided rich data on the prescribing process, and reasons for delays or non-administration of medicines, including "once only" orders and "as required" medicines. Conclusions The qualitative approach provided more understanding of processes, and some insights into why medication errors can happen. The method is cost-effective and could be used to supplement information from anonymous error reporting schemes. PMID:20497532

  4. Reevaluating Recovery: Perceived Violations and Preemptive Interventions on Emergency Psychiatry Rounds

    PubMed Central

    Cohen, Trevor; Blatter, Brett; Almeida, Carlos; Patel, Vimla L.

    2007-01-01

    Objective Contemporary error research suggests that the quest to eradicate error is misguided. Error commission, detection, and recovery are an integral part of cognitive work, even at the expert level. In collaborative workspaces, the perception of potential error is directly observable: workers discuss and respond to perceived violations of accepted practice norms. As perceived violations are captured and corrected preemptively, they do not fit Reason’s widely accepted definition of error as “failure to achieve an intended outcome.” However, perceived violations suggest the aversion of potential error, and consequently have implications for error prevention. This research aims to identify and describe perceived violations of the boundaries of accepted procedure in a psychiatric emergency department (PED), and how they are resolved in practice. Design Clinical discourse from fourteen PED patient rounds was audio-recorded. Excerpts from recordings suggesting perceived violations or incidents of miscommunication were extracted and analyzed using qualitative coding methods. The results are interpreted in relation to prior research on vulnerabilities to error in the PED. Results Thirty incidents of perceived violations or miscommunication are identified and analyzed. Of these, only one medication error was formally reported. Other incidents would not have been detected by a retrospective analysis. Conclusions The analysis of perceived violations expands the data available for error analysis beyond occasional reported adverse events. These data are prospective: responses are captured in real time. This analysis supports a set of recommendations to improve the quality of care in the PED and other critical care contexts. PMID:17329728

  5. Avoiding common pitfalls in qualitative data collection and transcription.

    PubMed

    Easton, K L; McComish, J F; Greenberg, R

    2000-09-01

    The subjective nature of qualitative research necessitates scrupulous scientific methods to ensure valid results. Although qualitative methods such as grounded theory, phenomenology, and ethnography yield rich data, consumers of research need to be able to trust the findings reported in such studies. Researchers are responsible for establishing the trustworthiness of qualitative research through a variety of ways. Specific challenges faced in the field can seriously threaten the dependability of the data. However, by minimizing potential errors that can occur when doing fieldwork, researchers can increase the trustworthiness of the study. The purpose of this article is to present three of the pitfalls that can occur in qualitative research during data collection and transcription: equipment failure, environmental hazards, and transcription errors. Specific strategies to minimize the risk for avoidable errors will be discussed.

  6. Chemometric study of Andalusian extra virgin olive oils Raman spectra: Qualitative and quantitative information.

    PubMed

    Sánchez-López, E; Sánchez-Rodríguez, M I; Marinas, A; Marinas, J M; Urbano, F J; Caridad, J M; Moalem, M

    2016-08-15

    Authentication of extra virgin olive oil (EVOO) is an important topic for olive oil industry. The fraudulent practices in this sector are a major problem affecting both producers and consumers. This study analyzes the capability of FT-Raman combined with chemometric treatments of prediction of the fatty acid contents (quantitative information), using gas chromatography as the reference technique, and classification of diverse EVOOs as a function of the harvest year, olive variety, geographical origin and Andalusian PDO (qualitative information). The optimal number of PLS components that summarizes the spectral information was introduced progressively. For the estimation of the fatty acid composition, the lowest error (both in fitting and prediction) corresponded to MUFA, followed by SAFA and PUFA though such errors were close to zero in all cases. As regards the qualitative variables, discriminant analysis allowed a correct classification of 94.3%, 84.0%, 89.0% and 86.6% of samples for harvest year, olive variety, geographical origin and PDO, respectively. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Linguistic Pattern Analysis of Misspellings of Typically Developing Writers in Grades 1 to 9

    PubMed Central

    Bahr, Ruth Huntley; Silliman, Elaine R.; Berninger, Virginia W.; Dow, Michael

    2012-01-01

    Purpose A mixed methods approach, evaluating triple word form theory, was used to describe linguistic patterns of misspellings. Method Spelling errors were taken from narrative and expository writing samples provided by 888 typically developing students in grades 1–9. Errors were coded by category (phonological, orthographic, and morphological) and specific linguistic feature affected. Grade level effects were analyzed with trend analysis. Qualitative analyses determined frequent error types and how use of specific linguistic features varied across grades. Results Phonological, orthographic, and morphological errors were noted across all grades, but orthographic errors predominated. Linear trends revealed developmental shifts in error proportions for the orthographic and morphological categories between grades 4–5. Similar error types were noted across age groups but the nature of linguistic feature error changed with age. Conclusions Triple word-form theory was supported. By grade 1, orthographic errors predominated and phonological and morphological error patterns were evident. Morphological errors increased in relative frequency in older students, probably due to a combination of word-formation issues and vocabulary growth. These patterns suggest that normal spelling development reflects non-linear growth and that it takes a long time to develop a robust orthographic lexicon that coordinates phonology, orthography, and morphology and supports word-specific, conventional spelling. PMID:22473834

  8. Defining near misses: towards a sharpened definition based on empirical data about error handling processes.

    PubMed

    Kessels-Habraken, Marieke; Van der Schaaf, Tjerk; De Jonge, Jan; Rutte, Christel

    2010-05-01

    Medical errors in health care still occur frequently. Unfortunately, errors cannot be completely prevented and 100% safety can never be achieved. Therefore, in addition to error reduction strategies, health care organisations could also implement strategies that promote timely error detection and correction. Reporting and analysis of so-called near misses - usually defined as incidents without adverse consequences for patients - are necessary to gather information about successful error recovery mechanisms. This study establishes the need for a clearer and more consistent definition of near misses to enable large-scale reporting and analysis in order to obtain such information. Qualitative incident reports and interviews were collected on four units of two Dutch general hospitals. Analysis of the 143 accompanying error handling processes demonstrated that different incident types each provide unique information about error handling. Specifically, error handling processes underlying incidents that did not reach the patient differed significantly from those of incidents that reached the patient, irrespective of harm, because of successful countermeasures that had been taken after error detection. We put forward two possible definitions of near misses and argue that, from a practical point of view, the optimal definition may be contingent on organisational context. Both proposed definitions could yield large-scale reporting of near misses. Subsequent analysis could enable health care organisations to improve the safety and quality of care proactively by (1) eliminating failure factors before real accidents occur, (2) enhancing their ability to intercept errors in time, and (3) improving their safety culture. Copyright 2010 Elsevier Ltd. All rights reserved.

  9. Error behaviors associated with loss of competency in Alzheimer's disease.

    PubMed

    Marson, D C; Annis, S M; McInturff, B; Bartolucci, A; Harrell, L E

    1999-12-10

    To investigate qualitative behavioral changes associated with declining medical decision-making capacity (competency) in patients with AD. Qualitative measures can yield clinical information about functional changes in neurologic disease not available through quantitative measures. Normal older controls (n = 21) and patients with mild and moderate probable AD (n = 72) were compared using a standardized competency measure and neuropsychological measures. A system of 16 qualitative error scores representing conceptual domains of language, executive dysfunction, affective dysfunction, and compensatory responses was used to analyze errors produced on the competency measure. Patterns of errors were examined across groups. Relationships between error behaviors and competency performance were determined, and neurocognitive correlates of specific error behaviors were identified. AD patients demonstrated more miscomprehension, factual confusion, intrusions, incoherent responses, nonresponsive answers, loss of task, and delegation than controls. Errors in the executive domain (loss of task, nonresponsive answer, and loss of detachment) were key predictors of declining competency performance by AD patients. Neuropsychological analyses in the AD group generally confirmed the conceptual domain assignments of the qualitative scores. Loss of task, nonresponsive answers, and loss of detachment were key behavioral changes associated with declining competency of AD patients and with neurocognitive measures of executive dysfunction. These findings support the growing linkage between executive dysfunction and competency loss.

  10. Consistency and convergence for numerical radiation conditions

    NASA Technical Reports Server (NTRS)

    Hagstrom, Thomas

    1990-01-01

    The problem of imposing radiation conditions at artificial boundaries for the numerical simulation of wave propagation is considered. Emphasis is on the behavior and analysis of the error which results from the restriction of the domain. The theory of error estimation is briefly outlined for boundary conditions. Use is made of the asymptotic analysis of propagating wave groups to derive and analyze boundary operators. For dissipative problems this leads to local, accurate conditions, but falls short in the hyperbolic case. A numerical experiment on the solution of the wave equation with cylindrical symmetry is described. A unified presentation of a number of conditions which have been proposed in the literature is given and the time dependence of the error which results from their use is displayed. The results are in qualitative agreement with theoretical considerations. It was found, however, that for this model problem it is particularly difficult to force the error to decay rapidly in time.

  11. Experiences of and support for nurses as second victims of adverse nursing errors: a qualitative systematic review.

    PubMed

    Cabilan, C J; Kynoch, Kathryn

    2017-09-01

    Second victims are clinicians who have made adverse errors and feel traumatized by the experience. The current published literature on second victims is mainly representative of doctors, hence nurses' experiences are not fully depicted. This systematic review was necessary to understand the second victim experience for nurses, explore the support provided, and recommend appropriate support systems for nurses. To synthesize the best available evidence on nurses' experiences as second victims, and explore their experiences of the support they receive and the support they need. Participants were registered nurses who made adverse errors. The review included studies that described nurses' experiences as second victims and/or the support they received after making adverse errors. All studies conducted in any health care settings worldwide. The qualitative studies included were grounded theory, discourse analysis and phenomenology. A structured search strategy was used to locate all unpublished and published qualitative studies, but was limited to the English language, and published between 1980 and February 2017. The references of studies selected for eligibility screening were hand-searched for additional literature. Eligible studies were assessed by two independent reviewers for methodological quality using a standardized critical appraisal instrument from the Joanna Briggs Institute Qualitative Assessment and Review Instrument (JBI QARI). Themes and narrative statements were extracted from papers included in the review using the standardized data extraction tool from JBI QARI. Data synthesis was conducted using the Joanna Briggs Institute meta-aggregation approach. There were nine qualitative studies included in the review. The narratives of 284 nurses generated a total of 43 findings, which formed 15 categories based on similarity of meaning. Four synthesized findings were generated from the categories: (i) The error brings a considerable emotional burden to the nurse that can last for a long time. In some cases, the error can alter nurses' perspectives and disrupt workplace relations; (ii) The type of support received influences how the nurse will feel about the error. Often nurses choose to speak with colleagues who have had similar experiences. Strategies need to focus on helping them to overcome the negative emotions associated with being a second victim; (iii) After the error, nurses are confronted with the dilemma of disclosure. Disclosure is determined by the following factors: how nurses feel about the error, harm to the patient, the support available to the nurse, and how errors are dealt with in the past; and (iv) Reconciliation is every nurse's endeavor. Predominantly, this is achieved by accepting fallibility, followed by acts of restitution, such as making positive changes in practice and disclosure to attain closure (see "Summary of findings"). Adverse errors were distressing for nurses, but they did not always receive the support they needed from colleagues. The lack of support had a significant impact on nurses' decisions on whether to disclose the error and his/her recovery process. Therefore, a good support system is imperative in alleviating the emotional burden, promoting the disclosure process, and assisting nurses with reconciliation. This review also highlighted research gaps that encompass the characteristics of the support system preferred by nurses, and the scarcity of studies worldwide.

  12. Reproducing American Sign Language sentences: cognitive scaffolding in working memory

    PubMed Central

    Supalla, Ted; Hauser, Peter C.; Bavelier, Daphne

    2014-01-01

    The American Sign Language Sentence Reproduction Test (ASL-SRT) requires the precise reproduction of a series of ASL sentences increasing in complexity and length. Error analyses of such tasks provides insight into working memory and scaffolding processes. Data was collected from three groups expected to differ in fluency: deaf children, deaf adults and hearing adults, all users of ASL. Quantitative (correct/incorrect recall) and qualitative error analyses were performed. Percent correct on the reproduction task supports its sensitivity to fluency as test performance clearly differed across the three groups studied. A linguistic analysis of errors further documented differing strategies and bias across groups. Subjects' recall projected the affordance and constraints of deep linguistic representations to differing degrees, with subjects resorting to alternate processing strategies when they failed to recall the sentence correctly. A qualitative error analysis allows us to capture generalizations about the relationship between error pattern and the cognitive scaffolding, which governs the sentence reproduction process. Highly fluent signers and less-fluent signers share common chokepoints on particular words in sentences. However, they diverge in heuristic strategy. Fluent signers, when they make an error, tend to preserve semantic details while altering morpho-syntactic domains. They produce syntactically correct sentences with equivalent meaning to the to-be-reproduced one, but these are not verbatim reproductions of the original sentence. In contrast, less-fluent signers tend to use a more linear strategy, preserving lexical status and word ordering while omitting local inflections, and occasionally resorting to visuo-motoric imitation. Thus, whereas fluent signers readily use top-down scaffolding in their working memory, less fluent signers fail to do so. Implications for current models of working memory across spoken and signed modalities are considered. PMID:25152744

  13. Perceptions and Attitudes towards Medication Error Reporting in Primary Care Clinics: A Qualitative Study in Malaysia

    PubMed Central

    Samsiah, A.; Othman, Noordin; Jamshed, Shazia; Hassali, Mohamed Azmi

    2016-01-01

    Objective To explore and understand participants’ perceptions and attitudes towards the reporting of medication errors (MEs). Methods A qualitative study using in-depth interviews of 31 healthcare practitioners from nine publicly funded, primary care clinics in three states in peninsular Malaysia was conducted for this study. The participants included family medicine specialists, doctors, pharmacists, pharmacist assistants, nurses and assistant medical officers. The interviews were audiotaped and transcribed verbatim. Analysis of the data was guided by the framework approach. Results Six themes and 28 codes were identified. Despite the availability of a reporting system, most of the participants agreed that MEs were underreported. The nature of the error plays an important role in determining the reporting. The reporting system, organisational factors, provider factors, reporter’s burden and benefit of reporting also were identified. Conclusions Healthcare practitioners in primary care clinics understood the importance of reporting MEs to improve patient safety. Their perceptions and attitudes towards reporting of MEs were influenced by many factors which affect the decision-making process of whether or not to report. Although the process is complex, it primarily is determined by the severity of the outcome of the errors. The participants voluntarily report the errors if they are familiar with the reporting system, what error to report, when to report and what form to use. PMID:27906960

  14. The establishment and external validation of NIR qualitative analysis model for waste polyester-cotton blend fabrics.

    PubMed

    Li, Feng; Li, Wen-Xia; Zhao, Guo-Liang; Tang, Shi-Jun; Li, Xue-Jiao; Wu, Hong-Mei

    2014-10-01

    A series of 354 polyester-cotton blend fabrics were studied by the near-infrared spectra (NIRS) technology, and a NIR qualitative analysis model for different spectral characteristics was established by partial least squares (PLS) method combined with qualitative identification coefficient. There were two types of spectrum for dying polyester-cotton blend fabrics: normal spectrum and slash spectrum. The slash spectrum loses its spectral characteristics, which are effected by the samples' dyes, pigments, matting agents and other chemical additives. It was in low recognition rate when the model was established by the total sample set, so the samples were divided into two types of sets: normal spectrum sample set and slash spectrum sample set, and two NIR qualitative analysis models were established respectively. After the of models were established the model's spectral region, pretreatment methods and factors were optimized based on the validation results, and the robustness and reliability of the model can be improved lately. The results showed that the model recognition rate was improved greatly when they were established respectively, the recognition rate reached up to 99% when the two models were verified by the internal validation. RC (relation coefficient of calibration) values of the normal spectrum model and slash spectrum model were 0.991 and 0.991 respectively, RP (relation coefficient of prediction) values of them were 0.983 and 0.984 respectively, SEC (standard error of calibration) values of them were 0.887 and 0.453 respectively, SEP (standard error of prediction) values of them were 1.131 and 0.573 respectively. A series of 150 bounds samples reached used to verify the normal spectrum model and slash spectrum model and the recognition rate reached up to 91.33% and 88.00% respectively. It showed that the NIR qualitative analysis model can be used for identification in the recycle site for the polyester-cotton blend fabrics.

  15. Analysis of organic acids and acylglycines for the diagnosis of related inborn errors of metabolism by GC- and HPLC-MS.

    PubMed

    la Marca, Giancarlo; Rizzo, Cristiano

    2011-01-01

    The analysis of organic acids in urine is commonly included in routine procedures for detecting many inborn errors of metabolism. Many analytical methods allow for both qualitative and quantitative determination of organic acids, mainly in urine but also in plasma, serum, whole blood, amniotic fluid, and cerebrospinal fluid. Liquid-liquid extraction and solid-phase extraction using anion exchange or silica columns are commonly employed approaches for sample treatment. Before analysis can be carried out using gas chromatography-mass spectrometry, organic acids must be converted into more thermally stable, volatile, and chemically inert forms, mainly trimethylsilyl ethers, esters, or methyl esters.

  16. Medication errors in home care: a qualitative focus group study.

    PubMed

    Berland, Astrid; Bentsen, Signe Berit

    2017-11-01

    To explore registered nurses' experiences of medication errors and patient safety in home care. The focus of care for older patients has shifted from institutional care towards a model of home care. Medication errors are common in this situation and can result in patient morbidity and mortality. An exploratory qualitative design with focus group interviews was used. Four focus group interviews were conducted with 20 registered nurses in home care. The data were analysed using content analysis. Five categories were identified as follows: lack of information, lack of competence, reporting medication errors, trade name products vs. generic name products, and improving routines. Medication errors occur frequently in home care and can threaten the safety of patients. Insufficient exchange of information and poor communication between the specialist and home-care health services, and between general practitioners and healthcare workers can lead to medication errors. A lack of competence in healthcare workers can also lead to medication errors. To prevent these, it is important that there should be up-to-date information and communication between healthcare workers during the transfer of patients from specialist to home care. Ensuring competence among healthcare workers with regard to medication is also important. In addition, there should be openness and accurate reporting of medication errors, as well as in setting routines for the preparation, alteration and administration of medicines. To prevent medication errors in home care, up-to-date information and communication between healthcare workers is important when patients are transferred from specialist to home care. It is also important to ensure adequate competence with regard to medication, and that there should be openness when medication errors occur, as well as in setting routines for the preparation, alteration and administration of medications. © 2017 John Wiley & Sons Ltd.

  17. A grid for a precise analysis of daily activities.

    PubMed

    Wojtasik, V; Olivier, C; Lekeu, F; Quittre, A; Adam, S; Salmon, E

    2010-01-01

    Assessment of daily living activities is essential in patients with Alzheimer's disease. Most current tools quantitatively assess overall ability but provide little qualitative information on individual difficulties. Only a few tools allow therapists to evaluate stereotyped activities and record different types of errors. We capitalised on the Kitchen Activity Assessment to design a widely applicable analysis grid that provides both qualitative and quantitative data on activity performance. A cooking activity was videotaped in 15 patients with dementia and assessed according to the different steps in the execution of the task. The evaluations obtained with our grid showed good correlations between raters, between versions of the grid and between sessions. Moreover, the degree of independence obtained with our analysis of the task correlated with the Kitchen Activity Assessment score and with a global score of cognitive functioning. We conclude that assessment of a daily living activity with this analysis grid is reproducible and relatively independent of the therapist, and thus provides quantitative and qualitative information useful for both evaluating and caring for demented patients.

  18. Interactions Between Research and Assessment

    ERIC Educational Resources Information Center

    Rourke, Byron P.

    1976-01-01

    Available from: Journal of Pediatric Psychology, Child Study Center, 1100 N.E. 13th Street, Oklahoma City, Oklahoma 73117. The author reviews some research in the area of the neuropsychology of learning disabilities (LD) with emphasis on the qualitative analysis of spelling errors in disabled spellers and the predictive accuracy of various…

  19. Using Fault Trees to Advance Understanding of Diagnostic Errors.

    PubMed

    Rogith, Deevakar; Iyengar, M Sriram; Singh, Hardeep

    2017-11-01

    Diagnostic errors annually affect at least 5% of adults in the outpatient setting in the United States. Formal analytic techniques are only infrequently used to understand them, in part because of the complexity of diagnostic processes and clinical work flows involved. In this article, diagnostic errors were modeled using fault tree analysis (FTA), a form of root cause analysis that has been successfully used in other high-complexity, high-risk contexts. How factors contributing to diagnostic errors can be systematically modeled by FTA to inform error understanding and error prevention is demonstrated. A team of three experts reviewed 10 published cases of diagnostic error and constructed fault trees. The fault trees were modeled according to currently available conceptual frameworks characterizing diagnostic error. The 10 trees were then synthesized into a single fault tree to identify common contributing factors and pathways leading to diagnostic error. FTA is a visual, structured, deductive approach that depicts the temporal sequence of events and their interactions in a formal logical hierarchy. The visual FTA enables easier understanding of causative processes and cognitive and system factors, as well as rapid identification of common pathways and interactions in a unified fashion. In addition, it enables calculation of empirical estimates for causative pathways. Thus, fault trees might provide a useful framework for both quantitative and qualitative analysis of diagnostic errors. Future directions include establishing validity and reliability by modeling a wider range of error cases, conducting quantitative evaluations, and undertaking deeper exploration of other FTA capabilities. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.

  20. New Statistical Techniques for Evaluating Longitudinal Models.

    ERIC Educational Resources Information Center

    Murray, James R.; Wiley, David E.

    A basic methodological approach in developmental studies is the collection of longitudinal data. Behavioral data cen take at least two forms, qualitative (or discrete) and quantitative. Both types are fallible. Measurement errors can occur in quantitative data and measures of these are based on error variance. Qualitative or discrete data can…

  1. Underlying risk factors for prescribing errors in long-term aged care: a qualitative study.

    PubMed

    Tariq, Amina; Georgiou, Andrew; Raban, Magdalena; Baysari, Melissa Therese; Westbrook, Johanna

    2016-09-01

    To identify system-related risk factors perceived to contribute to prescribing errors in Australian long-term care settings, that is, residential aged care facilities (RACFs). The study used qualitative methods to explore factors that contribute to unsafe prescribing in RACFs. Data were collected at three RACFs in metropolitan Sydney, Australia between May and November 2011. Participants included RACF managers, doctors, pharmacists and RACF staff actively involved in prescribing-related processes. Methods included non-participant observations (74 h), in-depth semistructured interviews (n=25) and artefact analysis. Detailed process activity models were developed for observed prescribing episodes supplemented by triangulated analysis using content analysis methods. System-related factors perceived to increase the risk of prescribing errors in RACFs were classified into three overarching themes: communication systems, team coordination and staff management. Factors associated with communication systems included limited point-of-care access to information, inadequate handovers, information storage across different media (paper, electronic and memory), poor legibility of charts, information double handling, multiple faxing of medication charts and reliance on manual chart reviews. Team factors included lack of established lines of responsibility, inadequate team communication and limited participation of doctors in multidisciplinary initiatives like medication advisory committee meetings. Factors related to staff management and workload included doctors' time constraints and their accessibility, lack of trained RACF staff and high RACF staff turnover. The study highlights several system-related factors including laborious methods for exchanging medication information, which often act together to contribute to prescribing errors. Multiple interventions (eg, technology systems, team communication protocols) are required to support the collaborative nature of RACF prescribing. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  2. Analysis of error type and frequency in apraxia of speech among Portuguese speakers.

    PubMed

    Cera, Maysa Luchesi; Minett, Thaís Soares Cianciarullo; Ortiz, Karin Zazo

    2010-01-01

    Most studies characterizing errors in the speech of patients with apraxia involve English language. To analyze the types and frequency of errors produced by patients with apraxia of speech whose mother tongue was Brazilian Portuguese. 20 adults with apraxia of speech caused by stroke were assessed. The types of error committed by patients were analyzed both quantitatively and qualitatively, and frequencies compared. We observed the presence of substitution, omission, trial-and-error, repetition, self-correction, anticipation, addition, reiteration and metathesis, in descending order of frequency, respectively. Omission type errors were one of the most commonly occurring whereas addition errors were infrequent. These findings differed to those reported in English speaking patients, probably owing to differences in the methodologies used for classifying error types; the inclusion of speakers with apraxia secondary to aphasia; and the difference in the structure of Portuguese language to English in terms of syllable onset complexity and effect on motor control. The frequency of omission and addition errors observed differed to the frequency reported for speakers of English.

  3. Non-destructive and fast identification of cotton-polyester blend fabrics by the portable near-infrared spectrometer.

    PubMed

    Li, Wen-xia; Li, Feng; Zhao, Guo-liang; Tang, Shi-jun; Liu, Xiao-ying

    2014-12-01

    A series of 376 cotton-polyester (PET) blend fabrics were studied by a portable near-infrared (NIR) spectrometer. A NIR semi-quantitative-qualitative calibration model was established by Partial Least Squares (PLS) method combined with qualitative identification coefficient. In this process, PLS method in a quantitative analysis was used as a correction method, and the qualitative identification coefficient was set by the content of cotton and polyester in blend fabrics. Cotton-polyester blend fabrics were identified qualitatively by the model and their relative contents were obtained quantitatively, the model can be used for semi-quantitative identification analysis. In the course of establishing the model, the noise and baseline drift of the spectra were eliminated by Savitzky-Golay(S-G) derivative. The influence of waveband selection and different pre-processing method was also studied in the qualitative calibration model. The major absorption bands of 100% cotton samples were in the 1400~1600 nm region, and the one for 100% polyester were around 1600~1800 nm, the absorption intensity was enhancing with the content increasing of cotton or polyester. Therefore, the cotton-polyester's major absorption region was selected as the base waveband, the optimal waveband (1100~2500 nm) was found by expanding the waveband in two directions (the correlation coefficient was 0.6, and wave-point number was 934). The validation samples were predicted by the calibration model, the results showed that the model evaluation parameters was optimum in the 1100~2500 nm region, and the combination of S-G derivative, multiplicative scatter correction (MSC) and mean centering was used as the pre-processing method. RC (relational coefficient of calibration) value was 0.978, RP (relational coefficient of prediction) value was 0.940, SEC (standard error of calibration) value was 1.264, SEP (standard error of prediction) value was 1.590, and the sample's recognition accuracy was up to 93.4%. It showed that the cotton-polyester blend fabrics could be predicted by the semi-quantitative-qualitative calibration model.

  4. Responding to serious medical error in general practice--consequences for the GPs involved: analysis of 75 cases from Germany.

    PubMed

    Fisseni, Gregor; Pentzek, Michael; Abholz, Heinz-Harald

    2008-02-01

    GPs' recollections about their 'most serious errors in treatment' and about the consequences for themselves. Does it make a difference, who (else) contributed to the error, or to its discovery or disclosure? Anonymous questionnaire study concerning the 'three most serious errors in your career as a GP'. The participating doctors were given an operational definition of 'serious error'. They applied a special recall technique, using patient-induced associations to bring to mind former 'serious errors'. The recall method and the semi-structured 25-item questionnaire used were developed and piloted by the authors. The items were analysed quantitatively and by qualitative content analysis. General practices in the North Rhine region in Germany: 32 GPs anonymously reported about 75 'most serious errors'. In more than half of the cases analysed, other people contributed considerably to the GPs' serious errors. Most of the errors were discovered and disclosed to the patient by doctors: either by the GPs themselves, or by colleagues. A lot of GPs suffered loss of reputation and loss of patients. However, the number of patients staying with their GP clearly exceeded the number leaving their GP, depending on who else contributed to the error, who discovered it and who disclosed it to the patient. The majority of patients still trusted their GP after a serious error, especially if the GP was not the only one who contributed to the error and if the GP played an active role in the discovery and disclosure or the error.

  5. Reducing Misses and Near Misses Related to Multitasking on the Electronic Health Record: Observational Study and Qualitative Analysis

    PubMed Central

    Matta, George Y; Bohsali, Fuad B; Chisolm, Margaret S

    2018-01-01

    Background Clinicians’ use of electronic health record (EHR) systems while multitasking may increase the risk of making errors, but silent EHR system use may lower patient satisfaction. Delaying EHR system use until after patient visits may increase clinicians’ EHR workload, stress, and burnout. Objective We aimed to describe the perspectives of clinicians, educators, administrators, and researchers about misses and near misses that they felt were related to clinician multitasking while using EHR systems. Methods This observational study was a thematic analysis of perspectives elicited from 63 continuing medical education (CME) participants during 2 workshops and 1 interactive lecture about challenges and strategies for relationship-centered communication during clinician EHR system use. The workshop elicited reflection about memorable times when multitasking EHR use was associated with “misses” (errors that were not caught at the time) or “near misses” (mistakes that were caught before leading to errors). We conducted qualitative analysis using an editing analysis style to identify codes and then select representative themes and quotes. Results All workshop participants shared stories of misses or near misses in EHR system ordering and documentation or patient-clinician communication, wondering about “misses we don’t even know about.” Risk factors included the computer’s position, EHR system usability, note content and style, information overload, problematic workflows, systems issues, and provider and patient communication behaviors and expectations. Strategies to reduce multitasking EHR system misses included clinician transparency when needing silent EHR system use (eg, for prescribing), narrating EHR system use, patient activation during EHR system use, adapting visit organization and workflow, improving EHR system design, and improving team support and systems. Conclusions CME participants shared numerous stories of errors and near misses in EHR tasks and communication that they felt related to EHR multitasking. However, they brainstormed diverse strategies for using EHR systems safely while preserving patient relationships. PMID:29410388

  6. Reducing Misses and Near Misses Related to Multitasking on the Electronic Health Record: Observational Study and Qualitative Analysis.

    PubMed

    Ratanawongsa, Neda; Matta, George Y; Bohsali, Fuad B; Chisolm, Margaret S

    2018-02-06

    Clinicians' use of electronic health record (EHR) systems while multitasking may increase the risk of making errors, but silent EHR system use may lower patient satisfaction. Delaying EHR system use until after patient visits may increase clinicians' EHR workload, stress, and burnout. We aimed to describe the perspectives of clinicians, educators, administrators, and researchers about misses and near misses that they felt were related to clinician multitasking while using EHR systems. This observational study was a thematic analysis of perspectives elicited from 63 continuing medical education (CME) participants during 2 workshops and 1 interactive lecture about challenges and strategies for relationship-centered communication during clinician EHR system use. The workshop elicited reflection about memorable times when multitasking EHR use was associated with "misses" (errors that were not caught at the time) or "near misses" (mistakes that were caught before leading to errors). We conducted qualitative analysis using an editing analysis style to identify codes and then select representative themes and quotes. All workshop participants shared stories of misses or near misses in EHR system ordering and documentation or patient-clinician communication, wondering about "misses we don't even know about." Risk factors included the computer's position, EHR system usability, note content and style, information overload, problematic workflows, systems issues, and provider and patient communication behaviors and expectations. Strategies to reduce multitasking EHR system misses included clinician transparency when needing silent EHR system use (eg, for prescribing), narrating EHR system use, patient activation during EHR system use, adapting visit organization and workflow, improving EHR system design, and improving team support and systems. CME participants shared numerous stories of errors and near misses in EHR tasks and communication that they felt related to EHR multitasking. However, they brainstormed diverse strategies for using EHR systems safely while preserving patient relationships. ©Neda Ratanawongsa, George Y Matta, Fuad B Bohsali, Margaret S Chisolm. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 06.02.2018.

  7. Patterns of technical error among surgical malpractice claims: an analysis of strategies to prevent injury to surgical patients.

    PubMed

    Regenbogen, Scott E; Greenberg, Caprice C; Studdert, David M; Lipsitz, Stuart R; Zinner, Michael J; Gawande, Atul A

    2007-11-01

    To identify the most prevalent patterns of technical errors in surgery, and evaluate commonly recommended interventions in light of these patterns. The majority of surgical adverse events involve technical errors, but little is known about the nature and causes of these events. We examined characteristics of technical errors and common contributing factors among closed surgical malpractice claims. Surgeon reviewers analyzed 444 randomly sampled surgical malpractice claims from four liability insurers. Among 258 claims in which injuries due to error were detected, 52% (n = 133) involved technical errors. These technical errors were further analyzed with a structured review instrument designed by qualitative content analysis. Forty-nine percent of the technical errors caused permanent disability; an additional 16% resulted in death. Two-thirds (65%) of the technical errors were linked to manual error, 9% to errors in judgment, and 26% to both manual and judgment error. A minority of technical errors involved advanced procedures requiring special training ("index operations"; 16%), surgeons inexperienced with the task (14%), or poorly supervised residents (9%). The majority involved experienced surgeons (73%), and occurred in routine, rather than index, operations (84%). Patient-related complexities-including emergencies, difficult or unexpected anatomy, and previous surgery-contributed to 61% of technical errors, and technology or systems failures contributed to 21%. Most technical errors occur in routine operations with experienced surgeons, under conditions of increased patient complexity or systems failure. Commonly recommended interventions, including restricting high-complexity operations to experienced surgeons, additional training for inexperienced surgeons, and stricter supervision of trainees, are likely to address only a minority of technical errors. Surgical safety research should instead focus on improving decision-making and performance in routine operations for complex patients and circumstances.

  8. Physician Preferences to Communicate Neuropsychological Results: Comparison of Qualitative Descriptors and a Proposal to Reduce Communication Errors.

    PubMed

    Schoenberg, Mike R; Osborn, Katie E; Mahone, E Mark; Feigon, Maia; Roth, Robert M; Pliskin, Neil H

    2017-11-08

    Errors in communication are a leading cause of medical errors. A potential source of error in communicating neuropsychological results is confusion in the qualitative descriptors used to describe standardized neuropsychological data. This study sought to evaluate the extent to which medical consumers of neuropsychological assessments believed that results/findings were not clearly communicated. In addition, preference data for a variety of qualitative descriptors commonly used to communicate normative neuropsychological test scores were obtained. Preference data were obtained for five qualitative descriptor systems as part of a larger 36-item internet-based survey of physician satisfaction with neuropsychological services. A new qualitative descriptor system termed the Simplified Qualitative Classification System (Q-Simple) was proposed to reduce the potential for communication errors using seven terms: very superior, superior, high average, average, low average, borderline, and abnormal/impaired. A non-random convenience sample of 605 clinicians identified from four United States academic medical centers from January 1, 2015 through January 7, 2016 were invited to participate. A total of 182 surveys were completed. A minority of clinicians (12.5%) indicated that neuropsychological study results were not clearly communicated. When communicating neuropsychological standardized scores, the two most preferred qualitative descriptor systems were by Heaton and colleagues (26%) and a newly proposed Q-simple system (22%). Comprehensive norms for an extended Halstead-Reitan battery: Demographic corrections, research findings, and clinical applications. Odessa, TX: Psychological Assessment Resources) (26%) and the newly proposed Q-Simple system (22%). Initial findings highlight the need to improve and standardize communication of neuropsychological results. These data offer initial guidance for preferred terms to communicate test results and form a foundation for more standardized practice among neuropsychologists. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. [Qualitative evaluation of blood products records in a hospital].

    PubMed

    Lartigue, B; Catillon, E

    2012-02-01

    This study aimed at evaluating the qualitative performance of blood products traceability from paper and electronic medical records in a hospital. Quality of date/time documentation was assessed by detection, for 20minutes or more, of chronological errors and inter-source inconsistencies, in a random sample of 168 blood products transfused during 2009. A receipt date/time was confirmed in 52% of paper records; a data entry error was attested in 25% of paper records, and 21% of electronic records. A transfusion date/time was notified in 93% of paper records, with a data entry error in 26% of paper records and 25% of electronic records. The patient medical record held at least one date/time error in 18% and 17%, for receipt and transfusion respectively. Environmental factors (clinical setting, urgency, blood product category) did not contributed to data error rates. Although blood products traceability has good quantitative results, the recorded documentation is not qualitative. In our study, data entry errors are similar in electronic or paper records, but the global failure rate is lesser in electronic records because omissions are controlled. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  10. Agency and Error in Young Adults' Stories of Sexual Decision Making

    ERIC Educational Resources Information Center

    Allen, Katherine R.; Husser, Erica K.; Stone, Dana J.; Jordal, Christian E.

    2008-01-01

    We conducted a qualitative analysis of 148 college students' written comments about themselves as sexual decision makers. Most participants described experiences in which they were actively engaged in decision-making processes of "waiting it out" to "working it out." The four patterns were (a) I am in control, (b) I am experimenting and learning,…

  11. Opening up the black box: an introduction to qualitative research methods in anaesthesia.

    PubMed

    Shelton, C L; Smith, A F; Mort, M

    2014-03-01

    Qualitative research methods are a group of techniques designed to allow the researcher to understand phenomena in their natural setting. A wide range is used, including focus groups, interviews, observation, and discourse analysis techniques, which may be used within research approaches such as grounded theory or ethnography. Qualitative studies in the anaesthetic setting have been used to define excellence in anaesthesia, explore the reasons behind drug errors, investigate the acquisition of expertise and examine incentives for hand-hygiene in the operating theatre. Understanding how and why people act the way they do is essential for the advancement of anaesthetic practice, and rigorous, well-designed qualitative research can generate useful data and important insights. Meticulous social scientific methods, transparency, reproducibility and reflexivity are markers of quality in qualitative research. Tools such as the consolidated criteria for reporting qualitative research checklist and the critical appraisal skills programme are available to help authors, reviewers and readers unfamiliar with qualitative research assess its merits. © 2013 The Association of Anaesthetists of Great Britain and Ireland.

  12. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures

    NASA Astrophysics Data System (ADS)

    Boes, Kelsey S.; Roberts, Michael S.; Vinueza, Nelson R.

    2018-03-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. [Figure not available: see fulltext.

  13. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures

    NASA Astrophysics Data System (ADS)

    Boes, Kelsey S.; Roberts, Michael S.; Vinueza, Nelson R.

    2017-12-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. [Figure not available: see fulltext.

  14. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures.

    PubMed

    Boes, Kelsey S; Roberts, Michael S; Vinueza, Nelson R

    2018-03-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R 2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R 2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. Graphical Abstract ᅟ.

  15. An Analysis of Java Programming Behaviors, Affect, Perceptions, and Syntax Errors among Low-Achieving, Average, and High-Achieving Novice Programmers

    ERIC Educational Resources Information Center

    Rodrigo, Ma. Mercedes T.; Andallaza, Thor Collin S.; Castro, Francisco Enrique Vicente G.; Armenta, Marc Lester V.; Dy, Thomas T.; Jadud, Matthew C.

    2013-01-01

    In this article we quantitatively and qualitatively analyze a sample of novice programmer compilation log data, exploring whether (or how) low-achieving, average, and high-achieving students vary in their grasp of these introductory concepts. High-achieving students self-reported having the easiest time learning the introductory programming…

  16. Design of RNA splicing analysis null models for post hoc filtering of Drosophila head RNA-Seq data with the splicing analysis kit (Spanki)

    PubMed Central

    2013-01-01

    Background The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. Results We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Conclusions Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools. PMID:24209455

  17. Design of RNA splicing analysis null models for post hoc filtering of Drosophila head RNA-Seq data with the splicing analysis kit (Spanki).

    PubMed

    Sturgill, David; Malone, John H; Sun, Xia; Smith, Harold E; Rabinow, Leonard; Samson, Marie-Laure; Oliver, Brian

    2013-11-09

    The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools.

  18. A patient-initiated voluntary online survey of adverse medical events: the perspective of 696 injured patients and families

    PubMed Central

    Southwick, Frederick S; Cranley, Nicole M; Hallisy, Julia A

    2015-01-01

    Background Preventable medical errors continue to be a major cause of death in the USA and throughout the world. Many patients have written about their experiences on websites and in published books. Methods As patients and family members who have experienced medical harm, we have created a nationwide voluntary survey in order to more broadly and systematically capture the perspective of patients and patient families experiencing adverse medical events and have used quantitative and qualitative analysis to summarise the responses of 696 patients and their families. Results Harm was most commonly associated with diagnostic and therapeutic errors, followed by surgical or procedural complications, hospital-associated infections and medication errors, and our quantitative results match those of previous provider-initiated patient surveys. Qualitative analysis of 450 narratives revealed a lack of perceived provider and system accountability, deficient and disrespectful communication and a failure of providers to listen as major themes. The consequences of adverse events included death, post-traumatic stress, financial hardship and permanent disability. These conditions and consequences led to a loss of patients’ trust in both the health system and providers. Patients and family members offered suggestions for preventing future adverse events and emphasised the importance of shared decision-making. Conclusions This large voluntary survey of medical harm highlights the potential efficacy of patient-initiated surveys for providing meaningful feedback and for guiding improvements in patient care. PMID:26092166

  19. Ranking and validation of spallation models for isotopic production cross sections of heavy residua

    NASA Astrophysics Data System (ADS)

    Sharma, Sushil K.; Kamys, Bogusław; Goldenbaum, Frank; Filges, Detlef

    2017-07-01

    The production cross sections of isotopically identified residual nuclei of spallation reactions induced by 136Xe projectiles at 500AMeV on hydrogen target were analyzed in a two-step model. The first stage of the reaction was described by the INCL4.6 model of an intranuclear cascade of nucleon-nucleon and pion-nucleon collisions whereas the second stage was analyzed by means of four different models; ABLA07, GEM2, GEMINI++ and SMM. The quality of the data description was judged quantitatively using two statistical deviation factors; the H-factor and the M-factor. It was found that the present analysis leads to a different ranking of models as compared to that obtained from the qualitative inspection of the data reproduction. The disagreement was caused by sensitivity of the deviation factors to large statistical errors present in some of the data. A new deviation factor, the A factor, was proposed, that is not sensitive to the statistical errors of the cross sections. The quantitative ranking of models performed using the A-factor agreed well with the qualitative analysis of the data. It was concluded that using the deviation factors weighted by statistical errors may lead to erroneous conclusions in the case when the data cover a large range of values. The quality of data reproduction by the theoretical models is discussed. Some systematic deviations of the theoretical predictions from the experimental results are observed.

  20. An error taxonomy system for analysis of haemodialysis incidents.

    PubMed

    Gu, Xiuzhu; Itoh, Kenji; Suzuki, Satoshi

    2014-12-01

    This paper describes the development of a haemodialysis error taxonomy system for analysing incidents and predicting the safety status of a dialysis organisation. The error taxonomy system was developed by adapting an error taxonomy system which assumed no specific specialty to haemodialysis situations. Its application was conducted with 1,909 incident reports collected from two dialysis facilities in Japan. Over 70% of haemodialysis incidents were reported as problems or complications related to dialyser, circuit, medication and setting of dialysis condition. Approximately 70% of errors took place immediately before and after the four hours of haemodialysis therapy. Error types most frequently made in the dialysis unit were omission and qualitative errors. Failures or complications classified to staff human factors, communication, task and organisational factors were found in most dialysis incidents. Device/equipment/materials, medicine and clinical documents were most likely to be involved in errors. Haemodialysis nurses were involved in more incidents related to medicine and documents, whereas dialysis technologists made more errors with device/equipment/materials. This error taxonomy system is able to investigate incidents and adverse events occurring in the dialysis setting but is also able to estimate safety-related status of an organisation, such as reporting culture. © 2014 European Dialysis and Transplant Nurses Association/European Renal Care Association.

  1. Logical errors on proving theorem

    NASA Astrophysics Data System (ADS)

    Sari, C. K.; Waluyo, M.; Ainur, C. M.; Darmaningsih, E. N.

    2018-01-01

    In tertiary level, students of mathematics education department attend some abstract courses, such as Introduction to Real Analysis which needs an ability to prove mathematical statements almost all the time. In fact, many students have not mastered this ability appropriately. In their Introduction to Real Analysis tests, even though they completed their proof of theorems, they achieved an unsatisfactory score. They thought that they succeeded, but their proof was not valid. In this study, a qualitative research was conducted to describe logical errors that students made in proving the theorem of cluster point. The theorem was given to 54 students. Misconceptions on understanding the definitions seem to occur within cluster point, limit of function, and limit of sequences. The habit of using routine symbol might cause these misconceptions. Suggestions to deal with this condition are described as well.

  2. Artifact-free dynamic atomic force microscopy reveals monotonic dissipation for a simple confined liquid

    NASA Astrophysics Data System (ADS)

    Kaggwa, G. B.; Kilpatrick, J. I.; Sader, J. E.; Jarvis, S. P.

    2008-07-01

    We present definitive interaction measurements of a simple confined liquid (octamethylcyclotetrasiloxane) using artifact-free frequency modulation atomic force microscopy. We use existing theory to decouple the conservative and dissipative components of the interaction, for a known phase offset from resonance (90° phase shift), that has been deliberately introduced into the experiment. Further we show the qualitative influence on the conservative and dissipative components of the interaction of a phase error deliberately introduced into the measurement, highlighting that artifacts, such as oscillatory dissipation, can be readily observed when the phase error is not compensated for in the force analysis.

  3. Qualitative fusion technique based on information poor system and its application to factor analysis for vibration of rolling bearings

    NASA Astrophysics Data System (ADS)

    Xia, Xintao; Wang, Zhongyu

    2008-10-01

    For some methods of stability analysis of a system using statistics, it is difficult to resolve the problems of unknown probability distribution and small sample. Therefore, a novel method is proposed in this paper to resolve these problems. This method is independent of probability distribution, and is useful for small sample systems. After rearrangement of the original data series, the order difference and two polynomial membership functions are introduced to estimate the true value, the lower bound and the supper bound of the system using fuzzy-set theory. Then empirical distribution function is investigated to ensure confidence level above 95%, and the degree of similarity is presented to evaluate stability of the system. Cases of computer simulation investigate stable systems with various probability distribution, unstable systems with linear systematic errors and periodic systematic errors and some mixed systems. The method of analysis for systematic stability is approved.

  4. Barriers and facilitators to recovering from e-prescribing errors in community pharmacies.

    PubMed

    Odukoya, Olufunmilola K; Stone, Jamie A; Chui, Michelle A

    2015-01-01

    To explore barriers and facilitators to recovery from e-prescribing errors in community pharmacies and to explore practical solutions for work system redesign to ensure successful recovery from errors. Cross-sectional qualitative design using direct observations, interviews, and focus groups. Five community pharmacies in Wisconsin. 13 pharmacists and 14 pharmacy technicians. Observational field notes and transcribed interviews and focus groups were subjected to thematic analysis guided by the Systems Engineering Initiative for Patient Safety (SEIPS) work system and patient safety model. Barriers and facilitators to recovering from e-prescription errors in community pharmacies. Organizational factors, such as communication, training, teamwork, and staffing levels, play an important role in recovering from e-prescription errors. Other factors that could positively or negatively affect recovery of e-prescription errors include level of experience, knowledge of the pharmacy personnel, availability or usability of tools and technology, interruptions and time pressure when performing tasks, and noise in the physical environment. The SEIPS model sheds light on key factors that may influence recovery from e-prescribing errors in pharmacies, including the environment, teamwork, communication, technology, tasks, and other organizational variables. To be successful in recovering from e-prescribing errors, pharmacies must provide the appropriate working conditions that support recovery from errors.

  5. How psychotherapists handle treatment errors – an ethical analysis

    PubMed Central

    2013-01-01

    Background Dealing with errors in psychotherapy is challenging, both ethically and practically. There is almost no empirical research on this topic. We aimed (1) to explore psychotherapists’ self-reported ways of dealing with an error made by themselves or by colleagues, and (2) to reconstruct their reasoning according to the two principle-based ethical approaches that are dominant in the ethics discourse of psychotherapy, Beauchamp & Childress (B&C) and Lindsay et al. (L). Methods We conducted 30 semi-structured interviews with 30 psychotherapists (physicians and non-physicians) and analysed the transcripts using qualitative content analysis. Answers were deductively categorized according to the two principle-based ethical approaches. Results Most psychotherapists reported that they preferred to an disclose error to the patient. They justified this by spontaneous intuitions and common values in psychotherapy, rarely using explicit ethical reasoning. The answers were attributed to the following categories with descending frequency: 1. Respect for patient autonomy (B&C; L), 2. Non-maleficence (B&C) and Responsibility (L), 3. Integrity (L), 4. Competence (L) and Beneficence (B&C). Conclusions Psychotherapists need specific ethical and communication training to complement and articulate their moral intuitions as a support when disclosing their errors to the patients. Principle-based ethical approaches seem to be useful for clarifying the reasons for disclosure. Further research should help to identify the most effective and acceptable ways of error disclosure in psychotherapy. PMID:24321503

  6. An Empirically Derived Taxonomy of Factors Affecting Physicians' Willingness to Disclose Medical Errors

    PubMed Central

    Kaldjian, Lauris C; Jones, Elizabeth W; Rosenthal, Gary E; Tripp-Reimer, Toni; Hillis, Stephen L

    2006-01-01

    BACKGROUND Physician disclosure of medical errors to institutions, patients, and colleagues is important for patient safety, patient care, and professional education. However, the variables that may facilitate or impede disclosure are diverse and lack conceptual organization. OBJECTIVE To develop an empirically derived, comprehensive taxonomy of factors that affects voluntary disclosure of errors by physicians. DESIGN A mixed-methods study using qualitative data collection (structured literature search and exploratory focus groups), quantitative data transformation (sorting and hierarchical cluster analysis), and validation procedures (confirmatory focus groups and expert review). RESULTS Full-text review of 316 articles identified 91 impeding or facilitating factors affecting physicians' willingness to disclose errors. Exploratory focus groups identified an additional 27 factors. Sorting and hierarchical cluster analysis organized factors into 8 domains. Confirmatory focus groups and expert review relocated 6 factors, removed 2 factors, and modified 4 domain names. The final taxonomy contained 4 domains of facilitating factors (responsibility to patient, responsibility to self, responsibility to profession, responsibility to community), and 4 domains of impeding factors (attitudinal barriers, uncertainties, helplessness, fears and anxieties). CONCLUSIONS A taxonomy of facilitating and impeding factors provides a conceptual framework for a complex field of variables that affects physicians' willingness to disclose errors to institutions, patients, and colleagues. This taxonomy can be used to guide the design of studies to measure the impact of different factors on disclosure, to assist in the design of error-reporting systems, and to inform educational interventions to promote the disclosure of errors to patients. PMID:16918739

  7. Identifying the latent failures underpinning medication administration errors: an exploratory study.

    PubMed

    Lawton, Rebecca; Carruthers, Sam; Gardner, Peter; Wright, John; McEachan, Rosie R C

    2012-08-01

    The primary aim of this article was to identify the latent failures that are perceived to underpin medication errors. The study was conducted within three medical wards in a hospital in the United Kingdom. The study employed a cross-sectional qualitative design. Interviews were conducted with 12 nurses and eight managers. Interviews were transcribed and subject to thematic content analysis. A two-step inter-rater comparison tested the reliability of the themes. Ten latent failures were identified based on the analysis of the interviews. These were ward climate, local working environment, workload, human resources, team communication, routine procedures, bed management, written policies and procedures, supervision and leadership, and training. The discussion focuses on ward climate, the most prevalent theme, which is conceptualized here as interacting with failures in the nine other organizational structures and processes. This study is the first of its kind to identify the latent failures perceived to underpin medication errors in a systematic way. The findings can be used as a platform for researchers to test the impact of organization-level patient safety interventions and to design proactive error management tools and incident reporting systems in hospitals. © Health Research and Educational Trust.

  8. A philosophical analysis of the general methodology of qualitative research: a critical rationalist perspective.

    PubMed

    Rudnick, Abraham

    2014-09-01

    Philosophical discussion of the general methodology of qualitative research, such as that used in some health research, has been inductivist or relativist to date, ignoring critical rationalism as a philosophical approach with which to discuss the general methodology of qualitative research. This paper presents a discussion of the general methodology of qualitative research from a critical rationalist perspective (inspired by Popper), using as an example mental health research. The widespread endorsement of induction in qualitative research is positivist and is suspect, if not false, particularly in relation to the context of justification (or rather theory testing) as compared to the context of discovery (or rather theory generation). Relativism is riddled with philosophical weaknesses and hence it is suspect if not false too. Theory testing is compatible with qualitative research, contrary to much writing about and in qualitative research, as theory testing involves learning from trial and error, which is part of qualitative research, and which may be the form of learning most conducive to generalization. Generalization involves comparison, which is a fundamental methodological requirement of any type of research (qualitative or other); hence the traditional grounding of quantitative and experimental research in generalization. Comparison--rather than generalization--is necessary for, and hence compatible with, qualitative research; hence, the common opposition to generalization in qualitative research is misdirected, disregarding whether this opposition's claims are true or false. In conclusion, qualitative research, similar to quantitative and experimental research, assumes comparison as a general methodological requirement, which is necessary for health research.

  9. Qualitative Analysis of the Interdisciplinary Interaction between Data Analysis Specialists and Novice Clinical Researchers

    PubMed Central

    Zammar, Guilherme Roberto; Shah, Jatin; Bonilauri Ferreira, Ana Paula; Cofiel, Luciana; Lyles, Kenneth W.; Pietrobon, Ricardo

    2010-01-01

    Background The inherent complexity of statistical methods and clinical phenomena compel researchers with diverse domains of expertise to work in interdisciplinary teams, where none of them have a complete knowledge in their counterpart's field. As a result, knowledge exchange may often be characterized by miscommunication leading to misinterpretation, ultimately resulting in errors in research and even clinical practice. Though communication has a central role in interdisciplinary collaboration and since miscommunication can have a negative impact on research processes, to the best of our knowledge, no study has yet explored how data analysis specialists and clinical researchers communicate over time. Methods/Principal Findings We conducted qualitative analysis of encounters between clinical researchers and data analysis specialists (epidemiologist, clinical epidemiologist, and data mining specialist). These encounters were recorded and systematically analyzed using a grounded theory methodology for extraction of emerging themes, followed by data triangulation and analysis of negative cases for validation. A policy analysis was then performed using a system dynamics methodology looking for potential interventions to improve this process. Four major emerging themes were found. Definitions using lay language were frequently employed as a way to bridge the language gap between the specialties. Thought experiments presented a series of “what if” situations that helped clarify how the method or information from the other field would behave, if exposed to alternative situations, ultimately aiding in explaining their main objective. Metaphors and analogies were used to translate concepts across fields, from the unfamiliar to the familiar. Prolepsis was used to anticipate study outcomes, thus helping specialists understand the current context based on an understanding of their final goal. Conclusion/Significance The communication between clinical researchers and data analysis specialists presents multiple challenges that can lead to errors. PMID:20195374

  10. Errors Made by Elementary Fourth Grade Students When Modelling Word Problems and the Elimination of Those Errors through Scaffolding

    ERIC Educational Resources Information Center

    Ulu, Mustafa

    2017-01-01

    This study aims to identify errors made by primary school students when modelling word problems and to eliminate those errors through scaffolding. A 10-question problem-solving achievement test was used in the research. The qualitative and quantitative designs were utilized together. The study group of the quantitative design comprises 248…

  11. Validating Domains of Patient Contextual Factors Essential to Preventing Contextual Errors: A Qualitative Study Conducted at Chicago Area Veterans Health Administration Sites.

    PubMed

    Binns-Calvey, Amy E; Malhiot, Alex; Kostovich, Carol T; LaVela, Sherri L; Stroupe, Kevin; Gerber, Ben S; Burkhart, Lisa; Weiner, Saul J; Weaver, Frances M

    2017-09-01

    "Patient context" indicates patient circumstances and characteristics or states that are essential to address when planning patient care. Specific patient "contextual factors," if overlooked, result in an inappropriate plan of care, a medical error termed a "contextual error." The myriad contextual factors that constitute patient context have been grouped into broad domains to create a taxonomy of challenges to consider when planning care. This study sought to validate a previously identified list of contextual domains. This qualitative study used directed content analysis. In 2014, 19 Department of Veterans Affairs (VA) providers (84% female) and 49 patients (86% male) from two VA medical centers and four outpatient clinics in the Chicago area participated in semistructured interviews and focus groups. Topics included patient-specific, community, and resource-related factors that affect patients' abilities to manage their care. Transcripts were analyzed with a previously identified list of contextual domains as a framework. Analysis of responses revealed that patients and providers identified the same 10 domains previously published, plus 3 additional ones. Based on comments made by patients and providers, the authors created a revised list of 12 domains from themes that emerged. Six pertain to patient circumstances such as access to care and financial situation, and 6 to patient characteristics/states including skills, abilities, and knowledge. Contextual factors in patients' lives may be essential to address for effective care planning. The rubric developed can serve as a "contextual differential" for clinicians to consider when addressing challenges patients face when planning their care.

  12. A patient-initiated voluntary online survey of adverse medical events: the perspective of 696 injured patients and families.

    PubMed

    Southwick, Frederick S; Cranley, Nicole M; Hallisy, Julia A

    2015-10-01

    Preventable medical errors continue to be a major cause of death in the USA and throughout the world. Many patients have written about their experiences on websites and in published books. As patients and family members who have experienced medical harm, we have created a nationwide voluntary survey in order to more broadly and systematically capture the perspective of patients and patient families experiencing adverse medical events and have used quantitative and qualitative analysis to summarise the responses of 696 patients and their families. Harm was most commonly associated with diagnostic and therapeutic errors, followed by surgical or procedural complications, hospital-associated infections and medication errors, and our quantitative results match those of previous provider-initiated patient surveys. Qualitative analysis of 450 narratives revealed a lack of perceived provider and system accountability, deficient and disrespectful communication and a failure of providers to listen as major themes. The consequences of adverse events included death, post-traumatic stress, financial hardship and permanent disability. These conditions and consequences led to a loss of patients' trust in both the health system and providers. Patients and family members offered suggestions for preventing future adverse events and emphasised the importance of shared decision-making. This large voluntary survey of medical harm highlights the potential efficacy of patient-initiated surveys for providing meaningful feedback and for guiding improvements in patient care. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  13. Beyond Error Patterns: A Sociocultural View of Fraction Comparison Errors in Students with Mathematical Learning Disabilities

    ERIC Educational Resources Information Center

    Lewis, Katherine E.

    2016-01-01

    Although many students struggle with fractions, students with mathematical learning disabilities (MLDs) experience pervasive difficulties because of neurological differences in how they process numerical information. These students make errors that are qualitatively different than their typically achieving and low-achieving peers. This study…

  14. Infant search and object permanence: a meta-analysis of the A-not-B error.

    PubMed

    Wellman, H M; Cross, D; Bartsch, K

    1987-01-01

    Research on Piaget's stage 4 object concept has failed to reveal a clear or consistent pattern of results. Piaget found that 8-12-month-old infants would make perserverative errors; his explanation for this phenomenon was that the infant's concept of the object was contextually dependent on his or her actions. Some studies designed to test Piaget's explanation have replicated Piaget's basic finding, yet many have found no preference for the A location or the B location or an actual preference for the B location. More recently, researchers have attempted to uncover the causes for these results concerning the A-not-B error. Again, however, different studies have yielded different results, and qualitative reviews have failed to yield a consistent explanation for the results of the individual studies. This state of affairs suggests that the phenomenon may simply be too complex to be captured by individual studies varying 1 factor at a time and by reviews based on similar qualitative considerations. Therefore, the current investigation undertook a meta-analysis, a synthesis capturing the quantitative information across the now sizable number of studies. We entered several important factors into the meta-analysis, including the effects of age, the number of A trials, the length of delay between hiding and search, the number of locations, the distances between locations, and the distinctive visual properties of the hiding arrays. Of these, the analysis consistently indicated that age, delay, and number of hiding locations strongly influence infants' search. The pattern of specific findings also yielded new information about infant search. A general characterization of the results is that, at every age, both above-chance and below-chance performance was observed. That is, at each age at least 1 combination of delay and number of locations yielded above-chance A-not-B errors or significant perseverative search. At the same time, at each age at least 1 alternative combination of delay and number of locations yielded below-chance errors and significant above-chance correct performance, that is, significantly accurate search. These 2 findings, appropriately elaborated, allow us to evaluate all extant theories of stage 4 infant search. When this is done, all these extant accounts prove to be incorrect. That is, they are incommensurate with one aspect or another of the pooled findings in the meta-analysis. Therefore, we end by proposing a new account that is consistent with the entire data set.

  15. Lateral charge transport from heavy-ion tracks in integrated circuit chips

    NASA Technical Reports Server (NTRS)

    Zoutendyk, J. A.; Schwartz, H. R.; Nevill, L. R.

    1988-01-01

    A 256K DRAM has been used to study the lateral transport of charge (electron-hole pairs) induced by direct ionization from heavy-ion tracks in an IC. The qualitative charge transport has been simulated using a two-dimensional numerical code in cylindrical coordinates. The experimental bit-map data clearly show the manifestation of lateral charge transport in the creation of adjacent multiple-bit errors from a single heavy-ion track. The heavy-ion data further demonstrate the occurrence of multiple-bit errors from single ion tracks with sufficient stopping power. The qualitative numerical simulation results suggest that electric-field-funnel-aided (drift) collection accounts for single error generated by an ion passing through a charge-collecting junction, while multiple errors from a single ion track are due to lateral diffusion of ion-generated charge.

  16. Texture analysis improves level set segmentation of the anterior abdominal wall

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Zhoubing; Allen, Wade M.; Baucom, Rebeccah B.

    2013-12-15

    Purpose: The treatment of ventral hernias (VH) has been a challenging problem for medical care. Repair of these hernias is fraught with failure; recurrence rates ranging from 24% to 43% have been reported, even with the use of biocompatible mesh. Currently, computed tomography (CT) is used to guide intervention through expert, but qualitative, clinical judgments, notably, quantitative metrics based on image-processing are not used. The authors propose that image segmentation methods to capture the three-dimensional structure of the abdominal wall and its abnormalities will provide a foundation on which to measure geometric properties of hernias and surrounding tissues and, therefore,more » to optimize intervention.Methods: In this study with 20 clinically acquired CT scans on postoperative patients, the authors demonstrated a novel approach to geometric classification of the abdominal. The authors’ approach uses a texture analysis based on Gabor filters to extract feature vectors and follows a fuzzy c-means clustering method to estimate voxelwise probability memberships for eight clusters. The memberships estimated from the texture analysis are helpful to identify anatomical structures with inhomogeneous intensities. The membership was used to guide the level set evolution, as well as to derive an initial start close to the abdominal wall.Results: Segmentation results on abdominal walls were both quantitatively and qualitatively validated with surface errors based on manually labeled ground truth. Using texture, mean surface errors for the outer surface of the abdominal wall were less than 2 mm, with 91% of the outer surface less than 5 mm away from the manual tracings; errors were significantly greater (2–5 mm) for methods that did not use the texture.Conclusions: The authors’ approach establishes a baseline for characterizing the abdominal wall for improving VH care. Inherent texture patterns in CT scans are helpful to the tissue classification, and texture analysis can improve the level set segmentation around the abdominal region.« less

  17. Missed opportunities for diagnosis: lessons learned from diagnostic errors in primary care.

    PubMed

    Goyder, Clare R; Jones, Caroline H D; Heneghan, Carl J; Thompson, Matthew J

    2015-12-01

    Because of the difficulties inherent in diagnosis in primary care, it is inevitable that diagnostic errors will occur. However, despite the important consequences associated with diagnostic errors and their estimated high prevalence, teaching and research on diagnostic error is a neglected area. To ascertain the key learning points from GPs' experiences of diagnostic errors and approaches to clinical decision making associated with these. Secondary analysis of 36 qualitative interviews with GPs in Oxfordshire, UK. Two datasets of semi-structured interviews were combined. Questions focused on GPs' experiences of diagnosis and diagnostic errors (or near misses) in routine primary care and out of hours. Interviews were audiorecorded, transcribed verbatim, and analysed thematically. Learning points include GPs' reliance on 'pattern recognition' and the failure of this strategy to identify atypical presentations; the importance of considering all potentially serious conditions using a 'restricted rule out' approach; and identifying and acting on a sense of unease. Strategies to help manage uncertainty in primary care were also discussed. Learning from previous examples of diagnostic errors is essential if these events are to be reduced in the future and this should be incorporated into GP training. At a practice level, learning points from experiences of diagnostic errors should be discussed more frequently; and more should be done to integrate these lessons nationally to understand and characterise diagnostic errors. © British Journal of General Practice 2015.

  18. Family caregiver learning--how family caregivers learn to provide care at the end of life: a qualitative secondary analysis of four datasets.

    PubMed

    Stajduhar, Kelli I; Funk, Laura; Outcalt, Linda

    2013-07-01

    Family caregivers are assuming growing responsibilities in providing care to dying family members. Supporting them is fundamental to ensure quality end-of-life care and to buffer potentially negative outcomes, although family caregivers frequently acknowledge a deficiency of information, knowledge, and skills necessary to assume the tasks involved in this care. The aim of this inquiry was to explore how family caregivers describe learning to provide care to palliative patients. Secondary analysis of data from four qualitative studies (n = 156) with family caregivers of dying people. Data included qualitative interviews with 156 family caregivers of dying people. Family caregivers learn through the following processes: trial and error, actively seeking needed information and guidance, applying knowledge and skills from previous experience, and reflecting on their current experiences. Caregivers generally preferred and appreciated a supported or guided learning process that involved being shown or told by others, usually learning reactively after a crisis. Findings inform areas for future research to identify effective, individualized programs and interventions to support positive learning experiences for family caregivers of dying people.

  19. VARIABLE SELECTION FOR QUALITATIVE INTERACTIONS IN PERSONALIZED MEDICINE WHILE CONTROLLING THE FAMILY-WISE ERROR RATE

    PubMed Central

    Gunter, Lacey; Zhu, Ji; Murphy, Susan

    2012-01-01

    For many years, subset analysis has been a popular topic for the biostatistics and clinical trials literature. In more recent years, the discussion has focused on finding subsets of genomes which play a role in the effect of treatment, often referred to as stratified or personalized medicine. Though highly sought after, methods for detecting subsets with altering treatment effects are limited and lacking in power. In this article we discuss variable selection for qualitative interactions with the aim to discover these critical patient subsets. We propose a new technique designed specifically to find these interaction variables among a large set of variables while still controlling for the number of false discoveries. We compare this new method against standard qualitative interaction tests using simulations and give an example of its use on data from a randomized controlled trial for the treatment of depression. PMID:22023676

  20. Sociotechnical factors influencing unsafe use of hospital information systems: A qualitative study in Malaysian government hospitals.

    PubMed

    Salahuddin, Lizawati; Ismail, Zuraini; Hashim, Ummi Rabaah; Raja Ikram, Raja Rina; Ismail, Nor Haslinda; Naim Mohayat, Mohd Hariz

    2018-03-01

    The objective of this study is to identify factors influencing unsafe use of hospital information systems in Malaysian government hospitals. Semi-structured interviews with 31 medical doctors in three Malaysian government hospitals implementing total hospital information systems were conducted between March and May 2015. A thematic qualitative analysis was performed on the resultant data to deduce the relevant themes. A total of five themes emerged as the factors influencing unsafe use of a hospital information system: (1) knowledge, (2) system quality, (3) task stressor, (4) organization resources, and (5) teamwork. These qualitative findings highlight that factors influencing unsafe use of a hospital information system originate from multidimensional sociotechnical aspects. Unsafe use of a hospital information system could possibly lead to the incidence of errors and thus raises safety risks to the patients. Hence, multiple interventions (e.g. technology systems and teamwork) are required in shaping high-quality hospital information system use.

  1. A preliminary taxonomy of medical errors in family practice

    PubMed Central

    Dovey, S; Meyers, D; Phillips, R; Green, L; Fryer, G; Galliher, J; Kappus, J; Grob, P

    2002-01-01

    Objective: To develop a preliminary taxonomy of primary care medical errors. Design: Qualitative analysis to identify categories of error reported during a randomized controlled trial of computer and paper reporting methods. Setting: The National Network for Family Practice and Primary Care Research. Participants: Family physicians. Main outcome measures: Medical error category, context, and consequence. Results: Forty two physicians made 344 reports: 284 (82.6%) arose from healthcare systems dysfunction; 46 (13.4%) were errors due to gaps in knowledge or skills; and 14 (4.1%) were reports of adverse events, not errors. The main subcategories were: administrative failures (102; 30.9% of errors), investigation failures (82; 24.8%), treatment delivery lapses (76; 23.0%), miscommunication (19; 5.8%), payment systems problems (4; 1.2%), error in the execution of a clinical task (19; 5.8%), wrong treatment decision (14; 4.2%), and wrong diagnosis (13; 3.9%). Most reports were of errors that were recognized and occurred in reporters' practices. Affected patients ranged in age from 8 months to 100 years, were of both sexes, and represented all major US ethnic groups. Almost half the reports were of events which had adverse consequences. Ten errors resulted in patients being admitted to hospital and one patient died. Conclusions: This medical error taxonomy, developed from self-reports of errors observed by family physicians during their routine clinical practice, emphasizes problems in healthcare processes and acknowledges medical errors arising from shortfalls in clinical knowledge and skills. Patient safety strategies with most effect in primary care settings need to be broader than the current focus on medication errors. PMID:12486987

  2. A preliminary taxonomy of medical errors in family practice.

    PubMed

    Dovey, S M; Meyers, D S; Phillips, R L; Green, L A; Fryer, G E; Galliher, J M; Kappus, J; Grob, P

    2002-09-01

    To develop a preliminary taxonomy of primary care medical errors. Qualitative analysis to identify categories of error reported during a randomized controlled trial of computer and paper reporting methods. The National Network for Family Practice and Primary Care Research. Family physicians. Medical error category, context, and consequence. Forty two physicians made 344 reports: 284 (82.6%) arose from healthcare systems dysfunction; 46 (13.4%) were errors due to gaps in knowledge or skills; and 14 (4.1%) were reports of adverse events, not errors. The main subcategories were: administrative failure (102; 30.9% of errors), investigation failures (82; 24.8%), treatment delivery lapses (76; 23.0%), miscommunication (19; 5.8%), payment systems problems (4; 1.2%), error in the execution of a clinical task (19; 5.8%), wrong treatment decision (14; 4.2%), and wrong diagnosis (13; 3.9%). Most reports were of errors that were recognized and occurred in reporters' practices. Affected patients ranged in age from 8 months to 100 years, were of both sexes, and represented all major US ethnic groups. Almost half the reports were of events which had adverse consequences. Ten errors resulted in patients being admitted to hospital and one patient died. This medical error taxonomy, developed from self-reports of errors observed by family physicians during their routine clinical practice, emphasizes problems in healthcare processes and acknowledges medical errors arising from shortfalls in clinical knowledge and skills. Patient safety strategies with most effect in primary care settings need to be broader than the current focus on medication errors.

  3. Medication Timing Errors for Parkinson's Disease: Perspectives Held by Caregivers and People with Parkinson's in New Zealand

    PubMed Central

    Buetow, Stephen; Henshaw, Jenny; Bryant, Linda; O'Sullivan, Deirdre

    2010-01-01

    Background. Common but seldom published are Parkinson's disease (PD) medication errors involving late, extra, or missed doses. These errors can reduce medication effectiveness and the quality of life of people with PD and their caregivers. Objective. To explore lay perspectives of factors contributing to medication timing errors for PD in hospital and community settings. Design and Methods. This qualitative research purposively sampled individuals with PD, or a proxy of their choice, throughout New Zealand during 2008-2009. Data collection involved 20 semistructured, personal interviews by telephone. A general inductive analysis of the data identified core insights consistent with the study objective. Results. Five themes help to account for possible timing adherence errors by people with PD, their caregivers or professionals. The themes are the abrupt withdrawal of PD medication; wrong, vague or misread instructions; devaluation of the lay role in managing PD medications; deficits in professional knowledge and in caring behavior around PD in formal health care settings; and lay forgetfulness. Conclusions. The results add to the limited published research on medication errors in PD and help to confirm anecdotal experience internationally. They indicate opportunities for professionals and lay people to work together to reduce errors in the timing of medication for PD in hospital and community settings. PMID:20975777

  4. Quantitative and qualitative differences in the lexical knowledge of monolingual and bilingual children on the LITMUS-CLT task.

    PubMed

    Altman, Carmit; Goldstein, Tamara; Armon-Lotem, Sharon

    2017-01-01

    While bilingual children follow the same milestones of language acquisition as monolingual children do in learning the syntactic patterns of their second language (L2), their vocabulary size in L2 often lags behind compared to monolinguals. The present study explores the comprehension and production of nouns and verbs in Hebrew, by two groups of 5- to 6-year olds with typical language development: monolingual Hebrew speakers (N = 26), and Russian-Hebrew bilinguals (N = 27). Analyses not only show quantitative gaps between comprehension and production and between nouns and verbs, with a bilingual effect in both, but also a qualitative difference between monolinguals and bilinguals in their production errors: monolinguals' errors reveal knowledge of the language rules despite temporary access difficulties, while bilinguals' errors reflect gaps in their knowledge of Hebrew (L2). The nature of Hebrew as a Semitic language allows one to explore this qualitative difference in the semantic and morphological level.

  5. The Model Analyst’s Toolkit: Scientific Model Development, Analysis, and Validation

    DTIC Science & Technology

    2015-08-20

    way correlations. For instance, if crime waves are associated with increases in unemployment or drops in police presence, that would be hard to...time lag, ai , bj are parameters in a linear combination, 1, 2 are error terms, and Prepared for Dr. Harold Hawkins US Government Contract...selecting a proper representation for the underlying data. A qualitative comparison of GC and DTW methods on World Bank data indicates that both methods

  6. Analysis of Lard in Lipstick Formulation Using FTIR Spectroscopy and Multivariate Calibration: A Comparison of Three Extraction Methods.

    PubMed

    Waskitho, Dri; Lukitaningsih, Endang; Sudjadi; Rohman, Abdul

    2016-01-01

    Analysis of lard extracted from lipstick formulation containing castor oil has been performed using FTIR spectroscopic method combined with multivariate calibration. Three different extraction methods were compared, namely saponification method followed by liquid/liquid extraction with hexane/dichlorometane/ethanol/water, saponification method followed by liquid/liquid extraction with dichloromethane/ethanol/water, and Bligh & Dyer method using chloroform/methanol/water as extracting solvent. Qualitative and quantitative analysis of lard were performed using principle component (PCA) and partial least square (PLS) analysis, respectively. The results showed that, in all samples prepared by the three extraction methods, PCA was capable of identifying lard at wavelength region of 1200-800 cm -1 with the best result was obtained by Bligh & Dyer method. Furthermore, PLS analysis at the same wavelength region used for qualification showed that Bligh and Dyer was the most suitable extraction method with the highest determination coefficient (R 2 ) and the lowest root mean square error of calibration (RMSEC) as well as root mean square error of prediction (RMSEP) values.

  7. Quality Issues of Court Reporters and Transcriptionists for Qualitative Research

    PubMed Central

    Hennink, Monique; Weber, Mary Beth

    2015-01-01

    Transcription is central to qualitative research, yet few researchers identify the quality of different transcription methods. We described the quality of verbatim transcripts from traditional transcriptionists and court reporters by reviewing 16 transcripts from 8 focus group discussions using four criteria: transcription errors, cost and time of transcription, and effect on study participants. Transcriptionists made fewer errors, captured colloquial dialogue, and errors were largely influenced by the quality of the recording. Court reporters made more errors, particularly in the omission of topical content and contextual detail and were less able to produce a verbatim transcript; however the potential immediacy of the transcript was advantageous. In terms of cost, shorter group discussions favored a transcriptionist and longer groups a court reporter. Study participants reported no effect by either method of recording. Understanding the benefits and limitations of each method of transcription can help researchers select an appropriate method for each study. PMID:23512435

  8. Measuring a diffusion coefficient by single-particle tracking: statistical analysis of experimental mean squared displacement curves.

    PubMed

    Ernst, Dominique; Köhler, Jürgen

    2013-01-21

    We provide experimental results on the accuracy of diffusion coefficients obtained by a mean squared displacement (MSD) analysis of single-particle trajectories. We have recorded very long trajectories comprising more than 1.5 × 10(5) data points and decomposed these long trajectories into shorter segments providing us with ensembles of trajectories of variable lengths. This enabled a statistical analysis of the resulting MSD curves as a function of the lengths of the segments. We find that the relative error of the diffusion coefficient can be minimized by taking an optimum number of points into account for fitting the MSD curves, and that this optimum does not depend on the segment length. Yet, the magnitude of the relative error for the diffusion coefficient does, and achieving an accuracy in the order of 10% requires the recording of trajectories with about 1000 data points. Finally, we compare our results with theoretical predictions and find very good qualitative and quantitative agreement between experiment and theory.

  9. Modelling default and likelihood reasoning as probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  10. Response cost, reinforcement, and children's Porteus Maze qualitative performance.

    PubMed

    Neenan, D M; Routh, D K

    1986-09-01

    Sixty fourth-grade children were given two different series of the Porteus Maze Test. The first series was given as a baseline, and the second series was administered under one of four different experimental conditions: control, response cost, positive reinforcement, or negative verbal feedback. Response cost and positive reinforcement, but not negative verbal feedback, led to significant decreases in the number of all types of qualitative errors in relation to the control group. The reduction of nontargeted as well as targeted errors provides evidence for the generalized effects of response cost and positive reinforcement.

  11. Model wall and recovery temperature effects on experimental heat transfer data analysis

    NASA Technical Reports Server (NTRS)

    Throckmorton, D. A.; Stone, D. R.

    1974-01-01

    Basic analytical procedures are used to illustrate, both qualitatively and quantitatively, the relative impact upon heat transfer data analysis of certain factors which may affect the accuracy of experimental heat transfer data. Inaccurate knowledge of adiabatic wall conditions results in a corresponding inaccuracy in the measured heat transfer coefficient. The magnitude of the resulting error is extreme for data obtained at wall temperatures approaching the adiabatic condition. High model wall temperatures and wall temperature gradients affect the level and distribution of heat transfer to an experimental model. The significance of each of these factors is examined and its impact upon heat transfer data analysis is assessed.

  12. ACSPRI 2014 4th International Social Science Methodology Conference Report

    DTIC Science & Technology

    2015-04-01

    Validity, trustworthiness and rigour: quality and the idea of qualitative research . Journal of Advanced Nursing, 304-310. Spencer, L., Ritchie, J...increasing data quality; the Total Survey Error framework; multi-modal on-line surveying, quality frameworks for assessing qualitative research ; and...provided an overview of the current perspectives on causal claims in qualitative research . Three approaches to generating plausible causal

  13. Near infrared spectroscopy combined with multivariate analysis for monitoring the ethanol precipitation process of fraction I + II + III supernatant in human albumin separation

    NASA Astrophysics Data System (ADS)

    Li, Can; Wang, Fei; Zang, Lixuan; Zang, Hengchang; Alcalà, Manel; Nie, Lei; Wang, Mingyu; Li, Lian

    2017-03-01

    Nowadays, as a powerful process analytical tool, near infrared spectroscopy (NIRS) has been widely applied in process monitoring. In present work, NIRS combined with multivariate analysis was used to monitor the ethanol precipitation process of fraction I + II + III (FI + II + III) supernatant in human albumin (HA) separation to achieve qualitative and quantitative monitoring at the same time and assure the product's quality. First, a qualitative model was established by using principal component analysis (PCA) with 6 of 8 normal batches samples, and evaluated by the remaining 2 normal batches and 3 abnormal batches. The results showed that the first principal component (PC1) score chart could be successfully used for fault detection and diagnosis. Then, two quantitative models were built with 6 of 8 normal batches to determine the content of the total protein (TP) and HA separately by using partial least squares regression (PLS-R) strategy, and the models were validated by 2 remaining normal batches. The determination coefficient of validation (Rp2), root mean square error of cross validation (RMSECV), root mean square error of prediction (RMSEP) and ratio of performance deviation (RPD) were 0.975, 0.501 g/L, 0.465 g/L and 5.57 for TP, and 0.969, 0.530 g/L, 0.341 g/L and 5.47 for HA, respectively. The results showed that the established models could give a rapid and accurate measurement of the content of TP and HA. The results of this study indicated that NIRS is an effective tool and could be successfully used for qualitative and quantitative monitoring the ethanol precipitation process of FI + II + III supernatant simultaneously. This research has significant reference value for assuring the quality and improving the recovery ratio of HA in industrialization scale by using NIRS.

  14. Near infrared spectroscopy combined with multivariate analysis for monitoring the ethanol precipitation process of fraction I+II+III supernatant in human albumin separation.

    PubMed

    Li, Can; Wang, Fei; Zang, Lixuan; Zang, Hengchang; Alcalà, Manel; Nie, Lei; Wang, Mingyu; Li, Lian

    2017-03-15

    Nowadays, as a powerful process analytical tool, near infrared spectroscopy (NIRS) has been widely applied in process monitoring. In present work, NIRS combined with multivariate analysis was used to monitor the ethanol precipitation process of fraction I+II+III (FI+II+III) supernatant in human albumin (HA) separation to achieve qualitative and quantitative monitoring at the same time and assure the product's quality. First, a qualitative model was established by using principal component analysis (PCA) with 6 of 8 normal batches samples, and evaluated by the remaining 2 normal batches and 3 abnormal batches. The results showed that the first principal component (PC1) score chart could be successfully used for fault detection and diagnosis. Then, two quantitative models were built with 6 of 8 normal batches to determine the content of the total protein (TP) and HA separately by using partial least squares regression (PLS-R) strategy, and the models were validated by 2 remaining normal batches. The determination coefficient of validation (R p 2 ), root mean square error of cross validation (RMSECV), root mean square error of prediction (RMSEP) and ratio of performance deviation (RPD) were 0.975, 0.501g/L, 0.465g/L and 5.57 for TP, and 0.969, 0.530g/L, 0.341g/L and 5.47 for HA, respectively. The results showed that the established models could give a rapid and accurate measurement of the content of TP and HA. The results of this study indicated that NIRS is an effective tool and could be successfully used for qualitative and quantitative monitoring the ethanol precipitation process of FI+II+III supernatant simultaneously. This research has significant reference value for assuring the quality and improving the recovery ratio of HA in industrialization scale by using NIRS. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Visualizing Uncertainty of Point Phenomena by Redesigned Error Ellipses

    NASA Astrophysics Data System (ADS)

    Murphy, Christian E.

    2018-05-01

    Visualizing uncertainty remains one of the great challenges in modern cartography. There is no overarching strategy to display the nature of uncertainty, as an effective and efficient visualization depends, besides on the spatial data feature type, heavily on the type of uncertainty. This work presents a design strategy to visualize uncertainty con-nected to point features. The error ellipse, well-known from mathematical statistics, is adapted to display the uncer-tainty of point information originating from spatial generalization. Modified designs of the error ellipse show the po-tential of quantitative and qualitative symbolization and simultaneous point based uncertainty symbolization. The user can intuitively depict the centers of gravity, the major orientation of the point arrays as well as estimate the ex-tents and possible spatial distributions of multiple point phenomena. The error ellipse represents uncertainty in an intuitive way, particularly suitable for laymen. Furthermore it is shown how applicable an adapted design of the er-ror ellipse is to display the uncertainty of point features originating from incomplete data. The suitability of the error ellipse to display the uncertainty of point information is demonstrated within two showcases: (1) the analysis of formations of association football players, and (2) uncertain positioning of events on maps for the media.

  16. Identification of spilled oils by NIR spectroscopy technology based on KPCA and LSSVM

    NASA Astrophysics Data System (ADS)

    Tan, Ailing; Bi, Weihong

    2011-08-01

    Oil spills on the sea surface are seen relatively often with the development of the petroleum exploitation and transportation of the sea. Oil spills are great threat to the marine environment and the ecosystem, thus the oil pollution in the ocean becomes an urgent topic in the environmental protection. To develop the oil spill accident treatment program and track the source of the spilled oils, a novel qualitative identification method combined Kernel Principal Component Analysis (KPCA) and Least Square Support Vector Machine (LSSVM) was proposed. The proposed method adapt Fourier transform NIR spectrophotometer to collect the NIR spectral data of simulated gasoline, diesel fuel and kerosene oil spills samples and do some pretreatments to the original spectrum. We use the KPCA algorithm which is an extension of Principal Component Analysis (PCA) using techniques of kernel methods to extract nonlinear features of the preprocessed spectrum. Support Vector Machines (SVM) is a powerful methodology for solving spectral classification tasks in chemometrics. LSSVM are reformulations to the standard SVMs which lead to solving a system of linear equations. So a LSSVM multiclass classification model was designed which using Error Correcting Output Code (ECOC) method borrowing the idea of error correcting codes used for correcting bit errors in transmission channels. The most common and reliable approach to parameter selection is to decide on parameter ranges, and to then do a grid search over the parameter space to find the optimal model parameters. To test the proposed method, 375 spilled oil samples of unknown type were selected to study. The optimal model has the best identification capabilities with the accuracy of 97.8%. Experimental results show that the proposed KPCA plus LSSVM qualitative analysis method of near infrared spectroscopy has good recognition result, which could work as a new method for rapid identification of spilled oils.

  17. Google glass based immunochromatographic diagnostic test analysis

    NASA Astrophysics Data System (ADS)

    Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan

    2015-03-01

    Integration of optical imagers and sensors into recently emerging wearable computational devices allows for simpler and more intuitive methods of integrating biomedical imaging and medical diagnostics tasks into existing infrastructures. Here we demonstrate the ability of one such device, the Google Glass, to perform qualitative and quantitative analysis of immunochromatographic rapid diagnostic tests (RDTs) using a voice-commandable hands-free software-only interface, as an alternative to larger and more bulky desktop or handheld units. Using the built-in camera of Glass to image one or more RDTs (labeled with Quick Response (QR) codes), our Glass software application uploads the captured image and related information (e.g., user name, GPS, etc.) to our servers for remote analysis and storage. After digital analysis of the RDT images, the results are transmitted back to the originating Glass device, and made available through a website in geospatial and tabular representations. We tested this system on qualitative human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) RDTs. For qualitative HIV tests, we demonstrate successful detection and labeling (i.e., yes/no decisions) for up to 6-fold dilution of HIV samples. For quantitative measurements, we activated and imaged PSA concentrations ranging from 0 to 200 ng/mL and generated calibration curves relating the RDT line intensity values to PSA concentration. By providing automated digitization of both qualitative and quantitative test results, this wearable colorimetric diagnostic test reader platform on Google Glass can reduce operator errors caused by poor training, provide real-time spatiotemporal mapping of test results, and assist with remote monitoring of various biomedical conditions.

  18. Student Errors in Fractions and Possible Causes of These Errors

    ERIC Educational Resources Information Center

    Aksoy, Nuri Can; Yazlik, Derya Ozlem

    2017-01-01

    In this study, it was aimed to determine the errors and misunderstandings of 5th and 6th grade middle school students in fractions and operations with fractions. For this purpose, the case study model, which is a qualitative research design, was used in the research. In the study, maximum diversity sampling, which is a purposeful sampling method,…

  19. Accommodating Grief on Twitter: An Analysis of Expressions of Grief Among Gang Involved Youth on Twitter Using Qualitative Analysis and Natural Language Processing.

    PubMed

    Patton, Desmond Upton; MacBeth, Jamie; Schoenebeck, Sarita; Shear, Katherine; McKeown, Kathleen

    2018-01-01

    There is a dearth of research investigating youths' experience of grief and mourning after the death of close friends or family. Even less research has explored the question of how youth use social media sites to engage in the grieving process. This study employs qualitative analysis and natural language processing to examine tweets that follow 2 deaths. First, we conducted a close textual read on a sample of tweets by Gakirah Barnes, a gang-involved teenaged girl in Chicago, and members of her Twitter network, over a 19-day period in 2014 during which 2 significant deaths occurred: that of Raason "Lil B" Shaw and Gakirah's own death. We leverage the grief literature to understand the way Gakirah and her peers express thoughts, feelings, and behaviors at the time of these deaths. We also present and explain the rich and complex style of online communication among gang-involved youth, one that has been overlooked in prior research. Next, we overview the natural language processing output for expressions of loss and grief in our data set based on qualitative findings and present an error analysis on its output for grief. We conclude with a call for interdisciplinary research that analyzes online and offline behaviors to help understand physical and emotional violence and other problematic behaviors prevalent among marginalized communities.

  20. Accommodating Grief on Twitter: An Analysis of Expressions of Grief Among Gang Involved Youth on Twitter Using Qualitative Analysis and Natural Language Processing

    PubMed Central

    Patton, Desmond Upton; MacBeth, Jamie; Schoenebeck, Sarita; Shear, Katherine; McKeown, Kathleen

    2018-01-01

    There is a dearth of research investigating youths’ experience of grief and mourning after the death of close friends or family. Even less research has explored the question of how youth use social media sites to engage in the grieving process. This study employs qualitative analysis and natural language processing to examine tweets that follow 2 deaths. First, we conducted a close textual read on a sample of tweets by Gakirah Barnes, a gang-involved teenaged girl in Chicago, and members of her Twitter network, over a 19-day period in 2014 during which 2 significant deaths occurred: that of Raason “Lil B” Shaw and Gakirah’s own death. We leverage the grief literature to understand the way Gakirah and her peers express thoughts, feelings, and behaviors at the time of these deaths. We also present and explain the rich and complex style of online communication among gang-involved youth, one that has been overlooked in prior research. Next, we overview the natural language processing output for expressions of loss and grief in our data set based on qualitative findings and present an error analysis on its output for grief. We conclude with a call for interdisciplinary research that analyzes online and offline behaviors to help understand physical and emotional violence and other problematic behaviors prevalent among marginalized communities. PMID:29636619

  1. Survey and Method for Determination of Trajectory Predictor Requirements

    NASA Technical Reports Server (NTRS)

    Rentas, Tamika L.; Green, Steven M.; Cate, Karen Tung

    2009-01-01

    A survey of air-traffic-management researchers, representing a broad range of automation applications, was conducted to document trajectory-predictor requirements for future decision-support systems. Results indicated that the researchers were unable to articulate a basic set of trajectory-prediction requirements for their automation concepts. Survey responses showed the need to establish a process to help developers determine the trajectory-predictor-performance requirements for their concepts. Two methods for determining trajectory-predictor requirements are introduced. A fast-time simulation method is discussed that captures the sensitivity of a concept to the performance of its trajectory-prediction capability. A characterization method is proposed to provide quicker, yet less precise results, based on analysis and simulation to characterize the trajectory-prediction errors associated with key modeling options for a specific concept. Concept developers can then identify the relative sizes of errors associated with key modeling options, and qualitatively determine which options lead to significant errors. The characterization method is demonstrated for a case study involving future airport surface traffic management automation. Of the top four sources of error, results indicated that the error associated with accelerations to and from turn speeds was unacceptable, the error associated with the turn path model was acceptable, and the error associated with taxi-speed estimation was of concern and needed a higher fidelity concept simulation to obtain a more precise result

  2. Vector space methods of photometric analysis. II - Refinement of the MK grid for B stars. III - The two components of ultraviolet reddening

    NASA Technical Reports Server (NTRS)

    Massa, D.

    1980-01-01

    This paper discusses systematic errors which arise from exclusive use of the MK system to determine reddening. It is found that implementation of uvby, beta photometry to refine the qualitative MK grid substantially reduces stellar mismatch error. A working definition of 'identical' ubvy, beta types is investigated and the relationship of uvby to B-V color excesses is determined. A comparison is also made of the hydrogen based uvby, beta types with the MK system based on He and metal lines. A small core correlated effective temperature luminosity error in the MK system for the early B stars is observed along with a breakdown of the MK luminosity criteria for the late B stars. The second part investigates the wavelength dependence of interstellar extinction in the ultraviolet wavelength range observed with the TD-1 satellite. In this study the sets of identical stars employed to find reddening are determined more precisely than in previous studies and consist only of normal, nonsupergiant stars. A multivariate analysis of variance techniques in an unbiased coordinate system is used for determining the wavelength dependence of reddening.

  3. Psycho-Motor and Error Enabled Simulations Modeling Vulnerable Skills in the Pre-Mastery Phase - Medical Practice Initiative Procedural Skill Decay and Maintenance (MPI-PSD)

    DTIC Science & Technology

    2015-04-01

    and execution of Performance Review Tool; Organization, coding, and transcribing of collected data; Analysis of qualitative survey and quantitative...University of Wisconsin System Madison, WI 53715-1218 REPORT DATE: April 2015 TYPE OF REPORT: Annual PREPARED FOR: U.S. Army Medical Research and...MONITOR’S ACRONYM(S) U.S. Army Medical Research and Material Command Fort Detrick, Maryland 21702-5012 11. SPONSOR/MONITOR’S REPORT NUMBER

  4. Magnetic field errors tolerances of Nuclotron booster

    NASA Astrophysics Data System (ADS)

    Butenko, Andrey; Kazinova, Olha; Kostromin, Sergey; Mikhaylov, Vladimir; Tuzikov, Alexey; Khodzhibagiyan, Hamlet

    2018-04-01

    Generation of magnetic field in units of booster synchrotron for the NICA project is one of the most important conditions for getting the required parameters and qualitative accelerator operation. Research of linear and nonlinear dynamics of ion beam 197Au31+ in the booster have carried out with MADX program. Analytical estimation of magnetic field errors tolerance and numerical computation of dynamic aperture of booster DFO-magnetic lattice are presented. Closed orbit distortion with random errors of magnetic fields and errors in layout of booster units was evaluated.

  5. The vocabulary profile of Slovak children with primary language impairment compared to typically developing Slovak children measured by LITMUS-CLT.

    PubMed

    Kapalková, Svetlana; Slančová, Daniela

    2017-01-01

    This study compared a sample of children with primary language impairment (PLI) and typically developing age-matched children using the crosslinguistic lexical tasks (CLT-SK). We also compared the PLI children with typically developing language-matched younger children who were matched on the basis of receptive vocabulary. Overall, statistical testing showed that the vocabulary of the PLI children was significantly different from the vocabulary of the age-matched children, but not statistically different from the younger children who were matched on the basis of their receptive vocabulary size. Qualitative analysis of the correct answers revealed that the PLI children showed higher rigidity compared to the younger language-matched children who are able to use more synonyms or derivations across word class in naming tasks. Similarly, an examination of the children's naming errors indicated that the language-matched children exhibited more semantic errors, whereas PLI children showed more associative errors.

  6. Exploring behavioural determinants relating to health professional reporting of medication errors: a qualitative study using the Theoretical Domains Framework.

    PubMed

    Alqubaisi, Mai; Tonna, Antonella; Strath, Alison; Stewart, Derek

    2016-07-01

    Effective and efficient medication reporting processes are essential in promoting patient safety. Few qualitative studies have explored reporting of medication errors by health professionals, and none have made reference to behavioural theories. The objective was to describe and understand the behavioural determinants of health professional reporting of medication errors in the United Arab Emirates (UAE). This was a qualitative study comprising face-to-face, semi-structured interviews within three major medical/surgical hospitals of Abu Dhabi, the UAE. Health professionals were sampled purposively in strata of profession and years of experience. The semi-structured interview schedule focused on behavioural determinants around medication error reporting, facilitators, barriers and experiences. The Theoretical Domains Framework (TDF; a framework of theories of behaviour change) was used as a coding framework. Ethical approval was obtained from a UK university and all participating hospital ethics committees. Data saturation was achieved after interviewing ten nurses, ten pharmacists and nine physicians. Whilst it appeared that patient safety and organisational improvement goals and intentions were behavioural determinants which facilitated reporting, there were key determinants which deterred reporting. These included the beliefs of the consequences of reporting (lack of any feedback following reporting and impacting professional reputation, relationships and career progression), emotions (fear and worry) and issues related to the environmental context (time taken to report). These key behavioural determinants which negatively impact error reporting can facilitate the development of an intervention, centring on organisational safety and reporting culture, to enhance reporting effectiveness and efficiency.

  7. Uncorrected and corrected refractive error experiences of Nepalese adults: a qualitative study.

    PubMed

    Kandel, Himal; Khadka, Jyoti; Shrestha, Mohan Krishna; Sharma, Sadhana; Neupane Kandel, Sandhya; Dhungana, Purushottam; Pradhan, Kishore; Nepal, Bhagavat P; Thapa, Suman; Pesudovs, Konrad

    2018-04-01

    The aim of this study was to explore the impact of corrected and uncorrected refractive error (URE) on Nepalese people's quality of life (QoL), and to compare the QoL status between refractive error subgroups. Participants were recruited from Tilganga Institute of Ophthalmology and Dhulikhel Hospital, Nepal. Semi-structured in-depth interviews were conducted with 101 people with refractive error. Thematic analysis was used with matrices produced to compare the occurrence of themes and categories across participants. Themes were identified using an inductive approach. Seven major themes emerged that determined refractive error-specific QoL: activity limitation, inconvenience, health concerns, psycho-social impact, economic impact, general and ocular comfort symptoms, and visual symptoms. Activity limitation, economic impact, and symptoms were the most important themes for the participants with URE, whereas inconvenience associated with wearing glasses was the most important issue in glasses wearers. Similarly, possibilities of having side effects or complications were the major concerns for participants wearing contact lens. In general, refractive surgery addressed socio-emotional impact of wearing glasses or contact lens. However, the surgery participants had concerns such as possibility of having to wear glasses again due to relapse of refractive error. Impact of refractive error on people's QoL is multifaceted. Significance of the identified themes varies by refractive error subgroups. Refractive correction may not always address QoL impact of URE but often add unique QoL issues. This study findings also provide content for developing an item-bank for quantitatively measuring refractive error-specific QoL in developing country setting.

  8. Symbolic-numeric interface: A review

    NASA Technical Reports Server (NTRS)

    Ng, E. W.

    1980-01-01

    A survey of the use of a combination of symbolic and numerical calculations is presented. Symbolic calculations primarily refer to the computer processing of procedures from classical algebra, analysis, and calculus. Numerical calculations refer to both numerical mathematics research and scientific computation. This survey is intended to point out a large number of problem areas where a cooperation of symbolic and numerical methods is likely to bear many fruits. These areas include such classical operations as differentiation and integration, such diverse activities as function approximations and qualitative analysis, and such contemporary topics as finite element calculations and computation complexity. It is contended that other less obvious topics such as the fast Fourier transform, linear algebra, nonlinear analysis and error analysis would also benefit from a synergistic approach.

  9. Combining FT-IR spectroscopy and multivariate analysis for qualitative and quantitative analysis of the cell wall composition changes during apples development.

    PubMed

    Szymanska-Chargot, M; Chylinska, M; Kruk, B; Zdunek, A

    2015-01-22

    The aim of this work was to quantitatively and qualitatively determine the composition of the cell wall material from apples during development by means of Fourier transform infrared (FT-IR) spectroscopy. The FT-IR region of 1500-800 cm(-1), containing characteristic bands for galacturonic acid, hemicellulose and cellulose, was examined using principal component analysis (PCA), k-means clustering and partial least squares (PLS). The samples were differentiated by development stage and cultivar using PCA and k-means clustering. PLS calibration models for galacturonic acid, hemicellulose and cellulose content from FT-IR spectra were developed and validated with the reference data. PLS models were tested using the root-mean-square errors of cross-validation for contents of galacturonic acid, hemicellulose and cellulose which was 8.30 mg/g, 4.08% and 1.74%, respectively. It was proven that FT-IR spectroscopy combined with chemometric methods has potential for fast and reliable determination of the main constituents of fruit cell walls. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Calculus Instructors' Responses to Prior Knowledge Errors

    ERIC Educational Resources Information Center

    Talley, Jana Renee

    2009-01-01

    This study investigates the responses to prior knowledge errors that Calculus I instructors make when assessing students. Prior knowledge is operationalized as any skill or understanding that a student needs to successfully navigate through a Calculus I course. A two part qualitative study consisting of student exams and instructor interviews was…

  11. Verb Errors of Bilingual and Monolingual Basic Writers

    ERIC Educational Resources Information Center

    Griswold, Olga

    2017-01-01

    This study analyzed the grammatical control of verbs exercised by 145 monolingual English and Generation 1.5 bilingual developmental writers in narrative essays using quantitative and qualitative methods. Generation 1.5 students made more errors than their monolingual peers in each category investigated, albeit in only 2 categories was the…

  12. Protocol for the PINCER trial: a cluster randomised trial comparing the effectiveness of a pharmacist-led IT-based intervention with simple feedback in reducing rates of clinically important errors in medicines management in general practices

    PubMed Central

    Avery, Anthony J; Rodgers, Sarah; Cantrill, Judith A; Armstrong, Sarah; Elliott, Rachel; Howard, Rachel; Kendrick, Denise; Morris, Caroline J; Murray, Scott A; Prescott, Robin J; Cresswell, Kathrin; Sheikh, Aziz

    2009-01-01

    Background Medication errors are an important cause of morbidity and mortality in primary care. The aims of this study are to determine the effectiveness, cost effectiveness and acceptability of a pharmacist-led information-technology-based complex intervention compared with simple feedback in reducing proportions of patients at risk from potentially hazardous prescribing and medicines management in general (family) practice. Methods Research subject group: "At-risk" patients registered with computerised general practices in two geographical regions in England. Design: Parallel group pragmatic cluster randomised trial. Interventions: Practices will be randomised to either: (i) Computer-generated feedback; or (ii) Pharmacist-led intervention comprising of computer-generated feedback, educational outreach and dedicated support. Primary outcome measures: The proportion of patients in each practice at six and 12 months post intervention: - with a computer-recorded history of peptic ulcer being prescribed non-selective non-steroidal anti-inflammatory drugs - with a computer-recorded diagnosis of asthma being prescribed beta-blockers - aged 75 years and older receiving long-term prescriptions for angiotensin converting enzyme inhibitors or loop diuretics without a recorded assessment of renal function and electrolytes in the preceding 15 months. Secondary outcome measures; These relate to a number of other examples of potentially hazardous prescribing and medicines management. Economic analysis: An economic evaluation will be done of the cost per error avoided, from the perspective of the UK National Health Service (NHS), comparing the pharmacist-led intervention with simple feedback. Qualitative analysis: A qualitative study will be conducted to explore the views and experiences of health care professionals and NHS managers concerning the interventions, and investigate possible reasons why the interventions prove effective, or conversely prove ineffective. Sample size: 34 practices in each of the two treatment arms would provide at least 80% power (two-tailed alpha of 0.05) to demonstrate a 50% reduction in error rates for each of the three primary outcome measures in the pharmacist-led intervention arm compared with a 11% reduction in the simple feedback arm. Discussion At the time of submission of this article, 72 general practices have been recruited (36 in each arm of the trial) and the interventions have been delivered. Analysis has not yet been undertaken. Trial registration Current controlled trials ISRCTN21785299 PMID:19409095

  13. Cognitive analysis as a way to understand students' problem-solving process in BODMAS rule

    NASA Astrophysics Data System (ADS)

    Ung, Ting Su; Kiong, Paul Lau Ngee; Manaf, Badron bin; Hamdan, Anniza Binti; Khium, Chen Chee

    2017-04-01

    Students tend to make lots of careless mistake during the process of mathematics solving. To facilitate effective learning, educators have to understand which cognitive processes are used by students and how these processes help them to solve problems. This paper is only aimed to determine the common errors in mathematics by pre-diploma students that took Intensive Mathematics I (MAT037) in UiTM Sarawak. Then, concentrate on the errors did by the students on the topic of BODMAS rule and the mental processes corresponding to these errors that been developed by students. One class of pre-diploma students taking MAT037 taught by the researchers was selected because they performed poorly in SPM mathematics. It is inevitable that they finished secondary education with many misconceptions in mathematics. The solution scripts for all the tutorials of the participants were collected. This study was predominately qualitative and the solution scripts were content analyzed to identify the common errors committed by the participants, and to generate possible mental processes to these errors. Selected students were interviewed by the researchers during the progress. BODMAS rule could be further divided into Numerical Simplification and Powers Simplification. Furthermore, the erroneous processes could be attributed to categories of Basic Arithmetic Rules, Negative Numbers and Powers.

  14. Decomposition and correction overlapping peaks of LIBS using an error compensation method combined with curve fitting.

    PubMed

    Tan, Bing; Huang, Min; Zhu, Qibing; Guo, Ya; Qin, Jianwei

    2017-09-01

    The laser induced breakdown spectroscopy (LIBS) technique is an effective method to detect material composition by obtaining the plasma emission spectrum. The overlapping peaks in the spectrum are a fundamental problem in the qualitative and quantitative analysis of LIBS. Based on a curve fitting method, this paper studies an error compensation method to achieve the decomposition and correction of overlapping peaks. The vital step is that the fitting residual is fed back to the overlapping peaks and performs multiple curve fitting processes to obtain a lower residual result. For the quantitative experiments of Cu, the Cu-Fe overlapping peaks in the range of 321-327 nm obtained from the LIBS spectrum of five different concentrations of CuSO 4 ·5H 2 O solution were decomposed and corrected using curve fitting and error compensation methods. Compared with the curve fitting method, the error compensation reduced the fitting residual about 18.12-32.64% and improved the correlation about 0.86-1.82%. Then, the calibration curve between the intensity and concentration of the Cu was established. It can be seen that the error compensation method exhibits a higher linear correlation between the intensity and concentration of Cu, which can be applied to the decomposition and correction of overlapping peaks in the LIBS spectrum.

  15. Error and objectivity: cognitive illusions and qualitative research.

    PubMed

    Paley, John

    2005-07-01

    Psychological research has shown that cognitive illusions, of which visual illusions are just a special case, are systematic and pervasive, raising epistemological questions about how error in all forms of research can be identified and eliminated. The quantitative sciences make use of statistical techniques for this purpose, but it is not clear what the qualitative equivalent is, particularly in view of widespread scepticism about validity and objectivity. I argue that, in the light of cognitive psychology, the 'error question' cannot be dismissed as a positivist obsession, and that the concepts of truth and objectivity are unavoidable. However, they constitute only a 'minimal realism', which does not necessarily bring a commitment to 'absolute' truth, certainty, correspondence, causation, reductionism, or universal laws in its wake. The assumption that it does reflects a misreading of positivism and, ironically, precipitates a 'crisis of legitimation and representation', as described by constructivist authors.

  16. Barriers to Medical Error Reporting for Physicians and Nurses.

    PubMed

    Soydemir, Dilek; Seren Intepeler, Seyda; Mert, Hatice

    2017-10-01

    The purpose of the study was to determine what barriers to error reporting exist for physicians and nurses. The study, of descriptive qualitative design, was conducted with physicians and nurses working at a training and research hospital. In-depth interviews were held with eight physicians and 15 nurses, a total of 23 participants. Physicians and nurses do not choose to report medical errors that they experience or witness. When barriers to error reporting were examined, it was seen that there were four main themes involved: fear, the attitude of administration, barriers related to the system, and the employees' perceptions of error. It is important in terms of preventing medical errors to identify the barriers that keep physicians and nurses from reporting errors.

  17. Analyzing students’ errors on fractions in the number line

    NASA Astrophysics Data System (ADS)

    Widodo, S.; Ikhwanudin, T.

    2018-05-01

    The objectives of this study are to know the type of students’ errors when they deal with fractions on the number line. This study used qualitative with a descriptive method, and involved 31 sixth grade students at one of the primary schools in Purwakarta, Indonesia. The results of this study are as follow, there are four types of student’s errors: unit confusion, tick mark interpretation error, partitioning and un partitioning error, and estimation error. We recommend that teachers should: strengthen unit understanding to the students when studying fractions, make students understand about tick mark interpretation, remind student of the importance of partitioning and un-partitioning strategy and teaches effective estimation strategies.

  18. Perceptions and receptivity of non-spousal family support: A mixed methods study of psychological distress among older, church-going African American men

    PubMed Central

    Watkins, Daphne C.; Wharton, Tracy; Mitchell, Jamie A.; Matusko, Niki; Kales, Helen

    2016-01-01

    The purpose of this study was to explore the role of non-spousal family support on mental health among older, church-going African American men. The mixed methods objective was to employ a design that used existing qualitative and quantitative data to explore the interpretive context within which social and cultural experiences occur. Qualitative data (n=21) were used to build a conceptual model that was tested using quantitative data (n= 401). Confirmatory factor analysis indicated an inverse association between non-spousal family support and distress. The comparative fit index, Tucker-Lewis fit index, and root mean square error of approximation indicated good model fit. This study offers unique methodological approaches to using existing, complementary data sources to understand the health of African American men. PMID:28943829

  19. Qualitative Dimensions in Scoring the Rey Visual Memory Test of Malingering.

    ERIC Educational Resources Information Center

    Griffin, G. A. Elmer; And Others

    1996-01-01

    A new qualitative scoring system for the Rey Visual Memory Test was tested for its ability to distinguish between malingerers and nonmalingerers. The new system, based on the types of errors made, was able to distinguish between 53 psychiatrically disabled and 64 normal nonmalingerers, and between nonmalingerers and 91 possible malingerers. (SLD)

  20. Possibilities: A Framework for Modeling Students' Deductive Reasoning in Physics

    ERIC Educational Resources Information Center

    Gaffney, Jonathan David Housley

    2010-01-01

    Students often make errors when trying to solve qualitative or conceptual physics problems, and while many successful instructional interventions have been generated to prevent such errors, the process of deduction that students use when solving physics problems has not been thoroughly studied. In an effort to better understand that reasoning…

  1. Endoscopic non-technical skills team training: the next step in quality assurance of endoscopy training.

    PubMed

    Matharoo, Manmeet; Haycock, Adam; Sevdalis, Nick; Thomas-Gibson, Siwan

    2014-12-14

    To investigate whether novel, non-technical skills training for Bowel Cancer Screening (BCS) endoscopy teams enhanced patient safety knowledge and attitudes. A novel endoscopy team training intervention for BCS teams was developed and evaluated as a pre-post intervention study. Four multi-disciplinary BCS teams constituting BCS endoscopist(s), specialist screening practitioners, endoscopy nurses and administrative staff (A) from English BCS training centres participated. No patients were involved in this study. Expert multidisciplinary faculty delivered a single day's training utilising real clinical examples. Pre and post-course evaluation comprised participants' patient safety awareness, attitudes, and knowledge. Global course evaluations were also collected. Twenty-three participants attended and their patient safety knowledge improved significantly from 43%-55% (P ≤ 0.001) following the training intervention. 12/41 (29%) of the safety attitudes items significantly improved in the areas of perceived patient safety knowledge and awareness. The remaining safety attitude items: perceived influence on patient safety, attitudes towards error management, error management actions and personal views following an error were unchanged following training. Both qualitative and quantitative global course evaluations were positive: 21/23 (91%) participants strongly agreed/agreed that they were satisfied with the course. Qualitative evaluation included mandating such training for endoscopy teams outside BCS and incorporating team training within wider endoscopy training. Limitations of the study include no measure of increased patient safety in clinical practice following training. A novel comprehensive training package addressing patient safety, non-technical skills and adverse event analysis was successful in improving multi-disciplinary teams' knowledge and safety attitudes.

  2. Age of acquisition and naming performance in Frisian-Dutch bilingual speakers with dementia.

    PubMed

    Veenstra, Wencke S; Huisman, Mark; Miller, Nick

    2014-01-01

    Age of acquisition (AoA) of words is a recognised variable affecting language processing in speakers with and without language disorders. For bi- and multilingual speakers their languages can be differentially affected in neurological illness. Study of language loss in bilingual speakers with dementia has been relatively neglected. We investigated whether AoA of words was associated with level of naming impairment in bilingual speakers with probable Alzheimer's dementia within and across their languages. Twenty-six Frisian-Dutch bilinguals with mild to moderate dementia named 90 pictures in each language, employing items with rated AoA and other word variable measures matched across languages. Quantitative (totals correct) and qualitative (error types and (in)appropriate switching) aspects were measured. Impaired retrieval occurred in Frisian (Language 1) and Dutch (Language 2), with a significant effect of AoA on naming in both languages. Earlier acquired words were better preserved and retrieved. Performance was identical across languages, but better in Dutch when controlling for covariates. However, participants demonstrated more inappropriate code switching within the Frisian test setting. On qualitative analysis, no differences in overall error distribution were found between languages for early or late acquired words. There existed a significantly higher percentage of semantically than visually-related errors. These findings have implications for understanding problems in lexical retrieval among bilingual individuals with dementia and its relation to decline in other cognitive functions which may play a role in inappropriate code switching. We discuss the findings in the light of the close relationship between Frisian and Dutch and the pattern of usage across the life-span.

  3. Endoscopic non-technical skills team training: The next step in quality assurance of endoscopy training

    PubMed Central

    Matharoo, Manmeet; Haycock, Adam; Sevdalis, Nick; Thomas-Gibson, Siwan

    2014-01-01

    AIM: To investigate whether novel, non-technical skills training for Bowel Cancer Screening (BCS) endoscopy teams enhanced patient safety knowledge and attitudes. METHODS: A novel endoscopy team training intervention for BCS teams was developed and evaluated as a pre-post intervention study. Four multi-disciplinary BCS teams constituting BCS endoscopist(s), specialist screening practitioners, endoscopy nurses and administrative staff (A) from English BCS training centres participated. No patients were involved in this study. Expert multidisciplinary faculty delivered a single day’s training utilising real clinical examples. Pre and post-course evaluation comprised participants’ patient safety awareness, attitudes, and knowledge. Global course evaluations were also collected. RESULTS: Twenty-three participants attended and their patient safety knowledge improved significantly from 43%-55% (P ≤ 0.001) following the training intervention. 12/41 (29%) of the safety attitudes items significantly improved in the areas of perceived patient safety knowledge and awareness. The remaining safety attitude items: perceived influence on patient safety, attitudes towards error management, error management actions and personal views following an error were unchanged following training. Both qualitative and quantitative global course evaluations were positive: 21/23 (91%) participants strongly agreed/agreed that they were satisfied with the course. Qualitative evaluation included mandating such training for endoscopy teams outside BCS and incorporating team training within wider endoscopy training. Limitations of the study include no measure of increased patient safety in clinical practice following training. CONCLUSION: A novel comprehensive training package addressing patient safety, non-technical skills and adverse event analysis was successful in improving multi-disciplinary teams’ knowledge and safety attitudes. PMID:25516665

  4. Medication management of febrile children: a qualitative study on pharmacy employees' experiences.

    PubMed

    Stakenborg, Jacqueline P G; de Bont, Eefje G P M; Peetoom, Kirsten K B; Nelissen-Vrancken, Marjorie H J M G; Cals, Jochen W L

    2016-10-01

    Background While fever is mostly self-limiting, antibiotic prescription rates for febrile children are high. Although every parent who receives a prescription visits a pharmacy, we have limited insight into pharmacy employees' experiences with these parents. Pharmacy employees do however exert an important role in ensuring children receive correct dosages and in advising parents on administration of antibiotics. Objective To describe pharmacists' and pharmacy assistants' experiences with parents contacting a pharmacy for their febrile child, and to identify ways of improving medication management of these children. Setting Community pharmacies in the Netherlands. Method A qualitative study including 24 Dutch pharmacy employees was conducted, performing four focus group discussions among pharmacy employees. Analysis was based on constant comparative technique using open and axial coding. Main outcome measure Pharmacy employees' experiences with parents contacting a pharmacy for their febrile child. Results Three categories were identified: (1) workload and general experience, (2) inconsistent information on antibiotic prescriptions, (3) improving communication and collaboration. Pharmacy employees experienced that dosing errors in antibiotic prescriptions occur frequently and doctors provide inconsistent information on prescriptions. Consequently, they have to contact doctors, resulting in a higher workload for both stakeholders. They believe this can be improved by providing the indication for antibiotics on prescriptions, especially when deviating from standard dosages. Conclusion Pharmacy employees experience a high amount of dosing errors in paediatric antibiotic prescriptions. Providing the indication for antibiotics in febrile children on prescriptions, especially when deviating from standard dosages, can potentially reduce dosage errors and miscommunication between doctors and pharmacy employees.

  5. Nurses' perceptions of multitasking in the emergency department: effective, fun and unproblematic (at least for me) – a qualitative study.

    PubMed

    Forsberg, Helena Hvitfeldt; Muntlin Athlin, Åsa; von Thiele Schwarz, Ulrica

    2015-04-01

    The aim was to understand how multitasking is experienced by registered nurses and how it relates to their everyday practice in the emergency department. Interviews with open-ended questions were conducted with registered nurses (n = 9) working in one of two included emergency departments in Sweden. Data were analyzed using Schilling's structured model for qualitative content analysis. Three core concepts related to multitasking emerged from the interviews: 'multitasking - an attractive prerequisite for ED care'; 'multitasking implies efficiency' and 'multitasking is not stressful'. From these core concepts an additional theme emerged: '… and does not cause errors – at least for me', related to patient safety. This study shows how the patient load and the unreflected multitasking that follows relate to nurses' perceived efficiency and job satisfaction. It also shows that the relationship between multitasking and errors is perceived to be mediated by whom the actor is, and his or her level of experience. Findings from this study add value to the discourse on multitasking and the emergency department context, as few studies go beyond examining the quantitative aspect of interruptions and multitasking and how it is experienced by the staff in their everyday practice. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. The struggle against perceived negligence. A qualitative study of patients' experiences of adverse events in Norwegian hospitals.

    PubMed

    Hågensen, Gunn; Nilsen, Gudrun; Mehus, Grete; Henriksen, Nils

    2018-04-25

    Every year, 14 % of patients in Norwegian hospitals experience adverse events, which often have health-damaging consequences. The government, hospital management and health personnel attempt to minimize such events. Limited research on the first-hand experience of the patients affected is available. The aim of this study is to present patients' perspectives of the occurrence of, disclosure of, and healthcare organizations' responses to adverse events. Findings are discussed within a social constructivist framework and with reference to principles of open disclosure policy. This qualitative study with an explorative descriptive design included fifteen in-depth interviews with former patients recruited by the Health and Social Services ombudsmen in the two northernmost counties of Norway. Inclusion criteria were as follows: 1) experience of adverse events in connection with surgical, orthopedic or medical treatment in general hospitals; 2) men and women; 3) aged 20-70; and 4) a minimum of one year since the event occurred. Transcribed audio-recorded interviews were analyzed through qualitative content analysis. The analysis revealed three main topics regarding patients' experiences of adverse events: 1) ignored concerns or signs of complications; 2) lack of responsibility and error correction; and 3) lack of support, loyalty and learning opportunities. Patients had to struggle to demonstrate the error that had occurred and to receive the necessary treatment and monitoring in the aftermath of the events. Patient narratives reveal a lack of openness, care and responsibility in connection with adverse events. Conflicting power structures, attitudes and established procedures may inhibit prevention, learning and patient safety work in spite of major efforts and good intentions. Attitudes in day-to-day patient care and organizational procedures should be challenged to invite patients into open disclosure processes and include them in health and safety work to a greater extent. The study's small sample of self-selected participants limits the generalizability of the findings, and future studies should include a larger number of patients as well as professional perspectives.

  7. Female residents experiencing medical errors in general internal medicine: a qualitative study.

    PubMed

    Mankaka, Cindy Ottiger; Waeber, Gérard; Gachoud, David

    2014-07-10

    Doctors, especially doctors-in-training such as residents, make errors. They have to face the consequences even though today's approach to errors emphasizes systemic factors. Doctors' individual characteristics play a role in how medical errors are experienced and dealt with. The role of gender has previously been examined in a few quantitative studies that have yielded conflicting results. In the present study, we sought to qualitatively explore the experience of female residents with respect to medical errors. In particular, we explored the coping mechanisms displayed after an error. This study took place in the internal medicine department of a Swiss university hospital. Within a phenomenological framework, semi-structured interviews were conducted with eight female residents in general internal medicine. All interviews were audiotaped, fully transcribed, and thereafter analyzed. Seven main themes emerged from the interviews: (1) A perception that there is an insufficient culture of safety and error; (2) The perceived main causes of errors, which included fatigue, work overload, inadequate level of competences in relation to assigned tasks, and dysfunctional communication; (3) Negative feelings in response to errors, which included different forms of psychological distress; (4) Variable attitudes of the hierarchy toward residents involved in an error; (5) Talking about the error, as the core coping mechanism; (6) Defensive and constructive attitudes toward one's own errors; and (7) Gender-specific experiences in relation to errors. Such experiences consisted in (a) perceptions that male residents were more confident and therefore less affected by errors than their female counterparts and (b) perceptions that sexist attitudes among male supervisors can occur and worsen an already painful experience. This study offers an in-depth account of how female residents specifically experience and cope with medical errors. Our interviews with female residents convey the sense that gender possibly influences the experience with errors, including the kind of coping mechanisms displayed. However, we acknowledge that the lack of a direct comparison between female and male participants represents a limitation while aiming to explore the role of gender.

  8. A mixed-methods analysis of patient reviews of hospital care in England: implications for public reporting of health care quality data in the United States.

    PubMed

    Lagu, Tara; Goff, Sarah L; Hannon, Nicholas S; Shatz, Amy; Lindenauer, Peter K

    2013-01-01

    In the United States patients have limited opportunities to read and write narrative reviews about hospitals. In contrast, the National Health Service (NHS) in England encourages patients to provide feedback to hospitals on their quality-reporting website, NHS Choices. The scope and content of the narrative feedback was studied. All NHS hospitals with more than 10 reviews posted on NHS Choices were included in a cross-sectional mixed-methods (qualitative and quantitative) analysis of patients' reviews of 20 randomly selected hospitals. The final sample consisted of 264 hospitals and 2,640 patient responses to structured questions. All 200 reviews from the 20 hospitals randomly selected were subjected to further quantitative and qualitative analysis. Comments about clinicians and staff were common (179 [90%]) and overwhelmingly positive, with 149 (83%) favorable to workers. In 124 (62%) of the 200 reviews, patients commented on technical aspects of hospital care, including quality of care, injuries, errors, and incorrect medical record or discharge documentation. Perceived medical errors were described in 51 (26%) hospital reviews. Comments about the hospital facility appeared in half (52%) of reviews, describing hospital cleanliness, food, parking, and amenities. Hospitals replied to 56% of the patient reviews. NHS Choices represents the first government-run initiative that enables any patient to provide narrative feedback about hospital care. Reviews appear to have similar domains to those covered in existing satisfaction surveys but also include detailed feedback that would be unlikely to be revealed by such surveys. Online narrative reviews can therefore provide useful and complementary information to consumers (patients) and hospitals, particularly when combined with systematically collected patient experience data.

  9. A Case Study on Improving Intensive Care Unit (ICU) Services Reliability: By Using Process Failure Mode and Effects Analysis (PFMEA)

    PubMed Central

    Yousefinezhadi, Taraneh; Jannesar Nobari, Farnaz Attar; Goodari, Faranak Behzadi; Arab, Mohammad

    2016-01-01

    Introduction: In any complex human system, human error is inevitable and shows that can’t be eliminated by blaming wrong doers. So with the aim of improving Intensive Care Units (ICU) reliability in hospitals, this research tries to identify and analyze ICU’s process failure modes at the point of systematic approach to errors. Methods: In this descriptive research, data was gathered qualitatively by observations, document reviews, and Focus Group Discussions (FGDs) with the process owners in two selected ICUs in Tehran in 2014. But, data analysis was quantitative, based on failures’ Risk Priority Number (RPN) at the base of Failure Modes and Effects Analysis (FMEA) method used. Besides, some causes of failures were analyzed by qualitative Eindhoven Classification Model (ECM). Results: Through FMEA methodology, 378 potential failure modes from 180 ICU activities in hospital A and 184 potential failures from 99 ICU activities in hospital B were identified and evaluated. Then with 90% reliability (RPN≥100), totally 18 failures in hospital A and 42 ones in hospital B were identified as non-acceptable risks and then their causes were analyzed by ECM. Conclusions: Applying of modified PFMEA for improving two selected ICUs’ processes reliability in two different kinds of hospitals shows that this method empowers staff to identify, evaluate, prioritize and analyze all potential failure modes and also make them eager to identify their causes, recommend corrective actions and even participate in improving process without feeling blamed by top management. Moreover, by combining FMEA and ECM, team members can easily identify failure causes at the point of health care perspectives. PMID:27157162

  10. Using incident reports to inform the prevention of medication administration errors.

    PubMed

    Härkänen, Marja; Saano, Susanna; Vehviläinen-Julkunen, Katri

    2017-11-01

    To describe ways of preventing medication administration errors based on reporters' views expressed in medication administration incident reports. Medication administration errors are very common, and nurses play important roles in committing and in preventing such errors. Thus far, incident reporters' perceptions of how to prevent medication administration errors have rarely been analysed. This is a qualitative, descriptive study using an inductive content analysis of the incident reports related to medication administration errors (n = 1012). These free-text descriptions include reporters' views on preventing the reoccurrence of medication administration errors. The data were collected from two hospitals in Finland and pertain to incidents that were reported between 1 January 2013 and 31 December 2014. Reporters' views on preventing medication administration errors were divided into three main categories related to individuals (health professionals), teams and organisations. The following categories related to individuals in preventing medication administration errors were identified: (1) accuracy and preciseness; (2) verification; and (3) following the guidelines, responsibility and attitude towards work. The team categories were as follows: (1) distribution of work; (2) flow of information and cooperation; and (3) documenting and marking the drug information. The categories related to organisation were as follows: (1) work environment; (2) resources; (3) training; (4) guidelines; and (5) development of the work. Health professionals should administer medication with a high moral awareness and an attempt to concentrate on the task. Nonetheless, the system should support health professionals by providing a reasonable work environment and encouraging collaboration among the providers to facilitate the safe administration of medication. Although there are numerous approaches to supporting medication safety, approaches that support the ability of individual health professionals to manage daily medications should be prioritised. © 2017 John Wiley & Sons Ltd.

  11. Ventilator-associated pneumonia: the influence of bacterial resistance, prescription errors, and de-escalation of antimicrobial therapy on mortality rates.

    PubMed

    Souza-Oliveira, Ana Carolina; Cunha, Thúlio Marquez; Passos, Liliane Barbosa da Silva; Lopes, Gustavo Camargo; Gomes, Fabiola Alves; Röder, Denise Von Dolinger de Brito

    2016-01-01

    Ventilator-associated pneumonia is the most prevalent nosocomial infection in intensive care units and is associated with high mortality rates (14-70%). This study evaluated factors influencing mortality of patients with Ventilator-associated pneumonia (VAP), including bacterial resistance, prescription errors, and de-escalation of antibiotic therapy. This retrospective study included 120 cases of Ventilator-associated pneumonia admitted to the adult adult intensive care unit of the Federal University of Uberlândia. The chi-square test was used to compare qualitative variables. Student's t-test was used for quantitative variables and multiple logistic regression analysis to identify independent predictors of mortality. De-escalation of antibiotic therapy and resistant bacteria did not influence mortality. Mortality was 4 times and 3 times higher, respectively, in patients who received an inappropriate antibiotic loading dose and in patients whose antibiotic dose was not adjusted for renal function. Multiple logistic regression analysis revealed the incorrect adjustment for renal function was the only independent factor associated with increased mortality. Prescription errors influenced mortality of patients with Ventilator-associated pneumonia, underscoring the challenge of proper Ventilator-associated pneumonia treatment, which requires continuous reevaluation to ensure that clinical response to therapy meets expectations. Copyright © 2016. Published by Elsevier Editora Ltda.

  12. Error in intensive care: psychological repercussions and defense mechanisms among health professionals.

    PubMed

    Laurent, Alexandra; Aubert, Laurence; Chahraoui, Khadija; Bioy, Antoine; Mariage, André; Quenot, Jean-Pierre; Capellier, Gilles

    2014-11-01

    To identify the psychological repercussions of an error on professionals in intensive care and to understand their evolution. To identify the psychological defense mechanisms used by professionals to cope with error. Qualitative study with clinical interviews. We transcribed recordings and analysed the data using an interpretative phenomenological analysis. Two ICUs in the teaching hospitals of Besançon and Dijon (France). Fourteen professionals in intensive care (20 physicians and 20 nurses). None. We conducted 40 individual semistructured interviews. The participants were invited to speak about the experience of error in ICU. The interviews were transcribed and analyzed thematically by three experts. In the month following the error, the professionals described feelings of guilt (53.8%) and shame (42.5%). These feelings were associated with anxiety states with rumination (37.5%) and fear for the patient (23%); a loss of confidence (32.5%); an inability to verbalize one's error (22.5%); questioning oneself at a professional level (20%); and anger toward the team (15%). In the long term, the error remains fixed in memory for many of the subjects (80%); on one hand, for 72.5%, it was associated with an increase in vigilance and verifications in their professional practice, and on the other hand, for three professionals, it was associated with a loss of confidence. Finally, three professionals felt guilt which still persisted at the time of the interview. We also observed different defense mechanisms implemented by the professional to fight against the emotional load inherent in the error: verbalization (70%), developing skills and knowledge (43%), rejecting responsibility (32.5%), and avoidance (23%). We also observed a minimization (60%) of the error during the interviews. It is important to take into account the psychological experience of error and the defense mechanisms developed following an error because they appear to determine the professional's capacity to acknowledge and disclose his/her error and to learn from it.

  13. Central Coherence Theory and the Interpretation of Picture Materials: Toward an Assessment of Autistic Spectrum Disorder.

    ERIC Educational Resources Information Center

    Worth, Sarah

    2003-01-01

    This study compared responses of 16 pupils either with or without autistic spectrum disorder (ASD) and matched for gender and verbal ability. Subjects' responses to various pictures were categorized. Results suggested errors made by the two groups differed both quantitatively and qualitatively. Errors made by pupils with ASD were largely…

  14. Exploring the feelings of Iranian women of reproductive age about health care seeking behavior: a qualitative study.

    PubMed

    Morowatisharifabad, Mohammad Ali; Rahimi, Tahereh; Farajkhoda, Tahmineh; Fallahzadeh, Hossein; Mohebi, Siamak

    2018-01-01

    Background: Despite the important role of feelings in health care seeking behavior (HCSB), this subject has not yet been adequately investigated. HCSB-related feelings begin with the onset of disease symptoms and persist in different forms after treatment. The aim of current study was to explore the feelings that women of reproductive age experience when they seek health care. Methods: In this deductive, qualitative content analysis, participants were selected by purposeful sampling. Semi-structured, in-depth interviews with 17 women of reproductive age and 5 health care staffs in Qom, Iran were carried out until data saturation was achieved. Qualitative data were concurrently analyzed by deductive content analysis, using the Health Promotion Model (HPM). The MAXQDA10 software was used to manage qualitative data analysis. Results: Three main categories were drawn from data to explain the HCSB-related feelings of participants consisting of (1) feeling of inner satisfaction with the treatment with 2 subcategories including "peace of mind" and "feeling alive", (2) multiple roles of fear with 5 subcategories including "fear about the consequences of delay", "fear of having hidden diseases", "fear of unknown experiences", "fear of hearing bad news" and "fear of medical errors" and (3)uncomfortable feelings with 3 subcategories including "feeling uneasy when attending health facility", "feeling embarrassed" and "feeling worthless due to dealing the doctor". Conclusion: This study revealed that the inner feelings of women varied widely, ranging from positive or motivating feelings to negative or inhibitory ones, given their experiences with the formal health care system and the current situation of medical and health services. Highlighting patients' perceived inner satisfaction and reducing fear and uncomfortable feelings by adopting culture-based practical strategies can enhance women's HCSB.

  15. Analysis of Student Errors on Division of Fractions

    NASA Astrophysics Data System (ADS)

    Maelasari, E.; Jupri, A.

    2017-02-01

    This study aims to describe the type of student errors that typically occurs at the completion of the division arithmetic operations on fractions, and to describe the causes of students’ mistakes. This research used a descriptive qualitative method, and involved 22 fifth grade students at one particular elementary school in Kuningan, Indonesia. The results of this study showed that students’ error answers caused by students changing their way of thinking to solve multiplication and division operations on the same procedures, the changing of mix fractions to common fraction have made students confused, and students are careless in doing calculation. From student written work, in solving the fraction problems, we found that there is influence between the uses of learning methods and student response, and some of student responses beyond researchers’ prediction. We conclude that the teaching method is not only the important thing that must be prepared, but the teacher should also prepare about predictions of students’ answers to the problems that will be given in the learning process. This could be a reflection for teachers to be better and to achieve the expected learning goals.

  16. Analysis of methods commonly used in biomedicine for treatment versus control comparison of very small samples.

    PubMed

    Ristić-Djurović, Jasna L; Ćirković, Saša; Mladenović, Pavle; Romčević, Nebojša; Trbovich, Alexander M

    2018-04-01

    A rough estimate indicated that use of samples of size not larger than ten is not uncommon in biomedical research and that many of such studies are limited to strong effects due to sample sizes smaller than six. For data collected from biomedical experiments it is also often unknown if mathematical requirements incorporated in the sample comparison methods are satisfied. Computer simulated experiments were used to examine performance of methods for qualitative sample comparison and its dependence on the effectiveness of exposure, effect intensity, distribution of studied parameter values in the population, and sample size. The Type I and Type II errors, their average, as well as the maximal errors were considered. The sample size 9 and the t-test method with p = 5% ensured error smaller than 5% even for weak effects. For sample sizes 6-8 the same method enabled detection of weak effects with errors smaller than 20%. If the sample sizes were 3-5, weak effects could not be detected with an acceptable error; however, the smallest maximal error in the most general case that includes weak effects is granted by the standard error of the mean method. The increase of sample size from 5 to 9 led to seven times more accurate detection of weak effects. Strong effects were detected regardless of the sample size and method used. The minimal recommended sample size for biomedical experiments is 9. Use of smaller sizes and the method of their comparison should be justified by the objective of the experiment. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Qualitative and quantitative assessment of Illumina's forensic STR and SNP kits on MiSeq FGx™.

    PubMed

    Sharma, Vishakha; Chow, Hoi Yan; Siegel, Donald; Wurmbach, Elisa

    2017-01-01

    Massively parallel sequencing (MPS) is a powerful tool transforming DNA analysis in multiple fields ranging from medicine, to environmental science, to evolutionary biology. In forensic applications, MPS offers the ability to significantly increase the discriminatory power of human identification as well as aid in mixture deconvolution. However, before the benefits of any new technology can be employed, a thorough evaluation of its quality, consistency, sensitivity, and specificity must be rigorously evaluated in order to gain a detailed understanding of the technique including sources of error, error rates, and other restrictions/limitations. This extensive study assessed the performance of Illumina's MiSeq FGx MPS system and ForenSeq™ kit in nine experimental runs including 314 reaction samples. In-depth data analysis evaluated the consequences of different assay conditions on test results. Variables included: sample numbers per run, targets per run, DNA input per sample, and replications. Results are presented as heat maps revealing patterns for each locus. Data analysis focused on read numbers (allele coverage), drop-outs, drop-ins, and sequence analysis. The study revealed that loci with high read numbers performed better and resulted in fewer drop-outs and well balanced heterozygous alleles. Several loci were prone to drop-outs which led to falsely typed homozygotes and therefore to genotype errors. Sequence analysis of allele drop-in typically revealed a single nucleotide change (deletion, insertion, or substitution). Analyses of sequences, no template controls, and spurious alleles suggest no contamination during library preparation, pooling, and sequencing, but indicate that sequencing or PCR errors may have occurred due to DNA polymerase infidelities. Finally, we found utilizing Illumina's FGx System at recommended conditions does not guarantee 100% outcomes for all samples tested, including the positive control, and required manual editing due to low read numbers and/or allele drop-in. These findings are important for progressing towards implementation of MPS in forensic DNA testing.

  18. Advanced image fusion algorithms for Gamma Knife treatment planning. Evaluation and proposal for clinical use.

    PubMed

    Apostolou, N; Papazoglou, Th; Koutsouris, D

    2006-01-01

    Image fusion is a process of combining information from multiple sensors. It is a useful tool implemented in the treatment planning programme of Gamma Knife Radiosurgery. In this paper we evaluate advanced image fusion algorithms for Matlab platform and head images. We develop nine level grayscale image fusion methods: average, principal component analysis (PCA), discrete wavelet transform (DWT) and Laplacian, filter - subtract - decimate (FSD), contrast, gradient, morphological pyramid and a shift invariant discrete wavelet transform (SIDWT) method in Matlab platform. We test these methods qualitatively and quantitatively. The quantitative criteria we use are the Root Mean Square Error (RMSE), the Mutual Information (MI), the Standard Deviation (STD), the Entropy (H), the Difference Entropy (DH) and the Cross Entropy (CEN). The qualitative are: natural appearance, brilliance contrast, presence of complementary features and enhancement of common features. Finally we make clinically useful suggestions.

  19. Qualitative analysis of factors affecting adherence to the phenylketonuria diet in adolescents.

    PubMed

    Sharman, Rachael; Mulgrew, Kate; Katsikitis, Mary

    2013-01-01

    Phenylketonuria (PKU) is an inborn error of metabolism that is primarily treated with a severely restricted, low-protein diet to prevent permanent neurological damage. Despite the recognition of the importance of strict dietary adherence in the prevention of intellectual impairment in individuals with PKU, apathy and attrition from diet, especially during adolescence, remain a threat to normal development in this population. This study's aim was to examine adolescents' perception of factors that encourage or inhibit their dietary adherence. This was a qualitative study, with the authors using thematic analysis to interpret the findings. The study was conducted as part of a Metabolic Disorders Association conference. Eight adolescents with PKU were recruited through convenience sampling. A focus group was conducted with the adolescents to gather information about factors that encourage and discourage dietary adherence. Thematic analysis revealed that the adolescents encountered problems explaining the nature and food requirements of their condition to other people. Friends, family, and wanting to maintain "normal" cognitive abilities were identified as factors that encouraged dietary adherence. Adolescents with PKU appear to share several barriers and incentives for maintaining the strict dietary regimen. Considering such perceptions may aid future interventions aiming to reduce diet attrition rates among adolescents.

  20. Multistrip Western blotting: a tool for comparative quantitative analysis of multiple proteins.

    PubMed

    Aksamitiene, Edita; Hoek, Jan B; Kiyatkin, Anatoly

    2015-01-01

    The qualitative and quantitative measurements of protein abundance and modification states are essential in understanding their functions in diverse cellular processes. Typical Western blotting, though sensitive, is prone to produce substantial errors and is not readily adapted to high-throughput technologies. Multistrip Western blotting is a modified immunoblotting procedure based on simultaneous electrophoretic transfer of proteins from multiple strips of polyacrylamide gels to a single membrane sheet. In comparison with the conventional technique, Multistrip Western blotting increases data output per single blotting cycle up to tenfold; allows concurrent measurement of up to nine different total and/or posttranslationally modified protein expression obtained from the same loading of the sample; and substantially improves the data accuracy by reducing immunoblotting-derived signal errors. This approach enables statistically reliable comparison of different or repeated sets of data and therefore is advantageous to apply in biomedical diagnostics, systems biology, and cell signaling research.

  1. Multi-template tensor-based morphometry: Application to analysis of Alzheimer's disease

    PubMed Central

    Koikkalainen, Juha; Lötjönen, Jyrki; Thurfjell, Lennart; Rueckert, Daniel; Waldemar, Gunhild; Soininen, Hilkka

    2012-01-01

    In this paper methods for using multiple templates in tensor-based morphometry (TBM) are presented and comparedtothe conventional single-template approach. TBM analysis requires non-rigid registrations which are often subject to registration errors. When using multiple templates and, therefore, multiple registrations, it can be assumed that the registration errors are averaged and eventually compensated. Four different methods are proposed for multi-template TBM. The methods were evaluated using magnetic resonance (MR) images of healthy controls, patients with stable or progressive mild cognitive impairment (MCI), and patients with Alzheimer's disease (AD) from the ADNI database (N=772). The performance of TBM features in classifying images was evaluated both quantitatively and qualitatively. Classification results show that the multi-template methods are statistically significantly better than the single-template method. The overall classification accuracy was 86.0% for the classification of control and AD subjects, and 72.1%for the classification of stable and progressive MCI subjects. The statistical group-level difference maps produced using multi-template TBM were smoother, formed larger continuous regions, and had larger t-values than the maps obtained with single-template TBM. PMID:21419228

  2. Astronaut Biography Project for Countermeasures of Human Behavior and Performance Risks in Long Duration Space Flights

    NASA Technical Reports Server (NTRS)

    Banks, Akeem

    2012-01-01

    This final report will summarize research that relates to human behavioral health and performance of astronauts and flight controllers. Literature reviews, data archival analyses, and ground-based analog studies that center around the risk of human space flight are being used to help mitigate human behavior and performance risks from long duration space flights. A qualitative analysis of an astronaut autobiography was completed. An analysis was also conducted on exercise countermeasure publications to show the positive affects of exercise on the risks targeted in this study. The three main risks targeted in this study are risks of behavioral and psychiatric disorders, risks of performance errors due to poor team performance, cohesion, and composition, and risks of performance errors due to sleep deprivation, circadian rhythm. These three risks focus on psychological and physiological aspects of astronauts who venture out into space on long duration space missions. The purpose of this research is to target these risks in order to help quantify, identify, and mature countermeasures and technologies required in preventing or mitigating adverse outcomes from exposure to the spaceflight environment

  3. Error analysis of mathematics students who are taught by using the book of mathematics learning strategy in solving pedagogical problems based on Polya’s four-step approach

    NASA Astrophysics Data System (ADS)

    Halomoan Siregar, Budi; Dewi, Izwita; Andriani, Ade

    2018-03-01

    The purpose of this study is to analyse the types of students errors and causes of them in solving of pedagogic problems. The type of this research is qualitative descriptive, conducted on 34 students of mathematics education in academic year 2017 to 2018. The data in this study is obtained through interviews and tests. Furthermore, the data is then analyzed through three stages: 1) data reduction, 2) data description, and 3) conclusions. The data is reduced by organizing and classifying them in order to obtain meaningful information. After reducing, then the data presented in a simple form of narrative, graphics, and tables to illustrate clearly the errors of students. Based on the information then drawn a conclusion. The results of this study indicate that the students made various errors: 1) they made a mistake in answer what being asked at the problem, because they misunderstood the problem, 2) they fail to plan the learning process based on constructivism, due to lack of understanding of how to design the learning, 3) they determine an inappropriate learning tool, because they did not understand what kind of learning tool is relevant to use.

  4. Age of acquisition and naming performance in Frisian-Dutch bilingual speakers with dementia

    PubMed Central

    Veenstra, Wencke S.; Huisman, Mark; Miller, Nick

    2014-01-01

    Age of acquisition (AoA) of words is a recognised variable affecting language processing in speakers with and without language disorders. For bi- and multilingual speakers their languages can be differentially affected in neurological illness. Study of language loss in bilingual speakers with dementia has been relatively neglected. Objective We investigated whether AoA of words was associated with level of naming impairment in bilingual speakers with probable Alzheimer's dementia within and across their languages. Methods Twenty-six Frisian-Dutch bilinguals with mild to moderate dementia named 90 pictures in each language, employing items with rated AoA and other word variable measures matched across languages. Quantitative (totals correct) and qualitative (error types and (in)appropriate switching) aspects were measured. Results Impaired retrieval occurred in Frisian (Language 1) and Dutch (Language 2), with a significant effect of AoA on naming in both languages. Earlier acquired words were better preserved and retrieved. Performance was identical across languages, but better in Dutch when controlling for covariates. However, participants demonstrated more inappropriate code switching within the Frisian test setting. On qualitative analysis, no differences in overall error distribution were found between languages for early or late acquired words. There existed a significantly higher percentage of semantically than visually-related errors. Conclusion These findings have implications for understanding problems in lexical retrieval among bilingual individuals with dementia and its relation to decline in other cognitive functions which may play a role in inappropriate code switching. We discuss the findings in the light of the close relationship between Frisian and Dutch and the pattern of usage across the life-span. PMID:29213911

  5. Text Classification for Assisting Moderators in Online Health Communities

    PubMed Central

    Huh, Jina; Yetisgen-Yildiz, Meliha; Pratt, Wanda

    2013-01-01

    Objectives Patients increasingly visit online health communities to get help on managing health. The large scale of these online communities makes it impossible for the moderators to engage in all conversations; yet, some conversations need their expertise. Our work explores low-cost text classification methods to this new domain of determining whether a thread in an online health forum needs moderators’ help. Methods We employed a binary classifier on WebMD’s online diabetes community data. To train the classifier, we considered three feature types: (1) word unigram, (2) sentiment analysis features, and (3) thread length. We applied feature selection methods based on χ2 statistics and under sampling to account for unbalanced data. We then performed a qualitative error analysis to investigate the appropriateness of the gold standard. Results Using sentiment analysis features, feature selection methods, and balanced training data increased the AUC value up to 0.75 and the F1-score up to 0.54 compared to the baseline of using word unigrams with no feature selection methods on unbalanced data (0.65 AUC and 0.40 F1-score). The error analysis uncovered additional reasons for why moderators respond to patients’ posts. Discussion We showed how feature selection methods and balanced training data can improve the overall classification performance. We present implications of weighing precision versus recall for assisting moderators of online health communities. Our error analysis uncovered social, legal, and ethical issues around addressing community members’ needs. We also note challenges in producing a gold standard, and discuss potential solutions for addressing these challenges. Conclusion Social media environments provide popular venues in which patients gain health-related information. Our work contributes to understanding scalable solutions for providing moderators’ expertise in these large-scale, social media environments. PMID:24025513

  6. Protocol for the PINCER trial: a cluster randomised trial comparing the effectiveness of a pharmacist-led IT-based intervention with simple feedback in reducing rates of clinically important errors in medicines management in general practices.

    PubMed

    Avery, Anthony J; Rodgers, Sarah; Cantrill, Judith A; Armstrong, Sarah; Elliott, Rachel; Howard, Rachel; Kendrick, Denise; Morris, Caroline J; Murray, Scott A; Prescott, Robin J; Cresswell, Kathrin; Sheikh, Aziz

    2009-05-01

    Medication errors are an important cause of morbidity and mortality in primary care. The aims of this study are to determine the effectiveness, cost effectiveness and acceptability of a pharmacist-led information-technology-based complex intervention compared with simple feedback in reducing proportions of patients at risk from potentially hazardous prescribing and medicines management in general (family) practice. RESEARCH SUBJECT GROUP: "At-risk" patients registered with computerised general practices in two geographical regions in England. Parallel group pragmatic cluster randomised trial. Practices will be randomised to either: (i) Computer-generated feedback; or (ii) Pharmacist-led intervention comprising of computer-generated feedback, educational outreach and dedicated support. The proportion of patients in each practice at six and 12 months post intervention: - with a computer-recorded history of peptic ulcer being prescribed non-selective non-steroidal anti-inflammatory drugs; - with a computer-recorded diagnosis of asthma being prescribed beta-blockers; - aged 75 years and older receiving long-term prescriptions for angiotensin converting enzyme inhibitors or loop diuretics without a recorded assessment of renal function and electrolytes in the preceding 15 months. SECONDARY OUTCOME MEASURES; These relate to a number of other examples of potentially hazardous prescribing and medicines management. An economic evaluation will be done of the cost per error avoided, from the perspective of the UK National Health Service (NHS), comparing the pharmacist-led intervention with simple feedback. QUALITATIVE ANALYSIS: A qualitative study will be conducted to explore the views and experiences of health care professionals and NHS managers concerning the interventions, and investigate possible reasons why the interventions prove effective, or conversely prove ineffective. 34 practices in each of the two treatment arms would provide at least 80% power (two-tailed alpha of 0.05) to demonstrate a 50% reduction in error rates for each of the three primary outcome measures in the pharmacist-led intervention arm compared with a 11% reduction in the simple feedback arm. At the time of submission of this article, 72 general practices have been recruited (36 in each arm of the trial) and the interventions have been delivered. Analysis has not yet been undertaken.

  7. A mathematical approach to beam matching

    PubMed Central

    Manikandan, A; Nandy, M; Gossman, M S; Sureka, C S; Ray, A; Sujatha, N

    2013-01-01

    Objective: This report provides the mathematical commissioning instructions for the evaluation of beam matching between two different linear accelerators. Methods: Test packages were first obtained including an open beam profile, a wedge beam profile and a depth–dose curve, each from a 10×10 cm2 beam. From these plots, a spatial error (SE) and a percentage dose error were introduced to form new plots. These three test package curves and the associated error curves were then differentiated in space with respect to dose for a first and second derivative to determine the slope and curvature of each data set. The derivatives, also known as bandwidths, were analysed to determine the level of acceptability for the beam matching test described in this study. Results: The open and wedged beam profiles and depth–dose curve in the build-up region were determined to match within 1% dose error and 1-mm SE at 71.4% and 70.8% for of all points, respectively. For the depth–dose analysis specifically, beam matching was achieved for 96.8% of all points at 1%/1 mm beyond the depth of maximum dose. Conclusion: To quantify the beam matching procedure in any clinic, the user needs to merely generate test packages from their reference linear accelerator. It then follows that if the bandwidths are smooth and continuous across the profile and depth, there is greater likelihood of beam matching. Differentiated spatial and percentage variation analysis is appropriate, ideal and accurate for this commissioning process. Advances in knowledge: We report a mathematically rigorous formulation for the qualitative evaluation of beam matching between linear accelerators. PMID:23995874

  8. Assessing the impact of representational and contextual problem features on student use of right-hand rules

    NASA Astrophysics Data System (ADS)

    Kustusch, Mary Bridget

    2016-06-01

    Students in introductory physics struggle with vector algebra and these challenges are often associated with contextual and representational features of the problems. Performance on problems about cross product direction is particularly poor and some research suggests that this may be primarily due to misapplied right-hand rules. However, few studies have had the resolution to explore student use of right-hand rules in detail. This study reviews literature in several disciplines, including spatial cognition, to identify ten contextual and representational problem features that are most likely to influence performance on problems requiring a right-hand rule. Two quantitative measures of performance (correctness and response time) and two qualitative measures (methods used and type of errors made) were used to explore the impact of these problem features on student performance. Quantitative results are consistent with expectations from the literature, but reveal that some features (such as the type of reasoning required and the physical awkwardness of using a right-hand rule) have a greater impact than others (such as whether the vectors are placed together or separate). Additional insight is gained by the qualitative analysis, including identifying sources of difficulty not previously discussed in the literature and revealing that the use of supplemental methods, such as physically rotating the paper, can mitigate errors associated with certain features.

  9. A Case Study of Teacher Responses to a Doubling Error and Difficulty in Learning Equivalent Fractions

    ERIC Educational Resources Information Center

    Ding, Meixia; Li, Xiaobao; Capraro, Mary Margaret; Kulm, Gerald

    2012-01-01

    This study qualitatively explored teachers' responses to doubling errors (e.g., 3/4 x 2 = 6/8) that typically reflect students' difficulties in understanding the "rule" for finding equivalent fractions (e.g., 3/4 x 2/2 = 6/8). Although all teachers claimed to teach for understanding in interviews, their responses varied in terms of effectiveness…

  10. Multiple imputation of missing fMRI data in whole brain analysis

    PubMed Central

    Vaden, Kenneth I.; Gebregziabher, Mulugeta; Kuchinsky, Stefanie E.; Eckert, Mark A.

    2012-01-01

    Whole brain fMRI analyses rarely include the entire brain because of missing data that result from data acquisition limits and susceptibility artifact, in particular. This missing data problem is typically addressed by omitting voxels from analysis, which may exclude brain regions that are of theoretical interest and increase the potential for Type II error at cortical boundaries or Type I error when spatial thresholds are used to establish significance. Imputation could significantly expand statistical map coverage, increase power, and enhance interpretations of fMRI results. We examined multiple imputation for group level analyses of missing fMRI data using methods that leverage the spatial information in fMRI datasets for both real and simulated data. Available case analysis, neighbor replacement, and regression based imputation approaches were compared in a general linear model framework to determine the extent to which these methods quantitatively (effect size) and qualitatively (spatial coverage) increased the sensitivity of group analyses. In both real and simulated data analysis, multiple imputation provided 1) variance that was most similar to estimates for voxels with no missing data, 2) fewer false positive errors in comparison to mean replacement, and 3) fewer false negative errors in comparison to available case analysis. Compared to the standard analysis approach of omitting voxels with missing data, imputation methods increased brain coverage in this study by 35% (from 33,323 to 45,071 voxels). In addition, multiple imputation increased the size of significant clusters by 58% and number of significant clusters across statistical thresholds, compared to the standard voxel omission approach. While neighbor replacement produced similar results, we recommend multiple imputation because it uses an informed sampling distribution to deal with missing data across subjects that can include neighbor values and other predictors. Multiple imputation is anticipated to be particularly useful for 1) large fMRI data sets with inconsistent missing voxels across subjects and 2) addressing the problem of increased artifact at ultra-high field, which significantly limit the extent of whole brain coverage and interpretations of results. PMID:22500925

  11. Qualitative and quantitative analysis of ochratoxin A contamination in green coffee beans using Fourier transform near infrared spectroscopy.

    PubMed

    Taradolsirithitikul, Panchita; Sirisomboon, Panmanas; Dachoupakan Sirisomboon, Cheewanun

    2017-03-01

    Ochratoxin A (OTA) contamination is highly prevalent in a variety of agricultural products including the commercially important coffee bean. As such, rapid and accurate detection methods are considered necessary for the identification of OTA in green coffee beans. The goal of this research was to apply Fourier transform near infrared spectroscopy to detect and classify OTA contamination in green coffee beans in both a quantitative and qualitative manner. PLSR models were generated using pretreated spectroscopic data to predict the OTA concentration. The best model displayed a correlation coefficient (r) of 0.814, a standard error of prediction (SEP and bias of 1.965 µg kg -1 and 0.358 µg kg -1 , respectively. Additionally, a PLS-DA model was also generated, displaying a classification accuracy of 96.83% for a non-OTA contaminated model and 80.95% for an OTA contaminated model, with an overall classification accuracy of 88.89%. The results demonstrate that the developed model could be used for detecting OTA contamination in green coffee beans in either a quantitative or qualitative manner. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  12. Avoiding and identifying errors in health technology assessment models: qualitative study and methodological review.

    PubMed

    Chilcott, J; Tappenden, P; Rawdin, A; Johnson, M; Kaltenthaler, E; Paisley, S; Papaioannou, D; Shippam, A

    2010-05-01

    Health policy decisions must be relevant, evidence-based and transparent. Decision-analytic modelling supports this process but its role is reliant on its credibility. Errors in mathematical decision models or simulation exercises are unavoidable but little attention has been paid to processes in model development. Numerous error avoidance/identification strategies could be adopted but it is difficult to evaluate the merits of strategies for improving the credibility of models without first developing an understanding of error types and causes. The study aims to describe the current comprehension of errors in the HTA modelling community and generate a taxonomy of model errors. Four primary objectives are to: (1) describe the current understanding of errors in HTA modelling; (2) understand current processes applied by the technology assessment community for avoiding errors in development, debugging and critically appraising models for errors; (3) use HTA modellers' perceptions of model errors with the wider non-HTA literature to develop a taxonomy of model errors; and (4) explore potential methods and procedures to reduce the occurrence of errors in models. It also describes the model development process as perceived by practitioners working within the HTA community. A methodological review was undertaken using an iterative search methodology. Exploratory searches informed the scope of interviews; later searches focused on issues arising from the interviews. Searches were undertaken in February 2008 and January 2009. In-depth qualitative interviews were performed with 12 HTA modellers from academic and commercial modelling sectors. All qualitative data were analysed using the Framework approach. Descriptive and explanatory accounts were used to interrogate the data within and across themes and subthemes: organisation, roles and communication; the model development process; definition of error; types of model error; strategies for avoiding errors; strategies for identifying errors; and barriers and facilitators. There was no common language in the discussion of modelling errors and there was inconsistency in the perceived boundaries of what constitutes an error. Asked about the definition of model error, there was a tendency for interviewees to exclude matters of judgement from being errors and focus on 'slips' and 'lapses', but discussion of slips and lapses comprised less than 20% of the discussion on types of errors. Interviewees devoted 70% of the discussion to softer elements of the process of defining the decision question and conceptual modelling, mostly the realms of judgement, skills, experience and training. The original focus concerned model errors, but it may be more useful to refer to modelling risks. Several interviewees discussed concepts of validation and verification, with notable consistency in interpretation: verification meaning the process of ensuring that the computer model correctly implemented the intended model, whereas validation means the process of ensuring that a model is fit for purpose. Methodological literature on verification and validation of models makes reference to the Hermeneutic philosophical position, highlighting that the concept of model validation should not be externalized from the decision-makers and the decision-making process. Interviewees demonstrated examples of all major error types identified in the literature: errors in the description of the decision problem, in model structure, in use of evidence, in implementation of the model, in operation of the model, and in presentation and understanding of results. The HTA error classifications were compared against existing classifications of model errors in the literature. A range of techniques and processes are currently used to avoid errors in HTA models: engaging with clinical experts, clients and decision-makers to ensure mutual understanding, producing written documentation of the proposed model, explicit conceptual modelling, stepping through skeleton models with experts, ensuring transparency in reporting, adopting standard housekeeping techniques, and ensuring that those parties involved in the model development process have sufficient and relevant training. Clarity and mutual understanding were identified as key issues. However, their current implementation is not framed within an overall strategy for structuring complex problems. Some of the questioning may have biased interviewees responses but as all interviewees were represented in the analysis no rebalancing of the report was deemed necessary. A potential weakness of the literature review was its focus on spreadsheet and program development rather than specifically on model development. It should also be noted that the identified literature concerning programming errors was very narrow despite broad searches being undertaken. Published definitions of overall model validity comprising conceptual model validation, verification of the computer model, and operational validity of the use of the model in addressing the real-world problem are consistent with the views expressed by the HTA community and are therefore recommended as the basis for further discussions of model credibility. Such discussions should focus on risks, including errors of implementation, errors in matters of judgement and violations. Discussions of modelling risks should reflect the potentially complex network of cognitive breakdowns that lead to errors in models and existing research on the cognitive basis of human error should be included in an examination of modelling errors. There is a need to develop a better understanding of the skills requirements for the development, operation and use of HTA models. Interaction between modeller and client in developing mutual understanding of a model establishes that model's significance and its warranty. This highlights that model credibility is the central concern of decision-makers using models so it is crucial that the concept of model validation should not be externalized from the decision-makers and the decision-making process. Recommendations for future research would be studies of verification and validation; the model development process; and identification of modifications to the modelling process with the aim of preventing the occurrence of errors and improving the identification of errors in models.

  13. Situational Interest in Engineering Design Activities

    NASA Astrophysics Data System (ADS)

    Bonderup Dohn, Niels

    2013-08-01

    The aim of the present mixed-method study was to investigate task-based situational interest of sixth grade students (n = 46), between 12 and 14 years old, during an eight-week engineering design programme in a Science & Technology-class. Students' interests were investigated by means of a descriptive interpretative analysis of qualitative data from classroom observations and informal interviews. The analysis was complemented by a self-report survey to validate findings and determine prevalence. The analysis revealed four main sources of interest: designing inventions, trial-and-error experimentation, achieved functionality of invention, and collaboration. These sources differ in terms of stimuli factors, such as novelty, autonomy (choice), social involvement, self-generation of interest, and task goal orientation. The study shows that design tasks stimulated interest, but only to the extent that students were able to self-regulate their learning strategies.

  14. Briefing and debriefing in the cardiac operating room. Analysis of impact on theatre team attitude and patient safety.

    PubMed

    Papaspyros, Sotiris C; Javangula, Kalyana C; Adluri, Rajeshwara Krishna Prasad; O'Regan, David J

    2010-01-01

    Error in health services delivery has long been recognised as a significant cause of inpatient morbidity and mortality. Root-cause analyses have cited communication failure as one of the contributing factors in adverse events. The formalised fighter pilot mission brief and debrief formed the basis of the National Aeronautics and Space Administration (NASA) crew resource management (CRM) concept produced in 1979. This is a qualitative analysis of our experience with the briefing-debriefing process applied to cardiac theatres. We instituted a policy of formal operating room (OR) briefing and debriefing in all cardiac theatre sessions. The first 118 cases were reviewed. A trouble-free operation was noted in only 28 (23.7%) cases. We experienced multiple problems in 38 (32.2%) cases. A gap was identified in the second order problem solving in relation to instrument repair and maintenance. Theatre team members were interviewed and their comments were subjected to qualitative analysis. The collaborative feeling is that communication has improved. The health industry may benefit from embracing the briefing-debriefing technique as an adjunct to continuous improvement through reflective learning, deliberate practice and immediate feedback. This may be the initial step toward a substantive and sustainable organizational transformation.

  15. Determining the sample size for co-dominant molecular marker-assisted linkage detection for a monogenic qualitative trait by controlling the type-I and type-II errors in a segregating F2 population.

    PubMed

    Hühn, M; Piepho, H P

    2003-03-01

    Tests for linkage are usually performed using the lod score method. A critical question in linkage analyses is the choice of sample size. The appropriate sample size depends on the desired type-I error and power of the test. This paper investigates the exact type-I error and power of the lod score method in a segregating F(2) population with co-dominant markers and a qualitative monogenic dominant-recessive trait. For illustration, a disease-resistance trait is considered, where the susceptible allele is recessive. A procedure is suggested for finding the appropriate sample size. It is shown that recessive plants have about twice the information content of dominant plants, so the former should be preferred for linkage detection. In some cases the exact alpha-values for a given nominal alpha may be rather small due to the discrete nature of the sampling distribution in small samples. We show that a gain in power is possible by using exact methods.

  16. Can eye-tracking technology improve situational awareness in paramedic clinical education?

    PubMed

    Williams, Brett; Quested, Andrew; Cooper, Simon

    2013-01-01

    Human factors play a significant part in clinical error. Situational awareness (SA) means being aware of one's surroundings, comprehending the present situation, and being able to predict outcomes. It is a key human skill that, when properly applied, is associated with reducing medical error: eye-tracking technology can be used to provide an objective and qualitative measure of the initial perception component of SA. Feedback from eye-tracking technology can be used to improve the understanding and teaching of SA in clinical contexts, and consequently, has potential for reducing clinician error and the concomitant adverse events.

  17. Evaluation of advanced laparoscopic skills tasks for validity evidence.

    PubMed

    Nepomnayshy, Dmitry; Whitledge, James; Birkett, Richard; Delmonico, Theodore; Ruthazer, Robin; Sillin, Lelan; Seymour, Neal E

    2015-02-01

    Since fundamentals of laparoscopic surgery (FLS) represents a minimum proficiency standard for laparoscopic surgery, more advanced proficiency standards are required to address the needs of current surgical training. We wanted to evaluate the acceptance and discriminative ability of a novel set of skills building on the FLS model that could represent a more advanced proficiency standard-advanced laparoscopic surgery (ALS). Qualitative and quantitative analyses were employed. Quantitative analysis involved comparison of expert (PGY 5+), intermediate (PGY 3-4) and novice (PGY 1-2) surgeons on FLS and ALS tasks. Composite scores included time and errors. Standard FLS errors were added to task time to create the composite score. Qualitative analysis involved thematic review of open-ended questions provided to experts participating in the study. Out of 48 participants, there were 15 (31 %) attendings, 3 (6 %) fellows and 30 (63 %) residents. By specialty, 54 % were general/MIS/bariatric/colorectal (GMBC) and 46 % were other (urology and gynecology). There was no difference between experience level and performance on FLS and ALS tasks for the entire cohort. However, looking at the GMBC subgroup, experts performed better than novices (p = 0.012) and intermediates performed better than novices (p = 0.057) on ALS tasks. There was no difference for the same group in FLS performance. Also, GMBC subgroup performed significantly better on FLS (p = 0.0035) and ALS (p = 0.0027) than the other subgroup. Thematic analysis revealed that the majority of experts felt that ALS was more realistic, challenging and clinically relevant for specific situations compared to FLS. For GMBC surgeons, we were able to show evidence of validity for a series of advanced laparoscopic tasks and their relationship to surgeon skill level. This study may represent the first step in the development of an advanced laparoscopic skills curriculum. Given the high degree of specialization in surgery, different advanced skills curricula will need to be developed for each specialty.

  18. On Bayesian methods of exploring qualitative interactions for targeted treatment.

    PubMed

    Chen, Wei; Ghosh, Debashis; Raghunathan, Trivellore E; Norkin, Maxim; Sargent, Daniel J; Bepler, Gerold

    2012-12-10

    Providing personalized treatments designed to maximize benefits and minimizing harms is of tremendous current medical interest. One problem in this area is the evaluation of the interaction between the treatment and other predictor variables. Treatment effects in subgroups having the same direction but different magnitudes are called quantitative interactions, whereas those having opposite directions in subgroups are called qualitative interactions (QIs). Identifying QIs is challenging because they are rare and usually unknown among many potential biomarkers. Meanwhile, subgroup analysis reduces the power of hypothesis testing and multiple subgroup analyses inflate the type I error rate. We propose a new Bayesian approach to search for QI in a multiple regression setting with adaptive decision rules. We consider various regression models for the outcome. We illustrate this method in two examples of phase III clinical trials. The algorithm is straightforward and easy to implement using existing software packages. We provide a sample code in Appendix A. Copyright © 2012 John Wiley & Sons, Ltd.

  19. A Technological Innovation to Reduce Prescribing Errors Based on Implementation Intentions: The Acceptability and Feasibility of MyPrescribe.

    PubMed

    Keyworth, Chris; Hart, Jo; Thoong, Hong; Ferguson, Jane; Tully, Mary

    2017-08-01

    Although prescribing of medication in hospitals is rarely an error-free process, prescribers receive little feedback on their mistakes and ways to change future practices. Audit and feedback interventions may be an effective approach to modifying the clinical practice of health professionals, but these may pose logistical challenges when used in hospitals. Moreover, such interventions are often labor intensive. Consequently, there is a need to develop effective and innovative interventions to overcome these challenges and to improve the delivery of feedback on prescribing. Implementation intentions, which have been shown to be effective in changing behavior, link critical situations with an appropriate response; however, these have rarely been used in the context of improving prescribing practices. Semistructured qualitative interviews were conducted to evaluate the acceptability and feasibility of providing feedback on prescribing errors via MyPrescribe, a mobile-compatible website informed by implementation intentions. Data relating to 200 prescribing errors made by 52 junior doctors were collected by 11 hospital pharmacists. These errors were populated into MyPrescribe, where prescribers were able to construct their own personalized action plans. Qualitative interviews with a subsample of 15 junior doctors were used to explore issues regarding feasibility and acceptability of MyPrescribe and their experiences of using implementation intentions to construct prescribing action plans. Framework analysis was used to identify prominent themes, with findings mapped to the behavioral components of the COM-B model (capability, opportunity, motivation, and behavior) to inform the development of future interventions. MyPrescribe was perceived to be effective in providing opportunities for critical reflection on prescribing errors and to complement existing training (such as junior doctors' e-portfolio). The participants were able to provide examples of how they would use "If-Then" plans for patient management. Technology, as opposed to other methods of learning (eg, traditional "paper based" learning), was seen as a positive advancement for continued learning. MyPrescribe was perceived as an acceptable and feasible learning tool for changing prescribing practices, with participants suggesting that it would make an important addition to medical prescribers' training in reflective practice. MyPrescribe is a novel theory-based technological innovation that provides the platform for doctors to create personalized implementation intentions. Applying the COM-B model allows for a more detailed understanding of the perceived mechanisms behind prescribing practices and the ways in which interventions aimed at changing professional practice can be implemented. ©Chris Keyworth, Jo Hart, Hong Thoong, Jane Ferguson, Mary Tully. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 01.08.2017.

  20. A Technological Innovation to Reduce Prescribing Errors Based on Implementation Intentions: The Acceptability and Feasibility of MyPrescribe

    PubMed Central

    Hart, Jo; Thoong, Hong; Ferguson, Jane; Tully, Mary

    2017-01-01

    Background Although prescribing of medication in hospitals is rarely an error-free process, prescribers receive little feedback on their mistakes and ways to change future practices. Audit and feedback interventions may be an effective approach to modifying the clinical practice of health professionals, but these may pose logistical challenges when used in hospitals. Moreover, such interventions are often labor intensive. Consequently, there is a need to develop effective and innovative interventions to overcome these challenges and to improve the delivery of feedback on prescribing. Implementation intentions, which have been shown to be effective in changing behavior, link critical situations with an appropriate response; however, these have rarely been used in the context of improving prescribing practices. Objective Semistructured qualitative interviews were conducted to evaluate the acceptability and feasibility of providing feedback on prescribing errors via MyPrescribe, a mobile-compatible website informed by implementation intentions. Methods Data relating to 200 prescribing errors made by 52 junior doctors were collected by 11 hospital pharmacists. These errors were populated into MyPrescribe, where prescribers were able to construct their own personalized action plans. Qualitative interviews with a subsample of 15 junior doctors were used to explore issues regarding feasibility and acceptability of MyPrescribe and their experiences of using implementation intentions to construct prescribing action plans. Framework analysis was used to identify prominent themes, with findings mapped to the behavioral components of the COM-B model (capability, opportunity, motivation, and behavior) to inform the development of future interventions. Results MyPrescribe was perceived to be effective in providing opportunities for critical reflection on prescribing errors and to complement existing training (such as junior doctors’ e-portfolio). The participants were able to provide examples of how they would use “If-Then” plans for patient management. Technology, as opposed to other methods of learning (eg, traditional “paper based” learning), was seen as a positive advancement for continued learning. Conclusions MyPrescribe was perceived as an acceptable and feasible learning tool for changing prescribing practices, with participants suggesting that it would make an important addition to medical prescribers’ training in reflective practice. MyPrescribe is a novel theory-based technological innovation that provides the platform for doctors to create personalized implementation intentions. Applying the COM-B model allows for a more detailed understanding of the perceived mechanisms behind prescribing practices and the ways in which interventions aimed at changing professional practice can be implemented. PMID:28765104

  1. Identification of Conceptual Understanding in Biotechnology Learning

    NASA Astrophysics Data System (ADS)

    Suryanti, E.; Fitriani, A.; Redjeki, S.; Riandi, R.

    2018-04-01

    Research on the identification of conceptual understanding in the learning of Biotechnology, especially on the concept of Genetic Engineering has been done. The lesson is carried out by means of discussion and presentation mediated-powerpoint media that contains learning materials with relevant images and videos. This research is a qualitative research with one-shot case study or one-group posttest-only design. Analysis of 44 students' answers show that only 22% of students understand the concept, 18% of students lack understanding of concepts, 57% of students have misconceptions, and 3% of students are error. It can be concluded that most students has misconceptions in learning the concept of Genetic Engineering.

  2. Variability of writing disorders in Wernicke's aphasia underperforming different writing tasks: A single-case study.

    PubMed

    Kozintseva, Elena; Skvortsov, Anatoliy

    2016-03-01

    The aim of our study was to evolve views on writing disorders in Wernicke's agraphia by comparing group data and analysis of a single patient. We showed how a single-case study can be useful in obtaining essential results that can be hidden by averaging group data. Analysis of a single patient proved to be important for resolving contradictions of the "holistic" and "elementaristic" paradigms of psychology and for the development of theoretical knowledge with the example of a writing disorder. The implementation of a holistic approach was undertaken by presenting the tasks differing in functions in which writing had been performed since its appearance in human culture (communicative, mnestic, and regulatory). In spite of the identical composition of involved psychological components, these differences were identified when certain types of errors were analyzed in the single subject. The results are discussed in terms of used writing strategy, resulting in a way of operation of involved components that lead to qualitative and quantitative changes of writing errors within the syndrome of Wernicke's agraphia. © 2016 The Institute of Psychology, Chinese Academy of Sciences and John Wiley & Sons Australia, Ltd.

  3. Scientific applications of frequency-stabilized laser technology in space

    NASA Technical Reports Server (NTRS)

    Schumaker, Bonny L.

    1990-01-01

    A synoptic investigation of the uses of frequency-stabilized lasers for scientific applications in space is presented. It begins by summarizing properties of lasers, characterizing their frequency stability, and describing limitations and techniques to achieve certain levels of frequency stability. Limits to precision set by laser frequency stability for various kinds of measurements are investigated and compared with other sources of error. These other sources include photon-counting statistics, scattered laser light, fluctuations in laser power, and intensity distribution across the beam, propagation effects, mechanical and thermal noise, and radiation pressure. Methods are explored to improve the sensitivity of laser-based interferometric and range-rate measurements. Several specific types of science experiments that rely on highly precise measurements made with lasers are analyzed, and anticipated errors and overall performance are discussed. Qualitative descriptions are given of a number of other possible science applications involving frequency-stabilized lasers and related laser technology in space. These applications will warrant more careful analysis as technology develops.

  4. Comparison of a single-view and a double-view aerosol optical depth retrieval algorithm

    NASA Astrophysics Data System (ADS)

    Henderson, Bradley G.; Chylek, Petr

    2003-11-01

    We compare the results of a single-view and a double-view aerosol optical depth (AOD) retrieval algorithm applied to image pairs acquired over NASA Stennis Space Center, Mississippi. The image data were acquired by the Department of Energy's (DOE) Multispectral Thermal Imager (MTI), a pushbroom satellite imager with 15 bands from the visible to the thermal infrared. MTI has the ability to acquire imagery in pairs in which the first image is a near-nadir view and the second image is off-nadir with a zenith angle of approximately 60°. A total of 15 image pairs were used in the analysis. For a given image pair, AOD retrieval is performed twice---once using a single-view algorithm applied to the near-nadir image, then again using a double-view algorithm. Errors for both retrievals are computed by comparing the results to AERONET AOD measurements obtained at the same time and place. The single-view algorithm showed an RMS error about the mean of 0.076 in AOD units, whereas the double-view algorithm showed a modest improvement with an RMS error of 0.06. The single-view errors show a positive bias which is presumed to be a result of the empirical relationship used to determine ground reflectance in the visible. A plot of AOD error of the double-view algorithm versus time shows a noticeable trend which is interpreted to be a calibration drift. When this trend is removed, the RMS error of the double-view algorithm drops to 0.030. The single-view algorithm qualitatively appears to perform better during the spring and summer whereas the double-view algorithm seems to be less sensitive to season.

  5. Local Media Influence on Opting-Out from an Exception from Informed Consent Trial

    PubMed Central

    Nelson, Maria J; DeIorio, Nicole M; MD, Terri Schmidt; Griffiths, Denise; Daya, Mohamud; Haywood, Liana; Zive, Dana; Newgard, Craig D

    2010-01-01

    Objectives News media are used for community education and notification in exception from informed consent clinical trials, yet their effectiveness as an added safeguard in such research remains unknown. We assessed the number of callers requesting opt-out bracelets following each local media report and described the errors and content within each media report. Methods We undertook a descriptive analysis of local media trial coverage (newspaper, television, radio, and weblog) and opt-out requests over a 41-month period at a single site participating in an exception from informed consent out-of-hospital trial. Two non-trial investigators independently assessed forty-one content-based media variables (including background, trial information, graphics, errors, publication information, assessment) using a standardized, semi-qualitative data collection tool. Major errors were considered serious misrepresentation of the trial purpose or protocol, whereas minor errors included misinformation unlikely to mislead the lay reader about the trial. We plotted the temporal relationship between opt-out bracelet requests and media reports. Descriptive information about the news sources and the trial coverage are presented. Results We collected 39 trial-related media reports (33 newspaper, 1 television, 1 radio, and 4 blogs). There were thirteen errors in 9 (23%) publications, 7 of which were major and 6 minor. Of 384 requests for 710 bracelets, 310 requests (80%) occurred within 4 days after trial media coverage. Graphical timeline representation of the data suggested a close association between media reports about the trial and requests for opt-out bracelets. Conclusions Based on results from a single site, local media coverage for an exception from informed consent clinical trial had a substantial portion of errors and appeared closely associated with opt-out requests. PMID:19682770

  6. "A qualitative meta-analysis examining clients' experiences of psychotherapy: A new agenda": Correction to Levitt, Pomerville, and Surace (2016).

    PubMed

    2016-10-01

    Reports an error in "A qualitative meta-analysis examining clients’ experiences of psychotherapy: A new agenda" by Heidi M. Levitt, Andrew Pomerville and Francisco I. Surace ( Psychological Bulletin , 2016[Aug], Vol 142[8], 801-830). In the article, the 2nd sentence in the Broadening the Forms of Power When Considering Client–Therapist Differences section, “Indeed, most of the studies (55/66, 83.3%) in these categories focused either on the power differential within the therapeutic relationship (37) or culturally based power differences between therapists and clients (29).” should read: “Indeed, most of the studies (49/59, 83.1%) in these categories focused either on the power differential within the therapeutic relationship (38) or culturally based power differences between therapists and clients (31).” (The following abstract of the original article appeared in record 2016-21269-001.) This article argues that psychotherapy practitioners and researchers should be informed by the substantive body of qualitative evidence that has been gathered to represent clients’ own experiences of therapy. The current meta-analysis examined qualitative research studies analyzing clients’ experiences within adult individual psychotherapy that appeared in English-language journals. This omnibus review integrates research from across psychotherapy approaches and qualitative methods, focusing on the cross-cutting question of how clients experience therapy. It utilized an innovative method in which 67 studies were subjected to a grounded theory meta-analysis in order to develop a hierarchy of data and then 42 additional studies were added into this hierarchy using a content meta-analytic method—summing to 109 studies in total. Findings highlight the critical psychotherapy experiences for clients, based upon robust findings across these research studies. Process-focused principles for practice are generated that can enrich therapists’ understanding of their clients in key clinical decision-making moments. Based upon these findings, an agenda is suggested in which research is directed toward heightening therapists’ understanding of clients and recognizing them as agents of change within sessions, supporting the client as self-healer paradigm. This research aims to improve therapists’ sensitivity to clients’ experiences and thus can expand therapists’ attunement and intentionality in shaping interventions in accordance with whichever theoretical orientation is in use. The article advocates for the full integration of the qualitative literature in psychotherapy research in which variables are conceptualized in reference to an understanding of clients’ experiences in sessions. PsycINFO Database Record (c) 2016 APA, all rights reserved

  7. Using a Delphi Method to Identify Human Factors Contributing to Nursing Errors.

    PubMed

    Roth, Cheryl; Brewer, Melanie; Wieck, K Lynn

    2017-07-01

    The purpose of this study was to identify human factors associated with nursing errors. Using a Delphi technique, this study used feedback from a panel of nurse experts (n = 25) on an initial qualitative survey questionnaire followed by summarizing the results with feedback and confirmation. Synthesized factors regarding causes of errors were incorporated into a quantitative Likert-type scale, and the original expert panel participants were queried a second time to validate responses. The list identified 24 items as most common causes of nursing errors, including swamping and errors made by others that nurses are expected to recognize and fix. The responses provided a consensus top 10 errors list based on means with heavy workload and fatigue at the top of the list. The use of the Delphi survey established consensus and developed a platform upon which future study of nursing errors can evolve as a link to future solutions. This list of human factors in nursing errors should serve to stimulate dialogue among nurses about how to prevent errors and improve outcomes. Human and system failures have been the subject of an abundance of research, yet nursing errors continue to occur. © 2016 Wiley Periodicals, Inc.

  8. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    PubMed

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  9. Evaluating the dynamic response of in-flight thrust calculation techniques during throttle transients

    NASA Technical Reports Server (NTRS)

    Ray, Ronald J.

    1994-01-01

    New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.

  10. The Witness-Voting System

    NASA Astrophysics Data System (ADS)

    Gerck, Ed

    We present a new, comprehensive framework to qualitatively improve election outcome trustworthiness, where voting is modeled as an information transfer process. Although voting is deterministic (all ballots are counted), information is treated stochastically using Information Theory. Error considerations, including faults, attacks, and threats by adversaries, are explicitly included. The influence of errors may be corrected to achieve an election outcome error as close to zero as desired (error-free), with a provably optimal design that is applicable to any type of voting, with or without ballots. Sixteen voting system requirements, including functional, performance, environmental and non-functional considerations, are derived and rated, meeting or exceeding current public-election requirements. The voter and the vote are unlinkable (secret ballot) although each is identifiable. The Witness-Voting System (Gerck, 2001) is extended as a conforming implementation of the provably optimal design that is error-free, transparent, simple, scalable, robust, receipt-free, universally-verifiable, 100% voter-verified, and end-to-end audited.

  11. [Establishment of the Mathematical Model for PMI Estimation Using FTIR Spectroscopy and Data Mining Method].

    PubMed

    Wang, L; Qin, X C; Lin, H C; Deng, K F; Luo, Y W; Sun, Q R; Du, Q X; Wang, Z Y; Tuo, Y; Sun, J H

    2018-02-01

    To analyse the relationship between Fourier transform infrared (FTIR) spectrum of rat's spleen tissue and postmortem interval (PMI) for PMI estimation using FTIR spectroscopy combined with data mining method. Rats were sacrificed by cervical dislocation, and the cadavers were placed at 20 ℃. The FTIR spectrum data of rats' spleen tissues were taken and measured at different time points. After pretreatment, the data was analysed by data mining method. The absorption peak intensity of rat's spleen tissue spectrum changed with the PMI, while the absorption peak position was unchanged. The results of principal component analysis (PCA) showed that the cumulative contribution rate of the first three principal components was 96%. There was an obvious clustering tendency for the spectrum sample at each time point. The methods of partial least squares discriminant analysis (PLS-DA) and support vector machine classification (SVMC) effectively divided the spectrum samples with different PMI into four categories (0-24 h, 48-72 h, 96-120 h and 144-168 h). The determination coefficient ( R ²) of the PMI estimation model established by PLS regression analysis was 0.96, and the root mean square error of calibration (RMSEC) and root mean square error of cross validation (RMSECV) were 9.90 h and 11.39 h respectively. In prediction set, the R ² was 0.97, and the root mean square error of prediction (RMSEP) was 10.49 h. The FTIR spectrum of the rat's spleen tissue can be effectively analyzed qualitatively and quantitatively by the combination of FTIR spectroscopy and data mining method, and the classification and PLS regression models can be established for PMI estimation. Copyright© by the Editorial Department of Journal of Forensic Medicine.

  12. Rapid and non-invasive analysis of deoxynivalenol in durum and common wheat by Fourier-Transform Near Infrared (FT-NIR) spectroscopy.

    PubMed

    De Girolamo, A; Lippolis, V; Nordkvist, E; Visconti, A

    2009-06-01

    Fourier transform near-infrared spectroscopy (FT-NIR) was used for rapid and non-invasive analysis of deoxynivalenol (DON) in durum and common wheat. The relevance of using ground wheat samples with a homogeneous particle size distribution to minimize measurement variations and avoid DON segregation among particles of different sizes was established. Calibration models for durum wheat, common wheat and durum + common wheat samples, with particle size <500 microm, were obtained by using partial least squares (PLS) regression with an external validation technique. Values of root mean square error of prediction (RMSEP, 306-379 microg kg(-1)) were comparable and not too far from values of root mean square error of cross-validation (RMSECV, 470-555 microg kg(-1)). Coefficients of determination (r(2)) indicated an "approximate to good" level of prediction of the DON content by FT-NIR spectroscopy in the PLS calibration models (r(2) = 0.71-0.83), and a "good" discrimination between low and high DON contents in the PLS validation models (r(2) = 0.58-0.63). A "limited to good" practical utility of the models was ascertained by range error ratio (RER) values higher than 6. A qualitative model, based on 197 calibration samples, was developed to discriminate between blank and naturally contaminated wheat samples by setting a cut-off at 300 microg kg(-1) DON to separate the two classes. The model correctly classified 69% of the 65 validation samples with most misclassified samples (16 of 20) showing DON contamination levels quite close to the cut-off level. These findings suggest that FT-NIR analysis is suitable for the determination of DON in unprocessed wheat at levels far below the maximum permitted limits set by the European Commission.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bogunovic, Hrvoje; Pozo, Jose Maria; Villa-Uriol, Maria Cruz

    Purpose: To evaluate the suitability of an improved version of an automatic segmentation method based on geodesic active regions (GAR) for segmenting cerebral vasculature with aneurysms from 3D x-ray reconstruction angiography (3DRA) and time of flight magnetic resonance angiography (TOF-MRA) images available in the clinical routine. Methods: Three aspects of the GAR method have been improved: execution time, robustness to variability in imaging protocols, and robustness to variability in image spatial resolutions. The improved GAR was retrospectively evaluated on images from patients containing intracranial aneurysms in the area of the Circle of Willis and imaged with two modalities: 3DRA andmore » TOF-MRA. Images were obtained from two clinical centers, each using different imaging equipment. Evaluation included qualitative and quantitative analyses of the segmentation results on 20 images from 10 patients. The gold standard was built from 660 cross-sections (33 per image) of vessels and aneurysms, manually measured by interventional neuroradiologists. GAR has also been compared to an interactive segmentation method: isointensity surface extraction (ISE). In addition, since patients had been imaged with the two modalities, we performed an intermodality agreement analysis with respect to both the manual measurements and each of the two segmentation methods. Results: Both GAR and ISE differed from the gold standard within acceptable limits compared to the imaging resolution. GAR (ISE) had an average accuracy of 0.20 (0.24) mm for 3DRA and 0.27 (0.30) mm for TOF-MRA, and had a repeatability of 0.05 (0.20) mm. Compared to ISE, GAR had a lower qualitative error in the vessel region and a lower quantitative error in the aneurysm region. The repeatability of GAR was superior to manual measurements and ISE. The intermodality agreement was similar between GAR and the manual measurements. Conclusions: The improved GAR method outperformed ISE qualitatively as well as quantitatively and is suitable for segmenting 3DRA and TOF-MRA images from clinical routine.« less

  14. Questioning care at the end of life.

    PubMed

    Ruopp, Patricia; Good, Mary-Jo Delvecchio; Lakoma, Matthew; Gadmer, Nina M; Arnold, Robert M; Block, Susan D

    2005-06-01

    The goal of the larger study was to explore physicians' emotional responses to the death of their patients; this study analyzed a subset of physician transcripts to elucidate the construct of questioning care, which emerged from the larger study. To analyzes how physicians question care-expressing concern, unease, or uncertainty about treatment decisions and practices, errors, or adverse events-as they attend dying patients. Retrospective interview study of physicians caring for randomly selected deaths on the medical service of a major academic teaching hospital, using qualitative and quantitative measures. SETTING, SUBJECTS: 188 attendings, residents, and interns on the internal medical services of two academic medical centers were part of the larger study. A subsample of 75 physician narratives was selected for qualitative data analysis for this study. Qualitative measures included open-ended questions eliciting physicians' stories of the most recent and a most emotionally powerful patient death they have experienced. Grounded theory was used to analyze physician narratives. Quantitative instruments measured physician attitudes toward end-of-life care and responses to the most recent and most emotional patient death. Physicians question care more frequently in most emotional deaths (42%) than in most recent deaths (34%). Physicians question communication with patients and families and within medical teams, medical judgment and technique, standards of practice, and high-risk treatments, often assigning responsibility for medical management they perceive as inappropriate, futile, overly aggressive, or mistakes in judgment and technique. Responsibility ranges from the distal (the culture of medicine) to the proximal (personal). Frustration, guilt, and anger are more frequently expressed in these narratives when care is questioned. A typology of questioning care emerged from these physicians' narratives that parallels and reflects recent and classic research on medical error and the culture of medicine. Physicians' questions about care can contribute to designing training experiences for residents and to improving the quality of systems that affect patients' experiences at life's end and physicians' experiences in caring for dying patients.

  15. A Grounded Theory Study of Aircraft Maintenance Technician Decision-Making

    NASA Astrophysics Data System (ADS)

    Norcross, Robert

    Aircraft maintenance technician decision-making and actions have resulted in aircraft system errors causing aircraft incidents and accidents. Aircraft accident investigators and researchers examined the factors that influence aircraft maintenance technician errors and categorized the types of errors in an attempt to prevent similar occurrences. New aircraft technology introduced to improve aviation safety and efficiency incur failures that have no information contained in the aircraft maintenance manuals. According to the Federal Aviation Administration, aircraft maintenance technicians must use only approved aircraft maintenance documents to repair, modify, and service aircraft. This qualitative research used a grounded theory approach to explore the decision-making processes and actions taken by aircraft maintenance technicians when confronted with an aircraft problem not contained in the aircraft maintenance manuals. The target population for the research was Federal Aviation Administration licensed aircraft and power plant mechanics from across the United States. Nonprobability purposeful sampling was used to obtain aircraft maintenance technicians with the experience sought in the study problem. The sample population recruitment yielded 19 participants for eight focus group sessions to obtain opinions, perceptions, and experiences related to the study problem. All data collected was entered into the Atlas ti qualitative analysis software. The emergence of Aircraft Maintenance Technician decision-making themes regarding Aircraft Maintenance Manual content, Aircraft Maintenance Technician experience, and legal implications of not following Aircraft Maintenance Manuals surfaced. Conclusions from this study suggest Aircraft Maintenance Technician decision-making were influenced by experience, gaps in the Aircraft Maintenance Manuals, reliance on others, realizing the impact of decisions concerning aircraft airworthiness, management pressures, and legal concerns related to decision-making. Recommendations included an in-depth systematic review of the Aircraft Maintenance Manuals, development of a Federal Aviation Administration approved standardized Aircraft Maintenance Technician decision-making flow diagram, and implementation of risk based decision-making training. The benefit of this study is to save the airline industry revenue by preventing poor decision-making practices that result in inefficient maintenance actions and aircraft incidents and accidents.

  16. Secondary Students' Perceptions about Learning Qualitative Analysis in Inorganic Chemistry

    NASA Astrophysics Data System (ADS)

    Tan, Kim-Chwee Daniel; Goh, Ngoh-Khang; Chia, Lian-Sai; Treagust, David F.

    2001-02-01

    Grade 10 students in Singapore find qualitative analysis one of the more difficult topics in their external examinations. Fifty-one grade 10 students (15-17 years old) from three schools were interviewed to investigate their perceptions about learning qualitative analysis and the aspects of qualitative analysis they found difficult. The results showed that students found qualitative analysis tedious, difficult to understand and found the practical sessions unrelated to what they learned in class. They also believed that learning qualitative analysis required a great amount of memory work. It is proposed that their difficulties may arise from not knowing explicitly what is required in qualitative analysis, the content of qualitative analysis, the lack of motivation to understand qualitative analysis, cognitive overloading, and the lack of mastery of the required process skills.

  17. Multiple diagnosis based on photoplethysmography: hematocrit, SpO2, pulse, and respiration

    NASA Astrophysics Data System (ADS)

    Yoon, Gilwon; Lee, Jong Y.; Jeon, Kye Jin; Park, Kun-Kook; Yeo, Hyung S.; Hwang, Hyun T.; Kim, Hong S.; Hwang, In-Duk

    2002-09-01

    Photo-plethysmography measures pulsatile blood flow in real-time and non-invasively. One of widely known applications of PPG is the measurement of saturated oxygen in arterial blood(SpO2). In our work, using several wavelengths more than those used in a pulse oximeter, an algorithm and instrument have been developed to measure hematocrit, saturated oxygen, pulse and respiratory rates simultaneously. To predict hematocrit, a dedicated algorithm is developed based on scattering of RBC and a protocol for detecting outlier signals is used to increase accuracy and reliability. Digital filtering techniques are used to extract respiratory rate signals. Utilization of wavelengths under 1000nm and a multi-wavelength LED array chip and digital-oriented electronics enable us to make a compact device. Our preliminary clinical trials show that the achieved percent errors are +/-8.2% for hematocrit when tested with 594 persons, R2 for SpO2 fitting is 0.99985 when tested with a Bi-Tek pulse oximeter simulator and the SpO2 error for in vivo test is +/-2.5% over the range of 75~100%. The error of pulse rates is less than +/-5%. We obtained a positive predictive value of 96% for respiratory rates in qualitative analysis.

  18. WE-H-BRC-08: Examining Credentialing Criteria and Poor Performance Indicators for IROC Houston’s Anthropomorphic Head and Neck Phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carson, M; Molineu, A; Taylor, P

    Purpose: To analyze the most recent results of IROC Houston’s anthropomorphic H&N phantom to determine the nature of failing irradiations and the feasibility of altering pass/fail credentialing criteria. Methods: IROC Houston’s H&N phantom, used for IMRT credentialing for NCI-sponsored clinical trials, requires that an institution’s treatment plan must agree with measurement within 7% (TLD doses) and ≥85% pixels must pass 7%/4 mm gamma analysis. 156 phantom irradiations (November 2014 – October 2015) were re-evaluated using tighter criteria: 1) 5% TLD and 5%/4 mm, 2) 5% TLD and 5%/3 mm, 3) 4% TLD and 4%/4 mm, and 4) 3% TLD andmore » 3%/3 mm. Failure/poor performance rates were evaluated with respect to individual film and TLD performance by location in the phantom. Overall poor phantom results were characterized qualitatively as systematic (dosimetric) errors, setup errors/positional shifts, global but non-systematic errors, and errors affecting only a local region. Results: The pass rate for these phantoms using current criteria is 90%. Substituting criteria 1-4 reduces the overall pass rate to 77%, 70%, 63%, and 37%, respectively. Statistical analyses indicated the probability of noise-induced TLD failure at the 5% criterion was <0.5%. Using criteria 1, TLD results were most often the cause of failure (86% failed TLD while 61% failed film), with most failures identified in the primary PTV (77% cases). Other criteria posed similar results. Irradiations that failed from film only were overwhelmingly associated with phantom shifts/setup errors (≥80% cases). Results failing criteria 1 were primarily diagnosed as systematic: 58% of cases. 11% were setup/positioning errors, 8% were global non-systematic errors, and 22% were local errors. Conclusion: This study demonstrates that 5% TLD and 5%/4 mm gamma criteria may be both practically and theoretically achievable. Further work is necessary to diagnose and resolve dosimetric inaccuracy in these trials, particularly for systematic dose errors. This work is funded by NCI Grant CA180803.« less

  19. Exploring the Current Landscape of Intravenous Infusion Practices and Errors (ECLIPSE): protocol for a mixed-methods observational study.

    PubMed

    Blandford, Ann; Furniss, Dominic; Lyons, Imogen; Chumbley, Gill; Iacovides, Ioanna; Wei, Li; Cox, Anna; Mayer, Astrid; Schnock, Kumiko; Bates, David Westfall; Dykes, Patricia C; Bell, Helen; Franklin, Bryony Dean

    2016-03-03

    Intravenous medication is essential for many hospital inpatients. However, providing intravenous therapy is complex and errors are common. 'Smart pumps' incorporating dose error reduction software have been widely advocated to reduce error. However, little is known about their effect on patient safety, how they are used or their likely impact. This study will explore the landscape of intravenous medication infusion practices and errors in English hospitals and how smart pumps may relate to the prevalence of medication administration errors. This is a mixed-methods study involving an observational quantitative point prevalence study to determine the frequency and types of errors that occur in the infusion of intravenous medication, and qualitative interviews with hospital staff to better understand infusion practices and the contexts in which errors occur. The study will involve 5 clinical areas (critical care, general medicine, general surgery, paediatrics and oncology), across 14 purposively sampled acute hospitals and 2 paediatric hospitals to cover a range of intravenous infusion practices. Data collectors will compare each infusion running at the time of data collection against the patient's medication orders to identify any discrepancies. The potential clinical importance of errors will be assessed. Quantitative data will be analysed descriptively; interviews will be analysed using thematic analysis. Ethical approval has been obtained from an NHS Research Ethics Committee (14/SC/0290); local approvals will be sought from each participating organisation. Findings will be published in peer-reviewed journals and presented at conferences for academic and health professional audiences. Results will also be fed back to participating organisations to inform local policy, training and procurement. Aggregated findings will inform the debate on costs and benefits of the NHS investing in smart pump technology, and what other changes may need to be made to ensure effectiveness of such an investment. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  20. Risk behaviours for organism transmission in health care delivery-A two month unstructured observational study.

    PubMed

    Lindberg, Maria; Lindberg, Magnus; Skytt, Bernice

    2017-05-01

    Errors in infection control practices risk patient safety. The probability for errors can increase when care practices become more multifaceted. It is therefore fundamental to track risk behaviours and potential errors in various care situations. The aim of this study was to describe care situations involving risk behaviours for organism transmission that could lead to subsequent healthcare-associated infections. Unstructured nonparticipant observations were performed at three medical wards. Healthcare personnel (n=27) were shadowed, in total 39h, on randomly selected weekdays between 7:30 am and 12 noon. Content analysis was used to inductively categorize activities into tasks and based on the character into groups. Risk behaviours for organism transmission were deductively classified into types of errors. Multiple response crosstabs procedure was used to visualize the number and proportion of errors in tasks. One-Way ANOVA with Bonferroni post Hoc test was used to determine differences among the three groups of activities. The qualitative findings gives an understanding of that risk behaviours for organism transmission goes beyond the five moments of hand hygiene and also includes the handling and placement of materials and equipment. The tasks with the highest percentage of errors were; 'personal hygiene', 'elimination' and 'dressing/wound care'. The most common types of errors in all identified tasks were; 'hand disinfection', 'glove usage', and 'placement of materials'. Significantly more errors (p<0.0001) were observed the more multifaceted (single, combined or interrupted) the activity was. The numbers and types of errors as well as the character of activities performed in care situations described in this study confirm the need to improve current infection control practices. It is fundamental that healthcare personnel practice good hand hygiene however effective preventive hygiene is complex in healthcare activities due to the multifaceted care situations, especially when activities are interrupted. A deeper understanding of infection control practices that goes beyond the sense of security by means of hand disinfection and use of gloves is needed as materials and surfaces in the care environment might be contaminated and thus pose a risk for organism transmission. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. A crowdsourcing workflow for extracting chemical-induced disease relations from free text

    PubMed Central

    Li, Tong Shu; Bravo, Àlex; Furlong, Laura I.; Good, Benjamin M.; Su, Andrew I.

    2016-01-01

    Relations between chemicals and diseases are one of the most queried biomedical interactions. Although expert manual curation is the standard method for extracting these relations from the literature, it is expensive and impractical to apply to large numbers of documents, and therefore alternative methods are required. We describe here a crowdsourcing workflow for extracting chemical-induced disease relations from free text as part of the BioCreative V Chemical Disease Relation challenge. Five non-expert workers on the CrowdFlower platform were shown each potential chemical-induced disease relation highlighted in the original source text and asked to make binary judgments about whether the text supported the relation. Worker responses were aggregated through voting, and relations receiving four or more votes were predicted as true. On the official evaluation dataset of 500 PubMed abstracts, the crowd attained a 0.505 F-score (0.475 precision, 0.540 recall), with a maximum theoretical recall of 0.751 due to errors with named entity recognition. The total crowdsourcing cost was $1290.67 ($2.58 per abstract) and took a total of 7 h. A qualitative error analysis revealed that 46.66% of sampled errors were due to task limitations and gold standard errors, indicating that performance can still be improved. All code and results are publicly available at https://github.com/SuLab/crowd_cid_relex Database URL: https://github.com/SuLab/crowd_cid_relex PMID:27087308

  2. Effect of signal intensity and camera quantization on laser speckle contrast analysis

    PubMed Central

    Song, Lipei; Elson, Daniel S.

    2012-01-01

    Laser speckle contrast analysis (LASCA) is limited to being a qualitative method for the measurement of blood flow and tissue perfusion as it is sensitive to the measurement configuration. The signal intensity is one of the parameters that can affect the contrast values due to the quantization of the signals by the camera and analog-to-digital converter (ADC). In this paper we deduce the theoretical relationship between signal intensity and contrast values based on the probability density function (PDF) of the speckle pattern and simplify it to a rational function. A simple method to correct this contrast error is suggested. The experimental results demonstrate that this relationship can effectively compensate the bias in contrast values induced by the quantized signal intensity and correct for bias induced by signal intensity variations across the field of view. PMID:23304650

  3. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  4. Understanding Research Misconduct: A Comparative Analysis of 120 Cases of Professional Wrongdoing

    PubMed Central

    DuBois, James M.; Anderson, Emily E.; Chibnall, John; Carroll, Kelly; Gibb, Tyler; Ogbuka, Chiji; Rubbelke, Timothy

    2013-01-01

    We analyzed 40 cases of falsification, fabrication, or plagiarism (FFP), comparing them to other types of wrongdoing in research (n = 40) and medicine (n = 40). Fifty-one variables were coded from an average of 29 news or investigative reports per case. Financial incentives, oversight failures, and seniority correlate significantly with more serious instances of FFP. However, most environmental variables were nearly absent from cases of FFP and none were more strongly present in cases of FFP than in other types of wrongdoing. Qualitative data suggest FFP involves thinking errors, poor coping with research pressures, and inadequate oversight. We offer recommendations for education, institutional investigations, policy, and further research. PMID:24028480

  5. Ultra high performance liquid chromatography with ion-trap TOF-MS for the fast characterization of flavonoids in Citrus bergamia juice.

    PubMed

    Sommella, Eduardo; Pepe, Giacomo; Pagano, Francesco; Tenore, Gian Carlo; Dugo, Paola; Manfra, Michele; Campiglia, Pietro

    2013-10-01

    We have developed a fast ultra HPLC with ion-trap TOF-MS method for the analysis of flavonoids in Citrus bergamia juice. With respect to the typical methods for the analysis of these matrices based on conventional HPLC techniques, a tenfold faster separation was attained. The use of a core-shell particle column ensured high resolution within the fast analysis time of only 5 min. Unambiguous determination of flavonoid identity was obtained by the employment of a hybrid ion-trap TOF mass spectrometer with high mass accuracy (average error 1.69 ppm). The system showed good retention time and peak area repeatability, with maximum RSD% values of 0.36 and 3.86, respectively, as well as good linearity (R(2) ≥ 0.99). Our results show that ultra HPLC can be a useful tool for ultra fast qualitative/quantitative analysis of flavonoid compounds in citrus fruit juices. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Irregular analytical errors in diagnostic testing - a novel concept.

    PubMed

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC-isotope-dilution mass spectrometry methods are increasingly used for pre-market validation of routine diagnostic assays (these tests also involve substantial sets of clinical validation samples). Based on this definition/terminology, we list recognized causes of irregular analytical error as a risk catalog for clinical chemistry in this article. These issues include reproducible individual analytical errors (e.g. caused by anti-reagent antibodies) and non-reproducible, sporadic errors (e.g. errors due to incorrect pipetting volume due to air bubbles in a sample), which can both lead to inaccurate results and risks for patients.

  7. Qualitative review of usability problems in health information systems for radiology.

    PubMed

    Dias, Camila Rodrigues; Pereira, Marluce Rodrigues; Freire, André Pimenta

    2017-12-01

    Radiology processes are commonly supported by Radiology Information System (RIS), Picture Archiving and Communication System (PACS) and other software for radiology. However, these information technologies can present usability problems that affect the performance of radiologists and physicians, especially considering the complexity of the tasks involved. The purpose of this study was to extract, classify and analyze qualitatively the usability problems in PACS, RIS and other software for radiology. A systematic review was performed to extract usability problems reported in empirical usability studies in the literature. The usability problems were categorized as violations of Nielsen and Molich's usability heuristics. The qualitative analysis indicated the causes and the effects of the identified usability problems. From the 431 papers initially identified, 10 met the study criteria. The analysis of the papers identified 90 instances of usability problems, classified into categories corresponding to established usability heuristics. The five heuristics with the highest number of instances of usability problems were "Flexibility and efficiency of use", "Consistency and standards", "Match between system and the real world", "Recognition rather than recall" and "Help and documentation", respectively. These problems can make the interaction time consuming, causing delays in tasks, dissatisfaction, frustration, preventing users from enjoying all the benefits and functionalities of the system, as well as leading to more errors and difficulties in carrying out clinical analyses. Furthermore, the present paper showed a lack of studies performed on systems for radiology, especially usability evaluations using formal methods of evaluation involving the final users. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Failing to Fix What is Found: Risk Accommodation in the Oil and Gas Industry.

    PubMed

    Stackhouse, Madelynn R D; Stewart, Robert

    2017-01-01

    The present program of research synthesizes the findings from three studies in line with two goals. First, the present research explores how the oil and gas industry is performing at risk mitigation in terms of finding and fixing errors when they occur. Second, the present research explores what factors in the work environment relate to a risk-accommodating environment. Study 1 presents a descriptive evaluation of high-consequence incidents at 34 oil and gas companies over a 12-month period (N = 873), especially in terms of those companies' effectiveness at investigating and fixing errors. The analysis found that most investigations were fair in terms of quality (mean = 75.50%), with a smaller proportion that were weak (mean = 11.40%) or strong (mean = 13.24%). Furthermore, most companies took at least one corrective action for high-consequence incidents, but few of these corrective actions were confirmed as having been completed (mean = 13.77%). In fact, most corrective actions were secondary interim administrative controls (e.g., having a safety meeting) rather than fair or strong controls (e.g., training, engineering elimination). Study 2a found that several environmental factors explain the 56.41% variance in safety, including management's disengagement from safety concerns, finding and fixing errors, safety management system effectiveness, training, employee safety, procedures, and a production-over-safety culture. Qualitative results from Study 2b suggest that a compliance-based culture of adhering to liability concerns, out-group blame, and a production-over-safety orientation may all impede safety effectiveness. © 2016 Society for Risk Analysis.

  9. Challenges to nurses' efforts of retrieving, documenting, and communicating patient care information.

    PubMed

    Keenan, Gail; Yakel, Elizabeth; Dunn Lopez, Karen; Tschannen, Dana; Ford, Yvonne B

    2013-01-01

    To examine information flow, a vital component of a patient's care and outcomes, in a sample of multiple hospital nursing units to uncover potential sources of error and opportunities for systematic improvement. This was a qualitative study of a sample of eight medical-surgical nursing units from four diverse hospitals in one US state. We conducted direct work observations of nursing staff's communication patterns for entire shifts (8 or 12 h) for a total of 200 h and gathered related documentation artifacts for analyses. Data were coded using qualitative content analysis procedures and then synthesized and organized thematically to characterize current practices. Three major themes emerged from the analyses, which represent serious vulnerabilities in the flow of patient care information during nurse hand-offs and to the entire interdisciplinary team across time and settings. The three themes are: (1) variation in nurse documentation and communication; (2) the absence of a centralized care overview in the patient's electronic health record, ie, easily accessible by the entire care team; and (3) rarity of interdisciplinary communication. The care information flow vulnerabilities are a catalyst for multiple types of serious and undetectable clinical errors. We have two major recommendations to address the gaps: (1) to standardize the format, content, and words used to document core information, such as the plan of care, and make this easily accessible to all team members; (2) to conduct extensive usability testing to ensure that tools in the electronic health record help the disconnected interdisciplinary team members to maintain a shared understanding of the patient's plan.

  10. Challenges to nurses' efforts of retrieving, documenting, and communicating patient care information

    PubMed Central

    Yakel, Elizabeth; Dunn Lopez, Karen; Tschannen, Dana; Ford, Yvonne B

    2013-01-01

    Objective To examine information flow, a vital component of a patient's care and outcomes, in a sample of multiple hospital nursing units to uncover potential sources of error and opportunities for systematic improvement. Design This was a qualitative study of a sample of eight medical–surgical nursing units from four diverse hospitals in one US state. We conducted direct work observations of nursing staff's communication patterns for entire shifts (8 or 12 h) for a total of 200 h and gathered related documentation artifacts for analyses. Data were coded using qualitative content analysis procedures and then synthesized and organized thematically to characterize current practices. Results Three major themes emerged from the analyses, which represent serious vulnerabilities in the flow of patient care information during nurse hand-offs and to the entire interdisciplinary team across time and settings. The three themes are: (1) variation in nurse documentation and communication; (2) the absence of a centralized care overview in the patient's electronic health record, ie, easily accessible by the entire care team; and (3) rarity of interdisciplinary communication. Conclusion The care information flow vulnerabilities are a catalyst for multiple types of serious and undetectable clinical errors. We have two major recommendations to address the gaps: (1) to standardize the format, content, and words used to document core information, such as the plan of care, and make this easily accessible to all team members; (2) to conduct extensive usability testing to ensure that tools in the electronic health record help the disconnected interdisciplinary team members to maintain a shared understanding of the patient's plan. PMID:22822042

  11. An embedded longitudinal multi-faceted qualitative evaluation of a complex cluster randomized controlled trial aiming to reduce clinically important errors in medicines management in general practice.

    PubMed

    Cresswell, Kathrin M; Sadler, Stacey; Rodgers, Sarah; Avery, Anthony; Cantrill, Judith; Murray, Scott A; Sheikh, Aziz

    2012-06-08

    There is a need to shed light on the pathways through which complex interventions mediate their effects in order to enable critical reflection on their transferability. We sought to explore and understand key stakeholder accounts of the acceptability, likely impact and strategies for optimizing and rolling-out a successful pharmacist-led information technology-enabled (PINCER) intervention, which substantially reduced the risk of clinically important errors in medicines management in primary care. Data were collected at two geographical locations in central England through a combination of one-to-one longitudinal semi-structured telephone interviews (one at the beginning of the trial and another when the trial was well underway), relevant documents, and focus group discussions following delivery of the PINCER intervention. Participants included PINCER pharmacists, general practice staff, researchers involved in the running of the trial, and primary care trust staff. PINCER pharmacists were interviewed at three different time-points during the delivery of the PINCER intervention. Analysis was thematic with diffusion of innovation theory providing a theoretical framework. We conducted 52 semi-structured telephone interviews and six focus group discussions with 30 additional participants. In addition, documentary data were collected from six pharmacist diaries, along with notes from four meetings of the PINCER pharmacists and feedback meetings from 34 practices. Key findings that helped to explain the success of the PINCER intervention included the perceived importance of focusing on prescribing errors to all stakeholders, and the credibility and appropriateness of a pharmacist-led intervention to address these shortcomings. Central to this was the face-to-face contact and relationship building between pharmacists and a range of practice staff, and pharmacists' explicitly designated role as a change agent. However, important concerns were identified about the likely sustainability of this new model of delivering care, in the absence of an appropriate support network for pharmacists and career development pathways. This embedded qualitative inquiry has helped to understand the complex organizational and social environment in which the trial was undertaken and the PINCER intervention was delivered. The longitudinal element has given insight into the dynamic changes and developments over time. Medication errors and ways to address these are high on stakeholders' agendas. Our results further indicate that pharmacists were, because of their professional standing and skill-set, able to engage with the complex general practice environment and able to identify and manage many clinically important errors in medicines management. The transferability of the PINCER intervention approach, both in relation to other prescribing errors and to other practices, is likely to be high.

  12. An embedded longitudinal multi-faceted qualitative evaluation of a complex cluster randomized controlled trial aiming to reduce clinically important errors in medicines management in general practice

    PubMed Central

    2012-01-01

    Background There is a need to shed light on the pathways through which complex interventions mediate their effects in order to enable critical reflection on their transferability. We sought to explore and understand key stakeholder accounts of the acceptability, likely impact and strategies for optimizing and rolling-out a successful pharmacist-led information technology-enabled (PINCER) intervention, which substantially reduced the risk of clinically important errors in medicines management in primary care. Methods Data were collected at two geographical locations in central England through a combination of one-to-one longitudinal semi-structured telephone interviews (one at the beginning of the trial and another when the trial was well underway), relevant documents, and focus group discussions following delivery of the PINCER intervention. Participants included PINCER pharmacists, general practice staff, researchers involved in the running of the trial, and primary care trust staff. PINCER pharmacists were interviewed at three different time-points during the delivery of the PINCER intervention. Analysis was thematic with diffusion of innovation theory providing a theoretical framework. Results We conducted 52 semi-structured telephone interviews and six focus group discussions with 30 additional participants. In addition, documentary data were collected from six pharmacist diaries, along with notes from four meetings of the PINCER pharmacists and feedback meetings from 34 practices. Key findings that helped to explain the success of the PINCER intervention included the perceived importance of focusing on prescribing errors to all stakeholders, and the credibility and appropriateness of a pharmacist-led intervention to address these shortcomings. Central to this was the face-to-face contact and relationship building between pharmacists and a range of practice staff, and pharmacists’ explicitly designated role as a change agent. However, important concerns were identified about the likely sustainability of this new model of delivering care, in the absence of an appropriate support network for pharmacists and career development pathways. Conclusions This embedded qualitative inquiry has helped to understand the complex organizational and social environment in which the trial was undertaken and the PINCER intervention was delivered. The longitudinal element has given insight into the dynamic changes and developments over time. Medication errors and ways to address these are high on stakeholders’ agendas. Our results further indicate that pharmacists were, because of their professional standing and skill-set, able to engage with the complex general practice environment and able to identify and manage many clinically important errors in medicines management. The transferability of the PINCER intervention approach, both in relation to other prescribing errors and to other practices, is likely to be high. PMID:22682095

  13. Acoustic-articulatory mapping in vowels by locally weighted regression

    PubMed Central

    McGowan, Richard S.; Berger, Michael A.

    2009-01-01

    A method for mapping between simultaneously measured articulatory and acoustic data is proposed. The method uses principal components analysis on the articulatory and acoustic variables, and mapping between the domains by locally weighted linear regression, or loess [Cleveland, W. S. (1979). J. Am. Stat. Assoc. 74, 829–836]. The latter method permits local variation in the slopes of the linear regression, assuming that the function being approximated is smooth. The methodology is applied to vowels of four speakers in the Wisconsin X-ray Microbeam Speech Production Database, with formant analysis. Results are examined in terms of (1) examples of forward (articulation-to-acoustics) mappings and inverse mappings, (2) distributions of local slopes and constants, (3) examples of correlations among slopes and constants, (4) root-mean-square error, and (5) sensitivity of formant frequencies to articulatory change. It is shown that the results are qualitatively correct and that loess performs better than global regression. The forward mappings show different root-mean-square error properties than the inverse mappings indicating that this method is better suited for the forward mappings than the inverse mappings, at least for the data chosen for the current study. Some preliminary results on sensitivity of the first two formant frequencies to the two most important articulatory principal components are presented. PMID:19813812

  14. "… Trial and error …": Speech-language pathologists' perspectives of working with Indigenous Australian adults with acquired communication disorders.

    PubMed

    Cochrane, Frances Clare; Brown, Louise; Siyambalapitiya, Samantha; Plant, Christopher

    2016-10-01

    This study explored speech-language pathologists' (SLPs) perspectives about factors that influence clinical management of Aboriginal and Torres Strait Islander adults with acquired communication disorders (e.g. aphasia, motor speech disorders). Using a qualitative phenomenological approach, seven SLPs working in North Queensland, Australia with experience working with this population participated in semi-structured in-depth interviews. Qualitative content analysis was used to identify categories and overarching themes within the data. Four categories, in relation to barriers and facilitators, were identified from participants' responses: (1) The Practice Context; (2) Working Together; (3) Client Factors; and (4) Speech-Language Pathologist Factors. Three overarching themes were also found to influence effective speech pathology services: (1) Aboriginal and Torres Strait Islander Cultural Practices; (2) Information and Communication; and (3) Time. This study identified many complex and inter-related factors which influenced SLPs' effective clinical management of this caseload. The findings suggest that SLPs should employ a flexible, holistic and collaborative approach in order to facilitate effective clinical management with Aboriginal and Torres Strait Islander people with acquired communication disorders.

  15. Bias analysis applied to Agricultural Health Study publications to estimate non-random sources of uncertainty.

    PubMed

    Lash, Timothy L

    2007-11-26

    The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a qualitative description of study limitations. The latter approach is likely to lead to overconfidence regarding the potential for causal associations, whereas the former safeguards against such overinterpretations. Furthermore, such analyses, once programmed, allow rapid implementation of alternative assignments of probability distributions to the bias parameters, so elevate the plane of discussion regarding study bias from characterizing studies as "valid" or "invalid" to a critical and quantitative discussion of sources of uncertainty.

  16. Columbus safety and reliability

    NASA Astrophysics Data System (ADS)

    Longhurst, F.; Wessels, H.

    1988-10-01

    Analyses carried out to ensure Columbus reliability, availability, and maintainability, and operational and design safety are summarized. Failure modes/effects/criticality is the main qualitative tool used. The main aspects studied are fault tolerance, hazard consequence control, risk minimization, human error effects, restorability, and safe-life design.

  17. Simulations in nursing practice: toward authentic leadership.

    PubMed

    Shapira-Lishchinsky, Orly

    2014-01-01

    Aim  This study explores nurses' ethical decision-making in team simulations in order to identify the benefits of these simulations for authentic leadership. Background  While previous studies have indicated that team simulations may improve ethics in the workplace by reducing the number of errors, those studies focused mainly on clinical aspects and not on nurses' ethical experiences or on the benefits of authentic leadership. Methods  Fifty nurses from 10 health institutions in central Israel participated in the study. Data about nurses' ethical experiences were collected from 10 teams. Qualitative data analysis based on Grounded Theory was applied, using the atlas.ti 5.0 software package. Findings  Simulation findings suggest four main benefits that reflect the underlying components of authentic leadership: self-awareness, relational transparency, balanced information processing and internalized moral perspective. Conclusions  Team-based simulation as a training tool may lead to authentic leadership among nurses. Implications for nursing management  Nursing management should incorporate team simulations into nursing practice to help resolve power conflicts and to develop authentic leadership in nursing. Consequently, errors will decrease, patients' safety will increase and optimal treatment will be provided. © 2012 John Wiley & Sons Ltd.

  18. On the use of drawing tasks in neuropsychological assessment.

    PubMed

    Smith, Alastair D

    2009-03-01

    Drawing tasks have attained a central position in neuropsychological assessment and are considered a rich source of information about the presence (or absence) of cognitive and perceptuo-motor abilities. However, unlike other tests of cognitive impairment, drawing tasks are often administered without reference to normative models of graphic production, and their results are often analyzed qualitatively. I begin this article by delineating the different ways in which drawing errors have been used to indicate particular functional deficits in neurological patients. I then describe models of drawing that have been explicitly based on the errors observed in patient drawings. Finally, the case is made for developing a more sensitive set of metrics in order to quantitatively assess patient performance. By providing a finer grain of analysis to assessment we will not only be better able to characterize the consequences of cognitive dysfunction, but may also be able to more subtly characterize and dissociate patients who would otherwise have been placed in the same broad category of impairment. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  19. [Qualitative analysis of the evaluation indicators and their related parameters of ametropic state].

    PubMed

    Ren, Zeqin

    2016-01-01

    To investigate the theoretical basis and practical limitations of the existing calculation formulas in the evaluation of ametropic state. The evaluation indicators and their calculation parameters of ametropia were analyzed by using the reduced schematic model eye, the paraxial imaging principle, and the dimension laws. The existing formulas resulted from the reduced object vergence of object distance and image distance relation. Regarding the two measurement indicators of the existing formulas, diopter was misused for refractive power. "Ametropia degree" was a non-standard diction. Both of them were not suitable as the evaluation indicators. The outcomes of the existing formulas and their related plus or minus sign rules showed refractive corrections instead of refractive errors proper. For refractive errors, there was no suitable evaluation indicator. In the evaluation of ametropic state, there are fundamental problems in the existing formulas resulting from the reduced object vergence. The measurement indicators and their dimensional units are confused and misused. The calculation results refer to the refractive corrections only. The evaluation indicators for ametropia need to be further discussed.

  20. Employer reasons for failing to report eligible workers’ compensation claims in the BLS survey of occupational injuries and illnesses

    PubMed Central

    Wuellner, Sara E.; Bonauto, David K.

    2016-01-01

    Background Little research has been done to identify reasons employers fail to report some injuries and illnesses in the Bureau of Labor Statistics Survey of Occupational Injuries and Illnesses (SOII). Methods We interviewed the 2012 Washington SOII respondents from establishments that had failed to report one or more eligible workers’ compensation claims in the SOII about their reasons for not reporting specific claims. Qualitative content analysis methods were used to identify themes and patterns in the responses. Results Non‐compliance with OSHA recordkeeping or SOII reporting instructions and data entry errors led to unreported claims. Some employers refused to include claims because they did not consider the injury to be work‐related, despite workers’ compensation eligibility. Participant responses brought the SOII eligibility of some claims into question. Conclusion Systematic and non‐systematic errors lead to SOII underreporting. Insufficient recordkeeping systems and limited knowledge of reporting requirements are barriers to accurate workplace injury records. Am. J. Ind. Med. 59:343–356, 2016. © 2016 The Authors. American Journal of Industrial Medicine Published by Wiley Periodicals, Inc. PMID:26970051

  1. Processing medical data: a systematic review

    PubMed Central

    2013-01-01

    Background Medical data recording is one of the basic clinical tools. Electronic Health Record (EHR) is important for data processing, communication, efficiency and effectiveness of patients’ information access, confidentiality, ethical and/or legal issues. Clinical record promote and support communication among service providers and hence upscale quality of healthcare. Qualities of records are reflections of the quality of care patients offered. Methods Qualitative analysis was undertaken for this systematic review. We reviewed 40 materials Published from 1999 to 2013. We searched these materials from databases including ovidMEDLINE and ovidEMBASE. Two reviewers independently screened materials on medical data recording, documentation and information processing and communication. Finally, all selected references were summarized, reconciled and compiled as one compatible document. Result Patients were dying and/or getting much suffering as the result of poor quality medical records. Electronic health record minimizes errors, saves unnecessary time, and money wasted on processing medical data. Conclusion Many countries have been complaining for incompleteness, inappropriateness and illegibility of records. Therefore creating awareness on the magnitude of the problem has paramount importance. Hence available correct patient information has lots of potential in reducing errors and support roles. PMID:24107106

  2. Errors in veterinary practice: preliminary lessons for building better veterinary teams.

    PubMed

    Kinnison, T; Guile, D; May, S A

    2015-11-14

    Case studies in two typical UK veterinary practices were undertaken to explore teamwork, including interprofessional working. Each study involved one week of whole team observation based on practice locations (reception, operating theatre), one week of shadowing six focus individuals (veterinary surgeons, veterinary nurses and administrators) and a final week consisting of semistructured interviews regarding teamwork. Errors emerged as a finding of the study. The definition of errors was inclusive, pertaining to inputs or omitted actions with potential adverse outcomes for patients, clients or the practice. The 40 identified instances could be grouped into clinical errors (dosing/drugs, surgical preparation, lack of follow-up), lost item errors, and most frequently, communication errors (records, procedures, missing face-to-face communication, mistakes within face-to-face communication). The qualitative nature of the study allowed the underlying cause of the errors to be explored. In addition to some individual mistakes, system faults were identified as a major cause of errors. Observed examples and interviews demonstrated several challenges to interprofessional teamworking which may cause errors, including: lack of time, part-time staff leading to frequent handovers, branch differences and individual veterinary surgeon work preferences. Lessons are drawn for building better veterinary teams and implications for Disciplinary Proceedings considered. British Veterinary Association.

  3. Muscle Strength and Qualitative Jump-Landing Differences in Male and Female Military Cadets: The Jump-ACL Study

    PubMed Central

    Beutler, Anthony I.; de la Motte, Sarah J.; Marshall, Stephen W.; Padua, Darin A.; Boden, Barry P.

    2009-01-01

    Recent studies have focused on gender differences in movement patterns as risk factors for ACL injury. Understanding intrinsic and extrinsic factors which contribute to movement patterns is critical to ACL injury prevention efforts. Isometric lower- extremity muscular strength, anthropometrics, and jump-landing technique were analyzed for 2,753 cadets (1,046 female, 1,707 male) from the U.S. Air Force, Military and Naval Academies. Jump- landings were evaluated using the Landing Error Scoring System (LESS), a valid qualitative movement screening tool. We hypothesized that distinct anthropometric factors (Q-angle, navicular drop, bodyweight) and muscle strength would predict poor jump-landing technique in males versus females, and that female cadets would have higher scores (more errors) on a qualitative movement screen (LESS) than males. Mean LESS scores were significantly higher in female (5.34 ± 1.51) versus male (4.65 ± 1.69) cadets (p < 0.001). Qualitative movement scores were analyzed using factor analyses, yielding five factors, or “patterns”, contributing to poor landing technique. Females were significantly more likely to have poor technique due to landing with less hip and knee flexion at initial contact (p < 0.001), more knee valgus with wider landing stance (p < 0. 001), and less flexion displacement over the entire landing (p < 0.001). Males were more likely to have poor technique due to landing toe-out (p < 0.001), with heels first, and with an asymmetric foot landing (p < 0.001). Many of the identified factor patterns have been previously proposed to contribute to ACL injury risk. However, univariate and multivariate analyses of muscular strength and anthropometric factors did not strongly predict LESS scores for either gender, suggesting that changing an athlete’s alignment, BMI, or muscle strength may not directly improve his or her movement patterns. Key points Important differences in male and female landing technique can be captured using a qualitative movement screen: the Landing Error Scoring System (LESS). Female cadets were more likely to land with shallow sagittal flexion, wide stance width, and more pronounced knee flexion. Male cadets were more likely to exhibit a heel-strike or asymmetric foot-strike and to land with toe out. Lower extremity muscle strength, Q-angle, and navicular drop do not significantly predict landing movement pattern in male or female cadets. PMID:21132103

  4. Optical Fourier filtering for whole lens assessment of progressive power lenses.

    PubMed

    Spiers, T; Hull, C C

    2000-07-01

    Four binary filter designs for use in an optical Fourier filtering set-up were evaluated when taking quantitative measurements and when qualitatively mapping the power variation of progressive power lenses (PPLs). The binary filters tested were concentric ring, linear grating, grid and "chevron" designs. The chevron filter was considered best for quantitative measurements since it permitted a vernier acuity task to be used for measuring the fringe spacing, significantly reducing errors, and it also gave information on the polarity of the lens power. The linear grating filter was considered best for qualitatively evaluating the power variation. Optical Fourier filtering and a Nidek automatic focimeter were then used to measure the powers in the distance and near portions of five PPLs of differing design. Mean measurement error was 0.04 D with a maximum value of 0.13 D. Good qualitative agreement was found between the iso-cylinder plots provided by the manufacturer and the Fourier filter fringe patterns for the PPLs indicating that optical Fourier filtering provides the ability to map the power distribution across the entire lens aperture without the need for multiple point measurements. Arguments are presented that demonstrate that it should be possible to derive both iso-sphere and iso-cylinder plots from the binary filter patterns.

  5. Differences between conduction aphasia and Wernicke's aphasia.

    PubMed

    Anzaki, F; Izumi, S

    2001-07-01

    Conduction aphasia and Wernike's aphasia have been differentiated by the degree of auditory language comprehension. We quantitatively compared the speech sound errors of two conduction aphasia patients and three Wernicke's aphasia patients on various language modality tests. All of the patients were Japanese. The two conduction aphasia patients had "conduites d'approche" errors and phonological paraphasia. The patient with mild Wernicke's aphasia made various errors. In the patient with severe Wernicke's aphasia, neologism was observed. Phonological paraphasia in the two conduction aphasia patients seemed to occur when the examinee searched for the target word. They made more errors in vowels than in consonants of target words on the naming and repetition tests. They seemed to search the target word by the correct consonant phoneme and incorrect vocalic phoneme in the table of the Japanese alphabet. The Wernicke's aphasia patients who had severe impairment of auditory comprehension, made more errors in consonants than in vowels of target words. In conclusion, utterance of conduction aphasia and that of Wernicke's aphasia are qualitatively distinct.

  6. Qualitative data analysis: conceptual and practical considerations.

    PubMed

    Liamputtong, Pranee

    2009-08-01

    Qualitative inquiry requires that collected data is organised in a meaningful way, and this is referred to as data analysis. Through analytic processes, researchers turn what can be voluminous data into understandable and insightful analysis. This paper sets out the different approaches that qualitative researchers can use to make sense of their data including thematic analysis, narrative analysis, discourse analysis and semiotic analysis and discusses the ways that qualitative researchers can analyse their data. I first discuss salient issues in performing qualitative data analysis, and then proceed to provide some suggestions on different methods of data analysis in qualitative research. Finally, I provide some discussion on the use of computer-assisted data analysis.

  7. Augmenting Qualitative Text Analysis with Natural Language Processing: Methodological Study.

    PubMed

    Guetterman, Timothy C; Chang, Tammy; DeJonckheere, Melissa; Basu, Tanmay; Scruggs, Elizabeth; Vydiswaran, V G Vinod

    2018-06-29

    Qualitative research methods are increasingly being used across disciplines because of their ability to help investigators understand the perspectives of participants in their own words. However, qualitative analysis is a laborious and resource-intensive process. To achieve depth, researchers are limited to smaller sample sizes when analyzing text data. One potential method to address this concern is natural language processing (NLP). Qualitative text analysis involves researchers reading data, assigning code labels, and iteratively developing findings; NLP has the potential to automate part of this process. Unfortunately, little methodological research has been done to compare automatic coding using NLP techniques and qualitative coding, which is critical to establish the viability of NLP as a useful, rigorous analysis procedure. The purpose of this study was to compare the utility of a traditional qualitative text analysis, an NLP analysis, and an augmented approach that combines qualitative and NLP methods. We conducted a 2-arm cross-over experiment to compare qualitative and NLP approaches to analyze data generated through 2 text (short message service) message survey questions, one about prescription drugs and the other about police interactions, sent to youth aged 14-24 years. We randomly assigned a question to each of the 2 experienced qualitative analysis teams for independent coding and analysis before receiving NLP results. A third team separately conducted NLP analysis of the same 2 questions. We examined the results of our analyses to compare (1) the similarity of findings derived, (2) the quality of inferences generated, and (3) the time spent in analysis. The qualitative-only analysis for the drug question (n=58) yielded 4 major findings, whereas the NLP analysis yielded 3 findings that missed contextual elements. The qualitative and NLP-augmented analysis was the most comprehensive. For the police question (n=68), the qualitative-only analysis yielded 4 primary findings and the NLP-only analysis yielded 4 slightly different findings. Again, the augmented qualitative and NLP analysis was the most comprehensive and produced the highest quality inferences, increasing our depth of understanding (ie, details and frequencies). In terms of time, the NLP-only approach was quicker than the qualitative-only approach for the drug (120 vs 270 minutes) and police (40 vs 270 minutes) questions. An approach beginning with qualitative analysis followed by qualitative- or NLP-augmented analysis took longer time than that beginning with NLP for both drug (450 vs 240 minutes) and police (390 vs 220 minutes) questions. NLP provides both a foundation to code qualitatively more quickly and a method to validate qualitative findings. NLP methods were able to identify major themes found with traditional qualitative analysis but were not useful in identifying nuances. Traditional qualitative text analysis added important details and context. ©Timothy C Guetterman, Tammy Chang, Melissa DeJonckheere, Tanmay Basu, Elizabeth Scruggs, VG Vinod Vydiswaran. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 29.06.2018.

  8. Testing for qualitative heterogeneity: An application to composite endpoints in survival analysis.

    PubMed

    Oulhaj, Abderrahim; El Ghouch, Anouar; Holman, Rury R

    2017-01-01

    Composite endpoints are frequently used in clinical outcome trials to provide more endpoints, thereby increasing statistical power. A key requirement for a composite endpoint to be meaningful is the absence of the so-called qualitative heterogeneity to ensure a valid overall interpretation of any treatment effect identified. Qualitative heterogeneity occurs when individual components of a composite endpoint exhibit differences in the direction of a treatment effect. In this paper, we develop a general statistical method to test for qualitative heterogeneity, that is to test whether a given set of parameters share the same sign. This method is based on the intersection-union principle and, provided that the sample size is large, is valid whatever the model used for parameters estimation. We propose two versions of our testing procedure, one based on a random sampling from a Gaussian distribution and another version based on bootstrapping. Our work covers both the case of completely observed data and the case where some observations are censored which is an important issue in many clinical trials. We evaluated the size and power of our proposed tests by carrying out some extensive Monte Carlo simulations in the case of multivariate time to event data. The simulations were designed under a variety of conditions on dimensionality, censoring rate, sample size and correlation structure. Our testing procedure showed very good performances in terms of statistical power and type I error. The proposed test was applied to a data set from a single-center, randomized, double-blind controlled trial in the area of Alzheimer's disease.

  9. Apology in cases of medical error disclosure: Thoughts based on a preliminary study

    PubMed Central

    Dahan, Sonia; Ducard, Dominique

    2017-01-01

    Background Disclosing medical errors is considered necessary by patients, ethicists, and health care professionals. Literature insists on the framing of this disclosure and describes the apology as appropriate and necessary. However, this policy seems difficult to put into practice. Few works have explored the function and meaning of the apology. Objective The aim of this study was to explore the role ascribed to apology in communication between healthcare professionals and patients when disclosing a medical error, and to discuss these findings using a linguistic and philosophical perspective. Methods Qualitative exploratory study, based on face-to-face semi-structured interviews, with seven physicians in a neonatal unit in France. Discourse analysis. Results Four themes emerged. Difference between apology in everyday life and in the medical encounter; place of the apology in the process of disclosure together with explanations, regrets, empathy and ways to avoid repeating the error; effects of the apology were to allow the patient-physician relationship undermined by the error, to be maintained, responsibility to be accepted, the first steps towards forgiveness to be taken, and a less hierarchical doctor-patient relationship to be created; ways of expressing apology (“I am sorry”) reflected regrets and empathy more than an explicit apology. Conclusion This study highlights how the act of apology can be seen as a “language act” as described by philosophers Austin and Searle, and how it functions as a technique for making amends following a wrongdoing and as an action undertaken in order that neither party should lose face, thus echoing the sociologist Goffmann’s interaction theory. This interpretation also accords with the views of Lazare, for whom the function of apology is a restoration of dignity after the humiliation of the error. This approach to the apology illustrates how meaning and impact of real-life language acts can be clarified by philosophical and sociological ideas. PMID:28759586

  10. Patients and families as teachers: a mixed methods assessment of a collaborative learning model for medical error disclosure and prevention.

    PubMed

    Langer, Thorsten; Martinez, William; Browning, David M; Varrin, Pamela; Sarnoff Lee, Barbara; Bell, Sigall K

    2016-08-01

    Despite growing interest in engaging patients and families (P/F) in patient safety education, little is known about how P/F can best contribute. We assessed the feasibility and acceptability of a patient-teacher medical error disclosure and prevention training model. We developed an educational intervention bringing together interprofessional clinicians with P/F from hospital advisory councils to discuss error disclosure and prevention. Patient focus groups and orientation sessions informed curriculum and assessment design. A pre-post survey with qualitative and quantitative questions was used to assess P/F and clinician experiences and attitudes about collaborative safety education including participant hopes, fears, perceived value of learning experience and challenges. Responses to open-ended questions were coded according to principles of content analysis. P/F and clinicians hoped to learn about each other's perspectives, communication skills and patient empowerment strategies. Before the intervention, both groups worried about power dynamics dampening effective interaction. Clinicians worried that P/F would learn about their fallibility, while P/F were concerned about clinicians' jargon and defensive posturing. Following workshops, clinicians valued patients' direct feedback, communication strategies for error disclosure and a 'real' learning experience. P/F appreciated clinicians' accountability, and insights into how medical errors affect clinicians. Half of participants found nothing challenging, the remainder clinicians cited emotions and enormity of 'culture change', while P/F commented on medical jargon and desire for more time. Patients and clinicians found the experience valuable. Recommendations about how to develop a patient-teacher programme in patient safety are provided. An educational paradigm that includes patients as teachers and collaborative learners with clinicians in patient safety is feasible, valued by clinicians and P/F and promising for P/F-centred medical error disclosure and prevention training. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  11. Apology in cases of medical error disclosure: Thoughts based on a preliminary study.

    PubMed

    Dahan, Sonia; Ducard, Dominique; Caeymaex, Laurence

    2017-01-01

    Disclosing medical errors is considered necessary by patients, ethicists, and health care professionals. Literature insists on the framing of this disclosure and describes the apology as appropriate and necessary. However, this policy seems difficult to put into practice. Few works have explored the function and meaning of the apology. The aim of this study was to explore the role ascribed to apology in communication between healthcare professionals and patients when disclosing a medical error, and to discuss these findings using a linguistic and philosophical perspective. Qualitative exploratory study, based on face-to-face semi-structured interviews, with seven physicians in a neonatal unit in France. Discourse analysis. Four themes emerged. Difference between apology in everyday life and in the medical encounter; place of the apology in the process of disclosure together with explanations, regrets, empathy and ways to avoid repeating the error; effects of the apology were to allow the patient-physician relationship undermined by the error, to be maintained, responsibility to be accepted, the first steps towards forgiveness to be taken, and a less hierarchical doctor-patient relationship to be created; ways of expressing apology ("I am sorry") reflected regrets and empathy more than an explicit apology. This study highlights how the act of apology can be seen as a "language act" as described by philosophers Austin and Searle, and how it functions as a technique for making amends following a wrongdoing and as an action undertaken in order that neither party should lose face, thus echoing the sociologist Goffmann's interaction theory. This interpretation also accords with the views of Lazare, for whom the function of apology is a restoration of dignity after the humiliation of the error. This approach to the apology illustrates how meaning and impact of real-life language acts can be clarified by philosophical and sociological ideas.

  12. Understanding Periodicity as a Process with Gestalt Structure.

    ERIC Educational Resources Information Center

    Shama, Gilli

    1998-01-01

    Presents a two-phase investigation of how Israeli students understand the concept of periodicity. Discusses related research with teachers and students (N=895) employing both qualitative and quantitative research methodologies. Concludes that students understand periodicity as a process. Students' errors and preferences are discussed with…

  13. Spectroscopic and Chemometric Analysis of Binary and Ternary Edible Oil Mixtures: Qualitative and Quantitative Study.

    PubMed

    Jović, Ozren; Smolić, Tomislav; Primožič, Ines; Hrenar, Tomica

    2016-04-19

    The aim of this study was to investigate the feasibility of FTIR-ATR spectroscopy coupled with the multivariate numerical methodology for qualitative and quantitative analysis of binary and ternary edible oil mixtures. Four pure oils (extra virgin olive oil, high oleic sunflower oil, rapeseed oil, and sunflower oil), as well as their 54 binary and 108 ternary mixtures, were analyzed using FTIR-ATR spectroscopy in combination with principal component and discriminant analysis, partial least-squares, and principal component regression. It was found that the composition of all 166 samples can be excellently represented using only the first three principal components describing 98.29% of total variance in the selected spectral range (3035-2989, 1170-1140, 1120-1100, 1093-1047, and 930-890 cm(-1)). Factor scores in 3D space spanned by these three principal components form a tetrahedral-like arrangement: pure oils being at the vertices, binary mixtures at the edges, and ternary mixtures on the faces of a tetrahedron. To confirm the validity of results, we applied several cross-validation methods. Quantitative analysis was performed by minimization of root-mean-square error of cross-validation values regarding the spectral range, derivative order, and choice of method (partial least-squares or principal component regression), which resulted in excellent predictions for test sets (R(2) > 0.99 in all cases). Additionally, experimentally more demanding gas chromatography analysis of fatty acid content was carried out for all specimens, confirming the results obtained by FTIR-ATR coupled with principal component analysis. However, FTIR-ATR provided a considerably better model for prediction of mixture composition than gas chromatography, especially for high oleic sunflower oil.

  14. Explanation of asymmetric dynamics of human water consumption in arid regions: prospect theory versus expected utility theory

    NASA Astrophysics Data System (ADS)

    Tian, F.; Lu, Y.

    2017-12-01

    Based on socioeconomic and hydrological data in three arid inland basins and error analysis, the dynamics of human water consumption (HWC) are analyzed to be asymmetric, i.e., HWC increase rapidly in wet periods while maintain or decrease slightly in dry periods. Besides the qualitative analysis that in wet periods great water availability inspires HWC to grow fast but the now expanded economy is managed to sustain by over-exploitation in dry periods, two quantitative models are established and tested, based on expected utility theory (EUT) and prospect theory (PT) respectively. EUT states that humans make decisions based on the total expected utility, namely the sum of utility function multiplied by probability of each result, while PT states that the utility function is defined over gains and losses separately, and probability should be replaced by probability weighting function.

  15. Research on the effects of geometrical and material uncertainties on the band gap of the undulated beam

    NASA Astrophysics Data System (ADS)

    Li, Yi; Xu, Yanlong

    2017-09-01

    Considering uncertain geometrical and material parameters, the lower and upper bounds of the band gap of an undulated beam with periodically arched shape are studied by the Monte Carlo Simulation (MCS) and interval analysis based on the Taylor series. Given the random variations of the overall uncertain variables, scatter plots from the MCS are used to analyze the qualitative sensitivities of the band gap respect to these uncertainties. We find that the influence of uncertainty of the geometrical parameter on the band gap of the undulated beam is stronger than that of the material parameter. And this conclusion is also proved by the interval analysis based on the Taylor series. Our methodology can give a strategy to reduce the errors between the design and practical values of the band gaps by improving the accuracy of the specially selected uncertain design variables of the periodical structures.

  16. Subthreshold muscle twitches dissociate oscillatory neural signatures of conflicts from errors.

    PubMed

    Cohen, Michael X; van Gaal, Simon

    2014-02-01

    We investigated the neural systems underlying conflict detection and error monitoring during rapid online error correction/monitoring mechanisms. We combined data from four separate cognitive tasks and 64 subjects in which EEG and EMG (muscle activity from the thumb used to respond) were recorded. In typical neuroscience experiments, behavioral responses are classified as "error" or "correct"; however, closer inspection of our data revealed that correct responses were often accompanied by "partial errors" - a muscle twitch of the incorrect hand ("mixed correct trials," ~13% of the trials). We found that these muscle twitches dissociated conflicts from errors in time-frequency domain analyses of EEG data. In particular, both mixed-correct trials and full error trials were associated with enhanced theta-band power (4-9Hz) compared to correct trials. However, full errors were additionally associated with power and frontal-parietal synchrony in the delta band. Single-trial robust multiple regression analyses revealed a significant modulation of theta power as a function of partial error correction time, thus linking trial-to-trial fluctuations in power to conflict. Furthermore, single-trial correlation analyses revealed a qualitative dissociation between conflict and error processing, such that mixed correct trials were associated with positive theta-RT correlations whereas full error trials were associated with negative delta-RT correlations. These findings shed new light on the local and global network mechanisms of conflict monitoring and error detection, and their relationship to online action adjustment. © 2013.

  17. Transmission and storage of medical images with patient information.

    PubMed

    Acharya U, Rajendra; Subbanna Bhat, P; Kumar, Sathish; Min, Lim Choo

    2003-07-01

    Digital watermarking is a technique of hiding specific identification data for copyright authentication. This technique is adapted here for interleaving patient information with medical images, to reduce storage and transmission overheads. The text data is encrypted before interleaving with images to ensure greater security. The graphical signals are interleaved with the image. Two types of error control-coding techniques are proposed to enhance reliability of transmission and storage of medical images interleaved with patient information. Transmission and storage scenarios are simulated with and without error control coding and a qualitative as well as quantitative interpretation of the reliability enhancement resulting from the use of various commonly used error control codes such as repetitive, and (7,4) Hamming code is provided.

  18. Negligence, genuine error, and litigation

    PubMed Central

    Sohn, David H

    2013-01-01

    Not all medical injuries are the result of negligence. In fact, most medical injuries are the result either of the inherent risk in the practice of medicine, or due to system errors, which cannot be prevented simply through fear of disciplinary action. This paper will discuss the differences between adverse events, negligence, and system errors; the current medical malpractice tort system in the United States; and review current and future solutions, including medical malpractice reform, alternative dispute resolution, health courts, and no-fault compensation systems. The current political environment favors investigation of non-cap tort reform remedies; investment into more rational oversight systems, such as health courts or no-fault systems may reap both quantitative and qualitative benefits for a less costly and safer health system. PMID:23426783

  19. Topological quantum error correction in the Kitaev honeycomb model

    NASA Astrophysics Data System (ADS)

    Lee, Yi-Chan; Brell, Courtney G.; Flammia, Steven T.

    2017-08-01

    The Kitaev honeycomb model is an approximate topological quantum error correcting code in the same phase as the toric code, but requiring only a 2-body Hamiltonian. As a frustrated spin model, it is well outside the commuting models of topological quantum codes that are typically studied, but its exact solubility makes it more amenable to analysis of effects arising in this noncommutative setting than a generic topologically ordered Hamiltonian. Here we study quantum error correction in the honeycomb model using both analytic and numerical techniques. We first prove explicit exponential bounds on the approximate degeneracy, local indistinguishability, and correctability of the code space. These bounds are tighter than can be achieved using known general properties of topological phases. Our proofs are specialized to the honeycomb model, but some of the methods may nonetheless be of broader interest. Following this, we numerically study noise caused by thermalization processes in the perturbative regime close to the toric code renormalization group fixed point. The appearance of non-topological excitations in this setting has no significant effect on the error correction properties of the honeycomb model in the regimes we study. Although the behavior of this model is found to be qualitatively similar to that of the standard toric code in most regimes, we find numerical evidence of an interesting effect in the low-temperature, finite-size regime where a preferred lattice direction emerges and anyon diffusion is geometrically constrained. We expect this effect to yield an improvement in the scaling of the lifetime with system size as compared to the standard toric code.

  20. Methods for quality-assurance review of water-quality data in New Jersey

    USGS Publications Warehouse

    Brown, G. Allan; Pustay, Edward A.; Gibs, Jacob

    2003-01-01

    Because values that are identified by the program as questionable may or may not be in error, the reviewer looks at both qualitative and quantitative relations between analytes during the period of record and then uses technical judgement to decide whether to accept a questionable value or investigate further. Guidelines for, and the use of regression analysis in, making this decision are described. Instructions are given for requesting that the analyzing laboratory reanalyze a constituent or otherwise verify the reported value. If, upon reanalysis or verification, a value is still questionable, consideration must be given to deleting the value or marking the value in the USGS National Water Information System database as having been reviewed and rejected.

  1. Principles of qualitative analysis in the chromatographic context.

    PubMed

    Valcárcel, M; Cárdenas, S; Simonet, B M; Carrillo-Carrión, C

    2007-07-27

    This article presents the state of the art of qualitative analysis in the framework of the chromatographic analysis. After establishing the differences between two main classes of qualitative analysis (analyte identification and sample classification/qualification) the particularities of instrumental qualitative analysis are commented on. Qualitative chromatographic analysis for sample classification/qualification through the so-called chromatographic fingerprint (for complex samples) or the volatiles profile (through the direct coupling headspace-mass spectrometry using the chromatograph as interface) is discussed. Next, more technical exposition of the qualitative chromatographic information is presented supported by a variety of representative examples.

  2. Small-Signal Analysis of Autonomous Hybrid Distributed Generation Systems in Presence of Ultracapacitor and Tie-Line Operation

    NASA Astrophysics Data System (ADS)

    Ray, Prakash K.; Mohanty, Soumya R.; Kishor, Nand

    2010-07-01

    This paper presents small-signal analysis of isolated as well as interconnected autonomous hybrid distributed generation system for sudden variation in load demand, wind speed and solar radiation. The hybrid systems comprise of different renewable energy resources such as wind, photovoltaic (PV) fuel cell (FC) and diesel engine generator (DEG) along with the energy storage devices such as flywheel energy storage system (FESS) and battery energy storage system (BESS). Further ultracapacitors (UC) as an alternative energy storage element and interconnection of hybrid systems through tie-line is incorporated into the system for improved performance. A comparative assessment of deviation of frequency profile for different hybrid systems in the presence of different storage system combinations is carried out graphically as well as in terms of the performance index (PI), ie integral square error (ISE). Both qualitative and quantitative analysis reflects the improvements of the deviation in frequency profiles in the presence of the ultracapacitors (UC) as compared to other energy storage elements.

  3. Ethical Sensitivity in Nursing Ethical Leadership: A Content Analysis of Iranian Nurses Experiences

    PubMed Central

    Esmaelzadeh, Fatemeh; Abbaszadeh, Abbas; Borhani, Fariba; Peyrovi, Hamid

    2017-01-01

    Background: Considering that many nursing actions affect other people’s health and life, sensitivity to ethics in nursing practice is highly important to ethical leaders as a role model. Objective: The study aims to explore ethical sensitivity in ethical nursing leaders in Iran. Method: This was a qualitative study based on the conventional content analysis in 2015. Data were collected using deep and semi-structured interviews with 20 Iranian nurses. The participants were chosen using purposive sampling. Data were analyzed using conventional content analysis. In order to increase the accuracy and integrity of the data, Lincoln and Guba's criteria were considered. Results: Fourteen sub-categories and five main categories emerged. Main categories consisted of sensitivity to care, sensitivity to errors, sensitivity to communication, sensitivity in decision making and sensitivity to ethical practice. Conclusion: Ethical sensitivity appears to be a valuable attribute for ethical nurse leaders, having an important effect on various aspects of professional practice and help the development of ethics in nursing practice. PMID:28584564

  4. Impractical CME programs: Influential parameters in Iran.

    PubMed

    Faghihi, Seyed Aliakbar; Khankeh, Hamid Reza; Hosseini, Seyed Jalil; Soltani Arabshahi, Seyed Kamran; Faghih, Zahra; Shirazi, Mandana

    2017-01-01

    Background: Traditional approaches in Continuing Medical Education (CME) appear to be ineffective in any improvement of the patients' care, reducing the medical errors, and/or altering physicians' behaviors. However, they are still executed by the CME providers, and are popular among the majority of the physicians. In this study, we aimed to explore the parameters involved in the degree of effectiveness of CME program in Iran. Methods: In this study, 31 participants, consisting of general practitioners, CME experts and providers were recruited to participate in in-depth interviews and field observations concerning experiences with CME. Application was made of the qualitative paradigm along with the qualitative content analysis, using grounded theory data analysis methodology (constant comparative analysis). Results: Based on the participants' experiences, the insufficient consistency between the training program contents and the demands of GPs, in addition to the non-beneficiary programs for the physicians and the non-comprehensive educational designs, created a negative attitude to the continuing education among physicians. This could be defined by an unrealistic continuing education program, which is the main theme here. Conclusion: Impracticable continuing education has created a negative attitude toward the CME programs among physicians so much that they consider these programs less important, resulting in attending the said programs without any specific aim: they dodge absenteeism just to get the credit points. Evidently, promoting CME programs to improve the performance of the physicians requires factual needs assessment over and above adaptation of the contents to the physicians' performance.

  5. Impractical CME programs: Influential parameters in Iran

    PubMed Central

    Faghihi, Seyed Aliakbar; Khankeh, Hamid Reza; Hosseini, Seyed Jalil; Soltani Arabshahi, Seyed Kamran; Faghih, Zahra; Shirazi, Mandana

    2017-01-01

    Background: Traditional approaches in Continuing Medical Education (CME) appear to be ineffective in any improvement of the patients’ care, reducing the medical errors, and/or altering physicians' behaviors. However, they are still executed by the CME providers, and are popular among the majority of the physicians. In this study, we aimed to explore the parameters involved in the degree of effectiveness of CME program in Iran. Methods: In this study, 31 participants, consisting of general practitioners, CME experts and providers were recruited to participate in in-depth interviews and field observations concerning experiences with CME. Application was made of the qualitative paradigm along with the qualitative content analysis, using grounded theory data analysis methodology (constant comparative analysis). Results: Based on the participants’ experiences, the insufficient consistency between the training program contents and the demands of GPs, in addition to the non-beneficiary programs for the physicians and the non-comprehensive educational designs, created a negative attitude to the continuing education among physicians. This could be defined by an unrealistic continuing education program, which is the main theme here. Conclusion: Impracticable continuing education has created a negative attitude toward the CME programs among physicians so much that they consider these programs less important, resulting in attending the said programs without any specific aim: they dodge absenteeism just to get the credit points. Evidently, promoting CME programs to improve the performance of the physicians requires factual needs assessment over and above adaptation of the contents to the physicians’ performance. PMID:28638813

  6. Found Poems, Member Checking and Crises of Representation

    ERIC Educational Resources Information Center

    Reilly, Rosemary C.

    2013-01-01

    In order to establish veracity, qualitative researchers frequently rely on member checks to insure credibility by giving participants opportunities to correct errors, challenge interpretations and assess results; however, member checks are not without drawbacks. This paper describes an innovative approach to conducting member checks. Six members…

  7. Is scanning electron microscopy/energy dispersive X-ray spectrometry (SEM/EDS) quantitative?

    PubMed

    Newbury, Dale E; Ritchie, Nicholas W M

    2013-01-01

    Scanning electron microscopy/energy dispersive X-ray spectrometry (SEM/EDS) is a widely applied elemental microanalysis method capable of identifying and quantifying all elements in the periodic table except H, He, and Li. By following the "k-ratio" (unknown/standard) measurement protocol development for electron-excited wavelength dispersive spectrometry (WDS), SEM/EDS can achieve accuracy and precision equivalent to WDS and at substantially lower electron dose, even when severe X-ray peak overlaps occur, provided sufficient counts are recorded. Achieving this level of performance is now much more practical with the advent of the high-throughput silicon drift detector energy dispersive X-ray spectrometer (SDD-EDS). However, three measurement issues continue to diminish the impact of SEM/EDS: (1) In the qualitative analysis (i.e., element identification) that must precede quantitative analysis, at least some current and many legacy software systems are vulnerable to occasional misidentification of major constituent peaks, with the frequency of misidentifications rising significantly for minor and trace constituents. (2) The use of standardless analysis, which is subject to much broader systematic errors, leads to quantitative results that, while useful, do not have sufficient accuracy to solve critical problems, e.g. determining the formula of a compound. (3) EDS spectrometers have such a large volume of acceptance that apparently credible spectra can be obtained from specimens with complex topography that introduce uncontrolled geometric factors that modify X-ray generation and propagation, resulting in very large systematic errors, often a factor of ten or more. © Wiley Periodicals, Inc.

  8. Understanding Human Error in Naval Aviation Mishaps.

    PubMed

    Miranda, Andrew T

    2018-04-01

    To better understand the external factors that influence the performance and decisions of aviators involved in Naval aviation mishaps. Mishaps in complex activities, ranging from aviation to nuclear power operations, are often the result of interactions between multiple components within an organization. The Naval aviation mishap database contains relevant information, both in quantitative statistics and qualitative reports, that permits analysis of such interactions to identify how the working atmosphere influences aviator performance and judgment. Results from 95 severe Naval aviation mishaps that occurred from 2011 through 2016 were analyzed using Bayes' theorem probability formula. Then a content analysis was performed on a subset of relevant mishap reports. Out of the 14 latent factors analyzed, the Bayes' application identified 6 that impacted specific aspects of aviator behavior during mishaps. Technological environment, misperceptions, and mental awareness impacted basic aviation skills. The remaining 3 factors were used to inform a content analysis of the contextual information within mishap reports. Teamwork failures were the result of plan continuation aggravated by diffused responsibility. Resource limitations and risk management deficiencies impacted judgments made by squadron commanders. The application of Bayes' theorem to historical mishap data revealed the role of latent factors within Naval aviation mishaps. Teamwork failures were seen to be considerably damaging to both aviator skill and judgment. Both the methods and findings have direct application for organizations interested in understanding the relationships between external factors and human error. It presents real-world evidence to promote effective safety decisions.

  9. Technical Note: Introduction of variance component analysis to setup error analysis in radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matsuo, Yukinori, E-mail: ymatsuo@kuhp.kyoto-u.ac.

    Purpose: The purpose of this technical note is to introduce variance component analysis to the estimation of systematic and random components in setup error of radiotherapy. Methods: Balanced data according to the one-factor random effect model were assumed. Results: Analysis-of-variance (ANOVA)-based computation was applied to estimate the values and their confidence intervals (CIs) for systematic and random errors and the population mean of setup errors. The conventional method overestimates systematic error, especially in hypofractionated settings. The CI for systematic error becomes much wider than that for random error. The ANOVA-based estimation can be extended to a multifactor model considering multiplemore » causes of setup errors (e.g., interpatient, interfraction, and intrafraction). Conclusions: Variance component analysis may lead to novel applications to setup error analysis in radiotherapy.« less

  10. Assessment of diet and physical activity of brazilian schoolchildren: usability testing of a web-based questionnaire.

    PubMed

    da Costa, Filipe Ferreira; Schmoelz, Camilie Pacheco; Davies, Vanessa Fernandes; Di Pietro, Patrícia Faria; Kupek, Emil; de Assis, Maria Alice Altenburg

    2013-08-19

    Information and communication technology (ICT) has been used with increasing frequency for the assessment of diet and physical activity in health surveys. A number of Web-based questionnaires have been developed for children and adolescents. However, their usability characteristics have scarcely been reported, despite their potential importance for improving the feasibility and validity of ICT-based methods. The objective of this study was to describe the usability evaluation of the Consumo Alimentar e Atividade Física de Escolares (CAAFE) questionnaire (Food Consumption and Physical Activity Questionnaire for schoolchildren), a new Web-based survey tool for the self-assessment of diet and physical activity by schoolchildren. A total of 114 schoolchildren aged 6 to 12 years took part in questionnaire usability testing carried out in computer classrooms at five elementary schools in the city of Florianopolis, Brazil. Schoolchildren used a personal computer (PC) equipped with software for recording what is on the computer screen and the children's speech during usability testing. Quantitative and qualitative analyses took into account objective usability metrics such as error counts and time to complete a task. Data on the main difficulties in accomplishing the task and the level of satisfaction expressed by the children were assessed by the observers using a standardized form and interviews with the children. Descriptive statistics and content analysis were used to summarize both the quantitative and the qualitative aspects of the data obtained. The mean time for completing the questionnaire was 13.7 minutes (SD 3.68). Compared to the children in 2nd or 3rd grades, those in 4th or 5th grades spent less time completing the questionnaire (median 12.4 vs 13.3 minutes, P=.022), asked for help less frequently (median 0 vs 1.0 count, P=.005), had a lower error count (median 2.0 vs 8.0 count, P<.001), and obtained a higher overall performance score (median 73.0 vs 68.0, P=.005). Children with a PC at home spent less time completing the questionnaire (median 12.3 vs 14.9 minutes, P<.001), had a lower overall error count (median 2.0 vs 9.0 count, P=.03), and had a higher performance score (median 72.0 vs 64.0, P=.005) compared to the children without a PC at home. The most common difficulty in completing the questionnaire was in using the scroll bar. The majority of children reported a positive evaluation (liked a lot or liked) for the four design elements, which were evaluated. The results of the present study provided feedback to improve the final version of the CAAFE questionnaire. Quantitative data showed minor errors and system failures, while qualitative data indicated that, overall, the children enjoyed the CAAFE questionnaire. Grade levels and PC use must be taken into account in Web-based tools designed for children.

  11. Parent Preferences for Medical Error Disclosure: A Qualitative Study.

    PubMed

    Coffey, Maitreya; Espin, Sherry; Hahmann, Tara; Clairman, Hayyah; Lo, Lisha; Friedman, Jeremy N; Matlow, Anne

    2017-01-01

    According to disclosure guidelines, patients experiencing adverse events due to medical errors should be offered full disclosure, whereas disclosure of near misses is not traditionally expected. This may conflict with parental expectations; surveys reveal most parents expect full disclosure whether errors resulted in harm or not. Protocols regarding whether to include children in these discussions have not been established. This study explores parent preferences around disclosure and views on including children. Fifteen parents of hospitalized children participated in semistructured interviews. Three hypothetical scenarios of different severity were used to initiate discussion. Interviews were audiotaped, transcribed, and coded for emergent themes. Parents uniformly wanted disclosure if harm occurred, although fewer wanted their child informed. For nonharmful errors, most parents wanted disclosure for themselves but few for their children.With respect to including children in disclosure, parents preferred to assess their children's cognitive and emotional readiness to cope with disclosure, wishing to act as a "buffer" between the health care team and their children. Generally, as event severity decreased, they felt that risks of informing children outweighed benefits. Parents strongly emphasized needing reassurance of a good final outcome and anticipated difficulty managing their emotions. Parents have mixed expectations regarding disclosure. Although survey studies indicate a stronger desire for disclosure of nonharmful events than for adult patients, this qualitative study revealed a greater degree of hesitation and complexity. Parents have a great need for reassurance and consistently wish to act as a buffer between the health care team and their children. Copyright © 2017 by the American Academy of Pediatrics.

  12. Simultaneous Qualitative and Quantitative Analyses of Triterpenoids in Ilex pubescens by Ultra-High-Performance Liquid Chromatography Coupled with Quadrupole Time-of-Flight Mass Spectrometry.

    PubMed

    Cao, Di; Wang, Qing; Jin, Jing; Qiu, Maosong; Zhou, Lian; Zhou, Xinghong; Li, Hui; Zhao, Zhongxiang

    2018-03-01

    Ilex pubescens Hook et Arn mainly contains triterpenoids that possess antithrombotic, anti-inflammatory and analgesic effects. Quantitative and qualitative analyses of the triterpenoids in I. pubescens can be useful for determining the authenticity and quality of raw materials and guiding its clinical preparation. To establish a method for rapid and comprehensive analysis of triterpenoids in I. pubescens using ultra-high-performance liquid chromatography coupled to electrospray ionisation and quadrupole time-of-flight-mass spectrometry (UPLC-ESI-QTOF-MS), which will also be applied to evaluate the contents of nine triterpenoids among root, root heartwood and root bark of I. pubescens to judge the value of the root bark to avoid wastage. UPLC-ESI-QTOF-MS data from the extracts of I. pubescens in negative mode were analysed using Peakview and Masterview software that provided molecular weight, mass errors, isotope pattern fit and MS/MS fragments for the identification of triterpenoids. The quantification of nine investigated compounds of I. pubescens was accomplished using MultiQuant software. A total of 33 triterpenoids, five phenolic acids, two lignans and a flavonol were characterised in only 14 min. The total content of the nine compounds in the root bark was generally slightly higher than that of the root and root heartwood, which has not been reported before. The developed UPLC-ESI-QTOF-MS method was proven to be rapid and comprehensive for simultaneous qualitative and quantitative analyses of the characteristic triterpenoids in I. pubescens. The results may provide a basis for holistic quality control and metabolic studies of I. pubescens, as well as serve as a reference for the analysis of other Ilex plants. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  13. The importance of including local correlation times in the calculation of inter-proton distances from NMR measurements: ignoring local correlation times leads to significant errors in the conformational analysis of the Glc alpha1-2Glc alpha linkage by NMR spectroscopy.

    PubMed

    Mackeen, Mukram; Almond, Andrew; Cumpstey, Ian; Enis, Seth C; Kupce, Eriks; Butters, Terry D; Fairbanks, Antony J; Dwek, Raymond A; Wormald, Mark R

    2006-06-07

    The experimental determination of oligosaccharide conformations has traditionally used cross-linkage 1H-1H NOE/ROEs. As relatively few NOEs are observed, to provide sufficient conformational constraints this method relies on: accurate quantification of NOE intensities (positive constraints); analysis of absent NOEs (negative constraints); and hence calculation of inter-proton distances using the two-spin approximation. We have compared the results obtained by using 1H 2D NOESY, ROESY and T-ROESY experiments at 500 and 700 MHz to determine the conformation of the terminal Glc alpha1-2Glc alpha linkage in a dodecasaccharide and a related tetrasaccharide. For the tetrasaccharide, the NOESY and ROESY spectra produced the same qualitative pattern of linkage cross-peaks but the quantitative pattern, the relative peak intensities, was different. For the dodecasaccharide, the NOESY and ROESY spectra at 500 MHz produced a different qualitative pattern of linkage cross-peaks, with fewer peaks in the NOESY spectrum. At 700 MHz, the NOESY and ROESY spectra of the dodecasaccharide produced the same qualitative pattern of peaks, but again the relative peak intensities were different. These differences are due to very significant differences in the local correlation times for different proton pairs across this glycosidic linkage. The local correlation time for each proton pair was measured using the ratio of the NOESY and T-ROESY cross-relaxation rates, leaving the NOESY and ROESY as independent data sets for calculating the inter-proton distances. The inter-proton distances calculated including the effects of differences in local correlation times give much more consistent results.

  14. The Influence of Observation Errors on Analysis Error and Forecast Skill Investigated with an Observing System Simulation Experiment

    NASA Technical Reports Server (NTRS)

    Prive, N. C.; Errico, R. M.; Tai, K.-S.

    2013-01-01

    The Global Modeling and Assimilation Office (GMAO) observing system simulation experiment (OSSE) framework is used to explore the response of analysis error and forecast skill to observation quality. In an OSSE, synthetic observations may be created that have much smaller error than real observations, and precisely quantified error may be applied to these synthetic observations. Three experiments are performed in which synthetic observations with magnitudes of applied observation error that vary from zero to twice the estimated realistic error are ingested into the Goddard Earth Observing System Model (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation for a one-month period representing July. The analysis increment and observation innovation are strongly impacted by observation error, with much larger variances for increased observation error. The analysis quality is degraded by increased observation error, but the change in root-mean-square error of the analysis state is small relative to the total analysis error. Surprisingly, in the 120 hour forecast increased observation error only yields a slight decline in forecast skill in the extratropics, and no discernable degradation of forecast skill in the tropics.

  15. Learning a visuomotor rotation: simultaneous visual and proprioceptive information is crucial for visuomotor remapping.

    PubMed

    Shabbott, Britne A; Sainburg, Robert L

    2010-05-01

    Visuomotor adaptation is mediated by errors between intended and sensory-detected arm positions. However, it is not clear whether visual-based errors that are shown during the course of motion lead to qualitatively different or more efficient adaptation than errors shown after movement. For instance, continuous visual feedback mediates online error corrections, which may facilitate or inhibit the adaptation process. We addressed this question by manipulating the timing of visual error information and task instructions during a visuomotor adaptation task. Subjects were exposed to a visuomotor rotation, during which they received continuous visual feedback (CF) of hand position with instructions to correct or not correct online errors, or knowledge-of-results (KR), provided as a static hand-path at the end of each trial. Our results showed that all groups improved performance with practice, and that online error corrections were inconsequential to the adaptation process. However, in contrast to the CF groups, the KR group showed relatively small reductions in mean error with practice, increased inter-trial variability during rotation exposure, and more limited generalization across target distances and workspace. Further, although the KR group showed improved performance with practice, after-effects were minimal when the rotation was removed. These findings suggest that simultaneous visual and proprioceptive information is critical in altering neural representations of visuomotor maps, although delayed error information may elicit compensatory strategies to offset perturbations.

  16. Conducting Qualitative Data Analysis: Managing Dynamic Tensions within

    ERIC Educational Resources Information Center

    Chenail, Ronald J.

    2012-01-01

    In the third of a series of "how-to" essays on conducting qualitative data analysis, Ron Chenail examines the dynamic tensions within the process of qualitative data analysis that qualitative researchers must manage in order to produce credible and creative results. These tensions include (a) the qualities of the data and the qualitative data…

  17. Qualitative Secondary Analysis: A Case Exemplar.

    PubMed

    Tate, Judith Ann; Happ, Mary Beth

    Qualitative secondary analysis (QSA) is the use of qualitative data that was collected by someone else or was collected to answer a different research question. Secondary analysis of qualitative data provides an opportunity to maximize data utility, particularly with difficult-to-reach patient populations. However, qualitative secondary analysis methods require careful consideration and explicit description to best understand, contextualize, and evaluate the research results. In this article, we describe methodologic considerations using a case exemplar to illustrate challenges specific to qualitative secondary analysis and strategies to overcome them. Copyright © 2017 National Association of Pediatric Nurse Practitioners. Published by Elsevier Inc. All rights reserved.

  18. Dynamic Time Warping compared to established methods for validation of musculoskeletal models.

    PubMed

    Gaspar, Martin; Welke, Bastian; Seehaus, Frank; Hurschler, Christof; Schwarze, Michael

    2017-04-11

    By means of Multi-Body musculoskeletal simulation, important variables such as internal joint forces and moments can be estimated which cannot be measured directly. Validation can ensued by qualitative or by quantitative methods. Especially when comparing time-dependent signals, many methods do not perform well and validation is often limited to qualitative approaches. The aim of the present study was to investigate the capabilities of the Dynamic Time Warping (DTW) algorithm for comparing time series, which can quantify phase as well as amplitude errors. We contrast the sensitivity of DTW with other established metrics: the Pearson correlation coefficient, cross-correlation, the metric according to Geers, RMSE and normalized RMSE. This study is based on two data sets, where one data set represents direct validation and the other represents indirect validation. Direct validation was performed in the context of clinical gait-analysis on trans-femoral amputees fitted with a 6 component force-moment sensor. Measured forces and moments from amputees' socket-prosthesis are compared to simulated forces and moments. Indirect validation was performed in the context of surface EMG measurements on a cohort of healthy subjects with measurements taken of seven muscles of the leg, which were compared to simulated muscle activations. Regarding direct validation, a positive linear relation between results of RMSE and nRMSE to DTW can be seen. For indirect validation, a negative linear relation exists between Pearson correlation and cross-correlation. We propose the DTW algorithm for use in both direct and indirect quantitative validation as it correlates well with methods that are most suitable for one of the tasks. However, in DV it should be used together with methods resulting in a dimensional error value, in order to be able to interpret results more comprehensible. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Iranian family caregivers' challenges and issues in caring of multiple sclerosis patients: A descriptive explorative qualitative study.

    PubMed

    Masoudi, Reza; Abedi, Heidar Ali; Abedi, Parvin; Mohammadianinejad, Seyed Ehsan

    2014-07-01

    The broad spectrum of problems caused by multiple sclerosis (MS) imposes heavy responsibility to caregivers in caring of their patients. Therefore, they encounter many issues and challenges in this situation. The purpose of this study was to explore the experiences and challenges of MS family caregivers. A qualitative design, based on a thematic analysis approach, was used to reach the study aim. Data were collected and analyzed concurrently through in-depth unstructured interviews, field notes, and observations that were held with 23 participants (14 family caregivers and 9 MS patients) at two referral centers in Ahvaz, Iran. THREE MAJOR THEMES WERE EXTRACTED FROM THE ANALYSIS OF THE TRANSCRIPTS: "emotional exhaustion of caregivers," "uncertain atmosphere of caring," and "insularity care." The first theme consisted of three subthemes: "stressful atmosphere of caring," "conflict and animism," and "continuing distress affecting the caregiver." The second theme consisted of three subthemes: "unstable and complicacy of disease," "caring with trial and error," and "frequent hospitalization of patients," and the third theme consisted of two subthemes: "caring gap and disintegration" and "lack of sufficient support." This study will be useful to healthcare system for managing the challenges of MS patients' family caregivers. Improving the conditions and performance of family caregivers is crucial in order to provide high-quality care to people with MS.

  20. Theory of injection locking and rapid start-up of magnetrons, and effects of manufacturing errors in terahertz traveling wave tubes

    NASA Astrophysics Data System (ADS)

    Pengvanich, Phongphaeth

    In this thesis, several contemporary issues on coherent radiation sources are examined. They include the fast startup and the injection locking of microwave magnetrons, and the effects of random manufacturing errors on phase and small signal gain of terahertz traveling wave amplifiers. In response to the rapid startup and low noise magnetron experiments performed at the University of Michigan that employed periodic azimuthal perturbations in the axial magnetic field, a systematic study of single particle orbits is performed for a crossed electric and periodic magnetic field. A parametric instability in the orbits, which brings a fraction of the electrons from the cathode toward the anode, is discovered. This offers an explanation of the rapid startup observed in the experiments. A phase-locking model has been constructed from circuit theory to qualitatively explain various regimes observed in kilowatt magnetron injection-locking experiments, which were performed at the University of Michigan. These experiments utilize two continuous-wave magnetrons; one functions as an oscillator and the other as a driver. Time and frequency domain solutions are developed from the model, allowing investigations into growth, saturation, and frequency response of the output. The model qualitatively recovers many of the phase-locking frequency characteristics observed in the experiments. Effects of frequency chirp and frequency perturbation on the phase and lockability have also been quantified. Development of traveling wave amplifier operating at terahertz is a subject of current interest. The small circuit size has prompted a statistical analysis of the effects of random fabrication errors on phase and small signal gain of these amplifiers. The small signal theory is treated with a continuum model in which the electron beam is monoenergetic. Circuit perturbations that vary randomly along the beam axis are introduced through the dimensionless Pierce parameters describing the beam-wave velocity mismatch (b), the gain parameter (C), and the cold tube circuit loss ( d). Our study shows that perturbation in b dominates the other two in terms of power gain and phase shift. Extensive data show that standard deviation of the output phase is linearly proportional to standard deviation of the individual perturbations in b, C and d.

  1. Contribution of stimulus attributes to errors in duration and distance judgments--a developmental study.

    PubMed

    Matsuda, F; Lan, W C; Tanimura, R

    1999-02-01

    In Matsuda's 1996 study, 4- to 11-yr.-old children (N = 133) watched two cars running on two parallel tracks on a CRT display and judged whether their durations and distances were equal and, if not, which was larger. In the present paper, the relative contributions of the four critical stimulus attributes (whether temporal starting points, temporal stopping points, spatial starting points, and spatial stopping points were the same or different between two cars) to the production of errors were quantitatively estimated based on the data for rates of errors obtained by Matsuda. The present analyses made it possible not only to understand numerically the findings about qualitative characteristics of the critical attributes described by Matsuda, but also to add more detailed findings about them.

  2. Crop/weed discrimination using near-infrared reflectance spectroscopy (NIRS)

    NASA Astrophysics Data System (ADS)

    Zhang, Yun; He, Yong

    2006-09-01

    The traditional uniform herbicide application often results in an over chemical residues on soil, crop plants and agriculture produce, which have imperiled the environment and food security. Near-infrared reflectance spectroscopy (NIRS) offers a promising means for weed detection and site-specific herbicide application. In laboratory, a total of 90 samples (30 for each species) of the detached leaves of two weeds, i.e., threeseeded mercury (Acalypha australis L.) and fourleafed duckweed (Marsilea quadrfolia L.), and one crop soybean (Glycine max) was investigated for NIRS on 325- 1075 nm using a field spectroradiometer. 20 absorbance samples of each species after pretreatment were exported and the lacked Y variables were assigned independent values for partial least squares (PLS) analysis. During the combined principle component analysis (PCA) on 400-1000 nm, the PC1 and PC2 could together explain over 91% of the total variance and detect the three plant species with 98.3% accuracy. The full-cross validation results of PLS, i.e., standard error of prediction (SEP) 0.247, correlation coefficient (r) 0.954 and root mean square error of prediction (RMSEP) 0.245, indicated an optimum model for weed identification. By predicting the remaining 10 samples of each species in the PLS model, the results with deviation presented a 100% crop/weed detection rate. Thus, it could be concluded that PLS was an available alternative of for qualitative weed discrimination on NTRS.

  3. Four-dimensional data coupled to alternating weighted residue constraint quadrilinear decomposition model applied to environmental analysis: Determination of polycyclic aromatic hydrocarbons

    NASA Astrophysics Data System (ADS)

    Liu, Tingting; Zhang, Ling; Wang, Shutao; Cui, Yaoyao; Wang, Yutian; Liu, Lingfei; Yang, Zhe

    2018-03-01

    Qualitative and quantitative analysis of polycyclic aromatic hydrocarbons (PAHs) was carried out by three-dimensional fluorescence spectroscopy combining with Alternating Weighted Residue Constraint Quadrilinear Decomposition (AWRCQLD). The experimental subjects were acenaphthene (ANA) and naphthalene (NAP). Firstly, in order to solve the redundant information of the three-dimensional fluorescence spectral data, the wavelet transform was used to compress data in preprocessing. Then, the four-dimensional data was constructed by using the excitation-emission fluorescence spectra of different concentration PAHs. The sample data was obtained from three solvents that are methanol, ethanol and Ultra-pure water. The four-dimensional spectral data was analyzed by AWRCQLD, then the recovery rate of PAHs was obtained from the three solvents and compared respectively. On one hand, the results showed that PAHs can be measured more accurately by the high-order data, and the recovery rate was higher. On the other hand, the results presented that AWRCQLD can better reflect the superiority of four-dimensional algorithm than the second-order calibration and other third-order calibration algorithms. The recovery rate of ANA was 96.5% 103.3% and the root mean square error of prediction was 0.04 μgL- 1. The recovery rate of NAP was 96.7% 115.7% and the root mean square error of prediction was 0.06 μgL- 1.

  4. On the error propagation of semi-Lagrange and Fourier methods for advection problems☆

    PubMed Central

    Einkemmer, Lukas; Ostermann, Alexander

    2015-01-01

    In this paper we study the error propagation of numerical schemes for the advection equation in the case where high precision is desired. The numerical methods considered are based on the fast Fourier transform, polynomial interpolation (semi-Lagrangian methods using a Lagrange or spline interpolation), and a discontinuous Galerkin semi-Lagrangian approach (which is conservative and has to store more than a single value per cell). We demonstrate, by carrying out numerical experiments, that the worst case error estimates given in the literature provide a good explanation for the error propagation of the interpolation-based semi-Lagrangian methods. For the discontinuous Galerkin semi-Lagrangian method, however, we find that the characteristic property of semi-Lagrangian error estimates (namely the fact that the error increases proportionally to the number of time steps) is not observed. We provide an explanation for this behavior and conduct numerical simulations that corroborate the different qualitative features of the error in the two respective types of semi-Lagrangian methods. The method based on the fast Fourier transform is exact but, due to round-off errors, susceptible to a linear increase of the error in the number of time steps. We show how to modify the Cooley–Tukey algorithm in order to obtain an error growth that is proportional to the square root of the number of time steps. Finally, we show, for a simple model, that our conclusions hold true if the advection solver is used as part of a splitting scheme. PMID:25844018

  5. Tests for qualitative treatment-by-centre interaction using a 'pushback' procedure.

    PubMed

    Ciminera, J L; Heyse, J F; Nguyen, H H; Tukey, J W

    1993-06-15

    In multicentre clinical trials using a common protocol, the centres are usually regarded as being a fixed factor, thus allowing any treatment-by-centre interaction to be omitted from the error term for the effect of treatment. However, we feel it necessary to use the treatment-by-centre interaction as the error term if there is substantial evidence that the interaction with centres is qualitative instead of quantitative. To make allowance for the estimated uncertainties of the centre means, we propose choosing a reference value (for example, the median of the ordered array of centre means) and converting the individual centre results into standardized deviations from the reference value. The deviations are then reordered, and the results 'pushed back' by amounts appropriate for the corresponding order statistics in a sample from the relevant distribution. The pushed-back standardized deviations are then restored to the original scale. The appearance of opposite signs among the destandardized values for the various centres is then taken as 'substantial evidence' of qualitative interaction. Procedures are presented using, in any combination: (i) Gaussian, or Student's t-distribution; (ii) order-statistic medians or outward 90 per cent points of the corresponding order statistic distributions; (iii) pooling or grouping and pooling the internally estimated standard deviations of the centre means. The use of the least conservative combination--Student's t, outward 90 per cent points, grouping and pooling--is recommended.

  6. Qualitative methods: what are they and why use them?

    PubMed Central

    Sofaer, S

    1999-01-01

    OBJECTIVE: To provide an overview of reasons why qualitative methods have been used and can be used in health services and health policy research, to describe a range of specific methods, and to give examples of their application. DATA SOURCES: Classic and contemporary descriptions of the underpinnings and applications of qualitative research methods and studies that have used such methods to examine important health services and health policy issues. PRINCIPAL FINDINGS: Qualitative research methods are valuable in providing rich descriptions of complex phenomena; tracking unique or unexpected events; illuminating the experience and interpretation of events by actors with widely differing stakes and roles; giving voice to those whose views are rarely heard; conducting initial explorations to develop theories and to generate and even test hypotheses; and moving toward explanations. Qualitative and quantitative methods can be complementary, used in sequence or in tandem. The best qualitative research is systematic and rigorous, and it seeks to reduce bias and error and to identify evidence that disconfirms initial or emergent hypotheses. CONCLUSIONS: Qualitative methods have much to contribute to health services and health policy research, especially as such research deals with rapid change and develops a more fully integrated theory base and research agenda. However, the field must build on the best traditions and techniques of qualitative methods and must recognize that special training and experience are essential to the application of these methods. PMID:10591275

  7. Development of a scale of executive functioning for the RBANS.

    PubMed

    Spencer, Robert J; Kitchen Andren, Katherine A; Tolle, Kathryn A

    2018-01-01

    The Repeatable Battery for the Assessment of Neuropsychological Status (RBANS) is a cognitive battery that contains scales of several cognitive abilities, but no scale in the instrument is exclusively dedicated to executive functioning. Although the subtests allow for observation of executive-type errors, each error is of fairly low base rate, and healthy and clinical normative data are lacking on the frequency of these types of errors, making their significance difficult to interpret in isolation. The aim of this project was to create an RBANS executive errors scale (RBANS EE) with items comprised of qualitatively dysexecutive errors committed throughout the test. Participants included Veterans referred for outpatient neuropsychological testing. Items were initially selected based on theoretical literature and were retained based on item-total correlations. The RBANS EE (a percentage calculated by dividing the number of dysexecutive errors by the total number of responses) was moderately related to each of seven established measures of executive functioning and was strongly predictive of dichotomous classification of executive impairment. Thus, the scale had solid concurrent validity, justifying its use as a supplementary scale. The RBANS EE requires no additional administration time and can provide a quantified measure of otherwise unmeasured aspects of executive functioning.

  8. Conducting Qualitative Data Analysis: Qualitative Data Analysis as a Metaphoric Process

    ERIC Educational Resources Information Center

    Chenail, Ronald J.

    2012-01-01

    In the second of a series of "how-to" essays on conducting qualitative data analysis, Ron Chenail argues the process can best be understood as a metaphoric process. From this orientation he suggests researchers follow Kenneth Burke's notion of metaphor and see qualitative data analysis as the analyst systematically considering the "this-ness" of…

  9. Conducting qualitative research in audiology: a tutorial.

    PubMed

    Knudsen, Line V; Laplante-Lévesque, Ariane; Jones, Lesley; Preminger, Jill E; Nielsen, Claus; Lunner, Thomas; Hickson, Louise; Naylor, Graham; Kramer, Sophia E

    2012-02-01

    Qualitative research methodologies are being used more frequently in audiology as it allows for a better understanding of the perspectives of people with hearing impairment. This article describes why and how international interdisciplinary qualitative research can be conducted. This paper is based on a literature review and our recent experience with the conduction of an international interdisciplinary qualitative study in audiology. We describe some available qualitative methods for sampling, data collection, and analysis and we discuss the rationale for choosing particular methods. The focus is on four approaches which have all previously been applied to audiologic research: grounded theory, interpretative phenomenological analysis, conversational analysis, and qualitative content analysis. This article provides a review of methodological issues useful for those designing qualitative research projects in audiology or needing assistance in the interpretation of qualitative literature.

  10. The effects of error augmentation on learning to walk on a narrow balance beam.

    PubMed

    Domingo, Antoinette; Ferris, Daniel P

    2010-10-01

    Error augmentation during training has been proposed as a means to facilitate motor learning due to the human nervous system's reliance on performance errors to shape motor commands. We studied the effects of error augmentation on short-term learning of walking on a balance beam to determine whether it had beneficial effects on motor performance. Four groups of able-bodied subjects walked on a treadmill-mounted balance beam (2.5-cm wide) before and after 30 min of training. During training, two groups walked on the beam with a destabilization device that augmented error (Medium and High Destabilization groups). A third group walked on a narrower beam (1.27-cm) to augment error (Narrow). The fourth group practiced walking on the 2.5-cm balance beam (Wide). Subjects in the Wide group had significantly greater improvements after training than the error augmentation groups. The High Destabilization group had significantly less performance gains than the Narrow group in spite of similar failures per minute during training. In a follow-up experiment, a fifth group of subjects (Assisted) practiced with a device that greatly reduced catastrophic errors (i.e., stepping off the beam) but maintained similar pelvic movement variability. Performance gains were significantly greater in the Wide group than the Assisted group, indicating that catastrophic errors were important for short-term learning. We conclude that increasing errors during practice via destabilization and a narrower balance beam did not improve short-term learning of beam walking. In addition, the presence of qualitatively catastrophic errors seems to improve short-term learning of walking balance.

  11. Error Propagation Analysis in the SAE Architecture Analysis and Design Language (AADL) and the EDICT Tool Framework

    NASA Technical Reports Server (NTRS)

    LaValley, Brian W.; Little, Phillip D.; Walter, Chris J.

    2011-01-01

    This report documents the capabilities of the EDICT tools for error modeling and error propagation analysis when operating with models defined in the Architecture Analysis & Design Language (AADL). We discuss our experience using the EDICT error analysis capabilities on a model of the Scalable Processor-Independent Design for Enhanced Reliability (SPIDER) architecture that uses the Reliable Optical Bus (ROBUS). Based on these experiences we draw some initial conclusions about model based design techniques for error modeling and analysis of highly reliable computing architectures.

  12. MUSCLE STRENGTH AND QUALITATIVE JUMP-LANDING DIFFERENCES IN MALE AND FEMALE MILITARY CADETS: THE JUMP-ACL STUDY.

    PubMed

    Beutler, Ai; de la Motte, Sj; Marshall, Sw; Padua, DA; Boden, Bp

    2009-01-01

    Recent studies have focused on gender differences in movement patterns as risk factors for ACL injury. Understanding intrinsic and extrinsic factors which contribute to movement patterns is critical to ACL injury prevention efforts. Isometric lower-extremity muscular strength, anthropometrics, and jump-landing technique were analyzed for 2,753 cadets (1,046 female, 1,707 male) from the U.S. Air Force, Military and Naval Academies. Jump-landings were evaluated using the Landing Error Scoring System (LESS), a valid qualitative movement screening tool. We hypothesized that distinct anthropometric factors (Q-angle, navicular drop, bodyweight) and muscle strength would predict poor jump-landing technique in males versus females, and that female cadets would have higher scores (more errors) on a qualitative movement screen (LESS) than males. Mean LESS scores were significantly higher in female (5.34 ± 1.51) versus male (4.65 ± 1.69) cadets (P<.001). Qualitative movement scores were analyzed using factor analyses, yielding five factors, or "patterns", contributing to poor landing technique. Females were significantly more likely to have poor technique due to landing with less hip and knee flexion at initial contact (P<.001), more knee valgus with wider landing stance (P<.001), and less flexion displacement over the entire landing (P<.001). Males were more likely to have poor technique due to landing toe-out (P<.001), with heels first, and with an asymmetric foot landing (P<.001). Many of the identified factor patterns have been previously proposed to contribute to ACL injury risk. However, univariate and multivariate analyses of muscular strength and anthropometric factors did not strongly predict LESS scores for either gender, suggesting that changing an athlete's alignment, BMI, or muscle strength may not directly improve his or her movement patterns.

  13. Pediatric Nurses' Perceptions of Medication Safety and Medication Error: A Mixed Methods Study.

    PubMed

    Alomari, Albara; Wilson, Val; Solman, Annette; Bajorek, Beata; Tinsley, Patricia

    2018-06-01

    This study aims to outline the current workplace culture of medication practice in a pediatric medical ward. The objective is to explore the perceptions of nurses in a pediatric clinical setting as to why medication administration errors occur. As nurses have a central role in the medication process, it is essential to explore nurses' perceptions of the factors influencing the medication process. Without this understanding, it is difficult to develop effective prevention strategies aimed at reducing medication administration errors. Previous studies were limited to exploring a single and specific aspect of medication safety. The methods used in these studies were limited to survey designs which may lead to incomplete or inadequate information being provided. This study is phase 1 on an action research project. Data collection included a direct observation of nurses during medication preparation and administration, audit based on the medication policy, and guidelines and focus groups with nursing staff. A thematic analysis was undertaken by each author independently to analyze the observation notes and focus group transcripts. Simple descriptive statistics were used to analyze the audit data. The study was conducted in a specialized pediatric medical ward. Four key themes were identified from the combined quantitative and qualitative data: (1) understanding medication errors, (2) the busy-ness of nurses, (3) the physical environment, and (4) compliance with medication policy and practice guidelines. Workload, frequent interruptions to process, poor physical environment design, lack of preparation space, and impractical medication policies are identified as barriers to safe medication practice. Overcoming these barriers requires organizations to review medication process policies and engage nurses more in medication safety research and in designing clinical guidelines for their own practice.

  14. Use of ATR-FTIR spectroscopy coupled with chemometrics for the authentication of avocado oil in ternary mixtures with sunflower and soybean oils.

    PubMed

    Jiménez-Sotelo, Paola; Hernández-Martínez, Maylet; Osorio-Revilla, Guillermo; Meza-Márquez, Ofelia Gabriela; García-Ochoa, Felipe; Gallardo-Velázquez, Tzayhrí

    2016-07-01

    Avocado oil is a high-value and nutraceutical oil whose authentication is very important since the addition of low-cost oils could lower its beneficial properties. Mid-FTIR spectroscopy combined with chemometrics was used to detect and quantify adulteration of avocado oil with sunflower and soybean oils in a ternary mixture. Thirty-seven laboratory-prepared adulterated samples and 20 pure avocado oil samples were evaluated. The adulterated oil amount ranged from 2% to 50% (w/w) in avocado oil. A soft independent modelling class analogy (SIMCA) model was developed to discriminate between pure and adulterated samples. The model showed recognition and rejection rate of 100% and proper classification in external validation. A partial least square (PLS) algorithm was used to estimate the percentage of adulteration. The PLS model showed values of R(2) > 0.9961, standard errors of calibration (SEC) in the range of 0.3963-0.7881, standard errors of prediction (SEP estimated) between 0.6483 and 0.9707, and good prediction performances in external validation. The results showed that mid-FTIR spectroscopy could be an accurate and reliable technique for qualitative and quantitative analysis of avocado oil in ternary mixtures.

  15. Deviations from Vegard's law in semiconductor thin films measured with X-ray diffraction and Rutherford backscattering: The Ge1-ySny and Ge1-xSix cases

    NASA Astrophysics Data System (ADS)

    Xu, Chi; Senaratne, Charutha L.; Culbertson, Robert J.; Kouvetakis, John; Menéndez, José

    2017-09-01

    The compositional dependence of the lattice parameter in Ge1-ySny alloys has been determined from combined X-ray diffraction and Rutherford Backscattering (RBS) measurements of a large set of epitaxial films with compositions in the 0 < y < 0.14 range. In view of contradictory prior results, a critical analysis of this method has been carried out, with emphasis on nonlinear elasticity corrections and systematic errors in popular RBS simulation codes. The approach followed is validated by showing that measurements of Ge1-xSix films yield a bowing parameter θGeSi =-0.0253(30) Å, in excellent agreement with the classic work by Dismukes. When the same methodology is applied to Ge1-ySny alloy films, it is found that the bowing parameter θGeSn is zero within experimental error, so that the system follows Vegard's law. This is in qualitative agreement with ab initio theory, but the value of the experimental bowing parameter is significantly smaller than the theoretical prediction. Possible reasons for this discrepancy are discussed in detail.

  16. Expert Intraoperative Judgment and Decision-Making: Defining the Cognitive Competencies for Safe Laparoscopic Cholecystectomy.

    PubMed

    Madani, Amin; Watanabe, Yusuke; Feldman, Liane S; Vassiliou, Melina C; Barkun, Jeffrey S; Fried, Gerald M; Aggarwal, Rajesh

    2015-11-01

    Bile duct injuries from laparoscopic cholecystectomy remain a significant source of morbidity and are often the result of intraoperative errors in perception, judgment, and decision-making. This qualitative study aimed to define and characterize higher-order cognitive competencies required to safely perform a laparoscopic cholecystectomy. Hierarchical and cognitive task analyses for establishing a critical view of safety during laparoscopic cholecystectomy were performed using qualitative methods to map the thoughts and practices that characterize expert performance. Experts with more than 5 years of experience, and who have performed at least 100 laparoscopic cholecystectomies, participated in semi-structured interviews and field observations. Verbal data were transcribed verbatim, supplemented with content from published literature, coded, thematically analyzed using grounded-theory by 2 independent reviewers, and synthesized into a list of items. A conceptual framework was created based on 10 interviews with experts, 9 procedures, and 18 literary sources. Experts included 6 minimally invasive surgeons, 2 hepato-pancreatico-biliary surgeons, and 2 acute care general surgeons (median years in practice, 11 [range 8 to 14]). One hundred eight cognitive elements (35 [32%] related to situation awareness, 47 [44%] involving decision-making, and 26 [24%] action-oriented subtasks) and 75 potential errors were identified and categorized into 6 general themes and 14 procedural tasks. Of the 75 potential errors, root causes were mapped to errors in situation awareness (24 [32%]), decision-making (49 [65%]), or either one (61 [81%]). This study defines the competencies that are essential to establishing a critical view of safety and avoiding bile duct injuries during laparoscopic cholecystectomy. This framework may serve as the basis for instructional design, assessment tools, and quality-control metrics to prevent injuries and promote a culture of patient safety. Copyright © 2015 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  17. "Do You Know What You're Doing?" College Students' Experiences with Male Condoms

    ERIC Educational Resources Information Center

    Yarber, William L.; Graham, Cynthia A.; Sanders, Stephanie A.; Crosby, Richard A.; Butler, Scott M.; Hartzell, Rose M.

    2007-01-01

    Background: Although quantitative assessment of male condom use errors and problems has received increased research attention, few studies have qualitatively examined this sexual health behavior. Purpose: This study examined problems of male condom use as experienced by college men and women at a large, public Midwestern university. Methods:…

  18. Physician Interaction with Electronic Medical Records: A Qualitative Study

    ERIC Educational Resources Information Center

    Noteboom, Cherie Bakker

    2010-01-01

    The integration of EHR (Electronic Health Records) in IT infrastructures supporting organizations enable improved access to and recording of patient data, enhanced ability to make better and more-timely decisions, and improved quality and reduced errors. Despite these benefits, there are mixed results as to the use of EHR. The literature suggests…

  19. Impact of specific language impairment and type of school on different language subsystems.

    PubMed

    Puglisi, Marina Leite; Befi-Lopes, Debora Maria

    2016-01-01

    This study aimed to explore quantitative and qualitative effects of type of school and specific language impairment (SLI) on different language abilities. 204 Brazilian children aged from 4 to 6 years old participated in the study. Children were selected to form three groups: 1) 63 typically developing children studying in private schools (TDPri); 2) 102 typically developing children studying in state schools (TDSta); and 39 children with SLI studying in state schools (SLISta). All individuals were assessed regarding expressive vocabulary, number morphology and morphosyntactic comprehension. All language subsystems were vulnerable to both environmental (type of school) and biological (SLI) effects. The relationship between the three language measures was exactly the same to all groups: vocabulary growth correlated with age and with the development of morphological abilities and morphosyntactic comprehension. Children with SLI showed atypical errors in the comprehension test at the age of 4, but presented a pattern of errors that gradually resembled typical development. The effect of type of school was marked by quantitative differences, while the effect of SLI was characterised by both quantitative and qualitative differences.

  20. Transana Qualitative Video and Audio Analysis Software as a Tool for Teaching Intellectual Assessment Skills to Graduate Psychology Students

    ERIC Educational Resources Information Center

    Rush, S. Craig

    2014-01-01

    This article draws on the author's experience using qualitative video and audio analysis, most notably through use of the Transana qualitative video and audio analysis software program, as an alternative method for teaching IQ administration skills to students in a graduate psychology program. Qualitative video and audio analysis may be useful for…

  1. Phonological therapy in jargon aphasia: effects on naming and neologisms.

    PubMed

    Bose, Arpita

    2013-01-01

    Jargon aphasia is one of the most intractable forms of aphasia with limited recommendation on amelioration of associated naming difficulties and neologisms. The few naming therapy studies that exist in jargon aphasia have utilized either semantic or phonological approaches, but the results have been equivocal. Moreover, the effect of therapy on the characteristics of neologisms is less explored. This study investigates the effectiveness of a phonological naming therapy (i.e., phonological component analysis-PCA) on picture-naming abilities and on quantitative and qualitative changes in neologisms for an individual with jargon aphasia (FF). FF showed evidence of jargon aphasia with severe naming difficulties and produced a very high proportion of neologisms. A single-subject multiple probe design across behaviours was employed to evaluate the effects of PCA therapy on the accuracy for three sets of words. In therapy, a phonological components analysis chart was used to identify five phonological components (i.e. rhymes, first sound, first sound associate, final sound and number of syllables) for each target word. Generalization effects-change in per cent accuracy and error pattern-were examined comparing pre- and post-therapy responses on the Philadelphia Naming Test, and these responses were analysed to explore the characteristics of the neologisms. The quantitative change in neologisms was measured by change in the proportion of neologisms from pre- to post-therapy and the qualitative change was indexed by the phonological overlap between target and neologism. As a consequence of PCA therapy, FF showed a significant improvement in his ability to name the treated items. His performance in maintenance and follow-up phases remained comparable with his performance during the therapy phases. Generalization to other naming tasks did not show a change in accuracy, but distinct differences in error pattern (an increase in proportion of real word responses and a decrease in proportion of neologisms) were observed. Notably, the decrease in neologisms occurred with a corresponding trend for increase in the phonological similarity between the neologisms and the targets. This study demonstrated the effectiveness of a phonological therapy for improving naming abilities and reducing the amount of neologisms in an individual with severe jargon aphasia. The positive outcome of this research is encouraging, as it provides evidence for effective therapies for jargon aphasia and also emphasizes that use of the quality and quantity of errors may provide a sensitive outcome measure to determine therapy effectiveness, in particular for client groups who are difficult to treat. © 2013 Royal College of Speech and Language Therapists.

  2. Automatic detection of MLC relative position errors for VMAT using the EPID-based picket fence test

    NASA Astrophysics Data System (ADS)

    Christophides, Damianos; Davies, Alex; Fleckney, Mark

    2016-12-01

    Multi-leaf collimators (MLCs) ensure the accurate delivery of treatments requiring complex beam fluences like intensity modulated radiotherapy and volumetric modulated arc therapy. The purpose of this work is to automate the detection of MLC relative position errors  ⩾0.5 mm using electronic portal imaging device-based picket fence tests and compare the results to the qualitative assessment currently in use. Picket fence tests with and without intentional MLC errors were measured weekly on three Varian linacs. The picket fence images analysed covered a time period ranging between 14-20 months depending on the linac. An algorithm was developed that calculated the MLC error for each leaf-pair present in the picket fence images. The baseline error distributions of each linac were characterised for an initial period of 6 months and compared with the intentional MLC errors using statistical metrics. The distributions of median and one-sample Kolmogorov-Smirnov test p-value exhibited no overlap between baseline and intentional errors and were used retrospectively to automatically detect MLC errors in routine clinical practice. Agreement was found between the MLC errors detected by the automatic method and the fault reports during clinical use, as well as interventions for MLC repair and calibration. In conclusion the method presented provides for full automation of MLC quality assurance, based on individual linac performance characteristics. The use of the automatic method has been shown to provide early warning for MLC errors that resulted in clinical downtime.

  3. Defining and classifying medical error: lessons for patient safety reporting systems.

    PubMed

    Tamuz, M; Thomas, E J; Franchois, K E

    2004-02-01

    It is important for healthcare providers to report safety related events, but little attention has been paid to how the definition and classification of events affects a hospital's ability to learn from its experience. To examine how the definition and classification of safety related events influences key organizational routines for gathering information, allocating incentives, and analyzing event reporting data. In semi-structured interviews, professional staff and administrators in a tertiary care teaching hospital and its pharmacy were asked to describe the existing programs designed to monitor medication safety, including the reporting systems. With a focus primarily on the pharmacy staff, interviews were audio recorded, transcribed, and analyzed using qualitative research methods. Eighty six interviews were conducted, including 36 in the hospital pharmacy. Examples are presented which show that: (1) the definition of an event could lead to under-reporting; (2) the classification of a medication error into alternative categories can influence the perceived incentives and disincentives for incident reporting; (3) event classification can enhance or impede organizational routines for data analysis and learning; and (4) routines that promote organizational learning within the pharmacy can reduce the flow of medication error data to the hospital. These findings from one hospital raise important practical and research questions about how reporting systems are influenced by the definition and classification of safety related events. By understanding more clearly how hospitals define and classify their experience, we may improve our capacity to learn and ultimately improve patient safety.

  4. Qualitative Data Analysis for Health Services Research: Developing Taxonomy, Themes, and Theory

    PubMed Central

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-01-01

    Objective To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. Data Sources and Design We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. Principle Findings We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Conclusions Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines. PMID:17286625

  5. Qualitative data analysis for health services research: developing taxonomy, themes, and theory.

    PubMed

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-08-01

    To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.

  6. Comparison analysis between filtered back projection and algebraic reconstruction technique on microwave imaging

    NASA Astrophysics Data System (ADS)

    Ramadhan, Rifqi; Prabowo, Rian Gilang; Aprilliyani, Ria; Basari

    2018-02-01

    Victims of acute cancer and tumor are growing each year and cancer becomes one of the causes of human deaths in the world. Cancers or tumor tissue cells are cells that grow abnormally and turn to take over and damage the surrounding tissues. At the beginning, cancers or tumors do not have definite symptoms in its early stages, and can even attack the tissues inside of the body. This phenomena is not identifiable under visual human observation. Therefore, an early detection system which is cheap, quick, simple, and portable is essensially required to anticipate the further development of cancer or tumor. Among all of the modalities, microwave imaging is considered to be a cheaper, simple, and portable system method. There are at least two simple image reconstruction algorithms i.e. Filtered Back Projection (FBP) and Algebraic Reconstruction Technique (ART), which have been adopted in some common modalities. In this paper, both algorithms will be compared by reconstructing the image from an artificial tissue model (i.e. phantom), which has two different dielectric distributions. We addressed two performance comparisons, namely quantitative and qualitative analysis. Qualitative analysis includes the smoothness of the image and also the success in distinguishing dielectric differences by observing the image with human eyesight. In addition, quantitative analysis includes Histogram, Structural Similarity Index (SSIM), Mean Squared Error (MSE), and Peak Signal-to-Noise Ratio (PSNR) calculation were also performed. As a result, quantitative parameters of FBP might show better values than the ART. However, ART is likely more capable to distinguish two different dielectric value than FBP, due to higher contrast in ART and wide distribution grayscale level.

  7. Influence of rumen protozoa on methane emission in ruminants: a meta-analysis approach.

    PubMed

    Guyader, J; Eugène, M; Nozière, P; Morgavi, D P; Doreau, M; Martin, C

    2014-11-01

    A meta-analysis was conducted to evaluate the effects of protozoa concentration on methane emission from ruminants. A database was built from 59 publications reporting data from 76 in vivo experiments. The experiments included in the database recorded methane production and rumen protozoa concentration measured on the same groups of animals. Quantitative data such as diet chemical composition, rumen fermentation and microbial parameters, and qualitative information such as methane mitigation strategies were also collected. In the database, 31% of the experiments reported a concomitant reduction of both protozoa concentration and methane emission (g/kg dry matter intake). Nearly all of these experiments tested lipids as methane mitigation strategies. By contrast, 21% of the experiments reported a variation in methane emission without changes in protozoa numbers, indicating that methanogenesis is also regulated by other mechanisms not involving protozoa. Experiments that used chemical compounds as an antimethanogenic treatment belonged to this group. The relationship between methane emission and protozoa concentration was studied with a variance-covariance model, with experiment as a fixed effect. The experiments included in the analysis had a within-experiment variation of protozoa concentration higher than 5.3 log10 cells/ml corresponding to the average s.e.m. of the database for this variable. To detect potential interfering factors for the relationship, the influence of several qualitative and quantitative secondary factors was tested. This meta-analysis showed a significant linear relationship between methane emission and protozoa concentration: methane (g/kg dry matter intake)=-30.7+8.14×protozoa (log10 cells/ml) with 28 experiments (91 treatments), residual mean square error=1.94 and adjusted R 2=0.90. The proportion of butyrate in the rumen positively influenced the least square means of this relationship.

  8. Fast classification and compositional analysis of cornstover fractions using Fourier transform near-infrared techniques.

    PubMed

    Philip Ye, X; Liu, Lu; Hayes, Douglas; Womac, Alvin; Hong, Kunlun; Sokhansanj, Shahab

    2008-10-01

    The objectives of this research were to determine the variation of chemical composition across botanical fractions of cornstover, and to probe the potential of Fourier transform near-infrared (FT-NIR) techniques in qualitatively classifying separated cornstover fractions and in quantitatively analyzing chemical compositions of cornstover by developing calibration models to predict chemical compositions of cornstover based on FT-NIR spectra. Large variations of cornstover chemical composition for wide calibration ranges, which is required by a reliable calibration model, were achieved by manually separating the cornstover samples into six botanical fractions, and their chemical compositions were determined by conventional wet chemical analyses, which proved that chemical composition varies significantly among different botanical fractions of cornstover. Different botanic fractions, having total saccharide content in descending order, are husk, sheath, pith, rind, leaf, and node. Based on FT-NIR spectra acquired on the biomass, classification by Soft Independent Modeling of Class Analogy (SIMCA) was employed to conduct qualitative classification of cornstover fractions, and partial least square (PLS) regression was used for quantitative chemical composition analysis. SIMCA was successfully demonstrated in classifying botanical fractions of cornstover. The developed PLS model yielded root mean square error of prediction (RMSEP %w/w) of 0.92, 1.03, 0.17, 0.27, 0.21, 1.12, and 0.57 for glucan, xylan, galactan, arabinan, mannan, lignin, and ash, respectively. The results showed the potential of FT-NIR techniques in combination with multivariate analysis to be utilized by biomass feedstock suppliers, bioethanol manufacturers, and bio-power producers in order to better manage bioenergy feedstocks and enhance bioconversion.

  9. Insufficient Hartree–Fock Exchange in Hybrid DFT Functionals Produces Bent Alkynyl Radical Structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oyeyemi, Victor B.; Keith, John A.; Pavone, Michele

    2012-01-11

    Density functional theory (DFT) is often used to determine the electronic and geometric structures of molecules. While studying alkynyl radicals, we discovered that DFT exchange-correlation (XC) functionals containing less than ~22% Hartree–Fock (HF) exchange led to qualitatively different structures than those predicted from ab initio HF and post-HF calculations or DFT XCs containing 25% or more HF exchange. We attribute this discrepancy to rehybridization at the radical center due to electron delocalization across the triple bonds of the alkynyl groups, which itself is an artifact of self-interaction and delocalization errors. Inclusion of sufficient exact exchange reduces these errors and suppressesmore » this erroneous delocalization; we find that a threshold amount is needed for accurate structure determinations. Finally, below this threshold, significant errors in predicted alkyne thermochemistry emerge as a consequence.« less

  10. Methodology for cork plank characterization (Quercus suber L.) by near-infrared spectroscopy and image analysis

    NASA Astrophysics Data System (ADS)

    Prades, Cristina; García-Olmo, Juan; Romero-Prieto, Tomás; García de Ceca, José L.; López-Luque, Rafael

    2010-06-01

    The procedures used today to characterize cork plank for the manufacture of cork bottle stoppers continue to be based on a traditional, manual method that is highly subjective. Furthermore, there is no specific legislation regarding cork classification. The objective of this viability study is to assess the potential of near-infrared spectroscopy (NIRS) technology for characterizing cork plank according to the following variables: aspect or visual quality, porosity, moisture and geographical origin. In order to calculate the porosity coefficient, an image analysis program was specifically developed in Visual Basic language for a desktop scanner. A set comprising 170 samples from two geographical areas of Andalusia (Spain) was classified into eight quality classes by visual inspection. Spectra were obtained in the transverse and tangential sections of the cork planks using an NIRSystems 6500 SY II reflectance spectrophotometer. The quantitative calibrations showed cross-validation coefficients of determination of 0.47 for visual quality, 0.69 for porosity and 0.66 for moisture. The results obtained using NIRS technology are promising considering the heterogeneity and variability of a natural product such as cork in spite of the fact that the standard error of cross validation (SECV) in the quantitative analysis is greater than the standard error of laboratory (SEL) for the three variables. The qualitative analysis regarding geographical origin achieved very satisfactory results. Applying these methods in industry will permit quality control procedures to be automated, as well as establishing correlations between the different classification systems currently used in the sector. These methods can be implemented in the cork chain of custody certification and will also provide a certainly more objective tool for assessing the economic value of the product.

  11. How do geometry-related parameters influence the clinical performance of orthodontic mini-implants? A systematic review and meta-analysis.

    PubMed

    Cunha, A C; da Veiga, A M A; Masterson, D; Mattos, C T; Nojima, L I; Nojima, M C G; Maia, L C

    2017-12-01

    The aim of this systematic review and meta-analysis was to investigate how parameters related to geometry influence the clinical performance of orthodontic mini-implants (MIs). Systematic searches were performed in electronic databases including MEDLINE, Scopus, Web of Science, Virtual Health Library, and Cochrane Library and reference lists up to March 2016. Eligibility criteria comprised clinical studies involving patients who received MIs for orthodontic anchorage, with data for categories of MI dimension, shape, and thread design and insertion site, and evaluated by assessment of primary and secondary stability. Study selection, data extraction, quality assessment, and a meta-analysis were carried out. Twenty-seven studies were included in the qualitative synthesis: five randomized, eight prospective, and 14 retrospective clinical studies. One study with a serious risk of bias was later excluded. Medium and short MIs (1.4-1.9mm diameter and 5-8mm length) presented the highest success rates (0.87, 95% CI 0.80-0.92). A maximum insertion torque of 13.28Ncm (standard error 0.34) was observed for tapered self-drilling MIs in the mandible, whereas cylindrical MIs in the maxilla presented a maximum removal torque of 10.01Ncm (standard error 0.17). Moderate evidence indicates that the clinical performance of MIs is influenced by implant geometry parameters and is also related to properties of the insertion site. However, further research is necessary to support these associations. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  12. Clinical decision support alert malfunctions: analysis and empirically derived taxonomy.

    PubMed

    Wright, Adam; Ai, Angela; Ash, Joan; Wiesen, Jane F; Hickman, Thu-Trang T; Aaron, Skye; McEvoy, Dustin; Borkowsky, Shane; Dissanayake, Pavithra I; Embi, Peter; Galanter, William; Harper, Jeremy; Kassakian, Steve Z; Ramoni, Rachel; Schreiber, Richard; Sirajuddin, Anwar; Bates, David W; Sittig, Dean F

    2018-05-01

    To develop an empirically derived taxonomy of clinical decision support (CDS) alert malfunctions. We identified CDS alert malfunctions using a mix of qualitative and quantitative methods: (1) site visits with interviews of chief medical informatics officers, CDS developers, clinical leaders, and CDS end users; (2) surveys of chief medical informatics officers; (3) analysis of CDS firing rates; and (4) analysis of CDS overrides. We used a multi-round, manual, iterative card sort to develop a multi-axial, empirically derived taxonomy of CDS malfunctions. We analyzed 68 CDS alert malfunction cases from 14 sites across the United States with diverse electronic health record systems. Four primary axes emerged: the cause of the malfunction, its mode of discovery, when it began, and how it affected rule firing. Build errors, conceptualization errors, and the introduction of new concepts or terms were the most frequent causes. User reports were the predominant mode of discovery. Many malfunctions within our database caused rules to fire for patients for whom they should not have (false positives), but the reverse (false negatives) was also common. Across organizations and electronic health record systems, similar malfunction patterns recurred. Challenges included updates to code sets and values, software issues at the time of system upgrades, difficulties with migration of CDS content between computing environments, and the challenge of correctly conceptualizing and building CDS. CDS alert malfunctions are frequent. The empirically derived taxonomy formalizes the common recurring issues that cause these malfunctions, helping CDS developers anticipate and prevent CDS malfunctions before they occur or detect and resolve them expediently.

  13. African Primary Care Research: Qualitative data analysis and writing results

    PubMed Central

    Govender, Indiran; Ogunbanjo, Gboyega A.; Mash, Bob

    2014-01-01

    Abstract This article is part of a series on African primary care research and gives practical guidance on qualitative data analysis and the presentation of qualitative findings. After an overview of qualitative methods and analytical approaches, the article focuses particularly on content analysis, using the framework method as an example. The steps of familiarisation, creating a thematic index, indexing, charting, interpretation and confirmation are described. Key concepts with regard to establishing the quality and trustworthiness of data analysis are described. Finally, an approach to the presentation of qualitative findings is given. PMID:26245437

  14. African Primary Care Research: qualitative data analysis and writing results.

    PubMed

    Mabuza, Langalibalele H; Govender, Indiran; Ogunbanjo, Gboyega A; Mash, Bob

    2014-06-05

    This article is part of a series on African primary care research and gives practical guidance on qualitative data analysis and the presentation of qualitative findings. After an overview of qualitative methods and analytical approaches, the article focuses particularly on content analysis, using the framework method as an example. The steps of familiarisation, creating a thematic index, indexing, charting, interpretation and confirmation are described. Key concepts with regard to establishing the quality and trustworthiness of data analysis are described. Finally, an approach to the presentation of qualitative findings is given.

  15. The Focinator v2-0 - Graphical Interface, Four Channels, Colocalization Analysis and Cell Phase Identification.

    PubMed

    Oeck, Sebastian; Malewicz, Nathalie M; Hurst, Sebastian; Al-Refae, Klaudia; Krysztofiak, Adam; Jendrossek, Verena

    2017-07-01

    The quantitative analysis of foci plays an important role in various cell biological methods. In the fields of radiation biology and experimental oncology, the effect of ionizing radiation, chemotherapy or molecularly targeted drugs on DNA damage induction and repair is frequently performed by the analysis of protein clusters or phosphorylated proteins recruited to so called repair foci at DNA damage sites, involving for example γ-H2A.X, 53BP1 or RAD51. We recently developed "The Focinator" as a reliable and fast tool for automated quantitative and qualitative analysis of nuclei and DNA damage foci. The refined software is now even more user-friendly due to a graphical interface and further features. Thus, we included an R-script-based mode for automated image opening, file naming, progress monitoring and an error report. Consequently, the evaluation no longer required the attendance of the operator after initial parameter definition. Moreover, the Focinator v2-0 is now able to perform multi-channel analysis of four channels and evaluation of protein-protein colocalization by comparison of up to three foci channels. This enables for example the quantification of foci in cells of a specific cell cycle phase.

  16. Parallel realities: exploring poverty dynamics using mixed methods in rural Bangladesh.

    PubMed

    Davisa, Peter; Baulch, Bob

    2011-01-01

    This paper explores the implications of using two methodological approaches to study poverty dynamics in rural Bangladesh. Using data from a unique longitudinal study, we show how different methods lead to very different assessments of socio-economic mobility. We suggest five ways of reconciling these differences: considering assets in addition to expenditures, proximity to the poverty line, other aspects of well-being, household division, and qualitative recall errors. Considering assets and proximity to the poverty line along with expenditures resolves three-fifths of the qualitative and quantitative differences. Use of such integrated mixed-methods can therefore improve the reliability of poverty dynamics research.

  17. Reliability of a Qualitative Video Analysis for Running.

    PubMed

    Pipkin, Andrew; Kotecki, Kristy; Hetzel, Scott; Heiderscheit, Bryan

    2016-07-01

    Study Design Reliability study. Background Video analysis of running gait is frequently performed in orthopaedic and sports medicine practices to assess biomechanical factors that may contribute to injury. However, the reliability of a whole-body assessment has not been determined. Objective To determine the intrarater and interrater reliability of the qualitative assessment of specific running kinematics from a 2-dimensional video. Methods Running-gait analysis was performed on videos recorded from 15 individuals (8 male, 7 female) running at a self-selected pace (3.17 ± 0.40 m/s, 8:28 ± 1:04 min/mi) using a high-speed camera (120 frames per second). These videos were independently rated on 2 occasions by 3 experienced physical therapists using a standardized qualitative assessment. Fifteen sagittal and frontal plane kinematic variables were rated on a 3- or 5-point categorical scale at specific events of the gait cycle, including initial contact (n = 3) and midstance (n = 9), or across the full gait cycle (n = 3). The video frame number corresponding to each gait event was also recorded. Intrarater and interrater reliability values were calculated for gait-event detection (intraclass correlation coefficient [ICC] and standard error of measurement [SEM]) and the individual kinematic variables (weighted kappa [κw]). Results Gait-event detection was highly reproducible within raters (ICC = 0.94-1.00; SEM, 0.3-1.0 frames) and between raters (ICC = 0.77-1.00; SEM, 0.4-1.9 frames). Eleven of the 15 kinematic variables demonstrated substantial (κw = 0.60-0.799) or excellent (κw>0.80) intrarater agreement, with the exception of foot-to-center-of-mass position (κw = 0.59), forefoot position (κw = 0.58), ankle dorsiflexion at midstance (κw = 0.49), and center-of-mass vertical excursion (κw = 0.36). Interrater agreement for the kinematic measures varied more widely (κw = 0.00-0.85), with 5 variables showing substantial or excellent reliability. Conclusion The qualitative assessment of specific kinematic measures during running can be reliably performed with the use of a high-speed video camera. Detection of specific gait events was highly reproducible, as were common kinematic variables such as rearfoot position, foot-strike pattern, tibial inclination angle, knee flexion angle, and forward trunk lean. Other variables should be used with caution. J Orthop Sports Phys Ther 2016;46(7):556-561. Epub 6 Jun 2016. doi:10.2519/jospt.2016.6280.

  18. 42 CFR 431.992 - Corrective action plan.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CMS, designed to reduce improper payments in each program based on its analysis of the error causes in... State must take the following actions: (1) Data analysis. States must conduct data analysis such as reviewing clusters of errors, general error causes, characteristics, and frequency of errors that are...

  19. 42 CFR 431.992 - Corrective action plan.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... CMS, designed to reduce improper payments in each program based on its analysis of the error causes in... State must take the following actions: (1) Data analysis. States must conduct data analysis such as reviewing clusters of errors, general error causes, characteristics, and frequency of errors that are...

  20. Qualitative Research in Palliative Care: Applications to Clinical Trials Work.

    PubMed

    Lim, Christopher T; Tadmor, Avia; Fujisawa, Daisuke; MacDonald, James J; Gallagher, Emily R; Eusebio, Justin; Jackson, Vicki A; Temel, Jennifer S; Greer, Joseph A; Hagan, Teresa; Park, Elyse R

    2017-08-01

    While vast opportunities for using qualitative methods exist within palliative care research, few studies provide practical advice for researchers and clinicians as a roadmap to identify and utilize such opportunities. To provide palliative care clinicians and researchers descriptions of qualitative methodology applied to innovative research questions relative to palliative care research and define basic concepts in qualitative research. Body: We describe three qualitative projects as exemplars to describe major concepts in qualitative analysis of early palliative care: (1) a descriptive analysis of clinician documentation in the electronic health record, (2) a thematic content analysis of palliative care clinician focus groups, and (3) a framework analysis of audio-recorded encounters between patients and clinicians as part of a clinical trial. This study provides a foundation for undertaking qualitative research within palliative care and serves as a framework for use by other palliative care researchers interested in qualitative methodologies.

  1. Correction to the paper “a simple model to determine the interrelation between the integral characteristics of hall thrusters” [Plasma Physics Reports 40, 229 (2014)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shumilin, V. P.; Shumilin, A. V.; Shumilin, N. V., E-mail: vladimirshumilin@yahoo.com

    2015-11-15

    The paper is devoted to comparison of experimental data with theoretical predictions concerning the dependence of the current of accelerated ions on the operating voltage of a Hall thruster with an anode layer. The error made in the paper published by the authors in Plasma Phys. Rep. 40, 229 (2014) occurred because of a misprint in the Encyclopedia of Low-Temperature Plasma. In the present paper, this error is corrected. It is shown that the simple model proposed in the above-mentioned paper is in qualitative and quantitative agreement with experimental results.

  2. Low-dimensional Representation of Error Covariance

    NASA Technical Reports Server (NTRS)

    Tippett, Michael K.; Cohn, Stephen E.; Todling, Ricardo; Marchesin, Dan

    2000-01-01

    Ensemble and reduced-rank approaches to prediction and assimilation rely on low-dimensional approximations of the estimation error covariances. Here stability properties of the forecast/analysis cycle for linear, time-independent systems are used to identify factors that cause the steady-state analysis error covariance to admit a low-dimensional representation. A useful measure of forecast/analysis cycle stability is the bound matrix, a function of the dynamics, observation operator and assimilation method. Upper and lower estimates for the steady-state analysis error covariance matrix eigenvalues are derived from the bound matrix. The estimates generalize to time-dependent systems. If much of the steady-state analysis error variance is due to a few dominant modes, the leading eigenvectors of the bound matrix approximate those of the steady-state analysis error covariance matrix. The analytical results are illustrated in two numerical examples where the Kalman filter is carried to steady state. The first example uses the dynamics of a generalized advection equation exhibiting nonmodal transient growth. Failure to observe growing modes leads to increased steady-state analysis error variances. Leading eigenvectors of the steady-state analysis error covariance matrix are well approximated by leading eigenvectors of the bound matrix. The second example uses the dynamics of a damped baroclinic wave model. The leading eigenvectors of a lowest-order approximation of the bound matrix are shown to approximate well the leading eigenvectors of the steady-state analysis error covariance matrix.

  3. Error-Analysis for Correctness, Effectiveness, and Composing Procedure.

    ERIC Educational Resources Information Center

    Ewald, Helen Rothschild

    The assumptions underpinning grammatical mistakes can often be detected by looking for patterns of errors in a student's work. Assumptions that negatively influence rhetorical effectiveness can similarly be detected through error analysis. On a smaller scale, error analysis can also reveal assumptions affecting rhetorical choice. Snags in the…

  4. Automatic Error Analysis Using Intervals

    ERIC Educational Resources Information Center

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  5. Accuracy of Digital vs Conventional Implant Impression Approach: A Three-Dimensional Comparative In Vitro Analysis.

    PubMed

    Basaki, Kinga; Alkumru, Hasan; De Souza, Grace; Finer, Yoav

    To assess the three-dimensional (3D) accuracy and clinical acceptability of implant definitive casts fabricated using a digital impression approach and to compare the results with those of a conventional impression method in a partially edentulous condition. A mandibular reference model was fabricated with implants in the first premolar and molar positions to simulate a patient with bilateral posterior edentulism. Ten implant-level impressions per method were made using either an intraoral scanner with scanning abutments for the digital approach or an open-tray technique and polyvinylsiloxane material for the conventional approach. 3D analysis and comparison of implant location on resultant definitive casts were performed using laser scanner and quality control software. The inter-implant distances and interimplant angulations for each implant pair were measured for the reference model and for each definitive cast (n = 20 per group); these measurements were compared to calculate the magnitude of error in 3D for each definitive cast. The influence of implant angulation on definitive cast accuracy was evaluated for both digital and conventional approaches. Statistical analysis was performed using t test (α = .05) for implant position and angulation. Clinical qualitative assessment of accuracy was done via the assessment of the passivity of a master verification stent for each implant pair, and significance was analyzed using chi-square test (α = .05). A 3D error of implant positioning was observed for the two impression techniques vs the reference model, with mean ± standard deviation (SD) error of 116 ± 94 μm and 56 ± 29 μm for the digital and conventional approaches, respectively (P = .01). In contrast, the inter-implant angulation errors were not significantly different between the two techniques (P = .83). Implant angulation did not have a significant influence on definitive cast accuracy within either technique (P = .64). The verification stent demonstrated acceptable passive fit for 11 out of 20 casts and 18 out of 20 casts for the digital and conventional methods, respectively (P = .01). Definitive casts fabricated using the digital impression approach were less accurate than those fabricated from the conventional impression approach for this simulated clinical scenario. A significant number of definitive casts generated by the digital technique did not meet clinically acceptable accuracy for the fabrication of a multiple implant-supported restoration.

  6. The impact of response measurement error on the analysis of designed experiments

    DOE PAGES

    Anderson-Cook, Christine Michaela; Hamada, Michael Scott; Burr, Thomas Lee

    2016-11-01

    This study considers the analysis of designed experiments when there is measurement error in the true response or so-called response measurement error. We consider both additive and multiplicative response measurement errors. Through a simulation study, we investigate the impact of ignoring the response measurement error in the analysis, that is, by using a standard analysis based on t-tests. In addition, we examine the role of repeat measurements in improving the quality of estimation and prediction in the presence of response measurement error. We also study a Bayesian approach that accounts for the response measurement error directly through the specification ofmore » the model, and allows including additional information about variability in the analysis. We consider the impact on power, prediction, and optimization. Copyright © 2015 John Wiley & Sons, Ltd.« less

  7. The impact of response measurement error on the analysis of designed experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson-Cook, Christine Michaela; Hamada, Michael Scott; Burr, Thomas Lee

    This study considers the analysis of designed experiments when there is measurement error in the true response or so-called response measurement error. We consider both additive and multiplicative response measurement errors. Through a simulation study, we investigate the impact of ignoring the response measurement error in the analysis, that is, by using a standard analysis based on t-tests. In addition, we examine the role of repeat measurements in improving the quality of estimation and prediction in the presence of response measurement error. We also study a Bayesian approach that accounts for the response measurement error directly through the specification ofmore » the model, and allows including additional information about variability in the analysis. We consider the impact on power, prediction, and optimization. Copyright © 2015 John Wiley & Sons, Ltd.« less

  8. Fully automated registration of first-pass myocardial perfusion MRI using independent component analysis.

    PubMed

    Milles, J; van der Geest, R J; Jerosch-Herold, M; Reiber, J H C; Lelieveldt, B P F

    2007-01-01

    This paper presents a novel method for registration of cardiac perfusion MRI. The presented method successfully corrects for breathing motion without any manual interaction using Independent Component Analysis to extract physiologically relevant features together with their time-intensity behavior. A time-varying reference image mimicking intensity changes in the data of interest is computed based on the results of ICA, and used to compute the displacement caused by breathing for each frame. Qualitative and quantitative validation of the method is carried out using 46 clinical quality, short-axis, perfusion MR datasets comprising 100 images each. Validation experiments showed a reduction of the average LV motion from 1.26+/-0.87 to 0.64+/-0.46 pixels. Time-intensity curves are also improved after registration with an average error reduced from 2.65+/-7.89% to 0.87+/-3.88% between registered data and manual gold standard. We conclude that this fully automatic ICA-based method shows an excellent accuracy, robustness and computation speed, adequate for use in a clinical environment.

  9. Patterns of verbal memory performance in mild cognitive impairment, Alzheimer disease, and normal aging.

    PubMed

    Greenaway, Melanie C; Lacritz, Laura H; Binegar, Dani; Weiner, Myron F; Lipton, Anne; Munro Cullum, C

    2006-06-01

    Individuals with mild cognitive impairment (MCI) typically demonstrate memory loss that falls between normal aging (NA) and Alzheimer disease (AD), but little is known about the pattern of memory dysfunction in MCI. To explore this issue, California Verbal Learning Test (CVLT) performance was examined across groups of MCI, AD, and NA. MCI subjects displayed a pattern of deficits closely resembling that of AD, characterized by reduced learning, rapid forgetting, increased recency recall, elevated intrusion errors, and poor recognition discriminability with increased false-positives. MCI performance was significantly worse than that of controls and better than that of AD patients across memory indices. Although qualitative analysis of CVLT profiles may be useful in individual cases, discriminant function analysis revealed that delayed recall and total learning were the best aspects of learning/memory on the CVLT in differentiating MCI, AD, and NA. These findings support the position that amnestic MCI represents an early point of decline on the continuum of AD that is different from normal aging.

  10. Forensic identification science evidence since Daubert: Part II--judicial reasoning in decisions to exclude forensic identification evidence on grounds of reliability.

    PubMed

    Page, Mark; Taylor, Jane; Blenkin, Matt

    2011-07-01

    Many studies regarding the legal status of forensic science have relied on the U.S. Supreme Court's mandate in Daubert v. Merrell Dow Pharmaceuticals Inc., and its progeny in order to make subsequent recommendations or rebuttals. This paper focuses on a more pragmatic approach to analyzing forensic science's immediate deficiencies by considering a qualitative analysis of actual judicial reasoning where forensic identification evidence has been excluded on reliability grounds since the Daubert precedent. Reliance on general acceptance is becoming insufficient as proof of the admissibility of forensic evidence. The citation of unfounded statistics, error rates and certainties, a failure to document the analytical process or follow standardized procedures, and the existence of observe bias represent some of the concerns that have lead to the exclusion or limitation of forensic identification evidence. Analysis of these reasons may serve to refocus forensic practitioners' testimony, resources, and research toward rectifying shortfalls in these areas. © 2011 American Academy of Forensic Sciences.

  11. Blackboard architecture for medical image interpretation

    NASA Astrophysics Data System (ADS)

    Davis, Darryl N.; Taylor, Christopher J.

    1991-06-01

    There is a growing interest in using sophisticated knowledge-based systems for biomedical image interpretation. We present a principled attempt to use artificial intelligence methodologies in interpreting lateral skull x-ray images. Such radiographs are routinely used in cephalometric analysis to provide quantitative measurements useful to clinical orthodontists. Manual and interactive methods of analysis are known to be error prone and previous attempts to automate this analysis typically fail to capture the expertise and adaptability required to cope with the variability in biological structure and image quality. An integrated model-based system has been developed which makes use of a blackboard architecture and multiple knowledge sources. A model definition interface allows quantitative models, of feature appearance and location, to be built from examples as well as more qualitative modelling constructs. Visual task definition and blackboard control modules allow task-specific knowledge sources to act on information available to the blackboard in a hypothesise and test reasoning cycle. Further knowledge-based modules include object selection, location hypothesis, intelligent segmentation, and constraint propagation systems. Alternative solutions to given tasks are permitted.

  12. An Array of Qualitative Data Analysis Tools: A Call for Data Analysis Triangulation

    ERIC Educational Resources Information Center

    Leech, Nancy L.; Onwuegbuzie, Anthony J.

    2007-01-01

    One of the most important steps in the qualitative research process is analysis of data. The purpose of this article is to provide elements for understanding multiple types of qualitative data analysis techniques available and the importance of utilizing more than one type of analysis, thus utilizing data analysis triangulation, in order to…

  13. Temporal lobe stimulation reveals anatomic distinction between auditory naming processes.

    PubMed

    Hamberger, M J; Seidel, W T; Goodman, R R; Perrine, K; McKhann, G M

    2003-05-13

    Language errors induced by cortical stimulation can provide insight into function(s) supported by the area stimulated. The authors observed that some stimulation-induced errors during auditory description naming were characterized by tip-of-the-tongue responses or paraphasic errors, suggesting expressive difficulty, whereas others were qualitatively different, suggesting receptive difficulty. They hypothesized that these two response types reflected disruption at different stages of auditory verbal processing and that these "subprocesses" might be supported by anatomically distinct cortical areas. To explore the topographic distribution of error types in auditory verbal processing. Twenty-one patients requiring left temporal lobe surgery underwent preresection language mapping using direct cortical stimulation. Auditory naming was tested at temporal sites extending from 1 cm from the anterior tip to the parietal operculum. Errors were dichotomized as either "expressive" or "receptive." The topographic distribution of error types was explored. Sites associated with the two error types were topographically distinct from one another. Most receptive sites were located in the middle portion of the superior temporal gyrus (STG), whereas most expressive sites fell outside this region, scattered along lateral temporal and temporoparietal cortex. Results raise clinical questions regarding the inclusion of the STG in temporal lobe epilepsy surgery and suggest that more detailed cortical mapping might enable better prediction of postoperative language decline. From a theoretical perspective, results carry implications regarding the understanding of structure-function relations underlying temporal lobe mediation of auditory language processing.

  14. Continuous quantum error correction for non-Markovian decoherence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oreshkov, Ognyan; Brun, Todd A.; Communication Sciences Institute, University of Southern California, Los Angeles, California 90089

    2007-08-15

    We study the effect of continuous quantum error correction in the case where each qubit in a codeword is subject to a general Hamiltonian interaction with an independent bath. We first consider the scheme in the case of a trivial single-qubit code, which provides useful insights into the workings of continuous error correction and the difference between Markovian and non-Markovian decoherence. We then study the model of a bit-flip code with each qubit coupled to an independent bath qubit and subject to continuous correction, and find its solution. We show that for sufficiently large error-correction rates, the encoded state approximatelymore » follows an evolution of the type of a single decohering qubit, but with an effectively decreased coupling constant. The factor by which the coupling constant is decreased scales quadratically with the error-correction rate. This is compared to the case of Markovian noise, where the decoherence rate is effectively decreased by a factor which scales only linearly with the rate of error correction. The quadratic enhancement depends on the existence of a Zeno regime in the Hamiltonian evolution which is absent in purely Markovian dynamics. We analyze the range of validity of this result and identify two relevant time scales. Finally, we extend the result to more general codes and argue that the performance of continuous error correction will exhibit the same qualitative characteristics.« less

  15. Erratum: Raman linewidths and rotationally inelastic collision rates in nitrogen [J. Chem. Phys. 98, 257 (1993)

    NASA Astrophysics Data System (ADS)

    Green, Sheldon

    1993-09-01

    A computer program error led to erroneous results in the titled paper. Corrected generalized IOS cross sections are significantly changed, especially at lower collision energies. These changes tend to cancel in predicted Raman linewidths; there is a systematic increase of 10-15 %, changing quantitative, but not qualitative, comparisons with experimental data.

  16. Fast and reliable symplectic integration for planetary system N-body problems

    NASA Astrophysics Data System (ADS)

    Hernandez, David M.

    2016-06-01

    We apply one of the exactly symplectic integrators, which we call HB15, of Hernandez & Bertschinger, along with the Kepler problem solver of Wisdom & Hernandez, to solve planetary system N-body problems. We compare the method to Wisdom-Holman (WH) methods in the MERCURY software package, the MERCURY switching integrator, and others and find HB15 to be the most efficient method or tied for the most efficient method in many cases. Unlike WH, HB15 solved N-body problems exhibiting close encounters with small, acceptable error, although frequent encounters slowed the code. Switching maps like MERCURY change between two methods and are not exactly symplectic. We carry out careful tests on their properties and suggest that they must be used with caution. We then use different integrators to solve a three-body problem consisting of a binary planet orbiting a star. For all tested tolerances and time steps, MERCURY unbinds the binary after 0 to 25 years. However, in the solutions of HB15, a time-symmetric HERMITE code, and a symplectic Yoshida method, the binary remains bound for >1000 years. The methods' solutions are qualitatively different, despite small errors in the first integrals in most cases. Several checks suggest that the qualitative binary behaviour of HB15's solution is correct. The Bulirsch-Stoer and Radau methods in the MERCURY package also unbind the binary before a time of 50 years, suggesting that this dynamical error is due to a MERCURY bug.

  17. Measurement Error and Equating Error in Power Analysis

    ERIC Educational Resources Information Center

    Phillips, Gary W.; Jiang, Tao

    2016-01-01

    Power analysis is a fundamental prerequisite for conducting scientific research. Without power analysis the researcher has no way of knowing whether the sample size is large enough to detect the effect he or she is looking for. This paper demonstrates how psychometric factors such as measurement error and equating error affect the power of…

  18. Qualitative Data Analysis: A Compendium of Techniques and a Framework for Selection for School Psychology Research and Beyond

    ERIC Educational Resources Information Center

    Leech, Nancy L.; Onwuegbuzie, Anthony J.

    2008-01-01

    Qualitative researchers in school psychology have a multitude of analyses available for data. The purpose of this article is to present several of the most common methods for analyzing qualitative data. Specifically, the authors describe the following 18 qualitative analysis techniques: method of constant comparison analysis, keywords-in-context,…

  19. QUAGOL: a guide for qualitative data analysis.

    PubMed

    Dierckx de Casterlé, Bernadette; Gastmans, Chris; Bryon, Els; Denier, Yvonne

    2012-03-01

    Data analysis is a complex and contested part of the qualitative research process, which has received limited theoretical attention. Researchers are often in need of useful instructions or guidelines on how to analyze the mass of qualitative data, but face the lack of clear guidance for using particular analytic methods. The aim of this paper is to propose and discuss the Qualitative Analysis Guide of Leuven (QUAGOL), a guide that was developed in order to be able to truly capture the rich insights of qualitative interview data. The article describes six major problems researchers are often struggling with during the process of qualitative data analysis. Consequently, the QUAGOL is proposed as a guide to facilitate the process of analysis. Challenges emerged and lessons learned from own extensive experiences with qualitative data analysis within the Grounded Theory Approach, as well as from those of other researchers (as described in the literature), were discussed and recommendations were presented. Strengths and pitfalls of the proposed method were discussed in detail. The Qualitative Analysis Guide of Leuven (QUAGOL) offers a comprehensive method to guide the process of qualitative data analysis. The process consists of two parts, each consisting of five stages. The method is systematic but not rigid. It is characterized by iterative processes of digging deeper, constantly moving between the various stages of the process. As such, it aims to stimulate the researcher's intuition and creativity as optimal as possible. The QUAGOL guide is a theory and practice-based guide that supports and facilitates the process of analysis of qualitative interview data. Although the method can facilitate the process of analysis, it cannot guarantee automatic quality. The skills of the researcher and the quality of the research team remain the most crucial components of a successful process of analysis. Additionally, the importance of constantly moving between the various stages throughout the research process cannot be overstated. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Ethnographic study of ICT-supported collaborative work routines in general practice.

    PubMed

    Swinglehurst, Deborah; Greenhalgh, Trisha; Myall, Michelle; Russell, Jill

    2010-12-29

    Health informatics research has traditionally been dominated by experimental and quasi-experimental designs. An emerging area of study in organisational sociology is routinisation (how collaborative work practices become business-as-usual). There is growing interest in the use of ethnography and other in-depth qualitative approaches to explore how collaborative work routines are enacted and develop over time, and how electronic patient records (EPRs) are used to support collaborative work practices within organisations. Following Feldman and Pentland, we will use 'the organisational routine' as our unit of analysis. In a sample of four UK general practices, we will collect narratives, ethnographic observations, multi-modal (video and screen capture) data, documents and other artefacts, and analyse these to map and compare the different understandings and enactments of three common routines (repeat prescribing, coding and summarising, and chronic disease surveillance) which span clinical and administrative spaces and which, though 'mundane', have an important bearing on quality and safety of care. In a detailed qualitative analysis informed by sociological theory, we aim to generate insights about how complex collaborative work is achieved through the process of routinisation in healthcare organisations. Our study offers the potential not only to identify potential quality failures (poor performance, errors, failures of coordination) in collaborative work routines but also to reveal the hidden work and workarounds by front-line staff which bridge the model-reality gap in EPR technologies and via which "automated" safety features have an impact in practice.

  1. BeerOz, a set of Matlab routines for the quantitative interpretation of spectrophotometric measurements of metal speciation in solution

    NASA Astrophysics Data System (ADS)

    Brugger, Joël

    2007-02-01

    The modelling of the speciation and mobility of metals under surface and hydrothermal conditions relies on the availability of accurate thermodynamic properties for all relevant minerals, aqueous species, gases and surface species. Spectroscopic techniques obeying the Beer-Lambert law can be used to obtain thermodynamic properties for reactions among aqueous species (e.g., ligand substitution; protonation). BeerOz is a set of Matlab routines designed to perform both qualitative and quantitative analysis of spectroscopic data following the Beer-Lambert law. BeerOz is modular and can be customised for particular experimental strategies or for simultaneous refinement of several datasets obtained using different techniques. Distribution of species calculations are performed using an implementation of the EQBRM code, which allows for customised activity coefficient calculations. BeerOz also contains routines to study the n-dimensional solution space, in order to provide realistic estimates of errors and test for the existence of multiple local minima and correlation between the different refined variables. The paper reviews the physical principles underlying the qualitative and quantitative analysis of spectroscopic data collected on aqueous speciation, in particular for studying successive ligand replacement reactions, and presents the non-linear least-squares algorithm implemented in BeerOz. The discussion is illustrated using UV-Vis spectra collected on acidic Fe(III) solutions containing varying LiCl concentrations, and showing the change from the hexaaquo Fe(H 2O) 63+ complex to the tetrahedral FeCl 4- complex.

  2. Fatigue and mental health in Australian rural and regional ambulance personnel.

    PubMed

    Pyper, Zoe; Paterson, Jessica L

    2016-02-01

    Australian ambulance personnel experience stress, fatigue and exposure to traumatic events. These risks have been extensively researched in metropolitan paramedics. However, there has been limited research in rural and regional personnel. Rural and regional ambulance personnel make up a significant proportion of the Australian ambulance workforce and may be exposed to unique stressors. The aim of the current study was to investigate levels of fatigue, stress, and emotional trauma in rural and regional ambulance personnel. A sample of 134 (103 male, 31 female) rural and regional ambulance personnel completed a mixed methods survey assessing fatigue, stress and emotional trauma. Data were analysed using a combination of descriptive analysis and qualitative, deductive analysis that involved data immersion, coding, and categorisation. Participants reported high levels of fatigue and emotional trauma. Qualitative data revealed stressors including community expectations and 'office politics'. Participants also reported negative effects of fatigue including errors in drug administration and falling asleep while driving. The majority of participants reported normal levels of stress. It may be the case that working with known individuals in a community offers some degree of 'protective' impact for stress in rural and regional ambulance personnel. This is one of the first studies to investigate fatigue, stress, and emotional trauma in a rural and regional ambulance population. Results indicate a complex and unique profile of risks and challenges for this critical and understudied community resource. © 2015 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  3. Iranian family caregivers’ challenges and issues in caring of multiple sclerosis patients: A descriptive explorative qualitative study

    PubMed Central

    Masoudi, Reza; Abedi, Heidar Ali; Abedi, Parvin; Mohammadianinejad, Seyed Ehsan

    2014-01-01

    Background: The broad spectrum of problems caused by multiple sclerosis (MS) imposes heavy responsibility to caregivers in caring of their patients. Therefore, they encounter many issues and challenges in this situation. The purpose of this study was to explore the experiences and challenges of MS family caregivers. Materials and Methods: A qualitative design, based on a thematic analysis approach, was used to reach the study aim. Data were collected and analyzed concurrently through in-depth unstructured interviews, field notes, and observations that were held with 23 participants (14 family caregivers and 9 MS patients) at two referral centers in Ahvaz, Iran. Findings: Three major themes were extracted from the analysis of the transcripts: “emotional exhaustion of caregivers,” “uncertain atmosphere of caring,” and “insularity care.” The first theme consisted of three subthemes: “stressful atmosphere of caring,” “conflict and animism,” and “continuing distress affecting the caregiver.” The second theme consisted of three subthemes: “unstable and complicacy of disease,” “caring with trial and error,” and “frequent hospitalization of patients,” and the third theme consisted of two subthemes: “caring gap and disintegration” and “lack of sufficient support.” Conclusions: This study will be useful to healthcare system for managing the challenges of MS patients’ family caregivers. Improving the conditions and performance of family caregivers is crucial in order to provide high-quality care to people with MS. PMID:25183985

  4. Ethnographic study of ICT-supported collaborative work routines in general practice

    PubMed Central

    2010-01-01

    Background Health informatics research has traditionally been dominated by experimental and quasi-experimental designs. An emerging area of study in organisational sociology is routinisation (how collaborative work practices become business-as-usual). There is growing interest in the use of ethnography and other in-depth qualitative approaches to explore how collaborative work routines are enacted and develop over time, and how electronic patient records (EPRs) are used to support collaborative work practices within organisations. Methods/design Following Feldman and Pentland, we will use 'the organisational routine' as our unit of analysis. In a sample of four UK general practices, we will collect narratives, ethnographic observations, multi-modal (video and screen capture) data, documents and other artefacts, and analyse these to map and compare the different understandings and enactments of three common routines (repeat prescribing, coding and summarising, and chronic disease surveillance) which span clinical and administrative spaces and which, though 'mundane', have an important bearing on quality and safety of care. In a detailed qualitative analysis informed by sociological theory, we aim to generate insights about how complex collaborative work is achieved through the process of routinisation in healthcare organisations. Discussion Our study offers the potential not only to identify potential quality failures (poor performance, errors, failures of coordination) in collaborative work routines but also to reveal the hidden work and workarounds by front-line staff which bridge the model-reality gap in EPR technologies and via which "automated" safety features have an impact in practice. PMID:21190583

  5. Using Framework Analysis in nursing research: a worked example.

    PubMed

    Ward, Deborah J; Furber, Christine; Tierney, Stephanie; Swallow, Veronica

    2013-11-01

    To demonstrate Framework Analysis using a worked example and to illustrate how criticisms of qualitative data analysis including issues of clarity and transparency can be addressed. Critics of the analysis of qualitative data sometimes cite lack of clarity and transparency about analytical procedures; this can deter nurse researchers from undertaking qualitative studies. Framework Analysis is flexible, systematic, and rigorous, offering clarity, transparency, an audit trail, an option for theme-based and case-based analysis and for readily retrievable data. This paper offers further explanation of the process undertaken which is illustrated with a worked example. Data were collected from 31 nursing students in 2009 using semi-structured interviews. The data collected are not reported directly here but used as a worked example for the five steps of Framework Analysis. Suggestions are provided to guide researchers through essential steps in undertaking Framework Analysis. The benefits and limitations of Framework Analysis are discussed. Nurses increasingly use qualitative research methods and need to use an analysis approach that offers transparency and rigour which Framework Analysis can provide. Nurse researchers may find the detailed critique of Framework Analysis presented in this paper a useful resource when designing and conducting qualitative studies. Qualitative data analysis presents challenges in relation to the volume and complexity of data obtained and the need to present an 'audit trail' for those using the research findings. Framework Analysis is an appropriate, rigorous and systematic method for undertaking qualitative analysis. © 2013 Blackwell Publishing Ltd.

  6. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    ERIC Educational Resources Information Center

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  7. Optimizing ChIP-seq peak detectors using visual labels and supervised machine learning

    PubMed Central

    Goerner-Potvin, Patricia; Morin, Andreanne; Shao, Xiaojian; Pastinen, Tomi

    2017-01-01

    Motivation: Many peak detection algorithms have been proposed for ChIP-seq data analysis, but it is not obvious which algorithm and what parameters are optimal for any given dataset. In contrast, regions with and without obvious peaks can be easily labeled by visual inspection of aligned read counts in a genome browser. We propose a supervised machine learning approach for ChIP-seq data analysis, using labels that encode qualitative judgments about which genomic regions contain or do not contain peaks. The main idea is to manually label a small subset of the genome, and then learn a model that makes consistent peak predictions on the rest of the genome. Results: We created 7 new histone mark datasets with 12 826 visually determined labels, and analyzed 3 existing transcription factor datasets. We observed that default peak detection parameters yield high false positive rates, which can be reduced by learning parameters using a relatively small training set of labeled data from the same experiment type. We also observed that labels from different people are highly consistent. Overall, these data indicate that our supervised labeling method is useful for quantitatively training and testing peak detection algorithms. Availability and Implementation: Labeled histone mark data http://cbio.ensmp.fr/~thocking/chip-seq-chunk-db/, R package to compute the label error of predicted peaks https://github.com/tdhock/PeakError Contacts: toby.hocking@mail.mcgill.ca or guil.bourque@mcgill.ca Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27797775

  8. Four-dimensional data coupled to alternating weighted residue constraint quadrilinear decomposition model applied to environmental analysis: Determination of polycyclic aromatic hydrocarbons.

    PubMed

    Liu, Tingting; Zhang, Ling; Wang, Shutao; Cui, Yaoyao; Wang, Yutian; Liu, Lingfei; Yang, Zhe

    2018-03-15

    Qualitative and quantitative analysis of polycyclic aromatic hydrocarbons (PAHs) was carried out by three-dimensional fluorescence spectroscopy combining with Alternating Weighted Residue Constraint Quadrilinear Decomposition (AWRCQLD). The experimental subjects were acenaphthene (ANA) and naphthalene (NAP). Firstly, in order to solve the redundant information of the three-dimensional fluorescence spectral data, the wavelet transform was used to compress data in preprocessing. Then, the four-dimensional data was constructed by using the excitation-emission fluorescence spectra of different concentration PAHs. The sample data was obtained from three solvents that are methanol, ethanol and Ultra-pure water. The four-dimensional spectral data was analyzed by AWRCQLD, then the recovery rate of PAHs was obtained from the three solvents and compared respectively. On one hand, the results showed that PAHs can be measured more accurately by the high-order data, and the recovery rate was higher. On the other hand, the results presented that AWRCQLD can better reflect the superiority of four-dimensional algorithm than the second-order calibration and other third-order calibration algorithms. The recovery rate of ANA was 96.5%~103.3% and the root mean square error of prediction was 0.04μgL -1 . The recovery rate of NAP was 96.7%~115.7% and the root mean square error of prediction was 0.06μgL -1 . Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Optimizing ChIP-seq peak detectors using visual labels and supervised machine learning.

    PubMed

    Hocking, Toby Dylan; Goerner-Potvin, Patricia; Morin, Andreanne; Shao, Xiaojian; Pastinen, Tomi; Bourque, Guillaume

    2017-02-15

    Many peak detection algorithms have been proposed for ChIP-seq data analysis, but it is not obvious which algorithm and what parameters are optimal for any given dataset. In contrast, regions with and without obvious peaks can be easily labeled by visual inspection of aligned read counts in a genome browser. We propose a supervised machine learning approach for ChIP-seq data analysis, using labels that encode qualitative judgments about which genomic regions contain or do not contain peaks. The main idea is to manually label a small subset of the genome, and then learn a model that makes consistent peak predictions on the rest of the genome. We created 7 new histone mark datasets with 12 826 visually determined labels, and analyzed 3 existing transcription factor datasets. We observed that default peak detection parameters yield high false positive rates, which can be reduced by learning parameters using a relatively small training set of labeled data from the same experiment type. We also observed that labels from different people are highly consistent. Overall, these data indicate that our supervised labeling method is useful for quantitatively training and testing peak detection algorithms. Labeled histone mark data http://cbio.ensmp.fr/~thocking/chip-seq-chunk-db/ , R package to compute the label error of predicted peaks https://github.com/tdhock/PeakError. toby.hocking@mail.mcgill.ca or guil.bourque@mcgill.ca. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  10. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks.

    PubMed

    Chiu, Ming-Chuan; Hsieh, Min-Chih

    2016-05-01

    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  11. The Role of Model and Initial Condition Error in Numerical Weather Forecasting Investigated with an Observing System Simulation Experiment

    NASA Technical Reports Server (NTRS)

    Prive, Nikki C.; Errico, Ronald M.

    2013-01-01

    A series of experiments that explore the roles of model and initial condition error in numerical weather prediction are performed using an observing system simulation experiment (OSSE) framework developed at the National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASA/GMAO). The use of an OSSE allows the analysis and forecast errors to be explicitly calculated, and different hypothetical observing networks can be tested with ease. In these experiments, both a full global OSSE framework and an 'identical twin' OSSE setup are utilized to compare the behavior of the data assimilation system and evolution of forecast skill with and without model error. The initial condition error is manipulated by varying the distribution and quality of the observing network and the magnitude of observation errors. The results show that model error has a strong impact on both the quality of the analysis field and the evolution of forecast skill, including both systematic and unsystematic model error components. With a realistic observing network, the analysis state retains a significant quantity of error due to systematic model error. If errors of the analysis state are minimized, model error acts to rapidly degrade forecast skill during the first 24-48 hours of forward integration. In the presence of model error, the impact of observation errors on forecast skill is small, but in the absence of model error, observation errors cause a substantial degradation of the skill of medium range forecasts.

  12. Ethical and professional challenges posed by patients with genetic concerns: a report of focus group discussions with genetic counselors, physicians, and nurses.

    PubMed

    Veach, P M; Bartels, D M; LeRoy, B S

    2001-04-01

    Ninety-seven physicians, nurses, and genetic counselors from four regions within the United States participated in focus groups to identify the types of ethical and professional challenges that arise when their patients have genetic concerns. Responses were taped and transcribed and then analyzed using the Hill et al. (1997, Counsel Psychol 25:517-522) Consensual Qualitative Research method of analysis. Sixteen major ethical and professional domains and 63 subcategories were identified. Major domains are informed consent; withholding information; facing uncertainty; resource allocation; value conflicts, directiveness/nondirectiveness; determining the primary patient; professional identity issues; emotional responses; diversity issues; confidentiality; attaining/maintaining proficiency; professional misconduct; discrimination; colleague error; and documentation. Implications for practitioners who deal with genetic issues and recommendations for additional research are given.

  13. Effects of Correlated Errors on the Analysis of Space Geodetic Data

    NASA Technical Reports Server (NTRS)

    Romero-Wolf, Andres; Jacobs, C. S.

    2011-01-01

    As thermal errors are reduced instrumental and troposphere correlated errors will increasingly become more important. Work in progress shows that troposphere covariance error models improve data analysis results. We expect to see stronger effects with higher data rates. Temperature modeling of delay errors may further reduce temporal correlations in the data.

  14. Using sediment 'fingerprints' to assess sediment-budget errors, north Halawa Valley, Oahu, Hawaii, 1991-92

    USGS Publications Warehouse

    Hill, B.R.; DeCarlo, E.H.; Fuller, C.C.; Wong, M.F.

    1998-01-01

    Reliable estimates of sediment-budget errors are important for interpreting sediment-budget results. Sediment-budget errors are commonly considered equal to sediment-budget imbalances, which may underestimate actual sediment-budget errors if they include compensating positive and negative errors. We modified the sediment 'fingerprinting' approach to qualitatively evaluate compensating errors in an annual (1991) fine (<63 ??m) sediment budget for the North Halawa Valley, a mountainous, forested drainage basin on the island of Oahu, Hawaii, during construction of a major highway. We measured concentrations of aeolian quartz and 137Cs in sediment sources and fluvial sediments, and combined concentrations of these aerosols with the sediment budget to construct aerosol budgets. Aerosol concentrations were independent of the sediment budget, hence aerosol budgets were less likely than sediment budgets to include compensating errors. Differences between sediment-budget and aerosol-budget imbalances therefore provide a measure of compensating errors in the sediment budget. The sediment-budget imbalance equalled 25% of the fluvial fine-sediment load. Aerosol-budget imbalances were equal to 19% of the fluvial 137Cs load and 34% of the fluval quartz load. The reasonably close agreement between sediment- and aerosol-budget imbalances indicates that compensating errors in the sediment budget were not large and that the sediment-budget imbalance as a reliable measure of sediment-budget error. We attribute at least one-third of the 1991 fluvial fine-sediment load to highway construction. Continued monitoring indicated that highway construction produced 90% of the fluvial fine-sediment load during 1992. Erosion of channel margins and attrition of coarse particles provided most of the fine sediment produced by natural processes. Hillslope processes contributed relatively minor amounts of sediment.

  15. [Qualitative research in health services research - discussion paper, Part 2: Qualitative research in health services research in Germany - an overview].

    PubMed

    Karbach, U; Stamer, M; Holmberg, C; Güthlin, C; Patzelt, C; Meyer, T

    2012-08-01

    This is the second part of a 3-part discussion paper by the working group on "Qualitative Methods" in the German network of health services research (DNVF) that shall contribute to the development of a memorandum concerning qualitative health services research. It aims to depict the different types of qualitative research that are conducted in health services research in Germany. In addition, the authors present a specific set of qualitative data collection and analysis tools to demonstrate the potential of qualitative research for health services research. QUALITATIVE RESEARCH IN HEALTH SERVICES RESEARCH - AN OVERVIEW: To give an overview of the types of qualitative research conducted in German health services research, the abstracts of the 8th German Conference on Health Services Research were filtered to identify qualitative or mixed-methods studies. These were then analysed by looking at the context which was studied, who was studied, the aims of the studies, and what type of methods were used. Those methods that were mentioned most often for data collection and analysis are described in detail. QUALITATIVE RESEARCH AT THE CONFERENCE FOR HEALTH SERVICES RESEARCH 2009: Approximately a fifth of all abstracts (n=74) had a qualitative (n=47) or a mixed-methods approach combining quantitative and qualitative methods (n=27). Research aims included needs assessment (41%), survey development (36%), evaluation (22%), and theorizing (1%). Data collection mostly consisted of one-on-one interviews (n=45) and group discussions (n=29). Qualitative content analysis was named in 35 abstracts, 30 abstracts did not reference their method of analysis. In addition to a quantitative summary of the abstract findings, the diversity of fields addressed by qualitative methods is highlighted. Although drawing conclusions on the use of qualitative methods in German health services research from the analysis of conference abstracts is not possible, the overview we present demonstrates the diversity of methods used for data collection and analysis and showed that a few select methods are extensively used. One of the tasks a memorandum of qualitative health services research should accomplish is to highlight underutilized research methods, which may help to develop the potential of qualitative methodology in German health services research. © Georg Thieme Verlag KG Stuttgart · New York.

  16. Beyond Constant Comparison Qualitative Data Analysis: Using NVivo

    ERIC Educational Resources Information Center

    Leech, Nancy L.; Onwuegbuzie, Anthony J.

    2011-01-01

    The purposes of this paper are to outline seven types of qualitative data analysis techniques, to present step-by-step guidance for conducting these analyses via a computer-assisted qualitative data analysis software program (i.e., NVivo9), and to present screenshots of the data analysis process. Specifically, the following seven analyses are…

  17. Skylab water balance error analysis

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1977-01-01

    Estimates of the precision of the net water balance were obtained for the entire Skylab preflight and inflight phases as well as for the first two weeks of flight. Quantitative estimates of both total sampling errors and instrumentation errors were obtained. It was shown that measurement error is minimal in comparison to biological variability and little can be gained from improvement in analytical accuracy. In addition, a propagation of error analysis demonstrated that total water balance error could be accounted for almost entirely by the errors associated with body mass changes. Errors due to interaction between terms in the water balance equation (covariances) represented less than 10% of the total error. Overall, the analysis provides evidence that daily measurements of body water changes obtained from the indirect balance technique are reasonable, precise, and relaible. The method is not biased toward net retention or loss.

  18. Two Strategies for Qualitative Content Analysis: An Intramethod Approach to Triangulation.

    PubMed

    Renz, Susan M; Carrington, Jane M; Badger, Terry A

    2018-04-01

    The overarching aim of qualitative research is to gain an understanding of certain social phenomena. Qualitative research involves the studied use and collection of empirical materials, all to describe moments and meanings in individuals' lives. Data derived from these various materials require a form of analysis of the content, focusing on written or spoken language as communication, to provide context and understanding of the message. Qualitative research often involves the collection of data through extensive interviews, note taking, and tape recording. These methods are time- and labor-intensive. With the advances in computerized text analysis software, the practice of combining methods to analyze qualitative data can assist the researcher in making large data sets more manageable and enhance the trustworthiness of the results. This article will describe a novel process of combining two methods of qualitative data analysis, or Intramethod triangulation, as a means to provide a deeper analysis of text.

  19. Causes of medication administration errors in hospitals: a systematic review of quantitative and qualitative evidence.

    PubMed

    Keers, Richard N; Williams, Steven D; Cooke, Jonathan; Ashcroft, Darren M

    2013-11-01

    Underlying systems factors have been seen to be crucial contributors to the occurrence of medication errors. By understanding the causes of these errors, the most appropriate interventions can be designed and implemented to minimise their occurrence. This study aimed to systematically review and appraise empirical evidence relating to the causes of medication administration errors (MAEs) in hospital settings. Nine electronic databases (MEDLINE, EMBASE, International Pharmaceutical Abstracts, ASSIA, PsycINFO, British Nursing Index, CINAHL, Health Management Information Consortium and Social Science Citations Index) were searched between 1985 and May 2013. Inclusion and exclusion criteria were applied to identify eligible publications through title analysis followed by abstract and then full text examination. English language publications reporting empirical data on causes of MAEs were included. Reference lists of included articles and relevant review papers were hand searched for additional studies. Studies were excluded if they did not report data on specific MAEs, used accounts from individuals not directly involved in the MAE concerned or were presented as conference abstracts with insufficient detail. A total of 54 unique studies were included. Causes of MAEs were categorised according to Reason's model of accident causation. Studies were assessed to determine relevance to the research question and how likely the results were to reflect the potential underlying causes of MAEs based on the method(s) used. Slips and lapses were the most commonly reported unsafe acts, followed by knowledge-based mistakes and deliberate violations. Error-provoking conditions influencing administration errors included inadequate written communication (prescriptions, documentation, transcription), problems with medicines supply and storage (pharmacy dispensing errors and ward stock management), high perceived workload, problems with ward-based equipment (access, functionality), patient factors (availability, acuity), staff health status (fatigue, stress) and interruptions/distractions during drug administration. Few studies sought to determine the causes of intravenous MAEs. A number of latent pathway conditions were less well explored, including local working culture and high-level managerial decisions. Causes were often described superficially; this may be related to the use of quantitative surveys and observation methods in many studies, limited use of established error causation frameworks to analyse data and a predominant focus on issues other than the causes of MAEs among studies. As only English language publications were included, some relevant studies may have been missed. Limited evidence from studies included in this systematic review suggests that MAEs are influenced by multiple systems factors, but if and how these arise and interconnect to lead to errors remains to be fully determined. Further research with a theoretical focus is needed to investigate the MAE causation pathway, with an emphasis on ensuring interventions designed to minimise MAEs target recognised underlying causes of errors to maximise their impact.

  20. Spectral Analysis of Forecast Error Investigated with an Observing System Simulation Experiment

    NASA Technical Reports Server (NTRS)

    Prive, N. C.; Errico, Ronald M.

    2015-01-01

    The spectra of analysis and forecast error are examined using the observing system simulation experiment (OSSE) framework developed at the National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASAGMAO). A global numerical weather prediction model, the Global Earth Observing System version 5 (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation, is cycled for two months with once-daily forecasts to 336 hours to generate a control case. Verification of forecast errors using the Nature Run as truth is compared with verification of forecast errors using self-analysis; significant underestimation of forecast errors is seen using self-analysis verification for up to 48 hours. Likewise, self analysis verification significantly overestimates the error growth rates of the early forecast, as well as mischaracterizing the spatial scales at which the strongest growth occurs. The Nature Run-verified error variances exhibit a complicated progression of growth, particularly for low wave number errors. In a second experiment, cycling of the model and data assimilation over the same period is repeated, but using synthetic observations with different explicitly added observation errors having the same error variances as the control experiment, thus creating a different realization of the control. The forecast errors of the two experiments become more correlated during the early forecast period, with correlations increasing for up to 72 hours before beginning to decrease.

  1. Human Factors Process Task Analysis: Liquid Oxygen Pump Acceptance Test Procedure at the Advanced Technology Development Center

    NASA Technical Reports Server (NTRS)

    Diorio, Kimberly A.; Voska, Ned (Technical Monitor)

    2002-01-01

    This viewgraph presentation provides information on Human Factors Process Failure Modes and Effects Analysis (HF PFMEA). HF PFMEA includes the following 10 steps: Describe mission; Define System; Identify human-machine; List human actions; Identify potential errors; Identify factors that effect error; Determine likelihood of error; Determine potential effects of errors; Evaluate risk; Generate solutions (manage error). The presentation also describes how this analysis was applied to a liquid oxygen pump acceptance test.

  2. Computer-Assisted Analysis of Qualitative Gerontological Research.

    ERIC Educational Resources Information Center

    Hiemstra, Roger; And Others

    1987-01-01

    Asserts that qualitative research has great potential for use in gerontological research. Describes QUALOG, a computer-assisted, qualitative data analysis scheme using logic programming developed at Syracuse University. Reviews development of QUALOG and discusses how QUALOG was used to analyze data from a qualitative study of older adult learners.…

  3. Investigation of the Application of Communicative Language Teaching in the English Language Classroom -- A Case Study on Teachers' Attitudes in Turkey

    ERIC Educational Resources Information Center

    Coskun, Abdullah

    2011-01-01

    This qualitative study aimed to reveal whether teachers' classroom practices overlap with their attitudes towards certain features of Communicative Language Teaching (CLT) such as pair and group-work activities, fluency and accuracy, error correction and the role of the teacher. Before conducting an open-ended questionnaire with two teachers of…

  4. Nurses' experiences and perspectives on medication safety practices: an explorative qualitative study.

    PubMed

    Smeulers, Marian; Onderwater, Astrid T; van Zwieten, Myra C B; Vermeulen, Hester

    2014-04-01

    To explore nurses' experiences with and perspectives on preventing medication administration errors. Insight into nurses' experiences with and perspectives on preventing medication administration errors is important and can be utilised to tailor and implement safety practices. A qualitative interview study of 20 nurses in an academic medical centre was conducted between March and December of 2011. Three themes emerged from this study: (1) nurses' roles and responsibilities in medication safety: aside from safe preparation and administration, the clinical reasoning of nurses is essential for medication safety; (2) nurses' ability to work safely: knowledge of risks and nurses' work circumstances influence their ability to work safely; and (3) nurses' acceptance of safety practices: advantages, feasibility and appropriateness are important incentives for acceptance of a safety practice. Nurses' experiences coincide with the assumption that they are in a pre-eminent position to enable safe medication management; however, their ability to adequately perform this role depends on sufficient knowledge to assess the risks of medication administration and on the circumstances in which they work. Safe medication management requires a learning climate and professional practice environment that enables further development of professional nursing skills and knowledge. © 2014 John Wiley & Sons Ltd.

  5. Methodology Series Module 10: Qualitative Health Research

    PubMed Central

    Setia, Maninder Singh

    2017-01-01

    Although quantitative designs are commonly used in clinical research, some studies require qualitative methods. These designs are different from quantitative methods; thus, researchers should be aware of data collection methods and analyses for qualitative research. Qualitative methods are particularly useful to understand patient experiences with the treatment or new methods of management or to explore issues in detail. These methods are useful in social and behavioral research. In qualitative research, often, the main focus is to understand the issue in detail rather than generalizability; thus, the sampling methods commonly used are purposive sampling; quota sampling; and snowball sampling (for hard to reach groups). Data can be collected using in-depth interviews (IDIs) or focus group discussions (FGDs). IDI is a one-to-one interview with the participant. FGD is a method of group interview or discussion, in which more than one participant is interviewed at the same time and is usually led by a facilitator. The commonly used methods for data analysis are: thematic analysis; grounded theory analysis; and framework analysis. Qualitative data collection and analysis require special expertise. Hence, if the reader plans to conduct qualitative research, they should team up with a qualitative researcher. PMID:28794545

  6. Methodology Series Module 10: Qualitative Health Research.

    PubMed

    Setia, Maninder Singh

    2017-01-01

    Although quantitative designs are commonly used in clinical research, some studies require qualitative methods. These designs are different from quantitative methods; thus, researchers should be aware of data collection methods and analyses for qualitative research. Qualitative methods are particularly useful to understand patient experiences with the treatment or new methods of management or to explore issues in detail. These methods are useful in social and behavioral research. In qualitative research, often, the main focus is to understand the issue in detail rather than generalizability; thus, the sampling methods commonly used are purposive sampling; quota sampling; and snowball sampling (for hard to reach groups). Data can be collected using in-depth interviews (IDIs) or focus group discussions (FGDs). IDI is a one-to-one interview with the participant. FGD is a method of group interview or discussion, in which more than one participant is interviewed at the same time and is usually led by a facilitator. The commonly used methods for data analysis are: thematic analysis; grounded theory analysis; and framework analysis. Qualitative data collection and analysis require special expertise. Hence, if the reader plans to conduct qualitative research, they should team up with a qualitative researcher.

  7. Improving the usefulness of a tool for appraising the quality of qualitative, quantitative and mixed methods studies, the Mixed Methods Appraisal Tool (MMAT).

    PubMed

    Hong, Quan Nha; Gonzalez-Reyes, Araceli; Pluye, Pierre

    2018-06-01

    Systematic reviews combining qualitative, quantitative, and/or mixed methods studies are increasingly popular because of their potential for addressing complex interventions and phenomena, specifically for assessing and improving clinical practice. A major challenge encountered with this type of review is the appraisal of the quality of individual studies given the heterogeneity of the study designs. The Mixed Methods Appraisal Tool (MMAT) was developed to help overcome this challenge. The aim of this study was to explore the usefulness of the MMAT by seeking the views and experiences of researchers who have used it. We conducted a qualitative descriptive study using semistructured interviews with MMAT users. A purposeful sample was drawn from the researchers who had previously contacted the developer of the MMAT, and those who have published a systematic review for which they had used the MMAT. All interviews were transcribed verbatim and analyzed by 2 coders using thematic analysis. Twenty participants from 8 countries were interviewed. Thirteen themes were identified and grouped into the 2 dimensions of usefulness, ie, utility and usability. The themes related to utility concerned the coverage, completeness, flexibility, and other utilities of the tool. Those regarding usability were related to the learnability, efficiency, satisfaction, and errors that could be made due to difficulties understanding or selecting the items to appraise. On the basis of the results of this study, we make several recommendations for improving the MMAT. This will contribute to greater usefulness of the MMAT. © 2018 John Wiley & Sons, Ltd.

  8. Clinical risk management in mental health: a qualitative study of main risks and related organizational management practices.

    PubMed

    Briner, Matthias; Manser, Tanja

    2013-02-04

    A scientific understanding of clinical risk management (CRM) in mental health care is essential for building safer health systems and for improving patient safety. While evidence on patient safety and CRM in physical health care has increased, there is limited research on these issues in mental health care. This qualitative study provides an overview of the most important clinical risks in mental health and related organizational management practices. We conducted in-depth expert interviews with professionals responsible for CRM in psychiatric hospitals. Interviews were transcribed and analyzed applying qualitative content analysis to thematically sort the identified risks. The main concerns for CRM in mental health are a) violence and self-destructive behavior (i.e. protecting patients and staff from other patients, and patients from themselves), b) treatment errors, especially in the process of therapy, and c) risks associated with mental illnesses (e.g. psychosis or depression). This study identified critical differences to CRM in hospitals for physical disorder and challenges specific to CRM in mental health. Firstly, many psychiatric patients do not believe that they are ill and are therefore in hospital against their will. Secondly, staff safety is a much more prominent theme for CRM in mental health care as it is directly related to the specifics of mental illnesses. The current study contributes to the understanding of patient safety and raises awareness for CRM in mental health. The mental health specific overview of central risks and related organizational management practices offers a valuable basis for CRM development in mental health and an addition to CRM in general.

  9. Identifying fallacious arguments in a qualitative study of antipsychotic prescribing in dementia.

    PubMed

    Donyai, Parastou

    2017-10-01

    Dementia can result in cognitive, noncognitive and behavioural symptoms which are difficult to manage. Formal guidelines for the care and management of dementia in the UK state that antipsychotics should only be prescribed where fully justified. This is because inappropriate use, particularly problematic in care-home settings, can produce severe side effects including death. The aim of this study was to explore the use of fallacious arguments in professionals' deliberations about antipsychotic prescribing in dementia in care-home settings. Fallacious arguments have the potential to become unremarkable discourses that construct and validate practices which are counter to guidelines. This qualitative study involved interviews with 28 care-home managers and health professionals involved in caring for patients with dementia. Potentially fallacious arguments were identified using qualitative content analysis and a coding framework constructed from existing explanatory models of fallacious reasoning. Fallacious arguments were identified in a range of explanations and reasons that participants gave for in answer to questions about initiating, reducing doses of and stopping antipsychotics in dementia. The dominant fallacy was false dichotomy. Appeal to popularity, tradition, consequence, emotion, or fear, and the slippery slope argument was also identified. Fallacious arguments were often formulated to present convincing cases whereby prescribing antipsychotics or maintaining existing doses (versus not starting medication or reducing the dose, for example) appeared as the only acceptable decision but this is not always the case. The findings could help health professionals to recognise and mitigate the effect of logic-based errors in decisions about the prescribing of antipsychotics in dementia. © 2016 Royal Pharmaceutical Society.

  10. Patient safety education to change medical students' attitudes and sense of responsibility.

    PubMed

    Roh, Hyerin; Park, Seok Ju; Kim, Taekjoong

    2015-01-01

    This study examined changes in the perceptions and attitudes as well as the sense of individual and collective responsibility in medical students after they received patient safety education. A three-day patient safety curriculum was implemented for third-year medical students shortly before entering their clerkship. Before and after training, we administered a questionnaire, which was analysed quantitatively. Additionally, we asked students to answer questions about their expected behaviours in response to two case vignettes. Their answers were analysed qualitatively. There was improvement in students' concepts of patient safety after training. Before training, they showed good comprehension of the inevitability of error, but most students blamed individuals for errors and expressed a strong sense of individual responsibility. After training, students increasingly attributed errors to system dysfunction and reported more self-confidence in speaking up about colleagues' errors. However, due to the hierarchical culture, students still described difficulties communicating with senior doctors. Patient safety education effectively shifted students' attitudes towards systems-based thinking and increased their sense of collective responsibility. Strategies for improving superior-subordinate communication within a hierarchical culture should be added to the patient safety curriculum.

  11. Robust estimation of adaptive tensors of curvature by tensor voting.

    PubMed

    Tong, Wai-Shun; Tang, Chi-Keung

    2005-03-01

    Although curvature estimation from a given mesh or regularly sampled point set is a well-studied problem, it is still challenging when the input consists of a cloud of unstructured points corrupted by misalignment error and outlier noise. Such input is ubiquitous in computer vision. In this paper, we propose a three-pass tensor voting algorithm to robustly estimate curvature tensors, from which accurate principal curvatures and directions can be calculated. Our quantitative estimation is an improvement over the previous two-pass algorithm, where only qualitative curvature estimation (sign of Gaussian curvature) is performed. To overcome misalignment errors, our improved method automatically corrects input point locations at subvoxel precision, which also rejects outliers that are uncorrectable. To adapt to different scales locally, we define the RadiusHit of a curvature tensor to quantify estimation accuracy and applicability. Our curvature estimation algorithm has been proven with detailed quantitative experiments, performing better in a variety of standard error metrics (percentage error in curvature magnitudes, absolute angle difference in curvature direction) in the presence of a large amount of misalignment noise.

  12. Climate model biases in seasonality of continental water storage revealed by satellite gravimetry

    USGS Publications Warehouse

    Swenson, Sean; Milly, P.C.D.

    2006-01-01

    Satellite gravimetric observations of monthly changes in continental water storage are compared with outputs from five climate models. All models qualitatively reproduce the global pattern of annual storage amplitude, and the seasonal cycle of global average storage is reproduced well, consistent with earlier studies. However, global average agreements mask systematic model biases in low latitudes. Seasonal extrema of low‐latitude, hemispheric storage generally occur too early in the models, and model‐specific errors in amplitude of the low‐latitude annual variations are substantial. These errors are potentially explicable in terms of neglected or suboptimally parameterized water stores in the land models and precipitation biases in the climate models.

  13. Conducting qualitative research in mental health: Thematic and content analyses.

    PubMed

    Crowe, Marie; Inder, Maree; Porter, Richard

    2015-07-01

    The objective of this paper is to describe two methods of qualitative analysis - thematic analysis and content analysis - and to examine their use in a mental health context. A description of the processes of thematic analysis and content analysis is provided. These processes are then illustrated by conducting two analyses of the same qualitative data. Transcripts of qualitative interviews are analysed using each method to illustrate these processes. The illustration of the processes highlights the different outcomes from the same set of data. Thematic and content analyses are qualitative methods that serve different research purposes. Thematic analysis provides an interpretation of participants' meanings, while content analysis is a direct representation of participants' responses. These methods provide two ways of understanding meanings and experiences and provide important knowledge in a mental health context. © The Royal Australian and New Zealand College of Psychiatrists 2015.

  14. Effect of catalogues coordinate errors of a star onto determination of the physical libration of the Moon from the observations of stars

    NASA Astrophysics Data System (ADS)

    Petrova, Natalia; Kocoulin, Valerii; Nefediev, Yurii

    2016-07-01

    In the Kazan University computer simulation is carried out for observation of lunar physical libration in projects planned installation of measuring equipment on the lunar surface. One such project is the project of ILOM (Japan), in which on the lunar pole an optical telescope with CCD will be equipped. As a result, the determining the selenographic coordinates (x and y) of a star with an accuracy of 1 ms of arc will be achieved. On the basis of the analytical theory of physical libration we developed a technique for solving the inverse problem of the libration. And we have already shown, for example, that the error in determining selenographic coordinates about ɛ seconds does not lead to errors in the determination of the libration angles ρ and Iσ larger than the 1.414ɛ. Libration in longitude is not determined from observations of the polar star (Petrova et al., 2012). The accuracy of the libration in the inverse problem depends on accuracy of the coordinates of the stars - α and δ - taken from the star catalogs. Checking this influence is the task of the present study. To do simulation we have developed that allows to choose the stars, falling in the field of view of the lunar telescope on observation period. Equatorial coordinates of stars were chosen by us from several fundamental catalogs: UCAC2-BSS, Hipparcos, Tycho, FK6 (part I, III) and the Astronomical Almanac. An analysis of these catalogues from the point of view accuracy of coordinates of stars represented in them was performed by Nefediev et al., 2013. The largest error, 20-70 ms, found in the catalogues UCAC2 and Tycho, the others have an error about a millisecond of arc. We simulated the observations with mentioned errors and got the following results. 1. The error in the declination Δδ of the star causes the same order error in libration parameters ρ and Iσ , while the sensitivity of libration to errors in Δα is ten time smaller. Fortunately, due to statistics (30 to 70, depending on the time of observation), this error is reduced by an order, i.e. does not exceed the error of observation selenographic coordinates. 2. The worst thing - errors in coordinates of catalogue causes though a small but constant shift in the ρ and Iσ. So, when Δα, Δδ ˜0.01", then the shift reaches 0.0025". Moreover there is a trend, with a slight, but noticeable slope. 3. Effect of error in declination of a stars is substantially strong than the error in right ascension. Perhaps it is characteristic only for polar observations. For the required accuracy in determination of the physical libration these phenomena must be taken into account when processing the planned observations. Referencies. Nefediev et al., 2013. Uchenye zapiski Kazanskogo universiteta, v. 155, 1, p.188-194. Petrova, N., Abdulmyanov T., Hanada H. Some qualitative manifestations of the physical libration of the Moon by observing stars from the lunar surface. //J. Adv. Space Res., 2012a. V. 50, p. 1702-1711

  15. Patient safety culture and associated factors: A quantitative and qualitative study of healthcare workers' view in Jimma zone Hospitals, Southwest Ethiopia.

    PubMed

    Wami, Sintayehu Daba; Demssie, Amsalu Feleke; Wassie, Molla Mesele; Ahmed, Ansha Nega

    2016-09-20

    Patient safety culture is an important aspect for quality healthcare delivery and is an issue of high concern globally. In Ethiopia health system little is known and information is limited in scope about patient safety culture. Therefore, the aim of this study was to assess the level of patient safety culture and associated factors in Jimma zone Hospitals, southwest Ethiopia. Facility based cross sectional quantitative study triangulated with qualitative approaches was employed from March to April 30/2015. Stratified sampling technique was used to select 637 study participants among 4 hospitals. The standardized tool which measures 12 patient safety culture composites was used for data collection. Bivariate and multivariate linear regression analyses were performed using SPSS version 20. Significance level was obtained at 95 % CI and p-value < 0.05. Semi structured guide in depth interview was used to collect the qualitative data. Content analysis of the interview was performed. The overall level of patient safety culture was 46.7 % (95 % CI: 43.0, 51.2). Hours worked per week (β =-0.06, 95 % CI:-0.12,-0.001), reporting adverse event (β = 3.34, 95 % CI: 2.12, 4.57), good communication (β = 2.78, 95 % CI: 2.29, 3.28), teamwork within hospital (β = 1.91, 95 % CI: 1.37, 2.46), level of staffing (β = 1.32, 95 % CI: 0.89, 1.75), exchange of feedback about error (β = 1.37, 95 % CI: 0.91, 1.83) and participation in patient safety program (β = 1.3, 95 % CI: 0.57, 2.03) were factors significantly associated with the patient safety culture. The in depth interview indicated incident reporting, resources, healthcare worker attitude and patient involvement as important factors that influence patient safety culture. The overall level of patient safety culture was low. Working hours, level of staffing, teamwork, communications openness, reporting an event and exchange of feedback about error were associated with patient safety culture. Therefore, interventions of systemic approach through facilitating opportunities for communication openness, cooperation and exchange of ideas between healthcare workers are needed to improve the level of patient safety culture.

  16. Phenomenology and qualitative research: Amedeo Giorgi's hermetic epistemology.

    PubMed

    Paley, John

    2018-04-11

    Amedeo Giorgi has published a review article devoted to Phenomenology as Qualitative Research: A Critical Analysis of Meaning Attribution. However, anyone reading this article, but unfamiliar with the book, will get a distorted view of what it is about, whom it is addressed to, what it seeks to achieve and how it goes about presenting its arguments. Not mildly distorted, in need of the odd correction here and there, but systematically misrepresented. The article is a study in misreading. Giorgi misreads the book's mise en scène; he misreads its narrative arc; he misreads individual arguments; he misreads short, simple passages; he misreads the philosophy of the science literature; he misreads his own data; he misreads the title; he misreads the blurb; he misreads the acknowledgements. In addition, there are serious failures of scholarship (ironically, he demonstrates how unacquainted he is with the relevant literature at the very moment he is accusing me of being ill-informed). In this reply, I provide several examples of these errors, but my primary aim is to understand why Giorgi's misreading is as ubiquitous as it is. To this end, I explain his mistakes by reference to the hermetic epistemology within which he is confined. © 2018 John Wiley & Sons Ltd.

  17. Comparative evaluation of the effect of denture cleansers on the surface topography of denture base materials: An in-vitro study

    PubMed Central

    Jeyapalan, Karthigeyan; Kumar, Jaya Krishna; Azhagarasan, N. S.

    2015-01-01

    Aims: The aim was to evaluate and compare the effects of three chemically different commercially available denture cleansing agents on the surface topography of two different denture base materials. Materials and Methods: Three chemically different denture cleansers (sodium perborate, 1% sodium hypochlorite, 0.2% chlorhexidine gluconate) were used on two denture base materials (acrylic resin and chrome cobalt alloy) and the changes were evaluated at 3 times intervals (56 h, 120 h, 240 h). Changes from baseline for surface roughness were recorded using a surface profilometer and standard error of the mean (SEM) both quantitatively and qualitatively, respectively. Qualitative surface analyses for all groups were done by SEM. Statistical Analysis Used: The values obtained were analyzed statistically using one-way ANOVA and paired t-test. Results: All three denture cleanser solutions showed no statistically significant surface changes on the acrylic resin portions at 56 h, 120 h, and 240 h of immersion. However, on the alloy portion changes were significant at the end of 120 h and 240 h. Conclusion: Of the three denture cleansers used in the study, none produced significant changes on the two denture base materials for the short duration of immersion, whereas changes were seen as the immersion periods were increased. PMID:26538915

  18. Qualitative research methods in renal medicine: an introduction.

    PubMed

    Bristowe, Katherine; Selman, Lucy; Murtagh, Fliss E M

    2015-09-01

    Qualitative methodologies are becoming increasingly widely used in health research. However, within some specialties, including renal medicine, qualitative approaches remain under-represented in the high-impact factor journals. Qualitative research can be undertaken: (i) as a stand-alone research method, addressing specific research questions; (ii) as part of a mixed methods approach alongside quantitative approaches or (iii) embedded in clinical trials, or during the development of complex interventions. The aim of this paper is to introduce qualitative research, including the rationale for choosing qualitative approaches, and guidance for ensuring quality when undertaking and reporting qualitative research. In addition, we introduce types of qualitative data (observation, interviews and focus groups) as well as some of the most commonly encountered methodological approaches (case studies, ethnography, phenomenology, grounded theory, thematic analysis, framework analysis and content analysis). © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  19. Simple and ultra-fast recognition and quantitation of compounded monoclonal antibodies: Application to flow injection analysis combined to UV spectroscopy and matching method.

    PubMed

    Jaccoulet, E; Schweitzer-Chaput, A; Toussaint, B; Prognon, P; Caudron, E

    2018-09-01

    Compounding of monoclonal antibody (mAbs) constantly increases in hospital. Quality control (QC) of the compounded mAbs based on quantification and identification is required to prevent potential errors and fast method is needed to manage outpatient chemotherapy administration. A simple and ultra-fast (less than 30 s) method using flow injection analysis associated to least square matching method issued from the analyzer software was performed and evaluated for the routine hospital QC of three compounded mAbs: bevacizumab, infliximab and rituximab. The method was evaluated through qualitative and quantitative parameters. Preliminary analysis of the UV absorption and second derivative spectra of the mAbs allowed us to adapt analytical conditions according to the therapeutic range of the mAbs. In terms of quantitative QC, linearity, accuracy and precision were assessed as specified in ICH guidelines. Very satisfactory recovery was achieved and the RSD (%) of the intermediate precision were less than 1.1%. Qualitative analytical parameters were also evaluated in terms of specificity, sensitivity and global precision through a matrix of confusion. Results showed to be concentration and mAbs dependant and excellent (100%) specificity and sensitivity were reached within specific concentration range. Finally, routine application on "real life" samples (n = 209) from different batch of the three mAbs complied with the specifications of the quality control i.e. excellent identification (100%) and ± 15% of targeting concentration belonging to the calibration range. The successful use of the combination of second derivative spectroscopy and partial least square matching method demonstrated the interest of FIA for the ultra-fast QC of mAbs after compounding using matching method. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. "It's like texting at the dinner table": A qualitative analysis of the impact of electronic health records on patient-physician interaction in hospitals.

    PubMed

    Pelland, Kimberly D; Baier, Rosa R; Gardner, Rebekah L

    2017-06-30

    nBACKGROUND: Electronic health records (EHRs) may reduce medical errors and improve care, but can complicate clinical encounters. To describe hospital-based physicians' perceptions of the impact of EHRs on patient-physician interactions and contrast these findings against office-based physicians' perceptionsMethods: We performed a qualitative analysis of comments submitted in response to the 2014 Rhode Island Health Information Technology Survey. Office- and hospital-based physicians licensed in Rhode Island, in active practice, and located in Rhode Island or neighboring states completed the survey about their Electronic Health Record use. The survey's response rate was 68.3% and 2,236 (87.1%) respondents had EHRs. Among survey respondents, 27.3% of hospital-based and 37.8% of office-based physicians with EHRs responded to the question about patient interaction. Five main themes emerged for hospital-based physicians, with respondents generally perceiving EHRs as negatively altering patient interactions. We noted the same five themes among office-based physicians, but the rank-order of the top two responses differed by setting: hospital-based physicians commented most frequently that they spend less time with patients because they have to spend more time on computers; office-based physicians commented most frequently on EHRs worsening the quality of their interactions and relationships with patients. In our analysis of a large sample of physicians, hospital-based physicians generally perceived EHRs as negatively altering patient interactions, although they emphasized different reasons than their office-based counterparts. These findings add to the prior literature, which focuses on outpatient physicians, and can shape interventions to improve how EHRs are used in inpatient settings.

  1. Needs assessment for simulation training in neuroendoscopy: a Canadian national survey.

    PubMed

    Haji, Faizal A; Dubrowski, Adam; Drake, James; de Ribaupierre, Sandrine

    2013-02-01

    In recent years, dramatic changes in surgical education have increased interest in simulation-based training for complex surgical skills. This is particularly true for endoscopic third ventriculostomy (ETV), given the potential for serious intraoperative errors arising from surgical inexperience. However, prior to simulator development, a thorough assessment of training needs is essential to ensure development of educationally relevant platforms. The purpose of this study was to conduct a national needs assessment addressing specific goals of instruction, to guide development of simulation platforms, training curricula, and assessment metrics for ETV. Canadian neurosurgeons performing ETV were invited to participate in a structured online questionnaire regarding the procedural steps for ETV, the frequency and significance of intraoperative errors committed while learning the technique, and simulation training modules of greatest potential educational benefit. Descriptive data analysis was completed for both quantitative and qualitative responses. Thirty-two (55.2%) of 58 surgeons completed the survey. All believed that virtual reality simulation training for ETV would be a valuable addition to clinical training. Selection of ventriculostomy site, navigation within the ventricles, and performance of the ventriculostomy ranked as the most important steps to simulate. Technically inadequate ventriculostomy and inappropriate fenestration site selection were ranked as the most frequent/significant errors. A standard ETV module was thought to be most beneficial for resident training. To inform the development of a simulation-based training program for ETV, the authors have conducted a national needs assessment. The results provide valuable insight to inform key design elements necessary to construct an educationally relevant device and educational program.

  2. New dimension analyses with error analysis for quaking aspen and black spruce

    NASA Technical Reports Server (NTRS)

    Woods, K. D.; Botkin, D. B.; Feiveson, A. H.

    1987-01-01

    Dimension analysis for black spruce in wetland stands and trembling aspen are reported, including new approaches in error analysis. Biomass estimates for sacrificed trees have standard errors of 1 to 3%; standard errors for leaf areas are 10 to 20%. Bole biomass estimation accounts for most of the error for biomass, while estimation of branch characteristics and area/weight ratios accounts for the leaf area error. Error analysis provides insight for cost effective design of future analyses. Predictive equations for biomass and leaf area, with empirically derived estimators of prediction error, are given. Systematic prediction errors for small aspen trees and for leaf area of spruce from different site-types suggest a need for different predictive models within species. Predictive equations are compared with published equations; significant differences may be due to species responses to regional or site differences. Proportional contributions of component biomass in aspen change in ways related to tree size and stand development. Spruce maintains comparatively constant proportions with size, but shows changes corresponding to site. This suggests greater morphological plasticity of aspen and significance for spruce of nutrient conditions.

  3. Addressing the unit of analysis in medical care studies: a systematic review.

    PubMed

    Calhoun, Aaron W; Guyatt, Gordon H; Cabana, Michael D; Lu, Downing; Turner, David A; Valentine, Stacey; Randolph, Adrienne G

    2008-06-01

    We assessed the frequency that patients are incorrectly used as the unit of analysis among studies of physicians' patient care behavior in articles published in high impact journals. We surveyed 30 high-impact journals across 6 medical fields for articles susceptible to unit of analysis errors published from 1994 to 2005. Three reviewers independently abstracted articles using previously published criteria to determine the presence of analytic errors. One hundred fourteen susceptible articles were found published in 15 journals, 4 journals published the majority (71 of 114 or 62.3%) of studies, 40 were intervention studies, and 74 were noninterventional studies. The unit of analysis error was present in 19 (48%) of the intervention studies and 31 (42%) of the noninterventional studies (overall error rate 44%). The frequency of the error decreased between 1994-1999 (N = 38; 65% error) and 2000-2005 (N = 76; 33% error) (P = 0.001). Although the frequency of the error in published studies is decreasing, further improvement remains desirable.

  4. Differential quantitative proteomics of Porphyromonas gingivalis by linear ion trap mass spectrometry: non-label methods comparison, q-values and LOWESS curve fitting

    PubMed Central

    Xia, Qiangwei; Wang, Tiansong; Park, Yoonsuk; Lamont, Richard J.; Hackett, Murray

    2009-01-01

    Differential analysis of whole cell proteomes by mass spectrometry has largely been applied using various forms of stable isotope labeling. While metabolic stable isotope labeling has been the method of choice, it is often not possible to apply such an approach. Four different label free ways of calculating expression ratios in a classic “two-state” experiment are compared: signal intensity at the peptide level, signal intensity at the protein level, spectral counting at the peptide level, and spectral counting at the protein level. The quantitative data were mined from a dataset of 1245 qualitatively identified proteins, about 56% of the protein encoding open reading frames from Porphyromonas gingivalis, a Gram-negative intracellular pathogen being studied under extracellular and intracellular conditions. Two different control populations were compared against P. gingivalis internalized within a model human target cell line. The q-value statistic, a measure of false discovery rate previously applied to transcription microarrays, was applied to proteomics data. For spectral counting, the most logically consistent estimate of random error came from applying the locally weighted scatter plot smoothing procedure (LOWESS) to the most extreme ratios generated from a control technical replicate, thus setting upper and lower bounds for the region of experimentally observed random error. PMID:19337574

  5. Chloroplast 2010: A Database for Large-Scale Phenotypic Screening of Arabidopsis Mutants1[W][OA

    PubMed Central

    Lu, Yan; Savage, Linda J.; Larson, Matthew D.; Wilkerson, Curtis G.; Last, Robert L.

    2011-01-01

    Large-scale phenotypic screening presents challenges and opportunities not encountered in typical forward or reverse genetics projects. We describe a modular database and laboratory information management system that was implemented in support of the Chloroplast 2010 Project, an Arabidopsis (Arabidopsis thaliana) reverse genetics phenotypic screen of more than 5,000 mutants (http://bioinfo.bch.msu.edu/2010_LIMS; www.plastid.msu.edu). The software and laboratory work environment were designed to minimize operator error and detect systematic process errors. The database uses Ruby on Rails and Flash technologies to present complex quantitative and qualitative data and pedigree information in a flexible user interface. Examples are presented where the database was used to find opportunities for process changes that improved data quality. We also describe the use of the data-analysis tools to discover mutants defective in enzymes of leucine catabolism (heteromeric mitochondrial 3-methylcrotonyl-coenzyme A carboxylase [At1g03090 and At4g34030] and putative hydroxymethylglutaryl-coenzyme A lyase [At2g26800]) based upon a syndrome of pleiotropic seed amino acid phenotypes that resembles previously described isovaleryl coenzyme A dehydrogenase (At3g45300) mutants. In vitro assay results support the computational annotation of At2g26800 as hydroxymethylglutaryl-coenzyme A lyase. PMID:21224340

  6. Clinical Risk Assessment in Intensive Care Unit

    PubMed Central

    Asefzadeh, Saeed; Yarmohammadian, Mohammad H.; Nikpey, Ahmad; Atighechian, Golrokh

    2013-01-01

    Background: Clinical risk management focuses on improving the quality and safety of health care services by identifying the circumstances and opportunities that put patients at risk of harm and acting to prevent or control those risks. The goal of this study is to identify and assess the failure modes in the ICU of Qazvin's Social Security Hospital (Razi Hospital) through Failure Mode and Effect Analysis (FMEA). Methods: This was a qualitative-quantitative research by Focus Discussion Group (FDG) performed in Qazvin Province, Iran during 2011. The study population included all individuals and owners who are familiar with the process in ICU. Sampling method was purposeful and the FDG group members were selected by the researcher. The research instrument was standard worksheet that has been used by several researchers. Data was analyzed by FMEA technique. Results: Forty eight clinical errors and failure modes identified, results showed that the highest risk probability number (RPN) was in respiratory care “Ventilator's alarm malfunction (no alarm)” with the score 288, and the lowest was in gastrointestinal “not washing the NG-Tube” with the score 8. Conclusions: Many of the identified errors can be prevented by group members. Clinical risk assessment and management is the key to delivery of effective health care. PMID:23930171

  7. First-order approximation error analysis of Risley-prism-based beam directing system.

    PubMed

    Zhao, Yanyan; Yuan, Yan

    2014-12-01

    To improve the performance of a Risley-prism system for optical detection and measuring applications, it is necessary to be able to determine the direction of the outgoing beam with high accuracy. In previous works, error sources and their impact on the performance of the Risley-prism system have been analyzed, but their numerical approximation accuracy was not high. Besides, pointing error analysis of the Risley-prism system has provided results for the case when the component errors, prism orientation errors, and assembly errors are certain. In this work, the prototype of a Risley-prism system was designed. The first-order approximations of the error analysis were derived and compared with the exact results. The directing errors of a Risley-prism system associated with wedge-angle errors, prism mounting errors, and bearing assembly errors were analyzed based on the exact formula and the first-order approximation. The comparisons indicated that our first-order approximation is accurate. In addition, the combined errors produced by the wedge-angle errors and mounting errors of the two prisms together were derived and in both cases were proved to be the sum of errors caused by the first and the second prism separately. Based on these results, the system error of our prototype was estimated. The derived formulas can be implemented to evaluate beam directing errors of any Risley-prism beam directing system with a similar configuration.

  8. Influence of Tooth Spacing Error on Gears With and Without Profile Modifications

    NASA Technical Reports Server (NTRS)

    Padmasolala, Giri; Lin, Hsiang H.; Oswald, Fred B.

    2000-01-01

    A computer simulation was conducted to investigate the effectiveness of profile modification for reducing dynamic loads in gears with different tooth spacing errors. The simulation examined varying amplitudes of spacing error and differences in the span of teeth over which the error occurs. The modification considered included both linear and parabolic tip relief. The analysis considered spacing error that varies around most of the gear circumference (similar to a typical sinusoidal error pattern) as well as a shorter span of spacing errors that occurs on only a few teeth. The dynamic analysis was performed using a revised version of a NASA gear dynamics code, modified to add tooth spacing errors to the analysis. Results obtained from the investigation show that linear tip relief is more effective in reducing dynamic loads on gears with small spacing errors but parabolic tip relief becomes more effective as the amplitude of spacing error increases. In addition, the parabolic modification is more effective for the more severe error case where the error is spread over a longer span of teeth. The findings of this study can be used to design robust tooth profile modification for improving dynamic performance of gear sets with different tooth spacing errors.

  9. Evaluation of Natural Language Processing (NLP) Systems to Annotate Drug Product Labeling with MedDRA Terminology.

    PubMed

    Ly, Thomas; Pamer, Carol; Dang, Oanh; Brajovic, Sonja; Haider, Shahrukh; Botsis, Taxiarchis; Milward, David; Winter, Andrew; Lu, Susan; Ball, Robert

    2018-05-31

    The FDA Adverse Event Reporting System (FAERS) is a primary data source for identifying unlabeled adverse events (AEs) in a drug or biologic drug product's postmarketing phase. Many AE reports must be reviewed by drug safety experts to identify unlabeled AEs, even if the reported AEs are previously identified, labeled AEs. Integrating the labeling status of drug product AEs into FAERS could increase report triage and review efficiency. Medical Dictionary for Regulatory Activities (MedDRA) is the standard for coding AE terms in FAERS cases. However, drug manufacturers are not required to use MedDRA to describe AEs in product labels. We hypothesized that natural language processing (NLP) tools could assist in automating the extraction and MedDRA mapping of AE terms in drug product labels. We evaluated the performance of three NLP systems, (ETHER, I2E, MetaMap) for their ability to extract AE terms from drug labels and translate the terms to MedDRA Preferred Terms (PTs). Pharmacovigilance-based annotation guidelines for extracting AE terms from drug labels were developed for this study. We compared each system's output to MedDRA PT AE lists, manually mapped by FDA pharmacovigilance experts using the guidelines, for ten drug product labels known as the "gold standard AE list" (GSL) dataset. Strict time and configuration conditions were imposed in order to test each system's capabilities under conditions of no human intervention and minimal system configuration. Each NLP system's output was evaluated for precision, recall and F measure in comparison to the GSL. A qualitative error analysis (QEA) was conducted to categorize a random sample of each NLP system's false positive and false negative errors. A total of 417, 278, and 250 false positive errors occurred in the ETHER, I2E, and MetaMap outputs, respectively. A total of 100, 80, and 187 false negative errors occurred in ETHER, I2E, and MetaMap outputs, respectively. Precision ranged from 64% to 77%, recall from 64% to 83% and F measure from 67% to 79%. I2E had the highest precision (77%), recall (83%) and F measure (79%). ETHER had the lowest precision (64%). MetaMap had the lowest recall (64%). The QEA found that the most prevalent false positive errors were context errors such as "Context error/General term", "Context error/Instructions or monitoring parameters", "Context error/Medical history preexisting condition underlying condition risk factor or contraindication", and "Context error/AE manifestations or secondary complication". The most prevalent false negative errors were in the "Incomplete or missed extraction" error category. Missing AE terms were typically due to long terms, or terms containing non-contiguous words which do not correspond exactly to MedDRA synonyms. MedDRA mapping errors were a minority of errors for ETHER and I2E but were the most prevalent false positive errors for MetaMap. The results demonstrate that it may be feasible to use NLP tools to extract and map AE terms to MedDRA PTs. However, the NLP tools we tested would need to be modified or reconfigured to lower the error rates to support their use in a regulatory setting. Tools specific for extracting AE terms from drug labels and mapping the terms to MedDRA PTs may need to be developed to support pharmacovigilance. Conducting research using additional NLP systems on a larger, diverse GSL would also be informative. Copyright © 2018. Published by Elsevier Inc.

  10. Peripheral Quantitative CT (pQCT) Using a Dedicated Extremity Cone-Beam CT Scanner

    PubMed Central

    Muhit, A. A.; Arora, S.; Ogawa, M.; Ding, Y.; Zbijewski, W.; Stayman, J. W.; Thawait, G.; Packard, N.; Senn, R.; Yang, D.; Yorkston, J.; Bingham, C.O.; Means, K.; Carrino, J. A.; Siewerdsen, J. H.

    2014-01-01

    Purpose We describe the initial assessment of the peripheral quantitative CT (pQCT) imaging capabilities of a cone-beam CT (CBCT) scanner dedicated to musculoskeletal extremity imaging. The aim is to accurately measure and quantify bone and joint morphology using information automatically acquired with each CBCT scan, thereby reducing the need for a separate pQCT exam. Methods A prototype CBCT scanner providing isotropic, sub-millimeter spatial resolution and soft-tissue contrast resolution comparable or superior to standard multi-detector CT (MDCT) has been developed for extremity imaging, including the capability for weight-bearing exams and multi-mode (radiography, fluoroscopy, and volumetric) imaging. Assessment of pQCT performance included measurement of bone mineral density (BMD), morphometric parameters of subchondral bone architecture, and joint space analysis. Measurements employed phantoms, cadavers, and patients from an ongoing pilot study imaged with the CBCT prototype (at various acquisition, calibration, and reconstruction techniques) in comparison to MDCT (using pQCT protocols for analysis of BMD) and micro-CT (for analysis of subchondral morphometry). Results The CBCT extremity scanner yielded BMD measurement within ±2–3% error in both phantom studies and cadaver extremity specimens. Subchondral bone architecture (bone volume fraction, trabecular thickness, degree of anisotropy, and structure model index) exhibited good correlation with gold standard micro-CT (error ~5%), surpassing the conventional limitations of spatial resolution in clinical MDCT scanners. Joint space analysis demonstrated the potential for sensitive 3D joint space mapping beyond that of qualitative radiographic scores in application to non-weight-bearing versus weight-bearing lower extremities and assessment of phalangeal joint space integrity in the upper extremities. Conclusion The CBCT extremity scanner demonstrated promising initial results in accurate pQCT analysis from images acquired with each CBCT scan. Future studies will include improved x-ray scatter correction and image reconstruction techniques to further improve accuracy and to correlate pQCT metrics with known pathology. PMID:25076823

  11. NVivo 8 and consistency in data analysis: reflecting on the use of a qualitative data analysis program.

    PubMed

    Bergin, Michael

    2011-01-01

    Qualitative data analysis is a complex process and demands clear thinking on the part of the analyst. However, a number of deficiencies may obstruct the research analyst during the process, leading to inconsistencies occurring. This paper is a reflection on the use of a qualitative data analysis program, NVivo 8, and its usefulness in identifying consistency and inconsistency during the coding process. The author was conducting a large-scale study of providers and users of mental health services in Ireland. He used NVivo 8 to store, code and analyse the data and this paper reflects some of his observations during the study. The demands placed on the analyst in trying to balance the mechanics of working through a qualitative data analysis program, while simultaneously remaining conscious of the value of all sources are highlighted. NVivo 8 as a qualitative data analysis program is a challenging but valuable means for advancing the robustness of qualitative research. Pitfalls can be avoided during analysis by running queries as the analyst progresses from tree node to tree node rather than leaving it to a stage whereby data analysis is well advanced.

  12. The Utility of Template Analysis in Qualitative Psychology Research.

    PubMed

    Brooks, Joanna; McCluskey, Serena; Turley, Emma; King, Nigel

    2015-04-03

    Thematic analysis is widely used in qualitative psychology research, and in this article, we present a particular style of thematic analysis known as Template Analysis. We outline the technique and consider its epistemological position, then describe three case studies of research projects which employed Template Analysis to illustrate the diverse ways it can be used. Our first case study illustrates how the technique was employed in data analysis undertaken by a team of researchers in a large-scale qualitative research project. Our second example demonstrates how a qualitative study that set out to build on mainstream theory made use of the a priori themes (themes determined in advance of coding) permitted in Template Analysis. Our final case study shows how Template Analysis can be used from an interpretative phenomenological stance. We highlight the distinctive features of this style of thematic analysis, discuss the kind of research where it may be particularly appropriate, and consider possible limitations of the technique. We conclude that Template Analysis is a flexible form of thematic analysis with real utility in qualitative psychology research.

  13. Gating the holes in the Swiss cheese (part I): Expanding professor Reason's model for patient safety.

    PubMed

    Seshia, Shashi S; Bryan Young, G; Makhinson, Michael; Smith, Preston A; Stobart, Kent; Croskerry, Pat

    2018-02-01

    Although patient safety has improved steadily, harm remains a substantial global challenge. Additionally, safety needs to be ensured not only in hospitals but also across the continuum of care. Better understanding of the complex cognitive factors influencing health care-related decisions and organizational cultures could lead to more rational approaches, and thereby to further improvement. A model integrating the concepts underlying Reason's Swiss cheese theory and the cognitive-affective biases plus cascade could advance the understanding of cognitive-affective processes that underlie decisions and organizational cultures across the continuum of care. Thematic analysis, qualitative information from several sources being used to support argumentation. Complex covert cognitive phenomena underlie decisions influencing health care. In the integrated model, the Swiss cheese slices represent dynamic cognitive-affective (mental) gates: Reason's successive layers of defence. Like firewalls and antivirus programs, cognitive-affective gates normally allow the passage of rational decisions but block or counter unsounds ones. Gates can be breached (ie, holes created) at one or more levels of organizations, teams, and individuals, by (1) any element of cognitive-affective biases plus (conflicts of interest and cognitive biases being the best studied) and (2) other potential error-provoking factors. Conversely, flawed decisions can be blocked and consequences minimized; for example, by addressing cognitive biases plus and error-provoking factors, and being constantly mindful. Informed shared decision making is a neglected but critical layer of defence (cognitive-affective gate). The integrated model can be custom tailored to specific situations, and the underlying principles applied to all methods for improving safety. The model may also provide a framework for developing and evaluating strategies to optimize organizational cultures and decisions. The concept is abstract, the model is virtual, and the best supportive evidence is qualitative and indirect. The proposed model may help enhance rational decision making across the continuum of care, thereby improving patient safety globally. © 2017 The Authors. Journal of Evaluation in Clinical Practice published by John Wiley & Sons, Ltd.

  14. Social Autopsy of maternal, neonatal deaths and stillbirths in rural Bangladesh: qualitative exploration of its effect and community acceptance.

    PubMed

    Biswas, Animesh; Rahman, Fazlur; Eriksson, Charli; Halim, Abdul; Dalal, Koustuv

    2016-08-23

    Social Autopsy (SA) is an innovative strategy where a trained facilitator leads community groups through a structured, standardised analysis of the physical, environmental, cultural and social factors contributing to a serious, non-fatal health event or death. The discussion stimulated by the formal process of SA determines the causes and suggests preventative measures that are appropriate and achievable in the community. Here we explored individual experiences of SA, including acceptance and participant learning, and its effect on rural communities in Bangladesh. The present study had explored the experiences gained while undertaking SA of maternal and neonatal deaths and stillbirths in rural Bangladesh. Qualitative assessment of documents, observations, focus group discussions, group discussions and in-depth interviews by content and thematic analyses. Each community's maternal and neonatal death was a unique, sad story. SA undertaken by government field-level health workers were well accepted by rural communities. SA had the capability to explore the social reasons behind the medical cause of the death without apportioning blame to any individual or group. SA was a useful instrument to raise awareness and encourage community responses to errors within the society that contributed to the death. People participating in SA showed commitment to future preventative measures and devised their own solutions for the future prevention of maternal and neonatal deaths. SA highlights societal errors and promotes discussion around maternal or newborn death. SA is an effective means to deliver important preventative messages and to sensitise the community to death issues. Importantly, the community itself is enabled to devise future strategies to avert future maternal and neonatal deaths in Bangladesh. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  15. Exploring the impact of forcing error characteristics on physically based snow simulations within a global sensitivity analysis framework

    NASA Astrophysics Data System (ADS)

    Raleigh, M. S.; Lundquist, J. D.; Clark, M. P.

    2015-07-01

    Physically based models provide insights into key hydrologic processes but are associated with uncertainties due to deficiencies in forcing data, model parameters, and model structure. Forcing uncertainty is enhanced in snow-affected catchments, where weather stations are scarce and prone to measurement errors, and meteorological variables exhibit high variability. Hence, there is limited understanding of how forcing error characteristics affect simulations of cold region hydrology and which error characteristics are most important. Here we employ global sensitivity analysis to explore how (1) different error types (i.e., bias, random errors), (2) different error probability distributions, and (3) different error magnitudes influence physically based simulations of four snow variables (snow water equivalent, ablation rates, snow disappearance, and sublimation). We use the Sobol' global sensitivity analysis, which is typically used for model parameters but adapted here for testing model sensitivity to coexisting errors in all forcings. We quantify the Utah Energy Balance model's sensitivity to forcing errors with 1 840 000 Monte Carlo simulations across four sites and five different scenarios. Model outputs were (1) consistently more sensitive to forcing biases than random errors, (2) generally less sensitive to forcing error distributions, and (3) critically sensitive to different forcings depending on the relative magnitude of errors. For typical error magnitudes found in areas with drifting snow, precipitation bias was the most important factor for snow water equivalent, ablation rates, and snow disappearance timing, but other forcings had a more dominant impact when precipitation uncertainty was due solely to gauge undercatch. Additionally, the relative importance of forcing errors depended on the model output of interest. Sensitivity analysis can reveal which forcing error characteristics matter most for hydrologic modeling.

  16. Effects of measurement errors on psychometric measurements in ergonomics studies: Implications for correlations, ANOVA, linear regression, factor analysis, and linear discriminant analysis.

    PubMed

    Liu, Yan; Salvendy, Gavriel

    2009-05-01

    This paper aims to demonstrate the effects of measurement errors on psychometric measurements in ergonomics studies. A variety of sources can cause random measurement errors in ergonomics studies and these errors can distort virtually every statistic computed and lead investigators to erroneous conclusions. The effects of measurement errors on five most widely used statistical analysis tools have been discussed and illustrated: correlation; ANOVA; linear regression; factor analysis; linear discriminant analysis. It has been shown that measurement errors can greatly attenuate correlations between variables, reduce statistical power of ANOVA, distort (overestimate, underestimate or even change the sign of) regression coefficients, underrate the explanation contributions of the most important factors in factor analysis and depreciate the significance of discriminant function and discrimination abilities of individual variables in discrimination analysis. The discussions will be restricted to subjective scales and survey methods and their reliability estimates. Other methods applied in ergonomics research, such as physical and electrophysiological measurements and chemical and biomedical analysis methods, also have issues of measurement errors, but they are beyond the scope of this paper. As there has been increasing interest in the development and testing of theories in ergonomics research, it has become very important for ergonomics researchers to understand the effects of measurement errors on their experiment results, which the authors believe is very critical to research progress in theory development and cumulative knowledge in the ergonomics field.

  17. The Development of a Web-Based Virtual Environment for Teaching Qualitative Analysis of Structures

    ERIC Educational Resources Information Center

    O'Dwyer, D. W.; Logan-Phelan, T. M.; O'Neill, E. A.

    2007-01-01

    The current paper describes the design and development of a qualitative analysis course and an interactive web-based teaching and assessment tool called VSE (virtual structural environment). The widespread reliance on structural analysis programs requires engineers to be able to verify computer output by carrying out qualitative analyses.…

  18. The Infinitesimal Jackknife with Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Preacher, Kristopher J.; Jennrich, Robert I.

    2012-01-01

    The infinitesimal jackknife, a nonparametric method for estimating standard errors, has been used to obtain standard error estimates in covariance structure analysis. In this article, we adapt it for obtaining standard errors for rotated factor loadings and factor correlations in exploratory factor analysis with sample correlation matrices. Both…

  19. [Preliminarily application of content analysis to qualitative nursing data].

    PubMed

    Liang, Shu-Yuan; Chuang, Yeu-Hui; Wu, Shu-Fang

    2012-10-01

    Content analysis is a methodology for objectively and systematically studying the content of communication in various formats. Content analysis in nursing research and nursing education is called qualitative content analysis. Qualitative content analysis is frequently applied to nursing research, as it allows researchers to determine categories inductively and deductively. This article examines qualitative content analysis in nursing research from theoretical and practical perspectives. We first describe how content analysis concepts such as unit of analysis, meaning unit, code, category, and theme are used. Next, we describe the basic steps involved in using content analysis, including data preparation, data familiarization, analysis unit identification, creating tentative coding categories, category refinement, and establishing category integrity. Finally, this paper introduces the concept of content analysis rigor, including dependability, confirmability, credibility, and transferability. This article elucidates the content analysis method in order to help professionals conduct systematic research that generates data that are informative and useful in practical application.

  20. Effects of Tropospheric Spatio-Temporal Correlated Noise on the Analysis of Space Geodetic Data

    NASA Technical Reports Server (NTRS)

    Romero-Wolf, A. F.; Jacobs, C. S.

    2011-01-01

    The standard VLBI analysis models measurement noise as purely thermal errors modeled according to uncorrelated Gaussian distributions. As the price of recording bits steadily decreases, thermal errors will soon no longer dominate. It is therefore expected that troposphere and instrumentation/clock errors will increasingly become more dominant. Given that both of these errors have correlated spectra, properly modeling the error distributions will become more relevant for optimal analysis. This paper will discuss the advantages of including the correlations between tropospheric delays using a Kolmogorov spectrum and the frozen ow model pioneered by Treuhaft and Lanyi. We will show examples of applying these correlated noise spectra to the weighting of VLBI data analysis.

  1. Quantitative Analysis Tools and Digital Phantoms for Deformable Image Registration Quality Assurance.

    PubMed

    Kim, Haksoo; Park, Samuel B; Monroe, James I; Traughber, Bryan J; Zheng, Yiran; Lo, Simon S; Yao, Min; Mansur, David; Ellis, Rodney; Machtay, Mitchell; Sohn, Jason W

    2015-08-01

    This article proposes quantitative analysis tools and digital phantoms to quantify intrinsic errors of deformable image registration (DIR) systems and establish quality assurance (QA) procedures for clinical use of DIR systems utilizing local and global error analysis methods with clinically realistic digital image phantoms. Landmark-based image registration verifications are suitable only for images with significant feature points. To address this shortfall, we adapted a deformation vector field (DVF) comparison approach with new analysis techniques to quantify the results. Digital image phantoms are derived from data sets of actual patient images (a reference image set, R, a test image set, T). Image sets from the same patient taken at different times are registered with deformable methods producing a reference DVFref. Applying DVFref to the original reference image deforms T into a new image R'. The data set, R', T, and DVFref, is from a realistic truth set and therefore can be used to analyze any DIR system and expose intrinsic errors by comparing DVFref and DVFtest. For quantitative error analysis, calculating and delineating differences between DVFs, 2 methods were used, (1) a local error analysis tool that displays deformation error magnitudes with color mapping on each image slice and (2) a global error analysis tool that calculates a deformation error histogram, which describes a cumulative probability function of errors for each anatomical structure. Three digital image phantoms were generated from three patients with a head and neck, a lung and a liver cancer. The DIR QA was evaluated using the case with head and neck. © The Author(s) 2014.

  2. Using secondary analysis of qualitative data of patient experiences of health care to inform health services research and policy.

    PubMed

    Ziebland, Sue; Hunt, Kate

    2014-07-01

    Qualitative research is recognized as an important method for including patients' voices and experiences in health services research and policy-making, yet the considerable potential to analyse existing qualitative data to inform health policy and practice has been little realized. This failure may partly be explained by: a lack of awareness amongst health policy makers of the increasing wealth of qualitative data available; and around 15 years of internal debates among qualitative researchers on the strengths, limitations and validity of re-use of qualitative data. Whilst acknowledging the challenges of qualitative secondary data analysis, we argue that there is a growing imperative to be pragmatic and to undertake analysis of existing qualitative data collections where they have the potential to contribute to health policy formulation. Time pressures are inherent in the policy-making process and in many circumstances it is not possible to seek funding, conduct and analyse new qualitative studies of patients' experiences in time to inform a specific policy. The danger then is that the patient voice, and the experiences of relatives and carers, is either excluded or included in a way that is easily dismissed as 'unrepresentative'. We argue that secondary analysis of qualitative data collections may sometimes be an effective means to enable patient experiences to inform policy decision-making. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  3. Analysis of measured data of human body based on error correcting frequency

    NASA Astrophysics Data System (ADS)

    Jin, Aiyan; Peipei, Gao; Shang, Xiaomei

    2014-04-01

    Anthropometry is to measure all parts of human body surface, and the measured data is the basis of analysis and study of the human body, establishment and modification of garment size and formulation and implementation of online clothing store. In this paper, several groups of the measured data are gained, and analysis of data error is gotten by analyzing the error frequency and using analysis of variance method in mathematical statistics method. Determination of the measured data accuracy and the difficulty of measured parts of human body, further studies of the causes of data errors, and summarization of the key points to minimize errors possibly are also mentioned in the paper. This paper analyses the measured data based on error frequency, and in a way , it provides certain reference elements to promote the garment industry development.

  4. Error Analysis in Mathematics. Technical Report #1012

    ERIC Educational Resources Information Center

    Lai, Cheng-Fei

    2012-01-01

    Error analysis is a method commonly used to identify the cause of student errors when they make consistent mistakes. It is a process of reviewing a student's work and then looking for patterns of misunderstanding. Errors in mathematics can be factual, procedural, or conceptual, and may occur for a number of reasons. Reasons why students make…

  5. Error analysis in stereo vision for location measurement of 3D point

    NASA Astrophysics Data System (ADS)

    Li, Yunting; Zhang, Jun; Tian, Jinwen

    2015-12-01

    Location measurement of 3D point in stereo vision is subjected to different sources of uncertainty that propagate to the final result. For current methods of error analysis, most of them are based on ideal intersection model to calculate the uncertainty region of point location via intersecting two fields of view of pixel that may produce loose bounds. Besides, only a few of sources of error such as pixel error or camera position are taken into account in the process of analysis. In this paper we present a straightforward and available method to estimate the location error that is taken most of source of error into account. We summed up and simplified all the input errors to five parameters by rotation transformation. Then we use the fast algorithm of midpoint method to deduce the mathematical relationships between target point and the parameters. Thus, the expectations and covariance matrix of 3D point location would be obtained, which can constitute the uncertainty region of point location. Afterwards, we turned back to the error propagation of the primitive input errors in the stereo system and throughout the whole analysis process from primitive input errors to localization error. Our method has the same level of computational complexity as the state-of-the-art method. Finally, extensive experiments are performed to verify the performance of our methods.

  6. Hybrid density-functional calculations of phonons in LaCoO3

    NASA Astrophysics Data System (ADS)

    Gryaznov, Denis; Evarestov, Robert A.; Maier, Joachim

    2010-12-01

    Phonon frequencies at Γ point in nonmagnetic rhombohedral phase of LaCoO3 were calculated using density-functional theory with hybrid exchange correlation functional PBE0. The calculations involved a comparison of results for two types of basis functions commonly used in ab initio calculations, namely, the plane-wave approach and linear combination of atomic orbitals, as implemented in VASP and CRYSTAL computer codes, respectively. A good qualitative, but also within an error margin of less than 30%, a quantitative agreement was observed not only between the two formalisms but also between theoretical and experimental phonon frequency predictions. Moreover, the correlation between the phonon symmetries in cubic and rhombohedral phases is discussed in detail on the basis of group-theoretical analysis. It is concluded that the hybrid PBE0 functional is able to predict correctly the phonon properties in LaCoO3 .

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.

    Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less

  8. Clinical decision regret among critical care nurses: a qualitative analysis.

    PubMed

    Arslanian-Engoren, Cynthia; Scott, Linda D

    2014-01-01

    Decision regret is a negative cognitive emotion associated with experiences of guilt and situations of interpersonal harm. These negative affective responses may contribute to emotional exhaustion in critical care nurses (CCNs), increased staff turnover rates and high medication error rates. Yet, little is known about clinical decision regret among CCNs or the conditions or situations (e.g., feeling sleepy) that may precipitate its occurrence. To examine decision regret among CCNs, with an emphasis on clinical decisions made when nurses were most sleepy. A content analytic approach was used to examine the narrative descriptions of clinical decisions by CCNs when sleepy. Six decision regret themes emerged that represented deviations in practice or performance behaviors that were attributed to fatigued CCNs. While 157 CCNs disclosed a clinical decision they made at work while sleepy, the prevalence may be underestimated and warrants further investigation. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. A study of photon propagation in free-space based on hybrid radiosity-radiance theorem.

    PubMed

    Chen, Xueli; Gao, Xinbo; Qu, Xiaochao; Liang, Jimin; Wang, Lin; Yang, Da'an; Garofalakis, Anikitos; Ripoll, Jorge; Tian, Jie

    2009-08-31

    Noncontact optical imaging has attracted increasing attention in recent years due to its significant advantages on detection sensitivity, spatial resolution, image quality and system simplicity compared with contact measurement. However, photon transport simulation in free-space is still an extremely challenging topic for the complexity of the optical system. For this purpose, this paper proposes an analytical model for photon propagation in free-space based on hybrid radiosity-radiance theorem (HRRT). It combines Lambert's cosine law and the radiance theorem to handle the influence of the complicated lens and to simplify the photon transport process in the optical system. The performance of the proposed model is evaluated and validated with numerical simulations and physical experiments. Qualitative comparison results of flux distribution at the detector are presented. In particular, error analysis demonstrates the feasibility and potential of the proposed model for simulating photon propagation in free-space.

  10. Rapid evaluation and quantitative analysis of thyme, origano and chamomile essential oils by ATR-IR and NIR spectroscopy

    NASA Astrophysics Data System (ADS)

    Schulz, Hartwig; Quilitzsch, Rolf; Krüger, Hans

    2003-12-01

    The essential oils obtained from various chemotypes of thyme, origano and chamomile species were studied by ATR/FT-IR as well as NIR spectroscopy. Application of multivariate statistics (PCA, PLS) in conjunction with analytical reference data leads to very good IR and NIR calibration results. For the main essential oil components (e.g. carvacrol, thymol, γ-terpinene, α-bisabolol and β-farnesene) standard errors are in the range of the applied GC reference method. In most cases the multiple coefficients of determination ( R2) are >0.97. Using the IR fingerprint region (900-1400 cm -1) a qualitative discrimination of the individual chemotypes is possible already by visual judgement without to apply any chemometric algorithms.The described rapid and non-destructive methods can be applied in industry to control very easily purifying, blending and redistillation processes of the mentioned essential oils.

  11. A median-Gaussian filtering framework for Moiré pattern noise removal from X-ray microscopy image.

    PubMed

    Wei, Zhouping; Wang, Jian; Nichol, Helen; Wiebe, Sheldon; Chapman, Dean

    2012-02-01

    Moiré pattern noise in Scanning Transmission X-ray Microscopy (STXM) imaging introduces significant errors in qualitative and quantitative image analysis. Due to the complex origin of the noise, it is difficult to avoid Moiré pattern noise during the image data acquisition stage. In this paper, we introduce a post-processing method for filtering Moiré pattern noise from STXM images. This method includes a semi-automatic detection of the spectral peaks in the Fourier amplitude spectrum by using a local median filter, and elimination of the spectral noise peaks using a Gaussian notch filter. The proposed median-Gaussian filtering framework shows good results for STXM images with the size of power of two, if such parameters as threshold, sizes of the median and Gaussian filters, and size of the low frequency window, have been properly selected. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. A new simplex chemometric approach to identify olive oil blends with potentially high traceability.

    PubMed

    Semmar, N; Laroussi-Mezghani, S; Grati-Kamoun, N; Hammami, M; Artaud, J

    2016-10-01

    Olive oil blends (OOBs) are complex matrices combining different cultivars at variable proportions. Although qualitative determinations of OOBs have been subjected to several chemometric works, quantitative evaluations of their contents remain poorly developed because of traceability difficulties concerning co-occurring cultivars. Around this question, we recently published an original simplex approach helping to develop predictive models of the proportions of co-occurring cultivars from chemical profiles of resulting blends (Semmar & Artaud, 2015). Beyond predictive model construction and validation, this paper presents an extension based on prediction errors' analysis to statistically define the blends with the highest predictability among all the possible ones that can be made by mixing cultivars at different proportions. This provides an interesting way to identify a priori labeled commercial products with potentially high traceability taking into account the natural chemical variability of different constitutive cultivars. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Quantitative validation of carbon-fiber laminate low velocity impact simulations

    DOE PAGES

    English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.

    2015-09-26

    Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less

  14. Petri-net-based 2D design of DNA walker circuits.

    PubMed

    Gilbert, David; Heiner, Monika; Rohr, Christian

    2018-01-01

    We consider localised DNA computation, where a DNA strand walks along a binary decision graph to compute a binary function. One of the challenges for the design of reliable walker circuits consists in leakage transitions, which occur when a walker jumps into another branch of the decision graph. We automatically identify leakage transitions, which allows for a detailed qualitative and quantitative assessment of circuit designs, design comparison, and design optimisation. The ability to identify leakage transitions is an important step in the process of optimising DNA circuit layouts where the aim is to minimise the computational error inherent in a circuit while minimising the area of the circuit. Our 2D modelling approach of DNA walker circuits relies on coloured stochastic Petri nets which enable functionality, topology and dimensionality all to be integrated in one two-dimensional model. Our modelling and analysis approach can be easily extended to 3-dimensional walker systems.

  15. Impact of tailored feedback in assessment of communication skills for medical students.

    PubMed

    Uhm, Seilin; Lee, Gui H; Jin, Jeong K; Bak, Yong I; Jeoung, Yeon O; Kim, Chan W

    2015-01-01

    Finding out the effective ways of teaching and assessing communication skills remain a challenging part of medication education. This study aims at exploring the usefulness and effectiveness of having additional feedback using qualitative analysis in assessment of communication skills in undergraduate medical training. We also determined the possibilities of using qualitative analysis in developing tailored strategies for improvement in communication skills training. This study was carried out on medical students (n=87) undergoing their final year clinical performance examination on communication skills using standardized patient by video-recording and transcribing their performances. Video-recordings of 26 students were randomly selected for qualitative analysis, and additional feedback was provided. We assessed the level of acceptance of communication skills scores between the study and nonstudy group and within the study group, before and after receiving feedback based on qualitative analysis. There was a statistically significant increase in the level of acceptance of feedback after delivering additional feedback using qualitative analysis, where the percentage of agreement with feedback increased from 15.4 to 80.8% (p<0.001). Incorporating feedback based on qualitative analysis for communication skills assessment gives essential information for medical students to learn and self-reflect, which could potentially lead to improved communication skills. As evident from our study, feedback becomes more meaningful and effective with additional feedback using qualitative analysis.

  16. Impact of tailored feedback in assessment of communication skills for medical students

    PubMed Central

    Uhm, Seilin; Lee, Gui H.; Jin, Jeong K.; Bak, Yong I.; Jeoung, Yeon O.; Kim, Chan W.

    2015-01-01

    Background Finding out the effective ways of teaching and assessing communication skills remain a challenging part of medication education. This study aims at exploring the usefulness and effectiveness of having additional feedback using qualitative analysis in assessment of communication skills in undergraduate medical training. We also determined the possibilities of using qualitative analysis in developing tailored strategies for improvement in communication skills training. Methods This study was carried out on medical students (n=87) undergoing their final year clinical performance examination on communication skills using standardized patient by video-recording and transcribing their performances. Video-recordings of 26 students were randomly selected for qualitative analysis, and additional feedback was provided. We assessed the level of acceptance of communication skills scores between the study and nonstudy group and within the study group, before and after receiving feedback based on qualitative analysis. Results There was a statistically significant increase in the level of acceptance of feedback after delivering additional feedback using qualitative analysis, where the percentage of agreement with feedback increased from 15.4 to 80.8% (p<0.001). Conclusions Incorporating feedback based on qualitative analysis for communication skills assessment gives essential information for medical students to learn and self-reflect, which could potentially lead to improved communication skills. As evident from our study, feedback becomes more meaningful and effective with additional feedback using qualitative analysis. PMID:26154864

  17. Assigning poetry reading as a way of introducing students to qualitative data analysis.

    PubMed

    Raingruber, Bonnie

    2009-08-01

    The aim of the paper is to explain how poetry reading can be used to teach interpretive analysis of qualitative data. A number of studies were located in the nursing literature that focused on using poetry to help students develop empathy for patients, to teach students to reflect on their own practice, and to assist them in developing self-understanding. No studies were found that described the use of poetry reading as a way of teaching the skill of interpretive analysis. There are, however, a number of parallels between the principles of poetry reading and qualitative analysis that suggest that this method of teaching would be successful. International papers published on PubMed, Medline, and CINAHL were reviewed to identify challenges facing educators and ways of teaching the process of qualitative data analysis using poetry reading. Using poetry reading to teach skills of qualitative data analysis helps motivate students, cultivates a reflective mindset, and develops the skill of working as a member of an interpretive group. Framing interpretive work as being like reading poetry helps students pick up more quickly on the art that is a major component of the work. This approach also helps students learn the importance of cultural and contextual particulars as they begin analyzing qualitative data. Using poetry reading to introduce students to the complex skill of qualitative data analysis is an effective pedagogical strategy.

  18. Qualitative environmental health research: an analysis of the literature, 1991-2008.

    PubMed

    Scammell, Madeleine Kangsen

    2011-10-01

    Qualitative research uses nonnumeric data to understand people's opinions, motives, understanding, and beliefs about events or phenomena. In this analysis, I report the use of qualitative methods and data in the study of the relationship between environmental exposures and human health. A primary search for peer-reviewed journal articles dated from 1991 through 2008 included the following three terms: qualitative, environ*, and health. Searches resulted in 3,155 records. Data were extracted and findings of articles analyzed to determine where and by whom qualitative environmental health research is conducted and published, the types of methods and analyses used in qualitative studies of environmental health, and the types of information qualitative data contribute to environmental health. The results highlight a diversity of disciplines and techniques among researchers who used qualitative methods to study environmental health. Nearly all of the studies identified increased scientific understanding of lay perceptions of environmental health exposures. This analysis demonstrates the potential of qualitative data to improve understanding of complex exposure pathways, including the influence of social factors on environmental health, and health outcomes.

  19. An Improved Flame Test for Qualitative Analysis Using a Multichannel UV-Visible Spectrophotometer

    ERIC Educational Resources Information Center

    Blitz, Jonathan P.; Sheeran, Daniel J.; Becker, Thomas L.

    2006-01-01

    Qualitative analysis schemes are used in undergraduate laboratory settings as a way to introduce equilibrium concepts and logical thinking. The main component of all qualitative analysis schemes is a flame test, as the color of light emitted from certain elements is distinctive and a flame photometer or spectrophotometer in each laboratory is…

  20. Use of MRI in Differentiation of Papillary Renal Cell Carcinoma Subtypes: Qualitative and Quantitative Analysis.

    PubMed

    Doshi, Ankur M; Ream, Justin M; Kierans, Andrea S; Bilbily, Matthew; Rusinek, Henry; Huang, William C; Chandarana, Hersh

    2016-03-01

    The purpose of this study was to determine whether qualitative and quantitative MRI feature analysis is useful for differentiating type 1 from type 2 papillary renal cell carcinoma (PRCC). This retrospective study included 21 type 1 and 17 type 2 PRCCs evaluated with preoperative MRI. Two radiologists independently evaluated various qualitative features, including signal intensity, heterogeneity, and margin. For the quantitative analysis, a radiology fellow and a medical student independently drew 3D volumes of interest over the entire tumor on T2-weighted HASTE images, apparent diffusion coefficient parametric maps, and nephrographic phase contrast-enhanced MR images to derive first-order texture metrics. Qualitative and quantitative features were compared between the groups. For both readers, qualitative features with greater frequency in type 2 PRCC included heterogeneous enhancement, indistinct margin, and T2 heterogeneity (all, p < 0.035). Indistinct margins and heterogeneous enhancement were independent predictors (AUC, 0.822). Quantitative analysis revealed that apparent diffusion coefficient, HASTE, and contrast-enhanced entropy were greater in type 2 PRCC (p < 0.05; AUC, 0.682-0.716). A combined quantitative and qualitative model had an AUC of 0.859. Qualitative features within the model had interreader concordance of 84-95%, and the quantitative data had intraclass coefficients of 0.873-0.961. Qualitative and quantitative features can help discriminate between type 1 and type 2 PRCC. Quantitative analysis may capture useful information that complements the qualitative appearance while benefiting from high interobserver agreement.

  1. The effects of finite mass, adiabaticity, and isothermality in nonlinear plasma wave studies

    NASA Astrophysics Data System (ADS)

    Hellberg, Manfred A.; Verheest, Frank; Mace, Richard L.

    2018-03-01

    The propagation of arbitrary amplitude ion-acoustic solitons is investigated in a plasma containing cool adiabatic positive ions and hot electrons or negative ions. The latter can be described by polytropic pressure-density relations, both with or without the retention of inertial effects. For analytical tractability, the resulting Sagdeev pseudopotential needs to be expressed in terms of the hot negative species density, rather than the electrostatic potential. The inclusion of inertia is found to have no qualitative effect, but yields quantitative differences that vary monotonically with the mass ratio and the polytropic index. This result contrasts with results for analogous problems involving three species, where it was found that inertia could yield significant qualitative differences. Attention is also drawn to the fact that in the literature there are numerous papers in which species are assumed to behave adiabatically, where the isothermal assumption would be more appropriate. Such an assumption leads to quantitative errors and, in some instances, even qualitative gaps for "reverse polarity" solitons.

  2. Fragment-orbital tunneling currents and electronic couplings for analysis of molecular charge-transfer systems.

    PubMed

    Hwang, Sang-Yeon; Kim, Jaewook; Kim, Woo Youn

    2018-04-04

    In theoretical charge-transfer research, calculation of the electronic coupling element is crucial for examining the degree of the electronic donor-acceptor interaction. The tunneling current (TC), representing the magnitudes and directions of electron flow, provides a way of evaluating electronic couplings, along with the ability of visualizing how electrons flow in systems. Here, we applied the TC theory to π-conjugated organic dimer systems, in the form of our fragment-orbital tunneling current (FOTC) method, which uses the frontier molecular-orbitals of system fragments as diabatic states. For a comprehensive test of FOTC, we assessed how reasonable the computed electronic couplings and the corresponding TC densities are for the hole- and electron-transfer databases HAB11 and HAB7. FOTC gave 12.5% mean relative unsigned error with regard to the high-level ab initio reference. The shown performance is comparable with that of fragment-orbital density functional theory, which gave the same error by 20.6% or 13.9% depending on the formulation. In the test of a set of nucleobase π stacks, we showed that the original TC expression is also applicable to nondegenerate cases under the condition that the overlap between the charge distributions of diabatic states is small enough to offset the energy difference. Lastly, we carried out visual analysis on the FOTC densities of thiophene dimers with different intermolecular alignments. The result depicts an intimate topological connection between the system geometry and electron flow. Our work provides quantitative and qualitative grounds for FOTC, showing it to be a versatile tool in characterization of molecular charge-transfer systems.

  3. Land cover mapping of Greater Mesoamerica using MODIS data

    USGS Publications Warehouse

    Giri, Chandra; Jenkins, Clinton N.

    2005-01-01

    A new land cover database of Greater Mesoamerica has been prepared using moderate resolution imaging spectroradiometer (MODIS, 500 m resolution) satellite data. Daily surface reflectance MODIS data and a suite of ancillary data were used in preparing the database by employing a decision tree classification approach. The new land cover data are an improvement over traditional advanced very high resolution radiometer (AVHRR) based land cover data in terms of both spatial and thematic details. The dominant land cover type in Greater Mesoamerica is forest (39%), followed by shrubland (30%) and cropland (22%). Country analysis shows forest as the dominant land cover type in Belize (62%), Cost Rica (52%), Guatemala (53%), Honduras (56%), Nicaragua (53%), and Panama (48%), cropland as the dominant land cover type in El Salvador (60.5%), and shrubland as the dominant land cover type in Mexico (37%). A three-step approach was used to assess the quality of the classified land cover data: (i) qualitative assessment provided good insight in identifying and correcting gross errors; (ii) correlation analysis of MODIS- and Landsat-derived land cover data revealed strong positive association for forest (r2 = 0.88), shrubland (r2 = 0.75), and cropland (r2 = 0.97) but weak positive association for grassland (r2 = 0.26); and (iii) an error matrix generated using unseen training data provided an overall accuracy of 77.3% with a Kappa coefficient of 0.73608. Overall, MODIS 500 m data and the methodology used were found to be quite useful for broad-scale land cover mapping of Greater Mesoamerica.

  4. Does the Consecutive Interpreting Approach enhance medical English communication skills of Japanese-speaking students?

    PubMed

    Iizuka, Hideki; Lefor, Alan K

    2018-04-19

    To determine if the Consecutive Interpreting Approach enhances medical English communication skills of students in a Japanese medical university and to assess this method based on performance and student evaluations.  This is a three-phase study using a mixed-methods design, which starts with four language reproduction activities for 30 medical and 95 nursing students, followed by a quantitative analysis of perfect-match reproduction rates to assess changes over the duration of the study and qualitative error analysis of participants' language reproduction. The final stage included a scored course evaluation and free-form comments to evaluate this approach and to identify effective educational strategies to enhance medical English communication skills. Mean perfect-match reproduction rates of all participants over four reproduction activities differed statistically significantly (repeated measures ANOVA, p<0.0005). The overall perfect-match reproduction rates improved from 75.3 % to 90.1 % for nursing and 89.5 % to 91.6% for medical students. The final achievement levels of nursing and medical students were equivalent (test of equivalence, p<0.05). Details of lexical- and syntactic-level errors were identified. The course evaluation scores were 3.74 (n=30, SD = 0.59) and 3.77 (n=90, SD=0.54) for medical and nursing students respectively. Participants' medical English communication skills are enhanced using this approach. Participants expressed positive feedback regarding this instruction method. This approach may be effective to enhance the language skills of non-native English-speaking students seeking to practice medicine in English speaking countries.

  5. Does the Consecutive Interpreting Approach enhance medical English communication skills of Japanese-speaking students?

    PubMed Central

    Lefor, Alan K.

    2018-01-01

    Objectives To determine if the Consecutive Interpreting Approach enhances medical English communication skills of students in a Japanese medical university and to assess this method based on performance and student evaluations.   Methods  This is a three-phase study using a mixed-methods design, which starts with four language reproduction activities for 30 medical and 95 nursing students, followed by a quantitative analysis of perfect-match reproduction rates to assess changes over the duration of the study and qualitative error analysis of participants' language reproduction. The final stage included a scored course evaluation and free-form comments to evaluate this approach and to identify effective educational strategies to enhance medical English communication skills. Results Mean perfect-match reproduction rates of all participants over four reproduction activities differed statistically significantly (repeated measures ANOVA, p<0.0005). The overall perfect-match reproduction rates improved from 75.3 % to 90.1 % for nursing and 89.5 % to 91.6% for medical students. The final achievement levels of nursing and medical students were equivalent (test of equivalence, p<0.05). Details of lexical- and syntactic-level errors were identified. The course evaluation scores were 3.74 (n=30, SD = 0.59) and 3.77 (n=90, SD=0.54) for medical and nursing students respectively. Conclusions Participants’ medical English communication skills are enhanced using this approach. Participants expressed positive feedback regarding this instruction method. This approach may be effective to enhance the language skills of non-native English-speaking students seeking to practice medicine in English speaking countries. PMID:29677693

  6. Safety and Performance Analysis of the Non-Radar Oceanic/Remote Airspace In-Trail Procedure

    NASA Technical Reports Server (NTRS)

    Carreno, Victor A.; Munoz, Cesar A.

    2007-01-01

    This document presents a safety and performance analysis of the nominal case for the In-Trail Procedure (ITP) in a non-radar oceanic/remote airspace. The analysis estimates the risk of collision between the aircraft performing the ITP and a reference aircraft. The risk of collision is only estimated for the ITP maneuver and it is based on nominal operating conditions. The analysis does not consider human error, communication error conditions, or the normal risk of flight present in current operations. The hazards associated with human error and communication errors are evaluated in an Operational Hazards Analysis presented elsewhere.

  7. Error Analysis of Brailled Instructional Materials Produced by Public School Personnel in Texas

    ERIC Educational Resources Information Center

    Herzberg, Tina

    2010-01-01

    In this study, a detailed error analysis was performed to determine if patterns of errors existed in braille transcriptions. The most frequently occurring errors were the insertion of letters or words that were not contained in the original print material; the incorrect usage of the emphasis indicator; and the incorrect formatting of titles,…

  8. Integrated analysis of error detection and recovery

    NASA Technical Reports Server (NTRS)

    Shin, K. G.; Lee, Y. H.

    1985-01-01

    An integrated modeling and analysis of error detection and recovery is presented. When fault latency and/or error latency exist, the system may suffer from multiple faults or error propagations which seriously deteriorate the fault-tolerant capability. Several detection models that enable analysis of the effect of detection mechanisms on the subsequent error handling operations and the overall system reliability were developed. Following detection of the faulty unit and reconfiguration of the system, the contaminated processes or tasks have to be recovered. The strategies of error recovery employed depend on the detection mechanisms and the available redundancy. Several recovery methods including the rollback recovery are considered. The recovery overhead is evaluated as an index of the capabilities of the detection and reconfiguration mechanisms.

  9. A Simple Exact Error Rate Analysis for DS-CDMA with Arbitrary Pulse Shape in Flat Nakagami Fading

    NASA Astrophysics Data System (ADS)

    Rahman, Mohammad Azizur; Sasaki, Shigenobu; Kikuchi, Hisakazu; Harada, Hiroshi; Kato, Shuzo

    A simple exact error rate analysis is presented for random binary direct sequence code division multiple access (DS-CDMA) considering a general pulse shape and flat Nakagami fading channel. First of all, a simple model is developed for the multiple access interference (MAI). Based on this, a simple exact expression of the characteristic function (CF) of MAI is developed in a straight forward manner. Finally, an exact expression of error rate is obtained following the CF method of error rate analysis. The exact error rate so obtained can be much easily evaluated as compared to the only reliable approximate error rate expression currently available, which is based on the Improved Gaussian Approximation (IGA).

  10. The role of qualitative research in psychological journals.

    PubMed

    Kidd, Sean A

    2002-03-01

    The acceptance of qualitative research in 15 journals published and distributed by the American Psychological Association (APA) was investigated. This investigation included a PsycINFO search using the keyword qualitative, an analysis of 15 APA journals for frequency of qualitative publication, a content analysis of the journal descriptions, and the results of qualitative interviews with 10 of the chief editors of those journals. The results indicate that there exists a substantial amount of interest in the potential contribution of qualitative methods in major psychological journals, although this interest is not ubiquitous, well defined, or communicated. These findings highlight the need for APA to state its position regarding the applicability of qualitative methods in the study of psychology.

  11. Error Analysis: Past, Present, and Future

    ERIC Educational Resources Information Center

    McCloskey, George

    2017-01-01

    This commentary will take an historical perspective on the Kaufman Test of Educational Achievement (KTEA) error analysis, discussing where it started, where it is today, and where it may be headed in the future. In addition, the commentary will compare and contrast the KTEA error analysis procedures that are rooted in psychometric methodology and…

  12. Optical System Error Analysis and Calibration Method of High-Accuracy Star Trackers

    PubMed Central

    Sun, Ting; Xing, Fei; You, Zheng

    2013-01-01

    The star tracker is a high-accuracy attitude measurement device widely used in spacecraft. Its performance depends largely on the precision of the optical system parameters. Therefore, the analysis of the optical system parameter errors and a precise calibration model are crucial to the accuracy of the star tracker. Research in this field is relatively lacking a systematic and universal analysis up to now. This paper proposes in detail an approach for the synthetic error analysis of the star tracker, without the complicated theoretical derivation. This approach can determine the error propagation relationship of the star tracker, and can build intuitively and systematically an error model. The analysis results can be used as a foundation and a guide for the optical design, calibration, and compensation of the star tracker. A calibration experiment is designed and conducted. Excellent calibration results are achieved based on the calibration model. To summarize, the error analysis approach and the calibration method are proved to be adequate and precise, and could provide an important guarantee for the design, manufacture, and measurement of high-accuracy star trackers. PMID:23567527

  13. A POSTERIORI ERROR ANALYSIS OF TWO STAGE COMPUTATION METHODS WITH APPLICATION TO EFFICIENT DISCRETIZATION AND THE PARAREAL ALGORITHM.

    PubMed

    Chaudhry, Jehanzeb Hameed; Estep, Don; Tavener, Simon; Carey, Varis; Sandelin, Jeff

    2016-01-01

    We consider numerical methods for initial value problems that employ a two stage approach consisting of solution on a relatively coarse discretization followed by solution on a relatively fine discretization. Examples include adaptive error control, parallel-in-time solution schemes, and efficient solution of adjoint problems for computing a posteriori error estimates. We describe a general formulation of two stage computations then perform a general a posteriori error analysis based on computable residuals and solution of an adjoint problem. The analysis accommodates various variations in the two stage computation and in formulation of the adjoint problems. We apply the analysis to compute "dual-weighted" a posteriori error estimates, to develop novel algorithms for efficient solution that take into account cancellation of error, and to the Parareal Algorithm. We test the various results using several numerical examples.

  14. Conducting Qualitative Data Analysis: Reading Line-by-Line, but Analyzing by Meaningful Qualitative Units

    ERIC Educational Resources Information Center

    Chenail, Ronald J.

    2012-01-01

    In the first of a series of "how-to" essays on conducting qualitative data analysis, Ron Chenail points out the challenges of determining units to analyze qualitatively when dealing with text. He acknowledges that although we may read a document word-by-word or line-by-line, we need to adjust our focus when processing the text for purposes of…

  15. Socializing the human factors analysis and classification system: incorporating social psychological phenomena into a human factors error classification system.

    PubMed

    Paletz, Susannah B F; Bearman, Christopher; Orasanu, Judith; Holbrook, Jon

    2009-08-01

    The presence of social psychological pressures on pilot decision making was assessed using qualitative analyses of critical incident interviews. Social psychological phenomena have long been known to influence attitudes and behavior but have not been highlighted in accident investigation models. Using a critical incident method, 28 pilots who flew in Alaska were interviewed. The participants were asked to describe a situation involving weather when they were pilot in command and found their skills challenged. They were asked to describe the incident in detail but were not explicitly asked to identify social pressures. Pressures were extracted from transcripts in a bottom-up manner and then clustered into themes. Of the 28 pilots, 16 described social psychological pressures on their decision making, specifically, informational social influence, the foot-in-the-door persuasion technique, normalization of deviance, and impression management and self-consistency motives. We believe accident and incident investigations can benefit from explicit inclusion of common social psychological pressures. We recommend specific ways of incorporating these pressures into theHuman Factors Analysis and Classification System.

  16. Quantitative Detection of Cracks in Steel Using Eddy Current Pulsed Thermography.

    PubMed

    Shi, Zhanqun; Xu, Xiaoyu; Ma, Jiaojiao; Zhen, Dong; Zhang, Hao

    2018-04-02

    Small cracks are common defects in steel and often lead to catastrophic accidents in industrial applications. Various nondestructive testing methods have been investigated for crack detection; however, most current methods focus on qualitative crack identification and image processing. In this study, eddy current pulsed thermography (ECPT) was applied for quantitative crack detection based on derivative analysis of temperature variation. The effects of the incentive parameters on the temperature variation were analyzed in the simulation study. The crack profile and position are identified in the thermal image based on the Canny edge detection algorithm. Then, one or more trajectories are determined through the crack profile in order to determine the crack boundary through its temperature distribution. The slope curve along the trajectory is obtained. Finally, quantitative analysis of the crack sizes was performed by analyzing the features of the slope curves. The experimental verification showed that the crack sizes could be quantitatively detected with errors of less than 1%. Therefore, the proposed ECPT method was demonstrated to be a feasible and effective nondestructive approach for quantitative crack detection.

  17. Integrated and spectral energetics of the GLAS general circulation model

    NASA Technical Reports Server (NTRS)

    Tenenbaum, J.

    1982-01-01

    Integrated and spectral error energetics of the GLAS General circulation model are compared with observations for periods in January 1975, 1976, and 1977. For two cases the model shows significant skill in predicting integrated energetics quantities out to two weeks, and for all three cases, the integrated monthly mean energetics show qualitative improvements over previous versions of the model in eddy kinetic energy and barotropic conversions. Fundamental difficulties remain with leakage of energy to the stratospheric level, particularly above strong initial jet streams associated in part with regions of steep terrain. The spectral error growth study represents the first comparison of general circulation model spectral energetics predictions with the corresponding observational spectra on a day by day basis. The major conclusion is that eddy kinetics energy can be correct while significant errors occur in the kinetic energy of wavenumber 3. Both the model and observations show evidence of single wavenumber dominance in eddy kinetic energy and the correlation of spectral kinetics and potential energy.

  18. Analytical and Methodological Issues in the Use of Qualitative Data Analysis Software: A Description of Three Studies.

    ERIC Educational Resources Information Center

    Margerum-Leys, Jon; Kupperman, Jeff; Boyle-Heimann, Kristen

    This paper presents perspectives on the use of data analysis software in the process of qualitative research. These perspectives were gained in the conduct of three qualitative research studies that differed in theoretical frames, areas of interests, and scope. Their common use of a particular data analysis software package allows the exploration…

  19. Distortion Representation of Forecast Errors for Model Skill Assessment and Objective Analysis. Revision 1.12

    NASA Technical Reports Server (NTRS)

    Hoffman, Ross N.; Nehrkorn, Thomas; Grassotti, Christopher

    1997-01-01

    We proposed a novel characterization of errors for numerical weather predictions. In its simplest form we decompose the error into a part attributable to phase errors and a remainder. The phase error is represented in the same fashion as a velocity field and is required to vary slowly and smoothly with position. A general distortion representation allows for the displacement and amplification or bias correction of forecast anomalies. Characterizing and decomposing forecast error in this way has two important applications, which we term the assessment application and the objective analysis application. For the assessment application, our approach results in new objective measures of forecast skill which are more in line with subjective measures of forecast skill and which are useful in validating models and diagnosing their shortcomings. With regard to the objective analysis application, meteorological analysis schemes balance forecast error and observational error to obtain an optimal analysis. Presently, representations of the error covariance matrix used to measure the forecast error are severely limited. For the objective analysis application our approach will improve analyses by providing a more realistic measure of the forecast error. We expect, a priori, that our approach should greatly improve the utility of remotely sensed data which have relatively high horizontal resolution, but which are indirectly related to the conventional atmospheric variables. In this project, we are initially focusing on the assessment application, restricted to a realistic but univariate 2-dimensional situation. Specifically, we study the forecast errors of the sea level pressure (SLP) and 500 hPa geopotential height fields for forecasts of the short and medium range. Since the forecasts are generated by the GEOS (Goddard Earth Observing System) data assimilation system with and without ERS 1 scatterometer data, these preliminary studies serve several purposes. They (1) provide a testbed for the use of the distortion representation of forecast errors, (2) act as one means of validating the GEOS data assimilation system and (3) help to describe the impact of the ERS 1 scatterometer data.

  20. An Analysis of a Finite Element Method for Convection-Diffusion Problems. Part II. A Posteriori Error Estimates and Adaptivity.

    DTIC Science & Technology

    1983-03-01

    AN ANALYSIS OF A FINITE ELEMENT METHOD FOR CONVECTION- DIFFUSION PROBLEMS PART II: A POSTERIORI ERROR ESTIMATES AND ADAPTIVITY by W. G. Szymczak Y 6a...PERIOD COVERED AN ANALYSIS OF A FINITE ELEMENT METHOD FOR final life of the contract CONVECTION- DIFFUSION PROBLEM S. Part II: A POSTERIORI ERROR ...Element Method for Convection- Diffusion Problems. Part II: A Posteriori Error Estimates and Adaptivity W. G. Szvmczak and I. Babu~ka# Laboratory for

  1. Systematic text condensation: a strategy for qualitative analysis.

    PubMed

    Malterud, Kirsti

    2012-12-01

    To present background, principles, and procedures for a strategy for qualitative analysis called systematic text condensation and discuss this approach compared with related strategies. Giorgi's psychological phenomenological analysis is the point of departure and inspiration for systematic text condensation. The basic elements of Giorgi's method and the elaboration of these in systematic text condensation are presented, followed by a detailed description of procedures for analysis according to systematic text condensation. Finally, similarities and differences compared with other frequently applied methods for qualitative analysis are identified, as the foundation of a discussion of strengths and limitations of systematic text condensation. Systematic text condensation is a descriptive and explorative method for thematic cross-case analysis of different types of qualitative data, such as interview studies, observational studies, and analysis of written texts. The method represents a pragmatic approach, although inspired by phenomenological ideas, and various theoretical frameworks can be applied. The procedure consists of the following steps: 1) total impression - from chaos to themes; 2) identifying and sorting meaning units - from themes to codes; 3) condensation - from code to meaning; 4) synthesizing - from condensation to descriptions and concepts. Similarities and differences comparing systematic text condensation with other frequently applied qualitative methods regarding thematic analysis, theoretical methodological framework, analysis procedures, and taxonomy are discussed. Systematic text condensation is a strategy for analysis developed from traditions shared by most of the methods for analysis of qualitative data. The method offers the novice researcher a process of intersubjectivity, reflexivity, and feasibility, while maintaining a responsible level of methodological rigour.

  2. Failure analysis and modeling of a multicomputer system. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Subramani, Sujatha Srinivasan

    1990-01-01

    This thesis describes the results of an extensive measurement-based analysis of real error data collected from a 7-machine DEC VaxCluster multicomputer system. In addition to evaluating basic system error and failure characteristics, we develop reward models to analyze the impact of failures and errors on the system. The results show that, although 98 percent of errors in the shared resources recover, they result in 48 percent of all system failures. The analysis of rewards shows that the expected reward rate for the VaxCluster decreases to 0.5 in 100 days for a 3 out of 7 model, which is well over a 100 times that for a 7-out-of-7 model. A comparison of the reward rates for a range of k-out-of-n models indicates that the maximum increase in reward rate (0.25) occurs in going from the 6-out-of-7 model to the 5-out-of-7 model. The analysis also shows that software errors have the lowest reward (0.2 vs. 0.91 for network errors). The large loss in reward rate for software errors is due to the fact that a large proportion (94 percent) of software errors lead to failure. In comparison, the high reward rate for network errors is due to fast recovery from a majority of these errors (median recovery duration is 0 seconds).

  3. Simultaneous Control of Error Rates in fMRI Data Analysis

    PubMed Central

    Kang, Hakmook; Blume, Jeffrey; Ombao, Hernando; Badre, David

    2015-01-01

    The key idea of statistical hypothesis testing is to fix, and thereby control, the Type I error (false positive) rate across samples of any size. Multiple comparisons inflate the global (family-wise) Type I error rate and the traditional solution to maintaining control of the error rate is to increase the local (comparison-wise) Type II error (false negative) rates. However, in the analysis of human brain imaging data, the number of comparisons is so large that this solution breaks down: the local Type II error rate ends up being so large that scientifically meaningful analysis is precluded. Here we propose a novel solution to this problem: allow the Type I error rate to converge to zero along with the Type II error rate. It works because when the Type I error rate per comparison is very small, the accumulation (or global) Type I error rate is also small. This solution is achieved by employing the Likelihood paradigm, which uses likelihood ratios to measure the strength of evidence on a voxel-by-voxel basis. In this paper, we provide theoretical and empirical justification for a likelihood approach to the analysis of human brain imaging data. In addition, we present extensive simulations that show the likelihood approach is viable, leading to ‘cleaner’ looking brain maps and operationally superiority (lower average error rate). Finally, we include a case study on cognitive control related activation in the prefrontal cortex of the human brain. PMID:26272730

  4. Methods of automatic nucleotide-sequence analysis. Multicomponent spectrophotometric analysis of mixtures of nucleic acid components by a least-squares procedure

    PubMed Central

    Lee, Sheila; McMullen, D.; Brown, G. L.; Stokes, A. R.

    1965-01-01

    1. A theoretical analysis of the errors in multicomponent spectrophotometric analysis of nucleoside mixtures, by a least-squares procedure, has been made to obtain an expression for the error coefficient, relating the error in calculated concentration to the error in extinction measurements. 2. The error coefficients, which depend only on the `library' of spectra used to fit the experimental curves, have been computed for a number of `libraries' containing the following nucleosides found in s-RNA: adenosine, guanosine, cytidine, uridine, 5-ribosyluracil, 7-methylguanosine, 6-dimethylaminopurine riboside, 6-methylaminopurine riboside and thymine riboside. 3. The error coefficients have been used to determine the best conditions for maximum accuracy in the determination of the compositions of nucleoside mixtures. 4. Experimental determinations of the compositions of nucleoside mixtures have been made and the errors found to be consistent with those predicted by the theoretical analysis. 5. It has been demonstrated that, with certain precautions, the multicomponent spectrophotometric method described is suitable as a basis for automatic nucleotide-composition analysis of oligonucleotides containing nine nucleotides. Used in conjunction with continuous chromatography and flow chemical techniques, this method can be applied to the study of the sequence of s-RNA. PMID:14346087

  5. Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis

    NASA Technical Reports Server (NTRS)

    Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.

    2017-01-01

    This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.

  6. Comparative study of contrast-enhanced ultrasound qualitative and quantitative analysis for identifying benign and malignant breast tumor lumps.

    PubMed

    Liu, Jian; Gao, Yun-Hua; Li, Ding-Dong; Gao, Yan-Chun; Hou, Ling-Mi; Xie, Ting

    2014-01-01

    To compare the value of contrast-enhanced ultrasound (CEUS) qualitative and quantitative analysis in the identification of breast tumor lumps. Qualitative and quantitative indicators of CEUS for 73 cases of breast tumor lumps were retrospectively analyzed by univariate and multivariate approaches. Logistic regression was applied and ROC curves were drawn for evaluation and comparison. The CEUS qualitative indicator-generated regression equation contained three indicators, namely enhanced homogeneity, diameter line expansion and peak intensity grading, which demonstrated prediction accuracy for benign and malignant breast tumor lumps of 91.8%; the quantitative indicator-generated regression equation only contained one indicator, namely the relative peak intensity, and its prediction accuracy was 61.5%. The corresponding areas under the ROC curve for qualitative and quantitative analyses were 91.3% and 75.7%, respectively, which exhibited a statistically significant difference by the Z test (P<0.05). The ability of CEUS qualitative analysis to identify breast tumor lumps is better than with quantitative analysis.

  7. Clinical risk management in mental health: a qualitative study of main risks and related organizational management practices

    PubMed Central

    2013-01-01

    Background A scientific understanding of clinical risk management (CRM) in mental health care is essential for building safer health systems and for improving patient safety. While evidence on patient safety and CRM in physical health care has increased, there is limited research on these issues in mental health care. This qualitative study provides an overview of the most important clinical risks in mental health and related organizational management practices. Methods We conducted in-depth expert interviews with professionals responsible for CRM in psychiatric hospitals. Interviews were transcribed and analyzed applying qualitative content analysis to thematically sort the identified risks. Results The main concerns for CRM in mental health are a) violence and self-destructive behavior (i.e. protecting patients and staff from other patients, and patients from themselves), b) treatment errors, especially in the process of therapy, and c) risks associated with mental illnesses (e.g. psychosis or depression). This study identified critical differences to CRM in hospitals for physical disorder and challenges specific to CRM in mental health. Firstly, many psychiatric patients do not believe that they are ill and are therefore in hospital against their will. Secondly, staff safety is a much more prominent theme for CRM in mental health care as it is directly related to the specifics of mental illnesses. Conclusions The current study contributes to the understanding of patient safety and raises awareness for CRM in mental health. The mental health specific overview of central risks and related organizational management practices offers a valuable basis for CRM development in mental health and an addition to CRM in general. PMID:23379842

  8. Spurious Solutions Of Nonlinear Differential Equations

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Sweby, P. K.; Griffiths, D. F.

    1992-01-01

    Report utilizes nonlinear-dynamics approach to investigate possible sources of errors and slow convergence and non-convergence of steady-state numerical solutions when using time-dependent approach for problems containing nonlinear source terms. Emphasizes implications for development of algorithms in CFD and computational sciences in general. Main fundamental conclusion of study is that qualitative features of nonlinear differential equations cannot be adequately represented by finite-difference method and vice versa.

  9. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  10. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps/incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  11. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Quality within space exploration ground processing operations, the identification and or classification of underlying contributors and causes of human error must be identified, in order to manage human error.This presentation will provide a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  12. Error Analysis and Validation for Insar Height Measurement Induced by Slant Range

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Li, T.; Fan, W.; Geng, X.

    2018-04-01

    InSAR technique is an important method for large area DEM extraction. Several factors have significant influence on the accuracy of height measurement. In this research, the effect of slant range measurement for InSAR height measurement was analysis and discussed. Based on the theory of InSAR height measurement, the error propagation model was derived assuming no coupling among different factors, which directly characterise the relationship between slant range error and height measurement error. Then the theoretical-based analysis in combination with TanDEM-X parameters was implemented to quantitatively evaluate the influence of slant range error to height measurement. In addition, the simulation validation of InSAR error model induced by slant range was performed on the basis of SRTM DEM and TanDEM-X parameters. The spatial distribution characteristics and error propagation rule of InSAR height measurement were further discussed and evaluated.

  13. Tolerance analysis of optical telescopes using coherent addition of wavefront errors

    NASA Technical Reports Server (NTRS)

    Davenport, J. W.

    1982-01-01

    A near diffraction-limited telescope requires that tolerance analysis be done on the basis of system wavefront error. One method of analyzing the wavefront error is to represent the wavefront error function in terms of its Zernike polynomial expansion. A Ramsey-Korsch ray trace package, a computer program that simulates the tracing of rays through an optical telescope system, was expanded to include the Zernike polynomial expansion up through the fifth-order spherical term. An option to determine a 3 dimensional plot of the wavefront error function was also included in the Ramsey-Korsch package. Several assimulation runs were analyzed to determine the particular set of coefficients in the Zernike expansion that are effected by various errors such as tilt, decenter and despace. A 3 dimensional plot of each error up through the fifth-order spherical term was also included in the study. Tolerance analysis data are presented.

  14. Speech recognition training for enhancing written language generation by a traumatic brain injury survivor.

    PubMed

    Manasse, N J; Hux, K; Rankin-Erickson, J L

    2000-11-01

    Impairments in motor functioning, language processing, and cognitive status may impact the written language performance of traumatic brain injury (TBI) survivors. One strategy to minimize the impact of these impairments is to use a speech recognition system. The purpose of this study was to explore the effect of mild dysarthria and mild cognitive-communication deficits secondary to TBI on a 19-year-old survivor's mastery and use of such a system-specifically, Dragon Naturally Speaking. Data included the % of the participant's words accurately perceived by the system over time, the participant's accuracy over time in using commands for navigation and error correction, and quantitative and qualitative changes in the participant's written texts generated with and without the use of the speech recognition system. Results showed that Dragon NaturallySpeaking was approximately 80% accurate in perceiving words spoken by the participant, and the participant quickly and easily mastered all navigation and error correction commands presented. Quantitatively, the participant produced a greater amount of text using traditional word processing and a standard keyboard than using the speech recognition system. Minimal qualitative differences appeared between writing samples. Discussion of factors that may have contributed to the obtained results and that may affect the generalization of the findings to other TBI survivors is provided.

  15. Error Pattern Analysis Applied to Technical Writing: An Editor's Guide for Writers.

    ERIC Educational Resources Information Center

    Monagle, E. Brette

    The use of error pattern analysis can reduce the time and money spent on editing and correcting manuscripts. What is required is noting, classifying, and keeping a frequency count of errors. First an editor should take a typical page of writing and circle each error. After the editor has done a sufficiently large number of pages to identify an…

  16. A Study of Reading Errors Using Goodman's Miscue Analysis and Cloze Procedure.

    ERIC Educational Resources Information Center

    Farren, Sean N.

    A study of 11 boys, aged 12 to 14 with low reading ability, was conducted to discover what kinds of errors they made and whether or not differences might exist between error patterns in silent and oral reading. Miscue analysis was used to test oral reading while cloze procedures were used to test silent reading. Errors were categorized according…

  17. Some Deep Structure Manifestations in Second Language Errors of English Voiced and Voiceless "th."

    ERIC Educational Resources Information Center

    Moustafa, Margaret Heiss

    Native speakers of Egyptian Arabic make errors in their pronunciation of English that cannot always be accounted for by a contrastive analysis of Egyptian analysis of Egyptain Arabic and English. This study focuses on three types of errors in the pronunciation of voiced and voiceless "th" made by fluent speakers of English. These errors were noted…

  18. Analyzing human errors in flight mission operations

    NASA Technical Reports Server (NTRS)

    Bruno, Kristin J.; Welz, Linda L.; Barnes, G. Michael; Sherif, Josef

    1993-01-01

    A long-term program is in progress at JPL to reduce cost and risk of flight mission operations through a defect prevention/error management program. The main thrust of this program is to create an environment in which the performance of the total system, both the human operator and the computer system, is optimized. To this end, 1580 Incident Surprise Anomaly reports (ISA's) from 1977-1991 were analyzed from the Voyager and Magellan projects. A Pareto analysis revealed that 38 percent of the errors were classified as human errors. A preliminary cluster analysis based on the Magellan human errors (204 ISA's) is presented here. The resulting clusters described the underlying relationships among the ISA's. Initial models of human error in flight mission operations are presented. Next, the Voyager ISA's will be scored and included in the analysis. Eventually, these relationships will be used to derive a theoretically motivated and empirically validated model of human error in flight mission operations. Ultimately, this analysis will be used to make continuous process improvements continuous process improvements to end-user applications and training requirements. This Total Quality Management approach will enable the management and prevention of errors in the future.

  19. Data Analysis & Statistical Methods for Command File Errors

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Waggoner, Bruce; Bryant, Larry

    2014-01-01

    This paper explains current work on modeling for managing the risk of command file errors. It is focused on analyzing actual data from a JPL spaceflight mission to build models for evaluating and predicting error rates as a function of several key variables. We constructed a rich dataset by considering the number of errors, the number of files radiated, including the number commands and blocks in each file, as well as subjective estimates of workload and operational novelty. We have assessed these data using different curve fitting and distribution fitting techniques, such as multiple regression analysis, and maximum likelihood estimation to see how much of the variability in the error rates can be explained with these. We have also used goodness of fit testing strategies and principal component analysis to further assess our data. Finally, we constructed a model of expected error rates based on the what these statistics bore out as critical drivers to the error rate. This model allows project management to evaluate the error rate against a theoretically expected rate as well as anticipate future error rates.

  20. TU-AB-202-12: A Novel Method to Map Endoscopic Video to CT for Treatment Planning and Toxicity Analysis in Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ingram, W; Yang, J; Beadle, B

    Purpose: Endoscopic examinations are routine procedures for head-and-neck cancer patients. Our goal is to develop a method to map the recorded video to CT, providing valuable information for radiotherapy treatment planning and toxicity analysis. Methods: We map video frames to CT via virtual endoscopic images rendered at the real endoscope’s CT-space coordinates. We developed two complementary methods to find these coordinates by maximizing real-to-virtual image similarity:(1)Endoscope Tracking: moves the virtual endoscope frame-by-frame until the desired frame is reached. Utilizes prior knowledge of endoscope coordinates, but sensitive to local optima. (2)Location Search: moves the virtual endoscope along possible paths through themore » volume to find the desired frame. More robust, but more computationally expensive. We tested these methods on clay phantoms with embedded markers for point mapping and protruding bolus material for contour mapping, and we assessed them qualitatively on three patient exams. For mapped points we calculated 3D-distance errors, and for mapped contours we calculated mean absolute distances (MAD) from CT contours. Results: In phantoms, Endoscope Tracking had average point error=0.66±0.50cm and average bolus MAD=0.74±0.37cm for the first 80% of each video. After that the virtual endoscope got lost, increasing these values to 4.73±1.69cm and 4.06±0.30cm. Location Search had point error=0.49±0.44cm and MAD=0.53±0.28cm. Point errors were larger where the endoscope viewed the surface at shallow angles<10 degrees (1.38±0.62cm and 1.22±0.69cm for Endoscope Tracking and Location Search, respectively). In patients, Endoscope Tracking did not make it past the nasal cavity. However, Location Search found coordinates near the correct location for 70% of test frames. Its performance was best near the epiglottis and in the nasal cavity. Conclusion: Location Search is a robust and accurate technique to map endoscopic video to CT. Endoscope Tracking is sensitive to erratic camera motion and local optima, but could be used in conjunction with anchor points found using Location Search.« less

  1. Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness.

    PubMed

    Graneheim, U H; Lundman, B

    2004-02-01

    Qualitative content analysis as described in published literature shows conflicting opinions and unsolved issues regarding meaning and use of concepts, procedures and interpretation. This paper provides an overview of important concepts (manifest and latent content, unit of analysis, meaning unit, condensation, abstraction, content area, code, category and theme) related to qualitative content analysis; illustrates the use of concepts related to the research procedure; and proposes measures to achieve trustworthiness (credibility, dependability and transferability) throughout the steps of the research procedure. Interpretation in qualitative content analysis is discussed in light of Watzlawick et al.'s [Pragmatics of Human Communication. A Study of Interactional Patterns, Pathologies and Paradoxes. W.W. Norton & Company, New York, London] theory of communication.

  2. "Hopefully This Will All Make Sense at Some Point": Meaning and Performance in Illness Blogs.

    PubMed

    Heilferty, Catherine McGeehin

    To analyze the narratives of illness blogs created by parents of children with cancer. The profound effects of the childhood cancer experience on family members and the turn to the Internet by parents for help in the process are gaining research attention. The qualitative study design involved secondary narrative analysis of 14 illness blogs: 9 by the parents of children with neuroblastoma and 5 by the parents of children with leukemia. Daily blog entries were analyzed as individual units of illness experience expression and in relation to one another to identify thematic and linguistic similarities. The initial analysis of these illness blogs resulted in identification of the quest for balance as a primary theme. Narratives in parents' childhood cancer illness blogs illustrated themes of performance. During this initial analysis, however, the author repeatedly asked, "Why are they writing this? And why publish this?" A second analysis of the data answered these questions of why parents blog about the experience. Narrative analysis resulted in the discovery of 6 main reasons that parents wrote and published the childhood cancer experience online: to report, explain, express, reflect, archive, and advocate. The analysis suggests that incorporation of parent writing may improve family--provider communication, enhance the family-health care professional relationship, enhance safety by preventing medical errors, improve reporting of clinical trial data such as adverse events, and improve satisfaction.

  3. General model for the pointing error analysis of Risley-prism system based on ray direction deviation in light refraction

    NASA Astrophysics Data System (ADS)

    Zhang, Hao; Yuan, Yan; Su, Lijuan; Huang, Fengzhen; Bai, Qing

    2016-09-01

    The Risley-prism-based light beam steering apparatus delivers superior pointing accuracy and it is used in imaging LIDAR and imaging microscopes. A general model for pointing error analysis of the Risley prisms is proposed in this paper, based on ray direction deviation in light refraction. This model captures incident beam deviation, assembly deflections, and prism rotational error. We derive the transmission matrixes of the model firstly. Then, the independent and cumulative effects of different errors are analyzed through this model. Accuracy study of the model shows that the prediction deviation of pointing error for different error is less than 4.1×10-5° when the error amplitude is 0.1°. Detailed analyses of errors indicate that different error sources affect the pointing accuracy to varying degree, and the major error source is the incident beam deviation. The prism tilting has a relative big effect on the pointing accuracy when prism tilts in the principal section. The cumulative effect analyses of multiple errors represent that the pointing error can be reduced by tuning the bearing tilting in the same direction. The cumulative effect of rotational error is relative big when the difference of these two prism rotational angles equals 0 or π, while it is relative small when the difference equals π/2. The novelty of these results suggests that our analysis can help to uncover the error distribution and aid in measurement calibration of Risley-prism systems.

  4. Progressive statistics for studies in sports medicine and exercise science.

    PubMed

    Hopkins, William G; Marshall, Stephen W; Batterham, Alan M; Hanin, Juri

    2009-01-01

    Statistical guidelines and expert statements are now available to assist in the analysis and reporting of studies in some biomedical disciplines. We present here a more progressive resource for sample-based studies, meta-analyses, and case studies in sports medicine and exercise science. We offer forthright advice on the following controversial or novel issues: using precision of estimation for inferences about population effects in preference to null-hypothesis testing, which is inadequate for assessing clinical or practical importance; justifying sample size via acceptable precision or confidence for clinical decisions rather than via adequate power for statistical significance; showing SD rather than SEM, to better communicate the magnitude of differences in means and nonuniformity of error; avoiding purely nonparametric analyses, which cannot provide inferences about magnitude and are unnecessary; using regression statistics in validity studies, in preference to the impractical and biased limits of agreement; making greater use of qualitative methods to enrich sample-based quantitative projects; and seeking ethics approval for public access to the depersonalized raw data of a study, to address the need for more scrutiny of research and better meta-analyses. Advice on less contentious issues includes the following: using covariates in linear models to adjust for confounders, to account for individual differences, and to identify potential mechanisms of an effect; using log transformation to deal with nonuniformity of effects and error; identifying and deleting outliers; presenting descriptive, effect, and inferential statistics in appropriate formats; and contending with bias arising from problems with sampling, assignment, blinding, measurement error, and researchers' prejudices. This article should advance the field by stimulating debate, promoting innovative approaches, and serving as a useful checklist for authors, reviewers, and editors.

  5. MRI-Based Computed Tomography Metal Artifact Correction Method for Improving Proton Range Calculation Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Peter C.; Schreibmann, Eduard; Roper, Justin

    2015-03-15

    Purpose: Computed tomography (CT) artifacts can severely degrade dose calculation accuracy in proton therapy. Prompted by the recently increased popularity of magnetic resonance imaging (MRI) in the radiation therapy clinic, we developed an MRI-based CT artifact correction method for improving the accuracy of proton range calculations. Methods and Materials: The proposed method replaces corrupted CT data by mapping CT Hounsfield units (HU number) from a nearby artifact-free slice, using a coregistered MRI. MRI and CT volumetric images were registered with use of 3-dimensional (3D) deformable image registration (DIR). The registration was fine-tuned on a slice-by-slice basis by using 2D DIR.more » Based on the intensity of paired MRI pixel values and HU from an artifact-free slice, we performed a comprehensive analysis to predict the correct HU for the corrupted region. For a proof-of-concept validation, metal artifacts were simulated on a reference data set. Proton range was calculated using reference, artifactual, and corrected images to quantify the reduction in proton range error. The correction method was applied to 4 unique clinical cases. Results: The correction method resulted in substantial artifact reduction, both quantitatively and qualitatively. On respective simulated brain and head and neck CT images, the mean error was reduced from 495 and 370 HU to 108 and 92 HU after correction. Correspondingly, the absolute mean proton range errors of 2.4 cm and 1.7 cm were reduced to less than 2 mm in both cases. Conclusions: Our MRI-based CT artifact correction method can improve CT image quality and proton range calculation accuracy for patients with severe CT artifacts.« less

  6. [Health and socio-educational needs of the families and children with rare metabolic diseases: Qualitative study in a tertiary hospital].

    PubMed

    Tejada-Ortigosa, Eva María; Flores-Rojas, Katherine; Moreno-Quintana, Laura; Muñoz-Villanueva, María Carmen; Pérez-Navero, Juan Luis; Gil-Campos, Mercedes

    2018-05-28

    Rare diseases are a challenge for public health due to the lack of information on their magnitude. These include inborn errors of metabolism. The objective of this study was to assess the quality of life and social, health, economic, and educational needs of a group of paediatric patients with inborn errors of metabolism attended to in a hospital. A questionnaire was developed based on the needs and expectations, based mainly on the Andalusian Plan for Rare Diseases. An analysis was performed on the variables of health, socioeconomic, and educational needs of 65 paediatric patients with inborn errors of metabolism. The respondents showed few possibilities to cope with medication (61%), special diet (86%), and other health benefits (79%). Just under half of them (43%) believed that the quality of family life had been greatly reduced since the onset of the disease. The main caregiver was the mother in 61.5% of cases, compared to 1.5% of cases in which it was the father. The primary caregivers had to reduce their working hours or give up their job in 77% of cases. The multidisciplinary treatment is affected by the inability of families to cope with a high cost, as well as with difficult access to these resources. In addition, there is great impact on the quality of life of patients, and their caregivers. Therefore, there is a need to evaluate the results of government health and socio-economic support plans for patients with rare diseases, and make a real response to their needs. Copyright © 2018. Publicado por Elsevier España, S.L.U.

  7. Building a simulation-based crisis resource management course for emergency medicine, phase 1: Results from an interdisciplinary needs assessment survey.

    PubMed

    Hicks, Christopher M; Bandiera, Glen W; Denny, Christopher J

    2008-11-01

    Emergency department (ED) resuscitation requires the coordinated efforts of an interdisciplinary team. Human errors are common and have a negative impact on patient safety. Although crisis resource management (CRM) skills are utilized in other clinical domains, most emergency medicine (EM) caregivers currently receive no formal CRM training. The objectives were to compile and compare attitudes toward CRM training among EM staff physicians, nurses, and residents at two Canadian academic teaching hospitals. Emergency physicians (EPs), residents, and nurses were asked to complete a Web survey that included Likert scales and short answer questions. Focus groups and pilot testing were used to inform survey development. Thematic content analysis was performed on the qualitative data set and compared to quantitative results. The response rate was 75.7% (N = 84). There was strong consensus regarding the importance of core CRM principles (i.e., effective communication, team leadership, resource utilization, problem-solving, situational awareness) in ED resuscitation. Problems with coordinating team actions (58.8%), communication (69.6%), and establishing priorities (41.3%) were among factors implicated in adverse events. Interdisciplinary collaboration (95.1%), efficiency of patient care (83.9%), and decreased medical error (82.6%) were proposed benefits of CRM training. Communication between disciplines is a barrier to effective ED resuscitation for 94.4% of nurses and 59.7% of EPs (p = 0.008). Residents reported a lack of exposure to (64.3%), yet had interest in (96.4%) formal CRM education using human patient simulation. Nurses rate communication as a barrier to teamwork more frequently than physicians. EM residents are keen to learn CRM skills. An opportunity exists to create a novel interdisciplinary CRM curriculum to improve EM team performance and mitigate human error.

  8. Spelling Errors of Dyslexic Children in Bosnian Language with Transparent Orthography

    ERIC Educational Resources Information Center

    Duranovic, Mirela

    2017-01-01

    The purpose of this study was to explore the nature of spelling errors made by children with dyslexia in Bosnian language with transparent orthography. Three main error categories were distinguished: phonological, orthographic, and grammatical errors. An analysis of error type showed 86% of phonological errors, 10% of orthographic errors, and 4%…

  9. Using integrated models to minimize environmentally induced wavefront error in optomechanical design and analysis

    NASA Astrophysics Data System (ADS)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate design goal of an optical system subjected to dynamic loads is to minimize system level wavefront error (WFE). In random response analysis, system WFE is difficult to predict from finite element results due to the loss of phase information. In the past, the use of ystem WFE was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for determining system level WFE using a linear optics model is presented. An error estimate is included in the analysis output based on fitting errors of mode shapes. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  10. Evaluation of image quality and radiation dose by adaptive statistical iterative reconstruction technique level for chest CT examination.

    PubMed

    Hong, Sun Suk; Lee, Jong-Woong; Seo, Jeong Beom; Jung, Jae-Eun; Choi, Jiwon; Kweon, Dae Cheol

    2013-12-01

    The purpose of this research is to determine the adaptive statistical iterative reconstruction (ASIR) level that enables optimal image quality and dose reduction in the chest computed tomography (CT) protocol with ASIR. A chest phantom with 0-50 % ASIR levels was scanned and then noise power spectrum (NPS), signal and noise and the degree of distortion of peak signal-to-noise ratio (PSNR) and the root-mean-square error (RMSE) were measured. In addition, the objectivity of the experiment was measured using the American College of Radiology (ACR) phantom. Moreover, on a qualitative basis, five lesions' resolution, latitude and distortion degree of chest phantom and their compiled statistics were evaluated. The NPS value decreased as the frequency increased. The lowest noise and deviation were at the 20 % ASIR level, mean 126.15 ± 22.21. As a result of the degree of distortion, signal-to-noise ratio and PSNR at 20 % ASIR level were at the highest value as 31.0 and 41.52. However, maximum absolute error and RMSE showed the lowest deviation value as 11.2 and 16. In the ACR phantom study, all ASIR levels were within acceptable allowance of guidelines. The 20 % ASIR level performed best in qualitative evaluation at five lesions of chest phantom as resolution score 4.3, latitude 3.47 and the degree of distortion 4.25. The 20 % ASIR level was proved to be the best in all experiments, noise, distortion evaluation using ImageJ and qualitative evaluation of five lesions of a chest phantom. Therefore, optimal images as well as reduce radiation dose would be acquired when 20 % ASIR level in thoracic CT is applied.

  11. [Character of refractive errors in population study performed by the Area Military Medical Commission in Lodz].

    PubMed

    Nowak, Michał S; Goś, Roman; Smigielski, Janusz

    2008-01-01

    To determine the prevalence of refractive errors in population. A retrospective review of medical examinations for entry to the military service from The Area Military Medical Commission in Lodz. Ophthalmic examinations were performed. We used statistic analysis to review the results. Statistic analysis revealed that refractive errors occurred in 21.68% of the population. The most commen refractive error was myopia. 1) The most commen ocular diseases are refractive errors, especially myopia (21.68% in total). 2) Refractive surgery and contact lenses should be allowed as the possible correction of refractive errors for military service.

  12. Implementation of an experimental program to investigate the performance characteristics of OMEGA navigation

    NASA Technical Reports Server (NTRS)

    Baxa, E. G., Jr.

    1974-01-01

    A theoretical formulation of differential and composite OMEGA error is presented to establish hypotheses about the functional relationships between various parameters and OMEGA navigational errors. Computer software developed to provide for extensive statistical analysis of the phase data is described. Results from the regression analysis used to conduct parameter sensitivity studies on differential OMEGA error tend to validate the theoretically based hypothesis concerning the relationship between uncorrected differential OMEGA error and receiver separation range and azimuth. Limited results of measurement of receiver repeatability error and line of position measurement error are also presented.

  13. An Analysis of Class II Supplies Requisitions in the Korean Army’s Organizational Supply

    DTIC Science & Technology

    2009-03-26

    five methods for qualitative research : Case study , Ethnography , 45 Phenomenological study , Grounded theory , and...Approaches .. 42 Table 9 Five Qualitative Research Methods ..................................................................... 45 Table 10 Six...Content analysis. Table 9 provides a brief overview of the five methods . Table 9 Five Qualitative

  14. Evaluation of errors in quantitative determination of asbestos in rock

    NASA Astrophysics Data System (ADS)

    Baietto, Oliviero; Marini, Paola; Vitaliti, Martina

    2016-04-01

    The quantitative determination of the content of asbestos in rock matrices is a complex operation which is susceptible to important errors. The principal methodologies for the analysis are Scanning Electron Microscopy (SEM) and Phase Contrast Optical Microscopy (PCOM). Despite the PCOM resolution is inferior to that of SEM, PCOM analysis has several advantages, including more representativity of the analyzed sample, more effective recognition of chrysotile and a lower cost. The DIATI LAA internal methodology for the analysis in PCOM is based on a mild grinding of a rock sample, its subdivision in 5-6 grain size classes smaller than 2 mm and a subsequent microscopic analysis of a portion of each class. The PCOM is based on the optical properties of asbestos and of the liquids with note refractive index in which the particles in analysis are immersed. The error evaluation in the analysis of rock samples, contrary to the analysis of airborne filters, cannot be based on a statistical distribution. In fact for airborne filters a binomial distribution (Poisson), which theoretically defines the variation in the count of fibers resulting from the observation of analysis fields, chosen randomly on the filter, can be applied. The analysis in rock matrices instead cannot lean on any statistical distribution because the most important object of the analysis is the size of the of asbestiform fibers and bundles of fibers observed and the resulting relationship between the weights of the fibrous component compared to the one granular. The error evaluation generally provided by public and private institutions varies between 50 and 150 percent, but there are not, however, specific studies that discuss the origin of the error or that link it to the asbestos content. Our work aims to provide a reliable estimation of the error in relation to the applied methodologies and to the total content of asbestos, especially for the values close to the legal limits. The error assessments must be made through the repetition of the same analysis on the same sample to try to estimate the error on the representativeness of the sample and the error related to the sensitivity of the operator, in order to provide a sufficiently reliable uncertainty of the method. We used about 30 natural rock samples with different asbestos content, performing 3 analysis on each sample to obtain a trend sufficiently representative of the percentage. Furthermore we made on one chosen sample 10 repetition of the analysis to try to define more specifically the error of the methodology.

  15. Exploring the Phenotype of Phonological Reading Disability as a Function of the Phonological Deficit Severity: Evidence from the Error Analysis Paradigm in Arabic

    ERIC Educational Resources Information Center

    Taha, Haitham; Ibrahim, Raphiq; Khateb, Asaid

    2014-01-01

    The dominant error types were investigated as a function of phonological processing (PP) deficit severity in four groups of impaired readers. For this aim, an error analysis paradigm distinguishing between four error types was used. The findings revealed that the different types of impaired readers were characterized by differing predominant error…

  16. Errors Analysis of Solving Linear Inequalities among the Preparatory Year Students at King Saud University

    ERIC Educational Resources Information Center

    El-khateeb, Mahmoud M. A.

    2016-01-01

    The purpose of this study aims to investigate the errors classes occurred by the Preparatory year students at King Saud University, through analysis student responses to the items of the study test, and to identify the varieties of the common errors and ratios of common errors that occurred in solving inequalities. In the collection of the data,…

  17. A Study on Mutil-Scale Background Error Covariances in 3D-Var Data Assimilation

    NASA Astrophysics Data System (ADS)

    Zhang, Xubin; Tan, Zhe-Min

    2017-04-01

    The construction of background error covariances is a key component of three-dimensional variational data assimilation. There are different scale background errors and interactions among them in the numerical weather Prediction. However, the influence of these errors and their interactions cannot be represented in the background error covariances statistics when estimated by the leading methods. So, it is necessary to construct background error covariances influenced by multi-scale interactions among errors. With the NMC method, this article firstly estimates the background error covariances at given model-resolution scales. And then the information of errors whose scales are larger and smaller than the given ones is introduced respectively, using different nesting techniques, to estimate the corresponding covariances. The comparisons of three background error covariances statistics influenced by information of errors at different scales reveal that, the background error variances enhance particularly at large scales and higher levels when introducing the information of larger-scale errors by the lateral boundary condition provided by a lower-resolution model. On the other hand, the variances reduce at medium scales at the higher levels, while those show slight improvement at lower levels in the nested domain, especially at medium and small scales, when introducing the information of smaller-scale errors by nesting a higher-resolution model. In addition, the introduction of information of larger- (smaller-) scale errors leads to larger (smaller) horizontal and vertical correlation scales of background errors. Considering the multivariate correlations, the Ekman coupling increases (decreases) with the information of larger- (smaller-) scale errors included, whereas the geostrophic coupling in free atmosphere weakens in both situations. The three covariances obtained in above work are used in a data assimilation and model forecast system respectively, and then the analysis-forecast cycles for a period of 1 month are conducted. Through the comparison of both analyses and forecasts from this system, it is found that the trends for variation in analysis increments with information of different scale errors introduced are consistent with those for variation in variances and correlations of background errors. In particular, introduction of smaller-scale errors leads to larger amplitude of analysis increments for winds at medium scales at the height of both high- and low- level jet. And analysis increments for both temperature and humidity are greater at the corresponding scales at middle and upper levels under this circumstance. These analysis increments improve the intensity of jet-convection system which includes jets at different levels and coupling between them associated with latent heat release, and these changes in analyses contribute to the better forecasts for winds and temperature in the corresponding areas. When smaller-scale errors are included, analysis increments for humidity enhance significantly at large scales at lower levels to moisten southern analyses. This humidification devotes to correcting dry bias there and eventually improves forecast skill of humidity. Moreover, inclusion of larger- (smaller-) scale errors is beneficial for forecast quality of heavy (light) precipitation at large (small) scales due to the amplification (diminution) of intensity and area in precipitation forecasts but tends to overestimate (underestimate) light (heavy) precipitation .

  18. Why Is Rainfall Error Analysis Requisite for Data Assimilation and Climate Modeling?

    NASA Technical Reports Server (NTRS)

    Hou, Arthur Y.; Zhang, Sara Q.

    2004-01-01

    Given the large temporal and spatial variability of precipitation processes, errors in rainfall observations are difficult to quantify yet crucial to making effective use of rainfall data for improving atmospheric analysis, weather forecasting, and climate modeling. We highlight the need for developing a quantitative understanding of systematic and random errors in precipitation observations by examining explicit examples of how each type of errors can affect forecasts and analyses in global data assimilation. We characterize the error information needed from the precipitation measurement community and how it may be used to improve data usage within the general framework of analysis techniques, as well as accuracy requirements from the perspective of climate modeling and global data assimilation.

  19. An Error Analysis for the Finite Element Method Applied to Convection Diffusion Problems.

    DTIC Science & Technology

    1981-03-01

    D TFhG-]NOLOGY k 4b 00 \\" ) ’b Technical Note BN-962 AN ERROR ANALYSIS FOR THE FINITE ELEMENT METHOD APPLIED TO CONVECTION DIFFUSION PROBLEM by I...Babu~ka and W. G. Szym’czak March 1981 V.. UNVI I Of- ’i -S AN ERROR ANALYSIS FOR THE FINITE ELEMENT METHOD P. - 0 w APPLIED TO CONVECTION DIFFUSION ...AOAO98 895 MARYLAND UNIVYCOLLEGE PARK INST FOR PHYSICAL SCIENCE--ETC F/G 12/I AN ERROR ANALYIS FOR THE FINITE ELEMENT METHOD APPLIED TO CONV..ETC (U

  20. Designing Home-Based Telemedicine Systems for the Geriatric Population: An Empirical Study.

    PubMed

    Narasimha, Shraddhaa; Agnisarman, Sruthy; Chalil Madathil, Kapil; Gramopadhye, Anand; McElligott, James T

    2018-02-01

    Background and Introduction: Telemedicine, the process of providing healthcare remotely using communication devices, has the potential to be useful for the geriatric population when specifically designed for this age group. This study explored the design of four video telemedicine systems currently available and outlined issues with these systems that impact usability among the geriatric population. Based on the results, design suggestions were developed to improve telemedicine systems for this population. Using a between-subjects experimental design, the study considered four telemedicine systems used in Medical University of South Carolina. The study was conducted at a local retirement home. The participant pool consisted of 40 adults, 60 years or older. The dependent measures used were the mean times for telemedicine session initiation and video session, mean number of errors, post-test satisfaction ratings, the NASA-Task Load Index (NASA-TLX) workload measures, and the IBM-Computer Systems Usability Questionnaire measures. Statistical significance was found among the telemedicine systems' initiation times. The analysis of the qualitative data revealed several issues, including lengthy e-mail content, icon placement, and chat box design, which affect the usability of these systems for the geriatric population. Human factor-based design modifications, including short, precise e-mail content, appropriately placed icons, and the inclusion of instructions, are recommended to address the issues found in the qualitative study.

  1. Quantitative evaluation of patient-specific quality assurance using online dosimetry system

    NASA Astrophysics Data System (ADS)

    Jung, Jae-Yong; Shin, Young-Ju; Sohn, Seung-Chang; Min, Jung-Whan; Kim, Yon-Lae; Kim, Dong-Su; Choe, Bo-Young; Suh, Tae-Suk

    2018-01-01

    In this study, we investigated the clinical performance of an online dosimetry system (Mobius FX system, MFX) by 1) dosimetric plan verification using gamma passing rates and dose volume metrics and 2) error-detection capability evaluation by deliberately introduced machine error. Eighteen volumetric modulated arc therapy (VMAT) plans were studied. To evaluate the clinical performance of the MFX, we used gamma analysis and dose volume histogram (DVH) analysis. In addition, to evaluate the error-detection capability, we used gamma analysis and DVH analysis utilizing three types of deliberately introduced errors (Type 1: gantry angle-independent multi-leaf collimator (MLC) error, Type 2: gantry angle-dependent MLC error, and Type 3: gantry angle error). A dosimetric verification comparison of physical dosimetry system (Delt4PT) and online dosimetry system (MFX), gamma passing rates of the two dosimetry systems showed very good agreement with treatment planning system (TPS) calculation. For the average dose difference between the TPS calculation and the MFX measurement, most of the dose metrics showed good agreement within a tolerance of 3%. For the error-detection comparison of Delta4PT and MFX, the gamma passing rates of the two dosimetry systems did not meet the 90% acceptance criterion with the magnitude of error exceeding 2 mm and 1.5 ◦, respectively, for error plans of Types 1, 2, and 3. For delivery with all error types, the average dose difference of PTV due to error magnitude showed good agreement between calculated TPS and measured MFX within 1%. Overall, the results of the online dosimetry system showed very good agreement with those of the physical dosimetry system. Our results suggest that a log file-based online dosimetry system is a very suitable verification tool for accurate and efficient clinical routines for patient-specific quality assurance (QA).

  2. Feared consequences of panic attacks in panic disorder: a qualitative and quantitative analysis.

    PubMed

    Raffa, Susan D; White, Kamila S; Barlow, David H

    2004-01-01

    Cognitions are hypothesized to play a central role in panic disorder (PD). Previous studies have used questionnaires to assess cognitive content, focusing on prototypical cognitions associated with PD; however, few studies have qualitatively examined cognitions associated with the feared consequences of panic attacks. The purpose of this study was to conduct a qualitative and quantitative analysis of feared consequences of panic attacks. The initial, qualitative analysis resulted in the development of 32 categories of feared consequences. The categories were derived from participant responses to a standardized, semi-structured question (n = 207). Five expert-derived categories were then utilized to quantitatively examine the relationship between cognitions and indicators of PD severity. Cognitions did not predict PD severity; however, correlational analyses indicated some predictive validity to the expert-derived categories. The qualitative analysis identified additional areas of patient-reported concern not included in previous research that may be important in the assessment and treatment of PD.

  3. An investigation of error characteristics and coding performance

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.

    1993-01-01

    The first year's effort on NASA Grant NAG5-2006 was an investigation to characterize typical errors resulting from the EOS dorn link. The analysis methods developed for this effort were used on test data from a March 1992 White Sands Terminal Test. The effectiveness of a concatenated coding scheme of a Reed Solomon outer code and a convolutional inner code versus a Reed Solomon only code scheme has been investigated as well as the effectiveness of a Periodic Convolutional Interleaver in dispersing errors of certain types. The work effort consisted of development of software that allows simulation studies with the appropriate coding schemes plus either simulated data with errors or actual data with errors. The software program is entitled Communication Link Error Analysis (CLEAN) and models downlink errors, forward error correcting schemes, and interleavers.

  4. An error analysis perspective for patient alignment systems.

    PubMed

    Figl, Michael; Kaar, Marcus; Hoffman, Rainer; Kratochwil, Alfred; Hummel, Johann

    2013-09-01

    This paper analyses the effects of error sources which can be found in patient alignment systems. As an example, an ultrasound (US) repositioning system and its transformation chain are assessed. The findings of this concept can also be applied to any navigation system. In a first step, all error sources were identified and where applicable, corresponding target registration errors were computed. By applying error propagation calculations on these commonly used registration/calibration and tracking errors, we were able to analyse the components of the overall error. Furthermore, we defined a special situation where the whole registration chain reduces to the error caused by the tracking system. Additionally, we used a phantom to evaluate the errors arising from the image-to-image registration procedure, depending on the image metric used. We have also discussed how this analysis can be applied to other positioning systems such as Cone Beam CT-based systems or Brainlab's ExacTrac. The estimates found by our error propagation analysis are in good agreement with the numbers found in the phantom study but significantly smaller than results from patient evaluations. We probably underestimated human influences such as the US scan head positioning by the operator and tissue deformation. Rotational errors of the tracking system can multiply these errors, depending on the relative position of tracker and probe. We were able to analyse the components of the overall error of a typical patient positioning system. We consider this to be a contribution to the optimization of the positioning accuracy for computer guidance systems.

  5. Human Error Analysis in a Permit to Work System: A Case Study in a Chemical Plant

    PubMed Central

    Jahangiri, Mehdi; Hoboubi, Naser; Rostamabadi, Akbar; Keshavarzi, Sareh; Hosseini, Ali Akbar

    2015-01-01

    Background A permit to work (PTW) is a formal written system to control certain types of work which are identified as potentially hazardous. However, human error in PTW processes can lead to an accident. Methods This cross-sectional, descriptive study was conducted to estimate the probability of human errors in PTW processes in a chemical plant in Iran. In the first stage, through interviewing the personnel and studying the procedure in the plant, the PTW process was analyzed using the hierarchical task analysis technique. In doing so, PTW was considered as a goal and detailed tasks to achieve the goal were analyzed. In the next step, the standardized plant analysis risk-human (SPAR-H) reliability analysis method was applied for estimation of human error probability. Results The mean probability of human error in the PTW system was estimated to be 0.11. The highest probability of human error in the PTW process was related to flammable gas testing (50.7%). Conclusion The SPAR-H method applied in this study could analyze and quantify the potential human errors and extract the required measures for reducing the error probabilities in PTW system. Some suggestions to reduce the likelihood of errors, especially in the field of modifying the performance shaping factors and dependencies among tasks are provided. PMID:27014485

  6. A Phase-Locked Loop Model of the Response of the Postural Control System to Periodic Platform Motion

    PubMed Central

    Schilling, Robert J.; Robinson, Charles J.

    2010-01-01

    A phase-locked loop (PLL) model of the response of the postural control system to periodic platform motion is proposed. The PLL model is based on the hypothesis that quiet standing (QS) postural sway can be characterized as a weak sinusoidal oscillation corrupted with noise. Because the signal to noise ratio is quite low, the characteristics of the QS oscillator are not measured directly from the QS sway, instead they are inferred from the response of the oscillator to periodic motion of the platform. When a sinusoidal stimulus is applied, the QS oscillator changes speed as needed until its frequency matches that of the platform, thus achieving phase lock in a manner consistent with a PLL control mechanism. The PLL model is highly effective in representing the frequency, amplitude, and phase shift of the sinusoidal component of the phase-locked response over a range of platform frequencies and amplitudes. Qualitative analysis of the PLL control mechanism indicates that there is a finite range of frequencies over which phase lock is possible, and that the size of this capture range decreases with decreasing platform amplitude. The PLL model was tested experimentally using nine healthy subjects and the results reveal good agreement with a mean phase shift error of 13.7° and a mean amplitude error of 0.8 mm. PMID:20378479

  7. Imaging-based quantification of hepatic fat: methods and clinical applications.

    PubMed

    Ma, Xiaozhou; Holalkere, Nagaraj-Setty; Kambadakone R, Avinash; Mino-Kenudson, Mari; Hahn, Peter F; Sahani, Dushyant V

    2009-01-01

    Fatty liver disease comprises a spectrum of conditions (simple hepatic steatosis, steatohepatitis with inflammatory changes, and end-stage liver disease with fibrosis and cirrhosis). Hepatic steatosis is often associated with diabetes and obesity and may be secondary to alcohol and drug use, toxins, viral infections, and metabolic diseases. Detection and quantification of liver fat have many clinical applications, and early recognition is crucial to institute appropriate management and prevent progression. Histopathologic analysis is the reference standard to detect and quantify fat in the liver, but results are vulnerable to sampling error. Moreover, it can cause morbidity and complications and cannot be repeated often enough to monitor treatment response. Imaging can be repeated regularly and allows assessment of the entire liver, thus avoiding sampling error. Selection of appropriate imaging methods demands understanding of their advantages and limitations and the suitable clinical setting. Ultrasonography is effective for detecting moderate or severe fatty infiltration but is limited by lack of interobserver reliability and intraobserver reproducibility. Computed tomography allows quantitative and qualitative evaluation and is generally highly accurate and reliable; however, the results may be confounded by hepatic parenchymal changes due to cirrhosis or depositional diseases. Magnetic resonance (MR) imaging with appropriate sequences (eg, chemical shift techniques) has similarly high sensitivity, and MR spectroscopy provides unique advantages for some applications. However, both are expensive and too complex to be used to monitor steatosis. (c) RSNA, 2009.

  8. The development and validation of using inertial sensors to monitor postural change in resistance exercise.

    PubMed

    Gleadhill, Sam; Lee, James Bruce; James, Daniel

    2016-05-03

    This research presented and validated a method of assessing postural changes during resistance exercise using inertial sensors. A simple lifting task was broken down to a series of well-defined tasks, which could be examined and measured in a controlled environment. The purpose of this research was to determine whether timing measures obtained from inertial sensor accelerometer outputs are able to provide accurate, quantifiable information of resistance exercise movement patterns. The aim was to complete a timing measure validation of inertial sensor outputs. Eleven participants completed five repetitions of 15 different deadlift variations. Participants were monitored with inertial sensors and an infrared three dimensional motion capture system. Validation was undertaken using a Will Hopkins Typical Error of the Estimate, with a Pearson׳s correlation and a Bland Altman Limits of Agreement analysis. Statistical validation measured the timing agreement during deadlifts, from inertial sensor outputs and the motion capture system. Timing validation results demonstrated a Pearson׳s correlation of 0.9997, with trivial standardised error (0.026) and standardised bias (0.002). Inertial sensors can now be used in practical settings with as much confidence as motion capture systems, for accelerometer timing measurements of resistance exercise. This research provides foundations for inertial sensors to be applied for qualitative activity recognition of resistance exercise and safe lifting practices. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Methodological challenges in qualitative content analysis: A discussion paper.

    PubMed

    Graneheim, Ulla H; Lindgren, Britt-Marie; Lundman, Berit

    2017-09-01

    This discussion paper is aimed to map content analysis in the qualitative paradigm and explore common methodological challenges. We discuss phenomenological descriptions of manifest content and hermeneutical interpretations of latent content. We demonstrate inductive, deductive, and abductive approaches to qualitative content analysis, and elaborate on the level of abstraction and degree of interpretation used in constructing categories, descriptive themes, and themes of meaning. With increased abstraction and interpretation comes an increased challenge to demonstrate the credibility and authenticity of the analysis. A key issue is to show the logic in how categories and themes are abstracted, interpreted, and connected to the aim and to each other. Qualitative content analysis is an autonomous method and can be used at varying levels of abstraction and interpretation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Interpretive focus groups: a participatory method for interpreting and extending secondary analysis of qualitative data.

    PubMed

    Redman-MacLaren, Michelle; Mills, Jane; Tommbe, Rachael

    2014-01-01

    Participatory approaches to qualitative research practice constantly change in response to evolving research environments. Researchers are increasingly encouraged to undertake secondary analysis of qualitative data, despite epistemological and ethical challenges. Interpretive focus groups can be described as a more participative method for groups to analyse qualitative data. To facilitate interpretive focus groups with women in Papua New Guinea to extend analysis of existing qualitative data and co-create new primary data. The purpose of this was to inform a transformational grounded theory and subsequent health promoting action. A two-step approach was used in a grounded theory study about how women experience male circumcision in Papua New Guinea. Participants analysed portions or 'chunks' of existing qualitative data in story circles and built upon this analysis by using the visual research method of storyboarding. New understandings of the data were evoked when women in interpretive focus groups analysed the data 'chunks'. Interpretive focus groups encouraged women to share their personal experiences about male circumcision. The visual method of storyboarding enabled women to draw pictures to represent their experiences. This provided an additional focus for whole-of-group discussions about the research topic. Interpretive focus groups offer opportunity to enhance trustworthiness of findings when researchers undertake secondary analysis of qualitative data. The co-analysis of existing data and co-generation of new data between research participants and researchers informed an emergent transformational grounded theory and subsequent health promoting action.

  11. Interpretive focus groups: a participatory method for interpreting and extending secondary analysis of qualitative data

    PubMed Central

    Redman-MacLaren, Michelle; Mills, Jane; Tommbe, Rachael

    2014-01-01

    Background Participatory approaches to qualitative research practice constantly change in response to evolving research environments. Researchers are increasingly encouraged to undertake secondary analysis of qualitative data, despite epistemological and ethical challenges. Interpretive focus groups can be described as a more participative method for groups to analyse qualitative data. Objective To facilitate interpretive focus groups with women in Papua New Guinea to extend analysis of existing qualitative data and co-create new primary data. The purpose of this was to inform a transformational grounded theory and subsequent health promoting action. Design A two-step approach was used in a grounded theory study about how women experience male circumcision in Papua New Guinea. Participants analysed portions or ‘chunks’ of existing qualitative data in story circles and built upon this analysis by using the visual research method of storyboarding. Results New understandings of the data were evoked when women in interpretive focus groups analysed the data ‘chunks’. Interpretive focus groups encouraged women to share their personal experiences about male circumcision. The visual method of storyboarding enabled women to draw pictures to represent their experiences. This provided an additional focus for whole-of-group discussions about the research topic. Conclusions Interpretive focus groups offer opportunity to enhance trustworthiness of findings when researchers undertake secondary analysis of qualitative data. The co-analysis of existing data and co-generation of new data between research participants and researchers informed an emergent transformational grounded theory and subsequent health promoting action. PMID:25138532

  12. Selective Weighted Least Squares Method for Fourier Transform Infrared Quantitative Analysis.

    PubMed

    Wang, Xin; Li, Yan; Wei, Haoyun; Chen, Xia

    2017-06-01

    Classical least squares (CLS) regression is a popular multivariate statistical method used frequently for quantitative analysis using Fourier transform infrared (FT-IR) spectrometry. Classical least squares provides the best unbiased estimator for uncorrelated residual errors with zero mean and equal variance. However, the noise in FT-IR spectra, which accounts for a large portion of the residual errors, is heteroscedastic. Thus, if this noise with zero mean dominates in the residual errors, the weighted least squares (WLS) regression method described in this paper is a better estimator than CLS. However, if bias errors, such as the residual baseline error, are significant, WLS may perform worse than CLS. In this paper, we compare the effect of noise and bias error in using CLS and WLS in quantitative analysis. Results indicated that for wavenumbers with low absorbance, the bias error significantly affected the error, such that the performance of CLS is better than that of WLS. However, for wavenumbers with high absorbance, the noise significantly affected the error, and WLS proves to be better than CLS. Thus, we propose a selective weighted least squares (SWLS) regression that processes data with different wavenumbers using either CLS or WLS based on a selection criterion, i.e., lower or higher than an absorbance threshold. The effects of various factors on the optimal threshold value (OTV) for SWLS have been studied through numerical simulations. These studies reported that: (1) the concentration and the analyte type had minimal effect on OTV; and (2) the major factor that influences OTV is the ratio between the bias error and the standard deviation of the noise. The last part of this paper is dedicated to quantitative analysis of methane gas spectra, and methane/toluene mixtures gas spectra as measured using FT-IR spectrometry and CLS, WLS, and SWLS. The standard error of prediction (SEP), bias of prediction (bias), and the residual sum of squares of the errors (RSS) from the three quantitative analyses were compared. In methane gas analysis, SWLS yielded the lowest SEP and RSS among the three methods. In methane/toluene mixture gas analysis, a modification of the SWLS has been presented to tackle the bias error from other components. The SWLS without modification presents the lowest SEP in all cases but not bias and RSS. The modification of SWLS reduced the bias, which showed a lower RSS than CLS, especially for small components.

  13. [News items about clinical errors and safety perceptions in hospital patients].

    PubMed

    Mira, José Joaquín; Guilabert, Mercedes; Ortíz, Lidia; Navarro, Isabel María; Pérez-Jover, María Virtudes; Aranaz, Jesús María

    2010-01-01

    To analyze how news items about clinical errors are treated by the press in Spain and their influence on patients. We performed a quantitative and qualitative study. Firstly, news items published between April and November 2007 in six newspapers were analyzed. Secondly, 829 patients from five hospitals in four autonomous regions were surveyed. We analyzed 90 cases generating 128 news items, representing a mean of 16 items per month. In 91 news items (71.1%) the source was checked. In 78 items (60.9%) the author could be identified. The impact of these news items was -4.86 points (95% confidence interval [95%CI]: -4.15-5.57). In 59 cases (57%) the error was attributed to the system, in 27 (21.3%) to health professionals, and in 41 (32.3%) to both. Neither the number of columns (p=0.702), nor the inclusion of a sub-header (p=0.195), nor a complementary image (p=0.9) were found to be related to the effect of the error on safety perceptions. Of the 829 patients, 515 (62.1%; 95%CI: 58.8-65.4%) claimed to have recently seen or heard news about clinical errors in the press, on the radio or on television. The perception of safety decreased when the same person was worried about being the victim of a clinical error and had seen a recent news item about such adverse events (chi(2)=15.17; p=0.001). Every week news items about clinical errors are published or broadcast. The way in which newspapers report legal claims over alleged medical errors is similar to the way they report judicial sentences for negligence causing irreparable damage or harm. News about errors generates insecurity in patients. It is advisable to create interfaces between journalists and health professionals. Copyright 2009 SESPAS. Published by Elsevier Espana. All rights reserved.

  14. A qualitative approach for recovering relative depths in dynamic scenes

    NASA Technical Reports Server (NTRS)

    Haynes, S. M.; Jain, R.

    1987-01-01

    This approach to dynamic scene analysis is a qualitative one. It computes relative depths using very general rules. The depths calculated are qualitative in the sense that the only information obtained is which object is in front of which others. The motion is qualitative in the sense that the only required motion data is whether objects are moving toward or away from the camera. Reasoning, which takes into account the temporal character of the data and the scene, is qualitative. This approach to dynamic scene analysis can tolerate imprecise data because in dynamic scenes the data are redundant.

  15. So you want to do research? 3. An introduction to qualitative methods.

    PubMed

    Meadows, Keith A

    2003-10-01

    This article describes some of the key issues in the use of qualitative research methods. Starting with a description of what qualitative research is and outlining some of the distinguishing features between quantitative and qualitative research, examples of the type of setting where qualitative research can be applied are provided. Methods of collecting information through in-depth interviews and group discussions are discussed in some detail, including issues around sampling and recruitment, the use of topic guides and techniques to encourage participants to talk openly. An overview on the analysis of qualitative data discusses aspects on data reduction, display and drawing conclusions from the data. Approaches to ensuring rigour in the collection, analysis and reporting of qualitative research are discussed and the concepts of credibility, transferability, dependability and confirmability are described. Finally, guidelines for the reporting of qualitative research are outlined and the need to write for a particular audience is discussed.

  16. Advancing the study of violence against women using mixed methods: integrating qualitative methods into a quantitative research program.

    PubMed

    Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol

    2011-02-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.

  17. Evaluation and error apportionment of an ensemble of ...

    EPA Pesticide Factsheets

    Through the comparison of several regional-scale chemistry transport modelling systems that simulate meteorology and air quality over the European and American continents, this study aims at i) apportioning the error to the responsible processes using time-scale analysis, ii) helping to detect causes of models error, and iii) identifying the processes and scales most urgently requiring dedicated investigations. The analysis is conducted within the framework of the third phase of the Air Quality Model Evaluation International Initiative (AQMEII) and tackles model performance gauging through measurement-to-model comparison, error decomposition and time series analysis of the models biases for several fields (ozone, CO, SO2, NO, NO2, PM10, PM2.5, wind speed, and temperature). The operational metrics (magnitude of the error, sign of the bias, associativity) provide an overall sense of model strengths and deficiencies, while apportioning the error to its constituent parts (bias, variance and covariance) can help to assess the nature and quality of the error. Each of the error components is analysed independently and apportioned to specific processes based on the corresponding timescale (long scale, synoptic, diurnal, and intra-day) using the error apportionment technique devised in the former phases of AQMEII.The application of the error apportionment method to the AQMEII Phase 3 simulations provides several key insights. In addition to reaffirming the strong impact

  18. Characteristics of qualitative studies in influential journals of general medicine: a critical review.

    PubMed

    Yamazaki, Hiroshi; Slingsby, Brian Taylor; Takahashi, Miyako; Hayashi, Yoko; Sugimori, Hiroki; Nakayama, Takeo

    2009-12-01

    Although qualitative studies have increased since the 1990s, some reports note that relatively few influential journals published them up until 2000. This study critically reviewed the characteristics of qualitative studies published in top tier medical journals since 2000. We assessed full texts of qualitative studies published between 2000 and 2004 in the Annals of Internal Medicine, BMJ, JAMA, Lancet, and New England Journal of Medicine. We found 80 qualitative studies, of which 73 (91%) were published in BMJ. Only 10 studies (13%) combined qualitative and quantitative methods. Sixty-two studies (78%) used only one method of data collection. Interviews dominated the choice of data collection. The median sample size was 36 (range: 9-383). Thirty-three studies (41%) did not specify the type of analysis used but rather described the analytic process in detail. The rest indicated the mode of data analysis, in which the most prevalent methods were the constant comparative method (23%) and the grounded theory approach (22%). Qualitative data analysis software was used by 33 studies (41%). Among influential journals of general medicine, only BMJ consistently published an average of 15 qualitative study reports between 2000 and 2004. These findings lend insight into what qualities and characteristics make a qualitative study worthy of consideration to be published in an influential journal, primarily BMJ.

  19. Attitude Determination Error Analysis System (ADEAS) mathematical specifications document

    NASA Technical Reports Server (NTRS)

    Nicholson, Mark; Markley, F.; Seidewitz, E.

    1988-01-01

    The mathematical specifications of Release 4.0 of the Attitude Determination Error Analysis System (ADEAS), which provides a general-purpose linear error analysis capability for various spacecraft attitude geometries and determination processes, are presented. The analytical basis of the system is presented. The analytical basis of the system is presented, and detailed equations are provided for both three-axis-stabilized and spin-stabilized attitude sensor models.

  20. A methodology for translating positional error into measures of attribute error, and combining the two error sources

    Treesearch

    Yohay Carmel; Curtis Flather; Denis Dean

    2006-01-01

    This paper summarizes our efforts to investigate the nature, behavior, and implications of positional error and attribute error in spatiotemporal datasets. Estimating the combined influence of these errors on map analysis has been hindered by the fact that these two error types are traditionally expressed in different units (distance units, and categorical units,...

  1. Hartman Testing of X-Ray Telescopes

    NASA Technical Reports Server (NTRS)

    Saha, Timo T.; Biskasch, Michael; Zhang, William W.

    2013-01-01

    Hartmann testing of x-ray telescopes is a simple test method to retrieve and analyze alignment errors and low-order circumferential errors of x-ray telescopes and their components. A narrow slit is scanned along the circumference of the telescope in front of the mirror and the centroids of the images are calculated. From the centroid data, alignment errors, radius variation errors, and cone-angle variation errors can be calculated. Mean cone angle, mean radial height (average radius), and the focal length of the telescope can also be estimated if the centroid data is measured at multiple focal plane locations. In this paper we present the basic equations that are used in the analysis process. These equations can be applied to full circumference or segmented x-ray telescopes. We use the Optical Surface Analysis Code (OSAC) to model a segmented x-ray telescope and show that the derived equations and accompanying analysis retrieves the alignment errors and low order circumferential errors accurately.

  2. Enhancing the quality and credibility of qualitative analysis.

    PubMed

    Patton, M Q

    1999-12-01

    Varying philosophical and theoretical orientations to qualitative inquiry remind us that issues of quality and credibility intersect with audience and intended research purposes. This overview examines ways of enhancing the quality and credibility of qualitative analysis by dealing with three distinct but related inquiry concerns: rigorous techniques and methods for gathering and analyzing qualitative data, including attention to validity, reliability, and triangulation; the credibility, competence, and perceived trustworthiness of the qualitative researcher; and the philosophical beliefs of evaluation users about such paradigm-based preferences as objectivity versus subjectivity, truth versus perspective, and generalizations versus extrapolations. Although this overview examines some general approaches to issues of credibility and data quality in qualitative analysis, it is important to acknowledge that particular philosophical underpinnings, specific paradigms, and special purposes for qualitative inquiry will typically include additional or substitute criteria for assuring and judging quality, validity, and credibility. Moreover, the context for these considerations has evolved. In early literature on evaluation methods the debate between qualitative and quantitative methodologists was often strident. In recent years the debate has softened. A consensus has gradually emerged that the important challenge is to match appropriately the methods to empirical questions and issues, and not to universally advocate any single methodological approach for all problems.

  3. Uncertainty

    USGS Publications Warehouse

    Hunt, Randall J.

    2012-01-01

    Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.

  4. Describing qualitative research undertaken with randomised controlled trials in grant proposals: a documentary analysis.

    PubMed

    Drabble, Sarah J; O'Cathain, Alicia; Thomas, Kate J; Rudolph, Anne; Hewison, Jenny

    2014-02-18

    There is growing recognition of the value of conducting qualitative research with trials in health research. It is timely to reflect on how this qualitative research is presented in grant proposals to identify lessons for researchers and research commissioners. As part of a larger study focusing on how to maximise the value of undertaking qualitative research with trials, we undertook a documentary analysis of proposals of funded studies. Using the metaRegister of Controlled Trials (mRCT) database we identified trials funded in the United Kingdom, ongoing between 2001 and 2010, and reporting the use of qualitative research. We requested copies of proposals from lead researchers. We extracted data from the proposals using closed and open questions, analysed using descriptive statistics and content analysis respectively. 2% (89/3812) of trials in the mRCT database described the use of qualitative research undertaken with the trial. From these 89 trials, we received copies of 36 full proposals, of which 32 met our inclusion criteria. 25% used less than a single paragraph to describe the qualitative research. The aims of the qualitative research described in these proposals focused mainly on the intervention or trial conduct. Just over half (56%) of the proposals included an explicit rationale for conducting the qualitative research with the trial, the most frequent being to optimise implementation into clinical practice or to interpret trial findings. Key information about methods, expertise and resources was missing in a large minority of proposals, in particular sample size, type of analysis, and non-personnel resources. 28% specifically stated that qualitative researchers would conduct the qualitative research. Our review of proposals of successfully funded studies identified good practice but also identified limited space given to describing the qualitative research, with an associated lack of attention to the rationale for doing the qualitative research and important methodological details. Acknowledging the space restrictions faced by researchers writing grant proposals, we suggest a starting point for providing practical guidance to help researchers write proposals and research commissioners assess proposals of qualitative research with trials.

  5. Describing qualitative research undertaken with randomised controlled trials in grant proposals: a documentary analysis

    PubMed Central

    2014-01-01

    Background There is growing recognition of the value of conducting qualitative research with trials in health research. It is timely to reflect on how this qualitative research is presented in grant proposals to identify lessons for researchers and research commissioners. As part of a larger study focusing on how to maximise the value of undertaking qualitative research with trials, we undertook a documentary analysis of proposals of funded studies. Methods Using the metaRegister of Controlled Trials (mRCT) database we identified trials funded in the United Kingdom, ongoing between 2001 and 2010, and reporting the use of qualitative research. We requested copies of proposals from lead researchers. We extracted data from the proposals using closed and open questions, analysed using descriptive statistics and content analysis respectively. Results 2% (89/3812) of trials in the mRCT database described the use of qualitative research undertaken with the trial. From these 89 trials, we received copies of 36 full proposals, of which 32 met our inclusion criteria. 25% used less than a single paragraph to describe the qualitative research. The aims of the qualitative research described in these proposals focused mainly on the intervention or trial conduct. Just over half (56%) of the proposals included an explicit rationale for conducting the qualitative research with the trial, the most frequent being to optimise implementation into clinical practice or to interpret trial findings. Key information about methods, expertise and resources was missing in a large minority of proposals, in particular sample size, type of analysis, and non-personnel resources. 28% specifically stated that qualitative researchers would conduct the qualitative research. Conclusions Our review of proposals of successfully funded studies identified good practice but also identified limited space given to describing the qualitative research, with an associated lack of attention to the rationale for doing the qualitative research and important methodological details. Acknowledging the space restrictions faced by researchers writing grant proposals, we suggest a starting point for providing practical guidance to help researchers write proposals and research commissioners assess proposals of qualitative research with trials. PMID:24533771

  6. Nonlinear truncation error analysis of finite difference schemes for the Euler equations

    NASA Technical Reports Server (NTRS)

    Klopfer, G. H.; Mcrae, D. S.

    1983-01-01

    It is pointed out that, in general, dissipative finite difference integration schemes have been found to be quite robust when applied to the Euler equations of gas dynamics. The present investigation considers a modified equation analysis of both implicit and explicit finite difference techniques as applied to the Euler equations. The analysis is used to identify those error terms which contribute most to the observed solution errors. A technique for analytically removing the dominant error terms is demonstrated, resulting in a greatly improved solution for the explicit Lax-Wendroff schemes. It is shown that the nonlinear truncation errors are quite large and distributed quite differently for each of the three conservation equations as applied to a one-dimensional shock tube problem.

  7. Nonparametric Estimation of Standard Errors in Covariance Analysis Using the Infinitesimal Jackknife

    ERIC Educational Resources Information Center

    Jennrich, Robert I.

    2008-01-01

    The infinitesimal jackknife provides a simple general method for estimating standard errors in covariance structure analysis. Beyond its simplicity and generality what makes the infinitesimal jackknife method attractive is that essentially no assumptions are required to produce consistent standard error estimates, not even the requirement that the…

  8. Mark-Up-Based Writing Error Analysis Model in an On-Line Classroom.

    ERIC Educational Resources Information Center

    Feng, Cheng; Yano, Yoneo; Ogata, Hiroaki

    2000-01-01

    Describes a new component called "Writing Error Analysis Model" (WEAM) in the CoCoA system for teaching writing composition in Japanese as a foreign language. The Weam can be used for analyzing learners' morphological errors and selecting appropriate compositions for learners' revising exercises. (Author/VWL)

  9. Exploratory Factor Analysis of Reading, Spelling, and Math Errors

    ERIC Educational Resources Information Center

    O'Brien, Rebecca; Pan, Xingyu; Courville, Troy; Bray, Melissa A.; Breaux, Kristina; Avitia, Maria; Choi, Dowon

    2017-01-01

    Norm-referenced error analysis is useful for understanding individual differences in students' academic skill development and for identifying areas of skill strength and weakness. The purpose of the present study was to identify underlying connections between error categories across five language and math subtests of the Kaufman Test of…

  10. Investigation on coupling error characteristics in angular rate matching based ship deformation measurement approach

    NASA Astrophysics Data System (ADS)

    Yang, Shuai; Wu, Wei; Wang, Xingshu; Xu, Zhiguang

    2018-01-01

    The coupling error in the measurement of ship hull deformation can significantly influence the attitude accuracy of the shipborne weapons and equipments. It is therefore important to study the characteristics of the coupling error. In this paper, an comprehensive investigation on the coupling error is reported, which has a potential of deducting the coupling error in the future. Firstly, the causes and characteristics of the coupling error are analyzed theoretically based on the basic theory of measuring ship deformation. Then, simulations are conducted for verifying the correctness of the theoretical analysis. Simulation results show that the cross-correlation between dynamic flexure and ship angular motion leads to the coupling error in measuring ship deformation, and coupling error increases with the correlation value between them. All the simulation results coincide with the theoretical analysis.

  11. Analysis technique for controlling system wavefront error with active/adaptive optics

    NASA Astrophysics Data System (ADS)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  12. MO-FG-202-06: Improving the Performance of Gamma Analysis QA with Radiomics- Based Image Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wootton, L; Nyflot, M; Ford, E

    2016-06-15

    Purpose: The use of gamma analysis for IMRT quality assurance has well-known limitations. Traditionally, a simple thresholding technique is used to evaluated passing criteria. However, like any image the gamma distribution is rich in information which thresholding mostly discards. We therefore propose a novel method of analyzing gamma images that uses quantitative image features borrowed from radiomics, with the goal of improving error detection. Methods: 368 gamma images were generated from 184 clinical IMRT beams. For each beam the dose to a phantom was measured with EPID dosimetry and compared to the TPS dose calculated with and without normally distributedmore » (2mm sigma) errors in MLC positions. The magnitude of 17 intensity histogram and size-zone radiomic features were derived from each image. The features that differed most significantly between image sets were determined with ROC analysis. A linear machine-learning model was trained on these features to classify images as with or without errors on 180 gamma images.The model was then applied to an independent validation set of 188 additional gamma distributions, half with and half without errors. Results: The most significant features for detecting errors were histogram kurtosis (p=0.007) and three size-zone metrics (p<1e-6 for each). The sizezone metrics detected clusters of high gamma-value pixels under mispositioned MLCs. The model applied to the validation set had an AUC of 0.8, compared to 0.56 for traditional gamma analysis with the decision threshold restricted to 98% or less. Conclusion: A radiomics-based image analysis method was developed that is more effective in detecting error than traditional gamma analysis. Though the pilot study here considers only MLC position errors, radiomics-based methods for other error types are being developed, which may provide better error detection and useful information on the source of detected errors. This work was partially supported by a grant from the Agency for Healthcare Research and Quality, grant number R18 HS022244-01.« less

  13. Cost-Effectiveness Analysis of an Automated Medication System Implemented in a Danish Hospital Setting.

    PubMed

    Risør, Bettina Wulff; Lisby, Marianne; Sørensen, Jan

    To evaluate the cost-effectiveness of an automated medication system (AMS) implemented in a Danish hospital setting. An economic evaluation was performed alongside a controlled before-and-after effectiveness study with one control ward and one intervention ward. The primary outcome measure was the number of errors in the medication administration process observed prospectively before and after implementation. To determine the difference in proportion of errors after implementation of the AMS, logistic regression was applied with the presence of error(s) as the dependent variable. Time, group, and interaction between time and group were the independent variables. The cost analysis used the hospital perspective with a short-term incremental costing approach. The total 6-month costs with and without the AMS were calculated as well as the incremental costs. The number of avoided administration errors was related to the incremental costs to obtain the cost-effectiveness ratio expressed as the cost per avoided administration error. The AMS resulted in a statistically significant reduction in the proportion of errors in the intervention ward compared with the control ward. The cost analysis showed that the AMS increased the ward's 6-month cost by €16,843. The cost-effectiveness ratio was estimated at €2.01 per avoided administration error, €2.91 per avoided procedural error, and €19.38 per avoided clinical error. The AMS was effective in reducing errors in the medication administration process at a higher overall cost. The cost-effectiveness analysis showed that the AMS was associated with affordable cost-effectiveness rates. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  14. System review: a method for investigating medical errors in healthcare settings.

    PubMed

    Alexander, G L; Stone, T T

    2000-01-01

    System analysis is a process of evaluating objectives, resources, structure, and design of businesses. System analysis can be used by leaders to collaboratively identify breakthrough opportunities to improve system processes. In healthcare systems, system analysis can be used to review medical errors (system occurrences) that may place patients at risk for injury, disability, and/or death. This study utilizes a case management approach to identify medical errors. Utilizing an interdisciplinary approach, a System Review Team was developed to identify trends in system occurrences, facilitate communication, and enhance the quality of patient care by reducing medical errors.

  15. Reliable absolute analog code retrieval approach for 3D measurement

    NASA Astrophysics Data System (ADS)

    Yu, Shuang; Zhang, Jing; Yu, Xiaoyang; Sun, Xiaoming; Wu, Haibin; Chen, Deyun

    2017-11-01

    The wrapped phase of phase-shifting approach can be unwrapped by using Gray code, but both the wrapped phase error and Gray code decoding error can result in period jump error, which will lead to gross measurement error. Therefore, this paper presents a reliable absolute analog code retrieval approach. The combination of unequal-period Gray code and phase shifting patterns at high frequencies are used to obtain high-frequency absolute analog code, and at low frequencies, the same unequal-period combination patterns are used to obtain the low-frequency absolute analog code. Next, the difference between the two absolute analog codes was employed to eliminate period jump errors, and a reliable unwrapped result can be obtained. Error analysis was used to determine the applicable conditions, and this approach was verified through theoretical analysis. The proposed approach was further verified experimentally. Theoretical analysis and experimental results demonstrate that the proposed approach can perform reliable analog code unwrapping.

  16. Qualitative methods: beyond the cookbook.

    PubMed

    Harding, G; Gantley, M

    1998-02-01

    Qualitative methods appear increasingly in vogue in health services research (HSR). Such research, however, has utilized, often uncritically, a 'cookbook' of methods for data collection, and common-sense principles for data analysis. This paper argues that qualitative HSR benefits from recognizing and drawing upon theoretical principles underlying qualitative data collection and analysis. A distinction is drawn between problem-orientated and theory-orientated research, in order to illustrate how problem-orientated research would benefit from the introduction of theoretical perspectives in order to develop the knowledge base of health services research.

  17. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    PubMed Central

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  18. Anatomic, clinical, and neuropsychological correlates of spelling errors in primary progressive aphasia.

    PubMed

    Shim, Hyungsub; Hurley, Robert S; Rogalski, Emily; Mesulam, M-Marsel

    2012-07-01

    This study evaluates spelling errors in the three subtypes of primary progressive aphasia (PPA): agrammatic (PPA-G), logopenic (PPA-L), and semantic (PPA-S). Forty-one PPA patients and 36 age-matched healthy controls were administered a test of spelling. The total number of errors and types of errors in spelling to dictation of regular words, exception words and nonwords, were recorded. Error types were classified based on phonetic plausibility. In the first analysis, scores were evaluated by clinical diagnosis. Errors in spelling exception words and phonetically plausible errors were seen in PPA-S. Conversely, PPA-G was associated with errors in nonword spelling and phonetically implausible errors. In the next analysis, spelling scores were correlated to other neuropsychological language test scores. Significant correlations were found between exception word spelling and measures of naming and single word comprehension. Nonword spelling correlated with tests of grammar and repetition. Global language measures did not correlate significantly with spelling scores, however. Cortical thickness analysis based on MRI showed that atrophy in several language regions of interest were correlated with spelling errors. Atrophy in the left supramarginal gyrus and inferior frontal gyrus (IFG) pars orbitalis correlated with errors in nonword spelling, while thinning in the left temporal pole and fusiform gyrus correlated with errors in exception word spelling. Additionally, phonetically implausible errors in regular word spelling correlated with thinning in the left IFG pars triangularis and pars opercularis. Together, these findings suggest two independent systems for spelling to dictation, one phonetic (phoneme to grapheme conversion), and one lexical (whole word retrieval). Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Generalized Structured Component Analysis with Uniqueness Terms for Accommodating Measurement Error

    PubMed Central

    Hwang, Heungsun; Takane, Yoshio; Jung, Kwanghee

    2017-01-01

    Generalized structured component analysis (GSCA) is a component-based approach to structural equation modeling (SEM), where latent variables are approximated by weighted composites of indicators. It has no formal mechanism to incorporate errors in indicators, which in turn renders components prone to the errors as well. We propose to extend GSCA to account for errors in indicators explicitly. This extension, called GSCAM, considers both common and unique parts of indicators, as postulated in common factor analysis, and estimates a weighted composite of indicators with their unique parts removed. Adding such unique parts or uniqueness terms serves to account for measurement errors in indicators in a manner similar to common factor analysis. Simulation studies are conducted to compare parameter recovery of GSCAM and existing methods. These methods are also applied to fit a substantively well-established model to real data. PMID:29270146

  20. Analyse des erreurs et grammaire generative: La syntaxe de l'interrogation en francais (Error Analysis and Generative Grammar: The Syntax of Interrogation in French).

    ERIC Educational Resources Information Center

    Py, Bernard

    A progress report is presented of a study which applies a system of generative grammar to error analysis. The objective of the study was to reconstruct the grammar of students' interlanguage, using a systematic analysis of errors. (Interlanguage refers to the linguistic competence of a student who possesses a relatively systematic body of rules,…

  1. Behind Human Error: Cognitive Systems, Computers and Hindsight

    DTIC Science & Technology

    1994-12-01

    evaluations • Organize and/or conduct workshops and conferences CSERIAC is a Department of Defense Information Analysis Cen- ter sponsored by the Defense...Process 185 Neutral Observer Criteria 191 Error Analysis as Causal Judgment 193 Error as Information 195 A Fundamental Surprise 195 What is Human...Kahnemann, 1974), and in risk analysis (Dougherty and Fragola, 1990). The discussions have continued in a wide variety of forums, includ- ing the

  2. Qualitative Research in Career Development: Content Analysis from 1990 to 2009

    ERIC Educational Resources Information Center

    Stead, Graham B.; Perry, Justin C.; Munka, Linda M.; Bonnett, Heather R.; Shiban, Abbey P.; Care, Esther

    2012-01-01

    A content analysis of 11 journals that published career, vocational, and work-related articles from 1990 to 2009 was conducted. Of 3,279 articles analyzed, 55.9% used quantitative methods and 35.5% were theoretical/conceptual articles. Only 6.3% used qualitative research methods. Among the qualitative empirical studies, standards of academic rigor…

  3. A Grounded Theory of Inductive Qualitative Research Education: Results of a Meta-Data-Analysis

    ERIC Educational Resources Information Center

    Cooper, Robin; Chenail, Ronald J.; Fleming, Stephanie

    2012-01-01

    This paper reports on the first stage of a meta-study conducted by the authors on primary research published during the last thirty years that focused on discovering the experiences of students learning qualitative research. The authors carried out a meta-analysis of the findings of students' experiences learning qualitative research included in…

  4. The Voices of Higher Education Service-Learning Directors: A Qualitative Inductive Analysis

    ERIC Educational Resources Information Center

    Woodard, Kelsey

    2013-01-01

    This research explored issues surrounding service-learning directors (SLDs) within higher education institutions, including who they are, how they became SLDs, and what they experience in the role. Qualitative data were drawn from in-depth interviews of 11 SLDs, as well as review of their vitaes. A qualitative inductive analysis was conducted in…

  5. Increasing the trustworthiness of research results: the role of computers in qualitative text analysis

    Treesearch

    Lynne M. Westphal

    2000-01-01

    By using computer packages designed for qualitative data analysis a researcher can increase trustworthiness (i.e., validity and reliability) of conclusions drawn from qualitative research results. This paper examines trustworthiness issues and therole of computer software (QSR's NUD*IST) in the context of a current research project investigating the social...

  6. A Simple Card Trick: Teaching Qualitative Data Analysis Using a Deck of Playing Cards

    ERIC Educational Resources Information Center

    Waite, Duncan

    2011-01-01

    Yet today, despite recent welcome additions, relatively little is written about teaching qualitative research. Why is that? This article reports out a relatively simple, yet appealing, pedagogical move, a lesson the author uses to teach qualitative data analysis. Data sorting and categorization, the use of tacit and explicit theory in data…

  7. Expediting the Analysis of Qualitative Data in Evaluation: A Procedure for the Rapid Identification of Themes from Audio Recordings (RITA)

    ERIC Educational Resources Information Center

    Neal, Jennifer Watling; Neal, Zachary P.; VanDyke, Erika; Kornbluh, Mariah

    2015-01-01

    Qualitative data offer advantages to evaluators, including rich information about stakeholders' perspectives and experiences. However, qualitative data analysis is labor-intensive and slow, conflicting with evaluators' needs to provide punctual feedback to their clients. In this method note, we contribute to the literature on rapid evaluation and…

  8. The impact of rotator cuff tendinopathy on proprioception, measuring force sensation.

    PubMed

    Maenhout, Annelies G; Palmans, Tanneke; De Muynck, Martine; De Wilde, Lieven F; Cools, Ann M

    2012-08-01

    The impact of rotator cuff tendinopathy and related impingement on proprioception is not well understood. Numerous quantitative and qualitative changes in shoulder muscles have been shown in patients with rotator cuff tendinopathy. These findings suggest that control of force might be affected. This investigation wants to evaluate force sensation, a submodality of proprioception, in patients with rotator cuff tendinopathy. Thirty-six patients with rotator cuff tendinopathy and 30 matched healthy subjects performed force reproduction tests to isometric external and internal rotation to investigate how accurately they could reproduce a fixed target (50% MVC). Relative error, constant error, and force steadiness were calculated to evaluate respectively magnitude of error made during the test, direction of this error (overshoot or undershoot), and fluctuations of produced forces. Patients significantly overshoot the target (mean, 6.04% of target) while healthy subjects underestimate the target (mean, -5.76% of target). Relative error and force steadiness are similar in patients with rotator cuff tendinopathy and healthy subjects. Force reproduction tests, as executed in this study, were found to be highly reliable (ICC 0.849 and 0.909). Errors were significantly larger during external rotation tests, compared to internal rotation. Patients overestimate the target during force reproduction tests. This should be taken into account in the rehabilitation of patients with rotator cuff tendinopathy; however, precision of force sensation and steadiness of force exertion remains unaltered. This might indicate that control of muscle force is preserved. Copyright © 2012 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Mosby, Inc. All rights reserved.

  9. Student Self-Assessment and Faculty Assessment of Performance in an Interprofessional Error Disclosure Simulation Training Program.

    PubMed

    Poirier, Therese I; Pailden, Junvie; Jhala, Ray; Ronald, Katie; Wilhelm, Miranda; Fan, Jingyang

    2017-04-01

    Objectives. To conduct a prospective evaluation for effectiveness of an error disclosure assessment tool and video recordings to enhance student learning and metacognitive skills while assessing the IPEC competencies. Design. The instruments for assessing performance (planning, communication, process, and team dynamics) in interprofessional error disclosure were developed. Student self-assessment of performance before and after viewing the recordings of their encounters were obtained. Faculty used a similar instrument to conduct real-time assessments. An instrument to assess achievement of the Interprofessional Education Collaborative (IPEC) core competencies was developed. Qualitative data was reviewed to determine student and faculty perceptions of the simulation. Assessment. The interprofessional simulation training involved a total of 233 students (50 dental, 109 nursing and 74 pharmacy). Use of video recordings made a significant difference in student self-assessment for communication and process categories of error disclosure. No differences in student self-assessments were noted among the different professions. There were differences among the family member affects for planning and communication for both pre-video and post-video data. There were significant differences between student self-assessment and faculty assessment for all paired comparisons, except communication in student post-video self-assessment. Students' perceptions of achievement of the IPEC core competencies were positive. Conclusion. The use of assessment instruments and video recordings may have enhanced students' metacognitive skills for assessing performance in interprofessional error disclosure. The simulation training was effective in enhancing perceptions on achievement of IPEC core competencies. This enhanced assessment process appeared to enhance learning about the skills needed for interprofessional error disclosure.

  10. Diagnosis is a team sport - partnering with allied health professionals to reduce diagnostic errors.

    PubMed

    Thomas, Dana B; Newman-Toker, David E

    2016-06-01

    Diagnostic errors are the most common, most costly, and most catastrophic of medical errors. Interdisciplinary teamwork has been shown to reduce harm from therapeutic errors, but sociocultural barriers may impact the engagement of allied health professionals (AHPs) in the diagnostic process. A qualitative case study of the experience at a single institution around involvement of an AHP in the diagnostic process for acute dizziness and vertigo. We detail five diagnostic error cases in which the input of a physical therapist was central to correct diagnosis. We further describe evolution of the sociocultural milieu at the institution as relates to AHP engagement in diagnosis. Five patients with acute vestibular symptoms were initially misdiagnosed by physicians and then correctly diagnosed based on input from a vestibular physical therapist. These included missed labyrinthine concussion and post-traumatic benign paroxysmal positional vertigo (BPPV); BPPV called gastroenteritis; BPPV called stroke; stroke called BPPV; and multiple sclerosis called BPPV. As a consequence of surfacing these diagnostic errors, initial resistance to physical therapy input to aid medical diagnosis has gradually declined, creating a more collaborative environment for 'team diagnosis' of patients with dizziness and vertigo at the institution. Barriers to AHP engagement in 'team diagnosis' include sociocultural norms that establish medical diagnosis as something reserved only for physicians. Drawing attention to the valuable diagnostic contributions of AHPs may help facilitate cultural change. Future studies should seek to measure diagnostic safety culture and then implement proven strategies to breakdown sociocultural barriers that inhibit effective teamwork and transdisciplinary diagnosis.

  11. Diagnosis is a team sport - partnering with allied health professionals to reduce diagnostic errors: A case study on the role of a vestibular therapist in diagnosing dizziness.

    PubMed

    Thomas, Dana B; Newman-Toker, David E

    2016-06-01

    Diagnostic errors are the most common, most costly, and most catastrophic of medical errors. Interdisciplinary teamwork has been shown to reduce harm from therapeutic errors, but sociocultural barriers may impact the engagement of allied health professionals (AHPs) in the diagnostic process. A qualitative case study of the experience at a single institution around involvement of an AHP in the diagnostic process for acute dizziness and vertigo. We detail five diagnostic error cases in which the input of a physical therapist was central to correct diagnosis. We further describe evolution of the sociocultural milieu at the institution as relates to AHP engagement in diagnosis. Five patients with acute vestibular symptoms were initially misdiagnosed by physicians and then correctly diagnosed based on input from a vestibular physical therapist. These included missed labyrinthine concussion and post-traumatic benign paroxysmal positional vertigo (BPPV); BPPV called gastroenteritis; BPPV called stroke; stroke called BPPV; and multiple sclerosis called BPPV. As a consequence of surfacing these diagnostic errors, initial resistance to physical therapy input to aid medical diagnosis has gradually declined, creating a more collaborative environment for 'team diagnosis' of patients with dizziness and vertigo at the institution. Barriers to AHP engagement in 'team diagnosis' include sociocultural norms that establish medical diagnosis as something reserved only for physicians. Drawing attention to the valuable diagnostic contributions of AHPs may help facilitate cultural change. Future studies should seek to measure diagnostic safety culture and then implement proven strategies to breakdown sociocultural barriers that inhibit effective teamwork and transdisciplinary diagnosis.

  12. Student Self-Assessment and Faculty Assessment of Performance in an Interprofessional Error Disclosure Simulation Training Program

    PubMed Central

    Pailden, Junvie; Jhala, Ray; Ronald, Katie; Wilhelm, Miranda; Fan, Jingyang

    2017-01-01

    Objectives. To conduct a prospective evaluation for effectiveness of an error disclosure assessment tool and video recordings to enhance student learning and metacognitive skills while assessing the IPEC competencies. Design. The instruments for assessing performance (planning, communication, process, and team dynamics) in interprofessional error disclosure were developed. Student self-assessment of performance before and after viewing the recordings of their encounters were obtained. Faculty used a similar instrument to conduct real-time assessments. An instrument to assess achievement of the Interprofessional Education Collaborative (IPEC) core competencies was developed. Qualitative data was reviewed to determine student and faculty perceptions of the simulation. Assessment. The interprofessional simulation training involved a total of 233 students (50 dental, 109 nursing and 74 pharmacy). Use of video recordings made a significant difference in student self-assessment for communication and process categories of error disclosure. No differences in student self-assessments were noted among the different professions. There were differences among the family member affects for planning and communication for both pre-video and post-video data. There were significant differences between student self-assessment and faculty assessment for all paired comparisons, except communication in student post-video self-assessment. Students’ perceptions of achievement of the IPEC core competencies were positive. Conclusion. The use of assessment instruments and video recordings may have enhanced students’ metacognitive skills for assessing performance in interprofessional error disclosure. The simulation training was effective in enhancing perceptions on achievement of IPEC core competencies. This enhanced assessment process appeared to enhance learning about the skills needed for interprofessional error disclosure. PMID:28496274

  13. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    PubMed Central

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  14. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    PubMed

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  15. Concurrent analysis: towards generalisable qualitative research.

    PubMed

    Snowden, Austyn; Martin, Colin R

    2011-10-01

    This study develops an original method of qualitative analysis coherent with its interpretivist principles. The objective is to increase the likelihood of achieving generalisability and so improve the chance of the findings being translated into practice. Good qualitative research depends on coherent analysis of different types of data. The limitations of existing methodologies are first discussed to justify the need for a novel approach. To illustrate this approach, primary evidence is presented using the new methodology. The primary evidence consists of a constructivist grounded theory of how mental health nurses with prescribing authority integrate prescribing into practice. This theory is built concurrently from interviews, reflective accounts and case study data from the literature. Concurrent analysis. Ten research articles and 13 semi-structured interviews were sampled purposively and then theoretically and analysed concurrently using constructivist grounded theory. A theory of the process of becoming competent in mental health nurse prescribing was generated through this process. This theory was validated by 32 practising mental health nurse prescribers as an accurate representation of their experience. The methodology generated a coherent and generalisable theory. It is therefore claimed that concurrent analysis engenders consistent and iterative treatment of different sources of qualitative data in a manageable manner. This process supports facilitation of the highest standard of qualitative research. Concurrent analysis removes the artificial delineation of relevant literature from other forms of constructed data. This gives researchers clear direction to treat qualitative data consistently raising the chances of generalisability of the findings. Raising the generalisability of qualitative research will increase its chances of informing clinical practice. © 2010 Blackwell Publishing Ltd.

  16. The causes of and factors associated with prescribing errors in hospital inpatients: a systematic review.

    PubMed

    Tully, Mary P; Ashcroft, Darren M; Dornan, Tim; Lewis, Penny J; Taylor, David; Wass, Val

    2009-01-01

    Prescribing errors are common, they result in adverse events and harm to patients and it is unclear how best to prevent them because recommendations are more often based on surmized rather than empirically collected data. The aim of this systematic review was to identify all informative published evidence concerning the causes of and factors associated with prescribing errors in specialist and non-specialist hospitals, collate it, analyse it qualitatively and synthesize conclusions from it. Seven electronic databases were searched for articles published between 1985-July 2008. The reference lists of all informative studies were searched for additional citations. To be included, a study had to be of handwritten prescriptions for adult or child inpatients that reported empirically collected data on the causes of or factors associated with errors. Publications in languages other than English and studies that evaluated errors for only one disease, one route of administration or one type of prescribing error were excluded. Seventeen papers reporting 16 studies, selected from 1268 papers identified by the search, were included in the review. Studies from the US and the UK in university-affiliated hospitals predominated (10/16 [62%]). The definition of a prescribing error varied widely and the included studies were highly heterogeneous. Causes were grouped according to Reason's model of accident causation into active failures, error-provoking conditions and latent conditions. The active failure most frequently cited was a mistake due to inadequate knowledge of the drug or the patient. Skills-based slips and memory lapses were also common. Where error-provoking conditions were reported, there was at least one per error. These included lack of training or experience, fatigue, stress, high workload for the prescriber and inadequate communication between healthcare professionals. Latent conditions included reluctance to question senior colleagues and inadequate provision of training. Prescribing errors are often multifactorial, with several active failures and error-provoking conditions often acting together to cause them. In the face of such complexity, solutions addressing a single cause, such as lack of knowledge, are likely to have only limited benefit. Further rigorous study, seeking potential ways of reducing error, needs to be conducted. Multifactorial interventions across many parts of the system are likely to be required.

  17. Accuracy improvement of the H-drive air-levitating wafer inspection stage based on error analysis and compensation

    NASA Astrophysics Data System (ADS)

    Zhang, Fan; Liu, Pinkuan

    2018-04-01

    In order to improve the inspection precision of the H-drive air-bearing stage for wafer inspection, in this paper the geometric error of the stage is analyzed and compensated. The relationship between the positioning errors and error sources are initially modeled, and seven error components are identified that are closely related to the inspection accuracy. The most effective factor that affects the geometric error is identified by error sensitivity analysis. Then, the Spearman rank correlation method is applied to find the correlation between different error components, aiming at guiding the accuracy design and error compensation of the stage. Finally, different compensation methods, including the three-error curve interpolation method, the polynomial interpolation method, the Chebyshev polynomial interpolation method, and the B-spline interpolation method, are employed within the full range of the stage, and their results are compared. Simulation and experiment show that the B-spline interpolation method based on the error model has better compensation results. In addition, the research result is valuable for promoting wafer inspection accuracy and will greatly benefit the semiconductor industry.

  18. Error rate information in attention allocation pilot models

    NASA Technical Reports Server (NTRS)

    Faulkner, W. H.; Onstott, E. D.

    1977-01-01

    The Northrop urgency decision pilot model was used in a command tracking task to compare the optimized performance of multiaxis attention allocation pilot models whose urgency functions were (1) based on tracking error alone, and (2) based on both tracking error and error rate. A matrix of system dynamics and command inputs was employed, to create both symmetric and asymmetric two axis compensatory tracking tasks. All tasks were single loop on each axis. Analysis showed that a model that allocates control attention through nonlinear urgency functions using only error information could not achieve performance of the full model whose attention shifting algorithm included both error and error rate terms. Subsequent to this analysis, tracking performance predictions for the full model were verified by piloted flight simulation. Complete model and simulation data are presented.

  19. The qualitative orientation in medical education research.

    PubMed

    Cleland, Jennifer Anne

    2017-06-01

    Qualitative research is very important in educational research as it addresses the "how" and "why" research questions and enables deeper understanding of experiences, phenomena and context. Qualitative research allows you to ask questions that cannot be easily put into numbers to understand human experience. Getting at the everyday realities of some social phenomenon and studying important questions as they are really practiced helps extend knowledge and understanding. To do so, you need to understand the philosophical stance of qualitative research and work from this to develop the research question, study design, data collection methods and data analysis. In this article, I provide an overview of the assumptions underlying qualitative research and the role of the researcher in the qualitative process. I then go on to discuss the type of research objectives which are common in qualitative research, then introduce the main qualitative designs, data collection tools, and finally the basics of qualitative analysis. I introduce the criteria by which you can judge the quality of qualitative research. Many classic references are cited in this article, and I urge you to seek out some of these further reading to inform your qualitative research program.

  20. The qualitative orientation in medical education research

    PubMed Central

    2017-01-01

    Qualitative research is very important in educational research as it addresses the “how” and “why” research questions and enables deeper understanding of experiences, phenomena and context. Qualitative research allows you to ask questions that cannot be easily put into numbers to understand human experience. Getting at the everyday realities of some social phenomenon and studying important questions as they are really practiced helps extend knowledge and understanding. To do so, you need to understand the philosophical stance of qualitative research and work from this to develop the research question, study design, data collection methods and data analysis. In this article, I provide an overview of the assumptions underlying qualitative research and the role of the researcher in the qualitative process. I then go on to discuss the type of research objectives which are common in qualitative research, then introduce the main qualitative designs, data collection tools, and finally the basics of qualitative analysis. I introduce the criteria by which you can judge the quality of qualitative research. Many classic references are cited in this article, and I urge you to seek out some of these further reading to inform your qualitative research program. PMID:28597869

Top