Love, Peter E D; Smith, Jim; Teo, Pauline
2018-05-01
Error management theory is drawn upon to examine how a project-based organization, which took the form of a program alliance, was able to change its established error prevention mindset to one that enacted a learning mindfulness that provided an avenue to curtail its action errors. The program alliance was required to unlearn its existing routines and beliefs to accommodate the practices required to embrace error management. As a result of establishing an error management culture the program alliance was able to create a collective mindfulness that nurtured learning and supported innovation. The findings provide a much-needed context to demonstrate the relevance of error management theory to effectively address rework and safety problems in construction projects. The robust theoretical underpinning that is grounded in practice and presented in this paper provides a mechanism to engender learning from errors, which can be utilized by construction organizations to improve the productivity and performance of their projects. Copyright © 2018 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Henningsen, David Dryden; Henningsen, Mary Lynn Miller
2010-01-01
Research on error management theory indicates that men tend to overestimate women's sexual interest and women underestimate men's interest in committed relationships (Haselton & Buss, 2000). We test the assumptions of the theory in face-to-face, stranger interactions with 111 man-woman dyads. Support for the theory emerges, but potential boundary…
Why hard-nosed executives should care about management theory.
Christensen, Clayton M; Raynor, Michael E
2003-09-01
Theory often gets a bum rap among managers because it's associated with the word "theoretical," which connotes "impractical." But it shouldn't. Because experience is solely about the past, solid theories are the only way managers can plan future actions with any degree of confidence. The key word here is "solid." Gravity is a solid theory. As such, it lets us predict that if we step off a cliff we will fall, without actually having to do so. But business literature is replete with theories that don't seem to work in practice or actually contradict each other. How can a manager tell a good business theory from a bad one? The first step is understanding how good theories are built. They develop in three stages: gathering data, organizing it into categories highlighting significant differences, then making generalizations explaining what causes what, under which circumstances. For instance, professor Ananth Raman and his colleagues collected data showing that bar code-scanning systems generated notoriously inaccurate inventory records. These observations led them to classify the types of errors the scanning systems produced and the types of shops in which those errors most often occurred. Recently, some of Raman's doctoral students have worked as clerks to see exactly what kinds of behavior cause the errors. From this foundation, a solid theory predicting under which circumstances bar code systems work, and don't work, is beginning to emerge. Once we forgo one-size-fits-all explanations and insist that a theory describes the circumstances under which it does and doesn't work, we can bring predictable success to the world of management.
Recognizing and managing errors of cognitive underspecification.
Duthie, Elizabeth A
2014-03-01
James Reason describes cognitive underspecification as incomplete communication that creates a knowledge gap. Errors occur when an information mismatch occurs in bridging that gap with a resulting lack of shared mental models during the communication process. There is a paucity of studies in health care examining this cognitive error and the role it plays in patient harm. The goal of the following case analyses is to facilitate accurate recognition, identify how it contributes to patient harm, and suggest appropriate management strategies. Reason's human error theory is applied in case analyses of errors of cognitive underspecification. Sidney Dekker's theory of human incident investigation is applied to event investigation to facilitate identification of this little recognized error. Contributory factors leading to errors of cognitive underspecification include workload demands, interruptions, inexperienced practitioners, and lack of a shared mental model. Detecting errors of cognitive underspecification relies on blame-free listening and timely incident investigation. Strategies for interception include two-way interactive communication, standardization of communication processes, and technological support to ensure timely access to documented clinical information. Although errors of cognitive underspecification arise at the sharp end with the care provider, effective management is dependent upon system redesign that mitigates the latent contributory factors. Cognitive underspecification is ubiquitous whenever communication occurs. Accurate identification is essential if effective system redesign is to occur.
Individual differences in political ideology are effects of adaptive error management.
Petersen, Michael Bang; Aarøe, Lene
2014-06-01
We apply error management theory to the analysis of individual differences in the negativity bias and political ideology. Using principles from evolutionary psychology, we propose a coherent theoretical framework for understanding (1) why individuals differ in their political ideology and (2) the conditions under which these individual differences influence and fail to influence the political choices people make.
Error management for musicians: an interdisciplinary conceptual framework
Kruse-Weber, Silke; Parncutt, Richard
2014-01-01
Musicians tend to strive for flawless performance and perfection, avoiding errors at all costs. Dealing with errors while practicing or performing is often frustrating and can lead to anger and despair, which can explain musicians’ generally negative attitude toward errors and the tendency to aim for flawless learning in instrumental music education. But even the best performances are rarely error-free, and research in general pedagogy and psychology has shown that errors provide useful information for the learning process. Research in instrumental pedagogy is still neglecting error issues; the benefits of risk management (before the error) and error management (during and after the error) are still underestimated. It follows that dealing with errors is a key aspect of music practice at home, teaching, and performance in public. And yet, to be innovative, or to make their performance extraordinary, musicians need to risk errors. Currently, most music students only acquire the ability to manage errors implicitly – or not at all. A more constructive, creative, and differentiated culture of errors would balance error tolerance and risk-taking against error prevention in ways that enhance music practice and music performance. The teaching environment should lay the foundation for the development of such an approach. In this contribution, we survey recent research in aviation, medicine, economics, psychology, and interdisciplinary decision theory that has demonstrated that specific error-management training can promote metacognitive skills that lead to better adaptive transfer and better performance skills. We summarize how this research can be applied to music, and survey-relevant research that is specifically tailored to the needs of musicians, including generic guidelines for risk and error management in music teaching and performance. On this basis, we develop a conceptual framework for risk management that can provide orientation for further music education and musicians at all levels. PMID:25120501
Error management for musicians: an interdisciplinary conceptual framework.
Kruse-Weber, Silke; Parncutt, Richard
2014-01-01
Musicians tend to strive for flawless performance and perfection, avoiding errors at all costs. Dealing with errors while practicing or performing is often frustrating and can lead to anger and despair, which can explain musicians' generally negative attitude toward errors and the tendency to aim for flawless learning in instrumental music education. But even the best performances are rarely error-free, and research in general pedagogy and psychology has shown that errors provide useful information for the learning process. Research in instrumental pedagogy is still neglecting error issues; the benefits of risk management (before the error) and error management (during and after the error) are still underestimated. It follows that dealing with errors is a key aspect of music practice at home, teaching, and performance in public. And yet, to be innovative, or to make their performance extraordinary, musicians need to risk errors. Currently, most music students only acquire the ability to manage errors implicitly - or not at all. A more constructive, creative, and differentiated culture of errors would balance error tolerance and risk-taking against error prevention in ways that enhance music practice and music performance. The teaching environment should lay the foundation for the development of such an approach. In this contribution, we survey recent research in aviation, medicine, economics, psychology, and interdisciplinary decision theory that has demonstrated that specific error-management training can promote metacognitive skills that lead to better adaptive transfer and better performance skills. We summarize how this research can be applied to music, and survey-relevant research that is specifically tailored to the needs of musicians, including generic guidelines for risk and error management in music teaching and performance. On this basis, we develop a conceptual framework for risk management that can provide orientation for further music education and musicians at all levels.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cappelli, M.; Gadomski, A. M.; Sepiellis, M.
In the field of nuclear power plant (NPP) safety modeling, the perception of the role of socio-cognitive engineering (SCE) is continuously increasing. Today, the focus is especially on the identification of human and organization decisional errors caused by operators and managers under high-risk conditions, as evident by analyzing reports on nuclear incidents occurred in the past. At present, the engineering and social safety requirements need to enlarge their domain of interest in such a way to include all possible losses generating events that could be the consequences of an abnormal state of a NPP. Socio-cognitive modeling of Integrated Nuclear Safetymore » Management (INSM) using the TOGA meta-theory has been discussed during the ICCAP 2011 Conference. In this paper, more detailed aspects of the cognitive decision-making and its possible human errors and organizational vulnerability are presented. The formal TOGA-based network model for cognitive decision-making enables to indicate and analyze nodes and arcs in which plant operators and managers errors may appear. The TOGA's multi-level IPK (Information, Preferences, Knowledge) model of abstract intelligent agents (AIAs) is applied. In the NPP context, super-safety approach is also discussed, by taking under consideration unexpected events and managing them from a systemic perspective. As the nature of human errors depends on the specific properties of the decision-maker and the decisional context of operation, a classification of decision-making using IPK is suggested. Several types of initial situations of decision-making useful for the diagnosis of NPP operators and managers errors are considered. The developed models can be used as a basis for applications to NPP educational or engineering simulators to be used for training the NPP executive staff. (authors)« less
A Case of Error Disclosure: A Communication Privacy Management Analysis
Petronio, Sandra; Helft, Paul R.; Child, Jeffrey T.
2013-01-01
To better understand the process of disclosing medical errors to patients, this research offers a case analysis using Petronios’s theoretical frame of Communication Privacy Management (CPM). Given the resistance clinicians often feel about error disclosure, insights into the way choices are made by the clinicians in telling patients about the mistake has the potential to address reasons for resistance. Applying the evidenced-based CPM theory, developed over the last 35 years and dedicated to studying disclosure phenomenon, to disclosing medical mistakes potentially has the ability to reshape thinking about the error disclosure process. Using a composite case representing a surgical mistake, analysis based on CPM theory is offered to gain insights into conversational routines and disclosure management choices of revealing a medical error. The results of this analysis show that an underlying assumption of health information ownership by the patient and family can be at odds with the way the clinician tends to control disclosure about the error. In addition, the case analysis illustrates that there are embedded patterns of disclosure that emerge out of conversations the clinician has with the patient and the patient’s family members. These patterns unfold privacy management decisions on the part of the clinician that impact how the patient is told about the error and the way that patients interpret the meaning of the disclosure. These findings suggest the need for a better understanding of how patients manage their private health information in relationship to their expectations for the way they see the clinician caring for or controlling their health information about errors. Significance for public health Much of the mission central to public health sits squarely on the ability to communicate effectively. This case analysis offers an in-depth assessment of how error disclosure is complicated by misunderstandings, assuming ownership and control over information, unwittingly following conversational scripts that convey misleading messages, and the difficulty in regulating privacy boundaries in the stressful circumstances that occur with error disclosures. As a consequence, the potential contribution to public health is the ability to more clearly see the significance of the disclosure process that has implications for many public health issues. PMID:25170501
System safety management: A new discipline
NASA Technical Reports Server (NTRS)
Pope, W. C.
1971-01-01
The systems theory is discussed in relation to safety management. It is suggested that systems safety management, as a new discipline, holds great promise for reducing operating errors, conserving labor resources, avoiding operating costs due to mistakes, and for improving managerial techniques. It is pointed out that managerial failures or system breakdowns are the basic reasons for human errors and condition defects. In this respect, a recommendation is made that safety engineers stop visualizing the problem only with the individual (supervisor or employee) and see the problem from the systems point of view.
Nurses' attitude and intention of medication administration error reporting.
Hung, Chang-Chiao; Chu, Tsui-Ping; Lee, Bih-O; Hsiao, Chia-Chi
2016-02-01
The Aims of this study were to explore the effects of nurses' attitudes and intentions regarding medication administration error reporting on actual reporting behaviours. Underreporting of medication errors is still a common occurrence. Whether attitude and intention towards medication administration error reporting connect to actual reporting behaviours remain unclear. This study used a cross-sectional design with self-administered questionnaires, and the theory of planned behaviour was used as the framework for this study. A total of 596 staff nurses who worked in general wards and intensive care units in a hospital were invited to participate in this study. The researchers used the instruments measuring nurses' attitude, nurse managers' and co-workers' attitude, report control, and nurses' intention to predict nurses' actual reporting behaviours. Data were collected from September-November 2013. Path analyses were used to examine the hypothesized model. Of the 596 nurses invited to participate, 548 (92%) completed and returned a valid questionnaire. The findings indicated that nurse managers' and co-workers' attitudes are predictors for nurses' attitudes towards medication administration error reporting. Nurses' attitudes also influenced their intention to report medication administration errors; however, no connection was found between intention and actual reporting behaviour. The findings reflected links among colleague perspectives, nurses' attitudes, and intention to report medication administration errors. The researchers suggest that hospitals should increase nurses' awareness and recognition of error occurrence. Regardless of nurse managers' and co-workers' attitudes towards medication administration error reporting, nurses are likely to report medication administration errors if they detect them. Management of medication administration errors should focus on increasing nurses' awareness and recognition of error occurrence. © 2015 John Wiley & Sons Ltd.
Reliability issues in active control of large flexible space structures
NASA Technical Reports Server (NTRS)
Vandervelde, W. E.
1986-01-01
Efforts in this reporting period were centered on four research tasks: design of failure detection filters for robust performance in the presence of modeling errors, design of generalized parity relations for robust performance in the presence of modeling errors, design of failure sensitive observers using the geometric system theory of Wonham, and computational techniques for evaluation of the performance of control systems with fault tolerance and redundancy management
Teaching organization theory for healthcare management: three applied learning methods.
Olden, Peter C
2006-01-01
Organization theory (OT) provides a way of seeing, describing, analyzing, understanding, and improving organizations based on patterns of organizational design and behavior (Daft 2004). It gives managers models, principles, and methods with which to diagnose and fix organization structure, design, and process problems. Health care organizations (HCOs) face serious problems such as fatal medical errors, harmful treatment delays, misuse of scarce nurses, costly inefficiency, and service failures. Some of health care managers' most critical work involves designing and structuring their organizations so their missions, visions, and goals can be achieved-and in some cases so their organizations can survive. Thus, it is imperative that graduate healthcare management programs develop effective approaches for teaching OT to students who will manage HCOs. Guided by principles of education, three applied teaching/learning activities/assignments were created to teach OT in a graduate healthcare management program. These educationalmethods develop students' competency with OT applied to HCOs. The teaching techniques in this article may be useful to faculty teaching graduate courses in organization theory and related subjects such as leadership, quality, and operation management.
Naik, Aanand Dinkar; Rao, Raghuram; Petersen, Laura Ann
2008-01-01
Diagnostic errors are poorly understood despite being a frequent cause of medical errors. Recent efforts have aimed to advance the "basic science" of diagnostic error prevention by tracing errors to their most basic origins. Although a refined theory of diagnostic error prevention will take years to formulate, we focus on communication breakdown, a major contributor to diagnostic errors and an increasingly recognized preventable factor in medical mishaps. We describe a comprehensive framework that integrates the potential sources of communication breakdowns within the diagnostic process and identifies vulnerable steps in the diagnostic process where various types of communication breakdowns can precipitate error. We then discuss potential information technology-based interventions that may have efficacy in preventing one or more forms of these breakdowns. These possible intervention strategies include using new technologies to enhance communication between health providers and health systems, improve patient involvement, and facilitate management of information in the medical record. PMID:18373151
Where Are the Logical Errors in the Theory of Big Bang?
NASA Astrophysics Data System (ADS)
Kalanov, Temur Z.
2015-04-01
The critical analysis of the foundations of the theory of Big Bang is proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. It is argued that the starting point of the theory of Big Bang contains three fundamental logical errors. The first error is the assumption that a macroscopic object (having qualitative determinacy) can have an arbitrarily small size and can be in the singular state (i.e., in the state that has no qualitative determinacy). This assumption implies that the transition, (macroscopic object having the qualitative determinacy) --> (singular state of matter that has no qualitative determinacy), leads to loss of information contained in the macroscopic object. The second error is the assumption that there are the void and the boundary between matter and void. But if such boundary existed, then it would mean that the void has dimensions and can be measured. The third error is the assumption that the singular state of matter can make a transition into the normal state without the existence of the program of qualitative and quantitative development of the matter, without controlling influence of other (independent) object. However, these assumptions conflict with the practice and, consequently, formal logic, rational dialectics, and cybernetics. Indeed, from the point of view of cybernetics, the transition, (singular state of the Universe) -->(normal state of the Universe),would be possible only in the case if there was the Managed Object that is outside the Universe and have full, complete, and detailed information about the Universe. Thus, the theory of Big Bang is a scientific fiction.
Adopting Extensible Business Reporting Language (XBRL): A Grounded Theory
ERIC Educational Resources Information Center
Cruz, Marivic
2010-01-01
In 2007 and 2008, government challenges consisted of error prone, manually intensive, and inefficient environments for financial reporting. Banking regulators worldwide faced issues with respect to transparency, timeliness, quality, and managing risks associated with accounting opacity. The general problem was the existing reporting standards and…
Formulation of a strategy for monitoring control integrity in critical digital control systems
NASA Technical Reports Server (NTRS)
Belcastro, Celeste M.; Fischl, Robert; Kam, Moshe
1991-01-01
Advanced aircraft will require flight critical computer systems for stability augmentation as well as guidance and control that must perform reliably in adverse, as well as nominal, operating environments. Digital system upset is a functional error mode that can occur in electromagnetically harsh environments, involves no component damage, can occur simultaneously in all channels of a redundant control computer, and is software dependent. A strategy is presented for dynamic upset detection to be used in the evaluation of critical digital controllers during the design and/or validation phases of development. Critical controllers must be able to be used in adverse environments that result from disturbances caused by an electromagnetic source such as lightning, high intensity radiated field (HIRF), and nuclear electromagnetic pulses (NEMP). The upset detection strategy presented provides dynamic monitoring of a given control computer for degraded functional integrity that can result from redundancy management errors and control command calculation error that could occur in an electromagnetically harsh operating environment. The use is discussed of Kalman filtering, data fusion, and decision theory in monitoring a given digital controller for control calculation errors, redundancy management errors, and control effectiveness.
The control of manual entry accuracy in management/engineering information systems, phase 1
NASA Technical Reports Server (NTRS)
Hays, Daniel; Nocke, Henry; Wilson, Harold; Woo, John, Jr.; Woo, June
1987-01-01
It was shown that clerical personnel can be tested for proofreading performance under simulated industrial conditions. A statistical study showed that errors in proofreading follow an extreme value probability theory. The study showed that innovative man/machine interfaces can be developed to improve and control accuracy during data entry.
Imperfect practice makes perfect: error management training improves transfer of learning.
Dyre, Liv; Tabor, Ann; Ringsted, Charlotte; Tolsgaard, Martin G
2017-02-01
Traditionally, trainees are instructed to practise with as few errors as possible during simulation-based training. However, transfer of learning may improve if trainees are encouraged to commit errors. The aim of this study was to assess the effects of error management instructions compared with error avoidance instructions during simulation-based ultrasound training. Medical students (n = 60) with no prior ultrasound experience were randomised to error management training (EMT) (n = 32) or error avoidance training (EAT) (n = 28). The EMT group was instructed to deliberately make errors during training. The EAT group was instructed to follow the simulator instructions and to commit as few errors as possible. Training consisted of 3 hours of simulation-based ultrasound training focusing on fetal weight estimation. Simulation-based tests were administered before and after training. Transfer tests were performed on real patients 7-10 days after the completion of training. Primary outcomes were transfer test performance scores and diagnostic accuracy. Secondary outcomes included performance scores and diagnostic accuracy during the simulation-based pre- and post-tests. A total of 56 participants completed the study. On the transfer test, EMT group participants attained higher performance scores (mean score: 67.7%, 95% confidence interval [CI]: 62.4-72.9%) than EAT group members (mean score: 51.7%, 95% CI: 45.8-57.6%) (p < 0.001; Cohen's d = 1.1, 95% CI: 0.5-1.7). There was a moderate improvement in diagnostic accuracy in the EMT group compared with the EAT group (16.7%, 95% CI: 10.2-23.3% weight deviation versus 26.6%, 95% CI: 16.5-36.7% weight deviation [p = 0.082; Cohen's d = 0.46, 95% CI: -0.06 to 1.0]). No significant interaction effects between group and performance improvements between the pre- and post-tests were found in either performance scores (p = 0.25) or diagnostic accuracy (p = 0.09). The provision of error management instructions during simulation-based training improves the transfer of learning to the clinical setting compared with error avoidance instructions. Rather than teaching to avoid errors, the use of errors for learning should be explored further in medical education theory and practice. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
Ben Natan, Merav; Sharon, Ira; Mahajna, Marlen; Mahajna, Sara
2017-11-01
Medication errors are common among nursing students. Nonetheless, these errors are often underreported. To examine factors related to nursing students' intention to report medication errors, using the Theory of Planned Behavior, and to examine whether the theory is useful in predicting students' intention to report errors. This study has a descriptive cross-sectional design. Study population was recruited in a university and a large nursing school in central and northern Israel. A convenience sample of 250 nursing students took part in the study. The students completed a self-report questionnaire, based on the Theory of Planned Behavior. The findings indicate that students' intention to report medication errors was high. The Theory of Planned Behavior constructs explained 38% of variance in students' intention to report medication errors. The constructs of behavioral beliefs, subjective norms, and perceived behavioral control were found as affecting this intention, while the most significant factor was behavioral beliefs. The findings also reveal that students' fear of the reaction to disclosure of the error from superiors and colleagues may impede them from reporting the error. Understanding factors related to reporting medication errors is crucial to designing interventions that foster error reporting. Copyright © 2017 Elsevier Ltd. All rights reserved.
Combinatorial neural codes from a mathematical coding theory perspective.
Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L
2013-07-01
Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.
Mathematical and field analysis of longitudinal reservoir infill
NASA Astrophysics Data System (ADS)
Ke, W. T.; Capart, H.
2016-12-01
In reservoirs, severe problems are caused by infilled sediment deposits. In long term, the sediment accumulation reduces the capacity of reservoir storage and flood control benefits. In the short term, the sediment deposits influence the intakes of water-supply and hydroelectricity generation. For the management of reservoir, it is important to understand the deposition process and then to predict the sedimentation in reservoir. To investigate the behaviors of sediment deposits, we propose a one-dimensional simplified theory derived by the Exner equation to predict the longitudinal sedimentation distribution in idealized reservoirs. The theory models the reservoir infill geomorphic actions for three scenarios: delta progradation, near-dam bottom deposition, and final infill. These yield three kinds of self-similar analytical solutions for the reservoir bed profiles, under different boundary conditions. Three analytical solutions are composed by error function, complementary error function, and imaginary error function, respectively. The theory is also computed by finite volume method to test the analytical solutions. The theoretical and numerical predictions are in good agreement with one-dimensional small-scale laboratory experiment. As the theory is simple to apply with analytical solutions and numerical computation, we propose some applications to simulate the long-profile evolution of field reservoirs and focus on the infill sediment deposit volume resulting the uplift of near-dam bottom elevation. These field reservoirs introduced here are Wushe Reservoir, Tsengwen Reservoir, Mudan Reservoir in Taiwan, Lago Dos Bocas in Puerto Rico, and Sakuma Dam in Japan.
NASA Technical Reports Server (NTRS)
Mcruer, D. T.; Clement, W. F.; Allen, R. W.
1981-01-01
Human errors tend to be treated in terms of clinical and anecdotal descriptions, from which remedial measures are difficult to derive. Correction of the sources of human error requires an attempt to reconstruct underlying and contributing causes of error from the circumstantial causes cited in official investigative reports. A comprehensive analytical theory of the cause-effect relationships governing propagation of human error is indispensable to a reconstruction of the underlying and contributing causes. A validated analytical theory of the input-output behavior of human operators involving manual control, communication, supervisory, and monitoring tasks which are relevant to aviation, maritime, automotive, and process control operations is highlighted. This theory of behavior, both appropriate and inappropriate, provides an insightful basis for investigating, classifying, and quantifying the needed cause-effect relationships governing propagation of human error.
Silvetti, Massimo; Alexander, William; Verguts, Tom; Brown, Joshua W
2014-10-01
The role of the medial prefrontal cortex (mPFC) and especially the anterior cingulate cortex has been the subject of intense debate for the last decade. A number of theories have been proposed to account for its function. Broadly speaking, some emphasize cognitive control, whereas others emphasize value processing; specific theories concern reward processing, conflict detection, error monitoring, and volatility detection, among others. Here we survey and evaluate them relative to experimental results from neurophysiological, anatomical, and cognitive studies. We argue for a new conceptualization of mPFC, arising from recent computational modeling work. Based on reinforcement learning theory, these new models propose that mPFC is an Actor-Critic system. This system is aimed to predict future events including rewards, to evaluate errors in those predictions, and finally, to implement optimal skeletal-motor and visceromotor commands to obtain reward. This framework provides a comprehensive account of mPFC function, accounting for and predicting empirical results across different levels of analysis, including monkey neurophysiology, human ERP, human neuroimaging, and human behavior. Copyright © 2013 Elsevier Ltd. All rights reserved.
Normal accidents: human error and medical equipment design.
Dain, Steven
2002-01-01
High-risk systems, which are typical of our technologically complex era, include not just nuclear power plants but also hospitals, anesthesia systems, and the practice of medicine and perfusion. In high-risk systems, no matter how effective safety devices are, some types of accidents are inevitable because the system's complexity leads to multiple and unexpected interactions. It is important for healthcare providers to apply a risk assessment and management process to decisions involving new equipment and procedures or staffing matters in order to minimize the residual risks of latent errors, which are amenable to correction because of the large window of opportunity for their detection. This article provides an introduction to basic risk management and error theory principles and examines ways in which they can be applied to reduce and mitigate the inevitable human errors that accompany high-risk systems. The article also discusses "human factor engineering" (HFE), the process which is used to design equipment/ human interfaces in order to mitigate design errors. The HFE process involves interaction between designers and endusers to produce a series of continuous refinements that are incorporated into the final product. The article also examines common design problems encountered in the operating room that may predispose operators to commit errors resulting in harm to the patient. While recognizing that errors and accidents are unavoidable, organizations that function within a high-risk system must adopt a "safety culture" that anticipates problems and acts aggressively through an anonymous, "blameless" reporting mechanism to resolve them. We must continuously examine and improve the design of equipment and procedures, personnel, supplies and materials, and the environment in which we work to reduce error and minimize its effects. Healthcare providers must take a leading role in the day-to-day management of the "Perioperative System" and be a role model in promoting a culture of safety in their organizations.
Beyond crisis resource management: new frontiers in human factors training for acute care medicine.
Petrosoniak, Andrew; Hicks, Christopher M
2013-12-01
Error is ubiquitous in medicine, particularly during critical events and resuscitation. A significant proportion of adverse events can be attributed to inadequate team-based skills such as communication, leadership, situation awareness and resource utilization. Aviation-based crisis resource management (CRM) training using high-fidelity simulation has been proposed as a strategy to improve team behaviours. This review will address key considerations in CRM training and outline recommendations for the future of human factors education in healthcare. A critical examination of the current literature yields several important considerations to guide the development and implementation of effective simulation-based CRM training. These include defining a priori domain-specific objectives, creating an immersive environment that encourages deliberate practice and transfer-appropriate processing, and the importance of effective team debriefing. Building on research from high-risk industry, we suggest that traditional CRM training may be augmented with new training techniques that promote the development of shared mental models for team and task processes, address the effect of acute stress on team performance, and integrate strategies to improve clinical reasoning and the detection of cognitive errors. The evolution of CRM training involves a 'Triple Threat' approach that integrates mental model theory for team and task processes, training for stressful situations and metacognition and error theory towards a more comprehensive training paradigm, with roots in high-risk industry and cognitive psychology. Further research is required to evaluate the impact of this approach on patient-oriented outcomes.
Reconciling uncertain costs and benefits in bayes nets for invasive species management
Burgman, M.A.; Wintle, B.A.; Thompson, C.A.; Moilanen, A.; Runge, M.C.; Ben-Haim, Y.
2010-01-01
Bayes nets are used increasingly to characterize environmental systems and formalize probabilistic reasoning to support decision making. These networks treat probabilities as exact quantities. Sensitivity analysis can be used to evaluate the importance of assumptions and parameter estimates. Here, we outline an application of info-gap theory to Bayes nets that evaluates the sensitivity of decisions to possibly large errors in the underlying probability estimates and utilities. We apply it to an example of management and eradication of Red Imported Fire Ants in Southern Queensland, Australia and show how changes in management decisions can be justified when uncertainty is considered. ?? 2009 Society for Risk Analysis.
NASA Technical Reports Server (NTRS)
Mcruer, D. T.; Clement, W. F.; Allen, R. W.
1980-01-01
Human error, a significant contributing factor in a very high proportion of civil transport, general aviation, and rotorcraft accidents is investigated. Correction of the sources of human error requires that one attempt to reconstruct underlying and contributing causes of error from the circumstantial causes cited in official investigative reports. A validated analytical theory of the input-output behavior of human operators involving manual control, communication, supervisory, and monitoring tasks which are relevant to aviation operations is presented. This theory of behavior, both appropriate and inappropriate, provides an insightful basis for investigating, classifying, and quantifying the needed cause-effect relationships governing propagation of human error.
Self-Interaction Error in Density Functional Theory: An Appraisal.
Bao, Junwei Lucas; Gagliardi, Laura; Truhlar, Donald G
2018-05-03
Self-interaction error (SIE) is considered to be one of the major sources of error in most approximate exchange-correlation functionals for Kohn-Sham density-functional theory (KS-DFT), and it is large with all local exchange-correlation functionals and with some hybrid functionals. In this work, we consider systems conventionally considered to be dominated by SIE. For these systems, we demonstrate that by using multiconfiguration pair-density functional theory (MC-PDFT), the error of a translated local density-functional approximation is significantly reduced (by a factor of 3) when using an MCSCF density and on-top density, as compared to using KS-DFT with the parent functional; the error in MC-PDFT with local on-top functionals is even lower than the error in some popular KS-DFT hybrid functionals. Density-functional theory, either in MC-PDFT form with local on-top functionals or in KS-DFT form with some functionals having 50% or more nonlocal exchange, has smaller errors for SIE-prone systems than does CASSCF, which has no SIE.
When is an error not a prediction error? An electrophysiological investigation.
Holroyd, Clay B; Krigolson, Olave E; Baker, Robert; Lee, Seung; Gibson, Jessica
2009-03-01
A recent theory holds that the anterior cingulate cortex (ACC) uses reinforcement learning signals conveyed by the midbrain dopamine system to facilitate flexible action selection. According to this position, the impact of reward prediction error signals on ACC modulates the amplitude of a component of the event-related brain potential called the error-related negativity (ERN). The theory predicts that ERN amplitude is monotonically related to the expectedness of the event: It is larger for unexpected outcomes than for expected outcomes. However, a recent failure to confirm this prediction has called the theory into question. In the present article, we investigated this discrepancy in three trial-and-error learning experiments. All three experiments provided support for the theory, but the effect sizes were largest when an optimal response strategy could actually be learned. This observation suggests that ACC utilizes dopamine reward prediction error signals for adaptive decision making when the optimal behavior is, in fact, learnable.
Nozari, Nazbanou; Dell, Gary S.; Schwartz, Myrna F.
2011-01-01
Despite the existence of speech errors, verbal communication is successful because speakers can detect (and correct) their errors. The standard theory of speech-error detection, the perceptual-loop account, posits that the comprehension system monitors production output for errors. Such a comprehension-based monitor, however, cannot explain the double dissociation between comprehension and error-detection ability observed in the aphasic patients. We propose a new theory of speech-error detection which is instead based on the production process itself. The theory borrows from studies of forced-choice-response tasks the notion that error detection is accomplished by monitoring response conflict via a frontal brain structure, such as the anterior cingulate cortex. We adapt this idea to the two-step model of word production, and test the model-derived predictions on a sample of aphasic patients. Our results show a strong correlation between patients’ error-detection ability and the model’s characterization of their production skills, and no significant correlation between error detection and comprehension measures, thus supporting a production-based monitor, generally, and the implemented conflict-based monitor in particular. The successful application of the conflict-based theory to error-detection in linguistic, as well as non-linguistic domains points to a domain-general monitoring system. PMID:21652015
The Neural Basis of Error Detection: Conflict Monitoring and the Error-Related Negativity
ERIC Educational Resources Information Center
Yeung, Nick; Botvinick, Matthew M.; Cohen, Jonathan D.
2004-01-01
According to a recent theory, anterior cingulate cortex is sensitive to response conflict, the coactivation of mutually incompatible responses. The present research develops this theory to provide a new account of the error-related negativity (ERN), a scalp potential observed following errors. Connectionist simulations of response conflict in an…
A comparison of error bounds for a nonlinear tracking system with detection probability Pd < 1.
Tong, Huisi; Zhang, Hao; Meng, Huadong; Wang, Xiqin
2012-12-14
Error bounds for nonlinear filtering are very important for performance evaluation and sensor management. This paper presents a comparative study of three error bounds for tracking filtering, when the detection probability is less than unity. One of these bounds is the random finite set (RFS) bound, which is deduced within the framework of finite set statistics. The others, which are the information reduction factor (IRF) posterior Cramer-Rao lower bound (PCRLB) and enumeration method (ENUM) PCRLB are introduced within the framework of finite vector statistics. In this paper, we deduce two propositions and prove that the RFS bound is equal to the ENUM PCRLB, while it is tighter than the IRF PCRLB, when the target exists from the beginning to the end. Considering the disappearance of existing targets and the appearance of new targets, the RFS bound is tighter than both IRF PCRLB and ENUM PCRLB with time, by introducing the uncertainty of target existence. The theory is illustrated by two nonlinear tracking applications: ballistic object tracking and bearings-only tracking. The simulation studies confirm the theory and reveal the relationship among the three bounds.
A Comparison of Error Bounds for a Nonlinear Tracking System with Detection Probability Pd < 1
Tong, Huisi; Zhang, Hao; Meng, Huadong; Wang, Xiqin
2012-01-01
Error bounds for nonlinear filtering are very important for performance evaluation and sensor management. This paper presents a comparative study of three error bounds for tracking filtering, when the detection probability is less than unity. One of these bounds is the random finite set (RFS) bound, which is deduced within the framework of finite set statistics. The others, which are the information reduction factor (IRF) posterior Cramer-Rao lower bound (PCRLB) and enumeration method (ENUM) PCRLB are introduced within the framework of finite vector statistics. In this paper, we deduce two propositions and prove that the RFS bound is equal to the ENUM PCRLB, while it is tighter than the IRF PCRLB, when the target exists from the beginning to the end. Considering the disappearance of existing targets and the appearance of new targets, the RFS bound is tighter than both IRF PCRLB and ENUM PCRLB with time, by introducing the uncertainty of target existence. The theory is illustrated by two nonlinear tracking applications: ballistic object tracking and bearings-only tracking. The simulation studies confirm the theory and reveal the relationship among the three bounds. PMID:23242274
Multiconfiguration Pair-Density Functional Theory Is Free From Delocalization Error.
Bao, Junwei Lucas; Wang, Ying; He, Xiao; Gagliardi, Laura; Truhlar, Donald G
2017-11-16
Delocalization error has been singled out by Yang and co-workers as the dominant error in Kohn-Sham density functional theory (KS-DFT) with conventional approximate functionals. In this Letter, by computing the vertical first ionization energy for well separated He clusters, we show that multiconfiguration pair-density functional theory (MC-PDFT) is free from delocalization error. To put MC-PDFT in perspective, we also compare it with some Kohn-Sham density functionals, including both traditional and modern functionals. Whereas large delocalization errors are almost universal in KS-DFT (the only exception being the very recent corrected functionals of Yang and co-workers), delocalization error is removed by MC-PDFT, which bodes well for its future as a step forward from KS-DFT.
How we load our data sets with theories and why we do so purposefully.
Rochefort-Maranda, Guillaume
2016-12-01
In this paper, I compare theory-laden perceptions with imputed data sets. The similarities between the two allow me to show how the phenomenon of theory-ladenness can manifest itself in statistical analyses. More importantly, elucidating the differences between them will allow me to broaden the focus of the existing literature on theory-ladenness and to introduce some much-needed nuances. The topic of statistical imputation has received no attention in philosophy of science. Yet, imputed data sets are very similar to theory-laden perceptions, and they are now an integral part of many scientific inferences. Unlike the existence of theory-laden perceptions, that of imputed data sets cannot be challenged or reduced to a manageable source of error. In fact, imputed data sets are created purposefully in order to improve the quality of our inferences. They do not undermine the possibility of scientific knowledge; on the contrary, they are epistemically desirable. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Belcastro, Celeste M.; Fischl, Robert; Kam, Moshe
1992-01-01
This paper presents a strategy for dynamically monitoring digital controllers in the laboratory for susceptibility to electromagnetic disturbances that compromise control integrity. The integrity of digital control systems operating in harsh electromagnetic environments can be compromised by upsets caused by induced transient electrical signals. Digital system upset is a functional error mode that involves no component damage, can occur simultaneously in all channels of a redundant control computer, and is software dependent. The motivation for this work is the need to develop tools and techniques that can be used in the laboratory to validate and/or certify critical aircraft controllers operating in electromagnetically adverse environments that result from lightning, high-intensity radiated fields (HIRF), and nuclear electromagnetic pulses (NEMP). The detection strategy presented in this paper provides dynamic monitoring of a given control computer for degraded functional integrity resulting from redundancy management errors, control calculation errors, and control correctness/effectiveness errors. In particular, this paper discusses the use of Kalman filtering, data fusion, and statistical decision theory in monitoring a given digital controller for control calculation errors.
Dual processing and diagnostic errors.
Norman, Geoff
2009-09-01
In this paper, I review evidence from two theories in psychology relevant to diagnosis and diagnostic errors. "Dual Process" theories of thinking, frequently mentioned with respect to diagnostic error, propose that categorization decisions can be made with either a fast, unconscious, contextual process called System 1 or a slow, analytical, conscious, and conceptual process, called System 2. Exemplar theories of categorization propose that many category decisions in everyday life are made by unconscious matching to a particular example in memory, and these remain available and retrievable individually. I then review studies of clinical reasoning based on these theories, and show that the two processes are equally effective; System 1, despite its reliance in idiosyncratic, individual experience, is no more prone to cognitive bias or diagnostic error than System 2. Further, I review evidence that instructions directed at encouraging the clinician to explicitly use both strategies can lead to consistent reduction in error rates.
Dual Processing and Diagnostic Errors
ERIC Educational Resources Information Center
Norman, Geoff
2009-01-01
In this paper, I review evidence from two theories in psychology relevant to diagnosis and diagnostic errors. "Dual Process" theories of thinking, frequently mentioned with respect to diagnostic error, propose that categorization decisions can be made with either a fast, unconscious, contextual process called System 1 or a slow, analytical,…
ERIC Educational Resources Information Center
Zhao, Xueyu; Solano-Flores, Guillermo; Qian, Ming
2018-01-01
This article addresses test translation review in international test comparisons. We investigated the applicability of the theory of test translation error--a theory of the multidimensionality and inevitability of test translation error--across source language-target language combinations in the translation of PISA (Programme of International…
NASA Astrophysics Data System (ADS)
Tian, F.; Lu, Y.
2017-12-01
Based on socioeconomic and hydrological data in three arid inland basins and error analysis, the dynamics of human water consumption (HWC) are analyzed to be asymmetric, i.e., HWC increase rapidly in wet periods while maintain or decrease slightly in dry periods. Besides the qualitative analysis that in wet periods great water availability inspires HWC to grow fast but the now expanded economy is managed to sustain by over-exploitation in dry periods, two quantitative models are established and tested, based on expected utility theory (EUT) and prospect theory (PT) respectively. EUT states that humans make decisions based on the total expected utility, namely the sum of utility function multiplied by probability of each result, while PT states that the utility function is defined over gains and losses separately, and probability should be replaced by probability weighting function.
Tamuz, Michal; Harrison, Michael I
2006-01-01
Objective To identify the distinctive contributions of high-reliability theory (HRT) and normal accident theory (NAT) as frameworks for examining five patient safety practices. Data Sources/Study Setting We reviewed and drew examples from studies of organization theory and health services research. Study Design After highlighting key differences between HRT and NAT, we applied the frames to five popular safety practices: double-checking medications, crew resource management (CRM), computerized physician order entry (CPOE), incident reporting, and root cause analysis (RCA). Principal Findings HRT highlights how double checking, which is designed to prevent errors, can undermine mindfulness of risk. NAT emphasizes that social redundancy can diffuse and reduce responsibility for locating mistakes. CRM promotes high reliability organizations by fostering deference to expertise, rather than rank. However, HRT also suggests that effective CRM depends on fundamental changes in organizational culture. NAT directs attention to an underinvestigated feature of CPOE: it tightens the coupling of the medication ordering process, and tight coupling increases the chances of a rapid and hard-to-contain spread of infrequent, but harmful errors. Conclusions Each frame can make a valuable contribution to improving patient safety. By applying the HRT and NAT frames, health care researchers and administrators can identify health care settings in which new and existing patient safety interventions are likely to be effective. Furthermore, they can learn how to improve patient safety, not only from analyzing mishaps, but also by studying the organizational consequences of implementing safety measures. PMID:16898984
Interactions of timing and prediction error learning.
Kirkpatrick, Kimberly
2014-01-01
Timing and prediction error learning have historically been treated as independent processes, but growing evidence has indicated that they are not orthogonal. Timing emerges at the earliest time point when conditioned responses are observed, and temporal variables modulate prediction error learning in both simple conditioning and cue competition paradigms. In addition, prediction errors, through changes in reward magnitude or value alter timing of behavior. Thus, there appears to be a bi-directional interaction between timing and prediction error learning. Modern theories have attempted to integrate the two processes with mixed success. A neurocomputational approach to theory development is espoused, which draws on neurobiological evidence to guide and constrain computational model development. Heuristics for future model development are presented with the goal of sparking new approaches to theory development in the timing and prediction error fields. Copyright © 2013 Elsevier B.V. All rights reserved.
Nexus: A modular workflow management system for quantum simulation codes
NASA Astrophysics Data System (ADS)
Krogel, Jaron T.
2016-01-01
The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.
Cockpit task management: A preliminary, normative theory
NASA Technical Reports Server (NTRS)
Funk, Ken
1991-01-01
Cockpit task management (CTM) involves the initiation, monitoring, prioritizing, and allocation of resources to concurrent tasks as well as termination of multiple concurrent tasks. As aircrews have more tasks to attend to due to reduced crew sizes and the increased complexity of aircraft and the air transportation system, CTM will become a more critical factor in aviation safety. It is clear that many aviation accidents and incidents can be satisfactorily explained in terms of CTM errors, and it is likely that more accidents induced by poor CTM practice will occur in the future unless the issue is properly addressed. The first step in understanding and facilitating CTM behavior was the development of a preliminary, normative theory of CTM which identifies several important CTM functions. From this theory, some requirements for pilot-vehicle interfaces were developed which are believed to facilitate CTM. A prototype PVI was developed which improves CTM performance and currently, a research program is under way that is aimed at developing a better understanding of CTM and facilitating CTM performance through better equipment and procedures.
Theory of Test Translation Error
ERIC Educational Resources Information Center
Solano-Flores, Guillermo; Backhoff, Eduardo; Contreras-Nino, Luis Angel
2009-01-01
In this article, we present a theory of test translation whose intent is to provide the conceptual foundation for effective, systematic work in the process of test translation and test translation review. According to the theory, translation error is multidimensional; it is not simply the consequence of defective translation but an inevitable fact…
Scherer, Laura D; Yates, J Frank; Baker, S Glenn; Valentine, Kathrene D
2017-06-01
Human judgment often violates normative standards, and virtually no judgment error has received as much attention as the conjunction fallacy. Judgment errors have historically served as evidence for dual-process theories of reasoning, insofar as these errors are assumed to arise from reliance on a fast and intuitive mental process, and are corrected via effortful deliberative reasoning. In the present research, three experiments tested the notion that conjunction errors are reduced by effortful thought. Predictions based on three different dual-process theory perspectives were tested: lax monitoring, override failure, and the Tripartite Model. Results indicated that participants higher in numeracy were less likely to make conjunction errors, but this association only emerged when participants engaged in two-sided reasoning, as opposed to one-sided or no reasoning. Confidence was higher for incorrect as opposed to correct judgments, suggesting that participants were unaware of their errors.
Understanding human management of automation errors
McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.
2013-01-01
Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042
Understanding human management of automation errors.
McBride, Sara E; Rogers, Wendy A; Fisk, Arthur D
2014-01-01
Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance.
Use of graph theory measures to identify errors in record linkage.
Randall, Sean M; Boyd, James H; Ferrante, Anna M; Bauer, Jacqueline K; Semmens, James B
2014-07-01
Ensuring high linkage quality is important in many record linkage applications. Current methods for ensuring quality are manual and resource intensive. This paper seeks to determine the effectiveness of graph theory techniques in identifying record linkage errors. A range of graph theory techniques was applied to two linked datasets, with known truth sets. The ability of graph theory techniques to identify groups containing errors was compared to a widely used threshold setting technique. This methodology shows promise; however, further investigations into graph theory techniques are required. The development of more efficient and effective methods of improving linkage quality will result in higher quality datasets that can be delivered to researchers in shorter timeframes. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Space-Time Error Representation and Estimation in Navier-Stokes Calculations
NASA Technical Reports Server (NTRS)
Barth, Timothy J.
2006-01-01
The mathematical framework for a-posteriori error estimation of functionals elucidated by Eriksson et al. [7] and Becker and Rannacher [3] is revisited in a space-time context. Using these theories, a hierarchy of exact and approximate error representation formulas are presented for use in error estimation and mesh adaptivity. Numerical space-time results for simple model problems as well as compressible Navier-Stokes flow at Re = 300 over a 2D circular cylinder are then presented to demonstrate elements of the error representation theory for time-dependent problems.
Hamiltonian lattice field theory: Computer calculations using variational methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zako, Robert L.
1991-12-03
I develop a variational method for systematic numerical computation of physical quantities -- bound state energies and scattering amplitudes -- in quantum field theory. An infinite-volume, continuum theory is approximated by a theory on a finite spatial lattice, which is amenable to numerical computation. I present an algorithm for computing approximate energy eigenvalues and eigenstates in the lattice theory and for bounding the resulting errors. I also show how to select basis states and choose variational parameters in order to minimize errors. The algorithm is based on the Rayleigh-Ritz principle and Kato`s generalizations of Temple`s formula. The algorithm could bemore » adapted to systems such as atoms and molecules. I show how to compute Green`s functions from energy eigenvalues and eigenstates in the lattice theory, and relate these to physical (renormalized) coupling constants, bound state energies and Green`s functions. Thus one can compute approximate physical quantities in a lattice theory that approximates a quantum field theory with specified physical coupling constants. I discuss the errors in both approximations. In principle, the errors can be made arbitrarily small by increasing the size of the lattice, decreasing the lattice spacing and computing sufficiently long. Unfortunately, I do not understand the infinite-volume and continuum limits well enough to quantify errors due to the lattice approximation. Thus the method is currently incomplete. I apply the method to real scalar field theories using a Fock basis of free particle states. All needed quantities can be calculated efficiently with this basis. The generalization to more complicated theories is straightforward. I describe a computer implementation of the method and present numerical results for simple quantum mechanical systems.« less
Apollo experience report: Very high frequency ranging system
NASA Technical Reports Server (NTRS)
Panter, W. C.; Shores, P. W.
1972-01-01
The history of the Apollo very-high-frequency ranging system development program is presented from the program-planning stage through the final-test and flight-evaluation stages. Block diagrams of the equipment are presented, and a description of the theory of operation is outlined. A sample of the distribution of errors measured in the aircraft-flight test program is included. The report is concluded with guidelines or recommendations for the management of development programs having the same general constraints.
Information management in DNA replication modeled by directional, stochastic chains with memory
NASA Astrophysics Data System (ADS)
Arias-Gonzalez, J. Ricardo
2016-11-01
Stochastic chains represent a key variety of phenomena in many branches of science within the context of information theory and thermodynamics. They are typically approached by a sequence of independent events or by a memoryless Markov process. Stochastic chains are of special significance to molecular biology, where genes are conveyed by linear polymers made up of molecular subunits and transferred from DNA to proteins by specialized molecular motors in the presence of errors. Here, we demonstrate that when memory is introduced, the statistics of the chain depends on the mechanism by which objects or symbols are assembled, even in the slow dynamics limit wherein friction can be neglected. To analyze these systems, we introduce a sequence-dependent partition function, investigate its properties, and compare it to the standard normalization defined by the statistical physics of ensembles. We then apply this theory to characterize the enzyme-mediated information transfer involved in DNA replication under the real, non-equilibrium conditions, reproducing measured error rates and explaining the typical 100-fold increase in fidelity that is experimentally found when proofreading and edition take place. Our model further predicts that approximately 1 kT has to be consumed to elevate fidelity in one order of magnitude. We anticipate that our results are necessary to interpret configurational order and information management in many molecular systems within biophysics, materials science, communication, and engineering.
The Distinctions of False and Fuzzy Memories.
ERIC Educational Resources Information Center
Schooler, Jonathan W.
1998-01-01
Notes that fuzzy-trace theory has been used to understand false memories of children. Demonstrates the irony imbedded in the theory, maintaining that a central implication of fuzzy-trace theory is that some errors characterized as false memories are not really false at all. These errors, when applied to false alarms to related lures, are best…
Action errors, error management, and learning in organizations.
Frese, Michael; Keith, Nina
2015-01-03
Every organization is confronted with errors. Most errors are corrected easily, but some may lead to negative consequences. Organizations often focus on error prevention as a single strategy for dealing with errors. Our review suggests that error prevention needs to be supplemented by error management--an approach directed at effectively dealing with errors after they have occurred, with the goal of minimizing negative and maximizing positive error consequences (examples of the latter are learning and innovations). After defining errors and related concepts, we review research on error-related processes affected by error management (error detection, damage control). Empirical evidence on positive effects of error management in individuals and organizations is then discussed, along with emotional, motivational, cognitive, and behavioral pathways of these effects. Learning from errors is central, but like other positive consequences, learning occurs under certain circumstances--one being the development of a mind-set of acceptance of human error.
Nee, Derek Evan; Kastner, Sabine; Brown, Joshua W
2011-01-01
The last decade has seen considerable discussion regarding a theoretical account of medial prefrontal cortex (mPFC) function with particular focus on the anterior cingulate cortex. The proposed theories have included conflict detection, error likelihood prediction, volatility monitoring, and several distinct theories of error detection. Arguments for and against particular theories often treat mPFC as functionally homogeneous, or at least nearly so, despite some evidence for distinct functional subregions. Here we used functional magnetic resonance imaging (fMRI) to simultaneously contrast multiple effects of error, conflict, and task-switching that have been individually construed in support of various theories. We found overlapping yet functionally distinct subregions of mPFC, with activations related to dominant error, conflict, and task-switching effects successively found along a rostral-ventral to caudal-dorsal gradient within medial prefrontal cortex. Activations in the rostral cingulate zone (RCZ) were strongly correlated with the unexpectedness of outcomes suggesting a role in outcome prediction and preparing control systems to deal with anticipated outcomes. The results as a whole support a resolution of some ongoing debates in that distinct theories may each pertain to corresponding distinct yet overlapping subregions of mPFC. Copyright © 2010 Elsevier Inc. All rights reserved.
Analysis of counterfactual quantum key distribution using error-correcting theory
NASA Astrophysics Data System (ADS)
Li, Yan-Bing
2014-10-01
Counterfactual quantum key distribution is an interesting direction in quantum cryptography and has been realized by some researchers. However, it has been pointed that its insecure in information theory when it is used over a high lossy channel. In this paper, we retry its security from a error-correcting theory point of view. The analysis indicates that the security flaw comes from the reason that the error rate in the users' raw key pair is as high as that under the Eve's attack when the loss rate exceeds 50 %.
Error propagation in energetic carrying capacity models
Pearse, Aaron T.; Stafford, Joshua D.
2014-01-01
Conservation objectives derived from carrying capacity models have been used to inform management of landscapes for wildlife populations. Energetic carrying capacity models are particularly useful in conservation planning for wildlife; these models use estimates of food abundance and energetic requirements of wildlife to target conservation actions. We provide a general method for incorporating a foraging threshold (i.e., density of food at which foraging becomes unprofitable) when estimating food availability with energetic carrying capacity models. We use a hypothetical example to describe how past methods for adjustment of foraging thresholds biased results of energetic carrying capacity models in certain instances. Adjusting foraging thresholds at the patch level of the species of interest provides results consistent with ecological foraging theory. Presentation of two case studies suggest variation in bias which, in certain instances, created large errors in conservation objectives and may have led to inefficient allocation of limited resources. Our results also illustrate how small errors or biases in application of input parameters, when extrapolated to large spatial extents, propagate errors in conservation planning and can have negative implications for target populations.
The Importance of Relying on the Manual: Scoring Error Variance in the WISC-IV Vocabulary Subtest
ERIC Educational Resources Information Center
Erdodi, Laszlo A.; Richard, David C. S.; Hopwood, Christopher
2009-01-01
Classical test theory assumes that ability level has no effect on measurement error. Newer test theories, however, argue that the precision of a measurement instrument changes as a function of the examinee's true score. Research has shown that administration errors are common in the Wechsler scales and that subtests requiring subjective scoring…
ERIC Educational Resources Information Center
Tian, Wei; Cai, Li; Thissen, David; Xin, Tao
2013-01-01
In item response theory (IRT) modeling, the item parameter error covariance matrix plays a critical role in statistical inference procedures. When item parameters are estimated using the EM algorithm, the parameter error covariance matrix is not an automatic by-product of item calibration. Cai proposed the use of Supplemented EM algorithm for…
The evolution of Crew Resource Management training in commercial aviation
NASA Technical Reports Server (NTRS)
Helmreich, R. L.; Merritt, A. C.; Wilhelm, J. A.
1999-01-01
In this study, we describe changes in the nature of Crew Resource Management (CRM) training in commercial aviation, including its shift from cockpit to crew resource management. Validation of the impact of CRM is discussed. Limitations of CRM, including lack of cross-cultural generality are considered. An overarching framework that stresses error management to increase acceptance of CRM concepts is presented. The error management approach defines behavioral strategies taught in CRM as error countermeasures that are employed to avoid error, to trap errors committed, and to mitigate the consequences of error.
A Phenomenological Study of Nurse Manager Interventions Related to Workplace Bullying.
Skarbek, Anita J; Johnson, Sandra; Dawson, Christina M
2015-10-01
The aim of this study was to acquire nurse managers' perspectives as to the scope of workplace bullying, which interventions were deemed as effective and ineffective, and what environmental characteristics cultivated a healthy, caring work environment. Research has linked workplace bullying among RNs to medical errors, unsafe hospital environments, and negative patient outcomes. Limited research had been conducted with nurse managers to discern their perspectives. Six nurse managers from hospital settings participated in in-depth, semistructured interviews. Ray's theory of bureaucratic caring guided the study. These themes emerged: (a) awareness, (b) scope of the problem, (c) quality of performance, and (d) healthy, caring environment. Findings indicated mandated antibullying programs were not as effective as individual manager interventions. Systems must be in place to hold individuals accountable for their behavior. Communication, collective support, and teamwork are essential to create environments that lead to the delivery of safe, optimum patient care.
Nexus: a modular workflow management system for quantum simulation codes
Krogel, Jaron T.
2015-08-24
The management of simulation workflows is a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantummore » chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.« less
Dissociating response conflict and error likelihood in anterior cingulate cortex.
Yeung, Nick; Nieuwenhuis, Sander
2009-11-18
Neuroimaging studies consistently report activity in anterior cingulate cortex (ACC) in conditions of high cognitive demand, leading to the view that ACC plays a crucial role in the control of cognitive processes. According to one prominent theory, the sensitivity of ACC to task difficulty reflects its role in monitoring for the occurrence of competition, or "conflict," between responses to signal the need for increased cognitive control. However, a contrasting theory proposes that ACC is the recipient rather than source of monitoring signals, and that ACC activity observed in relation to task demand reflects the role of this region in learning about the likelihood of errors. Response conflict and error likelihood are typically confounded, making the theories difficult to distinguish empirically. The present research therefore used detailed computational simulations to derive contrasting predictions regarding ACC activity and error rate as a function of response speed. The simulations demonstrated a clear dissociation between conflict and error likelihood: fast response trials are associated with low conflict but high error likelihood, whereas slow response trials show the opposite pattern. Using the N2 component as an index of ACC activity, an EEG study demonstrated that when conflict and error likelihood are dissociated in this way, ACC activity tracks conflict and is negatively correlated with error likelihood. These findings support the conflict-monitoring theory and suggest that, in speeded decision tasks, ACC activity reflects current task demands rather than the retrospective coding of past performance.
Patient safety education at Japanese medical schools: results of a nationwide survey.
Maeda, Shoichi; Kamishiraki, Etsuko; Starkey, Jay
2012-05-10
Patient safety education, including error prevention strategies and management of adverse events, has become a topic of worldwide concern. The importance of the patient safety is also recognized in Japan following two serious medical accidents in 1999. Furthermore, educational curriculum guideline revisions in 2008 by relevant the Ministry of Education includes patient safety as part of the core medical curriculum. However, little is known about the patient safety education in Japanese medical schools partly because a comprehensive study has not yet been conducted in this field. Therefore, we have conducted a nationwide survey in order to clarify the current status of patient safety education at medical schools in Japan. Response rate was 60.0% (n = 48/80). Ninety-eight-percent of respondents (n = 47/48) reported integration of patient safety education into their curricula. Thirty-nine percent reported devoting less than five hours to the topic. All schools that teach patient safety reported use of lecture based teaching methods while few used alternative methods, such as role-playing or in-hospital training. Topics related to medical error theory and legal ramifications of error are widely taught while practical topics related to error analysis such as root cause analysis are less often covered. Based on responses to our survey, most Japanese medical schools have incorporated the topic of patient safety into their curricula. However, the number of hours devoted to the patient safety education is far from the sufficient level with forty percent of medical schools that devote five hours or less to it. In addition, most medical schools employ only the lecture based learning, lacking diversity in teaching methods. Although most medical schools cover basic error theory, error analysis is taught at fewer schools. We still need to make improvements to our medical safety curricula. We believe that this study has the implications for the rest of the world as a model of what is possible and a sounding board for what topics might be important.
Does a better model yield a better argument? An info-gap analysis
NASA Astrophysics Data System (ADS)
Ben-Haim, Yakov
2017-04-01
Theories, models and computations underlie reasoned argumentation in many areas. The possibility of error in these arguments, though of low probability, may be highly significant when the argument is used in predicting the probability of rare high-consequence events. This implies that the choice of a theory, model or computational method for predicting rare high-consequence events must account for the probability of error in these components. However, error may result from lack of knowledge or surprises of various sorts, and predicting the probability of error is highly uncertain. We show that the putatively best, most innovative and sophisticated argument may not actually have the lowest probability of error. Innovative arguments may entail greater uncertainty than more standard but less sophisticated methods, creating an innovation dilemma in formulating the argument. We employ info-gap decision theory to characterize and support the resolution of this problem and present several examples.
Boquet, Albert J; Cohen, Tara N; Cabrera, Jennifer S; Litzinger, Tracy L; Captain, Kevin A; Fabian, Michael A; Miles, Steven G; Shappell, Scott A
2016-09-09
Historically, health care has relied on error management techniques to measure and reduce the occurrence of adverse events. This study proposes an alternative approach for identifying and analyzing hazardous events. Whereas previous research has concentrated on investigating individual flow disruptions, we maintain the industry should focus on threat windows, or the accumulation of these disruptions. This methodology, driven by the broken windows theory, allows us to identify process inefficiencies before they manifest and open the door for the occurrence of errors and adverse events. Medical human factors researchers observed disruptions during 34 trauma cases at a Level II trauma center. Data were collected during resuscitation and imaging and were classified using a human factors taxonomy: Realizing Improved Patient Care Through Human-Centered Operating Room Design for Threat Window Analysis (RIPCHORD-TWA). Of the 576 total disruptions observed, communication issues were the most prevalent (28%), followed by interruptions and coordination issues (24% each). Issues related to layout (16%), usability (5%), and equipment (2%) comprised the remainder of the observations. Disruptions involving communication issues were more prevalent during resuscitation, whereas coordination problems were observed more frequently during imaging. Rather than solely investigating errors and adverse events, we propose conceptualizing the accumulation of disruptions in terms of threat windows as a means to analyze potential threats to the integrity of the trauma care system. This approach allows for the improved identification of system weaknesses or threats, affording us the ability to address these inefficiencies and intervene before errors and adverse events may occur.
The theory precision analyse of RFM localization of satellite remote sensing imagery
NASA Astrophysics Data System (ADS)
Zhang, Jianqing; Xv, Biao
2009-11-01
The tradition method of detecting precision of Rational Function Model(RFM) is to make use of a great deal check points, and it calculates mean square error through comparing calculational coordinate with known coordinate. This method is from theory of probability, through a large number of samples to statistic estimate value of mean square error, we can think its estimate value approaches in its true when samples are well enough. This paper is from angle of survey adjustment, take law of propagation of error as the theory basis, and it calculates theory precision of RFM localization. Then take the SPOT5 three array imagery as experiment data, and the result of traditional method and narrated method in the paper are compared, while has confirmed tradition method feasible, and answered its theory precision question from the angle of survey adjustment.
Risk managers, physicians, and disclosure of harmful medical errors.
Loren, David J; Garbutt, Jane; Dunagan, W Claiborne; Bommarito, Kerry M; Ebers, Alison G; Levinson, Wendy; Waterman, Amy D; Fraser, Victoria J; Summy, Elizabeth A; Gallagher, Thomas H
2010-03-01
Physicians are encouraged to disclose medical errors to patients, which often requires close collaboration between physicians and risk managers. An anonymous national survey of 2,988 healthcare facility-based risk managers was conducted between November 2004 and March 2005, and results were compared with those of a previous survey (conducted between July 2003 and March 2004) of 1,311 medical physicians in Washington and Missouri. Both surveys included an error-disclosure scenario for an obvious and a less obvious error with scripted response options. More risk managers than physicians were aware that an error-reporting system was present at their hospital (81% versus 39%, p < .001) and believed that mechanisms to inform physicians about errors in their hospital were adequate (51% versus 17%, p < .001). More risk managers than physicians strongly agreed that serious errors should be disclosed to patients (70% versus 49%, p < .001). Across both error scenario, risk managers were more likely than physicians to definitely recommend that the error be disclosed (76% versus 50%, p < .001) and to provide full details about how the error would be prevented in the future (62% versus 51%, p < .001). However, physicians were more likely than risk managers to provide a full apology recognizing the harm caused by the error (39% versus 21%, p < .001). Risk managers have more favorable attitudes about disclosing errors to patients compared with physicians but are less supportive of providing a full apology. These differences may create conflicts between risk managers and physicians regarding disclosure. Health care institutions should promote greater collaboration between these two key participants in disclosure conversations.
ERIC Educational Resources Information Center
Jeptarus, Kipsamo E.; Ngene, Patrick K.
2016-01-01
The purpose of this research was to study the Lexico-semantic errors of the Keiyo-speaking standard seven primary school learners of English as a Second Language (ESL) in Keiyo District, Kenya. This study was guided by two related theories: Error Analysis Theory/Approach by Corder (1971) which approaches L2 learning through a detailed analysis of…
Social networks as embedded complex adaptive systems.
Benham-Hutchins, Marge; Clancy, Thomas R
2010-09-01
As systems evolve over time, their natural tendency is to become increasingly more complex. Studies in the field of complex systems have generated new perspectives on management in social organizations such as hospitals. Much of this research appears as a natural extension of the cross-disciplinary field of systems theory. This is the 15th in a series of articles applying complex systems science to the traditional management concepts of planning, organizing, directing, coordinating, and controlling. In this article, the authors discuss healthcare social networks as a hierarchy of embedded complex adaptive systems. The authors further examine the use of social network analysis tools as a means to understand complex communication patterns and reduce medical errors.
Schindler, Simon; Reinhard, Marc-André
2015-01-01
With the present research, we investigated effects of existential threat on veracity judgments. According to several meta-analyses, people judge potentially deceptive messages of other people as true rather than as false (so-called truth bias). This judgmental bias has been shown to depend on how people weigh the error of judging a true message as a lie (error 1) and the error of judging a lie as a true message (error 2). The weight of these errors has been further shown to be affected by situational variables. Given that research on terror management theory has found evidence that mortality salience (MS) increases the sensitivity toward the compliance of cultural norms, especially when they are of focal attention, we assumed that when the honesty norm is activated, MS affects judgmental error weighing and, consequently, judgmental biases. Specifically, activating the norm of honesty should decrease the weight of error 1 (the error of judging a true message as a lie) and increase the weight of error 2 (the error of judging a lie as a true message) when mortality is salient. In a first study, we found initial evidence for this assumption. Furthermore, the change in error weighing should reduce the truth bias, automatically resulting in better detection accuracy of actual lies and worse accuracy of actual true statements. In two further studies, we manipulated MS and honesty norm activation before participants judged several videos containing actual truths or lies. Results revealed evidence for our prediction. Moreover, in Study 3, the truth bias was increased after MS when group solidarity was previously emphasized. PMID:26388815
Explaining errors in children's questions.
Rowland, Caroline F
2007-07-01
The ability to explain the occurrence of errors in children's speech is an essential component of successful theories of language acquisition. The present study tested some generativist and constructivist predictions about error on the questions produced by ten English-learning children between 2 and 5 years of age. The analyses demonstrated that, as predicted by some generativist theories [e.g. Santelmann, L., Berk, S., Austin, J., Somashekar, S. & Lust. B. (2002). Continuity and development in the acquisition of inversion in yes/no questions: dissociating movement and inflection, Journal of Child Language, 29, 813-842], questions with auxiliary DO attracted higher error rates than those with modal auxiliaries. However, in wh-questions, questions with modals and DO attracted equally high error rates, and these findings could not be explained in terms of problems forming questions with why or negated auxiliaries. It was concluded that the data might be better explained in terms of a constructivist account that suggests that entrenched item-based constructions may be protected from error in children's speech, and that errors occur when children resort to other operations to produce questions [e.g. Dabrowska, E. (2000). From formula to schema: the acquisition of English questions. Cognitive Liguistics, 11, 83-102; Rowland, C. F. & Pine, J. M. (2000). Subject-auxiliary inversion errors and wh-question acquisition: What children do know? Journal of Child Language, 27, 157-181; Tomasello, M. (2003). Constructing a language: A usage-based theory of language acquisition. Cambridge, MA: Harvard University Press]. However, further work on constructivist theory development is required to allow researchers to make predictions about the nature of these operations.
Pilpel, Avital
2007-09-01
This paper is concerned with the role of rational belief change theory in the philosophical understanding of experimental error. Today, philosophers seek insight about error in the investigation of specific experiments, rather than in general theories. Nevertheless, rational belief change theory adds to our understanding of just such cases: R. A. Fisher's criticism of Mendel's experiments being a case in point. After an historical introduction, the main part of this paper investigates Fisher's paper from the point of view of rational belief change theory: what changes of belief about Mendel's experiment does Fisher go through and with what justification. It leads to surprising insights about what Fisher had done right and wrong, and, more generally, about the limits of statistical methods in detecting error.
Bootstrap Estimates of Standard Errors in Generalizability Theory
ERIC Educational Resources Information Center
Tong, Ye; Brennan, Robert L.
2007-01-01
Estimating standard errors of estimated variance components has long been a challenging task in generalizability theory. Researchers have speculated about the potential applicability of the bootstrap for obtaining such estimates, but they have identified problems (especially bias) in using the bootstrap. Using Brennan's bias-correcting procedures…
Weeland, Martine M; Nijhof, Karin S; Otten, R; Vermaes, Ignace P R; Buitelaar, Jan K
2017-10-01
This study tests the validity of Beck's cognitive theory and Nolen-Hoeksema's response style theory of depression in adolescents with and without MBID. The relationship between negative cognitive errors (Beck), response styles (Nolen-Hoeksema) and depressive symptoms was examined in 135 adolescents using linear regression. The cognitive error 'underestimation of the ability to cope' was more prevalent among adolescents with MBID than among adolescents with average intelligence. This was the only negative cognitive error that predicted depressive symptoms. There were no differences between groups in the prevalence of the three response styles. In line with the theory, ruminating was positively and problem-solving was negatively related to depressive symptoms. Distractive response styles were not related to depressive symptoms. The relationship between response styles, cognitive errors and depressive symptoms were similar for both groups. The main premises of both theories of depression are equally applicable to adolescents with and without MBID. The cognitive error 'Underestimation of the ability to cope' poses a specific risk factor for developing a depression for adolescents with MBID and requires special attention in treatment and prevention of depression. WHAT THIS PAPER ADDS?: Despite the high prevalence of depression among adolescents with MBID, little is known about the etiology and cognitive processes that play a role in the development of depression in this group. The current paper fills this gap in research by examining the core tenets of two important theories on the etiology of depression (Beck's cognitive theory and Nolen-Hoeksema's response style theory) in a clinical sample of adolescents with and without MBID. This paper demonstrated that the theories are equally applicable to adolescents with MBID, as to adolescents with average intellectual ability. However, the cognitive bias 'underestimation of the ability to cope' was the only cognitive error related to depressive symptoms, and was much more prevalent among adolescents with MBID than among adolescents with average intellectual ability. This suggests that underestimating one's coping skills may be a unique risk factor for depression among adolescents with MBID. This knowledge is important in understanding the causes and perpetuating mechanisms of depression in adolescents with MBID, and for the development of prevention- and treatment programs for adolescents with MBID. Copyright © 2017 Elsevier Ltd. All rights reserved.
Naveh, Eitan; Katz-Navon, Tal
2014-01-01
To avoid errors and improve patient safety and quality of care, health care organizations need to identify the sources of failures and facilitate implementation of corrective actions. Hence, health care organizations try to collect reports and data about errors by investing enormous resources in reporting systems. However, despite health care organizations' declared goal of increasing the voluntary reporting of errors and although the Patient Safety and Quality Improvement Act of 2005 (S.544, Public Law 109-41) legalizes efforts to secure reporters from specific liabilities, the problem of underreporting of adverse events by staff members remains. The purpose of the paper is to develop a theory-based model and a set of propositions to understand the antecedents of staff members' willingness to report errors based on a literature synthesis. The model aims to explore a complex system of considerations employees use when deciding whether to report their errors or be silent about them. The model integrates the influences of three types of organizational climates (psychological safety, psychological contracts, and safety climate) and individual perceptions of the applicability of the organization's procedures and proposes their mutual influence on willingness to report errors and, as a consequence, patient safety. The model suggests that managers should try to control and influence both the way employees perceive procedure applicability and organizational context-i.e., psychological safety, no-blame contracts, and safety climate-to increase reporting and improve patient safety.
Managing heteroscedasticity in general linear models.
Rosopa, Patrick J; Schaffer, Meline M; Schroeder, Amber N
2013-09-01
Heteroscedasticity refers to a phenomenon where data violate a statistical assumption. This assumption is known as homoscedasticity. When the homoscedasticity assumption is violated, this can lead to increased Type I error rates or decreased statistical power. Because this can adversely affect substantive conclusions, the failure to detect and manage heteroscedasticity could have serious implications for theory, research, and practice. In addition, heteroscedasticity is not uncommon in the behavioral and social sciences. Thus, in the current article, we synthesize extant literature in applied psychology, econometrics, quantitative psychology, and statistics, and we offer recommendations for researchers and practitioners regarding available procedures for detecting heteroscedasticity and mitigating its effects. In addition to discussing the strengths and weaknesses of various procedures and comparing them in terms of existing simulation results, we describe a 3-step data-analytic process for detecting and managing heteroscedasticity: (a) fitting a model based on theory and saving residuals, (b) the analysis of residuals, and (c) statistical inferences (e.g., hypothesis tests and confidence intervals) involving parameter estimates. We also demonstrate this data-analytic process using an illustrative example. Overall, detecting violations of the homoscedasticity assumption and mitigating its biasing effects can strengthen the validity of inferences from behavioral and social science data.
Lin, Yanli; Moran, Tim P; Schroder, Hans S; Moser, Jason S
2015-10-01
Anxious apprehension/worry is associated with exaggerated error monitoring; however, the precise mechanisms underlying this relationship remain unclear. The current study tested the hypothesis that the worry-error monitoring relationship involves left-lateralized linguistic brain activity by examining the relationship between worry and error monitoring, indexed by the error-related negativity (ERN), as a function of hand of error (Experiment 1) and stimulus orientation (Experiment 2). Results revealed that worry was exclusively related to the ERN on right-handed errors committed by the linguistically dominant left hemisphere. Moreover, the right-hand ERN-worry relationship emerged only when stimuli were presented horizontally (known to activate verbal processes) but not vertically. Together, these findings suggest that the worry-ERN relationship involves left hemisphere verbal processing, elucidating a potential mechanism to explain error monitoring abnormalities in anxiety. Implications for theory and practice are discussed. © 2015 Society for Psychophysiological Research.
Abstract Syntax in Sentence Production: Evidence from Stem-Exchange Errors
ERIC Educational Resources Information Center
Lane, Liane Wardlow; Ferreira, Victor S.
2010-01-01
Three experiments tested theories of syntactic representation by assessing "stem-exchange" errors ("hates the record"[right arrow]"records the hate"). Previous research has shown that in stem exchanges, speakers pronounce intended nouns ("REcord") as verbs ("reCORD"), yielding syntactically well-formed utterances. By "lexically based" theories,…
GY SAMPLING THEORY AND GEOSTATISTICS: ALTERNATE MODELS OF VARIABILITY IN CONTINUOUS MEDIA
In the sampling theory developed by Pierre Gy, sample variability is modeled as the sum of a set of seven discrete error components. The variogram used in geostatisties provides an alternate model in which several of Gy's error components are combined in a continuous mode...
NASA Astrophysics Data System (ADS)
Zhong, Xuemin; Liu, Hongqi; Mao, Xinyong; Li, Bin; He, Songping; Peng, Fangyu
2018-05-01
Large multi-axis propeller-measuring machines have two types of geometric error, position-independent geometric errors (PIGEs) and position-dependent geometric errors (PDGEs), which both have significant effects on the volumetric error of the measuring tool relative to the worktable. This paper focuses on modeling, identifying and compensating for the volumetric error of the measuring machine. A volumetric error model in the base coordinate system is established based on screw theory considering all the geometric errors. In order to fully identify all the geometric error parameters, a new method for systematic measurement and identification is proposed. All the PIGEs of adjacent axes and the six PDGEs of the linear axes are identified with a laser tracker using the proposed model. Finally, a volumetric error compensation strategy is presented and an inverse kinematic solution for compensation is proposed. The final measuring and compensation experiments have further verified the efficiency and effectiveness of the measuring and identification method, indicating that the method can be used in volumetric error compensation for large machine tools.
NASA Technical Reports Server (NTRS)
Ortega, J. M.
1984-01-01
Several short summaries of the work performed during this reporting period are presented. Topics discussed in this document include: (1) resilient seeded errors via simple techniques; (2) knowledge representation for engineering design; (3) analysis of faults in a multiversion software experiment; (4) implementation of parallel programming environment; (5) symbolic execution of concurrent programs; (6) two computer graphics systems for visualization of pressure distribution and convective density particles; (7) design of a source code management system; (8) vectorizing incomplete conjugate gradient on the Cyber 203/205; (9) extensions of domain testing theory and; (10) performance analyzer for the pisces system.
Field errors in hybrid insertion devices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schlueter, R.D.
1995-02-01
Hybrid magnet theory as applied to the error analyses used in the design of Advanced Light Source (ALS) insertion devices is reviewed. Sources of field errors in hybrid insertion devices are discussed.
Intervention strategies for the management of human error
NASA Technical Reports Server (NTRS)
Wiener, Earl L.
1993-01-01
This report examines the management of human error in the cockpit. The principles probably apply as well to other applications in the aviation realm (e.g. air traffic control, dispatch, weather, etc.) as well as other high-risk systems outside of aviation (e.g. shipping, high-technology medical procedures, military operations, nuclear power production). Management of human error is distinguished from error prevention. It is a more encompassing term, which includes not only the prevention of error, but also a means of disallowing an error, once made, from adversely affecting system output. Such techniques include: traditional human factors engineering, improvement of feedback and feedforward of information from system to crew, 'error-evident' displays which make erroneous input more obvious to the crew, trapping of errors within a system, goal-sharing between humans and machines (also called 'intent-driven' systems), paperwork management, and behaviorally based approaches, including procedures, standardization, checklist design, training, cockpit resource management, etc. Fifteen guidelines for the design and implementation of intervention strategies are included.
ERIC Educational Resources Information Center
Nozari, Nazbanou; Dell, Gary S.; Schwartz, Myrna F.
2011-01-01
Despite the existence of speech errors, verbal communication is successful because speakers can detect (and correct) their errors. The standard theory of speech-error detection, the perceptual-loop account, posits that the comprehension system monitors production output for errors. Such a comprehension-based monitor, however, cannot explain the…
Vuk, Tomislav; Barišić, Marijan; Očić, Tihomir; Mihaljević, Ivanka; Šarlija, Dorotea; Jukić, Irena
2012-01-01
Background. Continuous and efficient error management, including procedures from error detection to their resolution and prevention, is an important part of quality management in blood establishments. At the Croatian Institute of Transfusion Medicine (CITM), error management has been systematically performed since 2003. Materials and methods. Data derived from error management at the CITM during an 8-year period (2003–2010) formed the basis of this study. Throughout the study period, errors were reported to the Department of Quality Assurance. In addition to surveys and the necessary corrective activities, errors were analysed and classified according to the Medical Event Reporting System for Transfusion Medicine (MERS-TM). Results. During the study period, a total of 2,068 errors were recorded, including 1,778 (86.0%) in blood bank activities and 290 (14.0%) in blood transfusion services. As many as 1,744 (84.3%) errors were detected before issue of the product or service. Among the 324 errors identified upon release from the CITM, 163 (50.3%) errors were detected by customers and reported as complaints. In only five cases was an error detected after blood product transfusion however without any harmful consequences for the patients. All errors were, therefore, evaluated as “near miss” and “no harm” events. Fifty-two (2.5%) errors were evaluated as high-risk events. With regards to blood bank activities, the highest proportion of errors occurred in the processes of labelling (27.1%) and blood collection (23.7%). With regards to blood transfusion services, errors related to blood product issuing prevailed (24.5%). Conclusion. This study shows that comprehensive management of errors, including near miss errors, can generate data on the functioning of transfusion services, which is a precondition for implementation of efficient corrective and preventive actions that will ensure further improvement of the quality and safety of transfusion treatment. PMID:22395352
Defining and Measuring Decision-Making for the Management of Trauma Patients.
Madani, Amin; Gips, Amanda; Razek, Tarek; Deckelbaum, Dan L; Mulder, David S; Grushka, Jeremy R
Effective management of trauma patients is heavily dependent on sound judgment and decision-making. Yet, current methods for training and assessing these advanced cognitive skills are subjective, lack standardization, and are prone to error. This qualitative study aims to define and characterize the cognitive and interpersonal competencies required to optimally manage injured patients. Cognitive and hierarchical task analyses for managing unstable trauma patients were performed using qualitative methods to map the thoughts, behaviors, and practices that characterize expert performance. Trauma team leaders and board-certified trauma surgeons participated in semistructured interviews that were transcribed verbatim. Data were supplemented with content from published literature and prospectively collected field notes from observations of the trauma team during trauma activations. The data were coded and analyzed using grounded theory by 2 independent reviewers. A framework was created based on 14 interviews with experts (lasting 1-2 hours each), 35 field observations (20 [57%] blunt; 15 [43%] penetrating; median Injury Severity Score 20 [13-25]), and 15 literary sources. Experts included 11 trauma surgeons and 3 emergency physicians from 7 Level 1 academic institutions in North America (median years in practice: 12 [8-17]). Twenty-nine competencies were identified, including 17 (59%) related to situation awareness, 6 (21%) involving decision-making, and 6 (21%) requiring interpersonal skills. Of 40 potential errors that were identified, root causes were mapped to errors in situation awareness (20 [50%]), decision-making (10 [25%]), or interpersonal skills (10 [25%]). This study defines cognitive and interpersonal competencies that are essential for the management of trauma patients. This framework may serve as the basis for novel curricula to train and assess decision-making skills, and to develop quality-control metrics to improve team and individual performance. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Bowen, J. Philip; Sorensen, Jennifer B.; Kirschner, Karl N.
2007-01-01
The analysis explains the basis set superposition error (BSSE) and fragment relaxation involved in calculating the interaction energies using various first principle theories. Interacting the correlated fragment and increasing the size of the basis set can help in decreasing the BSSE to a great extent.
Jegathesan, Mithila; Vitberg, Yaffa M; Pusic, Martin V
2016-02-11
Intelligence theory research has illustrated that people hold either "fixed" (intelligence is immutable) or "growth" (intelligence can be improved) mindsets and that these views may affect how people learn throughout their lifetime. Little is known about the mindsets of physicians, and how mindset may affect their lifetime learning and integration of feedback. Our objective was to determine if pediatric physicians are of the "fixed" or "growth" mindset and whether individual mindset affects perception of medical error reporting. We sent an anonymous electronic survey to pediatric residents and attending pediatricians at a tertiary care pediatric hospital. Respondents completed the "Theories of Intelligence Inventory" which classifies individuals on a 6-point scale ranging from 1 (Fixed Mindset) to 6 (Growth Mindset). Subsequent questions collected data on respondents' recall of medical errors by self or others. We received 176/349 responses (50 %). Participants were equally distributed between mindsets with 84 (49 %) classified as "fixed" and 86 (51 %) as "growth". Residents, fellows and attendings did not differ in terms of mindset. Mindset did not correlate with the small number of reported medical errors. There is no dominant theory of intelligence (mindset) amongst pediatric physicians. The distribution is similar to that seen in the general population. Mindset did not correlate with error reports.
Intuitive theories of information: beliefs about the value of redundancy.
Soll, J B
1999-03-01
In many situations, quantity estimates from multiple experts or diagnostic instruments must be collected and combined. Normatively, and all else equal, one should value information sources that are nonredundant, in the sense that correlation in forecast errors should be minimized. Past research on the preference for redundancy has been inconclusive. While some studies have suggested that people correctly place higher value on uncorrelated inputs when collecting estimates, others have shown that people either ignore correlation or, in some cases, even prefer it. The present experiments show that the preference for redundancy depends on one's intuitive theory of information. The most common intuitive theory identified is the Error Tradeoff Model (ETM), which explicitly distinguishes between measurement error and bias. According to ETM, measurement error can only be averaged out by consulting the same source multiple times (normatively false), and bias can only be averaged out by consulting different sources (normatively true). As a result, ETM leads people to prefer redundant estimates when the ratio of measurement error to bias is relatively high. Other participants favored different theories. Some adopted the normative model, while others were reluctant to mathematically average estimates from different sources in any circumstance. In a post hoc analysis, science majors were more likely than others to subscribe to the normative model. While tentative, this result lends insight into how intuitive theories might develop and also has potential ramifications for how statistical concepts such as correlation might best be learned and internalized. Copyright 1999 Academic Press.
Towards a Framework for Managing Risk Associated with Technology-Induced Error.
Borycki, Elizabeth M; Kushniruk, Andre W
2017-01-01
Health information technologies (HIT) promised to streamline and modernize healthcare processes. However, a growing body of research has indicated that if such technologies are not designed, implemented or maintained properly this may lead to an increased incidence of new types of errors which the authors have referred to as "technology-induced errors". In this paper, framework is presented that can be used to manage HIT risk. The framework considers the reduction of technology-induced errors at different stages by managing risks associated with the implementation of HIT. Frameworks that allow health information technology managers to employ proactive and preventative approaches that can be used to manage the risks associated with technology-induced errors are critical to improving HIT safety and managing risk associated with implementing new technologies.
Linguistic pattern analysis of misspellings of typically developing writers in grades 1-9.
Bahr, Ruth Huntley; Sillian, Elaine R; Berninger, Virginia W; Dow, Michael
2012-12-01
A mixed-methods approach, evaluating triple word-form theory, was used to describe linguistic patterns of misspellings. Spelling errors were taken from narrative and expository writing samples provided by 888 typically developing students in Grades 1-9. Errors were coded by category (phonological, orthographic, and morphological) and specific linguistic feature affected. Grade-level effects were analyzed with trend analysis. Qualitative analyses determined frequent error types and how use of specific linguistic features varied across grades. Phonological, orthographic, and morphological errors were noted across all grades, but orthographic errors predominated. Linear trends revealed developmental shifts in error proportions for the orthographic and morphological categories between Grades 4 and 5. Similar error types were noted across age groups, but the nature of linguistic feature error changed with age. Triple word-form theory was supported. By Grade 1, orthographic errors predominated, and phonological and morphological error patterns were evident. Morphological errors increased in relative frequency in older students, probably due to a combination of word-formation issues and vocabulary growth. These patterns suggest that normal spelling development reflects nonlinear growth and that it takes a long time to develop a robust orthographic lexicon that coordinates phonology, orthography, and morphology and supports word-specific, conventional spelling.
On the Lennard-Jones and Devonshire theory for solid state thermodynamics
NASA Astrophysics Data System (ADS)
Lustig, Rolf
2017-06-01
The Lennard-Jones and Devonshire theory is developed into a self-consistent scheme for essentially complete thermodynamic information. The resulting methodology is compared with molecular simulation of the Lennard-Jones system in the face-centred-cubic solid state over an excessive range of state points. The thermal and caloric equations of state are in almost perfect agreement along the entire fluid-solid coexistence lines over more than six orders of magnitude in pressure. For homogeneous densities greater than twice the solid triple point density, the theory is essentially exact for derivatives of the Helmholtz energy. However, the fluid-solid phase equilibria are in disagreement with simulation. It is shown that the theory is in error by an additive constant to the Helmholtz energy A/(NkBT). Empirical inclusion of the error term makes all fluid-solid equilibria indistinguishable from exact results. Some arguments about the origin of the error are given.
Analysis on the dynamic error for optoelectronic scanning coordinate measurement network
NASA Astrophysics Data System (ADS)
Shi, Shendong; Yang, Linghui; Lin, Jiarui; Guo, Siyang; Ren, Yongjie
2018-01-01
Large-scale dynamic three-dimension coordinate measurement technique is eagerly demanded in equipment manufacturing. Noted for advantages of high accuracy, scale expandability and multitask parallel measurement, optoelectronic scanning measurement network has got close attention. It is widely used in large components jointing, spacecraft rendezvous and docking simulation, digital shipbuilding and automated guided vehicle navigation. At present, most research about optoelectronic scanning measurement network is focused on static measurement capacity and research about dynamic accuracy is insufficient. Limited by the measurement principle, the dynamic error is non-negligible and restricts the application. The workshop measurement and positioning system is a representative which can realize dynamic measurement function in theory. In this paper we conduct deep research on dynamic error resources and divide them two parts: phase error and synchronization error. Dynamic error model is constructed. Based on the theory above, simulation about dynamic error is carried out. Dynamic error is quantized and the rule of volatility and periodicity has been found. Dynamic error characteristics are shown in detail. The research result lays foundation for further accuracy improvement.
Bayesian learning for spatial filtering in an EEG-based brain-computer interface.
Zhang, Haihong; Yang, Huijuan; Guan, Cuntai
2013-07-01
Spatial filtering for EEG feature extraction and classification is an important tool in brain-computer interface. However, there is generally no established theory that links spatial filtering directly to Bayes classification error. To address this issue, this paper proposes and studies a Bayesian analysis theory for spatial filtering in relation to Bayes error. Following the maximum entropy principle, we introduce a gamma probability model for describing single-trial EEG power features. We then formulate and analyze the theoretical relationship between Bayes classification error and the so-called Rayleigh quotient, which is a function of spatial filters and basically measures the ratio in power features between two classes. This paper also reports our extensive study that examines the theory and its use in classification, using three publicly available EEG data sets and state-of-the-art spatial filtering techniques and various classifiers. Specifically, we validate the positive relationship between Bayes error and Rayleigh quotient in real EEG power features. Finally, we demonstrate that the Bayes error can be practically reduced by applying a new spatial filter with lower Rayleigh quotient.
Patient safety education at Japanese medical schools: results of a nationwide survey
2012-01-01
Background Patient safety education, including error prevention strategies and management of adverse events, has become a topic of worldwide concern. The importance of the patient safety is also recognized in Japan following two serious medical accidents in 1999. Furthermore, educational curriculum guideline revisions in 2008 by relevant the Ministry of Education includes patient safety as part of the core medical curriculum. However, little is known about the patient safety education in Japanese medical schools partly because a comprehensive study has not yet been conducted in this field. Therefore, we have conducted a nationwide survey in order to clarify the current status of patient safety education at medical schools in Japan. Results Response rate was 60.0% (n = 48/80). Ninety-eight-percent of respondents (n = 47/48) reported integration of patient safety education into their curricula. Thirty-nine percent reported devoting less than five hours to the topic. All schools that teach patient safety reported use of lecture based teaching methods while few used alternative methods, such as role-playing or in-hospital training. Topics related to medical error theory and legal ramifications of error are widely taught while practical topics related to error analysis such as root cause analysis are less often covered. Conclusions Based on responses to our survey, most Japanese medical schools have incorporated the topic of patient safety into their curricula. However, the number of hours devoted to the patient safety education is far from the sufficient level with forty percent of medical schools that devote five hours or less to it. In addition, most medical schools employ only the lecture based learning, lacking diversity in teaching methods. Although most medical schools cover basic error theory, error analysis is taught at fewer schools. We still need to make improvements to our medical safety curricula. We believe that this study has the implications for the rest of the world as a model of what is possible and a sounding board for what topics might be important. PMID:22574712
Asymptotic Standard Errors for Item Response Theory True Score Equating of Polytomous Items
ERIC Educational Resources Information Center
Cher Wong, Cheow
2015-01-01
Building on previous works by Lord and Ogasawara for dichotomous items, this article proposes an approach to derive the asymptotic standard errors of item response theory true score equating involving polytomous items, for equivalent and nonequivalent groups of examinees. This analytical approach could be used in place of empirical methods like…
The Importance of the Assumption of Uncorrelated Errors in Psychometric Theory
ERIC Educational Resources Information Center
Raykov, Tenko; Marcoulides, George A.; Patelis, Thanos
2015-01-01
A critical discussion of the assumption of uncorrelated errors in classical psychometric theory and its applications is provided. It is pointed out that this assumption is essential for a number of fundamental results and underlies the concept of parallel tests, the Spearman-Brown's prophecy and the correction for attenuation formulas as well as…
ERIC Educational Resources Information Center
van Veen, V.; Holroyd, C.B.; Cohen, J.D.; Stenger, V.A.; Carter, C.S.
2004-01-01
Recent theories of the neural basis of performance monitoring have emphasized a central role for the anterior cingulate cortex (ACC). Replicating an earlier event-related potential (ERP) study, which showed an error feedback negativity that was modeled as having an ACC generator, we used event-related fMRI to investigate whether the ACC would…
Standard errors in forest area
Joseph McCollum
2002-01-01
I trace the development of standard error equations for forest area, beginning with the theory behind double sampling and the variance of a product. The discussion shifts to the particular problem of forest area - at which time the theory becomes relevant. There are subtle difficulties in figuring out which variance of a product equation should be used. The equations...
SPSS and SAS programs for generalizability theory analyses.
Mushquash, Christopher; O'Connor, Brian P
2006-08-01
The identification and reduction of measurement errors is a major challenge in psychological testing. Most investigators rely solely on classical test theory for assessing reliability, whereas most experts have long recommended using generalizability theory instead. One reason for the common neglect of generalizability theory is the absence of analytic facilities for this purpose in popular statistical software packages. This article provides a brief introduction to generalizability theory, describes easy to use SPSS, SAS, and MATLAB programs for conducting the recommended analyses, and provides an illustrative example, using data (N = 329) for the Rosenberg Self-Esteem Scale. Program output includes variance components, relative and absolute errors and generalizability coefficients, coefficients for D studies, and graphs of D study results.
Law, Katherine E; Ray, Rebecca D; D'Angelo, Anne-Lise D; Cohen, Elaine R; DiMarco, Shannon M; Linsmeier, Elyse; Wiegmann, Douglas A; Pugh, Carla M
The study aim was to determine whether residents' error management strategies changed across 2 simulated laparoscopic ventral hernia (LVH) repair procedures after receiving feedback on their initial performance. We hypothesize that error detection and recovery strategies would improve during the second procedure without hands-on practice. Retrospective review of participant procedural performances of simulated laparoscopic ventral herniorrhaphy. A total of 3 investigators reviewed procedure videos to identify surgical errors. Errors were deconstructed. Error management events were noted, including error identification and recovery. Residents performed the simulated LVH procedures during a course on advanced laparoscopy. Participants had 30 minutes to complete a LVH procedure. After verbal and simulator feedback, residents returned 24 hours later to perform a different, more difficult simulated LVH repair. Senior (N = 7; postgraduate year 4-5) residents in attendance at the course participated in this study. In the first LVH procedure, residents committed 121 errors (M = 17.14, standard deviation = 4.38). Although the number of errors increased to 146 (M = 20.86, standard deviation = 6.15) during the second procedure, residents progressed further in the second procedure. There was no significant difference in the number of errors committed for both procedures, but errors shifted to the late stage of the second procedure. Residents changed the error types that they attempted to recover (χ 2 5 =24.96, p<0.001). For the second procedure, recovery attempts increased for action and procedure errors, but decreased for strategy errors. Residents also recovered the most errors in the late stage of the second procedure (p < 0.001). Residents' error management strategies changed between procedures following verbal feedback on their initial performance and feedback from the simulator. Errors and recovery attempts shifted to later steps during the second procedure. This may reflect residents' error management success in the earlier stages, which allowed further progression in the second simulation. Incorporating error recognition and management opportunities into surgical training could help track residents' learning curve and provide detailed, structured feedback on technical and decision-making skills. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Optimal information transfer in enzymatic networks: A field theoretic formulation
NASA Astrophysics Data System (ADS)
Samanta, Himadri S.; Hinczewski, Michael; Thirumalai, D.
2017-07-01
Signaling in enzymatic networks is typically triggered by environmental fluctuations, resulting in a series of stochastic chemical reactions, leading to corruption of the signal by noise. For example, information flow is initiated by binding of extracellular ligands to receptors, which is transmitted through a cascade involving kinase-phosphatase stochastic chemical reactions. For a class of such networks, we develop a general field-theoretic approach to calculate the error in signal transmission as a function of an appropriate control variable. Application of the theory to a simple push-pull network, a module in the kinase-phosphatase cascade, recovers the exact results for error in signal transmission previously obtained using umbral calculus [Hinczewski and Thirumalai, Phys. Rev. X 4, 041017 (2014), 10.1103/PhysRevX.4.041017]. We illustrate the generality of the theory by studying the minimal errors in noise reduction in a reaction cascade with two connected push-pull modules. Such a cascade behaves as an effective three-species network with a pseudointermediate. In this case, optimal information transfer, resulting in the smallest square of the error between the input and output, occurs with a time delay, which is given by the inverse of the decay rate of the pseudointermediate. Surprisingly, in these examples the minimum error computed using simulations that take nonlinearities and discrete nature of molecules into account coincides with the predictions of a linear theory. In contrast, there are substantial deviations between simulations and predictions of the linear theory in error in signal propagation in an enzymatic push-pull network for a certain range of parameters. Inclusion of second-order perturbative corrections shows that differences between simulations and theoretical predictions are minimized. Our study establishes that a field theoretic formulation of stochastic biological signaling offers a systematic way to understand error propagation in networks of arbitrary complexity.
A stochastic dynamic model for human error analysis in nuclear power plants
NASA Astrophysics Data System (ADS)
Delgado-Loperena, Dharma
Nuclear disasters like Three Mile Island and Chernobyl indicate that human performance is a critical safety issue, sending a clear message about the need to include environmental press and competence aspects in research. This investigation was undertaken to serve as a roadmap for studying human behavior through the formulation of a general solution equation. The theoretical model integrates models from two heretofore-disassociated disciplines (behavior specialists and technical specialists), that historically have independently studied the nature of error and human behavior; including concepts derived from fractal and chaos theory; and suggests re-evaluation of base theory regarding human error. The results of this research were based on comprehensive analysis of patterns of error, with the omnipresent underlying structure of chaotic systems. The study of patterns lead to a dynamic formulation, serving for any other formula used to study human error consequences. The search for literature regarding error yielded insight for the need to include concepts rooted in chaos theory and strange attractors---heretofore unconsidered by mainstream researchers who investigated human error in nuclear power plants or those who employed the ecological model in their work. The study of patterns obtained from the rupture of a steam generator tube (SGTR) event simulation, provided a direct application to aspects of control room operations in nuclear power plant operations. In doing so, the conceptual foundation based in the understanding of the patterns of human error analysis can be gleaned, resulting in reduced and prevent undesirable events.
Error Analyses of the North Alabama Lightning Mapping Array (LMA)
NASA Technical Reports Server (NTRS)
Koshak, W. J.; Solokiewicz, R. J.; Blakeslee, R. J.; Goodman, S. J.; Christian, H. J.; Hall, J. M.; Bailey, J. C.; Krider, E. P.; Bateman, M. G.; Boccippio, D. J.
2003-01-01
Two approaches are used to characterize how accurately the North Alabama Lightning Mapping Array (LMA) is able to locate lightning VHF sources in space and in time. The first method uses a Monte Carlo computer simulation to estimate source retrieval errors. The simulation applies a VHF source retrieval algorithm that was recently developed at the NASA-MSFC and that is similar, but not identical to, the standard New Mexico Tech retrieval algorithm. The second method uses a purely theoretical technique (i.e., chi-squared Curvature Matrix theory) to estimate retrieval errors. Both methods assume that the LMA system has an overall rms timing error of 50ns, but all other possible errors (e.g., multiple sources per retrieval attempt) are neglected. The detailed spatial distributions of retrieval errors are provided. Given that the two methods are completely independent of one another, it is shown that they provide remarkably similar results, except that the chi-squared theory produces larger altitude error estimates than the (more realistic) Monte Carlo simulation.
NASA Technical Reports Server (NTRS)
Jones, B. G.; Planchon, H. P., Jr.
1973-01-01
Work during the period of this report has been in three areas: (1) pressure transducer error analysis, (2) fluctuating velocity and pressure measurements in the NASA Lewis 6-inch diameter quiet jet facility, and (3) measurement analysis. A theory was developed and experimentally verified to quantify the pressure transducer velocity interference error. The theory and supporting experimental evidence show that the errors are a function of the velocity field's turbulent structure. It is shown that near the mixing layer center the errors are negligible. Turbulent velocity and pressure measurements were made in the NASA Lewis quiet jet facility. Some preliminary results are included.
Error and Uncertainty Quantification in the Numerical Simulation of Complex Fluid Flows
NASA Technical Reports Server (NTRS)
Barth, Timothy J.
2010-01-01
The failure of numerical simulation to predict physical reality is often a direct consequence of the compounding effects of numerical error arising from finite-dimensional approximation and physical model uncertainty resulting from inexact knowledge and/or statistical representation. In this topical lecture, we briefly review systematic theories for quantifying numerical errors and restricted forms of model uncertainty occurring in simulations of fluid flow. A goal of this lecture is to elucidate both positive and negative aspects of applying these theories to practical fluid flow problems. Finite-element and finite-volume calculations of subsonic and hypersonic fluid flow are presented to contrast the differing roles of numerical error and model uncertainty. for these problems.
Control by model error estimation
NASA Technical Reports Server (NTRS)
Likins, P. W.; Skelton, R. E.
1976-01-01
Modern control theory relies upon the fidelity of the mathematical model of the system. Truncated modes, external disturbances, and parameter errors in linear system models are corrected by augmenting to the original system of equations an 'error system' which is designed to approximate the effects of such model errors. A Chebyshev error system is developed for application to the Large Space Telescope (LST).
Research on effects of phase error in phase-shifting interferometer
NASA Astrophysics Data System (ADS)
Wang, Hongjun; Wang, Zhao; Zhao, Hong; Tian, Ailing; Liu, Bingcai
2007-12-01
Referring to phase-shifting interferometry technology, the phase shifting error from the phase shifter is the main factor that directly affects the measurement accuracy of the phase shifting interferometer. In this paper, the resources and sorts of phase shifting error were introduction, and some methods to eliminate errors were mentioned. Based on the theory of phase shifting interferometry, the effects of phase shifting error were analyzed in detail. The Liquid Crystal Display (LCD) as a new shifter has advantage as that the phase shifting can be controlled digitally without any mechanical moving and rotating element. By changing coded image displayed on LCD, the phase shifting in measuring system was induced. LCD's phase modulation characteristic was analyzed in theory and tested. Based on Fourier transform, the effect model of phase error coming from LCD was established in four-step phase shifting interferometry. And the error range was obtained. In order to reduce error, a new error compensation algorithm was put forward. With this method, the error can be obtained by process interferogram. The interferogram can be compensated, and the measurement results can be obtained by four-step phase shifting interferogram. Theoretical analysis and simulation results demonstrate the feasibility of this approach to improve measurement accuracy.
Factors associated with reporting of medication errors by Israeli nurses.
Kagan, Ilya; Barnoy, Sivia
2008-01-01
This study investigated medication error reporting among Israeli nurses, the relationship between nurses' personal views about error reporting, and the impact of the safety culture of the ward and hospital on this reporting. Nurses (n = 201) completed a questionnaire related to different aspects of error reporting (frequency, organizational norms of dealing with errors, and personal views on reporting). The higher the error frequency, the more errors went unreported. If the ward nurse manager corrected errors on the ward, error self-reporting decreased significantly. Ward nurse managers have to provide good role models.
2016-07-01
Reports an error in "Do positive affectivity and boundary preferences matter for work-family enrichment? A study of human service workers" by Laurel A. McNall, Lindsay D. Scott and Jessica M. Nicklin (Journal of Occupational Health Psychology, 2015[Jan], Vol 20[1], 93-104). In the article there was an error in Figure 1. The lower left bubble should read "Boundary Preference Toward Segmentation" instead of "Boundary Preference Toward Integration." (The following abstract of the original article appeared in record 2014-44477-001.) More individuals than ever are managing work and family roles, but relatively little research has been done exploring whether boundary preferences help individuals benefit from multiple role memberships. Drawing on Greenhaus and Powell's (2006) work-family enrichment theory, along with Boundary Theory (Ashforth, Kreiner, & Fugate, 2000) and Conservation of Resources Theory (Hobfoll, 2002), we explore the impact of personal characteristics as enablers of work-family enrichment, and in turn, work outcomes relevant to human service workers: turnover intentions and emotional exhaustion. In a 2-wave study of 161 human service employees, we found that individuals high in positive affectivity were more likely to experience both work-to-family and family to-work enrichment, whereas those with preferences toward integration were more likely to experience work-to-family enrichment (but not family to-work enrichment). In turn, work-to-family enrichment (but not family to-work enrichment) was related to lower turnover intentions and emotional exhaustion. Enrichment served as a mediating mechanism for only some of the hypothesized relationships. Implications for theory and practice are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Processor register error correction management
Bose, Pradip; Cher, Chen-Yong; Gupta, Meeta S.
2016-12-27
Processor register protection management is disclosed. In embodiments, a method of processor register protection management can include determining a sensitive logical register for executable code generated by a compiler, generating an error-correction table identifying the sensitive logical register, and storing the error-correction table in a memory accessible by a processor. The processor can be configured to generate a duplicate register of the sensitive logical register identified by the error-correction table.
Kim, Myoung Soo
2012-08-01
The purpose of this cross-sectional study was to examine current status of IT-based medication error prevention system construction and the relationships among system construction, medication error management climate and perception for system use. The participants were 124 patient safety chief managers working for 124 hospitals with over 300 beds in Korea. The characteristics of the participants, construction status and perception of systems (electric pharmacopoeia, electric drug dosage calculation system, computer-based patient safety reporting and bar-code system) and medication error management climate were measured in this study. The data were collected between June and August 2011. Descriptive statistics, partial Pearson correlation and MANCOVA were used for data analysis. Electric pharmacopoeia were constructed in 67.7% of participating hospitals, computer-based patient safety reporting systems were constructed in 50.8%, electric drug dosage calculation systems were in use in 32.3%. Bar-code systems showed up the lowest construction rate at 16.1% of Korean hospitals. Higher rates of construction of IT-based medication error prevention systems resulted in greater safety and a more positive error management climate prevailed. The supportive strategies for improving perception for use of IT-based systems would add to system construction, and positive error management climate would be more easily promoted.
Utilizing measure-based feedback in control-mastery theory: A clinical error.
Snyder, John; Aafjes-van Doorn, Katie
2016-09-01
Clinical errors and ruptures are an inevitable part of clinical practice. Often times, therapists are unaware that a clinical error or rupture has occurred, leaving no space for repair, and potentially leading to patient dropout and/or less effective treatment. One way to overcome our blind spots is by frequently and systematically collecting measure-based feedback from the patient. Patient feedback measures that focus on the process of psychotherapy such as the Patient's Experience of Attunement and Responsiveness scale (PEAR) can be used in conjunction with treatment outcome measures such as the Outcome Questionnaire 45.2 (OQ-45.2) to monitor the patient's therapeutic experience and progress. The regular use of these types of measures can aid clinicians in the identification of clinical errors and the associated patient deterioration that might otherwise go unnoticed and unaddressed. The current case study describes an instance of clinical error that occurred during the 2-year treatment of a highly traumatized young woman. The clinical error was identified using measure-based feedback and subsequently understood and addressed from the theoretical standpoint of the control-mastery theory of psychotherapy. An alternative hypothetical response is also presented and explained using control-mastery theory. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Fostering the Intelligent Novice: Learning from Errors with Metacognitive Tutoring
ERIC Educational Resources Information Center
Mathan, Santosh A.; Koedinger, Kenneth R.
2005-01-01
This article explores 2 important aspects of metacognition: (a) how students monitor their ongoing performance to detect and correct errors and (b) how students reflect on those errors to learn from them. Although many instructional theories have advocated providing students with immediate feedback on errors, some researchers have argued that…
A Mechanism for Error Detection in Speeded Response Time Tasks
ERIC Educational Resources Information Center
Holroyd, Clay B.; Yeung, Nick; Coles, Michael G. H.; Cohen, Jonathan D.
2005-01-01
The concept of error detection plays a central role in theories of executive control. In this article, the authors present a mechanism that can rapidly detect errors in speeded response time tasks. This error monitor assigns values to the output of cognitive processes involved in stimulus categorization and response generation and detects errors…
Using video recording to identify management errors in pediatric trauma resuscitation.
Oakley, Ed; Stocker, Sergio; Staubli, Georg; Young, Simon
2006-03-01
To determine the ability of video recording to identify management errors in trauma resuscitation and to compare this method with medical record review. The resuscitation of children who presented to the emergency department of the Royal Children's Hospital between February 19, 2001, and August 18, 2002, for whom the trauma team was activated was video recorded. The tapes were analyzed, and management was compared with Advanced Trauma Life Support guidelines. Deviations from these guidelines were recorded as errors. Fifty video recordings were analyzed independently by 2 reviewers. Medical record review was undertaken for a cohort of the most seriously injured patients, and errors were identified. The errors detected with the 2 methods were compared. Ninety resuscitations were video recorded and analyzed. An average of 5.9 errors per resuscitation was identified with this method (range: 1-12 errors). Twenty-five children (28%) had an injury severity score of >11; there was an average of 2.16 errors per patient in this group. Only 10 (20%) of these errors were detected in the medical record review. Medical record review detected an additional 8 errors that were not evident on the video recordings. Concordance between independent reviewers was high, with 93% agreement. Video recording is more effective than medical record review in detecting management errors in pediatric trauma resuscitation. Management errors in pediatric trauma resuscitation are common and often involve basic resuscitation principles. Resuscitation of the most seriously injured children was associated with fewer errors. Video recording is a useful adjunct to trauma resuscitation auditing.
Prospect theory does not describe the feedback-related negativity value function.
Sambrook, Thomas D; Roser, Matthew; Goslin, Jeremy
2012-12-01
Humans handle uncertainty poorly. Prospect theory accounts for this with a value function in which possible losses are overweighted compared to possible gains, and the marginal utility of rewards decreases with size. fMRI studies have explored the neural basis of this value function. A separate body of research claims that prediction errors are calculated by midbrain dopamine neurons. We investigated whether the prospect theoretic effects shown in behavioral and fMRI studies were present in midbrain prediction error coding by using the feedback-related negativity, an ERP component believed to reflect midbrain prediction errors. Participants' stated satisfaction with outcomes followed prospect theory but their feedback-related negativity did not, instead showing no effect of marginal utility and greater sensitivity to potential gains than losses. Copyright © 2012 Society for Psychophysiological Research.
On the relationship between aerosol content and errors in telephotometer experiments.
NASA Technical Reports Server (NTRS)
Thomas, R. W. L.
1971-01-01
This paper presents an invariant imbedding theory of multiple scattering phenomena contributing to errors in telephotometer experiments. The theory indicates that there is a simple relationship between the magnitudes of the errors introduced by successive orders of scattering and it is shown that for all optical thicknesses each order can be represented by a coefficient which depends on the field of view of the telescope and the properties of the scattering medium. The verification of the theory and the derivation of the coefficients have been accomplished by a Monte Carlo program. Both monodisperse and polydisperse systems of Mie scatterers have been treated. The results demonstrate that for a given optical thickness the coefficients increase strongly with the mean particle size particularly for the smaller fields of view.
Experiences from the testing of a theory for modelling groundwater flow in heterogeneous media
Christensen, S.; Cooley, R.L.
2002-01-01
Usually, small-scale model error is present in groundwater modelling because the model only represents average system characteristics having the same form as the drift and small-scale variability is neglected. These errors cause the true errors of a regression model to be correlated. Theory and an example show that the errors also contribute to bias in the estimates of model parameters. This bias originates from model nonlinearity. In spite of this bias, predictions of hydraulic head are nearly unbiased if the model intrinsic nonlinearity is small. Individual confidence and prediction intervals are accurate if the t-statistic is multiplied by a correction factor. The correction factor can be computed from the true error second moment matrix, which can be determined when the stochastic properties of the system characteristics are known.
Experience gained in testing a theory for modelling groundwater flow in heterogeneous media
Christensen, S.; Cooley, R.L.
2002-01-01
Usually, small-scale model error is present in groundwater modelling because the model only represents average system characteristics having the same form as the drift, and small-scale variability is neglected. These errors cause the true errors of a regression model to be correlated. Theory and an example show that the errors also contribute to bias in the estimates of model parameters. This bias originates from model nonlinearity. In spite of this bias, predictions of hydraulic head are nearly unbiased if the model intrinsic nonlinearity is small. Individual confidence and prediction intervals are accurate if the t-statistic is multiplied by a correction factor. The correction factor can be computed from the true error second moment matrix, which can be determined when the stochastic properties of the system characteristics are known.
Linguistic Pattern Analysis of Misspellings of Typically Developing Writers in Grades 1 to 9
Bahr, Ruth Huntley; Silliman, Elaine R.; Berninger, Virginia W.; Dow, Michael
2012-01-01
Purpose A mixed methods approach, evaluating triple word form theory, was used to describe linguistic patterns of misspellings. Method Spelling errors were taken from narrative and expository writing samples provided by 888 typically developing students in grades 1–9. Errors were coded by category (phonological, orthographic, and morphological) and specific linguistic feature affected. Grade level effects were analyzed with trend analysis. Qualitative analyses determined frequent error types and how use of specific linguistic features varied across grades. Results Phonological, orthographic, and morphological errors were noted across all grades, but orthographic errors predominated. Linear trends revealed developmental shifts in error proportions for the orthographic and morphological categories between grades 4–5. Similar error types were noted across age groups but the nature of linguistic feature error changed with age. Conclusions Triple word-form theory was supported. By grade 1, orthographic errors predominated and phonological and morphological error patterns were evident. Morphological errors increased in relative frequency in older students, probably due to a combination of word-formation issues and vocabulary growth. These patterns suggest that normal spelling development reflects non-linear growth and that it takes a long time to develop a robust orthographic lexicon that coordinates phonology, orthography, and morphology and supports word-specific, conventional spelling. PMID:22473834
Synthesis of robust nonlinear autopilots using differential game theory
NASA Technical Reports Server (NTRS)
Menon, P. K. A.
1991-01-01
A synthesis technique for handling unmodeled disturbances in nonlinear control law synthesis was advanced using differential game theory. Two types of modeling inaccuracies can be included in the formulation. The first is a bias-type error, while the second is the scale-factor-type error in the control variables. The disturbances were assumed to satisfy an integral inequality constraint. Additionally, it was assumed that they act in such a way as to maximize a quadratic performance index. Expressions for optimal control and worst-case disturbance were then obtained using optimal control theory.
Nurses' role in medication safety.
Choo, Janet; Hutchinson, Alison; Bucknall, Tracey
2010-10-01
To explore the nurse's role in the process of medication management and identify the challenges associated with safe medication management in contemporary clinical practice. Medication errors have been a long-standing factor affecting consumer safety. The nursing profession has been identified as essential to the promotion of patient safety. A review of literature on medication errors and the use of electronic prescribing in medication errors. Medication management requires a multidisciplinary approach and interdisciplinary communication is essential to reduce medication errors. Information technologies can help to reduce some medication errors through eradication of transcription and dosing errors. Nurses must play a major role in the design of computerized medication systems to ensure a smooth transition to such as system. The nurses' roles in medication management cannot be over-emphasized. This is particularly true when designing a computerized medication system. The adoption of safety measures during decision making that parallel those of the aviation industry safety procedures can provide some strategies to prevent medication error. Innovations in information technology offer potential mechanisms to avert adverse events in medication management for nurses. © 2010 The Authors. Journal compilation © 2010 Blackwell Publishing Ltd.
The effectiveness of risk management program on pediatric nurses' medication error.
Dehghan-Nayeri, Nahid; Bayat, Fariba; Salehi, Tahmineh; Faghihzadeh, Soghrat
2013-09-01
Medication therapy is one of the most complex and high-risk clinical processes that nurses deal with. Medication error is the most common type of error that brings about damage and death to patients, especially pediatric ones. However, these errors are preventable. Identifying and preventing undesirable events leading to medication errors are the main risk management activities. The aim of this study was to investigate the effectiveness of a risk management program on the pediatric nurses' medication error rate. This study is a quasi-experimental one with a comparison group. In this study, 200 nurses were recruited from two main pediatric hospitals in Tehran. In the experimental hospital, we applied the risk management program for a period of 6 months. Nurses of the control hospital did the hospital routine schedule. A pre- and post-test was performed to measure the frequency of the medication error events. SPSS software, t-test, and regression analysis were used for data analysis. After the intervention, the medication error rate of nurses at the experimental hospital was significantly lower (P < 0.001) and the error-reporting rate was higher (P < 0.007) compared to before the intervention and also in comparison to the nurses of the control hospital. Based on the results of this study and taking into account the high-risk nature of the medical environment, applying the quality-control programs such as risk management can effectively prevent the occurrence of the hospital undesirable events. Nursing mangers can reduce the medication error rate by applying risk management programs. However, this program cannot succeed without nurses' cooperation.
Error management training and simulation education.
Gardner, Aimee; Rich, Michelle
2014-12-01
The integration of simulation into the training of health care professionals provides context for decision making and procedural skills in a high-fidelity environment, without risk to actual patients. It was hypothesised that a novel approach to simulation-based education - error management training - would produce higher performance ratings compared with traditional step-by-step instruction. Radiology technology students were randomly assigned to participate in traditional procedural-based instruction (n = 11) or vicarious error management training (n = 11). All watched an instructional video and discussed how well each incident was handled (traditional instruction group) or identified where the errors were made (vicarious error management training). Students then participated in a 30-minute case-based simulation. Simulations were videotaped for performance analysis. Blinded experts evaluated performance using a predefined evaluation tool created specifically for the scenario. Blinded experts evaluated performance using a predefined evaluation tool created specifically for the scenario The vicarious error management group scored higher on observer-rated performance (Mean = 9.49) than students in the traditional instruction group (Mean = 9.02; p < 0.01). These findings suggest that incorporating the discussion of errors and how to handle errors during the learning session will better equip students when performing hands-on procedures and skills. This pilot study provides preliminary evidence for integrating error management skills into medical curricula and for the design of learning goals in simulation-based education. © 2014 John Wiley & Sons Ltd.
Mirinejad, Hossein; Gaweda, Adam E; Brier, Michael E; Zurada, Jacek M; Inanc, Tamer
2017-09-01
Anemia is a common comorbidity in patients with chronic kidney disease (CKD) and is frequently associated with decreased physical component of quality of life, as well as adverse cardiovascular events. Current treatment methods for renal anemia are mostly population-based approaches treating individual patients with a one-size-fits-all model. However, FDA recommendations stipulate individualized anemia treatment with precise control of the hemoglobin concentration and minimal drug utilization. In accordance with these recommendations, this work presents an individualized drug dosing approach to anemia management by leveraging the theory of optimal control. A Multiple Receding Horizon Control (MRHC) approach based on the RBF-Galerkin optimization method is proposed for individualized anemia management in CKD patients. Recently developed by the authors, the RBF-Galerkin method uses the radial basis function approximation along with the Galerkin error projection to solve constrained optimal control problems numerically. The proposed approach is applied to generate optimal dosing recommendations for individual patients. Performance of the proposed approach (MRHC) is compared in silico to that of a population-based anemia management protocol and an individualized multiple model predictive control method for two case scenarios: hemoglobin measurement with and without observational errors. In silico comparison indicates that hemoglobin concentration with MRHC method has less variation among the methods, especially in presence of measurement errors. In addition, the average achieved hemoglobin level from the MRHC is significantly closer to the target hemoglobin than that of the other two methods, according to the analysis of variance (ANOVA) statistical test. Furthermore, drug dosages recommended by the MRHC are more stable and accurate and reach the steady-state value notably faster than those generated by the other two methods. The proposed method is highly efficient for the control of hemoglobin level, yet provides accurate dosage adjustments in the treatment of CKD anemia. Copyright © 2017 Elsevier B.V. All rights reserved.
Error analysis and correction of discrete solutions from finite element codes
NASA Technical Reports Server (NTRS)
Thurston, G. A.; Stein, P. A.; Knight, N. F., Jr.; Reissner, J. E.
1984-01-01
Many structures are an assembly of individual shell components. Therefore, results for stresses and deflections from finite element solutions for each shell component should agree with the equations of shell theory. This paper examines the problem of applying shell theory to the error analysis and the correction of finite element results. The general approach to error analysis and correction is discussed first. Relaxation methods are suggested as one approach to correcting finite element results for all or parts of shell structures. Next, the problem of error analysis of plate structures is examined in more detail. The method of successive approximations is adapted to take discrete finite element solutions and to generate continuous approximate solutions for postbuckled plates. Preliminary numerical results are included.
Spiral-bevel geometry and gear train precision
NASA Technical Reports Server (NTRS)
Litvin, F. L.; Coy, J. J.
1983-01-01
A new aproach to the solution of determination of surface principal curvatures and directions is proposed. Direct relationships between the principal curvatures and directions of the tool surface and those of the principal curvatures and directions of generated gear surface are obtained. The principal curvatures and directions of geartooth surface are obtained without using the complicated equations of these surfaces. A general theory of the train kinematical errors exerted by manufacturing and assembly errors is discussed. Two methods for the determination of the train kinematical errors can be worked out: (1) with aid of a computer, and (2) with a approximate method. Results from noise and vibration measurement conducted on a helicopter transmission are used to illustrate the principals contained in the theory of kinematic errors.
Comment on "Infants' perseverative search errors are induced by pragmatic misinterpretation".
Spencer, John P; Dineva, Evelina; Smith, Linda B
2009-09-25
Topál et al. (Reports, 26 September 2008, p. 1831) proposed that infants' perseverative search errors can be explained by ostensive cues from the experimenter. We use the dynamic field theory to test the proposal that infants encode locations more weakly when social cues are present. Quantitative simulations show that this account explains infants' performance without recourse to the theory of natural pedagogy.
ERIC Educational Resources Information Center
Nicewander, W. Alan
2018-01-01
Spearman's correction for attenuation (measurement error) corrects a correlation coefficient for measurement errors in either-or-both of two variables, and follows from the assumptions of classical test theory. Spearman's equation removes all measurement error from a correlation coefficient which translates into "increasing the reliability of…
The Relevance of Second Language Acquisition Theory to the Written Error Correction Debate
ERIC Educational Resources Information Center
Polio, Charlene
2012-01-01
The controversies surrounding written error correction can be traced to Truscott (1996) in his polemic against written error correction. He claimed that empirical studies showed that error correction was ineffective and that this was to be expected "given the nature of the correction process and "the nature of language learning" (p. 328, emphasis…
Correcting quantum errors with entanglement.
Brun, Todd; Devetak, Igor; Hsieh, Min-Hsiu
2006-10-20
We show how entanglement shared between encoder and decoder can simplify the theory of quantum error correction. The entanglement-assisted quantum codes we describe do not require the dual-containing constraint necessary for standard quantum error-correcting codes, thus allowing us to "quantize" all of classical linear coding theory. In particular, efficient modern classical codes that attain the Shannon capacity can be made into entanglement-assisted quantum codes attaining the hashing bound (closely related to the quantum capacity). For systems without large amounts of shared entanglement, these codes can also be used as catalytic codes, in which a small amount of initial entanglement enables quantum communication.
A predictability study of Lorenz's 28-variable model as a dynamical system
NASA Technical Reports Server (NTRS)
Krishnamurthy, V.
1993-01-01
The dynamics of error growth in a two-layer nonlinear quasi-geostrophic model has been studied to gain an understanding of the mathematical theory of atmospheric predictability. The growth of random errors of varying initial magnitudes has been studied, and the relation between this classical approach and the concepts of the nonlinear dynamical systems theory has been explored. The local and global growths of random errors have been expressed partly in terms of the properties of an error ellipsoid and the Liapunov exponents determined by linear error dynamics. The local growth of small errors is initially governed by several modes of the evolving error ellipsoid but soon becomes dominated by the longest axis. The average global growth of small errors is exponential with a growth rate consistent with the largest Liapunov exponent. The duration of the exponential growth phase depends on the initial magnitude of the errors. The subsequent large errors undergo a nonlinear growth with a steadily decreasing growth rate and attain saturation that defines the limit of predictability. The degree of chaos and the largest Liapunov exponent show considerable variation with change in the forcing, which implies that the time variation in the external forcing can introduce variable character to the predictability.
Error Discounting in Probabilistic Category Learning
ERIC Educational Resources Information Center
Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.
2011-01-01
The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…
Image defects from surface and alignment errors in grazing incidence telescopes
NASA Technical Reports Server (NTRS)
Saha, Timo T.
1989-01-01
The rigid body motions and low frequency surface errors of grazing incidence Wolter telescopes are studied. The analysis is based on surface error descriptors proposed by Paul Glenn. In his analysis, the alignment and surface errors are expressed in terms of Legendre-Fourier polynomials. Individual terms in the expression correspond to rigid body motions (decenter and tilt) and low spatial frequency surface errors of mirrors. With the help of the Legendre-Fourier polynomials and the geometry of grazing incidence telescopes, exact and approximated first order equations are derived in this paper for the components of the ray intercepts at the image plane. These equations are then used to calculate the sensitivities of Wolter type I and II telescopes for the rigid body motions and surface deformations. The rms spot diameters calculated from this theory and OSAC ray tracing code agree very well. This theory also provides a tool to predict how rigid body motions and surface errors of the mirrors compensate each other.
Verduzco-Flores, Sergio O; O'Reilly, Randall C
2015-01-01
We present a cerebellar architecture with two main characteristics. The first one is that complex spikes respond to increases in sensory errors. The second one is that cerebellar modules associate particular contexts where errors have increased in the past with corrective commands that stop the increase in error. We analyze our architecture formally and computationally for the case of reaching in a 3D environment. In the case of motor control, we show that there are synergies of this architecture with the Equilibrium-Point hypothesis, leading to novel ways to solve the motor error and distal learning problems. In particular, the presence of desired equilibrium lengths for muscles provides a way to know when the error is increasing, and which corrections to apply. In the context of Threshold Control Theory and Perceptual Control Theory we show how to extend our model so it implements anticipative corrections in cascade control systems that span from muscle contractions to cognitive operations.
Verduzco-Flores, Sergio O.; O'Reilly, Randall C.
2015-01-01
We present a cerebellar architecture with two main characteristics. The first one is that complex spikes respond to increases in sensory errors. The second one is that cerebellar modules associate particular contexts where errors have increased in the past with corrective commands that stop the increase in error. We analyze our architecture formally and computationally for the case of reaching in a 3D environment. In the case of motor control, we show that there are synergies of this architecture with the Equilibrium-Point hypothesis, leading to novel ways to solve the motor error and distal learning problems. In particular, the presence of desired equilibrium lengths for muscles provides a way to know when the error is increasing, and which corrections to apply. In the context of Threshold Control Theory and Perceptual Control Theory we show how to extend our model so it implements anticipative corrections in cascade control systems that span from muscle contractions to cognitive operations. PMID:25852535
Team safety and innovation by learning from errors in long-term care settings.
Buljac-Samardžić, Martina; van Woerkom, Marianne; Paauwe, Jaap
2012-01-01
Team safety and team innovation are underexplored in the context of long-term care. Understanding the issues requires attention to how teams cope with error. Team managers could have an important role in developing a team's error orientation and managing team membership instabilities. The aim of this study was to examine the impact of team member stability, team coaching, and a team's error orientation on team safety and innovation. A cross-sectional survey method was employed within 2 long-term care organizations. Team members and team managers received a survey that measured safety and innovation. Team members assessed member stability, team coaching, and team error orientation (i.e., problem-solving and blaming approach). The final sample included 933 respondents from 152 teams. Stable teams and teams with managers who take on the role of coach are more likely to adopt a problem-solving approach and less likely to adopt a blaming approach toward errors. Both error orientations are related to team member ratings of safety and innovation, but only the blaming approach is (negatively) related to manager ratings of innovation. Differences between members' and managers' ratings of safety are greater in teams with relatively high scores for the blaming approach and relatively low scores for the problem-solving approach. Team coaching was found to be positively related to innovation, especially in unstable teams. Long-term care organizations that wish to enhance team safety and innovation should encourage a problem-solving approach and discourage a blaming approach. Team managers can play a crucial role in this by coaching team members to see errors as sources of learning and improvement and ensuring that individuals will not be blamed for errors.
Errors and Understanding: The Effects of Error-Management Training on Creative Problem-Solving
ERIC Educational Resources Information Center
Robledo, Issac C.; Hester, Kimberly S.; Peterson, David R.; Barrett, Jamie D.; Day, Eric A.; Hougen, Dean P.; Mumford, Michael D.
2012-01-01
People make errors in their creative problem-solving efforts. The intent of this article was to assess whether error-management training would improve performance on creative problem-solving tasks. Undergraduates were asked to solve an educational leadership problem known to call for creative thought where problem solutions were scored for…
Numerical optimization in Hilbert space using inexact function and gradient evaluations
NASA Technical Reports Server (NTRS)
Carter, Richard G.
1989-01-01
Trust region algorithms provide a robust iterative technique for solving non-convex unstrained optimization problems, but in many instances it is prohibitively expensive to compute high accuracy function and gradient values for the method. Of particular interest are inverse and parameter estimation problems, since function and gradient evaluations involve numerically solving large systems of differential equations. A global convergence theory is presented for trust region algorithms in which neither function nor gradient values are known exactly. The theory is formulated in a Hilbert space setting so that it can be applied to variational problems as well as the finite dimensional problems normally seen in trust region literature. The conditions concerning allowable error are remarkably relaxed: relative errors in the gradient error condition is automatically satisfied if the error is orthogonal to the gradient approximation. A technique for estimating gradient error and improving the approximation is also presented.
Bosco, Francesca M; Angeleri, Romina; Sacco, Katiuscia; Bara, Bruno G
2015-01-01
The purpose of this study is to investigate the pragmatic abilities of individuals with traumatic brain injury (TBI). Several studies in the literature have previously reported communicative deficits in individuals with TBI, however such research has focused principally on communicative deficits in general, without providing an analysis of the errors committed in understanding and expressing communicative acts. Within the theoretical framework of Cognitive Pragmatics theory and Cooperative principle we focused on intermediate communicative errors that occur in both the comprehension and the production of various pragmatic phenomena, expressed through both linguistic and extralinguistic communicative modalities. To investigate the pragmatic abilities of individuals with TBI. A group of 30 individuals with TBI and a matched control group took part in the experiment. They were presented with a series of videotaped vignettes depicting everyday communicative exchanges, and were tested on the comprehension and production of various kinds of communicative acts (standard communicative act, deceit and irony). The participants' answers were evaluated as correct or incorrect. Incorrect answers were then further evaluated with regard to the presence of different intermediate errors. Individuals with TBI performed worse than control participants on all the tasks investigated when considering correct versus incorrect answers. Furthermore, a series of logistic regression analyses showed that group membership (TBI versus controls) significantly predicted the occurrence of intermediate errors. This result holds in both the comprehension and production tasks, and in both linguistic and extralinguistic modalities. Participants with TBI tend to have difficulty in managing different types of communicative acts, and they make more intermediate errors than the control participants. Intermediate errors concern the comprehension and production of the expression act, the comprehension of the actors' meaning, as well as the respect of the Cooperative principle. © 2014 Royal College of Speech and Language Therapists.
North Alabama Lightning Mapping Array (LMA): VHF Source Retrieval Algorithm and Error Analyses
NASA Technical Reports Server (NTRS)
Koshak, W. J.; Solakiewicz, R. J.; Blakeslee, R. J.; Goodman, S. J.; Christian, H. J.; Hall, J.; Bailey, J.; Krider, E. P.; Bateman, M. G.; Boccippio, D.
2003-01-01
Two approaches are used to characterize how accurately the North Alabama Lightning Mapping Array (LMA) is able to locate lightning VHF sources in space and in time. The first method uses a Monte Carlo computer simulation to estimate source retrieval errors. The simulation applies a VHF source retrieval algorithm that was recently developed at the NASA Marshall Space Flight Center (MSFC) and that is similar, but not identical to, the standard New Mexico Tech retrieval algorithm. The second method uses a purely theoretical technique (i.e., chi-squared Curvature Matrix Theory) to estimate retrieval errors. Both methods assume that the LMA system has an overall rms timing error of 50 ns, but all other possible errors (e.g., multiple sources per retrieval attempt) are neglected. The detailed spatial distributions of retrieval errors are provided. Given that the two methods are completely independent of one another, it is shown that they provide remarkably similar results. However, for many source locations, the Curvature Matrix Theory produces larger altitude error estimates than the (more realistic) Monte Carlo simulation.
Medical errors in primary care clinics – a cross sectional study
2012-01-01
Background Patient safety is vital in patient care. There is a lack of studies on medical errors in primary care settings. The aim of the study is to determine the extent of diagnostic inaccuracies and management errors in public funded primary care clinics. Methods This was a cross-sectional study conducted in twelve public funded primary care clinics in Malaysia. A total of 1753 medical records were randomly selected in 12 primary care clinics in 2007 and were reviewed by trained family physicians for diagnostic, management and documentation errors, potential errors causing serious harm and likelihood of preventability of such errors. Results The majority of patient encounters (81%) were with medical assistants. Diagnostic errors were present in 3.6% (95% CI: 2.2, 5.0) of medical records and management errors in 53.2% (95% CI: 46.3, 60.2). For management errors, medication errors were present in 41.1% (95% CI: 35.8, 46.4) of records, investigation errors in 21.7% (95% CI: 16.5, 26.8) and decision making errors in 14.5% (95% CI: 10.8, 18.2). A total of 39.9% (95% CI: 33.1, 46.7) of these errors had the potential to cause serious harm. Problems of documentation including illegible handwriting were found in 98.0% (95% CI: 97.0, 99.1) of records. Nearly all errors (93.5%) detected were considered preventable. Conclusions The occurrence of medical errors was high in primary care clinics particularly with documentation and medication errors. Nearly all were preventable. Remedial intervention addressing completeness of documentation and prescriptions are likely to yield reduction of errors. PMID:23267547
Nair, Vinit; Salmon, J Warren; Kaul, Alan F
2007-12-01
Disease Management (DM) programs have advanced to address costly chronic disease patterns in populations. This is in part due to the programs' significant clinical and economical value, coupled with interest by pharmaceutical manufacturers, managed care organizations, and pharmacy benefit management firms. While cost containment realizations for many such interventions have been less than anticipated, this article explores potentials in marrying Medication Error Risk Reduction into DM programs within managed care environments. Medication errors are an emergent serious problem now gaining attention in US health policy. They represent a failure within population-based health programs because they remain significant cost drivers. Therefore, medication errors should be addressed in an organized fashion, with DM being a worthy candidate for piggybacking such programs to achieve the best synergistic effects.
Cresswell, Kathrin M; Sadler, Stacey; Rodgers, Sarah; Avery, Anthony; Cantrill, Judith; Murray, Scott A; Sheikh, Aziz
2012-06-08
There is a need to shed light on the pathways through which complex interventions mediate their effects in order to enable critical reflection on their transferability. We sought to explore and understand key stakeholder accounts of the acceptability, likely impact and strategies for optimizing and rolling-out a successful pharmacist-led information technology-enabled (PINCER) intervention, which substantially reduced the risk of clinically important errors in medicines management in primary care. Data were collected at two geographical locations in central England through a combination of one-to-one longitudinal semi-structured telephone interviews (one at the beginning of the trial and another when the trial was well underway), relevant documents, and focus group discussions following delivery of the PINCER intervention. Participants included PINCER pharmacists, general practice staff, researchers involved in the running of the trial, and primary care trust staff. PINCER pharmacists were interviewed at three different time-points during the delivery of the PINCER intervention. Analysis was thematic with diffusion of innovation theory providing a theoretical framework. We conducted 52 semi-structured telephone interviews and six focus group discussions with 30 additional participants. In addition, documentary data were collected from six pharmacist diaries, along with notes from four meetings of the PINCER pharmacists and feedback meetings from 34 practices. Key findings that helped to explain the success of the PINCER intervention included the perceived importance of focusing on prescribing errors to all stakeholders, and the credibility and appropriateness of a pharmacist-led intervention to address these shortcomings. Central to this was the face-to-face contact and relationship building between pharmacists and a range of practice staff, and pharmacists' explicitly designated role as a change agent. However, important concerns were identified about the likely sustainability of this new model of delivering care, in the absence of an appropriate support network for pharmacists and career development pathways. This embedded qualitative inquiry has helped to understand the complex organizational and social environment in which the trial was undertaken and the PINCER intervention was delivered. The longitudinal element has given insight into the dynamic changes and developments over time. Medication errors and ways to address these are high on stakeholders' agendas. Our results further indicate that pharmacists were, because of their professional standing and skill-set, able to engage with the complex general practice environment and able to identify and manage many clinically important errors in medicines management. The transferability of the PINCER intervention approach, both in relation to other prescribing errors and to other practices, is likely to be high.
2012-01-01
Background There is a need to shed light on the pathways through which complex interventions mediate their effects in order to enable critical reflection on their transferability. We sought to explore and understand key stakeholder accounts of the acceptability, likely impact and strategies for optimizing and rolling-out a successful pharmacist-led information technology-enabled (PINCER) intervention, which substantially reduced the risk of clinically important errors in medicines management in primary care. Methods Data were collected at two geographical locations in central England through a combination of one-to-one longitudinal semi-structured telephone interviews (one at the beginning of the trial and another when the trial was well underway), relevant documents, and focus group discussions following delivery of the PINCER intervention. Participants included PINCER pharmacists, general practice staff, researchers involved in the running of the trial, and primary care trust staff. PINCER pharmacists were interviewed at three different time-points during the delivery of the PINCER intervention. Analysis was thematic with diffusion of innovation theory providing a theoretical framework. Results We conducted 52 semi-structured telephone interviews and six focus group discussions with 30 additional participants. In addition, documentary data were collected from six pharmacist diaries, along with notes from four meetings of the PINCER pharmacists and feedback meetings from 34 practices. Key findings that helped to explain the success of the PINCER intervention included the perceived importance of focusing on prescribing errors to all stakeholders, and the credibility and appropriateness of a pharmacist-led intervention to address these shortcomings. Central to this was the face-to-face contact and relationship building between pharmacists and a range of practice staff, and pharmacists’ explicitly designated role as a change agent. However, important concerns were identified about the likely sustainability of this new model of delivering care, in the absence of an appropriate support network for pharmacists and career development pathways. Conclusions This embedded qualitative inquiry has helped to understand the complex organizational and social environment in which the trial was undertaken and the PINCER intervention was delivered. The longitudinal element has given insight into the dynamic changes and developments over time. Medication errors and ways to address these are high on stakeholders’ agendas. Our results further indicate that pharmacists were, because of their professional standing and skill-set, able to engage with the complex general practice environment and able to identify and manage many clinically important errors in medicines management. The transferability of the PINCER intervention approach, both in relation to other prescribing errors and to other practices, is likely to be high. PMID:22682095
Classifying nursing errors in clinical management within an Australian hospital.
Tran, D T; Johnson, M
2010-12-01
Although many classification systems relating to patient safety exist, no taxonomy was identified that classified nursing errors in clinical management. To develop a classification system for nursing errors relating to clinical management (NECM taxonomy) and to describe contributing factors and patient consequences. We analysed 241 (11%) self-reported incidents relating to clinical management in nursing in a metropolitan hospital. Descriptive analysis of numeric data and content analysis of text data were undertaken to derive the NECM taxonomy, contributing factors and consequences for patients. Clinical management incidents represented 1.63 incidents per 1000 occupied bed days. The four themes of the NECM taxonomy were nursing care process (67%), communication (22%), administrative process (5%), and knowledge and skill (6%). Half of the incidents did not cause any patient harm. Contributing factors (n=111) included the following: patient clinical, social conditions and behaviours (27%); resources (22%); environment and workload (18%); other health professionals (15%); communication (13%); and nurse's knowledge and experience (5%). The NECM taxonomy provides direction to clinicians and managers on areas in clinical management that are most vulnerable to error, and therefore, priorities for system change management. Any nurses who wish to classify nursing errors relating to clinical management could use these types of errors. This study informs further research into risk management behaviour, and self-assessment tools for clinicians. Globally, nurses need to continue to monitor and act upon patient safety issues. © 2010 The Authors. International Nursing Review © 2010 International Council of Nurses.
Wong, Alfred Ka-Shing; Ong, Shu Fen; Matchar, David Bruce; Lie, Desiree; Ng, Reuben; Yoon, Kirsten Eom; Wong, Chek Hooi
2017-10-01
Studies are needed to inform the preparation of community nurses to address patient behavioral and social factors contributing to unnecessary readmissions to hospital. This study uses nurses' input to understand challenges faced during home care, to derive a framework to address the challenges. Semistructured interviews were conducted to saturation with 16 community nurses in Singapore. Interviews were transcribed verbatim and transcripts independently coded for emergent themes. Themes were interpreted using grounded theory. Seven major themes emerged from 16 interviews: Strained social relationships, complex care decision-making processes within families, communication barriers, patient's or caregiver neglect of health issues, building and maintaining trust, trial-and-error nature of work, and dealing with uncertainty. Community nurses identified uncertainty arising from complexities in social-relational, personal, and organizational factors as a central challenge. Nursing education should focus on navigating and managing uncertainty at the personal, patient, and family levels.
Exchange-Correlation Effects for Noncovalent Interactions in Density Functional Theory.
Otero-de-la-Roza, A; DiLabio, Gino A; Johnson, Erin R
2016-07-12
In this article, we develop an understanding of how errors from exchange-correlation functionals affect the modeling of noncovalent interactions in dispersion-corrected density-functional theory. Computed CCSD(T) reference binding energies for a collection of small-molecule clusters are decomposed via a molecular many-body expansion and are used to benchmark density-functional approximations, including the effect of semilocal approximation, exact-exchange admixture, and range separation. Three sources of error are identified. Repulsion error arises from the choice of semilocal functional approximation. This error affects intermolecular repulsions and is present in all n-body exchange-repulsion energies with a sign that alternates with the order n of the interaction. Delocalization error is independent of the choice of semilocal functional but does depend on the exact exchange fraction. Delocalization error misrepresents the induction energies, leading to overbinding in all induction n-body terms, and underestimates the electrostatic contribution to the 2-body energies. Deformation error affects only monomer relaxation (deformation) energies and behaves similarly to bond-dissociation energy errors. Delocalization and deformation errors affect systems with significant intermolecular orbital interactions (e.g., hydrogen- and halogen-bonded systems), whereas repulsion error is ubiquitous. Many-body errors from the underlying exchange-correlation functional greatly exceed in general the magnitude of the many-body dispersion energy term. A functional built to accurately model noncovalent interactions must contain a dispersion correction, semilocal exchange, and correlation components that minimize the repulsion error independently and must also incorporate exact exchange in such a way that delocalization error is absent.
Recovering real money with a contract management system.
Lang, Kevin; Williams, Bethany
2003-12-01
A payment error here, a payment error there--pretty soon, you're talking about real money. Contract and underpayment management systems can pay for themselves by unearthing tiny errors that add up to a lot of lost cash.
Discordance between net analyte signal theory and practical multivariate calibration.
Brown, Christopher D
2004-08-01
Lorber's concept of net analyte signal is reviewed in the context of classical and inverse least-squares approaches to multivariate calibration. It is shown that, in the presence of device measurement error, the classical and inverse calibration procedures have radically different theoretical prediction objectives, and the assertion that the popular inverse least-squares procedures (including partial least squares, principal components regression) approximate Lorber's net analyte signal vector in the limit is disproved. Exact theoretical expressions for the prediction error bias, variance, and mean-squared error are given under general measurement error conditions, which reinforce the very discrepant behavior between these two predictive approaches, and Lorber's net analyte signal theory. Implications for multivariate figures of merit and numerous recently proposed preprocessing treatments involving orthogonal projections are also discussed.
Kim, Matthew H.; Marulis, Loren M.; Grammer, Jennie K.; Morrison, Frederick J.; Gehring, William J.
2016-01-01
Motivational beliefs and values influence how children approach challenging activities. The present study explores motivational processes from an expectancy-value theory framework by studying children's mistakes and their responses to them by focusing on two ERP components, the error-related negativity (ERN) and error positivity (Pe). Motivation was assessed using a child-friendly challenge puzzle task and a brief interview measure prior to ERP testing. Data from 50 four- to six-year-old children revealed that greater perceived competence beliefs were related to a larger Pe, while stronger intrinsic task value beliefs were associated with a smaller Pe. Motivation was unrelated to the ERN. Individual differences in early motivational processes may reflect electrophysiological activity related to conscious error awareness. PMID:27898304
A statistical investigation into the stability of iris recognition in diverse population sets
NASA Astrophysics Data System (ADS)
Howard, John J.; Etter, Delores M.
2014-05-01
Iris recognition is increasingly being deployed on population wide scales for important applications such as border security, social service administration, criminal identification and general population management. The error rates for this incredibly accurate form of biometric identification are established using well known, laboratory quality datasets. However, it is has long been acknowledged in biometric theory that not all individuals have the same likelihood of being correctly serviced by a biometric system. Typically, techniques for identifying clients that are likely to experience a false non-match or a false match error are carried out on a per-subject basis. This research makes the novel hypothesis that certain ethnical denominations are more or less likely to experience a biometric error. Through established statistical techniques, we demonstrate this hypothesis to be true and document the notable effect that the ethnicity of the client has on iris similarity scores. Understanding the expected impact of ethnical diversity on iris recognition accuracy is crucial to the future success of this technology as it is deployed in areas where the target population consists of clientele from a range of geographic backgrounds, such as border crossings and immigration check points.
Explaining Errors in Children's Questions
ERIC Educational Resources Information Center
Rowland, Caroline F.
2007-01-01
The ability to explain the occurrence of errors in children's speech is an essential component of successful theories of language acquisition. The present study tested some generativist and constructivist predictions about error on the questions produced by ten English-learning children between 2 and 5 years of age. The analyses demonstrated that,…
Identifiability Of Systems With Modeling Errors
NASA Technical Reports Server (NTRS)
Hadaegh, Yadolah " fred"
1988-01-01
Advances in theory of modeling errors reported. Recent paper on errors in mathematical models of deterministic linear or weakly nonlinear systems. Extends theoretical work described in NPO-16661 and NPO-16785. Presents concrete way of accounting for difference in structure between mathematical model and physical process or system that it represents.
[Quality assurance and quality management in intensive care].
Notz, K; Dubb, R; Kaltwasser, A; Hermes, C; Pfeffer, S
2015-11-01
Treatment success in hospitals, particularly in intensive care units, is directly tied to quality of structure, process, and outcomes. Technological and medical advancements lead to ever more complex treatment situations with highly specialized tasks in intensive care nursing. Quality criteria that can be used to describe and correctly measure those highly complex multiprofessional situations have only been recently developed and put into practice.In this article, it will be shown how quality in multiprofessional teams can be definded and assessed in daily clinical practice. Core aspects are the choice of a nursing theory, quality assurance measures, and quality management. One possible option of quality assurance is the use of standard operating procedures (SOPs). Quality can ultimately only be achieved if professional groups think beyond their boundaries, minimize errors, and establish and live out instructions and SOPs.
Farag, Amany; Blegen, Mary; Gedney-Lose, Amalia; Lose, Daniel; Perkhounkova, Yelena
2017-05-01
Medication errors are one of the most frequently occurring errors in health care settings. The complexity of the ED work environment places patients at risk for medication errors. Most hospitals rely on nurses' voluntary medication error reporting, but these errors are under-reported. The purpose of this study was to examine the relationship among work environment (nurse manager leadership style and safety climate), social capital (warmth and belonging relationships and organizational trust), and nurses' willingness to report medication errors. A cross-sectional descriptive design using a questionnaire with a convenience sample of emergency nurses was used. Data were analyzed using descriptive, correlation, Mann-Whitney U, and Kruskal-Wallis statistics. A total of 71 emergency nurses were included in the study. Emergency nurses' willingness to report errors decreased as the nurses' years of experience increased (r = -0.25, P = .03). Their willingness to report errors increased when they received more feedback about errors (r = 0.25, P = .03) and when their managers used a transactional leadership style (r = 0.28, P = .01). ED nurse managers can modify their leadership style to encourage error reporting. Timely feedback after an error report is particularly important. Engaging experienced nurses to understand error root causes could increase voluntary error reporting. Published by Elsevier Inc.
Plant, Katherine L; Stanton, Neville A
2013-01-01
Schema Theory is intuitively appealing although it has not always received positive press; critics of the approach argue that the concept is too ambiguous and vague and there are inherent difficulties associated with measuring schemata. As such, the term schema can be met with scepticism and wariness. The purpose of this paper is to address the criticisms that have been levelled at Schema Theory by demonstrating how Schema Theory has been utilised in Ergonomics research, particularly in the key areas of situation awareness, naturalistic decision making and error. The future of Schema Theory is also discussed in light of its potential roles as a unifying theory in Ergonomics and in contributing to our understanding of distributed cognition. We conclude that Schema Theory has made a positive contribution to Ergonomics and with continued refinement of methods to infer and represent schemata it is likely that this trend will continue. This paper reviews the contribution that Schema Theory has made to Ergonomics research. The criticisms of the theory are addressed using examples from the areas of situation awareness, decision making and error.
Speech errors of amnesic H.M.: unlike everyday slips-of-the-tongue.
MacKay, Donald G; James, Lori E; Hadley, Christopher B; Fogler, Kethera A
2011-03-01
Three language production studies indicate that amnesic H.M. produces speech errors unlike everyday slips-of-the-tongue. Study 1 was a naturalistic task: H.M. and six controls closely matched for age, education, background and IQ described what makes captioned cartoons funny. Nine judges rated the descriptions blind to speaker identity and gave reliably more negative ratings for coherence, vagueness, comprehensibility, grammaticality, and adequacy of humor-description for H.M. than the controls. Study 2 examined "major errors", a novel type of speech error that is uncorrected and reduces the coherence, grammaticality, accuracy and/or comprehensibility of an utterance. The results indicated that H.M. produced seven types of major errors reliably more often than controls: substitutions, omissions, additions, transpositions, reading errors, free associations, and accuracy errors. These results contradict recent claims that H.M. retains unconscious or implicit language abilities and produces spoken discourse that is "sophisticated," "intact" and "without major errors." Study 3 examined whether three classical types of errors (omissions, additions, and substitutions of words and phrases) differed for H.M. versus controls in basic nature and relative frequency by error type. The results indicated that omissions, and especially multi-word omissions, were relatively more common for H.M. than the controls; and substitutions violated the syntactic class regularity (whereby, e.g., nouns substitute with nouns but not verbs) relatively more often for H.M. than the controls. These results suggest that H.M.'s medial temporal lobe damage impaired his ability to rapidly form new connections between units in the cortex, a process necessary to form complete and coherent internal representations for novel sentence-level plans. In short, different brain mechanisms underlie H.M.'s major errors (which reflect incomplete and incoherent sentence-level plans) versus everyday slips-of-the tongue (which reflect errors in activating pre-planned units in fully intact sentence-level plans). Implications of the results of Studies 1-3 are discussed for systems theory, binding theory and relational memory theories. Copyright © 2010 Elsevier Srl. All rights reserved.
Nucleation theory - Is replacement free energy needed?. [error analysis of capillary approximation
NASA Technical Reports Server (NTRS)
Doremus, R. H.
1982-01-01
It has been suggested that the classical theory of nucleation of liquid from its vapor as developed by Volmer and Weber (1926) needs modification with a factor referred to as the replacement free energy and that the capillary approximation underlying the classical theory is in error. Here, the classical nucleation equation is derived from fluctuation theory, Gibb's result for the reversible work to form a critical nucleus, and the rate of collision of gas molecules with a surface. The capillary approximation is not used in the derivation. The chemical potential of small drops is then considered, and it is shown that the capillary approximation can be derived from thermodynamic equations. The results show that no corrections to Volmer's equation are needed.
An Overview of Judgment and Decision Making Research Through the Lens of Fuzzy Trace Theory.
Setton, Roni; Wilhelms, Evan; Weldon, Becky; Chick, Christina; Reyna, Valerie
2014-12-01
We present the basic tenets of fuzzy trace theory, a comprehensive theory of memory, judgment, and decision making that is grounded in research on how information is stored as knowledge, mentally represented, retrieved from storage, and processed. In doing so, we highlight how it is distinguished from traditional models of decision making in that gist reasoning plays a central role. The theory also distinguishes advanced intuition from primitive impulsivity. It predicts that different sorts of errors occur with respect to each component of judgment and decision making: background knowledge, representation, retrieval, and processing. Classic errors in the judgment and decision making literature, such as risky-choice framing and the conjunction fallacy, are accounted for by fuzzy trace theory and new results generated by the theory contradict traditional approaches. We also describe how developmental changes in brain and behavior offer crucial insight into adult cognitive processing. Research investigating brain and behavior in developing and special populations supports fuzzy trace theory's predictions about reliance on gist processing.
Error detection and reduction in blood banking.
Motschman, T L; Moore, S B
1996-12-01
Error management plays a major role in facility process improvement efforts. By detecting and reducing errors, quality and, therefore, patient care improve. It begins with a strong organizational foundation of management attitude with clear, consistent employee direction and appropriate physical facilities. Clearly defined critical processes, critical activities, and SOPs act as the framework for operations as well as active quality monitoring. To assure that personnel can detect an report errors they must be trained in both operational duties and error management practices. Use of simulated/intentional errors and incorporation of error detection into competency assessment keeps employees practiced, confident, and diminishes fear of the unknown. Personnel can clearly see that errors are indeed used as opportunities for process improvement and not for punishment. The facility must have a clearly defined and consistently used definition for reportable errors. Reportable errors should include those errors with potentially harmful outcomes as well as those errors that are "upstream," and thus further away from the outcome. A well-written error report consists of who, what, when, where, why/how, and follow-up to the error. Before correction can occur, an investigation to determine the underlying cause of the error should be undertaken. Obviously, the best corrective action is prevention. Correction can occur at five different levels; however, only three of these levels are directed at prevention. Prevention requires a method to collect and analyze data concerning errors. In the authors' facility a functional error classification method and a quality system-based classification have been useful. An active method to search for problems uncovers them further upstream, before they can have disastrous outcomes. In the continual quest for improving processes, an error management program is itself a process that needs improvement, and we must strive to always close the circle of quality assurance. Ultimately, the goal of better patient care will be the reward.
Neutral and charged excitations in carbon fullerenes from first-principles many-body theories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tiago, Murilo L; Kent, Paul R; Hood, Randolph Q.
2008-01-01
We use first-principles many-body theories to investigate the low energy excitations of the carbon fullerenes C_20, C_24, C_50, C_60, C_70, and C_80. Properties are calculated via the GW-Bethe-Salpeter Equation (GW-BSE) and diffusion Quantum Monte Carlo (QMC) methods. At a lower level of theoretical complexity, we also calculate these properties using static and time-dependent density-functional theory. We critically compare these theories and assess their accuracy against available experimental data. The first ionization potentials are consistently well reproduced and are similar for all the fullerenes and methods studied. The electron affinities and first triplet excitation energies show substantial method and geometry dependence.more » Compared to available experiment, GW-BSE underestimates excitation energies by approximately 0.3 eV while QMC overestimates them by approximately 0.5 eV. We show the GW-BSE errors result primarily from a systematic overestimation of the electron affinities, while the QMC errors likely result from nodal error in both ground and excited state calculations.« less
Effect of neoclassical toroidal viscosity on error-field penetration thresholds in tokamak plasmas.
Cole, A J; Hegna, C C; Callen, J D
2007-08-10
A model for field-error penetration is developed that includes nonresonant as well as the usual resonant field-error effects. The nonresonant components cause a neoclassical toroidal viscous torque that keeps the plasma rotating at a rate comparable to the ion diamagnetic frequency. The new theory is used to examine resonant error-field penetration threshold scaling in Ohmic tokamak plasmas. Compared to previous theoretical results, we find the plasma is less susceptible to error-field penetration and locking, by a factor that depends on the nonresonant error-field amplitude.
Moments of inclination error distribution computer program
NASA Technical Reports Server (NTRS)
Myler, T. R.
1981-01-01
A FORTRAN coded computer program is described which calculates orbital inclination error statistics using a closed-form solution. This solution uses a data base of trajectory errors from actual flights to predict the orbital inclination error statistics. The Scott flight history data base consists of orbit insertion errors in the trajectory parameters - altitude, velocity, flight path angle, flight azimuth, latitude and longitude. The methods used to generate the error statistics are of general interest since they have other applications. Program theory, user instructions, output definitions, subroutine descriptions and detailed FORTRAN coding information are included.
Learning from Errors: A Model of Individual Processes
ERIC Educational Resources Information Center
Tulis, Maria; Steuer, Gabriele; Dresel, Markus
2016-01-01
Errors bear the potential to improve knowledge acquisition, provided that learners are able to deal with them in an adaptive and reflexive manner. However, learners experience a host of different--often impeding or maladaptive--emotional and motivational states in the face of academic errors. Research has made few attempts to develop a theory that…
Application of Consider Covariance to the Extended Kalman Filter
NASA Technical Reports Server (NTRS)
Lundberg, John B.
1996-01-01
The extended Kalman filter (EKF) is the basis for many applications of filtering theory to real-time problems where estimates of the state of a dynamical system are to be computed based upon some set of observations. The form of the EKF may vary somewhat from one application to another, but the fundamental principles are typically unchanged among these various applications. As is the case in many filtering applications, models of the dynamical system (differential equations describing the state variables) and models of the relationship between the observations and the state variables are created. These models typically employ a set of constants whose values are established my means of theory or experimental procedure. Since the estimates of the state are formed assuming that the models are perfect, any modeling errors will affect the accuracy of the computed estimates. Note that the modeling errors may be errors of commission (errors in terms included in the model) or omission (errors in terms excluded from the model). Consequently, it becomes imperative when evaluating the performance of real-time filters to evaluate the effect of modeling errors on the estimates of the state.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Proctor, Timothy; Rudinger, Kenneth; Young, Kevin
Randomized benchmarking (RB) is widely used to measure an error rate of a set of quantum gates, by performing random circuits that would do nothing if the gates were perfect. In the limit of no finite-sampling error, the exponential decay rate of the observable survival probabilities, versus circuit length, yields a single error metric r. For Clifford gates with arbitrary small errors described by process matrices, r was believed to reliably correspond to the mean, over all Clifford gates, of the average gate infidelity between the imperfect gates and their ideal counterparts. We show that this quantity is not amore » well-defined property of a physical gate set. It depends on the representations used for the imperfect and ideal gates, and the variant typically computed in the literature can differ from r by orders of magnitude. We present new theories of the RB decay that are accurate for all small errors describable by process matrices, and show that the RB decay curve is a simple exponential for all such errors. Here, these theories allow explicit computation of the error rate that RB measures (r), but as far as we can tell it does not correspond to the infidelity of a physically allowed (completely positive) representation of the imperfect gates.« less
Kim, Matthew H; Marulis, Loren M; Grammer, Jennie K; Morrison, Frederick J; Gehring, William J
2017-03-01
Motivational beliefs and values influence how children approach challenging activities. The current study explored motivational processes from an expectancy-value theory framework by studying children's mistakes and their responses to them by focusing on two event-related potential (ERP) components: the error-related negativity (ERN) and the error positivity (Pe). Motivation was assessed using a child-friendly challenge puzzle task and a brief interview measure prior to ERP testing. Data from 50 4- to 6-year-old children revealed that greater perceived competence beliefs were related to a larger Pe, whereas stronger intrinsic task value beliefs were associated with a smaller Pe. Motivation was unrelated to the ERN. Individual differences in early motivational processes may reflect electrophysiological activity related to conscious error awareness. Copyright © 2016 Elsevier Inc. All rights reserved.
Reward positivity: Reward prediction error or salience prediction error?
Heydari, Sepideh; Holroyd, Clay B
2016-08-01
The reward positivity is a component of the human ERP elicited by feedback stimuli in trial-and-error learning and guessing tasks. A prominent theory holds that the reward positivity reflects a reward prediction error signal that is sensitive to outcome valence, being larger for unexpected positive events relative to unexpected negative events (Holroyd & Coles, 2002). Although the theory has found substantial empirical support, most of these studies have utilized either monetary or performance feedback to test the hypothesis. However, in apparent contradiction to the theory, a recent study found that unexpected physical punishments also elicit the reward positivity (Talmi, Atkinson, & El-Deredy, 2013). The authors of this report argued that the reward positivity reflects a salience prediction error rather than a reward prediction error. To investigate this finding further, in the present study participants navigated a virtual T maze and received feedback on each trial under two conditions. In a reward condition, the feedback indicated that they would either receive a monetary reward or not and in a punishment condition the feedback indicated that they would receive a small shock or not. We found that the feedback stimuli elicited a typical reward positivity in the reward condition and an apparently delayed reward positivity in the punishment condition. Importantly, this signal was more positive to the stimuli that predicted the omission of a possible punishment relative to stimuli that predicted a forthcoming punishment, which is inconsistent with the salience hypothesis. © 2016 Society for Psychophysiological Research.
Transformational leadership in nursing and medication safety education: a discussion paper.
Vaismoradi, Mojtaba; Griffiths, Pauline; Turunen, Hannele; Jordan, Sue
2016-10-01
This paper discusses the application of transformational leadership to the teaching and learning of safe medication management. The prevalence of adverse drug events (ADEs) and medication-related hospitalisations (one hundred thousand each year in the USA) are of concern. This discussion is based on a narrative literature review and scrutiny of international nursing research to synthesise pedagogical strategies for the application of transformational leadership to teaching medication safety. The four elements relating transformational leadership to medication safety education are: 'Idealised influence' or role modelling, both actual and exemplary, 'Inspirational motivation' providing students with commitment to medication safety, 'Intellectual stimulation' encouraging students to value improvement and change, and 'Individualised consideration' of individual students' educational goals, practice development and patient outcomes. The model lends itself to experiential learning and a case-study approach to teaching, offering an opportunity to reduce nursing's theory-practice gap. Transformational leadership for medication safety education is characterised by a focus on the role of nurse educators and mentors in the development of students' abilities, creation of a supportive culture, and enhancement of students' creativity, motivation and ethical behaviour. This will prepare nursing graduates with the competencies necessary to be diligent about medication safety and the prevention of errors. Teaching medication safety through transformational leadership requires the close collaboration of educators, managers and policy makers. Investigation of strategies to reduced medication errors and consequent patient harm should include exploration of the application of transformational leadership to education and its impact on the number and severity of medication errors. © 2016 John Wiley & Sons Ltd.
Demand Forecasting: An Evaluation of DODs Accuracy Metric and Navys Procedures
2016-06-01
inventory management improvement plan, mean of absolute scaled error, lead time adjusted squared error, forecast accuracy, benchmarking, naïve method...Manager JASA Journal of the American Statistical Association LASE Lead-time Adjusted Squared Error LCI Life Cycle Indicator MA Moving Average MAE...Mean Squared Error xvi NAVSUP Naval Supply Systems Command NDAA National Defense Authorization Act NIIN National Individual Identification Number
Detection of digital FSK using a phase-locked loop
NASA Technical Reports Server (NTRS)
Lindsey, W. C.; Simon, M. K.
1975-01-01
A theory is presented for the design of a digital FSK receiver which employs a phase-locked loop to set up the desired matched filter as the arriving signal frequency switches. The developed mathematical model makes it possible to establish the error probability performance of systems which employ a class of digital FM modulations. The noise mechanism which accounts for decision errors is modeled on the basis of the Meyr distribution and renewal Markov process theory.
Solution of elastic-plastic stress analysis problems by the p-version of the finite element method
NASA Technical Reports Server (NTRS)
Szabo, Barna A.; Actis, Ricardo L.; Holzer, Stefan M.
1993-01-01
The solution of small strain elastic-plastic stress analysis problems by the p-version of the finite element method is discussed. The formulation is based on the deformation theory of plasticity and the displacement method. Practical realization of controlling discretization errors for elastic-plastic problems is the main focus. Numerical examples which include comparisons between the deformation and incremental theories of plasticity under tight control of discretization errors are presented.
Modeling human response errors in synthetic flight simulator domain
NASA Technical Reports Server (NTRS)
Ntuen, Celestine A.
1992-01-01
This paper presents a control theoretic approach to modeling human response errors (HRE) in the flight simulation domain. The human pilot is modeled as a supervisor of a highly automated system. The synthesis uses the theory of optimal control pilot modeling for integrating the pilot's observation error and the error due to the simulation model (experimental error). Methods for solving the HRE problem are suggested. Experimental verification of the models will be tested in a flight quality handling simulation.
Molina, Sergio L; Stodden, David F
2018-04-01
This study examined variability in throwing speed and spatial error to test the prediction of an inverted-U function (i.e., impulse-variability [IV] theory) and the speed-accuracy trade-off. Forty-five 9- to 11-year-old children were instructed to throw at a specified percentage of maximum speed (45%, 65%, 85%, and 100%) and hit the wall target. Results indicated no statistically significant differences in variable error across the target conditions (p = .72), failing to support the inverted-U hypothesis. Spatial accuracy results indicated no statistically significant differences with mean radial error (p = .18), centroid radial error (p = .13), and bivariate variable error (p = .08) also failing to support the speed-accuracy trade-off in overarm throwing. As neither throwing performance variability nor accuracy changed across percentages of maximum speed in this sample of children as well as in a previous adult sample, current policy and practices of practitioners may need to be reevaluated.
Nikolic, Mark I; Sarter, Nadine B
2007-08-01
To examine operator strategies for diagnosing and recovering from errors and disturbances as well as the impact of automation design and time pressure on these processes. Considerable efforts have been directed at error prevention through training and design. However, because errors cannot be eliminated completely, their detection, diagnosis, and recovery must also be supported. Research has focused almost exclusively on error detection. Little is known about error diagnosis and recovery, especially in the context of event-driven tasks and domains. With a confederate pilot, 12 airline pilots flew a 1-hr simulator scenario that involved three challenging automation-related tasks and events that were likely to produce erroneous actions or assessments. Behavioral data were compared with a canonical path to examine pilots' error and disturbance management strategies. Debriefings were conducted to probe pilots' system knowledge. Pilots seldom followed the canonical path to cope with the scenario events. Detection of a disturbance was often delayed. Diagnostic episodes were rare because of pilots' knowledge gaps and time criticality. In many cases, generic inefficient recovery strategies were observed, and pilots relied on high levels of automation to manage the consequences of an error. Our findings describe and explain the nature and shortcomings of pilots' error management activities. They highlight the need for improved automation training and design to achieve more timely detection, accurate explanation, and effective recovery from errors and disturbances. Our findings can inform the design of tools and techniques that support disturbance management in various complex, event-driven environments.
A general geometric theory of attitude determination from directional sensing
NASA Technical Reports Server (NTRS)
Fang, B. T.
1976-01-01
A general geometric theory of spacecraft attitude determination from external reference direction sensors was presented. Outputs of different sensors are reduced to two kinds of basic directional measurements. Errors in these measurement equations are studied in detail. The partial derivatives of measurements with respect to the spacecraft orbit, the spacecraft attitude, and the error parameters form the basis for all orbit and attitude determination schemes and error analysis programs and are presented in a series of tables. The question of attitude observability is studied with the introduction of a graphical construction which provides a great deal of physical insight. The result is applied to the attitude observability of the IMP-8 spacecraft.
NASA Technical Reports Server (NTRS)
Decker, A. J.
1982-01-01
A theory of fringe localization in rapid-double-exposure, diffuse-illumination holographic interferometry was developed. The theory was then applied to compare holographic measurements with laser anemometer measurements of shock locations in a transonic axial-flow compressor rotor. The computed fringe localization error was found to agree well with the measured localization error. It is shown how the view orientation and the curvature and positional variation of the strength of a shock wave are used to determine the localization error and to minimize it. In particular, it is suggested that the view direction not deviate from tangency at the shock surface by more than 30 degrees.
Norman, Geoffrey R; Monteiro, Sandra D; Sherbino, Jonathan; Ilgen, Jonathan S; Schmidt, Henk G; Mamede, Silvia
2017-01-01
Contemporary theories of clinical reasoning espouse a dual processing model, which consists of a rapid, intuitive component (Type 1) and a slower, logical and analytical component (Type 2). Although the general consensus is that this dual processing model is a valid representation of clinical reasoning, the causes of diagnostic errors remain unclear. Cognitive theories about human memory propose that such errors may arise from both Type 1 and Type 2 reasoning. Errors in Type 1 reasoning may be a consequence of the associative nature of memory, which can lead to cognitive biases. However, the literature indicates that, with increasing expertise (and knowledge), the likelihood of errors decreases. Errors in Type 2 reasoning may result from the limited capacity of working memory, which constrains computational processes. In this article, the authors review the medical literature to answer two substantial questions that arise from this work: (1) To what extent do diagnostic errors originate in Type 1 (intuitive) processes versus in Type 2 (analytical) processes? (2) To what extent are errors a consequence of cognitive biases versus a consequence of knowledge deficits?The literature suggests that both Type 1 and Type 2 processes contribute to errors. Although it is possible to experimentally induce cognitive biases, particularly availability bias, the extent to which these biases actually contribute to diagnostic errors is not well established. Educational strategies directed at the recognition of biases are ineffective in reducing errors; conversely, strategies focused on the reorganization of knowledge to reduce errors have small but consistent benefits.
Radiation-Hardened Solid-State Drive
NASA Technical Reports Server (NTRS)
Sheldon, Douglas J.
2010-01-01
A method is provided for a radiationhardened (rad-hard) solid-state drive for space mission memory applications by combining rad-hard and commercial off-the-shelf (COTS) non-volatile memories (NVMs) into a hybrid architecture. The architecture is controlled by a rad-hard ASIC (application specific integrated circuit) or a FPGA (field programmable gate array). Specific error handling and data management protocols are developed for use in a rad-hard environment. The rad-hard memories are smaller in overall memory density, but are used to control and manage radiation-induced errors in the main, and much larger density, non-rad-hard COTS memory devices. Small amounts of rad-hard memory are used as error buffers and temporary caches for radiation-induced errors in the large COTS memories. The rad-hard ASIC/FPGA implements a variety of error-handling protocols to manage these radiation-induced errors. The large COTS memory is triplicated for protection, and CRC-based counters are calculated for sub-areas in each COTS NVM array. These counters are stored in the rad-hard non-volatile memory. Through monitoring, rewriting, regeneration, triplication, and long-term storage, radiation-induced errors in the large NV memory are managed. The rad-hard ASIC/FPGA also interfaces with the external computer buses.
An Overview of Judgment and Decision Making Research Through the Lens of Fuzzy Trace Theory
Setton, Roni; Wilhelms, Evan; Weldon, Becky; Chick, Christina; Reyna, Valerie
2017-01-01
We present the basic tenets of fuzzy trace theory, a comprehensive theory of memory, judgment, and decision making that is grounded in research on how information is stored as knowledge, mentally represented, retrieved from storage, and processed. In doing so, we highlight how it is distinguished from traditional models of decision making in that gist reasoning plays a central role. The theory also distinguishes advanced intuition from primitive impulsivity. It predicts that different sorts of errors occur with respect to each component of judgment and decision making: background knowledge, representation, retrieval, and processing. Classic errors in the judgment and decision making literature, such as risky-choice framing and the conjunction fallacy, are accounted for by fuzzy trace theory and new results generated by the theory contradict traditional approaches. We also describe how developmental changes in brain and behavior offer crucial insight into adult cognitive processing. Research investigating brain and behavior in developing and special populations supports fuzzy trace theory’s predictions about reliance on gist processing. PMID:28725239
The Neural-fuzzy Thermal Error Compensation Controller on CNC Machining Center
NASA Astrophysics Data System (ADS)
Tseng, Pai-Chung; Chen, Shen-Len
The geometric errors and structural thermal deformation are factors that influence the machining accuracy of Computer Numerical Control (CNC) machining center. Therefore, researchers pay attention to thermal error compensation technologies on CNC machine tools. Some real-time error compensation techniques have been successfully demonstrated in both laboratories and industrial sites. The compensation results still need to be enhanced. In this research, the neural-fuzzy theory has been conducted to derive a thermal prediction model. An IC-type thermometer has been used to detect the heat sources temperature variation. The thermal drifts are online measured by a touch-triggered probe with a standard bar. A thermal prediction model is then derived by neural-fuzzy theory based on the temperature variation and the thermal drifts. A Graphic User Interface (GUI) system is also built to conduct the user friendly operation interface with Insprise C++ Builder. The experimental results show that the thermal prediction model developed by neural-fuzzy theory methodology can improve machining accuracy from 80µm to 3µm. Comparison with the multi-variable linear regression analysis the compensation accuracy is increased from ±10µm to ±3µm.
Jackson, Simon A.; Kleitman, Sabina; Howie, Pauline; Stankov, Lazar
2016-01-01
In this paper, we investigate whether individual differences in performance on heuristic and biases tasks can be explained by cognitive abilities, monitoring confidence, and control thresholds. Current theories explain individual differences in these tasks by the ability to detect errors and override automatic but biased judgments, and deliberative cognitive abilities that help to construct the correct response. Here we retain cognitive abilities but disentangle error detection, proposing that lower monitoring confidence and higher control thresholds promote error checking. Participants (N = 250) completed tasks assessing their fluid reasoning abilities, stable monitoring confidence levels, and the control threshold they impose on their decisions. They also completed seven typical heuristic and biases tasks such as the cognitive reflection test and Resistance to Framing. Using structural equation modeling, we found that individuals with higher reasoning abilities, lower monitoring confidence, and higher control threshold performed significantly and, at times, substantially better on the heuristic and biases tasks. Individuals with higher control thresholds also showed lower preferences for risky alternatives in a gambling task. Furthermore, residual correlations among the heuristic and biases tasks were reduced to null, indicating that cognitive abilities, monitoring confidence, and control thresholds accounted for their shared variance. Implications include the proposal that the capacity to detect errors does not differ between individuals. Rather, individuals might adopt varied strategies that promote error checking to different degrees, regardless of whether they have made a mistake or not. The results support growing evidence that decision-making involves cognitive abilities that construct actions and monitoring and control processes that manage their initiation. PMID:27790170
Jackson, Simon A; Kleitman, Sabina; Howie, Pauline; Stankov, Lazar
2016-01-01
In this paper, we investigate whether individual differences in performance on heuristic and biases tasks can be explained by cognitive abilities, monitoring confidence, and control thresholds. Current theories explain individual differences in these tasks by the ability to detect errors and override automatic but biased judgments, and deliberative cognitive abilities that help to construct the correct response. Here we retain cognitive abilities but disentangle error detection, proposing that lower monitoring confidence and higher control thresholds promote error checking. Participants ( N = 250) completed tasks assessing their fluid reasoning abilities, stable monitoring confidence levels, and the control threshold they impose on their decisions. They also completed seven typical heuristic and biases tasks such as the cognitive reflection test and Resistance to Framing. Using structural equation modeling, we found that individuals with higher reasoning abilities, lower monitoring confidence, and higher control threshold performed significantly and, at times, substantially better on the heuristic and biases tasks. Individuals with higher control thresholds also showed lower preferences for risky alternatives in a gambling task. Furthermore, residual correlations among the heuristic and biases tasks were reduced to null, indicating that cognitive abilities, monitoring confidence, and control thresholds accounted for their shared variance. Implications include the proposal that the capacity to detect errors does not differ between individuals. Rather, individuals might adopt varied strategies that promote error checking to different degrees, regardless of whether they have made a mistake or not. The results support growing evidence that decision-making involves cognitive abilities that construct actions and monitoring and control processes that manage their initiation.
NASA Astrophysics Data System (ADS)
Cottrell, Paul Edward
There is a lack of research in the area of hedging future contracts, especially in illiquid or very volatile market conditions. It is important to understand the volatility of the oil and currency markets because reduced fluctuations in these markets could lead to better hedging performance. This study compared different hedging methods by using a hedging error metric, supplementing the Receding Horizontal Control and Stochastic Programming (RHCSP) method by utilizing the London Interbank Offered Rate with the Levy process. The RHCSP hedging method was investigated to determine if improved hedging error was accomplished compared to the Black-Scholes, Leland, and Whalley and Wilmott methods when applied on simulated, oil, and currency futures markets. A modified RHCSP method was also investigated to determine if this method could significantly reduce hedging error under extreme market illiquidity conditions when applied on simulated, oil, and currency futures markets. This quantitative study used chaos theory and emergence for its theoretical foundation. An experimental research method was utilized for this study with a sample size of 506 hedging errors pertaining to historical and simulation data. The historical data were from January 1, 2005 through December 31, 2012. The modified RHCSP method was found to significantly reduce hedging error for the oil and currency market futures by the use of a 2-way ANOVA with a t test and post hoc Tukey test. This study promotes positive social change by identifying better risk controls for investment portfolios and illustrating how to benefit from high volatility in markets. Economists, professional investment managers, and independent investors could benefit from the findings of this study.
Model-based predictions for dopamine.
Langdon, Angela J; Sharpe, Melissa J; Schoenbaum, Geoffrey; Niv, Yael
2018-04-01
Phasic dopamine responses are thought to encode a prediction-error signal consistent with model-free reinforcement learning theories. However, a number of recent findings highlight the influence of model-based computations on dopamine responses, and suggest that dopamine prediction errors reflect more dimensions of an expected outcome than scalar reward value. Here, we review a selection of these recent results and discuss the implications and complications of model-based predictions for computational theories of dopamine and learning. Copyright © 2017. Published by Elsevier Ltd.
Siu, Heidi; Spence Laschinger, Heather K; Finegan, Joan
2008-05-01
The aim of this study was to examine the impact of nurses' perceived professional practice environment on their quality of nursing conflict management approaches and ultimately their perceptions of unit effectiveness from the perspective of Deutsch's theory of constructive conflict management. Rising reports of hostility and conflict among Canadian nurses are a concern to nurses' health and the viability of effective patient care delivery. However, research on the situational factors that influence nurses' ability to apply effective conflict resolution skills that lead to positive results in practice is limited. A nonexperimental, predictive design was used in a sample of 678 registered nurses working in community hospitals within a large metropolitan area in Ontario. The results supported a modified version of the hypothesized model [chi2(1) = 16.25, Goodness of Fit = 0.99, Comparative Fit Index = 0.98, Root-Mean-Square Error of Approximation = 0.15] linking professional practice environment and core self-evaluation to nurses' conflict management and, ultimately, unit effectiveness. Professional practice environment, conflict management, and core-self evaluation explained approximately 46.6% of the variance in unit effectiveness. Positive professional practice environments and high core self-evaluations predicted nurses' constructive conflict management and, in turn, greater unit effectiveness.
NP-hardness of decoding quantum error-correction codes
NASA Astrophysics Data System (ADS)
Hsieh, Min-Hsiu; Le Gall, François
2011-05-01
Although the theory of quantum error correction is intimately related to classical coding theory and, in particular, one can construct quantum error-correction codes (QECCs) from classical codes with the dual-containing property, this does not necessarily imply that the computational complexity of decoding QECCs is the same as their classical counterparts. Instead, decoding QECCs can be very much different from decoding classical codes due to the degeneracy property. Intuitively, one expects degeneracy would simplify the decoding since two different errors might not and need not be distinguished in order to correct them. However, we show that general quantum decoding problem is NP-hard regardless of the quantum codes being degenerate or nondegenerate. This finding implies that no considerably fast decoding algorithm exists for the general quantum decoding problems and suggests the existence of a quantum cryptosystem based on the hardness of decoding QECCs.
NASA Astrophysics Data System (ADS)
Melendez, Jordan; Wesolowski, Sarah; Furnstahl, Dick
2017-09-01
Chiral effective field theory (EFT) predictions are necessarily truncated at some order in the EFT expansion, which induces an error that must be quantified for robust statistical comparisons to experiment. A Bayesian model yields posterior probability distribution functions for these errors based on expectations of naturalness encoded in Bayesian priors and the observed order-by-order convergence pattern of the EFT. As a general example of a statistical approach to truncation errors, the model was applied to chiral EFT for neutron-proton scattering using various semi-local potentials of Epelbaum, Krebs, and Meißner (EKM). Here we discuss how our model can learn correlation information from the data and how to perform Bayesian model checking to validate that the EFT is working as advertised. Supported in part by NSF PHY-1614460 and DOE NUCLEI SciDAC DE-SC0008533.
Asteroid orbital error analysis: Theory and application
NASA Technical Reports Server (NTRS)
Muinonen, K.; Bowell, Edward
1992-01-01
We present a rigorous Bayesian theory for asteroid orbital error estimation in which the probability density of the orbital elements is derived from the noise statistics of the observations. For Gaussian noise in a linearized approximation the probability density is also Gaussian, and the errors of the orbital elements at a given epoch are fully described by the covariance matrix. The law of error propagation can then be applied to calculate past and future positional uncertainty ellipsoids (Cappellari et al. 1976, Yeomans et al. 1987, Whipple et al. 1991). To our knowledge, this is the first time a Bayesian approach has been formulated for orbital element estimation. In contrast to the classical Fisherian school of statistics, the Bayesian school allows a priori information to be formally present in the final estimation. However, Bayesian estimation does give the same results as Fisherian estimation when no priori information is assumed (Lehtinen 1988, and reference therein).
Examining Impulse-Variability in Kicking.
Chappell, Andrew; Molina, Sergio L; McKibben, Jonathon; Stodden, David F
2016-07-01
This study examined variability in kicking speed and spatial accuracy to test the impulse-variability theory prediction of an inverted-U function and the speed-accuracy trade-off. Twenty-eight 18- to 25-year-old adults kicked a playground ball at various percentages (50-100%) of their maximum speed at a wall target. Speed variability and spatial error were analyzed using repeated-measures ANOVA with built-in polynomial contrasts. Results indicated a significant inverse linear trajectory for speed variability (p < .001, η2= .345) where 50% and 60% maximum speed had significantly higher variability than the 100% condition. A significant quadratic fit was found for spatial error scores of mean radial error (p < .0001, η2 = .474) and subject-centroid radial error (p < .0001, η2 = .453). Findings suggest variability and accuracy of multijoint, ballistic skill performance may not follow the general principles of impulse-variability theory or the speed-accuracy trade-off.
Interprofessional communication and medical error: a reframing of research questions and approaches.
Varpio, Lara; Hall, Pippa; Lingard, Lorelei; Schryer, Catherine F
2008-10-01
Progress toward understanding the links between interprofessional communication and issues of medical error has been slow. Recent research proposes that this delay may result from overlooking the complexities involved in interprofessional care. Medical education initiatives in this domain tend to simplify the complexities of team membership fluidity, rotation, and use of communication tools. A new theoretically informed research approach is required to take into account these complexities. To generate such an approach, we review two theories from the social sciences: Activity Theory and Knotworking. Using these perspectives, we propose that research into interprofessional communication and medical error can develop better understandings of (1) how and why medical errors are generated and (2) how and why gaps in team defenses occur. Such complexities will have to be investigated if students and practicing clinicians are to be adequately prepared to work safely in interprofessional teams.
Model error estimation for distributed systems described by elliptic equations
NASA Technical Reports Server (NTRS)
Rodriguez, G.
1983-01-01
A function space approach is used to develop a theory for estimation of the errors inherent in an elliptic partial differential equation model for a distributed parameter system. By establishing knowledge of the inevitable deficiencies in the model, the error estimates provide a foundation for updating the model. The function space solution leads to a specification of a method for computation of the model error estimates and development of model error analysis techniques for comparison between actual and estimated errors. The paper summarizes the model error estimation approach as well as an application arising in the area of modeling for static shape determination of large flexible systems.
Errors, error detection, error correction and hippocampal-region damage: data and theories.
MacKay, Donald G; Johnson, Laura W
2013-11-01
This review and perspective article outlines 15 observational constraints on theories of errors, error detection, and error correction, and their relation to hippocampal-region (HR) damage. The core observations come from 10 studies with H.M., an amnesic with cerebellar and HR damage but virtually no neocortical damage. Three studies examined the detection of errors planted in visual scenes (e.g., a bird flying in a fish bowl in a school classroom) and sentences (e.g., I helped themselves to the birthday cake). In all three experiments, H.M. detected reliably fewer errors than carefully matched memory-normal controls. Other studies examined the detection and correction of self-produced errors, with controls for comprehension of the instructions, impaired visual acuity, temporal factors, motoric slowing, forgetting, excessive memory load, lack of motivation, and deficits in visual scanning or attention. In these studies, H.M. corrected reliably fewer errors than memory-normal and cerebellar controls, and his uncorrected errors in speech, object naming, and reading aloud exhibited two consistent features: omission and anomaly. For example, in sentence production tasks, H.M. omitted one or more words in uncorrected encoding errors that rendered his sentences anomalous (incoherent, incomplete, or ungrammatical) reliably more often than controls. Besides explaining these core findings, the theoretical principles discussed here explain H.M.'s retrograde amnesia for once familiar episodic and semantic information; his anterograde amnesia for novel information; his deficits in visual cognition, sentence comprehension, sentence production, sentence reading, and object naming; and effects of aging on his ability to read isolated low frequency words aloud. These theoretical principles also explain a wide range of other data on error detection and correction and generate new predictions for future test. Copyright © 2013 Elsevier Ltd. All rights reserved.
Borycki, E M; Kushniruk, A W; Bellwood, P; Brender, J
2012-01-01
The objective of this paper is to examine the extent, range and scope to which frameworks, models and theories dealing with technology-induced error have arisen in the biomedical and life sciences literature as indexed by Medline®. To better understand the state of work in the area of technology-induced error involving frameworks, models and theories, the authors conducted a search of Medline® using selected key words identified from seminal articles in this research area. Articles were reviewed and those pertaining to frameworks, models or theories dealing with technology-induced error were further reviewed by two researchers. All articles from Medline® from its inception to April of 2011 were searched using the above outlined strategy. 239 citations were returned. Each of the abstracts for the 239 citations were reviewed by two researchers. Eleven articles met the criteria based on abstract review. These 11 articles were downloaded for further in-depth review. The majority of the articles obtained describe frameworks and models with reference to theories developed in other literatures outside of healthcare. The papers were grouped into several areas. It was found that articles drew mainly from three literatures: 1) the human factors literature (including human-computer interaction and cognition), 2) the organizational behavior/sociotechnical literature, and 3) the software engineering literature. A variety of frameworks and models were found in the biomedical and life sciences literatures. These frameworks and models drew upon and extended frameworks, models and theoretical perspectives that have emerged in other literatures. These frameworks and models are informing an emerging line of research in health and biomedical informatics involving technology-induced errors in healthcare.
A Theory of False Cognitive Expectancies in Airline Pilots
NASA Astrophysics Data System (ADS)
Cortes, Antonio I.
The Theory of False Cognitive Expectancies was developed by studying high reliability flight operations. Airline pilots depend extensively on cognitive expectancies to perceive, understand, and predict actions and events. Out of 1,363 incident reports submitted by airline pilots to the National Aeronautics and Space Administration Aviation Safety Reporting System over a year's time, 110 reports were found to contain evidence of 127 false cognitive expectancies in pilots. A comprehensive taxonomy was developed with six categories of interest. The dataset of 127 false expectancies was used to initially code tentative taxon values for each category. Intermediate coding through constant comparative analysis completed the taxonomy. The taxonomy was used for the advanced coding of chronological context-dependent visualizations of expectancy factors, known as strands, which depict the major factors in the creation and propagation of each expectancy. Strands were mapped into common networks to detect highly represented expectancy processes. Theoretical integration established 11 sources of false expectancies, the most common expectancy errors, and those conspicuous factors worthy of future study. The most prevalent source of false cognitive expectancies within the dataset was determined to be unconscious individual modeling based on past events. Integrative analyses also revealed relationships between expectancies and flight deck automation, unresolved discrepancies, and levels of situation awareness. Particularly noteworthy were the findings that false expectancies can combine in three possible permutations to diminish situation awareness and examples of how false expectancies can be unwittingly transmitted from one person to another. The theory resulting from this research can enhance the error coding process used during aircraft line oriented safety audits, lays the foundation for developing expectancy management training programs, and will allow researchers to proffer hypotheses for human testing using flight simulators.
Wheel speed management control system for spacecraft
NASA Technical Reports Server (NTRS)
Goodzeit, Neil E. (Inventor); Linder, David M. (Inventor)
1991-01-01
A spacecraft attitude control system uses at least four reaction wheels. In order to minimize reaction wheel speed and therefore power, a wheel speed management system is provided. The management system monitors the wheel speeds and generates a wheel speed error vector. The error vector is integrated, and the error vector and its integral are combined to form a correction vector. The correction vector is summed with the attitude control torque command signals for driving the reaction wheels.
Explanatory pluralism: An unrewarding prediction error for free energy theorists.
Colombo, Matteo; Wright, Cory
2017-03-01
Courtesy of its free energy formulation, the hierarchical predictive processing theory of the brain (PTB) is often claimed to be a grand unifying theory. To test this claim, we examine a central case: activity of mesocorticolimbic dopaminergic (DA) systems. After reviewing the three most prominent hypotheses of DA activity-the anhedonia, incentive salience, and reward prediction error hypotheses-we conclude that the evidence currently vindicates explanatory pluralism. This vindication implies that the grand unifying claims of advocates of PTB are unwarranted. More generally, we suggest that the form of scientific progress in the cognitive sciences is unlikely to be a single overarching grand unifying theory. Copyright © 2016 Elsevier Inc. All rights reserved.
[What Surgeons Should Know about Risk Management].
Strametz, R; Tannheimer, M; Rall, M
2017-02-01
Background: The fact that medical treatment is associated with errors has long been recognized. Based on the principle of "first do no harm", numerous efforts have since been made to prevent such errors or limit their impact. However, recent statistics show that these measures do not sufficiently prevent grave mistakes with serious consequences. Preventable mistakes such as wrong patient or wrong site surgery still frequently occur in error statistics. Methods: Based on insight from research on human error, in due consideration of recent legislative regulations in Germany, the authors give an overview of the clinical risk management tools needed to identify risks in surgery, analyse their causes, and determine adequate measures to manage those risks depending on their relevance. The use and limitations of critical incident reporting systems (CIRS), safety checklists and crisis resource management (CRM) are highlighted. Also the rationale for IT systems to support the risk management process is addressed. Results/Conclusion: No single tool of risk management can be effective as a standalone instrument, but unfolds its effect only when embedded in a superordinate risk management system, which integrates tailor-made elements to increase patient safety into the workflows of each organisation. Competence in choosing adequate tools, effective IT systems to support the risk management process as well as leadership and commitment to constructive handling of human error are crucial components to establish a safety culture in surgery. Georg Thieme Verlag KG Stuttgart · New York.
Contextual Advantage for State Discrimination
NASA Astrophysics Data System (ADS)
Schmid, David; Spekkens, Robert W.
2018-02-01
Finding quantitative aspects of quantum phenomena which cannot be explained by any classical model has foundational importance for understanding the boundary between classical and quantum theory. It also has practical significance for identifying information processing tasks for which those phenomena provide a quantum advantage. Using the framework of generalized noncontextuality as our notion of classicality, we find one such nonclassical feature within the phenomenology of quantum minimum-error state discrimination. Namely, we identify quantitative limits on the success probability for minimum-error state discrimination in any experiment described by a noncontextual ontological model. These constraints constitute noncontextuality inequalities that are violated by quantum theory, and this violation implies a quantum advantage for state discrimination relative to noncontextual models. Furthermore, our noncontextuality inequalities are robust to noise and are operationally formulated, so that any experimental violation of the inequalities is a witness of contextuality, independently of the validity of quantum theory. Along the way, we introduce new methods for analyzing noncontextuality scenarios and demonstrate a tight connection between our minimum-error state discrimination scenario and a Bell scenario.
[Risk Management: concepts and chances for public health].
Palm, Stefan; Cardeneo, Margareta; Halber, Marco; Schrappe, Matthias
2002-01-15
Errors are a common problem in medicine and occur as a result of a complex process involving many contributing factors. Medical errors significantly reduce the safety margin for the patient and contribute additional costs in health care delivery. In most cases adverse events cannot be attributed to a single underlying cause. Therefore an effective risk management strategy must follow a system approach, which is based on counting and analysis of near misses. The development of defenses against the undesired effects of errors should be the main focus rather than asking the question "Who blundered?". Analysis of near misses (which in this context can be compared to indicators) offers several methodological advantages as compared to the analysis of errors and adverse events. Risk management is an integral element of quality management.
What Randomized Benchmarking Actually Measures
Proctor, Timothy; Rudinger, Kenneth; Young, Kevin; ...
2017-09-28
Randomized benchmarking (RB) is widely used to measure an error rate of a set of quantum gates, by performing random circuits that would do nothing if the gates were perfect. In the limit of no finite-sampling error, the exponential decay rate of the observable survival probabilities, versus circuit length, yields a single error metric r. For Clifford gates with arbitrary small errors described by process matrices, r was believed to reliably correspond to the mean, over all Clifford gates, of the average gate infidelity between the imperfect gates and their ideal counterparts. We show that this quantity is not amore » well-defined property of a physical gate set. It depends on the representations used for the imperfect and ideal gates, and the variant typically computed in the literature can differ from r by orders of magnitude. We present new theories of the RB decay that are accurate for all small errors describable by process matrices, and show that the RB decay curve is a simple exponential for all such errors. Here, these theories allow explicit computation of the error rate that RB measures (r), but as far as we can tell it does not correspond to the infidelity of a physically allowed (completely positive) representation of the imperfect gates.« less
Johnstone, Megan-Jane; Kanitsaki, Olga
2006-03-01
Nurses globally are required and expected to report nursing errors. As is clearly demonstrated in the international literature, fulfilling this requirement is not, however, without risks. In this discussion paper, the notion of 'nursing error', the practical and moral importance of defining, distinguishing and disclosing nursing errors and how a distinct definition of 'nursing error' fits with the new 'system approach' to human-error management in health care are critiqued. Drawing on international literature and two key case exemplars from the USA and Australia, arguments are advanced to support the view that although it is 'right' for nurses to report nursing errors, it will be very difficult for them to do so unless a non-punitive approach to nursing-error management is adopted.
NASA Astrophysics Data System (ADS)
Decca, R. S.; Fischbach, E.; Klimchitskaya, G. L.; Krause, D. E.; López, D.; Mostepanenko, V. M.
2003-12-01
We report new constraints on extra-dimensional models and other physics beyond the standard model based on measurements of the Casimir force between two dissimilar metals for separations in the range 0.2 1.2 μm. The Casimir force between a Au-coated sphere and a Cu-coated plate of a microelectromechanical torsional oscillator was measured statically with an absolute error of 0.3 pN. In addition, the Casimir pressure between two parallel plates was determined dynamically with an absolute error of ≈0.6 mPa. Within the limits of experimental and theoretical errors, the results are in agreement with a theory that takes into account the finite conductivity and roughness of the two metals. The level of agreement between experiment and theory was then used to set limits on the predictions of extra-dimensional physics and thermal quantum field theory. It is shown that two theoretical approaches to the thermal Casimir force which predict effects linear in temperature are ruled out by these experiments. Finally, constraints on Yukawa corrections to Newton’s law of gravity are strengthened by more than an order of magnitude in the range 56 330 nm.
The Propagation of Errors in Experimental Data Analysis: A Comparison of Pre-and Post-Test Designs
ERIC Educational Resources Information Center
Gorard, Stephen
2013-01-01
Experimental designs involving the randomization of cases to treatment and control groups are powerful and under-used in many areas of social science and social policy. This paper reminds readers of the pre-and post-test, and the post-test only, designs, before explaining briefly how measurement errors propagate according to error theory. The…
Enhancing Undergraduate Mathematics Curriculum via Coding Theory and Cryptography
ERIC Educational Resources Information Center
Aydin, Nuh
2009-01-01
The theory of error-correcting codes and cryptography are two relatively recent applications of mathematics to information and communication systems. The mathematical tools used in these fields generally come from algebra, elementary number theory, and combinatorics, including concepts from computational complexity. It is possible to introduce the…
Abbott, Richard L; Weber, Paul; Kelley, Betsy
2005-12-01
To review the history and current issues surrounding medical professional liability insurance and its relationship to medical error and healthcare risk management. Focused literature review and authors' experience. Medical professional liability insurance issues are reviewed in association with the occurrence of medical error and the role of healthcare risk management. The rising frequency and severity of claims and lawsuits incurred by physicians, as well as escalating defense costs, have dramatically increased over the past several years and have resulted in accelerated efforts to reduce medical errors and control practice risk for physicians. Medical error reduction and improved patient outcomes are closely linked to the goals of the medical risk manager by reducing exposure to adverse medical events. Management of professional liability risk by the physician-led malpractice insurance company not only protects the economic viability of physicians, but also addresses patient safety concerns. Physician-owned malpractice liability insurance companies will continue to be the dominant providers of insurance for practicing physicians and will serve as the primary source for loss prevention and risk management services. To succeed in the marketplace, the emergence and importance of the risk manager and incorporation of risk management principles throughout the professional liability company has become crucial to the financial stability and success of the insurance company. The risk manager provides the necessary advice and support requested by physicians to minimize medical liability risk in their daily practice.
An Introduction to SPEAR (Seismogram Picking Error from Analyst Review)
NASA Astrophysics Data System (ADS)
Zeiler, C. P.; Velasco, A. A.; Anderson, D.; Pingitore, N. E.
2008-12-01
A grassroots initiative began in February of 2008 at the University of Texas at El Paso to understand how seismologists measure earthquakes. The Seismogram Picking Error from Analyst Review (SPEAR) project is designed to be a forum where seismologists can propose, discuss and experimentally test theories on proper procedures to identify and measure seismic phases. We outline the history of seismogram analysis and explore areas of seismogram analysis that still need to be defined. The main concern for SPEAR, at this time, is the impact of picking errors produced by merging earthquake catalogs. Our initial effort has been to establish a common data set for seismologists to pick. The preliminary studies from this data set have shown that significant bias between authors of catalogs may exist. We provide techniques to ensure that these biases can be identified and correctly managed to provide accurate mergers of earthquake measurements. The overall goal of SPEAR is to provide a repository of information to aid seismologists in comparing and sharing measurements. We want to document in the repository and explore all aspects of the picking process, from the basics of learning how to read a seismogram to complex transformations and enhancements of signals. Your participation in SPEAR will aid the seismological community to close the knowledge gaps that exist in seismogram analysis.
ERIC Educational Resources Information Center
Caputi, Peter; Chan, Amy; Jayasuriya, Rohan
2011-01-01
This paper examined the impact of training strategies on the types of errors that novice users make when learning a commonly used spreadsheet application. Fifty participants were assigned to a counterfactual thinking training (CFT) strategy, an error management training strategy, or a combination of both strategies, and completed an easy task…
ERIC Educational Resources Information Center
Göktürk, Söheyda; Bozoglu, Oguzhan; Günçavdi, Gizem
2017-01-01
Purpose: Elements of national and organizational cultures can contribute much to the success of error management in organizations. Accordingly, this study aims to consider how errors were approached in two state university departments in Turkey in relation to their specific organizational and national cultures. Design/methodology/approach: The…
ERIC Educational Resources Information Center
Magno, Carlo
2009-01-01
The present report demonstrates the difference between classical test theory (CTT) and item response theory (IRT) approach using an actual test data for chemistry junior high school students. The CTT and IRT were compared across two samples and two forms of test on their item difficulty, internal consistency, and measurement errors. The specific…
Kim, Myoung-Soo; Kim, Jung-Soon; Jung, In Sook; Kim, Young Hae; Kim, Ho Jung
2007-03-01
The purpose of this study was to develop and evaluate an error reporting promoting program(ERPP) to systematically reduce the incidence rate of nursing errors in operating room. A non-equivalent control group non-synchronized design was used. Twenty-six operating room nurses who were in one university hospital in Busan participated in this study. They were stratified into four groups according to their operating room experience and were allocated to the experimental and control groups using a matching method. Mann-Whitney U Test was used to analyze the differences pre and post incidence rates of nursing errors between the two groups. The incidence rate of nursing errors decreased significantly in the experimental group compared to the pre-test score from 28.4% to 15.7%. The incidence rate by domains, it decreased significantly in the 3 domains-"compliance of aseptic technique", "management of document", "environmental management" in the experimental group while it decreased in the control group which was applied ordinary error-reporting method. Error-reporting system can make possible to hold the errors in common and to learn from them. ERPP was effective to reduce the errors of recognition-related nursing activities. For the wake of more effective error-prevention, we will be better to apply effort of risk management along the whole health care system with this program.
Vaskinn, Anja; Antonsen, Bjørnar T.; Fretland, Ragnhild A.; Dziobek, Isabel; Sundet, Kjetil; Wilberg, Theresa
2015-01-01
Although borderline personality disorder (BPD) and schizophrenia (SZ) are notably different mental disorders, they share problems in social cognition—or understanding the feelings, intentions and thoughts of other people. To date no studies have directly compared the social cognitive abilities of individuals with these two disorders. In this study, the social cognitive subdomain theory of mind was investigated in women with BPD (n = 25), women with SZ (n = 25) and healthy women (n = 25). An ecologically valid video-based measure (Movie for the Assessment of Social Cognition) was used. For the overall score, women with SZ performed markedly below both healthy women and women with BPD, whereas women with BPD did not perform significantly different compared to the healthy control group. A statistically significant error type × group interaction effect indicated that the groups differed with respect to kind of errors. Whereas women with BPD made mostly overmentalizing errors, women with SZ in addition committed undermentalizing errors. Our study suggests different magnitude and pattern of social cognitive problems in BPD and SZ. PMID:26379577
Decision Aids for Multiple-Decision Disease Management as Affected by Weather Input Errors
USDA-ARS?s Scientific Manuscript database
Many disease management decision support systems (DSS) rely, exclusively or in part, on weather inputs to calculate an indicator for disease hazard. Error in the weather inputs, typically due to forecasting, interpolation or estimation from off-site sources, may affect model calculations and manage...
Toward a Middle-Range Theory of Weight Management.
Pickett, Stephanie; Peters, Rosalind M; Jarosz, Patricia A
2014-07-01
The authors of this paper present the middle-range theory of weight management that focuses on cultural, environmental, and psychosocial factors that influence behaviors needed for weight control. The theory of weight management was developed deductively from Orem's theory of self-care, a constituent theory within the broader self-care deficit nursing theory and from research literature. Linkages between the conceptual and middle-range theory concepts are illustrated using a substruction model. The development of the theory of weight management serves to build nursing science by integrating extant nursing theory and empirical knowledge. This theory may help predict weight management in populations at risk for obesity-related disorders. © The Author(s) 2014.
Management of high-risk perioperative systems.
Dain, Steven
2006-06-01
The perioperative system is a complex system that requires people, materials, and processes to come together in a highly ordered and timely manner. However, when working in this high-risk system, even well-organized, knowledgeable, vigilant, and well-intentioned individuals will eventually make errors. All systems need to be evaluated on a continual basis to reduce the risk of errors, make errors more easily recognizable, and provide methods for error mitigation. A simple approach to risk management that may be applied in clinical medicine is discussed.
Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Dale; Selby, Neil
2012-08-14
Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.
NASA Astrophysics Data System (ADS)
Wang, Biao; Yu, Xiaofen; Li, Qinzhao; Zheng, Yu
2008-10-01
The paper aiming at the influence factor of round grating dividing error, rolling-wheel produce eccentricity and surface shape errors provides an amendment method based on rolling-wheel to get the composite error model which includes all influence factors above, and then corrects the non-circle measurement angle error of the rolling-wheel. We make soft simulation verification and have experiment; the result indicates that the composite error amendment method can improve the diameter measurement accuracy with rolling-wheel theory. It has wide application prospect for the measurement accuracy higher than 5 μm/m.
Linearizing feedforward/feedback attitude control
NASA Technical Reports Server (NTRS)
Paielli, Russell A.; Bach, Ralph E.
1991-01-01
An approach to attitude control theory is introduced in which a linear form is postulated for the closed-loop rotation error dynamics, then the exact control law required to realize it is derived. The nonminimal (four-component) quaternion form is used to attitude because it is globally nonsingular, but the minimal (three-component) quaternion form is used for attitude error because it has no nonlinear constraints to prevent the rotational error dynamics from being linearized, and the definition of the attitude error is based on quaternion algebra. This approach produces an attitude control law that linearizes the closed-loop rotational error dynamics exactly, without any attitude singularities, even if the control errors become large.
Automated Classification of Phonological Errors in Aphasic Language
Ahuja, Sanjeev B.; Reggia, James A.; Berndt, Rita S.
1984-01-01
Using heuristically-guided state space search, a prototype program has been developed to simulate and classify phonemic errors occurring in the speech of neurologically-impaired patients. Simulations are based on an interchangeable rule/operator set of elementary errors which represent a theory of phonemic processing faults. This work introduces and evaluates a novel approach to error simulation and classification, it provides a prototype simulation tool for neurolinguistic research, and it forms the initial phase of a larger research effort involving computer modelling of neurolinguistic processes.
Archer, Steven M.
2007-01-01
Purpose Ordinary spherocylindrical refractive errors have been recognized as a cause of monocular diplopia for over a century, yet explanation of this phenomenon using geometrical optics has remained problematic. This study tests the hypothesis that the diffraction theory treatment of refractive errors will provide a more satisfactory explanation of monocular diplopia. Methods Diffraction theory calculations were carried out for modulation transfer functions, point spread functions, and line spread functions under conditions of defocus, astigmatism, and mixed spherocylindrical refractive errors. Defocused photographs of inked and projected black lines were made to demonstrate the predicted consequences of the theoretical calculations. Results For certain amounts of defocus, line spread functions resulting from spherical defocus are predicted to have a bimodal intensity distribution that could provide the basis for diplopia with line targets. Multimodal intensity distributions are predicted in point spread functions and provide a basis for diplopia or polyopia of point targets under conditions of astigmatism. The predicted doubling effect is evident in defocused photographs of black lines, but the effect is not as robust as the subjective experience of monocular diplopia. Conclusions Monocular diplopia due to ordinary refractive errors can be predicted from diffraction theory. Higher-order aberrations—such as spherical aberration—are not necessary but may, under some circumstances, enhance the features of monocular diplopia. The physical basis for monocular diplopia is relatively subtle, and enhancement by neural processing is probably needed to account for the robustness of the percept. PMID:18427616
Managing human fallibility in critical aerospace situations
NASA Astrophysics Data System (ADS)
Tew, Larry
2014-11-01
Human fallibility is pervasive in the aerospace industry with over 50% of errors attributed to human error. Consider the benefits to any organization if those errors were significantly reduced. Aerospace manufacturing involves high value, high profile systems with significant complexity and often repetitive build, assembly, and test operations. In spite of extensive analysis, planning, training, and detailed procedures, human factors can cause unexpected errors. Handling such errors involves extensive cause and corrective action analysis and invariably schedule slips and cost growth. We will discuss success stories, including those associated with electro-optical systems, where very significant reductions in human fallibility errors were achieved after receiving adapted and specialized training. In the eyes of company and customer leadership, the steps used to achieve these results lead to in a major culture change in both the workforce and the supporting management organization. This approach has proven effective in other industries like medicine, firefighting, law enforcement, and aviation. The roadmap to success and the steps to minimize human error are known. They can be used by any organization willing to accept human fallibility and take a proactive approach to incorporate the steps needed to manage and minimize error.
Analyzing human errors in flight mission operations
NASA Technical Reports Server (NTRS)
Bruno, Kristin J.; Welz, Linda L.; Barnes, G. Michael; Sherif, Josef
1993-01-01
A long-term program is in progress at JPL to reduce cost and risk of flight mission operations through a defect prevention/error management program. The main thrust of this program is to create an environment in which the performance of the total system, both the human operator and the computer system, is optimized. To this end, 1580 Incident Surprise Anomaly reports (ISA's) from 1977-1991 were analyzed from the Voyager and Magellan projects. A Pareto analysis revealed that 38 percent of the errors were classified as human errors. A preliminary cluster analysis based on the Magellan human errors (204 ISA's) is presented here. The resulting clusters described the underlying relationships among the ISA's. Initial models of human error in flight mission operations are presented. Next, the Voyager ISA's will be scored and included in the analysis. Eventually, these relationships will be used to derive a theoretically motivated and empirically validated model of human error in flight mission operations. Ultimately, this analysis will be used to make continuous process improvements continuous process improvements to end-user applications and training requirements. This Total Quality Management approach will enable the management and prevention of errors in the future.
Distributed consensus for discrete-time heterogeneous multi-agent systems
NASA Astrophysics Data System (ADS)
Zhao, Huanyu; Fei, Shumin
2018-06-01
This paper studies the consensus problem for a class of discrete-time heterogeneous multi-agent systems. Two kinds of consensus algorithms will be considered. The heterogeneous multi-agent systems considered are converted into equivalent error systems by a model transformation. Then we analyse the consensus problem of the original systems by analysing the stability problem of the error systems. Some sufficient conditions for consensus of heterogeneous multi-agent systems are obtained by applying algebraic graph theory and matrix theory. Simulation examples are presented to show the usefulness of the results.
ERIC Educational Resources Information Center
James, David E.; Schraw, Gregory; Kuch, Fred
2015-01-01
We present an equation, derived from standard statistical theory, that can be used to estimate sampling margin of error for student evaluations of teaching (SETs). We use the equation to examine the effect of sample size, response rates and sample variability on the estimated sampling margin of error, and present results in four tables that allow…
Davis, Stephen Jerome; Hurtado, Josephine; Nguyen, Rosemary; Huynh, Tran; Lindon, Ivan; Hudnall, Cedric; Bork, Sara
2017-01-01
Background: USP <797> regulatory requirements have mandated that pharmacies improve aseptic techniques and cleanliness of the medication preparation areas. In addition, the Institute for Safe Medication Practices (ISMP) recommends that technology and automation be used as much as possible for preparing and verifying compounded sterile products. Objective: To determine the benefits associated with the implementation of the workflow management system, such as reducing medication preparation and delivery errors, reducing quantity and frequency of medication errors, avoiding costs, and enhancing the organization's decision to move toward positive patient identification (PPID). Methods: At Texas Children's Hospital, data were collected and analyzed from January 2014 through August 2014 in the pharmacy areas in which the workflow management system would be implemented. Data were excluded for September 2014 during the workflow management system oral liquid implementation phase. Data were collected and analyzed from October 2014 through June 2015 to determine whether the implementation of the workflow management system reduced the quantity and frequency of reported medication errors. Data collected and analyzed during the study period included the quantity of doses prepared, number of incorrect medication scans, number of doses discontinued from the workflow management system queue, and the number of doses rejected. Data were collected and analyzed to identify patterns of incorrect medication scans, to determine reasons for rejected medication doses, and to determine the reduction in wasted medications. Results: During the 17-month study period, the pharmacy department dispensed 1,506,220 oral liquid and injectable medication doses. From October 2014 through June 2015, the pharmacy department dispensed 826,220 medication doses that were prepared and checked via the workflow management system. Of those 826,220 medication doses, there were 16 reported incorrect volume errors. The error rate after the implementation of the workflow management system averaged 8.4%, which was a 1.6% reduction. After the implementation of the workflow management system, the average number of reported oral liquid medication and injectable medication errors decreased to 0.4 and 0.2 times per week, respectively. Conclusion: The organization was able to achieve its purpose and goal of improving the provision of quality pharmacy care through optimal medication use and safety by reducing medication preparation errors. Error rates decreased and the workflow processes were streamlined, which has led to seamless operations within the pharmacy department. There has been significant cost avoidance and waste reduction and enhanced interdepartmental satisfaction due to the reduction of reported medication errors.
Soft systems thinking and social learning for adaptive management.
Cundill, G; Cumming, G S; Biggs, D; Fabricius, C
2012-02-01
The success of adaptive management in conservation has been questioned and the objective-based management paradigm on which it is based has been heavily criticized. Soft systems thinking and social-learning theory expose errors in the assumption that complex systems can be dispassionately managed by objective observers and highlight the fact that conservation is a social process in which objectives are contested and learning is context dependent. We used these insights to rethink adaptive management in a way that focuses on the social processes involved in management and decision making. Our approach to adaptive management is based on the following assumptions: action toward a common goal is an emergent property of complex social relationships; the introduction of new knowledge, alternative values, and new ways of understanding the world can become a stimulating force for learning, creativity, and change; learning is contextual and is fundamentally about practice; and defining the goal to be addressed is continuous and in principle never ends. We believe five key activities are crucial to defining the goal that is to be addressed in an adaptive-management context and to determining the objectives that are desirable and feasible to the participants: situate the problem in its social and ecological context; raise awareness about alternative views of a problem and encourage enquiry and deconstruction of frames of reference; undertake collaborative actions; and reflect on learning. ©2011 Society for Conservation Biology.
Effective field theory approach to heavy quark fragmentation
Fickinger, Michael; Fleming, Sean; Kim, Chul; ...
2016-11-17
Using an approach based on Soft Collinear Effective Theory (SCET) and Heavy Quark Effective Theory (HQET) we determine the b-quark fragmentation function from electron-positron annihilation data at the Z-boson peak at next-to-next-to leading order with next-to-next-to leading log resummation of DGLAP logarithms, and next-to-next-to-next-to leading log resummation of endpoint logarithms. This analysis improves, by one order, the previous extraction of the b-quark fragmentation function. We find that while the addition of the next order in the calculation does not much shift the extracted form of the fragmentation function, it does reduce theoretical errors indicating that the expansion is converging. Usingmore » an approach based on effective field theory allows us to systematically control theoretical errors. Furthermore, while the fits of theory to data are generally good, the fits seem to be hinting that higher order correction from HQET may be needed to explain the b-quark fragmentation function at smaller values of momentum fraction.« less
The influence of the structure and culture of medical group practices on prescription drug errors.
Kralewski, John E; Dowd, Bryan E; Heaton, Alan; Kaissi, Amer
2005-08-01
This project was designed to identify the magnitude of prescription drug errors in medical group practices and to explore the influence of the practice structure and culture on those error rates. Seventy-eight practices serving an upper Midwest managed care (Care Plus) plan during 2001 were included in the study. Using Care Plus claims data, prescription drug error rates were calculated at the enrollee level and then were aggregated to the group practice that each enrollee selected to provide and manage their care. Practice structure and culture data were obtained from surveys of the practices. Data were analyzed using multivariate regression. Both the culture and the structure of these group practices appear to influence prescription drug error rates. Seeing more patients per clinic hour, more prescriptions per patient, and being cared for in a rural clinic were all strongly associated with more errors. Conversely, having a case manager program is strongly related to fewer errors in all of our analyses. The culture of the practices clearly influences error rates, but the findings are mixed. Practices with cohesive cultures have lower error rates but, contrary to our hypothesis, cultures that value physician autonomy and individuality also have lower error rates than those with a more organizational orientation. Our study supports the contention that there are a substantial number of prescription drug errors in the ambulatory care sector. Even by the strictest definition, there were about 13 errors per 100 prescriptions for Care Plus patients in these group practices during 2001. Our study demonstrates that the structure of medical group practices influences prescription drug error rates. In some cases, this appears to be a direct relationship, such as the effects of having a case manager program on fewer drug errors, but in other cases the effect appears to be indirect through the improvement of drug prescribing practices. An important aspect of this study is that it provides insights into the relationships of the structure and culture of medical group practices and prescription drug errors and provides direction for future research. Research focused on the factors influencing the high error rates in rural areas and how the interaction of practice structural and cultural attributes influence error rates would add important insights into our findings. For medical practice directors, our data show that they should focus on patient care coordination to reduce errors.
NASA Astrophysics Data System (ADS)
Bajaj, Akash; Janet, Jon Paul; Kulik, Heather J.
2017-11-01
The flat-plane condition is the union of two exact constraints in electronic structure theory: (i) energetic piecewise linearity with fractional electron removal or addition and (ii) invariant energetics with change in electron spin in a half filled orbital. Semi-local density functional theory (DFT) fails to recover the flat plane, exhibiting convex fractional charge errors (FCE) and concave fractional spin errors (FSE) that are related to delocalization and static correlation errors. We previously showed that DFT+U eliminates FCE but now demonstrate that, like other widely employed corrections (i.e., Hartree-Fock exchange), it worsens FSE. To find an alternative strategy, we examine the shape of semi-local DFT deviations from the exact flat plane and we find this shape to be remarkably consistent across ions and molecules. We introduce the judiciously modified DFT (jmDFT) approach, wherein corrections are constructed from few-parameter, low-order functional forms that fit the shape of semi-local DFT errors. We select one such physically intuitive form and incorporate it self-consistently to correct semi-local DFT. We demonstrate on model systems that jmDFT represents the first easy-to-implement, no-overhead approach to recovering the flat plane from semi-local DFT.
Bledsoe, Sarah; Van Buskirk, Alex; Falconer, R James; Hollon, Andrew; Hoebing, Wendy; Jokic, Sladan
2018-02-01
The effectiveness of barcode-assisted medication preparation (BCMP) technology on detecting oral liquid dose preparation errors. From June 1, 2013, through May 31, 2014, a total of 178,344 oral doses were processed at Children's Mercy, a 301-bed pediatric hospital, through an automated workflow management system. Doses containing errors detected by the system's barcode scanning system or classified as rejected by the pharmacist were further reviewed. Errors intercepted by the barcode-scanning system were classified as (1) expired product, (2) incorrect drug, (3) incorrect concentration, and (4) technological error. Pharmacist-rejected doses were categorized into 6 categories based on the root cause of the preparation error: (1) expired product, (2) incorrect concentration, (3) incorrect drug, (4) incorrect volume, (5) preparation error, and (6) other. Of the 178,344 doses examined, 3,812 (2.1%) errors were detected by either the barcode-assisted scanning system (1.8%, n = 3,291) or a pharmacist (0.3%, n = 521). The 3,291 errors prevented by the barcode-assisted system were classified most commonly as technological error and incorrect drug, followed by incorrect concentration and expired product. Errors detected by pharmacists were also analyzed. These 521 errors were most often classified as incorrect volume, preparation error, expired product, other, incorrect drug, and incorrect concentration. BCMP technology detected errors in 1.8% of pediatric oral liquid medication doses prepared in an automated workflow management system, with errors being most commonly attributed to technological problems or incorrect drugs. Pharmacists rejected an additional 0.3% of studied doses. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Error estimation in the neural network solution of ordinary differential equations.
Filici, Cristian
2010-06-01
In this article a method of error estimation for the neural approximation of the solution of an Ordinary Differential Equation is presented. Some examples of the application of the method support the theory presented. Copyright 2010. Published by Elsevier Ltd.
Donn, Steven M; McDonnell, William M
2012-01-01
The Institute of Medicine has recommended a change in culture from "name and blame" to patient safety. This will require system redesign to identify and address errors, establish performance standards, and set safety expectations. This approach, however, is at odds with the present medical malpractice (tort) system. The current system is outcomes-based, meaning that health care providers and institutions are often sued despite providing appropriate care. Nevertheless, the focus should remain to provide the safest patient care. Effective peer review may be hindered by the present tort system. Reporting of medical errors is a key piece of peer review and education, and both anonymous reporting and confidential reporting of errors have potential disadvantages. Diagnostic and treatment errors continue to be the leading sources of allegations of malpractice in pediatrics, and the neonatal intensive care unit is uniquely vulnerable. Most errors result from systems failures rather than human error. Risk management can be an effective process to identify, evaluate, and address problems that may injure patients, lead to malpractice claims, and result in financial losses. Risk management identifies risk or potential risk, calculates the probability of an adverse event arising from a risk, estimates the impact of the adverse event, and attempts to control the risk. Implementation of a successful risk management program requires a positive attitude, sufficient knowledge base, and a commitment to improvement. Transparency in the disclosure of medical errors and a strategy of prospective risk management in dealing with medical errors may result in a substantial reduction in medical malpractice lawsuits, lower litigation costs, and a more safety-conscious environment. Thieme Medical Publishers, Inc.
Common errors in multidrug-resistant tuberculosis management.
Monedero, Ignacio; Caminero, Jose A
2014-02-01
Multidrug-resistant tuberculosis (MDR-TB), defined as being resistant to at least rifampicin and isoniazid, has an increasing burden and threatens TB control. Diagnosis is limited and usually delayed while treatment is long lasting, toxic and poorly effective. MDR-TB management in scarce-resource settings is demanding however it is feasible and extremely necessary. In these settings, cure rates do not usually exceed 60-70% and MDR-TB management is novel for many TB programs. In this challenging scenario, both clinical and programmatic errors are likely to occur. The majority of these errors may be prevented or alleviated with appropriate and timely training in addition to uninterrupted procurement of high-quality drugs, updated national guidelines and laws and an overall improvement in management capacities. While new tools for diagnosis and shorter and less toxic treatment are not available in developing countries, MDR-TB management will remain complex in scarce resource settings. Focusing special attention on the common errors in diagnosis, regimen design and especially treatment delivery may benefit patients and programs with current outdated tools. The present article is a compilation of typical errors repeatedly observed by the authors in a wide range of countries during technical assistant missions and trainings.
Birken, Sarah A; DiMartino, Lisa D; Kirk, Meredith A; Lee, Shoou-Yih D; McClelland, Mark; Albert, Nancy M
2016-01-04
The theory of middle managers' role in implementing healthcare innovations hypothesized that middle managers influence implementation effectiveness by fulfilling the following four roles: diffusing information, synthesizing information, mediating between strategy and day-to-day activities, and selling innovation implementation. The theory also suggested several activities in which middle managers might engage to fulfill the four roles. The extent to which the theory aligns with middle managers' experience in practice is unclear. We surveyed middle managers (n = 63) who attended a nursing innovation summit to (1) assess alignment between the theory and middle managers' experience in practice and (2) elaborate on the theory with examples from middle managers' experience overseeing innovation implementation in practice. Middle managers rated all of the theory's hypothesized four roles as "extremely important" but ranked diffusing and synthesizing information as the most important and selling innovation implementation as the least important. They reported engaging in several activities that were consistent with the theory's hypothesized roles and activities such as diffusing information via meetings and training. They also reported engaging in activities not described in the theory such as appraising employee performance. Middle managers' experience aligned well with the theory and expanded definitions of the roles and activities that it hypothesized. Future studies should assess the relationship between hypothesized roles and the effectiveness with which innovations are implemented in practice. If evidence supports the theory, the theory should be leveraged to promote the fulfillment of hypothesized roles among middle managers, doing so may promote innovation implementation.
Modelling Transposition Latencies: Constraints for Theories of Serial Order Memory
ERIC Educational Resources Information Center
Farrell, Simon; Lewandowsky, Stephan
2004-01-01
Several competing theories of short-term memory can explain serial recall performance at a quantitative level. However, most theories to date have not been applied to the accompanying pattern of response latencies, thus ignoring a rich and highly diagnostic aspect of performance. This article explores and tests the error latency predictions of…
Sample Size for Estimation of G and Phi Coefficients in Generalizability Theory
ERIC Educational Resources Information Center
Atilgan, Hakan
2013-01-01
Problem Statement: Reliability, which refers to the degree to which measurement results are free from measurement errors, as well as its estimation, is an important issue in psychometrics. Several methods for estimating reliability have been suggested by various theories in the field of psychometrics. One of these theories is the generalizability…
de Bock, Élodie; Hardouin, Jean-Benoit; Blanchin, Myriam; Le Neel, Tanguy; Kubis, Gildas; Bonnaud-Antignac, Angélique; Dantan, Étienne; Sébille, Véronique
2016-10-01
The objective was to compare classical test theory and Rasch-family models derived from item response theory for the analysis of longitudinal patient-reported outcomes data with possibly informative intermittent missing items. A simulation study was performed in order to assess and compare the performance of classical test theory and Rasch model in terms of bias, control of the type I error and power of the test of time effect. The type I error was controlled for classical test theory and Rasch model whether data were complete or some items were missing. Both methods were unbiased and displayed similar power with complete data. When items were missing, Rasch model remained unbiased and displayed higher power than classical test theory. Rasch model performed better than the classical test theory approach regarding the analysis of longitudinal patient-reported outcomes with possibly informative intermittent missing items mainly for power. This study highlights the interest of Rasch-based models in clinical research and epidemiology for the analysis of incomplete patient-reported outcomes data. © The Author(s) 2013.
1991-02-01
theory orients command leadership for the enormous task of managing organizations in our environment fraught with volatility, uncertainty...performance and organizational ethics. A THEORY OF MANAGEMENT BACKGROUND BASIC MANAGEMENT BEHAVIORAL Definitions FUNCTIONS ASPECTS History Planning Leadership ...the best way to manage in their theory of managerial leadership . To them, the 9,9 position on their model, "is acknowledged by managers as the
ERIC Educational Resources Information Center
Li, Feifei
2017-01-01
An information-correction method for testlet-based tests is introduced. This method takes advantage of both generalizability theory (GT) and item response theory (IRT). The measurement error for the examinee proficiency parameter is often underestimated when a unidimensional conditional-independence IRT model is specified for a testlet dataset. By…
Analyzing the errors of DFT approximations for compressed water systems
NASA Astrophysics Data System (ADS)
Alfè, D.; Bartók, A. P.; Csányi, G.; Gillan, M. J.
2014-07-01
We report an extensive study of the errors of density functional theory (DFT) approximations for compressed water systems. The approximations studied are based on the widely used PBE and BLYP exchange-correlation functionals, and we characterize their errors before and after correction for 1- and 2-body errors, the corrections being performed using the methods of Gaussian approximation potentials. The errors of the uncorrected and corrected approximations are investigated for two related types of water system: first, the compressed liquid at temperature 420 K and density 1.245 g/cm3 where the experimental pressure is 15 kilobars; second, thermal samples of compressed water clusters from the trimer to the 27-mer. For the liquid, we report four first-principles molecular dynamics simulations, two generated with the uncorrected PBE and BLYP approximations and a further two with their 1- and 2-body corrected counterparts. The errors of the simulations are characterized by comparing with experimental data for the pressure, with neutron-diffraction data for the three radial distribution functions, and with quantum Monte Carlo (QMC) benchmarks for the energies of sets of configurations of the liquid in periodic boundary conditions. The DFT errors of the configuration samples of compressed water clusters are computed using QMC benchmarks. We find that the 2-body and beyond-2-body errors in the liquid are closely related to similar errors exhibited by the clusters. For both the liquid and the clusters, beyond-2-body errors of DFT make a substantial contribution to the overall errors, so that correction for 1- and 2-body errors does not suffice to give a satisfactory description. For BLYP, a recent representation of 3-body energies due to Medders, Babin, and Paesani [J. Chem. Theory Comput. 9, 1103 (2013)] gives a reasonably good way of correcting for beyond-2-body errors, after which the remaining errors are typically 0.5 mEh ≃ 15 meV/monomer for the liquid and the clusters.
Analyzing the errors of DFT approximations for compressed water systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alfè, D.; London Centre for Nanotechnology, UCL, London WC1H 0AH; Thomas Young Centre, UCL, London WC1H 0AH
We report an extensive study of the errors of density functional theory (DFT) approximations for compressed water systems. The approximations studied are based on the widely used PBE and BLYP exchange-correlation functionals, and we characterize their errors before and after correction for 1- and 2-body errors, the corrections being performed using the methods of Gaussian approximation potentials. The errors of the uncorrected and corrected approximations are investigated for two related types of water system: first, the compressed liquid at temperature 420 K and density 1.245 g/cm{sup 3} where the experimental pressure is 15 kilobars; second, thermal samples of compressed watermore » clusters from the trimer to the 27-mer. For the liquid, we report four first-principles molecular dynamics simulations, two generated with the uncorrected PBE and BLYP approximations and a further two with their 1- and 2-body corrected counterparts. The errors of the simulations are characterized by comparing with experimental data for the pressure, with neutron-diffraction data for the three radial distribution functions, and with quantum Monte Carlo (QMC) benchmarks for the energies of sets of configurations of the liquid in periodic boundary conditions. The DFT errors of the configuration samples of compressed water clusters are computed using QMC benchmarks. We find that the 2-body and beyond-2-body errors in the liquid are closely related to similar errors exhibited by the clusters. For both the liquid and the clusters, beyond-2-body errors of DFT make a substantial contribution to the overall errors, so that correction for 1- and 2-body errors does not suffice to give a satisfactory description. For BLYP, a recent representation of 3-body energies due to Medders, Babin, and Paesani [J. Chem. Theory Comput. 9, 1103 (2013)] gives a reasonably good way of correcting for beyond-2-body errors, after which the remaining errors are typically 0.5 mE{sub h} ≃ 15 meV/monomer for the liquid and the clusters.« less
Identification of dynamic systems, theory and formulation
NASA Technical Reports Server (NTRS)
Maine, R. E.; Iliff, K. W.
1985-01-01
The problem of estimating parameters of dynamic systems is addressed in order to present the theoretical basis of system identification and parameter estimation in a manner that is complete and rigorous, yet understandable with minimal prerequisites. Maximum likelihood and related estimators are highlighted. The approach used requires familiarity with calculus, linear algebra, and probability, but does not require knowledge of stochastic processes or functional analysis. The treatment emphasizes unification of the various areas in estimation in dynamic systems is treated as a direct outgrowth of the static system theory. Topics covered include basic concepts and definitions; numerical optimization methods; probability; statistical estimators; estimation in static systems; stochastic processes; state estimation in dynamic systems; output error, filter error, and equation error methods of parameter estimation in dynamic systems, and the accuracy of the estimates.
The 6th International Conference on Computer Science and Computational Mathematics (ICCSCM 2017)
NASA Astrophysics Data System (ADS)
2017-09-01
The ICCSCM 2017 (The 6th International Conference on Computer Science and Computational Mathematics) has aimed to provide a platform to discuss computer science and mathematics related issues including Algebraic Geometry, Algebraic Topology, Approximation Theory, Calculus of Variations, Category Theory; Homological Algebra, Coding Theory, Combinatorics, Control Theory, Cryptology, Geometry, Difference and Functional Equations, Discrete Mathematics, Dynamical Systems and Ergodic Theory, Field Theory and Polynomials, Fluid Mechanics and Solid Mechanics, Fourier Analysis, Functional Analysis, Functions of a Complex Variable, Fuzzy Mathematics, Game Theory, General Algebraic Systems, Graph Theory, Group Theory and Generalizations, Image Processing, Signal Processing and Tomography, Information Fusion, Integral Equations, Lattices, Algebraic Structures, Linear and Multilinear Algebra; Matrix Theory, Mathematical Biology and Other Natural Sciences, Mathematical Economics and Financial Mathematics, Mathematical Physics, Measure Theory and Integration, Neutrosophic Mathematics, Number Theory, Numerical Analysis, Operations Research, Optimization, Operator Theory, Ordinary and Partial Differential Equations, Potential Theory, Real Functions, Rings and Algebras, Statistical Mechanics, Structure Of Matter, Topological Groups, Wavelets and Wavelet Transforms, 3G/4G Network Evolutions, Ad-Hoc, Mobile, Wireless Networks and Mobile Computing, Agent Computing & Multi-Agents Systems, All topics related Image/Signal Processing, Any topics related Computer Networks, Any topics related ISO SC-27 and SC- 17 standards, Any topics related PKI(Public Key Intrastructures), Artifial Intelligences(A.I.) & Pattern/Image Recognitions, Authentication/Authorization Issues, Biometric authentication and algorithms, CDMA/GSM Communication Protocols, Combinatorics, Graph Theory, and Analysis of Algorithms, Cryptography and Foundation of Computer Security, Data Base(D.B.) Management & Information Retrievals, Data Mining, Web Image Mining, & Applications, Defining Spectrum Rights and Open Spectrum Solutions, E-Comerce, Ubiquitous, RFID, Applications, Fingerprint/Hand/Biometrics Recognitions and Technologies, Foundations of High-performance Computing, IC-card Security, OTP, and Key Management Issues, IDS/Firewall, Anti-Spam mail, Anti-virus issues, Mobile Computing for E-Commerce, Network Security Applications, Neural Networks and Biomedical Simulations, Quality of Services and Communication Protocols, Quantum Computing, Coding, and Error Controls, Satellite and Optical Communication Systems, Theory of Parallel Processing and Distributed Computing, Virtual Visions, 3-D Object Retrievals, & Virtual Simulations, Wireless Access Security, etc. The success of ICCSCM 2017 is reflected in the received papers from authors around the world from several countries which allows a highly multinational and multicultural idea and experience exchange. The accepted papers of ICCSCM 2017 are published in this Book. Please check http://www.iccscm.com for further news. A conference such as ICCSCM 2017 can only become successful using a team effort, so herewith we want to thank the International Technical Committee and the Reviewers for their efforts in the review process as well as their valuable advices. We are thankful to all those who contributed to the success of ICCSCM 2017. The Secretary
Effects of Crew Resource Management Training on Medical Errors in a Simulated Prehospital Setting
ERIC Educational Resources Information Center
Carhart, Elliot D.
2012-01-01
This applied dissertation investigated the effect of crew resource management (CRM) training on medical errors in a simulated prehospital setting. Specific areas addressed by this program included situational awareness, decision making, task management, teamwork, and communication. This study is believed to be the first investigation of CRM…
Environmental cost of using poor decision metrics to prioritize environmental projects.
Pannell, David J; Gibson, Fiona L
2016-04-01
Conservation decision makers commonly use project-scoring metrics that are inconsistent with theory on optimal ranking of projects. As a result, there may often be a loss of environmental benefits. We estimated the magnitudes of these losses for various metrics that deviate from theory in ways that are common in practice. These metrics included cases where relevant variables were omitted from the benefits metric, project costs were omitted, and benefits were calculated using a faulty functional form. We estimated distributions of parameters from 129 environmental projects from Australia, New Zealand, and Italy for which detailed analyses had been completed previously. The cost of using poor prioritization metrics (in terms of lost environmental values) was often high--up to 80% in the scenarios we examined. The cost in percentage terms was greater when the budget was smaller. The most costly errors were omitting information about environmental values (up to 31% loss of environmental values), omitting project costs (up to 35% loss), omitting the effectiveness of management actions (up to 9% loss), and using a weighted-additive decision metric for variables that should be multiplied (up to 23% loss). The latter 3 are errors that occur commonly in real-world decision metrics, in combination often reducing potential benefits from conservation investments by 30-50%. Uncertainty about parameter values also reduced the benefits from investments in conservation projects but often not by as much as faulty prioritization metrics. © 2016 Society for Conservation Biology.
NASA Astrophysics Data System (ADS)
Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad
2016-09-01
Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.
NASA Astrophysics Data System (ADS)
Li, Tianxing; Zhou, Junxiang; Deng, Xiaozhong; Li, Jubo; Xing, Chunrong; Su, Jianxin; Wang, Huiliang
2018-07-01
A manufacturing error of a cycloidal gear is the key factor affecting the transmission accuracy of a robot rotary vector (RV) reducer. A methodology is proposed to realize the digitized measurement and data processing of the cycloidal gear manufacturing error based on the gear measuring center, which can quickly and accurately measure and evaluate the manufacturing error of the cycloidal gear by using both the whole tooth profile measurement and a single tooth profile measurement. By analyzing the particularity of the cycloidal profile and its effect on the actual meshing characteristics of the RV transmission, the cycloid profile measurement strategy is planned, and the theoretical profile model and error measurement model of cycloid-pin gear transmission are established. Through the digital processing technology, the theoretical trajectory of the probe and the normal vector of the measured point are calculated. By means of precision measurement principle and error compensation theory, a mathematical model for the accurate calculation and data processing of manufacturing error is constructed, and the actual manufacturing error of the cycloidal gear is obtained by the optimization iterative solution. Finally, the measurement experiment of the cycloidal gear tooth profile is carried out on the gear measuring center and the HEXAGON coordinate measuring machine, respectively. The measurement results verify the correctness and validity of the measurement theory and method. This methodology will provide the basis for the accurate evaluation and the effective control of manufacturing precision of the cycloidal gear in a robot RV reducer.
Liu, Yan; Salvendy, Gavriel
2009-05-01
This paper aims to demonstrate the effects of measurement errors on psychometric measurements in ergonomics studies. A variety of sources can cause random measurement errors in ergonomics studies and these errors can distort virtually every statistic computed and lead investigators to erroneous conclusions. The effects of measurement errors on five most widely used statistical analysis tools have been discussed and illustrated: correlation; ANOVA; linear regression; factor analysis; linear discriminant analysis. It has been shown that measurement errors can greatly attenuate correlations between variables, reduce statistical power of ANOVA, distort (overestimate, underestimate or even change the sign of) regression coefficients, underrate the explanation contributions of the most important factors in factor analysis and depreciate the significance of discriminant function and discrimination abilities of individual variables in discrimination analysis. The discussions will be restricted to subjective scales and survey methods and their reliability estimates. Other methods applied in ergonomics research, such as physical and electrophysiological measurements and chemical and biomedical analysis methods, also have issues of measurement errors, but they are beyond the scope of this paper. As there has been increasing interest in the development and testing of theories in ergonomics research, it has become very important for ergonomics researchers to understand the effects of measurement errors on their experiment results, which the authors believe is very critical to research progress in theory development and cumulative knowledge in the ergonomics field.
Poston, Brach; Van Gemmert, Arend W.A.; Sharma, Siddharth; Chakrabarti, Somesh; Zavaremi, Shahrzad H.; Stelmach, George
2013-01-01
The minimum variance theory proposes that motor commands are corrupted by signal-dependent noise and smooth trajectories with low noise levels are selected to minimize endpoint error and endpoint variability. The purpose of the study was to determine the contribution of trajectory smoothness to the endpoint accuracy and endpoint variability of rapid multi-joint arm movements. Young and older adults performed arm movements (4 blocks of 25 trials) as fast and as accurately as possible to a target with the right (dominant) arm. Endpoint accuracy and endpoint variability along with trajectory smoothness and error were quantified for each block of trials. Endpoint error and endpoint variance were greater in older adults compared with young adults, but decreased at a similar rate with practice for the two age groups. The greater endpoint error and endpoint variance exhibited by older adults were primarily due to impairments in movement extent control and not movement direction control. The normalized jerk was similar for the two age groups, but was not strongly associated with endpoint error or endpoint variance for either group. However, endpoint variance was strongly associated with endpoint error for both the young and older adults. Finally, trajectory error was similar for both groups and was weakly associated with endpoint error for the older adults. The findings are not consistent with the predictions of the minimum variance theory, but support and extend previous observations that movement trajectories and endpoints are planned independently. PMID:23584101
Human error in airway facilities.
DOT National Transportation Integrated Search
2001-01-01
This report examines human errors in Airway Facilities (AF) with the intent of preventing these errors from being : passed on to the new Operations Control Centers. To effectively manage errors, they first have to be identified. : Human factors engin...
Characterizing the impact of model error in hydrologic time series recovery inverse problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Scott K.; He, Jiachuan; Vesselinov, Velimir V.
Hydrologic models are commonly over-smoothed relative to reality, owing to computational limitations and to the difficulty of obtaining accurate high-resolution information. When used in an inversion context, such models may introduce systematic biases which cannot be encapsulated by an unbiased “observation noise” term of the type assumed by standard regularization theory and typical Bayesian formulations. Despite its importance, model error is difficult to encapsulate systematically and is often neglected. In this paper, model error is considered for an important class of inverse problems that includes interpretation of hydraulic transients and contaminant source history inference: reconstruction of a time series thatmore » has been convolved against a transfer function (i.e., impulse response) that is only approximately known. Using established harmonic theory along with two results established here regarding triangular Toeplitz matrices, upper and lower error bounds are derived for the effect of systematic model error on time series recovery for both well-determined and over-determined inverse problems. It is seen that use of additional measurement locations does not improve expected performance in the face of model error. A Monte Carlo study of a realistic hydraulic reconstruction problem is presented, and the lower error bound is seen informative about expected behavior. Finally, a possible diagnostic criterion for blind transfer function characterization is also uncovered.« less
Characterizing the impact of model error in hydrologic time series recovery inverse problems
Hansen, Scott K.; He, Jiachuan; Vesselinov, Velimir V.
2017-10-28
Hydrologic models are commonly over-smoothed relative to reality, owing to computational limitations and to the difficulty of obtaining accurate high-resolution information. When used in an inversion context, such models may introduce systematic biases which cannot be encapsulated by an unbiased “observation noise” term of the type assumed by standard regularization theory and typical Bayesian formulations. Despite its importance, model error is difficult to encapsulate systematically and is often neglected. In this paper, model error is considered for an important class of inverse problems that includes interpretation of hydraulic transients and contaminant source history inference: reconstruction of a time series thatmore » has been convolved against a transfer function (i.e., impulse response) that is only approximately known. Using established harmonic theory along with two results established here regarding triangular Toeplitz matrices, upper and lower error bounds are derived for the effect of systematic model error on time series recovery for both well-determined and over-determined inverse problems. It is seen that use of additional measurement locations does not improve expected performance in the face of model error. A Monte Carlo study of a realistic hydraulic reconstruction problem is presented, and the lower error bound is seen informative about expected behavior. Finally, a possible diagnostic criterion for blind transfer function characterization is also uncovered.« less
ERIC Educational Resources Information Center
Julian, Liam
2009-01-01
In this article, the author talks about George Orwell, his instructive errors, and the manner in which Orwell pierced worthless theory, faced facts and defended decency (with fluctuating success), and largely ignored the tradition of accumulated wisdom that has rendered him a timeless teacher--one whose inadvertent lessons, while infrequently…
Illusory Conjunctions: Does Inattention Really Matter?
ERIC Educational Resources Information Center
Navon, David; Ehrlich, Baruch
1995-01-01
Results of a study with 48 Israeli college students cast doubt on feature integration theory. Subjects searching for a probe in an array of three stimuli in two attention conditions, attention being manipulated by a dual-task requirement, made more conjunction errors than feature errors. (SLD)
GY SAMPLING THEORY IN ENVIRONMENTAL STUDIES 2: SUBSAMPLING ERROR MEASUREMENTS
Sampling can be a significant source of error in the measurement process. The characterization and cleanup of hazardous waste sites require data that meet site-specific levels of acceptable quality if scientifically supportable decisions are to be made. In support of this effort,...
NASA Astrophysics Data System (ADS)
Zhang, Yunju; Chen, Zhongyi; Guo, Ming; Lin, Shunsheng; Yan, Yinyang
2018-01-01
With the large capacity of the power system, the development trend of the large unit and the high voltage, the scheduling operation is becoming more frequent and complicated, and the probability of operation error increases. This paper aims at the problem of the lack of anti-error function, single scheduling function and low working efficiency for technical support system in regional regulation and integration, the integrated construction of the error prevention of the integrated architecture of the system of dispatching anti - error of dispatching anti - error of power network based on cloud computing has been proposed. Integrated system of error prevention of Energy Management System, EMS, and Operation Management System, OMS have been constructed either. The system architecture has good scalability and adaptability, which can improve the computational efficiency, reduce the cost of system operation and maintenance, enhance the ability of regional regulation and anti-error checking with broad development prospects.
The Ability of Analysts' Recommendations to Predict Optimistic and Pessimistic Forecasts
Biglari, Vahid; Alfan, Ervina Binti; Ahmad, Rubi Binti; Hajian, Najmeh
2013-01-01
Previous researches show that buy (growth) companies conduct income increasing earnings management in order to meet forecasts and generate positive forecast Errors (FEs). This behavior however, is not inherent in sell (non-growth) companies. Using the aforementioned background, this research hypothesizes that since sell companies are pressured to avoid income increasing earnings management, they are capable, and in fact more inclined, to pursue income decreasing Forecast Management (FM) with the purpose of generating positive FEs. Using a sample of 6553 firm-years of companies that are listed in the NYSE between the years 2005–2010, the study determines that sell companies conduct income decreasing FM to generate positive FEs. However, the frequency of positive FEs of sell companies does not exceed that of buy companies. Using the efficiency perspective, the study suggests that even though buy and sell companies have immense motivation in avoiding negative FEs, they exploit different but efficient strategies, respectively, in order to meet forecasts. Furthermore, the findings illuminated the complexities behind informative and opportunistic forecasts that falls under the efficiency versus opportunistic theories in literature. PMID:24146741
The ability of analysts' recommendations to predict optimistic and pessimistic forecasts.
Biglari, Vahid; Alfan, Ervina Binti; Ahmad, Rubi Binti; Hajian, Najmeh
2013-01-01
Previous researches show that buy (growth) companies conduct income increasing earnings management in order to meet forecasts and generate positive forecast Errors (FEs). This behavior however, is not inherent in sell (non-growth) companies. Using the aforementioned background, this research hypothesizes that since sell companies are pressured to avoid income increasing earnings management, they are capable, and in fact more inclined, to pursue income decreasing Forecast Management (FM) with the purpose of generating positive FEs. Using a sample of 6553 firm-years of companies that are listed in the NYSE between the years 2005-2010, the study determines that sell companies conduct income decreasing FM to generate positive FEs. However, the frequency of positive FEs of sell companies does not exceed that of buy companies. Using the efficiency perspective, the study suggests that even though buy and sell companies have immense motivation in avoiding negative FEs, they exploit different but efficient strategies, respectively, in order to meet forecasts. Furthermore, the findings illuminated the complexities behind informative and opportunistic forecasts that falls under the efficiency versus opportunistic theories in literature.
Error affect inoculation for a complex decision-making task.
Tabernero, Carmen; Wood, Robert E
2009-05-01
Individuals bring knowledge, implicit theories, and goal orientations to group meetings. Group decisions arise out of the exchange of these orientations. This research explores how a trainee's exploratory and deliberate process (an incremental theory and learning goal orientation) impacts the effectiveness of individual and group decision-making processes. The effectiveness of this training program is compared with another program that included error affect inoculation (EAI). Subjects were 40 Spanish Policemen in a training course. They were distributed in two training conditions for an individual and group decision-making task. In one condition, individuals received the Self-Guided Exploration plus Deliberation Process instructions, which emphasised exploring the options and testing hypotheses. In the other condition, individuals also received instructions based on Error Affect Inoculation (EAI), which emphasised positive affective reactions to errors and mistakes when making decisions. Results show that the quality of decisions increases when the groups share their reasoning. The AIE intervention promotes sharing information, flexible initial viewpoints, and improving the quality of group decisions. Implications and future directions are discussed.
Error Discounting in Probabilistic Category Learning
Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.
2011-01-01
Some current theories of probabilistic categorization assume that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report two probabilistic-categorization experiments that investigated error discounting by shifting feedback probabilities to new values after different amounts of training. In both experiments, responding gradually became less responsive to errors, and learning was slowed for some time after the feedback shift. Both results are indicative of error discounting. Quantitative modeling of the data revealed that adding a mechanism for error discounting significantly improved the fits of an exemplar-based and a rule-based associative learning model, as well as of a recency-based model of categorization. We conclude that error discounting is an important component of probabilistic learning. PMID:21355666
ERIC Educational Resources Information Center
Deegan, William L.; And Others
Japanese management theory was studied to identify specific models for consideration by student personnel administrators. The report is organized into three sections: major components of Japanese management theory, potential implications for student personnel administration, and three models, based on components of Japanese management theory, for…
Simulations in nursing practice: toward authentic leadership.
Shapira-Lishchinsky, Orly
2014-01-01
Aim This study explores nurses' ethical decision-making in team simulations in order to identify the benefits of these simulations for authentic leadership. Background While previous studies have indicated that team simulations may improve ethics in the workplace by reducing the number of errors, those studies focused mainly on clinical aspects and not on nurses' ethical experiences or on the benefits of authentic leadership. Methods Fifty nurses from 10 health institutions in central Israel participated in the study. Data about nurses' ethical experiences were collected from 10 teams. Qualitative data analysis based on Grounded Theory was applied, using the atlas.ti 5.0 software package. Findings Simulation findings suggest four main benefits that reflect the underlying components of authentic leadership: self-awareness, relational transparency, balanced information processing and internalized moral perspective. Conclusions Team-based simulation as a training tool may lead to authentic leadership among nurses. Implications for nursing management Nursing management should incorporate team simulations into nursing practice to help resolve power conflicts and to develop authentic leadership in nursing. Consequently, errors will decrease, patients' safety will increase and optimal treatment will be provided. © 2012 John Wiley & Sons Ltd.
Weaver, Sallie J; Newman-Toker, David E; Rosen, Michael A
2012-01-01
Missed, delayed, or wrong diagnoses can have a severe impact on patients, providers, and the entire health care system. One mechanism implicated in such diagnostic errors is the deterioration of cognitive diagnostic skills that are used rarely or not at all over a prolonged period of time. Existing evidence regarding maintenance of effective cognitive reasoning skills in the clinical education, organizational training, and human factors literatures suggest that continuing education plays a critical role in mitigating and managing diagnostic skill decay. Recent models also underscore the role of system level factors (eg, cognitive decision support tools, just-in-time training opportunities) in supporting clinical reasoning process. The purpose of this manuscript is to offer a multidisciplinary review of cognitive models of clinical decision making skills in order to provide a list of best practices for supporting continuous improvement and maintenance of cognitive diagnostic processes through continuing education. Copyright © 2012 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on CME, Association for Hospital Medical Education.
A Reduced Dimension Static, Linearized Kalman Filter and Smoother
NASA Technical Reports Server (NTRS)
Fukumori, I.
1995-01-01
An approximate Kalman filter and smoother, based on approximations of the state estimation error covariance matrix, is described. Approximations include a reduction of the effective state dimension, use of a static asymptotic error limit, and a time-invariant linearization of the dynamic model for error integration. The approximations lead to dramatic computational savings in applying estimation theory to large complex systems. Examples of use come from TOPEX/POSEIDON.
ERIC Educational Resources Information Center
Rice, Bart F.; Wilde, Carroll O.
It is noted that with the prominence of computers in today's technological society, digital communication systems have become widely used in a variety of applications. Some of the problems that arise in digital communications systems are described. This unit presents the problem of correcting errors in such systems. Error correcting codes are…
Open quantum systems and error correction
NASA Astrophysics Data System (ADS)
Shabani Barzegar, Alireza
Quantum effects can be harnessed to manipulate information in a desired way. Quantum systems which are designed for this purpose are suffering from harming interaction with their surrounding environment or inaccuracy in control forces. Engineering different methods to combat errors in quantum devices are highly demanding. In this thesis, I focus on realistic formulations of quantum error correction methods. A realistic formulation is the one that incorporates experimental challenges. This thesis is presented in two sections of open quantum system and quantum error correction. Chapters 2 and 3 cover the material on open quantum system theory. It is essential to first study a noise process then to contemplate methods to cancel its effect. In the second chapter, I present the non-completely positive formulation of quantum maps. Most of these results are published in [Shabani and Lidar, 2009b,a], except a subsection on geometric characterization of positivity domain of a quantum map. The real-time formulation of the dynamics is the topic of the third chapter. After introducing the concept of Markovian regime, A new post-Markovian quantum master equation is derived, published in [Shabani and Lidar, 2005a]. The section of quantum error correction is presented in three chapters of 4, 5, 6 and 7. In chapter 4, we introduce a generalized theory of decoherence-free subspaces and subsystems (DFSs), which do not require accurate initialization (published in [Shabani and Lidar, 2005b]). In Chapter 5, we present a semidefinite program optimization approach to quantum error correction that yields codes and recovery procedures that are robust against significant variations in the noise channel. Our approach allows us to optimize the encoding, recovery, or both, and is amenable to approximations that significantly improve computational cost while retaining fidelity (see [Kosut et al., 2008] for a published version). Chapter 6 is devoted to a theory of quantum error correction (QEC) that applies to any linear map, in particular maps that are not completely positive (CP). This is a complementary to the second chapter which is published in [Shabani and Lidar, 2007]. In the last chapter 7 before the conclusion, a formulation for evaluating the performance of quantum error correcting codes for a general error model is presented, also published in [Shabani, 2005]. In this formulation, the correlation between errors is quantified by a Hamiltonian description of the noise process. In particular, we consider Calderbank-Shor-Steane codes and observe a better performance in the presence of correlated errors depending on the timing of the error recovery.
An organizational approach to understanding patient safety and medical errors.
Kaissi, Amer
2006-01-01
Progress in patient safety, or lack thereof, is a cause for great concern. In this article, we argue that the patient safety movement has failed to reach its goals of eradicating or, at least, significantly reducing errors because of an inappropriate focus on provider and patient-level factors with no real attention to the organizational factors that affect patient safety. We describe an organizational approach to patient safety using different organizational theory perspectives and make several propositions to push patient safety research and practice in a direction that is more likely to improve care processes and outcomes. From a Contingency Theory perspective, we suggest that health care organizations, in general, operate under a misfit between contingencies and structures. This misfit is mainly due to lack of flexibility, cost containment, and lack of regulations, thus explaining the high level of errors committed in these organizations. From an organizational culture perspective, we argue that health care organizations must change their assumptions, beliefs, values, and artifacts to change their culture from a culture of blame to a culture of safety and thus reduce medical errors. From an organizational learning perspective, we discuss how reporting, analyzing, and acting on error information can result in reduced errors in health care organizations.
[Improving blood safety: errors management in transfusion medicine].
Bujandrić, Nevenka; Grujić, Jasmina; Krga-Milanović, Mirjana
2014-01-01
The concept of blood safety includes the entire transfusion chain starting with the collection of blood from the blood donor, and ending with blood transfusion to the patient. The concept involves quality management system as the systematic monitoring of adverse reactions and incidents regarding the blood donor or patient. Monitoring of near-miss errors show the critical points in the working process and increase transfusion safety. The aim of the study was to present the analysis results of adverse and unexpected events in transfusion practice with a potential risk to the health of blood donors and patients. One-year retrospective study was based on the collection, analysis and interpretation of written reports on medical errors in the Blood Transfusion Institute of Vojvodina. Errors were distributed according to the type, frequency and part of the working process where they occurred. Possible causes and corrective actions were described for each error. The study showed that there were not errors with potential health consequences for the blood donor/patient. Errors with potentially damaging consequences for patients were detected throughout the entire transfusion chain. Most of the errors were identified in the preanalytical phase. The human factor was responsible for the largest number of errors. Error reporting system has an important role in the error management and the reduction of transfusion-related risk of adverse events and incidents. The ongoing analysis reveals the strengths and weaknesses of the entire process and indicates the necessary changes. Errors in transfusion medicine can be avoided in a large percentage and prevention is cost-effective, systematic and applicable.
Diagnostic decision-making and strategies to improve diagnosis.
Thammasitboon, Satid; Cutrer, William B
2013-10-01
A significant portion of diagnostic errors arises through cognitive errors resulting from inadequate knowledge, faulty data gathering, and/or faulty verification. Experts estimate that 75% of diagnostic failures can be attributed to clinician diagnostic thinking failure. The cognitive processes that underlie diagnostic thinking of clinicians are complex and intriguing, and it is imperative that clinicians acquire explicit appreciation and application of different cognitive approaches to make decisions better. A dual-process model that unifies many theories of decision-making has emerged as a promising template for understanding how clinicians think and judge efficiently in a diagnostic reasoning process. The identification and implementation of strategies for decreasing or preventing such diagnostic errors has become a growing area of interest and research. Suggested strategies to decrease diagnostic error incidence include increasing clinician's clinical expertise and avoiding inherent cognitive errors to make decisions better. Implementing Interventions focused solely on avoiding errors may work effectively for patient safety issues such as medication errors. Addressing cognitive errors, however, requires equal effort on expanding the individual clinician's expertise. Providing cognitive support to clinicians for robust diagnostic decision-making serves as the final strategic target for decreasing diagnostic errors. Clinical guidelines and algorithms offer another method for streamlining decision-making and decreasing likelihood of cognitive diagnostic errors. Addressing cognitive processing errors is undeniably the most challenging task in reducing diagnostic errors. While many suggested approaches exist, they are mostly based on theories and sciences in cognitive psychology, decision-making, and education. The proposed interventions are primarily suggestions and very few of them have been tested in the actual practice settings. Collaborative research effort is required to effectively address cognitive processing errors. Researchers in various areas, including patient safety/quality improvement, decision-making, and problem solving, must work together to make medical diagnosis more reliable. © 2013 Mosby, Inc. All rights reserved.
Lee, Nam-Ju; Cho, Eunhee; Bakken, Suzanne
2010-03-01
The purposes of this study were to develop a taxonomy for detection of errors related to hypertension management and to apply the taxonomy to retrospectively analyze the documentation of nurses in Advanced Practice Nurse (APN) training. We developed the Hypertension Diagnosis and Management Error Taxonomy and applied it in a sample of adult patient encounters (N = 15,862) that were documented in a personal digital assistant-based clinical log by registered nurses in APN training. We used Standard Query Language queries to retrieve hypertension-related data from the central database. The data were summarized using descriptive statistics. Blood pressure was documented in 77.5% (n = 12,297) of encounters; 21% had high blood pressure values. Missed diagnosis, incomplete diagnosis and misdiagnosis rates were 63.7%, 6.8% and 7.5% respectively. In terms of treatment, the omission rates were 17.9% for essential medications and 69.9% for essential patient teaching. Contraindicated anti-hypertensive medications were documented in 12% of encounters with co-occurring diagnoses of hypertension and asthma. The Hypertension Diagnosis and Management Error Taxonomy was useful for identifying errors based on documentation in a clinical log. The results provide an initial understanding of the nature of errors associated with hypertension diagnosis and management of nurses in APN training. The information gained from this study can contribute to educational interventions that promote APN competencies in identification and management of hypertension as well as overall patient safety and informatics competencies. Copyright © 2010 Korean Society of Nursing Science. Published by . All rights reserved.
Strength conditions for the elastic structures with a stress error
NASA Astrophysics Data System (ADS)
Matveev, A. D.
2017-10-01
As is known, the constraints (strength conditions) for the safety factor of elastic structures and design details of a particular class, e.g. aviation structures are established, i.e. the safety factor values of such structures should be within the given range. It should be noted that the constraints are set for the safety factors corresponding to analytical (exact) solutions of elasticity problems represented for the structures. Developing the analytical solutions for most structures, especially irregular shape ones, is associated with great difficulties. Approximate approaches to solve the elasticity problems, e.g. the technical theories of deformation of homogeneous and composite plates, beams and shells, are widely used for a great number of structures. Technical theories based on the hypotheses give rise to approximate (technical) solutions with an irreducible error, with the exact value being difficult to be determined. In static calculations of the structural strength with a specified small range for the safety factors application of technical (by the Theory of Strength of Materials) solutions is difficult. However, there are some numerical methods for developing the approximate solutions of elasticity problems with arbitrarily small errors. In present paper, the adjusted reference (specified) strength conditions for the structural safety factor corresponding to approximate solution of the elasticity problem have been proposed. The stress error estimation is taken into account using the proposed strength conditions. It has been shown that, to fulfill the specified strength conditions for the safety factor of the given structure corresponding to an exact solution, the adjusted strength conditions for the structural safety factor corresponding to an approximate solution are required. The stress error estimation which is the basis for developing the adjusted strength conditions has been determined for the specified strength conditions. The adjusted strength conditions presented by allowable stresses are suggested. Adjusted strength conditions make it possible to determine the set of approximate solutions, whereby meeting the specified strength conditions. Some examples of the specified strength conditions to be satisfied using the technical (by the Theory of Strength of Materials) solutions and strength conditions have been given, as well as the examples of stress conditions to be satisfied using approximate solutions with a small error.
Reducing number entry errors: solving a widespread, serious problem.
Thimbleby, Harold; Cairns, Paul
2010-10-06
Number entry is ubiquitous: it is required in many fields including science, healthcare, education, government, mathematics and finance. People entering numbers are to be expected to make errors, but shockingly few systems make any effort to detect, block or otherwise manage errors. Worse, errors may be ignored but processed in arbitrary ways, with unintended results. A standard class of error (defined in the paper) is an 'out by 10 error', which is easily made by miskeying a decimal point or a zero. In safety-critical domains, such as drug delivery, out by 10 errors generally have adverse consequences. Here, we expose the extent of the problem of numeric errors in a very wide range of systems. An analysis of better error management is presented: under reasonable assumptions, we show that the probability of out by 10 errors can be halved by better user interface design. We provide a demonstration user interface to show that the approach is practical.To kill an error is as good a service as, and sometimes even better than, the establishing of a new truth or fact. (Charles Darwin 1879 [2008], p. 229).
Error Tracking System is a database used to store & track error notifications sent by users of EPA's web site. ETS is managed by OIC/OEI. OECA's ECHO & OEI Envirofacts use it. Error notifications from EPA's home Page under Contact Us also uses it.
NASA Technical Reports Server (NTRS)
Litvin, F. L.; Rahman, P.; Goldrich, R. N.
1982-01-01
The geometry of spiral bevel gears and to their rational design are studied. The nonconjugate tooth surfaces of spiral bevel gears are, in theory, replaced (or approximated) by conjugated tooth surfaces. These surfaces can be generated by two conical surfaces, and by a conical surface and a revolution. Although these conjugated tooth surfaces are simpler than the actual ones, the determination of their principal curvatures and directions is still a complicated problem. Therefore, a new approach, to the solution of these is proposed. Direct relationships between the principal curvatures and directions of the tool surface and those of the generated gear surface are obtained. With the aid of these analytical tools, the Hertzian contact problem for conjugate tooth surfaces can be solved. These results are useful in determining compressive load capacity and surface fatigue life of spiral bevel gears. A general theory of kinematical errors exerted by manufacturing and assembly errors is developed. This theory is used to determine the analytical relationship between gear misalignments and kinematical errors. This is important to the study of noise and vibration in geared systems.
A theory for predicting composite laminate warpage resulting from fabrication
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1974-01-01
Linear laminate theory is used with the moment-curvature relationship to derive equations for predicting end deflections due to warpage without solving the coupled fourth-order partial differential equations of the plate. Composite micro- and macrohyphenmechanics are used with laminate theory to assess the contribution of factors such as ply misorientation, fiber migration, and fiber and/or void volume ratio nonuniformity on the laminate warpage. Using these equations, it was found that a 1 deg error in the orientation angle of one ply was sufficient to produce warpage end deflection equal to two laminate thicknesses in a 10 inch by 10 inch laminate made from 8 ply Mod-I/epoxy. Using a sensitivity analysis on the governing parameters, it was found that a 3 deg fiber migration or a void volume ratio of three percent in some plies is sufficient to produce laminate warpage corner deflection equal to several laminate thicknesses. Tabular and graphical data are presented which can be used to identify possible errors contributing to laminate warpage and/or to obtain an a priori assessment when unavoidable errors during fabrication are anticipated.
Stem revenue losses with effective CDM management.
Alwell, Michael
2003-09-01
Effective CDM management not only minimizes revenue losses due to denied claims, but also helps eliminate administrative costs associated with correcting coding errors. Accountability for CDM management should be assigned to a single individual, who ideally reports to the CFO or high-level finance director. If your organization is prone to making billing errors due to CDM deficiencies, you should consider purchasing CDM software to help you manage your CDM.
A Short History of Probability Theory and Its Applications
ERIC Educational Resources Information Center
Debnath, Lokenath; Basu, Kanadpriya
2015-01-01
This paper deals with a brief history of probability theory and its applications to Jacob Bernoulli's famous law of large numbers and theory of errors in observations or measurements. Included are the major contributions of Jacob Bernoulli and Laplace. It is written to pay the tricentennial tribute to Jacob Bernoulli, since the year 2013…
Examination of Different Item Response Theory Models on Tests Composed of Testlets
ERIC Educational Resources Information Center
Kogar, Esin Yilmaz; Kelecioglu, Hülya
2017-01-01
The purpose of this research is to first estimate the item and ability parameters and the standard error values related to those parameters obtained from Unidimensional Item Response Theory (UIRT), bifactor (BIF) and Testlet Response Theory models (TRT) in the tests including testlets, when the number of testlets, number of independent items, and…
Teaching Common Errors in Applying a Procedure.
ERIC Educational Resources Information Center
Marcone, Stephen; Reigeluth, Charles M.
1988-01-01
Discusses study that investigated whether or not the teaching of matched examples and nonexamples in the form of common errors could improve student performance in undergraduate music theory courses. Highlights include hypotheses tested, pretests and posttests, and suggestions for further research with different age groups. (19 references)…
Pourasghar, Faramarz; Tabrizi, Jafar Sadegh; Yarifard, Khadijeh
2016-01-01
Background: Patient safety is one of the most important elements of quality of healthcare. It means preventing any harm to the patients during medical care process. Objective: This paper introduces a cost-effective tool in which the Radio Frequency Identification (RFID) technology is used to identify medical errors in hospital. Methods: The proposed clinical error management system (CEMS) is consisted of a reader device, a transfer/receiver device, a database and managing software. The reader device works using radio waves and is wireless. The reader sends and receives data to/from the database via the transfer/receiver device which is connected to the computer via USB port. The database contains data about patients’ medication orders. Results: The CEMS has the ability to identify the clinical errors before they occur and then warns the care-giver with voice and visual messages to prevent the error. This device reduces the errors and thus improves the patient safety. Conclusion: A new tool including software and hardware was developed in this study. Application of this tool in clinical settings can help the nurses prevent medical errors. It can also be a useful tool for clinical risk management. Using this device can improve the patient safety to a considerable extent and thus improve the quality of healthcare. PMID:27147802
Pourasghar, Faramarz; Tabrizi, Jafar Sadegh; Yarifard, Khadijeh
2016-04-01
Patient safety is one of the most important elements of quality of healthcare. It means preventing any harm to the patients during medical care process. This paper introduces a cost-effective tool in which the Radio Frequency Identification (RFID) technology is used to identify medical errors in hospital. The proposed clinical error management system (CEMS) is consisted of a reader device, a transfer/receiver device, a database and managing software. The reader device works using radio waves and is wireless. The reader sends and receives data to/from the database via the transfer/receiver device which is connected to the computer via USB port. The database contains data about patients' medication orders. The CEMS has the ability to identify the clinical errors before they occur and then warns the care-giver with voice and visual messages to prevent the error. This device reduces the errors and thus improves the patient safety. A new tool including software and hardware was developed in this study. Application of this tool in clinical settings can help the nurses prevent medical errors. It can also be a useful tool for clinical risk management. Using this device can improve the patient safety to a considerable extent and thus improve the quality of healthcare.
Lee, Eunjoo
2016-09-01
This study compared registered nurses' perceptions of safety climate and attitude toward medication error reporting before and after completing a hospital accreditation program. Medication errors are the most prevalent adverse events threatening patient safety; reducing underreporting of medication errors significantly improves patient safety. Safety climate in hospitals may affect medication error reporting. This study employed a longitudinal, descriptive design. Data were collected using questionnaires. A tertiary acute hospital in South Korea undergoing a hospital accreditation program. Nurses, pre- and post-accreditation (217 and 373); response rate: 58% and 87%, respectively. Hospital accreditation program. Perceived safety climate and attitude toward medication error reporting. The level of safety climate and attitude toward medication error reporting increased significantly following accreditation; however, measures of institutional leadership and management did not improve significantly. Participants' perception of safety climate was positively correlated with their attitude toward medication error reporting; this correlation strengthened following completion of the program. Improving hospitals' safety climate increased nurses' medication error reporting; interventions that help hospital administration and managers to provide more supportive leadership may facilitate safety climate improvement. Hospitals and their units should develop more friendly and intimate working environments that remove nurses' fear of penalties. Administration and managers should support nurses who report their own errors. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Teamwork and clinical error reporting among nurses in Korean hospitals.
Hwang, Jee-In; Ahn, Jeonghoon
2015-03-01
To examine levels of teamwork and its relationships with clinical error reporting among Korean hospital nurses. The study employed a cross-sectional survey design. We distributed a questionnaire to 674 nurses in two teaching hospitals in Korea. The questionnaire included items on teamwork and the reporting of clinical errors. We measured teamwork using the Teamwork Perceptions Questionnaire, which has five subscales including team structure, leadership, situation monitoring, mutual support, and communication. Using logistic regression analysis, we determined the relationships between teamwork and error reporting. The response rate was 85.5%. The mean score of teamwork was 3.5 out of 5. At the subscale level, mutual support was rated highest, while leadership was rated lowest. Of the participating nurses, 522 responded that they had experienced at least one clinical error in the last 6 months. Among those, only 53.0% responded that they always or usually reported clinical errors to their managers and/or the patient safety department. Teamwork was significantly associated with better error reporting. Specifically, nurses with a higher team communication score were more likely to report clinical errors to their managers and the patient safety department (odds ratio = 1.82, 95% confidence intervals [1.05, 3.14]). Teamwork was rated as moderate and was positively associated with nurses' error reporting performance. Hospital executives and nurse managers should make substantial efforts to enhance teamwork, which will contribute to encouraging the reporting of errors and improving patient safety. Copyright © 2015. Published by Elsevier B.V.
Can Bayesian Theories of Autism Spectrum Disorder Help Improve Clinical Practice?
Haker, Helene; Schneebeli, Maya; Stephan, Klaas Enno
2016-01-01
Diagnosis and individualized treatment of autism spectrum disorder (ASD) represent major problems for contemporary psychiatry. Tackling these problems requires guidance by a pathophysiological theory. In this paper, we consider recent theories that re-conceptualize ASD from a "Bayesian brain" perspective, which posit that the core abnormality of ASD resides in perceptual aberrations due to a disbalance in the precision of prediction errors (sensory noise) relative to the precision of predictions (prior beliefs). This results in percepts that are dominated by sensory inputs and less guided by top-down regularization and shifts the perceptual focus to detailed aspects of the environment with difficulties in extracting meaning. While these Bayesian theories have inspired ongoing empirical studies, their clinical implications have not yet been carved out. Here, we consider how this Bayesian perspective on disease mechanisms in ASD might contribute to improving clinical care for affected individuals. Specifically, we describe a computational strategy, based on generative (e.g., hierarchical Bayesian) models of behavioral and functional neuroimaging data, for establishing diagnostic tests. These tests could provide estimates of specific cognitive processes underlying ASD and delineate pathophysiological mechanisms with concrete treatment targets. Written with a clinical audience in mind, this article outlines how the development of computational diagnostics applicable to behavioral and functional neuroimaging data in routine clinical practice could not only fundamentally alter our concept of ASD but eventually also transform the clinical management of this disorder.
Can Bayesian Theories of Autism Spectrum Disorder Help Improve Clinical Practice?
Haker, Helene; Schneebeli, Maya; Stephan, Klaas Enno
2016-01-01
Diagnosis and individualized treatment of autism spectrum disorder (ASD) represent major problems for contemporary psychiatry. Tackling these problems requires guidance by a pathophysiological theory. In this paper, we consider recent theories that re-conceptualize ASD from a “Bayesian brain” perspective, which posit that the core abnormality of ASD resides in perceptual aberrations due to a disbalance in the precision of prediction errors (sensory noise) relative to the precision of predictions (prior beliefs). This results in percepts that are dominated by sensory inputs and less guided by top-down regularization and shifts the perceptual focus to detailed aspects of the environment with difficulties in extracting meaning. While these Bayesian theories have inspired ongoing empirical studies, their clinical implications have not yet been carved out. Here, we consider how this Bayesian perspective on disease mechanisms in ASD might contribute to improving clinical care for affected individuals. Specifically, we describe a computational strategy, based on generative (e.g., hierarchical Bayesian) models of behavioral and functional neuroimaging data, for establishing diagnostic tests. These tests could provide estimates of specific cognitive processes underlying ASD and delineate pathophysiological mechanisms with concrete treatment targets. Written with a clinical audience in mind, this article outlines how the development of computational diagnostics applicable to behavioral and functional neuroimaging data in routine clinical practice could not only fundamentally alter our concept of ASD but eventually also transform the clinical management of this disorder. PMID:27378955
Sum-rule corrections: a route to error cancellations in correlation matrix renormalisation theory
NASA Astrophysics Data System (ADS)
Liu, C.; Liu, J.; Yao, Y. X.; Wang, C. Z.; Ho, K. M.
2017-03-01
We recently proposed the correlation matrix renormalisation (CMR) theory to efficiently and accurately calculate ground state total energy of molecular systems, based on the Gutzwiller variational wavefunction (GWF) to treat the electronic correlation effects. To help reduce numerical complications and better adapt the CMR to infinite lattice systems, we need to further refine the way to minimise the error originated from the approximations in the theory. This conference proceeding reports our recent progress on this key issue, namely, we obtained a simple analytical functional form for the one-electron renormalisation factors, and introduced a novel sum-rule correction for a more accurate description of the intersite electron correlations. Benchmark calculations are performed on a set of molecules to show the reasonable accuracy of the method.
Phenotypic Graphs and Evolution Unfold the Standard Genetic Code as the Optimal
NASA Astrophysics Data System (ADS)
Zamudio, Gabriel S.; José, Marco V.
2018-03-01
In this work, we explicitly consider the evolution of the Standard Genetic Code (SGC) by assuming two evolutionary stages, to wit, the primeval RNY code and two intermediate codes in between. We used network theory and graph theory to measure the connectivity of each phenotypic graph. The connectivity values are compared to the values of the codes under different randomization scenarios. An error-correcting optimal code is one in which the algebraic connectivity is minimized. We show that the SGC is optimal in regard to its robustness and error-tolerance when compared to all random codes under different assumptions.
NASA Astrophysics Data System (ADS)
Waeldele, F.
1983-01-01
The influence of sample shape deviations on the measurement uncertainties and the optimization of computer aided coordinate measurement were investigated for a circle and a cylinder. Using the complete error propagation law in matrix form the parameter uncertainties are calculated, taking the correlation between the measurement points into account. Theoretical investigations show that the measuring points have to be equidistantly distributed and that for a cylindrical body a measuring point distribution along a cross section is better than along a helical line. The theoretically obtained expressions to calculate the uncertainties prove to be a good estimation basis. The simple error theory is not satisfactory for estimation. The complete statistical data analysis theory helps to avoid aggravating measurement errors and to adjust the number of measuring points to the required measuring uncertainty.
A Just Culture Approach to Managing Medication Errors.
Rogers, Erin; Griffin, Emily; Carnie, William; Melucci, Joseph; Weber, Robert J
2017-04-01
Medication errors continue to be a concern of health care providers and the public, in particular how to prevent harm from medication mistakes. Many health care workers are afraid to report errors for fear of retribution including the loss of professional licensure and even imprisonment. Most health care workers are silent, instead of admitting their mistake and discussing it openly with peers. This can result in further patient harm if the system causing the mistake is not identified and fixed; thus self-denial may have a negative impact on patient care outcomes. As a result, pharmacy leaders, in collaboration with others, must put systems in place that serve to prevent medication errors while promoting a "Just Culture" way of managing performance and outcomes. This culture must exist across disciplines and departments. Pharmacy leaders need to understand how to classify behaviors associated with errors, set realistic expectations, instill values for staff, and promote accountability within the workplace. This article reviews the concept of Just Culture and provides ways that pharmacy directors can use this concept to manage the degree of error in patient-centered pharmacy services.
A Quantum Theoretical Explanation for Probability Judgment Errors
ERIC Educational Resources Information Center
Busemeyer, Jerome R.; Pothos, Emmanuel M.; Franco, Riccardo; Trueblood, Jennifer S.
2011-01-01
A quantum probability model is introduced and used to explain human probability judgment errors including the conjunction and disjunction fallacies, averaging effects, unpacking effects, and order effects on inference. On the one hand, quantum theory is similar to other categorization and memory models of cognition in that it relies on vector…
Textbook Error: Short Circuiting on Electrochemical Cell
ERIC Educational Resources Information Center
Bonicamp, Judith M.; Clark, Roy W.
2007-01-01
Short circuiting an electrochemical cell is an unreported but persistent error in the electrochemistry textbooks. It is suggested that diagrams depicting a cell delivering usable current to a load be postponed, the theory of open-circuit galvanic cells is explained, the voltages from the tables of standard reduction potentials is calculated and…
Sciacovelli, Laura; O'Kane, Maurice; Skaik, Younis Abdelwahab; Caciagli, Patrizio; Pellegrini, Cristina; Da Rin, Giorgio; Ivanov, Agnes; Ghys, Timothy; Plebani, Mario
2011-05-01
The adoption of Quality Indicators (QIs) has prompted the development of tools to measure and evaluate the quality and effectiveness of laboratory testing, first in the hospital setting and subsequently in ambulatory and other care settings. While Laboratory Medicine has an important role in the delivery of high-quality care, no consensus exists as yet on the use of QIs focussing on all steps of the laboratory total testing process (TTP), and further research in this area is required. In order to reduce errors in laboratory testing, the IFCC Working Group on "Laboratory Errors and Patient Safety" (WG-LEPS) developed a series of Quality Indicators, specifically designed for clinical laboratories. In the first phase of the project, specific QIs for key processes of the TTP were identified, including all the pre-, intra- and post-analytic steps. The overall aim of the project is to create a common reporting system for clinical laboratories based on standardized data collection, and to define state-of-the-art and Quality Specifications (QSs) for each QI independent of: a) the size of organization and type of activities; b) the complexity of processes undertaken; and c) different degree of knowledge and ability of the staff. The aim of the present paper is to report the results collected from participating laboratories from February 2008 to December 2009 and to identify preliminary QSs. The results demonstrate that a Model of Quality Indicators managed as an External Quality Assurance Program can serve as a tool to monitor and control the pre-, intra- and post-analytical activities. It might also allow clinical laboratories to identify risks that lead to errors resulting in patient harm: identification and design of practices that eliminate medical errors; the sharing of information and education of clinical and laboratory teams on practices that reduce or prevent errors; the monitoring and evaluation of improvement activities.
Hydrograph simulation models of the Hillsborough and Alafia Rivers, Florida: a preliminary report
Turner, James F.
1972-01-01
Mathematical (digital) models that simulate flood hydrographs from rainfall records have been developed for the following gaging stations in the Hillsborough and Alafia River basins of west-central Florida: Hillsborough River near Tampa, Alafia River at Lithia, and north Prong Alafia River near Keysville. These models, which were developed from historical streamflow and and rainfall records, are based on rainfall-runoff and unit-hydrograph procedures involving an arbitrary separation of the flood hydrograph. These models assume the flood hydrograph to be composed of only two flow components, direct (storm) runoff, and base flow. Expressions describing these two flow components are derived from streamflow and rainfall records and are combined analytically to form algorithms (models), which are programmed for processing on a digital computing system. Most Hillsborough and Alafia River flood discharges can be simulated with expected relative errors less than or equal to 30 percent and flood peaks can be simulated with average relative errors less than 15 percent. Because of the inadequate rainfall network that is used in obtaining input data for the North Prong Alafia River model, simulated peaks are frequently in error by more than 40 percent, particularly for storms having highly variable areal rainfall distribution. Simulation errors are the result of rainfall sample errors and, to a lesser extent, model inadequacy. Data errors associated with the determination of mean basin precipitation are the result of the small number and poor areal distribution of rainfall stations available for use in the study. Model inadequacy, however, is attributed to the basic underlying theory, particularly the rainfall-runoff relation. These models broaden and enhance existing water-management capabilities within these basins by allowing the establishment and implementation of programs providing for continued development in these areas. Specifically, the models serve not only as a basis for forecasting floods, but also for simulating hydrologic information needed in flood-plain mapping and delineating and evaluating alternative flood control and abatement plans.
Tolerance assignment in optical design
NASA Astrophysics Data System (ADS)
Youngworth, Richard Neil
2002-09-01
Tolerance assignment is necessary in any engineering endeavor because fabricated systems---due to the stochastic nature of manufacturing and assembly processes---necessarily deviate from the nominal design. This thesis addresses the problem of optical tolerancing. The work can logically be split into three different components that all play an essential role. The first part addresses the modeling of manufacturing errors in contemporary fabrication and assembly methods. The second component is derived from the design aspect---the development of a cost-based tolerancing procedure. The third part addresses the modeling of image quality in an efficient manner that is conducive to the tolerance assignment process. The purpose of the first component, modeling manufacturing errors, is twofold---to determine the most critical tolerancing parameters and to understand better the effects of fabrication errors. Specifically, mid-spatial-frequency errors, typically introduced in sub-aperture grinding and polishing fabrication processes, are modeled. The implication is that improving process control and understanding better the effects of the errors makes the task of tolerance assignment more manageable. Conventional tolerancing methods do not directly incorporate cost. Consequently, tolerancing approaches tend to focus more on image quality. The goal of the second part of the thesis is to develop cost-based tolerancing procedures that facilitate optimum system fabrication by generating the loosest acceptable tolerances. This work has the potential to impact a wide range of optical designs. The third element, efficient modeling of image quality, is directly related to the cost-based optical tolerancing method. Cost-based tolerancing requires efficient and accurate modeling of the effects of errors on the performance of optical systems. Thus it is important to be able to compute the gradient and the Hessian, with respect to the parameters that need to be toleranced, of the figure of merit that measures the image quality of a system. An algebraic method for computing the gradient and the Hessian is developed using perturbation theory.
NASA Astrophysics Data System (ADS)
Kesler, Steven R.
The lifting line theory was first developed by Prandtl and was used primarily on analysis of airplane wings. Though the theory is about one hundred years old, it is still used in the initial calculations to find the lift of a wing. The question that guided this thesis was, "How close does Prandtl's lifting line theory predict the thrust of a propeller?" In order to answer this question, an experiment was designed that measured the thrust of a propeller for different speeds. The measured thrust was compared to what the theory predicted. In order to do this experiment and analysis, a propeller needed to be used. A walnut wood ultralight propeller was chosen that had a 1.30 meter (51 inches) length from tip to tip. In this thesis, Prandtl's lifting line theory was modified to account for the different incoming velocity depending on the radial position of the airfoil. A modified equation was used to reflect these differences. A working code was developed based on this modified equation. A testing rig was built that allowed the propeller to be rotated at high speeds while measuring the thrust. During testing, the rotational speed of the propeller ranged from 13-43 rotations per second. The thrust from the propeller was measured at different speeds and ranged from 16-33 Newton's. The test data were then compared to the theoretical results obtained from the lifting line code. A plot in Chapter 5 (the results section) shows the theoretical vs. actual thrust for different rotational speeds. The theory over predicted the actual thrust of the propeller. Depending on the rotational speed, the error was: at low speeds 36%, at low to moderate speeds 84%, and at high speeds the error increased to 195%. Different reasons for these errors are discussed.
A Grounded Theory Study of Aircraft Maintenance Technician Decision-Making
NASA Astrophysics Data System (ADS)
Norcross, Robert
Aircraft maintenance technician decision-making and actions have resulted in aircraft system errors causing aircraft incidents and accidents. Aircraft accident investigators and researchers examined the factors that influence aircraft maintenance technician errors and categorized the types of errors in an attempt to prevent similar occurrences. New aircraft technology introduced to improve aviation safety and efficiency incur failures that have no information contained in the aircraft maintenance manuals. According to the Federal Aviation Administration, aircraft maintenance technicians must use only approved aircraft maintenance documents to repair, modify, and service aircraft. This qualitative research used a grounded theory approach to explore the decision-making processes and actions taken by aircraft maintenance technicians when confronted with an aircraft problem not contained in the aircraft maintenance manuals. The target population for the research was Federal Aviation Administration licensed aircraft and power plant mechanics from across the United States. Nonprobability purposeful sampling was used to obtain aircraft maintenance technicians with the experience sought in the study problem. The sample population recruitment yielded 19 participants for eight focus group sessions to obtain opinions, perceptions, and experiences related to the study problem. All data collected was entered into the Atlas ti qualitative analysis software. The emergence of Aircraft Maintenance Technician decision-making themes regarding Aircraft Maintenance Manual content, Aircraft Maintenance Technician experience, and legal implications of not following Aircraft Maintenance Manuals surfaced. Conclusions from this study suggest Aircraft Maintenance Technician decision-making were influenced by experience, gaps in the Aircraft Maintenance Manuals, reliance on others, realizing the impact of decisions concerning aircraft airworthiness, management pressures, and legal concerns related to decision-making. Recommendations included an in-depth systematic review of the Aircraft Maintenance Manuals, development of a Federal Aviation Administration approved standardized Aircraft Maintenance Technician decision-making flow diagram, and implementation of risk based decision-making training. The benefit of this study is to save the airline industry revenue by preventing poor decision-making practices that result in inefficient maintenance actions and aircraft incidents and accidents.
NASA Astrophysics Data System (ADS)
Yang, Shuai; Wu, Wei; Wang, Xingshu; Xu, Zhiguang
2018-01-01
The coupling error in the measurement of ship hull deformation can significantly influence the attitude accuracy of the shipborne weapons and equipments. It is therefore important to study the characteristics of the coupling error. In this paper, an comprehensive investigation on the coupling error is reported, which has a potential of deducting the coupling error in the future. Firstly, the causes and characteristics of the coupling error are analyzed theoretically based on the basic theory of measuring ship deformation. Then, simulations are conducted for verifying the correctness of the theoretical analysis. Simulation results show that the cross-correlation between dynamic flexure and ship angular motion leads to the coupling error in measuring ship deformation, and coupling error increases with the correlation value between them. All the simulation results coincide with the theoretical analysis.
Bank, Ilana; Snell, Linda; Bhanji, Farhan
2014-12-01
Improved pediatric crisis resource management (CRM) training is needed in emergency medicine residencies because of the variable nature of exposure to critically ill pediatric patients during training. We created a short, needs-based pediatric CRM simulation workshop with postactivity follow-up to determine retention of CRM knowledge. Our aims were to provide a realistic learning experience for residents and to help the learners recognize common errors in teamwork and improve their perceived abilities to manage ill pediatric patients. Residents participated in a 4-hour objectives-based workshop derived from a formal needs assessment. To quantify their subjective abilities to manage pediatric cases, the residents completed a postworkshop survey (with a retrospective precomponent to assess perceived change). Ability to identify CRM errors was determined via a written assessment of scripted errors in a prerecorded video observed before and 1 month after completion of the workshop. Fifteen of the 16 eligible emergency medicine residents (postgraduate year 1-5) attended the workshop and completed the surveys. There were significant differences in 15 of 16 retrospective pre to post survey items using the Wilcoxon rank sum test for non-parametric data. These included ability to be an effective team leader in general (P < 0.008), delegating tasks appropriately (P < 0.009), and ability to ensure closed-loop communication (P < 0.008). There was a significant improvement in identification of CRM errors through the use of the video assessment from 3 of the 12 CRM errors to 7 of the 12 CRM errors (P < 0.006). The pediatric CRM simulation-based workshop improved the residents' self-perceptions of their pediatric CRM abilities and improved their performance on a video assessment task.
Thinly disguised contempt: a barrier to excellence.
Brown-Stewart, P
1987-04-01
Many elements in contemporary leadership and management convey contempt for employees. "Thinly disguised contempt," a concept introduced by Peters and Austin in A Passion For Excellence, explains many barriers to the achievement of excellence in corporations across disciplines. Health care executives and managers can learn from the errors of corporate management and avoid replicating these errors in the health care industry.
NASA Astrophysics Data System (ADS)
Wang, Dong; Ding, Hao; Singh, Vijay P.; Shang, Xiaosan; Liu, Dengfeng; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun; Zou, Xinqing
2015-05-01
For scientific and sustainable management of water resources, hydrologic and meteorologic data series need to be often extended. This paper proposes a hybrid approach, named WA-CM (wavelet analysis-cloud model), for data series extension. Wavelet analysis has time-frequency localization features, known as "mathematics microscope," that can decompose and reconstruct hydrologic and meteorologic series by wavelet transform. The cloud model is a mathematical representation of fuzziness and randomness and has strong robustness for uncertain data. The WA-CM approach first employs the wavelet transform to decompose the measured nonstationary series and then uses the cloud model to develop an extension model for each decomposition layer series. The final extension is obtained by summing the results of extension of each layer. Two kinds of meteorologic and hydrologic data sets with different characteristics and different influence of human activity from six (three pairs) representative stations are used to illustrate the WA-CM approach. The approach is also compared with four other methods, which are conventional correlation extension method, Kendall-Theil robust line method, artificial neural network method (back propagation, multilayer perceptron, and radial basis function), and single cloud model method. To evaluate the model performance completely and thoroughly, five measures are used, which are relative error, mean relative error, standard deviation of relative error, root mean square error, and Thiel inequality coefficient. Results show that the WA-CM approach is effective, feasible, and accurate and is found to be better than other four methods compared. The theory employed and the approach developed here can be applied to extension of data in other areas as well.
NASA Astrophysics Data System (ADS)
Pries-Heje, Jan; Baskerville, Richard L.
This paper elaborates a design science approach for management planning anchored to the concept of a management design theory. Unlike the notions of design theories arising from information systems, management design theories can appear as a system of technological rules, much as a system of hypotheses or propositions can embody scientific theories. The paper illus trates this form of management design theories with three grounded cases. These grounded cases include a software process improvement study, a user involvement study, and an organizational change study. Collectively these studies demonstrate how design theories founded on technological rules can not only improve the design of information systems, but that these concepts have great practical value for improving the framing of strategic organi zational design decisions about such systems. Each case is either grounded in an empirical sense, that is to say, actual practice, or it is grounded to practices described extensively in the practical literature. Such design theories will help managers more easily approach complex, strategic decisions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schatz, G.C.; Kuppermann, A.
1979-03-15
It is shown that the phase factor i/sup J+j+l/ omitted in a theory of atom-diatom monreactive scattering formulated by Schatz and Kupperman but included in the Choi, Poe, and Tang theory does not lead to errors in the analysis of te (H,H/sub 2/) system.
[Errors in medicine. Causes, impact and improvement measures to improve patient safety].
Waeschle, R M; Bauer, M; Schmidt, C E
2015-09-01
The guarantee of quality of care and patient safety is of major importance in hospitals even though increased economic pressure and work intensification are ubiquitously present. Nevertheless, adverse events still occur in 3-4 % of hospital stays and of these 25-50 % are estimated to be avoidable. The identification of possible causes of error and the development of measures for the prevention of medical errors are essential for patient safety. The implementation and continuous development of a constructive culture of error tolerance are fundamental.The origins of errors can be differentiated into systemic latent and individual active causes and components of both categories are typically involved when an error occurs. Systemic causes are, for example out of date structural environments, lack of clinical standards and low personnel density. These causes arise far away from the patient, e.g. management decisions and can remain unrecognized for a long time. Individual causes involve, e.g. confirmation bias, error of fixation and prospective memory failure. These causes have a direct impact on patient care and can result in immediate injury to patients. Stress, unclear information, complex systems and a lack of professional experience can promote individual causes. Awareness of possible causes of error is a fundamental precondition to establishing appropriate countermeasures.Error prevention should include actions directly affecting the causes of error and includes checklists and standard operating procedures (SOP) to avoid fixation and prospective memory failure and team resource management to improve communication and the generation of collective mental models. Critical incident reporting systems (CIRS) provide the opportunity to learn from previous incidents without resulting in injury to patients. Information technology (IT) support systems, such as the computerized physician order entry system, assist in the prevention of medication errors by providing information on dosage, pharmacological interactions, side effects and contraindications of medications.The major challenges for quality and risk management, for the heads of departments and the executive board is the implementation and support of the described actions and a sustained guidance of the staff involved in the modification management process. The global trigger tool is suitable for improving transparency and objectifying the frequency of medical errors.
Fink, Reinhold F
2016-11-14
We show analytically and numerically that the performance of second order Møller-Plesset (MP) perturbation theory (PT), coupled-cluster (CC) theory, and other perturbation theory approaches can be rationalized by analyzing the wavefunctions of these methods. While rather large deviations for the individual contributions of configurations to the electron correlation energy are found for MP wavefunctions, they profit from an advantageous and robust error cancellation: The absolute contribution to the correlation energy is generally underestimated for the critical excitations with small energy denominators and all other doubly excited configurations where the two excited electrons are coupled to a singlet. This is balanced by an overestimation of the contribution of triplet-coupled double excitations to the correlation energy. The even better performance of spin-component-scaled-MP2 theory is explained by a similar error compensation effect. The wavefunction analysis for the lowest singlet states of H 2 O, CH 2 , CO, and Cu + shows the predicted trends for MP methods, rapid but biased convergence of CC theory as well as the substantial potential of linearized CC, or retaining the excitation-degree (RE)-PT.
Medication errors: definitions and classification
Aronson, Jeffrey K
2009-01-01
To understand medication errors and to identify preventive strategies, we need to classify them and define the terms that describe them. The four main approaches to defining technical terms consider etymology, usage, previous definitions, and the Ramsey–Lewis method (based on an understanding of theory and practice). A medication error is ‘a failure in the treatment process that leads to, or has the potential to lead to, harm to the patient’. Prescribing faults, a subset of medication errors, should be distinguished from prescription errors. A prescribing fault is ‘a failure in the prescribing [decision-making] process that leads to, or has the potential to lead to, harm to the patient’. The converse of this, ‘balanced prescribing’ is ‘the use of a medicine that is appropriate to the patient's condition and, within the limits created by the uncertainty that attends therapeutic decisions, in a dosage regimen that optimizes the balance of benefit to harm’. This excludes all forms of prescribing faults, such as irrational, inappropriate, and ineffective prescribing, underprescribing and overprescribing. A prescription error is ‘a failure in the prescription writing process that results in a wrong instruction about one or more of the normal features of a prescription’. The ‘normal features’ include the identity of the recipient, the identity of the drug, the formulation, dose, route, timing, frequency, and duration of administration. Medication errors can be classified, invoking psychological theory, as knowledge-based mistakes, rule-based mistakes, action-based slips, and memory-based lapses. This classification informs preventive strategies. PMID:19594526
Decision aids for multiple-decision disease management as affected by weather input errors.
Pfender, W F; Gent, D H; Mahaffee, W F; Coop, L B; Fox, A D
2011-06-01
Many disease management decision support systems (DSSs) rely, exclusively or in part, on weather inputs to calculate an indicator for disease hazard. Error in the weather inputs, typically due to forecasting, interpolation, or estimation from off-site sources, may affect model calculations and management decision recommendations. The extent to which errors in weather inputs affect the quality of the final management outcome depends on a number of aspects of the disease management context, including whether management consists of a single dichotomous decision, or of a multi-decision process extending over the cropping season(s). Decision aids for multi-decision disease management typically are based on simple or complex algorithms of weather data which may be accumulated over several days or weeks. It is difficult to quantify accuracy of multi-decision DSSs due to temporally overlapping disease events, existence of more than one solution to optimizing the outcome, opportunities to take later recourse to modify earlier decisions, and the ongoing, complex decision process in which the DSS is only one component. One approach to assessing importance of weather input errors is to conduct an error analysis in which the DSS outcome from high-quality weather data is compared with that from weather data with various levels of bias and/or variance from the original data. We illustrate this analytical approach for two types of DSS, an infection risk index for hop powdery mildew and a simulation model for grass stem rust. Further exploration of analysis methods is needed to address problems associated with assessing uncertainty in multi-decision DSSs.
Balancing the books - a statistical theory of prospective budgets in Earth System science
NASA Astrophysics Data System (ADS)
O'Kane, J. Philip
An honest declaration of the error in a mass, momentum or energy balance, ɛ, simply raises the question of its acceptability: "At what value of ɛ is the attempted balance to be rejected?" Answering this question requires a reference quantity against which to compare ɛ. This quantity must be a mathematical function of all the data used in making the balance. To deliver this function, a theory grounded in a workable definition of acceptability is essential. A distinction must be drawn between a retrospective balance and a prospective budget in relation to any natural space-filling body. Balances look to the past; budgets look to the future. The theory is built on the application of classical sampling theory to the measurement and closure of a prospective budget. It satisfies R.A. Fisher's "vital requirement that the actual and physical conduct of experiments should govern the statistical procedure of their interpretation". It provides a test, which rejects, or fails to reject, the hypothesis that the closing error on the budget, when realised, was due to sampling error only. By increasing the number of measurements, the discrimination of the test can be improved, controlling both the precision and accuracy of the budget and its components. The cost-effective design of such measurement campaigns is discussed briefly. This analysis may also show when campaigns to close a budget on a particular space-filling body are not worth the effort for either scientific or economic reasons. Other approaches, such as those based on stochastic processes, lack this finality, because they fail to distinguish between different types of error in the mismatch between a set of realisations of the process and the measured data.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-15
... management of human error in its operations and system safety programs, and the status of PTC implementation... UP's safety management policies and programs associated with human error, operational accident and... Chairman of the Board of Inquiry 2. Introduction of the Board of Inquiry and Technical Panel 3...
Estimating the Imputed Social Cost of Errors of Measurement.
1983-10-01
social cost of an error of measurement in the score on a unidimensional test, an asymptotic method, based on item response theory, is developed for...11111111 ij MICROCOPY RESOLUTION TEST CHART NATIONAL BUREAU OF STANDARDS-1963-A.5. ,,, I v.P I RR-83-33-ONR 4ESTIMATING THE IMPUTED SOCIAL COST S OF... SOCIAL COST OF ERRORS OF MEASUREMENT Frederic M. Lord This research was sponsored in part by the Personnel and Training Research Programs Psychological
Data Analysis & Statistical Methods for Command File Errors
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Waggoner, Bruce; Bryant, Larry
2014-01-01
This paper explains current work on modeling for managing the risk of command file errors. It is focused on analyzing actual data from a JPL spaceflight mission to build models for evaluating and predicting error rates as a function of several key variables. We constructed a rich dataset by considering the number of errors, the number of files radiated, including the number commands and blocks in each file, as well as subjective estimates of workload and operational novelty. We have assessed these data using different curve fitting and distribution fitting techniques, such as multiple regression analysis, and maximum likelihood estimation to see how much of the variability in the error rates can be explained with these. We have also used goodness of fit testing strategies and principal component analysis to further assess our data. Finally, we constructed a model of expected error rates based on the what these statistics bore out as critical drivers to the error rate. This model allows project management to evaluate the error rate against a theoretically expected rate as well as anticipate future error rates.
Saunders, B; Houghton, M
1996-01-01
How a problem is understood dictates how it is responded to. In this paper the problem of relapse and alcohol dependence is reconsidered. The existing major relapse paradigm is evaluated against the last two decades of research. It is concluded that the available research strongly questions the notion that relapse is an addiction-specific event. Instead, relapse is probably better understood as a complex, generic, human behaviour, undertaken at times by all of us. Given this, it is possible that mainstream psychological theories, such as decision making and attribution theory, are important in coming to any understanding of the phenomenon of relapse. It is also contended that the investigation of relapse is potentially an error of focus. Such study invites the investigation of those who do not succeed in changing behaviour, as against the study of those who do. For those concerned with the treatment of alcohol dependence, studying the successes may be a more informative process than studying the putative failures. Given the burgeoning of research over the past two decades the impact on treatment practice is reviewed. It is concluded that relapse prevention and management is very much on the alcohol-intervention agenda. However, the research evidence to date is consistent with the general psychotherapy literature in that doing something appears better than no intervention, but that an optimum, effective, intervention has yet to be devised.
An Alternative Approach to Analyze Ipsative Data. Revisiting Experiential Learning Theory.
Batista-Foguet, Joan M; Ferrer-Rosell, Berta; Serlavós, Ricard; Coenders, Germà; Boyatzis, Richard E
2015-01-01
The ritualistic use of statistical models regardless of the type of data actually available is a common practice across disciplines which we dare to call type zero error. Statistical models involve a series of assumptions whose existence is often neglected altogether, this is specially the case with ipsative data. This paper illustrates the consequences of this ritualistic practice within Kolb's Experiential Learning Theory (ELT) operationalized through its Learning Style Inventory (KLSI). We show how using a well-known methodology in other disciplines-compositional data analysis (CODA) and log ratio transformations-KLSI data can be properly analyzed. In addition, the method has theoretical implications: a third dimension of the KLSI is unveiled providing room for future research. This third dimension describes an individual's relative preference for learning by prehension rather than by transformation. Using a sample of international MBA students, we relate this dimension with another self-assessment instrument, the Philosophical Orientation Questionnaire (POQ), and with an observer-assessed instrument, the Emotional and Social Competency Inventory (ESCI-U). Both show plausible statistical relationships. An intellectual operating philosophy (IOP) is linked to a preference for prehension, whereas a pragmatic operating philosophy (POP) is linked to transformation. Self-management and social awareness competencies are linked to a learning preference for transforming knowledge, whereas relationship management and cognitive competencies are more related to approaching learning by prehension.
An Alternative Approach to Analyze Ipsative Data. Revisiting Experiential Learning Theory
Batista-Foguet, Joan M.; Ferrer-Rosell, Berta; Serlavós, Ricard; Coenders, Germà; Boyatzis, Richard E.
2015-01-01
The ritualistic use of statistical models regardless of the type of data actually available is a common practice across disciplines which we dare to call type zero error. Statistical models involve a series of assumptions whose existence is often neglected altogether, this is specially the case with ipsative data. This paper illustrates the consequences of this ritualistic practice within Kolb's Experiential Learning Theory (ELT) operationalized through its Learning Style Inventory (KLSI). We show how using a well-known methodology in other disciplines—compositional data analysis (CODA) and log ratio transformations—KLSI data can be properly analyzed. In addition, the method has theoretical implications: a third dimension of the KLSI is unveiled providing room for future research. This third dimension describes an individual's relative preference for learning by prehension rather than by transformation. Using a sample of international MBA students, we relate this dimension with another self-assessment instrument, the Philosophical Orientation Questionnaire (POQ), and with an observer-assessed instrument, the Emotional and Social Competency Inventory (ESCI-U). Both show plausible statistical relationships. An intellectual operating philosophy (IOP) is linked to a preference for prehension, whereas a pragmatic operating philosophy (POP) is linked to transformation. Self-management and social awareness competencies are linked to a learning preference for transforming knowledge, whereas relationship management and cognitive competencies are more related to approaching learning by prehension. PMID:26617561
My copilot is a nurse--using crew resource management in the OR.
Powell, Stephen M; Hill, Ruth Kimberly
2006-01-01
Crew resource management (CRM) has been used for more than 20 years in the aviation industry to teach individual error countermeasures by developing nontechnical (ie, cognitive, social) skills based on the observed traits of successful individuals and crews. The health care industry began to investigate aviation CRM after the Institute of Medicine's report, To Err is Human: Building a Safer Health System, recommended that medicine adopt aviation's approach to safety and error management. Initial results of implementing CRM in health care arenas have demonstrated reduced adverse outcomes, reduced errors, reduced length of stay, improved nurse retention, and changed attitudes and behaviors toward teamwork.
Social Errors in Four Cultures: Evidence about Universal Forms of Social Relations.
ERIC Educational Resources Information Center
Fiske, Alan Page
1993-01-01
To test the cross-cultural generality of relational-models theory, 4 studies with 70 adults examined social errors of substitution of persons for Bengali, Korean, Chinese, and Vai (Liberia and Sierra Leone) subjects. In all four cultures, people tend to substitute someone with whom they have the same basic relationship. (SLD)
ERIC Educational Resources Information Center
Du, Yunfei
This paper discusses the impact of sampling error on the construction of confidence intervals around effect sizes. Sampling error affects the location and precision of confidence intervals. Meta-analytic resampling demonstrates that confidence intervals can haphazardly bounce around the true population parameter. Special software with graphical…
An Application of Multivariate Generalizability in Selection of Mathematically Gifted Students
ERIC Educational Resources Information Center
Kim, Sungyeun; Berebitsky, Dan
2016-01-01
This study investigates error sources and the effects of each error source to determine optimal weights of the composite score of teacher recommendation letters and self-introduction letters using multivariate generalizability theory. Data were collected from the science education institute for the gifted attached to the university located within…
Quality Control of an OSCE Using Generalizability Theory and Many-Faceted Rasch Measurement
ERIC Educational Resources Information Center
Iramaneerat, Cherdsak; Yudkowsky, Rachel; Myford, Carol M.; Downing, Steven M.
2008-01-01
An Objective Structured Clinical Examination (OSCE) is an effective method for evaluating competencies. However, scores obtained from an OSCE are vulnerable to many potential measurement errors that cases, items, or standardized patients (SPs) can introduce. Monitoring these sources of errors is an important quality control mechanism to ensure…
Sum-rule corrections: A route to error cancellations in correlation matrix renormalisation theory
Liu, C.; Liu, J.; Yao, Y. X.; ...
2017-01-16
Here, we recently proposed the correlation matrix renormalisation (CMR) theory to efficiently and accurately calculate ground state total energy of molecular systems, based on the Gutzwiller variational wavefunction (GWF) to treat the electronic correlation effects. To help reduce numerical complications and better adapt the CMR to infinite lattice systems, we need to further refine the way to minimise the error originated from the approximations in the theory. This conference proceeding reports our recent progress on this key issue, namely, we obtained a simple analytical functional form for the one-electron renormalisation factors, and introduced a novel sum-rule correction for a moremore » accurate description of the intersite electron correlations. Benchmark calculations are performed on a set of molecules to show the reasonable accuracy of the method.« less
Sum-rule corrections: A route to error cancellations in correlation matrix renormalisation theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, C.; Liu, J.; Yao, Y. X.
Here, we recently proposed the correlation matrix renormalisation (CMR) theory to efficiently and accurately calculate ground state total energy of molecular systems, based on the Gutzwiller variational wavefunction (GWF) to treat the electronic correlation effects. To help reduce numerical complications and better adapt the CMR to infinite lattice systems, we need to further refine the way to minimise the error originated from the approximations in the theory. This conference proceeding reports our recent progress on this key issue, namely, we obtained a simple analytical functional form for the one-electron renormalisation factors, and introduced a novel sum-rule correction for a moremore » accurate description of the intersite electron correlations. Benchmark calculations are performed on a set of molecules to show the reasonable accuracy of the method.« less
Verification of floating-point software
NASA Technical Reports Server (NTRS)
Hoover, Doug N.
1990-01-01
Floating point computation presents a number of problems for formal verification. Should one treat the actual details of floating point operations, or accept them as imprecisely defined, or should one ignore round-off error altogether and behave as if floating point operations are perfectly accurate. There is the further problem that a numerical algorithm usually only approximately computes some mathematical function, and we often do not know just how good the approximation is, even in the absence of round-off error. ORA has developed a theory of asymptotic correctness which allows one to verify floating point software with a minimum entanglement in these problems. This theory and its implementation in the Ariel C verification system are described. The theory is illustrated using a simple program which finds a zero of a given function by bisection. This paper is presented in viewgraph form.
Fresnel diffraction by spherical obstacles
NASA Technical Reports Server (NTRS)
Hovenac, Edward A.
1989-01-01
Lommel functions were used to solve the Fresnel-Kirchhoff diffraction integral for the case of a spherical obstacle. Comparisons were made between Fresnel diffraction theory and Mie scattering theory. Fresnel theory is then compared to experimental data. Experiment and theory typically deviated from one another by less than 10 percent. A unique experimental setup using mercury spheres suspended in a viscous fluid significantly reduced optical noise. The major source of error was due to the Gaussian-shaped laser beam.
Aviation accidents and the theory of the situation
NASA Technical Reports Server (NTRS)
Bolman, L.
1980-01-01
Social-psychological factors effecting the performance of flight crews are examined. In particular, a crew member's perceptual-psychological constructs of the flight situation (theories of the situation) are discussed. The skills and willingness of a flight crew to be alert to possible errors in the theory become critical to their effectiveness and their ability to ensure a safe flight. Several major factors that determine the likelihood that a faulty theory will be detected and revised are identified.
Using the theory of small perturbations in performance calculations of the RBMK
DOE Office of Scientific and Technical Information (OSTI.GOV)
Isaev, N.V.; Druzhinin, V.E.; Pogosbekyan, L.R.
The theory of small perturbations in reactor physics is discussed and applied to two-dimensional calculations of the RBMK. The classical theory of small perturbations implies considerable errors in calculations because the perturbations cannot be considered small. The modified theory of small perturbations presented here can be used in atomic power stations for determining reactivity effects and reloading rates of channels in reactors and also for assessing the reactivity storage in control rods.
Measurement error: Implications for diagnosis and discrepancy models of developmental dyslexia.
Cotton, Sue M; Crewther, David P; Crewther, Sheila G
2005-08-01
The diagnosis of developmental dyslexia (DD) is reliant on a discrepancy between intellectual functioning and reading achievement. Discrepancy-based formulae have frequently been employed to establish the significance of the difference between 'intelligence' and 'actual' reading achievement. These formulae, however, often fail to take into consideration test reliability and the error associated with a single test score. This paper provides an illustration of the potential effects that test reliability and measurement error can have on the diagnosis of dyslexia, with particular reference to discrepancy models. The roles of reliability and standard error of measurement (SEM) in classic test theory are also briefly reviewed. This is followed by illustrations of how SEM and test reliability can aid with the interpretation of a simple discrepancy-based formula of DD. It is proposed that a lack of consideration of test theory in the use of discrepancy-based models of DD can lead to misdiagnosis (both false positives and false negatives). Further, misdiagnosis in research samples affects reproducibility and generalizability of findings. This in turn, may explain current inconsistencies in research on the perceptual, sensory, and motor correlates of dyslexia.
NASA Technical Reports Server (NTRS)
O'Bryan, Thomas C; Danforth, Edward C B; Johnston, J Ford
1955-01-01
The magnitude and variation of the static-pressure error for various distances ahead of sharp-nose bodies and open-nose air inlets and for a distance of 1 chord ahead of the wing tip of a swept wing are defined by a combination of experiment and theory. The mechanism of the error is discussed in some detail to show the contributing factors that make up the error. The information presented provides a useful means for choosing a proper location for measurement of static pressure for most purposes.
Multipath induced errors in meteorological Doppler/interferometer location systems
NASA Technical Reports Server (NTRS)
Wallace, R. G.
1984-01-01
One application of an RF interferometer aboard a low-orbiting spacecraft to determine the location of ground-based transmitters is in tracking high-altitude balloons for meteorological studies. A source of error in this application is reflection of the signal from the sea surface. Through propagating and signal analysis, the magnitude of the reflection-induced error in both Doppler frequency measurements and interferometer phase measurements was estimated. The theory of diffuse scattering from random surfaces was applied to obtain the power spectral density of the reflected signal. The processing of the combined direct and reflected signals was then analyzed to find the statistics of the measurement error. It was found that the error varies greatly during the satellite overpass and attains its maximum value at closest approach. The maximum values of interferometer phase error and Doppler frequency error found for the system configuration considered were comparable to thermal noise-induced error.
A theory for predicting composite laminate warpage resulting from fabrication
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1975-01-01
Linear laminate theory is used in conjunction with the moment-curvature relationship to derive equations for predicting end deflections due to warpage without solving the coupled fourth-order partial differential equations of the plate. Using these equations, it is found that a 1 deg error in the orientation angle of one ply is sufficient to produce warpage end deflection equal to two laminate thicknesses in a 10 inch by 10 inch laminate made from 8-ply Mod-I/epoxy. From a sensitivity analysis on the governing parameters, it is found that a 3 deg fiber migration or a void volume ratio of three percent in some plies is sufficient to produce laminate warpage corner deflection equal to several laminate thicknesses. Tabular and graphical data are presented which can be used to identify possible errors contributing to laminate warpage and/or to obtain an a priori assessment when unavoidable errors during fabrication are anticipated.
ecco: An error correcting comparator theory.
Ghirlanda, Stefano
2018-03-08
Building on the work of Ralph Miller and coworkers (Miller and Matzel, 1988; Denniston et al., 2001; Stout and Miller, 2007), I propose a new formalization of the comparator hypothesis that seeks to overcome some shortcomings of existing formalizations. The new model, dubbed ecco for "Error-Correcting COmparisons," retains the comparator process and the learning of CS-CS associations based on contingency. ecco assumes, however, that learning of CS-US associations is driven by total error correction, as first introduced by Rescorla and Wagner (1972). I explore ecco's behavior in acquisition, compound conditioning, blocking, backward blocking, and unovershadowing. In these paradigms, ecco appears capable of avoiding the problems of current comparator models, such as the inability to solve some discriminations and some paradoxical effects of stimulus salience. At the same time, ecco exhibits the retrospective revaluation phenomena that are characteristic of comparator theory. Copyright © 2018 Elsevier B.V. All rights reserved.
Human operator response to error-likely situations in complex engineering systems
NASA Technical Reports Server (NTRS)
Morris, Nancy M.; Rouse, William B.
1988-01-01
The causes of human error in complex systems are examined. First, a conceptual framework is provided in which two broad categories of error are discussed: errors of action, or slips, and errors of intention, or mistakes. Conditions in which slips and mistakes might be expected to occur are identified, based on existing theories of human error. Regarding the role of workload, it is hypothesized that workload may act as a catalyst for error. Two experiments are presented in which humans' response to error-likely situations were examined. Subjects controlled PLANT under a variety of conditions and periodically provided subjective ratings of mental effort. A complex pattern of results was obtained, which was not consistent with predictions. Generally, the results of this research indicate that: (1) humans respond to conditions in which errors might be expected by attempting to reduce the possibility of error, and (2) adaptation to conditions is a potent influence on human behavior in discretionary situations. Subjects' explanations for changes in effort ratings are also explored.
Ajemian, Robert; D’Ausilio, Alessandro; Moorman, Helene; Bizzi, Emilio
2013-01-01
During the process of skill learning, synaptic connections in our brains are modified to form motor memories of learned sensorimotor acts. The more plastic the adult brain is, the easier it is to learn new skills or adapt to neurological injury. However, if the brain is too plastic and the pattern of synaptic connectivity is constantly changing, new memories will overwrite old memories, and learning becomes unstable. This trade-off is known as the stability–plasticity dilemma. Here a theory of sensorimotor learning and memory is developed whereby synaptic strengths are perpetually fluctuating without causing instability in motor memory recall, as long as the underlying neural networks are sufficiently noisy and massively redundant. The theory implies two distinct stages of learning—preasymptotic and postasymptotic—because once the error drops to a level comparable to that of the noise-induced error, further error reduction requires altered network dynamics. A key behavioral prediction derived from this analysis is tested in a visuomotor adaptation experiment, and the resultant learning curves are modeled with a nonstationary neural network. Next, the theory is used to model two-photon microscopy data that show, in animals, high rates of dendritic spine turnover, even in the absence of overt behavioral learning. Finally, the theory predicts enhanced task selectivity in the responses of individual motor cortical neurons as the level of task expertise increases. From these considerations, a unique interpretation of sensorimotor memory is proposed—memories are defined not by fixed patterns of synaptic weights but, rather, by nonstationary synaptic patterns that fluctuate coherently. PMID:24324147
Kourtis, Lampros C; Carter, Dennis R; Beaupre, Gary S
2014-08-01
Three-point bending tests are often used to determine the apparent or effective elastic modulus of long bones. The use of beam theory equations to interpret such tests can result in a substantial underestimation of the true effective modulus. In this study three-dimensional, nonlinear finite element analysis is used to quantify the errors inherent in beam theory and to create plots that can be used to correct the elastic modulus calculated from beam theory. Correction plots are generated for long bones representative of a variety of species commonly used in research studies. For a long bone with dimensions comparable to the mouse femur, the majority of the error in the effective elastic modulus results from deformations to the bone cross section that are not accounted for in the equations from beam theory. In some cases, the effective modulus calculated from beam theory can be less than one-third of the true effective modulus. Errors are larger: (1) for bones having short spans relative to bone length; (2) for bones with thin vs. thick cortices relative to periosteal diameter; and (3) when using a small radius or "knife-edge" geometry for the center loading ram and the outer supports in the three-point testing system. The use of these correction plots will enable researchers to compare results for long bones from different animal strains and to compare results obtained using testing systems that differ with regard to length between the outer supports and the radius used for the loading ram and outer supports.
Realtime mitigation of GPS SA errors using Loran-C
NASA Technical Reports Server (NTRS)
Braasch, Soo Y.
1994-01-01
The hybrid use of Loran-C with the Global Positioning System (GPS) was shown capable of providing a sole-means of enroute air radionavigation. By allowing pilots to fly direct to their destinations, use of this system is resulting in significant time savings and therefore fuel savings as well. However, a major error source limiting the accuracy of GPS is the intentional degradation of the GPS signal known as Selective Availability (SA). SA-induced position errors are highly correlated and far exceed all other error sources (horizontal position error: 100 meters, 95 percent). Realtime mitigation of SA errors from the position solution is highly desirable. How that can be achieved is discussed. The stability of Loran-C signals is exploited to reduce SA errors. The theory behind this technique is discussed and results using bench and flight data are given.
Hard sphere perturbation theory for thermodynamics of soft-sphere model liquid
NASA Astrophysics Data System (ADS)
Mon, K. K.
2001-09-01
It is a long-standing consensus in the literature that hard sphere perturbation theory (HSPT) is not accurate for dense soft sphere model liquids, interacting with repulsive r-n pair potentials for small n. In this paper, we show that if the intrinsic error of HSPT for soft sphere model liquids is accounted for, then this is not completely true. We present results for n=4, 6, 9, 12 which indicate that, even first order variational HSPT can provide free energy upper bounds to within a few percent at densities near freezing when corrected for the intrinsic error of the HSPT.
Wang, Jingxin; Tian, Jing; Wang, Rong; Benson, Valerie
2013-01-01
We examined performance in the antisaccade task for younger and older adults by comparing latencies and errors in what we defined as high attentional focus (mixed antisaccades and prosaccades in the same block) and low attentional focus (antisaccades and prosaccades in separate blocks) conditions. Shorter saccade latencies for correctly executed eye movements were observed for both groups in mixed, compared to blocked, antisaccade tasks, but antisaccade error rates were higher for older participants across both conditions. The results are discussed in relation to the inhibitory hypothesis, the goal neglect theory and attentional control theory. PMID:23620767
DeLorenzo, Christine; Papademetris, Xenophon; Staib, Lawrence H.; Vives, Kenneth P.; Spencer, Dennis D.; Duncan, James S.
2010-01-01
During neurosurgery, nonrigid brain deformation prevents preoperatively-acquired images from accurately depicting the intraoperative brain. Stereo vision systems can be used to track intraoperative cortical surface deformation and update preoperative brain images in conjunction with a biomechanical model. However, these stereo systems are often plagued with calibration error, which can corrupt the deformation estimation. In order to decouple the effects of camera calibration from the surface deformation estimation, a framework that can solve for disparate and often competing variables is needed. Game theory, which was developed to handle decision making in this type of competitive environment, has been applied to various fields from economics to biology. In this paper, game theory is applied to cortical surface tracking during neocortical epilepsy surgery and used to infer information about the physical processes of brain surface deformation and image acquisition. The method is successfully applied to eight in vivo cases, resulting in an 81% decrease in mean surface displacement error. This includes a case in which some of the initial camera calibration parameters had errors of 70%. Additionally, the advantages of using a game theoretic approach in neocortical epilepsy surgery are clearly demonstrated in its robustness to initial conditions. PMID:20129844
Decision-Making under Risk of Loss in Children
Steelandt, Sophie; Broihanne, Marie-Hélène; Romain, Amélie; Thierry, Bernard; Dufour, Valérie
2013-01-01
In human adults, judgment errors are known to often lead to irrational decision-making in risky contexts. While these errors can affect the accuracy of profit evaluation, they may have once enhanced survival in dangerous contexts following a “better be safe than sorry” rule of thumb. Such a rule can be critical for children, and it could develop early on. Here, we investigated the rationality of choices and the possible occurrence of judgment errors in children aged 3 to 9 years when exposed to a risky trade. Children were allocated with a piece of cookie that they could either keep or risk in exchange of the content of one cup among 6, visible in front of them. In the cups, cookies could be of larger, equal or smaller sizes than the initial allocation. Chances of losing or winning were manipulated by presenting different combinations of cookie sizes in the cups (for example 3 large, 2 equal and 1 small cookie). We investigated the rationality of children's response using the theoretical models of Expected Utility Theory (EUT) and Cumulative Prospect Theory. Children aged 3 to 4 years old were unable to discriminate the profitability of exchanging in the different combinations. From 5 years, children were better at maximizing their benefit in each combination, their decisions were negatively induced by the probability of losing, and they exhibited a framing effect, a judgment error found in adults. Confronting data to the EUT indicated that children aged over 5 were risk-seekers but also revealed inconsistencies in their choices. According to a complementary model, the Cumulative Prospect Theory (CPT), they exhibited loss aversion, a pattern also found in adults. These findings confirm that adult-like judgment errors occur in children, which suggests that they possess a survival value. PMID:23349682
Decision-making under risk of loss in children.
Steelandt, Sophie; Broihanne, Marie-Hélène; Romain, Amélie; Thierry, Bernard; Dufour, Valérie
2013-01-01
In human adults, judgment errors are known to often lead to irrational decision-making in risky contexts. While these errors can affect the accuracy of profit evaluation, they may have once enhanced survival in dangerous contexts following a "better be safe than sorry" rule of thumb. Such a rule can be critical for children, and it could develop early on. Here, we investigated the rationality of choices and the possible occurrence of judgment errors in children aged 3 to 9 years when exposed to a risky trade. Children were allocated with a piece of cookie that they could either keep or risk in exchange of the content of one cup among 6, visible in front of them. In the cups, cookies could be of larger, equal or smaller sizes than the initial allocation. Chances of losing or winning were manipulated by presenting different combinations of cookie sizes in the cups (for example 3 large, 2 equal and 1 small cookie). We investigated the rationality of children's response using the theoretical models of Expected Utility Theory (EUT) and Cumulative Prospect Theory. Children aged 3 to 4 years old were unable to discriminate the profitability of exchanging in the different combinations. From 5 years, children were better at maximizing their benefit in each combination, their decisions were negatively induced by the probability of losing, and they exhibited a framing effect, a judgment error found in adults. Confronting data to the EUT indicated that children aged over 5 were risk-seekers but also revealed inconsistencies in their choices. According to a complementary model, the Cumulative Prospect Theory (CPT), they exhibited loss aversion, a pattern also found in adults. These findings confirm that adult-like judgment errors occur in children, which suggests that they possess a survival value.
Acquisition Theory and Experimental Design: A Critique of Tomasello and Herron.
ERIC Educational Resources Information Center
Beck, Maria-Luise; Eubank, Lynn
1991-01-01
Caution should be taken in viewing previous research indicating that negative evidence, a special type of error correction to eliminate overgeneralizations, could be crucial to second-language learning, because the underlying theories adopted for that research possibly could be flawed. (10 references) (CB)
Taming wildlife disease: bridging the gap between science and management
Joseph, Maxwell B.; Mihaljevic, Joseph R.; Arellano, Ana Lisette; Kueneman, Jordan G.; Cross, Paul C.; Johnson, Pieter T.J.
2013-01-01
1.Parasites and pathogens of wildlife can threaten biodiversity, infect humans and domestic animals, and cause significant economic losses, providing incentives to manage wildlife diseases. Recent insights from disease ecology have helped transform our understanding of infectious disease dynamics and yielded new strategies to better manage wildlife diseases. Simultaneously, wildlife disease management (WDM) presents opportunities for large-scale empirical tests of disease ecology theory in diverse natural systems. 2.To assess whether the potential complementarity between WDM and disease ecology theory has been realized, we evaluate the extent to which specific concepts in disease ecology theory have been explicitly applied in peer-reviewed WDM literature. 3.While only half of WDM articles published in the past decade incorporated disease ecology theory, theory has been incorporated with increasing frequency over the past 40 years. Contrary to expectations, articles authored by academics were no more likely to apply disease ecology theory, but articles that explain unsuccessful management often do so in terms of theory. 4.Some theoretical concepts such as density-dependent transmission have been commonly applied, whereas emerging concepts such as pathogen evolutionary responses to management, biodiversity–disease relationships and within-host parasite interactions have not yet been fully integrated as management considerations. 5.Synthesis and applications. Theory-based disease management can meet the needs of both academics and managers by testing disease ecology theory and improving disease interventions. Theoretical concepts that have received limited attention to date in wildlife disease management could provide a basis for improving management and advancing disease ecology in the future.
Drach-Zahavy, A; Somech, A; Admi, H; Peterfreund, I; Peker, H; Priente, O
2014-03-01
Attention in the ward should shift from preventing medication administration errors to managing them. Nevertheless, little is known in regard with the practices nursing wards apply to learn from medication administration errors as a means of limiting them. To test the effectiveness of four types of learning practices, namely, non-integrated, integrated, supervisory and patchy learning practices in limiting medication administration errors. Data were collected from a convenient sample of 4 hospitals in Israel by multiple methods (observations and self-report questionnaires) at two time points. The sample included 76 wards (360 nurses). Medication administration error was defined as any deviation from prescribed medication processes and measured by a validated structured observation sheet. Wards' use of medication administration technologies, location of the medication station, and workload were observed; learning practices and demographics were measured by validated questionnaires. Results of the mixed linear model analysis indicated that the use of technology and quiet location of the medication cabinet were significantly associated with reduced medication administration errors (estimate=.03, p<.05 and estimate=-.17, p<.01 correspondingly), while workload was significantly linked to inflated medication administration errors (estimate=.04, p<.05). Of the learning practices, supervisory learning was the only practice significantly linked to reduced medication administration errors (estimate=-.04, p<.05). Integrated and patchy learning were significantly linked to higher levels of medication administration errors (estimate=-.03, p<.05 and estimate=-.04, p<.01 correspondingly). Non-integrated learning was not associated with it (p>.05). How wards manage errors might have implications for medication administration errors beyond the effects of typical individual, organizational and technology risk factors. Head nurse can facilitate learning from errors by "management by walking around" and monitoring nurses' medication administration behaviors. Copyright © 2013 Elsevier Ltd. All rights reserved.
A signal detection-item response theory model for evaluating neuropsychological measures.
Thomas, Michael L; Brown, Gregory G; Gur, Ruben C; Moore, Tyler M; Patt, Virginie M; Risbrough, Victoria B; Baker, Dewleen G
2018-02-05
Models from signal detection theory are commonly used to score neuropsychological test data, especially tests of recognition memory. Here we show that certain item response theory models can be formulated as signal detection theory models, thus linking two complementary but distinct methodologies. We then use the approach to evaluate the validity (construct representation) of commonly used research measures, demonstrate the impact of conditional error on neuropsychological outcomes, and evaluate measurement bias. Signal detection-item response theory (SD-IRT) models were fitted to recognition memory data for words, faces, and objects. The sample consisted of U.S. Infantry Marines and Navy Corpsmen participating in the Marine Resiliency Study. Data comprised item responses to the Penn Face Memory Test (PFMT; N = 1,338), Penn Word Memory Test (PWMT; N = 1,331), and Visual Object Learning Test (VOLT; N = 1,249), and self-report of past head injury with loss of consciousness. SD-IRT models adequately fitted recognition memory item data across all modalities. Error varied systematically with ability estimates, and distributions of residuals from the regression of memory discrimination onto self-report of past head injury were positively skewed towards regions of larger measurement error. Analyses of differential item functioning revealed little evidence of systematic bias by level of education. SD-IRT models benefit from the measurement rigor of item response theory-which permits the modeling of item difficulty and examinee ability-and from signal detection theory-which provides an interpretive framework encompassing the experimentally validated constructs of memory discrimination and response bias. We used this approach to validate the construct representation of commonly used research measures and to demonstrate how nonoptimized item parameters can lead to erroneous conclusions when interpreting neuropsychological test data. Future work might include the development of computerized adaptive tests and integration with mixture and random-effects models.
Adaptation to sensory-motor reflex perturbations is blind to the source of errors.
Hudson, Todd E; Landy, Michael S
2012-01-06
In the study of visual-motor control, perhaps the most familiar findings involve adaptation to externally imposed movement errors. Theories of visual-motor adaptation based on optimal information processing suppose that the nervous system identifies the sources of errors to effect the most efficient adaptive response. We report two experiments using a novel perturbation based on stimulating a visually induced reflex in the reaching arm. Unlike adaptation to an external force, our method induces a perturbing reflex within the motor system itself, i.e., perturbing forces are self-generated. This novel method allows a test of the theory that error source information is used to generate an optimal adaptive response. If the self-generated source of the visually induced reflex perturbation is identified, the optimal response will be via reflex gain control. If the source is not identified, a compensatory force should be generated to counteract the reflex. Gain control is the optimal response to reflex perturbation, both because energy cost and movement errors are minimized. Energy is conserved because neither reflex-induced nor compensatory forces are generated. Precision is maximized because endpoint variance is proportional to force production. We find evidence against source-identified adaptation in both experiments, suggesting that sensory-motor information processing is not always optimal.
Gaussian Hypothesis Testing and Quantum Illumination.
Wilde, Mark M; Tomamichel, Marco; Lloyd, Seth; Berta, Mario
2017-09-22
Quantum hypothesis testing is one of the most basic tasks in quantum information theory and has fundamental links with quantum communication and estimation theory. In this paper, we establish a formula that characterizes the decay rate of the minimal type-II error probability in a quantum hypothesis test of two Gaussian states given a fixed constraint on the type-I error probability. This formula is a direct function of the mean vectors and covariance matrices of the quantum Gaussian states in question. We give an application to quantum illumination, which is the task of determining whether there is a low-reflectivity object embedded in a target region with a bright thermal-noise bath. For the asymmetric-error setting, we find that a quantum illumination transmitter can achieve an error probability exponent stronger than a coherent-state transmitter of the same mean photon number, and furthermore, that it requires far fewer trials to do so. This occurs when the background thermal noise is either low or bright, which means that a quantum advantage is even easier to witness than in the symmetric-error setting because it occurs for a larger range of parameters. Going forward from here, we expect our formula to have applications in settings well beyond those considered in this paper, especially to quantum communication tasks involving quantum Gaussian channels.
The famous five factors in teamwork: a case study of fratricide.
Rafferty, Laura A; Stanton, Neville A; Walker, Guy H
2010-10-01
The purpose of this paper is to propose foundations for a theory of errors in teamwork based upon analysis of a case study of fratricide alongside a review of the existing literature. This approach may help to promote a better understanding of interactions within complex systems and help in the formulation of hypotheses and predictions concerning errors in teamwork, particularly incidents of fratricide. It is proposed that a fusion of concepts drawn from error models, with common causal categories taken from teamwork models, could allow for an in-depth exploration of incidents of fratricide. It is argued that such a model has the potential to explore the core causal categories identified as present in an incident of fratricide. This view marks fratricide as a process of errors occurring throughout the military system as a whole, particularly due to problems in teamwork within this complex system. Implications of this viewpoint for the development of a new theory of fratricide are offered. STATEMENT OF RELEVANCE: This article provides an insight into the fusion of existing error and teamwork models for the analysis of an incident of fratricide. Within this paper, a number of commonalities among models of teamwork have been identified allowing for the development of a model.
Research on Spectroscopy, Opacity, and Atmospheres
NASA Technical Reports Server (NTRS)
Kurucz, Robert L.
1996-01-01
I discuss errors in theory and in interpreting observations that are produced by the failure to consider resolution in space, time, and energy. I discuss convection in stellar model atmospheres and in stars. Large errors in abundances are possible such as the factor of ten error in the Li abundance for extreme Population II stars. Finally I discuss the variation of microturbulent velocity with depth, effective temperature, gravity and abundance. These variations must be dealt with in computing models and grids and in any type of photometric calibration.
ERIC Educational Resources Information Center
Halpern, Orly; Tobin, Yishai
2008-01-01
"Non-vocalization" (N-V) is a newly described phonological error process in hearing impaired speakers. In N-V the hearing impaired person actually articulates the phoneme but without producing a voice. The result is an error process looking as if it is produced but sounding as if it is omitted. N-V was discovered by video recording the speech of…
NASA Astrophysics Data System (ADS)
Zhao, Chen-Guang; Tan, Jiu-Bin; Liu, Tao
2010-09-01
The mechanism of a non-polarizing beam splitter (NPBS) with asymmetrical transfer coefficients causing the rotation of polarization direction is explained in principle, and the measurement nonlinear error caused by NPBS is analyzed based on Jones matrix theory. Theoretical calculations show that the nonlinear error changes periodically, and the error period and peak values increase with the deviation between transmissivities of p-polarization and s-polarization states. When the transmissivity of p-polarization is 53% and that of s-polarization is 48%, the maximum error reaches 2.7 nm. The imperfection of NPBS is one of the main error sources in simultaneous phase-shifting polarization interferometer, and its influence can not be neglected in the nanoscale ultra-precision measurement.
NASA Astrophysics Data System (ADS)
Zhou, Yanru; Zhao, Yuxiang; Tian, Hui; Zhang, Dengwei; Huang, Tengchao; Miao, Lijun; Shu, Xiaowu; Che, Shuangliang; Liu, Cheng
2016-12-01
In an axial magnetic field (AMF), which is vertical to the plane of the fiber coil, a polarization-maintaining fiber optic gyro (PM-FOG) appears as an axial magnetic error. This error is linearly related to the intensity of an AMF, the radius of the fiber coil, and the light wavelength, and also influenced by the distribution of fiber twist. When a PM-FOG is manufactured completely, this error only appears a linear correlation with the AMF. A real-time compensation model is established to eliminate the error, and the experimental results show that the axial magnetic error of the PM-FOG is decreased from 5.83 to 0.09 deg/h in 12G AMF with 18-dB suppression.
NASA Astrophysics Data System (ADS)
Zhao, Fei; Zhang, Chi; Yang, Guilin; Chen, Chinyin
2016-12-01
This paper presents an online estimation method of cutting error by analyzing of internal sensor readings. The internal sensors of numerical control (NC) machine tool are selected to avoid installation problem. The estimation mathematic model of cutting error was proposed to compute the relative position of cutting point and tool center point (TCP) from internal sensor readings based on cutting theory of gear. In order to verify the effectiveness of the proposed model, it was simulated and experimented in gear generating grinding process. The cutting error of gear was estimated and the factors which induce cutting error were analyzed. The simulation and experiments verify that the proposed approach is an efficient way to estimate the cutting error of work-piece during machining process.
A day in the life of a volunteer incident commander: errors, pressures and mitigating strategies.
Bearman, Christopher; Bremner, Peter A
2013-05-01
To meet an identified gap in the literature this paper investigates the tasks that a volunteer incident commander needs to carry out during an incident, the errors that can be made and the way that errors are managed. In addition, pressure from goal seduction and situation aversion were also examined. Volunteer incident commanders participated in a two-part interview consisting of a critical decision method interview and discussions about a hierarchical task analysis constructed by the authors. A SHERPA analysis was conducted to further identify potential errors. The results identified the key tasks, errors with extreme risk, pressures from strong situations and mitigating strategies for errors and pressures. The errors and pressures provide a basic set of issues that need to be managed by both volunteer incident commanders and fire agencies. The mitigating strategies identified here suggest some ways that this can be done. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Factors contributing to registered nurse medication administration error: a narrative review.
Parry, Angela M; Barriball, K Louise; While, Alison E
2015-01-01
To explore the factors contributing to Registered Nurse medication administration error behaviour. A narrative review. Electronic databases (Cochrane, CINAHL, MEDLINE, BNI, EmBase, and PsycINFO) were searched from 1 January 1999 to 31 December 2012 in the English language. 1127 papers were identified and 26 papers were included in the review. Data were extracted by one reviewer and checked by a second reviewer. A thematic analysis and narrative synthesis of the factors contributing to Registered Nurses' medication administration behaviour. Bandura's (1986) theory of reciprocal determinism was used as an organising framework. This theory proposes that there is a reciprocal interplay between the environment, the person and their behaviour. Medication administration error is an outcome of RN behaviour. The 26 papers reported studies conducted in 4 continents across 11 countries predominantly in North America and Europe, with one multi-national study incorporating 27 countries. Within both the environment and person domain of the reciprocal determinism framework, a number of factors emerged as influencing Registered Nurse medication administration error behaviour. Within the environment domain, two key themes of clinical workload and work setting emerged, and within the person domain the Registered Nurses' characteristics and their lived experience of work emerged as themes. Overall, greater attention has been given to the contribution of the environment domain rather than the person domain as contributing to error, with the literature viewing an error as an event rather than the outcome of behaviour. The interplay between factors that influence behaviour were poorly accounted for within the selected studies. It is proposed that a shift away from error as an event to a focus on the relationships between the person, the environment and Registered Nurse medication administration behaviour is needed to better understand medication administration error. Copyright © 2014 Elsevier Ltd. All rights reserved.
Vaskinn, Anja; Andersson, Stein; Østefjells, Tiril; Andreassen, Ole A; Sundet, Kjetil
2018-06-05
Theory of mind (ToM) can be divided into cognitive and affective ToM, and a distinction can be made between overmentalizing and undermentalizing errors. Research has shown that ToM in schizophrenia is associated with non-social and social cognition, and with clinical symptoms. In this study, we investigate cognitive and clinical predictors of different ToM processes. Ninety-one individuals with schizophrenia participated. ToM was measured with the Movie for the Assessment of Social Cognition (MASC) yielding six scores (total ToM, cognitive ToM, affective ToM, overmentalizing errors, undermentalizing errors and no mentalizing errors). Neurocognition was indexed by a composite score based on the non-social cognitive tests in the MATRICS Consensus Cognitive Battery (MCCB). Emotion perception was measured with Emotion in Biological Motion (EmoBio), a point-light walker task. Clinical symptoms were assessed with the Positive and Negative Syndrome Scale (PANSS). Seventy-one healthy control (HC) participants completed the MASC. Individuals with schizophrenia showed large impairments compared to HC for all MASC scores, except overmentalizing errors. Hierarchical regression analyses with the six different MASC scores as dependent variables revealed that MCCB was a significant predictor of all MASC scores, explaining 8-18% of the variance. EmoBio increased the explained variance significantly, to 17-28%, except for overmentalizing errors. PANSS excited symptoms increased explained variance for total ToM, affective ToM and no mentalizing errors. Both social and non-social cognition were significant predictors of ToM. Overmentalizing was only predicted by non-social cognition. Excited symptoms contributed to overall and affective ToM, and to no mentalizing errors. Copyright © 2018 Elsevier Inc. All rights reserved.
Effect of Static Strains on Diffusion
NASA Technical Reports Server (NTRS)
Girifalco, L. A.; Grimes, H. H.
1961-01-01
A theory is developed that gives the diffusion coefficient in strained systems as an exponential function of the strain. This theory starts with the statistical theory of the atomic jump frequency as developed by Vineyard. The parameter determining the effect of strain on diffusion is related to the changes in the inter-atomic forces with strain. Comparison of the theory with published experimental results for the effect of pressure on diffusion shows that the experiments agree with the form of the theoretical equation in all cases within experimental error.
[Description of clinical thinking by the dual-process theory].
Peña G, Luis
2012-06-01
Clinical thinking is a very complex process that can be described by the dual-process theory, it has an intuitive part (that recognizes patterns) and an analytical part (that tests hypotheses). It is vulnerable to cognitive bias that professionals must be aware of, to minimize diagnostic errors.
Conflict Monitoring in Dual Process Theories of Thinking
ERIC Educational Resources Information Center
De Neys, Wim; Glumicic, Tamara
2008-01-01
Popular dual process theories have characterized human thinking as an interplay between an intuitive-heuristic and demanding-analytic reasoning process. Although monitoring the output of the two systems for conflict is crucial to avoid decision making errors there are some widely different views on the efficiency of the process. Kahneman…
Misconduct in the Prosecution of Severe Crimes: Theory and Experimental Test
ERIC Educational Resources Information Center
Lucas, Jeffrey W.; Graif, Corina; Lovaglia, Michael J.
2006-01-01
Prosecutorial misconduct involves the intentional use of illegal or improper methods for attaining convictions against defendants in criminal trials. Previous research documented extensive errors in the prosecution of severe crimes. A theory formulated to explain this phenomenon proposes that in serious cases, increased pressure to convict…
Probability Theory, Not the Very Guide of Life
ERIC Educational Resources Information Center
Juslin, Peter; Nilsson, Hakan; Winman, Anders
2009-01-01
Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive…
Error Tendencies in Processing Student Feedback for Instructional Decision Making.
ERIC Educational Resources Information Center
Schermerhorn, John R., Jr.; And Others
1985-01-01
Seeks to assist instructors in recognizing two basic errors that can occur in processing student evaluation data on instructional development efforts; offers a research framework for future investigations of the error tendencies and related issues; and suggests ways in which instructors can confront and manage error tendencies in practice. (MBR)
Monitoring in Language Perception: Mild and Strong Conflicts Elicit Different ERP Patterns
ERIC Educational Resources Information Center
van de Meerendonk, Nan; Kolk, Herman H. J.; Vissers, Constance Th. W. M.; Chwilla, Dorothee J.
2010-01-01
In the language domain, most studies of error monitoring have been devoted to language production. However, in language perception, errors are made as well and we are able to detect them. According to the monitoring theory of language perception, a strong conflict between what is expected and what is observed triggers reanalysis to check for…
Standard Error Estimation of 3PL IRT True Score Equating with an MCMC Method
ERIC Educational Resources Information Center
Liu, Yuming; Schulz, E. Matthew; Yu, Lei
2008-01-01
A Markov chain Monte Carlo (MCMC) method and a bootstrap method were compared in the estimation of standard errors of item response theory (IRT) true score equating. Three test form relationships were examined: parallel, tau-equivalent, and congeneric. Data were simulated based on Reading Comprehension and Vocabulary tests of the Iowa Tests of…
ERIC Educational Resources Information Center
Worth, Sarah
2003-01-01
This study compared responses of 16 pupils either with or without autistic spectrum disorder (ASD) and matched for gender and verbal ability. Subjects' responses to various pictures were categorized. Results suggested errors made by the two groups differed both quantitatively and qualitatively. Errors made by pupils with ASD were largely…
Crop area estimation based on remotely-sensed data with an accurate but costly subsample
NASA Technical Reports Server (NTRS)
Gunst, R. F.
1983-01-01
Alternatives to sampling-theory stratified and regression estimators of crop production and timber biomass were examined. An alternative estimator which is viewed as especially promising is the errors-in-variable regression estimator. Investigations established the need for caution with this estimator when the ratio of two error variances is not precisely known.
Random Error in Judgment: The Contribution of Encoding and Retrieval Processes
ERIC Educational Resources Information Center
Pleskac, Timothy J.; Dougherty, Michael R.; Rivadeneira, A. Walkyria; Wallsten, Thomas S.
2009-01-01
Theories of confidence judgments have embraced the role random error plays in influencing responses. An important next step is to identify the source(s) of these random effects. To do so, we used the stochastic judgment model (SJM) to distinguish the contribution of encoding and retrieval processes. In particular, we investigated whether dividing…
Standard Errors of Estimated Latent Variable Scores with Estimated Structural Parameters
ERIC Educational Resources Information Center
Hoshino, Takahiro; Shigemasu, Kazuo
2008-01-01
The authors propose a concise formula to evaluate the standard error of the estimated latent variable score when the true values of the structural parameters are not known and must be estimated. The formula can be applied to factor scores in factor analysis or ability parameters in item response theory, without bootstrap or Markov chain Monte…
A Systematic Approach for Identifying Level-1 Error Covariance Structures in Latent Growth Modeling
ERIC Educational Resources Information Center
Ding, Cherng G.; Jane, Ten-Der; Wu, Chiu-Hui; Lin, Hang-Rung; Shen, Chih-Kang
2017-01-01
It has been pointed out in the literature that misspecification of the level-1 error covariance structure in latent growth modeling (LGM) has detrimental impacts on the inferences about growth parameters. Since correct covariance structure is difficult to specify by theory, the identification needs to rely on a specification search, which,…
Senior High School Students' Errors on the Use of Relative Words
ERIC Educational Resources Information Center
Bao, Xiaoli
2015-01-01
Relative clause is one of the most important language points in College English Examination. Teachers have been attaching great importance to the teaching of relative clause, but the outcomes are not satisfactory. Based on Error Analysis theory, this article aims to explore the reasons why senior high school students find it difficult to choose…
Nonlinear growth of zonal flows by secondary instability in general magnetic geometry
Plunk, G. G.; Navarro, A. Banon
2017-02-23
Here we present a theory of the nonlinear growth of zonal flows in magnetized plasma turbulence, by the mechanism of secondary instability. The theory is derived for general magnetic geometry, and is thus applicable to both tokamaks and stellarators. The predicted growth rate is shown to compare favorably with nonlinear gyrokinetic simulations, with the error scaling as expected with the small parameter of the theory.
NASA Astrophysics Data System (ADS)
Kalanov, Temur Z.
2015-04-01
Analysis of the foundations of the theory of negative numbers is proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. Statement of the problem is as follows. As is known, point O in the Cartesian coordinate system XOY determines the position of zero on the scale. The number ``zero'' belongs to both the scale of positive numbers and the scale of negative numbers. In this case, the following formallogical contradiction arises: the number 0 is both positive number and negative number; or, equivalently, the number 0 is neither positive number nor negative number, i.e. number 0 has no sign. Then the following question arises: Do negative numbers exist in science and practice? A detailed analysis of the problem shows that negative numbers do not exist because the foundations of the theory of negative numbers contrary to the formal-logical laws. It is proved that: (a) all numbers have no signs; (b) the concepts ``negative number'' and ``negative sign of number'' represent a formallogical error; (c) signs ``plus'' and ``minus'' are only symbols of mathematical operations. The logical errors determine the essence of the theory of negative numbers: the theory of negative number is a false theory.
Hakala, John L; Hung, Joseph C; Mosman, Elton A
2012-09-01
The objective of this project was to ensure correct radiopharmaceutical administration through the use of a bar code system that links patient and drug profiles with on-site information management systems. This new combined system would minimize the amount of manual human manipulation, which has proven to be a primary source of error. The most common reason for dosing errors is improper patient identification when a dose is obtained from the nuclear pharmacy or when a dose is administered. A standardized electronic transfer of information from radiopharmaceutical preparation to injection will further reduce the risk of misadministration. Value stream maps showing the flow of the patient dose information, as well as potential points of human error, were developed. Next, a future-state map was created that included proposed corrections for the most common critical sites of error. Transitioning the current process to the future state will require solutions that address these sites. To optimize the future-state process, a bar code system that links the on-site radiology management system with the nuclear pharmacy management system was proposed. A bar-coded wristband connects the patient directly to the electronic information systems. The bar code-enhanced process linking the patient dose with the electronic information reduces the number of crucial points for human error and provides a framework to ensure that the prepared dose reaches the correct patient. Although the proposed flowchart is designed for a site with an in-house central nuclear pharmacy, much of the framework could be applied by nuclear medicine facilities using unit doses. An electronic connection between information management systems to allow the tracking of a radiopharmaceutical from preparation to administration can be a useful tool in preventing the mistakes that are an unfortunate reality for any facility.
Managing business compliance using model-driven security management
NASA Astrophysics Data System (ADS)
Lang, Ulrich; Schreiner, Rudolf
Compliance with regulatory and governance standards is rapidly becoming one of the hot topics of information security today. This is because, especially with regulatory compliance, both business and government have to expect large financial and reputational losses if compliance cannot be ensured and demonstrated. One major difficulty of implementing such regulations is caused the fact that they are captured at a high level of abstraction that is business-centric and not IT centric. This means that the abstract intent needs to be translated in a trustworthy, traceable way into compliance and security policies that the IT security infrastructure can enforce. Carrying out this mapping process manually is time consuming, maintenance-intensive, costly, and error-prone. Compliance monitoring is also critical in order to be able to demonstrate compliance at any given point in time. The problem is further complicated because of the need for business-driven IT agility, where IT policies and enforcement can change frequently, e.g. Business Process Modelling (BPM) driven Service Oriented Architecture (SOA). Model Driven Security (MDS) is an innovative technology approach that can solve these problems as an extension of identity and access management (IAM) and authorization management (also called entitlement management). In this paper we will illustrate the theory behind Model Driven Security for compliance, provide an improved and extended architecture, as well as a case study in the healthcare industry using our OpenPMF 2.0 technology.
Error Analysis and Validation for Insar Height Measurement Induced by Slant Range
NASA Astrophysics Data System (ADS)
Zhang, X.; Li, T.; Fan, W.; Geng, X.
2018-04-01
InSAR technique is an important method for large area DEM extraction. Several factors have significant influence on the accuracy of height measurement. In this research, the effect of slant range measurement for InSAR height measurement was analysis and discussed. Based on the theory of InSAR height measurement, the error propagation model was derived assuming no coupling among different factors, which directly characterise the relationship between slant range error and height measurement error. Then the theoretical-based analysis in combination with TanDEM-X parameters was implemented to quantitatively evaluate the influence of slant range error to height measurement. In addition, the simulation validation of InSAR error model induced by slant range was performed on the basis of SRTM DEM and TanDEM-X parameters. The spatial distribution characteristics and error propagation rule of InSAR height measurement were further discussed and evaluated.
Management Theories and Broadcasting: A Handbook.
ERIC Educational Resources Information Center
Craig, J. Robert; Hindmarsh, Wayne A.
Today's contemporary management and motivation theories, as applied to the business of broadcasting, are the focus of the first section of this paper. It deals with the kinds and reactions of employees in broadcasting stations in relation to 11 motivational theories: (1) Theories X and Y, (2) Immaturity-Maturity Theory, (3) V Theory, (4) Z Theory,…
A Study of Student Publications Adviser Management Style (Theory X, Theory Y).
ERIC Educational Resources Information Center
Ames, Steve
Members of the National Council of College Publications Advisers were surveyed to determine the predominant management style within which the advisers worked. The choices of management style were restrictive-Theory X and permissive-Theory Y. Respondents were encouraged to give general examples, cite specific incidents, and include clippings from…
Models of Organization and Governance at the Community College.
ERIC Educational Resources Information Center
Silverman, Michael
In order to provide the best management model for the effective and efficient operation of community colleges, it is useful to look briefly at management theories. The three principle theories in use in corporate management are: (1) theory X, involving an autocratic supervisor allowing for minimal group influence; (2) theory Y, in which…
Clinical review: Medication errors in critical care
Moyen, Eric; Camiré, Eric; Stelfox, Henry Thomas
2008-01-01
Medication errors in critical care are frequent, serious, and predictable. Critically ill patients are prescribed twice as many medications as patients outside of the intensive care unit (ICU) and nearly all will suffer a potentially life-threatening error at some point during their stay. The aim of this article is to provide a basic review of medication errors in the ICU, identify risk factors for medication errors, and suggest strategies to prevent errors and manage their consequences. PMID:18373883
Estimating Bias Error Distributions
NASA Technical Reports Server (NTRS)
Liu, Tian-Shu; Finley, Tom D.
2001-01-01
This paper formulates the general methodology for estimating the bias error distribution of a device in a measuring domain from less accurate measurements when a minimal number of standard values (typically two values) are available. A new perspective is that the bias error distribution can be found as a solution of an intrinsic functional equation in a domain. Based on this theory, the scaling- and translation-based methods for determining the bias error distribution arc developed. These methods are virtually applicable to any device as long as the bias error distribution of the device can be sufficiently described by a power series (a polynomial) or a Fourier series in a domain. These methods have been validated through computational simulations and laboratory calibration experiments for a number of different devices.
Disrupted prediction-error signal in psychosis: evidence for an associative account of delusions
Corlett, P. R.; Murray, G. K.; Honey, G. D.; Aitken, M. R. F.; Shanks, D. R.; Robbins, T.W.; Bullmore, E.T.; Dickinson, A.; Fletcher, P. C.
2012-01-01
Delusions are maladaptive beliefs about the world. Based upon experimental evidence that prediction error—a mismatch between expectancy and outcome—drives belief formation, this study examined the possibility that delusions form because of disrupted prediction-error processing. We used fMRI to determine prediction-error-related brain responses in 12 healthy subjects and 12 individuals (7 males) with delusional beliefs. Frontal cortex responses in the patient group were suggestive of disrupted prediction-error processing. Furthermore, across subjects, the extent of disruption was significantly related to an individual’s propensity to delusion formation. Our results support a neurobiological theory of delusion formation that implicates aberrant prediction-error signalling, disrupted attentional allocation and associative learning in the formation of delusional beliefs. PMID:17690132
Determination and error analysis of emittance and spectral emittance measurements by remote sensing
NASA Technical Reports Server (NTRS)
Dejesusparada, N. (Principal Investigator); Kumar, R.
1977-01-01
The author has identified the following significant results. From the theory of remote sensing of surface temperatures, an equation of the upper bound of absolute error of emittance was determined. It showed that the absolute error decreased with an increase in contact temperature, whereas, it increased with an increase in environmental integrated radiant flux density. Change in emittance had little effect on the absolute error. A plot of the difference between temperature and band radiance temperature vs. emittance was provided for the wavelength intervals: 4.5 to 5.5 microns, 8 to 13.5 microns, and 10.2 to 12.5 microns.
Rathert, Cheryl; May, Douglas R
2007-01-01
Experts continue to decry the lack of progress made in decreasing the alarming frequency of medical errors in health care organizations (Leape, L. L., & Berwick, D. M. (2005). Five years after to err is human: What have we learned?. Journal of the American Medical Association, 293(19), 2384-2390). At the same time, other experts are concerned about the lack of job satisfaction and turnover among nurses (. Keeping patients safe: Transforming the work environment of nurses. Washington, DC: National Academy Press). Research and theory suggest that a work environment that facilitates patient-centered care should increase patient safety and nurse satisfaction. The present study began with a conceptual model that specifies how work environment variables should be related to both nurse and patient outcomes. Specifically, we proposed that health care work units with climates for patient-centered care should have nurses who are more satisfied with their jobs. Such units should also have higher levels of patient safety, with fewer medication errors. We examined perceptions of nurses from three acute care hospitals in the eastern United States. Nurses who perceived their work units as more patient centered were significantly more satisfied with their jobs than were those whose units were perceived as less patient centered. Those whose work units were more patient centered reported that medication errors occurred less frequently in their units and said that they felt more comfortable reporting errors and near-misses than those in less patient-centered units. Patients and quality leaders continue to call for delivery of patient-centered care. If climates that facilitate such care are also related to improved patient safety and nurse satisfaction, proactive, patient-centered management of the work environment could result in improved patient, employee, and organizational outcomes.
NASA Astrophysics Data System (ADS)
Bao, Chuanchen; Li, Jiakun; Feng, Qibo; Zhang, Bin
2018-07-01
This paper introduces an error-compensation model for our measurement method to measure five motion errors of a rotary axis based on fibre laser collimation. The error-compensation model is established in a matrix form using the homogeneous coordinate transformation theory. The influences of the installation errors, error crosstalk, and manufacturing errors are analysed. The model is verified by both ZEMAX simulation and measurement experiments. The repeatability values of the radial and axial motion errors are significantly suppressed by more than 50% after compensation. The repeatability experiments of five degrees of freedom motion errors and the comparison experiments of two degrees of freedom motion errors of an indexing table were performed by our measuring device and a standard instrument. The results show that the repeatability values of the angular positioning error ε z and tilt motion error around the Y axis ε y are 1.2″ and 4.4″, and the comparison deviations of the two motion errors are 4.0″ and 4.4″, respectively. The repeatability values of the radial and axial motion errors, δ y and δ z , are 1.3 and 0.6 µm, respectively. The repeatability value of the tilt motion error around the X axis ε x is 3.8″.
Nurses' attitudes and perceived barriers to the reporting of medication administration errors.
Yung, Hai-Peng; Yu, Shu; Chu, Chi; Hou, I-Ching; Tang, Fu-In
2016-07-01
(1) To explore the attitudes and perceived barriers to reporting medication administration errors and (2) to understand the characteristics of - and nurses' feelings - about error reports. Under-reporting of medication administration errors is a global concern related to the safety of patient care. Understanding nurses' attitudes and perceived barriers to error reporting is the initial step to increasing the reporting rate. A cross-sectional, descriptive survey with a self-administered questionnaire was completed by the nurses of a medical centre hospital in Taiwan. A total of 306 nurses participated in the study. Nurses' attitudes towards medication administration error reporting were inclined towards positive. The major perceived barrier was fear of the consequences after reporting. The results demonstrated that 88.9% of medication administration errors were reported orally, whereas 19.0% were reported through the hospital internet system. Self-recrimination was the common feeling of nurses after the commission of an medication administration error. Even if hospital management encourages errors to be reported without recrimination, nurses' attitudes toward medication administration error reporting are not very positive and fear is the most prominent barrier contributing to underreporting. Nursing managers should establish anonymous reporting systems and counselling classes to create a secure atmosphere to reduce nurses' fear and provide incentives to encourage reporting. © 2016 John Wiley & Sons Ltd.
Wu, Ya-Ke; Chu, Nain-Feng
2015-01-01
Overweight and obesity are serious public health and medical problems among children and adults worldwide. Behavioural change has been demonstrably contributory to weight management programs. Behavioural change-based weight loss programs require a theoretical framework. We will review the transtheoretical model and the organisational development theory in weight management. The transtheoretical model is a behaviour theory of individual level frequently used for weight management programs. The organisational development theory is a more complicated behaviour theory that applies to behavioural change on the system level. Both of these two theories have their respective strengths and weaknesses. In this manuscript, we try to introduce the transtheoretical model and the organisational development theory in the context of weight loss programs among population that are overweight or obese. Ultimately, we wish to present a new framework/strategy of weight management by integrating these two theories together. Copyright © 2015 Asian Oceanian Association for the Study of Obesity. Published by Elsevier Ltd. All rights reserved.
["Second victim" - error, crises and how to get out of it].
von Laue, N; Schwappach, D; Hochreutener, M
2012-06-01
Medical errors do not only harm patients ("first victims"). Almost all health care professionals become a so-called "second victim" once in their career by being involved in a medical error. Studies show that error involvement can have a tremendous impact on health care workers leading to burnout, depression and professional crisis. Moreover persons involved in errors show a decline in job performance and jeopardize therefore patient safety. Blaming the person is one of the typical psychological reactions after an error happened as the attribution theory tells. The self-esteem gets stabilized if we can put blame on someone and pick out a scapegoat. But standing alone makes the emotional situation even worse. A vicious circle can evolve with tragic effect for the individual and negative implications for patient safety and the health care setting.
Establishing a culture for patient safety - the role of education.
Milligan, Frank J
2007-02-01
This paper argues that the process of making significant moves towards a patient safety culture requires changes in healthcare education. Improvements in patient safety are a shared international priority as too many errors and other forms of unnecessary harm are currently occurring in the process of caring for and treating patients. A description of the patient safety agenda is given followed by a brief analysis of human factors theory and its use in other safety critical industries, most notably aviation. The all too common problem of drug administration errors is used to illustrate the relevance of human factors theory to healthcare education with specific mention made of the Human Factors Analysis and Classification System (HFACS).
ERIC Educational Resources Information Center
Cerveny, Robert P.
This curriculum guide provides an introduction to Management Information Systems (MIS) concepts and techniques for students preparing to develop MISs in professional settings, and to assist in MIS evaluation. According to the guide, students are exposed to concepts drawn from systems theory, information theory, management theory, data base…
Application of social domain of human mind in water management
NASA Astrophysics Data System (ADS)
Piirimäe, Kristjan
2010-05-01
Currently, researches dispute whether a human reasons domain-generally or domain-specifically (Fiddick, 2004). The theory of several intuitive reasoning programmes in human mind suggests that the main driver to increase problem-solving abilities is social domain (Byrne & Bates, 2009). This theory leads to an idea to apply the social domain also in environmental management. More specifically, environmental problems might be presented through social aspects. Cosmides (1989) proposed that the most powerful programme in our social domain might be ‘cheater detection module' - a genetically determined mental tool whose dedicated function is to unmask cheaters. She even suggested that only cheater detection can enable logical reasoning. Recently, this idea has found experimental proof and specifications (Buchner et al., 2009). From this perspective, a participatory environmental decision support system requires involvement of the representatives of social control such as environmental agencies and NGOs. These evaluators might effectively discover legal and moral inconsistencies, logical errors and other weaknesses in proposals if they are encouraged to detect cheating. Thus, instead of just environmental concerns, the query of an artificial intelligence should emphasize cheating. Following the idea of Cosmides (1989), employment of cheater detectors to EDSS might appear the only way to achieve environmental management which applies correct logical reasoning as well as both, legislative requirements and conservationist moral. According to our hypothesis, representatives of social control can well discover legal and moral inconsistencies, logical errors and and other weaknesses in envirionmental management proposals if encouraged for cheater detection. In our social experiment, a draft plan of measures for sustainable management of Lake Peipsi environment was proposed to representatives of social control, including Ministry of Environment, other environmental authorities, and NGOs. These people were randomly divided to two working groups and asked to criticize the proposed plan. One group was encouraged to detect cheating behind the plan. Later, a group of independent experts evaluated the criticism of both groups and each individual person. The resulting assignements rated the group of cheater detectors as significantly more adequate decision-supporters. The results confirmed that simulation of the 'cheater detection module' of human mind might improve the performance of an EDSS. The study calls for the development of special methodologies for the stimulation and application of social domain in water management. References Buchner, A., Bell, R., Mehl, B., & Musch, J., (2009). No enhanced recognition memory, but better source memory for faces of cheaters. Evolution and Human Behaviour, 30(3), 212 - 224. Byrne, R., Bates, L. (2009). Sociality, evolution and cognition. Current Biology, 17(16), R714 - R723. Cosmides, L. (1989). The logic of social exchange: Has natural selection shaped how humans reason? Studies with the Wason selection task. Cognition, 31(3), 187-276. Fiddick, L. (2004). Domains of deontic reasoning: Resolving the discrepancy between the cognitive and moral reasoning literatures. The Quartlerly Journal of Experimental Psychology, 57A(3), 447 - 474.
General Contingency Theory of Organizations: An Alternative to Open Systems Theory.
1982-08-01
genetic and mechanical open systems. We have recently proposed a general contingency theory (GCT) of management (Luthans and Stewart, 1977) which promises...developed in response to the need for an integrative theory of management that incorporates the environment (in the open systems sense. and begins to... management and desired performance out- comes. We will show that the GCT matrix can lead to organizational effec- tiveness. The Theory as a Basis for More
Data Visualization of Item-Total Correlation by Median Smoothing
ERIC Educational Resources Information Center
Yu, Chong Ho; Douglas, Samantha; Lee, Anna; An, Min
2016-01-01
This paper aims to illustrate how data visualization could be utilized to identify errors prior to modeling, using an example with multi-dimensional item response theory (MIRT). MIRT combines item response theory and factor analysis to identify a psychometric model that investigates two or more latent traits. While it may seem convenient to…
Applying Generalizability Theory To Evaluate Treatment Effect in Single-Subject Research.
ERIC Educational Resources Information Center
Lefebvre, Daniel J.; Suen, Hoi K.
An empirical investigation of methodological issues associated with evaluating treatment effect in single-subject research (SSR) designs is presented. This investigation: (1) conducted a generalizability (G) study to identify the sources of systematic and random measurement error (SRME); (2) used an analytic approach based on G theory to integrate…
ERIC Educational Resources Information Center
Roessger, Kevin M.
2014-01-01
In work-related instrumental learning contexts, the role of reflective activities is unclear. Kolb's experiential learning theory and Mezirow's transformative learning theory predict skill adaptation as an outcome. This prediction was tested by manipulating reflective activities and assessing participants' response and error rates during novel…
Large Sample Confidence Intervals for Item Response Theory Reliability Coefficients
ERIC Educational Resources Information Center
Andersson, Björn; Xin, Tao
2018-01-01
In applications of item response theory (IRT), an estimate of the reliability of the ability estimates or sum scores is often reported. However, analytical expressions for the standard errors of the estimators of the reliability coefficients are not available in the literature and therefore the variability associated with the estimated reliability…
A Survey of Progress in Coding Theory in the Soviet Union. Final Report.
ERIC Educational Resources Information Center
Kautz, William H.; Levitt, Karl N.
The results of a comprehensive technical survey of all published Soviet literature in coding theory and its applications--over 400 papers and books appearing before March 1967--are described in this report. Noteworthy Soviet contributions are discussed, including codes for the noiseless channel, codes that correct asymetric errors, decoding for…
Opioid errors in inpatient palliative care services: a retrospective review.
Heneka, Nicole; Shaw, Tim; Rowett, Debra; Lapkin, Samuel; Phillips, Jane L
2018-06-01
Opioids are a high-risk medicine frequently used to manage palliative patients' cancer-related pain and other symptoms. Despite the high volume of opioid use in inpatient palliative care services, and the potential for patient harm, few studies have focused on opioid errors in this population. To (i) identify the number of opioid errors reported by inpatient palliative care services, (ii) identify reported opioid error characteristics and (iii) determine the impact of opioid errors on palliative patient outcomes. A 24-month retrospective review of opioid errors reported in three inpatient palliative care services in one Australian state. Of the 55 opioid errors identified, 84% reached the patient. Most errors involved morphine (35%) or hydromorphone (29%). Opioid administration errors accounted for 76% of reported opioid errors, largely due to omitted dose (33%) or wrong dose (24%) errors. Patients were more likely to receive a lower dose of opioid than ordered as a direct result of an opioid error (57%), with errors adversely impacting pain and/or symptom management in 42% of patients. Half (53%) of the affected patients required additional treatment and/or care as a direct consequence of the opioid error. This retrospective review has provided valuable insights into the patterns and impact of opioid errors in inpatient palliative care services. Iatrogenic harm related to opioid underdosing errors contributed to palliative patients' unrelieved pain. Better understanding the factors that contribute to opioid errors and the role of safety culture in the palliative care service context warrants further investigation. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Errors of Measurement, Theory, and Public Policy. William H. Angoff Memorial Lecture Series
ERIC Educational Resources Information Center
Kane, Michael
2010-01-01
The 12th annual William H. Angoff Memorial Lecture was presented by Dr. Michael T. Kane, ETS's (Educational Testing Service) Samuel J. Messick Chair in Test Validity and the former Director of Research at the National Conference of Bar Examiners. Dr. Kane argues that it is important for policymakers to recognize the impact of errors of measurement…
ERIC Educational Resources Information Center
Weaver, Sallie J.; Newman-Toker, David E.; Rosen, Michael A.
2012-01-01
Missed, delayed, or wrong diagnoses can have a severe impact on patients, providers, and the entire health care system. One mechanism implicated in such diagnostic errors is the deterioration of cognitive diagnostic skills that are used rarely or not at all over a prolonged period of time. Existing evidence regarding maintenance of effective…
Quantum error-correcting codes from algebraic geometry codes of Castle type
NASA Astrophysics Data System (ADS)
Munuera, Carlos; Tenório, Wanderson; Torres, Fernando
2016-10-01
We study algebraic geometry codes producing quantum error-correcting codes by the CSS construction. We pay particular attention to the family of Castle codes. We show that many of the examples known in the literature in fact belong to this family of codes. We systematize these constructions by showing the common theory that underlies all of them.
The Reliability and Sources of Error of Using Rubrics-Based Assessment for Student Projects
ERIC Educational Resources Information Center
Menéndez-Varela, José-Luis; Gregori-Giralt, Eva
2018-01-01
Rubrics are widely used in higher education to assess performance in project-based learning environments. To date, the sources of error that may affect their reliability have not been studied in depth. Using generalisability theory as its starting-point, this article analyses the influence of the assessors and the criteria of the rubrics on the…
Error-correcting codes in computer arithmetic.
NASA Technical Reports Server (NTRS)
Massey, J. L.; Garcia, O. N.
1972-01-01
Summary of the most important results so far obtained in the theory of coding for the correction and detection of errors in computer arithmetic. Attempts to satisfy the stringent reliability demands upon the arithmetic unit are considered, and special attention is given to attempts to incorporate redundancy into the numbers themselves which are being processed so that erroneous results can be detected and corrected.
NASA Technical Reports Server (NTRS)
Richards, W. Lance
1996-01-01
Significant strain-gage errors may exist in measurements acquired in transient-temperature environments if conventional correction methods are applied. As heating or cooling rates increase, temperature gradients between the strain-gage sensor and substrate surface increase proportionally. These temperature gradients introduce strain-measurement errors that are currently neglected in both conventional strain-correction theory and practice. Therefore, the conventional correction theory has been modified to account for these errors. A new experimental method has been developed to correct strain-gage measurements acquired in environments experiencing significant temperature transients. The new correction technique has been demonstrated through a series of tests in which strain measurements were acquired for temperature-rise rates ranging from 1 to greater than 100 degrees F/sec. Strain-gage data from these tests have been corrected with both the new and conventional methods and then compared with an analysis. Results show that, for temperature-rise rates greater than 10 degrees F/sec, the strain measurements corrected with the conventional technique produced strain errors that deviated from analysis by as much as 45 percent, whereas results corrected with the new technique were in good agreement with analytical results.
Adaptive management is an approach to natural resource management that emphasizes learning through management where knowledge is incomplete, and when, despite inherent uncertainty, managers and policymakers must act. Unlike a traditional trial and error approach, adaptive managem...
ERIC Educational Resources Information Center
Lee, Seung Yong; Bates, Paul R.; Murray, Patrick S.; Martin, Wayne L.
2017-01-01
Threat and Error Management (TEM) training, endorsed and recommended by the International Civil Aviation Organisation (ICAO), was mandated in Australia with the aim of improving aviation safety. However, to date, there has been very limited, if any, formal post-implementation review, assessment or evaluation to examine the "after-state"…
Montuno, Michael A; Kohner, Andrew B; Foote, Kelly D; Okun, Michael S
2013-01-01
Deep brain stimulation (DBS) is an effective technique that has been utilized to treat advanced and medication-refractory movement and psychiatric disorders. In order to avoid implanted pulse generator (IPG) failure and consequent adverse symptoms, a better understanding of IPG battery longevity and management is necessary. Existing methods for battery estimation lack the specificity required for clinical incorporation. Technical challenges prevent higher accuracy longevity estimations, and a better approach to managing end of DBS battery life is needed. The literature was reviewed and DBS battery estimators were constructed by the authors and made available on the web at http://mdc.mbi.ufl.edu/surgery/dbs-battery-estimator. A clinical algorithm for management of DBS battery life was constructed. The algorithm takes into account battery estimations and clinical symptoms. Existing methods of DBS battery life estimation utilize an interpolation of averaged current drains to calculate how long a battery will last. Unfortunately, this technique can only provide general approximations. There are inherent errors in this technique, and these errors compound with each iteration of the battery estimation. Some of these errors cannot be accounted for in the estimation process, and some of the errors stem from device variation, battery voltage dependence, battery usage, battery chemistry, impedance fluctuations, interpolation error, usage patterns, and self-discharge. We present web-based battery estimators along with an algorithm for clinical management. We discuss the perils of using a battery estimator without taking into account the clinical picture. Future work will be needed to provide more reliable management of implanted device batteries; however, implementation of a clinical algorithm that accounts for both estimated battery life and for patient symptoms should improve the care of DBS patients. © 2012 International Neuromodulation Society.
Bacon, Dave; Flammia, Steven T
2009-09-18
The difficulty in producing precisely timed and controlled quantum gates is a significant source of error in many physical implementations of quantum computers. Here we introduce a simple universal primitive, adiabatic gate teleportation, which is robust to timing errors and many control errors and maintains a constant energy gap throughout the computation above a degenerate ground state space. This construction allows for geometric robustness based upon the control of two independent qubit interactions. Further, our piecewise adiabatic evolution easily relates to the quantum circuit model, enabling the use of standard methods from fault-tolerance theory for establishing thresholds.
NASA Technical Reports Server (NTRS)
1985-01-01
A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.
[Discussion on six errors of formulas corresponding to syndromes in using the classic formulas].
Bao, Yan-ju; Hua, Bao-jin
2012-12-01
The theory of formulas corresponding to syndromes is one of the characteristics of Treatise on Cold Damage and Miscellaneous Diseases (Shanghan Zabing Lun) and one of the main principles in applying classic prescriptions. It is important to take effect by following the principle of formulas corresponding to syndromes. However, some medical practitioners always feel that the actual clinical effect is far less than expected. Six errors in the use of classic prescriptions as well as the theory of formulas corresponding to syndromes are the most important causes to be considered, i.e. paying attention only to the local syndromes while neglecting the whole, paying attention only to formulas corresponding to syndromes while neglecting the pathogenesis, paying attention only to syndromes while neglecting the pulse diagnosis, paying attention only to unilateral prescription but neglecting the combined prescriptions, paying attention only to classic prescriptions while neglecting the modern formulas, and paying attention only to the formulas but neglecting the drug dosage. Therefore, not only the patients' clinical syndromes, but also the combination of main syndrome and pathogenesis simultaneously is necessary in the clinical applications of classic prescriptions and the theory of prescription corresponding to syndrome. In addition, comprehensive syndrome differentiation, modern formulas, current prescriptions, combined prescriptions, and drug dosage all contribute to avoid clinical errors and improve clinical effects.
NASA Astrophysics Data System (ADS)
Zocchi, Fabio E.
2017-10-01
One of the approaches that is being tested for the integration of the mirror modules of the advanced telescope for high-energy astrophysics x-ray mission of the European Space Agency consists in aligning each module on an optical bench operated at an ultraviolet wavelength. The mirror module is illuminated by a plane wave and, in order to overcome diffraction effects, the centroid of the image produced by the module is used as a reference to assess the accuracy of the optical alignment of the mirror module itself. Among other sources of uncertainty, the wave-front error of the plane wave also introduces an error in the position of the centroid, thus affecting the quality of the mirror module alignment. The power spectral density of the position of the point spread function centroid is here derived from the power spectral density of the wave-front error of the plane wave in the framework of the scalar theory of Fourier diffraction. This allows the defining of a specification on the collimator quality used for generating the plane wave starting from the contribution to the error budget allocated for the uncertainty of the centroid position. The theory generally applies whenever Fourier diffraction is a valid approximation, in which case the obtained result is identical to that derived by geometrical optics considerations.
Complexity Theory, School Leadership and Management: Questions for Theory and Practice
ERIC Educational Resources Information Center
Morrison, Keith
2010-01-01
Complexity theory (CT) has had a meteoric rise in management literature and the social sciences. Its fledgling importation into school leadership and management raises several questions and concerns. This article takes one view of CT and argues that, though its key elements have much to offer school leadership and management, caution has to be…
Inference of emission rates from multiple sources using Bayesian probability theory.
Yee, Eugene; Flesch, Thomas K
2010-03-01
The determination of atmospheric emission rates from multiple sources using inversion (regularized least-squares or best-fit technique) is known to be very susceptible to measurement and model errors in the problem, rendering the solution unusable. In this paper, a new perspective is offered for this problem: namely, it is argued that the problem should be addressed as one of inference rather than inversion. Towards this objective, Bayesian probability theory is used to estimate the emission rates from multiple sources. The posterior probability distribution for the emission rates is derived, accounting fully for the measurement errors in the concentration data and the model errors in the dispersion model used to interpret the data. The Bayesian inferential methodology for emission rate recovery is validated against real dispersion data, obtained from a field experiment involving various source-sensor geometries (scenarios) consisting of four synthetic area sources and eight concentration sensors. The recovery of discrete emission rates from three different scenarios obtained using Bayesian inference and singular value decomposition inversion are compared and contrasted.
Towards Holography via Quantum Source-Channel Codes.
Pastawski, Fernando; Eisert, Jens; Wilming, Henrik
2017-07-14
While originally motivated by quantum computation, quantum error correction (QEC) is currently providing valuable insights into many-body quantum physics, such as topological phases of matter. Furthermore, mounting evidence originating from holography research (AdS/CFT) indicates that QEC should also be pertinent for conformal field theories. With this motivation in mind, we introduce quantum source-channel codes, which combine features of lossy compression and approximate quantum error correction, both of which are predicted in holography. Through a recent construction for approximate recovery maps, we derive guarantees on its erasure decoding performance from calculations of an entropic quantity called conditional mutual information. As an example, we consider Gibbs states of the transverse field Ising model at criticality and provide evidence that they exhibit nontrivial protection from local erasure. This gives rise to the first concrete interpretation of a bona fide conformal field theory as a quantum error correcting code. We argue that quantum source-channel codes are of independent interest beyond holography.
Baker, Travis E; Holroyd, Clay B
2011-04-01
The reinforcement learning theory of the error-related negativity (ERN) holds that the impact of reward signals carried by the midbrain dopamine system modulates activity of the anterior cingulate cortex (ACC), alternatively disinhibiting and inhibiting the ACC following unpredicted error and reward events, respectively. According to a recent formulation of the theory, activity that is intrinsic to the ACC produces a component of the event-related brain potential (ERP) called the N200, and following unpredicted rewards, the N200 is suppressed by extrinsically applied positive dopamine reward signals, resulting in an ERP component called the feedback-ERN (fERN). Here we demonstrate that, despite extensive spatial and temporal overlap between the two ERP components, the functional processes indexed by the N200 (conflict) and the fERN (reward) are dissociable. These results point toward avenues for future investigation. Copyright © 2011 Elsevier B.V. All rights reserved.
Towards Holography via Quantum Source-Channel Codes
NASA Astrophysics Data System (ADS)
Pastawski, Fernando; Eisert, Jens; Wilming, Henrik
2017-07-01
While originally motivated by quantum computation, quantum error correction (QEC) is currently providing valuable insights into many-body quantum physics, such as topological phases of matter. Furthermore, mounting evidence originating from holography research (AdS/CFT) indicates that QEC should also be pertinent for conformal field theories. With this motivation in mind, we introduce quantum source-channel codes, which combine features of lossy compression and approximate quantum error correction, both of which are predicted in holography. Through a recent construction for approximate recovery maps, we derive guarantees on its erasure decoding performance from calculations of an entropic quantity called conditional mutual information. As an example, we consider Gibbs states of the transverse field Ising model at criticality and provide evidence that they exhibit nontrivial protection from local erasure. This gives rise to the first concrete interpretation of a bona fide conformal field theory as a quantum error correcting code. We argue that quantum source-channel codes are of independent interest beyond holography.
Gorban, A N; Mirkes, E M; Zinovyev, A
2016-12-01
Most of machine learning approaches have stemmed from the application of minimizing the mean squared distance principle, based on the computationally efficient quadratic optimization methods. However, when faced with high-dimensional and noisy data, the quadratic error functionals demonstrated many weaknesses including high sensitivity to contaminating factors and dimensionality curse. Therefore, a lot of recent applications in machine learning exploited properties of non-quadratic error functionals based on L 1 norm or even sub-linear potentials corresponding to quasinorms L p (0
NASA Technical Reports Server (NTRS)
Podio, Fernando; Vollrath, William; Williams, Joel; Kobler, Ben; Crouse, Don
1998-01-01
Sophisticated network storage management applications are rapidly evolving to satisfy a market demand for highly reliable data storage systems with large data storage capacities and performance requirements. To preserve a high degree of data integrity, these applications must rely on intelligent data storage devices that can provide reliable indicators of data degradation. Error correction activity generally occurs within storage devices without notification to the host. Early indicators of degradation and media error monitoring 333 and reporting (MEMR) techniques implemented in data storage devices allow network storage management applications to notify system administrators of these events and to take appropriate corrective actions before catastrophic errors occur. Although MEMR techniques have been implemented in data storage devices for many years, until 1996 no MEMR standards existed. In 1996 the American National Standards Institute (ANSI) approved the only known (world-wide) industry standard specifying MEMR techniques to verify stored data on optical disks. This industry standard was developed under the auspices of the Association for Information and Image Management (AIIM). A recently formed AIIM Optical Tape Subcommittee initiated the development of another data integrity standard specifying a set of media error monitoring tools and media error monitoring information (MEMRI) to verify stored data on optical tape media. This paper discusses the need for intelligent storage devices that can provide data integrity metadata, the content of the existing data integrity standard for optical disks, and the content of the MEMRI standard being developed by the AIIM Optical Tape Subcommittee.
The influence of cognitive load on transfer with error prevention training methods: a meta-analysis.
Hutchins, Shaun D; Wickens, Christopher D; Carolan, Thomas F; Cumming, John M
2013-08-01
The objective was to conduct research synthesis for the U.S.Army on the effectiveness of two error prevention training strategies (training wheels and scaffolding) on the transfer of training. Motivated as part of an ongoing program of research on training effectiveness, the current work presents some of the program's research into the effects on transfer of error prevention strategies during training from a cognitive load perspective. Based on cognitive load theory, two training strategies were hypothesized to reduce intrinsic load by supporting learners early in acquisition during schema development. A transfer ratio and Hedges' g were used in the two meta-analyses conducted on transfer studies employing the two training strategies. Moderators relevant to cognitive load theory and specific to the implemented strategies were examined.The transfer ratio was the ratio of treatment transfer performance to control transfer. Hedges' g was used in comparing treatment and control group standardized mean differences. Both effect sizes were analyzed with versions of sample weighted fixed effect models. Analysis of the training wheels strategy suggests a transfer benefit. The observed benefit was strongest when the training wheels were a worked example coupled with a principle-based prompt. Analysis of the scaffolding data also suggests a transfer benefit for the strategy. Both training wheels and scaffolding demonstrated positive transfer as training strategies.As error prevention techniques, both support the intrinsic load--reducing implications of cognitive load theory. The findings are applicable to the development of instructional design guidelines in professional skill-based organizations such as the military.
Fujita, Masahiko
2016-03-01
Lesions of the cerebellum result in large errors in movements. The cerebellum adaptively controls the strength and timing of motor command signals depending on the internal and external environments of movements. The present theory describes how the cerebellar cortex can control signals for accurate and timed movements. A model network of the cerebellar Golgi and granule cells is shown to be equivalent to a multiple-input (from mossy fibers) hierarchical neural network with a single hidden layer of threshold units (granule cells) that receive a common recurrent inhibition (from a Golgi cell). The weighted sum of the hidden unit signals (Purkinje cell output) is theoretically analyzed regarding the capability of the network to perform two types of universal function approximation. The hidden units begin firing as the excitatory inputs exceed the recurrent inhibition. This simple threshold feature leads to the first approximation theory, and the network final output can be any continuous function of the multiple inputs. When the input is constant, this output becomes stationary. However, when the recurrent unit activity is triggered to decrease or the recurrent inhibition is triggered to increase through a certain mechanism (metabotropic modulation or extrasynaptic spillover), the network can generate any continuous signals for a prolonged period of change in the activity of recurrent signals, as the second approximation theory shows. By incorporating the cerebellar capability of two such types of approximations to a motor system, in which learning proceeds through repeated movement trials with accompanying corrections, accurate and timed responses for reaching the target can be adaptively acquired. Simple models of motor control can solve the motor error vs. sensory error problem, as well as the structural aspects of credit (or error) assignment problem. Two physiological experiments are proposed for examining the delay and trace conditioning of eyelid responses, as well as saccade adaptation, to investigate this novel idea of cerebellar processing. Copyright © 2015 Elsevier Ltd. All rights reserved.
Study on Network Error Analysis and Locating based on Integrated Information Decision System
NASA Astrophysics Data System (ADS)
Yang, F.; Dong, Z. H.
2017-10-01
Integrated information decision system (IIDS) integrates multiple sub-system developed by many facilities, including almost hundred kinds of software, which provides with various services, such as email, short messages, drawing and sharing. Because the under-layer protocols are different, user standards are not unified, many errors are occurred during the stages of setup, configuration, and operation, which seriously affect the usage. Because the errors are various, which may be happened in different operation phases, stages, TCP/IP communication protocol layers, sub-system software, it is necessary to design a network error analysis and locating tool for IIDS to solve the above problems. This paper studies on network error analysis and locating based on IIDS, which provides strong theory and technology supports for the running and communicating of IIDS.
A Systems Modeling Approach for Risk Management of Command File Errors
NASA Technical Reports Server (NTRS)
Meshkat, Leila
2012-01-01
The main cause of commanding errors is often (but not always) due to procedures. Either lack of maturity in the processes, incompleteness of requirements or lack of compliance to these procedures. Other causes of commanding errors include lack of understanding of system states, inadequate communication, and making hasty changes in standard procedures in response to an unexpected event. In general, it's important to look at the big picture prior to making corrective actions. In the case of errors traced back to procedures, considering the reliability of the process as a metric during its' design may help to reduce risk. This metric is obtained by using data from Nuclear Industry regarding human reliability. A structured method for the collection of anomaly data will help the operator think systematically about the anomaly and facilitate risk management. Formal models can be used for risk based design and risk management. A generic set of models can be customized for a broad range of missions.
Abnormal Error Monitoring in Math-Anxious Individuals: Evidence from Error-Related Brain Potentials
Suárez-Pellicioni, Macarena; Núñez-Peña, María Isabel; Colomé, Àngels
2013-01-01
This study used event-related brain potentials to investigate whether math anxiety is related to abnormal error monitoring processing. Seventeen high math-anxious (HMA) and seventeen low math-anxious (LMA) individuals were presented with a numerical and a classical Stroop task. Groups did not differ in terms of trait or state anxiety. We found enhanced error-related negativity (ERN) in the HMA group when subjects committed an error on the numerical Stroop task, but not on the classical Stroop task. Groups did not differ in terms of the correct-related negativity component (CRN), the error positivity component (Pe), classical behavioral measures or post-error measures. The amplitude of the ERN was negatively related to participants’ math anxiety scores, showing a more negative amplitude as the score increased. Moreover, using standardized low resolution electromagnetic tomography (sLORETA) we found greater activation of the insula in errors on a numerical task as compared to errors in a non-numerical task only for the HMA group. The results were interpreted according to the motivational significance theory of the ERN. PMID:24236212
A modified adjoint-based grid adaptation and error correction method for unstructured grid
NASA Astrophysics Data System (ADS)
Cui, Pengcheng; Li, Bin; Tang, Jing; Chen, Jiangtao; Deng, Youqi
2018-05-01
Grid adaptation is an important strategy to improve the accuracy of output functions (e.g. drag, lift, etc.) in computational fluid dynamics (CFD) analysis and design applications. This paper presents a modified robust grid adaptation and error correction method for reducing simulation errors in integral outputs. The procedure is based on discrete adjoint optimization theory in which the estimated global error of output functions can be directly related to the local residual error. According to this relationship, local residual error contribution can be used as an indicator in a grid adaptation strategy designed to generate refined grids for accurately estimating the output functions. This grid adaptation and error correction method is applied to subsonic and supersonic simulations around three-dimensional configurations. Numerical results demonstrate that the sensitive grids to output functions are detected and refined after grid adaptation, and the accuracy of output functions is obviously improved after error correction. The proposed grid adaptation and error correction method is shown to compare very favorably in terms of output accuracy and computational efficiency relative to the traditional featured-based grid adaptation.
Bit-error rate for free-space adaptive optics laser communications.
Tyson, Robert K
2002-04-01
An analysis of adaptive optics compensation for atmospheric-turbulence-induced scintillation is presented with the figure of merit being the laser communications bit-error rate. The formulation covers weak, moderate, and strong turbulence; on-off keying; and amplitude-shift keying, over horizontal propagation paths or on a ground-to-space uplink or downlink. The theory shows that under some circumstances the bit-error rate can be improved by a few orders of magnitude with the addition of adaptive optics to compensate for the scintillation. Low-order compensation (less than 40 Zernike modes) appears to be feasible as well as beneficial for reducing the bit-error rate and increasing the throughput of the communication link.
NASA Astrophysics Data System (ADS)
Wang, Jia; Hou, Xi; Wan, Yongjian; Shi, Chunyan
2017-10-01
An optimized method to calculate error correction capability of tool influence function (TIF) in certain polishing conditions will be proposed based on smoothing spectral function. The basic mathematical model for this method will be established in theory. A set of polishing experimental data with rigid conformal tool is used to validate the optimized method. The calculated results can quantitatively indicate error correction capability of TIF for different spatial frequency errors in certain polishing conditions. The comparative analysis with previous method shows that the optimized method is simpler in form and can get the same accuracy results with less calculating time in contrast to previous method.
Software platform for managing the classification of error- related potentials of observers
NASA Astrophysics Data System (ADS)
Asvestas, P.; Ventouras, E.-C.; Kostopoulos, S.; Sidiropoulos, K.; Korfiatis, V.; Korda, A.; Uzunolglu, A.; Karanasiou, I.; Kalatzis, I.; Matsopoulos, G.
2015-09-01
Human learning is partly based on observation. Electroencephalographic recordings of subjects who perform acts (actors) or observe actors (observers), contain a negative waveform in the Evoked Potentials (EPs) of the actors that commit errors and of observers who observe the error-committing actors. This waveform is called the Error-Related Negativity (ERN). Its detection has applications in the context of Brain-Computer Interfaces. The present work describes a software system developed for managing EPs of observers, with the aim of classifying them into observations of either correct or incorrect actions. It consists of an integrated platform for the storage, management, processing and classification of EPs recorded during error-observation experiments. The system was developed using C# and the following development tools and frameworks: MySQL, .NET Framework, Entity Framework and Emgu CV, for interfacing with the machine learning library of OpenCV. Up to six features can be computed per EP recording per electrode. The user can select among various feature selection algorithms and then proceed to train one of three types of classifiers: Artificial Neural Networks, Support Vector Machines, k-nearest neighbour. Next the classifier can be used for classifying any EP curve that has been inputted to the database.
Managerial process improvement: a lean approach to eliminating medication delivery.
Hussain, Aftab; Stewart, LaShonda M; Rivers, Patrick A; Munchus, George
2015-01-01
Statistical evidence shows that medication errors are a major cause of injuries that concerns all health care oganizations. Despite all the efforts to improve the quality of care, the lack of understanding and inability of management to design a robust system that will strategically target those factors is a major cause of distress. The paper aims to discuss these issues. Achieving optimum organizational performance requires two key variables; work process factors and human performance factors. The approach is that healthcare administrators must take in account both variables in designing a strategy to reduce medication errors. However, strategies that will combat such phenomena require that managers and administrators understand the key factors that are causing medication delivery errors. The authors recommend that healthcare organizations implement the Toyota Production System (TPS) combined with human performance improvement (HPI) methodologies to eliminate medication delivery errors in hospitals. Despite all the efforts to improve the quality of care, there continues to be a lack of understanding and the ability of management to design a robust system that will strategically target those factors associated with medication errors. This paper proposes a solution to an ambiguous workflow process using the TPS combined with the HPI system.
New Tools and Methods for Assessing Risk-Management Strategies
2004-03-01
Theories to evaluate the risks and benefits of various acquisition alternatives and allowed researchers to monitor the process students used to make a...revealed distinct risk-management strategies. 15. SUBJECT TERMS risk managements, acquisition process, expected value theory , multi-attribute utility theory ...Utility Theories to evaluate the risks and benefits of various acquisition alternatives, and allowed us to monitor the process subjects used to arrive at
ERIC Educational Resources Information Center
Anderson, Marc H.
2007-01-01
Management educators teaching topics such as motivation and leadership face the challenge of clearly explaining why so many diverse theories exist and why each represents a useful tool worth learning. The large number of "core" theories in these and other management domains often frustrates students, who see the lack of a single, comprehensive…
Error analysis of the Golay3 optical imaging system.
Wu, Quanying; Fan, Junliu; Wu, Feng; Zhao, Jun; Qian, Lin
2013-05-01
We use aberration theory to derive a generalized pupil function of the Golay3 imaging system when astigmatisms exist in its submirrors. Theoretical analysis and numerical simulation using ZEMAX show that the point spread function (PSF) and the modulation transfer function (MTF) of the Golay3 sparse aperture system have a periodic change when there are piston errors. When the peak-valley value of the wavefront (PV(tilt)) due to the tilt error increases from zero to λ, the PSF and the MTF change significantly, and the change direction is determined by the location of the submirror with the tilt error. When PV(tilt) becomes larger than λ, the PSF and the MTF remain unvaried. We calculate the peaks of the signal-to-noise ratio (PSNR) resulting from the piston and tilt errors according to the Strehl ratio, and show that the PSNR decreases when the errors increase.
Dynamic Pulse Buckling--Theory and Experiment
1983-02-01
5.6 mm. (Values of p- sinh pn•" for r - 0.02 are shown later in Figure 3.11.) Since the nonuniform "error" of the intended uniform velocity in the tests...amplitude of the crests, which, according to the theory, depend mainly on the nonuniformities in the ,,elocity distribution. Figure 3.13 shows two cylinders... nonuniformities in the experimental velocity distributions, which are required for the theory, are unknown. Instead, the number of crests observed is compared
Hickey, Edward J; Nosikova, Yaroslavna; Pham-Hung, Eric; Gritti, Michael; Schwartz, Steven; Caldarone, Christopher A; Redington, Andrew; Van Arsdell, Glen S
2015-02-01
We hypothesized that the National Aeronautics and Space Administration "threat and error" model (which is derived from analyzing >30,000 commercial flights, and explains >90% of crashes) is directly applicable to pediatric cardiac surgery. We implemented a unit-wide performance initiative, whereby every surgical admission constitutes a "flight" and is tracked in real time, with the aim of identifying errors. The first 500 consecutive patients (524 flights) were analyzed, with an emphasis on the relationship between error cycles and permanent harmful outcomes. Among 524 patient flights (risk adjustment for congenital heart surgery category: 1-6; median: 2) 68 (13%) involved residual hemodynamic lesions, 13 (2.5%) permanent end-organ injuries, and 7 deaths (1.3%). Preoperatively, 763 threats were identified in 379 (72%) flights. Only 51% of patient flights (267) were error free. In the remaining 257 flights, 430 errors occurred, most commonly related to proficiency (280; 65%) or judgment (69, 16%). In most flights with errors (173 of 257; 67%), an unintended clinical state resulted, ie, the error was consequential. In 60% of consequential errors (n = 110; 21% of total), subsequent cycles of additional error/unintended states occurred. Cycles, particularly those containing multiple errors, were very significantly associated with permanent harmful end-states, including residual hemodynamic lesions (P < .0001), end-organ injury (P < .0001), and death (P < .0001). Deaths were almost always preceded by cycles (6 of 7; P < .0001). Human error, if not mitigated, often leads to cycles of error and unintended patient states, which are dangerous and precede the majority of harmful outcomes. Efforts to manage threats and error cycles (through crew resource management techniques) are likely to yield large increases in patient safety. Copyright © 2015. Published by Elsevier Inc.
Pegler, Joe; Lehane, Elaine; Livingstone, Vicki; McCarthy, Nora; Sahm, Laura J.; Tabirca, Sabin; O’Driscoll, Aoife; Corrigan, Mark
2016-01-01
Background Patient safety requires optimal management of medications. Electronic systems are encouraged to reduce medication errors. Near field communications (NFC) is an emerging technology that may be used to develop novel medication management systems. Methods An NFC-based system was designed to facilitate prescribing, administration and review of medications commonly used on surgical wards. Final year medical, nursing, and pharmacy students were recruited to test the electronic system in a cross-over observational setting on a simulated ward. Medication errors were compared against errors recorded using a paper-based system. Results A significant difference in the commission of medication errors was seen when NFC and paper-based medication systems were compared. Paper use resulted in a mean of 4.09 errors per prescribing round while NFC prescribing resulted in a mean of 0.22 errors per simulated prescribing round (P=0.000). Likewise, medication administration errors were reduced from a mean of 2.30 per drug round with a Paper system to a mean of 0.80 errors per round using NFC (P<0.015). A mean satisfaction score of 2.30 was reported by users, (rated on seven-point scale with 1 denoting total satisfaction with system use and 7 denoting total dissatisfaction). Conclusions An NFC based medication system may be used to effectively reduce medication errors in a simulated ward environment. PMID:28293602
O'Connell, Emer; Pegler, Joe; Lehane, Elaine; Livingstone, Vicki; McCarthy, Nora; Sahm, Laura J; Tabirca, Sabin; O'Driscoll, Aoife; Corrigan, Mark
2016-01-01
Patient safety requires optimal management of medications. Electronic systems are encouraged to reduce medication errors. Near field communications (NFC) is an emerging technology that may be used to develop novel medication management systems. An NFC-based system was designed to facilitate prescribing, administration and review of medications commonly used on surgical wards. Final year medical, nursing, and pharmacy students were recruited to test the electronic system in a cross-over observational setting on a simulated ward. Medication errors were compared against errors recorded using a paper-based system. A significant difference in the commission of medication errors was seen when NFC and paper-based medication systems were compared. Paper use resulted in a mean of 4.09 errors per prescribing round while NFC prescribing resulted in a mean of 0.22 errors per simulated prescribing round (P=0.000). Likewise, medication administration errors were reduced from a mean of 2.30 per drug round with a Paper system to a mean of 0.80 errors per round using NFC (P<0.015). A mean satisfaction score of 2.30 was reported by users, (rated on seven-point scale with 1 denoting total satisfaction with system use and 7 denoting total dissatisfaction). An NFC based medication system may be used to effectively reduce medication errors in a simulated ward environment.
Eriksen, Janus J; Sauer, Stephan P A; Mikkelsen, Kurt V; Jensen, Hans J Aa; Kongsted, Jacob
2012-09-30
We investigate the effect of including a dynamic reaction field at the lowest possible ab inito wave function level of theory, namely the Hartree-Fock (HF) self-consistent field level within the polarizable embedding (PE) formalism. We formulate HF based PE within the linear response theory picture leading to the PE-random-phase approximation (PE-RPA) and bridge the expressions to a second-order polarization propagator approximation (SOPPA) frame such that dynamic reaction field contributions are included at the RPA level in addition to the static response described at the SOPPA level but with HF induced dipole moments. We conduct calculations on para-nitro-aniline and para-nitro-phenolate using said model in addition to dynamic PE-RPA and PE-CAM-B3LYP. We compare the results to recently published PE-CCSD data and demonstrate how the cost effective SOPPA-based model successfully recovers a great portion of the inherent PE-RPA error when the observable is the solvatochromic shift. We furthermore demonstrate that whenever the change in density resulting from the ground state-excited state electronic transition in the solute is not associated with a significant change in the electric field, dynamic response contributions formulated at the HF level of theory manage to capture the majority of the system response originating from derivative densities. Copyright © 2012 Wiley Periodicals, Inc.
Linking Theory with Practice in Basic Management
ERIC Educational Resources Information Center
Carroll, Archie B.
1974-01-01
Instructors of management in higher education have not been cautious in explaining the relation between practice and theory in their basic courses. The author distinguished between the two in suggesting that management theory is based on observed practices and may or may not have broader application. (AG)
Halting in Single Word Production: A Test of the Perceptual Loop Theory of Speech Monitoring
ERIC Educational Resources Information Center
Slevc, L. Robert; Ferreira, Victor S.
2006-01-01
The "perceptual loop theory" of speech monitoring (Levelt, 1983) claims that inner and overt speech are monitored by the comprehension system, which detects errors by comparing the comprehension of formulated utterances to originally intended utterances. To test the perceptual loop monitor, speakers named pictures and sometimes attempted to halt…
Characterizing Sources of Uncertainty in Item Response Theory Scale Scores
ERIC Educational Resources Information Center
Yang, Ji Seung; Hansen, Mark; Cai, Li
2012-01-01
Traditional estimators of item response theory scale scores ignore uncertainty carried over from the item calibration process, which can lead to incorrect estimates of the standard errors of measurement (SEMs). Here, the authors review a variety of approaches that have been applied to this problem and compare them on the basis of their statistical…
The Impact of Causality on Information-Theoretic Source and Channel Coding Problems
ERIC Educational Resources Information Center
Palaiyanur, Harikrishna R.
2011-01-01
This thesis studies several problems in information theory where the notion of causality comes into play. Causality in information theory refers to the timing of when information is available to parties in a coding system. The first part of the thesis studies the error exponent (or reliability function) for several communication problems over…
ERIC Educational Resources Information Center
Erwin, T. Dary
Rating scales are a typical method for evaluating a student's performance in outcomes assessment. The analysis of the quality of information from rating scales poses special measurement problems when researchers work with faculty in their development. Generalizability measurement theory offers a set of techniques for estimating errors or…
The free-energy principle: a unified brain theory?
Friston, Karl
2010-02-01
A free-energy principle has been proposed recently that accounts for action, perception and learning. This Review looks at some key brain theories in the biological (for example, neural Darwinism) and physical (for example, information theory and optimal control theory) sciences from the free-energy perspective. Crucially, one key theme runs through each of these theories - optimization. Furthermore, if we look closely at what is optimized, the same quantity keeps emerging, namely value (expected reward, expected utility) or its complement, surprise (prediction error, expected cost). This is the quantity that is optimized under the free-energy principle, which suggests that several global brain theories might be unified within a free-energy framework.
Overview of error-tolerant cockpit research
NASA Technical Reports Server (NTRS)
Abbott, Kathy
1990-01-01
The objectives of research in intelligent cockpit aids and intelligent error-tolerant systems are stated. In intelligent cockpit aids research, the objective is to provide increased aid and support to the flight crew of civil transport aircraft through the use of artificial intelligence techniques combined with traditional automation. In intelligent error-tolerant systems, the objective is to develop and evaluate cockpit systems that provide flight crews with safe and effective ways and means to manage aircraft systems, plan and replan flights, and respond to contingencies. A subsystems fault management functional diagram is given. All information is in viewgraph form.
Managing corporate capabilities:theory and industry approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slavin, Adam M.
2007-02-01
This study characterizes theoretical and industry approaches to organizational capabilities management and ascertains whether there is a distinct ''best practice'' in this regard. We consider both physical capabilities, such as technical disciplines and infrastructure, and non-physical capabilities such as corporate culture and organizational procedures. We examine Resource-Based Theory (RBT), which is the predominant organizational management theory focused on capabilities. RBT seeks to explain the effect of capabilities on competitiveness, and thus provide a basis for investment/divestment decisions. We then analyze industry approaches described to us in interviews with representatives from Goodyear, 3M, Intel, Ford, NASA, Lockheed Martin, and Boeing. Wemore » found diversity amongst the industry capability management approaches. Although all organizations manage capabilities and consider them to some degree in their strategies, no two approaches that we observed were identical. Furthermore, we observed that theory is not a strong driver in this regard. No organization used the term ''Resource-Based Theory'', nor did any organization mention any other guiding theory or practice from the organizational management literature when explaining their capabilities management approaches. As such, we concluded that there is no single best practice for capabilities management. Nevertheless, we believe that RBT and the diverse industry experiences described herein can provide useful insights to support development of capabilities management approaches.« less
A Middle-Range Theory for Diabetes Self-management Mastery.
Fearon-Lynch, Jennifer A; Stover, Caitlin M
2015-01-01
Diabetes mellitus is the seventh leading cause of death in America and affects 382 million people worldwide. Individuals with diabetes must manage the complexity of the disease, its treatment, and complications to avert deleterious consequences associated with the illness. However, not all patients with diabetes successfully gain mastery to positively impact self-management. A new middle-range theory is proposed that merges 2 extant theories, theory of mastery and organismic integration theory, to better understand this human response. The theories' philosophical, theoretical, and conceptual perspectives were examined and relational properties synthesized to provide a conceptual representation of the phenomenon of interest.
[Does clinical risk management require a structured conflict management?].
Neumann, Stefan
2015-01-01
A key element of clinical risk management is the analysis of errors causing near misses or patient damage. After analyzing the causes and circumstances, measures for process improvement have to be taken. Process management, human resource development and other established methods are used. If an interpersonal conflict is a contributory factor to the error, there is usually no structured conflict management available which includes selection criteria for various methods of conflict processing. The European University Viadrina in Frankfurt (Oder) has created a process model for introducing a structured conflict management system which is suitable for hospitals and could fill the gap in the methodological spectrum of clinical risk management. There is initial evidence that a structured conflict management reduces staff fluctuation and hidden conflict costs. This article should be understood as an impulse for discussion on to what extent the range of methods of clinical risk management should be complemented by conflict management.
Development of an FAA-EUROCONTROL technique for the analysis of human error in ATM : final report.
DOT National Transportation Integrated Search
2002-07-01
Human error has been identified as a dominant risk factor in safety-oriented industries such as air traffic control (ATC). However, little is known about the factors leading to human errors in current air traffic management (ATM) systems. The first s...
ERIC Educational Resources Information Center
Cook, Desmond L.
This document, one of a series of reports examining the possible contribution of other disciplines to evaluation methodology, describes the major elements of general systems theory (GST), cybernetics theory (CT) and management control theory (MCT). The author suggests that MCT encapsulates major concerns of evaluation since it reveals that…
NASA Astrophysics Data System (ADS)
Xu, Chunmei; Huang, Fu-yu; Yin, Jian-ling; Chen, Yu-dan; Mao, Shao-juan
2016-10-01
The influence of aberration on misalignment of optical system is considered fully, the deficiencies of Gauss optical correction method is pointed, and a correction method for transmission-type misalignment optical system is proposed based on aberration theory. The variation regularity of single lens aberration caused by axial displacement is analyzed, and the aberration effect is defined. On this basis, through calculating the size of lens adjustment induced by the image position error and the magnifying rate error, the misalignment correction formula based on the constraints of the aberration is deduced mathematically. Taking the three lens collimation system for an example, the test is carried out to validate this method, and its superiority is proved.
A formal theory of feature binding in object perception.
Ashby, F G; Prinzmetal, W; Ivry, R; Maddox, W T
1996-01-01
Visual objects are perceived correctly only if their features are identified and then bound together. Illusory conjunctions result when feature identification is correct but an error occurs during feature binding. A new model is proposed that assumes feature binding errors occur because of uncertainty about the location of visual features. This model accounted for data from 2 new experiments better than a model derived from A. M. Treisman and H. Schmidt's (1982) feature integration theory. The traditional method for detecting the occurrence of true illusory conjunctions is shown to be fundamentally flawed. A reexamination of 2 previous studies provided new insights into the role of attention and location information in object perception and a reinterpretation of the deficits in patients who exhibit attentional disorders.
How accurate are lexile text measures?
Stenner, A Jackson; Burdick, Hal; Sanford, Eleanor E; Burdick, Donald S
2006-01-01
The Lexile Framework for Reading models comprehension as the difference between a reader measure and a text measure. Uncertainty in comprehension rates results from unreliability in reader measures and inaccuracy in text readability measures. Whole-text processing eliminates sampling error in text measures. However, Lexile text measures are imperfect due to misspecification of the Lexile theory. The standard deviation component associated with theory misspecification is estimated at 64L for a standard-length passage (approximately 125 words). A consequence is that standard errors for longer texts (2,500 to 150,000 words) are measured on the Lexile scale with uncertainties in the single digits. Uncertainties in expected comprehension rates are largely due to imprecision in reader ability and not inaccuracies in text readabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Podeszwa, Rafal; Department of Physics and Astronomy, University of Delaware, Newark, Delaware 19716; Szalewicz, Krzysztof
2012-04-28
Density-functional theory (DFT) revolutionized the ability of computational quantum mechanics to describe properties of matter and is by far the most often used method. However, all the standard variants of DFT fail to predict intermolecular interaction energies. In recent years, a number of ways to go around this problem has been proposed. We show that some of these approaches can reproduce interaction energies with median errors of only about 5% in the complete range of intermolecular configurations. Such errors are comparable to typical uncertainties of wave-function-based methods in practical applications. Thus, these DFT methods are expected to find broad applicationsmore » in modelling of condensed phases and of biomolecules.« less
Hindsight Bias and Developing Theories of Mind
Bernstein, Daniel M.; Atance, Cristina; Meltzoff, Andrew N.; Loftus, Geoffrey R.
2013-01-01
Although hindsight bias (the “I knew it all along” phenomenon) has been documented in adults, its development has not been investigated. This is despite the fact that hindsight bias errors closely resemble the errors children make on theory of mind (ToM) tasks. Two main goals of the present work were to (a) create a battery of hindsight tasks for preschoolers, and (b) assess the relation between children’s performance on these and ToM tasks. In two experiments involving 144 preschoolers, 3-, 4-, and 5-year olds exhibited strong hindsight bias. Performance on hindsight and ToM tasks was significantly correlated independent of age, language ability, and inhibitory control. These findings contribute to a more comprehensive account of perspective taking across the lifespan. PMID:17650144
Diagnostic Reasoning and Cognitive Biases of Nurse Practitioners.
Lawson, Thomas N
2018-04-01
Diagnostic reasoning is often used colloquially to describe the process by which nurse practitioners and physicians come to the correct diagnosis, but a rich definition and description of this process has been lacking in the nursing literature. A literature review was conducted with theoretical sampling seeking conceptual insight into diagnostic reasoning. Four common themes emerged: Cognitive Biases and Debiasing Strategies, the Dual Process Theory, Diagnostic Error, and Patient Harm. Relevant cognitive biases are discussed, followed by debiasing strategies and application of the dual process theory to reduce diagnostic error and harm. The accuracy of diagnostic reasoning of nurse practitioners may be improved by incorporating these items into nurse practitioner education and practice. [J Nurs Educ. 2018;57(4):203-208.]. Copyright 2018, SLACK Incorporated.
Computer discrimination procedures applicable to aerial and ERTS multispectral data
NASA Technical Reports Server (NTRS)
Richardson, A. J.; Torline, R. J.; Allen, W. A.
1970-01-01
Two statistical models are compared in the classification of crops recorded on color aerial photographs. A theory of error ellipses is applied to the pattern recognition problem. An elliptical boundary condition classification model (EBC), useful for recognition of candidate patterns, evolves out of error ellipse theory. The EBC model is compared with the minimum distance to the mean (MDM) classification model in terms of pattern recognition ability. The pattern recognition results of both models are interpreted graphically using scatter diagrams to represent measurement space. Measurement space, for this report, is determined by optical density measurements collected from Kodak Ektachrome Infrared Aero Film 8443 (EIR). The EBC model is shown to be a significant improvement over the MDM model.
He, Ding-Xin; Ling, Guang; Guan, Zhi-Hong; Hu, Bin; Liao, Rui-Quan
2018-02-01
This paper focuses on the collective dynamics of multisynchronization among heterogeneous genetic oscillators under a partial impulsive control strategy. The coupled nonidentical genetic oscillators are modeled by differential equations with uncertainties. The definition of multisynchronization is proposed to describe some more general synchronization behaviors in the real. Considering that each genetic oscillator consists of a large number of biochemical molecules, we design a more manageable impulsive strategy for dynamic networks to achieve multisynchronization. Not all the molecules but only a small fraction of them in each genetic oscillator are controlled at each impulsive instant. Theoretical analysis of multisynchronization is carried out by the control theory approach, and a sufficient condition of partial impulsive controller for multisynchronization with given error bounds is established. At last, numerical simulations are exploited to demonstrate the effectiveness of our results.
Latent error detection: A golden two hours for detection.
Saward, Justin R E; Stanton, Neville A
2017-03-01
Undetected error in safety critical contexts generates a latent condition that can contribute to a future safety failure. The detection of latent errors post-task completion is observed in naval air engineers using a diary to record work-related latent error detection (LED) events. A systems view is combined with multi-process theories to explore sociotechnical factors associated with LED. Perception of cues in different environments facilitates successful LED, for which the deliberate review of past tasks within two hours of the error occurring and whilst remaining in the same or similar sociotechnical environment to that which the error occurred appears most effective. Identified ergonomic interventions offer potential mitigation for latent errors; particularly in simple everyday habitual tasks. It is thought safety critical organisations should look to engineer further resilience through the application of LED techniques that engage with system cues across the entire sociotechnical environment, rather than relying on consistent human performance. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.
Error Sources in Proccessing LIDAR Based Bridge Inspection
NASA Astrophysics Data System (ADS)
Bian, H.; Chen, S. E.; Liu, W.
2017-09-01
Bridge inspection is a critical task in infrastructure management and is facing unprecedented challenges after a series of bridge failures. The prevailing visual inspection was insufficient in providing reliable and quantitative bridge information although a systematic quality management framework was built to ensure visual bridge inspection data quality to minimize errors during the inspection process. The LiDAR based remote sensing is recommended as an effective tool in overcoming some of the disadvantages of visual inspection. In order to evaluate the potential of applying this technology in bridge inspection, some of the error sources in LiDAR based bridge inspection are analysed. The scanning angle variance in field data collection and the different algorithm design in scanning data processing are the found factors that will introduce errors into inspection results. Besides studying the errors sources, advanced considerations should be placed on improving the inspection data quality, and statistical analysis might be employed to evaluate inspection operation process that contains a series of uncertain factors in the future. Overall, the development of a reliable bridge inspection system requires not only the improvement of data processing algorithms, but also systematic considerations to mitigate possible errors in the entire inspection workflow. If LiDAR or some other technology can be accepted as a supplement for visual inspection, the current quality management framework will be modified or redesigned, and this would be as urgent as the refine of inspection techniques.
ERIC Educational Resources Information Center
Bassett, Jonathan F.
2007-01-01
The author attempts to integrate Terror Management Theory (TMT) and R. W. Firestone's Separation Theory (1984, 1994). Both theories emphasize defense against death anxiety as a key human motive. Whereas TMT focuses extensively on self-esteem and cultural worldview, Firestone posited additional defenses such as gene survival, self-nourishing…
Park, Jong Cook; Kim, Kwang Sig
2012-03-01
The reliability of test is determined by each items' characteristics. Item analysis is achieved by classical test theory and item response theory. The purpose of the study was to compare the discrimination indices with item response theory using the Rasch model. Thirty-one 4th-year medical school students participated in the clinical course written examination, which included 22 A-type items and 3 R-type items. Point biserial correlation coefficient (C(pbs)) was compared to method of extreme group (D), biserial correlation coefficient (C(bs)), item-total correlation coefficient (C(it)), and corrected item-total correlation coeffcient (C(cit)). Rasch model was applied to estimate item difficulty and examinee's ability and to calculate item fit statistics using joint maximum likelihood. Explanatory power (r2) of Cpbs is decreased in the following order: C(cit) (1.00), C(it) (0.99), C(bs) (0.94), and D (0.45). The ranges of difficulty logit and standard error and ability logit and standard error were -0.82 to 0.80 and 0.37 to 0.76, -3.69 to 3.19 and 0.45 to 1.03, respectively. Item 9 and 23 have outfit > or =1.3. Student 1, 5, 7, 18, 26, 30, and 32 have fit > or =1.3. C(pbs), C(cit), and C(it) are good discrimination parameters. Rasch model can estimate item difficulty parameter and examinee's ability parameter with standard error. The fit statistics can identify bad items and unpredictable examinee's responses.
Quantum channels and memory effects
NASA Astrophysics Data System (ADS)
Caruso, Filippo; Giovannetti, Vittorio; Lupo, Cosmo; Mancini, Stefano
2014-10-01
Any physical process can be represented as a quantum channel mapping an initial state to a final state. Hence it can be characterized from the point of view of communication theory, i.e., in terms of its ability to transfer information. Quantum information provides a theoretical framework and the proper mathematical tools to accomplish this. In this context the notion of codes and communication capacities have been introduced by generalizing them from the classical Shannon theory of information transmission and error correction. The underlying assumption of this approach is to consider the channel not as acting on a single system, but on sequences of systems, which, when properly initialized allow one to overcome the noisy effects induced by the physical process under consideration. While most of the work produced so far has been focused on the case in which a given channel transformation acts identically and independently on the various elements of the sequence (memoryless configuration in jargon), correlated error models appear to be a more realistic way to approach the problem. A slightly different, yet conceptually related, notion of correlated errors applies to a single quantum system which evolves continuously in time under the influence of an external disturbance which acts on it in a non-Markovian fashion. This leads to the study of memory effects in quantum channels: a fertile ground where interesting novel phenomena emerge at the intersection of quantum information theory and other branches of physics. A survey is taken of the field of quantum channels theory while also embracing these specific and complex settings.
Schiffino, Felipe L; Zhou, Vivian; Holland, Peter C
2014-02-01
Within most contemporary learning theories, reinforcement prediction error, the difference between the obtained and expected reinforcer value, critically influences associative learning. In some theories, this prediction error determines the momentary effectiveness of the reinforcer itself, such that the same physical event produces more learning when its presentation is surprising than when it is expected. In other theories, prediction error enhances attention to potential cues for that reinforcer by adjusting cue-specific associability parameters, biasing the processing of those stimuli so that they more readily enter into new associations in the future. A unique feature of these latter theories is that such alterations in stimulus associability must be represented in memory in an enduring fashion. Indeed, considerable data indicate that altered associability may be expressed days after its induction. Previous research from our laboratory identified brain circuit elements critical to the enhancement of stimulus associability by the omission of an expected event, and to the subsequent expression of that altered associability in more rapid learning. Here, for the first time, we identified a brain region, the posterior parietal cortex, as a potential site for a memorial representation of altered stimulus associability. In three experiments using rats and a serial prediction task, we found that intact posterior parietal cortex function was essential during the encoding, consolidation, and retrieval of an associability memory enhanced by surprising omissions. We discuss these new results in the context of our previous findings and additional plausible frontoparietal and subcortical networks. © 2013 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Social Role Theory and Social Role Valorization for Care Management Practice.
Blakely, Thomas J; Dziadosz, Gregory M
2015-01-01
This article proposes that social role theory (SRT) and social role valorization (SRV) be established as organizing theories for care managers. SRT is a recognized sociological theory that has a distinctive place in care management practice. SRV is an adjunct for SRT that focuses on people who are devalued by being in a negative social position and supports behavior change and movement to a valued social position.
Applying Theory Y to Library Management
ERIC Educational Resources Information Center
Morton, Donald J.
1975-01-01
Reviews the principles of the Theory Y approach, reports upon its coverage in library literature, distinguishes between the concepts of Theory Y and participative management, and discusses how Theory Y's application in a small academic library recommends its use for library operations in general. (Author)
Medication knowledge, certainty, and risk of errors in health care: a cross-sectional study
2011-01-01
Background Medication errors are often involved in reported adverse events. Drug therapy, prescribed by physicians, is mostly carried out by nurses, who are expected to master all aspects of medication. Research has revealed the need for improved knowledge in drug dose calculation, and medication knowledge as a whole is poorly investigated. The purpose of this survey was to study registered nurses' medication knowledge, certainty and estimated risk of errors, and to explore factors associated with good results. Methods Nurses from hospitals and primary health care establishments were invited to carry out a multiple-choice test in pharmacology, drug management and drug dose calculations (score range 0-14). Self-estimated certainty in each answer was recorded, graded from 0 = very uncertain to 3 = very certain. Background characteristics and sense of coping were recorded. Risk of error was estimated by combining knowledge and certainty scores. The results are presented as mean (±SD). Results Two-hundred and three registered nurses participated (including 16 males), aged 42.0 (9.3) years with a working experience of 12.4 (9.2) years. Knowledge scores in pharmacology, drug management and drug dose calculations were 10.3 (1.6), 7.5 (1.6), and 11.2 (2.0), respectively, and certainty scores were 1.8 (0.4), 1.9 (0.5), and 2.0 (0.6), respectively. Fifteen percent of the total answers showed a high risk of error, with 25% in drug management. Independent factors associated with high medication knowledge were working in hospitals (p < 0.001), postgraduate specialization (p = 0.01) and completion of courses in drug management (p < 0.01). Conclusions Medication knowledge was found to be unsatisfactory among practicing nurses, with a significant risk for medication errors. The study revealed a need to improve the nurses' basic knowledge, especially when referring to drug management. PMID:21791106
Risk prediction and aversion by anterior cingulate cortex.
Brown, Joshua W; Braver, Todd S
2007-12-01
The recently proposed error-likelihood hypothesis suggests that anterior cingulate cortex (ACC) and surrounding areas will become active in proportion to the perceived likelihood of an error. The hypothesis was originally derived from a computational model prediction. The same computational model now makes a further prediction that ACC will be sensitive not only to predicted error likelihood, but also to the predicted magnitude of the consequences, should an error occur. The product of error likelihood and predicted error consequence magnitude collectively defines the general "expected risk" of a given behavior in a manner analogous but orthogonal to subjective expected utility theory. New fMRI results from an incentivechange signal task now replicate the error-likelihood effect, validate the further predictions of the computational model, and suggest why some segments of the population may fail to show an error-likelihood effect. In particular, error-likelihood effects and expected risk effects in general indicate greater sensitivity to earlier predictors of errors and are seen in risk-averse but not risk-tolerant individuals. Taken together, the results are consistent with an expected risk model of ACC and suggest that ACC may generally contribute to cognitive control by recruiting brain activity to avoid risk.
Panel positioning error and support mechanism for a 30-m THz radio telescope
NASA Astrophysics Data System (ADS)
Yang, De-Hua; Okoh, Daniel; Zhou, Guo-Hua; Li, Ai-Hua; Li, Guo-Ping; Cheng, Jing-Quan
2011-06-01
A 30-m TeraHertz (THz) radio telescope is proposed to operate at 200 μm with an active primary surface. This paper presents sensitivity analysis of active surface panel positioning errors with optical performance in terms of the Strehl ratio. Based on Ruze's surface error theory and using a Monte Carlo simulation, the effects of six rigid panel positioning errors, such as piston, tip, tilt, radial, azimuthal and twist displacements, were directly derived. The optical performance of the telescope was then evaluated using the standard Strehl ratio. We graphically illustrated the various panel error effects by presenting simulations of complete ensembles of full reflector surface errors for the six different rigid panel positioning errors. Study of the panel error sensitivity analysis revealed that the piston error and tilt/tip errors are dominant while the other rigid errors are much less important. Furthermore, as indicated by the results, we conceived of an alternative Master-Slave Concept-based (MSC-based) active surface by implementating a special Series-Parallel Concept-based (SPC-based) hexapod as the active panel support mechanism. A new 30-m active reflector based on the two concepts was demonstrated to achieve correction for all the six rigid panel positioning errors in an economically feasible way.
NASA Technical Reports Server (NTRS)
Diorio, Kimberly A.; Voska, Ned (Technical Monitor)
2002-01-01
This viewgraph presentation provides information on Human Factors Process Failure Modes and Effects Analysis (HF PFMEA). HF PFMEA includes the following 10 steps: Describe mission; Define System; Identify human-machine; List human actions; Identify potential errors; Identify factors that effect error; Determine likelihood of error; Determine potential effects of errors; Evaluate risk; Generate solutions (manage error). The presentation also describes how this analysis was applied to a liquid oxygen pump acceptance test.
Application of hard sphere perturbation theory for thermodynamics of model liquid metals
NASA Astrophysics Data System (ADS)
Mon, K. K.
2001-06-01
Hard sphere perturbation theory (HSPT) has contributed toward the fundamental understanding of dense fluids for over 30 years. In recent decades, other techniques have been more popular. In this paper, we argue for the revival of hard sphere perturbation theory for the study of thermodynamics of dense liquid in general, and in liquid metal in particular. The weakness of HSPT is now well understood, and can be easily overcome by using a simple convenient Monte Carlo method to calculate the intrinsic error of HSPT free energy density. To demonstrate this approach, we consider models of liquid aluminum and sodium. We obtain the intrinsic error of HSPT with the Monte Carlo method. HSPT is shown to provide a lower free energy upper bound than one-component plasma (OCP) for alkali metals and polyvalent metals. We are thus able to provide insight into the long standing observation that a OCP is a better reference system than a HS for alkali metals.
Mostafaei, Davoud; Barati Marnani, Ahmad; Mosavi Esfahani, Haleh; Estebsari, Fatemeh; Shahzaidi, Shiva; Jamshidi, Ensiyeh; Aghamiri, Seyed Samad
2014-10-01
About one third of unwanted reported medication consequences are due to medication errors, resulting in one-fifth of hospital injuries. The aim of this study was determined formal and informal medication errors of nurses and the level of importance of factors in refusal to report medication errors among nurses. The cross-sectional study was done on the nursing staff of Shohada Tajrish Hospital, Tehran, Iran in 2012. The data was gathered through a questionnaire, made by the researchers. The questionnaires' face and content validity was confirmed by experts and for measuring its reliability test-retest was used. The data was analyzed by descriptive statistics. We used SPSS for related statistical analyses. The most important factors in refusal to report medication errors respectively were: lack of medication error recording and reporting system in the hospital (3.3%), non-significant error reporting to hospital authorities and lack of appropriate feedback (3.1%), and lack of a clear definition for a medication error (3%). There were both formal and informal reporting of medication errors in this study. Factors pertaining to management in hospitals as well as the fear of the consequences of reporting are two broad fields among the factors that make nurses not report their medication errors. In this regard, providing enough education to nurses, boosting the job security for nurses, management support and revising related processes and definitions are some factors that can help decreasing medication errors and increasing their report in case of occurrence.
A new approach to flow simulation using hybrid models
NASA Astrophysics Data System (ADS)
Solgi, Abazar; Zarei, Heidar; Nourani, Vahid; Bahmani, Ramin
2017-11-01
The necessity of flow prediction in rivers, for proper management of water resource, and the need for determining the inflow to the dam reservoir, designing efficient flood warning systems and so forth, have always led water researchers to think about models with high-speed response and low error. In the recent years, the development of Artificial Neural Networks and Wavelet theory and using the combination of models help researchers to estimate the river flow better and better. In this study, daily and monthly scales were used for simulating the flow of Gamasiyab River, Nahavand, Iran. The first simulation was done using two types of ANN and ANFIS models. Then, using wavelet theory and decomposing input signals of the used parameters, sub-signals were obtained and were fed into the ANN and ANFIS to obtain hybrid models of WANN and WANFIS. In this study, in addition to the parameters of precipitation and flow, parameters of temperature and evaporation were used to analyze their effects on the simulation. The results showed that using wavelet transform improved the performance of the models in both monthly and daily scale. However, it had a better effect on the monthly scale and the WANFIS was the best model.
NASA Astrophysics Data System (ADS)
Wills, John M.; Mattsson, Ann E.
2012-02-01
Density functional theory (DFT) provides a formally predictive base for equation of state properties. Available approximations to the exchange/correlation functional provide accurate predictions for many materials in the periodic table. For heavy materials however, DFT calculations, using available functionals, fail to provide quantitative predictions, and often fail to be even qualitative. This deficiency is due both to the lack of the appropriate confinement physics in the exchange/correlation functional and to approximations used to evaluate the underlying equations. In order to assess and develop accurate functionals, it is essential to eliminate all other sources of error. In this talk we describe an efficient first-principles electronic structure method based on the Dirac equation and compare the results obtained with this method with other methods generally used. Implications for high-pressure equation of state of relativistic materials are demonstrated in application to Ce and the light actinides. Sandia National Laboratories is a multi-program laboratory managed andoperated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
An evolutionary account of vigilance in grief
White, Claire; Fessler, Daniel M T
2018-01-01
Abstract Grief is characterized by a number of cardinal cognitive symptoms, including preoccupation with thoughts of the deceased and vigilance toward indications that the deceased is in the environment. Compared with emotional symptoms, little attention has been paid to the ultimate function of vigilance in grief. Drawing on signal-detection theory, we propose that the ultimate function of vigilance is to facilitate the reunification (where possible) with a viable relationship partner following separation. Preoccupation with thoughts about the missing person creates the cognitive conditions necessary to maintain a low baseline threshold for the detection of the agent—any information associated with the agent is highly salient, and attention is correspondingly readily deployed toward such cues. These patterns are adaptive in cases of an absent but living partner, but maladaptive in cases of the death of a partner. That they occur in the latter likely reflects the intersection of error-management considerations and the kludge-like configuration of the mind. We discuss results from two previous studies designed to test predictions concerning input conditions and individual differences based on this account, and consider the implications of these findings for mainstream bereavement theories and practices. PMID:29492265
NASA Astrophysics Data System (ADS)
Kästner, K.; Hoitink, A. J. F.; Torfs, P. J. J. F.; Vermeulen, B.; Ningsih, N. S.; Pramulya, M.
2018-02-01
River discharge has to be monitored reliably for effective water management. As river discharge cannot be measured directly, it is usually inferred from the water level. This practice is unreliable at places where the relation between water level and flow velocity is ambiguous. In such a case, the continuous measurement of the flow velocity can improve the discharge prediction. The emergence of horizontal acoustic Doppler current profilers (HADCPs) has made it possible to continuously measure the flow velocity. However, the profiling range of HADCPs is limited, so that a single instrument can only partially cover a wide cross section. The total discharge still has to be determined with a model. While the limitations of rating curves are well understood, there is not yet a comprehensive theory to assess the accuracy of discharge predicted from velocity measurements. Such a theory is necessary to discriminate which factors influence the measurements, and to improve instrument deployment as well as discharge prediction. This paper presents a generic method to assess the uncertainty of discharge predicted from range-limited velocity profiles. The theory shows that a major source of error is the variation of the ratio between the local and cross-section-averaged velocity. This variation is large near the banks, where HADCPs are usually deployed and can limit the advantage gained from the velocity measurement. We apply our theory at two gauging stations situated in the Kapuas River, Indonesia. We find that at one of the two stations the index velocity does not outperform a simple rating curve.
NASA Astrophysics Data System (ADS)
Shaw, Jeremy A.; Daescu, Dacian N.
2017-08-01
This article presents the mathematical framework to evaluate the sensitivity of a forecast error aspect to the input parameters of a weak-constraint four-dimensional variational data assimilation system (w4D-Var DAS), extending the established theory from strong-constraint 4D-Var. Emphasis is placed on the derivation of the equations for evaluating the forecast sensitivity to parameters in the DAS representation of the model error statistics, including bias, standard deviation, and correlation structure. A novel adjoint-based procedure for adaptive tuning of the specified model error covariance matrix is introduced. Results from numerical convergence tests establish the validity of the model error sensitivity equations. Preliminary experiments providing a proof-of-concept are performed using the Lorenz multi-scale model to illustrate the theoretical concepts and potential benefits for practical applications.
Ten years of preanalytical monitoring and control: Synthetic Balanced Score Card Indicator
López-Garrigós, Maite; Flores, Emilio; Santo-Quiles, Ana; Gutierrez, Mercedes; Lugo, Javier; Lillo, Rosa; Leiva-Salinas, Carlos
2015-01-01
Introduction Preanalytical control and monitoring continue to be an important issue for clinical laboratory professionals. The aim of the study was to evaluate a monitoring system of preanalytical errors regarding not suitable samples for analysis, based on different indicators; to compare such indicators in different phlebotomy centres; and finally to evaluate a single synthetic preanalytical indicator that may be included in the balanced scorecard management system (BSC). Materials and methods We collected individual and global preanalytical errors in haematology, coagulation, chemistry, and urine samples analysis. We also analyzed a synthetic indicator that represents the sum of all types of preanalytical errors, expressed in a sigma level. We studied the evolution of those indicators over time and compared indicator results by way of the comparison of proportions and Chi-square. Results There was a decrease in the number of errors along the years (P < 0.001). This pattern was confirmed in primary care patients, inpatients and outpatients. In blood samples, fewer errors occurred in outpatients, followed by inpatients. Conclusion We present a practical and effective methodology to monitor unsuitable sample preanalytical errors. The synthetic indicator results summarize overall preanalytical sample errors, and can be used as part of BSC management system. PMID:25672466
Managing Errors to Reduce Accidents in High Consequence Networked Information Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ganter, J.H.
1999-02-01
Computers have always helped to amplify and propagate errors made by people. The emergence of Networked Information Systems (NISs), which allow people and systems to quickly interact worldwide, has made understanding and minimizing human error more critical. This paper applies concepts from system safety to analyze how hazards (from hackers to power disruptions) penetrate NIS defenses (e.g., firewalls and operating systems) to cause accidents. Such events usually result from both active, easily identified failures and more subtle latent conditions that have resided in the system for long periods. Both active failures and latent conditions result from human errors. We classifymore » these into several types (slips, lapses, mistakes, etc.) and provide NIS examples of how they occur. Next we examine error minimization throughout the NIS lifecycle, from design through operation to reengineering. At each stage, steps can be taken to minimize the occurrence and effects of human errors. These include defensive design philosophies, architectural patterns to guide developers, and collaborative design that incorporates operational experiences and surprises into design efforts. We conclude by looking at three aspects of NISs that will cause continuing challenges in error and accident management: immaturity of the industry, limited risk perception, and resource tradeoffs.« less
Error correcting coding-theory for structured light illumination systems
NASA Astrophysics Data System (ADS)
Porras-Aguilar, Rosario; Falaggis, Konstantinos; Ramos-Garcia, Ruben
2017-06-01
Intensity discrete structured light illumination systems project a series of projection patterns for the estimation of the absolute fringe order using only the temporal grey-level sequence at each pixel. This work proposes the use of error-correcting codes for pixel-wise correction of measurement errors. The use of an error correcting code is advantageous in many ways: it allows reducing the effect of random intensity noise, it corrects outliners near the border of the fringe commonly present when using intensity discrete patterns, and it provides a robustness in case of severe measurement errors (even for burst errors where whole frames are lost). The latter aspect is particular interesting in environments with varying ambient light as well as in critical safety applications as e.g. monitoring of deformations of components in nuclear power plants, where a high reliability is ensured even in case of short measurement disruptions. A special form of burst errors is the so-called salt and pepper noise, which can largely be removed with error correcting codes using only the information of a given pixel. The performance of this technique is evaluated using both simulations and experiments.
Strategy of restraining ripple error on surface for optical fabrication.
Wang, Tan; Cheng, Haobo; Feng, Yunpeng; Tam, Honyuen
2014-09-10
The influence from the ripple error to the high imaging quality is effectively reduced by restraining the ripple height. A method based on the process parameters and the surface error distribution is designed to suppress the ripple height in this paper. The generating mechanism of the ripple error is analyzed by polishing theory with uniform removal character. The relation between the processing parameters (removal functions, pitch of path, and dwell time) and the ripple error is discussed through simulations. With these, the strategy for diminishing the error is presented. A final process is designed and demonstrated on K9 work-pieces using the optimizing strategy with magnetorheological jet polishing. The form error on the surface is decreased from 0.216λ PV (λ=632.8 nm) and 0.039λ RMS to 0.03λ PV and 0.004λ RMS. And the ripple error is restrained well at the same time, because the ripple height is less than 6 nm on the final surface. Results indicate that these strategies are suitable for high-precision optical manufacturing.
Useful measures and models for analytical quality management in medical laboratories.
Westgard, James O
2016-02-01
The 2014 Milan Conference "Defining analytical performance goals 15 years after the Stockholm Conference" initiated a new discussion of issues concerning goals for precision, trueness or bias, total analytical error (TAE), and measurement uncertainty (MU). Goal-setting models are critical for analytical quality management, along with error models, quality-assessment models, quality-planning models, as well as comprehensive models for quality management systems. There are also critical underlying issues, such as an emphasis on MU to the possible exclusion of TAE and a corresponding preference for separate precision and bias goals instead of a combined total error goal. This opinion recommends careful consideration of the differences in the concepts of accuracy and traceability and the appropriateness of different measures, particularly TAE as a measure of accuracy and MU as a measure of traceability. TAE is essential to manage quality within a medical laboratory and MU and trueness are essential to achieve comparability of results across laboratories. With this perspective, laboratory scientists can better understand the many measures and models needed for analytical quality management and assess their usefulness for practical applications in medical laboratories.
77 FR 59686 - Sunshine Act Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-28
... prevention of errors through robust system design, deployment, and operation. The second panel will focus on the responses to errors and malfunctions and managing crises in real-time. For further information...
ERIC Educational Resources Information Center
Waagen, Christopher L.
William Ouchi's Theory Z, a theory that focuses on the identification of both management and labor with the company's goals, emphasizes communication structures and styles. Ringi is a Japanese procedure for decision making in which all levels of management participate. In Ringi, a manager's task is to communicate. In quality control (Q-C) circles,…
NASA Technical Reports Server (NTRS)
Holmquist, R.
1978-01-01
The random evolutionary hits (REH) theory of evolutionary divergence, originally proposed in 1972, is restated with attention to certain aspects of the theory that have caused confusion. The theory assumes that natural selection and stochastic processes interact and that natural selection restricts those codon sites which may fix mutations. The predicted total number of fixed nucleotide replacements agrees with data for cytochrome c, a-hemoglobin, beta-hemoglobin, and myoglobin. The restatement analyzes the magnitude of possible sources of errors and simplifies calculational methodology by supplying polynomial expressions to replace tables and graphs.
Wayde c. Morse; Troy E. Hall; Linda E. Kruger
2008-01-01
In this article, we examine how issues of scale affect the integration of recreation management with the management of other natural resources on public lands. We present two theories used to address scale issues in ecology and explore how they can improve the two most widely applied recreation-planning frameworks. The theory of patch dynamics and hierarchy theory are...
Awareness of technology-induced errors and processes for identifying and preventing such errors.
Bellwood, Paule; Borycki, Elizabeth M; Kushniruk, Andre W
2015-01-01
There is a need to determine if organizations working with health information technology are aware of technology-induced errors and how they are addressing and preventing them. The purpose of this study was to: a) determine the degree of technology-induced error awareness in various Canadian healthcare organizations, and b) identify those processes and procedures that are currently in place to help address, manage, and prevent technology-induced errors. We identified a lack of technology-induced error awareness among participants. Participants identified there was a lack of well-defined procedures in place for reporting technology-induced errors, addressing them when they arise, and preventing them.
Development of a Dependency Theory Toolbox for Database Design.
1987-12-01
published algorithms and theorems , and hand simulating these algorithms can be a tedious and error prone chore. Additionally, since the process of...to design and study relational databases exists in the form of published algorithms and theorems . However, hand simulating these algorithms can be a...published algorithms and theorems . Hand simulating these algorithms can be a tedious and error prone chore. Therefore, a toolbox of algorithms and
Research on the output bit error rate of 2DPSK signal based on stochastic resonance theory
NASA Astrophysics Data System (ADS)
Yan, Daqin; Wang, Fuzhong; Wang, Shuo
2017-12-01
Binary differential phase-shift keying (2DPSK) signal is mainly used for high speed data transmission. However, the bit error rate of digital signal receiver is high in the case of wicked channel environment. In view of this situation, a novel method based on stochastic resonance (SR) is proposed, which is aimed to reduce the bit error rate of 2DPSK signal by coherent demodulation receiving. According to the theory of SR, a nonlinear receiver model is established, which is used to receive 2DPSK signal under small signal-to-noise ratio (SNR) circumstances (between -15 dB and 5 dB), and compared with the conventional demodulation method. The experimental results demonstrate that when the input SNR is in the range of -15 dB to 5 dB, the output bit error rate of nonlinear system model based on SR has a significant decline compared to the conventional model. It could reduce 86.15% when the input SNR equals -7 dB. Meanwhile, the peak value of the output signal spectrum is 4.25 times as that of the conventional model. Consequently, the output signal of the system is more likely to be detected and the accuracy can be greatly improved.
Comparison of Optimal Design Methods in Inverse Problems
Banks, H. T.; Holm, Kathleen; Kappel, Franz
2011-01-01
Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher Information Matrix (FIM). A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criteria with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model [13], the standard harmonic oscillator model [13] and a popular glucose regulation model [16, 19, 29]. PMID:21857762
Bright-Paul, Alexandra; Jarrold, Christopher; Wright, Daniel B; Guillaume, Stephanie
2012-01-01
This study examined whether recalling an event with a co-witness influences children's recall. Individual 3-5-year-olds (n = 48) watched a film with a co-witness. Unbeknown to participants, the co-witness was watching an alternative version of the film. Afterwards both the co-witness and the participant answered questions about the film together (public recall), and the degree to which children conformed to the co-witness's alternative version of events was measured. Subsequently participants were questioned again individually (private recall). Children also completed false belief and inhibitory control tasks. By separating errors made in public and private, the results indicated that both social conformity (32% of errors) and memory distortion (68% of errors) played a role in co-witness influence. Inhibitory control predicted the likelihood of retracting errors in private, but only for children who failed (r = .66) rather than passed false belief tasks (r = -.10). The results suggest that children with a theory of mind conform in the company of the co-witness to avoid social embarrassment, while those a poor theory of mind conform on the basis of an inability to inhibit the co-witness's response. The findings contribute to our understanding of the motivations responsible for co-witness conformity across early childhood.
Chiral extrapolation of the leading hadronic contribution to the muon anomalous magnetic moment
NASA Astrophysics Data System (ADS)
Golterman, Maarten; Maltman, Kim; Peris, Santiago
2017-04-01
A lattice computation of the leading-order hadronic contribution to the muon anomalous magnetic moment can potentially help reduce the error on the Standard Model prediction for this quantity, if sufficient control of all systematic errors affecting such a computation can be achieved. One of these systematic errors is that associated with the extrapolation to the physical pion mass from values on the lattice larger than the physical pion mass. We investigate this extrapolation assuming lattice pion masses in the range of 200 to 400 MeV with the help of two-loop chiral perturbation theory, and we find that such an extrapolation is unlikely to lead to control of this systematic error at the 1% level. This remains true even if various tricks to improve the reliability of the chiral extrapolation employed in the literature are taken into account. In addition, while chiral perturbation theory also predicts the dependence on the pion mass of the leading-order hadronic contribution to the muon anomalous magnetic moment as the chiral limit is approached, this prediction turns out to be of no practical use because the physical pion mass is larger than the muon mass that sets the scale for the onset of this behavior.
A Support System for Error Correction Questions in Programming Education
ERIC Educational Resources Information Center
Hachisu, Yoshinari; Yoshida, Atsushi
2014-01-01
For supporting the education of debugging skills, we propose a system for generating error correction questions of programs and checking the correctness. The system generates HTML files for answering questions and CGI programs for checking answers. Learners read and answer questions on Web browsers. For management of error injection, we have…
ERIC Educational Resources Information Center
Sass, D. A.; Schmitt, T. A.; Walker, C. M.
2008-01-01
Item response theory (IRT) procedures have been used extensively to study normal latent trait distributions and have been shown to perform well; however, less is known concerning the performance of IRT with non-normal latent trait distributions. This study investigated the degree of latent trait estimation error under normal and non-normal…
ERIC Educational Resources Information Center
Klinger, Don A.; Rogers, W. Todd
2003-01-01
The estimation accuracy of procedures based on classical test score theory and item response theory (generalized partial credit model) were compared for examinations consisting of multiple-choice and extended-response items. Analysis of British Columbia Scholarship Examination results found an error rate of about 10 percent for both methods, with…
Error Monitoring in Speech Production: A Computational Test of the Perceptual Loop Theory.
ERIC Educational Resources Information Center
Hartsuiker, Robert J.; Kolk, Herman H. J.
2001-01-01
Tested whether an elaborated version of the perceptual loop theory (W. Levelt, 1983) and the main interruption rule was consistent with existing time course data (E. Blackmer and E. Mitton, 1991; C. Oomen and A. Postma, in press). The study suggests that including an inner loop through the speech comprehension system generates predictions that fit…
Recent advances in coding theory for near error-free communications
NASA Technical Reports Server (NTRS)
Cheung, K.-M.; Deutsch, L. J.; Dolinar, S. J.; Mceliece, R. J.; Pollara, F.; Shahshahani, M.; Swanson, L.
1991-01-01
Channel and source coding theories are discussed. The following subject areas are covered: large constraint length convolutional codes (the Galileo code); decoder design (the big Viterbi decoder); Voyager's and Galileo's data compression scheme; current research in data compression for images; neural networks for soft decoding; neural networks for source decoding; finite-state codes; and fractals for data compression.
ERIC Educational Resources Information Center
Gu, Fei; Skorupski, William P.; Hoyle, Larry; Kingston, Neal M.
2011-01-01
Ramsay-curve item response theory (RC-IRT) is a nonparametric procedure that estimates the latent trait using splines, and no distributional assumption about the latent trait is required. For item parameters of the two-parameter logistic (2-PL), three-parameter logistic (3-PL), and polytomous IRT models, RC-IRT can provide more accurate estimates…
ERIC Educational Resources Information Center
Abry, Tashia; Cash, Anne H.; Bradshaw, Catherine P.
2014-01-01
Generalizability theory (GT) offers a useful framework for estimating the reliability of a measure while accounting for multiple sources of error variance. The purpose of this study was to use GT to examine multiple sources of variance in and the reliability of school-level teacher and high school student behaviors as observed using the tool,…
The cost of misremembering: Inferring the loss function in visual working memory.
Sims, Chris R
2015-03-04
Visual working memory (VWM) is a highly limited storage system. A basic consequence of this fact is that visual memories cannot perfectly encode or represent the veridical structure of the world. However, in natural tasks, some memory errors might be more costly than others. This raises the intriguing possibility that the nature of memory error reflects the costs of committing different kinds of errors. Many existing theories assume that visual memories are noise-corrupted versions of afferent perceptual signals. However, this additive noise assumption oversimplifies the problem. Implicit in the behavioral phenomena of visual working memory is the concept of a loss function: a mathematical entity that describes the relative cost to the organism of making different types of memory errors. An optimally efficient memory system is one that minimizes the expected loss according to a particular loss function, while subject to a constraint on memory capacity. This paper describes a novel theoretical framework for characterizing visual working memory in terms of its implicit loss function. Using inverse decision theory, the empirical loss function is estimated from the results of a standard delayed recall visual memory experiment. These results are compared to the predicted behavior of a visual working memory system that is optimally efficient for a previously identified natural task, gaze correction following saccadic error. Finally, the approach is compared to alternative models of visual working memory, and shown to offer a superior account of the empirical data across a range of experimental datasets. © 2015 ARVO.
Gauvin, Hanna S; De Baene, Wouter; Brass, Marcel; Hartsuiker, Robert J
2016-02-01
To minimize the number of errors in speech, and thereby facilitate communication, speech is monitored before articulation. It is, however, unclear at which level during speech production monitoring takes place, and what mechanisms are used to detect and correct errors. The present study investigated whether internal verbal monitoring takes place through the speech perception system, as proposed by perception-based theories of speech monitoring, or whether mechanisms independent of perception are applied, as proposed by production-based theories of speech monitoring. With the use of fMRI during a tongue twister task we observed that error detection in internal speech during noise-masked overt speech production and error detection in speech perception both recruit the same neural network, which includes pre-supplementary motor area (pre-SMA), dorsal anterior cingulate cortex (dACC), anterior insula (AI), and inferior frontal gyrus (IFG). Although production and perception recruit similar areas, as proposed by perception-based accounts, we did not find activation in superior temporal areas (which are typically associated with speech perception) during internal speech monitoring in speech production as hypothesized by these accounts. On the contrary, results are highly compatible with a domain general approach to speech monitoring, by which internal speech monitoring takes place through detection of conflict between response options, which is subsequently resolved by a domain general executive center (e.g., the ACC). Copyright © 2015 Elsevier Inc. All rights reserved.