Defining the Relationship Between Human Error Classes and Technology Intervention Strategies
NASA Technical Reports Server (NTRS)
Wiegmann, Douglas A.; Rantanen, Esa; Crisp, Vicki K. (Technical Monitor)
2002-01-01
One of the main factors in all aviation accidents is human error. The NASA Aviation Safety Program (AvSP), therefore, has identified several human-factors safety technologies to address this issue. Some technologies directly address human error either by attempting to reduce the occurrence of errors or by mitigating the negative consequences of errors. However, new technologies and system changes may also introduce new error opportunities or even induce different types of errors. Consequently, a thorough understanding of the relationship between error classes and technology "fixes" is crucial for the evaluation of intervention strategies outlined in the AvSP, so that resources can be effectively directed to maximize the benefit to flight safety. The purpose of the present project, therefore, was to examine the repositories of human factors data to identify the possible relationship between different error class and technology intervention strategies. The first phase of the project, which is summarized here, involved the development of prototype data structures or matrices that map errors onto "fixes" (and vice versa), with the hope of facilitating the development of standards for evaluating safety products. Possible follow-on phases of this project are also discussed. These additional efforts include a thorough and detailed review of the literature to fill in the data matrix and the construction of a complete database and standards checklists.
Defining the Relationship Between Human Error Classes and Technology Intervention Strategies
NASA Technical Reports Server (NTRS)
Wiegmann, Douglas A.; Rantanen, Eas M.
2003-01-01
The modus operandi in addressing human error in aviation systems is predominantly that of technological interventions or fixes. Such interventions exhibit considerable variability both in terms of sophistication and application. Some technological interventions address human error directly while others do so only indirectly. Some attempt to eliminate the occurrence of errors altogether whereas others look to reduce the negative consequences of these errors. In any case, technological interventions add to the complexity of the systems and may interact with other system components in unforeseeable ways and often create opportunities for novel human errors. Consequently, there is a need to develop standards for evaluating the potential safety benefit of each of these intervention products so that resources can be effectively invested to produce the biggest benefit to flight safety as well as to mitigate any adverse ramifications. The purpose of this project was to help define the relationship between human error and technological interventions, with the ultimate goal of developing a set of standards for evaluating or measuring the potential benefits of new human error fixes.
The contributions of human factors on human error in Malaysia aviation maintenance industries
NASA Astrophysics Data System (ADS)
Padil, H.; Said, M. N.; Azizan, A.
2018-05-01
Aviation maintenance is a multitasking activity in which individuals perform varied tasks under constant pressure to meet deadlines as well as challenging work conditions. These situational characteristics combined with human factors can lead to various types of human related errors. The primary objective of this research is to develop a structural relationship model that incorporates human factors, organizational factors, and their impact on human errors in aviation maintenance. Towards that end, a questionnaire was developed which was administered to Malaysian aviation maintenance professionals. Structural Equation Modelling (SEM) approach was used in this study utilizing AMOS software. Results showed that there were a significant relationship of human factors on human errors and were tested in the model. Human factors had a partial effect on organizational factors while organizational factors had a direct and positive impact on human errors. It was also revealed that organizational factors contributed to human errors when coupled with human factors construct. This study has contributed to the advancement of knowledge on human factors effecting safety and has provided guidelines for improving human factors performance relating to aviation maintenance activities and could be used as a reference for improving safety performance in the Malaysian aviation maintenance companies.
Operational Interventions to Maintenance Error
NASA Technical Reports Server (NTRS)
Kanki, Barbara G.; Walter, Diane; Dulchinos, VIcki
1997-01-01
A significant proportion of aviation accidents and incidents are known to be tied to human error. However, research of flight operational errors has shown that so-called pilot error often involves a variety of human factors issues and not a simple lack of individual technical skills. In aircraft maintenance operations, there is similar concern that maintenance errors which may lead to incidents and accidents are related to a large variety of human factors issues. Although maintenance error data and research are limited, industry initiatives involving human factors training in maintenance have become increasingly accepted as one type of maintenance error intervention. Conscientious efforts have been made in re-inventing the team7 concept for maintenance operations and in tailoring programs to fit the needs of technical opeRAtions. Nevertheless, there remains a dual challenge: 1) to develop human factors interventions which are directly supported by reliable human error data, and 2) to integrate human factors concepts into the procedures and practices of everyday technical tasks. In this paper, we describe several varieties of human factors interventions and focus on two specific alternatives which target problems related to procedures and practices; namely, 1) structured on-the-job training and 2) procedure re-design. We hope to demonstrate that the key to leveraging the impact of these solutions comes from focused interventions; that is, interventions which are derived from a clear understanding of specific maintenance errors, their operational context and human factors components.
Reduction of Maintenance Error Through Focused Interventions
NASA Technical Reports Server (NTRS)
Kanki, Barbara G.; Walter, Diane; Rosekind, Mark R. (Technical Monitor)
1997-01-01
It is well known that a significant proportion of aviation accidents and incidents are tied to human error. In flight operations, research of operational errors has shown that so-called "pilot error" often involves a variety of human factors issues and not a simple lack of individual technical skills. In aircraft maintenance operations, there is similar concern that maintenance errors which may lead to incidents and accidents are related to a large variety of human factors issues. Although maintenance error data and research are limited, industry initiatives involving human factors training in maintenance have become increasingly accepted as one type of maintenance error intervention. Conscientious efforts have been made in re-inventing the "team" concept for maintenance operations and in tailoring programs to fit the needs of technical operations. Nevertheless, there remains a dual challenge: to develop human factors interventions which are directly supported by reliable human error data, and to integrate human factors concepts into the procedures and practices of everyday technical tasks. In this paper, we describe several varieties of human factors interventions and focus on two specific alternatives which target problems related to procedures and practices; namely, 1) structured on-the-job training and 2) procedure re-design. We hope to demonstrate that the key to leveraging the impact of these solutions comes from focused interventions; that is, interventions which are derived from a clear understanding of specific maintenance errors, their operational context and human factors components.
De Sá Teixeira, Nuno Alexandre
2014-12-01
Given its conspicuous nature, gravity has been acknowledged by several research lines as a prime factor in structuring the spatial perception of one's environment. One such line of enquiry has focused on errors in spatial localization aimed at the vanishing location of moving objects - it has been systematically reported that humans mislocalize spatial positions forward, in the direction of motion (representational momentum) and downward in the direction of gravity (representational gravity). Moreover, spatial localization errors were found to evolve dynamically with time in a pattern congruent with an anticipated trajectory (representational trajectory). The present study attempts to ascertain the degree to which vestibular information plays a role in these phenomena. Human observers performed a spatial localization task while tilted to varying degrees and referring to the vanishing locations of targets moving along several directions. A Fourier decomposition of the obtained spatial localization errors revealed that although spatial errors were increased "downward" mainly along the body's longitudinal axis (idiotropic dominance), the degree of misalignment between the latter and physical gravity modulated the time course of the localization responses. This pattern is surmised to reflect increased uncertainty about the internal model when faced with conflicting cues regarding the perceived "downward" direction.
Garbe, James C.; Vrba, Lukas; Sputova, Klara; ...
2014-10-29
Telomerase reactivation and immortalization are critical for human carcinoma progression. However, little is known about the mechanisms controlling this crucial step, due in part to the paucity of experimentally tractable model systems that can examine human epithelial cell immortalization as it might occur in vivo. We achieved efficient non-clonal immortalization of normal human mammary epithelial cells (HMEC) by directly targeting the 2 main senescence barriers encountered by cultured HMEC. The stress-associated stasis barrier was bypassed using shRNA to p16INK4; replicative senescence due to critically shortened telomeres was bypassed in post-stasis HMEC by c-MYC transduction. Thus, 2 pathologically relevant oncogenic agentsmore » are sufficient to immortally transform normal HMEC. The resultant non-clonal immortalized lines exhibited normal karyotypes. Most human carcinomas contain genomically unstable cells, with widespread instability first observed in vivo in pre-malignant stages; in vitro, instability is seen as finite cells with critically shortened telomeres approach replicative senescence. Our results support our hypotheses that: (1) telomere-dysfunction induced genomic instability in pre-malignant finite cells may generate the errors required for telomerase reactivation and immortalization, as well as many additional “passenger” errors carried forward into resulting carcinomas; (2) genomic instability during cancer progression is needed to generate errors that overcome tumor suppressive barriers, but not required per se; bypassing the senescence barriers by direct targeting eliminated a need for genomic errors to generate immortalization. Achieving efficient HMEC immortalization, in the absence of “passenger” genomic errors, should facilitate examination of telomerase regulation during human carcinoma progression, and exploration of agents that could prevent immortalization.« less
A stochastic dynamic model for human error analysis in nuclear power plants
NASA Astrophysics Data System (ADS)
Delgado-Loperena, Dharma
Nuclear disasters like Three Mile Island and Chernobyl indicate that human performance is a critical safety issue, sending a clear message about the need to include environmental press and competence aspects in research. This investigation was undertaken to serve as a roadmap for studying human behavior through the formulation of a general solution equation. The theoretical model integrates models from two heretofore-disassociated disciplines (behavior specialists and technical specialists), that historically have independently studied the nature of error and human behavior; including concepts derived from fractal and chaos theory; and suggests re-evaluation of base theory regarding human error. The results of this research were based on comprehensive analysis of patterns of error, with the omnipresent underlying structure of chaotic systems. The study of patterns lead to a dynamic formulation, serving for any other formula used to study human error consequences. The search for literature regarding error yielded insight for the need to include concepts rooted in chaos theory and strange attractors---heretofore unconsidered by mainstream researchers who investigated human error in nuclear power plants or those who employed the ecological model in their work. The study of patterns obtained from the rupture of a steam generator tube (SGTR) event simulation, provided a direct application to aspects of control room operations in nuclear power plant operations. In doing so, the conceptual foundation based in the understanding of the patterns of human error analysis can be gleaned, resulting in reduced and prevent undesirable events.
Human Factors Directions for Civil Aviation
NASA Technical Reports Server (NTRS)
Hart, Sandra G.
2002-01-01
Despite considerable progress in understanding human capabilities and limitations, incorporating human factors into aircraft design, operation, and certification, and the emergence of new technologies designed to reduce workload and enhance human performance in the system, most aviation accidents still involve human errors. Such errors occur as a direct or indirect result of untimely, inappropriate, or erroneous actions (or inactions) by apparently well-trained and experienced pilots, controllers, and maintainers. The field of human factors has solved many of the more tractable problems related to simple ergonomics, cockpit layout, symbology, and so on. We have learned much about the relationships between people and machines, but know less about how to form successful partnerships between humans and the information technologies that are beginning to play a central role in aviation. Significant changes envisioned in the structure of the airspace, pilots and controllers' roles and responsibilities, and air/ground technologies will require a similarly significant investment in human factors during the next few decades to ensure the effective integration of pilots, controllers, dispatchers, and maintainers into the new system. Many of the topics that will be addressed are not new because progress in crucial areas, such as eliminating human error, has been slow. A multidisciplinary approach that capitalizes upon human studies and new classes of information, computational models, intelligent analytical tools, and close collaborations with organizations that build, operate, and regulate aviation technology will ensure that the field of human factors meets the challenge.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lon N. Haney; David I. Gertman
2003-04-01
Beginning in the 1980s a primary focus of human reliability analysis was estimation of human error probabilities. However, detailed qualitative modeling with comprehensive representation of contextual variables often was lacking. This was likely due to the lack of comprehensive error and performance shaping factor taxonomies, and the limited data available on observed error rates and their relationship to specific contextual variables. In the mid 90s Boeing, America West Airlines, NASA Ames Research Center and INEEL partnered in a NASA sponsored Advanced Concepts grant to: assess the state of the art in human error analysis, identify future needs for human errormore » analysis, and develop an approach addressing these needs. Identified needs included the need for a method to identify and prioritize task and contextual characteristics affecting human reliability. Other needs identified included developing comprehensive taxonomies to support detailed qualitative modeling and to structure meaningful data collection efforts across domains. A result was the development of the FRamework Assessing Notorious Contributing Influences for Error (FRANCIE) with a taxonomy for airline maintenance tasks. The assignment of performance shaping factors to generic errors by experts proved to be valuable to qualitative modeling. Performance shaping factors and error types from such detailed approaches can be used to structure error reporting schemes. In a recent NASA Advanced Human Support Technology grant FRANCIE was refined, and two new taxonomies for use on space missions were developed. The development, sharing, and use of error taxonomies, and the refinement of approaches for increased fidelity of qualitative modeling is offered as a means to help direct useful data collection strategies.« less
Context-dependent sequential effects of target selection for action.
Moher, Jeff; Song, Joo-Hyun
2013-07-11
Humans exhibit variation in behavior from moment to moment even when performing a simple, repetitive task. Errors are typically followed by cautious responses, minimizing subsequent distractor interference. However, less is known about how variation in the execution of an ultimately correct response affects subsequent behavior. We asked participants to reach toward a uniquely colored target presented among distractors and created two categories to describe participants' responses in correct trials based on analyses of movement trajectories; partial errors referred to trials in which observers initially selected a nontarget for action before redirecting the movement and accurately pointing to the target, and direct movements referred to trials in which the target was directly selected for action. We found that latency to initiate a hand movement was shorter in trials following partial errors compared to trials following direct movements. Furthermore, when the target and distractor colors were repeated, movement time and reach movement curvature toward distractors were greater following partial errors compared to direct movements. Finally, when the colors were repeated, partial errors were more frequent than direct movements following partial-error trials, and direct movements were more frequent following direct-movement trials. The dependence of these latter effects on repeated-task context indicates the involvement of higher-level cognitive mechanisms in an integrated attention-action system in which execution of a partial-error or direct-movement response affects memory representations that bias performance in subsequent trials. Altogether, these results demonstrate that whether a nontarget is selected for action or not has a measurable impact on subsequent behavior.
Spatial durbin error model for human development index in Province of Central Java.
NASA Astrophysics Data System (ADS)
Septiawan, A. R.; Handajani, S. S.; Martini, T. S.
2018-05-01
The Human Development Index (HDI) is an indicator used to measure success in building the quality of human life, explaining how people access development outcomes when earning income, health and education. Every year HDI in Central Java has improved to a better direction. In 2016, HDI in Central Java was 69.98 %, an increase of 0.49 % over the previous year. The objective of this study was to apply the spatial Durbin error model using angle weights queen contiguity to measure HDI in Central Java Province. Spatial Durbin error model is used because the model overcomes the spatial effect of errors and the effects of spatial depedency on the independent variable. Factors there use is life expectancy, mean years of schooling, expected years of schooling, and purchasing power parity. Based on the result of research, we get spatial Durbin error model for HDI in Central Java with influencing factors are life expectancy, mean years of schooling, expected years of schooling, and purchasing power parity.
Causal Evidence from Humans for the Role of Mediodorsal Nucleus of the Thalamus in Working Memory.
Peräkylä, Jari; Sun, Lihua; Lehtimäki, Kai; Peltola, Jukka; Öhman, Juha; Möttönen, Timo; Ogawa, Keith H; Hartikainen, Kaisa M
2017-12-01
The mediodorsal nucleus of the thalamus (MD), with its extensive connections to the lateral pFC, has been implicated in human working memory and executive functions. However, this understanding is based solely on indirect evidence from human lesion and imaging studies and animal studies. Direct, causal evidence from humans is missing. To obtain direct evidence for MD's role in humans, we studied patients treated with deep brain stimulation (DBS) for refractory epilepsy. This treatment is thought to prevent the generalization of a seizure by disrupting the functioning of the patient's anterior nuclei of the thalamus (ANT) with high-frequency electric stimulation. This structure is located superior and anterior to MD, and when the DBS lead is implanted in ANT, tip contacts of the lead typically penetrate through ANT into the adjoining MD. To study the role of MD in human executive functions and working memory, we periodically disrupted and recovered MD's function with high-frequency electric stimulation using DBS contacts reaching MD while participants performed a cognitive task engaging several aspects of executive functions. We hypothesized that the efficacy of executive functions, specifically working memory, is impaired when the functioning of MD is perturbed by high-frequency stimulation. Eight participants treated with ANT-DBS for refractory epilepsy performed a computer-based test of executive functions while DBS was repeatedly switched ON and OFF at MD and at the control location (ANT). In comparison to stimulation of the control location, when MD was stimulated, participants committed 2.26 times more errors in general (total errors; OR = 2.26, 95% CI [1.69, 3.01]) and 2.86 times more working memory-related errors specifically (incorrect button presses; OR = 2.88, CI [1.95, 4.24]). Similarly, participants committed 1.81 more errors in general ( OR = 1.81, CI [1.45, 2.24]) and 2.08 times more working memory-related errors ( OR = 2.08, CI [1.57, 2.75]) in comparison to no stimulation condition. "Total errors" is a composite score consisting of basic error types and was mostly driven by working memory-related errors. The facts that MD and a control location, ANT, are only few millimeters away from each other and that their stimulation produces very different results highlight the location-specific effect of DBS rather than regionally unspecific general effect. In conclusion, disrupting and recovering MD's function with high-frequency electric stimulation modulated participants' online working memory performance providing causal, in vivo evidence from humans for the role of MD in human working memory.
The effect of saccade metrics on the corollary discharge contribution to perceived eye location
Bansal, Sonia; Jayet Bray, Laurence C.; Peterson, Matthew S.
2015-01-01
Corollary discharge (CD) is hypothesized to provide the movement information (direction and amplitude) required to compensate for the saccade-induced disruptions to visual input. Here, we investigated to what extent these conveyed metrics influence perceptual stability in human subjects with a target-displacement detection task. Subjects made saccades to targets located at different amplitudes (4°, 6°, or 8°) and directions (horizontal or vertical). During the saccade, the target disappeared and then reappeared at a shifted location either in the same direction or opposite to the movement vector. Subjects reported the target displacement direction, and from these reports we determined the perceptual threshold for shift detection and estimate of target location. Our results indicate that the thresholds for all amplitudes and directions generally scaled with saccade amplitude. Additionally, subjects on average produced hypometric saccades with an estimated CD gain <1. Finally, we examined the contribution of different error signals to perceptual performance, the saccade error (movement-to-movement variability in saccade amplitude) and visual error (distance between the fovea and the shifted target location). Perceptual judgment was not influenced by the fluctuations in movement amplitude, and performance was largely the same across movement directions for different magnitudes of visual error. Importantly, subjects reported the correct direction of target displacement above chance level for very small visual errors (<0.75°), even when these errors were opposite the target-shift direction. Collectively, these results suggest that the CD-based compensatory mechanisms for visual disruptions are highly accurate and comparable for saccades with different metrics. PMID:25761955
Action errors, error management, and learning in organizations.
Frese, Michael; Keith, Nina
2015-01-03
Every organization is confronted with errors. Most errors are corrected easily, but some may lead to negative consequences. Organizations often focus on error prevention as a single strategy for dealing with errors. Our review suggests that error prevention needs to be supplemented by error management--an approach directed at effectively dealing with errors after they have occurred, with the goal of minimizing negative and maximizing positive error consequences (examples of the latter are learning and innovations). After defining errors and related concepts, we review research on error-related processes affected by error management (error detection, damage control). Empirical evidence on positive effects of error management in individuals and organizations is then discussed, along with emotional, motivational, cognitive, and behavioral pathways of these effects. Learning from errors is central, but like other positive consequences, learning occurs under certain circumstances--one being the development of a mind-set of acceptance of human error.
Kuselman, Ilya; Pennecchi, Francesca; Epstein, Malka; Fajgelj, Ales; Ellison, Stephen L R
2014-12-01
Monte Carlo simulation of expert judgments on human errors in a chemical analysis was used for determination of distributions of the error quantification scores (scores of likelihood and severity, and scores of effectiveness of a laboratory quality system in prevention of the errors). The simulation was based on modeling of an expert behavior: confident, reasonably doubting and irresolute expert judgments were taken into account by means of different probability mass functions (pmfs). As a case study, 36 scenarios of human errors which may occur in elemental analysis of geological samples by ICP-MS were examined. Characteristics of the score distributions for three pmfs of an expert behavior were compared. Variability of the scores, as standard deviation of the simulated score values from the distribution mean, was used for assessment of the score robustness. A range of the score values, calculated directly from elicited data and simulated by a Monte Carlo method for different pmfs, was also discussed from the robustness point of view. It was shown that robustness of the scores, obtained in the case study, can be assessed as satisfactory for the quality risk management and improvement of a laboratory quality system against human errors. Copyright © 2014 Elsevier B.V. All rights reserved.
Skills, rules and knowledge in aircraft maintenance: errors in context
NASA Technical Reports Server (NTRS)
Hobbs, Alan; Williamson, Ann
2002-01-01
Automatic or skill-based behaviour is generally considered to be less prone to error than behaviour directed by conscious control. However, researchers who have applied Rasmussen's skill-rule-knowledge human error framework to accidents and incidents have sometimes found that skill-based errors appear in significant numbers. It is proposed that this is largely a reflection of the opportunities for error which workplaces present and does not indicate that skill-based behaviour is intrinsically unreliable. In the current study, 99 errors reported by 72 aircraft mechanics were examined in the light of a task analysis based on observations of the work of 25 aircraft mechanics. The task analysis identified the opportunities for error presented at various stages of maintenance work packages and by the job as a whole. Once the frequency of each error type was normalized in terms of the opportunities for error, it became apparent that skill-based performance is more reliable than rule-based performance, which is in turn more reliable than knowledge-based performance. The results reinforce the belief that industrial safety interventions designed to reduce errors would best be directed at those aspects of jobs that involve rule- and knowledge-based performance.
Nguyen, Hung P.; Dingwell, Jonathan B.
2012-01-01
Determining how the human nervous system contends with neuro-motor noise is vital to understanding how humans achieve accurate goal-directed movements. Experimentally, people learning skilled tasks tend to reduce variability in distal joint movements more than in proximal joint movements. This suggests that they might be imposing greater control over distal joints than proximal joints. However, the reasons for this remain unclear, largely because it is not experimentally possible to directly manipulate either the noise or the control at each joint independently. Therefore, this study used a 2 degree-of-freedom torque driven arm model to determine how different combinations of noise and/or control independently applied at each joint affected the reaching accuracy and the total work required to make the movement. Signal-dependent noise was simultaneously and independently added to the shoulder and elbow torques to induce endpoint errors during planar reaching. Feedback control was then applied, independently and jointly, at each joint to reduce endpoint error due to the added neuromuscular noise. Movement direction and the inertia distribution along the arm were varied to quantify how these biomechanical variations affected the system performance. Endpoint error and total net work were computed as dependent measures. When each joint was independently subjected to noise in the absence of control, endpoint errors were more sensitive to distal (elbow) noise than to proximal (shoulder) noise for nearly all combinations of reaching direction and inertia ratio. The effects of distal noise on endpoint errors were more pronounced when inertia was distributed more toward the forearm. In contrast, the total net work decreased as mass was shifted to the upper arm for reaching movements in all directions. When noise was present at both joints and joint control was implemented, controlling the distal joint alone reduced endpoint errors more than controlling the proximal joint alone for nearly all combinations of reaching direction and inertia ratio. Applying control only at the distal joint was more effective at reducing endpoint errors when more of the mass was more proximally distributed. Likewise, controlling the distal joint alone required less total net work than controlling the proximal joint alone for nearly all combinations of reaching distance and inertia ratio. It is more efficient to reduce endpoint error and energetic cost by selectively applying control to reduce variability in the distal joint than the proximal joint. The reasons for this arise from the biomechanical configuration of the arm itself. PMID:22757504
Nguyen, Hung P; Dingwell, Jonathan B
2012-06-01
Determining how the human nervous system contends with neuro-motor noise is vital to understanding how humans achieve accurate goal-directed movements. Experimentally, people learning skilled tasks tend to reduce variability in distal joint movements more than in proximal joint movements. This suggests that they might be imposing greater control over distal joints than proximal joints. However, the reasons for this remain unclear, largely because it is not experimentally possible to directly manipulate either the noise or the control at each joint independently. Therefore, this study used a 2 degree-of-freedom torque driven arm model to determine how different combinations of noise and/or control independently applied at each joint affected the reaching accuracy and the total work required to make the movement. Signal-dependent noise was simultaneously and independently added to the shoulder and elbow torques to induce endpoint errors during planar reaching. Feedback control was then applied, independently and jointly, at each joint to reduce endpoint error due to the added neuromuscular noise. Movement direction and the inertia distribution along the arm were varied to quantify how these biomechanical variations affected the system performance. Endpoint error and total net work were computed as dependent measures. When each joint was independently subjected to noise in the absence of control, endpoint errors were more sensitive to distal (elbow) noise than to proximal (shoulder) noise for nearly all combinations of reaching direction and inertia ratio. The effects of distal noise on endpoint errors were more pronounced when inertia was distributed more toward the forearm. In contrast, the total net work decreased as mass was shifted to the upper arm for reaching movements in all directions. When noise was present at both joints and joint control was implemented, controlling the distal joint alone reduced endpoint errors more than controlling the proximal joint alone for nearly all combinations of reaching direction and inertia ratio. Applying control only at the distal joint was more effective at reducing endpoint errors when more of the mass was more proximally distributed. Likewise, controlling the distal joint alone required less total net work than controlling the proximal joint alone for nearly all combinations of reaching distance and inertia ratio. It is more efficient to reduce endpoint error and energetic cost by selectively applying control to reduce variability in the distal joint than the proximal joint. The reasons for this arise from the biomechanical configuration of the arm itself.
Mapping the Origins of Time: Scalar Errors in Infant Time Estimation
ERIC Educational Resources Information Center
Addyman, Caspar; Rocha, Sinead; Mareschal, Denis
2014-01-01
Time is central to any understanding of the world. In adults, estimation errors grow linearly with the length of the interval, much faster than would be expected of a clock-like mechanism. Here we present the first direct demonstration that this is also true in human infants. Using an eye-tracking paradigm, we examined 4-, 6-, 10-, and…
Ishihara, Hisashi; Ota, Nobuyuki; Asada, Minoru
2017-11-27
It is quite difficult for android robots to replicate the numerous and various types of human facial expressions owing to limitations in terms of space, mechanisms, and materials. This situation could be improved with greater knowledge regarding these expressions and their deformation rules, i.e. by using the biomimetic approach. In a previous study, we investigated 16 facial deformation patterns and found that each facial point moves almost only in its own principal direction and different deformation patterns are created with different combinations of moving lengths. However, the replication errors caused by moving each control point of a face in only their principal direction were not evaluated for each deformation pattern at that time. Therefore, we calculated the replication errors in this study using the second principal component scores of the 16 sets of flow vectors at each point on the face. More than 60% of the errors were within 1 mm, and approximately 90% of them were within 3 mm. The average error was 1.1 mm. These results indicate that robots can replicate the 16 investigated facial expressions with errors within 3 mm and 1 mm for about 90% and 60% of the vectors, respectively, even if each point on the robot face moves in only its own principal direction. This finding seems promising for the development of robots capable of showing various facial expressions because significantly fewer types of movements than previously predicted are necessary.
Zupanc, Christine M; Burgess-Limerick, Robin J; Wallis, Guy
2007-08-01
To investigate error and reaction time consequences of alternating compatible and incompatible steering arrangements during a simulated obstacle avoidance task. Underground coal mine shuttle cars provide an example of a vehicle in which operators are required to alternate between compatible and incompatible steering configurations. This experiment examines the performance of 48 novice participants in a virtual analogy of an underground coal mine shuttle car. Participants were randomly assigned to a compatible condition, an incompatible condition, an alternating condition in which compatibility alternated within and between hands, or an alternating condition in which compatibility alternated between hands. Participants made fewer steering direction errors and made correct steering responses more quickly in the compatible condition. Error rate decreased over time in the incompatible condition. A compatibility effect for both errors and reaction time was also found when the control-response relationship alternated; however, performance improvements over time were not consistent. Isolating compatibility to a hand resulted in reduced error rate and faster reaction time than when compatibility alternated within and between hands. The consequences of alternating control-response relationships are higher error rates and slower responses, at least in the early stages of learning. This research highlights the importance of ensuring consistently compatible human-machine directional control-response relationships.
Position sense at the human elbow joint measured by arm matching or pointing.
Tsay, Anthony; Allen, Trevor J; Proske, Uwe
2016-10-01
Position sense at the human elbow joint has traditionally been measured in blindfolded subjects using a forearm matching task. Here we compare position errors in a matching task with errors generated when the subject uses a pointer to indicate the position of a hidden arm. Evidence from muscle vibration during forearm matching supports a role for muscle spindles in position sense. We have recently shown using vibration, as well as muscle conditioning, which takes advantage of muscle's thixotropic property, that position errors generated in a forearm pointing task were not consistent with a role by muscle spindles. In the present study we have used a form of muscle conditioning, where elbow muscles are co-contracted at the test angle, to further explore differences in position sense measured by matching and pointing. For fourteen subjects, in a matching task where the reference arm had elbow flexor and extensor muscles contracted at the test angle and the indicator arm had its flexors conditioned at 90°, matching errors lay in the direction of flexion by 6.2°. After the same conditioning of the reference arm and extension conditioning of the indicator at 0°, matching errors lay in the direction of extension (5.7°). These errors were consistent with predictions based on a role by muscle spindles in determining forearm matching outcomes. In the pointing task subjects moved a pointer to align it with the perceived position of the hidden arm. After conditioning of the reference arm as before, pointing errors all lay in a more extended direction than the actual position of the arm by 2.9°-7.3°, a distribution not consistent with a role by muscle spindles. We propose that in pointing muscle spindles do not play the major role in signalling limb position that they do in matching, but that other sources of sensory input should be given consideration, including afferents from skin and joint.
NASA Technical Reports Server (NTRS)
1989-01-01
The discovery that human error has caused many more airline crashes than mechanical malfunctions led to an increased emphasis on teamwork and coordination in airline flight training programs. Human factors research at Ames Research Center has produced two crew training programs directed toward more effective operations. Cockpit Resource Management (CRM) defines areas like decision making, workload distribution, communication skills, etc. as essential in addressing human error problems. In 1979, a workshop led to the implementation of the CRM program by United Airlines, and later other airlines. In Line Oriented Flight Training (LOFT), crews fly missions in realistic simulators while instructors induce emergency situations requiring crew coordination. This is followed by a self critique. Ames Research Center continues its involvement with these programs.
ERIC Educational Resources Information Center
Yordanova, Juliana; Albrecht, Bjorn; Uebel, Henrik; Kirov, Roumen; Banaschewski, Tobias; Rothenberger, Aribert; Kolev, Vasil
2011-01-01
The maintenance of stable goal-directed behaviour is a hallmark of conscious executive control in humans. Notably, both correct and error human actions may have a subconscious activation-based determination. One possible source of subconscious interference may be the default mode network that, in contrast to attentional network, manifests…
Model-based color halftoning using direct binary search.
Agar, A Ufuk; Allebach, Jan P
2005-12-01
In this paper, we develop a model-based color halftoning method using the direct binary search (DBS) algorithm. Our method strives to minimize the perceived error between the continuous tone original color image and the color halftone image. We exploit the differences in how the human viewers respond to luminance and chrominance information and use the total squared error in a luminance/chrominance based space as our metric. Starting with an initial halftone, we minimize this error metric using the DBS algorithm. Our method also incorporates a measurement based color printer dot interaction model to prevent the artifacts due to dot overlap and to improve color texture quality. We calibrate our halftoning algorithm to ensure accurate colorant distributions in resulting halftones. We present the color halftones which demonstrate the efficacy of our method.
Observing human movements helps decoding environmental forces.
Zago, Myrka; La Scaleia, Barbara; Miller, William L; Lacquaniti, Francesco
2011-11-01
Vision of human actions can affect several features of visual motion processing, as well as the motor responses of the observer. Here, we tested the hypothesis that action observation helps decoding environmental forces during the interception of a decelerating target within a brief time window, a task intrinsically very difficult. We employed a factorial design to evaluate the effects of scene orientation (normal or inverted) and target gravity (normal or inverted). Button-press triggered the motion of a bullet, a piston, or a human arm. We found that the timing errors were smaller for upright scenes irrespective of gravity direction in the Bullet group, while the errors were smaller for the standard condition of normal scene and gravity in the Piston group. In the Arm group, instead, performance was better when the directions of scene and target gravity were concordant, irrespective of whether both were upright or inverted. These results suggest that the default viewer-centered reference frame is used with inanimate scenes, such as those of the Bullet and Piston protocols. Instead, the presence of biological movements in animate scenes (as in the Arm protocol) may help processing target kinematics under the ecological conditions of coherence between scene and target gravity directions.
Reliability of drivers in urban intersections.
Gstalter, Herbert; Fastenmeier, Wolfgang
2010-01-01
The concept of human reliability has been widely used in industrial settings by human factors experts to optimise the person-task fit. Reliability is estimated by the probability that a task will successfully be completed by personnel in a given stage of system operation. Human Reliability Analysis (HRA) is a technique used to calculate human error probabilities as the ratio of errors committed to the number of opportunities for that error. To transfer this notion to the measurement of car driver reliability the following components are necessary: a taxonomy of driving tasks, a definition of correct behaviour in each of these tasks, a list of errors as deviations from the correct actions and an adequate observation method to register errors and opportunities for these errors. Use of the SAFE-task analysis procedure recently made it possible to derive driver errors directly from the normative analysis of behavioural requirements. Driver reliability estimates could be used to compare groups of tasks (e.g. different types of intersections with their respective regulations) as well as groups of drivers' or individual drivers' aptitudes. This approach was tested in a field study with 62 drivers of different age groups. The subjects drove an instrumented car and had to complete an urban test route, the main features of which were 18 intersections representing six different driving tasks. The subjects were accompanied by two trained observers who recorded driver errors using standardized observation sheets. Results indicate that error indices often vary between both the age group of drivers and the type of driving task. The highest error indices occurred in the non-signalised intersection tasks and the roundabout, which exactly equals the corresponding ratings of task complexity from the SAFE analysis. A comparison of age groups clearly shows the disadvantage of older drivers, whose error indices in nearly all tasks are significantly higher than those of the other groups. The vast majority of these errors could be explained by high task load in the intersections, as they represent difficult tasks. The discussion shows how reliability estimates can be used in a constructive way to propose changes in car design, intersection layout and regulation as well as driver training.
Differing Air Traffic Controller Responses to Similar Trajectory Prediction Errors
NASA Technical Reports Server (NTRS)
Mercer, Joey; Hunt-Espinosa, Sarah; Bienert, Nancy; Laraway, Sean
2016-01-01
A Human-In-The-Loop simulation was conducted in January of 2013 in the Airspace Operations Laboratory at NASA's Ames Research Center. The simulation airspace included two en route sectors feeding the northwest corner of Atlanta's Terminal Radar Approach Control. The focus of this paper is on how uncertainties in the study's trajectory predictions impacted the controllers ability to perform their duties. Of particular interest is how the controllers interacted with the delay information displayed in the meter list and data block while managing the arrival flows. Due to wind forecasts with 30-knot over-predictions and 30-knot under-predictions, delay value computations included errors of similar magnitude, albeit in opposite directions. However, when performing their duties in the presence of these errors, did the controllers issue clearances of similar magnitude, albeit in opposite directions?
Advanced automated glass cockpit certification: Being wary of human factors
NASA Technical Reports Server (NTRS)
Amalberti, Rene; Wilbaux, Florence
1994-01-01
This paper presents some facets of the French experience with human factors in the process of certification of advanced automated cockpits. Three types of difficulties are described: first, the difficulties concerning the hotly debated concept of human error and its non-linear relationship to risk of accident; a typology of errors to be taken into account in the certification process is put forward to respond to this issue. Next, the difficulties connected to the basically gradual and evolving nature of pilot expertise on a given type of aircraft, which contrasts with the immediate and definitive style of certifying systems. The last difficulties to be considered are those related to the goals of certification itself on these new aircraft and the status of findings from human factor analyses (in particular, what should be done with disappointing results, how much can the changes induced by human factors investigation economically affect aircraft design, how many errors do we need to accumulate before we revise the system, what should be remedied when human factor problems are discovered at the certification stage: the machine? pilot training? the rules? or everything?). The growth of advanced-automated glass cockpits has forced the international aeronautical community to pay more attention to human factors during the design phase, the certification phase and pilot training. The recent creation of a human factor desk at the DGAC-SFACT (Official French services) is a direct consequence of this. The paper is divided into three parts. Part one debates human error and its relationship with system design and accident risk. Part two describes difficulties connected to the basically gradual and evolving nature of pilot expertise on a given type of aircraft, which contrasts with the immediate and definitive style of certifying systems. Part three focuses on concrete outcomes of human factors for certification purposes.
Disruption of State Estimation in the Human Lateral Cerebellum
Miall, R. Chris; Christensen, Lars O. D; Cain, Owen; Stanley, James
2007-01-01
The cerebellum has been proposed to be a crucial component in the state estimation process that combines information from motor efferent and sensory afferent signals to produce a representation of the current state of the motor system. Such a state estimate of the moving human arm would be expected to be used when the arm is rapidly and skillfully reaching to a target. We now report the effects of transcranial magnetic stimulation (TMS) over the ipsilateral cerebellum as healthy humans were made to interrupt a slow voluntary movement to rapidly reach towards a visually defined target. Errors in the initial direction and in the final finger position of this reach-to-target movement were significantly higher for cerebellar stimulation than they were in control conditions. The average directional errors in the cerebellar TMS condition were consistent with the reaching movements being planned and initiated from an estimated hand position that was 138 ms out of date. We suggest that these results demonstrate that the cerebellum is responsible for estimating the hand position over this time interval and that TMS disrupts this state estimate. PMID:18044990
Preventing medical errors by designing benign failures.
Grout, John R
2003-07-01
One way to successfully reduce medical errors is to design health care systems that are more resistant to the tendencies of human beings to err. One interdisciplinary approach entails creating design changes, mitigating human errors, and making human error irrelevant to outcomes. This approach is intended to facilitate the creation of benign failures, which have been called mistake-proofing devices and forcing functions elsewhere. USING FAULT TREES TO DESIGN FORCING FUNCTIONS: A fault tree is a graphical tool used to understand the relationships that either directly cause or contribute to the cause of a particular failure. A careful analysis of a fault tree enables the analyst to anticipate how the process will behave after the change. EXAMPLE OF AN APPLICATION: A scenario in which a patient is scalded while bathing can serve as an example of how multiple fault trees can be used to design forcing functions. The first fault tree shows the undesirable event--patient scalded while bathing. The second fault tree has a benign event--no water. Adding a scald valve changes the outcome from the undesirable event ("patient scalded while bathing") to the benign event ("no water") Analysis of fault trees does not ensure or guarantee that changes necessary to eliminate error actually occur. Most mistake-proofing is used to prevent simple errors and to create well-defended processes, but complex errors can also result. The utilization of mistake-proofing or forcing functions can be thought of as changing the logic of a process. Errors that formerly caused undesirable failures can be converted into the causes of benign failures. The use of fault trees can provide a variety of insights into the design of forcing functions that will improve patient safety.
Colas, Jaron T; Pauli, Wolfgang M; Larsen, Tobias; Tyszka, J Michael; O'Doherty, John P
2017-10-01
Prediction-error signals consistent with formal models of "reinforcement learning" (RL) have repeatedly been found within dopaminergic nuclei of the midbrain and dopaminoceptive areas of the striatum. However, the precise form of the RL algorithms implemented in the human brain is not yet well determined. Here, we created a novel paradigm optimized to dissociate the subtypes of reward-prediction errors that function as the key computational signatures of two distinct classes of RL models-namely, "actor/critic" models and action-value-learning models (e.g., the Q-learning model). The state-value-prediction error (SVPE), which is independent of actions, is a hallmark of the actor/critic architecture, whereas the action-value-prediction error (AVPE) is the distinguishing feature of action-value-learning algorithms. To test for the presence of these prediction-error signals in the brain, we scanned human participants with a high-resolution functional magnetic-resonance imaging (fMRI) protocol optimized to enable measurement of neural activity in the dopaminergic midbrain as well as the striatal areas to which it projects. In keeping with the actor/critic model, the SVPE signal was detected in the substantia nigra. The SVPE was also clearly present in both the ventral striatum and the dorsal striatum. However, alongside these purely state-value-based computations we also found evidence for AVPE signals throughout the striatum. These high-resolution fMRI findings suggest that model-free aspects of reward learning in humans can be explained algorithmically with RL in terms of an actor/critic mechanism operating in parallel with a system for more direct action-value learning.
Pauli, Wolfgang M.; Larsen, Tobias; Tyszka, J. Michael; O’Doherty, John P.
2017-01-01
Prediction-error signals consistent with formal models of “reinforcement learning” (RL) have repeatedly been found within dopaminergic nuclei of the midbrain and dopaminoceptive areas of the striatum. However, the precise form of the RL algorithms implemented in the human brain is not yet well determined. Here, we created a novel paradigm optimized to dissociate the subtypes of reward-prediction errors that function as the key computational signatures of two distinct classes of RL models—namely, “actor/critic” models and action-value-learning models (e.g., the Q-learning model). The state-value-prediction error (SVPE), which is independent of actions, is a hallmark of the actor/critic architecture, whereas the action-value-prediction error (AVPE) is the distinguishing feature of action-value-learning algorithms. To test for the presence of these prediction-error signals in the brain, we scanned human participants with a high-resolution functional magnetic-resonance imaging (fMRI) protocol optimized to enable measurement of neural activity in the dopaminergic midbrain as well as the striatal areas to which it projects. In keeping with the actor/critic model, the SVPE signal was detected in the substantia nigra. The SVPE was also clearly present in both the ventral striatum and the dorsal striatum. However, alongside these purely state-value-based computations we also found evidence for AVPE signals throughout the striatum. These high-resolution fMRI findings suggest that model-free aspects of reward learning in humans can be explained algorithmically with RL in terms of an actor/critic mechanism operating in parallel with a system for more direct action-value learning. PMID:29049406
Hakala, John L; Hung, Joseph C; Mosman, Elton A
2012-09-01
The objective of this project was to ensure correct radiopharmaceutical administration through the use of a bar code system that links patient and drug profiles with on-site information management systems. This new combined system would minimize the amount of manual human manipulation, which has proven to be a primary source of error. The most common reason for dosing errors is improper patient identification when a dose is obtained from the nuclear pharmacy or when a dose is administered. A standardized electronic transfer of information from radiopharmaceutical preparation to injection will further reduce the risk of misadministration. Value stream maps showing the flow of the patient dose information, as well as potential points of human error, were developed. Next, a future-state map was created that included proposed corrections for the most common critical sites of error. Transitioning the current process to the future state will require solutions that address these sites. To optimize the future-state process, a bar code system that links the on-site radiology management system with the nuclear pharmacy management system was proposed. A bar-coded wristband connects the patient directly to the electronic information systems. The bar code-enhanced process linking the patient dose with the electronic information reduces the number of crucial points for human error and provides a framework to ensure that the prepared dose reaches the correct patient. Although the proposed flowchart is designed for a site with an in-house central nuclear pharmacy, much of the framework could be applied by nuclear medicine facilities using unit doses. An electronic connection between information management systems to allow the tracking of a radiopharmaceutical from preparation to administration can be a useful tool in preventing the mistakes that are an unfortunate reality for any facility.
Effective force control by muscle synergies.
Berger, Denise J; d'Avella, Andrea
2014-01-01
Muscle synergies have been proposed as a way for the central nervous system (CNS) to simplify the generation of motor commands and they have been shown to explain a large fraction of the variation in the muscle patterns across a variety of conditions. However, whether human subjects are able to control forces and movements effectively with a small set of synergies has not been tested directly. Here we show that muscle synergies can be used to generate target forces in multiple directions with the same accuracy achieved using individual muscles. We recorded electromyographic (EMG) activity from 13 arm muscles and isometric hand forces during a force reaching task in a virtual environment. From these data we estimated the force associated to each muscle by linear regression and we identified muscle synergies by non-negative matrix factorization. We compared trajectories of a virtual mass displaced by the force estimated using the entire set of recorded EMGs to trajectories obtained using 4-5 muscle synergies. While trajectories were similar, when feedback was provided according to force estimated from recorded EMGs (EMG-control) on average trajectories generated with the synergies were less accurate. However, when feedback was provided according to recorded force (force-control) we did not find significant differences in initial angle error and endpoint error. We then tested whether synergies could be used as effectively as individual muscles to control cursor movement in the force reaching task by providing feedback according to force estimated from the projection of the recorded EMGs into synergy space (synergy-control). Human subjects were able to perform the task immediately after switching from force-control to EMG-control and synergy-control and we found no differences between initial movement direction errors and endpoint errors in all control modes. These results indicate that muscle synergies provide an effective strategy for motor coordination.
Performance Monitoring Applied to System Supervision
Somon, Bertille; Campagne, Aurélie; Delorme, Arnaud; Berberian, Bruno
2017-01-01
Nowadays, automation is present in every aspect of our daily life and has some benefits. Nonetheless, empirical data suggest that traditional automation has many negative performance and safety consequences as it changed task performers into task supervisors. In this context, we propose to use recent insights into the anatomical and neurophysiological substrates of action monitoring in humans, to help further characterize performance monitoring during system supervision. Error monitoring is critical for humans to learn from the consequences of their actions. A wide variety of studies have shown that the error monitoring system is involved not only in our own errors, but also in the errors of others. We hypothesize that the neurobiological correlates of the self-performance monitoring activity can be applied to system supervision. At a larger scale, a better understanding of system supervision may allow its negative effects to be anticipated or even countered. This review is divided into three main parts. First, we assess the neurophysiological correlates of self-performance monitoring and their characteristics during error execution. Then, we extend these results to include performance monitoring and error observation of others or of systems. Finally, we provide further directions in the study of system supervision and assess the limits preventing us from studying a well-known phenomenon: the Out-Of-the-Loop (OOL) performance problem. PMID:28744209
Human Factors and Ergonomics for the Dental Profession.
Ross, Al
2016-09-01
This paper proposes that the science of Human Factors and Ergonomics (HFE) is suitable for wide application in dental education, training and practice to improve safety, quality and efficiency. Three areas of interest are highlighted. First it is proposed that individual and team Non-Technical Skills (NTS), such as communication, leadership and stress management can improve error rates and efficiency of procedures. Secondly, in a physically and technically challenging environment, staff can benefit from ergonomic principles which examine design in supporting safe work. Finally, examination of organizational human factors can help anticipate stressors and plan for flexible responses to multiple, variable demands, and fluctuating resources. Clinical relevance: HFE is an evidence-based approach to reducing error rates and procedural complications, and avoiding problems associated with stress and fatigue. Improved teamwork and organizational planning and efficiency can impact directly on patient outcomes.
Human Error: A Concept Analysis
NASA Technical Reports Server (NTRS)
Hansen, Frederick D.
2007-01-01
Human error is the subject of research in almost every industry and profession of our times. This term is part of our daily language and intuitively understood by most people however, it would be premature to assume that everyone's understanding of human error s the same. For example, human error is used to describe the outcome or consequence of human action, the causal factor of an accident, deliberate violations,a nd the actual action taken by a human being. As a result, researchers rarely agree on the either a specific definition or how to prevent human error. The purpose of this article is to explore the specific concept of human error using Concept Analysis as described by Walker and Avant (1995). The concept of human error is examined as currently used in the literature of a variety of industries and professions. Defining attributes and examples of model, borderline, and contrary cases are described. The antecedents and consequences of human error are also discussed and a definition of human error is offered.
Fuzzy risk analysis of a modern γ-ray industrial irradiator.
Castiglia, F; Giardina, M
2011-06-01
Fuzzy fault tree analyses were used to investigate accident scenarios that involve radiological exposure to operators working in industrial γ-ray irradiation facilities. The HEART method, a first generation human reliability analysis method, was used to evaluate the probability of adverse human error in these analyses. This technique was modified on the basis of fuzzy set theory to more directly take into account the uncertainties in the error-promoting factors on which the methodology is based. Moreover, with regard to some identified accident scenarios, fuzzy radiological exposure risk, expressed in terms of potential annual death, was evaluated. The calculated fuzzy risks for the examined plant were determined to be well below the reference risk suggested by International Commission on Radiological Protection.
Transfer Error and Correction Approach in Mobile Network
NASA Astrophysics Data System (ADS)
Xiao-kai, Wu; Yong-jin, Shi; Da-jin, Chen; Bing-he, Ma; Qi-li, Zhou
With the development of information technology and social progress, human demand for information has become increasingly diverse, wherever and whenever people want to be able to easily, quickly and flexibly via voice, data, images and video and other means to communicate. Visual information to the people direct and vivid image, image / video transmission also been widespread attention. Although the third generation mobile communication systems and the emergence and rapid development of IP networks, making video communications is becoming the main business of the wireless communications, however, the actual wireless and IP channel will lead to error generation, such as: wireless channel multi- fading channels generated error and blocking IP packet loss and so on. Due to channel bandwidth limitations, the video communication compression coding of data is often beyond the data, and compress data after the error is very sensitive to error conditions caused a serious decline in image quality.
Understanding adverse events: human factors.
Reason, J
1995-01-01
(1) Human rather than technical failures now represent the greatest threat to complex and potentially hazardous systems. This includes healthcare systems. (2) Managing the human risks will never be 100% effective. Human fallibility can be moderated, but it cannot be eliminated. (3) Different error types have different underlying mechanisms, occur in different parts of the organisation, and require different methods of risk management. The basic distinctions are between: Slips, lapses, trips, and fumbles (execution failures) and mistakes (planning or problem solving failures). Mistakes are divided into rule based mistakes and knowledge based mistakes. Errors (information-handling problems) and violations (motivational problems) Active versus latent failures. Active failures are committed by those in direct contact with the patient, latent failures arise in organisational and managerial spheres and their adverse effects may take a long time to become evident. (4) Safety significant errors occur at all levels of the system, not just at the sharp end. Decisions made in the upper echelons of the organisation create the conditions in the workplace that subsequently promote individual errors and violations. Latent failures are present long before an accident and are hence prime candidates for principled risk management. (5) Measures that involve sanctions and exhortations (that is, moralistic measures directed to those at the sharp end) have only very limited effectiveness, especially so in the case of highly trained professionals. (6) Human factors problems are a product of a chain of causes in which the individual psychological factors (that is, momentary inattention, forgetting, etc) are the last and least manageable links. Attentional "capture" (preoccupation or distraction) is a necessary condition for the commission of slips and lapses. Yet, its occurrence is almost impossible to predict or control effectively. The same is true of the factors associated with forgetting. States of mind contributing to error are thus extremely difficult to manage; they can happen to the best of people at any time. (7) People do not act in isolation. Their behaviour is shaped by circumstances. The same is true for errors and violations. The likelihood of an unsafe act being committed is heavily influenced by the nature of the task and by the local workplace conditions. These, in turn, are the product of "upstream" organisational factors. Great gains in safety can ve achieved through relatively small modifications of equipment and workplaces. (8) Automation and increasing advanced equipment do not cure human factors problems, they merely relocate them. In contrast, training people to work effectively in teams costs little, but has achieved significant enhancements of human performance in aviation. (9) Effective risk management depends critically on a confidential and preferable anonymous incident monitoring system that records the individual, task, situational, and organisational factors associated with incidents and near misses. (10) Effective risk management means the simultaneous and targeted deployment of limited remedial resources at different levels of the system: the individual or team, the task, the situation, and the organisation as a whole. PMID:10151618
Human Factors In the Joint Typhoon Warning Center Watch Floor
2012-11-01
Report 3. DATES COVERED (From-To) 01-10-2010 – 30-03-2011 4. TITLE AND SUBTITLE Human Factors in the Joint Typhoon Warning Center Watch Floor...between users’ information requirements and interpretation process and the JTWC’s forecast fields. The language of TCCOR definitions provides one (of...direction error is less than 90°, predicting a position 10 nautical miles (nmi) too close to the current position produces a lower FTE than
Li, Wen-Chin; Harris, Don; Yu, Chung-San
2008-03-01
The human factors analysis and classification system (HFACS) is based upon Reason's organizational model of human error. HFACS was developed as an analytical framework for the investigation of the role of human error in aviation accidents, however, there is little empirical work formally describing the relationship between the components in the model. This research analyses 41 civil aviation accidents occurring to aircraft registered in the Republic of China (ROC) between 1999 and 2006 using the HFACS framework. The results show statistically significant relationships between errors at the operational level and organizational inadequacies at both the immediately adjacent level (preconditions for unsafe acts) and higher levels in the organization (unsafe supervision and organizational influences). The pattern of the 'routes to failure' observed in the data from this analysis of civil aircraft accidents show great similarities to that observed in the analysis of military accidents. This research lends further support to Reason's model that suggests that active failures are promoted by latent conditions in the organization. Statistical relationships linking fallible decisions in upper management levels were found to directly affect supervisory practices, thereby creating the psychological preconditions for unsafe acts and hence indirectly impairing the performance of pilots, ultimately leading to accidents.
Fabbretti, G
2010-06-01
Because of its complex nature, surgical pathology practice is prone to error. In this report, we describe our methods for reducing error as much as possible during the pre-analytical and analytical phases. This was achieved by revising procedures, and by using computer technology and automation. Most mistakes are the result of human error in the identification and matching of patient and samples. To avoid faulty data interpretation, we employed a new comprehensive computer system that acquires all patient ID information directly from the hospital's database with a remote order entry; it also provides label and request forms via-Web where clinical information is required before sending the sample. Both patient and sample are identified directly and immediately at the site where the surgical procedures are performed. Barcode technology is used to input information at every step and automation is used for sample blocks and slides to avoid errors that occur when information is recorded or transferred by hand. Quality control checks occur at every step of the process to ensure that none of the steps are left to chance and that no phase is dependent on a single operator. The system also provides statistical analysis of errors so that new strategies can be implemented to avoid repetition. In addition, the staff receives frequent training on avoiding errors and new developments. The results have been shown promising results with a very low error rate (0.27%). None of these compromised patient health and all errors were detected before the release of the diagnosis report.
Human Error In Complex Systems
NASA Technical Reports Server (NTRS)
Morris, Nancy M.; Rouse, William B.
1991-01-01
Report presents results of research aimed at understanding causes of human error in such complex systems as aircraft, nuclear powerplants, and chemical processing plants. Research considered both slips (errors of action) and mistakes (errors of intention), and influence of workload on them. Results indicated that: humans respond to conditions in which errors expected by attempting to reduce incidence of errors; and adaptation to conditions potent influence on human behavior in discretionary situations.
NASA Technical Reports Server (NTRS)
Alexander, Tiffaney Miller
2017-01-01
Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.
NASA Technical Reports Server (NTRS)
Alexander, Tiffaney Miller
2017-01-01
Research results have shown that more than half of aviation, aerospace and aeronautics mishaps/incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.
NASA Technical Reports Server (NTRS)
Alexander, Tiffaney Miller
2017-01-01
Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Quality within space exploration ground processing operations, the identification and or classification of underlying contributors and causes of human error must be identified, in order to manage human error.This presentation will provide a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.
Understanding human management of automation errors
McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.
2013-01-01
Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042
Understanding human management of automation errors.
McBride, Sara E; Rogers, Wendy A; Fisk, Arthur D
2014-01-01
Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance.
Dopamine Modulates Adaptive Prediction Error Coding in the Human Midbrain and Striatum.
Diederen, Kelly M J; Ziauddeen, Hisham; Vestergaard, Martin D; Spencer, Tom; Schultz, Wolfram; Fletcher, Paul C
2017-02-15
Learning to optimally predict rewards requires agents to account for fluctuations in reward value. Recent work suggests that individuals can efficiently learn about variable rewards through adaptation of the learning rate, and coding of prediction errors relative to reward variability. Such adaptive coding has been linked to midbrain dopamine neurons in nonhuman primates, and evidence in support for a similar role of the dopaminergic system in humans is emerging from fMRI data. Here, we sought to investigate the effect of dopaminergic perturbations on adaptive prediction error coding in humans, using a between-subject, placebo-controlled pharmacological fMRI study with a dopaminergic agonist (bromocriptine) and antagonist (sulpiride). Participants performed a previously validated task in which they predicted the magnitude of upcoming rewards drawn from distributions with varying SDs. After each prediction, participants received a reward, yielding trial-by-trial prediction errors. Under placebo, we replicated previous observations of adaptive coding in the midbrain and ventral striatum. Treatment with sulpiride attenuated adaptive coding in both midbrain and ventral striatum, and was associated with a decrease in performance, whereas bromocriptine did not have a significant impact. Although we observed no differential effect of SD on performance between the groups, computational modeling suggested decreased behavioral adaptation in the sulpiride group. These results suggest that normal dopaminergic function is critical for adaptive prediction error coding, a key property of the brain thought to facilitate efficient learning in variable environments. Crucially, these results also offer potential insights for understanding the impact of disrupted dopamine function in mental illness. SIGNIFICANCE STATEMENT To choose optimally, we have to learn what to expect. Humans dampen learning when there is a great deal of variability in reward outcome, and two brain regions that are modulated by the brain chemical dopamine are sensitive to reward variability. Here, we aimed to directly relate dopamine to learning about variable rewards, and the neural encoding of associated teaching signals. We perturbed dopamine in healthy individuals using dopaminergic medication and asked them to predict variable rewards while we made brain scans. Dopamine perturbations impaired learning and the neural encoding of reward variability, thus establishing a direct link between dopamine and adaptation to reward variability. These results aid our understanding of clinical conditions associated with dopaminergic dysfunction, such as psychosis. Copyright © 2017 Diederen et al.
Human operator response to error-likely situations in complex engineering systems
NASA Technical Reports Server (NTRS)
Morris, Nancy M.; Rouse, William B.
1988-01-01
The causes of human error in complex systems are examined. First, a conceptual framework is provided in which two broad categories of error are discussed: errors of action, or slips, and errors of intention, or mistakes. Conditions in which slips and mistakes might be expected to occur are identified, based on existing theories of human error. Regarding the role of workload, it is hypothesized that workload may act as a catalyst for error. Two experiments are presented in which humans' response to error-likely situations were examined. Subjects controlled PLANT under a variety of conditions and periodically provided subjective ratings of mental effort. A complex pattern of results was obtained, which was not consistent with predictions. Generally, the results of this research indicate that: (1) humans respond to conditions in which errors might be expected by attempting to reduce the possibility of error, and (2) adaptation to conditions is a potent influence on human behavior in discretionary situations. Subjects' explanations for changes in effort ratings are also explored.
Human-simulation-based learning to prevent medication error: A systematic review.
Sarfati, Laura; Ranchon, Florence; Vantard, Nicolas; Schwiertz, Vérane; Larbre, Virginie; Parat, Stéphanie; Faudel, Amélie; Rioufol, Catherine
2018-01-31
In the past 2 decades, there has been an increasing interest in simulation-based learning programs to prevent medication error (ME). To improve knowledge, skills, and attitudes in prescribers, nurses, and pharmaceutical staff, these methods enable training without directly involving patients. However, best practices for simulation for healthcare providers are as yet undefined. By analysing the current state of experience in the field, the present review aims to assess whether human simulation in healthcare helps to reduce ME. A systematic review was conducted on Medline from 2000 to June 2015, associating the terms "Patient Simulation," "Medication Errors," and "Simulation Healthcare." Reports of technology-based simulation were excluded, to focus exclusively on human simulation in nontechnical skills learning. Twenty-one studies assessing simulation-based learning programs were selected, focusing on pharmacy, medicine or nursing students, or concerning programs aimed at reducing administration or preparation errors, managing crises, or learning communication skills for healthcare professionals. The studies varied in design, methodology, and assessment criteria. Few demonstrated that simulation was more effective than didactic learning in reducing ME. This review highlights a lack of long-term assessment and real-life extrapolation, with limited scenarios and participant samples. These various experiences, however, help in identifying the key elements required for an effective human simulation-based learning program for ME prevention: ie, scenario design, debriefing, and perception assessment. The performance of these programs depends on their ability to reflect reality and on professional guidance. Properly regulated simulation is a good way to train staff in events that happen only exceptionally, as well as in standard daily activities. By integrating human factors, simulation seems to be effective in preventing iatrogenic risk related to ME, if the program is well designed. © 2018 John Wiley & Sons, Ltd.
Human error and human factors engineering in health care.
Welch, D L
1997-01-01
Human error is inevitable. It happens in health care systems as it does in all other complex systems, and no measure of attention, training, dedication, or punishment is going to stop it. The discipline of human factors engineering (HFE) has been dealing with the causes and effects of human error since the 1940's. Originally applied to the design of increasingly complex military aircraft cockpits, HFE has since been effectively applied to the problem of human error in such diverse systems as nuclear power plants, NASA spacecraft, the process control industry, and computer software. Today the health care industry is becoming aware of the costs of human error and is turning to HFE for answers. Just as early experimental psychologists went beyond the label of "pilot error" to explain how the design of cockpits led to air crashes, today's HFE specialists are assisting the health care industry in identifying the causes of significant human errors in medicine and developing ways to eliminate or ameliorate them. This series of articles will explore the nature of human error and how HFE can be applied to reduce the likelihood of errors and mitigate their effects.
Active learning: learning a motor skill without a coach.
Huang, Vincent S; Shadmehr, Reza; Diedrichsen, Jörn
2008-08-01
When we learn a new skill (e.g., golf) without a coach, we are "active learners": we have to choose the specific components of the task on which to train (e.g., iron, driver, putter, etc.). What guides our selection of the training sequence? How do choices that people make compare with choices made by machine learning algorithms that attempt to optimize performance? We asked subjects to learn the novel dynamics of a robotic tool while moving it in four directions. They were instructed to choose their practice directions to maximize their performance in subsequent tests. We found that their choices were strongly influenced by motor errors: subjects tended to immediately repeat an action if that action had produced a large error. This strategy was correlated with better performance on test trials. However, even when participants performed perfectly on a movement, they did not avoid repeating that movement. The probability of repeating an action did not drop below chance even when no errors were observed. This behavior led to suboptimal performance. It also violated a strong prediction of current machine learning algorithms, which solve the active learning problem by choosing a training sequence that will maximally reduce the learner's uncertainty about the task. While we show that these algorithms do not provide an adequate description of human behavior, our results suggest ways to improve human motor learning by helping people choose an optimal training sequence.
Cheng, Ching-Min; Hwang, Sheue-Ling
2015-03-01
This paper outlines the human error identification (HEI) techniques that currently exist to assess latent human errors. Many formal error identification techniques have existed for years, but few have been validated to cover latent human error analysis in different domains. This study considers many possible error modes and influential factors, including external error modes, internal error modes, psychological error mechanisms, and performance shaping factors, and integrates several execution procedures and frameworks of HEI techniques. The case study in this research was the operational process of changing chemical cylinders in a factory. In addition, the integrated HEI method was used to assess the operational processes and the system's reliability. It was concluded that the integrated method is a valuable aid to develop much safer operational processes and can be used to predict human error rates on critical tasks in the plant. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Effective force control by muscle synergies
Berger, Denise J.; d'Avella, Andrea
2014-01-01
Muscle synergies have been proposed as a way for the central nervous system (CNS) to simplify the generation of motor commands and they have been shown to explain a large fraction of the variation in the muscle patterns across a variety of conditions. However, whether human subjects are able to control forces and movements effectively with a small set of synergies has not been tested directly. Here we show that muscle synergies can be used to generate target forces in multiple directions with the same accuracy achieved using individual muscles. We recorded electromyographic (EMG) activity from 13 arm muscles and isometric hand forces during a force reaching task in a virtual environment. From these data we estimated the force associated to each muscle by linear regression and we identified muscle synergies by non-negative matrix factorization. We compared trajectories of a virtual mass displaced by the force estimated using the entire set of recorded EMGs to trajectories obtained using 4–5 muscle synergies. While trajectories were similar, when feedback was provided according to force estimated from recorded EMGs (EMG-control) on average trajectories generated with the synergies were less accurate. However, when feedback was provided according to recorded force (force-control) we did not find significant differences in initial angle error and endpoint error. We then tested whether synergies could be used as effectively as individual muscles to control cursor movement in the force reaching task by providing feedback according to force estimated from the projection of the recorded EMGs into synergy space (synergy-control). Human subjects were able to perform the task immediately after switching from force-control to EMG-control and synergy-control and we found no differences between initial movement direction errors and endpoint errors in all control modes. These results indicate that muscle synergies provide an effective strategy for motor coordination. PMID:24860489
A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Safety Metrics for Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Leveson, Nancy G; Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Human error in airway facilities.
DOT National Transportation Integrated Search
2001-01-01
This report examines human errors in Airway Facilities (AF) with the intent of preventing these errors from being : passed on to the new Operations Control Centers. To effectively manage errors, they first have to be identified. : Human factors engin...
Obstetric Neuraxial Drug Administration Errors: A Quantitative and Qualitative Analytical Review.
Patel, Santosh; Loveridge, Robert
2015-12-01
Drug administration errors in obstetric neuraxial anesthesia can have devastating consequences. Although fully recognizing that they represent "only the tip of the iceberg," published case reports/series of these errors were reviewed in detail with the aim of estimating the frequency and the nature of these errors. We identified case reports and case series from MEDLINE and performed a quantitative analysis of the involved drugs, error setting, source of error, the observed complications, and any therapeutic interventions. We subsequently performed a qualitative analysis of the human factors involved and proposed modifications to practice. Twenty-nine cases were identified. Various drugs were given in error, but no direct effects on the course of labor, mode of delivery, or neonatal outcome were reported. Four maternal deaths from the accidental intrathecal administration of tranexamic acid were reported, all occurring after delivery of the fetus. A range of hemodynamic and neurologic signs and symptoms were noted, but the most commonly reported complication was the failure of the intended neuraxial anesthetic technique. Several human factors were present; most common factors were drug storage issues and similar drug appearance. Four practice recommendations were identified as being likely to have prevented the errors. The reported errors exposed latent conditions within health care systems. We suggest that the implementation of the following processes may decrease the risk of these types of drug errors: (1) Careful reading of the label on any drug ampule or syringe before the drug is drawn up or injected; (2) labeling all syringes; (3) checking labels with a second person or a device (such as a barcode reader linked to a computer) before the drug is drawn up or administered; and (4) use of non-Luer lock connectors on all epidural/spinal/combined spinal-epidural devices. Further study is required to determine whether routine use of these processes will reduce drug error.
Chiu, Ming-Chuan; Hsieh, Min-Chih
2016-05-01
The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
The effects of training on errors of perceived direction in perspective displays
NASA Technical Reports Server (NTRS)
Tharp, Gregory K.; Ellis, Stephen R.
1990-01-01
An experiment was conducted to determine the effects of training on the characteristic direction errors that are observed when subjects estimate exocentric directions on perspective displays. Changes in five subjects' perceptual errors were measured during a training procedure designed to eliminate the error. The training was provided by displaying to each subject both the sign and the direction of his judgment error. The feedback provided by the error display was found to decrease but not eliminate the error. A lookup table model of the source of the error was developed in which the judgement errors were attributed to overestimates of both the pitch and the yaw of the viewing direction used to produce the perspective projection. The model predicts the quantitative characteristics of the data somewhat better than previous models did. A mechanism is proposed for the observed learning, and further tests of the model are suggested.
De Rosario, Helios; Page, Álvaro; Besa, Antonio
2017-09-06
The accurate location of the main axes of rotation (AoR) is a crucial step in many applications of human movement analysis. There are different formal methods to determine the direction and position of the AoR, whose performance varies across studies, depending on the pose and the source of errors. Most methods are based on minimizing squared differences between observed and modelled marker positions or rigid motion parameters, implicitly assuming independent and uncorrelated errors, but the largest error usually results from soft tissue artefacts (STA), which do not have such statistical properties and are not effectively cancelled out by such methods. However, with adequate methods it is possible to assume that STA only account for a small fraction of the observed motion and to obtain explicit formulas through differential analysis that relate STA components to the resulting errors in AoR parameters. In this paper such formulas are derived for three different functional calibration techniques (Geometric Fitting, mean Finite Helical Axis, and SARA), to explain why each technique behaves differently from the others, and to propose strategies to compensate for those errors. These techniques were tested with published data from a sit-to-stand activity, where the true axis was defined using bi-planar fluoroscopy. All the methods were able to estimate the direction of the AoR with an error of less than 5°, whereas there were errors in the location of the axis of 30-40mm. Such location errors could be reduced to less than 17mm by the methods based on equations that use rigid motion parameters (mean Finite Helical Axis, SARA) when the translation component was calculated using the three markers nearest to the axis. Copyright © 2017 Elsevier Ltd. All rights reserved.
Information systems and human error in the lab.
Bissell, Michael G
2004-01-01
Health system costs in clinical laboratories are incurred daily due to human error. Indeed, a major impetus for automating clinical laboratories has always been the opportunity it presents to simultaneously reduce cost and improve quality of operations by decreasing human error. But merely automating these processes is not enough. To the extent that introduction of these systems results in operators having less practice in dealing with unexpected events or becoming deskilled in problemsolving, however new kinds of error will likely appear. Clinical laboratories could potentially benefit by integrating findings on human error from modern behavioral science into their operations. Fully understanding human error requires a deep understanding of human information processing and cognition. Predicting and preventing negative consequences requires application of this understanding to laboratory operations. Although the occurrence of a particular error at a particular instant cannot be absolutely prevented, human error rates can be reduced. The following principles are key: an understanding of the process of learning in relation to error; understanding the origin of errors since this knowledge can be used to reduce their occurrence; optimal systems should be forgiving to the operator by absorbing errors, at least for a time; although much is known by industrial psychologists about how to write operating procedures and instructions in ways that reduce the probability of error, this expertise is hardly ever put to use in the laboratory; and a feedback mechanism must be designed into the system that enables the operator to recognize in real time that an error has occurred.
First-order approximation error analysis of Risley-prism-based beam directing system.
Zhao, Yanyan; Yuan, Yan
2014-12-01
To improve the performance of a Risley-prism system for optical detection and measuring applications, it is necessary to be able to determine the direction of the outgoing beam with high accuracy. In previous works, error sources and their impact on the performance of the Risley-prism system have been analyzed, but their numerical approximation accuracy was not high. Besides, pointing error analysis of the Risley-prism system has provided results for the case when the component errors, prism orientation errors, and assembly errors are certain. In this work, the prototype of a Risley-prism system was designed. The first-order approximations of the error analysis were derived and compared with the exact results. The directing errors of a Risley-prism system associated with wedge-angle errors, prism mounting errors, and bearing assembly errors were analyzed based on the exact formula and the first-order approximation. The comparisons indicated that our first-order approximation is accurate. In addition, the combined errors produced by the wedge-angle errors and mounting errors of the two prisms together were derived and in both cases were proved to be the sum of errors caused by the first and the second prism separately. Based on these results, the system error of our prototype was estimated. The derived formulas can be implemented to evaluate beam directing errors of any Risley-prism beam directing system with a similar configuration.
Lessons Learned the Hard Way but Learned Well
ERIC Educational Resources Information Center
Dirksen, Debra J.
2014-01-01
The author spins a tale of how she learned classroom management largely by trial and error and by making a commitment to never give up on her students. Classroom management done well provides the signposts that give students direction and enables them to reach their destination as learners and human beings. Classroom management is one of the most…
NASA Technical Reports Server (NTRS)
Johnson, C. W.; Holloway, C, M.
2007-01-01
Accident reports provide important insights into the causes and contributory factors leading to particular adverse events. In contrast, this paper provides an analysis that extends across the findings presented over ten years investigations into maritime accidents by both the US National Transportation Safety Board (NTSB) and Canadian Transportation Safety Board (TSB). The purpose of the study was to assess the comparative frequency of a range of causal factors in the reporting of adverse events. In order to communicate our findings, we introduce J-H graphs as a means of representing the proportion of causes and contributory factors associated with human error, equipment failure and other high level classifications in longitudinal studies of accident reports. Our results suggest the proportion of causal and contributory factors attributable to direct human error may be very much smaller than has been suggested elsewhere in the human factors literature. In contrast, more attention should be paid to wider systemic issues, including the managerial and regulatory context of maritime operations.
NASA Technical Reports Server (NTRS)
Wilmington, R. P.; Klute, Glenn K. (Editor); Carroll, Amy E. (Editor); Stuart, Mark A. (Editor); Poliner, Jeff (Editor); Rajulu, Sudhakar (Editor); Stanush, Julie (Editor)
1992-01-01
Kinematics, the study of motion exclusive of the influences of mass and force, is one of the primary methods used for the analysis of human biomechanical systems as well as other types of mechanical systems. The Anthropometry and Biomechanics Laboratory (ABL) in the Crew Interface Analysis section of the Man-Systems Division performs both human body kinematics as well as mechanical system kinematics using the Ariel Performance Analysis System (APAS). The APAS supports both analysis of analog signals (e.g. force plate data collection) as well as digitization and analysis of video data. The current evaluations address several methodology issues concerning the accuracy of the kinematic data collection and analysis used in the ABL. This document describes a series of evaluations performed to gain quantitative data pertaining to position and constant angular velocity movements under several operating conditions. Two-dimensional as well as three-dimensional data collection and analyses were completed in a controlled laboratory environment using typical hardware setups. In addition, an evaluation was performed to evaluate the accuracy impact due to a single axis camera offset. Segment length and positional data exhibited errors within 3 percent when using three-dimensional analysis and yielded errors within 8 percent through two-dimensional analysis (Direct Linear Software). Peak angular velocities displayed errors within 6 percent through three-dimensional analyses and exhibited errors of 12 percent when using two-dimensional analysis (Direct Linear Software). The specific results from this series of evaluations and their impacts on the methodology issues of kinematic data collection and analyses are presented in detail. The accuracy levels observed in these evaluations are also presented.
Dotette: Programmable, high-precision, plug-and-play droplet pipetting.
Fan, Jinzhen; Men, Yongfan; Hao Tseng, Kuo; Ding, Yi; Ding, Yunfeng; Villarreal, Fernando; Tan, Cheemeng; Li, Baoqing; Pan, Tingrui
2018-05-01
Manual micropipettes are the most heavily used liquid handling devices in biological and chemical laboratories; however, they suffer from low precision for volumes under 1 μ l and inevitable human errors. For a manual device, the human errors introduced pose potential risks of failed experiments, inaccurate results, and financial costs. Meanwhile, low precision under 1 μ l can cause severe quantification errors and high heterogeneity of outcomes, becoming a bottleneck of reaction miniaturization for quantitative research in biochemical labs. Here, we report Dotette, a programmable, plug-and-play microfluidic pipetting device based on nanoliter liquid printing. With automated control, protocols designed on computers can be directly downloaded into Dotette, enabling programmable operation processes. Utilizing continuous nanoliter droplet dispensing, the precision of the volume control has been successfully improved from traditional 20%-50% to less than 5% in the range of 100 nl to 1000 nl. Such a highly automated, plug-and-play add-on to existing pipetting devices not only improves precise quantification in low-volume liquid handling and reduces chemical consumptions but also facilitates and automates a variety of biochemical and biological operations.
Model and experiments to optimize co-adaptation in a simplified myoelectric control system.
Couraud, M; Cattaert, D; Paclet, F; Oudeyer, P Y; de Rugy, A
2018-04-01
To compensate for a limb lost in an amputation, myoelectric prostheses use surface electromyography (EMG) from the remaining muscles to control the prosthesis. Despite considerable progress, myoelectric controls remain markedly different from the way we normally control movements, and require intense user adaptation. To overcome this, our goal is to explore concurrent machine co-adaptation techniques that are developed in the field of brain-machine interface, and that are beginning to be used in myoelectric controls. We combined a simplified myoelectric control with a perturbation for which human adaptation is well characterized and modeled, in order to explore co-adaptation settings in a principled manner. First, we reproduced results obtained in a classical visuomotor rotation paradigm in our simplified myoelectric context, where we rotate the muscle pulling vectors used to reconstruct wrist force from EMG. Then, a model of human adaptation in response to directional error was used to simulate various co-adaptation settings, where perturbations and machine co-adaptation are both applied on muscle pulling vectors. These simulations established that a relatively low gain of machine co-adaptation that minimizes final errors generates slow and incomplete adaptation, while higher gains increase adaptation rate but also errors by amplifying noise. After experimental verification on real subjects, we tested a variable gain that cumulates the advantages of both, and implemented it with directionally tuned neurons similar to those used to model human adaptation. This enables machine co-adaptation to locally improve myoelectric control, and to absorb more challenging perturbations. The simplified context used here enabled to explore co-adaptation settings in both simulations and experiments, and to raise important considerations such as the need for a variable gain encoded locally. The benefits and limits of extending this approach to more complex and functional myoelectric contexts are discussed.
Stochastic Models of Human Errors
NASA Technical Reports Server (NTRS)
Elshamy, Maged; Elliott, Dawn M. (Technical Monitor)
2002-01-01
Humans play an important role in the overall reliability of engineering systems. More often accidents and systems failure are traced to human errors. Therefore, in order to have meaningful system risk analysis, the reliability of the human element must be taken into consideration. Describing the human error process by mathematical models is a key to analyzing contributing factors. Therefore, the objective of this research effort is to establish stochastic models substantiated by sound theoretic foundation to address the occurrence of human errors in the processing of the space shuttle.
NASA Technical Reports Server (NTRS)
Diorio, Kimberly A.; Voska, Ned (Technical Monitor)
2002-01-01
This viewgraph presentation provides information on Human Factors Process Failure Modes and Effects Analysis (HF PFMEA). HF PFMEA includes the following 10 steps: Describe mission; Define System; Identify human-machine; List human actions; Identify potential errors; Identify factors that effect error; Determine likelihood of error; Determine potential effects of errors; Evaluate risk; Generate solutions (manage error). The presentation also describes how this analysis was applied to a liquid oxygen pump acceptance test.
NASA Technical Reports Server (NTRS)
Mcruer, D. T.; Clement, W. F.; Allen, R. W.
1981-01-01
Human errors tend to be treated in terms of clinical and anecdotal descriptions, from which remedial measures are difficult to derive. Correction of the sources of human error requires an attempt to reconstruct underlying and contributing causes of error from the circumstantial causes cited in official investigative reports. A comprehensive analytical theory of the cause-effect relationships governing propagation of human error is indispensable to a reconstruction of the underlying and contributing causes. A validated analytical theory of the input-output behavior of human operators involving manual control, communication, supervisory, and monitoring tasks which are relevant to aviation, maritime, automotive, and process control operations is highlighted. This theory of behavior, both appropriate and inappropriate, provides an insightful basis for investigating, classifying, and quantifying the needed cause-effect relationships governing propagation of human error.
NASA Technical Reports Server (NTRS)
Mcruer, D. T.; Clement, W. F.; Allen, R. W.
1980-01-01
Human error, a significant contributing factor in a very high proportion of civil transport, general aviation, and rotorcraft accidents is investigated. Correction of the sources of human error requires that one attempt to reconstruct underlying and contributing causes of error from the circumstantial causes cited in official investigative reports. A validated analytical theory of the input-output behavior of human operators involving manual control, communication, supervisory, and monitoring tasks which are relevant to aviation operations is presented. This theory of behavior, both appropriate and inappropriate, provides an insightful basis for investigating, classifying, and quantifying the needed cause-effect relationships governing propagation of human error.
Human Reliability and the Cost of Doing Business
NASA Technical Reports Server (NTRS)
DeMott, Diana
2014-01-01
Most businesses recognize that people will make mistakes and assume errors are just part of the cost of doing business, but does it need to be? Companies with high risk, or major consequences, should consider the effect of human error. In a variety of industries, Human Errors have caused costly failures and workplace injuries. These have included: airline mishaps, medical malpractice, administration of medication and major oil spills have all been blamed on human error. A technique to mitigate or even eliminate some of these costly human errors is the use of Human Reliability Analysis (HRA). Various methodologies are available to perform Human Reliability Assessments that range from identifying the most likely areas for concern to detailed assessments with human error failure probabilities calculated. Which methodology to use would be based on a variety of factors that would include: 1) how people react and act in different industries, and differing expectations based on industries standards, 2) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 3) type and availability of data and 4) how the industry views risk & reliability influences ( types of emergencies, contingencies and routine tasks versus cost based concerns). The Human Reliability Assessments should be the first step to reduce, mitigate or eliminate the costly mistakes or catastrophic failures. Using Human Reliability techniques to identify and classify human error risks allows a company more opportunities to mitigate or eliminate these risks and prevent costly failures.
Human Reliability and the Cost of Doing Business
NASA Technical Reports Server (NTRS)
DeMott, D. L.
2014-01-01
Human error cannot be defined unambiguously in advance of it happening, it often becomes an error after the fact. The same action can result in a tragic accident for one situation or a heroic action given a more favorable outcome. People often forget that we employ humans in business and industry for the flexibility and capability to change when needed. In complex systems, operations are driven by their specifications of the system and the system structure. People provide the flexibility to make it work. Human error has been reported as being responsible for 60%-80% of failures, accidents and incidents in high-risk industries. We don't have to accept that all human errors are inevitable. Through the use of some basic techniques, many potential human error events can be addressed. There are actions that can be taken to reduce the risk of human error.
Leach, Julia M; Mancini, Martina; Peterka, Robert J; Hayes, Tamara L; Horak, Fay B
2014-09-29
The Nintendo Wii balance board (WBB) has generated significant interest in its application as a postural control measurement device in both the clinical and (basic, clinical, and rehabilitation) research domains. Although the WBB has been proposed as an alternative to the "gold standard" laboratory-grade force plate, additional research is necessary before the WBB can be considered a valid and reliable center of pressure (CoP) measurement device. In this study, we used the WBB and a laboratory-grade AMTI force plate (AFP) to simultaneously measure the CoP displacement of a controlled dynamic load, which has not been done before. A one-dimensional inverted pendulum was displaced at several different displacement angles and load heights to simulate a variety of postural sway amplitudes and frequencies (<1 Hz). Twelve WBBs were tested to address the issue of inter-device variability. There was a significant effect of sway amplitude, frequency, and direction on the WBB's CoP measurement error, with an increase in error as both sway amplitude and frequency increased and a significantly greater error in the mediolateral (ML) (compared to the anteroposterior (AP)) sway direction. There was no difference in error across the 12 WBB's, supporting low inter-device variability. A linear calibration procedure was then implemented to correct the WBB's CoP signals and reduce measurement error. There was a significant effect of calibration on the WBB's CoP signal accuracy, with a significant reduction in CoP measurement error (quantified by root-mean-squared error) from 2-6 mm (before calibration) to 0.5-2 mm (after calibration). WBB-based CoP signal calibration also significantly reduced the percent error in derived (time-domain) CoP sway measures, from -10.5% (before calibration) to -0.05% (after calibration) (percent errors averaged across all sway measures and in both sway directions). In this study, we characterized the WBB's CoP measurement error under controlled, dynamic conditions and implemented a linear calibration procedure for WBB CoP signals that is recommended to reduce CoP measurement error and provide more reliable estimates of time-domain CoP measures. Despite our promising results, additional work is necessary to understand how our findings translate to the clinical and rehabilitation research domains. Once the WBB's CoP measurement error is fully characterized in human postural sway (which differs from our simulated postural sway in both amplitude and frequency content), it may be used to measure CoP displacement in situations where lower accuracy and precision is acceptable.
Leach, Julia M.; Mancini, Martina; Peterka, Robert J.; Hayes, Tamara L.; Horak, Fay B.
2014-01-01
The Nintendo Wii balance board (WBB) has generated significant interest in its application as a postural control measurement device in both the clinical and (basic, clinical, and rehabilitation) research domains. Although the WBB has been proposed as an alternative to the “gold standard” laboratory-grade force plate, additional research is necessary before the WBB can be considered a valid and reliable center of pressure (CoP) measurement device. In this study, we used the WBB and a laboratory-grade AMTI force plate (AFP) to simultaneously measure the CoP displacement of a controlled dynamic load, which has not been done before. A one-dimensional inverted pendulum was displaced at several different displacement angles and load heights to simulate a variety of postural sway amplitudes and frequencies (<1 Hz). Twelve WBBs were tested to address the issue of inter-device variability. There was a significant effect of sway amplitude, frequency, and direction on the WBB's CoP measurement error, with an increase in error as both sway amplitude and frequency increased and a significantly greater error in the mediolateral (ML) (compared to the anteroposterior (AP)) sway direction. There was no difference in error across the 12 WBB's, supporting low inter-device variability. A linear calibration procedure was then implemented to correct the WBB's CoP signals and reduce measurement error. There was a significant effect of calibration on the WBB's CoP signal accuracy, with a significant reduction in CoP measurement error (quantified by root-mean-squared error) from 2–6 mm (before calibration) to 0.5–2 mm (after calibration). WBB-based CoP signal calibration also significantly reduced the percent error in derived (time-domain) CoP sway measures, from −10.5% (before calibration) to −0.05% (after calibration) (percent errors averaged across all sway measures and in both sway directions). In this study, we characterized the WBB's CoP measurement error under controlled, dynamic conditions and implemented a linear calibration procedure for WBB CoP signals that is recommended to reduce CoP measurement error and provide more reliable estimates of time-domain CoP measures. Despite our promising results, additional work is necessary to understand how our findings translate to the clinical and rehabilitation research domains. Once the WBB's CoP measurement error is fully characterized in human postural sway (which differs from our simulated postural sway in both amplitude and frequency content), it may be used to measure CoP displacement in situations where lower accuracy and precision is acceptable. PMID:25268919
Human Error Analysis in a Permit to Work System: A Case Study in a Chemical Plant
Jahangiri, Mehdi; Hoboubi, Naser; Rostamabadi, Akbar; Keshavarzi, Sareh; Hosseini, Ali Akbar
2015-01-01
Background A permit to work (PTW) is a formal written system to control certain types of work which are identified as potentially hazardous. However, human error in PTW processes can lead to an accident. Methods This cross-sectional, descriptive study was conducted to estimate the probability of human errors in PTW processes in a chemical plant in Iran. In the first stage, through interviewing the personnel and studying the procedure in the plant, the PTW process was analyzed using the hierarchical task analysis technique. In doing so, PTW was considered as a goal and detailed tasks to achieve the goal were analyzed. In the next step, the standardized plant analysis risk-human (SPAR-H) reliability analysis method was applied for estimation of human error probability. Results The mean probability of human error in the PTW system was estimated to be 0.11. The highest probability of human error in the PTW process was related to flammable gas testing (50.7%). Conclusion The SPAR-H method applied in this study could analyze and quantify the potential human errors and extract the required measures for reducing the error probabilities in PTW system. Some suggestions to reduce the likelihood of errors, especially in the field of modifying the performance shaping factors and dependencies among tasks are provided. PMID:27014485
Effects of noise on the performance of a memory decision response task
NASA Technical Reports Server (NTRS)
Lawton, B. W.
1972-01-01
An investigation has been made to determine the effects of noise on human performance. Fourteen subjects performed a memory-decision-response task in relative quiet and while listening to tape recorded noises. Analysis of the data obtained indicates that performance was degraded in the presence of noise. Significant increases in problem solution times were found for impulsive noise conditions as compared with times found for the no-noise condition. Performance accuracy was also degraded. Significantly more error responses occurred at higher noise levels; a direct or positive relation was found between error responses and noise level experienced by the subjects.
Managing human fallibility in critical aerospace situations
NASA Astrophysics Data System (ADS)
Tew, Larry
2014-11-01
Human fallibility is pervasive in the aerospace industry with over 50% of errors attributed to human error. Consider the benefits to any organization if those errors were significantly reduced. Aerospace manufacturing involves high value, high profile systems with significant complexity and often repetitive build, assembly, and test operations. In spite of extensive analysis, planning, training, and detailed procedures, human factors can cause unexpected errors. Handling such errors involves extensive cause and corrective action analysis and invariably schedule slips and cost growth. We will discuss success stories, including those associated with electro-optical systems, where very significant reductions in human fallibility errors were achieved after receiving adapted and specialized training. In the eyes of company and customer leadership, the steps used to achieve these results lead to in a major culture change in both the workforce and the supporting management organization. This approach has proven effective in other industries like medicine, firefighting, law enforcement, and aviation. The roadmap to success and the steps to minimize human error are known. They can be used by any organization willing to accept human fallibility and take a proactive approach to incorporate the steps needed to manage and minimize error.
NASA Technical Reports Server (NTRS)
Beutter, Brent R.; Stone, Leland S.
1997-01-01
Although numerous studies have examined the relationship between smooth-pursuit eye movements and motion perception, it remains unresolved whether a common motion-processing system subserves both perception and pursuit. To address this question, we simultaneously recorded perceptual direction judgments and the concomitant smooth eye movement response to a plaid stimulus that we have previously shown generates systematic perceptual errors. We measured the perceptual direction biases psychophysically and the smooth eye-movement direction biases using two methods (standard averaging and oculometric analysis). We found that the perceptual and oculomotor biases were nearly identical suggesting that pursuit and perception share a critical motion processing stage, perhaps in area MT or MST of extrastriate visual cortex.
NASA Technical Reports Server (NTRS)
Beutter, B. R.; Stone, L. S.
1998-01-01
Although numerous studies have examined the relationship between smooth-pursuit eye movements and motion perception, it remains unresolved whether a common motion-processing system subserves both perception and pursuit. To address this question, we simultaneously recorded perceptual direction judgments and the concomitant smooth eye-movement response to a plaid stimulus that we have previously shown generates systematic perceptual errors. We measured the perceptual direction biases psychophysically and the smooth eye-movement direction biases using two methods (standard averaging and oculometric analysis). We found that the perceptual and oculomotor biases were nearly identical, suggesting that pursuit and perception share a critical motion processing stage, perhaps in area MT or MST of extrastriate visual cortex.
Prediction of human errors by maladaptive changes in event-related brain networks.
Eichele, Tom; Debener, Stefan; Calhoun, Vince D; Specht, Karsten; Engel, Andreas K; Hugdahl, Kenneth; von Cramon, D Yves; Ullsperger, Markus
2008-04-22
Humans engaged in monotonous tasks are susceptible to occasional errors that may lead to serious consequences, but little is known about brain activity patterns preceding errors. Using functional MRI and applying independent component analysis followed by deconvolution of hemodynamic responses, we studied error preceding brain activity on a trial-by-trial basis. We found a set of brain regions in which the temporal evolution of activation predicted performance errors. These maladaptive brain activity changes started to evolve approximately 30 sec before the error. In particular, a coincident decrease of deactivation in default mode regions of the brain, together with a decline of activation in regions associated with maintaining task effort, raised the probability of future errors. Our findings provide insights into the brain network dynamics preceding human performance errors and suggest that monitoring of the identified precursor states may help in avoiding human errors in critical real-world situations.
Prediction of human errors by maladaptive changes in event-related brain networks
Eichele, Tom; Debener, Stefan; Calhoun, Vince D.; Specht, Karsten; Engel, Andreas K.; Hugdahl, Kenneth; von Cramon, D. Yves; Ullsperger, Markus
2008-01-01
Humans engaged in monotonous tasks are susceptible to occasional errors that may lead to serious consequences, but little is known about brain activity patterns preceding errors. Using functional MRI and applying independent component analysis followed by deconvolution of hemodynamic responses, we studied error preceding brain activity on a trial-by-trial basis. We found a set of brain regions in which the temporal evolution of activation predicted performance errors. These maladaptive brain activity changes started to evolve ≈30 sec before the error. In particular, a coincident decrease of deactivation in default mode regions of the brain, together with a decline of activation in regions associated with maintaining task effort, raised the probability of future errors. Our findings provide insights into the brain network dynamics preceding human performance errors and suggest that monitoring of the identified precursor states may help in avoiding human errors in critical real-world situations. PMID:18427123
Steiner, Lisa; Burgess-Limerick, Robin; Porter, William
2014-03-01
The authors examine the pattern of direction errors made during the manipulation of a physical simulation of an underground coal mine bolting machine to assess the directional control-response compatibility relationships associated with the device and to compare these results to data obtained from a virtual simulation of a generic device. Directional errors during the manual control of underground coal roof bolting equipment are associated with serious injuries. Directional control-response relationships have previously been examined using a virtual simulation of a generic device; however, the applicability of these results to a specific physical device may be questioned. Forty-eight participants randomly assigned to different directional control-response relationships manipulated horizontal or vertical control levers to move a simulated bolter arm in three directions (elevation, slew, and sump) as well as to cause a light to become illuminated and raise or lower a stabilizing jack. Directional errors were recorded during the completion of 240 trials by each participant Directional error rates are increased when the control and response are in opposite directions or if the direction of the control and response are perpendicular.The pattern of direction error rates was consistent with experiments obtained from a generic device in a virtual environment. Error rates are increased by incompatible directional control-response relationships. Ensuring that the design of equipment controls maintains compatible directional control-response relationships has potential to reduce the errors made in high-risk situations, such as underground coal mining.
Remediating Common Math Errors.
ERIC Educational Resources Information Center
Wagner, Rudolph F.
1981-01-01
Explanations and remediation suggestions for five types of mathematics errors due either to perceptual or cognitive difficulties are given. Error types include directionality problems, mirror writing, visually misperceived signs, diagnosed directionality problems, and mixed process errors. (CL)
Applying lessons learned to enhance human performance and reduce human error for ISS operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, W.R.
1999-01-01
A major component of reliability, safety, and mission success for space missions is ensuring that the humans involved (flight crew, ground crew, mission control, etc.) perform their tasks and functions as required. This includes compliance with training and procedures during normal conditions, and successful compensation when malfunctions or unexpected conditions occur. A very significant issue that affects human performance in space flight is human error. Human errors can invalidate carefully designed equipment and procedures. If certain errors combine with equipment failures or design flaws, mission failure or loss of life can occur. The control of human error during operation ofmore » the International Space Station (ISS) will be critical to the overall success of the program. As experience from Mir operations has shown, human performance plays a vital role in the success or failure of long duration space missions. The Department of Energy{close_quote}s Idaho National Engineering and Environmental Laboratory (INEEL) is developing a systematic approach to enhance human performance and reduce human errors for ISS operations. This approach is based on the systematic identification and evaluation of lessons learned from past space missions such as Mir to enhance the design and operation of ISS. This paper will describe previous INEEL research on human error sponsored by NASA and how it can be applied to enhance human reliability for ISS. {copyright} {ital 1999 American Institute of Physics.}« less
Applying lessons learned to enhance human performance and reduce human error for ISS operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, W.R.
1998-09-01
A major component of reliability, safety, and mission success for space missions is ensuring that the humans involved (flight crew, ground crew, mission control, etc.) perform their tasks and functions as required. This includes compliance with training and procedures during normal conditions, and successful compensation when malfunctions or unexpected conditions occur. A very significant issue that affects human performance in space flight is human error. Human errors can invalidate carefully designed equipment and procedures. If certain errors combine with equipment failures or design flaws, mission failure or loss of life can occur. The control of human error during operation ofmore » the International Space Station (ISS) will be critical to the overall success of the program. As experience from Mir operations has shown, human performance plays a vital role in the success or failure of long duration space missions. The Department of Energy`s Idaho National Engineering and Environmental Laboratory (INEEL) is developed a systematic approach to enhance human performance and reduce human errors for ISS operations. This approach is based on the systematic identification and evaluation of lessons learned from past space missions such as Mir to enhance the design and operation of ISS. This paper describes previous INEEL research on human error sponsored by NASA and how it can be applied to enhance human reliability for ISS.« less
Structured methods for identifying and correcting potential human errors in aviation operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, W.R.
1997-10-01
Human errors have been identified as the source of approximately 60% of the incidents and accidents that occur in commercial aviation. It can be assumed that a very large number of human errors occur in aviation operations, even though in most cases the redundancies and diversities built into the design of aircraft systems prevent the errors from leading to serious consequences. In addition, when it is acknowledged that many system failures have their roots in human errors that occur in the design phase, it becomes apparent that the identification and elimination of potential human errors could significantly decrease the risksmore » of aviation operations. This will become even more critical during the design of advanced automation-based aircraft systems as well as next-generation systems for air traffic management. Structured methods to identify and correct potential human errors in aviation operations have been developed and are currently undergoing testing at the Idaho National Engineering and Environmental Laboratory (INEEL).« less
On parameters identification of computational models of vibrations during quiet standing of humans
NASA Astrophysics Data System (ADS)
Barauskas, R.; Krušinskienė, R.
2007-12-01
Vibration of the center of pressure (COP) of human body on the base of support during quiet standing is a very popular clinical research, which provides useful information about the physical and health condition of an individual. In this work, vibrations of COP of a human body in forward-backward direction during still standing are generated using controlled inverted pendulum (CIP) model with a single degree of freedom (dof) supplied with proportional, integral and differential (PID) controller, which represents the behavior of the central neural system of a human body and excited by cumulative disturbance vibration, generated within the body due to breathing or any other physical condition. The identification of the model and disturbance parameters is an important stage while creating a close-to-reality computational model able to evaluate features of disturbance. The aim of this study is to present the CIP model parameters identification approach based on the information captured by time series of the COP signal. The identification procedure is based on an error function minimization. Error function is formulated in terms of time laws of computed and experimentally measured COP vibrations. As an alternative, error function is formulated in terms of the stabilogram diffusion function (SDF). The minimization of error functions is carried out by employing methods based on sensitivity functions of the error with respect to model and excitation parameters. The sensitivity functions are obtained by using the variational techniques. The inverse dynamic problem approach has been employed in order to establish the properties of the disturbance time laws ensuring the satisfactory coincidence of measured and computed COP vibration laws. The main difficulty of the investigated problem is encountered during the model validation stage. Generally, neither the PID controller parameter set nor the disturbance time law are known in advance. In this work, an error function formulated in terms of time derivative of disturbance torque has been proposed in order to obtain PID controller parameters, as well as the reference time law of the disturbance. The disturbance torque is calculated from experimental data using the inverse dynamic approach. Experiments presented in this study revealed that vibrations of disturbance torque and PID controller parameters identified by the method may be qualified as feasible in humans. Presented approach may be easily extended to structural models with any number of dof or higher structural complexity.
Kraemer, Sara; Carayon, Pascale
2007-03-01
This paper describes human errors and violations of end users and network administration in computer and information security. This information is summarized in a conceptual framework for examining the human and organizational factors contributing to computer and information security. This framework includes human error taxonomies to describe the work conditions that contribute adversely to computer and information security, i.e. to security vulnerabilities and breaches. The issue of human error and violation in computer and information security was explored through a series of 16 interviews with network administrators and security specialists. The interviews were audio taped, transcribed, and analyzed by coding specific themes in a node structure. The result is an expanded framework that classifies types of human error and identifies specific human and organizational factors that contribute to computer and information security. Network administrators tended to view errors created by end users as more intentional than unintentional, while errors created by network administrators as more unintentional than intentional. Organizational factors, such as communication, security culture, policy, and organizational structure, were the most frequently cited factors associated with computer and information security.
Intervention strategies for the management of human error
NASA Technical Reports Server (NTRS)
Wiener, Earl L.
1993-01-01
This report examines the management of human error in the cockpit. The principles probably apply as well to other applications in the aviation realm (e.g. air traffic control, dispatch, weather, etc.) as well as other high-risk systems outside of aviation (e.g. shipping, high-technology medical procedures, military operations, nuclear power production). Management of human error is distinguished from error prevention. It is a more encompassing term, which includes not only the prevention of error, but also a means of disallowing an error, once made, from adversely affecting system output. Such techniques include: traditional human factors engineering, improvement of feedback and feedforward of information from system to crew, 'error-evident' displays which make erroneous input more obvious to the crew, trapping of errors within a system, goal-sharing between humans and machines (also called 'intent-driven' systems), paperwork management, and behaviorally based approaches, including procedures, standardization, checklist design, training, cockpit resource management, etc. Fifteen guidelines for the design and implementation of intervention strategies are included.
Cohen, Michael R; Smetzer, Judy L
2017-01-01
These medication errors have occurred in health care facilities at least once. They will happen again-perhaps where you work. Through education and alertness of personnel and procedural safeguards, they can be avoided. You should consider publishing accounts of errors in your newsletters and/or presenting them at your inservice training programs. Your assistance is required to continue this feature. The reports described here were received through the Institute for Safe Medication Practices (ISMP) Medication Errors Reporting Program. Any reports published by ISMP will be anonymous. Comments are also invited; the writers' names will be published if desired. ISMP may be contacted at the address shown below. Errors, close calls, or hazardous conditions may be reported directly to ISMP through the ISMP Web site (www.ismp.org), by calling 800-FAIL-SAFE, or via e-mail at ismpinfo@ismp.org. ISMP guarantees the confidentiality and security of the information received and respects reporters' wishes as to the level of detail included in publications.
Radiology's Achilles' heel: error and variation in the interpretation of the Röntgen image.
Robinson, P J
1997-11-01
The performance of the human eye and brain has failed to keep pace with the enormous technical progress in the first full century of radiology. Errors and variations in interpretation now represent the weakest aspect of clinical imaging. Those interpretations which differ from the consensus view of a panel of "experts" may be regarded as errors; where experts fail to achieve consensus, differing reports are regarded as "observer variation". Errors arise from poor technique, failures of perception, lack of knowledge and misjudgments. Observer variation is substantial and should be taken into account when different diagnostic methods are compared; in many cases the difference between observers outweighs the difference between techniques. Strategies for reducing error include attention to viewing conditions, training of the observers, availability of previous films and relevant clinical data, dual or multiple reporting, standardization of terminology and report format, and assistance from computers. Digital acquisition and display will probably not affect observer variation but the performance of radiologists, as measured by receiver operating characteristic (ROC) analysis, may be improved by computer-directed search for specific image features. Other current developments show that where image features can be comprehensively described, computer analysis can replace the perception function of the observer, whilst the function of interpretation can in some cases be performed better by artificial neural networks. However, computer-assisted diagnosis is still in its infancy and complete replacement of the human observer is as yet a remote possibility.
Evolution of gossip-based indirect reciprocity on a bipartite network
Giardini, Francesca; Vilone, Daniele
2016-01-01
Cooperation can be supported by indirect reciprocity via reputation. Thanks to gossip, reputations are built and circulated and humans can identify defectors and ostracise them. However, the evolutionary stability of gossip is allegedly undermined by the fact that it is more error-prone that direct observation, whereas ostracism could be ineffective if the partner selection mechanism is not robust. The aim of this work is to investigate the conditions under which the combination of gossip and ostracism might support cooperation in groups of different sizes. We are also interested in exploring the extent to which errors in transmission might undermine the reliability of gossip as a mechanism for identifying defectors. Our results show that a large quantity of gossip is necessary to support cooperation, and that group structure can mitigate the effects of errors in transmission. PMID:27885256
Evolution of gossip-based indirect reciprocity on a bipartite network.
Giardini, Francesca; Vilone, Daniele
2016-11-25
Cooperation can be supported by indirect reciprocity via reputation. Thanks to gossip, reputations are built and circulated and humans can identify defectors and ostracise them. However, the evolutionary stability of gossip is allegedly undermined by the fact that it is more error-prone that direct observation, whereas ostracism could be ineffective if the partner selection mechanism is not robust. The aim of this work is to investigate the conditions under which the combination of gossip and ostracism might support cooperation in groups of different sizes. We are also interested in exploring the extent to which errors in transmission might undermine the reliability of gossip as a mechanism for identifying defectors. Our results show that a large quantity of gossip is necessary to support cooperation, and that group structure can mitigate the effects of errors in transmission.
Evolution of gossip-based indirect reciprocity on a bipartite network
NASA Astrophysics Data System (ADS)
Giardini, Francesca; Vilone, Daniele
2016-11-01
Cooperation can be supported by indirect reciprocity via reputation. Thanks to gossip, reputations are built and circulated and humans can identify defectors and ostracise them. However, the evolutionary stability of gossip is allegedly undermined by the fact that it is more error-prone that direct observation, whereas ostracism could be ineffective if the partner selection mechanism is not robust. The aim of this work is to investigate the conditions under which the combination of gossip and ostracism might support cooperation in groups of different sizes. We are also interested in exploring the extent to which errors in transmission might undermine the reliability of gossip as a mechanism for identifying defectors. Our results show that a large quantity of gossip is necessary to support cooperation, and that group structure can mitigate the effects of errors in transmission.
Method and apparatus for determining weldability of thin sheet metal
Goodwin, Gene M.; Hudson, Joseph D.
1988-01-01
A fixture is provided for testing thin sheet metal specimens to evaluate hot-cracking sensitivity for determining metal weldability on a heat-to-heat basis or through varying welding parameters. A test specimen is stressed in a first direction with a load selectively adjustable over a wide range and then a weldment is passed along over the specimen in a direction transverse to the direction of strain to evaluate the hot-cracking characteristics of the sheet metal which are indicative of the weldability of the metal. The fixture provides evaluations of hot-cracking sensitivity for determining metal weldability in a highly reproducible manner with minimum human error.
ERIC Educational Resources Information Center
Boedigheimer, Dan
2010-01-01
Approximately 70% of aviation accidents are attributable to human error. The greatest opportunity for further improving aviation safety is found in reducing human errors in the cockpit. The purpose of this quasi-experimental, mixed-method research was to evaluate whether there was a difference in pilot attitudes toward reducing human error in the…
Evaluating a medical error taxonomy.
Brixey, Juliana; Johnson, Todd R; Zhang, Jiajie
2002-01-01
Healthcare has been slow in using human factors principles to reduce medical errors. The Center for Devices and Radiological Health (CDRH) recognizes that a lack of attention to human factors during product development may lead to errors that have the potential for patient injury, or even death. In response to the need for reducing medication errors, the National Coordinating Council for Medication Errors Reporting and Prevention (NCC MERP) released the NCC MERP taxonomy that provides a standard language for reporting medication errors. This project maps the NCC MERP taxonomy of medication error to MedWatch medical errors involving infusion pumps. Of particular interest are human factors associated with medical device errors. The NCC MERP taxonomy of medication errors is limited in mapping information from MEDWATCH because of the focus on the medical device and the format of reporting.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terezakis, Stephanie A., E-mail: stereza1@jhmi.edu; Harris, Kendra M.; Ford, Eric
Purpose: Systems to ensure patient safety are of critical importance. The electronic incident reporting systems (IRS) of 2 large academic radiation oncology departments were evaluated for events that may be suitable for submission to a national reporting system (NRS). Methods and Materials: All events recorded in the combined IRS were evaluated from 2007 through 2010. Incidents were graded for potential severity using the validated French Nuclear Safety Authority (ASN) 5-point scale. These incidents were categorized into 7 groups: (1) human error, (2) software error, (3) hardware error, (4) error in communication between 2 humans, (5) error at the human-software interface,more » (6) error at the software-hardware interface, and (7) error at the human-hardware interface. Results: Between the 2 systems, 4407 incidents were reported. Of these events, 1507 (34%) were considered to have the potential for clinical consequences. Of these 1507 events, 149 (10%) were rated as having a potential severity of ≥2. Of these 149 events, the committee determined that 79 (53%) of these events would be submittable to a NRS of which the majority was related to human error or to the human-software interface. Conclusions: A significant number of incidents were identified in this analysis. The majority of events in this study were related to human error and to the human-software interface, further supporting the need for a NRS to facilitate field-wide learning and system improvement.« less
Jeon, Hong Jin; Kim, Ji-Hae; Kim, Bin-Na; Park, Seung Jin; Fava, Maurizio; Mischoulon, David; Kang, Eun-Ho; Roh, Sungwon; Lee, Dongsoo
2014-12-01
Human error is defined as an unintended error that is attributable to humans rather than machines, and that is important to avoid to prevent accidents. We aimed to investigate the association between sleep quality and human errors among train drivers. Cross-sectional. Population-based. A sample of 5,480 subjects who were actively working as train drivers were recruited in South Korea. The participants were 4,634 drivers who completed all questionnaires (response rate 84.6%). None. The Pittsburgh Sleep Quality Index (PSQI), the Center for Epidemiologic Studies Depression Scale (CES-D), the Impact of Event Scale-Revised (IES-R), the State-Trait Anxiety Inventory (STAI), and the Korean Occupational Stress Scale (KOSS). Of 4,634 train drivers, 349 (7.5%) showed more than one human error per 5 y. Human errors were associated with poor sleep quality, higher PSQI total scores, short sleep duration at night, and longer sleep latency. Among train drivers with poor sleep quality, those who experienced severe posttraumatic stress showed a significantly higher number of human errors than those without. Multiple logistic regression analysis showed that human errors were significantly associated with poor sleep quality and posttraumatic stress, whereas there were no significant associations with depression, trait and state anxiety, and work stress after adjusting for age, sex, education years, marital status, and career duration. Poor sleep quality was found to be associated with more human errors in train drivers, especially in those who experienced severe posttraumatic stress. © 2014 Associated Professional Sleep Societies, LLC.
Analyzing human errors in flight mission operations
NASA Technical Reports Server (NTRS)
Bruno, Kristin J.; Welz, Linda L.; Barnes, G. Michael; Sherif, Josef
1993-01-01
A long-term program is in progress at JPL to reduce cost and risk of flight mission operations through a defect prevention/error management program. The main thrust of this program is to create an environment in which the performance of the total system, both the human operator and the computer system, is optimized. To this end, 1580 Incident Surprise Anomaly reports (ISA's) from 1977-1991 were analyzed from the Voyager and Magellan projects. A Pareto analysis revealed that 38 percent of the errors were classified as human errors. A preliminary cluster analysis based on the Magellan human errors (204 ISA's) is presented here. The resulting clusters described the underlying relationships among the ISA's. Initial models of human error in flight mission operations are presented. Next, the Voyager ISA's will be scored and included in the analysis. Eventually, these relationships will be used to derive a theoretically motivated and empirically validated model of human error in flight mission operations. Ultimately, this analysis will be used to make continuous process improvements continuous process improvements to end-user applications and training requirements. This Total Quality Management approach will enable the management and prevention of errors in the future.
Acquisition, representation, and transfer of models of visuo-motor error
Zhang, Hang; Kulsa, Mila Kirstie C.; Maloney, Laurence T.
2015-01-01
We examined how human subjects acquire and represent models of visuo-motor error and how they transfer information about visuo-motor error from one task to a closely related one. The experiment consisted of three phases. In the training phase, subjects threw beanbags underhand towards targets displayed on a wall-mounted touch screen. The distribution of their endpoints was a vertically elongated bivariate Gaussian. In the subsequent choice phase, subjects repeatedly chose which of two targets varying in shape and size they would prefer to attempt to hit. Their choices allowed us to investigate their internal models of visuo-motor error distribution, including the coordinate system in which they represented visuo-motor error. In the transfer phase, subjects repeated the choice phase from a different vantage point, the same distance from the screen but with the throwing direction shifted 45°. From the new vantage point, visuo-motor error was effectively expanded horizontally by . We found that subjects incorrectly assumed an isotropic distribution in the choice phase but that the anisotropy they assumed in the transfer phase agreed with an objectively correct transfer. We also found that the coordinate system used in coding two-dimensional visuo-motor error in the choice phase was effectively one-dimensional. PMID:26057549
NASA Technical Reports Server (NTRS)
Landon, Lauren Blackwell; Vessey, William B.; Barrett, Jamie D.
2015-01-01
A team is defined as: "two or more individuals who interact socially and adaptively, have shared or common goals, and hold meaningful task interdependences; it is hierarchically structured and has a limited life span; in it expertise and roles are distributed; and it is embedded within an organization/environmental context that influences and is influenced by ongoing processes and performance outcomes" (Salas, Stagl, Burke, & Goodwin, 2007, p. 189). From the NASA perspective, a team is commonly understood to be a collection of individuals that is assigned to support and achieve a particular mission. Thus, depending on context, this definition can encompass both the spaceflight crew and the individuals and teams in the larger multi-team system who are assigned to support that crew during a mission. The Team Risk outcomes of interest are predominantly performance related, with a secondary emphasis on long-term health; this is somewhat unique in the NASA HRP in that most Risk areas are medically related and primarily focused on long-term health consequences. In many operational environments (e.g., aviation), performance is assessed as the avoidance of errors. However, the research on performance errors is ambiguous. It implies that actions may be dichotomized into "correct" or "incorrect" responses, where incorrect responses or errors are always undesirable. Researchers have argued that this dichotomy is a harmful oversimplification, and it would be more productive to focus on the variability of human performance and how organizations can manage that variability (Hollnagel, Woods, & Leveson, 2006) (Category III1). Two problems occur when focusing on performance errors: 1) the errors are infrequent and, therefore, difficult to observe and record; and 2) the errors do not directly correspond to failure. Research reveals that humans are fairly adept at correcting or compensating for performance errors before such errors result in recognizable or recordable failures. Astronauts are notably adept high performers. Most failures are recorded only when multiple, small errors occur and humans are unable to recognize and correct or compensate for these errors in time to prevent a failure (Dismukes, Berman, Loukopoulos, 2007) (Category III). More commonly, observers record variability in levels of performance. Some teams commit no observable errors but fail to achieve performance objectives or perform only adequately, while other teams commit some errors but perform spectacularly. Successful performance, therefore, cannot be viewed as simply the absence of errors or the avoidance of failure Johnson Space Center (JSC) Joint Leadership Team, 2008). While failure is commonly attributed to making a major error, focusing solely on the elimination of error(s) does not significantly reduce the risk of failure. Failure may also occur when performance is simply insufficient or an effort is incapable of adjusting sufficiently to a contextual change (e.g., changing levels of autonomy).
Performance Metrics, Error Modeling, and Uncertainty Quantification
NASA Technical Reports Server (NTRS)
Tian, Yudong; Nearing, Grey S.; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Tang, Ling
2016-01-01
A common set of statistical metrics has been used to summarize the performance of models or measurements- the most widely used ones being bias, mean square error, and linear correlation coefficient. They assume linear, additive, Gaussian errors, and they are interdependent, incomplete, and incapable of directly quantifying uncertainty. The authors demonstrate that these metrics can be directly derived from the parameters of the simple linear error model. Since a correct error model captures the full error information, it is argued that the specification of a parametric error model should be an alternative to the metrics-based approach. The error-modeling methodology is applicable to both linear and nonlinear errors, while the metrics are only meaningful for linear errors. In addition, the error model expresses the error structure more naturally, and directly quantifies uncertainty. This argument is further explained by highlighting the intrinsic connections between the performance metrics, the error model, and the joint distribution between the data and the reference.
Human Error and the International Space Station: Challenges and Triumphs in Science Operations
NASA Technical Reports Server (NTRS)
Harris, Samantha S.; Simpson, Beau C.
2016-01-01
Any system with a human component is inherently risky. Studies in human factors and psychology have repeatedly shown that human operators will inevitably make errors, regardless of how well they are trained. Onboard the International Space Station (ISS) where crew time is arguably the most valuable resource, errors by the crew or ground operators can be costly to critical science objectives. Operations experts at the ISS Payload Operations Integration Center (POIC), located at NASA's Marshall Space Flight Center in Huntsville, Alabama, have learned that from payload concept development through execution, there are countless opportunities to introduce errors that can potentially result in costly losses of crew time and science. To effectively address this challenge, we must approach the design, testing, and operation processes with two specific goals in mind. First, a systematic approach to error and human centered design methodology should be implemented to minimize opportunities for user error. Second, we must assume that human errors will be made and enable rapid identification and recoverability when they occur. While a systematic approach and human centered development process can go a long way toward eliminating error, the complete exclusion of operator error is not a reasonable expectation. The ISS environment in particular poses challenging conditions, especially for flight controllers and astronauts. Operating a scientific laboratory 250 miles above the Earth is a complicated and dangerous task with high stakes and a steep learning curve. While human error is a reality that may never be fully eliminated, smart implementation of carefully chosen tools and techniques can go a long way toward minimizing risk and increasing the efficiency of NASA's space science operations.
Hickey, Edward J; Nosikova, Yaroslavna; Pham-Hung, Eric; Gritti, Michael; Schwartz, Steven; Caldarone, Christopher A; Redington, Andrew; Van Arsdell, Glen S
2015-02-01
We hypothesized that the National Aeronautics and Space Administration "threat and error" model (which is derived from analyzing >30,000 commercial flights, and explains >90% of crashes) is directly applicable to pediatric cardiac surgery. We implemented a unit-wide performance initiative, whereby every surgical admission constitutes a "flight" and is tracked in real time, with the aim of identifying errors. The first 500 consecutive patients (524 flights) were analyzed, with an emphasis on the relationship between error cycles and permanent harmful outcomes. Among 524 patient flights (risk adjustment for congenital heart surgery category: 1-6; median: 2) 68 (13%) involved residual hemodynamic lesions, 13 (2.5%) permanent end-organ injuries, and 7 deaths (1.3%). Preoperatively, 763 threats were identified in 379 (72%) flights. Only 51% of patient flights (267) were error free. In the remaining 257 flights, 430 errors occurred, most commonly related to proficiency (280; 65%) or judgment (69, 16%). In most flights with errors (173 of 257; 67%), an unintended clinical state resulted, ie, the error was consequential. In 60% of consequential errors (n = 110; 21% of total), subsequent cycles of additional error/unintended states occurred. Cycles, particularly those containing multiple errors, were very significantly associated with permanent harmful end-states, including residual hemodynamic lesions (P < .0001), end-organ injury (P < .0001), and death (P < .0001). Deaths were almost always preceded by cycles (6 of 7; P < .0001). Human error, if not mitigated, often leads to cycles of error and unintended patient states, which are dangerous and precede the majority of harmful outcomes. Efforts to manage threats and error cycles (through crew resource management techniques) are likely to yield large increases in patient safety. Copyright © 2015. Published by Elsevier Inc.
Modeling human response errors in synthetic flight simulator domain
NASA Technical Reports Server (NTRS)
Ntuen, Celestine A.
1992-01-01
This paper presents a control theoretic approach to modeling human response errors (HRE) in the flight simulation domain. The human pilot is modeled as a supervisor of a highly automated system. The synthesis uses the theory of optimal control pilot modeling for integrating the pilot's observation error and the error due to the simulation model (experimental error). Methods for solving the HRE problem are suggested. Experimental verification of the models will be tested in a flight quality handling simulation.
Norman, Geoffrey R; Monteiro, Sandra D; Sherbino, Jonathan; Ilgen, Jonathan S; Schmidt, Henk G; Mamede, Silvia
2017-01-01
Contemporary theories of clinical reasoning espouse a dual processing model, which consists of a rapid, intuitive component (Type 1) and a slower, logical and analytical component (Type 2). Although the general consensus is that this dual processing model is a valid representation of clinical reasoning, the causes of diagnostic errors remain unclear. Cognitive theories about human memory propose that such errors may arise from both Type 1 and Type 2 reasoning. Errors in Type 1 reasoning may be a consequence of the associative nature of memory, which can lead to cognitive biases. However, the literature indicates that, with increasing expertise (and knowledge), the likelihood of errors decreases. Errors in Type 2 reasoning may result from the limited capacity of working memory, which constrains computational processes. In this article, the authors review the medical literature to answer two substantial questions that arise from this work: (1) To what extent do diagnostic errors originate in Type 1 (intuitive) processes versus in Type 2 (analytical) processes? (2) To what extent are errors a consequence of cognitive biases versus a consequence of knowledge deficits?The literature suggests that both Type 1 and Type 2 processes contribute to errors. Although it is possible to experimentally induce cognitive biases, particularly availability bias, the extent to which these biases actually contribute to diagnostic errors is not well established. Educational strategies directed at the recognition of biases are ineffective in reducing errors; conversely, strategies focused on the reorganization of knowledge to reduce errors have small but consistent benefits.
Scheidegger, Rachel; Vinogradov, Elena; Alsop, David C
2011-01-01
Amide proton transfer (APT) imaging has shown promise as an indicator of tissue pH and as a marker for brain tumors. Sources of error in APT measurements include direct water saturation, and magnetization transfer (MT) from membranes and macromolecules. These are typically suppressed by post-processing asymmetry analysis. However, this approach is strongly dependent on B0 homogeneity and can introduce additional errors due to intrinsic MT asymmetry, aliphatic proton features opposite the amide peak, and radiation damping-induced asymmetry. Although several methods exist to correct for B0 inhomogeneity, they tremendously increase scan times and do not address errors induced by asymmetry of the z-spectrum. In this paper, a novel saturation scheme - saturation with frequency alternating RF irradiation (SAFARI) - is proposed in combination with a new magnetization transfer ratio (MTR) parameter designed to generate APT images insensitive to direct water saturation and MT, even in the presence of B0 inhomogeneity. The feasibility of the SAFARI technique is demonstrated in phantoms and in the human brain. Experimental results show that SAFARI successfully removes direct water saturation and MT contamination from APT images. It is insensitive to B0 offsets up to 180Hz without using additional B0 correction, thereby dramatically reducing scanning time. PMID:21608029
Prevention of medication errors: detection and audit.
Montesi, Germana; Lechi, Alessandro
2009-06-01
1. Medication errors have important implications for patient safety, and their identification is a main target in improving clinical practice errors, in order to prevent adverse events. 2. Error detection is the first crucial step. Approaches to this are likely to be different in research and routine care, and the most suitable must be chosen according to the setting. 3. The major methods for detecting medication errors and associated adverse drug-related events are chart review, computerized monitoring, administrative databases, and claims data, using direct observation, incident reporting, and patient monitoring. All of these methods have both advantages and limitations. 4. Reporting discloses medication errors, can trigger warnings, and encourages the diffusion of a culture of safe practice. Combining and comparing data from various and encourages the diffusion of a culture of safe practice sources increases the reliability of the system. 5. Error prevention can be planned by means of retroactive and proactive tools, such as audit and Failure Mode, Effect, and Criticality Analysis (FMECA). Audit is also an educational activity, which promotes high-quality care; it should be carried out regularly. In an audit cycle we can compare what is actually done against reference standards and put in place corrective actions to improve the performances of individuals and systems. 6. Patient safety must be the first aim in every setting, in order to build safer systems, learning from errors and reducing the human and fiscal costs.
Development and implementation of a human accuracy program in patient foodservice.
Eden, S H; Wood, S M; Ptak, K M
1987-04-01
For many years, industry has utilized the concept of human error rates to monitor and minimize human errors in the production process. A consistent quality-controlled product increases consumer satisfaction and repeat purchase of product. Administrative dietitians have applied the concepts of using human error rates (the number of errors divided by the number of opportunities for error) at four hospitals, with a total bed capacity of 788, within a tertiary-care medical center. Human error rate was used to monitor and evaluate trayline employee performance and to evaluate layout and tasks of trayline stations, in addition to evaluating employees in patient service areas. Long-term employees initially opposed the error rate system with some hostility and resentment, while newer employees accepted the system. All employees now believe that the constant feedback given by supervisors enhances their self-esteem and productivity. Employee error rates are monitored daily and are used to counsel employees when necessary; they are also utilized during annual performance evaluation. Average daily error rates for a facility staffed by new employees decreased from 7% to an acceptable 3%. In a facility staffed by long-term employees, the error rate increased, reflecting improper error documentation. Patient satisfaction surveys reveal satisfaction, for tray accuracy increased from 88% to 92% in the facility staffed by long-term employees and has remained above the 90% standard in the facility staffed by new employees.
Davare, Marco; Zénon, Alexandre; Desmurget, Michel; Olivier, Etienne
2015-01-01
To reach for an object, we must convert its spatial location into an appropriate motor command, merging movement direction and amplitude. In humans, it has been suggested that this visuo-motor transformation occurs in a dorsomedial parieto-frontal pathway, although the causal contribution of the areas constituting the “reaching circuit” remains unknown. Here we used transcranial magnetic stimulation (TMS) in healthy volunteers to disrupt the function of either the medial intraparietal area (mIPS) or dorsal premotor cortex (PMd), in each hemisphere. The task consisted in performing step-tracking movements with the right wrist towards targets located in different directions and eccentricities; targets were either visible for the whole trial (Target-ON) or flashed for 200 ms (Target-OFF). Left and right mIPS disruption led to errors in the initial direction of movements performed towards contralateral targets. These errors were corrected online in the Target-ON condition but when the target was flashed for 200 ms, mIPS TMS manifested as a larger endpoint spreading. In contrast, left PMd virtual lesions led to higher acceleration and velocity peaks—two parameters typically used to probe the planned movement amplitude—irrespective of the target position, hemifield and presentation condition; in the Target-OFF condition, left PMd TMS induced overshooting and increased the endpoint dispersion along the axis of the target direction. These results indicate that left PMd intervenes in coding amplitude during movement preparation. The critical TMS timings leading to errors in direction and amplitude were different, namely 160–100 ms before movement onset for mIPS and 100–40 ms for left PMd. TMS applied over right PMd had no significant effect. These results demonstrate that, during motor preparation, direction and amplitude of goal-directed movements are processed by different cortical areas, at distinct timings, and according to a specific hemispheric organization. PMID:25999837
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abkowitz, M.D.; Abkowitz, S.B.; Lepofsky, M.
1989-04-01
This report examines the extent of human factors effects on the safety of transporting radioactive waste materials. It is seen principally as a scoping effort, to establish whether there is a need for DOE to undertake a more formal approach to studying human factors in radioactive waste transport, and if so, logical directions for that program to follow. Human factors effects are evaluated on driving and loading/transfer operations only. Particular emphasis is placed on the driving function, examining the relationship between human error and safety as it relates to the impairment of driver performance. Although multi-modal in focus, the widespreadmore » availability of data and previous literature on truck operations resulted in a primary study focus on the trucking mode from the standpoint of policy development. In addition to the analysis of human factors accident statistics, the report provides relevant background material on several policies that have been instituted or are under consideration, directed at improving human reliability in the transport sector. On the basis of reported findings, preliminary policy areas are identified. 71 refs., 26 figs., 5 tabs.« less
Sensitivity analysis of dynamic biological systems with time-delays.
Wu, Wu Hsiung; Wang, Feng Sheng; Chang, Maw Shang
2010-10-15
Mathematical modeling has been applied to the study and analysis of complex biological systems for a long time. Some processes in biological systems, such as the gene expression and feedback control in signal transduction networks, involve a time delay. These systems are represented as delay differential equation (DDE) models. Numerical sensitivity analysis of a DDE model by the direct method requires the solutions of model and sensitivity equations with time-delays. The major effort is the computation of Jacobian matrix when computing the solution of sensitivity equations. The computation of partial derivatives of complex equations either by the analytic method or by symbolic manipulation is time consuming, inconvenient, and prone to introduce human errors. To address this problem, an automatic approach to obtain the derivatives of complex functions efficiently and accurately is necessary. We have proposed an efficient algorithm with an adaptive step size control to compute the solution and dynamic sensitivities of biological systems described by ordinal differential equations (ODEs). The adaptive direct-decoupled algorithm is extended to solve the solution and dynamic sensitivities of time-delay systems describing by DDEs. To save the human effort and avoid the human errors in the computation of partial derivatives, an automatic differentiation technique is embedded in the extended algorithm to evaluate the Jacobian matrix. The extended algorithm is implemented and applied to two realistic models with time-delays: the cardiovascular control system and the TNF-α signal transduction network. The results show that the extended algorithm is a good tool for dynamic sensitivity analysis on DDE models with less user intervention. By comparing with direct-coupled methods in theory, the extended algorithm is efficient, accurate, and easy to use for end users without programming background to do dynamic sensitivity analysis on complex biological systems with time-delays.
Jessen, Torben E; Höskuldsson, Agnar T; Bjerrum, Poul J; Verder, Henrik; Sørensen, Lars; Bratholm, Palle S; Christensen, Bo; Jensen, Lene S; Jensen, Maria A B
2014-09-01
Direct measurement of chemical constituents in complex biologic matrices without the use of analyte specific reagents could be a step forward toward the simplification of clinical biochemistry. Problems related to reagents such as production errors, improper handling, and lot-to-lot variations would be eliminated as well as errors occurring during assay execution. We describe and validate a reagent free method for direct measurement of six analytes in human plasma based on Fourier-transform infrared spectroscopy (FTIR). Blood plasma is analyzed without any sample preparation. FTIR spectrum of the raw plasma is recorded in a sampling cuvette specially designed for measurement of aqueous solutions. For each analyte, a mathematical calibration process is performed by a stepwise selection of wavelengths giving the optimal least-squares correlation between the measured FTIR signal and the analyte concentration measured by conventional clinical reference methods. The developed calibration algorithms are subsequently evaluated for their capability to predict the concentration of the six analytes in blinded patient samples. The correlation between the six FTIR methods and corresponding reference methods were 0.87
Kunkel, Maria E; Herkommer, Andrea; Reinehr, Michael; Böckers, Tobias M; Wilke, Hans-Joachim
2011-01-01
The main aim of this study was to provide anatomical data on the heights of the human intervertebral discs for all levels of the thoracic spine by direct and radiographic measurements. Additionally, the heights of the neighboring vertebral bodies were measured, and the prediction of the disc heights based only on the size of the vertebral bodies was investigated. The anterior (ADH), middle (MDH) and posterior heights (PDH) of the discs were measured directly and on radiographs of 72 spine segments from 30 donors (age 57.43 ± 11.27 years). The radiographic measurement error and the reliability of the measurements were calculated. Linear and non-linear regression analyses were employed for investigation of statistical correlations between the heights of the thoracic disc and vertebrae. Radiographic measurements displayed lower repeatability and were shorter than the anatomical ones (approximately 9% for ADH and 37% for PDH). The thickness of the discs varied from 4.5 to 7.2 mm, with the MDH approximately 22.7% greater. The disc heights showed good correlations with the vertebral body heights (R2, 0.659–0.835, P-values < 0.005; anova), allowing the generation of 10 prediction equations. New data on thoracic disc morphometry were provided in this study. The generated set of regression equations could be used to predict thoracic disc heights from radiographic measurement of the vertebral body height posterior. For the creation of parameterized models of the human thoracic discs, the use of the prediction equations could eliminate the need for direct measurement on intervertebral discs. Moreover, the error produced by radiographic measurements could be reduced at least for the PDH. PMID:21615399
Reflections on human error - Matters of life and death
NASA Technical Reports Server (NTRS)
Wiener, Earl L.
1989-01-01
The last two decades have witnessed a rapid growth in the introduction of automatic devices into aircraft cockpits, and eleswhere in human-machine systems. This was motivated in part by the assumption that when human functioning is replaced by machine functioning, human error is eliminated. Experience to date shows that this is far from true, and that automation does not replace humans, but changes their role in the system, as well as the types and severity of the errors they make. This altered role may lead to fewer, but more critical errors. Intervention strategies to prevent these errors, or ameliorate their consequences include basic human factors engineering of the interface, enhanced warning and alerting systems, and more intelligent interfaces that understand the strategic intent of the crew and can detect and trap inconsistent or erroneous input before it affects the system.
Human factors engineering approaches to patient identification armband design.
Probst, C Adam; Wolf, Laurie; Bollini, Mara; Xiao, Yan
2016-01-01
The task of patient identification is performed many times each day by nurses and other members of the care team. Armbands are used for both direct verification and barcode scanning during patient identification. Armbands and information layout are critical to reducing patient identification errors and dangerous workarounds. We report the effort at two large, integrated healthcare systems that employed human factors engineering approaches to the information layout design of new patient identification armbands. The different methods used illustrate potential pathways to obtain standardized armbands across healthcare systems that incorporate human factors principles. By extension, how the designs have been adopted provides examples of how to incorporate human factors engineering into key clinical processes. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Color extended visual cryptography using error diffusion.
Kang, InKoo; Arce, Gonzalo R; Lee, Heung-Kyu
2011-01-01
Color visual cryptography (VC) encrypts a color secret message into n color halftone image shares. Previous methods in the literature show good results for black and white or gray scale VC schemes, however, they are not sufficient to be applied directly to color shares due to different color structures. Some methods for color visual cryptography are not satisfactory in terms of producing either meaningless shares or meaningful shares with low visual quality, leading to suspicion of encryption. This paper introduces the concept of visual information pixel (VIP) synchronization and error diffusion to attain a color visual cryptography encryption method that produces meaningful color shares with high visual quality. VIP synchronization retains the positions of pixels carrying visual information of original images throughout the color channels and error diffusion generates shares pleasant to human eyes. Comparisons with previous approaches show the superior performance of the new method.
Human factors process failure modes and effects analysis (HF PFMEA) software tool
NASA Technical Reports Server (NTRS)
Chandler, Faith T. (Inventor); Relvini, Kristine M. (Inventor); Shedd, Nathaneal P. (Inventor); Valentino, William D. (Inventor); Philippart, Monica F. (Inventor); Bessette, Colette I. (Inventor)
2011-01-01
Methods, computer-readable media, and systems for automatically performing Human Factors Process Failure Modes and Effects Analysis for a process are provided. At least one task involved in a process is identified, where the task includes at least one human activity. The human activity is described using at least one verb. A human error potentially resulting from the human activity is automatically identified, the human error is related to the verb used in describing the task. A likelihood of occurrence, detection, and correction of the human error is identified. The severity of the effect of the human error is identified. The likelihood of occurrence, and the severity of the risk of potential harm is identified. The risk of potential harm is compared with a risk threshold to identify the appropriateness of corrective measures.
Chaplain Corps Cadet Chapel Community Center Chapel Institutional Review Board Not Human Subjects Research Requirements 7 Not Human Subjects Research Form 8 Researcher Instructions - Activities Submitted to DoD IRB 9 Review 18 Not Human Subjects Errors 19 Exempt Research Most Frequent Errors 20 Most Frequent Errors for
Development of an FAA-EUROCONTROL technique for the analysis of human error in ATM : final report.
DOT National Transportation Integrated Search
2002-07-01
Human error has been identified as a dominant risk factor in safety-oriented industries such as air traffic control (ATC). However, little is known about the factors leading to human errors in current air traffic management (ATM) systems. The first s...
Human Error: The Stakes Are Raised.
ERIC Educational Resources Information Center
Greenberg, Joel
1980-01-01
Mistakes related to the operation of nuclear power plants and other technologically complex systems are discussed. Recommendations are given for decreasing the chance of human error in the operation of nuclear plants. The causes of the Three Mile Island incident are presented in terms of the human error element. (SA)
Influence of wheelchair front caster wheel on reverse directional stability.
Guo, Songfeng; Cooper, Rory A; Corfman, Tom; Ding, Dan; Grindle, Garrett
2003-01-01
The purpose of this research was to study directional stability during reversing of rear-wheel drive, electric powered wheelchairs (EPW) under different initial front caster orientations. Specifically, the weight distribution differences caused by certain initial caster orientations were examined as a possible mechanism for causing directional instability that could lead to accidents. Directional stability was quantified by measuring the drive direction error of the EPW by a motion analysis system. The ground reaction forces were collected to determine the load on the front casters, as well as back-emf data to attain the speed of the motors. The drive direction error was found to be different for various initial caster orientations. Drive direction error was greatest when both casters were oriented 90 degrees to the left or right, and least when both casters were oriented forward. The results show that drive direction error corresponds to the loading difference on the casters. The data indicates that loading differences may cause asymmetric drag on the casters, which in turn causes unbalanced torque load on the motors. This leads to a difference in motor speed and drive direction error.
Avoiding Human Error in Mission Operations: Cassini Flight Experience
NASA Technical Reports Server (NTRS)
Burk, Thomas A.
2012-01-01
Operating spacecraft is a never-ending challenge and the risk of human error is ever- present. Many missions have been significantly affected by human error on the part of ground controllers. The Cassini mission at Saturn has not been immune to human error, but Cassini operations engineers use tools and follow processes that find and correct most human errors before they reach the spacecraft. What is needed are skilled engineers with good technical knowledge, good interpersonal communications, quality ground software, regular peer reviews, up-to-date procedures, as well as careful attention to detail and the discipline to test and verify all commands that will be sent to the spacecraft. Two areas of special concern are changes to flight software and response to in-flight anomalies. The Cassini team has a lot of practical experience in all these areas and they have found that well-trained engineers with good tools who follow clear procedures can catch most errors before they get into command sequences to be sent to the spacecraft. Finally, having a robust and fault-tolerant spacecraft that allows ground controllers excellent visibility of its condition is the most important way to ensure human error does not compromise the mission.
Enhanced Pedestrian Navigation Based on Course Angle Error Estimation Using Cascaded Kalman Filters
Park, Chan Gook
2018-01-01
An enhanced pedestrian dead reckoning (PDR) based navigation algorithm, which uses two cascaded Kalman filters (TCKF) for the estimation of course angle and navigation errors, is proposed. The proposed algorithm uses a foot-mounted inertial measurement unit (IMU), waist-mounted magnetic sensors, and a zero velocity update (ZUPT) based inertial navigation technique with TCKF. The first stage filter estimates the course angle error of a human, which is closely related to the heading error of the IMU. In order to obtain the course measurements, the filter uses magnetic sensors and a position-trace based course angle. For preventing magnetic disturbance from contaminating the estimation, the magnetic sensors are attached to the waistband. Because the course angle error is mainly due to the heading error of the IMU, and the characteristic error of the heading angle is highly dependent on that of the course angle, the estimated course angle error is used as a measurement for estimating the heading error in the second stage filter. At the second stage, an inertial navigation system-extended Kalman filter-ZUPT (INS-EKF-ZUPT) method is adopted. As the heading error is estimated directly by using course-angle error measurements, the estimation accuracy for the heading and yaw gyro bias can be enhanced, compared with the ZUPT-only case, which eventually enhances the position accuracy more efficiently. The performance enhancements are verified via experiments, and the way-point position error for the proposed method is compared with those for the ZUPT-only case and with other cases that use ZUPT and various types of magnetic heading measurements. The results show that the position errors are reduced by a maximum of 90% compared with the conventional ZUPT based PDR algorithms. PMID:29690539
Brennan, Peter A; Mitchell, David A; Holmes, Simon; Plint, Simon; Parry, David
2016-01-01
Human error is as old as humanity itself and is an appreciable cause of mistakes by both organisations and people. Much of the work related to human factors in causing error has originated from aviation where mistakes can be catastrophic not only for those who contribute to the error, but for passengers as well. The role of human error in medical and surgical incidents, which are often multifactorial, is becoming better understood, and includes both organisational issues (by the employer) and potential human factors (at a personal level). Mistakes as a result of individual human factors and surgical teams should be better recognised and emphasised. Attitudes and acceptance of preoperative briefing has improved since the introduction of the World Health Organization (WHO) surgical checklist. However, this does not address limitations or other safety concerns that are related to performance, such as stress and fatigue, emotional state, hunger, awareness of what is going on situational awareness, and other factors that could potentially lead to error. Here we attempt to raise awareness of these human factors, and highlight how they can lead to error, and how they can be minimised in our day-to-day practice. Can hospitals move from being "high risk industries" to "high reliability organisations"? Copyright © 2015 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Modeling congenital disease and inborn errors of development in Drosophila melanogaster
Moulton, Matthew J.; Letsou, Anthea
2016-01-01
ABSTRACT Fly models that faithfully recapitulate various aspects of human disease and human health-related biology are being used for research into disease diagnosis and prevention. Established and new genetic strategies in Drosophila have yielded numerous substantial successes in modeling congenital disorders or inborn errors of human development, as well as neurodegenerative disease and cancer. Moreover, although our ability to generate sequence datasets continues to outpace our ability to analyze these datasets, the development of high-throughput analysis platforms in Drosophila has provided access through the bottleneck in the identification of disease gene candidates. In this Review, we describe both the traditional and newer methods that are facilitating the incorporation of Drosophila into the human disease discovery process, with a focus on the models that have enhanced our understanding of human developmental disorders and congenital disease. Enviable features of the Drosophila experimental system, which make it particularly useful in facilitating the much anticipated move from genotype to phenotype (understanding and predicting phenotypes directly from the primary DNA sequence), include its genetic tractability, the low cost for high-throughput discovery, and a genome and underlying biology that are highly evolutionarily conserved. In embracing the fly in the human disease-gene discovery process, we can expect to speed up and reduce the cost of this process, allowing experimental scales that are not feasible and/or would be too costly in higher eukaryotes. PMID:26935104
Challenges of primate embryonic stem cell research.
Bavister, Barry D; Wolf, Don P; Brenner, Carol A
2005-01-01
Embryonic stem (ES) cells hold great promise for treating degenerative diseases, including diabetes, Parkinson's, Alzheimer's, neural degeneration, and cardiomyopathies. This research is controversial to some because producing ES cells requires destroying embryos, which generally means human embryos. However, some of the surplus human embryos available from in vitro fertilization (IVF) clinics may have a high rate of genetic errors and therefore would be unsuitable for ES cell research. Although gross chromosome errors can readily be detected in ES cells, other anomalies such as mitochondrial DNA defects may have gone unrecognized. An insurmountable problem is that there are no human ES cells derived from in vivo-produced embryos to provide normal comparative data. In contrast, some monkey ES cell lines have been produced using in vivo-generated, normal embryos obtained from fertile animals; these can represent a "gold standard" for primate ES cells. In this review, we argue a need for strong research programs using rhesus monkey ES cells, conducted in parallel with studies on human ES and adult stem cells, to derive the maximum information about the biology of normal stem cells and to produce technical protocols for their directed differentiation into safe and functional replacement cells, tissues, and organs. In contrast, ES cell research using only human cell lines is likely to be incomplete, which could hinder research progress, and delay or diminish the effective application of ES cell technology to the treatment of human diseases.
Using a Delphi Method to Identify Human Factors Contributing to Nursing Errors.
Roth, Cheryl; Brewer, Melanie; Wieck, K Lynn
2017-07-01
The purpose of this study was to identify human factors associated with nursing errors. Using a Delphi technique, this study used feedback from a panel of nurse experts (n = 25) on an initial qualitative survey questionnaire followed by summarizing the results with feedback and confirmation. Synthesized factors regarding causes of errors were incorporated into a quantitative Likert-type scale, and the original expert panel participants were queried a second time to validate responses. The list identified 24 items as most common causes of nursing errors, including swamping and errors made by others that nurses are expected to recognize and fix. The responses provided a consensus top 10 errors list based on means with heavy workload and fatigue at the top of the list. The use of the Delphi survey established consensus and developed a platform upon which future study of nursing errors can evolve as a link to future solutions. This list of human factors in nursing errors should serve to stimulate dialogue among nurses about how to prevent errors and improve outcomes. Human and system failures have been the subject of an abundance of research, yet nursing errors continue to occur. © 2016 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Dwyer Cianciolo, Alicia; Powell, Richard W.
2017-01-01
Precision landing on Mars is a challenge. All Mars lander missions prior to the 2012 Mars Science Laboratory (MSL) had landing location uncertainty ellipses on the order of hundreds of kilometers. Sending humans to the surface of Mars will likely require multiple landers delivered in close proximity, which will in turn require orders of magnitude improvement in landing accuracy. MSL was the first Mars mission to use an Apollo-derived bank angle guidance to reduce the size of the landing ellipse. It utilized commanded bank angle magnitude to control total range and bank angle reversals to control cross range. A shortcoming of this bank angle guidance is that the open loop phase of flight created by use of bank reversals increases targeting errors. This paper presents a comparison of entry, descent and landing performance for a vehicle with a low lift-to-drag ratio using both bank angle control and an alternative guidance called Direct Force Control (DFC). DFC eliminates the open loop flight errors by directly controlling two forces independently, lift and side force. This permits independent control of down range and cross range. Performance results, evaluated using the Program to Optimize Simulated Trajectories (POST2), including propellant use and landing accuracy, are presented.
The force synergy of human digits in static and dynamic cylindrical grasps.
Kuo, Li-Chieh; Chen, Shih-Wei; Lin, Chien-Ju; Lin, Wei-Jr; Lin, Sheng-Che; Su, Fong-Chin
2013-01-01
This study explores the force synergy of human digits in both static and dynamic cylindrical grasping conditions. The patterns of digit force distribution, error compensation, and the relationships among digit forces are examined to quantify the synergetic patterns and coordination of multi-finger movements. This study recruited 24 healthy participants to perform cylindrical grasps using a glass simulator under normal grasping and one-finger restricted conditions. Parameters such as the grasping force, patterns of digit force distribution, and the force coefficient of variation are determined. Correlation coefficients and principal component analysis (PCA) are used to estimate the synergy strength under the dynamic grasping condition. Specific distribution patterns of digit forces are identified for various conditions. The compensation of adjacent fingers for the force in the normal direction of an absent finger agrees with the principle of error compensation. For digit forces in anti-gravity directions, the distribution patterns vary significantly by participant. The forces exerted by the thumb are closely related to those exerted by other fingers under all conditions. The index-middle and middle-ring finger pairs demonstrate a significant relationship. The PCA results show that the normal forces of digits are highly coordinated. This study reveals that normal force synergy exists under both static and dynamic cylindrical grasping conditions.
The Force Synergy of Human Digits in Static and Dynamic Cylindrical Grasps
Kuo, Li-Chieh; Chen, Shih-Wei; Lin, Chien-Ju; Lin, Wei-Jr; Lin, Sheng-Che; Su, Fong-Chin
2013-01-01
This study explores the force synergy of human digits in both static and dynamic cylindrical grasping conditions. The patterns of digit force distribution, error compensation, and the relationships among digit forces are examined to quantify the synergetic patterns and coordination of multi-finger movements. This study recruited 24 healthy participants to perform cylindrical grasps using a glass simulator under normal grasping and one-finger restricted conditions. Parameters such as the grasping force, patterns of digit force distribution, and the force coefficient of variation are determined. Correlation coefficients and principal component analysis (PCA) are used to estimate the synergy strength under the dynamic grasping condition. Specific distribution patterns of digit forces are identified for various conditions. The compensation of adjacent fingers for the force in the normal direction of an absent finger agrees with the principle of error compensation. For digit forces in anti-gravity directions, the distribution patterns vary significantly by participant. The forces exerted by the thumb are closely related to those exerted by other fingers under all conditions. The index-middle and middle-ring finger pairs demonstrate a significant relationship. The PCA results show that the normal forces of digits are highly coordinated. This study reveals that normal force synergy exists under both static and dynamic cylindrical grasping conditions. PMID:23544151
The effects of center of rotation errors on cardiac SPECT imaging
NASA Astrophysics Data System (ADS)
Bai, Chuanyong; Shao, Ling; Ye, Jinghan; Durbin, M.
2003-10-01
In SPECT imaging, center of rotation (COR) errors lead to the misalignment of projection data and can potentially degrade the quality of the reconstructed images. In this work, we study the effects of COR errors on cardiac SPECT imaging using simulation, point source, cardiac phantom, and patient studies. For simulation studies, we generate projection data using a uniform MCAT phantom first without modeling any physical effects (NPH), then with the modeling of detector response effect (DR) alone. We then corrupt the projection data with simulated sinusoid and step COR errors. For other studies, we introduce sinusoid COR errors to projection data acquired on SPECT systems. An OSEM algorithm is used for image reconstruction without detector response correction, but with nonuniform attenuation correction when needed. The simulation studies show that, when COR errors increase from 0 to 0.96 cm: 1) sinusoid COR errors in axial direction lead to intensity decrease in the inferoapical region; 2) step COR errors in axial direction lead to intensity decrease in the distal anterior region. The intensity decrease is more severe in images reconstructed from projection data with NPH than with DR; and 3) the effects of COR errors in transaxial direction seem to be insignificant. In other studies, COR errors slightly degrade point source resolution; COR errors of 0.64 cm or above introduce visible but insignificant nonuniformity in the images of uniform cardiac phantom; COR errors up to 0.96 cm in transaxial direction affect the lesion-to-background contrast (LBC) insignificantly in the images of cardiac phantom with defects, and COR errors up to 0.64 cm in axial direction only slightly decrease the LBC. For the patient studies with COR errors up to 0.96 cm, images have the same diagnostic/prognostic values as those without COR errors. This work suggests that COR errors of up to 0.64 cm are not likely to change the clinical applications of cardiac SPECT imaging when using iterative reconstruction algorithm without detector response correction.
Redundancy reduction explains the expansion of visual direction space around the cardinal axes.
Perrone, John A; Liston, Dorion B
2015-06-01
Motion direction discrimination in humans is worse for oblique directions than for the cardinal directions (the oblique effect). For some unknown reason, the human visual system makes systematic errors in the estimation of particular motion directions; a direction displacement near a cardinal axis appears larger than it really is whereas the same displacement near an oblique axis appears to be smaller. Although the perceptual effects are robust and are clearly measurable in smooth pursuit eye movements, all attempts to identify the neural underpinnings for the oblique effect have failed. Here we show that a model of image velocity estimation based on the known properties of neurons in primary visual cortex (V1) and the middle temporal (MT) visual area of the primate brain produces the oblique effect. We also provide an explanation for the unusual asymmetric patterns of inhibition that have been found surrounding MT neurons. These patterns are consistent with a mechanism within the visual system that prevents redundant velocity signals from being passed onto the next motion-integration stage, (dorsal Medial superior temporal, MSTd). We show that model redundancy-reduction mechanisms within the MT-MSTd pathway produce the oblique effect. Copyright © 2015 Elsevier Ltd. All rights reserved.
Analysis of MMU FDIR expert system
NASA Technical Reports Server (NTRS)
Landauer, Christopher
1990-01-01
This paper describes the analysis of a rulebase for fault diagnosis, isolation, and recovery for NASA's Manned Maneuvering Unit (MMU). The MMU is used by a human astronaut to move around a spacecraft in space. In order to provide maneuverability, there are several thrusters oriented in various directions, and hand-controlled devices for useful groups of them. The rulebase describes some error detection procedures, and corrective actions that can be applied in a few cases. The approach taken in this paper is to treat rulebases as symbolic objects and compute correctness and 'reasonableness' criteria that use the statistical distribution of various syntactic structures within the rulebase. The criteria should identify awkward situations, and otherwise signal anomalies that may be errors. The rulebase analysis agorithms are derived from mathematical and computational criteria that implement certain principles developed for rulebase evaluation. The principles are Consistency, Completeness, Irredundancy, Connectivity, and finally, Distribution. Several errors were detected in the delivered rulebase. Some of these errors were easily fixed. Some errors could not be fixed with the available information. A geometric model of the thruster arrangement is needed to show how to correct certain other distribution nomalies that are in fact errors. The investigations reported here were partially supported by The Aerospace Corporation's Sponsored Research Program.
Performance Support Tools for Space Medical Operations
NASA Technical Reports Server (NTRS)
Byrne, Vicky; Schmid, Josef; Barshi, Immanuel
2010-01-01
Early Constellation space missions are expected to have medical capabilities similar to those currently on board the Space Shuttle and International Space Station (ISS). Flight surgeons on the ground in Mission Control will direct the Crew Medical Officer (CMO) during medical situations. If the crew is unable to communicate with the ground, the CMO will carry out medical procedures without the aid of a flight surgeon. In these situations, use of performance support tools can reduce errors and time to perform emergency medical tasks. The research presented here is part of the Human Factors in Training Directed Research Project of the Space Human Factors Engineering Project under the Space Human Factors and Habitability Element of the Human Research Program. This is a joint project consisting of human factors teams from the Johnson Space Center (JSC) and the Ames Research Center (ARC). Work on medical training has been conducted in collaboration with the Medical Training Group at JSC and with Wyle that provides medical training to crew members, biomedical engineers (BMEs), and flight surgeons under the Bioastronautics contract. Human factors personnel at Johnson Space Center have investigated medical performance support tools for CMOs and flight surgeons.
Analysis of measured data of human body based on error correcting frequency
NASA Astrophysics Data System (ADS)
Jin, Aiyan; Peipei, Gao; Shang, Xiaomei
2014-04-01
Anthropometry is to measure all parts of human body surface, and the measured data is the basis of analysis and study of the human body, establishment and modification of garment size and formulation and implementation of online clothing store. In this paper, several groups of the measured data are gained, and analysis of data error is gotten by analyzing the error frequency and using analysis of variance method in mathematical statistics method. Determination of the measured data accuracy and the difficulty of measured parts of human body, further studies of the causes of data errors, and summarization of the key points to minimize errors possibly are also mentioned in the paper. This paper analyses the measured data based on error frequency, and in a way , it provides certain reference elements to promote the garment industry development.
Model and experiments to optimize co-adaptation in a simplified myoelectric control system
NASA Astrophysics Data System (ADS)
Couraud, M.; Cattaert, D.; Paclet, F.; Oudeyer, P. Y.; de Rugy, A.
2018-04-01
Objective. To compensate for a limb lost in an amputation, myoelectric prostheses use surface electromyography (EMG) from the remaining muscles to control the prosthesis. Despite considerable progress, myoelectric controls remain markedly different from the way we normally control movements, and require intense user adaptation. To overcome this, our goal is to explore concurrent machine co-adaptation techniques that are developed in the field of brain-machine interface, and that are beginning to be used in myoelectric controls. Approach. We combined a simplified myoelectric control with a perturbation for which human adaptation is well characterized and modeled, in order to explore co-adaptation settings in a principled manner. Results. First, we reproduced results obtained in a classical visuomotor rotation paradigm in our simplified myoelectric context, where we rotate the muscle pulling vectors used to reconstruct wrist force from EMG. Then, a model of human adaptation in response to directional error was used to simulate various co-adaptation settings, where perturbations and machine co-adaptation are both applied on muscle pulling vectors. These simulations established that a relatively low gain of machine co-adaptation that minimizes final errors generates slow and incomplete adaptation, while higher gains increase adaptation rate but also errors by amplifying noise. After experimental verification on real subjects, we tested a variable gain that cumulates the advantages of both, and implemented it with directionally tuned neurons similar to those used to model human adaptation. This enables machine co-adaptation to locally improve myoelectric control, and to absorb more challenging perturbations. Significance. The simplified context used here enabled to explore co-adaptation settings in both simulations and experiments, and to raise important considerations such as the need for a variable gain encoded locally. The benefits and limits of extending this approach to more complex and functional myoelectric contexts are discussed.
Ananda, Guruprasad; Hile, Suzanne E.; Breski, Amanda; Wang, Yanli; Kelkar, Yogeshwar; Makova, Kateryna D.; Eckert, Kristin A.
2014-01-01
Interruptions of microsatellite sequences impact genome evolution and can alter disease manifestation. However, human polymorphism levels at interrupted microsatellites (iMSs) are not known at a genome-wide scale, and the pathways for gaining interruptions are poorly understood. Using the 1000 Genomes Phase-1 variant call set, we interrogated mono-, di-, tri-, and tetranucleotide repeats up to 10 units in length. We detected ∼26,000–40,000 iMSs within each of four human population groups (African, European, East Asian, and American). We identified population-specific iMSs within exonic regions, and discovered that known disease-associated iMSs contain alleles present at differing frequencies among the populations. By analyzing longer microsatellites in primate genomes, we demonstrate that single interruptions result in a genome-wide average two- to six-fold reduction in microsatellite mutability, as compared with perfect microsatellites. Centrally located interruptions lowered mutability dramatically, by two to three orders of magnitude. Using a biochemical approach, we tested directly whether the mutability of a specific iMS is lower because of decreased DNA polymerase strand slippage errors. Modeling the adenomatous polyposis coli tumor suppressor gene sequence, we observed that a single base substitution interruption reduced strand slippage error rates five- to 50-fold, relative to a perfect repeat, during synthesis by DNA polymerases α, β, or η. Computationally, we demonstrate that iMSs arise primarily by base substitution mutations within individual human genomes. Our biochemical survey of human DNA polymerase α, β, δ, κ, and η error rates within certain microsatellites suggests that interruptions are created most frequently by low fidelity polymerases. Our combined computational and biochemical results demonstrate that iMSs are abundant in human genomes and are sources of population-specific genetic variation that may affect genome stability. The genome-wide identification of iMSs in human populations presented here has important implications for current models describing the impact of microsatellite polymorphisms on gene expression. PMID:25033203
Westendorff, Stephanie; Kuang, Shenbing; Taghizadeh, Bahareh; Donchin, Opher; Gail, Alexander
2015-04-01
Different error signals can induce sensorimotor adaptation during visually guided reaching, possibly evoking different neural adaptation mechanisms. Here we investigate reach adaptation induced by visual target errors without perturbing the actual or sensed hand position. We analyzed the spatial generalization of adaptation to target error to compare it with other known generalization patterns and simulated our results with a neural network model trained to minimize target error independent of prediction errors. Subjects reached to different peripheral visual targets and had to adapt to a sudden fixed-amplitude displacement ("jump") consistently occurring for only one of the reach targets. Subjects simultaneously had to perform contralateral unperturbed saccades, which rendered the reach target jump unnoticeable. As a result, subjects adapted by gradually decreasing reach errors and showed negative aftereffects for the perturbed reach target. Reach errors generalized to unperturbed targets according to a translational rather than rotational generalization pattern, but locally, not globally. More importantly, reach errors generalized asymmetrically with a skewed generalization function in the direction of the target jump. Our neural network model reproduced the skewed generalization after adaptation to target jump without having been explicitly trained to produce a specific generalization pattern. Our combined psychophysical and simulation results suggest that target jump adaptation in reaching can be explained by gradual updating of spatial motor goal representations in sensorimotor association networks, independent of learning induced by a prediction-error about the hand position. The simulations make testable predictions about the underlying changes in the tuning of sensorimotor neurons during target jump adaptation. Copyright © 2015 the American Physiological Society.
Westendorff, Stephanie; Kuang, Shenbing; Taghizadeh, Bahareh; Donchin, Opher
2015-01-01
Different error signals can induce sensorimotor adaptation during visually guided reaching, possibly evoking different neural adaptation mechanisms. Here we investigate reach adaptation induced by visual target errors without perturbing the actual or sensed hand position. We analyzed the spatial generalization of adaptation to target error to compare it with other known generalization patterns and simulated our results with a neural network model trained to minimize target error independent of prediction errors. Subjects reached to different peripheral visual targets and had to adapt to a sudden fixed-amplitude displacement (“jump”) consistently occurring for only one of the reach targets. Subjects simultaneously had to perform contralateral unperturbed saccades, which rendered the reach target jump unnoticeable. As a result, subjects adapted by gradually decreasing reach errors and showed negative aftereffects for the perturbed reach target. Reach errors generalized to unperturbed targets according to a translational rather than rotational generalization pattern, but locally, not globally. More importantly, reach errors generalized asymmetrically with a skewed generalization function in the direction of the target jump. Our neural network model reproduced the skewed generalization after adaptation to target jump without having been explicitly trained to produce a specific generalization pattern. Our combined psychophysical and simulation results suggest that target jump adaptation in reaching can be explained by gradual updating of spatial motor goal representations in sensorimotor association networks, independent of learning induced by a prediction-error about the hand position. The simulations make testable predictions about the underlying changes in the tuning of sensorimotor neurons during target jump adaptation. PMID:25609106
Yordanova, Juliana; Albrecht, Björn; Uebel, Henrik; Kirov, Roumen; Banaschewski, Tobias; Rothenberger, Aribert; Kolev, Vasil
2011-06-01
The maintenance of stable goal-directed behaviour is a hallmark of conscious executive control in humans. Notably, both correct and error human actions may have a subconscious activation-based determination. One possible source of subconscious interference may be the default mode network that, in contrast to attentional network, manifests intrinsic oscillations at very low (<0.1 Hz) frequencies. In the present study, we analyse the time dynamics of performance accuracy to search for multisecond periodic fluctuations of error occurrence. Attentional lapses in attention deficit/hyperactivity disorder are proposed to originate from interferences from intrinsically oscillating networks. Identifying periodic error fluctuations with a frequency<0.1 Hz in patients with attention deficit/hyperactivity disorder would provide a behavioural evidence for such interferences. Performance was monitored during a visual flanker task in 92 children (7- to 16-year olds), 47 with attention deficit/hyperactivity disorder, combined type and 45 healthy controls. Using an original approach, the time distribution of error occurrence was analysed in the frequency and time-frequency domains in order to detect rhythmic periodicity. Major results demonstrate that in both patients and controls, error behaviour was characterized by multisecond rhythmic fluctuations with a period of ∼12 s, appearing with a delay after transition to task. Only in attention deficit/hyperactivity disorder, was there an additional 'pathological' oscillation of error generation, which determined periodic drops of performance accuracy each 20-30 s. Thus, in patients, periodic error fluctuations were modulated by two independent oscillatory patterns. The findings demonstrate that: (i) attentive behaviour of children is determined by multisecond regularities; and (ii) a unique additional periodicity guides performance fluctuations in patients. These observations may re-conceptualize the understanding of attentive behaviour beyond the executive top-down control and may reveal new origins of psychopathological behaviours in attention deficit/hyperactivity disorder.
Tailoring a Human Reliability Analysis to Your Industry Needs
NASA Technical Reports Server (NTRS)
DeMott, D. L.
2016-01-01
Companies at risk of accidents caused by human error that result in catastrophic consequences include: airline industry mishaps, medical malpractice, medication mistakes, aerospace failures, major oil spills, transportation mishaps, power production failures and manufacturing facility incidents. Human Reliability Assessment (HRA) is used to analyze the inherent risk of human behavior or actions introducing errors into the operation of a system or process. These assessments can be used to identify where errors are most likely to arise and the potential risks involved if they do occur. Using the basic concepts of HRA, an evolving group of methodologies are used to meet various industry needs. Determining which methodology or combination of techniques will provide a quality human reliability assessment is a key element to developing effective strategies for understanding and dealing with risks caused by human errors. There are a number of concerns and difficulties in "tailoring" a Human Reliability Assessment (HRA) for different industries. Although a variety of HRA methodologies are available to analyze human error events, determining the most appropriate tools to provide the most useful results can depend on industry specific cultures and requirements. Methodology selection may be based on a variety of factors that include: 1) how people act and react in different industries, 2) expectations based on industry standards, 3) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 4) type and availability of data, 5) how the industry views risk & reliability, and 6) types of emergencies, contingencies and routine tasks. Other considerations for methodology selection should be based on what information is needed from the assessment. If the principal concern is determination of the primary risk factors contributing to the potential human error, a more detailed analysis method may be employed versus a requirement to provide a numerical value as part of a probabilistic risk assessment. Industries involved with humans operating large equipment or transport systems (ex. railroads or airlines) would have more need to address the man machine interface than medical workers administering medications. Human error occurs in every industry; in most cases the consequences are relatively benign and occasionally beneficial. In cases where the results can have disastrous consequences, the use of Human Reliability techniques to identify and classify the risk of human errors allows a company more opportunities to mitigate or eliminate these types of risks and prevent costly tragedies.
Leonard, Matthew K; Desai, Maansi; Hungate, Dylan; Cai, Ruofan; Singhal, Nilika S; Knowlton, Robert C; Chang, Edward F
2018-05-22
Music and speech are human-specific behaviours that share numerous properties, including the fine motor skills required to produce them. Given these similarities, previous work has suggested that music and speech may at least partially share neural substrates. To date, much of this work has focused on perception, and has not investigated the neural basis of production, particularly in trained musicians. Here, we report two rare cases of musicians undergoing neurosurgical procedures, where it was possible to directly stimulate the left hemisphere cortex during speech and piano/guitar music production tasks. We found that stimulation to left inferior frontal cortex, including pars opercularis and ventral pre-central gyrus, caused slowing and arrest for both speech and music, and note sequence errors for music. Stimulation to posterior superior temporal cortex only caused production errors during speech. These results demonstrate partially dissociable networks underlying speech and music production, with a shared substrate in frontal regions.
Using APEX to Model Anticipated Human Error: Analysis of a GPS Navigational Aid
NASA Technical Reports Server (NTRS)
VanSelst, Mark; Freed, Michael; Shefto, Michael (Technical Monitor)
1997-01-01
The interface development process can be dramatically improved by predicting design facilitated human error at an early stage in the design process. The approach we advocate is to SIMULATE the behavior of a human agent carrying out tasks with a well-specified user interface, ANALYZE the simulation for instances of human error, and then REFINE the interface or protocol to minimize predicted error. This approach, incorporated into the APEX modeling architecture, differs from past approaches to human simulation in Its emphasis on error rather than e.g. learning rate or speed of response. The APEX model consists of two major components: (1) a powerful action selection component capable of simulating behavior in complex, multiple-task environments; and (2) a resource architecture which constrains cognitive, perceptual, and motor capabilities to within empirically demonstrated limits. The model mimics human errors arising from interactions between limited human resources and elements of the computer interface whose design falls to anticipate those limits. We analyze the design of a hand-held Global Positioning System (GPS) device used for radical and navigational decisions in small yacht recalls. The analysis demonstrates how human system modeling can be an effective design aid, helping to accelerate the process of refining a product (or procedure).
Frequency-specific hippocampal-prefrontal interactions during associative learning
Brincat, Scott L.; Miller, Earl K.
2015-01-01
Much of our knowledge of the world depends on learning associations (e.g., face-name), for which the hippocampus (HPC) and prefrontal cortex (PFC) are critical. HPC-PFC interactions have rarely been studied in monkeys, whose cognitive/mnemonic abilities are akin to humans. Here, we show functional differences and frequency-specific interactions between HPC and PFC of monkeys learning object-pair associations, an animal model of human explicit memory. PFC spiking activity reflected learning in parallel with behavioral performance, while HPC neurons reflected feedback about whether trial-and-error guesses were correct or incorrect. Theta-band HPC-PFC synchrony was stronger after errors, was driven primarily by PFC to HPC directional influences, and decreased with learning. In contrast, alpha/beta-band synchrony was stronger after correct trials, was driven more by HPC, and increased with learning. Rapid object associative learning may occur in PFC, while HPC may guide neocortical plasticity by signaling success or failure via oscillatory synchrony in different frequency bands. PMID:25706471
NASA Technical Reports Server (NTRS)
Barshi, Immanuel
2016-01-01
Multitasking is endemic in modern life and work: drivers talk on cell phones, office workers type while answering phone calls, students do homework while text messaging, nurses prepare injections while responding to doctors calls, and air traffic controllers direct aircraft in one sector while handling additional traffic in another. Whether in daily life or at work, we are constantly bombarded with multiple, concurrent interruptions and demands and we have all somehow come to believe in the myth that we can, and in fact are expected to, easily address them all - without any repercussions. However, accumulating scientific evidence is now suggesting that multitasking increases the probability of human error. This talk presents a set of NASA studies that characterize concurrent demands in one work domain, routine airline cockpit operations, in order to illustrate the ways operational task demands together with the proclivity to manage them all concurrently make human performance in this and in any domain vulnerable to potentially serious errors and to accidents.
2017-01-01
We selected iOS in this study as the App operation system, Objective-C as the programming language, and Oracle as the database to develop an App to inspect controlled substances in patient care units. Using a web-enabled smartphone, pharmacist inspection can be performed on site and the inspection result can be directly recorded into HIS through the Internet, so human error of data translation can be minimized and the work efficiency and data processing can be improved. This system not only is fast and convenient compared to the conventional paperwork, but also provides data security and accuracy. In addition, there are several features to increase inspecting quality: (1) accuracy of drug appearance, (2) foolproof mechanism to avoid input errors or miss, (3) automatic data conversion without human judgments, (4) online alarm of expiry date, and (5) instant inspection result to show not meted items. This study has successfully turned paper-based medication inspection into inspection using a web-based mobile device. PMID:28286761
Lu, Ying-Hao; Lee, Li-Yao; Chen, Ying-Lan; Cheng, Hsing-I; Tsai, Wen-Tsung; Kuo, Chen-Chun; Chen, Chung-Yu; Huang, Yaw-Bin
2017-01-01
We selected iOS in this study as the App operation system, Objective-C as the programming language, and Oracle as the database to develop an App to inspect controlled substances in patient care units. Using a web-enabled smartphone, pharmacist inspection can be performed on site and the inspection result can be directly recorded into HIS through the Internet, so human error of data translation can be minimized and the work efficiency and data processing can be improved. This system not only is fast and convenient compared to the conventional paperwork, but also provides data security and accuracy. In addition, there are several features to increase inspecting quality: (1) accuracy of drug appearance, (2) foolproof mechanism to avoid input errors or miss, (3) automatic data conversion without human judgments, (4) online alarm of expiry date, and (5) instant inspection result to show not meted items. This study has successfully turned paper-based medication inspection into inspection using a web-based mobile device.
NASA Technical Reports Server (NTRS)
Jaeger, R. J.; Agarwal, G. C.; Gottlieb, G. L.
1978-01-01
Subjects can correct their own errors of movement more quickly than they can react to external stimuli by using three general categories of feedback: (1) knowledge of results, primarily visually mediated; (2) proprioceptive or kinaesthetic such as from muscle spindles and joint receptors, and (3) corollary discharge or efference copy within the central nervous system. The effects of these feedbacks on simple reaction time, choice reaction time, and error correction time were studied in four normal human subjects. The movement used was plantarflexion and dorsiflexion of the ankle joint. The feedback loops were modified, by changing the sign of the visual display to alter the subject's perception of results, and by applying vibration at 100 Hz simultaneously to both the agonist and antagonist muscles of the ankle joint. The central processing was interfered with when the subjects were given moderate doses of alcohol (blood alcohol concentration levels of up to 0.07%). Vibration and alcohol increase both the simple and choice reaction times but not the error correction time.
Brzozek, Christopher; Benke, Kurt K; Zeleke, Berihun M; Abramson, Michael J; Benke, Geza
2018-03-26
Uncertainty in experimental studies of exposure to radiation from mobile phones has in the past only been framed within the context of statistical variability. It is now becoming more apparent to researchers that epistemic or reducible uncertainties can also affect the total error in results. These uncertainties are derived from a wide range of sources including human error, such as data transcription, model structure, measurement and linguistic errors in communication. The issue of epistemic uncertainty is reviewed and interpreted in the context of the MoRPhEUS, ExPOSURE and HERMES cohort studies which investigate the effect of radiofrequency electromagnetic radiation from mobile phones on memory performance. Research into this field has found inconsistent results due to limitations from a range of epistemic sources. Potential analytic approaches are suggested based on quantification of epistemic error using Monte Carlo simulation. It is recommended that future studies investigating the relationship between radiofrequency electromagnetic radiation and memory performance pay more attention to treatment of epistemic uncertainties as well as further research into improving exposure assessment. Use of directed acyclic graphs is also encouraged to display the assumed covariate relationship.
Wilf-Miron, R; Lewenhoff, I; Benyamini, Z; Aviram, A
2003-02-01
The development of a medical risk management programme based on the aviation safety approach and its implementation in a large ambulatory healthcare organisation is described. The following key safety principles were applied: (1). errors inevitably occur and usually derive from faulty system design, not from negligence; (2). accident prevention should be an ongoing process based on open and full reporting; (3). major accidents are only the "tip of the iceberg" of processes that indicate possibilities for organisational learning. Reporting physicians were granted immunity, which encouraged open reporting of errors. A telephone "hotline" served the medical staff for direct reporting and receipt of emotional support and medical guidance. Any adverse event which had learning potential was debriefed, while focusing on the human cause of error within a systemic context. Specific recommendations were formulated to rectify processes conducive to error when failures were identified. During the first 5 years of implementation, the aviation safety concept and tools were successfully adapted to ambulatory care, fostering a culture of greater concern for patient safety through risk management while providing support to the medical staff.
Wilf-Miron, R; Lewenhoff, I; Benyamini, Z; Aviram, A
2003-01-01
The development of a medical risk management programme based on the aviation safety approach and its implementation in a large ambulatory healthcare organisation is described. The following key safety principles were applied: (1) errors inevitably occur and usually derive from faulty system design, not from negligence; (2) accident prevention should be an ongoing process based on open and full reporting; (3) major accidents are only the "tip of the iceberg" of processes that indicate possibilities for organisational learning. Reporting physicians were granted immunity, which encouraged open reporting of errors. A telephone "hotline" served the medical staff for direct reporting and receipt of emotional support and medical guidance. Any adverse event which had learning potential was debriefed, while focusing on the human cause of error within a systemic context. Specific recommendations were formulated to rectify processes conducive to error when failures were identified. During the first 5 years of implementation, the aviation safety concept and tools were successfully adapted to ambulatory care, fostering a culture of greater concern for patient safety through risk management while providing support to the medical staff. PMID:12571343
Hooper, Brionny J; O'Hare, David P A
2013-08-01
Human error classification systems theoretically allow researchers to analyze postaccident data in an objective and consistent manner. The Human Factors Analysis and Classification System (HFACS) framework is one such practical analysis tool that has been widely used to classify human error in aviation. The Cognitive Error Taxonomy (CET) is another. It has been postulated that the focus on interrelationships within HFACS can facilitate the identification of the underlying causes of pilot error. The CET provides increased granularity at the level of unsafe acts. The aim was to analyze the influence of factors at higher organizational levels on the unsafe acts of front-line operators and to compare the errors of fixed-wing and rotary-wing operations. This study analyzed 288 aircraft incidents involving human error from an Australasian military organization occurring between 2001 and 2008. Action errors accounted for almost twice (44%) the proportion of rotary wing compared to fixed wing (23%) incidents. Both classificatory systems showed significant relationships between precursor factors such as the physical environment, mental and physiological states, crew resource management, training and personal readiness, and skill-based, but not decision-based, acts. The CET analysis showed different predisposing factors for different aspects of skill-based behaviors. Skill-based errors in military operations are more prevalent in rotary wing incidents and are related to higher level supervisory processes in the organization. The Cognitive Error Taxonomy provides increased granularity to HFACS analyses of unsafe acts.
Parvin, Darius E; McDougle, Samuel D; Taylor, Jordan A; Ivry, Richard B
2018-05-09
Failures to obtain reward can occur from errors in action selection or action execution. Recently, we observed marked differences in choice behavior when the failure to obtain a reward was attributed to errors in action execution compared with errors in action selection (McDougle et al., 2016). Specifically, participants appeared to solve this credit assignment problem by discounting outcomes in which the absence of reward was attributed to errors in action execution. Building on recent evidence indicating relatively direct communication between the cerebellum and basal ganglia, we hypothesized that cerebellar-dependent sensory prediction errors (SPEs), a signal indicating execution failure, could attenuate value updating within a basal ganglia-dependent reinforcement learning system. Here we compared the SPE hypothesis to an alternative, "top-down" hypothesis in which changes in choice behavior reflect participants' sense of agency. In two experiments with male and female human participants, we manipulated the strength of SPEs, along with the participants' sense of agency in the second experiment. The results showed that, whereas the strength of SPE had no effect on choice behavior, participants were much more likely to discount the absence of rewards under conditions in which they believed the reward outcome depended on their ability to produce accurate movements. These results provide strong evidence that SPEs do not directly influence reinforcement learning. Instead, a participant's sense of agency appears to play a significant role in modulating choice behavior when unexpected outcomes can arise from errors in action execution. SIGNIFICANCE STATEMENT When learning from the outcome of actions, the brain faces a credit assignment problem: Failures of reward can be attributed to poor choice selection or poor action execution. Here, we test a specific hypothesis that execution errors are implicitly signaled by cerebellar-based sensory prediction errors. We evaluate this hypothesis and compare it with a more "top-down" hypothesis in which the modulation of choice behavior from execution errors reflects participants' sense of agency. We find that sensory prediction errors have no significant effect on reinforcement learning. Instead, instructions influencing participants' belief of causal outcomes appear to be the main factor influencing their choice behavior. Copyright © 2018 the authors 0270-6474/18/384521-10$15.00/0.
The NASA F-15 Intelligent Flight Control Systems: Generation II
NASA Technical Reports Server (NTRS)
Buschbacher, Mark; Bosworth, John
2006-01-01
The Second Generation (Gen II) control system for the F-15 Intelligent Flight Control System (IFCS) program implements direct adaptive neural networks to demonstrate robust tolerance to faults and failures. The direct adaptive tracking controller integrates learning neural networks (NNs) with a dynamic inversion control law. The term direct adaptive is used because the error between the reference model and the aircraft response is being compensated or directly adapted to minimize error without regard to knowing the cause of the error. No parameter estimation is needed for this direct adaptive control system. In the Gen II design, the feedback errors are regulated with a proportional-plus-integral (PI) compensator. This basic compensator is augmented with an online NN that changes the system gains via an error-based adaptation law to improve aircraft performance at all times, including normal flight, system failures, mispredicted behavior, or changes in behavior resulting from damage.
Hughes, Charmayne M L; Baber, Chris; Bienkiewicz, Marta; Worthington, Andrew; Hazell, Alexa; Hermsdörfer, Joachim
2015-01-01
Approximately 33% of stroke patients have difficulty performing activities of daily living, often committing errors during the planning and execution of such activities. The objective of this study was to evaluate the ability of the human error identification (HEI) technique SHERPA (Systematic Human Error Reduction and Prediction Approach) to predict errors during the performance of daily activities in stroke patients with left and right hemisphere lesions. Using SHERPA we successfully predicted 36 of the 38 observed errors, with analysis indicating that the proportion of predicted and observed errors was similar for all sub-tasks and severity levels. HEI results were used to develop compensatory cognitive strategies that clinicians could employ to reduce or prevent errors from occurring. This study provides evidence for the reliability and validity of SHERPA in the design of cognitive rehabilitation strategies in stroke populations.
2010-03-15
Swiss cheese model of human error causation. ................................................................... 3 2. Results for the classification of...based on Reason’s “ Swiss cheese ” model of human error (1990). Figure 1 describes how an accident is likely to occur when all of the errors, or “holes...align. A detailed description of HFACS can be found in Wiegmann and Shappell (2003). Figure 1. The Swiss cheese model of human error
Hypothesis-driven methods to augment human cognition by optimizing cortical oscillations
Horschig, Jörn M.; Zumer, Johanna M.; Bahramisharif, Ali
2014-01-01
Cortical oscillations have been shown to represent fundamental functions of a working brain, e.g., communication, stimulus binding, error monitoring, and inhibition, and are directly linked to behavior. Recent studies intervening with these oscillations have demonstrated effective modulation of both the oscillations and behavior. In this review, we collect evidence in favor of how hypothesis-driven methods can be used to augment cognition by optimizing cortical oscillations. We elaborate their potential usefulness for three target groups: healthy elderly, patients with attention deficit/hyperactivity disorder, and healthy young adults. We discuss the relevance of neuronal oscillations in each group and show how each of them can benefit from the manipulation of functionally-related oscillations. Further, we describe methods for manipulation of neuronal oscillations including direct brain stimulation as well as indirect task alterations. We also discuss practical considerations about the proposed techniques. In conclusion, we propose that insights from neuroscience should guide techniques to augment human cognition, which in turn can provide a better understanding of how the human brain works. PMID:25018706
Probst, C Adam; Carter, Megan; Cadigan, Caton; Dalcour, Cortney; Cassity, Cindy; Quinn, Penny; Williams, Tiana; Montgomery, Donna Cook; Wilder, Claudia; Xiao, Yan
2017-02-01
The aim of this study is to increase nurses' time for direct patient care and improve safety via a novel human factors framework for nursing worksystem improvement. Time available for direct patient care influences outcomes, yet worksystem barriers prevent nurses adequate time at the bedside. A novel human factors framework was developed for worksystem improvement in 3 units at 2 facilities. Objectives included improving nurse efficiency as measured by time-and-motion studies, reducing missing medications and subsequent trips to medication rooms and improving medication safety. Worksystem improvement resulted in time savings of 16% to 32% per nurse per 12-hour shift. Requests for missing medications dropped from 3.2 to 1.3 per day. Nurse medication room trips were reduced by 30% and nurse-reported medication errors fell from a range of 1.2 to 0.8 and 6.3 to 4.0 per month. An innovative human factors framework for nursing worksystem improvement provided practical and high priority targets for interventions that significantly improved the nursing worksystem.
Examination of soldier target recognition with direct view optics
NASA Astrophysics Data System (ADS)
Long, Frederick H.; Larkin, Gabriella; Bisordi, Danielle; Dorsey, Shauna; Marianucci, Damien; Goss, Lashawnta; Bastawros, Michael; Misiuda, Paul; Rodgers, Glenn; Mazz, John P.
2017-10-01
Target recognition and identification is a problem of great military and scientific importance. To examine the correlation between target recognition and optical magnification, ten U.S. Army soldiers were tasked with identifying letters on targets at 800 and 1300 meters away. Letters were used since they are a standard method for measuring visual acuity. The letters were approximately 90 cm high, which is the size of a well-known rifle. Four direct view optics with angular magnifications of 1.5x, 4x, 6x, and 9x were used. The goal of this approach was to measure actual probabilities for correct target identification. Previous scientific literature suggests that target recognition can be modeled as a linear response problem in angular frequency space using the established values for the contrast sensitivity function for a healthy human eye and the experimentally measured modulation transfer function of the optic. At the 9x magnification, the soldiers could identify the letters with almost no errors (i.e., 97% probability of correct identification). At lower magnification, errors in letter identification were more frequent. The identification errors were not random but occurred most frequently with a few pairs of letters (e.g., O and Q), which is consistent with the literature for letter recognition. In addition, in the small subject sample of ten soldiers, there was considerable variation in the observer recognition capability at 1.5x and a range of 800 meters. This can be directly attributed to the variation in the observer visual acuity.
Meta sequence analysis of human blood peptides and their parent proteins.
Bowden, Peter; Pendrak, Voitek; Zhu, Peihong; Marshall, John G
2010-04-18
Sequence analysis of the blood peptides and their qualities will be key to understanding the mechanisms that contribute to error in LC-ESI-MS/MS. Analysis of peptides and their proteins at the level of sequences is much more direct and informative than the comparison of disparate accession numbers. A portable database of all blood peptide and protein sequences with descriptor fields and gene ontology terms might be useful for designing immunological or MRM assays from human blood. The results of twelve studies of human blood peptides and/or proteins identified by LC-MS/MS and correlated against a disparate array of genetic libraries were parsed and matched to proteins from the human ENSEMBL, SwissProt and RefSeq databases by SQL. The reported peptide and protein sequences were organized into an SQL database with full protein sequences and up to five unique peptides in order of prevalence along with the peptide count for each protein. Structured query language or BLAST was used to acquire descriptive information in current databases. Sampling error at the level of peptides is the largest source of disparity between groups. Chi Square analysis of peptide to protein distributions confirmed the significant agreement between groups on identified proteins. Copyright 2010. Published by Elsevier B.V.
Marshall-Pescini, Sarah; Passalacqua, Chiara; Miletto Petrazzini, Maria Elena; Valsecchi, Paola; Prato-Previde, Emanuela
2012-01-01
Dogs appear to be sensitive to human ostensive communicative cues in a variety of situations, however there is still a measure of controversy as to the way in which these cues influence human-dog interactions. There is evidence for instance that dogs can be led into making evaluation errors in a quantity discrimination task, for example losing their preference for a larger food quantity if a human shows a preference for a smaller one, yet there is, so far, no explanation for this phenomenon. Using a modified version of this task, in the current study we investigated whether non-social, social or communicative cues (alone or in combination) cause dogs to go against their preference for the larger food quantity. Results show that dogs' evaluation errors are indeed caused by a social bias, but, somewhat contrary to previous studies, they highlight the potent effect of stimulus enhancement (handling the target) in influencing the dogs' response. A mild influence on the dog's behaviour was found only when different ostensive cues (and no handling of the target) were used in combination, suggesting their cumulative effect. The discussion addresses possible motives for discrepancies with previous studies suggesting that both the intentionality and the directionality of the action may be important in causing dogs' social biases.
Marshall-Pescini, Sarah; Passalacqua, Chiara; Miletto Petrazzini, Maria Elena; Valsecchi, Paola; Prato-Previde, Emanuela
2012-01-01
Dogs appear to be sensitive to human ostensive communicative cues in a variety of situations, however there is still a measure of controversy as to the way in which these cues influence human-dog interactions. There is evidence for instance that dogs can be led into making evaluation errors in a quantity discrimination task, for example losing their preference for a larger food quantity if a human shows a preference for a smaller one, yet there is, so far, no explanation for this phenomenon. Using a modified version of this task, in the current study we investigated whether non-social, social or communicative cues (alone or in combination) cause dogs to go against their preference for the larger food quantity. Results show that dogs' evaluation errors are indeed caused by a social bias, but, somewhat contrary to previous studies, they highlight the potent effect of stimulus enhancement (handling the target) in influencing the dogs' response. A mild influence on the dog's behaviour was found only when different ostensive cues (and no handling of the target) were used in combination, suggesting their cumulative effect. The discussion addresses possible motives for discrepancies with previous studies suggesting that both the intentionality and the directionality of the action may be important in causing dogs' social biases. PMID:22558150
Animal social networks as substrate for cultural behavioural diversity.
Whitehead, Hal; Lusseau, David
2012-02-07
We used individual-based stochastic models to examine how social structure influences the diversity of socially learned behaviour within a non-human population. For continuous behavioural variables we modelled three forms of dyadic social learning, averaging the behavioural value of the two individuals, random transfer of information from one individual to the other, and directional transfer from the individual with highest behavioural value to the other. Learning had potential error. We also examined the transfer of categorical behaviour between individuals with random directionality and two forms of error, the adoption of a randomly chosen existing behavioural category or the innovation of a new type of behaviour. In populations without social structuring the diversity of culturally transmitted behaviour increased with learning error and population size. When the populations were structured socially either by making individuals members of permanent social units or by giving them overlapping ranges, behavioural diversity increased with network modularity under all scenarios, although the proportional increase varied considerably between continuous and categorical behaviour, with transmission mechanism, and population size. Although functions of the form e(c)¹(m)⁻(c)² + (c)³(Log(N)) predicted the mean increase in diversity with modularity (m) and population size (N), behavioural diversity could be highly unpredictable both between simulations with the same set of parameters, and within runs. Errors in social learning and social structuring generally promote behavioural diversity. Consequently, social learning may be considered to produce culture in populations whose social structure is sufficiently modular. Copyright © 2011 Elsevier Ltd. All rights reserved.
A Quality Improvement Project to Decrease Human Milk Errors in the NICU.
Oza-Frank, Reena; Kachoria, Rashmi; Dail, James; Green, Jasmine; Walls, Krista; McClead, Richard E
2017-02-01
Ensuring safe human milk in the NICU is a complex process with many potential points for error, of which one of the most serious is administration of the wrong milk to the wrong infant. Our objective was to describe a quality improvement initiative that was associated with a reduction in human milk administration errors identified over a 6-year period in a typical, large NICU setting. We employed a quasi-experimental time series quality improvement initiative by using tools from the model for improvement, Six Sigma methodology, and evidence-based interventions. Scanned errors were identified from the human milk barcode medication administration system. Scanned errors of interest were wrong-milk-to-wrong-infant, expired-milk, or preparation errors. The scanned error rate and the impact of additional improvement interventions from 2009 to 2015 were monitored by using statistical process control charts. From 2009 to 2015, the total number of errors scanned declined from 97.1 per 1000 bottles to 10.8. Specifically, the number of expired milk error scans declined from 84.0 per 1000 bottles to 8.9. The number of preparation errors (4.8 per 1000 bottles to 2.2) and wrong-milk-to-wrong-infant errors scanned (8.3 per 1000 bottles to 2.0) also declined. By reducing the number of errors scanned, the number of opportunities for errors also decreased. Interventions that likely had the greatest impact on reducing the number of scanned errors included installation of bedside (versus centralized) scanners and dedicated staff to handle milk. Copyright © 2017 by the American Academy of Pediatrics.
Spiral-bevel geometry and gear train precision
NASA Technical Reports Server (NTRS)
Litvin, F. L.; Coy, J. J.
1983-01-01
A new aproach to the solution of determination of surface principal curvatures and directions is proposed. Direct relationships between the principal curvatures and directions of the tool surface and those of the principal curvatures and directions of generated gear surface are obtained. The principal curvatures and directions of geartooth surface are obtained without using the complicated equations of these surfaces. A general theory of the train kinematical errors exerted by manufacturing and assembly errors is discussed. Two methods for the determination of the train kinematical errors can be worked out: (1) with aid of a computer, and (2) with a approximate method. Results from noise and vibration measurement conducted on a helicopter transmission are used to illustrate the principals contained in the theory of kinematic errors.
Human errors and measurement uncertainty
NASA Astrophysics Data System (ADS)
Kuselman, Ilya; Pennecchi, Francesca
2015-04-01
Evaluating the residual risk of human errors in a measurement and testing laboratory, remaining after the error reduction by the laboratory quality system, and quantifying the consequences of this risk for the quality of the measurement/test results are discussed based on expert judgments and Monte Carlo simulations. A procedure for evaluation of the contribution of the residual risk to the measurement uncertainty budget is proposed. Examples are provided using earlier published sets of expert judgments on human errors in pH measurement of groundwater, elemental analysis of geological samples by inductively coupled plasma mass spectrometry, and multi-residue analysis of pesticides in fruits and vegetables. The human error contribution to the measurement uncertainty budget in the examples was not negligible, yet also not dominant. This was assessed as a good risk management result.
Prediction-error in the context of real social relationships modulates reward system activity.
Poore, Joshua C; Pfeifer, Jennifer H; Berkman, Elliot T; Inagaki, Tristen K; Welborn, Benjamin L; Lieberman, Matthew D
2012-01-01
The human reward system is sensitive to both social (e.g., validation) and non-social rewards (e.g., money) and is likely integral for relationship development and reputation building. However, data is sparse on the question of whether implicit social reward processing meaningfully contributes to explicit social representations such as trust and attachment security in pre-existing relationships. This event-related fMRI experiment examined reward system prediction-error activity in response to a potent social reward-social validation-and this activity's relation to both attachment security and trust in the context of real romantic relationships. During the experiment, participants' expectations for their romantic partners' positive regard of them were confirmed (validated) or violated, in either positive or negative directions. Primary analyses were conducted using predefined regions of interest, the locations of which were taken from previously published research. Results indicate that activity for mid-brain and striatal reward system regions of interest was modulated by social reward expectation violation in ways consistent with prior research on reward prediction-error. Additionally, activity in the striatum during viewing of disconfirmatory information was associated with both increases in post-scan reports of attachment anxiety and decreases in post-scan trust, a finding that follows directly from representational models of attachment and trust.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Callan, J.R.; Kelly, R.T.; Quinn, M.L.
1995-05-01
Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices usedmore » in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error.« less
DOT National Transportation Integrated Search
2001-02-01
The Human Factors Analysis and Classification System (HFACS) is a general human error framework : originally developed and tested within the U.S. military as a tool for investigating and analyzing the human : causes of aviation accidents. Based upon ...
Virtual hospital--a computer-aided platform to evaluate the sense of direction.
Jiang, Ching-Fen; Li, Yuan-Shyi
2007-01-01
This paper presents a computer-aided platform, named Virtual Hospital (VH), to evaluate the wayfinding ability that is found impaired in senile people with early dementia. The development of the VH takes the advantage of virtual reality technology to make the evaluation of the sense of direction more convenient and accurate then the conventional way. A pilot study was carried out to test its feasibility in differentiating the sense of direction between different genders. The results with significant differences in the response time (p<0.05) and the pointing error (p<0.01) between genders suggest the potential of the VH for clinical uses. Further improvement on the human-machine interface is necessary to make it easy for geriatric people to use.
Comparison of spatial association approaches for landscape mapping of soil organic carbon stocks
NASA Astrophysics Data System (ADS)
Miller, B. A.; Koszinski, S.; Wehrhan, M.; Sommer, M.
2015-03-01
The distribution of soil organic carbon (SOC) can be variable at small analysis scales, but consideration of its role in regional and global issues demands the mapping of large extents. There are many different strategies for mapping SOC, among which is to model the variables needed to calculate the SOC stock indirectly or to model the SOC stock directly. The purpose of this research is to compare direct and indirect approaches to mapping SOC stocks from rule-based, multiple linear regression models applied at the landscape scale via spatial association. The final products for both strategies are high-resolution maps of SOC stocks (kg m-2), covering an area of 122 km2, with accompanying maps of estimated error. For the direct modelling approach, the estimated error map was based on the internal error estimations from the model rules. For the indirect approach, the estimated error map was produced by spatially combining the error estimates of component models via standard error propagation equations. We compared these two strategies for mapping SOC stocks on the basis of the qualities of the resulting maps as well as the magnitude and distribution of the estimated error. The direct approach produced a map with less spatial variation than the map produced by the indirect approach. The increased spatial variation represented by the indirect approach improved R2 values for the topsoil and subsoil stocks. Although the indirect approach had a lower mean estimated error for the topsoil stock, the mean estimated error for the total SOC stock (topsoil + subsoil) was lower for the direct approach. For these reasons, we recommend the direct approach to modelling SOC stocks be considered a more conservative estimate of the SOC stocks' spatial distribution.
Comparison of spatial association approaches for landscape mapping of soil organic carbon stocks
NASA Astrophysics Data System (ADS)
Miller, B. A.; Koszinski, S.; Wehrhan, M.; Sommer, M.
2014-11-01
The distribution of soil organic carbon (SOC) can be variable at small analysis scales, but consideration of its role in regional and global issues demands the mapping of large extents. There are many different strategies for mapping SOC, among which are to model the variables needed to calculate the SOC stock indirectly or to model the SOC stock directly. The purpose of this research is to compare direct and indirect approaches to mapping SOC stocks from rule-based, multiple linear regression models applied at the landscape scale via spatial association. The final products for both strategies are high-resolution maps of SOC stocks (kg m-2), covering an area of 122 km2, with accompanying maps of estimated error. For the direct modelling approach, the estimated error map was based on the internal error estimations from the model rules. For the indirect approach, the estimated error map was produced by spatially combining the error estimates of component models via standard error propagation equations. We compared these two strategies for mapping SOC stocks on the basis of the qualities of the resulting maps as well as the magnitude and distribution of the estimated error. The direct approach produced a map with less spatial variation than the map produced by the indirect approach. The increased spatial variation represented by the indirect approach improved R2 values for the topsoil and subsoil stocks. Although the indirect approach had a lower mean estimated error for the topsoil stock, the mean estimated error for the total SOC stock (topsoil + subsoil) was lower for the direct approach. For these reasons, we recommend the direct approach to modelling SOC stocks be considered a more conservative estimate of the SOC stocks' spatial distribution.
NASA Astrophysics Data System (ADS)
Kugeiko, M. M.; Lisenko, S. A.
2008-07-01
An easily automated method for determining the real part of the refractive index of human blood erythrocytes in the range 0.3 1.2 μm is proposed. The method is operationally and metrologically reliable and is based on the measurement of the coefficients of light scattering from forward and backward hemisphere by two pairs of angles and on the use of multiple regression equations. An engineering solution for constructing a measurement system according to this method is proposed, which makes it possible to maximally reduce the calibration errors and effects of destabilizing factors.
NASA Astrophysics Data System (ADS)
Chen, Yue; Cunningham, Gregory; Henderson, Michael
2016-09-01
This study aims to statistically estimate the errors in local magnetic field directions that are derived from electron directional distributions measured by Los Alamos National Laboratory geosynchronous (LANL GEO) satellites. First, by comparing derived and measured magnetic field directions along the GEO orbit to those calculated from three selected empirical global magnetic field models (including a static Olson and Pfitzer 1977 quiet magnetic field model, a simple dynamic Tsyganenko 1989 model, and a sophisticated dynamic Tsyganenko 2001 storm model), it is shown that the errors in both derived and modeled directions are at least comparable. Second, using a newly developed proxy method as well as comparing results from empirical models, we are able to provide for the first time circumstantial evidence showing that derived magnetic field directions should statistically match the real magnetic directions better, with averaged errors < ˜ 2°, than those from the three empirical models with averaged errors > ˜ 5°. In addition, our results suggest that the errors in derived magnetic field directions do not depend much on magnetospheric activity, in contrast to the empirical field models. Finally, as applications of the above conclusions, we show examples of electron pitch angle distributions observed by LANL GEO and also take the derived magnetic field directions as the real ones so as to test the performance of empirical field models along the GEO orbits, with results suggesting dependence on solar cycles as well as satellite locations. This study demonstrates the validity and value of the method that infers local magnetic field directions from particle spin-resolved distributions.
Chen, Yue; Cunningham, Gregory; Henderson, Michael
2016-09-21
Our study aims to statistically estimate the errors in local magnetic field directions that are derived from electron directional distributions measured by Los Alamos National Laboratory geosynchronous (LANL GEO) satellites. First, by comparing derived and measured magnetic field directions along the GEO orbit to those calculated from three selected empirical global magnetic field models (including a static Olson and Pfitzer 1977 quiet magnetic field model, a simple dynamic Tsyganenko 1989 model, and a sophisticated dynamic Tsyganenko 2001 storm model), it is shown that the errors in both derived and modeled directions are at least comparable. Furthermore, using a newly developedmore » proxy method as well as comparing results from empirical models, we are able to provide for the first time circumstantial evidence showing that derived magnetic field directions should statistically match the real magnetic directions better, with averaged errors < ~2°, than those from the three empirical models with averaged errors > ~5°. In addition, our results suggest that the errors in derived magnetic field directions do not depend much on magnetospheric activity, in contrast to the empirical field models. Finally, as applications of the above conclusions, we show examples of electron pitch angle distributions observed by LANL GEO and also take the derived magnetic field directions as the real ones so as to test the performance of empirical field models along the GEO orbits, with results suggesting dependence on solar cycles as well as satellite locations. Finally, this study demonstrates the validity and value of the method that infers local magnetic field directions from particle spin-resolved distributions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yue; Cunningham, Gregory; Henderson, Michael
Our study aims to statistically estimate the errors in local magnetic field directions that are derived from electron directional distributions measured by Los Alamos National Laboratory geosynchronous (LANL GEO) satellites. First, by comparing derived and measured magnetic field directions along the GEO orbit to those calculated from three selected empirical global magnetic field models (including a static Olson and Pfitzer 1977 quiet magnetic field model, a simple dynamic Tsyganenko 1989 model, and a sophisticated dynamic Tsyganenko 2001 storm model), it is shown that the errors in both derived and modeled directions are at least comparable. Furthermore, using a newly developedmore » proxy method as well as comparing results from empirical models, we are able to provide for the first time circumstantial evidence showing that derived magnetic field directions should statistically match the real magnetic directions better, with averaged errors < ~2°, than those from the three empirical models with averaged errors > ~5°. In addition, our results suggest that the errors in derived magnetic field directions do not depend much on magnetospheric activity, in contrast to the empirical field models. Finally, as applications of the above conclusions, we show examples of electron pitch angle distributions observed by LANL GEO and also take the derived magnetic field directions as the real ones so as to test the performance of empirical field models along the GEO orbits, with results suggesting dependence on solar cycles as well as satellite locations. Finally, this study demonstrates the validity and value of the method that infers local magnetic field directions from particle spin-resolved distributions.« less
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2011-01-01
As automation and advanced technologies are introduced into transport systems ranging from the Next Generation Air Transportation System termed NextGen, to the advanced surface transportation systems as exemplified by the Intelligent Transportations Systems, to future systems designed for space exploration, there is an increased need to validly predict how the future systems will be vulnerable to error given the demands imposed by the assistive technologies. One formalized approach to study the impact of assistive technologies on the human operator in a safe and non-obtrusive manner is through the use of human performance models (HPMs). HPMs play an integral role when complex human-system designs are proposed, developed, and tested. One HPM tool termed the Man-machine Integration Design and Analysis System (MIDAS) is a NASA Ames Research Center HPM software tool that has been applied to predict human-system performance in various domains since 1986. MIDAS is a dynamic, integrated HPM and simulation environment that facilitates the design, visualization, and computational evaluation of complex man-machine system concepts in simulated operational environments. The paper will discuss a range of aviation specific applications including an approach used to model human error for NASA s Aviation Safety Program, and what-if analyses to evaluate flight deck technologies for NextGen operations. This chapter will culminate by raising two challenges for the field of predictive HPMs for complex human-system designs that evaluate assistive technologies: that of (1) model transparency and (2) model validation.
Yamanari, Masahiro; Nagase, Satoko; Fukuda, Shinichi; Ishii, Kotaro; Tanaka, Ryosuke; Yasui, Takeshi; Oshika, Tetsuro; Miura, Masahiro; Yasuno, Yoshiaki
2014-05-01
The relationship between scleral birefringence and biometric parameters of human eyes in vivo is investigated. Scleral birefringence near the limbus of 21 healthy human eyes was measured using polarization-sensitive optical coherence tomography. Spherical equivalent refractive error, axial eye length, and intraocular pressure (IOP) were measured in all subjects. IOP and scleral birefringence of human eyes in vivo was found to have statistically significant correlations (r = -0.63, P = 0.002). The slope of linear regression was -2.4 × 10(-2) deg/μm/mmHg. Neither spherical equivalent refractive error nor axial eye length had significant correlations with scleral birefringence. To evaluate the direct influence of IOP to scleral birefringence, scleral birefringence of 16 ex vivo porcine eyes was measured under controlled IOP of 5-60 mmHg. In these ex vivo porcine eyes, the mean linear regression slope between controlled IOP and scleral birefringence was -9.9 × 10(-4) deg/μm/mmHg. In addition, porcine scleral collagen fibers were observed with second-harmonic-generation (SHG) microscopy. SHG images of porcine sclera, measured on the external surface at the superior side to the cornea, showed highly aligned collagen fibers parallel to the limbus. In conclusion, scleral birefringence of healthy human eyes was correlated with IOP, indicating that the ultrastructure of scleral collagen was correlated with IOP. It remains to show whether scleral collagen ultrastructure of human eyes is affected by IOP as a long-term effect.
Object motion computation for the initiation of smooth pursuit eye movements in humans.
Wallace, Julian M; Stone, Leland S; Masson, Guillaume S
2005-04-01
Pursuing an object with smooth eye movements requires an accurate estimate of its two-dimensional (2D) trajectory. This 2D motion computation requires that different local motion measurements are extracted and combined to recover the global object-motion direction and speed. Several combination rules have been proposed such as vector averaging (VA), intersection of constraints (IOC), or 2D feature tracking (2DFT). To examine this computation, we investigated the time course of smooth pursuit eye movements driven by simple objects of different shapes. For type II diamond (where the direction of true object motion is dramatically different from the vector average of the 1-dimensional edge motions, i.e., VA not equal IOC = 2DFT), the ocular tracking is initiated in the vector average direction. Over a period of less than 300 ms, the eye-tracking direction converges on the true object motion. The reduction of the tracking error starts before the closing of the oculomotor loop. For type I diamonds (where the direction of true object motion is identical to the vector average direction, i.e., VA = IOC = 2DFT), there is no such bias. We quantified this effect by calculating the direction error between responses to types I and II and measuring its maximum value and time constant. At low contrast and high speeds, the initial bias in tracking direction is larger and takes longer to converge onto the actual object-motion direction. This effect is attenuated with the introduction of more 2D information to the extent that it was totally obliterated with a texture-filled type II diamond. These results suggest a flexible 2D computation for motion integration, which combines all available one-dimensional (edge) and 2D (feature) motion information to refine the estimate of object-motion direction over time.
Limb position sense, proprioceptive drift and muscle thixotropy at the human elbow joint
Tsay, A; Savage, G; Allen, T J; Proske, U
2014-01-01
These experiments on the human forearm are based on the hypothesis that drift in the perceived position of a limb over time can be explained by receptor adaptation. Limb position sense was measured in 39 blindfolded subjects using a forearm-matching task. A property of muscle, its thixotropy, a contraction history-dependent passive stiffness, was exploited to place muscle receptors of elbow muscles in a defined state. After the arm had been held flexed and elbow flexors contracted, we observed time-dependent changes in the perceived position of the reference arm by an average of 2.8° in the direction of elbow flexion over 30 s (Experiment 1). The direction of the drift reversed after the arm had been extended and elbow extensors contracted, with a mean shift of 3.5° over 30 s in the direction of elbow extension (Experiment 2). The time-dependent changes could be abolished by conditioning elbow flexors and extensors in the reference arm at the test angle, although this led to large position errors during matching (±10°), depending on how the indicator arm had been conditioned (Experiments 3 and 4). When slack was introduced in the elbow muscles of both arms, by shortening muscles after the conditioning contraction, matching errors became small and there was no drift in position sense (Experiments 5 and 6). These experiments argue for a receptor-based mechanism for proprioceptive drift and suggest that to align the two forearms, the brain monitors the difference between the afferent signals from the two arms. PMID:24665096
Compound Stimulus Presentation Does Not Deepen Extinction in Human Causal Learning
Griffiths, Oren; Holmes, Nathan; Westbrook, R. Fred
2017-01-01
Models of associative learning have proposed that cue-outcome learning critically depends on the degree of prediction error encountered during training. Two experiments examined the role of error-driven extinction learning in a human causal learning task. Target cues underwent extinction in the presence of additional cues, which differed in the degree to which they predicted the outcome, thereby manipulating outcome expectancy and, in the absence of any change in reinforcement, prediction error. These prediction error manipulations have each been shown to modulate extinction learning in aversive conditioning studies. While both manipulations resulted in increased prediction error during training, neither enhanced extinction in the present human learning task (one manipulation resulted in less extinction at test). The results are discussed with reference to the types of associations that are regulated by prediction error, the types of error terms involved in their regulation, and how these interact with parameters involved in training. PMID:28232809
Kim, Jun Sik; Jeong, Byung Yong
2018-05-03
The study aimed to describe the characteristics of occupational injuries of female workers in the residential healthcare facilities for the elderly, and analyze human errors as causes of accidents. From the national industrial accident compensation data, 506 female injuries were analyzed by age and occupation. The results showed that medical service worker was the most prevalent (54.1%), followed by social welfare worker (20.4%). Among injuries, 55.7% were <1 year of work experience, and 37.9% were ≥60 years old. Slips/falls were the most common type of accident (42.7%), and proportion of injured by slips/falls increases with age. Among human errors, action errors were the primary reasons, followed by perception errors, and cognition errors. Besides, the ratios of injuries by perception errors and action errors increase with age, respectively. The findings of this study suggest that there is a need to design workplaces that accommodate the characteristics of older female workers.
Research on the liquid crystal adaptive optics system for human retinal imaging
NASA Astrophysics Data System (ADS)
Zhang, Lei; Tong, Shoufeng; Song, Yansong; Zhao, Xin
2013-12-01
The blood vessels only in Human eye retinal can be observed directly. Many diseases that are not obvious in their early symptom can be diagnosed through observing the changes of distal micro blood vessel. In order to obtain the high resolution human retinal images,an adaptive optical system for correcting the aberration of the human eye was designed by using the Shack-Hartmann wavefront sensor and the Liquid Crystal Spatial Light Modulator(LCLSM) .For a subject eye with 8m-1 (8D)myopia, the wavefront error is reduced to 0.084 λ PV and 0.12 λRMS after adaptive optics(AO) correction ,which has reached diffraction limit.The results show that the LCLSM based AO system has the ability of correcting the aberration of the human eye efficiently,and making the blurred photoreceptor cell to clearly image on a CCD camera.
Reinhart, Robert M G; Zhu, Julia; Park, Sohee; Woodman, Geoffrey F
2015-07-28
Executive control and flexible adjustment of behavior following errors are essential to adaptive functioning. Loss of adaptive control may be a biomarker of a wide range of neuropsychiatric disorders, particularly in the schizophrenia spectrum. Here, we provide support for the view that oscillatory activity in the frontal cortex underlies adaptive adjustments in cognitive processing following errors. Compared with healthy subjects, patients with schizophrenia exhibited low frequency oscillations with abnormal temporal structure and an absence of synchrony over medial-frontal and lateral-prefrontal cortex following errors. To demonstrate that these abnormal oscillations were the origin of the impaired adaptive control in patients with schizophrenia, we applied noninvasive dc electrical stimulation over the medial-frontal cortex. This noninvasive stimulation descrambled the phase of the low-frequency neural oscillations that synchronize activity across cortical regions. Following stimulation, the behavioral index of adaptive control was improved such that patients were indistinguishable from healthy control subjects. These results provide unique causal evidence for theories of executive control and cortical dysconnectivity in schizophrenia.
Human Factors in Training - Space Medicine Proficiency Training
NASA Technical Reports Server (NTRS)
Connell, Erin; Arsintescu, Lucia
2009-01-01
The early Constellation space missions are expected to have medical capabilities very similar to those currently on the Space Shuttle and International Space Station (ISS). For Crew Exploration Vehicle (CEV) missions to ISS, medical equipment will be located on ISS, and carried into CEV in the event of an emergency. Flight Surgeons (FS) on the ground in Mission Control will be expected to direct the Crew Medical Officer (CMO) during medical situations. If there is a loss of signal and the crew is unable to communicate with the ground, a CMO would be expected to carry out medical procedures without the aid of a FS. In these situations, performance support tools can be used to reduce errors and time to perform emergency medical tasks. Work on medical training has been conducted in collaboration with the Medical Training Group at the Space Life Sciences Directorate and with Wyle Lab which provides medical training to crew members, Biomedical Engineers (BMEs), and to flight surgeons under the JSC Space Life Sciences Directorate s Bioastronautics contract. The space medical training work is part of the Human Factors in Training Directed Research Project (DRP) of the Space Human Factors Engineering (SHFE) Project under the Space Human Factors and Habitability (SHFH) Element of the Human Research Program (HRP). Human factors researchers at Johnson Space Center have recently investigated medical performance support tools for CMOs on-orbit, and FSs on the ground, and researchers at the Ames Research Center performed a literature review on medical errors. The work proposed for FY10 continues to build on this strong collaboration with the Space Medical Training Group and previous research. This abstract focuses on two areas of work involving Performance Support Tools for Space Medical Operations. One area of research building on activities from FY08, involved the feasibility of just-in-time (JIT) training techniques and concepts for real-time medical procedures. In Phase 1, preliminary feasibility data was gathered for two types of prototype display technologies: a hand-held PDA, and a Head Mounted Display (HMD). The PDA and HMD were compared while performing a simulated medical procedure using ISS flight-like medical equipment. Based on the outcome of Phase 1, including data on user preferences, further testing was completed using the PDA only. Phase 2 explored a wrist-mounted PDA, and compared it to a paper cue card. For each phase, time to complete procedures, errors, and user satisfaction ratings were captured.
[Risk and risk management in aviation].
Müller, Manfred
2004-10-01
RISK MANAGEMENT: The large proportion of human errors in aviation accidents suggested the solution--at first sight brilliant--to replace the fallible human being by an "infallible" digitally-operating computer. However, even after the introduction of the so-called HITEC-airplanes, the factor human error still accounts for 75% of all accidents. Thus, if the computer is ruled out as the ultimate safety system, how else can complex operations involving quick and difficult decisions be controlled? OPTIMIZED TEAM INTERACTION/PARALLEL CONNECTION OF THOUGHT MACHINES: Since a single person is always "highly error-prone", support and control have to be guaranteed by a second person. The independent work of mind results in a safety network that more efficiently cushions human errors. NON-PUNITIVE ERROR MANAGEMENT: To be able to tackle the actual problems, the open discussion of intervened errors must not be endangered by the threat of punishment. It has been shown in the past that progress is primarily achieved by investigating and following up mistakes, failures and catastrophes shortly after they happened. HUMAN FACTOR RESEARCH PROJECT: A comprehensive survey showed the following result: By far the most frequent safety-critical situation (37.8% of all events) consists of the following combination of risk factors: 1. A complication develops. 2. In this situation of increased stress a human error occurs. 3. The negative effects of the error cannot be corrected or eased because there are deficiencies in team interaction on the flight deck. This means, for example, that a negative social climate has the effect of a "turbocharger" when a human error occurs. It needs to be pointed out that a negative social climate is not identical with a dispute. In many cases the working climate is burdened without the responsible person even noticing it: A first negative impression, too much or too little respect, contempt, misunderstandings, not expressing unclear concern, etc. can considerably reduce the efficiency of a team.
Accuracy of planar reaching movements. I. Independence of direction and extent variability.
Gordon, J; Ghilardi, M F; Ghez, C
1994-01-01
This study examined the variability in movement end points in a task in which human subjects reached to targets in different locations on a horizontal surface. The primary purpose was to determine whether patterns in the variable errors would reveal the nature and origin of the coordinate system in which the movements were planned. Six subjects moved a hand-held cursor on a digitizing tablet. Target and cursor positions were displayed on a computer screen, and vision of the hand and arm was blocked. The screen cursor was blanked during movement to prevent visual corrections. The paths of the movements were straight and thus directions were largely specified at the onset of movement. The velocity profiles were bell-shaped, and peak velocities and accelerations were scaled to target distance, implying that movement extent was also programmed in advance of the movement. The spatial distributions of movement end points were elliptical in shape. The major axes of these ellipses were systematically oriented in the direction of hand movement with respect to its initial position. This was true for both fast and slow movements, as well as for pointing movements involving rotations of the wrist joint. Using principal components analysis to compute the axes of these ellipses, we found that the eccentricity of the elliptical dispersions was uniformly greater for small than for large movements: variability along the axis of movement, representing extent variability, increased markedly but nonlinearly with distance. Variability perpendicular to the direction of movement, which results from directional errors, was generally smaller than extent variability, but it increased in proportion to the extent of the movement. Therefore, directional variability, in angular terms, was constant and independent of distance. Because the patterns of variability were similar for both slow and fast movements, as well as for movements involving different joints, we conclude that they result largely from errors in the planning process. We also argue that they cannot be simply explained as consequences of the inertial properties of the limb. Rather they provide evidence for an organizing mechanism that moves the limb along a straight path. We further conclude that reaching movements are planned in a hand-centered coordinate system, with direction and extent of hand movement as the planned parameters. Since the factors which influence directional variability are independent of those that influence extent errors, we propose that these two variables can be separately specified by the brain.
Competition between learned reward and error outcome predictions in anterior cingulate cortex.
Alexander, William H; Brown, Joshua W
2010-02-15
The anterior cingulate cortex (ACC) is implicated in performance monitoring and cognitive control. Non-human primate studies of ACC show prominent reward signals, but these are elusive in human studies, which instead show mainly conflict and error effects. Here we demonstrate distinct appetitive and aversive activity in human ACC. The error likelihood hypothesis suggests that ACC activity increases in proportion to the likelihood of an error, and ACC is also sensitive to the consequence magnitude of the predicted error. Previous work further showed that error likelihood effects reach a ceiling as the potential consequences of an error increase, possibly due to reductions in the average reward. We explored this issue by independently manipulating reward magnitude of task responses and error likelihood while controlling for potential error consequences in an Incentive Change Signal Task. The fMRI results ruled out a modulatory effect of expected reward on error likelihood effects in favor of a competition effect between expected reward and error likelihood. Dynamic causal modeling showed that error likelihood and expected reward signals are intrinsic to the ACC rather than received from elsewhere. These findings agree with interpretations of ACC activity as signaling both perceptions of risk and predicted reward. Copyright 2009 Elsevier Inc. All rights reserved.
Sollmann, Nico; Tanigawa, Noriko; Tussis, Lorena; Hauck, Theresa; Ille, Sebastian; Maurer, Stefanie; Negwer, Chiara; Zimmer, Claus; Ringel, Florian; Meyer, Bernhard; Krieg, Sandro M
2015-04-01
Knowledge about the cortical representation of semantic processing is mainly derived from functional magnetic resonance imaging (fMRI) or direct cortical stimulation (DCS) studies. Because DCS is regarded as the gold standard in terms of language mapping but can only be used during awake surgery due to its invasive character, repetitive navigated transcranial magnetic stimulation (rTMS)—a non-invasive modality that uses a similar technique as DCS—seems highly feasible for use in the investigation of semantic processing in the healthy human brain. A total number of 100 (50 left-hemispheric and 50 right-hemispheric) rTMS-based language mappings were performed in 50 purely right-handed, healthy volunteers during an object-naming task. All rTMS-induced semantic naming errors were then counted and evaluated systematically. Furthermore, since the distribution of stimulations within both hemispheres varied between individuals and cortical regions stimulated, all elicited errors were standardized and subsequently related to their cortical sites by projecting the mapping results into the cortical parcellation system (CPS). Overall, the most left-hemispheric semantic errors were observed after targeting the rTMS to the posterior middle frontal gyrus (pMFG; standardized error rate: 7.3‰), anterior supramarginal gyrus (aSMG; 5.6‰), and ventral postcentral gyrus (vPoG; 5.0‰). In contrast to that, the highest right-hemispheric error rates occurred after stimulation of the posterior superior temporal gyrus (pSTG; 12.4‰), middle superior temporal gyrus (mSTG; 6.2‰), and anterior supramarginal gyrus (aSMG; 6.2‰). Although error rates were low, the rTMS-based approach of investigating semantic processing during object naming shows convincing results compared to the current literature. Therefore, rTMS seems a valuable, safe, and reliable tool for the investigation of semantic processing within the healthy human brain. Copyright © 2015 Elsevier Ltd. All rights reserved.
Rong, Hao; Tian, Jin; Zhao, Tingdi
2016-01-01
In traditional approaches of human reliability assessment (HRA), the definition of the error producing conditions (EPCs) and the supporting guidance are such that some of the conditions (especially organizational or managerial conditions) can hardly be included, and thus the analysis is burdened with incomprehensiveness without reflecting the temporal trend of human reliability. A method based on system dynamics (SD), which highlights interrelationships among technical and organizational aspects that may contribute to human errors, is presented to facilitate quantitatively estimating the human error probability (HEP) and its related variables changing over time in a long period. Taking the Minuteman III missile accident in 2008 as a case, the proposed HRA method is applied to assess HEP during missile operations over 50 years by analyzing the interactions among the variables involved in human-related risks; also the critical factors are determined in terms of impact that the variables have on risks in different time periods. It is indicated that both technical and organizational aspects should be focused on to minimize human errors in a long run. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Giuliani, Manuel; Mirnig, Nicole; Stollnberger, Gerald; Stadler, Susanne; Buchner, Roland; Tscheligi, Manfred
2015-01-01
Human-robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human-robot interaction experiments. For that, we analyzed 201 videos of five human-robot interaction user studies with varying tasks from four independent projects. The analysis shows that there are two types of error situations: social norm violations and technical failures. Social norm violations are situations in which the robot does not adhere to the underlying social script of the interaction. Technical failures are caused by technical shortcomings of the robot. The results of the video analysis show that the study participants use many head movements and very few gestures, but they often smile, when in an error situation with the robot. Another result is that the participants sometimes stop moving at the beginning of error situations. We also found that the participants talked more in the case of social norm violations and less during technical failures. Finally, the participants use fewer non-verbal social signals (for example smiling, nodding, and head shaking), when they are interacting with the robot alone and no experimenter or other human is present. The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills. Our findings have implications for builders and evaluators of human-robot interaction systems. The builders need to consider including modules for recognition and classification of head movements to the robot input channels. The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies.
Air-ground information transfer in the National Airspace System
NASA Technical Reports Server (NTRS)
Lee, Alfred T.; Lozito, Sandra
1989-01-01
This paper reviews NASA's Aviation Safety Reporting System incident data for a two-year period in order to identify the frequency of air-ground information transfer errors and the factors associated with their occurrence. Of the more than 14,000 primary reports received during the 1985 and 1986 reporting period, one out of four reports concerned problems of information transfer between aircraft and ATC. Approximately half of these errors were associated directly or indirectly with aircraft deviations from assigned heading or altitude. The majority of incidents cited some human-system problem such as workload, cockpit distractions, etc., as the primary contributing factor. Improvements in air-ground information transfer using existing and future (e.g., data link) technology are proposed centering on the development and application of user-centered information management principles.
Proprioception Is Robust under External Forces
Kuling, Irene A.; Brenner, Eli; Smeets, Jeroen B. J.
2013-01-01
Information from cutaneous, muscle and joint receptors is combined with efferent information to create a reliable percept of the configuration of our body (proprioception). We exposed the hand to several horizontal force fields to examine whether external forces influence this percept. In an end-point task subjects reached visually presented positions with their unseen hand. In a vector reproduction task, subjects had to judge a distance and direction visually and reproduce the corresponding vector by moving the unseen hand. We found systematic individual errors in the reproduction of the end-points and vectors, but these errors did not vary systematically with the force fields. This suggests that human proprioception accounts for external forces applied to the hand when sensing the position of the hand in the horizontal plane. PMID:24019959
Wiegmann, D A; Shappell, S A
2001-11-01
The Human Factors Analysis and Classification System (HFACS) is a general human error framework originally developed and tested within the U.S. military as a tool for investigating and analyzing the human causes of aviation accidents. Based on Reason's (1990) model of latent and active failures, HFACS addresses human error at all levels of the system, including the condition of aircrew and organizational factors. The purpose of the present study was to assess the utility of the HFACS framework as an error analysis and classification tool outside the military. The HFACS framework was used to analyze human error data associated with aircrew-related commercial aviation accidents that occurred between January 1990 and December 1996 using database records maintained by the NTSB and the FAA. Investigators were able to reliably accommodate all the human causal factors associated with the commercial aviation accidents examined in this study using the HFACS system. In addition, the classification of data using HFACS highlighted several critical safety issues in need of intervention research. These results demonstrate that the HFACS framework can be a viable tool for use within the civil aviation arena. However, additional research is needed to examine its applicability to areas outside the flight deck, such as aircraft maintenance and air traffic control domains.
To Err Is Human; To Structurally Prime from Errors Is Also Human
ERIC Educational Resources Information Center
Slevc, L. Robert; Ferreira, Victor S.
2013-01-01
Natural language contains disfluencies and errors. Do listeners simply discard information that was clearly produced in error, or can erroneous material persist to affect subsequent processing? Two experiments explored this question using a structural priming paradigm. Speakers described dative-eliciting pictures after hearing prime sentences that…
Human factors analysis and classification system-HFACS.
DOT National Transportation Integrated Search
2000-02-01
Human error has been implicated in 70 to 80% of all civil and military aviation accidents. Yet, most accident : reporting systems are not designed around any theoretical framework of human error. As a result, most : accident databases are not conduci...
Technical approaches for measurement of human errors
NASA Technical Reports Server (NTRS)
Clement, W. F.; Heffley, R. K.; Jewell, W. F.; Mcruer, D. T.
1980-01-01
Human error is a significant contributing factor in a very high proportion of civil transport, general aviation, and rotorcraft accidents. The technical details of a variety of proven approaches for the measurement of human errors in the context of the national airspace system are presented. Unobtrusive measurements suitable for cockpit operations and procedures in part of full mission simulation are emphasized. Procedure, system performance, and human operator centered measurements are discussed as they apply to the manual control, communication, supervisory, and monitoring tasks which are relevant to aviation operations.
Comparison of direct and heterodyne detection optical intersatellite communication links
NASA Technical Reports Server (NTRS)
Chen, C. C.; Gardner, C. S.
1987-01-01
The performance of direct and heterodyne detection optical intersatellite communication links are evaluated and compared. It is shown that the performance of optical links is very sensitive to the pointing and tracking errors at the transmitter and receiver. In the presence of random pointing and tracking errors, optimal antenna gains exist that will minimize the required transmitter power. In addition to limiting the antenna gains, random pointing and tracking errors also impose a power penalty in the link budget. This power penalty is between 1.6 to 3 dB for a direct detection QPPM link, and 3 to 5 dB for a heterodyne QFSK system. For the heterodyne systems, the carrier phase noise presents another major factor of performance degradation that must be considered. In contrast, the loss due to synchronization error is small. The link budgets for direct and heterodyne detection systems are evaluated. It is shown that, for systems with large pointing and tracking errors, the link budget is dominated by the spatial tracking error, and the direct detection system shows a superior performance because it is less sensitive to the spatial tracking error. On the other hand, for systems with small pointing and tracking jitters, the antenna gains are in general limited by the launch cost, and suboptimal antenna gains are often used in practice. In which case, the heterodyne system has a slightly higher power margin because of higher receiver sensitivity.
Völker, Martin; Fiederer, Lukas D J; Berberich, Sofie; Hammer, Jiří; Behncke, Joos; Kršek, Pavel; Tomášek, Martin; Marusič, Petr; Reinacher, Peter C; Coenen, Volker A; Helias, Moritz; Schulze-Bonhage, Andreas; Burgard, Wolfram; Ball, Tonio
2018-06-01
Error detection in motor behavior is a fundamental cognitive function heavily relying on local cortical information processing. Neural activity in the high-gamma frequency band (HGB) closely reflects such local cortical processing, but little is known about its role in error processing, particularly in the healthy human brain. Here we characterize the error-related response of the human brain based on data obtained with noninvasive EEG optimized for HGB mapping in 31 healthy subjects (15 females, 16 males), and additional intracranial EEG data from 9 epilepsy patients (4 females, 5 males). Our findings reveal a multiscale picture of the global and local dynamics of error-related HGB activity in the human brain. On the global level as reflected in the noninvasive EEG, the error-related response started with an early component dominated by anterior brain regions, followed by a shift to parietal regions, and a subsequent phase characterized by sustained parietal HGB activity. This phase lasted for more than 1 s after the error onset. On the local level reflected in the intracranial EEG, a cascade of both transient and sustained error-related responses involved an even more extended network, spanning beyond frontal and parietal regions to the insula and the hippocampus. HGB mapping appeared especially well suited to investigate late, sustained components of the error response, possibly linked to downstream functional stages such as error-related learning and behavioral adaptation. Our findings establish the basic spatio-temporal properties of HGB activity as a neural correlate of error processing, complementing traditional error-related potential studies. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhang, Hao; Yuan, Yan; Su, Lijuan; Huang, Fengzhen; Bai, Qing
2016-09-01
The Risley-prism-based light beam steering apparatus delivers superior pointing accuracy and it is used in imaging LIDAR and imaging microscopes. A general model for pointing error analysis of the Risley prisms is proposed in this paper, based on ray direction deviation in light refraction. This model captures incident beam deviation, assembly deflections, and prism rotational error. We derive the transmission matrixes of the model firstly. Then, the independent and cumulative effects of different errors are analyzed through this model. Accuracy study of the model shows that the prediction deviation of pointing error for different error is less than 4.1×10-5° when the error amplitude is 0.1°. Detailed analyses of errors indicate that different error sources affect the pointing accuracy to varying degree, and the major error source is the incident beam deviation. The prism tilting has a relative big effect on the pointing accuracy when prism tilts in the principal section. The cumulative effect analyses of multiple errors represent that the pointing error can be reduced by tuning the bearing tilting in the same direction. The cumulative effect of rotational error is relative big when the difference of these two prism rotational angles equals 0 or π, while it is relative small when the difference equals π/2. The novelty of these results suggests that our analysis can help to uncover the error distribution and aid in measurement calibration of Risley-prism systems.
A Novel Design for Drug-Drug Interaction Alerts Improves Prescribing Efficiency.
Russ, Alissa L; Chen, Siying; Melton, Brittany L; Johnson, Elizabette G; Spina, Jeffrey R; Weiner, Michael; Zillich, Alan J
2015-09-01
Drug-drug interactions (DDIs) are common in clinical care and pose serious risks for patients. Electronic health records display DDI alerts that can influence prescribers, but the interface design of DDI alerts has largely been unstudied. In this study, the objective was to apply human factors engineering principles to alert design. It was hypothesized that redesigned DDI alerts would significantly improve prescribers' efficiency and reduce prescribing errors. In a counterbalanced, crossover study with prescribers, two DDI alert designs were evaluated. Department of Veterans Affairs (VA) prescribers were video recorded as they completed fictitious patient scenarios, which included DDI alerts of varying severity. Efficiency was measured from time-stamped recordings. Prescribing errors were evaluated against predefined criteria. Efficiency and prescribing errors were analyzed with the Wilcoxon signed-rank test. Other usability data were collected on the adequacy of alert content, prescribers' use of the DDI monograph, and alert navigation. Twenty prescribers completed patient scenarios for both designs. Prescribers resolved redesigned alerts in about half the time (redesign: 52 seconds versus original design: 97 seconds; p<.001). Prescribing errors were not significantly different between the two designs. Usability results indicate that DDI alerts might be enhanced by facilitating easier access to laboratory data and dosing information and by allowing prescribers to cancel either interacting medication directly from the alert. Results also suggest that neither design provided adequate information for decision making via the primary interface. Applying human factors principles to DDI alerts improved overall efficiency. Aspects of DDI alert design that could be further enhanced prior to implementation were also identified.
Forces associated with pneumatic power screwdriver operation: statics and dynamics.
Lin, Jia-Hua; Radwin, Robert G; Fronczak, Frank J; Richard, Terry G
2003-10-10
The statics and dynamics of pneumatic power screwdriver operation were investigated in the context of predicting forces acting against the human operator. A static force model is described in the paper, based on tool geometry, mass, orientation in space, feed force, torque build up, and stall torque. Three common power hand tool shapes are considered, including pistol grip, right angle, and in-line. The static model estimates handle force needed to support a power nutrunner when it acts against the tightened fastener with a constant torque. A system of equations for static force and moment equilibrium conditions are established, and the resultant handle force (resolved in orthogonal directions) is calculated in matrix form. A dynamic model is formulated to describe pneumatic motor torque build-up characteristics dependent on threaded fastener joint hardness. Six pneumatic tools were tested to validate the deterministic model. The average torque prediction error was 6.6% (SD = 5.4%) and the average handle force prediction error was 6.7% (SD = 6.4%) for a medium-soft threaded fastener joint. The average torque prediction error was 5.2% (SD = 5.3%) and the average handle force prediction error was 3.6% (SD = 3.2%) for a hard threaded fastener joint. Use of these equations for estimating handle forces based on passive mechanical elements representing the human operator is also described. These models together should be useful for considering tool handle force in the selection and design of power screwdrivers, particularly for minimizing handle forces in the prevention of injuries and work related musculoskeletal disorders.
Methods and apparatus for reducing peak wind turbine loads
Moroz, Emilian Mieczyslaw
2007-02-13
A method for reducing peak loads of wind turbines in a changing wind environment includes measuring or estimating an instantaneous wind speed and direction at the wind turbine and determining a yaw error of the wind turbine relative to the measured instantaneous wind direction. The method further includes comparing the yaw error to a yaw error trigger that has different values at different wind speeds and shutting down the wind turbine when the yaw error exceeds the yaw error trigger corresponding to the measured or estimated instantaneous wind speed.
Zeleke, Berihun M.; Abramson, Michael J.; Benke, Geza
2018-01-01
Uncertainty in experimental studies of exposure to radiation from mobile phones has in the past only been framed within the context of statistical variability. It is now becoming more apparent to researchers that epistemic or reducible uncertainties can also affect the total error in results. These uncertainties are derived from a wide range of sources including human error, such as data transcription, model structure, measurement and linguistic errors in communication. The issue of epistemic uncertainty is reviewed and interpreted in the context of the MoRPhEUS, ExPOSURE and HERMES cohort studies which investigate the effect of radiofrequency electromagnetic radiation from mobile phones on memory performance. Research into this field has found inconsistent results due to limitations from a range of epistemic sources. Potential analytic approaches are suggested based on quantification of epistemic error using Monte Carlo simulation. It is recommended that future studies investigating the relationship between radiofrequency electromagnetic radiation and memory performance pay more attention to treatment of epistemic uncertainties as well as further research into improving exposure assessment. Use of directed acyclic graphs is also encouraged to display the assumed covariate relationship. PMID:29587425
Mentoring Human Performance - 12480
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geis, John A.; Haugen, Christian N.
2012-07-01
Although the positive effects of implementing a human performance approach to operations can be hard to quantify, many organizations and industry areas are finding tangible benefits to such a program. Recently, a unique mentoring program was established and implemented focusing on improving the performance of managers, supervisors, and work crews, using the principles of Human Performance Improvement (HPI). The goal of this mentoring was to affect behaviors and habits that reliably implement the principles of HPI to ensure continuous improvement in implementation of an Integrated Safety Management System (ISMS) within a Conduct of Operations framework. Mentors engaged with personnel inmore » a one-on-one, or one-on-many dialogue, which focused on what behaviors were observed, what factors underlie the behaviors, and what changes in behavior could prevent errors or events, and improve performance. A senior management sponsor was essential to gain broad management support. A clear charter and management plan describing the goals, objectives, methodology, and expected outcomes was established. Mentors were carefully selected with senior management endorsement. Mentors were assigned to projects and work teams based on the following three criteria: 1) knowledge of the work scope; 2) experience in similar project areas; and 3) perceived level of trust they would have with project management, supervision, and work teams. This program was restructured significantly when the American Reinvestment and Recovery Act (ARRA) and the associated funding came to an end. The program was restructured based on an understanding of the observations, attributed successes and identified shortfalls, and the consolidation of those lessons. Mentoring the application of proven methods for improving human performance was shown effective at increasing success in day-to-day activities and increasing confidence and level of skill of supervisors. While mentoring program effectiveness is difficult to measure, and return on investment is difficult to quantify, especially in complex and large organizations where the ability to directly correlate causal factors can be challenging, the evidence presented by Sydney Dekker, James Reason, and others who study the field of human factors does assert managing and reducing error is possible. Employment of key behaviors-HPI techniques and skills-can be shown to have a significant impact on error rates. Our mentoring program demonstrated reduced error rates and corresponding improvements in safety and production. Improved behaviors are the result, of providing a culture with consistent, clear expectations from leadership, and processes and methods applied consistently to error prevention. Mentoring, as envisioned and executed in this program, was effective in helping shift organizational culture and effectively improving safety and production. (authors)« less
Data entry errors and design for model-based tight glycemic control in critical care.
Ward, Logan; Steel, James; Le Compte, Aaron; Evans, Alicia; Tan, Chia-Siong; Penning, Sophie; Shaw, Geoffrey M; Desaive, Thomas; Chase, J Geoffrey
2012-01-01
Tight glycemic control (TGC) has shown benefits but has been difficult to achieve consistently. Model-based methods and computerized protocols offer the opportunity to improve TGC quality but require human data entry, particularly of blood glucose (BG) values, which can be significantly prone to error. This study presents the design and optimization of data entry methods to minimize error for a computerized and model-based TGC method prior to pilot clinical trials. To minimize data entry error, two tests were carried out to optimize a method with errors less than the 5%-plus reported in other studies. Four initial methods were tested on 40 subjects in random order, and the best two were tested more rigorously on 34 subjects. The tests measured entry speed and accuracy. Errors were reported as corrected and uncorrected errors, with the sum comprising a total error rate. The first set of tests used randomly selected values, while the second set used the same values for all subjects to allow comparisons across users and direct assessment of the magnitude of errors. These research tests were approved by the University of Canterbury Ethics Committee. The final data entry method tested reduced errors to less than 1-2%, a 60-80% reduction from reported values. The magnitude of errors was clinically significant and was typically by 10.0 mmol/liter or an order of magnitude but only for extreme values of BG < 2.0 mmol/liter or BG > 15.0-20.0 mmol/liter, both of which could be easily corrected with automated checking of extreme values for safety. The data entry method selected significantly reduced data entry errors in the limited design tests presented, and is in use on a clinical pilot TGC study. The overall approach and testing methods are easily performed and generalizable to other applications and protocols. © 2012 Diabetes Technology Society.
Energy-efficient human body communication receiver chipset using wideband signaling scheme.
Song, Seong-Jun; Cho, Namjun; Kim, Sunyoung; Yoo, Hoi-Jun
2007-01-01
This paper presents an energy-efficient wideband signaling receiver for communication channels using the human body as a data transmission medium. The wideband signaling scheme with the direct-coupled interface provides the energy-efficient transmission of multimedia data around the human body. The wideband signaling receiver incorporates with a receiver AFE exploiting wideband symmetric triggering technique and an all-digital CDR circuit with quadratic sampling technique. The AFE operates at 10-Mb/s data rate with input sensitivity of -27dBm and the operational bandwidth of 200-MHz. The CDR recovers clock and data of 2-Mb/s at a bit error rate of 10(-7). The receiver chipset consumes only 5-mW from a 1-V supply, thereby achieving the bit energy of 2.5-nJ/bit.
Development of an errorable car-following driver model
NASA Astrophysics Data System (ADS)
Yang, H.-H.; Peng, H.
2010-06-01
An errorable car-following driver model is presented in this paper. An errorable driver model is one that emulates human driver's functions and can generate both nominal (error-free), as well as devious (with error) behaviours. This model was developed for evaluation and design of active safety systems. The car-following data used for developing and validating the model were obtained from a large-scale naturalistic driving database. The stochastic car-following behaviour was first analysed and modelled as a random process. Three error-inducing behaviours were then introduced. First, human perceptual limitation was studied and implemented. Distraction due to non-driving tasks was then identified based on the statistical analysis of the driving data. Finally, time delay of human drivers was estimated through a recursive least-square identification process. By including these three error-inducing behaviours, rear-end collisions with the lead vehicle could occur. The simulated crash rate was found to be similar but somewhat higher than that reported in traffic statistics.
An extensometer for global measurement of bone strain suitable for use in vivo in humans
NASA Technical Reports Server (NTRS)
Perusek, G. P.; Davis, B. L.; Sferra, J. J.; Courtney, A. C.; D'Andrea, S. E.
2001-01-01
An axial extensometer able to measure global bone strain magnitudes and rates encountered during physiological activity, and suitable for use in vivo in human subjects, is described. The extensometer uses paired capacitive sensors mounted to intraosseus pins and allows measurement of strain due to bending in the plane of the extensometer as well as uniaxial compression or tension. Data are presented for validation of the device against a surface-mounted strain gage in an acrylic specimen under dynamic four-point bending, with square wave and sinusoidal loading inputs up to 1500 mu epsilon and 20 Hz, representative of physiological strain magnitudes and frequencies. Pearson's correlation coefficient (r) between extensometer and strain gage ranged from 0.960 to 0.999. Mean differences between extensometer and strain gage ranged up to 15.3 mu epsilon. Errors in the extensometer output were directly proportional to the degree of bending that occurs in the specimen, however, these errors were predictable and less than 1 mu epsilon for the loading regime studied. The device is capable of tracking strain rates in excess of 90,000 mu epsilon/s.
Lost in Translation: the Case for Integrated Testing
NASA Technical Reports Server (NTRS)
Young, Aaron
2017-01-01
The building of a spacecraft is complex and often involves multiple suppliers and companies that have their own designs and processes. Standards have been developed across the industries to reduce the chances for critical flight errors at the system level, but the spacecraft is still vulnerable to the introduction of critical errors during integration of these systems. Critical errors can occur at any time during the process and in many cases, human reliability analysis (HRA) identifies human error as a risk driver. Most programs have a test plan in place that is intended to catch these errors, but it is not uncommon for schedule and cost stress to result in less testing than initially planned. Therefore, integrated testing, or "testing as you fly," is essential as a final check on the design and assembly to catch any errors prior to the mission. This presentation will outline the unique benefits of integrated testing by catching critical flight errors that can otherwise go undetected, discuss HRA methods that are used to identify opportunities for human error, lessons learned and challenges over ownership of testing will be discussed.
Human factors in aircraft incidents - Results of a 7-year study (Andre Allard Memorial Lecture)
NASA Technical Reports Server (NTRS)
Billings, C. E.; Reynard, W. D.
1984-01-01
It is pointed out that nearly all fatal aircraft accidents are preventable, and that most such accidents are due to human error. The present discussion is concerned with the results of a seven-year study of the data collected by the NASA Aviation Safety Reporting System (ASRS). The Aviation Safety Reporting System was designed to stimulate as large a flow as possible of information regarding errors and operational problems in the conduct of air operations. It was implemented in April, 1976. In the following 7.5 years, 35,000 reports have been received from pilots, controllers, and the armed forces. Human errors are found in more than 80 percent of these reports. Attention is given to the types of events reported, possible causal factors in incidents, the relationship of incidents and accidents, and sources of error in the data. ASRS reports include sufficient detail to permit authorities to institute changes in the national aviation system designed to minimize the likelihood of human error, and to insulate the system against the effects of errors.
Human factors in surgery: from Three Mile Island to the operating room.
D'Addessi, Alessandro; Bongiovanni, Luca; Volpe, Andrea; Pinto, Francesco; Bassi, PierFrancesco
2009-01-01
Human factors is a definition that includes the science of understanding the properties of human capability, the application of this understanding to the design and development of systems and services, the art of ensuring their successful applications to a program. The field of human factors traces its origins to the Second World War, but Three Mile Island has been the best example of how groups of people react and make decisions under stress: this nuclear accident was exacerbated by wrong decisions made because the operators were overwhelmed with irrelevant, misleading or incorrect information. Errors and their nature are the same in all human activities. The predisposition for error is so intrinsic to human nature that scientifically it is best considered as inherently biologic. The causes of error in medical care may not be easily generalized. Surgery differs in important ways: most errors occur in the operating room and are technical in nature. Commonly, surgical error has been thought of as the consequence of lack of skill or ability, and is the result of thoughtless actions. Moreover the 'operating theatre' has a unique set of team dynamics: professionals from multiple disciplines are required to work in a closely coordinated fashion. This complex environment provides multiple opportunities for unclear communication, clashing motivations, errors arising not from technical incompetence but from poor interpersonal skills. Surgeons have to work closely with human factors specialists in future studies. By improving processes already in place in many operating rooms, safety will be enhanced and quality increased.
Julian, B.R.; Evans, J.R.; Pritchard, M.J.; Foulger, G.R.
2000-01-01
Some computer programs based on the Aki-Christofferson-Husebye (ACH) method of teleseismic tomography contain an error caused by identifying local grid directions with azimuths on the spherical Earth. This error, which is most severe in high latitudes, introduces systematic errors into computed ray paths and distorts inferred Earth models. It is best dealt with by explicity correcting for the difference between true and grid directions. Methods for computing these directions are presented in this article and are likely to be useful in many other kinds of regional geophysical studies that use Cartesian coordinates and flat-earth approximations.
2012-07-01
and foot dynamics,” IEEE Trans. Biomed. Eng., vol. 54, no. 5, pp. 895–902, May 2007. [25] R. G. Brown and P. Y. C. Hwang , Introduction to Random Signals...the direction of displacement or position. In [18], Sabatini described a quaternion-based extended Kalman filter for determining the orientation of a...position. In [19], Foxlin used a foot-mounted IMMU from Inter- Sense incorporating the ZVU and an extended Kalman filter to achieve error performance
The OPL Access Control Policy Language
NASA Astrophysics Data System (ADS)
Alm, Christopher; Wolf, Ruben; Posegga, Joachim
Existing policy languages suffer from a limited ability of directly and elegantly expressing high-level access control principles such as history-based separation of duty [22], binding of duty [26], context constraints [24], Chinese wall properties [10], and obligations [20]. It is often difficult to extend a language in order to retrofit these features once required or it is necessary to use complicated and complex language constructs to express such concepts. The latter, however, is cumbersome and error-prone for humans dealing with policy administration.
The Human Factors Analysis and Classification System : HFACS : final report.
DOT National Transportation Integrated Search
2000-02-01
Human error has been implicated in 70 to 80% of all civil and military aviation accidents. Yet, most accident reporting systems are not designed around any theoretical framework of human error. As a result, most accident databases are not conducive t...
Kaneko, Takaaki; Tomonaga, Masaki
2014-06-01
Humans are often unaware of how they control their limb motor movements. People pay attention to their own motor movements only when their usual motor routines encounter errors. Yet little is known about the extent to which voluntary actions rely on automatic control and when automatic control shifts to deliberate control in nonhuman primates. In this study, we demonstrate that chimpanzees and humans showed similar limb motor adjustment in response to feedback error during reaching actions, whereas attentional allocation inferred from gaze behavior differed. We found that humans shifted attention to their own motor kinematics as errors were induced in motor trajectory feedback regardless of whether the errors actually disrupted their reaching their action goals. In contrast, chimpanzees shifted attention to motor execution only when errors actually interfered with their achieving a planned action goal. These results indicate that the species differed in their criteria for shifting from automatic to deliberate control of motor actions. It is widely accepted that sophisticated motor repertoires have evolved in humans. Our results suggest that the deliberate monitoring of one's own motor kinematics may have evolved in the human lineage. Copyright © 2014 Elsevier B.V. All rights reserved.
Modeling the directivity of parametric loudspeaker
NASA Astrophysics Data System (ADS)
Shi, Chuang; Gan, Woon-Seng
2012-09-01
The emerging applications of the parametric loudspeaker, such as 3D audio, demands accurate directivity control at the audible frequency (i.e. the difference frequency). Though the delay-and-sum beamforming has been proven adequate to adjust the steering angles of the parametric loudspeaker, accurate prediction of the mainlobe and sidelobes remains a challenging problem. It is mainly because of the approximations that are used to derive the directivity of the difference frequency from the directivity of the primary frequency, and the mismatches between the theoretical directivity and the measured directivity caused by system errors incurred at different stages of the implementation. In this paper, we propose a directivity model of the parametric loudspeaker. The directivity model consists of two tuning vectors corresponding to the spacing error and the weight error for the primary frequency. The directivity model adopts a modified form of the product directivity principle for the difference frequency to further improve the modeling accuracy.
Aoki, Hirofumi; Ohno, Ryuzo; Yamaguchi, Takao
2005-01-01
In a virtual weightless environment, subjects' orientation skills were studied to examine what kind of cognitive errors people make when they moved through the interior space of virtual space stations and what kind of visual information effectively decreases those errors. Subjects wearing a head-mounted display moved from one end to the other end in space station-like routes constructed of rectangular and cubical modules, and did Pointing and Modeling tasks. In Experiment 1, configurations of the routes were changed with such variables as the number of bends, the number of embedding planes, and the number of planes with respect to the body posture. The results indicated that spatial orientation ability was relevant to the variables and that orientational errors were explained by two causes. One of these was that the place, the direction, and the sequence of turns were incorrect. The other was that subjects did not recognize the rotation of the frame of reference, especially when they turned in pitch direction rather than in yaw. In Experiment 2, the effect of the interior design was examined by testing three design settings. Wall colors that showed the allocentric frame of reference and the different interior design of vertical and horizontal modules were effective; however, there was a limit to the effectiveness in complicated configurations. c2005 Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Mundermann, Lars; Mundermann, Annegret; Chaudhari, Ajit M.; Andriacchi, Thomas P.
2005-01-01
Anthropometric parameters are fundamental for a wide variety of applications in biomechanics, anthropology, medicine and sports. Recent technological advancements provide methods for constructing 3D surfaces directly. Of these new technologies, visual hull construction may be the most cost-effective yet sufficiently accurate method. However, the conditions influencing the accuracy of anthropometric measurements based on visual hull reconstruction are unknown. The purpose of this study was to evaluate the conditions that influence the accuracy of 3D shape-from-silhouette reconstruction of body segments dependent on number of cameras, camera resolution and object contours. The results demonstrate that the visual hulls lacked accuracy in concave regions and narrow spaces, but setups with a high number of cameras reconstructed a human form with an average accuracy of 1.0 mm. In general, setups with less than 8 cameras yielded largely inaccurate visual hull constructions, while setups with 16 and more cameras provided good volume estimations. Body segment volumes were obtained with an average error of 10% at a 640x480 resolution using 8 cameras. Changes in resolution did not significantly affect the average error. However, substantial decreases in error were observed with increasing number of cameras (33.3% using 4 cameras; 10.5% using 8 cameras; 4.1% using 16 cameras; 1.2% using 64 cameras).
Preventable Medical Errors Driven Modeling of Medical Best Practice Guidance Systems.
Ou, Andrew Y-Z; Jiang, Yu; Wu, Po-Liang; Sha, Lui; Berlin, Richard B
2017-01-01
In a medical environment such as Intensive Care Unit, there are many possible reasons to cause errors, and one important reason is the effect of human intellectual tasks. When designing an interactive healthcare system such as medical Cyber-Physical-Human Systems (CPHSystems), it is important to consider whether the system design can mitigate the errors caused by these tasks or not. In this paper, we first introduce five categories of generic intellectual tasks of humans, where tasks among each category may lead to potential medical errors. Then, we present an integrated modeling framework to model a medical CPHSystem and use UPPAAL as the foundation to integrate and verify the whole medical CPHSystem design models. With a verified and comprehensive model capturing the human intellectual tasks effects, we can design a more accurate and acceptable system. We use a cardiac arrest resuscitation guidance and navigation system (CAR-GNSystem) for such medical CPHSystem modeling. Experimental results show that the CPHSystem models help determine system design flaws and can mitigate the potential medical errors caused by the human intellectual tasks.
Cognitive science and the law.
Busey, Thomas A; Loftus, Geoffrey R
2007-03-01
Numerous innocent people have been sent to jail based directly or indirectly on normal, but flawed, human perception, memory and decision making. Current cognitive-science research addresses the issues that are directly relevant to the connection between normal cognitive functioning and such judicial errors, and suggests means by which the false-conviction rate could be reduced. Here, we illustrate how this can be achieved by reviewing recent work in two related areas: eyewitness testimony and fingerprint analysis. We articulate problems in these areas with reference to specific legal cases and demonstrate how recent findings can be used to address them. We also discuss how researchers can translate their conclusions into language and ideas that can influence and improve the legal system.
The role of the insula in intuitive expert bug detection in computer code: an fMRI study.
Castelhano, Joao; Duarte, Isabel C; Ferreira, Carlos; Duraes, Joao; Madeira, Henrique; Castelo-Branco, Miguel
2018-05-09
Software programming is a complex and relatively recent human activity, involving the integration of mathematical, recursive thinking and language processing. The neural correlates of this recent human activity are still poorly understood. Error monitoring during this type of task, requiring the integration of language, logical symbol manipulation and other mathematical skills, is particularly challenging. We therefore aimed to investigate the neural correlates of decision-making during source code understanding and mental manipulation in professional participants with high expertise. The present fMRI study directly addressed error monitoring during source code comprehension, expert bug detection and decision-making. We used C code, which triggers the same sort of processing irrespective of the native language of the programmer. We discovered a distinct role for the insula in bug monitoring and detection and a novel connectivity pattern that goes beyond the expected activation pattern evoked by source code understanding in semantic language and mathematical processing regions. Importantly, insula activity levels were critically related to the quality of error detection, involving intuition, as signalled by reported initial bug suspicion, prior to final decision and bug detection. Activity in this salience network (SN) region evoked by bug suspicion was predictive of bug detection precision, suggesting that it encodes the quality of the behavioral evidence. Connectivity analysis provided evidence for top-down circuit "reutilization" stemming from anterior cingulate cortex (BA32), a core region in the SN that evolved for complex error monitoring such as required for this type of recent human activity. Cingulate (BA32) and anterolateral (BA10) frontal regions causally modulated decision processes in the insula, which in turn was related to activity of math processing regions in early parietal cortex. In other words, earlier brain regions used during evolution for other functions seem to be reutilized in a top-down manner for a new complex function, in an analogous manner as described for other cultural creations such as reading and literacy.
Scientific Impacts of Wind Direction Errors
NASA Technical Reports Server (NTRS)
Liu, W. Timothy; Kim, Seung-Bum; Lee, Tong; Song, Y. Tony; Tang, Wen-Qing; Atlas, Robert
2004-01-01
An assessment on the scientific impact of random errors in wind direction (less than 45 deg) retrieved from space-based observations under weak wind (less than 7 m/s ) conditions was made. averages, and these weak winds cover most of the tropical, sub-tropical, and coastal oceans. Introduction of these errors in the semi-daily winds causes, on average, 5% changes of the yearly mean Ekman and Sverdrup volume transports computed directly from the winds, respectively. These poleward movements of water are the main mechanisms to redistribute heat from the warmer tropical region to the colder high- latitude regions, and they are the major manifestations of the ocean's function in modifying Earth's climate. Simulation by an ocean general circulation model shows that the wind errors introduce a 5% error in the meridional heat transport at tropical latitudes. The simulation also shows that the erroneous winds cause a pile-up of warm surface water in the eastern tropical Pacific, similar to the conditions during El Nino episode. Similar wind directional errors cause significant change in sea-surface temperature and sea-level patterns in coastal oceans in a coastal model simulation. Previous studies have shown that assimilation of scatterometer winds improves 3-5 day weather forecasts in the Southern Hemisphere. When directional information below 7 m/s was withheld, approximately 40% of the improvement was lost
Barsingerhorn, A D; Boonstra, F N; Goossens, H H L M
2017-02-01
Current stereo eye-tracking methods model the cornea as a sphere with one refractive surface. However, the human cornea is slightly aspheric and has two refractive surfaces. Here we used ray-tracing and the Navarro eye-model to study how these optical properties affect the accuracy of different stereo eye-tracking methods. We found that pupil size, gaze direction and head position all influence the reconstruction of gaze. Resulting errors range between ± 1.0 degrees at best. This shows that stereo eye-tracking may be an option if reliable calibration is not possible, but the applied eye-model should account for the actual optics of the cornea.
A model of the human supervisor
NASA Technical Reports Server (NTRS)
Kok, J. J.; Vanwijk, R. A.
1977-01-01
A general model of the human supervisor's behavior is given. Submechanisms of the model include: the observer/reconstructor; decision-making; and controller. A set of hypothesis is postulated for the relations between the task variables and the parameters of the different submechanisms of the model. Verification of the model hypotheses is considered using variations in the task variables. An approach is suggested for the identification of the model parameters which makes use of a multidimensional error criterion. Each of the elements of this multidimensional criterion corresponds to a certain aspect of the supervisor's behavior, and is directly related to a particular part of the model and its parameters. This approach offers good possibilities for an efficient parameter adjustment procedure.
Trial-by-trial adaptation of movements during mental practice under force field.
Anwar, Muhammad Nabeel; Khan, Salman Hameed
2013-01-01
Human nervous system tries to minimize the effect of any external perturbing force by bringing modifications in the internal model. These modifications affect the subsequent motor commands generated by the nervous system. Adaptive compensation along with the appropriate modifications of internal model helps in reducing human movement errors. In the current study, we studied how motor imagery influences trial-to-trial learning in a robot-based adaptation task. Two groups of subjects performed reaching movements with or without motor imagery in a velocity-dependent force field. The results show that reaching movements performed with motor imagery have relatively a more focused generalization pattern and a higher learning rate in training direction.
NASA Astrophysics Data System (ADS)
Coyne, Kevin Anthony
The safe operation of complex systems such as nuclear power plants requires close coordination between the human operators and plant systems. In order to maintain an adequate level of safety following an accident or other off-normal event, the operators often are called upon to perform complex tasks during dynamic situations with incomplete information. The safety of such complex systems can be greatly improved if the conditions that could lead operators to make poor decisions and commit erroneous actions during these situations can be predicted and mitigated. The primary goal of this research project was the development and validation of a cognitive model capable of simulating nuclear plant operator decision-making during accident conditions. Dynamic probabilistic risk assessment methods can improve the prediction of human error events by providing rich contextual information and an explicit consideration of feedback arising from man-machine interactions. The Accident Dynamics Simulator paired with the Information, Decision, and Action in a Crew context cognitive model (ADS-IDAC) shows promise for predicting situational contexts that might lead to human error events, particularly knowledge driven errors of commission. ADS-IDAC generates a discrete dynamic event tree (DDET) by applying simple branching rules that reflect variations in crew responses to plant events and system status changes. Branches can be generated to simulate slow or fast procedure execution speed, skipping of procedure steps, reliance on memorized information, activation of mental beliefs, variations in control inputs, and equipment failures. Complex operator mental models of plant behavior that guide crew actions can be represented within the ADS-IDAC mental belief framework and used to identify situational contexts that may lead to human error events. This research increased the capabilities of ADS-IDAC in several key areas. The ADS-IDAC computer code was improved to support additional branching events and provide a better representation of the IDAC cognitive model. An operator decision-making engine capable of responding to dynamic changes in situational context was implemented. The IDAC human performance model was fully integrated with a detailed nuclear plant model in order to realistically simulate plant accident scenarios. Finally, the improved ADS-IDAC model was calibrated, validated, and updated using actual nuclear plant crew performance data. This research led to the following general conclusions: (1) A relatively small number of branching rules are capable of efficiently capturing a wide spectrum of crew-to-crew variabilities. (2) Compared to traditional static risk assessment methods, ADS-IDAC can provide a more realistic and integrated assessment of human error events by directly determining the effect of operator behaviors on plant thermal hydraulic parameters. (3) The ADS-IDAC approach provides an efficient framework for capturing actual operator performance data such as timing of operator actions, mental models, and decision-making activities.
Péter, András; Topál, József; Miklósi, Ádám; Pongrácz, Péter
2016-04-01
Performance in object search tasks is not only influenced by the subjects' object permanence ability. For example, ostensive cues of the human manipulating the target markedly affect dogs' choices. However, the interference between the target's location and the spatial cues of the human hiding the object is still unknown. In a five-location visible displacement task, the experimental groups differed in the hiding route of the experimenter. In the 'direct' condition he moved straight towards the actual location, hid the object and returned to the dog. In the 'indirect' conditions, he additionally walked behind each screen before returning. The two 'indirect' conditions differed from each other in that the human either visited the previously baited locations before (proactive interference) or after (retroactive interference) hiding the object. In the 'indirect' groups, dogs' performance was significantly lower than in the 'direct' group, demonstrating that for dogs, in an ostensive context, spatial cues of the hider are as important as the observed location of the target. Based on their incorrect choices, dogs were most attracted to the previously baited locations that the human visited after hiding the object in the actual trial. This underlines the importance of retroactive interference in multiple choice tasks. Copyright © 2016 Elsevier B.V. All rights reserved.
Generality of a congruity effect in judgements of relative order.
Liu, Yang S; Chan, Michelle; Caplan, Jeremy B
2014-10-01
The judgement of relative order (JOR) procedure is used to investigate serial-order memory. Measuring response times, the wording of the instructions (whether the earlier or the later item was designated as the target) reversed the direction of search in subspan lists (Chan, Ross, Earle, & Caplan Psychonomic Bulletin & Review, 16(5), 945-951, 2009). If a similar congruity effect applied to above-span lists and, furthermore, with error rate as the measure, this could suggest how to model order memory across scales. Participants performed JORs on lists of nouns (Experiment 1: list lengths = 4, 6, 8, 10) or consonants (Experiment 2: list lengths = 4, 8). In addition to the usual distance, primacy, and recency effects, instructions interacted with serial position of the later probe in both experiments, not only in response time, but also in error rate, suggesting that availability, not just accessibility, is affected by instructions. The congruity effect challenges current memory models. We fitted Hacker's (Journal of Experimental Psychology: Human Learning and Memory, 6(6), 651-675, 1980) self-terminating search model to our data and found that a switch in search direction could explain the congruity effect for short lists, but not longer lists. This suggests that JORs may need to be understood via direct-access models, adapted to produce a congruity effect, or a mix of mechanisms.
Modeling human tracking error in several different anti-tank systems
NASA Technical Reports Server (NTRS)
Kleinman, D. L.
1981-01-01
An optimal control model for generating time histories of human tracking errors in antitank systems is outlined. Monte Carlo simulations of human operator responses for three Army antitank systems are compared. System/manipulator dependent data comparisons reflecting human operator limitations in perceiving displayed quantities and executing intended control motions are presented. Motor noise parameters are also discussed.
An examination of an adapter method for measuring the vibration transmitted to the human arms.
Xu, Xueyan S; Dong, Ren G; Welcome, Daniel E; Warren, Christopher; McDowell, Thomas W
2015-09-01
The objective of this study is to evaluate an adapter method for measuring the vibration on the human arms. Four instrumented adapters with different weights were used to measure the vibration transmitted to the wrist, forearm, and upper arm of each subject. Each adapter was attached at each location on the subjects using an elastic cloth wrap. Two laser vibrometers were also used to measure the transmitted vibration at each location to evaluate the validity of the adapter method. The apparent mass at the palm of the hand along the forearm direction was also measured to enhance the evaluation. This study found that the adapter and laser-measured transmissibility spectra were comparable with some systematic differences. While increasing the adapter mass reduced the resonant frequency at the measurement location, increasing the tightness of the adapter attachment increased the resonant frequency. However, the use of lightweight (≤15 g) adapters under medium attachment tightness did not change the basic trends of the transmissibility spectrum. The resonant features observed in the transmissibility spectra were also correlated with those observed in the apparent mass spectra. Because the local coordinate systems of the adapters may be significantly misaligned relative to the global coordinates of the vibration test systems, large errors were observed for the adapter-measured transmissibility in some individual orthogonal directions. This study, however, also demonstrated that the misalignment issue can be resolved by either using the total vibration transmissibility or by measuring the misalignment angles to correct the errors. Therefore, the adapter method is acceptable for understanding the basic characteristics of the vibration transmission in the human arms, and the adapter-measured data are acceptable for approximately modeling the system.
An examination of an adapter method for measuring the vibration transmitted to the human arms
Xu, Xueyan S.; Dong, Ren G.; Welcome, Daniel E.; Warren, Christopher; McDowell, Thomas W.
2016-01-01
The objective of this study is to evaluate an adapter method for measuring the vibration on the human arms. Four instrumented adapters with different weights were used to measure the vibration transmitted to the wrist, forearm, and upper arm of each subject. Each adapter was attached at each location on the subjects using an elastic cloth wrap. Two laser vibrometers were also used to measure the transmitted vibration at each location to evaluate the validity of the adapter method. The apparent mass at the palm of the hand along the forearm direction was also measured to enhance the evaluation. This study found that the adapter and laser-measured transmissibility spectra were comparable with some systematic differences. While increasing the adapter mass reduced the resonant frequency at the measurement location, increasing the tightness of the adapter attachment increased the resonant frequency. However, the use of lightweight (≤15 g) adapters under medium attachment tightness did not change the basic trends of the transmissibility spectrum. The resonant features observed in the transmissibility spectra were also correlated with those observed in the apparent mass spectra. Because the local coordinate systems of the adapters may be significantly misaligned relative to the global coordinates of the vibration test systems, large errors were observed for the adapter-measured transmissibility in some individual orthogonal directions. This study, however, also demonstrated that the misalignment issue can be resolved by either using the total vibration transmissibility or by measuring the misalignment angles to correct the errors. Therefore, the adapter method is acceptable for understanding the basic characteristics of the vibration transmission in the human arms, and the adapter-measured data are acceptable for approximately modeling the system. PMID:26834309
Vavilov, A Iu; Viter, V I
2007-01-01
Mathematical questions of data errors of modern thermometrical models of postmortem cooling of the human body are considered. The main diagnostic areas used for thermometry are analyzed to minimize these data errors. The authors propose practical recommendations to decrease data errors of determination of prescription of death coming.
Smiley, A M
1990-10-01
In February of 1986 a head-on collision occurred between a freight train and a passenger train in western Canada killing 23 people and causing over $30 million of damage. A Commission of Inquiry appointed by the Canadian government concluded that human error was the major reason for the collision. This report discusses the factors contributing to the human error: mainly poor work-rest schedules, the monotonous nature of the train driving task, insufficient information about train movements, and the inadequate backup systems in case of human error.
A Conceptual Framework for Predicting Error in Complex Human-Machine Environments
NASA Technical Reports Server (NTRS)
Freed, Michael; Remington, Roger; Null, Cynthia H. (Technical Monitor)
1998-01-01
We present a Goals, Operators, Methods, and Selection Rules-Model Human Processor (GOMS-MHP) style model-based approach to the problem of predicting human habit capture errors. Habit captures occur when the model fails to allocate limited cognitive resources to retrieve task-relevant information from memory. Lacking the unretrieved information, decision mechanisms act in accordance with implicit default assumptions, resulting in error when relied upon assumptions prove incorrect. The model helps interface designers identify situations in which such failures are especially likely.
Effect of thematic map misclassification on landscape multi-metric assessment.
Kleindl, William J; Powell, Scott L; Hauer, F Richard
2015-06-01
Advancements in remote sensing and computational tools have increased our awareness of large-scale environmental problems, thereby creating a need for monitoring, assessment, and management at these scales. Over the last decade, several watershed and regional multi-metric indices have been developed to assist decision-makers with planning actions of these scales. However, these tools use remote-sensing products that are subject to land-cover misclassification, and these errors are rarely incorporated in the assessment results. Here, we examined the sensitivity of a landscape-scale multi-metric index (MMI) to error from thematic land-cover misclassification and the implications of this uncertainty for resource management decisions. Through a case study, we used a simplified floodplain MMI assessment tool, whose metrics were derived from Landsat thematic maps, to initially provide results that were naive to thematic misclassification error. Using a Monte Carlo simulation model, we then incorporated map misclassification error into our MMI, resulting in four important conclusions: (1) each metric had a different sensitivity to error; (2) within each metric, the bias between the error-naive metric scores and simulated scores that incorporate potential error varied in magnitude and direction depending on the underlying land cover at each assessment site; (3) collectively, when the metrics were combined into a multi-metric index, the effects were attenuated; and (4) the index bias indicated that our naive assessment model may overestimate floodplain condition of sites with limited human impacts and, to a lesser extent, either over- or underestimated floodplain condition of sites with mixed land use.
NASA Technical Reports Server (NTRS)
Dar, M. E.; Jorgensen, T. J.
1995-01-01
Using the radiomimetic drug, bleomycin, we have determined the mutagenic potential of DNA strand breaks in the shuttle vector pZ189 in human fibroblasts. The bleomycin treatment conditions used produce strand breaks with 3'-phosphoglycolate termini as > 95% of the detectable dose-dependent lesions. Breaks with this end group represent 50% of the strand break damage produced by ionizing radiation. We report that such strand breaks are mutagenic lesions. The type of mutation produced is largely determined by the type of strand break on the plasmid (i.e. single versus double). Mutagenesis studies with purified DNA forms showed that nicked plasmids (i.e. those containing single-strand breaks) predominantly produce base substitutions, the majority of which are multiples, which presumably originate from error-prone polymerase activity at strand break sites. In contrast, repair of linear plasmids (i.e. those containing double-strand breaks) mainly results in deletions at short direct repeat sequences, indicating the involvement of illegitimate recombination. The data characterize the nature of mutations produced by single- and double-strand breaks in human cells, and suggests that deletions at direct repeats may be a 'signature' mutation for the processing of DNA double-strand breaks.
Rong, Hao; Tian, Jin
2015-05-01
The study contributes to human reliability analysis (HRA) by proposing a method that focuses more on human error causality within a sociotechnical system, illustrating its rationality and feasibility by using a case of the Minuteman (MM) III missile accident. Due to the complexity and dynamics within a sociotechnical system, previous analyses of accidents involving human and organizational factors clearly demonstrated that the methods using a sequential accident model are inadequate to analyze human error within a sociotechnical system. System-theoretic accident model and processes (STAMP) was used to develop a universal framework of human error causal analysis. To elaborate the causal relationships and demonstrate the dynamics of human error, system dynamics (SD) modeling was conducted based on the framework. A total of 41 contributing factors, categorized into four types of human error, were identified through the STAMP-based analysis. All factors are related to a broad view of sociotechnical systems, and more comprehensive than the causation presented in the accident investigation report issued officially. Recommendations regarding both technical and managerial improvement for a lower risk of the accident are proposed. The interests of an interdisciplinary approach provide complementary support between system safety and human factors. The integrated method based on STAMP and SD model contributes to HRA effectively. The proposed method will be beneficial to HRA, risk assessment, and control of the MM III operating process, as well as other sociotechnical systems. © 2014, Human Factors and Ergonomics Society.
Approximate error conjugation gradient minimization methods
Kallman, Jeffrey S
2013-05-21
In one embodiment, a method includes selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, calculating an approximate error using the subset of rays, and calculating a minimum in a conjugate gradient direction based on the approximate error. In another embodiment, a system includes a processor for executing logic, logic for selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, logic for calculating an approximate error using the subset of rays, and logic for calculating a minimum in a conjugate gradient direction based on the approximate error. In other embodiments, computer program products, methods, and systems are described capable of using approximate error in constrained conjugate gradient minimization problems.
Why GPS makes distances bigger than they are
Ranacher, Peter; Brunauer, Richard; Trutschnig, Wolfgang; Van der Spek, Stefan; Reich, Siegfried
2016-01-01
ABSTRACT Global navigation satellite systems such as the Global Positioning System (GPS) is one of the most important sensors for movement analysis. GPS is widely used to record the trajectories of vehicles, animals and human beings. However, all GPS movement data are affected by both measurement and interpolation errors. In this article we show that measurement error causes a systematic bias in distances recorded with a GPS; the distance between two points recorded with a GPS is – on average – bigger than the true distance between these points. This systematic ‘overestimation of distance’ becomes relevant if the influence of interpolation error can be neglected, which in practice is the case for movement sampled at high frequencies. We provide a mathematical explanation of this phenomenon and illustrate that it functionally depends on the autocorrelation of GPS measurement error (C). We argue that C can be interpreted as a quality measure for movement data recorded with a GPS. If there is a strong autocorrelation between any two consecutive position estimates, they have very similar error. This error cancels out when average speed, distance or direction is calculated along the trajectory. Based on our theoretical findings we introduce a novel approach to determine C in real-world GPS movement data sampled at high frequencies. We apply our approach to pedestrian trajectories and car trajectories. We found that the measurement error in the data was strongly spatially and temporally autocorrelated and give a quality estimate of the data. Most importantly, our findings are not limited to GPS alone. The systematic bias and its implications are bound to occur in any movement data collected with absolute positioning if interpolation error can be neglected. PMID:27019610
Reinhart, Robert M G; Zhu, Julia; Park, Sohee; Woodman, Geoffrey F
2015-09-02
Posterror learning, associated with medial-frontal cortical recruitment in healthy subjects, is compromised in neuropsychiatric disorders. Here we report novel evidence for the mechanisms underlying learning dysfunctions in schizophrenia. We show that, by noninvasively passing direct current through human medial-frontal cortex, we could enhance the event-related potential related to learning from mistakes (i.e., the error-related negativity), a putative index of prediction error signaling in the brain. Following this causal manipulation of brain activity, the patients learned a new task at a rate that was indistinguishable from healthy individuals. Moreover, the severity of delusions interacted with the efficacy of the stimulation to improve learning. Our results demonstrate a causal link between disrupted prediction error signaling and inefficient learning in schizophrenia. These findings also demonstrate the feasibility of nonpharmacological interventions to address cognitive deficits in neuropsychiatric disorders. When there is a difference between what we expect to happen and what we actually experience, our brains generate a prediction error signal, so that we can map stimuli to responses and predict outcomes accurately. Theories of schizophrenia implicate abnormal prediction error signaling in the cognitive deficits of the disorder. Here, we combine noninvasive brain stimulation with large-scale electrophysiological recordings to establish a causal link between faulty prediction error signaling and learning deficits in schizophrenia. We show that it is possible to improve learning rate, as well as the neural signature of prediction error signaling, in patients to a level quantitatively indistinguishable from that of healthy subjects. The results provide mechanistic insight into schizophrenia pathophysiology and suggest a future therapy for this condition. Copyright © 2015 the authors 0270-6474/15/3512232-09$15.00/0.
Characterizing the SWOT discharge error budget on the Sacramento River, CA
NASA Astrophysics Data System (ADS)
Yoon, Y.; Durand, M. T.; Minear, J. T.; Smith, L.; Merry, C. J.
2013-12-01
The Surface Water and Ocean Topography (SWOT) is an upcoming satellite mission (2020 year) that will provide surface-water elevation and surface-water extent globally. One goal of SWOT is the estimation of river discharge directly from SWOT measurements. SWOT discharge uncertainty is due to two sources. First, SWOT cannot measure channel bathymetry and determine roughness coefficient data necessary for discharge calculations directly; these parameters must be estimated from the measurements or from a priori information. Second, SWOT measurement errors directly impact the discharge estimate accuracy. This study focuses on characterizing parameter and measurement uncertainties for SWOT river discharge estimation. A Bayesian Markov Chain Monte Carlo scheme is used to calculate parameter estimates, given the measurements of river height, slope and width, and mass and momentum constraints. The algorithm is evaluated using simulated both SWOT and AirSWOT (the airborne version of SWOT) observations over seven reaches (about 40 km) of the Sacramento River. The SWOT and AirSWOT observations are simulated by corrupting the ';true' HEC-RAS hydraulic modeling results with the instrument error. This experiment answers how unknown bathymetry and roughness coefficients affect the accuracy of the river discharge algorithm. From the experiment, the discharge error budget is almost completely dominated by unknown bathymetry and roughness; 81% of the variance error is explained by uncertainties in bathymetry and roughness. Second, we show how the errors in water surface, slope, and width observations influence the accuracy of discharge estimates. Indeed, there is a significant sensitivity to water surface, slope, and width errors due to the sensitivity of bathymetry and roughness to measurement errors. Increasing water-surface error above 10 cm leads to a corresponding sharper increase of errors in bathymetry and roughness. Increasing slope error above 1.5 cm/km leads to a significant degradation due to direct error in the discharge estimates. As the width error increases past 20%, the discharge error budget is dominated by the width error. Above two experiments are performed based on AirSWOT scenarios. In addition, we explore the sensitivity of the algorithm to the SWOT scenarios.
Qiao-Grider, Ying; Hung, Li-Fang; Kee, Chea-Su; Ramamirtham, Ramkumar; Smith, Earl L
2010-08-23
We analyzed the contribution of individual ocular components to vision-induced ametropias in 210 rhesus monkeys. The primary contribution to refractive-error development came from vitreous chamber depth; a minor contribution from corneal power was also detected. However, there was no systematic relationship between refractive error and anterior chamber depth or between refractive error and any crystalline lens parameter. Our results are in good agreement with previous studies in humans, suggesting that the refractive errors commonly observed in humans are created by vision-dependent mechanisms that are similar to those operating in monkeys. This concordance emphasizes the applicability of rhesus monkeys in refractive-error studies. Copyright 2010 Elsevier Ltd. All rights reserved.
Qiao-Grider, Ying; Hung, Li-Fang; Kee, Chea-su; Ramamirtham, Ramkumar; Smith, Earl L.
2010-01-01
We analyzed the contribution of individual ocular components to vision-induced ametropias in 210 rhesus monkeys. The primary contribution to refractive-error development came from vitreous chamber depth; a minor contribution from corneal power was also detected. However, there was no systematic relationship between refractive error and anterior chamber depth or between refractive error and any crystalline lens parameter. Our results are in good agreement with previous studies in humans, suggesting that the refractive errors commonly observed in humans are created by vision-dependent mechanisms that are similar to those operating in monkeys. This concordance emphasizes the applicability of rhesus monkeys in refractive-error studies. PMID:20600237
Internally-generated error signals in monkey frontal eye field during an inferred motion task
Ferrera, Vincent P.; Barborica, Andrei
2010-01-01
An internal model for predictive saccades in frontal cortex was investigated by recording neurons in monkey frontal eye field during an inferred motion task. Monkeys were trained to make saccades to the extrapolated position of a small moving target that was rendered temporarily invisible and whose trajectory was altered. On roughly two-thirds of the trials, monkeys made multiple saccades while the target was invisible. Primary saccades were correlated with extrapolated target position. Secondary saccades significantly reduced residual errors resulting from imperfect accuracy of the first saccade. These observations suggest that the second saccade was corrective. As there was no visual feedback, corrective saccades could only be driven by an internally generated error signal. Neuronal activity in the frontal eye field was directionally tuned prior to both primary and secondary saccades. Separate subpopulations of cells encoded either saccade direction or direction error prior to the second saccade. These results suggest that FEF neurons encode the error after the first saccade, as well as the direction of the second saccade. Hence, FEF appears to contribute to detecting and correcting movement errors based on internally generated signals. PMID:20810882
Data driven CAN node reliability assessment for manufacturing system
NASA Astrophysics Data System (ADS)
Zhang, Leiming; Yuan, Yong; Lei, Yong
2017-01-01
The reliability of the Controller Area Network(CAN) is critical to the performance and safety of the system. However, direct bus-off time assessment tools are lacking in practice due to inaccessibility of the node information and the complexity of the node interactions upon errors. In order to measure the mean time to bus-off(MTTB) of all the nodes, a novel data driven node bus-off time assessment method for CAN network is proposed by directly using network error information. First, the corresponding network error event sequence for each node is constructed using multiple-layer network error information. Then, the generalized zero inflated Poisson process(GZIP) model is established for each node based on the error event sequence. Finally, the stochastic model is constructed to predict the MTTB of the node. The accelerated case studies with different error injection rates are conducted on a laboratory network to demonstrate the proposed method, where the network errors are generated by a computer controlled error injection system. Experiment results show that the MTTB of nodes predicted by the proposed method agree well with observations in the case studies. The proposed data driven node time to bus-off assessment method for CAN networks can successfully predict the MTTB of nodes by directly using network error event data.
A novel validation and calibration method for motion capture systems based on micro-triangulation.
Nagymáté, Gergely; Tuchband, Tamás; Kiss, Rita M
2018-06-06
Motion capture systems are widely used to measure human kinematics. Nevertheless, users must consider system errors when evaluating their results. Most validation techniques for these systems are based on relative distance and displacement measurements. In contrast, our study aimed to analyse the absolute volume accuracy of optical motion capture systems by means of engineering surveying reference measurement of the marker coordinates (uncertainty: 0.75 mm). The method is exemplified on an 18 camera OptiTrack Flex13 motion capture system. The absolute accuracy was defined by the root mean square error (RMSE) between the coordinates measured by the camera system and by engineering surveying (micro-triangulation). The original RMSE of 1.82 mm due to scaling error was managed to be reduced to 0.77 mm while the correlation of errors to their distance from the origin reduced from 0.855 to 0.209. A simply feasible but less accurate absolute accuracy compensation method using tape measure on large distances was also tested, which resulted in similar scaling compensation compared to the surveying method or direct wand size compensation by a high precision 3D scanner. The presented validation methods can be less precise in some respects as compared to previous techniques, but they address an error type, which has not been and cannot be studied with the previous validation methods. Copyright © 2018 Elsevier Ltd. All rights reserved.
Geolocation error tracking of ZY-3 three line cameras
NASA Astrophysics Data System (ADS)
Pan, Hongbo
2017-01-01
The high-accuracy geolocation of high-resolution satellite images (HRSIs) is a key issue for mapping and integrating multi-temporal, multi-sensor images. In this manuscript, we propose a new geometric frame for analysing the geometric error of a stereo HRSI, in which the geolocation error can be divided into three parts: the epipolar direction, cross base direction, and height direction. With this frame, we proved that the height error of three line cameras (TLCs) is independent of nadir images, and that the terrain effect has a limited impact on the geolocation errors. For ZY-3 error sources, the drift error in both the pitch and roll angle and its influence on the geolocation accuracy are analysed. Epipolar and common tie-point constraints are proposed to study the bundle adjustment of HRSIs. Epipolar constraints explain that the relative orientation can reduce the number of compensation parameters in the cross base direction and have a limited impact on the height accuracy. The common tie points adjust the pitch-angle errors to be consistent with each other for TLCs. Therefore, free-net bundle adjustment of a single strip cannot significantly improve the geolocation accuracy. Furthermore, the epipolar and common tie-point constraints cause the error to propagate into the adjacent strip when multiple strips are involved in the bundle adjustment, which results in the same attitude uncertainty throughout the whole block. Two adjacent strips-Orbit 305 and Orbit 381, covering 7 and 12 standard scenes separately-and 308 ground control points (GCPs) were used for the experiments. The experiments validate the aforementioned theory. The planimetric and height root mean square errors were 2.09 and 1.28 m, respectively, when two GCPs were settled at the beginning and end of the block.
Poster - 49: Assessment of Synchrony respiratory compensation error for CyberKnife liver treatment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Ming; Cygler,
The goal of this work is to quantify respiratory motion compensation errors for liver tumor patients treated by the CyberKnife system with Synchrony tracking, to identify patients with the smallest tracking errors and to eventually help coach patient’s breathing patterns to minimize dose delivery errors. The accuracy of CyberKnife Synchrony respiratory motion compensation was assessed for 37 patients treated for liver lesions by analyzing data from system logfiles. A predictive model is used to modulate the direction of individual beams during dose delivery based on the positions of internally implanted fiducials determined using an orthogonal x-ray imaging system and themore » current location of LED external markers. For each x-ray pair acquired, system logfiles report the prediction error, the difference between the measured and predicted fiducial positions, and the delivery error, which is an estimate of the statistical error in the model overcoming the latency between x-ray acquisition and robotic repositioning. The total error was calculated at the time of each x-ray pair, for the number of treatment fractions and the number of patients, giving the average respiratory motion compensation error in three dimensions. The 99{sup th} percentile for the total radial error is 3.85 mm, with the highest contribution of 2.79 mm in superior/inferior (S/I) direction. The absolute mean compensation error is 1.78 mm radially with a 1.27 mm contribution in the S/I direction. Regions of high total error may provide insight into features predicting groups of patients with larger or smaller total errors.« less
Pointing control using a moving base of support.
Hondzinski, Jan M; Kwon, Taegyong
2009-07-01
The purposes of this study were to determine whether gaze direction provides a control signal for movement direction for a pointing task requiring a step and to gain insight into why discrepancies previously identified in the literature for endpoint accuracy with gaze directed eccentrically exist. Straight arm pointing movements were performed to real and remembered target locations, either toward or 30 degrees eccentric to gaze direction. Pointing occurred in normal room lighting or darkness while subjects sat, stood still or side-stepped left or right. Trunk rotation contributed 22-65% to gaze orientations when it was not constrained. Error differences for different target locations explained discrepancies among previous experiments. Variable pointing errors were influenced by gaze direction, while mean systematic pointing errors and trunk orientations were influenced by step direction. These data support the use of a control strategy that relies on gaze direction and equilibrium inputs for whole-body goal-directed movements.
Effects of a direct refill program for automated dispensing cabinets on medication-refill errors.
Helmons, Pieter J; Dalton, Ashley J; Daniels, Charles E
2012-10-01
The effects of a direct refill program for automated dispensing cabinets (ADCs) on medication-refill errors were studied. This study was conducted in designated acute care areas of a 386-bed academic medical center. A wholesaler-to-ADC direct refill program, consisting of prepackaged delivery of medications and bar-code-assisted ADC refilling, was implemented in the inpatient pharmacy of the medical center in September 2009. Medication-refill errors in 26 ADCs from the general medicine units, the infant special care unit, the surgical and burn intensive care units, and intermediate units were assessed before and after the implementation of this program. Medication-refill errors were defined as an ADC pocket containing the wrong drug, wrong strength, or wrong dosage form. ADC refill errors decreased by 77%, from 62 errors per 6829 refilled pockets (0.91%) to 8 errors per 3855 refilled pockets (0.21%) (p < 0.0001). The predominant error type detected before the intervention was the incorrect medication (wrong drug, wrong strength, or wrong dosage form) in the ADC pocket. Of the 54 incorrect medications found before the intervention, 38 (70%) were loaded in a multiple-drug drawer. After the implementation of the new refill process, 3 of the 5 incorrect medications were loaded in a multiple-drug drawer. There were 3 instances of expired medications before and only 1 expired medication after implementation of the program. A redesign of the ADC refill process using a wholesaler-to-ADC direct refill program that included delivery of prepackaged medication and bar-code-assisted refill significantly decreased the occurrence of ADC refill errors.
Clover: Compiler directed lightweight soft error resilience
Liu, Qingrui; Lee, Dongyoon; Jung, Changhee; ...
2015-05-01
This paper presents Clover, a compiler directed soft error detection and recovery scheme for lightweight soft error resilience. The compiler carefully generates soft error tolerant code based on idem-potent processing without explicit checkpoint. During program execution, Clover relies on a small number of acoustic wave detectors deployed in the processor to identify soft errors by sensing the wave made by a particle strike. To cope with DUE (detected unrecoverable errors) caused by the sensing latency of error detection, Clover leverages a novel selective instruction duplication technique called tail-DMR (dual modular redundancy). Once a soft error is detected by either themore » sensor or the tail-DMR, Clover takes care of the error as in the case of exception handling. To recover from the error, Clover simply redirects program control to the beginning of the code region where the error is detected. Lastly, the experiment results demonstrate that the average runtime overhead is only 26%, which is a 75% reduction compared to that of the state-of-the-art soft error resilience technique.« less
Kotasidis, F A; Mehranian, A; Zaidi, H
2016-05-07
Kinetic parameter estimation in dynamic PET suffers from reduced accuracy and precision when parametric maps are estimated using kinetic modelling following image reconstruction of the dynamic data. Direct approaches to parameter estimation attempt to directly estimate the kinetic parameters from the measured dynamic data within a unified framework. Such image reconstruction methods have been shown to generate parametric maps of improved precision and accuracy in dynamic PET. However, due to the interleaving between the tomographic and kinetic modelling steps, any tomographic or kinetic modelling errors in certain regions or frames, tend to spatially or temporally propagate. This results in biased kinetic parameters and thus limits the benefits of such direct methods. Kinetic modelling errors originate from the inability to construct a common single kinetic model for the entire field-of-view, and such errors in erroneously modelled regions could spatially propagate. Adaptive models have been used within 4D image reconstruction to mitigate the problem, though they are complex and difficult to optimize. Tomographic errors in dynamic imaging on the other hand, can originate from involuntary patient motion between dynamic frames, as well as from emission/transmission mismatch. Motion correction schemes can be used, however, if residual errors exist or motion correction is not included in the study protocol, errors in the affected dynamic frames could potentially propagate either temporally, to other frames during the kinetic modelling step or spatially, during the tomographic step. In this work, we demonstrate a new strategy to minimize such error propagation in direct 4D image reconstruction, focusing on the tomographic step rather than the kinetic modelling step, by incorporating time-of-flight (TOF) within a direct 4D reconstruction framework. Using ever improving TOF resolutions (580 ps, 440 ps, 300 ps and 160 ps), we demonstrate that direct 4D TOF image reconstruction can substantially prevent kinetic parameter error propagation either from erroneous kinetic modelling, inter-frame motion or emission/transmission mismatch. Furthermore, we demonstrate the benefits of TOF in parameter estimation when conventional post-reconstruction (3D) methods are used and compare the potential improvements to direct 4D methods. Further improvements could possibly be achieved in the future by combining TOF direct 4D image reconstruction with adaptive kinetic models and inter-frame motion correction schemes.
NASA Astrophysics Data System (ADS)
Kotasidis, F. A.; Mehranian, A.; Zaidi, H.
2016-05-01
Kinetic parameter estimation in dynamic PET suffers from reduced accuracy and precision when parametric maps are estimated using kinetic modelling following image reconstruction of the dynamic data. Direct approaches to parameter estimation attempt to directly estimate the kinetic parameters from the measured dynamic data within a unified framework. Such image reconstruction methods have been shown to generate parametric maps of improved precision and accuracy in dynamic PET. However, due to the interleaving between the tomographic and kinetic modelling steps, any tomographic or kinetic modelling errors in certain regions or frames, tend to spatially or temporally propagate. This results in biased kinetic parameters and thus limits the benefits of such direct methods. Kinetic modelling errors originate from the inability to construct a common single kinetic model for the entire field-of-view, and such errors in erroneously modelled regions could spatially propagate. Adaptive models have been used within 4D image reconstruction to mitigate the problem, though they are complex and difficult to optimize. Tomographic errors in dynamic imaging on the other hand, can originate from involuntary patient motion between dynamic frames, as well as from emission/transmission mismatch. Motion correction schemes can be used, however, if residual errors exist or motion correction is not included in the study protocol, errors in the affected dynamic frames could potentially propagate either temporally, to other frames during the kinetic modelling step or spatially, during the tomographic step. In this work, we demonstrate a new strategy to minimize such error propagation in direct 4D image reconstruction, focusing on the tomographic step rather than the kinetic modelling step, by incorporating time-of-flight (TOF) within a direct 4D reconstruction framework. Using ever improving TOF resolutions (580 ps, 440 ps, 300 ps and 160 ps), we demonstrate that direct 4D TOF image reconstruction can substantially prevent kinetic parameter error propagation either from erroneous kinetic modelling, inter-frame motion or emission/transmission mismatch. Furthermore, we demonstrate the benefits of TOF in parameter estimation when conventional post-reconstruction (3D) methods are used and compare the potential improvements to direct 4D methods. Further improvements could possibly be achieved in the future by combining TOF direct 4D image reconstruction with adaptive kinetic models and inter-frame motion correction schemes.
Human Error as an Emergent Property of Action Selection and Task Place-Holding.
Tamborello, Franklin P; Trafton, J Gregory
2017-05-01
A computational process model could explain how the dynamic interaction of human cognitive mechanisms produces each of multiple error types. With increasing capability and complexity of technological systems, the potential severity of consequences of human error is magnified. Interruption greatly increases people's error rates, as does the presence of other information to maintain in an active state. The model executed as a software-instantiated Monte Carlo simulation. It drew on theoretical constructs such as associative spreading activation for prospective memory, explicit rehearsal strategies as a deliberate cognitive operation to aid retrospective memory, and decay. The model replicated the 30% effect of interruptions on postcompletion error in Ratwani and Trafton's Stock Trader task, the 45% interaction effect on postcompletion error of working memory capacity and working memory load from Byrne and Bovair's Phaser Task, as well as the 5% perseveration and 3% omission effects of interruption from the UNRAVEL Task. Error classes including perseveration, omission, and postcompletion error fall naturally out of the theory. The model explains post-interruption error in terms of task state representation and priming for recall of subsequent steps. Its performance suggests that task environments providing more cues to current task state will mitigate error caused by interruption. For example, interfaces could provide labeled progress indicators or facilities for operators to quickly write notes about their task states when interrupted.
Karsh, B‐T; Holden, R J; Alper, S J; Or, C K L
2006-01-01
The goal of improving patient safety has led to a number of paradigms for directing improvement efforts. The main paradigms to date have focused on reducing injuries, reducing errors, or improving evidence based practice. In this paper a human factors engineering paradigm is proposed that focuses on designing systems to improve the performance of healthcare professionals and to reduce hazards. Both goals are necessary, but neither is sufficient to improve safety. We suggest that the road to patient and employee safety runs through the healthcare professional who delivers care. To that end, several arguments are provided to show that designing healthcare delivery systems to support healthcare professional performance and hazard reduction should yield significant patient safety benefits. The concepts of human performance and hazard reduction are explained. PMID:17142611
Emulation as an Integrating Principle for Cognition
Colder, Brian
2011-01-01
Emulations, defined as ongoing internal representations of potential actions and the futures those actions are expected to produce, play a critical role in directing human bodily activities. Studies of gross motor behavior, perception, allocation of attention, response to errors, interoception, and homeostatic activities, and higher cognitive reasoning suggest that the proper execution of all these functions relies on emulations. Further evidence supports the notion that reinforcement learning in humans is aimed at updating emulations, and that action selection occurs via the advancement of preferred emulations toward realization of their action and environmental prediction. Emulations are hypothesized to exist as distributed active networks of neurons in cortical and sub-cortical structures. This manuscript ties together previously unrelated theories of the role of prediction in different aspects of human information processing to create an integrated framework for cognition. PMID:21660288
NASA Technical Reports Server (NTRS)
Kimes, D. S.; Kerber, A. G.; Sellers, P. J.
1993-01-01
Spatial averaging errors which may occur when creating hemispherical reflectance maps for different cover types from direct nadir technique to estimate the hemispherical reflectance are assessed by comparing the results with those obtained with a knowledge-based system called VEG (Kimes et al., 1991, 1992). It was found that hemispherical reflectance errors provided by using VEG are much less than those using the direct nadir techniques, depending on conditions. Suggestions are made concerning sampling and averaging strategies for creating hemispherical reflectance maps for photosynthetic, carbon cycle, and climate change studies.
Byliński, Hubert; Gębicki, Jacek; Dymerski, Tomasz; Namieśnik, Jacek
2017-07-04
One of the major sources of error that occur during chemical analysis utilizing the more conventional and established analytical techniques is the possibility of losing part of the analytes during the sample preparation stage. Unfortunately, this sample preparation stage is required to improve analytical sensitivity and precision. Direct techniques have helped to shorten or even bypass the sample preparation stage; and in this review, we comment of some of the new direct techniques that are mass-spectrometry based. The study presents information about the measurement techniques using mass spectrometry, which allow direct sample analysis, without sample preparation or limiting some pre-concentration steps. MALDI - MS, PTR - MS, SIFT - MS, DESI - MS techniques are discussed. These solutions have numerous applications in different fields of human activity due to their interesting properties. The advantages and disadvantages of these techniques are presented. The trends in development of direct analysis using the aforementioned techniques are also presented.
Simultaneous Control of Error Rates in fMRI Data Analysis
Kang, Hakmook; Blume, Jeffrey; Ombao, Hernando; Badre, David
2015-01-01
The key idea of statistical hypothesis testing is to fix, and thereby control, the Type I error (false positive) rate across samples of any size. Multiple comparisons inflate the global (family-wise) Type I error rate and the traditional solution to maintaining control of the error rate is to increase the local (comparison-wise) Type II error (false negative) rates. However, in the analysis of human brain imaging data, the number of comparisons is so large that this solution breaks down: the local Type II error rate ends up being so large that scientifically meaningful analysis is precluded. Here we propose a novel solution to this problem: allow the Type I error rate to converge to zero along with the Type II error rate. It works because when the Type I error rate per comparison is very small, the accumulation (or global) Type I error rate is also small. This solution is achieved by employing the Likelihood paradigm, which uses likelihood ratios to measure the strength of evidence on a voxel-by-voxel basis. In this paper, we provide theoretical and empirical justification for a likelihood approach to the analysis of human brain imaging data. In addition, we present extensive simulations that show the likelihood approach is viable, leading to ‘cleaner’ looking brain maps and operationally superiority (lower average error rate). Finally, we include a case study on cognitive control related activation in the prefrontal cortex of the human brain. PMID:26272730
Direction Dependent Effects In Widefield Wideband Full Stokes Radio Imaging
NASA Astrophysics Data System (ADS)
Jagannathan, Preshanth; Bhatnagar, Sanjay; Rau, Urvashi; Taylor, Russ
2015-01-01
Synthesis imaging in radio astronomy is affected by instrumental and atmospheric effects which introduce direction dependent gains.The antenna power pattern varies both as a function of time and frequency. The broad band time varying nature of the antenna power pattern when not corrected leads to gross errors in full stokes imaging and flux estimation. In this poster we explore the errors that arise in image deconvolution while not accounting for the time and frequency dependence of the antenna power pattern. Simulations were conducted with the wideband full stokes power pattern of the Very Large Array(VLA) antennas to demonstrate the level of errors arising from direction-dependent gains. Our estimate is that these errors will be significant in wide-band full-pol mosaic imaging as well and algorithms to correct these errors will be crucial for many up-coming large area surveys (e.g. VLASS)
Human error and the search for blame
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1989-01-01
Human error is a frequent topic in discussions about risks in using computer systems. A rational analysis of human error leads through the consideration of mistakes to standards that designers use to avoid mistakes that lead to known breakdowns. The irrational side, however, is more interesting. It conditions people to think that breakdowns are inherently wrong and that there is ultimately someone who is responsible. This leads to a search for someone to blame which diverts attention from: learning from the mistakes; seeing the limitations of current engineering methodology; and improving the discourse of design.
An anthropomorphic phantom for quantitative evaluation of breast MRI.
Freed, Melanie; de Zwart, Jacco A; Loud, Jennifer T; El Khouli, Riham H; Myers, Kyle J; Greene, Mark H; Duyn, Jeff H; Badano, Aldo
2011-02-01
In this study, the authors aim to develop a physical, tissue-mimicking phantom for quantitative evaluation of breast MRI protocols. The objective of this phantom is to address the need for improved standardization in breast MRI and provide a platform for evaluating the influence of image protocol parameters on lesion detection and discrimination. Quantitative comparisons between patient and phantom image properties are presented. The phantom is constructed using a mixture of lard and egg whites, resulting in a random structure with separate adipose- and glandular-mimicking components. T1 and T2 relaxation times of the lard and egg components of the phantom were estimated at 1.5 T from inversion recovery and spin-echo scans, respectively, using maximum-likelihood methods. The image structure was examined quantitatively by calculating and comparing spatial covariance matrices of phantom and patient images. A static, enhancing lesion was introduced by creating a hollow mold with stereolithography and filling it with a gadolinium-doped water solution. Measured phantom relaxation values fall within 2 standard errors of human values from the literature and are reasonably stable over 9 months of testing. Comparison of the covariance matrices of phantom and patient data demonstrates that the phantom and patient data have similar image structure. Their covariance matrices are the same to within error bars in the anterior-posterior direction and to within about two error bars in the right-left direction. The signal from the phantom's adipose-mimicking material can be suppressed using active fat-suppression protocols. A static, enhancing lesion can also be included with the ability to change morphology and contrast agent concentration. The authors have constructed a phantom and demonstrated its ability to mimic human breast images in terms of key physical properties that are relevant to breast MRI. This phantom provides a platform for the optimization and standardization of breast MRI imaging protocols for lesion detection and characterization.
Validation of simplified centre of mass models during gait in individuals with chronic stroke.
Huntley, Andrew H; Schinkel-Ivy, Alison; Aqui, Anthony; Mansfield, Avril
2017-10-01
The feasibility of using a multiple segment (full-body) kinematic model in clinical gait assessment is difficult when considering obstacles such as time and cost constraints. While simplified gait models have been explored in healthy individuals, no such work to date has been conducted in a stroke population. The aim of this study was to quantify the errors of simplified kinematic models for chronic stroke gait assessment. Sixteen individuals with chronic stroke (>6months), outfitted with full body kinematic markers, performed a series of gait trials. Three centre of mass models were computed: (i) 13-segment whole-body model, (ii) 3 segment head-trunk-pelvis model, and (iii) 1 segment pelvis model. Root mean squared error differences were compared between models, along with correlations to measures of stroke severity. Error differences revealed that, while both models were similar in the mediolateral direction, the head-trunk-pelvis model had less error in the anteroposterior direction and the pelvis model had less error in the vertical direction. There was some evidence that the head-trunk-pelvis model error is influenced in the mediolateral direction for individuals with more severe strokes, as a few significant correlations were observed between the head-trunk-pelvis model and measures of stroke severity. These findings demonstrate the utility and robustness of the pelvis model for clinical gait assessment in individuals with chronic stroke. Low error in the mediolateral and vertical directions is especially important when considering potential stability analyses during gait for this population, as lateral stability has been previously linked to fall risk. Copyright © 2017 Elsevier Ltd. All rights reserved.
Why do adult dogs (Canis familiaris) commit the A-not-B search error?
Sümegi, Zsófia; Kis, Anna; Miklósi, Ádám; Topál, József
2014-02-01
It has been recently reported that adult domestic dogs, like human infants, tend to commit perseverative search errors; that is, they select the previously rewarded empty location in Piagetian A-not-B search task because of the experimenter's ostensive communicative cues. There is, however, an ongoing debate over whether these findings reveal that dogs can use the human ostensive referential communication as a source of information or the phenomenon can be accounted for by "more simple" explanations like insufficient attention and learning based on local enhancement. In 2 experiments the authors systematically manipulated the type of human cueing (communicative or noncommunicative) adjacent to the A hiding place during both the A and B trials. Results highlight 3 important aspects of the dogs' A-not-B error: (a) search errors are influenced to a certain extent by dogs' motivation to retrieve the toy object; (b) human communicative and noncommunicative signals have different error-inducing effects; and (3) communicative signals presented at the A hiding place during the B trials but not during the A trials play a crucial role in inducing the A-not-B error and it can be induced even without demonstrating repeated hiding events at location A. These findings further confirm the notion that perseverative search error, at least partially, reflects a "ready-to-obey" attitude in the dog rather than insufficient attention and/or working memory.
A Qualitative Model of Human Interaction with Complex Dynamic Systems
NASA Technical Reports Server (NTRS)
Hess, Ronald A.
1987-01-01
A qualitative model describing human interaction with complex dynamic systems is developed. The model is hierarchical in nature and consists of three parts: a behavior generator, an internal model, and a sensory information processor. The behavior generator is responsible for action decomposition, turning higher level goals or missions into physical action at the human-machine interface. The internal model is an internal representation of the environment which the human is assumed to possess and is divided into four submodel categories. The sensory information processor is responsible for sensory composition. All three parts of the model act in consort to allow anticipatory behavior on the part of the human in goal-directed interaction with dynamic systems. Human workload and error are interpreted in this framework, and the familiar example of an automobile commute is used to illustrate the nature of the activity in the three model elements. Finally, with the qualitative model as a guide, verbal protocols from a manned simulation study of a helicopter instrument landing task are analyzed with particular emphasis on the effect of automation on human-machine performance.
A qualitative model of human interaction with complex dynamic systems
NASA Technical Reports Server (NTRS)
Hess, Ronald A.
1987-01-01
A qualitative model describing human interaction with complex dynamic systems is developed. The model is hierarchical in nature and consists of three parts: a behavior generator, an internal model, and a sensory information processor. The behavior generator is responsible for action decomposition, turning higher level goals or missions into physical action at the human-machine interface. The internal model is an internal representation of the environment which the human is assumed to possess and is divided into four submodel categories. The sensory information processor is responsible for sensory composition. All three parts of the model act in consort to allow anticipatory behavior on the part of the human in goal-directed interaction with dynamic systems. Human workload and error are interpreted in this framework, and the familiar example of an automobile commute is used to illustrate the nature of the activity in the three model elements. Finally, with the qualitative model as a guide, verbal protocols from a manned simulation study of a helicopter instrument landing task are analyzed with particular emphasis on the effect of automation on human-machine performance.
Common medial frontal mechanisms of adaptive control in humans and rodents
Frank, Michael J.; Laubach, Mark
2013-01-01
In this report, we describe how common brain networks within the medial frontal cortex facilitate adaptive behavioral control in rodents and humans. We demonstrate that low frequency oscillations below 12 Hz are dramatically modulated after errors in humans over mid-frontal cortex and in rats within prelimbic and anterior cingulate regions of medial frontal cortex. These oscillations were phase-locked between medial frontal cortex and motor areas in both rats and humans. In rats, single neurons that encoded prior behavioral outcomes were phase-coherent with low-frequency field oscillations particularly after errors. Inactivating medial frontal regions in rats led to impaired behavioral adjustments after errors, eliminated the differential expression of low frequency oscillations after errors, and increased low-frequency spike-field coupling within motor cortex. Our results describe a novel mechanism for behavioral adaptation via low-frequency oscillations and elucidate how medial frontal networks synchronize brain activity to guide performance. PMID:24141310
#2 - An Empirical Assessment of Exposure Measurement Error ...
Background• Differing degrees of exposure error acrosspollutants• Previous focus on quantifying and accounting forexposure error in single-pollutant models• Examine exposure errors for multiple pollutantsand provide insights on the potential for bias andattenuation of effect estimates in single and bipollutantepidemiological models The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.
Pan, Hong-Wei; Li, Wei; Li, Rong-Guo; Li, Yong; Zhang, Yi; Sun, En-Hua
2018-01-01
Rapid identification and determination of the antibiotic susceptibility profiles of the infectious agents in patients with bloodstream infections are critical steps in choosing an effective targeted antibiotic for treatment. However, there has been minimal effort focused on developing combined methods for the simultaneous direct identification and antibiotic susceptibility determination of bacteria in positive blood cultures. In this study, we constructed a lysis-centrifugation-wash procedure to prepare a bacterial pellet from positive blood cultures, which can be used directly for identification by matrix-assisted laser desorption/ionization-time-of-flight mass spectrometry (MALDI-TOF MS) and antibiotic susceptibility testing by the Vitek 2 system. The method was evaluated using a total of 129 clinical bacteria-positive blood cultures. The whole sample preparation process could be completed in <15 min. The correct rate of direct MALDI-TOF MS identification was 96.49% for gram-negative bacteria and 97.22% for gram-positive bacteria. Vitek 2 antimicrobial susceptibility testing of gram-negative bacteria showed an agreement rate of antimicrobial categories of 96.89% with a minor error, major error, and very major error rate of 2.63, 0.24, and 0.24%, respectively. Category agreement of antimicrobials against gram-positive bacteria was 92.81%, with a minor error, major error, and very major error rate of 4.51, 1.22, and 1.46%, respectively. These results indicated that our direct antibiotic susceptibility analysis method worked well compared to the conventional culture-dependent laboratory method. Overall, this fast, easy, and accurate method can facilitate the direct identification and antibiotic susceptibility testing of bacteria in positive blood cultures.
Alastruey, Jordi; Hunt, Anthony A E; Weinberg, Peter D
2014-01-01
We present a novel analysis of arterial pulse wave propagation that combines traditional wave intensity analysis with identification of Windkessel pressures to account for the effect on the pressure waveform of peripheral wave reflections. Using haemodynamic data measured in vivo in the rabbit or generated numerically in models of human compliant vessels, we show that traditional wave intensity analysis identifies the timing, direction and magnitude of the predominant waves that shape aortic pressure and flow waveforms in systole, but fails to identify the effect of peripheral reflections. These reflections persist for several cardiac cycles and make up most of the pressure waveform, especially in diastole and early systole. Ignoring peripheral reflections leads to an erroneous indication of a reflection-free period in early systole and additional error in the estimates of (i) pulse wave velocity at the ascending aorta given by the PU–loop method (9.5% error) and (ii) transit time to a dominant reflection site calculated from the wave intensity profile (27% error). These errors decreased to 1.3% and 10%, respectively, when accounting for peripheral reflections. Using our new analysis, we investigate the effect of vessel compliance and peripheral resistance on wave intensity, peripheral reflections and reflections originating in previous cardiac cycles. PMID:24132888
de Oliveira Isac Moraes, Gabriel; da Silva, Larissa Meirelles Rodrigues; dos Santos-Neto, Alvaro José; Florenzano, Fábio Herbst; Figueiredo, Eduardo Costa
2013-09-01
A new restricted access molecularly imprinted polymer coated with bovine serum albumin (RAMIP-BSA) was developed, characterized, and used for direct analysis of chlorpromazine in human plasma samples. The RAMIP-BSA was synthesized using chlorpromazine, methacrylic acid, and ethylene glycol dimethacrylate as template, functional monomer, and cross-linker, respectively. Glycerol dimethacrylate and hydroxy methyl methacrylate were used to promote a hydrophilic surface (high density of hydroxyl groups). Afterward, the polymer was coated with BSA using glutaraldehyde as cross-linker, resulting in a protein chemical shield around it. The material was able to eliminate ca. 99% of protein when a 44-mg mL(-1) BSA aqueous solution was passed through it. The RAMIP-BSA was packed in a column and used for direct analysis of chlorpromazine in human plasma samples in an online column switching high-performance liquid chromatography system. The analytical calibration curve was prepared in a pool of human plasma samples with chlorpromazine concentrations ranging from 30 to 350 μg L(-1). The correlation coefficient obtained was 0.995 and the limit of quantification was 30 μg L(-1). Intra-day and inter-day precision and accuracy presented variation coefficients and relative errors lower than 15% and within -15 and 15%, respectively. The sample throughput was 3 h(-1) (sample preparation and chromatographic analysis steps) and the same RAMIP-BSA column was efficiently used for about 90 cycles.
The economics of health care quality and medical errors.
Andel, Charles; Davidow, Stephen L; Hollander, Mark; Moreno, David A
2012-01-01
Hospitals have been looking for ways to improve quality and operational efficiency and cut costs for nearly three decades, using a variety of quality improvement strategies. However, based on recent reports, approximately 200,000 Americans die from preventable medical errors including facility-acquired conditions and millions may experience errors. In 2008, medical errors cost the United States $19.5 billion. About 87 percent or $17 billion were directly associated with additional medical cost, including: ancillary services, prescription drug services, and inpatient and outpatient care, according to a study sponsored by the Society for Actuaries and conducted by Milliman in 2010. Additional costs of $1.4 billion were attributed to increased mortality rates with $1.1 billion or 10 million days of lost productivity from missed work based on short-term disability claims. The authors estimate that the economic impact is much higher, perhaps nearly $1 trillion annually when quality-adjusted life years (QALYs) are applied to those that die. Using the Institute of Medicine's (IOM) estimate of 98,000 deaths due to preventable medical errors annually in its 1998 report, To Err Is Human, and an average of ten lost years of life at $75,000 to $100,000 per year, there is a loss of $73.5 billion to $98 billion in QALYs for those deaths--conservatively. These numbers are much greater than those we cite from studies that explore the direct costs of medical errors. And if the estimate of a recent Health Affairs article is correct-preventable death being ten times the IOM estimate-the cost is $735 billion to $980 billion. Quality care is less expensive care. It is better, more efficient, and by definition, less wasteful. It is the right care, at the right time, every time. It should mean that far fewer patients are harmed or injured. Obviously, quality care is not being delivered consistently throughout U.S. hospitals. Whatever the measure, poor quality is costing payers and society a great deal. However, health care leaders and professionals are focusing on quality and patient safety in ways they never have before because the economics of quality have changed substantially.
NASA Astrophysics Data System (ADS)
Bozkurt, Alican; Kose, Kivanc; Fox, Christi A.; Dy, Jennifer; Brooks, Dana H.; Rajadhyaksha, Milind
2016-02-01
Study of the stratum corneum (SC) in human skin is important for research in barrier structure and function, drug delivery, and water permeability of skin. The optical sectioning and high resolution of reflectance confocal microscopy (RCM) allows visual examination of SC non-invasively. Here, we present an unsupervised segmentation algorithm that can automatically delineate thickness of the SC in RCM images of human skin in-vivo. We mimic clinicians visual process by applying complex wavelet transform over non-overlapping local regions of size 16 x 16 μm called tiles, and analyze the textural changes in between consecutive tiles in axial (depth) direction. We use dual-tree complex wavelet transform to represent textural structures in each tile. This transform is almost shift-invariant, and directionally selective, which makes it highly efficient in texture representation. Using DT-CWT, we decompose each tile into 6 directional sub-bands with orientations in +/-15, 45, and 75 degrees and a low-pass band, which is the decimated version of the input. We apply 3 scales of decomposition by recursively transforming the low-pass bands and obtain 18 bands of different directionality at different scales. We then calculate mean and variance of each band resulting in a feature vector of 36 entries. Feature vectors obtained for each stack of tiles in axial direction are then clustered using spectral clustering in order to detect the textural changes in depth direction. Testing on a set of 15 RCM stacks produced a mean error of 5.45+/-1.32 μm, compared to the "ground truth" segmentation provided by a clinical expert reader.
Cutting the Cord: Discrimination and Command Responsibility in Autonomous Lethal Weapons
2014-02-13
machine responses to identical stimuli, and it was the job of a third party human “witness” to determine which participant was man and which was...machines may be error free, but there are potential benefits to be gained through autonomy if machines can meet or exceed human performance in...lieu of human operators and reap the benefits that autonomy provides. Human and Machine Error It would be foolish to assert that either humans
Apparatus for microbiological sampling. [including automatic swabbing
NASA Technical Reports Server (NTRS)
Wilkins, J. R.; Mills, S. M. (Inventor)
1974-01-01
An automatic apparatus is described for microbiologically sampling surface using a cotton swab which eliminates human error. The apparatus includes a self-powered transport device, such as a motor-driven wheeled cart, which mounts a swabbing motor drive for a crank arm which supports a swab in the free end thereof. The swabbing motor is pivotably mounted and an actuator rod movable responsive to the cart traveling a predetermined distance provides lifting of the swab from the surface being sampled and reversal of the direction of travel of the cart.
NASA Astrophysics Data System (ADS)
Jung, Jae Hong; Jung, Joo-Young; Bae, Sun Hyun; Moon, Seong Kwon; Cho, Kwang Hwan
2016-10-01
The purpose of this study was to compare patient setup deviations for different image-guided protocols (weekly vs. biweekly) that are used in TomoDirect three-dimensional conformal radiotherapy (TD-3DCRT) for whole-breast radiation therapy (WBRT). A total of 138 defined megavoltage computed tomography (MVCT) image sets from 46 breast cancer cases were divided into two groups based on the imaging acquisition times: weekly or biweekly. The mean error, three-dimensional setup displacement error (3D-error), systematic error (Σ), and random error (σ) were calculated for each group. The 3D-errors were 4.29 ± 1.11 mm and 5.02 ± 1.85 mm for the weekly and biweekly groups, respectively; the biweekly error was 14.6% higher than the weekly error. The systematic errors in the roll angle and the x, y, and z directions were 0.48°, 1.72 mm, 2.18 mm, and 1.85 mm for the weekly protocol and 0.21°, 1.24 mm, 1.39 mm, and 1.85 mm for the biweekly protocol. Random errors in the roll angle and the x, y, and z directions were 25.7%, 40.6%, 40.0%, and 40.8% higher in the biweekly group than in the weekly group. For the x, y, and z directions, the distributions of the treatment frequency at less than 5 mm were 98.6%, 91.3%, and 94.2% in the weekly group and 94.2%, 89.9%, and 82.6% in the biweekly group. Moreover, the roll angles with 0 - 1° were 79.7% and 89.9% in the weekly and the biweekly groups, respectively. Overall, the evaluation of setup deviations for the two protocols revealed no significant differences (p > 0.05). Reducing the frequency of MVCT imaging could have promising effects on imaging doses and machine times during treatment. However, the biweekly protocol was associated with increased random setup deviations in the treatment. We have demonstrated a biweekly protocol of TD-3DCRT for WBRT, and we anticipate that our method may provide an alternative approach for considering the uncertainties in the patient setup.
Human error identification for laparoscopic surgery: Development of a motion economy perspective.
Al-Hakim, Latif; Sevdalis, Nick; Maiping, Tanaphon; Watanachote, Damrongpan; Sengupta, Shomik; Dissaranan, Charuspong
2015-09-01
This study postulates that traditional human error identification techniques fail to consider motion economy principles and, accordingly, their applicability in operating theatres may be limited. This study addresses this gap in the literature with a dual aim. First, it identifies the principles of motion economy that suit the operative environment and second, it develops a new error mode taxonomy for human error identification techniques which recognises motion economy deficiencies affecting the performance of surgeons and predisposing them to errors. A total of 30 principles of motion economy were developed and categorised into five areas. A hierarchical task analysis was used to break down main tasks of a urological laparoscopic surgery (hand-assisted laparoscopic nephrectomy) to their elements and the new taxonomy was used to identify errors and their root causes resulting from violation of motion economy principles. The approach was prospectively tested in 12 observed laparoscopic surgeries performed by 5 experienced surgeons. A total of 86 errors were identified and linked to the motion economy deficiencies. Results indicate the developed methodology is promising. Our methodology allows error prevention in surgery and the developed set of motion economy principles could be useful for training surgeons on motion economy principles. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Endpoint Accuracy in Manual Control of a Steerable Needle.
van de Berg, Nick J; Dankelman, Jenny; van den Dobbelsteen, John J
2017-02-01
To study the ability of a human operator to manually correct for errors in the needle insertion path without partial withdrawal of the needle by means of an active, tip-articulated steerable needle. The needle is composed of a 1.32-mm outer-diameter cannula, with a flexure joint near the tip, and a retractable stylet. The bending stiffness of the needle resembles that of a 20-gauge hypodermic needle. The needle functionality was evaluated in manual insertions by steering to predefined targets and a lateral displacement of 20 mm from the straight insertion line. Steering tasks were conducted in 5 directions and 2 tissue simulants under image guidance from a camera. The repeatability in instrument actuations was assessed during 100 mm deep automated insertions with a linear motor. In addition to tip position, tip angles were tracked during the insertions. The targeting error (mean absolute error ± standard deviation) during manual steering to 5 different targets in stiff tissue was 0.5 mm ± 1.1. This variability in manual tip placement (1.1 mm) was less than the variability among automated insertions (1.4 mm) in the same tissue type. An increased tissue stiffness resulted in an increased lateral tip displacement. The tip angle was directly controlled by the user interface, and remained unaffected by the tissue stiffness. This study demonstrates the ability to manually steer needles to predefined target locations under image guidance. Copyright © 2016 SIR. Published by Elsevier Inc. All rights reserved.
Modeling Types of Pedal Applications Using a Driving Simulator.
Wu, Yuqing; Boyle, Linda Ng; McGehee, Daniel; Roe, Cheryl A; Ebe, Kazutoshi; Foley, James
2015-11-01
The aim of this study was to examine variations in drivers' foot behavior and identify factors associated with pedal misapplications. Few studies have focused on the foot behavior while in the vehicle and the mishaps that a driver can encounter during a potentially hazardous situation. A driving simulation study was used to understand how drivers move their right foot toward the pedals. The study included data from 43 drivers as they responded to a series of rapid traffic signal phase changes. Pedal application types were classified as (a) direct hit, (b) hesitated, (c) corrected trajectory, and (d) pedal errors (incorrect trajectories, misses, slips, or pressed both pedals). A mixed-effects multinomial logit model was used to predict the likelihood of one of these pedal applications, and linear mixed models with repeated measures were used to examine the response time and pedal duration given the various experimental conditions (stimuli color and location). Younger drivers had higher probabilities of direct hits when compared to other age groups. Participants tended to have more pedal errors when responding to a red signal or when the signal appeared to be closer. Traffic signal phases and locations were associated with pedal response time and duration. The response time and pedal duration affected the likelihood of being in one of the four pedal application types. Findings from this study suggest that age-related and situational factors may play a role in pedal errors, and the stimuli locations could affect the type of pedal application. © 2015, Human Factors and Ergonomics Society.
Lobach, Irvna; Fan, Ruzone; Carroll, Raymond T.
2011-01-01
With the advent of dense single nucleotide polymorphism genotyping, population-based association studies have become the major tools for identifying human disease genes and for fine gene mapping of complex traits. We develop a genotype-based approach for association analysis of case-control studies of gene-environment interactions in the case when environmental factors are measured with error and genotype data are available on multiple genetic markers. To directly use the observed genotype data, we propose two genotype-based models: genotype effect and additive effect models. Our approach offers several advantages. First, the proposed risk functions can directly incorporate the observed genotype data while modeling the linkage disequihbrium information in the regression coefficients, thus eliminating the need to infer haplotype phase. Compared with the haplotype-based approach, an estimating procedure based on the proposed methods can be much simpler and significantly faster. In addition, there is no potential risk due to haplotype phase estimation. Further, by fitting the proposed models, it is possible to analyze the risk alleles/variants of complex diseases, including their dominant or additive effects. To model measurement error, we adopt the pseudo-likelihood method by Lobach et al. [2008]. Performance of the proposed method is examined using simulation experiments. An application of our method is illustrated using a population-based case-control study of association between calcium intake with the risk of colorectal adenoma development. PMID:21031455
Managing human error in aviation.
Helmreich, R L
1997-05-01
Crew resource management (CRM) programs were developed to address team and leadership aspects of piloting modern airplanes. The goal is to reduce errors through team work. Human factors research and social, cognitive, and organizational psychology are used to develop programs tailored for individual airlines. Flight crews study accident case histories, group dynamics, and human error. Simulators provide pilots with the opportunity to solve complex flight problems. CRM in the simulator is called line-oriented flight training (LOFT). In automated cockpits CRM promotes the idea of automation as a crew member. Cultural aspects of aviation include professional, business, and national culture. The aviation CRM model has been adapted for training surgeons and operating room staff in human factors.
Empirical Analysis of Systematic Communication Errors.
1981-09-01
human o~ . .... 8 components in communication systems. (Systematic errors were defined to be those that occur regularly in human communication links...phase of the human communication process and focuses on the linkage between a specific piece of information (and the receiver) and the transmission...communication flow. (2) Exchange. Exchange is the next phase in human communication and entails a concerted effort on the part of the sender and receiver to share
NASA Technical Reports Server (NTRS)
DeMott, Diana
2013-01-01
Compared to equipment designed to perform the same function over and over, humans are just not as reliable. Computers and machines perform the same action in the same way repeatedly getting the same result, unless equipment fails or a human interferes. Humans who are supposed to perform the same actions repeatedly often perform them incorrectly due to a variety of issues including: stress, fatigue, illness, lack of training, distraction, acting at the wrong time, not acting when they should, not following procedures, misinterpreting information or inattention to detail. Why not use robots and automatic controls exclusively if human error is so common? In an emergency or off normal situation that the computer, robotic element, or automatic control system is not designed to respond to, the result is failure unless a human can intervene. The human in the loop may be more likely to cause an error, but is also more likely to catch the error and correct it. When it comes to unexpected situations, or performing multiple tasks outside the defined mission parameters, humans are the only viable alternative. Human Reliability Assessments (HRA) identifies ways to improve human performance and reliability and can lead to improvements in systems designed to interact with humans. Understanding the context of the situation that can lead to human errors, which include taking the wrong action, no action or making bad decisions provides additional information to mitigate risks. With improved human reliability comes reduced risk for the overall operation or project.
Zhang, Xudong
2002-10-01
This work describes a new approach that allows an angle-domain human movement model to generate, via forward kinematics, Cartesian-space human movement representation with otherwise inevitable end-point offset nullified but much of the kinematic authenticity retained. The approach incorporates a rectification procedure that determines the minimum postural angle change at the final frame to correct the end-point offset, and a deformation procedure that deforms the angle profile accordingly to preserve maximum original kinematic authenticity. Two alternative deformation schemes, named amplitude-proportional (AP) and time-proportional (TP) schemes, are proposed and formulated. As an illustration and empirical evaluation, the proposed approach, along with two deformation schemes, was applied to a set of target-directed right-hand reaching movements that had been previously measured and modeled. The evaluation showed that both deformation schemes nullified the final frame end-point offset and significantly reduced time-averaged position errors for the end-point as well as the most distal intermediate joint while causing essentially no change in the remaining joints. A comparison between the two schemes based on time-averaged joint and end-point position errors indicated that overall the TP scheme outperformed the AP scheme. In addition, no statistically significant difference in time-averaged angle error was identified between the raw prediction and either of the deformation schemes, nor between the two schemes themselves, suggesting minimal angle-domain distortion incurred by the deformation.
Fouragnan, Elsa; Retzler, Chris; Philiastides, Marios G
2018-03-25
Learning occurs when an outcome differs from expectations, generating a reward prediction error signal (RPE). The RPE signal has been hypothesized to simultaneously embody the valence of an outcome (better or worse than expected) and its surprise (how far from expectations). Nonetheless, growing evidence suggests that separate representations of the two RPE components exist in the human brain. Meta-analyses provide an opportunity to test this hypothesis and directly probe the extent to which the valence and surprise of the error signal are encoded in separate or overlapping networks. We carried out several meta-analyses on a large set of fMRI studies investigating the neural basis of RPE, locked at decision outcome. We identified two valence learning systems by pooling studies searching for differential neural activity in response to categorical positive-versus-negative outcomes. The first valence network (negative > positive) involved areas regulating alertness and switching behaviours such as the midcingulate cortex, the thalamus and the dorsolateral prefrontal cortex whereas the second valence network (positive > negative) encompassed regions of the human reward circuitry such as the ventral striatum and the ventromedial prefrontal cortex. We also found evidence of a largely distinct surprise-encoding network including the anterior cingulate cortex, anterior insula and dorsal striatum. Together with recent animal and electrophysiological evidence this meta-analysis points to a sequential and distributed encoding of different components of the RPE signal, with potentially distinct functional roles. © 2018 Wiley Periodicals, Inc.
Mirzaei Aliabadi, Mostafa; Aghaei, Hamed; Kalatpour, Omid; Soltanian, Ali Reza; SeyedTabib, Maryam
2018-05-18
Mines are a dangerous workplace worldwide with a high accident rate. According to the Statistical Center of Iran, the number of occupational accidents in Iranian mines has increased in recent years. This study determined and explained human and organizational deficiencies influencing Iranian mining accidents. In this study, the data associated with 305 mining accidents were investigated. The data were analyzed based on a systems analysis approach to identify critical deficiencies in organizational influences, unsafe supervision, preconditions for unsafe acts, and workers' unsafe acts. Partial Least Square Structural Equation Modeling [PLS-SEM] was utilized for modeling the interactions between these deficiencies. It was demonstrated that organizational deficiencies had a direct positive effect on workers' violations (path coefficient=0.16) and workers' errors (path coefficient=0.23). The effect of unsafe supervision on workers' violations and workers' errors was also significant with the path coefficients of 0.14 and 0.20. Likewise, preconditions for unsafe acts also had a significant effect on both workers' violations (path coefficient=0.16) and workers' errors (path coefficient=0.21). Moreover, organizational deficiencies had an indirect positive effect on workers' unsafe acts mediated by unsafe supervision and preconditions for unsafe acts. Among the variables examined in the current study, organizational influences had the strongest impacts on workers' unsafe acts. Organizational deficiencies are the main causes of accidents in mining sectors that affects all other aspects of system safety. For preventing occupational accidents, organizational deficiencies should be modified first.
Identifying Human Factors Issues in Aircraft Maintenance Operations
NASA Technical Reports Server (NTRS)
Veinott, Elizabeth S.; Kanki, Barbara G.; Shafto, Michael G. (Technical Monitor)
1995-01-01
Maintenance operations incidents submitted to the Aviation Safety Reporting System (ASRS) between 1986-1992 were systematically analyzed in order to identify issues relevant to human factors and crew coordination. This exploratory analysis involved 95 ASRS reports which represented a wide range of maintenance incidents. The reports were coded and analyzed according to the type of error (e.g, wrong part, procedural error, non-procedural error), contributing factors (e.g., individual, within-team, cross-team, procedure, tools), result of the error (e.g., aircraft damage or not) as well as the operational impact (e.g., aircraft flown to destination, air return, delay at gate). The main findings indicate that procedural errors were most common (48.4%) and that individual and team actions contributed to the errors in more than 50% of the cases. As for operational results, most errors were either corrected after landing at the destination (51.6%) or required the flight crew to stop enroute (29.5%). Interactions among these variables are also discussed. This analysis is a first step toward developing a taxonomy of crew coordination problems in maintenance. By understanding what variables are important and how they are interrelated, we may develop intervention strategies that are better tailored to the human factor issues involved.
Managing Errors to Reduce Accidents in High Consequence Networked Information Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ganter, J.H.
1999-02-01
Computers have always helped to amplify and propagate errors made by people. The emergence of Networked Information Systems (NISs), which allow people and systems to quickly interact worldwide, has made understanding and minimizing human error more critical. This paper applies concepts from system safety to analyze how hazards (from hackers to power disruptions) penetrate NIS defenses (e.g., firewalls and operating systems) to cause accidents. Such events usually result from both active, easily identified failures and more subtle latent conditions that have resided in the system for long periods. Both active failures and latent conditions result from human errors. We classifymore » these into several types (slips, lapses, mistakes, etc.) and provide NIS examples of how they occur. Next we examine error minimization throughout the NIS lifecycle, from design through operation to reengineering. At each stage, steps can be taken to minimize the occurrence and effects of human errors. These include defensive design philosophies, architectural patterns to guide developers, and collaborative design that incorporates operational experiences and surprises into design efforts. We conclude by looking at three aspects of NISs that will cause continuing challenges in error and accident management: immaturity of the industry, limited risk perception, and resource tradeoffs.« less
Sarter, Nadine
2008-06-01
The goal of this article is to illustrate the problem-driven, cumulative, and highly interdisciplinary nature of human factors research by providing a brief overview of the work on mode errors on modern flight decks over the past two decades. Mode errors on modem flight decks were first reported in the late 1980s. Poor feedback, inadequate mental models of the automation, and the high degree of coupling and complexity of flight deck systems were identified as main contributors to these breakdowns in human-automation interaction. Various improvements of design, training, and procedures were proposed to address these issues. The author describes when and why the problem of mode errors surfaced, summarizes complementary research activities that helped identify and understand the contributing factors to mode errors, and describes some countermeasures that have been developed in recent years. This brief review illustrates how one particular human factors problem in the aviation domain enabled various disciplines and methodological approaches to contribute to a better understanding of, as well as provide better support for, effective human-automation coordination. Converging operations and interdisciplinary collaboration over an extended period of time are hallmarks of successful human factors research. The reported body of research can serve as a model for future research and as a teaching tool for students in this field of work.
The Swiss cheese model of adverse event occurrence--Closing the holes.
Stein, James E; Heiss, Kurt
2015-12-01
Traditional surgical attitude regarding error and complications has focused on individual failings. Human factors research has brought new and significant insights into the occurrence of error in healthcare, helping us identify systemic problems that injure patients while enhancing individual accountability and teamwork. This article introduces human factors science and its applicability to teamwork, surgical culture, medical error, and individual accountability. Copyright © 2015 Elsevier Inc. All rights reserved.
Behind Human Error: Cognitive Systems, Computers and Hindsight
1994-12-01
evaluations • Organize and/or conduct workshops and conferences CSERIAC is a Department of Defense Information Analysis Cen- ter sponsored by the Defense...Process 185 Neutral Observer Criteria 191 Error Analysis as Causal Judgment 193 Error as Information 195 A Fundamental Surprise 195 What is Human...Kahnemann, 1974), and in risk analysis (Dougherty and Fragola, 1990). The discussions have continued in a wide variety of forums, includ- ing the
Contributions of the cerebellum and the motor cortex to acquisition and retention of motor memories
Herzfeld, David J.; Pastor, Damien; Haith, Adrian M.; Rossetti, Yves; Shadmehr, Reza; O’Shea, Jacinta
2014-01-01
We investigated the contributions of the cerebellum and the motor cortex (M1) to acquisition and retention of human motor memories in a force field reaching task. We found that anodal transcranial direct current stimulation (tDCS) of the cerebellum, a technique that is thought to increase neuronal excitability, increased the ability to learn from error and form an internal model of the field, while cathodal cerebellar stimulation reduced this error-dependent learning. In addition, cathodal cerebellar stimulation disrupted the ability to respond to error within a reaching movement, reducing the gain of the sensory-motor feedback loop. By contrast, anodal M1 stimulation had no significant effects on these variables. During sham stimulation, early in training the acquired motor memory exhibited rapid decay in error-clamp trials. With further training the rate of decay decreased, suggesting that with training the motor memory was transformed from a labile to a more stable state. Surprisingly, neither cerebellar nor M1 stimulation altered these decay patterns. Participants returned 24 hours later and were re-tested in error-clamp trials without stimulation. The cerebellar group that had learned the task with cathodal stimulation exhibited significantly impaired retention, and retention was not improved by M1 anodal stimulation. In summary, non-invasive cerebellar stimulation resulted in polarity-dependent up- or down-regulation of error-dependent motor learning. In addition, cathodal cerebellar stimulation during acquisition impaired the ability to retain the motor memory overnight. Thus, in the force field task we found a critical role for the cerebellum in both formation of motor memory and its retention. PMID:24816533
Human-computer interaction in multitask situations
NASA Technical Reports Server (NTRS)
Rouse, W. B.
1977-01-01
Human-computer interaction in multitask decisionmaking situations is considered, and it is proposed that humans and computers have overlapping responsibilities. Queueing theory is employed to model this dynamic approach to the allocation of responsibility between human and computer. Results of simulation experiments are used to illustrate the effects of several system variables including number of tasks, mean time between arrivals of action-evoking events, human-computer speed mismatch, probability of computer error, probability of human error, and the level of feedback between human and computer. Current experimental efforts are discussed and the practical issues involved in designing human-computer systems for multitask situations are considered.
Cultural background shapes spatial reference frame proclivity
Goeke, Caspar; Kornpetpanee, Suchada; Köster, Moritz; Fernández-Revelles, Andrés B.; Gramann, Klaus; König, Peter
2015-01-01
Spatial navigation is an essential human skill that is influenced by several factors. The present study investigates how gender, age, and cultural background account for differences in reference frame proclivity and performance in a virtual navigation task. Using an online navigation study, we recorded reaction times, error rates (confusion of turning axis), and reference frame proclivity (egocentric vs. allocentric reference frame) of 1823 participants. Reaction times significantly varied with gender and age, but were only marginally influenced by the cultural background of participants. Error rates were in line with these results and exhibited a significant influence of gender and culture, but not age. Participants’ cultural background significantly influenced reference frame selection; the majority of North-Americans preferred an allocentric strategy, while Latin-Americans preferred an egocentric navigation strategy. European and Asian groups were in between these two extremes. Neither the factor of age nor the factor of gender had a direct impact on participants’ navigation strategies. The strong effects of cultural background on navigation strategies without the influence of gender or age underlines the importance of socialized spatial cognitive processes and argues for socio-economic analysis in studies investigating human navigation. PMID:26073656
Quantum Error Correction: Optimal, Robust, or Adaptive? Or, Where is The Quantum Flyball Governor?
NASA Astrophysics Data System (ADS)
Kosut, Robert; Grace, Matthew
2012-02-01
In The Human Use of Human Beings: Cybernetics and Society (1950), Norbert Wiener introduces feedback control in this way: ``This control of a machine on the basis of its actual performance rather than its expected performance is known as feedback ... It is the function of control ... to produce a temporary and local reversal of the normal direction of entropy.'' The classic classroom example of feedback control is the all-mechanical flyball governor used by James Watt in the 18th century to regulate the speed of rotating steam engines. What is it that is so compelling about this apparatus? First, it is easy to understand how it regulates the speed of a rotating steam engine. Secondly, and perhaps more importantly, it is a part of the device itself. A naive observer would not distinguish this mechanical piece from all the rest. So it is natural to ask, where is the all-quantum device which is self regulating, ie, the Quantum Flyball Governor? Is the goal of quantum error correction (QEC) to design such a device? Devloping the computational and mathematical tools to design this device is the topic of this talk.
Towards automatic Markov reliability modeling of computer architectures
NASA Technical Reports Server (NTRS)
Liceaga, C. A.; Siewiorek, D. P.
1986-01-01
The analysis and evaluation of reliability measures using time-varying Markov models is required for Processor-Memory-Switch (PMS) structures that have competing processes such as standby redundancy and repair, or renewal processes such as transient or intermittent faults. The task of generating these models is tedious and prone to human error due to the large number of states and transitions involved in any reasonable system. Therefore model formulation is a major analysis bottleneck, and model verification is a major validation problem. The general unfamiliarity of computer architects with Markov modeling techniques further increases the necessity of automating the model formulation. This paper presents an overview of the Automated Reliability Modeling (ARM) program, under development at NASA Langley Research Center. ARM will accept as input a description of the PMS interconnection graph, the behavior of the PMS components, the fault-tolerant strategies, and the operational requirements. The output of ARM will be the reliability of availability Markov model formulated for direct use by evaluation programs. The advantages of such an approach are (a) utility to a large class of users, not necessarily expert in reliability analysis, and (b) a lower probability of human error in the computation.
How to minimize perceptual error and maximize expertise in medical imaging
NASA Astrophysics Data System (ADS)
Kundel, Harold L.
2007-03-01
Visual perception is such an intimate part of human experience that we assume that it is entirely accurate. Yet, perception accounts for about half of the errors made by radiologists using adequate imaging technology. The true incidence of errors that directly affect patient well being is not known but it is probably at the lower end of the reported values of 3 to 25%. Errors in screening for lung and breast cancer are somewhat better characterized than errors in routine diagnosis. About 25% of cancers actually recorded on the images are missed and cancer is falsely reported in about 5% of normal people. Radiologists must strive to decrease error not only because of the potential impact on patient care but also because substantial variation among observers undermines confidence in the reliability of imaging diagnosis. Observer variation also has a major impact on technology evaluation because the variation between observers is frequently greater than the difference in the technologies being evaluated. This has become particularly important in the evaluation of computer aided diagnosis (CAD). Understanding the basic principles that govern the perception of medical images can provide a rational basis for making recommendations for minimizing perceptual error. It is convenient to organize thinking about perceptual error into five steps. 1) The initial acquisition of the image by the eye-brain (contrast and detail perception). 2) The organization of the retinal image into logical components to produce a literal perception (bottom-up, global, holistic). 3) Conversion of the literal perception into a preferred perception by resolving ambiguities in the literal perception (top-down, simulation, synthesis). 4) Selective visual scanning to acquire details that update the preferred perception. 5) Apply decision criteria to the preferred perception. The five steps are illustrated with examples from radiology with suggestions for minimizing error. The role of perceptual learning in the development of expertise is also considered.
Doytchev, Doytchin E; Szwillus, Gerd
2009-11-01
Understanding the reasons for incident and accident occurrence is important for an organization's safety. Different methods have been developed to achieve this goal. To better understand the human behaviour in incident occurrence we propose an analysis concept that combines Fault Tree Analysis (FTA) and Task Analysis (TA). The former method identifies the root causes of an accident/incident, while the latter analyses the way people perform the tasks in their work environment and how they interact with machines or colleagues. These methods were complemented with the use of the Human Error Identification in System Tools (HEIST) methodology and the concept of Performance Shaping Factors (PSF) to deepen the insight into the error modes of an operator's behaviour. HEIST shows the external error modes that caused the human error and the factors that prompted the human to err. To show the validity of the approach, a case study at a Bulgarian Hydro power plant was carried out. An incident - the flooding of the plant's basement - was analysed by combining the afore-mentioned methods. The case study shows that Task Analysis in combination with other methods can be applied successfully to human error analysis, revealing details about erroneous actions in a realistic situation.
Kis, Anna; Gácsi, Márta; Range, Friederike; Virányi, Zsófia
2012-01-01
In this paper, we describe a behaviour pattern similar to the "A-not-B" error found in human infants and young apes in a monkey species, the common marmosets (Callithrix jacchus). In contrast to the classical explanation, recently it has been suggested that the "A-not-B" error committed by human infants is at least partially due to misinterpretation of the hider's ostensively communicated object hiding actions as potential 'teaching' demonstrations during the A trials. We tested whether this so-called Natural Pedagogy hypothesis would account for the A-not-B error that marmosets commit in a standard object permanence task, but found no support for the hypothesis in this species. Alternatively, we present evidence that lower level mechanisms, such as attention and motivation, play an important role in committing the "A-not-B" error in marmosets. We argue that these simple mechanisms might contribute to the effect of undeveloped object representational skills in other species including young non-human primates that commit the A-not-B error.
de Cueto, Marina; Ceballos, Esther; Martinez-Martinez, Luis; Perea, Evelio J.; Pascual, Alvaro
2004-01-01
In order to further decrease the time lapse between initial inoculation of blood culture media and the reporting of results of identification and antimicrobial susceptibility tests for microorganisms causing bacteremia, we performed a prospective study in which specially processed fluid from positive blood culture bottles from Bactec 9240 (Becton Dickinson, Cockeysville, Md.) containing aerobic media were directly inoculated into Vitek 2 system cards (bio-Mérieux, France). Organism identification and susceptibility results were compared with those obtained from cards inoculated with a standardized bacterial suspension obtained following subculture to agar; 100 consecutive positive monomicrobic blood cultures, consisting of 50 gram-negative rods and 50 gram-positive cocci, were included in the study. For gram-negative organisms, 31 of the 50 (62%) showed complete agreement with the standard method for species identification, while none of the 50 gram-positive cocci were correctly identified by the direct method. For gram-negative rods, there were 50% categorical agreements between the direct and standard methods for all drugs tested. The very major error rate was 2.4%, and the major error rate was 0.6%. The overall error rate for gram-negatives was 6.6%. Complete agreement in clinical categories of all antimicrobial agents evaluated was obtained for 19 of 50 (38%) gram-positive cocci evaluated; the overall error rate was 8.4%, with 2.8% minor errors, 2.4% major errors, and 3.2% very major errors. These findings suggest that the Vitek 2 cards inoculated directly from positive Bactec 9240 bottles do not provide acceptable bacterial identification or susceptibility testing in comparison with corresponding cards tested by a standard method. PMID:15297523
Updating expected action outcome in the medial frontal cortex involves an evaluation of error type.
Maier, Martin E; Steinhauser, Marco
2013-10-02
Forming expectations about the outcome of an action is an important prerequisite for action control and reinforcement learning in the human brain. The medial frontal cortex (MFC) has been shown to play an important role in the representation of outcome expectations, particularly when an update of expected outcome becomes necessary because an error is detected. However, error detection alone is not always sufficient to compute expected outcome because errors can occur in various ways and different types of errors may be associated with different outcomes. In the present study, we therefore investigate whether updating expected outcome in the human MFC is based on an evaluation of error type. Our approach was to consider an electrophysiological correlate of MFC activity on errors, the error-related negativity (Ne/ERN), in a task in which two types of errors could occur. Because the two error types were associated with different amounts of monetary loss, updating expected outcomes on error trials required an evaluation of error type. Our data revealed a pattern of Ne/ERN amplitudes that closely mirrored the amount of monetary loss associated with each error type, suggesting that outcome expectations are updated based on an evaluation of error type. We propose that this is achieved by a proactive evaluation process that anticipates error types by continuously monitoring error sources or by dynamically representing possible response-outcome relations.
Past makes future: role of pFC in prediction.
Fuster, Joaquín M; Bressler, Steven L
2015-04-01
The pFC enables the essential human capacities for predicting future events and preadapting to them. These capacities rest on both the structure and dynamics of the human pFC. Structurally, pFC, together with posterior association cortex, is at the highest hierarchical level of cortical organization, harboring neural networks that represent complex goal-directed actions. Dynamically, pFC is at the highest level of the perception-action cycle, the circular processing loop through the cortex that interfaces the organism with the environment in the pursuit of goals. In its predictive and preadaptive roles, pFC supports cognitive functions that are critical for the temporal organization of future behavior, including planning, attentional set, working memory, decision-making, and error monitoring. These functions have a common future perspective and are dynamically intertwined in goal-directed action. They all utilize the same neural infrastructure: a vast array of widely distributed, overlapping, and interactive cortical networks of personal memory and semantic knowledge, named cognits, which are formed by synaptic reinforcement in learning and memory acquisition. From this cortex-wide reservoir of memory and knowledge, pFC generates purposeful, goal-directed actions that are preadapted to predicted future events.
Krimmel, R.M.
1999-01-01
Net mass balance has been measured since 1958 at South Cascade Glacier using the 'direct method,' e.g. area averages of snow gain and firn and ice loss at stakes. Analysis of cartographic vertical photography has allowed measurement of mass balance using the 'geodetic method' in 1970, 1975, 1977, 1979-80, and 1985-97. Water equivalent change as measured by these nearly independent methods should give similar results. During 1970-97, the direct method shows a cumulative balance of about -15 m, and the geodetic method shows a cumulative balance of about -22 m. The deviation between the two methods is fairly consistent, suggesting no gross errors in either, but rather a cumulative systematic error. It is suspected that the cumulative error is in the direct method because the geodetic method is based on a non-changing reference, the bedrock control, whereas the direct method is measured with reference to only the previous year's summer surface. Possible sources of mass loss that are missing from the direct method are basal melt, internal melt, and ablation on crevasse walls. Possible systematic measurement errors include under-estimation of the density of lost material, sinking stakes, or poorly represented areas.
Tilt Error in Cryospheric Surface Radiation Measurements at High Latitudes: A Model Study
NASA Astrophysics Data System (ADS)
Bogren, W.; Kylling, A.; Burkhart, J. F.
2015-12-01
We have evaluated the magnitude and makeup of error in cryospheric radiation observations due to small sensor misalignment in in-situ measurements of solar irradiance. This error is examined through simulation of diffuse and direct irradiance arriving at a detector with a cosine-response foreoptic. Emphasis is placed on assessing total error over the solar shortwave spectrum from 250nm to 4500nm, as well as supporting investigation over other relevant shortwave spectral ranges. The total measurement error introduced by sensor tilt is dominated by the direct component. For a typical high latitude albedo measurement with a solar zenith angle of 60◦, a sensor tilted by 1, 3, and 5◦ can respectively introduce up to 2.6, 7.7, and 12.8% error into the measured irradiance and similar errors in the derived albedo. Depending on the daily range of solar azimuth and zenith angles, significant measurement error can persist also in integrated daily irradiance and albedo.
Simulations in site error estimation for direction finders
NASA Astrophysics Data System (ADS)
López, Raúl E.; Passi, Ranjit M.
1991-08-01
The performance of an algorithm for the recovery of site-specific errors of direction finder (DF) networks is tested under controlled simulated conditions. The simulations show that the algorithm has some inherent shortcomings for the recovery of site errors from the measured azimuth data. These limitations are fundamental to the problem of site error estimation using azimuth information. Several ways for resolving or ameliorating these basic complications are tested by means of simulations. From these it appears that for the effective implementation of the site error determination algorithm, one should design the networks with at least four DFs, improve the alignment of the antennas, and increase the gain of the DFs as much as it is compatible with other operational requirements. The use of a nonzero initial estimate of the site errors when working with data from networks of four or more DFs also improves the accuracy of the site error recovery. Even for networks of three DFs, reasonable site error corrections could be obtained if the antennas could be well aligned.
NASA Technical Reports Server (NTRS)
Lang, Christapher G.; Bey, Kim S. (Technical Monitor)
2002-01-01
This research investigates residual-based a posteriori error estimates for finite element approximations of heat conduction in single-layer and multi-layered materials. The finite element approximation, based upon hierarchical modelling combined with p-version finite elements, is described with specific application to a two-dimensional, steady state, heat-conduction problem. Element error indicators are determined by solving an element equation for the error with the element residual as a source, and a global error estimate in the energy norm is computed by collecting the element contributions. Numerical results of the performance of the error estimate are presented by comparisons to the actual error. Two methods are discussed and compared for approximating the element boundary flux. The equilibrated flux method provides more accurate results for estimating the error than the average flux method. The error estimation is applied to multi-layered materials with a modification to the equilibrated flux method to approximate the discontinuous flux along a boundary at the material interfaces. A directional error indicator is developed which distinguishes between the hierarchical modeling error and the finite element error. Numerical results are presented for single-layered materials which show that the directional indicators accurately determine which contribution to the total error dominates.
Helle, Samuli
2018-03-01
Revealing causal effects from correlative data is very challenging and a contemporary problem in human life history research owing to the lack of experimental approach. Problems with causal inference arising from measurement error in independent variables, whether related either to inaccurate measurement technique or validity of measurements, seem not well-known in this field. The aim of this study is to show how structural equation modeling (SEM) with latent variables can be applied to account for measurement error in independent variables when the researcher has recorded several indicators of a hypothesized latent construct. As a simple example of this approach, measurement error in lifetime allocation of resources to reproduction in Finnish preindustrial women is modelled in the context of the survival cost of reproduction. In humans, lifetime energetic resources allocated in reproduction are almost impossible to quantify with precision and, thus, typically used measures of lifetime reproductive effort (e.g., lifetime reproductive success and parity) are likely to be plagued by measurement error. These results are contrasted with those obtained from a traditional regression approach where the single best proxy of lifetime reproductive effort available in the data is used for inference. As expected, the inability to account for measurement error in women's lifetime reproductive effort resulted in the underestimation of its underlying effect size on post-reproductive survival. This article emphasizes the advantages that the SEM framework can provide in handling measurement error via multiple-indicator latent variables in human life history studies. © 2017 Wiley Periodicals, Inc.
Safety coaches in radiology: decreasing human error and minimizing patient harm.
Dickerson, Julie M; Koch, Bernadette L; Adams, Janet M; Goodfriend, Martha A; Donnelly, Lane F
2010-09-01
Successful programs to improve patient safety require a component aimed at improving safety culture and environment, resulting in a reduced number of human errors that could lead to patient harm. Safety coaching provides peer accountability. It involves observing for safety behaviors and use of error prevention techniques and provides immediate feedback. For more than a decade, behavior-based safety coaching has been a successful strategy for reducing error within the context of occupational safety in industry. We describe the use of safety coaches in radiology. Safety coaches are an important component of our comprehensive patient safety program.
Error-associated behaviors and error rates for robotic geology
NASA Technical Reports Server (NTRS)
Anderson, Robert C.; Thomas, Geb; Wagner, Jacob; Glasgow, Justin
2004-01-01
This study explores human error as a function of the decision-making process. One of many models for human decision-making is Rasmussen's decision ladder [9]. The decision ladder identifies the multiple tasks and states of knowledge involved in decision-making. The tasks and states of knowledge can be classified by the level of cognitive effort required to make the decision, leading to the skill, rule, and knowledge taxonomy (Rasmussen, 1987). Skill based decisions require the least cognitive effort and knowledge based decisions require the greatest cognitive effort. Errors can occur at any of the cognitive levels.
Procedural error monitoring and smart checklists
NASA Technical Reports Server (NTRS)
Palmer, Everett
1990-01-01
Human beings make and usually detect errors routinely. The same mental processes that allow humans to cope with novel problems can also lead to error. Bill Rouse has argued that errors are not inherently bad but their consequences may be. He proposes the development of error-tolerant systems that detect errors and take steps to prevent the consequences of the error from occurring. Research should be done on self and automatic detection of random and unanticipated errors. For self detection, displays should be developed that make the consequences of errors immediately apparent. For example, electronic map displays graphically show the consequences of horizontal flight plan entry errors. Vertical profile displays should be developed to make apparent vertical flight planning errors. Other concepts such as energy circles could also help the crew detect gross flight planning errors. For automatic detection, systems should be developed that can track pilot activity, infer pilot intent and inform the crew of potential errors before their consequences are realized. Systems that perform a reasonableness check on flight plan modifications by checking route length and magnitude of course changes are simple examples. Another example would be a system that checked the aircraft's planned altitude against a data base of world terrain elevations. Information is given in viewgraph form.
Comparison of phasing strategies for whole human genomes
Kirkness, Ewen; Schork, Nicholas J.
2018-01-01
Humans are a diploid species that inherit one set of chromosomes paternally and one homologous set of chromosomes maternally. Unfortunately, most human sequencing initiatives ignore this fact in that they do not directly delineate the nucleotide content of the maternal and paternal copies of the 23 chromosomes individuals possess (i.e., they do not ‘phase’ the genome) often because of the costs and complexities of doing so. We compared 11 different widely-used approaches to phasing human genomes using the publicly available ‘Genome-In-A-Bottle’ (GIAB) phased version of the NA12878 genome as a gold standard. The phasing strategies we compared included laboratory-based assays that prepare DNA in unique ways to facilitate phasing as well as purely computational approaches that seek to reconstruct phase information from general sequencing reads and constructs or population-level haplotype frequency information obtained through a reference panel of haplotypes. To assess the performance of the 11 approaches, we used metrics that included, among others, switch error rates, haplotype block lengths, the proportion of fully phase-resolved genes, phasing accuracy and yield between pairs of SNVs. Our comparisons suggest that a hybrid or combined approach that leverages: 1. population-based phasing using the SHAPEIT software suite, 2. either genome-wide sequencing read data or parental genotypes, and 3. a large reference panel of variant and haplotype frequencies, provides a fast and efficient way to produce highly accurate phase-resolved individual human genomes. We found that for population-based approaches, phasing performance is enhanced with the addition of genome-wide read data; e.g., whole genome shotgun and/or RNA sequencing reads. Further, we found that the inclusion of parental genotype data within a population-based phasing strategy can provide as much as a ten-fold reduction in phasing errors. We also considered a majority voting scheme for the construction of a consensus haplotype combining multiple predictions for enhanced performance and site coverage. Finally, we also identified DNA sequence signatures associated with the genomic regions harboring phasing switch errors, which included regions of low polymorphism or SNV density. PMID:29621242
A circadian rhythm in skill-based errors in aviation maintenance.
Hobbs, Alan; Williamson, Ann; Van Dongen, Hans P A
2010-07-01
In workplaces where activity continues around the clock, human error has been observed to exhibit a circadian rhythm, with a characteristic peak in the early hours of the morning. Errors are commonly distinguished by the nature of the underlying cognitive failure, particularly the level of intentionality involved in the erroneous action. The Skill-Rule-Knowledge (SRK) framework of Rasmussen is used widely in the study of industrial errors and accidents. The SRK framework describes three fundamental types of error, according to whether behavior is under the control of practiced sensori-motor skill routines with minimal conscious awareness; is guided by implicit or explicit rules or expertise; or where the planning of actions requires the conscious application of domain knowledge. Up to now, examinations of circadian patterns of industrial errors have not distinguished between different types of error. Consequently, it is not clear whether all types of error exhibit the same circadian rhythm. A survey was distributed to aircraft maintenance personnel in Australia. Personnel were invited to anonymously report a safety incident and were prompted to describe, in detail, the human involvement (if any) that contributed to it. A total of 402 airline maintenance personnel reported an incident, providing 369 descriptions of human error in which the time of the incident was reported and sufficient detail was available to analyze the error. Errors were categorized using a modified version of the SRK framework, in which errors are categorized as skill-based, rule-based, or knowledge-based, or as procedure violations. An independent check confirmed that the SRK framework had been applied with sufficient consistency and reliability. Skill-based errors were the most common form of error, followed by procedure violations, rule-based errors, and knowledge-based errors. The frequency of errors was adjusted for the estimated proportion of workers present at work/each hour of the day, and the 24 h pattern of each error type was examined. Skill-based errors exhibited a significant circadian rhythm, being most prevalent in the early hours of the morning. Variation in the frequency of rule-based errors, knowledge-based errors, and procedure violations over the 24 h did not reach statistical significance. The results suggest that during the early hours of the morning, maintenance technicians are at heightened risk of "absent minded" errors involving failures to execute action plans as intended.
SU-E-T-195: Gantry Angle Dependency of MLC Leaf Position Error
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ju, S; Hong, C; Kim, M
Purpose: The aim of this study was to investigate the gantry angle dependency of the multileaf collimator (MLC) leaf position error. Methods: An automatic MLC quality assurance system (AutoMLCQA) was developed to evaluate the gantry angle dependency of the MLC leaf position error using an electronic portal imaging device (EPID). To eliminate the EPID position error due to gantry rotation, we designed a reference maker (RM) that could be inserted into the wedge mount. After setting up the EPID, a reference image was taken of the RM using an open field. Next, an EPID-based picket-fence test (PFT) was performed withoutmore » the RM. These procedures were repeated at every 45° intervals of the gantry angle. A total of eight reference images and PFT image sets were analyzed using in-house software. The average MLC leaf position error was calculated at five pickets (-10, -5, 0, 5, and 10 cm) in accordance with general PFT guidelines using in-house software. This test was carried out for four linear accelerators. Results: The average MLC leaf position errors were within the set criterion of <1 mm (actual errors ranged from -0.7 to 0.8 mm) for all gantry angles, but significant gantry angle dependency was observed in all machines. The error was smaller at a gantry angle of 0° but increased toward the positive direction with gantry angle increments in the clockwise direction. The error reached a maximum value at a gantry angle of 90° and then gradually decreased until 180°. In the counter-clockwise rotation of the gantry, the same pattern of error was observed but the error increased in the negative direction. Conclusion: The AutoMLCQA system was useful to evaluate the MLC leaf position error for various gantry angles without the EPID position error. The Gantry angle dependency should be considered during MLC leaf position error analysis.« less
The high accuracy data processing system of laser interferometry signals based on MSP430
NASA Astrophysics Data System (ADS)
Qi, Yong-yue; Lin, Yu-chi; Zhao, Mei-rong
2009-07-01
Generally speaking there are two orthogonal signals used in single-frequency laser interferometer for differentiating direction and electronic subdivision. However there usually exist three errors with the interferential signals: zero offsets error, unequal amplitude error and quadrature phase shift error. These three errors have a serious impact on subdivision precision. Based on Heydemann error compensation algorithm, it is proposed to achieve compensation of the three errors. Due to complicated operation of the Heydemann mode, a improved arithmetic is advanced to decrease the calculating time effectively in accordance with the special characteristic that only one item of data will be changed in each fitting algorithm operation. Then a real-time and dynamic compensatory circuit is designed. Taking microchip MSP430 as the core of hardware system, two input signals with the three errors are turned into digital quantity by the AD7862. After data processing in line with improved arithmetic, two ideal signals without errors are output by the AD7225. At the same time two original signals are turned into relevant square wave and imported to the differentiating direction circuit. The impulse exported from the distinguishing direction circuit is counted by the timer of the microchip. According to the number of the pulse and the soft subdivision the final result is showed by LED. The arithmetic and the circuit are adopted to test the capability of a laser interferometer with 8 times optical path difference and the measuring accuracy of 12-14nm is achieved.
Tilt error in cryospheric surface radiation measurements at high latitudes: a model study
NASA Astrophysics Data System (ADS)
Bogren, Wiley Steven; Faulkner Burkhart, John; Kylling, Arve
2016-03-01
We have evaluated the magnitude and makeup of error in cryospheric radiation observations due to small sensor misalignment in in situ measurements of solar irradiance. This error is examined through simulation of diffuse and direct irradiance arriving at a detector with a cosine-response fore optic. Emphasis is placed on assessing total error over the solar shortwave spectrum from 250 to 4500 nm, as well as supporting investigation over other relevant shortwave spectral ranges. The total measurement error introduced by sensor tilt is dominated by the direct component. For a typical high-latitude albedo measurement with a solar zenith angle of 60°, a sensor tilted by 1, 3, and 5° can, respectively introduce up to 2.7, 8.1, and 13.5 % error into the measured irradiance and similar errors in the derived albedo. Depending on the daily range of solar azimuth and zenith angles, significant measurement error can persist also in integrated daily irradiance and albedo. Simulations including a cloud layer demonstrate decreasing tilt error with increasing cloud optical depth.
NASA Astrophysics Data System (ADS)
Wu, Heng
2000-10-01
In this thesis, an a-posteriori error estimator is presented and employed for solving viscous incompressible flow problems. In an effort to detect local flow features, such as vortices and separation, and to resolve flow details precisely, a velocity angle error estimator e theta which is based on the spatial derivative of velocity direction fields is designed and constructed. The a-posteriori error estimator corresponds to the antisymmetric part of the deformation-rate-tensor, and it is sensitive to the second derivative of the velocity angle field. Rationality discussions reveal that the velocity angle error estimator is a curvature error estimator, and its value reflects the accuracy of streamline curves. It is also found that the velocity angle error estimator contains the nonlinear convective term of the Navier-Stokes equations, and it identifies and computes the direction difference when the convective acceleration direction and the flow velocity direction have a disparity. Through benchmarking computed variables with the analytic solution of Kovasznay flow or the finest grid of cavity flow, it is demonstrated that the velocity angle error estimator has a better performance than the strain error estimator. The benchmarking work also shows that the computed profile obtained by using etheta can achieve the best matching outcome with the true theta field, and that it is asymptotic to the true theta variation field, with a promise of fewer unknowns. Unstructured grids are adapted by employing local cell division as well as unrefinement of transition cells. Using element class and node class can efficiently construct a hierarchical data structure which provides cell and node inter-reference at each adaptive level. Employing element pointers and node pointers can dynamically maintain the connection of adjacent elements and adjacent nodes, and thus avoids time-consuming search processes. The adaptive scheme is applied to viscous incompressible flow at different Reynolds numbers. It is found that the velocity angle error estimator can detect most flow characteristics and produce dense grids in the regions where flow velocity directions have abrupt changes. In addition, the e theta estimator makes the derivative error dilutely distribute in the whole computational domain and also allows the refinement to be conducted at regions of high error. Through comparison of the velocity angle error across the interface with neighbouring cells, it is verified that the adaptive scheme in using etheta provides an optimum mesh which can clearly resolve local flow features in a precise way. The adaptive results justify the applicability of the etheta estimator and prove that this error estimator is a valuable adaptive indicator for the automatic refinement of unstructured grids.
Hurford, Amy
2009-05-20
Movement data are frequently collected using Global Positioning System (GPS) receivers, but recorded GPS locations are subject to errors. While past studies have suggested methods to improve location accuracy, mechanistic movement models utilize distributions of turning angles and directional biases and these data present a new challenge in recognizing and reducing the effect of measurement error. I collected locations from a stationary GPS collar, analyzed a probabilistic model and used Monte Carlo simulations to understand how measurement error affects measured turning angles and directional biases. Results from each of the three methods were in complete agreement: measurement error gives rise to a systematic bias where a stationary animal is most likely to be measured as turning 180 degrees or moving towards a fixed point in space. These spurious effects occur in GPS data when the measured distance between locations is <20 meters. Measurement error must be considered as a possible cause of 180 degree turning angles in GPS data. Consequences of failing to account for measurement error are predicting overly tortuous movement, numerous returns to previously visited locations, inaccurately predicting species range, core areas, and the frequency of crossing linear features. By understanding the effect of GPS measurement error, ecologists are able to disregard false signals to more accurately design conservation plans for endangered wildlife.
Enge, Martin; Arda, H Efsun; Mignardi, Marco; Beausang, John; Bottino, Rita; Kim, Seung K; Quake, Stephen R
2017-10-05
As organisms age, cells accumulate genetic and epigenetic errors that eventually lead to impaired organ function or catastrophic transformation such as cancer. Because aging reflects a stochastic process of increasing disorder, cells in an organ will be individually affected in different ways, thus rendering bulk analyses of postmitotic adult cells difficult to interpret. Here, we directly measure the effects of aging in human tissue by performing single-cell transcriptome analysis of 2,544 human pancreas cells from eight donors spanning six decades of life. We find that islet endocrine cells from older donors display increased levels of transcriptional noise and potential fate drift. By determining the mutational history of individual cells, we uncover a novel mutational signature in healthy aging endocrine cells. Our results demonstrate the feasibility of using single-cell RNA sequencing (RNA-seq) data from primary cells to derive insights into genetic and transcriptional processes that operate on aging human tissue. Copyright © 2017 Elsevier Inc. All rights reserved.
Diuk, Carlos; Tsai, Karin; Wallis, Jonathan; Botvinick, Matthew; Niv, Yael
2013-03-27
Studies suggest that dopaminergic neurons report a unitary, global reward prediction error signal. However, learning in complex real-life tasks, in particular tasks that show hierarchical structure, requires multiple prediction errors that may coincide in time. We used functional neuroimaging to measure prediction error signals in humans performing such a hierarchical task involving simultaneous, uncorrelated prediction errors. Analysis of signals in a priori anatomical regions of interest in the ventral striatum and the ventral tegmental area indeed evidenced two simultaneous, but separable, prediction error signals corresponding to the two levels of hierarchy in the task. This result suggests that suitably designed tasks may reveal a more intricate pattern of firing in dopaminergic neurons. Moreover, the need for downstream separation of these signals implies possible limitations on the number of different task levels that we can learn about simultaneously.
Initializing a Mesoscale Boundary-Layer Model with Radiosonde Observations
NASA Astrophysics Data System (ADS)
Berri, Guillermo J.; Bertossa, Germán
2018-01-01
A mesoscale boundary-layer model is used to simulate low-level regional wind fields over the La Plata River of South America, a region characterized by a strong daily cycle of land-river surface-temperature contrast and low-level circulations of sea-land breeze type. The initial and boundary conditions are defined from a limited number of local observations and the upper boundary condition is taken from the only radiosonde observations available in the region. The study considers 14 different upper boundary conditions defined from the radiosonde data at standard levels, significant levels, level of the inversion base and interpolated levels at fixed heights, all of them within the first 1500 m. The period of analysis is 1994-2008 during which eight daily observations from 13 weather stations of the region are used to validate the 24-h surface-wind forecast. The model errors are defined as the root-mean-square of relative error in wind-direction frequency distribution and mean wind speed per wind sector. Wind-direction errors are greater than wind-speed errors and show significant dispersion among the different upper boundary conditions, not present in wind speed, revealing a sensitivity to the initialization method. The wind-direction errors show a well-defined daily cycle, not evident in wind speed, with the minimum at noon and the maximum at dusk, but no systematic deterioration with time. The errors grow with the height of the upper boundary condition level, in particular wind direction, and double the errors obtained when the upper boundary condition is defined from the lower levels. The conclusion is that defining the model upper boundary condition from radiosonde data closer to the ground minimizes the low-level wind-field errors throughout the region.
Mental representation of symbols as revealed by vocabulary errors in two bonobos (Pan paniscus).
Lyn, Heidi
2007-10-01
Error analysis has been used in humans to detect implicit representations and categories in language use. The present study utilizes the same technique to report on mental representations and categories in symbol use from two bonobos (Pan paniscus). These bonobos have been shown in published reports to comprehend English at the level of a two-and-a-half year old child and to use a keyboard with over 200 visuographic symbols (lexigrams). In this study, vocabulary test errors from over 10 years of data revealed auditory, visual, and spatio-temporal generalizations (errors were more likely items that looked like sounded like, or were frequently associated with the sample item in space or in time), as well as hierarchical and conceptual categorizations. These error data, like those of humans, are a result of spontaneous responding rather than specific training and do not solely depend upon the sample mode (e.g. auditory similarity errors are not universally more frequent with an English sample, nor were visual similarity errors universally more frequent with a photograph sample). However, unlike humans, these bonobos do not make errors based on syntactical confusions (e.g. confusing semantically unrelated nouns), suggesting that they may not separate syntactical and semantic information. These data suggest that apes spontaneously create a complex, hierarchical, web of representations when exposed to a symbol system.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-15
... management of human error in its operations and system safety programs, and the status of PTC implementation... UP's safety management policies and programs associated with human error, operational accident and... Chairman of the Board of Inquiry 2. Introduction of the Board of Inquiry and Technical Panel 3...
Intravenous Chemotherapy Compounding Errors in a Follow-Up Pan-Canadian Observational Study.
Gilbert, Rachel E; Kozak, Melissa C; Dobish, Roxanne B; Bourrier, Venetia C; Koke, Paul M; Kukreti, Vishal; Logan, Heather A; Easty, Anthony C; Trbovich, Patricia L
2018-05-01
Intravenous (IV) compounding safety has garnered recent attention as a result of high-profile incidents, awareness efforts from the safety community, and increasingly stringent practice standards. New research with more-sensitive error detection techniques continues to reinforce that error rates with manual IV compounding are unacceptably high. In 2014, our team published an observational study that described three types of previously unrecognized and potentially catastrophic latent chemotherapy preparation errors in Canadian oncology pharmacies that would otherwise be undetectable. We expand on this research and explore whether additional potential human failures are yet to be addressed by practice standards. Field observations were conducted in four cancer center pharmacies in four Canadian provinces from January 2013 to February 2015. Human factors specialists observed and interviewed pharmacy managers, oncology pharmacists, pharmacy technicians, and pharmacy assistants as they carried out their work. Emphasis was on latent errors (potential human failures) that could lead to outcomes such as wrong drug, dose, or diluent. Given the relatively short observational period, no active failures or actual errors were observed. However, 11 latent errors in chemotherapy compounding were identified. In terms of severity, all 11 errors create the potential for a patient to receive the wrong drug or dose, which in the context of cancer care, could lead to death or permanent loss of function. Three of the 11 practices were observed in our previous study, but eight were new. Applicable Canadian and international standards and guidelines do not explicitly address many of the potentially error-prone practices observed. We observed a significant degree of risk for error in manual mixing practice. These latent errors may exist in other regions where manual compounding of IV chemotherapy takes place. Continued efforts to advance standards, guidelines, technological innovation, and chemical quality testing are needed.
A GPU-based symmetric non-rigid image registration method in human lung.
Haghighi, Babak; D Ellingwood, Nathan; Yin, Youbing; Hoffman, Eric A; Lin, Ching-Long
2018-03-01
Quantitative computed tomography (QCT) of the lungs plays an increasing role in identifying sub-phenotypes of pathologies previously lumped into broad categories such as chronic obstructive pulmonary disease and asthma. Methods for image matching and linking multiple lung volumes have proven useful in linking structure to function and in the identification of regional longitudinal changes. Here, we seek to improve the accuracy of image matching via the use of a symmetric multi-level non-rigid registration employing an inverse consistent (IC) transformation whereby images are registered both in the forward and reverse directions. To develop the symmetric method, two similarity measures, the sum of squared intensity difference (SSD) and the sum of squared tissue volume difference (SSTVD), were used. The method is based on a novel generic mathematical framework to include forward and backward transformations, simultaneously, eliminating the need to compute the inverse transformation. Two implementations were used to assess the proposed method: a two-dimensional (2-D) implementation using synthetic examples with SSD, and a multi-core CPU and graphics processing unit (GPU) implementation with SSTVD for three-dimensional (3-D) human lung datasets (six normal adults studied at total lung capacity (TLC) and functional residual capacity (FRC)). Success was evaluated in terms of the IC transformation consistency serving to link TLC to FRC. 2-D registration on synthetic images, using both symmetric and non-symmetric SSD methods, and comparison of displacement fields showed that the symmetric method gave a symmetrical grid shape and reduced IC errors, with the mean values of IC errors decreased by 37%. Results for both symmetric and non-symmetric transformations of human datasets showed that the symmetric method gave better results for IC errors in all cases, with mean values of IC errors for the symmetric method lower than the non-symmetric methods using both SSD and SSTVD. The GPU version demonstrated an average of 43 times speedup and ~5.2 times speedup over the single-threaded and 12-threaded CPU versions, respectively. Run times with the GPU were as fast as 2 min. The symmetric method improved the inverse consistency, aiding the use of image registration in the QCT-based evaluation of the lung.
Modeling for Military Operational Medicine Scientific and Technical Objectives
2005-09-01
measurements and less error in interpreting the measurements since the sensor units are placed directly under armor ; and (3) new material that matches the...more accurate measurements and less error in interpreting the measurements, since the sensor units are placed directly under armor ; and (3) new
Spacecraft and propulsion technician error
NASA Astrophysics Data System (ADS)
Schultz, Daniel Clyde
Commercial aviation and commercial space similarly launch, fly, and land passenger vehicles. Unlike aviation, the U.S. government has not established maintenance policies for commercial space. This study conducted a mixed methods review of 610 U.S. space launches from 1984 through 2011, which included 31 failures. An analysis of the failure causal factors showed that human error accounted for 76% of those failures, which included workmanship error accounting for 29% of the failures. With the imminent future of commercial space travel, the increased potential for the loss of human life demands that changes be made to the standardized procedures, training, and certification to reduce human error and failure rates. Several recommendations were made by this study to the FAA's Office of Commercial Space Transportation, space launch vehicle operators, and maintenance technician schools in an effort to increase the safety of the space transportation passengers.
ERIC Educational Resources Information Center
Waugh, Rebecca E.
2010-01-01
Simultaneous prompting is an errorless learning strategy designed to reduce the number of errors students make; however, research has shown a disparity in the number of errors students make during instructional versus probe trials. This study directly examined the effects of error correction versus no error correction during probe trials on the…
ERIC Educational Resources Information Center
Waugh, Rebecca E.; Alberto, Paul A.; Fredrick, Laura D.
2011-01-01
Simultaneous prompting is an errorless learning strategy designed to reduce the number of errors students make; however, research has shown a disparity in the number of errors students make during instructional versus probe trials. This study directly examined the effects of error correction versus no error correction during probe trials on the…
Decision Making In A High-Tech World: Automation Bias and Countermeasures
NASA Technical Reports Server (NTRS)
Mosier, Kathleen L.; Skitka, Linda J.; Burdick, Mark R.; Heers, Susan T.; Rosekind, Mark R. (Technical Monitor)
1996-01-01
Automated decision aids and decision support systems have become essential tools in many high-tech environments. In aviation, for example, flight management systems computers not only fly the aircraft, but also calculate fuel efficient paths, detect and diagnose system malfunctions and abnormalities, and recommend or carry out decisions. Air Traffic Controllers will soon be utilizing decision support tools to help them predict and detect potential conflicts and to generate clearances. Other fields as disparate as nuclear power plants and medical diagnostics are similarly becoming more and more automated. Ideally, the combination of human decision maker and automated decision aid should result in a high-performing team, maximizing the advantages of additional cognitive and observational power in the decision-making process. In reality, however, the presence of these aids often short-circuits the way that even very experienced decision makers have traditionally handled tasks and made decisions, and introduces opportunities for new decision heuristics and biases. Results of recent research investigating the use of automated aids have indicated the presence of automation bias, that is, errors made when decision makers rely on automated cues as a heuristic replacement for vigilant information seeking and processing. Automation commission errors, i.e., errors made when decision makers inappropriately follow an automated directive, or automation omission errors, i.e., errors made when humans fail to take action or notice a problem because an automated aid fails to inform them, can result from this tendency. Evidence of the tendency to make automation-related omission and commission errors has been found in pilot self reports, in studies using pilots in flight simulations, and in non-flight decision making contexts with student samples. Considerable research has found that increasing social accountability can successfully ameliorate a broad array of cognitive biases and resultant errors. To what extent these effects generalize to performance situations is not yet empirically established. The two studies to be presented represent concurrent efforts, with student and professional pilot samples, to determine the effects of accountability pressures on automation bias and on the verification of the accurate functioning of automated aids. Students (Experiment 1) and commercial pilots (Experiment 2) performed simulated flight tasks using automated aids. In both studies, participants who perceived themselves as accountable for their strategies of interaction with the automation were significantly more likely to verify its correctness, and committed significantly fewer automation-related errors than those who did not report this perception.
42 CFR 1005.23 - Harmless error.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 5 2012-10-01 2012-10-01 false Harmless error. 1005.23 Section 1005.23 Public Health OFFICE OF INSPECTOR GENERAL-HEALTH CARE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OIG AUTHORITIES APPEALS OF EXCLUSIONS, CIVIL MONEY PENALTIES AND ASSESSMENTS § 1005.23 Harmless error. No error in either...
42 CFR 1005.23 - Harmless error.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 42 Public Health 5 2014-10-01 2014-10-01 false Harmless error. 1005.23 Section 1005.23 Public Health OFFICE OF INSPECTOR GENERAL-HEALTH CARE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OIG AUTHORITIES APPEALS OF EXCLUSIONS, CIVIL MONEY PENALTIES AND ASSESSMENTS § 1005.23 Harmless error. No error in either...
42 CFR 1005.23 - Harmless error.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 5 2010-10-01 2010-10-01 false Harmless error. 1005.23 Section 1005.23 Public Health OFFICE OF INSPECTOR GENERAL-HEALTH CARE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OIG AUTHORITIES APPEALS OF EXCLUSIONS, CIVIL MONEY PENALTIES AND ASSESSMENTS § 1005.23 Harmless error. No error in either...
42 CFR 1005.23 - Harmless error.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 5 2013-10-01 2013-10-01 false Harmless error. 1005.23 Section 1005.23 Public Health OFFICE OF INSPECTOR GENERAL-HEALTH CARE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OIG AUTHORITIES APPEALS OF EXCLUSIONS, CIVIL MONEY PENALTIES AND ASSESSMENTS § 1005.23 Harmless error. No error in either...
42 CFR 1005.23 - Harmless error.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 5 2011-10-01 2011-10-01 false Harmless error. 1005.23 Section 1005.23 Public Health OFFICE OF INSPECTOR GENERAL-HEALTH CARE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OIG AUTHORITIES APPEALS OF EXCLUSIONS, CIVIL MONEY PENALTIES AND ASSESSMENTS § 1005.23 Harmless error. No error in either...
42 CFR 3.552 - Harmless error.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 1 2010-10-01 2010-10-01 false Harmless error. 3.552 Section 3.552 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PROVISIONS PATIENT SAFETY ORGANIZATIONS AND PATIENT SAFETY WORK PRODUCT Enforcement Program § 3.552 Harmless error. No error in either the...
45 CFR 98.100 - Error Rate Report.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Error Rate Report. 98.100 Section 98.100 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.100 Error Rate Report. (a) Applicability—The requirements of this subpart...
45 CFR 98.100 - Error Rate Report.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false Error Rate Report. 98.100 Section 98.100 Public Welfare Department of Health and Human Services GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.100 Error Rate Report. (a) Applicability—The requirements of this subpart...
45 CFR 98.100 - Error Rate Report.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 1 2012-10-01 2012-10-01 false Error Rate Report. 98.100 Section 98.100 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.100 Error Rate Report. (a) Applicability—The requirements of this subpart...
45 CFR 98.100 - Error Rate Report.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 1 2011-10-01 2011-10-01 false Error Rate Report. 98.100 Section 98.100 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.100 Error Rate Report. (a) Applicability—The requirements of this subpart...
Defense Mapping Agency (DMA) Raster-to-Vector Analysis
1984-11-30
model) to pinpoint critical deficiencies and understand trade-offs between alternative solutions. This may be exemplified by the allocation of human ...process, prone to errors (i.e., human operator eye/motor control limitations), and its time consuming nature (as a function of data density). It should...achieved through the facilities of coinputer interactive graphics. Each error or anomaly is individually identified by a human operator and corrected
Ehrampoosh, Shervin; Dave, Mohit; Kia, Michael A; Rablau, Corneliu; Zadeh, Mehrdad H
2013-01-01
This paper presents an enhanced haptic-enabled master-slave teleoperation system which can be used to provide force feedback to surgeons in minimally invasive surgery (MIS). One of the research goals was to develop a combined-control architecture framework that included both direct force reflection (DFR) and position-error-based (PEB) control strategies. To achieve this goal, it was essential to measure accurately the direct contact forces between deformable bodies and a robotic tool tip. To measure the forces at a surgical tool tip and enhance the performance of the teleoperation system, an optical force sensor was designed, prototyped, and added to a robot manipulator. The enhanced teleoperation architecture was formulated by developing mathematical models for the optical force sensor, the extended slave robot manipulator, and the combined-control strategy. Human factor studies were also conducted to (a) examine experimentally the performance of the enhanced teleoperation system with the optical force sensor, and (b) study human haptic perception during the identification of remote object deformability. The first experiment was carried out to discriminate deformability of objects when human subjects were in direct contact with deformable objects by means of a laparoscopic tool. The control parameters were then tuned based on the results of this experiment using a gain-scheduling method. The second experiment was conducted to study the effectiveness of the force feedback provided through the enhanced teleoperation system. The results show that the force feedback increased the ability of subjects to correctly identify materials of different deformable types. In addition, the virtual force feedback provided by the teleoperation system comes close to the real force feedback experienced in direct MIS. The experimental results provide design guidelines for choosing and validating the control architecture and the optical force sensor.
Shibayama, Yusuke; Arimura, Hidetaka; Hirose, Taka-Aki; Nakamoto, Takahiro; Sasaki, Tomonari; Ohga, Saiji; Matsushita, Norimasa; Umezu, Yoshiyuki; Nakamura, Yasuhiko; Honda, Hiroshi
2017-05-01
The setup errors and organ motion errors pertaining to clinical target volume (CTV) have been considered as two major causes of uncertainties in the determination of the CTV-to-planning target volume (PTV) margins for prostate cancer radiation treatment planning. We based our study on the assumption that interfractional target shape variations are not negligible as another source of uncertainty for the determination of precise CTV-to-PTV margins. Thus, we investigated the interfractional shape variations of CTVs based on a point distribution model (PDM) for prostate cancer radiation therapy. To quantitate the shape variations of CTVs, the PDM was applied for the contours of 4 types of CTV regions (low-risk, intermediate- risk, high-risk CTVs, and prostate plus entire seminal vesicles), which were delineated by considering prostate cancer risk groups on planning computed tomography (CT) and cone beam CT (CBCT) images of 73 fractions of 10 patients. The standard deviations (SDs) of the interfractional random errors for shape variations were obtained from covariance matrices based on the PDMs, which were generated from vertices of triangulated CTV surfaces. The correspondences between CTV surface vertices were determined based on a thin-plate spline robust point matching algorithm. The systematic error for shape variations was defined as the average deviation between surfaces of an average CTV and planning CTVs, and the random error as the average deviation of CTV surface vertices for fractions from an average CTV surface. The means of the SDs of the systematic errors for the four types of CTVs ranged from 1.0 to 2.0 mm along the anterior direction, 1.2 to 2.6 mm along the posterior direction, 1.0 to 2.5 mm along the superior direction, 0.9 to 1.9 mm along the inferior direction, 0.9 to 2.6 mm along the right direction, and 1.0 to 3.0 mm along the left direction. Concerning the random errors, the means of the SDs ranged from 0.9 to 1.2 mm along the anterior direction, 1.0 to 1.4 mm along the posterior direction, 0.9 to 1.3 mm along the superior direction, 0.8 to 1.0 mm along the inferior direction, 0.8 to 0.9 mm along the right direction, and 0.8 to 1.0 mm along the left direction. Since the shape variations were not negligible for intermediate and high-risk CTVs, they should be taken into account for the determination of the CTV-to-PTV margins in radiation treatment planning of prostate cancer. © 2017 American Association of Physicists in Medicine.
Krigolson, Olav E; Hassall, Cameron D; Handy, Todd C
2014-03-01
Our ability to make decisions is predicated upon our knowledge of the outcomes of the actions available to us. Reinforcement learning theory posits that actions followed by a reward or punishment acquire value through the computation of prediction errors-discrepancies between the predicted and the actual reward. A multitude of neuroimaging studies have demonstrated that rewards and punishments evoke neural responses that appear to reflect reinforcement learning prediction errors [e.g., Krigolson, O. E., Pierce, L. J., Holroyd, C. B., & Tanaka, J. W. Learning to become an expert: Reinforcement learning and the acquisition of perceptual expertise. Journal of Cognitive Neuroscience, 21, 1833-1840, 2009; Bayer, H. M., & Glimcher, P. W. Midbrain dopamine neurons encode a quantitative reward prediction error signal. Neuron, 47, 129-141, 2005; O'Doherty, J. P. Reward representations and reward-related learning in the human brain: Insights from neuroimaging. Current Opinion in Neurobiology, 14, 769-776, 2004; Holroyd, C. B., & Coles, M. G. H. The neural basis of human error processing: Reinforcement learning, dopamine, and the error-related negativity. Psychological Review, 109, 679-709, 2002]. Here, we used the brain ERP technique to demonstrate that not only do rewards elicit a neural response akin to a prediction error but also that this signal rapidly diminished and propagated to the time of choice presentation with learning. Specifically, in a simple, learnable gambling task, we show that novel rewards elicited a feedback error-related negativity that rapidly decreased in amplitude with learning. Furthermore, we demonstrate the existence of a reward positivity at choice presentation, a previously unreported ERP component that has a similar timing and topography as the feedback error-related negativity that increased in amplitude with learning. The pattern of results we observed mirrored the output of a computational model that we implemented to compute reward prediction errors and the changes in amplitude of these prediction errors at the time of choice presentation and reward delivery. Our results provide further support that the computations that underlie human learning and decision-making follow reinforcement learning principles.
Mission operations and command assurance: Flight operations quality improvements
NASA Technical Reports Server (NTRS)
Welz, Linda L.; Bruno, Kristin J.; Kazz, Sheri L.; Potts, Sherrill S.; Witkowski, Mona M.
1994-01-01
Mission Operations and Command Assurance (MO&CA) is a Total Quality Management (TQM) task on JPL projects to instill quality in flight mission operations. From a system engineering view, MO&CA facilitates communication and problem-solving among flight teams and provides continuous solving among flight teams and provides continuous process improvement to reduce risk in mission operations by addressing human factors. The MO&CA task has evolved from participating as a member of the spacecraft team, to an independent team reporting directly to flight project management and providing system level assurance. JPL flight projects have benefited significantly from MO&CA's effort to contain risk and prevent rather than rework errors. MO&CA's ability to provide direct transfer of knowledge allows new projects to benefit from previous and ongoing flight experience.
A Method for the Study of Human Factors in Aircraft Operations
NASA Technical Reports Server (NTRS)
Barnhart, W.; Billings, C.; Cooper, G.; Gilstrap, R.; Lauber, J.; Orlady, H.; Puskas, B.; Stephens, W.
1975-01-01
A method for the study of human factors in the aviation environment is described. A conceptual framework is provided within which pilot and other human errors in aircraft operations may be studied with the intent of finding out how, and why, they occurred. An information processing model of human behavior serves as the basis for the acquisition and interpretation of information relating to occurrences which involve human error. A systematic method of collecting such data is presented and discussed. The classification of the data is outlined.
An interactive framework for acquiring vision models of 3-D objects from 2-D images.
Motai, Yuichi; Kak, Avinash
2004-02-01
This paper presents a human-computer interaction (HCI) framework for building vision models of three-dimensional (3-D) objects from their two-dimensional (2-D) images. Our framework is based on two guiding principles of HCI: 1) provide the human with as much visual assistance as possible to help the human make a correct input; and 2) verify each input provided by the human for its consistency with the inputs previously provided. For example, when stereo correspondence information is elicited from a human, his/her job is facilitated by superimposing epipolar lines on the images. Although that reduces the possibility of error in the human marked correspondences, such errors are not entirely eliminated because there can be multiple candidate points close together for complex objects. For another example, when pose-to-pose correspondence is sought from a human, his/her job is made easier by allowing the human to rotate the partial model constructed in the previous pose in relation to the partial model for the current pose. While this facility reduces the incidence of human-supplied pose-to-pose correspondence errors, such errors cannot be eliminated entirely because of confusion created when multiple candidate features exist close together. Each input provided by the human is therefore checked against the previous inputs by invoking situation-specific constraints. Different types of constraints (and different human-computer interaction protocols) are needed for the extraction of polygonal features and for the extraction of curved features. We will show results on both polygonal objects and object containing curved features.
Atmospheric refraction effects on baseline error in satellite laser ranging systems
NASA Technical Reports Server (NTRS)
Im, K. E.; Gardner, C. S.
1982-01-01
Because of the mathematical complexities involved in exact analyses of baseline errors, it is not easy to isolate atmospheric refraction effects; however, by making certain simplifying assumptions about the ranging system geometry, relatively simple expressions can be derived which relate the baseline errors directly to the refraction errors. The results indicate that even in the absence of other errors, the baseline error for intercontinental baselines can be more than an order of magnitude larger than the refraction error.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Runxiao, L; Aikun, W; Xiaomei, F
2015-06-15
Purpose: To compare two registration methods in the CBCT guided radiotherapy for cervical carcinoma, analyze the setup errors and registration methods, determine the margin required for clinical target volume(CTV) extending to planning target volume(PTV). Methods: Twenty patients with cervical carcinoma were enrolled. All patients were underwent CT simulation in the supine position. Transfering the CT images to the treatment planning system and defining the CTV, PTV and the organs at risk (OAR), then transmit them to the XVI workshop. CBCT scans were performed before radiotherapy and registered to planning CT images according to bone and gray value registration methods. Comparedmore » two methods and obtain left-right(X), superior-inferior(Y), anterior-posterior (Z) setup errors, the margin required for CTV to PTV were calculated. Results: Setup errors were unavoidable in postoperative cervical carcinoma irradiation. The setup errors measured by method of bone (systemic ± random) on X(1eft.right),Y(superior.inferior),Z(anterior.posterior) directions were(0.24±3.62),(0.77±5.05) and (0.13±3.89)mm, respectively, the setup errors measured by method of grey (systemic ± random) on X(1eft-right), Y(superior-inferior), Z(anterior-posterior) directions were(0.31±3.93), (0.85±5.16) and (0.21±4.12)mm, respectively.The spatial distributions of setup error was maximum in Y direction. The margins were 4 mm in X axis, 6 mm in Y axis, 4 mm in Z axis respectively.These two registration methods were similar and highly recommended. Conclusion: Both bone and grey registration methods could offer an accurate setup error. The influence of setup errors of a PTV margin would be suggested by 4mm, 4mm and 6mm on X, Y and Z directions for postoperative radiotherapy for cervical carcinoma.« less
Wolf, Dwayne A; Drake, Stacy A; Snow, Francine K
2017-12-01
In the course of fulfilling their statutory role, physicians performing medicolegal investigations may recognize clinical colleagues' medical errors. If the error is found to have led directly to the patient's death (missed diagnosis or incorrect diagnosis, for example), then the forensic pathologist has a professional responsibility to include the information in the autopsy report and make sure that the family is appropriately informed. When the error is significant but did not lead directly to the patient's demise, ethical questions may arise regarding the obligations of the medical examiner to disclose the error to the clinicians or to the family. This case depicts the discovery of medical error likely unrelated to the cause of death and describes one possible ethical approach to disclosure derived from an ethical reasoning model addressing ethical principles of respect for persons/autonomy, beneficence, nonmaleficence, and justice.
Development of multiple-eye PIV using mirror array
NASA Astrophysics Data System (ADS)
Maekawa, Akiyoshi; Sakakibara, Jun
2018-06-01
In order to reduce particle image velocimetry measurement error, we manufactured an ellipsoidal polyhedral mirror and placed it between a camera and flow target to capture n images of identical particles from n (=80 maximum) different directions. The 3D particle positions were determined from the ensemble average of n C2 intersecting points of a pair of line-of-sight back-projected points from a particle found in any combination of two images in the n images. The method was then applied to a rigid-body rotating flow and a turbulent pipe flow. In the former measurement, bias error and random error fell in a range of ±0.02 pixels and 0.02–0.05 pixels, respectively; additionally, random error decreased in proportion to . In the latter measurement, in which the measured value was compared to direct numerical simulation, bias error was reduced and random error also decreased in proportion to .
New methodology to reconstruct in 2-D the cuspal enamel of modern human lower molars.
Modesto-Mata, Mario; García-Campos, Cecilia; Martín-Francés, Laura; Martínez de Pinillos, Marina; García-González, Rebeca; Quintino, Yuliet; Canals, Antoni; Lozano, Marina; Dean, M Christopher; Martinón-Torres, María; Bermúdez de Castro, José María
2017-08-01
In the last years different methodologies have been developed to reconstruct worn teeth. In this article, we propose a new 2-D methodology to reconstruct the worn enamel of lower molars. Our main goals are to reconstruct molars with a high level of accuracy when measuring relevant histological variables and to validate the methodology calculating the errors associated with the measurements. This methodology is based on polynomial regression equations, and has been validated using two different dental variables: cuspal enamel thickness and crown height of the protoconid. In order to perform the validation process, simulated worn modern human molars were employed. The associated errors of the measurements were also estimated applying methodologies previously proposed by other authors. The mean percentage error estimated in reconstructed molars for these two variables in comparison with their own real values is -2.17% for the cuspal enamel thickness of the protoconid and -3.18% for the crown height of the protoconid. This error significantly improves the results of other methodologies, both in the interobserver error and in the accuracy of the measurements. The new methodology based on polynomial regressions can be confidently applied to the reconstruction of cuspal enamel of lower molars, as it improves the accuracy of the measurements and reduces the interobserver error. The present study shows that it is important to validate all methodologies in order to know the associated errors. This new methodology can be easily exportable to other modern human populations, the human fossil record and forensic sciences. © 2017 Wiley Periodicals, Inc.
Safety and Performance Analysis of the Non-Radar Oceanic/Remote Airspace In-Trail Procedure
NASA Technical Reports Server (NTRS)
Carreno, Victor A.; Munoz, Cesar A.
2007-01-01
This document presents a safety and performance analysis of the nominal case for the In-Trail Procedure (ITP) in a non-radar oceanic/remote airspace. The analysis estimates the risk of collision between the aircraft performing the ITP and a reference aircraft. The risk of collision is only estimated for the ITP maneuver and it is based on nominal operating conditions. The analysis does not consider human error, communication error conditions, or the normal risk of flight present in current operations. The hazards associated with human error and communication errors are evaluated in an Operational Hazards Analysis presented elsewhere.
Measurement-based quantum communication with resource states generated by entanglement purification
NASA Astrophysics Data System (ADS)
Wallnöfer, J.; Dür, W.
2017-01-01
We investigate measurement-based quantum communication with noisy resource states that are generated by entanglement purification. We consider the transmission of encoded information via noisy quantum channels using a measurement-based implementation of encoding, error correction, and decoding. We show that such an approach offers advantages over direct transmission, gate-based error correction, and measurement-based schemes with direct generation of resource states. We analyze the noise structure of resource states generated by entanglement purification and show that a local error model, i.e., noise acting independently on all qubits of the resource state, is a good approximation in general, and provides an exact description for Greenberger-Horne-Zeilinger states. The latter are resources for a measurement-based implementation of error-correction codes for bit-flip or phase-flip errors. This provides an approach to link the recently found very high thresholds for fault-tolerant measurement-based quantum information processing based on local error models for resource states with error thresholds for gate-based computational models.
Pakula, Malgorzata M; Maier, Thorsten J; Vorup-Jensen, Thomas
2017-06-01
Amino acids (AAs) support a broad range of functions in living organisms, including several that affect the immune system. The functions of the immune system are affected when free AAs are depleted or in excess because of external factors, such as starvation, or because of genetic factors, such as inborn errors of metabolism. Areas covered: In this review, we discuss the current insights into how free AAs affect immune responses. When possible, we make comparisons to known disease states resulting from inborn errors of metabolism, in which changed levels of AAs or AA metabolites provide insight into the impact of AAs on the human immune system in vivo. We also explore the literature describing how changes in AA levels might provide pharmaceutical targets for safe immunomodulatory treatment. Expert opinion: The impact of free AAs on the immune system is a neglected topic in most immunology textbooks. That neglect is undeserved, because free AAs have both direct and indirect effects on the immune system. Consistent choices of pre-clinical models and better strategies for creating formulations are required to gain clinical impact.
Homing by path integration when a locomotion trajectory crosses itself.
Yamamoto, Naohide; Meléndez, Jayleen A; Menzies, Derek T
2014-01-01
Path integration is a process with which navigators derive their current position and orientation by integrating self-motion signals along a locomotion trajectory. It has been suggested that path integration becomes disproportionately erroneous when the trajectory crosses itself. However, there is a possibility that this previous finding was confounded by effects of the length of a traveled path and the amount of turns experienced along the path, two factors that are known to affect path integration performance. The present study was designed to investigate whether the crossover of a locomotion trajectory truly increases errors of path integration. In an experiment, blindfolded human navigators were guided along four paths that varied in their lengths and turns, and attempted to walk directly back to the beginning of the paths. Only one of the four paths contained a crossover. Results showed that errors yielded from the path containing the crossover were not always larger than those observed in other paths, and the errors were attributed solely to the effects of longer path lengths or greater degrees of turns. These results demonstrated that path crossover does not always cause significant disruption in path integration processes. Implications of the present findings for models of path integration are discussed.
A cognitive taxonomy of medical errors.
Zhang, Jiajie; Patel, Vimla L; Johnson, Todd R; Shortliffe, Edward H
2004-06-01
Propose a cognitive taxonomy of medical errors at the level of individuals and their interactions with technology. Use cognitive theories of human error and human action to develop the theoretical foundations of the taxonomy, develop the structure of the taxonomy, populate the taxonomy with examples of medical error cases, identify cognitive mechanisms for each category of medical error under the taxonomy, and apply the taxonomy to practical problems. Four criteria were used to evaluate the cognitive taxonomy. The taxonomy should be able (1) to categorize major types of errors at the individual level along cognitive dimensions, (2) to associate each type of error with a specific underlying cognitive mechanism, (3) to describe how and explain why a specific error occurs, and (4) to generate intervention strategies for each type of error. The proposed cognitive taxonomy largely satisfies the four criteria at a theoretical and conceptual level. Theoretically, the proposed cognitive taxonomy provides a method to systematically categorize medical errors at the individual level along cognitive dimensions, leads to a better understanding of the underlying cognitive mechanisms of medical errors, and provides a framework that can guide future studies on medical errors. Practically, it provides guidelines for the development of cognitive interventions to decrease medical errors and foundation for the development of medical error reporting system that not only categorizes errors but also identifies problems and helps to generate solutions. To validate this model empirically, we will next be performing systematic experimental studies.
Differential sensitivity to human communication in dogs, wolves, and human infants.
Topál, József; Gergely, György; Erdohegyi, Agnes; Csibra, Gergely; Miklósi, Adám
2009-09-04
Ten-month-old infants persistently search for a hidden object at its initial hiding place even after observing it being hidden at another location. Recent evidence suggests that communicative cues from the experimenter contribute to the emergence of this perseverative search error. We replicated these results with dogs (Canis familiaris), who also commit more search errors in ostensive-communicative (in 75% of the total trials) than in noncommunicative (39%) or nonsocial (17%) hiding contexts. However, comparative investigations suggest that communicative signals serve different functions for dogs and infants, whereas human-reared wolves (Canis lupus) do not show doglike context-dependent differences of search errors. We propose that shared sensitivity to human communicative signals stems from convergent social evolution of the Homo and the Canis genera.
Metrics for Business Process Models
NASA Astrophysics Data System (ADS)
Mendling, Jan
Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.
Tsai, Karin; Wallis, Jonathan; Botvinick, Matthew
2013-01-01
Studies suggest that dopaminergic neurons report a unitary, global reward prediction error signal. However, learning in complex real-life tasks, in particular tasks that show hierarchical structure, requires multiple prediction errors that may coincide in time. We used functional neuroimaging to measure prediction error signals in humans performing such a hierarchical task involving simultaneous, uncorrelated prediction errors. Analysis of signals in a priori anatomical regions of interest in the ventral striatum and the ventral tegmental area indeed evidenced two simultaneous, but separable, prediction error signals corresponding to the two levels of hierarchy in the task. This result suggests that suitably designed tasks may reveal a more intricate pattern of firing in dopaminergic neurons. Moreover, the need for downstream separation of these signals implies possible limitations on the number of different task levels that we can learn about simultaneously. PMID:23536092
Associations between errors and contributing factors in aircraft maintenance
NASA Technical Reports Server (NTRS)
Hobbs, Alan; Williamson, Ann
2003-01-01
In recent years cognitive error models have provided insights into the unsafe acts that lead to many accidents in safety-critical environments. Most models of accident causation are based on the notion that human errors occur in the context of contributing factors. However, there is a lack of published information on possible links between specific errors and contributing factors. A total of 619 safety occurrences involving aircraft maintenance were reported using a self-completed questionnaire. Of these occurrences, 96% were related to the actions of maintenance personnel. The types of errors that were involved, and the contributing factors associated with those actions, were determined. Each type of error was associated with a particular set of contributing factors and with specific occurrence outcomes. Among the associations were links between memory lapses and fatigue and between rule violations and time pressure. Potential applications of this research include assisting with the design of accident prevention strategies, the estimation of human error probabilities, and the monitoring of organizational safety performance.
Neuroanatomical dissociation for taxonomic and thematic knowledge in the human brain
Schwartz, Myrna F.; Kimberg, Daniel Y.; Walker, Grant M.; Brecher, Adelyn; Faseyitan, Olufunsho K.; Dell, Gary S.; Mirman, Daniel; Coslett, H. Branch
2011-01-01
It is thought that semantic memory represents taxonomic information differently from thematic information. This study investigated the neural basis for the taxonomic-thematic distinction in a unique way. We gathered picture-naming errors from 86 individuals with poststroke language impairment (aphasia). Error rates were determined separately for taxonomic errors (“pear” in response to apple) and thematic errors (“worm” in response to apple), and their shared variance was regressed out of each measure. With the segmented lesions normalized to a common template, we carried out voxel-based lesion-symptom mapping on each error type separately. We found that taxonomic errors localized to the left anterior temporal lobe and thematic errors localized to the left temporoparietal junction. This is an indication that the contribution of these regions to semantic memory cleaves along taxonomic-thematic lines. Our findings show that a distinction long recognized in the psychological sciences is grounded in the structure and function of the human brain. PMID:21540329
Huh, Yeamin; Smith, David E.; Feng, Meihau Rose
2014-01-01
Human clearance prediction for small- and macro-molecule drugs was evaluated and compared using various scaling methods and statistical analysis.Human clearance is generally well predicted using single or multiple species simple allometry for macro- and small-molecule drugs excreted renally.The prediction error is higher for hepatically eliminated small-molecules using single or multiple species simple allometry scaling, and it appears that the prediction error is mainly associated with drugs with low hepatic extraction ratio (Eh). The error in human clearance prediction for hepatically eliminated small-molecules was reduced using scaling methods with a correction of maximum life span (MLP) or brain weight (BRW).Human clearance of both small- and macro-molecule drugs is well predicted using the monkey liver blood flow method. Predictions using liver blood flow from other species did not work as well, especially for the small-molecule drugs. PMID:21892879
Random measurement error: Why worry? An example of cardiovascular risk factors.
Brakenhoff, Timo B; van Smeden, Maarten; Visseren, Frank L J; Groenwold, Rolf H H
2018-01-01
With the increased use of data not originally recorded for research, such as routine care data (or 'big data'), measurement error is bound to become an increasingly relevant problem in medical research. A common view among medical researchers on the influence of random measurement error (i.e. classical measurement error) is that its presence leads to some degree of systematic underestimation of studied exposure-outcome relations (i.e. attenuation of the effect estimate). For the common situation where the analysis involves at least one exposure and one confounder, we demonstrate that the direction of effect of random measurement error on the estimated exposure-outcome relations can be difficult to anticipate. Using three example studies on cardiovascular risk factors, we illustrate that random measurement error in the exposure and/or confounder can lead to underestimation as well as overestimation of exposure-outcome relations. We therefore advise medical researchers to refrain from making claims about the direction of effect of measurement error in their manuscripts, unless the appropriate inferential tools are used to study or alleviate the impact of measurement error from the analysis.
Corrections of clinical chemistry test results in a laboratory information system.
Wang, Sihe; Ho, Virginia
2004-08-01
The recently released reports by the Institute of Medicine, To Err Is Human and Patient Safety, have received national attention because of their focus on the problem of medical errors. Although a small number of studies have reported on errors in general clinical laboratories, there are, to our knowledge, no reported studies that focus on errors in pediatric clinical laboratory testing. To characterize the errors that have caused corrections to have to be made in pediatric clinical chemistry results in the laboratory information system, Misys. To provide initial data on the errors detected in pediatric clinical chemistry laboratories in order to improve patient safety in pediatric health care. All clinical chemistry staff members were informed of the study and were requested to report in writing when a correction was made in the laboratory information system, Misys. Errors were detected either by the clinicians (the results did not fit the patients' clinical conditions) or by the laboratory technologists (the results were double-checked, and the worksheets were carefully examined twice a day). No incident that was discovered before or during the final validation was included. On each Monday of the study, we generated a report from Misys that listed all of the corrections made during the previous week. We then categorized the corrections according to the types and stages of the incidents that led to the corrections. A total of 187 incidents were detected during the 10-month study, representing a 0.26% error detection rate per requisition. The distribution of the detected incidents included 31 (17%) preanalytic incidents, 46 (25%) analytic incidents, and 110 (59%) postanalytic incidents. The errors related to noninterfaced tests accounted for 50% of the total incidents and for 37% of the affected tests and orderable panels, while the noninterfaced tests and panels accounted for 17% of the total test volume in our laboratory. This pilot study provided the rate and categories of errors detected in a pediatric clinical chemistry laboratory based on the corrections of results in the laboratory information system. A direct interface of the instruments to the laboratory information system showed that it had favorable effects on reducing laboratory errors.
Application of human reliability analysis to nursing errors in hospitals.
Inoue, Kayoko; Koizumi, Akio
2004-12-01
Adverse events in hospitals, such as in surgery, anesthesia, radiology, intensive care, internal medicine, and pharmacy, are of worldwide concern and it is important, therefore, to learn from such incidents. There are currently no appropriate tools based on state-of-the art models available for the analysis of large bodies of medical incident reports. In this study, a new model was developed to facilitate medical error analysis in combination with quantitative risk assessment. This model enables detection of the organizational factors that underlie medical errors, and the expedition of decision making in terms of necessary action. Furthermore, it determines medical tasks as module practices and uses a unique coding system to describe incidents. This coding system has seven vectors for error classification: patient category, working shift, module practice, linkage chain (error type, direct threat, and indirect threat), medication, severity, and potential hazard. Such mathematical formulation permitted us to derive two parameters: error rates for module practices and weights for the aforementioned seven elements. The error rate of each module practice was calculated by dividing the annual number of incident reports of each module practice by the annual number of the corresponding module practice. The weight of a given element was calculated by the summation of incident report error rates for an element of interest. This model was applied specifically to nursing practices in six hospitals over a year; 5,339 incident reports with a total of 63,294,144 module practices conducted were analyzed. Quality assurance (QA) of our model was introduced by checking the records of quantities of practices and reproducibility of analysis of medical incident reports. For both items, QA guaranteed legitimacy of our model. Error rates for all module practices were approximately of the order 10(-4) in all hospitals. Three major organizational factors were found to underlie medical errors: "violation of rules" with a weight of 826 x 10(-4), "failure of labor management" with a weight of 661 x 10(-4), and "defects in the standardization of nursing practices" with a weight of 495 x 10(-4).
In Vivo Evaluation of Wearable Head Impact Sensors.
Wu, Lyndia C; Nangia, Vaibhav; Bui, Kevin; Hammoor, Bradley; Kurt, Mehmet; Hernandez, Fidel; Kuo, Calvin; Camarillo, David B
2016-04-01
Inertial sensors are commonly used to measure human head motion. Some sensors have been tested with dummy or cadaver experiments with mixed results, and methods to evaluate sensors in vivo are lacking. Here we present an in vivo method using high speed video to test teeth-mounted (mouthguard), soft tissue-mounted (skin patch), and headgear-mounted (skull cap) sensors during 6-13 g sagittal soccer head impacts. Sensor coupling to the skull was quantified by displacement from an ear-canal reference. Mouthguard displacements were within video measurement error (<1 mm), while the skin patch and skull cap displaced up to 4 and 13 mm from the ear-canal reference, respectively. We used the mouthguard, which had the least displacement from skull, as the reference to assess 6-degree-of-freedom skin patch and skull cap measurements. Linear and rotational acceleration magnitudes were over-predicted by both the skin patch (with 120% NRMS error for a(mag), 290% for α(mag)) and the skull cap (320% NRMS error for a(mag), 500% for α(mag)). Such over-predictions were largely due to out-of-plane motion. To model sensor error, we found that in-plane skin patch linear acceleration in the anterior-posterior direction could be modeled by an underdamped viscoelastic system. In summary, the mouthguard showed tighter skull coupling than the other sensor mounting approaches. Furthermore, the in vivo methods presented are valuable for investigating skull acceleration sensor technologies.
Lilienfeld, Scott O; Ritschel, Lorie A; Lynn, Steven Jay; Cautin, Robin L; Latzman, Robert D
2014-07-01
The past 40 years have generated numerous insights regarding errors in human reasoning. Arguably, clinical practice is the domain of applied psychology in which acknowledging and mitigating these errors is most crucial. We address one such set of errors here, namely, the tendency of some psychologists and other mental health professionals to assume that they can rely on informal clinical observations to infer whether treatments are effective. We delineate four broad, underlying cognitive impediments to accurately evaluating improvement in psychotherapy-naive realism, confirmation bias, illusory causation, and the illusion of control. We then describe 26 causes of spurious therapeutic effectiveness (CSTEs), organized into a taxonomy of three overarching categories: (a) the perception of client change in its actual absence, (b) misinterpretations of actual client change stemming from extratherapeutic factors, and (c) misinterpretations of actual client change stemming from nonspecific treatment factors. These inferential errors can lead clinicians, clients, and researchers to misperceive useless or even harmful psychotherapies as effective. We (a) examine how methodological safeguards help to control for different CSTEs, (b) delineate fruitful directions for research on CSTEs, and (c) consider the implications of CSTEs for everyday clinical practice. An enhanced appreciation of the inferential problems posed by CSTEs may narrow the science-practice gap and foster a heightened appreciation of the need for the methodological safeguards afforded by evidence-based practice. © The Author(s) 2014.
Quesada, Jose Antonio; Nolasco, Andreu; Moncho, Joaquín
2013-01-01
Geocoding is the assignment of geographic coordinates to spatial points, which often are postal addresses. The error made in applying this process can introduce bias in estimates of spatiotemporal models in epidemiological studies. No studies have been found to measure the error made in applying this process in Spanish cities. The objective is to evaluate the errors in magnitude and direction from two free sources (Google and Yahoo) with regard to a GPS in two Spanish cities. 30 addresses were geocoded with those two sources and the GPS in Santa Pola (Alicante) and Alicante city. The distances were calculated in metres (median, CI95%) between the sources and the GPS, globally and according to the status reported by each source. The directionality of the error was evaluated by calculating the location quadrant and applying a Chi-Square test. The GPS error was evaluated by geocoding 11 addresses twice at 4 days interval. The overall median in Google-GPS was 23,2 metres (16,0-32,1) for Santa Pola, and 21,4 meters (14,9-31,1) for Alicante. The overall median in Yahoo was 136,0 meters (19,2-318,5) for Santa Pola, and 23,8 meters (13,6- 29,2) for Alicante. Between the 73% and 90% were geocoded by status as "exact or interpolated" (minor error), where Goggle and Yahoo had a median error between 19 and 23 metres in the two cities. The GPS had a median error of 13.8 meters (6,7-17,8). No error directionality was detected. Google error is acceptable and stable in the two cities, so that it is a reliable source for Para medir elgeocoding addresses in Spain in epidemiological studies.
Understanding error generation in fused deposition modeling
NASA Astrophysics Data System (ADS)
Bochmann, Lennart; Bayley, Cindy; Helu, Moneer; Transchel, Robert; Wegener, Konrad; Dornfeld, David
2015-03-01
Additive manufacturing offers completely new possibilities for the manufacturing of parts. The advantages of flexibility and convenience of additive manufacturing have had a significant impact on many industries, and optimizing part quality is crucial for expanding its utilization. This research aims to determine the sources of imprecision in fused deposition modeling (FDM). Process errors in terms of surface quality, accuracy and precision are identified and quantified, and an error-budget approach is used to characterize errors of the machine tool. It was determined that accuracy and precision in the y direction (0.08-0.30 mm) are generally greater than in the x direction (0.12-0.62 mm) and the z direction (0.21-0.57 mm). Furthermore, accuracy and precision tend to decrease at increasing axis positions. The results of this work can be used to identify possible process improvements in the design and control of FDM technology.
NASA Astrophysics Data System (ADS)
Bacha, Tulu
The Goddard Lidar Observatory for Wind (GLOW), a mobile direct detection Doppler LIDAR based on molecular backscattering for measurement of wind in the troposphere and lower stratosphere region of atmosphere is operated and its errors characterized. It was operated at Howard University Beltsville Center for Climate Observation System (BCCOS) side by side with other operating instruments: the NASA/Langely Research Center Validation Lidar (VALIDAR), Leosphere WLS70, and other standard wind sensing instruments. The performance of Goddard Lidar Observatory for Wind (GLOW) is presented for various optical thicknesses of cloud conditions. It was also compared to VALIDAR under various conditions. These conditions include clear and cloudy sky regions. The performance degradation due to the presence of cirrus clouds is quantified by comparing the wind speed error to cloud thickness. The cloud thickness is quantified in terms of aerosol backscatter ratio (ASR) and cloud optical depth (COD). ASR and COD are determined from Howard University Raman Lidar (HURL) operating at the same station as GLOW. The wind speed error of GLOW was correlated with COD and aerosol backscatter ratio (ASR) which are determined from HURL data. The correlation related in a weak linear relationship. Finally, the wind speed measurements of GLOW were corrected using the quantitative relation from the correlation relations. Using ASR reduced the GLOW wind error from 19% to 8% in a thin cirrus cloud and from 58% to 28% in a relatively thick cloud. After correcting for cloud induced error, the remaining error is due to shot noise and atmospheric variability. Shot-noise error is the statistical random error of backscattered photons detected by photon multiplier tube (PMT) can only be minimized by averaging large number of data recorded. The atmospheric backscatter measured by GLOW along its line-of-sight direction is also used to analyze error due to atmospheric variability within the volume of measurement. GLOW scans in five different directions (vertical and at elevation angles of 45° in north, south, east, and west) to generate wind profiles. The non-uniformity of the atmosphere in all scanning directions is a factor contributing to the measurement error of GLOW. The atmospheric variability in the scanning region leads to difference in the intensity of backscattered signals for scanning directions. Taking the ratio of the north (east) to south (west) and comparing the statistical differences lead to a weak linear relation between atmospheric variability and line-of-sights wind speed differences. This relation was used to make correction which reduced by about 50%.
NASA Technical Reports Server (NTRS)
Denton, R.; Sonnerup, B. U. O.; Swisdak, M.; Birn, J.; Drake, J. F.; Heese, M.
2012-01-01
When analyzing data from an array of spacecraft (such as Cluster or MMS) crossing a site of magnetic reconnection, it is desirable to be able to accurately determine the orientation of the reconnection site. If the reconnection is quasi-two dimensional, there are three key directions, the direction of maximum inhomogeneity (the direction across the reconnection site), the direction of the reconnecting component of the magnetic field, and the direction of rough invariance (the "out of plane" direction). Using simulated spacecraft observations of magnetic reconnection in the geomagnetic tail, we extend our previous tests of the direction-finding method developed by Shi et al. (2005) and the method to determine the structure velocity relative to the spacecraft Vstr. These methods require data from four proximate spacecraft. We add artificial noise and calibration errors to the simulation fields, and then use the perturbed gradient of the magnetic field B and perturbed time derivative dB/dt, as described by Denton et al. (2010). Three new simulations are examined: a weakly three-dimensional, i.e., quasi-two-dimensional, MHD simulation without a guide field, a quasi-two-dimensional MHD simulation with a guide field, and a two-dimensional full dynamics kinetic simulation with inherent noise so that the apparent minimum gradient was not exactly zero, even without added artificial errors. We also examined variations of the spacecraft trajectory for the kinetic simulation. The accuracy of the directions found varied depending on the simulation and spacecraft trajectory, but all the directions could be found within about 10 for all cases. Various aspects of the method were examined, including how to choose averaging intervals and the best intervals for determining the directions and velocity. For the kinetic simulation, we also investigated in detail how the errors in the inferred gradient directions from the unmodified Shi et al. method (using the unperturbed gradient) depended on the amplitude of the calibration errors. For an accuracy of 3 for the maximum gradient direction, the calibration errors could be as large as 3% of reconnection magnetic field, while for the same accuracy for the minimum gradient direction, the calibration errors could only be as large as 0.03% of the reconnection magnetic field. These results suggest that the maximum gradient direction can normally be determined by the unmodified Shi et al. method, while the modified method or some other method must be used to accurately determine the minimum gradient direction. The structure velocity was found with magnitude accurate to 2% and direction accurate to within 5%.
Perceptually lossless fractal image compression
NASA Astrophysics Data System (ADS)
Lin, Huawu; Venetsanopoulos, Anastasios N.
1996-02-01
According to the collage theorem, the encoding distortion for fractal image compression is directly related to the metric used in the encoding process. In this paper, we introduce a perceptually meaningful distortion measure based on the human visual system's nonlinear response to luminance and the visual masking effects. Blackwell's psychophysical raw data on contrast threshold are first interpolated as a function of background luminance and visual angle, and are then used as an error upper bound for perceptually lossless image compression. For a variety of images, experimental results show that the algorithm produces a compression ratio of 8:1 to 10:1 without introducing visual artifacts.
Fifty Years of THERP and Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring
2012-06-01
In 1962 at a Human Factors Society symposium, Alan Swain presented a paper introducing a Technique for Human Error Rate Prediction (THERP). This was followed in 1963 by a Sandia Laboratories monograph outlining basic human error quantification using THERP and, in 1964, by a special journal edition of Human Factors on quantification of human performance. Throughout the 1960s, Swain and his colleagues focused on collecting human performance data for the Sandia Human Error Rate Bank (SHERB), primarily in connection with supporting the reliability of nuclear weapons assembly in the US. In 1969, Swain met with Jens Rasmussen of Risø Nationalmore » Laboratory and discussed the applicability of THERP to nuclear power applications. By 1975, in WASH-1400, Swain had articulated the use of THERP for nuclear power applications, and the approach was finalized in the watershed publication of the NUREG/CR-1278 in 1983. THERP is now 50 years old, and remains the most well known and most widely used HRA method. In this paper, the author discusses the history of THERP, based on published reports and personal communication and interviews with Swain. The author also outlines the significance of THERP. The foundations of human reliability analysis are found in THERP: human failure events, task analysis, performance shaping factors, human error probabilities, dependence, event trees, recovery, and pre- and post-initiating events were all introduced in THERP. While THERP is not without its detractors, and it is showing signs of its age in the face of newer technological applications, the longevity of THERP is a testament of its tremendous significance. THERP started the field of human reliability analysis. This paper concludes with a discussion of THERP in the context of newer methods, which can be seen as extensions of or departures from Swain’s pioneering work.« less
Effects of Ocular Optics on Perceived Visual Direction and Depth
NASA Astrophysics Data System (ADS)
Ye, Ming
Most studies of human retinal image quality have specifically addressed the issues of image contrast, few have examined the problem of image location. However, one of the most impressive properties of human vision involves the location of objects. We are able to identify object location with great accuracy (less than 5 arcsec). The sensitivity we exhibit for image location indicates that any optical errors, such as refractive error, ocular aberrations, pupil decentration, etc., may have noticeable effects on perceived visual direction and distance of objects. The most easily observed effects of these optical factors is a binocular depth illusion called chromostereopsis in which equidistance colored objects appear to lie at the different distances. This dissertation covers a series of theoretical and experimental studies that examined the effects of ocular optics on perceived monocular visual direction and binocular chromostereopsis. Theoretical studies included development of an adequate eye model for predicting chromatic aberration, a major ocular aberration, using geometric optics. Also, a wave optical analysis is used to model the effects of defocus, optical aberrations, Stiles-Crawford effect (SCE) and pupil location on retinal image profiles. Experimental studies used psychophysical methods such as monocular vernier alignment tests, binocular stereoscopic tests, etc. This dissertation concludes: (1) With a decentered large pupil, the SCE reduces defocused image shifts compare to an eye without the SCE. (2) The blurred image location can be predicted by the centroid of the image profile. (3) Chromostereopsis with small pupils can be precisely accounted for by the interocular difference in monocular transverse chromatic aberration. (4) The SCE also plays an important role in the effect of pupil size on chromostereopsis. The reduction of chromostereopsis with large pupils can be accurately predicted by the interocular difference in monocular chromatic diplopia which is also reduced with large pupils. This supports the hypothesis that the effect of pupil size on chromostereopsis is due to monocular mechanisms.
Error-free replicative bypass of (6–4) photoproducts by DNA polymerase ζ in mouse and human cells
Yoon, Jung-Hoon; Prakash, Louise; Prakash, Satya
2010-01-01
The ultraviolet (UV)-induced (6–4) pyrimidine–pyrimidone photoproduct [(6–4) PP] confers a large structural distortion in DNA. Here we examine in human cells the roles of translesion synthesis (TLS) DNA polymerases (Pols) in promoting replication through a (6–4) TT photoproduct carried on a duplex plasmid where bidirectional replication initiates from an origin of replication. We show that TLS contributes to a large fraction of lesion bypass and that it is mostly error-free. We find that, whereas Pol η and Pol ι provide alternate pathways for mutagenic TLS, surprisingly, Pol ζ functions independently of these Pols and in a predominantly error-free manner. We verify and extend these observations in mouse cells and conclude that, in human cells, TLS during replication can be markedly error-free even opposite a highly distorting DNA lesion. PMID:20080950
Novel approach to ambulatory assessment of human segmental orientation on a wearable sensor system.
Liu, Kun; Liu, Tao; Shibata, Kyoko; Inoue, Yoshio; Zheng, Rencheng
2009-12-11
A new method using a double-sensor difference based algorithm for analyzing human segment rotational angles in two directions for segmental orientation analysis in the three-dimensional (3D) space was presented. A wearable sensor system based only on triaxial accelerometers was developed to obtain the pitch and yaw angles of thigh segment with an accelerometer approximating translational acceleration of the hip joint and two accelerometers measuring the actual accelerations on the thigh. To evaluate the method, the system was first tested on a 2 degrees of freedom mechanical arm assembled out of rigid segments and encoders. Then, to estimate the human segmental orientation, the wearable sensor system was tested on the thighs of eight volunteer subjects, who walked in a straight forward line in the work space of an optical motion analysis system at three self-selected speeds: slow, normal and fast. In the experiment, the subject was assumed to walk in a straight forward way with very little trunk sway, skin artifacts and no significant internal/external rotation of the leg. The root mean square (RMS) errors of the thigh segment orientation measurement were between 2.4 degrees and 4.9 degrees during normal gait that had a 45 degrees flexion/extension range of motion. Measurement error was observed to increase with increasing walking speed probably because of the result of increased trunk sway, axial rotation and skin artifacts. The results show that, without integration and switching between different sensors, using only one kind of sensor, the wearable sensor system is suitable for ambulatory analysis of normal gait orientation of thigh and shank in two directions of the segment-fixed local coordinate system in 3D space. It can then be applied to assess spatio-temporal gait parameters and monitoring the gait function of patients in clinical settings.
A general geometric theory of attitude determination from directional sensing
NASA Technical Reports Server (NTRS)
Fang, B. T.
1976-01-01
A general geometric theory of spacecraft attitude determination from external reference direction sensors was presented. Outputs of different sensors are reduced to two kinds of basic directional measurements. Errors in these measurement equations are studied in detail. The partial derivatives of measurements with respect to the spacecraft orbit, the spacecraft attitude, and the error parameters form the basis for all orbit and attitude determination schemes and error analysis programs and are presented in a series of tables. The question of attitude observability is studied with the introduction of a graphical construction which provides a great deal of physical insight. The result is applied to the attitude observability of the IMP-8 spacecraft.
A Framework for Modeling Human-Machine Interactions
NASA Technical Reports Server (NTRS)
Shafto, Michael G.; Rosekind, Mark R. (Technical Monitor)
1996-01-01
Modern automated flight-control systems employ a variety of different behaviors, or modes, for managing the flight. While developments in cockpit automation have resulted in workload reduction and economical advantages, they have also given rise to an ill-defined class of human-machine problems, sometimes referred to as 'automation surprises'. Our interest in applying formal methods for describing human-computer interaction stems from our ongoing research on cockpit automation. In this area of aeronautical human factors, there is much concern about how flight crews interact with automated flight-control systems, so that the likelihood of making errors, in particular mode-errors, is minimized and the consequences of such errors are contained. The goal of the ongoing research on formal methods in this context is: (1) to develop a framework for describing human interaction with control systems; (2) to formally categorize such automation surprises; and (3) to develop tests for identification of these categories early in the specification phase of a new human-machine system.
Inborn Errors of Human JAKs and STATs
Casanova, Jean-Laurent; Holland, Steven M.; Notarangelo, Luigi D.
2012-01-01
Inborn errors of the genes encoding two of the four human JAKs (JAK3 and TYK2) and three of the six human STATs (STAT1, STAT3, and STAT5B) have been described. We review the disorders arising from mutations in these five genes, highlighting the way in which the molecular and cellular pathogenesis of these conditions has been clarified by the discovery of inborn errors of cytokines, hormones, and their receptors, including those interacting with JAKs and STATs. The phenotypic similarities between mice and humans lacking individual JAK-STAT components suggest that the functions of JAKs and STATs are largely conserved in mammals. However, a wide array of phenotypic differences has emerged between mice and humans carrying bi-allelic null alleles of JAK3, TYK2, STAT1, or STAT5B. Moreover, the high level of allelic heterogeneity at the human JAK3, STAT1, and STAT3 loci has revealed highly diverse immunological and clinical phenotypes, which had not been anticipated. PMID:22520845
Inborn errors of human JAKs and STATs.
Casanova, Jean-Laurent; Holland, Steven M; Notarangelo, Luigi D
2012-04-20
Inborn errors of the genes encoding two of the four human JAKs (JAK3 and TYK2) and three of the six human STATs (STAT1, STAT3, and STAT5B) have been described. We review the disorders arising from mutations in these five genes, highlighting the way in which the molecular and cellular pathogenesis of these conditions has been clarified by the discovery of inborn errors of cytokines, hormones, and their receptors, including those interacting with JAKs and STATs. The phenotypic similarities between mice and humans lacking individual JAK-STAT components suggest that the functions of JAKs and STATs are largely conserved in mammals. However, a wide array of phenotypic differences has emerged between mice and humans carrying biallelic null alleles of JAK3, TYK2, STAT1, or STAT5B. Moreover, the high degree of allelic heterogeneity at the human JAK3, TYK2, STAT1, and STAT3 loci has revealed highly diverse immunological and clinical phenotypes, which had not been anticipated. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Ha, Taesung
A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential usefulness of quantifying model uncertainty as sensitivity analysis in the PRA model.
Use of modeling to identify vulnerabilities to human error in laparoscopy.
Funk, Kenneth H; Bauer, James D; Doolen, Toni L; Telasha, David; Nicolalde, R Javier; Reeber, Miriam; Yodpijit, Nantakrit; Long, Myra
2010-01-01
This article describes an exercise to investigate the utility of modeling and human factors analysis in understanding surgical processes and their vulnerabilities to medical error. A formal method to identify error vulnerabilities was developed and applied to a test case of Veress needle insertion during closed laparoscopy. A team of 2 surgeons, a medical assistant, and 3 engineers used hierarchical task analysis and Integrated DEFinition language 0 (IDEF0) modeling to create rich models of the processes used in initial port creation. Using terminology from a standardized human performance database, detailed task descriptions were written for 4 tasks executed in the process of inserting the Veress needle. Key terms from the descriptions were used to extract from the database generic errors that could occur. Task descriptions with potential errors were translated back into surgical terminology. Referring to the process models and task descriptions, the team used a modified failure modes and effects analysis (FMEA) to consider each potential error for its probability of occurrence, its consequences if it should occur and be undetected, and its probability of detection. The resulting likely and consequential errors were prioritized for intervention. A literature-based validation study confirmed the significance of the top error vulnerabilities identified using the method. Ongoing work includes design and evaluation of procedures to correct the identified vulnerabilities and improvements to the modeling and vulnerability identification methods. Copyright 2010 AAGL. Published by Elsevier Inc. All rights reserved.
Wyatt, Madison; Nave, Gillian
2017-01-01
We evaluated the use of a commercial flatbed scanner for digitizing photographic plates used for spectroscopy. The scanner has a bed size of 420 mm by 310 mm and a pixel size of about 0.0106 mm. Our tests show that the closest line pairs that can be resolved with the scanner are 0.024 mm apart, only slightly larger than the Nyquist resolution of 0.021 mm expected by the 0.0106 mm pixel size. We measured periodic errors in the scanner using both a calibrated length scale and a photographic plate. We find no noticeable periodic errors in the direction parallel to the linear detector in the scanner, but errors with an amplitude of 0.03 mm to 0.05 mm in the direction perpendicular to the detector. We conclude that large periodic errors in measurements of spectroscopic plates using flatbed scanners can be eliminated by scanning the plates with the dispersion direction parallel to the linear detector by placing the plate along the short side of the scanner. PMID:28463262
Milekovic, Tomislav; Ball, Tonio; Schulze-Bonhage, Andreas; Aertsen, Ad; Mehring, Carsten
2013-01-01
Background Brain-machine interfaces (BMIs) can translate the neuronal activity underlying a user’s movement intention into movements of an artificial effector. In spite of continuous improvements, errors in movement decoding are still a major problem of current BMI systems. If the difference between the decoded and intended movements becomes noticeable, it may lead to an execution error. Outcome errors, where subjects fail to reach a certain movement goal, are also present during online BMI operation. Detecting such errors can be beneficial for BMI operation: (i) errors can be corrected online after being detected and (ii) adaptive BMI decoding algorithm can be updated to make fewer errors in the future. Methodology/Principal Findings Here, we show that error events can be detected from human electrocorticography (ECoG) during a continuous task with high precision, given a temporal tolerance of 300–400 milliseconds. We quantified the error detection accuracy and showed that, using only a small subset of 2×2 ECoG electrodes, 82% of detection information for outcome error and 74% of detection information for execution error available from all ECoG electrodes could be retained. Conclusions/Significance The error detection method presented here could be used to correct errors made during BMI operation or to adapt a BMI algorithm to make fewer errors in the future. Furthermore, our results indicate that smaller ECoG implant could be used for error detection. Reducing the size of an ECoG electrode implant used for BMI decoding and error detection could significantly reduce the medical risk of implantation. PMID:23383315
Patient safety: honoring advanced directives.
Tice, Martha A
2007-02-01
Healthcare providers typically think of patient safety in the context of preventing iatrogenic injury. Prevention of falls and medication or treatment errors is the typical focus of adverse event analyses. If healthcare providers are committed to honoring the wishes of patients, then perhaps failures to honor advanced directives should be viewed as reportable medical errors.
Progress in the improved lattice calculation of direct CP-violation in the Standard Model
NASA Astrophysics Data System (ADS)
Kelly, Christopher
2018-03-01
We discuss the ongoing effort by the RBC & UKQCD collaborations to improve our lattice calculation of the measure of Standard Model direct CP violation, ɛ', with physical kinematics. We present our progress in decreasing the (dominant) statistical error and discuss other related activities aimed at reducing the systematic errors.
An Introduction to the Sources of Delivery Error for Direct-Fire Ballistic Projectiles
2013-07-01
Ballistic mismatch has also been used to quantify the difference in target impacts using different gun tubes ...the angle between the local “upwards” direction of the gun tube and the vertical direction as defined by gravity. Cant results from the gun tube ...Determining Optimal Tube Shape for Reduction of Jump Error for Tank Fleets Using Fleet Zero. Presented at the 20th International Symposium on Ballistics
Inborn Errors of Fructose Metabolism. What Can We Learn from Them?
Tran, Christel
2017-04-03
Fructose is one of the main sweetening agents in the human diet and its ingestion is increasing globally. Dietary sugar has particular effects on those whose capacity to metabolize fructose is limited. If intolerance to carbohydrates is a frequent finding in children, inborn errors of carbohydrate metabolism are rare conditions. Three inborn errors are known in the pathway of fructose metabolism; (1) essential or benign fructosuria due to fructokinase deficiency; (2) hereditary fructose intolerance; and (3) fructose-1,6-bisphosphatase deficiency. In this review the focus is set on the description of the clinical symptoms and biochemical anomalies in the three inborn errors of metabolism. The potential toxic effects of fructose in healthy humans also are discussed. Studies conducted in patients with inborn errors of fructose metabolism helped to understand fructose metabolism and its potential toxicity in healthy human. Influence of fructose on the glycolytic pathway and on purine catabolism is the cause of hypoglycemia, lactic acidosis and hyperuricemia. The discovery that fructose-mediated generation of uric acid may have a causal role in diabetes and obesity provided new understandings into pathogenesis for these frequent diseases.
Inborn Errors of Fructose Metabolism. What Can We Learn from Them?
Tran, Christel
2017-01-01
Fructose is one of the main sweetening agents in the human diet and its ingestion is increasing globally. Dietary sugar has particular effects on those whose capacity to metabolize fructose is limited. If intolerance to carbohydrates is a frequent finding in children, inborn errors of carbohydrate metabolism are rare conditions. Three inborn errors are known in the pathway of fructose metabolism; (1) essential or benign fructosuria due to fructokinase deficiency; (2) hereditary fructose intolerance; and (3) fructose-1,6-bisphosphatase deficiency. In this review the focus is set on the description of the clinical symptoms and biochemical anomalies in the three inborn errors of metabolism. The potential toxic effects of fructose in healthy humans also are discussed. Studies conducted in patients with inborn errors of fructose metabolism helped to understand fructose metabolism and its potential toxicity in healthy human. Influence of fructose on the glycolytic pathway and on purine catabolism is the cause of hypoglycemia, lactic acidosis and hyperuricemia. The discovery that fructose-mediated generation of uric acid may have a causal role in diabetes and obesity provided new understandings into pathogenesis for these frequent diseases. PMID:28368361
Ching, Joan M; Williams, Barbara L; Idemoto, Lori M; Blackmore, C Craig
2014-08-01
Virginia Mason Medical Center (Seattle) employed the Lean concept of Jidoka (automation with a human touch) to plan for and deploy bar code medication administration (BCMA) to hospitalized patients. Integrating BCMA technology into the nursing work flow with minimal disruption was accomplished using three steps ofJidoka: (1) assigning work to humans and machines on the basis of their differing abilities, (2) adapting machines to the human work flow, and (3) monitoring the human-machine interaction. Effectiveness of BCMA to both reinforce safe administration practices and reduce medication errors was measured using the Collaborative Alliance for Nursing Outcomes (CALNOC) Medication Administration Accuracy Quality Study methodology. Trained nurses observed a total of 16,149 medication doses for 3,617 patients in a three-year period. Following BCMA implementation, the number of safe practice violations decreased from 54.8 violations/100 doses (January 2010-September 2011) to 29.0 violations/100 doses (October 2011-December 2012), resulting in an absolute risk reduction of 25.8 violations/100 doses (95% confidence interval [CI]: 23.7, 27.9, p < .001). The number of medication errors decreased from 5.9 errors/100 doses at baseline to 3.0 errors/100 doses after BCMA implementation (absolute risk reduction: 2.9 errors/100 doses [95% CI: 2.2, 3.6,p < .001]). The number of unsafe administration practices (estimate, -5.481; standard error 1.133; p < .001; 95% CI: -7.702, -3.260) also decreased. As more hospitals respond to health information technology meaningful use incentives, thoughtful, methodical, and well-managed approaches to technology deployment are crucial. This work illustrates how Jidoka offers opportunities for a smooth transition to new technology.
Russ, Alissa L; Zillich, Alan J; Melton, Brittany L; Russell, Scott A; Chen, Siying; Spina, Jeffrey R; Weiner, Michael; Johnson, Elizabette G; Daggy, Joanne K; McManus, M Sue; Hawsey, Jason M; Puleo, Anthony G; Doebbeling, Bradley N; Saleem, Jason J
2014-01-01
Objective To apply human factors engineering principles to improve alert interface design. We hypothesized that incorporating human factors principles into alerts would improve usability, reduce workload for prescribers, and reduce prescribing errors. Materials and methods We performed a scenario-based simulation study using a counterbalanced, crossover design with 20 Veterans Affairs prescribers to compare original versus redesigned alerts. We redesigned drug–allergy, drug–drug interaction, and drug–disease alerts based upon human factors principles. We assessed usability (learnability of redesign, efficiency, satisfaction, and usability errors), perceived workload, and prescribing errors. Results Although prescribers received no training on the design changes, prescribers were able to resolve redesigned alerts more efficiently (median (IQR): 56 (47) s) compared to the original alerts (85 (71) s; p=0.015). In addition, prescribers rated redesigned alerts significantly higher than original alerts across several dimensions of satisfaction. Redesigned alerts led to a modest but significant reduction in workload (p=0.042) and significantly reduced the number of prescribing errors per prescriber (median (range): 2 (1–5) compared to original alerts: 4 (1–7); p=0.024). Discussion Aspects of the redesigned alerts that likely contributed to better prescribing include design modifications that reduced usability-related errors, providing clinical data closer to the point of decision, and displaying alert text in a tabular format. Displaying alert text in a tabular format may help prescribers extract information quickly and thereby increase responsiveness to alerts. Conclusions This simulation study provides evidence that applying human factors design principles to medication alerts can improve usability and prescribing outcomes. PMID:24668841
Russ, Alissa L; Zillich, Alan J; Melton, Brittany L; Russell, Scott A; Chen, Siying; Spina, Jeffrey R; Weiner, Michael; Johnson, Elizabette G; Daggy, Joanne K; McManus, M Sue; Hawsey, Jason M; Puleo, Anthony G; Doebbeling, Bradley N; Saleem, Jason J
2014-10-01
To apply human factors engineering principles to improve alert interface design. We hypothesized that incorporating human factors principles into alerts would improve usability, reduce workload for prescribers, and reduce prescribing errors. We performed a scenario-based simulation study using a counterbalanced, crossover design with 20 Veterans Affairs prescribers to compare original versus redesigned alerts. We redesigned drug-allergy, drug-drug interaction, and drug-disease alerts based upon human factors principles. We assessed usability (learnability of redesign, efficiency, satisfaction, and usability errors), perceived workload, and prescribing errors. Although prescribers received no training on the design changes, prescribers were able to resolve redesigned alerts more efficiently (median (IQR): 56 (47) s) compared to the original alerts (85 (71) s; p=0.015). In addition, prescribers rated redesigned alerts significantly higher than original alerts across several dimensions of satisfaction. Redesigned alerts led to a modest but significant reduction in workload (p=0.042) and significantly reduced the number of prescribing errors per prescriber (median (range): 2 (1-5) compared to original alerts: 4 (1-7); p=0.024). Aspects of the redesigned alerts that likely contributed to better prescribing include design modifications that reduced usability-related errors, providing clinical data closer to the point of decision, and displaying alert text in a tabular format. Displaying alert text in a tabular format may help prescribers extract information quickly and thereby increase responsiveness to alerts. This simulation study provides evidence that applying human factors design principles to medication alerts can improve usability and prescribing outcomes. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Westbrook, J I; Li, L; Raban, M Z; Baysari, M T; Prgomet, M; Georgiou, A; Kim, T; Lake, R; McCullagh, C; Dalla-Pozza, L; Karnon, J; O'Brien, T A; Ambler, G; Day, R; Cowell, C T; Gazarian, M; Worthington, R; Lehmann, C U; White, L; Barbaric, D; Gardo, A; Kelly, M; Kennedy, P
2016-01-01
Introduction Medication errors are the most frequent cause of preventable harm in hospitals. Medication management in paediatric patients is particularly complex and consequently potential for harms are greater than in adults. Electronic medication management (eMM) systems are heralded as a highly effective intervention to reduce adverse drug events (ADEs), yet internationally evidence of their effectiveness in paediatric populations is limited. This study will assess the effectiveness of an eMM system to reduce medication errors, ADEs and length of stay (LOS). The study will also investigate system impact on clinical work processes. Methods and analysis A stepped-wedge cluster randomised controlled trial (SWCRCT) will measure changes pre-eMM and post-eMM system implementation in prescribing and medication administration error (MAE) rates, potential and actual ADEs, and average LOS. In stage 1, 8 wards within the first paediatric hospital will be randomised to receive the eMM system 1 week apart. In stage 2, the second paediatric hospital will randomise implementation of a modified eMM and outcomes will be assessed. Prescribing errors will be identified through record reviews, and MAEs through direct observation of nurses and record reviews. Actual and potential severity will be assigned. Outcomes will be assessed at the patient-level using mixed models, taking into account correlation of admissions within wards and multiple admissions for the same patient, with adjustment for potential confounders. Interviews and direct observation of clinicians will investigate the effects of the system on workflow. Data from site 1 will be used to develop improvements in the eMM and implemented at site 2, where the SWCRCT design will be repeated (stage 2). Ethics and dissemination The research has been approved by the Human Research Ethics Committee of the Sydney Children's Hospitals Network and Macquarie University. Results will be reported through academic journals and seminar and conference presentations. Trial registration number Australian New Zealand Clinical Trials Registry (ANZCTR) 370325. PMID:27797997
Westbrook, J I; Li, L; Raban, M Z; Baysari, M T; Mumford, V; Prgomet, M; Georgiou, A; Kim, T; Lake, R; McCullagh, C; Dalla-Pozza, L; Karnon, J; O'Brien, T A; Ambler, G; Day, R; Cowell, C T; Gazarian, M; Worthington, R; Lehmann, C U; White, L; Barbaric, D; Gardo, A; Kelly, M; Kennedy, P
2016-10-21
Medication errors are the most frequent cause of preventable harm in hospitals. Medication management in paediatric patients is particularly complex and consequently potential for harms are greater than in adults. Electronic medication management (eMM) systems are heralded as a highly effective intervention to reduce adverse drug events (ADEs), yet internationally evidence of their effectiveness in paediatric populations is limited. This study will assess the effectiveness of an eMM system to reduce medication errors, ADEs and length of stay (LOS). The study will also investigate system impact on clinical work processes. A stepped-wedge cluster randomised controlled trial (SWCRCT) will measure changes pre-eMM and post-eMM system implementation in prescribing and medication administration error (MAE) rates, potential and actual ADEs, and average LOS. In stage 1, 8 wards within the first paediatric hospital will be randomised to receive the eMM system 1 week apart. In stage 2, the second paediatric hospital will randomise implementation of a modified eMM and outcomes will be assessed. Prescribing errors will be identified through record reviews, and MAEs through direct observation of nurses and record reviews. Actual and potential severity will be assigned. Outcomes will be assessed at the patient-level using mixed models, taking into account correlation of admissions within wards and multiple admissions for the same patient, with adjustment for potential confounders. Interviews and direct observation of clinicians will investigate the effects of the system on workflow. Data from site 1 will be used to develop improvements in the eMM and implemented at site 2, where the SWCRCT design will be repeated (stage 2). The research has been approved by the Human Research Ethics Committee of the Sydney Children's Hospitals Network and Macquarie University. Results will be reported through academic journals and seminar and conference presentations. Australian New Zealand Clinical Trials Registry (ANZCTR) 370325. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
An Analysis of Computational Errors in the Use of Division Algorithms by Fourth-Grade Students.
ERIC Educational Resources Information Center
Stefanich, Greg P.; Rokusek, Teri
1992-01-01
Presents a study that analyzed errors made by randomly chosen fourth grade students (25 of 57) while using the division algorithm and investigated the effect of remediation on identified systematic errors. Results affirm that error pattern diagnosis and directed remediation lead to new learning and long-term retention. (MDH)
ERIC Educational Resources Information Center
Bowe, Melissa; Sellers, Tyra P.
2018-01-01
The Performance Diagnostic Checklist-Human Services (PDC-HS) has been used to assess variables contributing to undesirable staff performance. In this study, three preschool teachers completed the PDC-HS to identify the factors contributing to four paraprofessionals' inaccurate implementation of error-correction procedures during discrete trial…
The Importance of HRA in Human Space Flight: Understanding the Risks
NASA Technical Reports Server (NTRS)
Hamlin, Teri
2010-01-01
Human performance is critical to crew safety during space missions. Humans interact with hardware and software during ground processing, normal flight, and in response to events. Human interactions with hardware and software can cause Loss of Crew and/or Vehicle (LOCV) through improper actions, or may prevent LOCV through recovery and control actions. Humans have the ability to deal with complex situations and system interactions beyond the capability of machines. Human Reliability Analysis (HRA) is a method used to qualitatively and quantitatively assess the occurrence of human failures that affect availability and reliability of complex systems. Modeling human actions with their corresponding failure probabilities in a Probabilistic Risk Assessment (PRA) provides a more complete picture of system risks and risk contributions. A high-quality HRA can provide valuable information on potential areas for improvement, including training, procedures, human interfaces design, and the need for automation. Modeling human error has always been a challenge in part because performance data is not always readily available. For spaceflight, the challenge is amplified not only because of the small number of participants and limited amount of performance data available, but also due to the lack of definition of the unique factors influencing human performance in space. These factors, called performance shaping factors in HRA terminology, are used in HRA techniques to modify basic human error probabilities in order to capture the context of an analyzed task. Many of the human error modeling techniques were developed within the context of nuclear power plants and therefore the methodologies do not address spaceflight factors such as the effects of microgravity and longer duration missions. This presentation will describe the types of human error risks which have shown up as risk drivers in the Shuttle PRA which may be applicable to commercial space flight. As with other large PRAs of complex machines, human error in the Shuttle PRA proved to be an important contributor (12 percent) to LOCV. An existing HRA technique was adapted for use in the Shuttle PRA, but additional guidance and improvements are needed to make the HRA task in space-related PRAs easier and more accurate. Therefore, this presentation will also outline plans for expanding current HRA methodology to more explicitly cover spaceflight performance shaping factors.
Comment on "Differential sensitivity to human communication in dogs, wolves, and human infants".
Fiset, Sylvain
2010-07-09
Topál et al. (Reports, 4 September 2009, p. 1269) reported that dogs' sensitivity to reading and using human signals contributes to the emergence of a spatial perseveration error (the A-not-B error) for locating objects. Here, I argue that the authors' conclusion was biased by two confounding factors: the use of an atypical A-not-B search task and an inadequate nonsocial condition as a control.
NASA: Model development for human factors interfacing
NASA Technical Reports Server (NTRS)
Smith, L. L.
1984-01-01
The results of an intensive literature review in the general topics of human error analysis, stress and job performance, and accident and safety analysis revealed no usable techniques or approaches for analyzing human error in ground or space operations tasks. A task review model is described and proposed to be developed in order to reduce the degree of labor intensiveness in ground and space operations tasks. An extensive number of annotated references are provided.
Hamm, Jordan P.; Dyckman, Kara A.; McDowell, Jennifer E.; Clementz, Brett A.
2012-01-01
Cognitive control is required for correct performance on antisaccade tasks, including the ability to inhibit an externally driven ocular motor repsonse (a saccade to a peripheral stimulus) in favor of an internally driven ocular motor goal (a saccade directed away from a peripheral stimulus). Healthy humans occasionally produce errors during antisaccade tasks, but the mechanisms associated with such failures of cognitive control are uncertain. Most research on cognitive control failures focuses on post-stimulus processing, although a growing body of literature highlights a role of intrinsic brain activity in perceptual and cognitive performance. The current investigation used dense array electroencephalography and distributed source analyses to examine brain oscillations across a wide frequency bandwidth in the period prior to antisaccade cue onset. Results highlight four important aspects of ongoing and preparatory brain activations that differentiate error from correct antisaccade trials: (i) ongoing oscillatory beta (20–30Hz) power in anterior cingulate prior to trial initiation (lower for error trials), (ii) instantaneous phase of ongoing alpha-theta (7Hz) in frontal and occipital cortices immediately before trial initiation (opposite between trial types), (iii) gamma power (35–60Hz) in posterior parietal cortex 100 ms prior to cue onset (greater for error trials), and (iv) phase locking of alpha (5–12Hz) in parietal and occipital cortices immediately prior to cue onset (lower for error trials). These findings extend recently reported effects of pre-trial alpha phase on perception to cognitive control processes, and help identify the cortical generators of such phase effects. PMID:22593071
Lee, Benjamin C; Moody, Jonathan B; Poitrasson-Rivière, Alexis; Melvin, Amanda C; Weinberg, Richard L; Corbett, James R; Ficaro, Edward P; Murthy, Venkatesh L
2018-03-23
Patient motion can lead to misalignment of left ventricular volumes of interest and subsequently inaccurate quantification of myocardial blood flow (MBF) and flow reserve (MFR) from dynamic PET myocardial perfusion images. We aimed to identify the prevalence of patient motion in both blood and tissue phases and analyze the effects of this motion on MBF and MFR estimates. We selected 225 consecutive patients that underwent dynamic stress/rest rubidium-82 chloride ( 82 Rb) PET imaging. Dynamic image series were iteratively reconstructed with 5- to 10-second frame durations over the first 2 minutes for the blood phase and 10 to 80 seconds for the tissue phase. Motion shifts were assessed by 3 physician readers from the dynamic series and analyzed for frequency, magnitude, time, and direction of motion. The effects of this motion isolated in time, direction, and magnitude on global and regional MBF and MFR estimates were evaluated. Flow estimates derived from the motion corrected images were used as the error references. Mild to moderate motion (5-15 mm) was most prominent in the blood phase in 63% and 44% of the stress and rest studies, respectively. This motion was observed with frequencies of 75% in the septal and inferior directions for stress and 44% in the septal direction for rest. Images with blood phase isolated motion had mean global MBF and MFR errors of 2%-5%. Isolating blood phase motion in the inferior direction resulted in mean MBF and MFR errors of 29%-44% in the RCA territory. Flow errors due to tissue phase isolated motion were within 1%. Patient motion was most prevalent in the blood phase and MBF and MFR errors increased most substantially with motion in the inferior direction. Motion correction focused on these motions is needed to reduce MBF and MFR errors.
Designing to Control Flight Crew Errors
NASA Technical Reports Server (NTRS)
Schutte, Paul C.; Willshire, Kelli F.
1997-01-01
It is widely accepted that human error is a major contributing factor in aircraft accidents. There has been a significant amount of research in why these errors occurred, and many reports state that the design of flight deck can actually dispose humans to err. This research has led to the call for changes in design according to human factors and human-centered principles. The National Aeronautics and Space Administration's (NASA) Langley Research Center has initiated an effort to design a human-centered flight deck from a clean slate (i.e., without constraints of existing designs.) The effort will be based on recent research in human-centered design philosophy and mission management categories. This design will match the human's model of the mission and function of the aircraft to reduce unnatural or non-intuitive interfaces. The product of this effort will be a flight deck design description, including training and procedures, and a cross reference or paper trail back to design hypotheses, and an evaluation of the design. The present paper will discuss the philosophy, process, and status of this design effort.
Williams, Camille K; Grierson, Lawrence E M; Carnahan, Heather
2011-08-01
A link between affect and action has been supported by the discovery that threat information is prioritized through an action-centred pathway--the dorsal visual stream. Magnocellular afferents, which originate from the retina and project to dorsal stream structures, are suppressed by exposure to diffuse red light, which diminishes humans' perception of threat-based images. In order to explore the role of colour in the relationship between affect and action, participants donned different pairs of coloured glasses (red, yellow, green, blue and clear) and completed Positive and Negative Affect Scale questionnaires as well as a series of target-directed aiming movements. Analyses of affect scores revealed a significant main effect for affect valence and a significant interaction between colour and valence: perceived positive affect was significantly smaller for the red condition. Kinematic analyses of variable error in the primary movement direction and Pearson correlation analyses between the displacements travelled prior to and following peak velocity indicated reduced accuracy and application of online control processes while wearing red glasses. Variable error of aiming was also positively and significantly correlated with negative affect scores under the red condition. These results suggest that only red light modulates the affect-action link by suppressing magnocellular activity, which disrupts visual processing for movement control. Furthermore, previous research examining the effect of the colour red on psychomotor tasks and perceptual acceleration of threat-based imagery suggest that stimulus-driven motor performance tasks requiring online control may be particularly susceptible to this effect.
Causes and Prevention of Laparoscopic Bile Duct Injuries
Way, Lawrence W.; Stewart, Lygia; Gantert, Walter; Liu, Kingsway; Lee, Crystine M.; Whang, Karen; Hunter, John G.
2003-01-01
Objective To apply human performance concepts in an attempt to understand the causes of and prevent laparoscopic bile duct injury. Summary Background Data Powerful conceptual advances have been made in understanding the nature and limits of human performance. Applying these findings in high-risk activities, such as commercial aviation, has allowed the work environment to be restructured to substantially reduce human error. Methods The authors analyzed 252 laparoscopic bile duct injuries according to the principles of the cognitive science of visual perception, judgment, and human error. The injury distribution was class I, 7%; class II, 22%; class III, 61%; and class IV, 10%. The data included operative radiographs, clinical records, and 22 videotapes of original operations. Results The primary cause of error in 97% of cases was a visual perceptual illusion. Faults in technical skill were present in only 3% of injuries. Knowledge and judgment errors were contributory but not primary. Sixty-four injuries (25%) were recognized at the index operation; the surgeon identified the problem early enough to limit the injury in only 15 (6%). In class III injuries the common duct, erroneously believed to be the cystic duct, was deliberately cut. This stemmed from an illusion of object form due to a specific uncommon configuration of the structures and the heuristic nature (unconscious assumptions) of human visual perception. The videotapes showed the persuasiveness of the illusion, and many operative reports described the operation as routine. Class II injuries resulted from a dissection too close to the common hepatic duct. Fundamentally an illusion, it was contributed to in some instances by working too deep in the triangle of Calot. Conclusions These data show that errors leading to laparoscopic bile duct injuries stem principally from misperception, not errors of skill, knowledge, or judgment. The misperception was so compelling that in most cases the surgeon did not recognize a problem. Even when irregularities were identified, corrective feedback did not occur, which is characteristic of human thinking under firmly held assumptions. These findings illustrate the complexity of human error in surgery while simultaneously providing insights. They demonstrate that automatically attributing technical complications to behavioral factors that rely on the assumption of control is likely to be wrong. Finally, this study shows that there are only a few points within laparoscopic cholecystectomy where the complication-causing errors occur, which suggests that focused training to heighten vigilance might be able to decrease the incidence of bile duct injury. PMID:12677139
Advancing Usability Evaluation through Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; David I. Gertman
2005-07-01
This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probabilitymore » of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues.« less
Opioid errors in inpatient palliative care services: a retrospective review.
Heneka, Nicole; Shaw, Tim; Rowett, Debra; Lapkin, Samuel; Phillips, Jane L
2018-06-01
Opioids are a high-risk medicine frequently used to manage palliative patients' cancer-related pain and other symptoms. Despite the high volume of opioid use in inpatient palliative care services, and the potential for patient harm, few studies have focused on opioid errors in this population. To (i) identify the number of opioid errors reported by inpatient palliative care services, (ii) identify reported opioid error characteristics and (iii) determine the impact of opioid errors on palliative patient outcomes. A 24-month retrospective review of opioid errors reported in three inpatient palliative care services in one Australian state. Of the 55 opioid errors identified, 84% reached the patient. Most errors involved morphine (35%) or hydromorphone (29%). Opioid administration errors accounted for 76% of reported opioid errors, largely due to omitted dose (33%) or wrong dose (24%) errors. Patients were more likely to receive a lower dose of opioid than ordered as a direct result of an opioid error (57%), with errors adversely impacting pain and/or symptom management in 42% of patients. Half (53%) of the affected patients required additional treatment and/or care as a direct consequence of the opioid error. This retrospective review has provided valuable insights into the patterns and impact of opioid errors in inpatient palliative care services. Iatrogenic harm related to opioid underdosing errors contributed to palliative patients' unrelieved pain. Better understanding the factors that contribute to opioid errors and the role of safety culture in the palliative care service context warrants further investigation. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Őri, Zsolt P
2017-05-01
A mathematical model has been developed to facilitate indirect measurements of difficult to measure variables of the human energy metabolism on a daily basis. The model performs recursive system identification of the parameters of the metabolic model of the human energy metabolism using the law of conservation of energy and principle of indirect calorimetry. Self-adaptive models of the utilized energy intake prediction, macronutrient oxidation rates, and daily body composition changes were created utilizing Kalman filter and the nominal trajectory methods. The accuracy of the models was tested in a simulation study utilizing data from the Minnesota starvation and overfeeding study. With biweekly macronutrient intake measurements, the average prediction error of the utilized carbohydrate intake was -23.2 ± 53.8 kcal/day, fat intake was 11.0 ± 72.3 kcal/day, and protein was 3.7 ± 16.3 kcal/day. The fat and fat-free mass changes were estimated with an error of 0.44 ± 1.16 g/day for fat and -2.6 ± 64.98 g/day for fat-free mass. The daily metabolized macronutrient energy intake and/or daily macronutrient oxidation rate and the daily body composition change from directly measured serial data are optimally predicted with a self-adaptive model with Kalman filter that uses recursive system identification.
Task-oriented display design - Concept and example
NASA Technical Reports Server (NTRS)
Abbott, Terence S.
1989-01-01
The general topic was in the area of display design alternatives for improved man-machine performance. The intent was to define and assess a display design concept oriented toward providing this task-oriented information. The major focus of this concept deals with the processing of data into parameters that are more relevant to the task of the human operator. Closely coupled to this concept of relevant information is the form or manner in which this information is actually presented. Conventional forms of presentation are normally a direct representation of the underlying data. By providing information in a form that is more easily assimilated and understood, a reduction in human error and cognitive workload may be obtained. A description of this proposed concept with a design example is provided. The application for the example was an engine display for a generic, twin-engine civil transport aircraft. The product of this concept was evaluated against a functionally similar, traditional display. The results of this evaluation showed that a task-oriented approach to design is a viable concept with regard to reducing user error and cognitive workload. The goal of this design process, providing task-oriented information to the user, both in content and form, appears to be a feasible mechanism for increasing the overall performance of a man-machine system.
How glitter relates to gold: similarity-dependent reward prediction errors in the human striatum.
Kahnt, Thorsten; Park, Soyoung Q; Burke, Christopher J; Tobler, Philippe N
2012-11-14
Optimal choices benefit from previous learning. However, it is not clear how previously learned stimuli influence behavior to novel but similar stimuli. One possibility is to generalize based on the similarity between learned and current stimuli. Here, we use neuroscientific methods and a novel computational model to inform the question of how stimulus generalization is implemented in the human brain. Behavioral responses during an intradimensional discrimination task showed similarity-dependent generalization. Moreover, a peak shift occurred, i.e., the peak of the behavioral generalization gradient was displaced from the rewarded conditioned stimulus in the direction away from the unrewarded conditioned stimulus. To account for the behavioral responses, we designed a similarity-based reinforcement learning model wherein prediction errors generalize across similar stimuli and update their value. We show that this model predicts a similarity-dependent neural generalization gradient in the striatum as well as changes in responding during extinction. Moreover, across subjects, the width of generalization was negatively correlated with functional connectivity between the striatum and the hippocampus. This result suggests that hippocampus-striatal connections contribute to stimulus-specific value updating by controlling the width of generalization. In summary, our results shed light onto the neurobiology of a fundamental, similarity-dependent learning principle that allows learning the value of stimuli that have never been encountered.
Task-oriented display design: Concept and example
NASA Technical Reports Server (NTRS)
Abbott, Terence S.
1989-01-01
The general topic was in the area of display design alternatives for improved man-machine performance. The intent was to define and assess a display design concept oriented toward providing this task-oriented information. The major focus of this concept deals with the processing of data into parameters that are more relevant to the task of the human operator. Closely coupled to this concept of relevant information is the form or manner in which this information is actually presented. Conventional forms of presentation are normally a direct representation of the underlying data. By providing information in a form that is more easily assimilated and understood, a reduction in human error and cognitive workload may be obtained. A description of this proposed concept with a design example is provided. The application for the example was an engine display for a generic, twin-engine civil transport aircraft. The product of this concept was evaluated against a functionally similar, traditional display. The results of this evaluation showed that a task-oriented approach to design is a viable concept with regard to reducing user error and cognitive workload. The goal of this design process, providing task-oriented information to the user, both in content and form, appears to be a feasible mechanism for increasing the overall performance of a man-machine system.
NASA Astrophysics Data System (ADS)
Rojo, Pilar; Royo, Santiago; Caum, Jesus; Ramírez, Jorge; Madariaga, Ines
2015-02-01
Peripheral refraction, the refractive error present outside the main direction of gaze, has lately attracted interest due to its alleged relationship with the progression of myopia. The ray tracing procedures involved in its calculation need to follow an approach different from those used in conventional ophthalmic lens design, where refractive errors are compensated only in the main direction of gaze. We present a methodology for the evaluation of the peripheral refractive error in ophthalmic lenses, adapting the conventional generalized ray tracing approach to the requirements of the evaluation of peripheral refraction. The nodal point of the eye and a retinal conjugate surface will be used to evaluate the three-dimensional distribution of refractive error around the fovea. The proposed approach enables us to calculate the three-dimensional peripheral refraction induced by any ophthalmic lens at any direction of gaze and to personalize the lens design to the requirements of the user. The complete evaluation process for a given user prescribed with a -5.76D ophthalmic lens for foveal vision is detailed, and comparative results obtained when the geometry of the lens is modified and when the central refractive error is over- or undercorrected. The methodology is also applied for an emmetropic eye to show its application for refractive errors other than myopia.
Autonomous Control Modes and Optimized Path Guidance for Shipboard Landing in High Sea States
2015-11-16
a degraded visual environment, workload during the landing task begins to approach the limits of a human pilot’s capability. It is a similarly...Figure 2. Approach Trajectory ±4 ft landing error ±8 ft landing error ±12 ft landing error Flight Path -3000...heave and yaw axes. Figure 5. Open loop system generation ±4 ft landing error ±8 ft landing error ±12 ft landing error -10 -8 -6 -4 -2 0 2 4
The Accuracy of GBM GRB Localizations
NASA Astrophysics Data System (ADS)
Briggs, Michael Stephen; Connaughton, V.; Meegan, C.; Hurley, K.
2010-03-01
We report an study of the accuracy of GBM GRB localizations, analyzing three types of localizations: those produced automatically by the GBM Flight Software on board GBM, those produced automatically with ground software in near real time, and localizations produced with human guidance. The two types of automatic locations are distributed in near real-time via GCN Notices; the human-guided locations are distributed on timescale of many minutes or hours using GCN Circulars. This work uses a Bayesian analysis that models the distribution of the GBM total location error by comparing GBM locations to more accurate locations obtained with other instruments. Reference locations are obtained from Swift, Super-AGILE, the LAT, and with the IPN. We model the GBM total location errors as having systematic errors in addition to the statistical errors and use the Bayesian analysis to constrain the systematic errors.
Servant, Mathieu; White, Corey; Montagnini, Anna; Burle, Borís
2015-07-15
Most decisions that we make build upon multiple streams of sensory evidence and control mechanisms are needed to filter out irrelevant information. Sequential sampling models of perceptual decision making have recently been enriched by attentional mechanisms that weight sensory evidence in a dynamic and goal-directed way. However, the framework retains the longstanding hypothesis that motor activity is engaged only once a decision threshold is reached. To probe latent assumptions of these models, neurophysiological indices are needed. Therefore, we collected behavioral and EMG data in the flanker task, a standard paradigm to investigate decisions about relevance. Although the models captured response time distributions and accuracy data, EMG analyses of response agonist muscles challenged the assumption of independence between decision and motor processes. Those analyses revealed covert incorrect EMG activity ("partial error") in a fraction of trials in which the correct response was finally given, providing intermediate states of evidence accumulation and response activation at the single-trial level. We extended the models by allowing motor activity to occur before a commitment to a choice and demonstrated that the proposed framework captured the rate, latency, and EMG surface of partial errors, along with the speed of the correction process. In return, EMG data provided strong constraints to discriminate between competing models that made similar behavioral predictions. Our study opens new theoretical and methodological avenues for understanding the links among decision making, cognitive control, and motor execution in humans. Sequential sampling models of perceptual decision making assume that sensory information is accumulated until a criterion quantity of evidence is obtained, from where the decision terminates in a choice and motor activity is engaged. The very existence of covert incorrect EMG activity ("partial error") during the evidence accumulation process challenges this longstanding assumption. In the present work, we use partial errors to better constrain sequential sampling models at the single-trial level. Copyright © 2015 the authors 0270-6474/15/3510371-15$15.00/0.
Yuan, Fenghua; Lai, Fangfang; Gu, Liya; Zhou, Wen; El Hokayem, Jimmy; Zhang, Yanbin
2009-05-01
Mismatch repair corrects biosynthetic errors generated during DNA replication, whose deficiency causes a mutator phenotype and directly underlies hereditary non-polyposis colorectal cancer and sporadic cancers. Because of remarkably high conservation of the mismatch repair machinery between the budding yeast (Saccharomyces cerevisiae) and humans, the study of mismatch repair in yeast has provided tremendous insights into the mechanisms of this repair pathway in humans. In addition, yeast cells possess an unbeatable advantage over human cells in terms of the easy genetic manipulation, the availability of whole genome deletion strains, and the relatively low cost for setting up the system. Although many components of eukaryotic mismatch repair have been identified, it remains unclear if additional factors, such as DNA helicase(s) and redundant nuclease(s) besides EXO1, participate in eukaryotic mismatch repair. To facilitate the discovery of novel mismatch repair factors, we developed a straightforward in vitro cell-free repair system. Here, we describe the practical protocols for preparation of yeast cell-free nuclear extracts and DNA mismatch substrates, and the in vitro mismatch repair assay. The validity of the cell-free system was confirmed by the mismatch repair deficient yeast strain (Deltamsh2) and the complementation assay with purified yeast MSH2-MSH6.
THERP and HEART integrated methodology for human error assessment
NASA Astrophysics Data System (ADS)
Castiglia, Francesco; Giardina, Mariarosa; Tomarchio, Elio
2015-11-01
THERP and HEART integrated methodology is proposed to investigate accident scenarios that involve operator errors during high-dose-rate (HDR) treatments. The new approach has been modified on the basis of fuzzy set concept with the aim of prioritizing an exhaustive list of erroneous tasks that can lead to patient radiological overexposures. The results allow for the identification of human errors that are necessary to achieve a better understanding of health hazards in the radiotherapy treatment process, so that it can be properly monitored and appropriately managed.
Popa, Laurentiu S.; Hewitt, Angela L.; Ebner, Timothy J.
2012-01-01
The cerebellum has been implicated in processing motor errors required for online control of movement and motor learning. The dominant view is that Purkinje cell complex spike discharge signals motor errors. This study investigated whether errors are encoded in the simple spike discharge of Purkinje cells in monkeys trained to manually track a pseudo-randomly moving target. Four task error signals were evaluated based on cursor movement relative to target movement. Linear regression analyses based on firing residuals ensured that the modulation with a specific error parameter was independent of the other error parameters and kinematics. The results demonstrate that simple spike firing in lobules IV–VI is significantly correlated with position, distance and directional errors. Independent of the error signals, the same Purkinje cells encode kinematics. The strongest error modulation occurs at feedback timing. However, in 72% of cells at least one of the R2 temporal profiles resulting from regressing firing with individual errors exhibit two peak R2 values. For these bimodal profiles, the first peak is at a negative τ (lead) and a second peak at a positive τ (lag), implying that Purkinje cells encode both prediction and feedback about an error. For the majority of the bimodal profiles, the signs of the regression coefficients or preferred directions reverse at the times of the peaks. The sign reversal results in opposing simple spike modulation for the predictive and feedback components. Dual error representations may provide the signals needed to generate sensory prediction errors used to update a forward internal model. PMID:23115173
NASA Astrophysics Data System (ADS)
Huo, Ming-Xia; Li, Ying
2017-12-01
Quantum error correction is important to quantum information processing, which allows us to reliably process information encoded in quantum error correction codes. Efficient quantum error correction benefits from the knowledge of error rates. We propose a protocol for monitoring error rates in real time without interrupting the quantum error correction. Any adaptation of the quantum error correction code or its implementation circuit is not required. The protocol can be directly applied to the most advanced quantum error correction techniques, e.g. surface code. A Gaussian processes algorithm is used to estimate and predict error rates based on error correction data in the past. We find that using these estimated error rates, the probability of error correction failures can be significantly reduced by a factor increasing with the code distance.
Patient safety in otolaryngology: a descriptive review.
Danino, Julian; Muzaffar, Jameel; Metcalfe, Chris; Coulson, Chris
2017-03-01
Human evaluation and judgement may include errors that can have disastrous results. Within medicine and healthcare there has been slow progress towards major changes in safety. Healthcare lags behind other specialised industries, such as aviation and nuclear power, where there have been significant improvements in overall safety, especially in reducing risk of errors. Following several high profile cases in the USA during the 1990s, a report titled "To Err Is Human: Building a Safer Health System" was published. The report extrapolated that in the USA approximately 50,000 to 100,000 patients may die each year as a result of medical errors. Traditionally otolaryngology has always been regarded as a "safe specialty". A study in the USA in 2004 inferred that there may be 2600 cases of major morbidity and 165 deaths within the specialty. MEDLINE via PubMed interface was searched for English language articles published between 2000 and 2012. Each combined two or three of the keywords noted earlier. Limitations are related to several generic topics within patient safety in otolaryngology. Other areas covered have been current relevant topics due to recent interest or new advances in technology. There has been a heightened awareness within the healthcare community of patient safety; it has become a major priority. Focus has shifted from apportioning blame to prevention of the errors and implementation of patient safety mechanisms in healthcare delivery. Type of Errors can be divided into errors due to action and errors due to knowledge or planning. In healthcare there are several factors that may influence adverse events and patient safety. Although technology may improve patient safety, it also introduces new sources of error. The ability to work with people allows for the increase in safety netting. Team working has been shown to have a beneficial effect on patient safety. Any field of work involving human decision-making will always have a risk of error. Within Otolaryngology, although patient safety has evolved along similar themes as other surgical specialties; there are several specific high-risk areas. Medical error is a common problem and its human cost is of immense importance. Steps to reduce such errors require the identification of high-risk practice within a complex healthcare system. The commitment to patient safety and quality improvement in medicine depend on personal responsibility and professional accountability.
Henneman, Elizabeth A; Roche, Joan P; Fisher, Donald L; Cunningham, Helene; Reilly, Cheryl A; Nathanson, Brian H; Henneman, Philip L
2010-02-01
This study examined types of errors that occurred or were recovered in a simulated environment by student nurses. Errors occurred in all four rule-based error categories, and all students committed at least one error. The most frequent errors occurred in the verification category. Another common error was related to physician interactions. The least common errors were related to coordinating information with the patient and family. Our finding that 100% of student subjects committed rule-based errors is cause for concern. To decrease errors and improve safe clinical practice, nurse educators must identify effective strategies that students can use to improve patient surveillance. Copyright 2010 Elsevier Inc. All rights reserved.
Local blur analysis and phase error correction method for fringe projection profilometry systems.
Rao, Li; Da, Feipeng
2018-05-20
We introduce a flexible error correction method for fringe projection profilometry (FPP) systems in the presence of local blur phenomenon. Local blur caused by global light transport such as camera defocus, projector defocus, and subsurface scattering will cause significant systematic errors in FPP systems. Previous methods, which adopt high-frequency patterns to separate the direct and global components, fail when the global light phenomenon occurs locally. In this paper, the influence of local blur on phase quality is thoroughly analyzed, and a concise error correction method is proposed to compensate the phase errors. For defocus phenomenon, this method can be directly applied. With the aid of spatially varying point spread functions and local frontal plane assumption, experiments show that the proposed method can effectively alleviate the system errors and improve the final reconstruction accuracy in various scenes. For a subsurface scattering scenario, if the translucent object is dominated by multiple scattering, the proposed method can also be applied to correct systematic errors once the bidirectional scattering-surface reflectance distribution function of the object material is measured.
Navigator alignment using radar scan
Doerry, Armin W.; Marquette, Brandeis
2016-04-05
The various technologies presented herein relate to the determination of and correction of heading error of platform. Knowledge of at least one of a maximum Doppler frequency or a minimum Doppler bandwidth pertaining to a plurality of radar echoes can be utilized to facilitate correction of the heading error. Heading error can occur as a result of component drift. In an ideal situation, a boresight direction of an antenna or the front of an aircraft will have associated therewith at least one of a maximum Doppler frequency or a minimum Doppler bandwidth. As the boresight direction of the antenna strays from a direction of travel at least one of the maximum Doppler frequency or a minimum Doppler bandwidth will shift away, either left or right, from the ideal situation.
Body mass and stature estimation based on the first metatarsal in humans.
De Groote, Isabelle; Humphrey, Louise T
2011-04-01
Archaeological assemblages often lack the complete long bones needed to estimate stature and body mass. The most accurate estimates of body mass and stature are produced using femoral head diameter and femur length. Foot bones including the first metatarsal preserve relatively well in a range of archaeological contexts. In this article we present regression equations using the first metatarsal to estimate femoral head diameter, femoral length, and body mass in a diverse human sample. The skeletal sample comprised 87 individuals (Andamanese, Australasians, Africans, Native Americans, and British). Results show that all first metatarsal measurements correlate moderately to highly (r = 0.62-0.91) with femoral head diameter and length. The proximal articular dorsoplantar diameter is the best single measurement to predict both femoral dimensions. Percent standard errors of the estimate are below 5%. Equations using two metatarsal measurements show a small increase in accuracy. Direct estimations of body mass (calculated from measured femoral head diameter using previously published equations) have an error of just over 7%. No direct stature estimation equations were derived due to the varied linear body proportions represented in the sample. The equations were tested on a sample of 35 individuals from Christ Church Spitalfields. Percentage differences in estimated and measured femoral head diameter and length were less than 1%. This study demonstrates that it is feasible to use the first metatarsal in the estimation of body mass and stature. The equations presented here are particularly useful for assemblages where the long bones are either missing or fragmented, and enable estimation of these fundamental population parameters in poorly preserved assemblages. Copyright © 2011 Wiley-Liss, Inc.
Normal accidents: human error and medical equipment design.
Dain, Steven
2002-01-01
High-risk systems, which are typical of our technologically complex era, include not just nuclear power plants but also hospitals, anesthesia systems, and the practice of medicine and perfusion. In high-risk systems, no matter how effective safety devices are, some types of accidents are inevitable because the system's complexity leads to multiple and unexpected interactions. It is important for healthcare providers to apply a risk assessment and management process to decisions involving new equipment and procedures or staffing matters in order to minimize the residual risks of latent errors, which are amenable to correction because of the large window of opportunity for their detection. This article provides an introduction to basic risk management and error theory principles and examines ways in which they can be applied to reduce and mitigate the inevitable human errors that accompany high-risk systems. The article also discusses "human factor engineering" (HFE), the process which is used to design equipment/ human interfaces in order to mitigate design errors. The HFE process involves interaction between designers and endusers to produce a series of continuous refinements that are incorporated into the final product. The article also examines common design problems encountered in the operating room that may predispose operators to commit errors resulting in harm to the patient. While recognizing that errors and accidents are unavoidable, organizations that function within a high-risk system must adopt a "safety culture" that anticipates problems and acts aggressively through an anonymous, "blameless" reporting mechanism to resolve them. We must continuously examine and improve the design of equipment and procedures, personnel, supplies and materials, and the environment in which we work to reduce error and minimize its effects. Healthcare providers must take a leading role in the day-to-day management of the "Perioperative System" and be a role model in promoting a culture of safety in their organizations.
The Measurement and Correction of the Periodic Error of the LX200-16 Telescope Driving System
NASA Astrophysics Data System (ADS)
Jeong, Jang Hae; Lee, Young Sam; Lee, Chung Uk
2000-06-01
We examined and corrected the periodic error of the LX200-16 Telescope driving system of Chungbuk National University Campus Observatory. Before correcting, the standard deviation of the periodic error in the direction of East-West was = 7.''2. After correcting,we found that the periodic error was reduced to = 1.''2.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-15
....gov/acs/www/ or contact the Census Bureau's Social, Economic, and Housing Statistics Division at (301...) Sampling Error, which consists of the error that arises from the use of probability sampling to create the... direction; and (2) Sampling Error, which consists of the error that arises from the use of probability...
"Apologies" from pathologists: why, when, and how to say "sorry" after committing a medical error.
Dewar, Rajan; Parkash, Vinita; Forrow, Lachlan; Truog, Robert D
2014-05-01
How pathologists communicate an error is complicated by the absence of a direct physician-patient relationship. Using 2 examples, we elaborate on how other physician colleagues routinely play an intermediary role in our day-to-day transactions and in the communication of a pathologist error to the patient. The concept of a "dual-hybrid" mind-set in the intermediary physician and its role in representing the pathologists' viewpoint adequately is considered. In a dual-hybrid mind-set, the intermediary physician can align with the patients' philosophy and like the patient, consider the smallest deviation from norm to be an error. Alternatively, they might embrace the traditional physician philosophy and communicate only those errors that resulted in a clinically inappropriate outcome. Neither may effectively reflect the pathologists' interests. We propose that pathologists develop strategies to communicate errors that include considerations of meeting with the patients directly. Such interactions promote healing for the patient and are relieving to the well-intentioned pathologist.
Correction of a Technical Error in the Golf Swing: Error Amplification Versus Direct Instruction.
Milanese, Chiara; Corte, Stefano; Salvetti, Luca; Cavedon, Valentina; Agostini, Tiziano
2016-01-01
Performance errors drive motor learning for many tasks. The authors' aim was to determine which of two strategies, method of amplification of error (MAE) or direct instruction (DI), would be more beneficial for error correction during a full golfing swing with a driver. Thirty-four golfers were randomly assigned to one of three training conditions (MAE, DI, and control). Participants were tested in a practice session in which each golfer performed 7 pretraining trials, 6 training-intervention trials, and 7 posttraining trials; and a retention test after 1 week. An optoeletronic motion capture system was used to measure the kinematic parameters of each golfer's performance. Results showed that MAE is an effective strategy for correcting the technical errors leading to a rapid improvement in performance. These findings could have practical implications for sport psychology and physical education because, while practice is obviously necessary for improving learning, the efficacy of the learning process is essential in enhancing learners' motivation and sport enjoyment.
E-prescribing errors in community pharmacies: exploring consequences and contributing factors.
Odukoya, Olufunmilola K; Stone, Jamie A; Chui, Michelle A
2014-06-01
To explore types of e-prescribing errors in community pharmacies and their potential consequences, as well as the factors that contribute to e-prescribing errors. Data collection involved performing 45 total hours of direct observations in five pharmacies. Follow-up interviews were conducted with 20 study participants. Transcripts from observations and interviews were subjected to content analysis using NVivo 10. Pharmacy staff detected 75 e-prescription errors during the 45 h observation in pharmacies. The most common e-prescribing errors were wrong drug quantity, wrong dosing directions, wrong duration of therapy, and wrong dosage formulation. Participants estimated that 5 in 100 e-prescriptions have errors. Drug classes that were implicated in e-prescribing errors were antiinfectives, inhalers, ophthalmic, and topical agents. The potential consequences of e-prescribing errors included increased likelihood of the patient receiving incorrect drug therapy, poor disease management for patients, additional work for pharmacy personnel, increased cost for pharmacies and patients, and frustrations for patients and pharmacy staff. Factors that contribute to errors included: technology incompatibility between pharmacy and clinic systems, technology design issues such as use of auto-populate features and dropdown menus, and inadvertently entering incorrect information. Study findings suggest that a wide range of e-prescribing errors is encountered in community pharmacies. Pharmacists and technicians perceive that causes of e-prescribing errors are multidisciplinary and multifactorial, that is to say e-prescribing errors can originate from technology used in prescriber offices and pharmacies. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
E-Prescribing Errors in Community Pharmacies: Exploring Consequences and Contributing Factors
Stone, Jamie A.; Chui, Michelle A.
2014-01-01
Objective To explore types of e-prescribing errors in community pharmacies and their potential consequences, as well as the factors that contribute to e-prescribing errors. Methods Data collection involved performing 45 total hours of direct observations in five pharmacies. Follow-up interviews were conducted with 20 study participants. Transcripts from observations and interviews were subjected to content analysis using NVivo 10. Results Pharmacy staff detected 75 e-prescription errors during the 45 hour observation in pharmacies. The most common e-prescribing errors were wrong drug quantity, wrong dosing directions, wrong duration of therapy, and wrong dosage formulation. Participants estimated that 5 in 100 e-prescriptions have errors. Drug classes that were implicated in e-prescribing errors were antiinfectives, inhalers, ophthalmic, and topical agents. The potential consequences of e-prescribing errors included increased likelihood of the patient receiving incorrect drug therapy, poor disease management for patients, additional work for pharmacy personnel, increased cost for pharmacies and patients, and frustrations for patients and pharmacy staff. Factors that contribute to errors included: technology incompatibility between pharmacy and clinic systems, technology design issues such as use of auto-populate features and dropdown menus, and inadvertently entering incorrect information. Conclusion Study findings suggest that a wide range of e-prescribing errors are encountered in community pharmacies. Pharmacists and technicians perceive that causes of e-prescribing errors are multidisciplinary and multifactorial, that is to say e-prescribing errors can originate from technology used in prescriber offices and pharmacies. PMID:24657055
Clarke, D L; Kong, V Y; Naidoo, L C; Furlong, H; Aldous, C
2013-01-01
Acute surgical patients are particularly vulnerable to human error. The Acute Physiological Support Team (APST) was created with the twin objectives of identifying high-risk acute surgical patients in the general wards and reducing both the incidence of error and impact of error on these patients. A number of error taxonomies were used to understand the causes of human error and a simple risk stratification system was adopted to identify patients who are particularly at risk of error. During the period November 2012-January 2013 a total of 101 surgical patients were cared for by the APST at Edendale Hospital. The average age was forty years. There were 36 females and 65 males. There were 66 general surgical patients and 35 trauma patients. Fifty-six patients were referred on the day of their admission. The average length of stay in the APST was four days. Eleven patients were haemo-dynamically unstable on presentation and twelve were clinically septic. The reasons for referral were sepsis,(4) respiratory distress,(3) acute kidney injury AKI (38), post-operative monitoring (39), pancreatitis,(3) ICU down-referral,(7) hypoxia,(5) low GCS,(1) coagulopathy.(1) The mortality rate was 13%. A total of thirty-six patients experienced 56 errors. A total of 143 interventions were initiated by the APST. These included institution or adjustment of intravenous fluids (101), blood transfusion,(12) antibiotics,(9) the management of neutropenic sepsis,(1) central line insertion,(3) optimization of oxygen therapy,(7) correction of electrolyte abnormality,(8) correction of coagulopathy.(2) CONCLUSION: Our intervention combined current taxonomies of error with a simple risk stratification system and is a variant of the defence in depth strategy of error reduction. We effectively identified and corrected a significant number of human errors in high-risk acute surgical patients. This audit has helped understand the common sources of error in the general surgical wards and will inform on-going error reduction initiatives. Copyright © 2013 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.
It's positive to be negative: Achilles tendon work loops during human locomotion.
Zelik, Karl E; Franz, Jason R
2017-01-01
Ultrasound imaging is increasingly used with motion and force data to quantify tendon dynamics during human movement. Frequently, tendon dynamics are estimated indirectly from muscle fascicle kinematics (by subtracting muscle from muscle-tendon unit length), but there is mounting evidence that this Indirect approach yields implausible tendon work loops. Since tendons are passive viscoelastic structures, when they undergo a loading-unloading cycle they must exhibit a negative work loop (i.e., perform net negative work). However, prior studies using this Indirect approach report large positive work loops, often estimating that tendons return 2-5 J of elastic energy for every 1 J of energy stored. More direct ultrasound estimates of tendon kinematics have emerged that quantify tendon elongations by tracking either the muscle-tendon junction or localized tendon tissue. However, it is unclear if these yield more plausible estimates of tendon dynamics. Our objective was to compute tendon work loops and hysteresis losses using these two Direct tendon kinematics estimates during human walking. We found that Direct estimates generally resulted in negative work loops, with average tendon hysteresis losses of 2-11% at 1.25 m/s and 33-49% at 0.75 m/s (N = 8), alluding to 0.51-0.98 J of tendon energy returned for every 1 J stored. We interpret this finding to suggest that Direct approaches provide more plausible estimates than the Indirect approach, and may be preferable for understanding tendon energy storage and return. However, the Direct approaches did exhibit speed-dependent trends that are not consistent with isolated, in vitro tendon hysteresis losses of about 5-10%. These trends suggest that Direct estimates also contain some level of error, albeit much smaller than Indirect estimates. Overall, this study serves to highlight the complexity and difficulty of estimating tendon dynamics non-invasively, and the care that must be taken to interpret biological function from current ultrasound-based estimates.
Homing in humans: A different look.
Bovet, J
1994-08-01
A current model holds that the long-distance homing abilities of free-ranging mammals rest primarily on a strategy of course reversal, based on outward journey information. In this study, I measured the ability to orient toward home in humans displaced under conditions that promote the use of this strategy, namely along an outward route that was direct, and the main bearing of which could be extrapolated by reference to a pre-existant mental map and by visual backup during the outward journey. Even though the individual course estimates obtained did show a certain amount of dispersion and/or error, they were more accurate and less dispersed than in experiments by other authors, where subjects could not use this strategy, because they were displaced blindfolded and/or along circuitous routes. Copyright © 1994. Published by Elsevier B.V.
Flexible and inflexible response components: a Stroop study with typewritten output.
Damian, Markus F; Freeman, Norman H
2008-05-01
Two experiments were directed at investigating the relationship between response selection and execution in typewriting, and specifically the extent to which concurrent processing takes place. In a Stroop paradigm adapted from [Logan, G. D., & Zbrodoff, N. J. (1998). Stroop-type interference: Congruity effects in colour naming with typewritten responses. Journal of Experimental Psychology: Human Perception and Performance, 24, 978-992], participants typed the names of colour patches with incongruent, congruent, or neutral distractors presented at various stimulus-onset asynchronies. Experiment 1 showed Stroop interference and facilitation for initial keystroke latencies and errors, contrasting with response durations (a measure of response execution) being unaffected by Stroop manipulation. Experiment 2 showed that all three measures were responsive to time pressure; again, Stroop effects were confined to latencies and errors only. The observation that response duration is both flexible under time pressure and protected from response competition, may imply either that response execution is structurally segregated from earlier processing stages, or that encapsulation develops during the acquisition of typing skills.
Vanishing Point Extraction and Refinement for Robust Camera Calibration
Tsai, Fuan
2017-01-01
This paper describes a flexible camera calibration method using refined vanishing points without prior information. Vanishing points are estimated from human-made features like parallel lines and repeated patterns. With the vanishing points extracted from the three mutually orthogonal directions, the interior and exterior orientation parameters can be further calculated using collinearity condition equations. A vanishing point refinement process is proposed to reduce the uncertainty caused by vanishing point localization errors. The fine-tuning algorithm is based on the divergence of grouped feature points projected onto the reference plane, minimizing the standard deviation of each of the grouped collinear points with an O(1) computational complexity. This paper also presents an automated vanishing point estimation approach based on the cascade Hough transform. The experiment results indicate that the vanishing point refinement process can significantly improve camera calibration parameters and the root mean square error (RMSE) of the constructed 3D model can be reduced by about 30%. PMID:29280966
NASA Astrophysics Data System (ADS)
Wi, S.; Freeman, S.; Brown, C.
2017-12-01
This study presents a general approach to developing computational models of human-hydrologic systems where human modification of hydrologic surface processes are significant or dominant. A river basin system is represented by a network of human-hydrologic response units (HHRUs) identified based on locations where river regulations happen (e.g., reservoir operation and diversions). Natural and human processes in HHRUs are simulated in a holistic framework that integrates component models representing rainfall-runoff, river routing, reservoir operation, flow diversion and water use processes. We illustrate the approach in a case study of the Cutzamala water system (CWS) in Mexico, a complex inter-basin water transfer system supplying the Mexico City Metropolitan Area (MCMA). The human-hydrologic system model for CWS (CUTZSIM) is evaluated in terms of streamflow and reservoir storages measured across the CWS and to water supplied for MCMA. The CUTZSIM improves the representation of hydrology and river-operation interaction and, in so doing, advances evaluation of system-wide water management consequences under altered climatic and demand regimes. The integrated modeling framework enables evaluation and simulation of model errors throughout the river basin, including errors in representation of the human component processes. Heretofore, model error evaluation, predictive error intervals and the resultant improved understanding have been limited to hydrologic processes. The general framework represents an initial step towards fuller understanding and prediction of the many and varied processes that determine the hydrologic fluxes and state variables in real river basins.
A Foundation for Systems Anthropometry: Lumbar/Pelvic Kinematics
1983-02-01
caused by human error in positioning the cursor of the digitizing board and inaccuracy of the digitizer. (Human error is approximately + .02 an and...Milne, J.S. and Lauder, I.J. 1974. "Age Effects in Kyphosis and Lordosis in Adults." Ann. Hum. Biol. 1(3):327-337. Mchr, G.C., Brinkley, J.W., Kazarian
Comment on "Differential sensitivity to human communication in dogs, wolves, and human infants".
Marshall-Pescini, S; Passalacqua, C; Valsecchi, P; Prato-Previde, E
2010-07-09
Topál et al. (Reports, 4 September 2009, p. 1269) showed that dogs, like infants but unlike wolves, make perseverative search errors that can be explained by the use of ostensive cues from the experimenter. We suggest that a simpler learning process, local enhancement, can account for errors made by dogs.
Review of U.S. Army Unmanned Aerial Systems Accident Reports: Analysis of Human Error Contributions
2018-03-20
USAARL Report No. 2018-08 Review of U.S. Army Unmanned Aerial Systems Accident Reports: Analysis of Human Error Contributions By Kathryn A...3 Statistical Analysis Approach ..............................................................................................3 Results...1 Introduction The success of unmanned aerial systems (UAS) operations relies upon a variety of factors, including, but not limited to
Impact of human error on lumber yield in rough mills
Urs Buehlmann; R. Edward Thomas; R. Edward Thomas
2002-01-01
Rough sawn, kiln-dried lumber contains characteristics such as knots and bark pockets that are considered by most people to be defects. When using boards to produce furniture components, these defects are removed to produce clear, defect-free parts. Currently, human operators identify and locate the unusable board areas containing defects. Errors in determining a...
NASA Technical Reports Server (NTRS)
Silva-Martinez, Jackelynne; Ellenberger, Richard; Dory, Jonathan
2017-01-01
This project aims to identify poor human factors design decisions that led to error-prone systems, or did not facilitate the flight crew making the right choices; and to verify that NASA is effectively preventing similar incidents from occurring again. This analysis was performed by reviewing significant incidents and close calls in human spaceflight identified by the NASA Johnson Space Center Safety and Mission Assurance Flight Safety Office. The review of incidents shows whether the identified human errors were due to the operational phase (flight crew and ground control) or if they initiated at the design phase (includes manufacturing and test). This classification was performed with the aid of the NASA Human Systems Integration domains. This in-depth analysis resulted in a tool that helps with the human factors classification of significant incidents and close calls in human spaceflight, which can be used to identify human errors at the operational level, and how they were or should be minimized. Current governing documents on human systems integration for both government and commercial crew were reviewed to see if current requirements, processes, training, and standard operating procedures protect the crew and ground control against these issues occurring in the future. Based on the findings, recommendations to target those areas are provided.
Uncorrected refractive errors.
Naidoo, Kovin S; Jaggernath, Jyoti
2012-01-01
Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC), were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR) Development, Service Development and Social Entrepreneurship.
ERIC Educational Resources Information Center
Sun, Wei; And Others
1992-01-01
Identifies types and distributions of errors in text produced by optical character recognition (OCR) and proposes a process using machine learning techniques to recognize and correct errors in OCR texts. Results of experiments indicating that this strategy can reduce human interaction required for error correction are reported. (25 references)…
Gleich, Stephen J; Nemergut, Michael E; Stans, Anthony A; Haile, Dawit T; Feigal, Scott A; Heinrich, Angela L; Bosley, Christopher L; Tripathi, Sandeep
2016-08-01
Ineffective and inefficient patient transfer processes can increase the chance of medical errors. Improvements in such processes are high-priority local institutional and national patient safety goals. At our institution, nonintubated postoperative pediatric patients are first admitted to the postanesthesia care unit before transfer to the PICU. This quality improvement project was designed to improve the patient transfer process from the operating room (OR) to the PICU. After direct observation of the baseline process, we introduced a structured, direct OR-PICU transfer process for orthopedic spinal fusion patients. We performed value stream mapping of the process to determine error-prone and inefficient areas. We evaluated primary outcome measures of handoff error reduction and the overall efficiency of patient transfer process time. Staff satisfaction was evaluated as a counterbalance measure. With the introduction of the new direct OR-PICU patient transfer process, the handoff communication error rate improved from 1.9 to 0.3 errors per patient handoff (P = .002). Inefficiency (patient wait time and non-value-creating activity) was reduced from 90 to 32 minutes. Handoff content was improved with fewer information omissions (P < .001). Staff satisfaction significantly improved among nearly all PICU providers. By using quality improvement methodology to design and implement a new direct OR-PICU transfer process with a structured multidisciplinary verbal handoff, we achieved sustained improvements in patient safety and efficiency. Handoff communication was enhanced, with fewer errors and content omissions. The new process improved efficiency, with high staff satisfaction. Copyright © 2016 by the American Academy of Pediatrics.
An Empirical State Error Covariance Matrix for Batch State Estimation
NASA Technical Reports Server (NTRS)
Frisbee, Joseph H., Jr.
2011-01-01
State estimation techniques serve effectively to provide mean state estimates. However, the state error covariance matrices provided as part of these techniques suffer from some degree of lack of confidence in their ability to adequately describe the uncertainty in the estimated states. A specific problem with the traditional form of state error covariance matrices is that they represent only a mapping of the assumed observation error characteristics into the state space. Any errors that arise from other sources (environment modeling, precision, etc.) are not directly represented in a traditional, theoretical state error covariance matrix. Consider that an actual observation contains only measurement error and that an estimated observation contains all other errors, known and unknown. It then follows that a measurement residual (the difference between expected and observed measurements) contains all errors for that measurement. Therefore, a direct and appropriate inclusion of the actual measurement residuals in the state error covariance matrix will result in an empirical state error covariance matrix. This empirical state error covariance matrix will fully account for the error in the state estimate. By way of a literal reinterpretation of the equations involved in the weighted least squares estimation algorithm, it is possible to arrive at an appropriate, and formally correct, empirical state error covariance matrix. The first specific step of the method is to use the average form of the weighted measurement residual variance performance index rather than its usual total weighted residual form. Next it is helpful to interpret the solution to the normal equations as the average of a collection of sample vectors drawn from a hypothetical parent population. From here, using a standard statistical analysis approach, it directly follows as to how to determine the standard empirical state error covariance matrix. This matrix will contain the total uncertainty in the state estimate, regardless as to the source of the uncertainty. Also, in its most straight forward form, the technique only requires supplemental calculations to be added to existing batch algorithms. The generation of this direct, empirical form of the state error covariance matrix is independent of the dimensionality of the observations. Mixed degrees of freedom for an observation set are allowed. As is the case with any simple, empirical sample variance problems, the presented approach offers an opportunity (at least in the case of weighted least squares) to investigate confidence interval estimates for the error covariance matrix elements. The diagonal or variance terms of the error covariance matrix have a particularly simple form to associate with either a multiple degree of freedom chi-square distribution (more approximate) or with a gamma distribution (less approximate). The off diagonal or covariance terms of the matrix are less clear in their statistical behavior. However, the off diagonal covariance matrix elements still lend themselves to standard confidence interval error analysis. The distributional forms associated with the off diagonal terms are more varied and, perhaps, more approximate than those associated with the diagonal terms. Using a simple weighted least squares sample problem, results obtained through use of the proposed technique are presented. The example consists of a simple, two observer, triangulation problem with range only measurements. Variations of this problem reflect an ideal case (perfect knowledge of the range errors) and a mismodeled case (incorrect knowledge of the range errors).
Recognizing and managing errors of cognitive underspecification.
Duthie, Elizabeth A
2014-03-01
James Reason describes cognitive underspecification as incomplete communication that creates a knowledge gap. Errors occur when an information mismatch occurs in bridging that gap with a resulting lack of shared mental models during the communication process. There is a paucity of studies in health care examining this cognitive error and the role it plays in patient harm. The goal of the following case analyses is to facilitate accurate recognition, identify how it contributes to patient harm, and suggest appropriate management strategies. Reason's human error theory is applied in case analyses of errors of cognitive underspecification. Sidney Dekker's theory of human incident investigation is applied to event investigation to facilitate identification of this little recognized error. Contributory factors leading to errors of cognitive underspecification include workload demands, interruptions, inexperienced practitioners, and lack of a shared mental model. Detecting errors of cognitive underspecification relies on blame-free listening and timely incident investigation. Strategies for interception include two-way interactive communication, standardization of communication processes, and technological support to ensure timely access to documented clinical information. Although errors of cognitive underspecification arise at the sharp end with the care provider, effective management is dependent upon system redesign that mitigates the latent contributory factors. Cognitive underspecification is ubiquitous whenever communication occurs. Accurate identification is essential if effective system redesign is to occur.
Comprehensive analysis of a medication dosing error related to CPOE.
Horsky, Jan; Kuperman, Gilad J; Patel, Vimla L
2005-01-01
This case study of a serious medication error demonstrates the necessity of a comprehensive methodology for the analysis of failures in interaction between humans and information systems. The authors used a novel approach to analyze a dosing error related to computer-based ordering of potassium chloride (KCl). The method included a chronological reconstruction of events and their interdependencies from provider order entry usage logs, semistructured interviews with involved clinicians, and interface usability inspection of the ordering system. Information collected from all sources was compared and evaluated to understand how the error evolved and propagated through the system. In this case, the error was the product of faults in interaction among human and system agents that methods limited in scope to their distinct analytical domains would not identify. The authors characterized errors in several converging aspects of the drug ordering process: confusing on-screen laboratory results review, system usability difficulties, user training problems, and suboptimal clinical system safeguards that all contributed to a serious dosing error. The results of the authors' analysis were used to formulate specific recommendations for interface layout and functionality modifications, suggest new user alerts, propose changes to user training, and address error-prone steps of the KCl ordering process to reduce the risk of future medication dosing errors.
Space-Borne Laser Altimeter Geolocation Error Analysis
NASA Astrophysics Data System (ADS)
Wang, Y.; Fang, J.; Ai, Y.
2018-05-01
This paper reviews the development of space-borne laser altimetry technology over the past 40 years. Taking the ICESAT satellite as an example, a rigorous space-borne laser altimeter geolocation model is studied, and an error propagation equation is derived. The influence of the main error sources, such as the platform positioning error, attitude measurement error, pointing angle measurement error and range measurement error, on the geolocation accuracy of the laser spot are analysed by simulated experiments. The reasons for the different influences on geolocation accuracy in different directions are discussed, and to satisfy the accuracy of the laser control point, a design index for each error source is put forward.
Covariance analysis for evaluating head trackers
NASA Astrophysics Data System (ADS)
Kang, Donghoon
2017-10-01
Existing methods for evaluating the performance of head trackers usually rely on publicly available face databases, which contain facial images and the ground truths of their corresponding head orientations. However, most of the existing publicly available face databases are constructed by assuming that a frontal head orientation can be determined by compelling the person under examination to look straight ahead at the camera on the first video frame. Since nobody can accurately direct one's head toward the camera, this assumption may be unrealistic. Rather than obtaining estimation errors, we present a method for computing the covariance of estimation error rotations to evaluate the reliability of head trackers. As an uncertainty measure of estimators, the Schatten 2-norm of a square root of error covariance (or the algebraic average of relative error angles) can be used. The merit of the proposed method is that it does not disturb the person under examination by asking him to direct his head toward certain directions. Experimental results using real data validate the usefulness of our method.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-20
...The Food and Drug Administration (FDA or we) is correcting the preamble to a proposed rule that published in the Federal Register of January 16, 2013. That proposed rule would establish science-based minimum standards for the safe growing, harvesting, packing, and holding of produce, meaning fruits and vegetables grown for human consumption. FDA proposed these standards as part of our implementation of the FDA Food Safety Modernization Act. The document published with several technical errors, including some errors in cross references, as well as several errors in reference numbers cited throughout the document. This document corrects those errors. We are also placing a corrected copy of the proposed rule in the docket.
Patient identification using a near-infrared laser scanner
NASA Astrophysics Data System (ADS)
Manit, Jirapong; Bremer, Christina; Schweikard, Achim; Ernst, Floris
2017-03-01
We propose a new biometric approach where the tissue thickness of a person's forehead is used as a biometric feature. Given that the spatial registration of two 3D laser scans of the same human face usually produces a low error value, the principle of point cloud registration and its error metric can be applied to human classification techniques. However, by only considering the spatial error, it is not possible to reliably verify a person's identity. We propose to use a novel near-infrared laser-based head tracking system to determine an additional feature, the tissue thickness, and include this in the error metric. Using MRI as a ground truth, data from the foreheads of 30 subjects was collected from which a 4D reference point cloud was created for each subject. The measurements from the near-infrared system were registered with all reference point clouds using the ICP algorithm. Afterwards, the spatial and tissue thickness errors were extracted, forming a 2D feature space. For all subjects, the lowest feature distance resulted from the registration of a measurement and the reference point cloud of the same person. The combined registration error features yielded two clusters in the feature space, one from the same subject and another from the other subjects. When only the tissue thickness error was considered, these clusters were less distinct but still present. These findings could help to raise safety standards for head and neck cancer patients and lays the foundation for a future human identification technique.
Human factors evaluation of remote afterloading brachytherapy. Volume 2, Function and task analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Callan, J.R.; Gwynne, J.W. III; Kelly, T.T.
1995-05-01
A human factors project on the use of nuclear by-product material to treat cancer using remotely operated afterloaders was undertaken by the Nuclear Regulatory Commission. The purpose of the project was to identify factors that contribute to human error in the system for remote afterloading brachytherapy (RAB). This report documents the findings from the first phase of the project, which involved an extensive function and task analysis of RAB. This analysis identified the functions and tasks in RAB, made preliminary estimates of the likelihood of human error in each task, and determined the skills needed to perform each RAB task.more » The findings of the function and task analysis served as the foundation for the remainder of the project, which evaluated four major aspects of the RAB system linked to human error: human-system interfaces; procedures and practices; training and qualifications of RAB staff; and organizational practices and policies. At its completion, the project identified and prioritized areas for recommended NRC and industry attention based on all of the evaluations and analyses.« less
Multiview face detection based on position estimation over multicamera surveillance system
NASA Astrophysics Data System (ADS)
Huang, Ching-chun; Chou, Jay; Shiu, Jia-Hou; Wang, Sheng-Jyh
2012-02-01
In this paper, we propose a multi-view face detection system that locates head positions and indicates the direction of each face in 3-D space over a multi-camera surveillance system. To locate 3-D head positions, conventional methods relied on face detection in 2-D images and projected the face regions back to 3-D space for correspondence. However, the inevitable false face detection and rejection usually degrades the system performance. Instead, our system searches for the heads and face directions over the 3-D space using a sliding cube. Each searched 3-D cube is projected onto the 2-D camera views to determine the existence and direction of human faces. Moreover, a pre-process to estimate the locations of candidate targets is illustrated to speed-up the searching process over the 3-D space. In summary, our proposed method can efficiently fuse multi-camera information and suppress the ambiguity caused by detection errors. Our evaluation shows that the proposed approach can efficiently indicate the head position and face direction on real video sequences even under serious occlusion.
NASA Technical Reports Server (NTRS)
Long, E. R., Jr.
1986-01-01
Effects of specimen preparation on measured values of an acrylic's electomagnetic properties at X-band microwave frequencies, TE sub 1,0 mode, utilizing an automatic network analyzer have been studied. For 1 percent or less error, a gap between the specimen edge and the 0.901-in. wall of the specimen holder was the most significant parameter. The gap had to be less than 0.002 in. The thickness variation and alignment errors in the direction parallel to the 0.901-in. wall were equally second most significant and had to be less than 1 degree. Errors in the measurement f the thickness were third most significant. They had to be less than 3 percent. The following parameters caused errors of 1 percent or less: ratios of specimen-holder thicknesses of more than 15 percent, gaps between the specimen edge and the 0.401-in. wall less than 0.045 in., position errors less than 15 percent, surface roughness, hickness variation in the direction parallel to the 0.401-in. wall less than 35 percent, and specimen alignment in the direction parallel to the 0.401-in. wall mass than 5 degrees.
Prediction error induced motor contagions in human behaviors.
Ikegami, Tsuyoshi; Ganesh, Gowrishankar; Takeuchi, Tatsuya; Nakamoto, Hiroki
2018-05-29
Motor contagions refer to implicit effects on one's actions induced by observed actions. Motor contagions are believed to be induced simply by action observation and cause an observer's action to become similar to the action observed. In contrast, here we report a new motor contagion that is induced only when the observation is accompanied by prediction errors - differences between actions one observes and those he/she predicts or expects. In two experiments, one on whole-body baseball pitching and another on simple arm reaching, we show that the observation of the same action induces distinct motor contagions, depending on whether prediction errors are present or not. In the absence of prediction errors, as in previous reports, participants' actions changed to become similar to the observed action, while in the presence of prediction errors, their actions changed to diverge away from it, suggesting distinct effects of action observation and action prediction on human actions. © 2018, Ikegami et al.
NASA Technical Reports Server (NTRS)
Weiss, D. M.
1981-01-01
Error data obtained from two different software development environments are compared. To obtain data that was complete, accurate, and meaningful, a goal-directed data collection methodology was used. Changes made to software were monitored concurrently with its development. Similarities common to both environments are included: (1) the principal error was in the design and implementation of single routines; (2) few errors were the result of changes, required more than one attempt to correct, and resulted in other errors; (3) relatively few errors took more than a day to correct.
How smart is your BEOL? productivity improvement through intelligent automation
NASA Astrophysics Data System (ADS)
Schulz, Kristian; Egodage, Kokila; Tabbone, Gilles; Garetto, Anthony
2017-07-01
The back end of line (BEOL) workflow in the mask shop still has crucial issues throughout all standard steps which are inspection, disposition, photomask repair and verification of repair success. All involved tools are typically run by highly trained operators or engineers who setup jobs and recipes, execute tasks, analyze data and make decisions based on the results. No matter how experienced operators are and how good the systems perform, there is one aspect that always limits the productivity and effectiveness of the operation: the human aspect. Human errors can range from seemingly rather harmless slip-ups to mistakes with serious and direct economic impact including mask rejects, customer returns and line stops in the wafer fab. Even with the introduction of quality control mechanisms that help to reduce these critical but unavoidable faults, they can never be completely eliminated. Therefore the mask shop BEOL cannot run in the most efficient manner as unnecessary time and money are spent on processes that still remain labor intensive. The best way to address this issue is to automate critical segments of the workflow that are prone to human errors. In fact, manufacturing errors can occur for each BEOL step where operators intervene. These processes comprise of image evaluation, setting up tool recipes, data handling and all other tedious but required steps. With the help of smart solutions, operators can work more efficiently and dedicate their time to less mundane tasks. Smart solutions connect tools, taking over the data handling and analysis typically performed by operators and engineers. These solutions not only eliminate the human error factor in the manufacturing process but can provide benefits in terms of shorter cycle times, reduced bottlenecks and prediction of an optimized workflow. In addition such software solutions consist of building blocks that seamlessly integrate applications and allow the customers to use tailored solutions. To accommodate for the variability and complexity in mask shops today, individual workflows can be supported according to the needs of any particular manufacturing line with respect to necessary measurement and production steps. At the same time the efficiency of assets is increased by avoiding unneeded cycle time and waste of resources due to the presence of process steps that are very crucial for a given technology. In this paper we present details of which areas of the BEOL can benefit most from intelligent automation, what solutions exist and the quantification of benefits to a mask shop with full automation by the use of a back end of line model.
Fisher, Charles K.; Mehta, Pankaj
2014-01-01
Human associated microbial communities exert tremendous influence over human health and disease. With modern metagenomic sequencing methods it is now possible to follow the relative abundance of microbes in a community over time. These microbial communities exhibit rich ecological dynamics and an important goal of microbial ecology is to infer the ecological interactions between species directly from sequence data. Any algorithm for inferring ecological interactions must overcome three major obstacles: 1) a correlation between the abundances of two species does not imply that those species are interacting, 2) the sum constraint on the relative abundances obtained from metagenomic studies makes it difficult to infer the parameters in timeseries models, and 3) errors due to experimental uncertainty, or mis-assignment of sequencing reads into operational taxonomic units, bias inferences of species interactions due to a statistical problem called “errors-in-variables”. Here we introduce an approach, Learning Interactions from MIcrobial Time Series (LIMITS), that overcomes these obstacles. LIMITS uses sparse linear regression with boostrap aggregation to infer a discrete-time Lotka-Volterra model for microbial dynamics. We tested LIMITS on synthetic data and showed that it could reliably infer the topology of the inter-species ecological interactions. We then used LIMITS to characterize the species interactions in the gut microbiomes of two individuals and found that the interaction networks varied significantly between individuals. Furthermore, we found that the interaction networks of the two individuals are dominated by distinct “keystone species”, Bacteroides fragilis and Bacteroided stercosis, that have a disproportionate influence on the structure of the gut microbiome even though they are only found in moderate abundance. Based on our results, we hypothesize that the abundances of certain keystone species may be responsible for individuality in the human gut microbiome. PMID:25054627
Pointing error analysis of Risley-prism-based beam steering system.
Zhou, Yuan; Lu, Yafei; Hei, Mo; Liu, Guangcan; Fan, Dapeng
2014-09-01
Based on the vector form Snell's law, ray tracing is performed to quantify the pointing errors of Risley-prism-based beam steering systems, induced by component errors, prism orientation errors, and assembly errors. Case examples are given to elucidate the pointing error distributions in the field of regard and evaluate the allowances of the error sources for a given pointing accuracy. It is found that the assembly errors of the second prism will result in more remarkable pointing errors in contrast with the first one. The pointing errors induced by prism tilt depend on the tilt direction. The allowances of bearing tilt and prism tilt are almost identical if the same pointing accuracy is planned. All conclusions can provide a theoretical foundation for practical works.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J Zwan, B; Central Coast Cancer Centre, Gosford, NSW; Colvill, E
2016-06-15
Purpose: The added complexity of the real-time adaptive multi-leaf collimator (MLC) tracking increases the likelihood of undetected MLC delivery errors. In this work we develop and test a system for real-time delivery verification and error detection for MLC tracking radiotherapy using an electronic portal imaging device (EPID). Methods: The delivery verification system relies on acquisition and real-time analysis of transit EPID image frames acquired at 8.41 fps. In-house software was developed to extract the MLC positions from each image frame. Three comparison metrics were used to verify the MLC positions in real-time: (1) field size, (2) field location and, (3)more » field shape. The delivery verification system was tested for 8 VMAT MLC tracking deliveries (4 prostate and 4 lung) where real patient target motion was reproduced using a Hexamotion motion stage and a Calypso system. Sensitivity and detection delay was quantified for various types of MLC and system errors. Results: For both the prostate and lung test deliveries the MLC-defined field size was measured with an accuracy of 1.25 cm{sup 2} (1 SD). The field location was measured with an accuracy of 0.6 mm and 0.8 mm (1 SD) for lung and prostate respectively. Field location errors (i.e. tracking in wrong direction) with a magnitude of 3 mm were detected within 0.4 s of occurrence in the X direction and 0.8 s in the Y direction. Systematic MLC gap errors were detected as small as 3 mm. The method was not found to be sensitive to random MLC errors and individual MLC calibration errors up to 5 mm. Conclusion: EPID imaging may be used for independent real-time verification of MLC trajectories during MLC tracking deliveries. Thresholds have been determined for error detection and the system has been shown to be sensitive to a range of delivery errors.« less
Frequency-domain Green's functions for radar waves in heterogeneous 2.5D media
Ellefsen, K.J.; Croize, D.; Mazzella, A.T.; McKenna, J.R.
2009-01-01
Green's functions for radar waves propagating in heterogeneous 2.5D media might be calculated in the frequency domain using a hybrid method. The model is defined in the Cartesian coordinate system, and its electromagnetic properties might vary in the x- and z-directions, but not in the y-direction. Wave propagation in the x- and z-directions is simulated with the finite-difference method, and wave propagation in the y-direction is simulated with an analytic function. The absorbing boundaries on the finite-difference grid are perfectly matched layers that have been modified to make them compatible with the hybrid method. The accuracy of these numerical Greens functions is assessed by comparing them with independently calculated Green's functions. For a homogeneous model, the magnitude errors range from -4.16% through 0.44%, and the phase errors range from -0.06% through 4.86%. For a layered model, the magnitude errors range from -2.60% through 2.06%, and the phase errors range from -0.49% through 2.73%. These numerical Green's functions might be used for forward modeling and full waveform inversion. ?? 2009 Society of Exploration Geophysicists. All rights reserved.
Valeri, Linda; Lin, Xihong; VanderWeele, Tyler J.
2014-01-01
Mediation analysis is a popular approach to examine the extent to which the effect of an exposure on an outcome is through an intermediate variable (mediator) and the extent to which the effect is direct. When the mediator is mis-measured the validity of mediation analysis can be severely undermined. In this paper we first study the bias of classical, non-differential measurement error on a continuous mediator in the estimation of direct and indirect causal effects in generalized linear models when the outcome is either continuous or discrete and exposure-mediator interaction may be present. Our theoretical results as well as a numerical study demonstrate that in the presence of non-linearities the bias of naive estimators for direct and indirect effects that ignore measurement error can take unintuitive directions. We then develop methods to correct for measurement error. Three correction approaches using method of moments, regression calibration and SIMEX are compared. We apply the proposed method to the Massachusetts General Hospital lung cancer study to evaluate the effect of genetic variants mediated through smoking on lung cancer risk. PMID:25220625
Synchronizing movements with the metronome: nonlinear error correction and unstable periodic orbits.
Engbert, Ralf; Krampe, Ralf Th; Kurths, Jürgen; Kliegl, Reinhold
2002-02-01
The control of human hand movements is investigated in a simple synchronization task. We propose and analyze a stochastic model based on nonlinear error correction; a mechanism which implies the existence of unstable periodic orbits. This prediction is tested in an experiment with human subjects. We find that our experimental data are in good agreement with numerical simulations of our theoretical model. These results suggest that feedback control of the human motor systems shows nonlinear behavior. Copyright 2001 Elsevier Science (USA).
NASA Astrophysics Data System (ADS)
Zhang, Yachu; Zhao, Yuejin; Liu, Ming; Dong, Liquan; Kong, Lingqin; Liu, Lingling
2017-09-01
In contrast to humans, who use only visual information for navigation, many mobile robots use laser scanners and ultrasonic sensors along with vision cameras to navigate. This work proposes a vision-based robot control algorithm based on deep convolutional neural networks. We create a large 15-layer convolutional neural network learning system and achieve the advanced recognition performance. Our system is trained from end to end to map raw input images to direction in supervised mode. The images of data sets are collected in a wide variety of weather conditions and lighting conditions. Besides, the data sets are augmented by adding Gaussian noise and Salt-and-pepper noise to avoid overfitting. The algorithm is verified by two experiments, which are line tracking and obstacle avoidance. The line tracking experiment is proceeded in order to track the desired path which is composed of straight and curved lines. The goal of obstacle avoidance experiment is to avoid the obstacles indoor. Finally, we get 3.29% error rate on the training set and 5.1% error rate on the test set in the line tracking experiment, 1.8% error rate on the training set and less than 5% error rate on the test set in the obstacle avoidance experiment. During the actual test, the robot can follow the runway centerline outdoor and avoid the obstacle in the room accurately. The result confirms the effectiveness of the algorithm and our improvement in the network structure and train parameters
Human error in aviation operations
NASA Technical Reports Server (NTRS)
Billings, C. E.; Lanber, J. K.; Cooper, G. E.
1974-01-01
This report is a brief description of research being undertaken by the National Aeronautics and Space Administration. The project is designed to seek out factors in the aviation system which contribute to human error, and to search for ways of minimizing the potential threat posed by these factors. The philosophy and assumptions underlying the study are discussed, together with an outline of the research plan.
Error detection and reduction in blood banking.
Motschman, T L; Moore, S B
1996-12-01
Error management plays a major role in facility process improvement efforts. By detecting and reducing errors, quality and, therefore, patient care improve. It begins with a strong organizational foundation of management attitude with clear, consistent employee direction and appropriate physical facilities. Clearly defined critical processes, critical activities, and SOPs act as the framework for operations as well as active quality monitoring. To assure that personnel can detect an report errors they must be trained in both operational duties and error management practices. Use of simulated/intentional errors and incorporation of error detection into competency assessment keeps employees practiced, confident, and diminishes fear of the unknown. Personnel can clearly see that errors are indeed used as opportunities for process improvement and not for punishment. The facility must have a clearly defined and consistently used definition for reportable errors. Reportable errors should include those errors with potentially harmful outcomes as well as those errors that are "upstream," and thus further away from the outcome. A well-written error report consists of who, what, when, where, why/how, and follow-up to the error. Before correction can occur, an investigation to determine the underlying cause of the error should be undertaken. Obviously, the best corrective action is prevention. Correction can occur at five different levels; however, only three of these levels are directed at prevention. Prevention requires a method to collect and analyze data concerning errors. In the authors' facility a functional error classification method and a quality system-based classification have been useful. An active method to search for problems uncovers them further upstream, before they can have disastrous outcomes. In the continual quest for improving processes, an error management program is itself a process that needs improvement, and we must strive to always close the circle of quality assurance. Ultimately, the goal of better patient care will be the reward.
45 CFR 98.102 - Content of Error Rate Reports.
Code of Federal Regulations, 2013 CFR
2013-10-01
....102 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.102 Content of Error Rate Reports. (a) Baseline Submission Report... payments by the total dollar amount of child care payments that the State, the District of Columbia or...
45 CFR 98.102 - Content of Error Rate Reports.
Code of Federal Regulations, 2014 CFR
2014-10-01
....102 Public Welfare Department of Health and Human Services GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.102 Content of Error Rate Reports. (a) Baseline Submission Report... payments by the total dollar amount of child care payments that the State, the District of Columbia or...
45 CFR 98.102 - Content of Error Rate Reports.
Code of Federal Regulations, 2012 CFR
2012-10-01
....102 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.102 Content of Error Rate Reports. (a) Baseline Submission Report... payments by the total dollar amount of child care payments that the State, the District of Columbia or...
45 CFR 98.102 - Content of Error Rate Reports.
Code of Federal Regulations, 2011 CFR
2011-10-01
....102 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.102 Content of Error Rate Reports. (a) Baseline Submission Report... payments by the total dollar amount of child care payments that the State, the District of Columbia or...
45 CFR 98.100 - Error Rate Report.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND... rates, which is defined as the percentage of cases with an error (expressed as the total number of cases with an error compared to the total number of cases); the percentage of cases with an improper payment...
Code of Federal Regulations, 2010 CFR
2010-10-01
... Government misrepresentation, inaction, or error. 407.32 Section 407.32 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES MEDICARE PROGRAM SUPPLEMENTARY MEDICAL INSURANCE... enrollment rights because of Federal Government misrepresentation, inaction, or error. If an individual's...
Code of Federal Regulations, 2011 CFR
2011-10-01
... Government misrepresentation, inaction, or error. 407.32 Section 407.32 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES MEDICARE PROGRAM SUPPLEMENTARY MEDICAL INSURANCE... enrollment rights because of Federal Government misrepresentation, inaction, or error. If an individual's...
Can eye-tracking technology improve situational awareness in paramedic clinical education?
Williams, Brett; Quested, Andrew; Cooper, Simon
2013-01-01
Human factors play a significant part in clinical error. Situational awareness (SA) means being aware of one's surroundings, comprehending the present situation, and being able to predict outcomes. It is a key human skill that, when properly applied, is associated with reducing medical error: eye-tracking technology can be used to provide an objective and qualitative measure of the initial perception component of SA. Feedback from eye-tracking technology can be used to improve the understanding and teaching of SA in clinical contexts, and consequently, has potential for reducing clinician error and the concomitant adverse events.
A study on airborne integrated display system and human information processing
NASA Technical Reports Server (NTRS)
Mizumoto, K.; Iwamoto, H.; Shimizu, S.; Kuroda, I.
1983-01-01
The cognitive behavior of pilots was examined in an experiment involving mock ups of an eight display electronic attitude direction indicator for an airborne integrated display. Displays were presented in digital, analog digital, and analog format to experienced pilots. Two tests were run, one involving the speed of memorization in a single exposure and the other comprising two five second exposures spaced 30 sec apart. Errors increased with the speed of memorization. Generally, the analog information was assimilated faster than the digital data, with regard to the response speed. Information processing was quantified as 25 bits for the first five second exposure and 15 bits during the second.
Why Current Statistics of Complementary Alternative Medicine Clinical Trials is Invalid.
Pandolfi, Maurizio; Carreras, Giulia
2018-06-07
It is not sufficiently known that frequentist statistics cannot provide direct information on the probability that the research hypothesis tested is correct. The error resulting from this misunderstanding is compounded when the hypotheses under scrutiny have precarious scientific bases, which, generally, those of complementary alternative medicine (CAM) are. In such cases, it is mandatory to use inferential statistics, considering the prior probability that the hypothesis tested is true, such as the Bayesian statistics. The authors show that, under such circumstances, no real statistical significance can be achieved in CAM clinical trials. In this respect, CAM trials involving human material are also hardly defensible from an ethical viewpoint.
Litovsky, Ruth Y.; Godar, Shelly P.
2010-01-01
The precedence effect refers to the fact that humans are able to localize sound in reverberant environments, because the auditory system assigns greater weight to the direct sound (lead) than the later-arriving sound (lag). In this study, absolute sound localization was studied for single source stimuli and for dual source lead-lag stimuli in 4–5 year old children and adults. Lead-lag delays ranged from 5–100 ms. Testing was conducted in free field, with pink noise bursts emitted from loudspeakers positioned on a horizontal arc in the frontal field. Listeners indicated how many sounds were heard and the perceived location of the first- and second-heard sounds. Results suggest that at short delays (up to 10 ms), the lead dominates sound localization strongly at both ages, and localization errors are similar to those with single-source stimuli. At longer delays errors can be large, stemming from over-integration of the lead and lag, interchanging of perceived locations of the first-heard and second-heard sounds due to temporal order confusion, and dominance of the lead over the lag. The errors are greater for children than adults. Results are discussed in the context of maturation of auditory and non-auditory factors. PMID:20968369
Studies in automatic speech recognition and its application in aerospace
NASA Astrophysics Data System (ADS)
Taylor, Michael Robinson
Human communication is characterized in terms of the spectral and temporal dimensions of speech waveforms. Electronic speech recognition strategies based on Dynamic Time Warping and Markov Model algorithms are described and typical digit recognition error rates are tabulated. The application of Direct Voice Input (DVI) as an interface between man and machine is explored within the context of civil and military aerospace programmes. Sources of physical and emotional stress affecting speech production within military high performance aircraft are identified. Experimental results are reported which quantify fundamental frequency and coarse temporal dimensions of male speech as a function of the vibration, linear acceleration and noise levels typical of aerospace environments; preliminary indications of acoustic phonetic variability reported by other researchers are summarized. Connected whole-word pattern recognition error rates are presented for digits spoken under controlled Gz sinusoidal whole-body vibration. Correlations are made between significant increases in recognition error rate and resonance of the abdomen-thorax and head subsystems of the body. The phenomenon of vibrato style speech produced under low frequency whole-body Gz vibration is also examined. Interactive DVI system architectures and avionic data bus integration concepts are outlined together with design procedures for the efficient development of pilot-vehicle command and control protocols.
Assessing tropical rainforest growth traits: Data - Model fusion in the Congo basin and beyond
NASA Astrophysics Data System (ADS)
Pietsch, Stephan
2017-04-01
Virgin forest ecosystems resemble the key reference level for natural tree growth dynamics. The mosaic cycle concept describes such dynamics as local disequilibria driven by patch level succession cycles of breakdown, regeneration, juvenescence and old growth. These cycles, however, may involve different traits of light demanding and shade tolerant species assemblies. In this work a data model fusion concept will be introduced to assess the differences in growth dynamics of the mosaic cycle of the Western Congolian Lowland Rainforest ecosystem. Field data from 34 forest patches located in an ice age forest refuge, recently pinpointed to the ground and still devoid of direct human impact up to today - resemble the data base. A 3D error assessment procedure versus BGC model simulations for the 34 patches revealed two different growth dynamics, consistent with observed growth traits of pioneer and late succession species assemblies of the Western Congolian Lowland rainforest. An application of the same procedure to Central American Pacific rainforests confirms the strength of the 3D error field data model fusion concept to Central American Pacific rainforests confirms the strength of the 3D error field data model fusion concept to assess different growth traits of the mosaic cycle of natural forest dynamics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuld, R.; Cybert, S.
Methods and criteria for performing human factors evaluations of plant systems and procedures are well developed and available. For a design review to produce a positive impact on operations, however, it is not enough to simply document deficiences and solutions. The results must be presented to management in a clear and compelling form that will direct attention to the heart of a problem and present proposed solutions in terms of explicit, quantified cost/benefits. A proactive program of trip reduction provides an excellent opportunity to accomplish human factors-related upgrades. As an evaluative context, trip reduction imposes a uniform goodness criterion onmore » all situations: the probability of inadvertent plant trip. This in turn means that findings can be compared in terms of a common quantitative reference point: the cost of an inadvertent shutdown. To interpret human factors deficiencies in terms of trip probabilities, the Technique for Human Error Rate Prediction (THERP) can be used. THERP provides an accessible compilation of human reliability data for generic, discrete task elements. Sequences of such values are combined in standard event trees to determine the probability of failure (e.g., trip) for a given evolution. THERP is widely accepted as one of the best available alternatives for assessing human reliability.« less
Review article: the influence of psychology and human factors on education in anesthesiology.
Glavin, Ronnie; Flin, Rhona
2012-02-01
We look at the changing nature of medical education in the developed world with particular reference to those areas of the new curriculum frameworks which have introduced topics from the psychosocial realm. Research in the branch of psychology dealing with human factors has developed a useful body of working knowledge which applies to other industries where humans interact with the complex systems in which they function. Some findings are already being applied to facets of anesthesia performance, including situation awareness, effective teamwork, countermeasures against active errors and latent pathogens, and limitations of human performance. However, existing lessons and practices from industrial or military research may not translate directly into effective strategies for anesthesiologists. Collaborative studies between psychologists and clinicians should continue in order to provide the anesthetic curriculum with an effective body of knowledge for each role of the anesthesiologist. Although individual anesthesiologists have made important contributions in this field, such material has not been formally incorporated into the curricula serving anesthesiologists in the developed world. There is a gap between the human factors psychologists now know and the human factors anesthesiologists need to know. As that gap closes, anesthesiologists may come to think more like human factor psychologists as well as biomedical scientists.
DeSouza, Joseph F X; Ovaysikia, Shima; Pynn, Laura
2012-06-20
The aim of this methods paper is to describe how to implement a neuroimaging technique to examine complementary brain processes engaged by two similar tasks. Participants' behavior during task performance in an fMRI scanner can then be correlated to the brain activity using the blood-oxygen-level-dependent signal. We measure behavior to be able to sort correct trials, where the subject performed the task correctly and then be able to examine the brain signals related to correct performance. Conversely, if subjects do not perform the task correctly, and these trials are included in the same analysis with the correct trials we would introduce trials that were not only for correct performance. Thus, in many cases these errors can be used themselves to then correlate brain activity to them. We describe two complementary tasks that are used in our lab to examine the brain during suppression of an automatic responses: the stroop(1) and anti-saccade tasks. The emotional stroop paradigm instructs participants to either report the superimposed emotional 'word' across the affective faces or the facial 'expressions' of the face stimuli(1,2). When the word and the facial expression refer to different emotions, a conflict between what must be said and what is automatically read occurs. The participant has to resolve the conflict between two simultaneously competing processes of word reading and facial expression. Our urge to read out a word leads to strong 'stimulus-response (SR)' associations; hence inhibiting these strong SR's is difficult and participants are prone to making errors. Overcoming this conflict and directing attention away from the face or the word requires the subject to inhibit bottom up processes which typically directs attention to the more salient stimulus. Similarly, in the anti-saccade task(3,4,5,6), where an instruction cue is used to direct only attention to a peripheral stimulus location but then the eye movement is made to the mirror opposite position. Yet again we measure behavior by recording the eye movements of participants which allows for the sorting of the behavioral responses into correct and error trials(7) which then can be correlated to brain activity. Neuroimaging now allows researchers to measure different behaviors of correct and error trials that are indicative of different cognitive processes and pinpoint the different neural networks involved.
Advanced Outage and Control Center: Strategies for Nuclear Plant Outage Work Status Capabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gregory Weatherby
The research effort is a part of the Light Water Reactor Sustainability (LWRS) Program. LWRS is a research and development program sponsored by the Department of Energy, performed in close collaboration with industry to provide the technical foundations for licensing and managing the long-term, safe and economical operation of current nuclear power plants. The LWRS Program serves to help the US nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. The Outage Control Center (OCC) Pilot Project was directed at carrying out the applied researchmore » for development and pilot of technology designed to enhance safe outage and maintenance operations, improve human performance and reliability, increase overall operational efficiency, and improve plant status control. Plant outage management is a high priority concern for the nuclear industry from cost and safety perspectives. Unfortunately, many of the underlying technologies supporting outage control are the same as those used in the 1980’s. They depend heavily upon large teams of staff, multiple work and coordination locations, and manual administrative actions that require large amounts of paper. Previous work in human reliability analysis suggests that many repetitive tasks, including paper work tasks, may have a failure rate of 1.0E-3 or higher (Gertman, 1996). With between 10,000 and 45,000 subtasks being performed during an outage (Gomes, 1996), the opportunity for human error of some consequence is a realistic concern. Although a number of factors exist that can make these errors recoverable, reducing and effectively coordinating the sheer number of tasks to be performed, particularly those that are error prone, has the potential to enhance outage efficiency and safety. Additionally, outage management requires precise coordination of work groups that do not always share similar objectives. Outage managers are concerned with schedule and cost, union workers are concerned with performing work that is commensurate with their trade, and support functions (safety, quality assurance, and radiological controls, etc.) are concerned with performing the work within the plants controls and procedures. Approaches to outage management should be designed to increase the active participation of work groups and managers in making decisions that closed the gap between competing objectives and the potential for error and process inefficiency.« less
Henneman, Elizabeth A
2017-07-01
The Institute of Medicine (now National Academy of Medicine) reports "To Err is Human" and "Crossing the Chasm" made explicit 3 previously unappreciated realities: (1) Medical errors are common and result in serious, preventable adverse events; (2) The majority of medical errors are the result of system versus human failures; and (3) It would be impossible for any system to prevent all errors. With these realities, the role of the nurse in the "near miss" process and as the final safety net for the patient is of paramount importance. The nurse's role in patient safety is described from both a systems perspective and a human factors perspective. Critical care nurses use specific strategies to identify, interrupt, and correct medical errors. Strategies to identify errors include knowing the patient, knowing the plan of care, double-checking, and surveillance. Nursing strategies to interrupt errors include offering assistance, clarifying, and verbally interrupting. Nurses correct errors by persevering, being physically present, reviewing/confirming the plan of care, or involving another nurse or physician. Each of these strategies has implications for education, practice, and research. Surveillance is a key nursing strategy for identifying medical errors and reducing adverse events. Eye-tracking technology is a novel approach for evaluating the surveillance process during common, high-risk processes such as blood transfusion and medication administration. Eye tracking has also been used to examine the impact of interruptions to care caused by bedside alarms as well as by other health care personnel. Findings from this safety-related eye-tracking research provide new insight into effective bedside surveillance and interruption management strategies. ©2017 American Association of Critical-Care Nurses.
Westbrook, Johanna I.; Li, Ling; Lehnbom, Elin C.; Baysari, Melissa T.; Braithwaite, Jeffrey; Burke, Rosemary; Conn, Chris; Day, Richard O.
2015-01-01
Objectives To (i) compare medication errors identified at audit and observation with medication incident reports; (ii) identify differences between two hospitals in incident report frequency and medication error rates; (iii) identify prescribing error detection rates by staff. Design Audit of 3291patient records at two hospitals to identify prescribing errors and evidence of their detection by staff. Medication administration errors were identified from a direct observational study of 180 nurses administering 7451 medications. Severity of errors was classified. Those likely to lead to patient harm were categorized as ‘clinically important’. Setting Two major academic teaching hospitals in Sydney, Australia. Main Outcome Measures Rates of medication errors identified from audit and from direct observation were compared with reported medication incident reports. Results A total of 12 567 prescribing errors were identified at audit. Of these 1.2/1000 errors (95% CI: 0.6–1.8) had incident reports. Clinically important prescribing errors (n = 539) were detected by staff at a rate of 218.9/1000 (95% CI: 184.0–253.8), but only 13.0/1000 (95% CI: 3.4–22.5) were reported. 78.1% (n = 421) of clinically important prescribing errors were not detected. A total of 2043 drug administrations (27.4%; 95% CI: 26.4–28.4%) contained ≥1 errors; none had an incident report. Hospital A had a higher frequency of incident reports than Hospital B, but a lower rate of errors at audit. Conclusions Prescribing errors with the potential to cause harm frequently go undetected. Reported incidents do not reflect the profile of medication errors which occur in hospitals or the underlying rates. This demonstrates the inaccuracy of using incident frequency to compare patient risk or quality performance within or across hospitals. New approaches including data mining of electronic clinical information systems are required to support more effective medication error detection and mitigation. PMID:25583702
NASA Astrophysics Data System (ADS)
Young, Kenneth C.; Cook, James J. H.; Oduko, Jennifer M.; Bosmans, Hilde
2006-03-01
European Guidelines for quality control in digital mammography specify minimum and achievable standards of image quality in terms of threshold contrast, based on readings of images of the CDMAM test object by human observers. However this is time-consuming and has large inter-observer error. To overcome these problems a software program (CDCOM) is available to automatically read CDMAM images, but the optimal method of interpreting the output is not defined. This study evaluates methods of determining threshold contrast from the program, and compares these to human readings for a variety of mammography systems. The methods considered are (A) simple thresholding (B) psychometric curve fitting (C) smoothing and interpolation and (D) smoothing and psychometric curve fitting. Each method leads to similar threshold contrasts but with different reproducibility. Method (A) had relatively poor reproducibility with a standard error in threshold contrast of 18.1 +/- 0.7%. This was reduced to 8.4% by using a contrast-detail curve fitting procedure. Method (D) had the best reproducibility with an error of 6.7%, reducing to 5.1% with curve fitting. A panel of 3 human observers had an error of 4.4% reduced to 2.9 % by curve fitting. All automatic methods led to threshold contrasts that were lower than for humans. The ratio of human to program threshold contrasts varied with detail diameter and was 1.50 +/- .04 (sem) at 0.1mm and 1.82 +/- .06 at 0.25mm for method (D). There were good correlations between the threshold contrast determined by humans and the automated methods.
Effect of Antenna Pointing Errors on SAR Imaging Considering the Change of the Point Target Location
NASA Astrophysics Data System (ADS)
Zhang, Xin; Liu, Shijie; Yu, Haifeng; Tong, Xiaohua; Huang, Guoman
2018-04-01
Towards spaceborne spotlight SAR, the antenna is regulated by the SAR system with specific regularity, so the shaking of the internal mechanism is inevitable. Moreover, external environment also has an effect on the stability of SAR platform. Both of them will cause the jitter of the SAR platform attitude. The platform attitude instability will introduce antenna pointing error on both the azimuth and range directions, and influence the acquisition of SAR original data and ultimate imaging quality. In this paper, the relations between the antenna pointing errors and the three-axis attitude errors are deduced, then the relations between spaceborne spotlight SAR imaging of the point target and antenna pointing errors are analysed based on the paired echo theory, meanwhile, the change of the azimuth antenna gain is considered as the spotlight SAR platform moves ahead. The simulation experiments manifest the effects on spotlight SAR imaging caused by antenna pointing errors are related to the target location, that is, the pointing errors of the antenna beam will severely influence the area far away from the scene centre of azimuth direction in the illuminated scene.
NASA Technical Reports Server (NTRS)
Bell, Thomas L.; Kundu, Prasun K.; Kummerow, Christian D.; Einaudi, Franco (Technical Monitor)
2000-01-01
Quantitative use of satellite-derived maps of monthly rainfall requires some measure of the accuracy of the satellite estimates. The rainfall estimate for a given map grid box is subject to both remote-sensing error and, in the case of low-orbiting satellites, sampling error due to the limited number of observations of the grid box provided by the satellite. A simple model of rain behavior predicts that Root-mean-square (RMS) random error in grid-box averages should depend in a simple way on the local average rain rate, and the predicted behavior has been seen in simulations using surface rain-gauge and radar data. This relationship was examined using satellite SSM/I data obtained over the western equatorial Pacific during TOGA COARE. RMS error inferred directly from SSM/I rainfall estimates was found to be larger than predicted from surface data, and to depend less on local rain rate than was predicted. Preliminary examination of TRMM microwave estimates shows better agreement with surface data. A simple method of estimating rms error in satellite rainfall estimates is suggested, based on quantities that can be directly computed from the satellite data.
Jolley, Suzanne; Thompson, Claire; Hurley, James; Medin, Evelina; Butler, Lucy; Bebbington, Paul; Dunn, Graham; Freeman, Daniel; Fowler, David; Kuipers, Elizabeth; Garety, Philippa
2014-01-01
Understanding how people with delusions arrive at false conclusions is central to the refinement of cognitive behavioural interventions. Making hasty decisions based on limited data (‘jumping to conclusions’, JTC) is one potential causal mechanism, but reasoning errors may also result from other processes. In this study, we investigated the correlates of reasoning errors under differing task conditions in 204 participants with schizophrenia spectrum psychosis who completed three probabilistic reasoning tasks. Psychotic symptoms, affect, and IQ were also evaluated. We found that hasty decision makers were more likely to draw false conclusions, but only 37% of their reasoning errors were consistent with the limited data they had gathered. The remainder directly contradicted all the presented evidence. Reasoning errors showed task-dependent associations with IQ, affect, and psychotic symptoms. We conclude that limited data-gathering contributes to false conclusions but is not the only mechanism involved. Delusions may also be maintained by a tendency to disregard evidence. Low IQ and emotional biases may contribute to reasoning errors in more complex situations. Cognitive strategies to reduce reasoning errors should therefore extend beyond encouragement to gather more data, and incorporate interventions focused directly on these difficulties. PMID:24958065
Prasad, Devleena; Das, Pinaki; Saha, Niladri S; Chatterjee, Sanjoy; Achari, Rimpa; Mallick, Indranil
2014-01-01
This aim of this study was to determine if a less resource-intensive and established offline correction protocol - the No Action Level (NAL) protocol was as effective as daily online corrections of setup deviations in curative high-dose radiotherapy of prostate cancer. A total of 683 daily megavoltage CT (MVCT) or kilovoltage CT (kvCBCT) images of 30 patients with localized prostate cancer treated with intensity modulated radiotherapy were evaluated. Daily image-guidance was performed and setup errors in three translational axes recorded. The NAL protocol was simulated by using the mean shift calculated from the first five fractions and implemented on all subsequent treatments. Using the imaging data from the remaining fractions, the daily residual error (RE) was determined. The proportion of fractions where the RE was greater than 3,5 and 7 mm was calculated, and also the actual PTV margin that would be required if the offline protocol was followed. Using the NAL protocol reduced the systematic but not the random errors. Corrections made using the NAL protocol resulted in small and acceptable RE in the mediolateral (ML) and superoinferior (SI) directions with 46/533 (8.1%) and 48/533 (5%) residual shifts above 5 mm. However; residual errors greater than 5mm in the anteroposterior (AP) direction remained in 181/533 (34%) of fractions. The PTV margins calculated based on residual errors were 5mm, 5mm and 13 mm in the ML, SI and AP directions respectively. Offline correction using the NAL protocol resulted in unacceptably high residual errors in the AP direction, due to random uncertainties of rectal and bladder filling. Daily online imaging and corrections remain the standard image guidance policy for highly conformal radiotherapy of prostate cancer.