Determination of the number of ψ' events at BESIII
NASA Astrophysics Data System (ADS)
Ablikim, M.; N. Achasov, M.; Albayrak, O.; J. Ambrose, D.; F. An, F.; Q., An; Z. Bai, J.; Ban, Y.; Becker, J.; V. Bennett, J.; Berger, N.; Bertani, M.; M. Bian, J.; Boger, E.; Bondarenko, O.; Boyko, I.; A. Briere, R.; Bytev, V.; Cai, X.; Cakir, O.; Calcaterra, A.; F. Cao, G.; A. Cetin, S.; F. Chang, J.; Chelkov, G.; G., Chen; S. Chen, H.; C. Chen, J.; L. Chen, M.; J. Chen, S.; X., Chen; B. Chen, Y.; P. Cheng, H.; P. Chu, Y.; Cronin-Hennessy, D.; L. Dai, H.; P. Dai, J.; Dedovich, D.; Y. Deng, Z.; Denig, A.; Denysenko, I.; Destefanis, M.; M. Ding, W.; Y., Ding; Y. Dong, L.; Y. Dong, M.; X. Du, S.; J., Fang; S. Fang, S.; Fava, L.; Q. Feng, C.; B. Ferroli, R.; Friedel, P.; D. Fu, C.; Gao, Y.; C., Geng; Goetzen, K.; X. Gong, W.; Gradl, W.; Greco, M.; H. Gu, M.; T. Gu, Y.; H. Guan, Y.; Q. Guo, A.; B. Guo, L.; T., Guo; P. Guo, Y.; L. Han, Y.; A. Harris, F.; L. He, K.; M., He; Y. He, Z.; Held, T.; K. Heng, Y.; L. Hou, Z.; C., Hu; M. Hu, H.; F. Hu, J.; T., Hu; M. Huang, G.; S. Huang, G.; S. Huang, J.; L., Huang; T. Huang, X.; Y., Huang; P. Huang, Y.; Hussain, T.; S. Ji, C.; Q., Ji; P. Ji, Q.; B. Ji, X.; L. Ji, X.; L. Jiang, L.; S. Jiang, X.; B. Jiao, J.; Jiao, Z.; P. Jin, D.; S., Jin; F. Jing, F.; Kalantar-Nayestanaki, N.; Kavatsyuk, M.; Kopf, B.; Kornicer, M.; Kuehn, W.; Lai, W.; S. Lange, J.; Leyhe, M.; H. Li, C.; Cheng, Li; Cui, Li; M. Li, D.; F., Li; G., Li; B. Li, H.; C. Li, J.; K., Li; Lei, Li; J. Li, Q.; L. Li, S.; D. Li, W.; G. Li, W.; L. Li, X.; N. Li, X.; Q. Li, X.; R. Li, X.; B. Li, Z.; H., Liang; F. Liang, Y.; T. Liang, Y.; R. Liao, G.; T. Liao, X.; Lin(Lin, D.; J. Liu, B.; L. Liu, C.; X. Liu, C.; H. Liu, F.; Fang, Liu; Feng, Liu; H., Liu; B. Liu, H.; H. Liu, H.; M. Liu, H.; W. Liu, H.; P. Liu, J.; K., Liu; Y. Liu, K.; Kai, Liu; L. Liu, P.; Q., Liu; B. Liu, S.; X., Liu; B. Liu, Y.; A. Liu, Z.; Zhiqiang, Liu; Zhiqing, Liu; Loehner, H.; R. Lu, G.; J. Lu, H.; G. Lu, J.; W. Lu, Q.; R. Lu, X.; P. Lu, Y.; L. Luo, C.; X. Luo, M.; Luo, T.; L. Luo, X.; Lv, M.; L. Ma, C.; C. Ma, F.; L. Ma, H.; M. Ma, Q.; Ma, S.; Ma, T.; Y. Ma, X.; E. Maas, F.; Maggiora, M.; A. Malik, Q.; J. Mao, Y.; P. Mao, Z.; G. Messchendorp, J.; J., Min; J. Min, T.; E. Mitchell, R.; H. Mo, X.; C. Morales, Morales; Yu. Muchnoi, N.; Muramatsu, H.; Nefedov, Y.; Nicholson, C.; B. Nikolaev, I.; Z., Ning; L. Olsen, S.; Ouyang, Q.; Pacetti, S.; W. Park, J.; Pelizaeus, M.; P. Peng, H.; Peters, K.; L. Ping, J.; G. Ping, R.; Poling, R.; Prencipe, E.; M., Qi; Qian, S.; F. Qiao, C.; Q. Qin, L.; S. Qin, X.; Y., Qin; H. Qin, Z.; F. Qiu, J.; H. Rashid, K.; G., Rong; D. Ruan, X.; Sarantsev, A.; D. Schaefer, B.; Shao, M.; P. Shen, C.; Y. Shen, X.; Y. Sheng, H.; R. Shepherd, M.; Y. Song, X.; Spataro, S.; Spruck, B.; H. Sun, D.; X. Sun, G.; F. Sun, J.; S. Sun, S.; J. Sun, Y.; Z. Sun, Y.; J. Sun, Z.; T. Sun, Z.; J. Tang, C.; Tang, X.; Tapan, I.; H. Thorndike, E.; Toth, D.; Ullrich, M.; S. Varner, G.; Q. Wang, B.; D., Wang; Y. Wang, D.; K., Wang; L. Wang, L.; S. Wang, L.; M., Wang; P., Wang; L. Wang, P.; J. Wang, Q.; G. Wang, S.; F. Wang, X.; L. Wang, X.; F. Wang, Y.; Z., Wang; G. Wang, Z.; Y. Wang, Z.; H. Wei, D.; B. Wei, J.; Weidenkaff, P.; G. Wen, Q.; P. Wen, S.; M., Werner; Wiedner, U.; H. Wu, L.; N., Wu; X. Wu, S.; W., Wu; Z., Wu; G. Xia, L.; X Xia, Y.; J. Xiao, Z.; G. Xie, Y.; L. Xiu, Q.; F. Xu, G.; M. Xu, G.; J. Xu, Q.; N. Xu, Q.; P. Xu, X.; R. Xu, Z.; Xue, F.; Xue, Z.; L., Yan; B. Yan, W.; H. Yan, Y.; X. Yang, H.; Y., Yang; X. Yang, Y.; Ye, H.; Ye, M.; H. Ye, M.; X. Yu, B.; X. Yu, C.; W. Yu, H.; S. Yu, J.; P. Yu, S.; Z. Yuan, C.; Y., Yuan; A. Zafar, A.; Zallo, A.; Zeng, Y.; X. Zhang, B.; Y. Zhang, B.; Zhang, C.; C. Zhang, C.; H. Zhang, D.; H. Zhang, H.; Y. Zhang, H.; Q. Zhang, J.; W. Zhang, J.; Y. Zhang, J.; Z. Zhang, J.; Lili, Zhang; Zhang, R.; H. Zhang, S.; J. Zhang, X.; Y. Zhang, X.; Zhang, Y.; H. Zhang, Y.; P. Zhang, Z.; Y. Zhang, Z.; Zhenghao, Zhang; Zhao, G.; S. Zhao, H.; W. Zhao, J.; X. Zhao, K.; Lei, Zhao; Ling, Zhao; G. Zhao, M.; Zhao, Q.; Z. Zhao, Q.; J. Zhao, S.; C. Zhao, T.; B. Zhao, Y.; G. Zhao, Z.; Zhemchugov, A.; B., Zheng; P. Zheng, J.; H. Zheng, Y.; B., Zhong; Z., Zhong; L., Zhou; K. Zhou, X.; R. Zhou, X.; Zhu, C.; Zhu, K.; J. Zhu, K.; H. Zhu, S.; L. Zhu, X.; C. Zhu, Y.; M. Zhu, Y.; S. Zhu, Y.; A. Zhu, Z.; J., Zhuang; S. Zou, B.; H. Zou, J.
2013-06-01
The number of ψ' events accumulated by the BESIII experiment from March 3 through April 14, 2009, is determined by counting inclusive hadronic events. The result is 106.41×(1.00±0.81%)×106. The error is systematic dominant; the statistical error is negligible.
Drought Persistence Errors in Global Climate Models
NASA Astrophysics Data System (ADS)
Moon, H.; Gudmundsson, L.; Seneviratne, S. I.
2018-04-01
The persistence of drought events largely determines the severity of socioeconomic and ecological impacts, but the capability of current global climate models (GCMs) to simulate such events is subject to large uncertainties. In this study, the representation of drought persistence in GCMs is assessed by comparing state-of-the-art GCM model simulations to observation-based data sets. For doing so, we consider dry-to-dry transition probabilities at monthly and annual scales as estimates for drought persistence, where a dry status is defined as negative precipitation anomaly. Though there is a substantial spread in the drought persistence bias, most of the simulations show systematic underestimation of drought persistence at global scale. Subsequently, we analyzed to which degree (i) inaccurate observations, (ii) differences among models, (iii) internal climate variability, and (iv) uncertainty of the employed statistical methods contribute to the spread in drought persistence errors using an analysis of variance approach. The results show that at monthly scale, model uncertainty and observational uncertainty dominate, while the contribution from internal variability is small in most cases. At annual scale, the spread of the drought persistence error is dominated by the statistical estimation error of drought persistence, indicating that the partitioning of the error is impaired by the limited number of considered time steps. These findings reveal systematic errors in the representation of drought persistence in current GCMs and suggest directions for further model improvement.
Drought Persistence in Models and Observations
NASA Astrophysics Data System (ADS)
Moon, Heewon; Gudmundsson, Lukas; Seneviratne, Sonia
2017-04-01
Many regions of the world have experienced drought events that persisted several years and caused substantial economic and ecological impacts in the 20th century. However, it remains unclear whether there are significant trends in the frequency or severity of these prolonged drought events. In particular, an important issue is linked to systematic biases in the representation of persistent drought events in climate models, which impedes analysis related to the detection and attribution of drought trends. This study assesses drought persistence errors in global climate model (GCM) simulations from the 5th phase of Coupled Model Intercomparison Project (CMIP5), in the period of 1901-2010. The model simulations are compared with five gridded observational data products. The analysis focuses on two aspects: the identification of systematic biases in the models and the partitioning of the spread of drought-persistence-error into four possible sources of uncertainty: model uncertainty, observation uncertainty, internal climate variability and the estimation error of drought persistence. We use monthly and yearly dry-to-dry transition probabilities as estimates for drought persistence with drought conditions defined as negative precipitation anomalies. For both time scales we find that most model simulations consistently underestimated drought persistence except in a few regions such as India and Eastern South America. Partitioning the spread of the drought-persistence-error shows that at the monthly time scale model uncertainty and observation uncertainty are dominant, while the contribution from internal variability does play a minor role in most cases. At the yearly scale, the spread of the drought-persistence-error is dominated by the estimation error, indicating that the partitioning is not statistically significant, due to a limited number of considered time steps. These findings reveal systematic errors in the representation of drought persistence in current climate models and highlight the main contributors of uncertainty of drought-persistence-error. Future analyses will focus on investigating the temporal propagation of drought persistence to better understand the causes for the identified errors in the representation of drought persistence in state-of-the-art climate models.
On the use of Lineal Energy Measurements to Estimate Linear Energy Transfer Spectra
NASA Technical Reports Server (NTRS)
Adams, David A.; Howell, Leonard W., Jr.; Adam, James H., Jr.
2007-01-01
This paper examines the error resulting from using a lineal energy spectrum to represent a linear energy transfer spectrum for applications in the space radiation environment. Lineal energy and linear energy transfer spectra are compared in three diverse but typical space radiation environments. Different detector geometries are also studied to determine how they affect the error. LET spectra are typically used to compute dose equivalent for radiation hazard estimation and single event effect rates to estimate radiation effects on electronics. The errors in the estimations of dose equivalent and single event rates that result from substituting lineal energy spectra for linear energy spectra are examined. It is found that this substitution has little effect on dose equivalent estimates in interplanetary quiet-time environment regardless of detector shape. The substitution has more of an effect when the environment is dominated by solar energetic particles or trapped radiation, but even then the errors are minor especially if a spherical detector is used. For single event estimation, the effect of the substitution can be large if the threshold for the single event effect is near where the linear energy spectrum drops suddenly. It is judged that single event rate estimates made from lineal energy spectra are unreliable and the use of lineal energy spectra for single event rate estimation should be avoided.
Mood, motivation, and misinformation: aging and affective state influences on memory.
Hess, Thomas M; Popham, Lauren E; Emery, Lisa; Elliott, Tonya
2012-01-01
Normative age differences in memory have typically been attributed to declines in basic cognitive and cortical mechanisms. The present study examined the degree to which dominant everyday affect might also be associated with age-related memory errors using the misinformation paradigm. Younger and older adults viewed a positive and a negative event, and then were exposed to misinformation about each event. Older adults exhibited a higher likelihood than young adults of falsely identifying misinformation as having occurred in the events. Consistent with expectations, strength of the misinformation effect was positively associated with dominant mood, and controlling for mood eliminated any age effects. Also, motivation to engage in complex cognitive activity was negatively associated with susceptibility to misinformation, and susceptibility was stronger for negative than for positive events. We argue that motivational processes underlie all of the observed effects, and that such processes are useful in understanding age differences in memory performance.
Mood, motivation, and misinformation: Aging and affective state influences on memory
Hess, Thomas M.; Popham, Lauren E.; Emery, Lisa; Elliott, Tonya
2014-01-01
Normative age differences in memory have typically been attributed to declines in basic cognitive and cortical mechanisms. The present study examined the degree to which dominant everyday affect might also be associated with age-related memory errors using the misinformation paradigm. Younger and older adults viewed a positive and a negative event, and then were exposed to misinformation about each event. Older adults exhibited a higher likelihood than young adults of falsely identifying misinformation as having occurred in the events. Consistent with expectations, strength of the misinformation effect was positively associated with dominant mood, and controlling for mood eliminated any age effects. Also, motivation to engage in complex cognitive activity was negatively associated with susceptibility to misinformation, and susceptibility was stronger for negative than for positive events. We argue that motivational processes underlie all of the observed effects, and that such processes are useful in understanding age differences in memory performance. PMID:22059441
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1990-01-01
An expurgated upper bound on the event error probability of trellis coded modulation is presented. This bound is used to derive a lower bound on the minimum achievable free Euclidean distance d sub (free) of trellis codes. It is shown that the dominant parameters for both bounds, the expurgated error exponent and the asymptotic d sub (free) growth rate, respectively, can be obtained from the cutoff-rate R sub O of the transmission channel by a simple geometric construction, making R sub O the central parameter for finding good trellis codes. Several constellations are optimized with respect to the bounds.
Gravitational wave spectroscopy of binary neutron star merger remnants with mode stacking
NASA Astrophysics Data System (ADS)
Yang, Huan; Paschalidis, Vasileios; Yagi, Kent; Lehner, Luis; Pretorius, Frans; Yunes, Nicolás
2018-01-01
A binary neutron star coalescence event has recently been observed for the first time in gravitational waves, and many more detections are expected once current ground-based detectors begin operating at design sensitivity. As in the case of binary black holes, gravitational waves generated by binary neutron stars consist of inspiral, merger, and postmerger components. Detecting the latter is important because it encodes information about the nuclear equation of state in a regime that cannot be probed prior to merger. The postmerger signal, however, can only be expected to be measurable by current detectors for events closer than roughly ten megaparsecs, which given merger rate estimates implies a low probability of observation within the expected lifetime of these detectors. We carry out Monte Carlo simulations showing that the dominant postmerger signal (the ℓ=m =2 mode) from individual binary neutron star mergers may not have a good chance of observation even with the most sensitive future ground-based gravitational wave detectors proposed so far (the Einstein Telescope and Cosmic Explorer, for certain equations of state, assuming a full year of operation, the latest merger rates, and a detection threshold corresponding to a signal-to-noise ratio of 5). For this reason, we propose two methods that stack the postmerger signal from multiple binary neutron star observations to boost the postmerger detection probability. The first method follows a commonly used practice of multiplying the Bayes factors of individual events. The second method relies on an assumption that the mode phase can be determined from the inspiral waveform, so that coherent mode stacking of the data from different events becomes possible. We find that both methods significantly improve the chances of detecting the dominant postmerger signal, making a detection very likely after a year of observation with Cosmic Explorer for certain equations of state. We also show that in terms of detection, coherent stacking is more efficient in accumulating confidence for the presence of postmerger oscillations in a signal than the first method. Moreover, assuming the postmerger signal is detected with Cosmic Explorer via stacking, we estimate through a Fisher analysis that the peak frequency can be measured to a statistical error of ˜4 - 20 Hz for certain equations of state. Such an error corresponds to a neutron star radius measurement to within ˜15 - 56 m , a fractional relative error ˜4 %, suggesting that systematic errors from theoretical modeling (≳100 m ) may dominate the error budget.
SEU System Analysis: Not Just the Sum of All Parts
NASA Technical Reports Server (NTRS)
Berg, Melanie D.; Label, Kenneth
2014-01-01
Single event upset (SEU) analysis of complex systems is challenging. Currently, system SEU analysis is performed by component level partitioning and then either: the most dominant SEU cross-sections (SEUs) are used in system error rate calculations; or the partition SEUs are summed to eventually obtain a system error rate. In many cases, system error rates are overestimated because these methods generally overlook system level derating factors. The problem with overestimating is that it can cause overdesign and consequently negatively affect the following: cost, schedule, functionality, and validation/verification. The scope of this presentation is to discuss the risks involved with our current scheme of SEU analysis for complex systems; and to provide alternative methods for improvement.
First gravitational-wave burst GW150914: MASTER optical follow-up observations
NASA Astrophysics Data System (ADS)
Lipunov, V. M.; Kornilov, V.; Gorbovskoy, E.; Buckley, D. A. H.; Tiurina, N.; Balanutsa, P.; Kuznetsov, A.; Greiner, J.; Vladimirov, V.; Vlasenko, D.; Chazov, V.; Kuvshinov, D.; Gabovich, A.; Potter, S. B.; Kniazev, A.; Crawford, S.; Rebolo Lopez, R.; Serra-Ricart, M.; Israelian, G.; Lodieu, N.; Gress, O.; Budnev, N.; Ivanov, K.; Poleschuk, V.; Yazev, S.; Tlatov, A.; Senik, V.; Yurkov, V.; Dormidontov, D.; Parkhomenko, A.; Sergienko, Yu.; Podesta, R.; Levato, H.; Lopez, C.; Saffe, C.; Podesta, F.; Mallamaci, C.
2017-03-01
The Advanced LIGO observatory recently reported the first direct detection of the gravitational waves (GWs) predicted by Einstein & Sitzungsber. We report on the first optical observations of the GW source GW150914 error region with the Global MASTER Robotic Net. Between the optical telescopes of electromagnetic support, the covered area is dominated by MASTER with an unfiltered magnitude up to 19.9 mag (5σ). We detected several optical transients, which proved to be unconnected with the GW event. The main input to investigate the final error box of GW150914 was made by the MASTER-SAAO robotic telescope, which covered 70 per cent of the final GW error box and 90 per cent of the common localization area of the LIGO and Fermi events. Our result is consistent with the conclusion (Abbott et al. 2016a) that GWs from GW150914 were produced in a binary black hole merger. At the same time, we cannot exclude that MASTER OT J040938.68-541316.9 exploded on 2015 September 14.
Single-Event Effect Performance of a Conductive-Bridge Memory EEPROM
NASA Technical Reports Server (NTRS)
Chen, Dakai; Wilcox, Edward; Berg, Melanie; Kim, Hak; Phan, Anthony; Figueiredo, Marco; Seidleck, Christina; LaBel, Kenneth
2015-01-01
We investigated the heavy ion single-event effect (SEE) susceptibility of the industry’s first stand-alone memory based on conductive-bridge memory (CBRAM) technology. The device is available as an electrically erasable programmable read-only memory (EEPROM). We found that single-event functional interrupt (SEFI) is the dominant SEE type for each operational mode (standby, dynamic read, and dynamic write/read). SEFIs occurred even while the device is statically biased in standby mode. Worst case SEFIs resulted in errors that filled the entire memory space. Power cycle did not always clear the errors. Thus the corrupted cells had to be reprogrammed in some cases. The device is also vulnerable to bit upsets during dynamic write/read tests, although the frequency of the upsets are relatively low. The linear energy transfer threshold for cell upset is between 10 and 20 megaelectron volts per square centimeter per milligram, with an upper limit cross section of 1.6 times 10(sup -11) square centimeters per bit (95 percent confidence level) at 10 megaelectronvolts per square centimeter per milligram. In standby mode, the CBRAM array appears invulnerable to bit upsets.
Passive acoustic monitoring to detect spawning in large-bodied catostomids
Straight, Carrie A.; Freeman, Byron J.; Freeman, Mary C.
2014-01-01
Documenting timing, locations, and intensity of spawning can provide valuable information for conservation and management of imperiled fishes. However, deep, turbid or turbulent water, or occurrence of spawning at night, can severely limit direct observations. We have developed and tested the use of passive acoustics to detect distinctive acoustic signatures associated with spawning events of two large-bodied catostomid species (River Redhorse Moxostoma carinatum and Robust Redhorse Moxostoma robustum) in river systems in north Georgia. We deployed a hydrophone with a recording unit at four different locations on four different dates when we could both record and observe spawning activity. Recordings captured 494 spawning events that we acoustically characterized using dominant frequency, 95% frequency, relative power, and duration. We similarly characterized 46 randomly selected ambient river noises. Dominant frequency did not differ between redhorse species and ranged from 172.3 to 14,987.1 Hz. Duration of spawning events ranged from 0.65 to 11.07 s, River Redhorse having longer durations than Robust Redhorse. Observed spawning events had significantly higher dominant and 95% frequencies than ambient river noises. We additionally tested software designed to automate acoustic detection. The automated detection configurations correctly identified 80–82% of known spawning events, and falsely indentified spawns 6–7% of the time when none occurred. These rates were combined over all recordings; rates were more variable among individual recordings. Longer spawning events were more likely to be detected. Combined with sufficient visual observations to ascertain species identities and to estimate detection error rates, passive acoustic recording provides a useful tool to study spawning frequency of large-bodied fishes that displace gravel during egg deposition, including several species of imperiled catostomids.
Systematic Biases in Parameter Estimation of Binary Black-Hole Mergers
NASA Technical Reports Server (NTRS)
Littenberg, Tyson B.; Baker, John G.; Buonanno, Alessandra; Kelly, Bernard J.
2012-01-01
Parameter estimation of binary-black-hole merger events in gravitational-wave data relies on matched filtering techniques, which, in turn, depend on accurate model waveforms. Here we characterize the systematic biases introduced in measuring astrophysical parameters of binary black holes by applying the currently most accurate effective-one-body templates to simulated data containing non-spinning numerical-relativity waveforms. For advanced ground-based detectors, we find that the systematic biases are well within the statistical error for realistic signal-to-noise ratios (SNR). These biases grow to be comparable to the statistical errors at high signal-to-noise ratios for ground-based instruments (SNR approximately 50) but never dominate the error budget. At the much larger signal-to-noise ratios expected for space-based detectors, these biases will become large compared to the statistical errors but are small enough (at most a few percent in the black-hole masses) that we expect they should not affect broad astrophysical conclusions that may be drawn from the data.
ERM model analysis for adaptation to hydrological model errors
NASA Astrophysics Data System (ADS)
Baymani-Nezhad, M.; Han, D.
2018-05-01
Hydrological conditions are changed continuously and these phenomenons generate errors on flood forecasting models and will lead to get unrealistic results. Therefore, to overcome these difficulties, a concept called model updating is proposed in hydrological studies. Real-time model updating is one of the challenging processes in hydrological sciences and has not been entirely solved due to lack of knowledge about the future state of the catchment under study. Basically, in terms of flood forecasting process, errors propagated from the rainfall-runoff model are enumerated as the main source of uncertainty in the forecasting model. Hence, to dominate the exciting errors, several methods have been proposed by researchers to update the rainfall-runoff models such as parameter updating, model state updating, and correction on input data. The current study focuses on investigations about the ability of rainfall-runoff model parameters to cope with three types of existing errors, timing, shape and volume as the common errors in hydrological modelling. The new lumped model, the ERM model, has been selected for this study to evaluate its parameters for its use in model updating to cope with the stated errors. Investigation about ten events proves that the ERM model parameters can be updated to cope with the errors without the need to recalibrate the model.
Van Weverberg, K.; Morcrette, C. J.; Petch, J.; ...
2018-02-28
Many Numerical Weather Prediction (NWP) and climate models exhibit too warm lower tropospheres near the midlatitude continents. The warm bias has been shown to coincide with important surface radiation biases that likely play a critical role in the inception or the growth of the warm bias. This paper presents an attribution study on the net radiation biases in nine model simulations, performed in the framework of the CAUSES project (Clouds Above the United States and Errors at the Surface). Contributions from deficiencies in the surface properties, clouds, water vapor, and aerosols are quantified, using an array of radiation measurement stationsmore » near the Atmospheric Radiation Measurement Southern Great Plains site. Furthermore, an in-depth analysis is shown to attribute the radiation errors to specific cloud regimes. The net surface shortwave radiation is overestimated in all models throughout most of the simulation period. Cloud errors are shown to contribute most to this overestimation, although nonnegligible contributions from the surface albedo exist in most models. Missing deep cloud events and/or simulating deep clouds with too weak cloud radiative effects dominate in the cloud-related radiation errors. Some models have compensating errors between excessive occurrence of deep cloud but largely underestimating their radiative effect, while other models miss deep cloud events altogether. Surprisingly, even the latter models tend to produce too much and too frequent afternoon surface precipitation. This suggests that rather than issues with the triggering of deep convection, cloud radiative deficiencies are related to too weak convective cloud detrainment and too large precipitation efficiencies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Weverberg, K.; Morcrette, C. J.; Petch, J.
Many Numerical Weather Prediction (NWP) and climate models exhibit too warm lower tropospheres near the midlatitude continents. The warm bias has been shown to coincide with important surface radiation biases that likely play a critical role in the inception or the growth of the warm bias. This paper presents an attribution study on the net radiation biases in nine model simulations, performed in the framework of the CAUSES project (Clouds Above the United States and Errors at the Surface). Contributions from deficiencies in the surface properties, clouds, water vapor, and aerosols are quantified, using an array of radiation measurement stationsmore » near the Atmospheric Radiation Measurement Southern Great Plains site. Furthermore, an in-depth analysis is shown to attribute the radiation errors to specific cloud regimes. The net surface shortwave radiation is overestimated in all models throughout most of the simulation period. Cloud errors are shown to contribute most to this overestimation, although nonnegligible contributions from the surface albedo exist in most models. Missing deep cloud events and/or simulating deep clouds with too weak cloud radiative effects dominate in the cloud-related radiation errors. Some models have compensating errors between excessive occurrence of deep cloud but largely underestimating their radiative effect, while other models miss deep cloud events altogether. Surprisingly, even the latter models tend to produce too much and too frequent afternoon surface precipitation. This suggests that rather than issues with the triggering of deep convection, cloud radiative deficiencies are related to too weak convective cloud detrainment and too large precipitation efficiencies.« less
NASA Astrophysics Data System (ADS)
Van Weverberg, K.; Morcrette, C. J.; Petch, J.; Klein, S. A.; Ma, H.-Y.; Zhang, C.; Xie, S.; Tang, Q.; Gustafson, W. I.; Qian, Y.; Berg, L. K.; Liu, Y.; Huang, M.; Ahlgrimm, M.; Forbes, R.; Bazile, E.; Roehrig, R.; Cole, J.; Merryfield, W.; Lee, W.-S.; Cheruy, F.; Mellul, L.; Wang, Y.-C.; Johnson, K.; Thieman, M. M.
2018-04-01
Many Numerical Weather Prediction (NWP) and climate models exhibit too warm lower tropospheres near the midlatitude continents. The warm bias has been shown to coincide with important surface radiation biases that likely play a critical role in the inception or the growth of the warm bias. This paper presents an attribution study on the net radiation biases in nine model simulations, performed in the framework of the CAUSES project (Clouds Above the United States and Errors at the Surface). Contributions from deficiencies in the surface properties, clouds, water vapor, and aerosols are quantified, using an array of radiation measurement stations near the Atmospheric Radiation Measurement Southern Great Plains site. Furthermore, an in-depth analysis is shown to attribute the radiation errors to specific cloud regimes. The net surface shortwave radiation is overestimated in all models throughout most of the simulation period. Cloud errors are shown to contribute most to this overestimation, although nonnegligible contributions from the surface albedo exist in most models. Missing deep cloud events and/or simulating deep clouds with too weak cloud radiative effects dominate in the cloud-related radiation errors. Some models have compensating errors between excessive occurrence of deep cloud but largely underestimating their radiative effect, while other models miss deep cloud events altogether. Surprisingly, even the latter models tend to produce too much and too frequent afternoon surface precipitation. This suggests that rather than issues with the triggering of deep convection, cloud radiative deficiencies are related to too weak convective cloud detrainment and too large precipitation efficiencies.
Two Cultures in Modern Science and Technology: For Safety and Validity Does Medicine Have to Update?
Becker, Robert E
2016-01-11
Two different scientific cultures go unreconciled in modern medicine. Each culture accepts that scientific knowledge and technologies are vulnerable to and easily invalidated by methods and conditions of acquisition, interpretation, and application. How these vulnerabilities are addressed separates the 2 cultures and potentially explains medicine's difficulties eradicating errors. A traditional culture, dominant in medicine, leaves error control in the hands of individual and group investigators and practitioners. A competing modern scientific culture accepts errors as inevitable, pernicious, and pervasive sources of adverse events throughout medical research and patient care too malignant for individuals or groups to control. Error risks to the validity of scientific knowledge and safety in patient care require systemwide programming able to support a culture in medicine grounded in tested, continually updated, widely promulgated, and uniformly implemented standards of practice for research and patient care. Experiences from successes in other sciences and industries strongly support the need for leadership from the Institute of Medicine's recommended Center for Patient Safely within the Federal Executive branch of government.
A Search for Neutrinos from Fast Radio Bursts with IceCube
NASA Astrophysics Data System (ADS)
Fahey, Samuel; Kheirandish, Ali; Vandenbroucke, Justin; Xu, Donglian
2017-08-01
We present a search for neutrinos in coincidence in time and direction with four fast radio bursts (FRBs) detected by the Parkes and Green Bank radio telescopes during the first year of operation of the complete IceCube Neutrino Observatory (2011 May through 2012 May). The neutrino sample consists of 138,322 muon neutrino candidate events, which are dominated by atmospheric neutrinos and atmospheric muons but also contain an astrophysical neutrino component. Considering only neutrinos detected on the same day as each FRB, zero IceCube events were found to be compatible with the FRB directions within the estimated 99% error radius of the neutrino directions. Based on the non-detection, we present the first upper limits on the neutrino fluence from FRBs.
First- and second-language phonological representations in the mental lexicon.
Sebastian-Gallés, Núria; Rodríguez-Fornells, Antoni; de Diego-Balaguer, Ruth; Díaz, Begoña
2006-08-01
Performance-based studies on the psychological nature of linguistic competence can conceal significant differences in the brain processes that underlie native versus nonnative knowledge of language. Here we report results from the brain activity of very proficient early bilinguals making a lexical decision task that illustrates this point. Two groups of Spanish-Catalan early bilinguals (Spanish-dominant and Catalan-dominant) were asked to decide whether a given form was a Catalan word or not. The nonwords were based on real words, with one vowel changed. In the experimental stimuli, the vowel change involved a Catalan-specific contrast that previous research had shown to be difficult for Spanish natives to perceive. In the control stimuli, the vowel switch involved contrasts common to Spanish and Catalan. The results indicated that the groups of bilinguals did not differ in their behavioral and event-related brain potential measurements for the control stimuli; both groups made very few errors and showed a larger N400 component for control nonwords than for control words. However, significant differences were observed for the experimental stimuli across groups: Specifically, Spanish-dominant bilinguals showed great difficulty in rejecting experimental nonwords. Indeed, these participants not only showed very high error rates for these stimuli, but also did not show an error-related negativity effect in their erroneous nonword decisions. However, both groups of bilinguals showed a larger correct-related negativity when making correct decisions about the experimental nonwords. The results suggest that although some aspects of a second language system may show a remarkable lack of plasticity (like the acquisition of some foreign contrasts), first-language representations seem to be more dynamic in their capacity of adapting and incorporating new information.
The Sensitivity of Adverse Event Cost Estimates to Diagnostic Coding Error
Wardle, Gavin; Wodchis, Walter P; Laporte, Audrey; Anderson, Geoffrey M; Baker, Ross G
2012-01-01
Objective To examine the impact of diagnostic coding error on estimates of hospital costs attributable to adverse events. Data Sources Original and reabstracted medical records of 9,670 complex medical and surgical admissions at 11 hospital corporations in Ontario from 2002 to 2004. Patient specific costs, not including physician payments, were retrieved from the Ontario Case Costing Initiative database. Study Design Adverse events were identified among the original and reabstracted records using ICD10-CA (Canadian adaptation of ICD10) codes flagged as postadmission complications. Propensity score matching and multivariate regression analysis were used to estimate the cost of the adverse events and to determine the sensitivity of cost estimates to diagnostic coding error. Principal Findings Estimates of the cost of the adverse events ranged from $16,008 (metabolic derangement) to $30,176 (upper gastrointestinal bleeding). Coding errors caused the total cost attributable to the adverse events to be underestimated by 16 percent. The impact of coding error on adverse event cost estimates was highly variable at the organizational level. Conclusions Estimates of adverse event costs are highly sensitive to coding error. Adverse event costs may be significantly underestimated if the likelihood of error is ignored. PMID:22091908
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibson, Adam Paul
The authors present a measurement of the mass of the top quark. The event sample is selected from proton-antiproton collisions, at 1.96 TeV center-of-mass energy, observed with the CDF detector at Fermilab's Tevatron. They consider a 318 pb -1 dataset collected between March 2002 and August 2004. They select events that contain one energetic lepton, large missing transverse energy, exactly four energetic jets, and at least one displaced vertex b tag. The analysis uses leading-order tmore » $$\\bar{t}$$ and background matrix elements along with parameterized parton showering to construct event-by-event likelihoods as a function of top quark mass. From the 63 events observed with the 318 pb -1 dataset they extract a top quark mass of 172.0 ± 2.6(stat) ± 3.3(syst) GeV/c 2 from the joint likelihood. The mean expected statistical uncertainty is 3.2 GeV/c 2 for m $$\\bar{t}$$ = 178 GTeV/c 2 and 3.1 GeV/c 2 for m $$\\bar{t}$$ = 172.5 GeV/c 2. The systematic error is dominated by the uncertainty of the jet energy scale.« less
NASA Astrophysics Data System (ADS)
Ran, Youhua; Li, Xin; Jin, Rui; Kang, Jian; Cosh, Michael H.
2017-01-01
Monitoring and estimating grid-mean soil moisture is very important for assessing many hydrological, biological, and biogeochemical processes and for validating remotely sensed surface soil moisture products. Temporal stability analysis (TSA) is a valuable tool for identifying a small number of representative sampling points to estimate the grid-mean soil moisture content. This analysis was evaluated and improved using high-quality surface soil moisture data that were acquired by a wireless sensor network in a high-intensity irrigated agricultural landscape in an arid region of northwestern China. The performance of the TSA was limited in areas where the representative error was dominated by random events, such as irrigation events. This shortcoming can be effectively mitigated by using a stratified TSA (STSA) method, proposed in this paper. In addition, the following methods were proposed for rapidly and efficiently identifying representative sampling points when using TSA. (1) Instantaneous measurements can be used to identify representative sampling points to some extent; however, the error resulting from this method is significant when validating remotely sensed soil moisture products. Thus, additional representative sampling points should be considered to reduce this error. (2) The calibration period can be determined from the time span of the full range of the grid-mean soil moisture content during the monitoring period. (3) The representative error is sensitive to the number of calibration sampling points, especially when only a few representative sampling points are used. Multiple sampling points are recommended to reduce data loss and improve the likelihood of representativeness at two scales.
Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Dale; Selby, Neil
2012-08-14
Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.
Search for gamma-ray events in the BATSE data base
NASA Technical Reports Server (NTRS)
Lewin, Walter
1994-01-01
We find large location errors and error radii in the locations of channel 1 Cygnus X-1 events. These errors and their associated uncertainties are a result of low signal-to-noise ratios (a few sigma) in the two brightest detectors for each event. The untriggered events suffer from similarly low signal-to-noise ratios, and their location errors are expected to be at least as large as those found for Cygnus X-1 with a given signal-to-noise ratio. The statistical error radii are consistent with those found for Cygnus X-1 and with the published estimates. We therefore expect approximately 20 - 30 deg location errors for the untriggered events. Hence, many of the untriggered events occurring within a few months of the triggered activity from SGR 1900 plus 14 are indeed consistent with the SGR source location, although Cygnus X-1 is also a good candidate.
Functional language shift to the right hemisphere in patients with language-eloquent brain tumors.
Krieg, Sandro M; Sollmann, Nico; Hauck, Theresa; Ille, Sebastian; Foerschler, Annette; Meyer, Bernhard; Ringel, Florian
2013-01-01
Language function is mainly located within the left hemisphere of the brain, especially in right-handed subjects. However, functional MRI (fMRI) has demonstrated changes of language organization in patients with left-sided perisylvian lesions to the right hemisphere. Because intracerebral lesions can impair fMRI, this study was designed to investigate human language plasticity with a virtual lesion model using repetitive navigated transcranial magnetic stimulation (rTMS). Fifteen patients with lesions of left-sided language-eloquent brain areas and 50 healthy and purely right-handed participants underwent bilateral rTMS language mapping via an object-naming task. All patients were proven to have left-sided language function during awake surgery. The rTMS-induced language errors were categorized into 6 different error types. The error ratio (induced errors/number of stimulations) was determined for each brain region on both hemispheres. A hemispheric dominance ratio was then defined for each region as the quotient of the error ratio (left/right) of the corresponding area of both hemispheres (ratio >1 = left dominant; ratio <1 = right dominant). Patients with language-eloquent lesions showed a statistically significantly lower ratio than healthy participants concerning "all errors" and "all errors without hesitations", which indicates a higher participation of the right hemisphere in language function. Yet, there was no cortical region with pronounced difference in language dominance compared to the whole hemisphere. This is the first study that shows by means of an anatomically accurate virtual lesion model that a shift of language function to the non-dominant hemisphere can occur.
Elliott, Rachel A; Putman, Koen D; Franklin, Matthew; Annemans, Lieven; Verhaeghe, Nick; Eden, Martin; Hayre, Jasdeep; Rodgers, Sarah; Sheikh, Aziz; Avery, Anthony J
2014-06-01
We recently showed that a pharmacist-led information technology-based intervention (PINCER) was significantly more effective in reducing medication errors in general practices than providing simple feedback on errors, with cost per error avoided at £79 (US$131). We aimed to estimate cost effectiveness of the PINCER intervention by combining effectiveness in error reduction and intervention costs with the effect of the individual errors on patient outcomes and healthcare costs, to estimate the effect on costs and QALYs. We developed Markov models for each of six medication errors targeted by PINCER. Clinical event probability, treatment pathway, resource use and costs were extracted from literature and costing tariffs. A composite probabilistic model combined patient-level error models with practice-level error rates and intervention costs from the trial. Cost per extra QALY and cost-effectiveness acceptability curves were generated from the perspective of NHS England, with a 5-year time horizon. The PINCER intervention generated £2,679 less cost and 0.81 more QALYs per practice [incremental cost-effectiveness ratio (ICER): -£3,037 per QALY] in the deterministic analysis. In the probabilistic analysis, PINCER generated 0.001 extra QALYs per practice compared with simple feedback, at £4.20 less per practice. Despite this extremely small set of differences in costs and outcomes, PINCER dominated simple feedback with a mean ICER of -£3,936 (standard error £2,970). At a ceiling 'willingness-to-pay' of £20,000/QALY, PINCER reaches 59 % probability of being cost effective. PINCER produced marginal health gain at slightly reduced overall cost. Results are uncertain due to the poor quality of data to inform the effect of avoiding errors.
SRAM Based Re-programmable FPGA for Space Applications
NASA Technical Reports Server (NTRS)
Wang, J. J.; Sun, J. S.; Cronquist, B. E.; McCollum, J. L.; Speers, T. M.; Plants, W. C.; Katz, R. B.
1999-01-01
An SRAM (static random access memory)-based reprogrammable FPGA (field programmable gate array) is investigated for space applications. A new commercial prototype, named the RS family, was used as an example for the investigation. The device is fabricated in a 0.25 micrometers CMOS technology. Its architecture is reviewed to provide a better understanding of the impact of single event upset (SEU) on the device during operation. The SEU effect of different memories available on the device is evaluated. Heavy ion test data and SPICE simulations are used integrally to extract the threshold LET (linear energy transfer). Together with the saturation cross-section measurement from the layout, a rate prediction is done on each memory type. The SEU in the configuration SRAM is identified as the dominant failure mode and is discussed in detail. The single event transient error in combinational logic is also investigated and simulated by SPICE. SEU mitigation by hardening the memories and employing EDAC (error detection and correction) at the device level are presented. For the configuration SRAM (CSRAM) cell, the trade-off between resistor de-coupling and redundancy hardening techniques are investigated with interesting results. Preliminary heavy ion test data show no sign of SEL (single event latch-up). With regard to ionizing radiation effects, the increase in static leakage current (static I(sub CC)) measured indicates a device tolerance of approximately 50krad(Si).
Lobaugh, Lauren M Y; Martin, Lizabeth D; Schleelein, Laura E; Tyler, Donald C; Litman, Ronald S
2017-09-01
Wake Up Safe is a quality improvement initiative of the Society for Pediatric Anesthesia that contains a deidentified registry of serious adverse events occurring in pediatric anesthesia. The aim of this study was to describe and characterize reported medication errors to find common patterns amenable to preventative strategies. In September 2016, we analyzed approximately 6 years' worth of medication error events reported to Wake Up Safe. Medication errors were classified by: (1) medication category; (2) error type by phase of administration: prescribing, preparation, or administration; (3) bolus or infusion error; (4) provider type and level of training; (5) harm as defined by the National Coordinating Council for Medication Error Reporting and Prevention; and (6) perceived preventability. From 2010 to the time of our data analysis in September 2016, 32 institutions had joined and submitted data on 2087 adverse events during 2,316,635 anesthetics. These reports contained details of 276 medication errors, which comprised the third highest category of events behind cardiac and respiratory related events. Medication errors most commonly involved opioids and sedative/hypnotics. When categorized by phase of handling, 30 events occurred during preparation, 67 during prescribing, and 179 during administration. The most common error type was accidental administration of the wrong dose (N = 84), followed by syringe swap (accidental administration of the wrong syringe, N = 49). Fifty-seven (21%) reported medication errors involved medications prepared as infusions as opposed to 1 time bolus administrations. Medication errors were committed by all types of anesthesia providers, most commonly by attendings. Over 80% of reported medication errors reached the patient and more than half of these events caused patient harm. Fifteen events (5%) required a life sustaining intervention. Nearly all cases (97%) were judged to be either likely or certainly preventable. Our findings characterize the most common types of medication errors in pediatric anesthesia practice and provide guidance on future preventative strategies. Many of these errors will be almost entirely preventable with the use of prefilled medication syringes to avoid accidental ampule swap, bar-coding at the point of medication administration to prevent syringe swap and to confirm the proper dose, and 2-person checking of medication infusions for accuracy.
NASA Astrophysics Data System (ADS)
Xia, Zhiye; Xu, Lisheng; Chen, Hongbin; Wang, Yongqian; Liu, Jinbao; Feng, Wenlan
2017-06-01
Extended range forecasting of 10-30 days, which lies between medium-term and climate prediction in terms of timescale, plays a significant role in decision-making processes for the prevention and mitigation of disastrous meteorological events. The sensitivity of initial error, model parameter error, and random error in a nonlinear crossprediction error (NCPE) model, and their stability in the prediction validity period in 10-30-day extended range forecasting, are analyzed quantitatively. The associated sensitivity of precipitable water, temperature, and geopotential height during cases of heavy rain and hurricane is also discussed. The results are summarized as follows. First, the initial error and random error interact. When the ratio of random error to initial error is small (10-6-10-2), minor variation in random error cannot significantly change the dynamic features of a chaotic system, and therefore random error has minimal effect on the prediction. When the ratio is in the range of 10-1-2 (i.e., random error dominates), attention should be paid to the random error instead of only the initial error. When the ratio is around 10-2-10-1, both influences must be considered. Their mutual effects may bring considerable uncertainty to extended range forecasting, and de-noising is therefore necessary. Second, in terms of model parameter error, the embedding dimension m should be determined by the factual nonlinear time series. The dynamic features of a chaotic system cannot be depicted because of the incomplete structure of the attractor when m is small. When m is large, prediction indicators can vanish because of the scarcity of phase points in phase space. A method for overcoming the cut-off effect ( m > 4) is proposed. Third, for heavy rains, precipitable water is more sensitive to the prediction validity period than temperature or geopotential height; however, for hurricanes, geopotential height is most sensitive, followed by precipitable water.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terezakis, Stephanie A., E-mail: stereza1@jhmi.edu; Harris, Kendra M.; Ford, Eric
Purpose: Systems to ensure patient safety are of critical importance. The electronic incident reporting systems (IRS) of 2 large academic radiation oncology departments were evaluated for events that may be suitable for submission to a national reporting system (NRS). Methods and Materials: All events recorded in the combined IRS were evaluated from 2007 through 2010. Incidents were graded for potential severity using the validated French Nuclear Safety Authority (ASN) 5-point scale. These incidents were categorized into 7 groups: (1) human error, (2) software error, (3) hardware error, (4) error in communication between 2 humans, (5) error at the human-software interface,more » (6) error at the software-hardware interface, and (7) error at the human-hardware interface. Results: Between the 2 systems, 4407 incidents were reported. Of these events, 1507 (34%) were considered to have the potential for clinical consequences. Of these 1507 events, 149 (10%) were rated as having a potential severity of ≥2. Of these 149 events, the committee determined that 79 (53%) of these events would be submittable to a NRS of which the majority was related to human error or to the human-software interface. Conclusions: A significant number of incidents were identified in this analysis. The majority of events in this study were related to human error and to the human-software interface, further supporting the need for a NRS to facilitate field-wide learning and system improvement.« less
Homaeinezhad, M R; Erfanianmoshiri-Nejad, M; Naseri, H
2014-01-01
The goal of this study is to introduce a simple, standard and safe procedure to detect and to delineate P and T waves of the electrocardiogram (ECG) signal in real conditions. The proposed method consists of four major steps: (1) a secure QRS detection and delineation algorithm, (2) a pattern recognition algorithm designed for distinguishing various ECG clusters which take place between consecutive R-waves, (3) extracting template of the dominant events of each cluster waveform and (4) application of the correlation analysis in order to delineate automatically the P- and T-waves in noisy conditions. The performance characteristics of the proposed P and T detection-delineation algorithm are evaluated versus various ECG signals whose qualities are altered from the best to the worst cases based on the random-walk noise theory. Also, the method is applied to the MIT-BIH Arrhythmia and the QT databases for comparing some parts of its performance characteristics with a number of P and T detection-delineation algorithms. The conducted evaluations indicate that in a signal with low quality value of about 0.6, the proposed method detects the P and T events with sensitivity Se=85% and positive predictive value of P+=89%, respectively. In addition, at the same quality, the average delineation errors associated with those ECG events are 45 and 63ms, respectively. Stable delineation error, high detection accuracy and high noise tolerance were the most important aspects considered during development of the proposed method. © 2013 Elsevier Ltd. All rights reserved.
Adverse Drug Events and Medication Errors in African Hospitals: A Systematic Review.
Mekonnen, Alemayehu B; Alhawassi, Tariq M; McLachlan, Andrew J; Brien, Jo-Anne E
2018-03-01
Medication errors and adverse drug events are universal problems contributing to patient harm but the magnitude of these problems in Africa remains unclear. The objective of this study was to systematically investigate the literature on the extent of medication errors and adverse drug events, and the factors contributing to medication errors in African hospitals. We searched PubMed, MEDLINE, EMBASE, Web of Science and Global Health databases from inception to 31 August, 2017 and hand searched the reference lists of included studies. Original research studies of any design published in English that investigated adverse drug events and/or medication errors in any patient population in the hospital setting in Africa were included. Descriptive statistics including median and interquartile range were presented. Fifty-one studies were included; of these, 33 focused on medication errors, 15 on adverse drug events, and three studies focused on medication errors and adverse drug events. These studies were conducted in nine (of the 54) African countries. In any patient population, the median (interquartile range) percentage of patients reported to have experienced any suspected adverse drug event at hospital admission was 8.4% (4.5-20.1%), while adverse drug events causing admission were reported in 2.8% (0.7-6.4%) of patients but it was reported that a median of 43.5% (20.0-47.0%) of the adverse drug events were deemed preventable. Similarly, the median mortality rate attributed to adverse drug events was reported to be 0.1% (interquartile range 0.0-0.3%). The most commonly reported types of medication errors were prescribing errors, occurring in a median of 57.4% (interquartile range 22.8-72.8%) of all prescriptions and a median of 15.5% (interquartile range 7.5-50.6%) of the prescriptions evaluated had dosing problems. Major contributing factors for medication errors reported in these studies were individual practitioner factors (e.g. fatigue and inadequate knowledge/training) and environmental factors, such as workplace distraction and high workload. Medication errors in the African healthcare setting are relatively common, and the impact of adverse drug events is substantial but many are preventable. This review supports the design and implementation of preventative strategies targeting the most likely contributing factors.
NASA Technical Reports Server (NTRS)
Wu, Huan; Adler, Robert F.; Tian, Yudong; Huffman, George J.; Li, Hongyi; Wang, JianJian
2014-01-01
A widely used land surface model, the Variable Infiltration Capacity (VIC) model, is coupled with a newly developed hierarchical dominant river tracing-based runoff-routing model to form the Dominant river tracing-Routing Integrated with VIC Environment (DRIVE) model, which serves as the new core of the real-time Global Flood Monitoring System (GFMS). The GFMS uses real-time satellite-based precipitation to derive flood monitoring parameters for the latitude band 50 deg. N - 50 deg. S at relatively high spatial (approximately 12 km) and temporal (3 hourly) resolution. Examples of model results for recent flood events are computed using the real-time GFMS (http://flood.umd.edu). To evaluate the accuracy of the new GFMS, the DRIVE model is run retrospectively for 15 years using both research-quality and real-time satellite precipitation products. Evaluation results are slightly better for the research-quality input and significantly better for longer duration events (3 day events versus 1 day events). Basins with fewer dams tend to provide lower false alarm ratios. For events longer than three days in areas with few dams, the probability of detection is approximately 0.9 and the false alarm ratio is approximately 0.6. In general, these statistical results are better than those of the previous system. Streamflow was evaluated at 1121 river gauges across the quasi-global domain. Validation using real-time precipitation across the tropics (30 deg. S - 30 deg. N) gives positive daily Nash-Sutcliffe Coefficients for 107 out of 375 (28%) stations with a mean of 0.19 and 51% of the same gauges at monthly scale with a mean of 0.33. There were poorer results in higher latitudes, probably due to larger errors in the satellite precipitation input.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Huan; Adler, Robert F.; Tian, Yudong
2014-03-01
A widely used land surface model, the Variable Infiltration Capacity (VIC) model, is coupled with a newly developed hierarchical dominant river tracing-based runoff-routing model to form the Dominant river tracing-Routing Integrated with VIC Environment (DRIVE) model, which serves as the new core of the real-time Global Flood Monitoring System (GFMS). The GFMS uses real-time satellite-based precipitation to derive flood monitoring parameters for the latitude band 50°N–50°S at relatively high spatial (~12 km) and temporal (3 hourly) resolution. Examples of model results for recent flood events are computed using the real-time GFMS (http://flood.umd.edu). To evaluate the accuracy of the new GFMS,more » the DRIVE model is run retrospectively for 15 years using both research-quality and real-time satellite precipitation products. Evaluation results are slightly better for the research-quality input and significantly better for longer duration events (3 day events versus 1 day events). Basins with fewer dams tend to provide lower false alarm ratios. For events longer than three days in areas with few dams, the probability of detection is ~0.9 and the false alarm ratio is ~0.6. In general, these statistical results are better than those of the previous system. Streamflow was evaluated at 1121 river gauges across the quasi-global domain. Validation using real-time precipitation across the tropics (30°S–30°N) gives positive daily Nash-Sutcliffe Coefficients for 107 out of 375 (28%) stations with a mean of 0.19 and 51% of the same gauges at monthly scale with a mean of 0.33. Finally, there were poorer results in higher latitudes, probably due to larger errors in the satellite precipitation input.« less
Meurier, C E
2000-07-01
Human errors are common in clinical practice, but they are under-reported. As a result, very little is known of the types, antecedents and consequences of errors in nursing practice. This limits the potential to learn from errors and to make improvement in the quality and safety of nursing care. The aim of this study was to use an Organizational Accident Model to analyse critical incidents of errors in nursing. Twenty registered nurses were invited to produce a critical incident report of an error (which had led to an adverse event or potentially could have led to an adverse event) they had made in their professional practice and to write down their responses to the error using a structured format. Using Reason's Organizational Accident Model, supplemental information was then collected from five of the participants by means of an individual in-depth interview to explore further issues relating to the incidents they had reported. The detailed analysis of one of the incidents is discussed in this paper, demonstrating the effectiveness of this approach in providing insight into the chain of events which may lead to an adverse event. The case study approach using critical incidents of clinical errors was shown to provide relevant information regarding the interaction of organizational factors, local circumstances and active failures (errors) in producing an adverse or potentially adverse event. It is suggested that more use should be made of this approach to understand how errors are made in practice and to take appropriate preventative measures.
Effects of Tropospheric Spatio-Temporal Correlated Noise on the Analysis of Space Geodetic Data
NASA Technical Reports Server (NTRS)
Romero-Wolf, A. F.; Jacobs, C. S.
2011-01-01
The standard VLBI analysis models measurement noise as purely thermal errors modeled according to uncorrelated Gaussian distributions. As the price of recording bits steadily decreases, thermal errors will soon no longer dominate. It is therefore expected that troposphere and instrumentation/clock errors will increasingly become more dominant. Given that both of these errors have correlated spectra, properly modeling the error distributions will become more relevant for optimal analysis. This paper will discuss the advantages of including the correlations between tropospheric delays using a Kolmogorov spectrum and the frozen ow model pioneered by Treuhaft and Lanyi. We will show examples of applying these correlated noise spectra to the weighting of VLBI data analysis.
Characteristics of Single-Event Upsets in a Fabric Switch (ADS151)
NASA Technical Reports Server (NTRS)
Buchner, Stephen; Carts, Martin A.; McMorrow, Dale; Kim, Hak; Marshall, Paul W.; LaBel, Kenneth A.
2003-01-01
Abstract-Two types of single event effects - bit errors and single event functional interrupts - were observed during heavy-ion testing of the AD8151 crosspoint switch. Bit errors occurred in bursts with the average number of bits in a burst being dependent on both the ion LET and on the data rate. A pulsed laser was used to identify the locations on the chip where the bit errors and single event functional interrupts occurred. Bit errors originated in the switches, drivers, and output buffers. Single event functional interrupts occurred when the laser was focused on the second rank latch containing the data specifying the state of each switch in the 33x17 matrix.
Error reporting in transfusion medicine at a tertiary care centre: a patient safety initiative.
Elhence, Priti; Shenoy, Veena; Verma, Anupam; Sachan, Deepti
2012-11-01
Errors in the transfusion process can compromise patient safety. A study was undertaken at our center to identify the errors in the transfusion process and their causes in order to reduce their occurrence by corrective and preventive actions. All near miss, no harm events and adverse events reported in the 'transfusion process' during 1 year study period were recorded, classified and analyzed at a tertiary care teaching hospital in North India. In total, 285 transfusion related events were reported during the study period. Of these, there were four adverse (1.5%), 10 no harm (3.5%) and 271 (95%) near miss events. Incorrect blood component transfusion rate was 1 in 6031 component units. ABO incompatible transfusion rate was one in 15,077 component units issued or one in 26,200 PRBC units issued and acute hemolytic transfusion reaction due to ABO incompatible transfusion was 1 in 60,309 component units issued. Fifty-three percent of the antecedent near miss events were bedside events. Patient sample handling errors were the single largest category of errors (n=94, 33%) followed by errors in labeling and blood component handling and storage in user areas. The actual and near miss event data obtained through this initiative provided us with clear evidence about latent defects and critical points in the transfusion process so that corrective and preventive actions could be taken to reduce errors and improve transfusion safety.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dodds, Nathaniel Anson
2015-08-01
This report briefly summarizes three publications that resulted from a two-year LDRD. The three publications address a recently emerging reliability issue: namely, that low-energy protons (LEPs) can cause single-event effects (SEEs) in highly scaled microelectronics. These publications span from low to high technology readiness levels. In the first, novel experiments were used to prove that proton direct ionization is the dominant mechanism for LEP-induced SEEs. In the second, a simple method was developed to calculate expected on-orbit error rates for LEP effects. This simplification was enabled by creating (and characterizing) an accelerated space-like LEP environment in the laboratory. In themore » third publication, this new method was applied to many memory circuits from the 20-90 nm technology nodes to study the general importance of LEP effects, in terms of their contribution to the total on-orbit SEE rate.« less
Functional Language Shift to the Right Hemisphere in Patients with Language-Eloquent Brain Tumors
Krieg, Sandro M.; Sollmann, Nico; Hauck, Theresa; Ille, Sebastian; Foerschler, Annette; Meyer, Bernhard; Ringel, Florian
2013-01-01
Objectives Language function is mainly located within the left hemisphere of the brain, especially in right-handed subjects. However, functional MRI (fMRI) has demonstrated changes of language organization in patients with left-sided perisylvian lesions to the right hemisphere. Because intracerebral lesions can impair fMRI, this study was designed to investigate human language plasticity with a virtual lesion model using repetitive navigated transcranial magnetic stimulation (rTMS). Experimental design Fifteen patients with lesions of left-sided language-eloquent brain areas and 50 healthy and purely right-handed participants underwent bilateral rTMS language mapping via an object-naming task. All patients were proven to have left-sided language function during awake surgery. The rTMS-induced language errors were categorized into 6 different error types. The error ratio (induced errors/number of stimulations) was determined for each brain region on both hemispheres. A hemispheric dominance ratio was then defined for each region as the quotient of the error ratio (left/right) of the corresponding area of both hemispheres (ratio >1 = left dominant; ratio <1 = right dominant). Results Patients with language-eloquent lesions showed a statistically significantly lower ratio than healthy participants concerning “all errors” and “all errors without hesitations”, which indicates a higher participation of the right hemisphere in language function. Yet, there was no cortical region with pronounced difference in language dominance compared to the whole hemisphere. Conclusions This is the first study that shows by means of an anatomically accurate virtual lesion model that a shift of language function to the non-dominant hemisphere can occur. PMID:24069410
Maximizing the Detection Probability of Kilonovae Associated with Gravitational Wave Observations
NASA Astrophysics Data System (ADS)
Chan, Man Leong; Hu, Yi-Ming; Messenger, Chris; Hendry, Martin; Heng, Ik Siong
2017-01-01
Estimates of the source sky location for gravitational wave signals are likely to span areas of up to hundreds of square degrees or more, making it very challenging for most telescopes to search for counterpart signals in the electromagnetic spectrum. To boost the chance of successfully observing such counterparts, we have developed an algorithm that optimizes the number of observing fields and their corresponding time allocations by maximizing the detection probability. As a proof-of-concept demonstration, we optimize follow-up observations targeting kilonovae using telescopes including the CTIO-Dark Energy Camera, Subaru-HyperSuprimeCam, Pan-STARRS, and the Palomar Transient Factory. We consider three simulated gravitational wave events with 90% credible error regions spanning areas from ∼ 30 {\\deg }2 to ∼ 300 {\\deg }2. Assuming a source at 200 {Mpc}, we demonstrate that to obtain a maximum detection probability, there is an optimized number of fields for any particular event that a telescope should observe. To inform future telescope design studies, we present the maximum detection probability and corresponding number of observing fields for a combination of limiting magnitudes and fields of view over a range of parameters. We show that for large gravitational wave error regions, telescope sensitivity rather than field of view is the dominating factor in maximizing the detection probability.
NASA Astrophysics Data System (ADS)
Pan, X.; Yang, Y.; Liu, Y.; Fan, X.; Shan, L.; Zhang, X.
2018-04-01
Error source analyses are critical for the satellite-retrieved surface net radiation (Rn) products. In this study, we evaluate the Rn error sources in the Clouds and the Earth's Radiant Energy System (CERES) project at 43 sites from July in 2007 to December in 2007 in China. The results show that cloud fraction (CF), land surface temperature (LST), atmospheric temperature (AT) and algorithm error dominate the Rn error, with error contributions of -20, 15, 10 and 10 W/m2 (net shortwave (NSW)/longwave (NLW) radiation), respectively. For NSW, the dominant error source is algorithm error (more than 10 W/m2), particularly in spring and summer with abundant cloud. For NLW, due to the high sensitivity of algorithm and large LST/CF error, LST and CF are the largest error sources, especially in northern China. The AT influences the NLW error large in southern China because of the large AT error in there. The total precipitable water has weak influence on Rn error even with the high sensitivity of algorithm. In order to improve Rn quality, CF and LST (AT) error in northern (southern) China should be decreased.
Trait dissociation and commission errors in memory reports of emotional events.
Merckelbach, Harald; Zeles, Gwen; Van Bergen, Saskia; Giesbrecht, Timo
2007-01-01
In 2 studies we examined whether trait dissociation is related to spontaneous commission errors (reports of events that did not occur) in free recall of emotional events. We also explored whether the functional locus of the dissociation-commission link is related to repeated retrieval or shallow encoding. In Experiment 1 participants were exposed to a staged incident and were repeatedly asked to add more information to their written accounts of the event. Dissociation levels were related to commission errors, indicating that people who report many dissociative experiences tend to make more commission errors. However, it was not the case that the overall increase in commission errors over successive retrieval attempts was typical for high dissociative participants. In Experiment 2 participants saw a video fragment of a severe car accident. During the video, half the participants performed a dual task, and the other half did not. Participants performing the dual task made more commission errors than controls, but this effect was not more pronounced in those with high trait dissociation scores. These studies show that there is a link between dissociation and spontaneous commission errors in memory reports of emotional events, but the functional locus of this link remains unclear.
Impact of SST Anomaly Events over the Kuroshio-Oyashio Extension on the "Summer Prediction Barrier"
NASA Astrophysics Data System (ADS)
Wu, Yujie; Duan, Wansuo
2018-04-01
The "summer prediction barrier" (SPB) of SST anomalies (SSTA) over the Kuroshio-Oyashio Extension (KOE) refers to the phenomenon that prediction errors of KOE-SSTA tend to increase rapidly during boreal summer, resulting in large prediction uncertainties. The fast error growth associated with the SPB occurs in the mature-to-decaying transition phase, which is usually during the August-September-October (ASO) season, of the KOE-SSTA events to be predicted. Thus, the role of KOE-SSTA evolutionary characteristics in the transition phase in inducing the SPB is explored by performing perfect model predictability experiments in a coupled model, indicating that the SSTA events with larger mature-to-decaying transition rates (Category-1) favor a greater possibility of yielding a more significant SPB than those events with smaller transition rates (Category-2). The KOE-SSTA events in Category-1 tend to have more significant anomalous Ekman pumping in their transition phase, resulting in larger prediction errors of vertical oceanic temperature advection associated with the SSTA events. Consequently, Category-1 events possess faster error growth and larger prediction errors. In addition, the anomalous Ekman upwelling (downwelling) in the ASO season also causes SSTA cooling (warming), accelerating the transition rates of warm (cold) KOE-SSTA events. Therefore, the SSTA transition rate and error growth rate are both related with the anomalous Ekman pumping of the SSTA events to be predicted in their transition phase. This may explain why the SSTA events transferring more rapidly from the mature to decaying phase tend to have a greater possibility of yielding a more significant SPB.
Martin, George M.
2011-01-01
All phenotypes result from interactions between Nature, Nurture and Chance. The constitutional genome is clearly the dominant factor in explaining the striking differences in the pace and patterns of ageing among species. We are now in a position to reveal salient features underlying these differential modulations, which are likely to be dominated by regulatory domains. By contrast, I shall argue that stochastic events are the major players underlying the surprisingly large intra-specific variations in lifespan and healthspan. I shall review well established as well as more speculative categories of chance events – somatic mutations, protein synthesis error catastrophe and variegations of gene expression (epigenetic drift), with special emphasis upon the latter. I shall argue that stochastic drifts in variegated gene expression are the major contributors to intra-specific differences in the pace and patterns of ageing within members of the same species. They may be responsible for the quasi-stochastic distributions of major types of geriatric pathologies, including the “big three” of Alzheimer's disease, atherosclerosis and, via the induction of hyperplasis, cancer. They may be responsible for altered stoichiometries of heteromultimeric mitochondrial complexes, potentially leading to such disorders as sarcopenia, nonischemic cardiomyopathy and Parkinson's disease. PMID:21963385
Kreilinger, Alex; Hiebel, Hannah; Müller-Putz, Gernot R
2016-03-01
This work aimed to find and evaluate a new method for detecting errors in continuous brain-computer interface (BCI) applications. Instead of classifying errors on a single-trial basis, the new method was based on multiple events (MEs) analysis to increase the accuracy of error detection. In a BCI-driven car game, based on motor imagery (MI), discrete events were triggered whenever subjects collided with coins and/or barriers. Coins counted as correct events, whereas barriers were errors. This new method, termed ME method, combined and averaged the classification results of single events (SEs) and determined the correctness of MI trials, which consisted of event sequences instead of SEs. The benefit of this method was evaluated in an offline simulation. In an online experiment, the new method was used to detect erroneous MI trials. Such MI trials were discarded and could be repeated by the users. We found that, even with low SE error potential (ErrP) detection rates, feasible accuracies can be achieved when combining MEs to distinguish erroneous from correct MI trials. Online, all subjects reached higher scores with error detection than without, at the cost of longer times needed for completing the game. Findings suggest that ErrP detection may become a reliable tool for monitoring continuous states in BCI applications when combining MEs. This paper demonstrates a novel technique for detecting errors in online continuous BCI applications, which yields promising results even with low single-trial detection rates.
Model assessment using a multi-metric ranking technique
NASA Astrophysics Data System (ADS)
Fitzpatrick, P. J.; Lau, Y.; Alaka, G.; Marks, F.
2017-12-01
Validation comparisons of multiple models presents challenges when skill levels are similar, especially in regimes dominated by the climatological mean. Assessing skill separation will require advanced validation metrics and identifying adeptness in extreme events, but maintain simplicity for management decisions. Flexibility for operations is also an asset. This work postulates a weighted tally and consolidation technique which ranks results by multiple types of metrics. Variables include absolute error, bias, acceptable absolute error percentages, outlier metrics, model efficiency, Pearson correlation, Kendall's Tau, reliability Index, multiplicative gross error, and root mean squared differences. Other metrics, such as root mean square difference and rank correlation were also explored, but removed when the information was discovered to be generally duplicative to other metrics. While equal weights are applied, weights could be altered depending for preferred metrics. Two examples are shown comparing ocean models' currents and tropical cyclone products, including experimental products. The importance of using magnitude and direction for tropical cyclone track forecasts instead of distance, along-track, and cross-track are discussed. Tropical cyclone intensity and structure prediction are also assessed. Vector correlations are not included in the ranking process, but found useful in an independent context, and will be briefly reported.
Interlimb Differences in Coordination of Unsupported Reaching Movements
Schaffer, Jacob E.; Sainburg, Robert L.
2017-01-01
Previous research suggests that interlimb differences in coordination associated with handedness might result from specialized control mechanisms that are subserved by different cerebral hemispheres. Based largely on the results of horizontal plane reaching studies, we have proposed that the hemisphere contralateral to the dominant arm is specialized for predictive control of limb dynamics, while the non-dominant hemisphere is specialized for controlling limb impedance. The current study explores interlimb differences in control of 3-D unsupported reaching movements. While the task was presented in the horizontal plane, participant’s arms were unsupported and free to move within a range of the vertical axis, which was redundant to the task plane. Results indicated significant dominant arm advantages for both initial direction accuracy and final position accuracy. The dominant arm showed greater excursion along a redundant axis that was perpendicular to the task, and parallel to gravitational forces. In contrast, the non-dominant arm better impeded motion out of the task-plane. Nevertheless, left arm task errors varied substantially more with shoulder rotation excursion than did dominant arm task errors. These findings suggest that the dominant arm controller was able to take advantage of the redundant degrees of freedom of the task, while non-dominant task errors appeared enslaved to motion along the redundant axis. These findings are consistent with a dominant controller that is specialized for intersegmental coordination, and a non-dominant controller that is specialized for impedance control. However, the findings are inconsistent with previously documented conclusions from planar tasks, in which non-dominant control leads to greater final position accuracy. PMID:28344068
First-year Analysis of the Operating Room Black Box Study.
Jung, James J; Jüni, Peter; Lebovic, Gerald; Grantcharov, Teodor
2018-06-18
To characterize intraoperative errors, events, and distractions, and measure technical skills of surgeons in minimally invasive surgery practice. Adverse events in the operating room (OR) are common contributors of morbidity and mortality in surgical patients. Adverse events often occur due to deviations in performance and environmental factors. Although comprehensive intraoperative data analysis and transparent disclosure have been advocated to better understand how to improve surgical safety, they have rarely been done. We conducted a prospective cohort study in 132 consecutive patients undergoing elective laparoscopic general surgery at an academic hospital during the first year after the definite implementation of a multiport data capture system called the OR Black Box to identify intraoperative errors, events, and distractions. Expert analysts characterized intraoperative distractions, errors, and events, and measured trainee involvement as main operator. Technical skills were compared, crude and risk-adjusted, among the attending surgeon and trainees. Auditory distractions occurred a median of 138 times per case [interquartile range (IQR) 96-190]. At least 1 cognitive distraction appeared in 84 cases (64%). Medians of 20 errors (IQR 14-36) and 8 events (IQR 4-12) were identified per case. Both errors and events occurred often in dissection and reconstruction phases of operation. Technical skills of residents were lower than those of the attending surgeon (P = 0.015). During elective laparoscopic operations, frequent intraoperative errors and events, variation in surgeons' technical skills, and a high amount of environmental distractions were identified using the OR Black Box.
NASA Astrophysics Data System (ADS)
Halkides, D. J.; Waliser, Duane E.; Lee, Tong; Menemenlis, Dimitris; Guan, Bin
2015-02-01
Spatial and temporal variation of processes that determine ocean mixed-layer (ML) temperature (MLT) variability on the timescale of the Madden-Julian Oscillation (MJO) in the Tropical Indian Ocean (TIO) are examined in a heat-conserving ocean state estimate for years 1993-2011. We introduce a new metric for representing spatial variability of the relative importance of processes. In general, horizontal advection is most important at the Equator. Subsurface processes and surface heat flux are more important away from the Equator, with surface heat flux being the more dominant factor. Analyses at key sites are discussed in the context of local dynamics and literature. At 0°, 80.5°E, for MLT events > 2 standard deviations, ocean dynamics account for more than two thirds of the net tendency during cooling and warming phases. Zonal advection alone accounts for ˜40% of the net tendency. Moderate events (1-2 standard deviations) show more differences between events, and some are dominated by surface heat flux. At 8°S, 67°E in the Seychelles-Chagos Thermocline Ridge (SCTR) area, surface heat flux accounts for ˜70% of the tendency during strong cooling and warming phases; subsurface processes linked to ML depth (MLD) deepening (shoaling) during cooling (warming) account for ˜30%. MLT is more sensitive to subsurface processes in the SCTR, due to the thin MLD, thin barrier layer and raised thermocline. Results for 8°S, 67°E support assertions by Vialard et al. (2008) not previously confirmed due to measurement error that prevented budget closure and the small number of events studied. The roles of MLD, barrier layer thickness, and thermocline depth on different timescales are examined.
Abbott, Richard L; Weber, Paul; Kelley, Betsy
2005-12-01
To review the history and current issues surrounding medical professional liability insurance and its relationship to medical error and healthcare risk management. Focused literature review and authors' experience. Medical professional liability insurance issues are reviewed in association with the occurrence of medical error and the role of healthcare risk management. The rising frequency and severity of claims and lawsuits incurred by physicians, as well as escalating defense costs, have dramatically increased over the past several years and have resulted in accelerated efforts to reduce medical errors and control practice risk for physicians. Medical error reduction and improved patient outcomes are closely linked to the goals of the medical risk manager by reducing exposure to adverse medical events. Management of professional liability risk by the physician-led malpractice insurance company not only protects the economic viability of physicians, but also addresses patient safety concerns. Physician-owned malpractice liability insurance companies will continue to be the dominant providers of insurance for practicing physicians and will serve as the primary source for loss prevention and risk management services. To succeed in the marketplace, the emergence and importance of the risk manager and incorporation of risk management principles throughout the professional liability company has become crucial to the financial stability and success of the insurance company. The risk manager provides the necessary advice and support requested by physicians to minimize medical liability risk in their daily practice.
Bilingual language intrusions and other speech errors in Alzheimer's disease.
Gollan, Tamar H; Stasenko, Alena; Li, Chuchu; Salmon, David P
2017-11-01
The current study investigated how Alzheimer's disease (AD) affects production of speech errors in reading-aloud. Twelve Spanish-English bilinguals with AD and 19 matched controls read-aloud 8 paragraphs in four conditions (a) English-only, (b) Spanish-only, (c) English-mixed (mostly English with 6 Spanish words), and (d) Spanish-mixed (mostly Spanish with 6 English words). Reading elicited language intrusions (e.g., saying la instead of the), and several types of within-language errors (e.g., saying their instead of the). Patients produced more intrusions (and self-corrected less often) than controls, particularly when reading non-dominant language paragraphs with switches into the dominant language. Patients also produced more within-language errors than controls, but differences between groups for these were not consistently larger with dominant versus non-dominant language targets. These results illustrate the potential utility of speech errors for diagnosis of AD, suggest a variety of linguistic and executive control impairments in AD, and reveal multiple cognitive mechanisms needed to mix languages fluently. The observed pattern of deficits, and unique sensitivity of intrusions to AD in bilinguals, suggests intact ability to select a default language with contextual support, to rapidly translate and switch languages in production of connected speech, but impaired ability to monitor language membership while regulating inhibitory control. Copyright © 2017 Elsevier Inc. All rights reserved.
Detecting medication errors in the New Zealand pharmacovigilance database: a retrospective analysis.
Kunac, Desireé L; Tatley, Michael V
2011-01-01
Despite the traditional focus being adverse drug reactions (ADRs), pharmacovigilance centres have recently been identified as a potentially rich and important source of medication error data. To identify medication errors in the New Zealand Pharmacovigilance database (Centre for Adverse Reactions Monitoring [CARM]), and to describe the frequency and characteristics of these events. A retrospective analysis of the CARM pharmacovigilance database operated by the New Zealand Pharmacovigilance Centre was undertaken for the year 1 January-31 December 2007. All reports, excluding those relating to vaccines, clinical trials and pharmaceutical company reports, underwent a preventability assessment using predetermined criteria. Those events deemed preventable were subsequently classified to identify the degree of patient harm, type of error, stage of medication use process where the error occurred and origin of the error. A total of 1412 reports met the inclusion criteria and were reviewed, of which 4.3% (61/1412) were deemed preventable. Not all errors resulted in patient harm: 29.5% (18/61) were 'no harm' errors but 65.5% (40/61) of errors were deemed to have been associated with some degree of patient harm (preventable adverse drug events [ADEs]). For 5.0% (3/61) of events, the degree of patient harm was unable to be determined as the patient outcome was unknown. The majority of preventable ADEs (62.5% [25/40]) occurred in adults aged 65 years and older. The medication classes most involved in preventable ADEs were antibacterials for systemic use and anti-inflammatory agents, with gastrointestinal and respiratory system disorders the most common adverse events reported. For both preventable ADEs and 'no harm' events, most errors were incorrect dose and drug therapy monitoring problems consisting of failures in detection of significant drug interactions, past allergies or lack of necessary clinical monitoring. Preventable events were mostly related to the prescribing and administration stages of the medication use process, with the majority of errors 82.0% (50/61) deemed to have originated in the community setting. The CARM pharmacovigilance database includes medication errors, many of which were found to originate in the community setting and reported as ADRs. Error-prone situations were able to be identified, providing greater opportunity to improve patient safety. However, to enhance detection of medication errors by pharmacovigilance centres, reports should be prospectively reviewed for preventability and the reporting form revised to facilitate capture of important information that will provide meaningful insight into the nature of the underlying systems defects that caused the error.
Grantcharov, T P; Bardram, L; Funch-Jensen, P; Rosenberg, J
2003-07-01
The impact of gender and hand dominance on operative performance may be a subject of prejudice among surgeons, reportedly leading to discrimination and lack of professional promotion. However, very little objective evidence is available yet on the matter. This study was conducted to identify factors that influence surgeons' performance, as measured by a virtual reality computer simulator for laparoscopic surgery. This study included 25 surgical residents who had limited experience with laparoscopic surgery, having performed fewer than 10 laparoscopic cholecystectomies. The participants were registered according to their gender, hand dominance, and experience with computer games. All of the participants performed 10 repetitions of the six tasks on the Minimally Invasive Surgical Trainer-Virtual Reality (MIST-VR) within 1 month. Assessment of laparoscopic skills was based on three parameters measured by the simulator: time, errors, and economy of hand movement. Differences in performance existed between the compared groups. Men completed the tasks in less time than women ( p = 0.01, Mann-Whitney test), but there was no statistical difference between the genders in the number of errors and unnecessary movements. Individuals with right hand dominance performed fewer unnecessary movements ( p = 0.045, Mann-Whitney test), and there was a trend toward better results in terms of time and errors among the residence with right hand dominance than among those with left dominance. Users of computer games made fewer errors than nonusers ( p = 0.035, Mann-Whitney test). The study provides objective evidence of a difference in laparoscopic skills between surgeons differing gender, hand dominance, and computer experience. These results may influence the future development of training program for laparoscopic surgery. They also pose a challenge to individuals responsible for the selection and training of the residents.
The use of propagation path corrections to improve regional seismic event location in western China
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steck, L.K.; Cogbill, A.H.; Velasco, A.A.
1999-03-01
In an effort to improve the ability to locate seismic events in western China using only regional data, the authors have developed empirical propagation path corrections (PPCs) and applied such corrections using both traditional location routines as well as a nonlinear grid search method. Thus far, the authors have concentrated on corrections to observed P arrival times for shallow events using travel-time observations available from the USGS EDRs, the ISC catalogs, their own travel-tim picks from regional data, and data from other catalogs. They relocate events with the algorithm of Bratt and Bache (1988) from a region encompassing China. Formore » individual stations having sufficient data, they produce a map of the regional travel-time residuals from all well-located teleseismic events. From these maps, interpolated PPC surfaces have been constructed using both surface fitting under tension and modified Bayesian kriging. The latter method offers the advantage of providing well-behaved interpolants, but requires that the authors have adequate error estimates associated with the travel-time residuals. To improve error estimates for kriging and event location, they separate measurement error from modeling error. The modeling error is defined as the travel-time variance of a particular model as a function of distance, while the measurement error is defined as the picking error associated with each phase. They estimate measurement errors for arrivals from the EDRs based on roundoff or truncation, and use signal-to-noise for the travel-time picks from the waveform data set.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolaczkowski, A.M.; Lambright, J.A.; Ferrell, W.L.
This document contains the internal event initiated accident sequence analyses for Peach Bottom, Unit 2; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Peach Bottom, Unit 2, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provided additional insights regarding the dominant contributors to the Peach Bottom core damage frequency estimate. The mean core damage frequency at Peach Bottom wasmore » calculated to be 8.2E-6. Station blackout type accidents (loss of all ac power) were found to dominate the overall results. Anticipated Transient Without Scram accidents were also found to be non-negligible contributors. The numerical results are largely driven by common mode failure probability estimates and to some extent, human error. Because of significant data and analysis uncertainties in these two areas (important, for instance, to the most dominant scenario in this study), it is recommended that the results of the uncertainty and sensitivity analyses be considered before any actions are taken based on this analysis.« less
Multiple levels of bilingual language control: evidence from language intrusions in reading aloud.
Gollan, Tamar H; Schotter, Elizabeth R; Gomez, Joanne; Murillo, Mayra; Rayner, Keith
2014-02-01
Bilinguals rarely produce words in an unintended language. However, we induced such intrusion errors (e.g., saying el instead of he) in 32 Spanish-English bilinguals who read aloud single-language (English or Spanish) and mixed-language (haphazard mix of English and Spanish) paragraphs with English or Spanish word order. These bilinguals produced language intrusions almost exclusively in mixed-language paragraphs, and most often when attempting to produce dominant-language targets (accent-only errors also exhibited reversed language-dominance effects). Most intrusion errors occurred for function words, especially when they were not from the language that determined the word order in the paragraph. Eye movements showed that fixating a word in the nontarget language increased intrusion errors only for function words. Together, these results imply multiple mechanisms of language control, including (a) inhibition of the dominant language at both lexical and sublexical processing levels, (b) special retrieval mechanisms for function words in mixed-language utterances, and (c) attentional monitoring of the target word for its match with the intended language.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Weverberg, K.; Morcrette, C. J.; Petch, J.
Many numerical weather prediction (NWP) and climate models exhibit too warm lower tropospheres near the mid-latitude continents. This warm bias has been extensively studied before, but evidence about its origin remains inconclusive. Some studies point to deficiencies in the deep convective or low clouds. Other studies found an important contribution from errors in the land surface properties. The warm bias has been shown to coincide with important surface radiation biases that likely play a critical role in the inception or the growth of the warm bias. Documenting these radiation errors is hence an important step towards understanding and alleviating themore » warm bias. This paper presents an attribution study to quantify the net radiation biases in 9 model simulations, performed in the framework of the CAUSES project (Clouds Above the United States and Errors at the Surface). Contributions from deficiencies in the surface properties, clouds, integrated water vapor (IWV) and aerosols are quantified, using an array of radiation measurement stations near the ARM SGP site. Furthermore, an in depth-analysis is shown to attribute the radiation errors to specific cloud regimes. The net surface SW radiation is overestimated (LW underestimated) in all models throughout most of the simulation period. Cloud errors are shown to contribute most to this overestimation in all but one model, which has a dominant albedo issue. Using a cloud regime analysis, it was shown that missing deep cloud events and/or simulating deep clouds with too weak cloud-radiative effects account for most of these cloud-related radiation errors. Some models have compensating errors between excessive occurrence of deep cloud, but largely underestimating their radiative effect, while other models miss deep cloud events altogether. Surprisingly however, even the latter models tend to produce too much and too frequent afternoon surface precipitation. This suggests that rather than issues with the triggering of deep convection, the deep cloud problem in many models could be related to too weak convective cloud detrainment and too large precipitation efficiencies. This does not rule out that previously documented issues with the evaporative fraction contribute to the warm bias as well, since the majority of the models underestimate the surface rain rates overall, as they miss the observed large nocturnal precipitation peak.« less
NASA Astrophysics Data System (ADS)
Li, Jiaqiang; Choutko, Vitaly; Xiao, Liyi
2018-03-01
Based on the collection of error data from the Alpha Magnetic Spectrometer (AMS) Digital Signal Processors (DSP), on-orbit Single Event Upsets (SEUs) of the DSP program memory are analyzed. The daily error distribution and time intervals between errors are calculated to evaluate the reliability of the system. The particle density distribution of International Space Station (ISS) orbit is presented and the effects from the South Atlantic Anomaly (SAA) and the geomagnetic poles are analyzed. The impact of solar events on the DSP program memory is carried out combining data analysis and Monte Carlo simulation (MC). From the analysis and simulation results, it is concluded that the area corresponding to the SAA is the main source of errors on the ISS orbit. Solar events can also cause errors on DSP program memory, but the effect depends on the on-orbit particle density.
ERIC Educational Resources Information Center
Mazur, Elizabeth; Wolchik, Sharlene A.; Virdin, Lynn; Sandler, Irwin N.; West, Stephen G.
1999-01-01
Examined whether children's cognitive biases moderated impact of stressful divorce-related events on adjustment in 9- to 12-year olds. Found that endorsing negative cognitive errors for hypothetical divorce events moderated relations between stressful divorce events and self- and maternal-reports of internalizing and externalizing symptoms for…
Sandra, Dominiek
2010-01-01
Two experiments and two corpus studies focus on homophone dominance in the spelling of regularly inflected verb forms, the phenomenon that the higher-frequency homophone causes more intrusion errors on the lower-frequency one than vice versa. Experiment 1 was a speeded dictation task focusing on the Dutch imperative, a verb form whose formation rule is poorly known. A clear-cut effect of homophone dominance was found. This effect was equally strong when the target imperative was preceded by another imperative in the same sentence whose pronunciation reflected the spelling rule. Experiment 2 indicated that the effect of homophone dominance cannot be reduced to an effect of recency. Language users cannot discriminate a recently seen verb form when shown the two homophones. Instead, they choose the most frequent spelling pattern. In Corpus Study 1 a Google search on the world wide web revealed a sublexical effect of homophone dominance in the spelling errors on regular past tense forms. Corpus Study 2 demonstrated the validity of the search method. The sublexical effect of homophone dominance, involving units that cut across the stem-suffix boundary, lends itself naturally to a representational model of the connectionist or analogical processing tradition but is hard to reconcile with a rule-based account.
Jared, Debra; O'Donnell, Katrina
2017-02-01
We examined whether highly skilled adult readers activate the meanings of high-frequency words using phonology when reading sentences for meaning. A homophone-error paradigm was used. Sentences were written to fit 1 member of a homophone pair, and then 2 other versions were created in which the homophone was replaced by its mate or a spelling-control word. The error words were all high-frequency words, and the correct homophones were either higher-frequency words or low-frequency words-that is, the homophone errors were either the subordinate or dominant member of the pair. Participants read sentences as their eye movements were tracked. When the high-frequency homophone error words were the subordinate member of the homophone pair, participants had shorter immediate eye-fixation latencies on these words than on matched spelling-control words. In contrast, when the high-frequency homophone error words were the dominant member of the homophone pair, a difference between these words and spelling controls was delayed. These findings provide clear evidence that the meanings of high-frequency words are activated by phonological representations when skilled readers read sentences for meaning. Explanations of the differing patterns of results depending on homophone dominance are discussed.
Estimating Rain Rates from Tipping-Bucket Rain Gauge Measurements
NASA Technical Reports Server (NTRS)
Wang, Jianxin; Fisher, Brad L.; Wolff, David B.
2007-01-01
This paper describes the cubic spline based operational system for the generation of the TRMM one-minute rain rate product 2A-56 from Tipping Bucket (TB) gauge measurements. Methodological issues associated with applying the cubic spline to the TB gauge rain rate estimation are closely examined. A simulated TB gauge from a Joss-Waldvogel (JW) disdrometer is employed to evaluate effects of time scales and rain event definitions on errors of the rain rate estimation. The comparison between rain rates measured from the JW disdrometer and those estimated from the simulated TB gauge shows good overall agreement; however, the TB gauge suffers sampling problems, resulting in errors in the rain rate estimation. These errors are very sensitive to the time scale of rain rates. One-minute rain rates suffer substantial errors, especially at low rain rates. When one minute rain rates are averaged to 4-7 minute or longer time scales, the errors dramatically reduce. The rain event duration is very sensitive to the event definition but the event rain total is rather insensitive, provided that the events with less than 1 millimeter rain totals are excluded. Estimated lower rain rates are sensitive to the event definition whereas the higher rates are not. The median relative absolute errors are about 22% and 32% for 1-minute TB rain rates higher and lower than 3 mm per hour, respectively. These errors decrease to 5% and 14% when TB rain rates are used at 7-minute scale. The radar reflectivity-rainrate (Ze-R) distributions drawn from large amount of 7-minute TB rain rates and radar reflectivity data are mostly insensitive to the event definition.
Corrigendum: Earthquakes triggered by silent slip events on Kīlauea volcano, Hawaii
Segall, Paul; Desmarais, Emily K.; Shelly, David; Miklius, Asta; Cervelli, Peter
2006-01-01
There was a plotting error in Fig. 1 that inadvertently displays earthquakes for the incorrect time interval. The location of earthquakes during the two-day-long slow-slip event of January 2005 are shown here in the corrected Fig. 1. Because the incorrect locations were also used in the Coulomb stress-change (CSC) calculation, the error could potentially have biased our interpretation of the depth of the slow-slip event, although in fact it did not. Because nearly all of the earthquakes, both background and triggered, are landward of the slow-slip event and at similar depths (6.5–8.5 km), the impact on the CSC calculations is negligible (Fig. 2; compare with Fig. 4 in original paper). The error does not alter our conclusion that the triggered events during the January 2005 slow-slip event were located on a subhorizontal plane at a depth of 7.5 1 km. This is therefore the most likely depth of the slow-slip events. We thank Cecily J. Wolfe for pointing out the error in the original Fig. 1.
Bonilla, Manuel G.; Mark, Robert K.; Lienkaemper, James J.
1984-01-01
In order to refine correlations of surface-wave magnitude, fault rupture length at the ground surface, and fault displacement at the surface by including the uncertainties in these variables, the existing data were critically reviewed and a new data base was compiled. Earthquake magnitudes were redetermined as necessary to make them as consistent as possible with the Gutenberg methods and results, which make up much of the data base. Measurement errors were estimated for the three variables for 58 moderate to large shallow-focus earthquakes. Regression analyses were then made utilizing the estimated measurement errors.The regression analysis demonstrates that the relations among the variables magnitude, length, and displacement are stochastic in nature. The stochastic variance, introduced in part by incomplete surface expression of seismogenic faulting, variation in shear modulus, and regional factors, dominates the estimated measurement errors. Thus, it is appropriate to use ordinary least squares for the regression models, rather than regression models based upon an underlying deterministic relation in which the variance results primarily from measurement errors.Significant differences exist in correlations of certain combinations of length, displacement, and magnitude when events are grouped by fault type or by region, including attenuation regions delineated by Evernden and others.Estimates of the magnitude and the standard deviation of the magnitude of a prehistoric or future earthquake associated with a fault can be made by correlating Ms with the logarithms of rupture length, fault displacement, or the product of length and displacement.Fault rupture area could be reliably estimated for about 20 of the events in the data set. Regression of Ms on rupture area did not result in a marked improvement over regressions that did not involve rupture area. Because no subduction-zone earthquakes are included in this study, the reported results do not apply to such zones.
Bonilla, M.G.; Mark, R.K.; Lienkaemper, J.J.
1984-01-01
In order to refine correlations of surface-wave magnitude, fault rupture length at the ground surface, and fault displacement at the surface by including the uncertainties in these variables, the existing data were critically reviewed and a new data base was compiled. Earthquake magnitudes were redetermined as necessary to make them as consistent as possible with the Gutenberg methods and results, which necessarily make up much of the data base. Measurement errors were estimated for the three variables for 58 moderate to large shallow-focus earthquakes. Regression analyses were then made utilizing the estimated measurement errors. The regression analysis demonstrates that the relations among the variables magnitude, length, and displacement are stochastic in nature. The stochastic variance, introduced in part by incomplete surface expression of seismogenic faulting, variation in shear modulus, and regional factors, dominates the estimated measurement errors. Thus, it is appropriate to use ordinary least squares for the regression models, rather than regression models based upon an underlying deterministic relation with the variance resulting from measurement errors. Significant differences exist in correlations of certain combinations of length, displacement, and magnitude when events are qrouped by fault type or by region, including attenuation regions delineated by Evernden and others. Subdivision of the data results in too few data for some fault types and regions, and for these only regressions using all of the data as a group are reported. Estimates of the magnitude and the standard deviation of the magnitude of a prehistoric or future earthquake associated with a fault can be made by correlating M with the logarithms of rupture length, fault displacement, or the product of length and displacement. Fault rupture area could be reliably estimated for about 20 of the events in the data set. Regression of MS on rupture area did not result in a marked improvement over regressions that did not involve rupture area. Because no subduction-zone earthquakes are included in this study, the reported results do not apply to such zones.
Model Based Verification of Cyber Range Event Environments
2015-12-10
Model Based Verification of Cyber Range Event Environments Suresh K. Damodaran MIT Lincoln Laboratory 244 Wood St., Lexington, MA, USA...apply model based verification to cyber range event environment configurations, allowing for the early detection of errors in event environment...Environment Representation (CCER) ontology. We also provide an overview of a methodology to specify verification rules and the corresponding error
Rausch, R; MacDonald, K
1997-03-01
We used a protocol consisting of a continuous presentation of stimuli with associated response requests during an intracarotid sodium amobarbital procedure (IAP) to study the effects of hemisphere injected (speech dominant vs. nondominant) and seizure focus (left temporal lobe vs. right temporal lobe) on the pattern of behavioral response errors for three types of visual stimuli (pictures of common objects, words, and abstract forms). Injection of the left speech dominant hemisphere compared to the right nondominant hemisphere increased overall errors and affected the pattern of behavioral errors. The presence of a seizure focus in the contralateral hemisphere increased overall errors, particularly for the right temporal lobe seizure patients, but did not affect the pattern of behavioral errors. Left hemisphere injections disrupted both naming and reading responses at a rate similar to that of matching-to-sample performance. Also, a short-term memory deficit was observed with all three stimuli. Long-term memory testing following the left hemisphere injection indicated that only for pictures of common objects were there fewer errors during the early postinjection period than for the later long-term memory testing. Therefore, despite the inability to respond to picture stimuli, picture items, but not words or forms, could be sufficiently encoded for later recall. In contrast, right hemisphere injections resulted in few errors, with a pattern suggesting a mild general cognitive decrease. A selective weakness in learning unfamiliar forms was found. Our findings indicate that different patterns of behavioral deficits occur following the left vs. right hemisphere injections, with selective patterns specific to stimulus type.
Slow earthquakes in microseism frequency band (0.1-2 Hz) off the Kii peninsula
NASA Astrophysics Data System (ADS)
Kaneko, L.; Ide, S.; Nakano, M.
2017-12-01
Slow earthquakes are divided into deep tectonic tremors, very low frequency (VLF) events, and slow slip events (SSE), each of which is observed in a different frequency band. Tremors are observed above 2 Hz, and VLF signals are visible mainly in 0.01-0.05 Hz. It was generally very difficult to find signals of slow underground deformation at frequencies between them, i.e., 0.1-2Hz, where microseism noise is dominant. However, after a Mw 5.9 plate boundary earthquake off the Kii peninsula on April 1st, 2016, sufficiently large signals have been observed in the microseism band, accompanied with signals from active tremors, VLFEs, and SSEs by the ocean bottom seismometer network DONET maintained by JAMSTEC and NIED. This is the first observation of slow earthquakes in the microseism frequency band. Here we report the detection and location of events in this band, and compare them with the spatial and temporal distributions of ordinary tectonic tremors above 2 Hz and VLF events. We used continuous records of 20 broadband seismometers of DONET from April 1st to 12th. We detected events by calculating arrival time differences between stations using an envelope correlation method of Ide (2010). Unlike ordinary applications, we repeated analyses for seismograms bandpass-filtered in four separated frequency bands, 0.1-1, 1-2, 2-4, and 4-8 Hz. For each band, we successfully detected events and determined their hypocenter locations. Many VLF events have also been detected in this region in the frequency band of 0.03-0.05 Hz, with location and focal mechanism using a method of Nakano et al. (2008). In the 0.1-1 Hz microseism band, hypocenters were determined mainly on April 10th, when microseism noises are small and signal amplitudes are quite large. In several time windows, events were detected in all four bands, and located within the 2-sigma error ellipses, with similar source time functions. Sometimes, events were detected in two or three bands, suggesting wide variations of in wave radiation at different frequencies. Although the location errors are not always small enough to confirm the collocation of sources, due to uncertainty in structure, we can confirm seismic wave are radiated in the microseism band from slow earthquake, which is considered as a continuous, broadband, and complicated phenomenon.
Spelling Errors of Dyslexic Children in Bosnian Language With Transparent Orthography.
Duranović, Mirela
The purpose of this study was to explore the nature of spelling errors made by children with dyslexia in Bosnian language with transparent orthography. Three main error categories were distinguished: phonological, orthographic, and grammatical errors. An analysis of error type showed 86% of phonological errors,10% of orthographic errors, and 4% of grammatical errors. Furthermore, the majority errors were the omissions and substitutions, followed by the insertions, omission of rules of assimilation by voicing, and errors with utilization of suffix. We can conclude that phonological errors were dominant in children with dyslexia at all grade levels.
Computations of Vertical Displacement Events with Toroidal Asymmetry
NASA Astrophysics Data System (ADS)
Sovinec, C. R.; Bunkers, K. J.
2017-10-01
Nonlinear numerical MHD modeling with the NIMROD code [https://nimrodteam.org] is being developed to investigate asymmetry during vertical displacement events. We start from idealized up/down symmetric tokamak equilibria with small levels of imposed toroidally asymmetric field errors. Vertical displacement results when removing current from one of the two divertor coils. The Eulerian reference-frame modeling uses temperature-dependent resistivity and anisotropic thermal conduction to distinguish the hot plasma region from surrounding cold, low-density conditions. Diffusion through a resistive wall is slow relative to Alfvenic scales but much faster than resistive plasma diffusion. Loss of the initial edge pressure and current distributions leads to a narrow layer of parallel current, which drives low-n modes that may be related to peeling-dominated ELMs. These modes induce toroidal asymmetry in the conduction current, which connects the simulated plasma to the wall. Work supported by the US DOE through Grant Numbers DE-FG02-06ER54850 and DE-FC02-08ER54975.
In-hospital fellow coverage reduces communication errors in the surgical intensive care unit.
Williams, Mallory; Alban, Rodrigo F; Hardy, James P; Oxman, David A; Garcia, Edward R; Hevelone, Nathanael; Frendl, Gyorgy; Rogers, Selwyn O
2014-06-01
Staff coverage strategies of intensive care units (ICUs) impact clinical outcomes. High-intensity staff coverage strategies are associated with lower morbidity and mortality. Accessible clinical expertise, team work, and effective communication have all been attributed to the success of this coverage strategy. We evaluate the impact of in-hospital fellow coverage (IHFC) on improving communication of cardiorespiratory events. A prospective observational study performed in an academic tertiary care center with high-intensity staff coverage. The main outcome measure was resident to fellow communication of cardiorespiratory events during IHFC vs home coverage (HC) periods. Three hundred twelve cardiorespiratory events were collected in 114 surgical ICU patients in 134 study days. Complete data were available for 306 events. One hundred three communication errors occurred. IHFC was associated with significantly better communication of events compared to HC (P<.0001). Residents communicated 89% of events during IHFC vs 51% of events during HC (P<.001). Communication patterns of junior and midlevel residents were similar. Midlevel residents communicated 68% of all on-call events (87% IHFC vs 50% HC, P<.001). Junior residents communicated 66% of events (94% IHFC vs 52% HC, P<.001). Communication errors were lower in all ICUs during IHFC (P<.001). IHFC reduced communication errors. Copyright © 2014 Elsevier Inc. All rights reserved.
Poon, Eric G; Cina, Jennifer L; Churchill, William W; Mitton, Patricia; McCrea, Michelle L; Featherstone, Erica; Keohane, Carol A; Rothschild, Jeffrey M; Bates, David W; Gandhi, Tejal K
2005-01-01
We performed a direct observation pre-post study to evaluate the impact of barcode technology on medication dispensing errors and potential adverse drug events in the pharmacy of a tertiary-academic medical center. We found that barcode technology significantly reduced the rate of target dispensing errors leaving the pharmacy by 85%, from 0.37% to 0.06%. The rate of potential adverse drug events (ADEs) due to dispensing errors was also significantly reduced by 63%, from 0.19% to 0.069%. In a 735-bed hospital where 6 million doses of medications are dispensed per year, this technology is expected to prevent about 13,000 dispensing errors and 6,000 potential ADEs per year. PMID:16779372
Improving Patient Safety With Error Identification in Chemotherapy Orders by Verification Nurses.
Baldwin, Abigail; Rodriguez, Elizabeth S
2016-02-01
The prevalence of medication errors associated with chemotherapy administration is not precisely known. Little evidence exists concerning the extent or nature of errors; however, some evidence demonstrates that errors are related to prescribing. This article demonstrates how the review of chemotherapy orders by a designated nurse known as a verification nurse (VN) at a National Cancer Institute-designated comprehensive cancer center helps to identify prescribing errors that may prevent chemotherapy administration mistakes and improve patient safety in outpatient infusion units. This article will describe the role of the VN and details of the verification process. To identify benefits of the VN role, a retrospective review and analysis of chemotherapy near-miss events from 2009-2014 was performed. A total of 4,282 events related to chemotherapy were entered into the Reporting to Improve Safety and Quality system. A majority of the events were categorized as near-miss events, or those that, because of chance, did not result in patient injury, and were identified at the point of prescribing.
NASA Astrophysics Data System (ADS)
Colins, Karen; Li, Liqian; Liu, Yu
2017-05-01
Mass production of widely used semiconductor digital integrated circuits (ICs) has lowered unit costs to the level of ordinary daily consumables of a few dollars. It is therefore reasonable to contemplate the idea of an engineered system that consumes unshielded low-cost ICs for the purpose of measuring gamma radiation dose. Underlying the idea is the premise of a measurable correlation between an observable property of ICs and radiation dose. Accumulation of radiation-damage-induced state changes or error events is such a property. If correct, the premise could make possible low-cost wide-area radiation dose measurement systems, instantiated as wireless sensor networks (WSNs) with unshielded consumable ICs as nodes, communicating error events to a remote base station. The premise has been investigated quantitatively for the first time in laboratory experiments and related analyses performed at the Canadian Nuclear Laboratories. State changes or error events were recorded in real time during irradiation of samples of ICs of different types in a 60Co gamma cell. From the error-event sequences, empirical distribution functions of dose were generated. The distribution functions were inverted and probabilities scaled by total error events, to yield plots of the relationship between dose and error tallies. Positive correlation was observed, and discrete functional dependence of dose quantiles on error tallies was measured, demonstrating the correctness of the premise. The idea of an engineered system that consumes unshielded low-cost ICs in a WSN, for the purpose of measuring gamma radiation dose over wide areas, is therefore tenable.
Flouri, Eirini; Panourgia, Constantina
2011-06-01
The aim of this study was to test for gender differences in how negative cognitive errors (overgeneralizing, catastrophizing, selective abstraction, and personalizing) mediate the association between adverse life events and adolescents' emotional and behavioural problems (measured with the Strengths and Difficulties Questionnaire). The sample consisted of 202 boys and 227 girls (aged 11-15 years) from three state secondary schools in disadvantaged areas in one county in the South East of England. Control variables were age, ethnicity, special educational needs, exclusion history, family structure, family socio-economic disadvantage, and verbal cognitive ability. Adverse life events were measured with Tiet et al.'s (1998) Adverse Life Events Scale. For both genders, we assumed a pathway from adverse life events to emotional and behavioural problems via cognitive errors. We found no gender differences in life adversity, cognitive errors, total difficulties, peer problems, or hyperactivity. In both boys and girls, even after adjustment for controls, cognitive errors were related to total difficulties and emotional symptoms, and life adversity was related to total difficulties and conduct problems. The life adversity/conduct problems association was not explained by negative cognitive errors in either gender. However, we found gender differences in how adversity and cognitive errors produced hyperactivity and internalizing problems. In particular, life adversity was not related, after adjustment for controls, to hyperactivity in girls and to peer problems and emotional symptoms in boys. Cognitive errors fully mediated the effect of life adversity on hyperactivity in boys and on peer and emotional problems in girls.
[Memorization of Sequences of Movements of the Right and the Left Hand by Right- and Left-Handers].
Bobrova, E V; Bogacheva, I N; Lyakhovetskii, V A; Fabinskaja, A A; Fomina, E V
2015-01-01
We analyzed the errors of right- and left-handers when performing memorized sequences by the left or the right hand during the task which activates positional coding: after 6-10 times the order of movements changed (the positions remained the same during all task). The task was first performed by one ("initial") hand, and then by another one ("continuing"); there were 2 groups of right-handers and 2 groups of left-handers. It was found that the pattern of errors during the task performance by the initial hand is similar in right- and left-handers both for the dominant and non-dominant hand. The information about the previous positions after changing the order of elements is used in the sequences for subdominant hands and not used in the sequences for dominant ones. After changing the hand, right- and left-handers show different patterns of errors ("non-symmetrical"). Thus, the errors of right- and left-handers are "symmetrical" at the early stages of task performance, while the transfer of this motor skill in right-and left-handers occurs in different ways.
Assiri, Ghadah Asaad; Shebl, Nada Atef; Mahmoud, Mansour Adam; Aloudah, Nouf; Grant, Elizabeth; Aljadhey, Hisham; Sheikh, Aziz
2018-05-05
To investigate the epidemiology of medication errors and error-related adverse events in adults in primary care, ambulatory care and patients' homes. Systematic review. Six international databases were searched for publications between 1 January 2006 and 31 December 2015. Two researchers independently extracted data from eligible studies and assessed the quality of these using established instruments. Synthesis of data was informed by an appreciation of the medicines' management process and the conceptual framework from the International Classification for Patient Safety. 60 studies met the inclusion criteria, of which 53 studies focused on medication errors, 3 on error-related adverse events and 4 on risk factors only. The prevalence of prescribing errors was reported in 46 studies: prevalence estimates ranged widely from 2% to 94%. Inappropriate prescribing was the most common type of error reported. Only one study reported the prevalence of monitoring errors, finding that incomplete therapeutic/safety laboratory-test monitoring occurred in 73% of patients. The incidence of preventable adverse drug events (ADEs) was estimated as 15/1000 person-years, the prevalence of drug-drug interaction-related adverse drug reactions as 7% and the prevalence of preventable ADE as 0.4%. A number of patient, healthcare professional and medication-related risk factors were identified, including the number of medications used by the patient, increased patient age, the number of comorbidities, use of anticoagulants, cases where more than one physician was involved in patients' care and care being provided by family physicians/general practitioners. A very wide variation in the medication error and error-related adverse events rates is reported in the studies, this reflecting heterogeneity in the populations studied, study designs employed and outcomes evaluated. This review has identified important limitations and discrepancies in the methodologies used and gaps in the literature on the epidemiology and outcomes of medication errors in community settings. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Developmental Changes in Error Monitoring: An Event-Related Potential Study
ERIC Educational Resources Information Center
Wiersema, Jan R.; van der Meere, Jacob J.; Roeyers, Herbert
2007-01-01
The aim of the study was to investigate the developmental trajectory of error monitoring. For this purpose, children (age 7-8), young adolescents (age 13-14) and adults (age 23-24) performed a Go/No-Go task and were compared on overt reaction time (RT) performance and on event-related potentials (ERPs), thought to reflect error detection…
NASA Astrophysics Data System (ADS)
Heath, J. T.; Chafer, C. J.; van Ogtrop, F. F.; Bishop, T. F. A.
2014-11-01
Wildfire is a recurring event which has been acknowledged by the literature to impact the hydrological cycle of a catchment. Hence, wildfire may have a significant impact on water yield levels within a catchment. In Australia, studies of the effect of fire on water yield have been limited to obligate seeder vegetation communities. These communities regenerate from seed banks in the ground or within woody fruits and are generally activated by fire. In contrast, the Sydney Basin is dominated by obligate resprouter communities. These communities regenerate from fire resistant buds found on the plant and are generally found in regions where wildfire is a regular occurrence. The 2001/2002 wildfires in the Sydney Basin provided an opportunity to investigate the impacts of wildfire on water yield in a number of catchments dominated by obligate resprouting communities. The overall aim of this study was to investigate whether there was a difference in water yield post-wildfire. Four burnt subcatchments and 3 control subcatchments were assessed. A general additive model was calibrated using pre-wildfire data and then used to predict post-wildfire water yield using post-wildfire data. The model errors were analysed and it was found that the errors for all subcatchments showed similar trends for the post-wildfire period. This finding demonstrates that wildfires within the Sydney Basin have no significant medium-term impact on water yield.
Hühn, M; Piepho, H P
2003-03-01
Tests for linkage are usually performed using the lod score method. A critical question in linkage analyses is the choice of sample size. The appropriate sample size depends on the desired type-I error and power of the test. This paper investigates the exact type-I error and power of the lod score method in a segregating F(2) population with co-dominant markers and a qualitative monogenic dominant-recessive trait. For illustration, a disease-resistance trait is considered, where the susceptible allele is recessive. A procedure is suggested for finding the appropriate sample size. It is shown that recessive plants have about twice the information content of dominant plants, so the former should be preferred for linkage detection. In some cases the exact alpha-values for a given nominal alpha may be rather small due to the discrete nature of the sampling distribution in small samples. We show that a gain in power is possible by using exact methods.
Stress priming in reading and the selective modulation of lexical and sub-lexical pathways.
Colombo, Lucia; Zevin, Jason
2009-09-29
Four experiments employed a priming methodology to investigate different mechanisms of stress assignment and how they are modulated by lexical and sub-lexical mechanisms in reading aloud in Italian. Lexical stress is unpredictable in Italian, and requires lexical look-up. The most frequent stress pattern (Dominant) is on the penultimate syllable [laVOro (work)], while stress on the antepenultimate syllable [MAcchina (car)] is relatively less frequent (non-Dominant). Word and pseudoword naming responses primed by words with non-dominant stress--which require whole-word knowledge to be read correctly--were compared to those primed by nonwords. Percentage of errors to words and percentage of dominant stress responses to nonwords were measured. In Experiments 1 and 2 stress errors increased for non-dominant stress words primed by nonwords, as compared to when they were primed by words. The results could be attributed to greater activation of sub-lexical codes, and an associated tendency to assign the dominant stress pattern by default in the nonword prime condition. Alternatively, they may have been the consequence of prosodic priming, inducing more errors on trials in which the stress pattern of primes and targets was not congruent. The two interpretations were investigated in Experiments 3 and 4. The results overall suggested a limited role of the default metrical pattern in word pronunciation, and showed clear effect of prosodic priming, but only when the sub-lexical mechanism prevailed.
Errors, near misses and adverse events in the emergency department: what can patients tell us?
Friedman, Steven M; Provan, David; Moore, Shannon; Hanneman, Kate
2008-09-01
We sought to determine whether patients or their families could identify adverse events in the emergency department (ED), to characterize patient reports of errors and to compare patient reports to events recorded by health care providers. This was a prospective cohort study in a quaternary care inner city teaching hospital with approximately 40,000 annual visits. ED patients were recruited for participation in a standardized interview within 24 hours of ED discharge and a follow-up interview 3-7 days after discharge. Responses regarding events were tabulated and compared with physician and nurse notations in the medical record and hospital event reporting system. Of 292 eligible patients, 201 (69%) were interviewed within 24 hours of ED discharge, and 143 (71% of interviewees) underwent a follow-up interview 3-7 days after discharge. Interviewees did not differ from the base ED population in terms of age, sex or language. Analysis of patient interviews identified 10 adverse events (5% incident rate; 95% confidence interval [CI] 2.41%-8.96%), 8 near misses (4% incident rate; 95% CI 1.73%-7.69%) and no medical errors. Of the 10 adverse events, 6 (60%) were characterized as preventable (2 raters; kappa=0.78, standard error [SE] 0.20; 95% CI 0.39-1.00; p=0.01). Adverse events were primarily related to delayed or inadequate analgesia. Only 4 out of 8 (50%) near misses were intercepted by hospital personnel. The secondary interview elicited 2 out of 10 adverse events and 3 out of 8 near misses that had not been identified in the primary interview. No designation (0 out of 10) of an adverse event was recorded in the ED medical record or in the confidential hospital event reporting system. ED patients can identify adverse events affecting their care. Moreover, many of these events are not recorded in the medical record. Engaging patients and their family members in identification of errors may enhance patient safety.
Threat and error management for anesthesiologists: a predictive risk taxonomy
Ruskin, Keith J.; Stiegler, Marjorie P.; Park, Kellie; Guffey, Patrick; Kurup, Viji; Chidester, Thomas
2015-01-01
Purpose of review Patient care in the operating room is a dynamic interaction that requires cooperation among team members and reliance upon sophisticated technology. Most human factors research in medicine has been focused on analyzing errors and implementing system-wide changes to prevent them from recurring. We describe a set of techniques that has been used successfully by the aviation industry to analyze errors and adverse events and explain how these techniques can be applied to patient care. Recent findings Threat and error management (TEM) describes adverse events in terms of risks or challenges that are present in an operational environment (threats) and the actions of specific personnel that potentiate or exacerbate those threats (errors). TEM is a technique widely used in aviation, and can be adapted for the use in a medical setting to predict high-risk situations and prevent errors in the perioperative period. A threat taxonomy is a novel way of classifying and predicting the hazards that can occur in the operating room. TEM can be used to identify error-producing situations, analyze adverse events, and design training scenarios. Summary TEM offers a multifaceted strategy for identifying hazards, reducing errors, and training physicians. A threat taxonomy may improve analysis of critical events with subsequent development of specific interventions, and may also serve as a framework for training programs in risk mitigation. PMID:24113268
Error-related negativities elicited by monetary loss and cues that predict loss.
Dunning, Jonathan P; Hajcak, Greg
2007-11-19
Event-related potential studies have reported error-related negativity following both error commission and feedback indicating errors or monetary loss. The present study examined whether error-related negativities could be elicited by a predictive cue presented prior to both the decision and subsequent feedback in a gambling task. Participants were presented with a cue that indicated the probability of reward on the upcoming trial (0, 50, and 100%). Results showed a negative deflection in the event-related potential in response to loss cues compared with win cues; this waveform shared a similar latency and morphology with the traditional feedback error-related negativity.
SEC proton prediction model: verification and analysis.
Balch, C C
1999-06-01
This paper describes a model that has been used at the NOAA Space Environment Center since the early 1970s as a guide for the prediction of solar energetic particle events. The algorithms for proton event probability, peak flux, and rise time are described. The predictions are compared with observations. The current model shows some ability to distinguish between proton event associated flares and flares that are not associated with proton events. The comparisons of predicted and observed peak flux show considerable scatter, with an rms error of almost an order of magnitude. Rise time comparisons also show scatter, with an rms error of approximately 28 h. The model algorithms are analyzed using historical data and improvements are suggested. Implementation of the algorithm modifications reduces the rms error in the log10 of the flux prediction by 21%, and the rise time rms error by 31%. Improvements are also realized in the probability prediction by deriving the conditional climatology for proton event occurrence given flare characteristics.
Oh, Eric J; Shepherd, Bryan E; Lumley, Thomas; Shaw, Pamela A
2018-04-15
For time-to-event outcomes, a rich literature exists on the bias introduced by covariate measurement error in regression models, such as the Cox model, and methods of analysis to address this bias. By comparison, less attention has been given to understanding the impact or addressing errors in the failure time outcome. For many diseases, the timing of an event of interest (such as progression-free survival or time to AIDS progression) can be difficult to assess or reliant on self-report and therefore prone to measurement error. For linear models, it is well known that random errors in the outcome variable do not bias regression estimates. With nonlinear models, however, even random error or misclassification can introduce bias into estimated parameters. We compare the performance of 2 common regression models, the Cox and Weibull models, in the setting of measurement error in the failure time outcome. We introduce an extension of the SIMEX method to correct for bias in hazard ratio estimates from the Cox model and discuss other analysis options to address measurement error in the response. A formula to estimate the bias induced into the hazard ratio by classical measurement error in the event time for a log-linear survival model is presented. Detailed numerical studies are presented to examine the performance of the proposed SIMEX method under varying levels and parametric forms of the error in the outcome. We further illustrate the method with observational data on HIV outcomes from the Vanderbilt Comprehensive Care Clinic. Copyright © 2017 John Wiley & Sons, Ltd.
Purpora, Christina; Blegen, Mary A; Stotts, Nancy A
2015-01-01
To test hypotheses from a horizontal violence and quality and safety of patient care model: horizontal violence (negative behavior among peers) is inversely related to peer relations, quality of care and it is positively related to errors and adverse events. Additionally, the association between horizontal violence, peer relations, quality of care, errors and adverse events, and nurse and work characteristics were determined. A random sample (n= 175) of hospital staff Registered Nurses working in California. Nurses participated via survey. Bivariate and multivariate analyses tested the study hypotheses. Hypotheses were supported. Horizontal violence was inversely related to peer relations and quality of care, and positively related to errors and adverse events. Including peer relations in the analyses altered the relationship between horizontal violence and quality of care but not between horizontal violence, errors and adverse events. Nurse and hospital characteristics were not related to other variables. Clinical area contributed significantly in predicting the quality of care, errors and adverse events but not peer relationships. Horizontal violence affects peer relationships and the quality and safety of patient care as perceived by participating nurses. Supportive peer relationships are important to mitigate the impact of horizontal violence on quality of care.
Effect of bar-code technology on the safety of medication administration.
Poon, Eric G; Keohane, Carol A; Yoon, Catherine S; Ditmore, Matthew; Bane, Anne; Levtzion-Korach, Osnat; Moniz, Thomas; Rothschild, Jeffrey M; Kachalia, Allen B; Hayes, Judy; Churchill, William W; Lipsitz, Stuart; Whittemore, Anthony D; Bates, David W; Gandhi, Tejal K
2010-05-06
Serious medication errors are common in hospitals and often occur during order transcription or administration of medication. To help prevent such errors, technology has been developed to verify medications by incorporating bar-code verification technology within an electronic medication-administration system (bar-code eMAR). We conducted a before-and-after, quasi-experimental study in an academic medical center that was implementing the bar-code eMAR. We assessed rates of errors in order transcription and medication administration on units before and after implementation of the bar-code eMAR. Errors that involved early or late administration of medications were classified as timing errors and all others as nontiming errors. Two clinicians reviewed the errors to determine their potential to harm patients and classified those that could be harmful as potential adverse drug events. We observed 14,041 medication administrations and reviewed 3082 order transcriptions. Observers noted 776 nontiming errors in medication administration on units that did not use the bar-code eMAR (an 11.5% error rate) versus 495 such errors on units that did use it (a 6.8% error rate)--a 41.4% relative reduction in errors (P<0.001). The rate of potential adverse drug events (other than those associated with timing errors) fell from 3.1% without the use of the bar-code eMAR to 1.6% with its use, representing a 50.8% relative reduction (P<0.001). The rate of timing errors in medication administration fell by 27.3% (P<0.001), but the rate of potential adverse drug events associated with timing errors did not change significantly. Transcription errors occurred at a rate of 6.1% on units that did not use the bar-code eMAR but were completely eliminated on units that did use it. Use of the bar-code eMAR substantially reduced the rate of errors in order transcription and in medication administration as well as potential adverse drug events, although it did not eliminate such errors. Our data show that the bar-code eMAR is an important intervention to improve medication safety. (ClinicalTrials.gov number, NCT00243373.) 2010 Massachusetts Medical Society
Development of an FAA-EUROCONTROL technique for the analysis of human error in ATM : final report.
DOT National Transportation Integrated Search
2002-07-01
Human error has been identified as a dominant risk factor in safety-oriented industries such as air traffic control (ATC). However, little is known about the factors leading to human errors in current air traffic management (ATM) systems. The first s...
Interval sampling methods and measurement error: a computer simulation.
Wirth, Oliver; Slaven, James; Taylor, Matthew A
2014-01-01
A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.
Adverse Drug Events caused by Serious Medication Administration Errors
Sawarkar, Abhivyakti; Keohane, Carol A.; Maviglia, Saverio; Gandhi, Tejal K; Poon, Eric G
2013-01-01
OBJECTIVE To determine how often serious or life-threatening medication administration errors with the potential to cause patient harm (or potential adverse drug events) result in actual patient harm (or adverse drug events (ADEs)) in the hospital setting. DESIGN Retrospective chart review of clinical events that transpired following observed medication administration errors. BACKGROUND Medication errors are common at the medication administration stage for hospitalized patients. While many of these errors are considered capable of causing patient harm, it is not clear how often patients are actually harmed by these errors. METHODS In a previous study where 14,041 medication administrations in an acute-care hospital were directly observed, investigators discovered 1271 medication administration errors, of which 133 had the potential to cause serious or life-threatening harm to patients and were considered serious or life-threatening potential ADEs. In the current study, clinical reviewers conducted detailed chart reviews of cases where a serious or life-threatening potential ADE occurred to determine if an actual ADE developed following the potential ADE. Reviewers further assessed the severity of the ADE and attribution to the administration error. RESULTS Ten (7.5% [95% C.I. 6.98, 8.01]) actual adverse drug events or ADEs resulted from the 133 serious and life-threatening potential ADEs, of which 6 resulted in significant, three in serious, and one life threatening injury. Therefore 4 (3% [95% C.I. 2.12, 3.6]) serious and life threatening potential ADEs led to serious or life threatening ADEs. Half of the ten actual ADEs were caused by dosage or monitoring errors for anti-hypertensives. The life threatening ADE was caused by an error that was both a transcription and a timing error. CONCLUSION Potential ADEs at the medication administration stage can cause serious patient harm. Given previous estimates of serious or life-threatening potential ADE of 1.33 per 100 medication doses administered, in a hospital where 6 million doses are administered per year, about 4000 preventable ADEs would be attributable to medication administration errors annually. PMID:22791691
Measurement of the bottom hadron lifetime at the Z 0 resonancce
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fujino, Donald Hideo
1992-06-01
We have measured the bottom hadron lifetime from bmore » $$\\bar{b}$$ events produced at the Z 0 resonance. Using the precision vertex detectors of the Mark II detector at the Stanford Linear Collider, we developed an impact parameter tag to identify bottom hadrons. The vertex tracking system resolved impact parameters to 30 μm for high momentum tracks, and 70 μm for tracks with a momentum of 1 GeV. We selected B hadrons with an efficiency of 40% and a sample purity of 80%, by requiring there be at least two tracks in a single jet that significantly miss the Z 0 decay vertex. From a total of 208 hadronic Z 0 events collected by the Mark II detector in 1990, we tagged 53 jets, of which 22 came from 11 double-tagged events. The jets opposite the tagged ones, referred as the ``untagged`` sample, are rich in B hadrons and unbiased in B decay times. The variable Σδ is the sum of impact parameters from tracks in the jet, and contains vital information on the B decay time. We measured the B lifetime from a one-parameter likelihood fit to the untagged Σδ distribution, obtaining τ b = 1.53 $$+0.55\\atop{-0.45}$$ ± 0.16 ps which agrees with the current world average. The first error is statistical and the second is systematic. The systematic error was dominated by uncertainties in the track resolution function. As a check, we also obtained consistent results using the Σδ distribution from the tagged jets and from the entire hadronic sample without any bottom enrichment.« less
Measurement of the bottom hadron lifetime at the Z sup 0 resonancce
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fujino, D.H.
1992-06-01
We have measured the bottom hadron lifetime from b{bar b} events produced at the Z{sup 0} resonance. Using the precision vertex detectors of the Mark II detector at the Stanford Linear Collider, we developed an impact parameter tag to identify bottom hadrons. The vertex tracking system resolved impact parameters to 30 {mu}m for high momentum tracks, and 70 {mu}m for tracks with a momentum of 1 GeV. We selected B hadrons with an efficiency of 40% and a sample purity of 80%, by requiring there be at least two tracks in a single jet that significantly miss the Z{sup 0}more » decay vertex. From a total of 208 hadronic Z{sup 0} events collected by the Mark II detector in 1990, we tagged 53 jets, of which 22 came from 11 double-tagged events. The jets opposite the tagged ones, referred as the untagged'' sample, are rich in B hadrons and unbiased in B decay times. The variable {Sigma}{delta} is the sum of impact parameters from tracks in the jet, and contains vital information on the B decay time. We measured the B lifetime from a one-parameter likelihood fit to the untagged {Sigma}{delta} distribution, obtaining {tau}{sub b} = 1.53{sub {minus}0.45}{sup +0.55}{plus minus}0.16 ps which agrees with the current world average. The first error is statistical and the second is systematic. The systematic error was dominated by uncertainties in the track resolution function. As a check, we also obtained consistent results using the {Sigma}{delta} distribution from the tagged jets and from the entire hadronic sample without any bottom enrichment.« less
Danielson, Patrick; Yang, Limin; Jin, Suming; Homer, Collin G.; Napton, Darrell
2016-01-01
We developed a method that analyzes the quality of the cultivated cropland class mapped in the USA National Land Cover Database (NLCD) 2006. The method integrates multiple geospatial datasets and a Multi Index Integrated Change Analysis (MIICA) change detection method that captures spectral changes to identify the spatial distribution and magnitude of potential commission and omission errors for the cultivated cropland class in NLCD 2006. The majority of the commission and omission errors in NLCD 2006 are in areas where cultivated cropland is not the most dominant land cover type. The errors are primarily attributed to the less accurate training dataset derived from the National Agricultural Statistics Service Cropland Data Layer dataset. In contrast, error rates are low in areas where cultivated cropland is the dominant land cover. Agreement between model-identified commission errors and independently interpreted reference data was high (79%). Agreement was low (40%) for omission error comparison. The majority of the commission errors in the NLCD 2006 cultivated crops were confused with low-intensity developed classes, while the majority of omission errors were from herbaceous and shrub classes. Some errors were caused by inaccurate land cover change from misclassification in NLCD 2001 and the subsequent land cover post-classification process.
Changes in medical errors after implementation of a handoff program.
Starmer, Amy J; Spector, Nancy D; Srivastava, Rajendu; West, Daniel C; Rosenbluth, Glenn; Allen, April D; Noble, Elizabeth L; Tse, Lisa L; Dalal, Anuj K; Keohane, Carol A; Lipsitz, Stuart R; Rothschild, Jeffrey M; Wien, Matthew F; Yoon, Catherine S; Zigmont, Katherine R; Wilson, Karen M; O'Toole, Jennifer K; Solan, Lauren G; Aylor, Megan; Bismilla, Zia; Coffey, Maitreya; Mahant, Sanjay; Blankenburg, Rebecca L; Destino, Lauren A; Everhart, Jennifer L; Patel, Shilpa J; Bale, James F; Spackman, Jaime B; Stevenson, Adam T; Calaman, Sharon; Cole, F Sessions; Balmer, Dorene F; Hepps, Jennifer H; Lopreiato, Joseph O; Yu, Clifton E; Sectish, Theodore C; Landrigan, Christopher P
2014-11-06
Miscommunications are a leading cause of serious medical errors. Data from multicenter studies assessing programs designed to improve handoff of information about patient care are lacking. We conducted a prospective intervention study of a resident handoff-improvement program in nine hospitals, measuring rates of medical errors, preventable adverse events, and miscommunications, as well as resident workflow. The intervention included a mnemonic to standardize oral and written handoffs, handoff and communication training, a faculty development and observation program, and a sustainability campaign. Error rates were measured through active surveillance. Handoffs were assessed by means of evaluation of printed handoff documents and audio recordings. Workflow was assessed through time-motion observations. The primary outcome had two components: medical errors and preventable adverse events. In 10,740 patient admissions, the medical-error rate decreased by 23% from the preintervention period to the postintervention period (24.5 vs. 18.8 per 100 admissions, P<0.001), and the rate of preventable adverse events decreased by 30% (4.7 vs. 3.3 events per 100 admissions, P<0.001). The rate of nonpreventable adverse events did not change significantly (3.0 and 2.8 events per 100 admissions, P=0.79). Site-level analyses showed significant error reductions at six of nine sites. Across sites, significant increases were observed in the inclusion of all prespecified key elements in written documents and oral communication during handoff (nine written and five oral elements; P<0.001 for all 14 comparisons). There were no significant changes from the preintervention period to the postintervention period in the duration of oral handoffs (2.4 and 2.5 minutes per patient, respectively; P=0.55) or in resident workflow, including patient-family contact and computer time. Implementation of the handoff program was associated with reductions in medical errors and in preventable adverse events and with improvements in communication, without a negative effect on workflow. (Funded by the Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services, and others.).
NASA Astrophysics Data System (ADS)
Duan, Y.; Wilson, A. M.; Barros, A. P.
2014-10-01
A diagnostic analysis of the space-time structure of error in Quantitative Precipitation Estimates (QPE) from the Precipitation Radar (PR) on the Tropical Rainfall Measurement Mission (TRMM) satellite is presented here in preparation for the Integrated Precipitation and Hydrology Experiment (IPHEx) in 2014. IPHEx is the first NASA ground-validation field campaign after the launch of the Global Precipitation Measurement (GPM) satellite. In anticipation of GPM, a science-grade high-density raingauge network was deployed at mid to high elevations in the Southern Appalachian Mountains, USA since 2007. This network allows for direct comparison between ground-based measurements from raingauges and satellite-based QPE (specifically, PR 2A25 V7 using 5 years of data 2008-2013). Case studies were conducted to characterize the vertical profiles of reflectivity and rain rate retrievals associated with large discrepancies with respect to ground measurements. The spatial and temporal distribution of detection errors (false alarm, FA, and missed detection, MD) and magnitude errors (underestimation, UND, and overestimation, OVR) for stratiform and convective precipitation are examined in detail toward elucidating the physical basis of retrieval error. The diagnostic error analysis reveals that detection errors are linked to persistent stratiform light rainfall in the Southern Appalachians, which explains the high occurrence of FAs throughout the year, as well as the diurnal MD maximum at midday in the cold season (fall and winter), and especially in the inner region. Although UND dominates the magnitude error budget, underestimation of heavy rainfall conditions accounts for less than 20% of the total consistent with regional hydrometeorology. The 2A25 V7 product underestimates low level orographic enhancement of rainfall associated with fog, cap clouds and cloud to cloud feeder-seeder interactions over ridges, and overestimates light rainfall in the valleys by large amounts, though this behavior is strongly conditioned by the coarse spatial resolution (5 km) of the terrain topography mask used to remove ground clutter effects. Precipitation associated with small-scale systems (< 25 km2) and isolated deep convection tends to be underestimated, which we attribute to non-uniform beam-filling effects due to spatial averaging of reflectivity at the PR resolution. Mixed precipitation events (i.e., cold fronts and snow showers) fall into OVR or FA categories, but these are also the types of events for which observations from standard ground-based raingauge networks are more likely subject to measurement uncertainty, that is raingauge underestimation errors due to under-catch and precipitation phase. Overall, the space-time structure of the errors shows strong links among precipitation, envelope orography, landform (ridge-valley contrasts), and local hydrometeorological regime that is strongly modulated by the diurnal cycle, pointing to three major error causes that are inter-related: (1) representation of concurrent vertically and horizontally varying microphysics; (2) non uniform beam filling (NUBF) effects and ambiguity in the detection of bright band position; and (3) spatial resolution and ground clutter correction.
NASA Astrophysics Data System (ADS)
Duan, Y.; Wilson, A. M.; Barros, A. P.
2015-03-01
A diagnostic analysis of the space-time structure of error in quantitative precipitation estimates (QPEs) from the precipitation radar (PR) on the Tropical Rainfall Measurement Mission (TRMM) satellite is presented here in preparation for the Integrated Precipitation and Hydrology Experiment (IPHEx) in 2014. IPHEx is the first NASA ground-validation field campaign after the launch of the Global Precipitation Measurement (GPM) satellite. In anticipation of GPM, a science-grade high-density raingauge network was deployed at mid to high elevations in the southern Appalachian Mountains, USA, since 2007. This network allows for direct comparison between ground-based measurements from raingauges and satellite-based QPE (specifically, PR 2A25 Version 7 using 5 years of data 2008-2013). Case studies were conducted to characterize the vertical profiles of reflectivity and rain rate retrievals associated with large discrepancies with respect to ground measurements. The spatial and temporal distribution of detection errors (false alarm, FA; missed detection, MD) and magnitude errors (underestimation, UND; overestimation, OVR) for stratiform and convective precipitation are examined in detail toward elucidating the physical basis of retrieval error. The diagnostic error analysis reveals that detection errors are linked to persistent stratiform light rainfall in the southern Appalachians, which explains the high occurrence of FAs throughout the year, as well as the diurnal MD maximum at midday in the cold season (fall and winter) and especially in the inner region. Although UND dominates the error budget, underestimation of heavy rainfall conditions accounts for less than 20% of the total, consistent with regional hydrometeorology. The 2A25 V7 product underestimates low-level orographic enhancement of rainfall associated with fog, cap clouds and cloud to cloud feeder-seeder interactions over ridges, and overestimates light rainfall in the valleys by large amounts, though this behavior is strongly conditioned by the coarse spatial resolution (5 km) of the topography mask used to remove ground-clutter effects. Precipitation associated with small-scale systems (< 25 km2) and isolated deep convection tends to be underestimated, which we attribute to non-uniform beam-filling effects due to spatial averaging of reflectivity at the PR resolution. Mixed precipitation events (i.e., cold fronts and snow showers) fall into OVR or FA categories, but these are also the types of events for which observations from standard ground-based raingauge networks are more likely subject to measurement uncertainty, that is raingauge underestimation errors due to undercatch and precipitation phase. Overall, the space-time structure of the errors shows strong links among precipitation, envelope orography, landform (ridge-valley contrasts), and a local hydrometeorological regime that is strongly modulated by the diurnal cycle, pointing to three major error causes that are inter-related: (1) representation of concurrent vertically and horizontally varying microphysics; (2) non-uniform beam filling (NUBF) effects and ambiguity in the detection of bright band position; and (3) spatial resolution and ground-clutter correction.
Design and Construction of a Vertex Chamber and Measurement of the Average Beta-Hadron Lifetime
NASA Astrophysics Data System (ADS)
Nelson, Harry Norman
Four parameters describe the mixing of the three quark generations in the Standard Model of the weak charged current interaction. These four parameters are experimental inputs to the model. A measurement of the mean lifetime of hadrons containing b-quarks, or B-Hadrons, constrains the magnitudes of two of these parameters. Measurement of the B-Hadron lifetime requires a device that can measure the locations of the stable particles that result from B-Hadron decay. This device must function reliably in an inaccessible location, and survive high radiation levels. We describe the design and construction of such a device, a gaseous drift chamber. Tubes of 6.9 mm diameter, having aluminized mylar walls of 100 μm thickness are utilized in this Vertex Chamber. It achieves a spatial resolution of 45 mum, and a resolution in extrapolation to the B-Hadron decay location of 87 mum. Its inner layer is 4.6 cm from e^+e ^- colliding beams. The Vertex Chamber is situated within the MAC detector at PEP. We have analyzed both the 94 pb ^{-1} of integrated luminosity accumulated at sqrt{s} = 29 GeV with the Vertex Chamber in place as well as the 210 pb^{-1} accumulated previously. We require a lepton with large momentum transverse to the event thrust axis to obtain a sample of events enriched in B-Hadron decays. The distribution of signed impact parameters of all tracks in these events is used to measure the B-Hadron flight distance, and hence lifetime. The trimmed mean signed impact parameters are 130 +/- 19 μm for data accumulated with the Vertex Chamber, and 162 +/- 25 μm for previous data. Together these indicate an average B-Hadron lifetime of tau_{b} = (1.37_sp{-0.19}{+0.22} stat. +/- 0.11 sys.) times (1 +/- 0.15 sys.) psec. We separate additive and multiplicative systematic errors because the second does not degrade the statistical significance of the difference of the result from 0. If b-c dominates b-quark decay the corresponding weak mixing matrix element mid V_ {cb}mid = 0.047 +/- 0.006 +/- 0.005, where the first error is from this experiment, and the second theoretical uncertainty. If b-u dominates, midV _{ub}mid = 0.033 +/- 0.004 +/- 0.12.
ERIC Educational Resources Information Center
Taha, Haitham; Ibrahim, Raphiq; Khateb, Asaid
2014-01-01
The dominant error types were investigated as a function of phonological processing (PP) deficit severity in four groups of impaired readers. For this aim, an error analysis paradigm distinguishing between four error types was used. The findings revealed that the different types of impaired readers were characterized by differing predominant error…
Event-related potentials in response to violations of content and temporal event knowledge.
Drummer, Janna; van der Meer, Elke; Schaadt, Gesa
2016-01-08
Scripts that store knowledge of everyday events are fundamentally important for managing daily routines. Content event knowledge (i.e., knowledge about which events belong to a script) and temporal event knowledge (i.e., knowledge about the chronological order of events in a script) constitute qualitatively different forms of knowledge. However, there is limited information about each distinct process and the time course involved in accessing content and temporal event knowledge. Therefore, we analyzed event-related potentials (ERPs) in response to either correctly presented event sequences or event sequences that contained a content or temporal error. We found an N400, which was followed by a posteriorly distributed P600 in response to content errors in event sequences. By contrast, we did not find an N400 but an anteriorly distributed P600 in response to temporal errors in event sequences. Thus, the N400 seems to be elicited as a response to a general mismatch between an event and the established event model. We assume that the expectancy violation of content event knowledge, as indicated by the N400, induces the collapse of the established event model, a process indicated by the posterior P600. The expectancy violation of temporal event knowledge is assumed to induce an attempt to reorganize the event model in working memory, a process indicated by the frontal P600. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ibrahim, Abdulrasheed; Garba, Ekundayo Stephen; Asuku, Malachy Eneye
2012-01-01
Surgery in sub-Saharan Africa is widely known to be done against a background of poverty and illiteracy, late presentation with complicated pathologies, and a desperate lack of infrastructure. In addition, patient autonomy and self determination are highly flavored by cultural practices and religious beliefs. Any of these factors can influence the pattern and disclosure of adverse events and errors. The impact of these in the relationships between surgeons and patients, and between health institutions and patients must be considered as it may affect disclosure and response to errors. This article identifies the peculiar socioeconomic and cultural challenges that may hinder disclosure and proposes strategies for instituting disclosure of errors and adverse events services in Sub-Saharan Africa.
NASA Astrophysics Data System (ADS)
Ogashawara, Igor; Mishra, Deepak R.; Nascimento, Renata F. F.; Alcântara, Enner H.; Kampel, Milton; Stech, Jose L.
2016-12-01
Quasi-Analytical Algorithms (QAAs) are based on radiative transfer equations and have been used to derive inherent optical properties (IOPs) from the above surface remote sensing reflectance (Rrs) in aquatic systems in which phytoplankton is the dominant optically active constituents (OACs). However, Colored Dissolved Organic Matter (CDOM) and Non Algal Particles (NAP) can also be dominant OACs in water bodies and till now a QAA has not been parametrized for these aquatic systems. In this study, we compared the performance of three widely used QAAs in two CDOM dominated aquatic systems which were unsuccessful in retrieving the spectral shape of IOPS and produced minimum errors of 350% for the total absorption coefficient (a), 39% for colored dissolved matter absorption coefficient (aCDM) and 7566.33% for phytoplankton absorption coefficient (aphy). We re-parameterized a QAA for CDOM dominated (hereafter QAACDOM) waters which was able to not only achieve the spectral shape of the OACs absorption coefficients but also brought the error magnitude to a reasonable level. The average errors found for the 400-750 nm range were 30.71 and 14.51 for a, 14.89 and 8.95 for aCDM and 25.90 and 29.76 for aphy in Funil and Itumbiara Reservoirs, Brazil respectively. Although QAACDOM showed significant promise for retrieving IOPs in CDOM dominated waters, results indicated further tuning is needed in the estimation of a(λ) and aphy(λ). Successful retrieval of the absorption coefficients by QAACDOM would be very useful in monitoring the spatio-temporal variability of IOPS in CDOM dominated waters.
Convergent models of handedness and brain lateralization
Sainburg, Robert L.
2014-01-01
The pervasive nature of handedness across human history and cultures is a salient consequence of brain lateralization. This paper presents evidence that provides a structure for understanding the motor control processes that give rise to handedness. According to the Dynamic Dominance Model, the left hemisphere (in right handers) is proficient for processes that predict the effects of body and environmental dynamics, while the right hemisphere is proficient at impedance control processes that can minimize potential errors when faced with unexpected mechanical conditions, and can achieve accurate steady-state positions. This model can be viewed as a motor component for the paradigm of brain lateralization that has been proposed by Rogers et al. (MacNeilage et al., 2009) that is based upon evidence from a wide range of behaviors across many vertebrate species. Rogers proposed a left-hemisphere specialization for well-established patterns of behavior performed in familiar environmental conditions, and a right hemisphere specialization for responding to unforeseen environmental events. The dynamic dominance hypothesis provides a framework for understanding the biology of motor lateralization that is consistent with Roger's paradigm of brain lateralization. PMID:25339923
Identifying isotropic events using a regional moment tensor inversion
Ford, Sean R.; Dreger, Douglas S.; Walter, William R.
2009-01-17
We calculate the deviatoric and isotropic source components for 17 explosions at the Nevada Test Site, as well as 12 earthquakes and 3 collapses in the surrounding region of the western United States, using a regional time domain full waveform inversion for the complete moment tensor. The events separate into specific populations according to their deviation from a pure double-couple and ratio of isotropic to deviatoric energy. The separation allows for anomalous event identification and discrimination between explosions, earthquakes, and collapses. Confidence regions of the model parameters are estimated from the data misfit by assuming normally distributed parameter values. Wemore » investigate the sensitivity of the resolved parameters of an explosion to imperfect Earth models, inaccurate event depths, and data with low signal-to-noise ratio (SNR) assuming a reasonable azimuthal distribution of stations. In the band of interest (0.02–0.10 Hz) the source-type calculated from complete moment tensor inversion is insensitive to velocity model perturbations that cause less than a half-cycle shift (<5 s) in arrival time error if shifting of the waveforms is allowed. The explosion source-type is insensitive to an incorrect depth assumption (for a true depth of 1 km), and the goodness of fit of the inversion result cannot be used to resolve the true depth of the explosion. Noise degrades the explosive character of the result, and a good fit and accurate result are obtained when the signal-to-noise ratio is greater than 5. We assess the depth and frequency dependence upon the resolved explosive moment. As the depth decreases from 1 km to 200 m, the isotropic moment is no longer accurately resolved and is in error between 50 and 200%. Furthermore, even at the most shallow depth the resultant moment tensor is dominated by the explosive component when the data have a good SNR.« less
Medical Errors Reduction Initiative
2009-03-01
enough data was collected to have any statistical significance or determine impact on latent error in the process of blood transfusion. Bedside...of adverse drug events. JAMA 1995; 274: 35-43 . Leape, L.L., Brennan, T .A., & Laird, N .M. ( 1991) The nature of adverse events in hospitalized...Background Medical errors are a significant cause of morbidity and mortality among hospitalized patients (Kohn, Corrigan and Donaldson, 2000; Leape, Brennan
Nikolic, Mark I; Sarter, Nadine B
2007-08-01
To examine operator strategies for diagnosing and recovering from errors and disturbances as well as the impact of automation design and time pressure on these processes. Considerable efforts have been directed at error prevention through training and design. However, because errors cannot be eliminated completely, their detection, diagnosis, and recovery must also be supported. Research has focused almost exclusively on error detection. Little is known about error diagnosis and recovery, especially in the context of event-driven tasks and domains. With a confederate pilot, 12 airline pilots flew a 1-hr simulator scenario that involved three challenging automation-related tasks and events that were likely to produce erroneous actions or assessments. Behavioral data were compared with a canonical path to examine pilots' error and disturbance management strategies. Debriefings were conducted to probe pilots' system knowledge. Pilots seldom followed the canonical path to cope with the scenario events. Detection of a disturbance was often delayed. Diagnostic episodes were rare because of pilots' knowledge gaps and time criticality. In many cases, generic inefficient recovery strategies were observed, and pilots relied on high levels of automation to manage the consequences of an error. Our findings describe and explain the nature and shortcomings of pilots' error management activities. They highlight the need for improved automation training and design to achieve more timely detection, accurate explanation, and effective recovery from errors and disturbances. Our findings can inform the design of tools and techniques that support disturbance management in various complex, event-driven environments.
Challenge and Error: Critical Events and Attention-Related Errors
ERIC Educational Resources Information Center
Cheyne, James Allan; Carriere, Jonathan S. A.; Solman, Grayden J. F.; Smilek, Daniel
2011-01-01
Attention lapses resulting from reactivity to task challenges and their consequences constitute a pervasive factor affecting everyday performance errors and accidents. A bidirectional model of attention lapses (error [image omitted] attention-lapse: Cheyne, Solman, Carriere, & Smilek, 2009) argues that errors beget errors by generating attention…
Data driven CAN node reliability assessment for manufacturing system
NASA Astrophysics Data System (ADS)
Zhang, Leiming; Yuan, Yong; Lei, Yong
2017-01-01
The reliability of the Controller Area Network(CAN) is critical to the performance and safety of the system. However, direct bus-off time assessment tools are lacking in practice due to inaccessibility of the node information and the complexity of the node interactions upon errors. In order to measure the mean time to bus-off(MTTB) of all the nodes, a novel data driven node bus-off time assessment method for CAN network is proposed by directly using network error information. First, the corresponding network error event sequence for each node is constructed using multiple-layer network error information. Then, the generalized zero inflated Poisson process(GZIP) model is established for each node based on the error event sequence. Finally, the stochastic model is constructed to predict the MTTB of the node. The accelerated case studies with different error injection rates are conducted on a laboratory network to demonstrate the proposed method, where the network errors are generated by a computer controlled error injection system. Experiment results show that the MTTB of nodes predicted by the proposed method agree well with observations in the case studies. The proposed data driven node time to bus-off assessment method for CAN networks can successfully predict the MTTB of nodes by directly using network error event data.
Impact of Measurement Error on Synchrophasor Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Yilu; Gracia, Jose R.; Ewing, Paul D.
2015-07-01
Phasor measurement units (PMUs), a type of synchrophasor, are powerful diagnostic tools that can help avert catastrophic failures in the power grid. Because of this, PMU measurement errors are particularly worrisome. This report examines the internal and external factors contributing to PMU phase angle and frequency measurement errors and gives a reasonable explanation for them. It also analyzes the impact of those measurement errors on several synchrophasor applications: event location detection, oscillation detection, islanding detection, and dynamic line rating. The primary finding is that dynamic line rating is more likely to be influenced by measurement error. Other findings include themore » possibility of reporting nonoscillatory activity as an oscillation as the result of error, failing to detect oscillations submerged by error, and the unlikely impact of error on event location and islanding detection.« less
Schulz, Christian M; Burden, Amanda; Posner, Karen L; Mincer, Shawn L; Steadman, Randolph; Wagner, Klaus J; Domino, Karen B
2017-08-01
Situational awareness errors may play an important role in the genesis of patient harm. The authors examined closed anesthesia malpractice claims for death or brain damage to determine the frequency and type of situational awareness errors. Surgical and procedural anesthesia death and brain damage claims in the Anesthesia Closed Claims Project database were analyzed. Situational awareness error was defined as failure to perceive relevant clinical information, failure to comprehend the meaning of available information, or failure to project, anticipate, or plan. Patient and case characteristics, primary damaging events, and anesthesia payments in claims with situational awareness errors were compared to other death and brain damage claims from 2002 to 2013. Anesthesiologist situational awareness errors contributed to death or brain damage in 198 of 266 claims (74%). Respiratory system damaging events were more common in claims with situational awareness errors (56%) than other claims (21%, P < 0.001). The most common specific respiratory events in error claims were inadequate oxygenation or ventilation (24%), difficult intubation (11%), and aspiration (10%). Payments were made in 85% of situational awareness error claims compared to 46% in other claims (P = 0.001), with no significant difference in payment size. Among 198 claims with anesthesia situational awareness error, perception errors were most common (42%), whereas comprehension errors (29%) and projection errors (29%) were relatively less common. Situational awareness error definitions were operationalized for reliable application to real-world anesthesia cases. Situational awareness errors may have contributed to catastrophic outcomes in three quarters of recent anesthesia malpractice claims.Situational awareness errors resulting in death or brain damage remain prevalent causes of malpractice claims in the 21st century.
Open Label Extension of ISIS 301012 (Mipomersen) to Treat Familial Hypercholesterolemia
2016-08-01
Lipid Metabolism, Inborn Errors; Hypercholesterolemia, Autosomal Dominant; Hyperlipidemias; Metabolic Diseases; Hyperlipoproteinemia Type II; Metabolism, Inborn Errors; Genetic Diseases, Inborn; Infant, Newborn, Diseases; Metabolic Disorder; Congenital Abnormalities; Hypercholesterolemia; Hyperlipoproteinemias; Dyslipidemias; Lipid Metabolism Disorders
Luu, Phan; Tucker, Don M; Makeig, Scott
2004-08-01
The error-related negativity (ERN) is an event-related potential (ERP) peak occurring between 50 and 100 ms after the commission of a speeded motor response that the subject immediately realizes to be in error. The ERN is believed to index brain processes that monitor action outcomes. Our previous analyses of ERP and EEG data suggested that the ERN is dominated by partial phase-locking of intermittent theta-band EEG activity. In this paper, this possibility is further evaluated. The possibility that the ERN is produced by phase-locking of theta-band EEG activity was examined by analyzing the single-trial EEG traces from a forced-choice speeded response paradigm before and after applying theta-band (4-7 Hz) filtering and by comparing the averaged and single-trial phase-locked (ERP) and non-phase-locked (other) EEG data. Electrical source analyses were used to estimate the brain sources involved in the generation of the ERN. Beginning just before incorrect button presses in a speeded choice response paradigm, midfrontal theta-band activity increased in amplitude and became partially and transiently phase-locked to the subject's motor response, accounting for 57% of ERN peak amplitude. The portion of the theta-EEG activity increase remaining after subtracting the response-locked ERP from each trial was larger and longer lasting after error responses than after correct responses, extending on average 400 ms beyond the ERN peak. Multiple equivalent-dipole source analysis suggested 3 possible equivalent dipole sources of the theta-bandpassed ERN, while the scalp distribution of non-phase-locked theta amplitude suggested the presence of additional frontal theta-EEG sources. These results appear consistent with a body of research that demonstrates a relationship between limbic theta activity and action regulation, including error monitoring and learning.
Hazard Assessment from Storm Tides and Rainfall on a Tidal River Estuary
NASA Technical Reports Server (NTRS)
Orton, P.; Conticello, F.; Cioffi, F.; Hall, T.; Georgas, N.; Lall, U.; Blumberg, A.
2015-01-01
Here, we report on methods and results for a model-based flood hazard assessment we have conducted for the Hudson River from New York City to Troy/Albany at the head of tide. Our recent work showed that neglecting freshwater flows leads to underestimation of peak water levels at up-river sites and neglecting stratification (typical with two-dimensional modeling) leads to underestimation all along the Hudson. As a result, we use a three-dimensional hydrodynamic model and merge streamflows and storm tides from tropical and extratropical cyclones (TCs, ETCs), as well as wet extratropical cyclone (WETC) floods (e.g. freshets, rain-on-snow events). We validate the modeled flood levels and quantify error with comparisons to 76 historical events. A Bayesian statistical method is developed for tropical cyclone streamflows using historical data and consisting in the evaluation of (1) the peak discharge and its pdf as a function of TC characteristics, and (2) the temporal trend of the hydrograph as a function of temporal evolution of the cyclone track, its intensity and the response characteristics of the specific basin. A k-nearest-neighbors method is employed to determine the hydrograph shape. Out of sample validation tests demonstrate the effectiveness of the method. Thus, the combined effects of storm surge and runoff produced by tropical cyclones hitting the New York area can be included in flood hazard assessment. Results for the upper Hudson (Albany) suggest a dominance of WETCs, for the lower Hudson (at New York Harbor) a case where ETCs are dominant for shorter return periods and TCs are more important for longer return periods (over 150 years), and for the middle-Hudson (Poughkeepsie) a mix of all three flood events types is important. However, a possible low-bias for TC flood levels is inferred from a lower importance in the assessment results, versus historical event top-20 lists, and this will be further evaluated as these preliminary methods and results are finalized. Future funded work will quantify the influences of sea level rise and flood adaptation plans (e.g. surge barriers). It would also be valuable to examine how streamflows from tropical cyclones and wet cool-season storms will change, as this factor will dominate at upriver locations.
Analyzing temozolomide medication errors: potentially fatal.
Letarte, Nathalie; Gabay, Michael P; Bressler, Linda R; Long, Katie E; Stachnik, Joan M; Villano, J Lee
2014-10-01
The EORTC-NCIC regimen for glioblastoma requires different dosing of temozolomide (TMZ) during radiation and maintenance therapy. This complexity is exacerbated by the availability of multiple TMZ capsule strengths. TMZ is an alkylating agent and the major toxicity of this class is dose-related myelosuppression. Inadvertent overdose can be fatal. The websites of the Institute for Safe Medication Practices (ISMP), and the Food and Drug Administration (FDA) MedWatch database were reviewed. We searched the MedWatch database for adverse events associated with TMZ and obtained all reports including hematologic toxicity submitted from 1st November 1997 to 30th May 2012. The ISMP describes errors with TMZ resulting from the positioning of information on the label of the commercial product. The strength and quantity of capsules on the label were in close proximity to each other, and this has been changed by the manufacturer. MedWatch identified 45 medication errors. Patient errors were the most common, accounting for 21 or 47% of errors, followed by dispensing errors, which accounted for 13 or 29%. Seven reports or 16% were errors in the prescribing of TMZ. Reported outcomes ranged from reversible hematological adverse events (13%), to hospitalization for other adverse events (13%) or death (18%). Four error reports lacked detail and could not be categorized. Although the FDA issued a warning in 2003 regarding fatal medication errors and the product label warns of overdosing, errors in TMZ dosing occur for various reasons and involve both healthcare professionals and patients. Overdosing errors can be fatal.
NASA Technical Reports Server (NTRS)
Perez, Christopher E.; Berg, Melanie D.; Friendlich, Mark R.
2011-01-01
Motivation for this work is: (1) Accurately characterize digital signal processor (DSP) core single-event effect (SEE) behavior (2) Test DSP cores across a large frequency range and across various input conditions (3) Isolate SEE analysis to DSP cores alone (4) Interpret SEE analysis in terms of single-event upsets (SEUs) and single-event transients (SETs) (5) Provide flight missions with accurate estimate of DSP core error rates and error signatures.
Automation: Decision Aid or Decision Maker?
NASA Technical Reports Server (NTRS)
Skitka, Linda J.
1998-01-01
This study clarified that automation bias is something unique to automated decision making contexts, and is not the result of a general tendency toward complacency. By comparing performance on exactly the same events on the same tasks with and without an automated decision aid, we were able to determine that at least the omission error part of automation bias is due to the unique context created by having an automated decision aid, and is not a phenomena that would occur even if people were not in an automated context. However, this study also revealed that having an automated decision aid did lead to modestly improved performance across all non-error events. Participants in the non- automated condition responded with 83.68% accuracy, whereas participants in the automated condition responded with 88.67% accuracy, across all events. Automated decision aids clearly led to better overall performance when they were accurate. People performed almost exactly at the level of reliability as the automation (which across events was 88% reliable). However, also clear, is that the presence of less than 100% accurate automated decision aids creates a context in which new kinds of errors in decision making can occur. Participants in the non-automated condition responded with 97% accuracy on the six "error" events, whereas participants in the automated condition had only a 65% accuracy rate when confronted with those same six events. In short, the presence of an AMA can lead to vigilance decrements that can lead to errors in decision making.
Reliability of Memories Protected by Multibit Error Correction Codes Against MBUs
NASA Astrophysics Data System (ADS)
Ming, Zhu; Yi, Xiao Li; Chang, Liu; Wei, Zhang Jian
2011-02-01
As technology scales, more and more memory cells can be placed in a die. Therefore, the probability that a single event induces multiple bit upsets (MBUs) in adjacent memory cells gets greater. Generally, multibit error correction codes (MECCs) are effective approaches to mitigate MBUs in memories. In order to evaluate the robustness of protected memories, reliability models have been widely studied nowadays. Instead of irradiation experiments, the models can be used to quickly evaluate the reliability of memories in the early design. To build an accurate model, some situations should be considered. Firstly, when MBUs are presented in memories, the errors induced by several events may overlap each other, which is more frequent than single event upset (SEU) case. Furthermore, radiation experiments show that the probability of MBUs strongly depends on angles of the radiation event. However, reliability models which consider the overlap of multiple bit errors and angles of radiation event have not been proposed in the present literature. In this paper, a more accurate model of memories with MECCs is presented. Both the overlap of multiple bit errors and angles of event are considered in the model, which produces a more precise analysis in the calculation of mean time to failure (MTTF) for memory systems under MBUs. In addition, memories with scrubbing and nonscrubbing are analyzed in the proposed model. Finally, we evaluate the reliability of memories under MBUs in Matlab. The simulation results verify the validity of the proposed model.
A Linguistic Analysis of Errors in the Compositions of Arba Minch University Students
ERIC Educational Resources Information Center
Tizazu, Yoseph
2014-01-01
This study reports the dominant linguistic errors that occur in the written productions of Arba Minch University (hereafter AMU) students. A sample of paragraphs was collected for two years from students ranging from freshmen to graduating level. The sampled compositions were then coded, described, and explained using error analysis method. Both…
NASA Astrophysics Data System (ADS)
Sinha, T.; Arumugam, S.
2012-12-01
Seasonal streamflow forecasts contingent on climate forecasts can be effectively utilized in updating water management plans and optimize generation of hydroelectric power. Streamflow in the rainfall-runoff dominated basins critically depend on forecasted precipitation in contrast to snow dominated basins, where initial hydrological conditions (IHCs) are more important. Since precipitation forecasts from Atmosphere-Ocean-General Circulation Models are available at coarse scale (~2.8° by 2.8°), spatial and temporal downscaling of such forecasts are required to implement land surface models, which typically runs on finer spatial and temporal scales. Consequently, multiple sources are introduced at various stages in predicting seasonal streamflow. Therefore, in this study, we addresses the following science questions: 1) How do we attribute the errors in monthly streamflow forecasts to various sources - (i) model errors, (ii) spatio-temporal downscaling, (iii) imprecise initial conditions, iv) no forecasts, and (iv) imprecise forecasts? and 2) How does monthly streamflow forecast errors propagate with different lead time over various seasons? In this study, the Variable Infiltration Capacity (VIC) model is calibrated over Apalachicola River at Chattahoochee, FL in the southeastern US and implemented with observed 1/8° daily forcings to estimate reference streamflow during 1981 to 2010. The VIC model is then forced with different schemes under updated IHCs prior to forecasting period to estimate relative mean square errors due to: a) temporally disaggregation, b) spatial downscaling, c) Reverse Ensemble Streamflow Prediction (imprecise IHCs), d) ESP (no forecasts), and e) ECHAM4.5 precipitation forecasts. Finally, error propagation under different schemes are analyzed with different lead time over different seasons.
Impact of Extended-Duration Shifts on Medical Errors, Adverse Events, and Attentional Failures
Barger, Laura K; Ayas, Najib T; Cade, Brian E; Cronin, John W; Rosner, Bernard; Speizer, Frank E; Czeisler, Charles A
2006-01-01
Background A recent randomized controlled trial in critical-care units revealed that the elimination of extended-duration work shifts (≥24 h) reduces the rates of significant medical errors and polysomnographically recorded attentional failures. This raised the concern that the extended-duration shifts commonly worked by interns may contribute to the risk of medical errors being made, and perhaps to the risk of adverse events more generally. Our current study assessed whether extended-duration shifts worked by interns are associated with significant medical errors, adverse events, and attentional failures in a diverse population of interns across the United States. Methods and Findings We conducted a Web-based survey, across the United States, in which 2,737 residents in their first postgraduate year (interns) completed 17,003 monthly reports. The association between the number of extended-duration shifts worked in the month and the reporting of significant medical errors, preventable adverse events, and attentional failures was assessed using a case-crossover analysis in which each intern acted as his/her own control. Compared to months in which no extended-duration shifts were worked, during months in which between one and four extended-duration shifts and five or more extended-duration shifts were worked, the odds ratios of reporting at least one fatigue-related significant medical error were 3.5 (95% confidence interval [CI], 3.3–3.7) and 7.5 (95% CI, 7.2–7.8), respectively. The respective odds ratios for fatigue-related preventable adverse events, 8.7 (95% CI, 3.4–22) and 7.0 (95% CI, 4.3–11), were also increased. Interns working five or more extended-duration shifts per month reported more attentional failures during lectures, rounds, and clinical activities, including surgery and reported 300% more fatigue-related preventable adverse events resulting in a fatality. Conclusions In our survey, extended-duration work shifts were associated with an increased risk of significant medical errors, adverse events, and attentional failures in interns across the United States. These results have important public policy implications for postgraduate medical education. PMID:17194188
Demura, Shinichi; Tada, Nobuhiko; Matsuzawa, Jinzaburo; Mikami, Hajime; Ohuchi, Tetsuhiko; Shirane, Hiroya; Nagasawa, Yoshinori; Uchiyama, Masanobu
2006-09-01
This study aimed to reveal the influence of gender, athletic events and athletic experience on the subjective dominant hand and the dominant hand based on the laterality quotient (LQ). It also aimed to examine the validity of the Edinburgh Inventory (Oldfield, 1971). Males and females (n=3,726) living in 7 prefectures in Japan (age: 16-45 yrs) participated in this survey. Analysis was performed on 3,557 separate datasets with high reliability. The reliability of the survey was examined using a test-retest method consisting of 100 people selected randomly from all participants. All participants provided the same answers for each question. The influence of gender, event and experience was examined for the subjective and LQ-based dominant hands. In addition, concordance rates of the subjective dominant hand and the LQ-based dominant hand and both dominant hands were examined. Differences of concordance rates between hands used in the 10 movement questions of the Inventory and the subjective dominant hand were tested using the chi(2) test. The frequency differences among items were tested using Ryan's method (multiple comparisons). Significant gender differences were found between rates of the LQ-based dominant hand (males: 94.4%; females: 96.6%) and the subjective dominant hand (males: 91.6%; females: 94.0%), but the degree was only 2.0-4.0%. Insignificant differences were found among athletic events, two groups of different athletic experience, and gender according to each athletic event. The subjective dominant hand almost always agreed with the LQ-based dominant hand (complete concordance rate=0.96, kappa=0.67). Of the 10 question items, inexperienced answers were found only in the item "Knife (without fork)". The "Toothbrush", "Broom (upper hand)", and "Opening box (lid)" items had significantly lower correspondence with the subjective dominant hand (79.7-87.0%) than the other items (92.1-95.7%). In conclusion, athletic experience appears to have little influence on handedness, although there is a slight gender difference. The subjective dominant hand almost always agrees with the dominant hand based on the Inventory. A more efficient handedness inventory may be constructed by excluding the above 4 items.
2016-08-01
Lipid Metabolism, Inborn Errors; Hypercholesterolemia, Autosomal Dominant; Hyperlipidemias; Metabolic Diseases; Hyperlipoproteinemia Type II; Metabolism, Inborn Errors; Genetic Diseases, Inborn; Infant, Newborn, Diseases; Metabolic Disorder; Congenital Abnormalities; Hypercholesterolemia; Hyperlipoproteinemias; Dyslipidemias; Lipid Metabolism Disorders
Debiasing affective forecasting errors with targeted, but not representative, experience narratives.
Shaffer, Victoria A; Focella, Elizabeth S; Scherer, Laura D; Zikmund-Fisher, Brian J
2016-10-01
To determine whether representative experience narratives (describing a range of possible experiences) or targeted experience narratives (targeting the direction of forecasting bias) can reduce affective forecasting errors, or errors in predictions of experiences. In Study 1, participants (N=366) were surveyed about their experiences with 10 common medical events. Those who had never experienced the event provided ratings of predicted discomfort and those who had experienced the event provided ratings of actual discomfort. Participants making predictions were randomly assigned to either the representative experience narrative condition or the control condition in which they made predictions without reading narratives. In Study 2, participants (N=196) were again surveyed about their experiences with these 10 medical events, but participants making predictions were randomly assigned to either the targeted experience narrative condition or the control condition. Affective forecasting errors were observed in both studies. These forecasting errors were reduced with the use of targeted experience narratives (Study 2) but not representative experience narratives (Study 1). Targeted, but not representative, narratives improved the accuracy of predicted discomfort. Public collections of patient experiences should favor stories that target affective forecasting biases over stories representing the range of possible experiences. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Vuk, Tomislav; Barišić, Marijan; Očić, Tihomir; Mihaljević, Ivanka; Šarlija, Dorotea; Jukić, Irena
2012-01-01
Background. Continuous and efficient error management, including procedures from error detection to their resolution and prevention, is an important part of quality management in blood establishments. At the Croatian Institute of Transfusion Medicine (CITM), error management has been systematically performed since 2003. Materials and methods. Data derived from error management at the CITM during an 8-year period (2003–2010) formed the basis of this study. Throughout the study period, errors were reported to the Department of Quality Assurance. In addition to surveys and the necessary corrective activities, errors were analysed and classified according to the Medical Event Reporting System for Transfusion Medicine (MERS-TM). Results. During the study period, a total of 2,068 errors were recorded, including 1,778 (86.0%) in blood bank activities and 290 (14.0%) in blood transfusion services. As many as 1,744 (84.3%) errors were detected before issue of the product or service. Among the 324 errors identified upon release from the CITM, 163 (50.3%) errors were detected by customers and reported as complaints. In only five cases was an error detected after blood product transfusion however without any harmful consequences for the patients. All errors were, therefore, evaluated as “near miss” and “no harm” events. Fifty-two (2.5%) errors were evaluated as high-risk events. With regards to blood bank activities, the highest proportion of errors occurred in the processes of labelling (27.1%) and blood collection (23.7%). With regards to blood transfusion services, errors related to blood product issuing prevailed (24.5%). Conclusion. This study shows that comprehensive management of errors, including near miss errors, can generate data on the functioning of transfusion services, which is a precondition for implementation of efficient corrective and preventive actions that will ensure further improvement of the quality and safety of transfusion treatment. PMID:22395352
Addressing Angular Single-Event Effects in the Estimation of On-Orbit Error Rates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, David S.; Swift, Gary M.; Wirthlin, Michael J.
2015-12-01
Our study describes complications introduced by angular direct ionization events on space error rate predictions. In particular, prevalence of multiple-cell upsets and a breakdown in the application of effective linear energy transfer in modern-scale devices can skew error rates approximated from currently available estimation models. Moreover, this paper highlights the importance of angular testing and proposes a methodology to extend existing error estimation tools to properly consider angular strikes in modern-scale devices. Finally, these techniques are illustrated with test data provided from a modern 28 nm SRAM-based device.
Medium-range Performance of the Global NWP Model
NASA Astrophysics Data System (ADS)
Kim, J.; Jang, T.; Kim, J.; Kim, Y.
2017-12-01
The medium-range performance of the global numerical weather prediction (NWP) model in the Korea Meteorological Administration (KMA) is investigated. The performance is based on the prediction of the extratropical circulation. The mean square error is expressed by sum of spatial variance of discrepancy between forecasts and observations and the square of the mean error (ME). Thus, it is important to investigate the ME effect in order to understand the model performance. The ME is expressed by the subtraction of an anomaly from forecast difference against the real climatology. It is found that the global model suffers from a severe systematic ME in medium-range forecasts. The systematic ME is dominant in the entire troposphere in all months. Such ME can explain at most 25% of root mean square error. We also compare the extratropical ME distribution with that from other NWP centers. NWP models exhibit similar spatial ME structure each other. It is found that the spatial ME pattern is highly correlated to that of an anomaly, implying that the ME varies with seasons. For example, the correlation coefficient between ME and anomaly ranges from -0.51 to -0.85 by months. The pattern of the extratropical circulation also has a high correlation to an anomaly. The global model has trouble in faithfully simulating extratropical cyclones and blockings in the medium-range forecast. In particular, the model has a hard to simulate an anomalous event in medium-range forecasts. If we choose an anomalous period for a test-bed experiment, we will suffer from a large error due to an anomaly.
Spraker, Matthew B; Fain, Robert; Gopan, Olga; Zeng, Jing; Nyflot, Matthew; Jordan, Loucille; Kane, Gabrielle; Ford, Eric
Incident learning systems (ILSs) are a popular strategy for improving safety in radiation oncology (RO) clinics, but few reports focus on the causes of errors in RO. The goal of this study was to test a causal factor taxonomy developed in 2012 by the American Association of Physicists in Medicine and adopted for use in the RO: Incident Learning System (RO-ILS). Three hundred event reports were randomly selected from an institutional ILS database and Safety in Radiation Oncology (SAFRON), an international ILS. The reports were split into 3 groups of 100 events each: low-risk institutional, high-risk institutional, and SAFRON. Three raters retrospectively analyzed each event for contributing factors using the American Association of Physicists in Medicine taxonomy. No events were described by a single causal factor (median, 7). The causal factor taxonomy was found to be applicable for all events, but 4 causal factors were not described in the taxonomy: linear accelerator failure (n = 3), hardware/equipment failure (n = 2), failure to follow through with a quality improvement intervention (n = 1), and workflow documentation was misleading (n = 1). The most common causal factor categories contributing to events were similar in all event types. The most common specific causal factor to contribute to events was a "slip causing physical error." Poor human factors engineering was the only causal factor found to contribute more frequently to high-risk institutional versus low-risk institutional events. The taxonomy in the study was found to be applicable for all events and may be useful in root cause analyses and future studies. Communication and human behaviors were the most common errors affecting all types of events. Poor human factors engineering was found to specifically contribute to high-risk more than low-risk institutional events, and may represent a strategy for reducing errors in all types of events. Copyright © 2017 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
Parametric Modulation of Error-Related ERP Components by the Magnitude of Visuo-Motor Mismatch
ERIC Educational Resources Information Center
Vocat, Roland; Pourtois, Gilles; Vuilleumier, Patrik
2011-01-01
Errors generate typical brain responses, characterized by two successive event-related potentials (ERP) following incorrect action: the error-related negativity (ERN) and the positivity error (Pe). However, it is unclear whether these error-related responses are sensitive to the magnitude of the error, or instead show all-or-none effects. We…
Rosenman's "Serendipity and Scientific Discovery" Revisited: Toward Defining Types of Chance Events.
ERIC Educational Resources Information Center
Diaz de Chumaceiro, Cora L.; Yaber O., Guillermo E.
1994-01-01
The role of serendipity or "chance in all its forms" in scientific discovery is considered. The need to differentiate between purely accidental events and Rothenberg's "articulations of error" when discussing scientific discoveries is stressed. Examples of articulations of errors are noted, including Fleming (penicillin),…
Fine-Scale Event Location and Error Analysis in NET-VISA
NASA Astrophysics Data System (ADS)
Arora, N. S.; Russell, S.
2016-12-01
NET-VISA is a generative probabilistic model for the occurrence of seismic, hydro, and atmospheric events, and the propagation of energy from these events through various mediums and phases before being detected, or misdetected, by IMS stations. It is built on top of the basic station, and arrival detection processing at the IDC, and is currently being tested in the IDC network processing pipelines. A key distinguishing feature of NET-VISA is that it is easy to incorporate prior scientific knowledge and historical data into the probabilistic model. The model accounts for both detections and mis-detections when forming events, and this allows it to make more accurate event hypothesis. It has been continuously evaluated since 2012, and in each year it makes a roughly 60% reduction in the number of missed events without increasing the false event rate as compared to the existing GA algorithm. More importantly the model finds large numbers of events that have been confirmed by regional seismic bulletins but missed by the IDC analysts using the same data. In this work we focus on enhancements to the model to improve the location accuracy, and error ellipses. We will present a new version of the model that focuses on the fine scale around the event location, and present error ellipses and analysis of recent important events.
NASA Astrophysics Data System (ADS)
Simmons, B. E.
1981-08-01
This report derives equations predicting satellite ephemeris error as a function of measurement errors of space-surveillance sensors. These equations lend themselves to rapid computation with modest computer resources. They are applicable over prediction times such that measurement errors, rather than uncertainties of atmospheric drag and of Earth shape, dominate in producing ephemeris error. This report describes the specialization of these equations underlying the ANSER computer program, SEEM (Satellite Ephemeris Error Model). The intent is that this report be of utility to users of SEEM for interpretive purposes, and to computer programmers who may need a mathematical point of departure for limited generalization of SEEM.
Sediment Dynamics Over a Stable Point bar of the San Pedro River, Southeastern Arizona
NASA Astrophysics Data System (ADS)
Hamblen, J. M.; Conklin, M. H.
2002-12-01
Streams of the Southwest receive enormous inputs of sediment during storm events in the monsoon season due to the high intensity rainfall and large percentages of exposed soil in the semi-arid landscape. In the Upper San Pedro River, with a watershed area of approximately 3600 square kilometers, particle size ranges from clays to boulders with large fractions of sand and gravel. This study focuses on the mechanics of scour and fill on a stable point bar. An innovative technique using seven co-located scour chains and liquid-filled, load-cell scour sensors characterized sediment dynamics over the point bar during the monsoon season of July to September 2002. The sensors were set in two transects to document sediment dynamics near the head and toe of the bar. Scour sensors record area-averaged sediment depths while scour chains measure scour and fill at a point. The average area covered by each scour sensor is 11.1 square meters. Because scour sensors have never been used in a system similar to the San Pedro, one goal of the study was to test their ability to detect changes in sediment load with time in order to determine the extent of scour and fill during monsoonal storms. Because of the predominantly unconsolidated nature of the substrate it was hypothesized that dune bedforms would develop in events less than the 1-year flood. The weak 2002 monsoon season produced only two storms that completely inundated the point bar, both less than the 1-year flood event. The first event, 34 cms, produced net deposition in areas where Johnson grass had been present and was now buried. The scour sensor at the lowest elevation, in a depression which serves as a secondary channel during storm events, recorded scour during the rising limb of the hydrograph followed by pulses we interpret to be the passage of dunes. The second event, although smaller at 28 cms, resulted from rain more than 50 km upstream and had a much longer peak and a slowly declining falling limb. During the second flood, several areas with buried vegetation were scoured back to their original bed elevations. Pulses of sediment passed over the sensor in the secondary channel and the sensor in the vegetated zone. Scour sensor measurements agree with data from scour chains (error +/- 3 cm) and surveys (error +/- 0.6 cm) performed before and after the two storm events, within the range of error of each method. All load sensor data were recorded at five minute intervals. Use of a smaller interval could give more details about the shapes of sediment waves and aid in bedform determination. Results suggest that dune migration is the dominant mechanism for scour and backfill in the point bar setting. Scour sensors, when coupled with surveying and/or scour chains, are a tremendous addition to the geomorphologist's toolbox, allowing unattended real-time measurements of sediment depth with time.
Simulation of rare events in quantum error correction
NASA Astrophysics Data System (ADS)
Bravyi, Sergey; Vargo, Alexander
2013-12-01
We consider the problem of calculating the logical error probability for a stabilizer quantum code subject to random Pauli errors. To access the regime of large code distances where logical errors are extremely unlikely we adopt the splitting method widely used in Monte Carlo simulations of rare events and Bennett's acceptance ratio method for estimating the free energy difference between two canonical ensembles. To illustrate the power of these methods in the context of error correction, we calculate the logical error probability PL for the two-dimensional surface code on a square lattice with a pair of holes for all code distances d≤20 and all error rates p below the fault-tolerance threshold. Our numerical results confirm the expected exponential decay PL˜exp[-α(p)d] and provide a simple fitting formula for the decay rate α(p). Both noiseless and noisy syndrome readout circuits are considered.
Ultrahigh Error Threshold for Surface Codes with Biased Noise
NASA Astrophysics Data System (ADS)
Tuckett, David K.; Bartlett, Stephen D.; Flammia, Steven T.
2018-02-01
We show that a simple modification of the surface code can exhibit an enormous gain in the error correction threshold for a noise model in which Pauli Z errors occur more frequently than X or Y errors. Such biased noise, where dephasing dominates, is ubiquitous in many quantum architectures. In the limit of pure dephasing noise we find a threshold of 43.7(1)% using a tensor network decoder proposed by Bravyi, Suchara, and Vargo. The threshold remains surprisingly large in the regime of realistic noise bias ratios, for example 28.2(2)% at a bias of 10. The performance is, in fact, at or near the hashing bound for all values of the bias. The modified surface code still uses only weight-4 stabilizers on a square lattice, but merely requires measuring products of Y instead of Z around the faces, as this doubles the number of useful syndrome bits associated with the dominant Z errors. Our results demonstrate that large efficiency gains can be found by appropriately tailoring codes and decoders to realistic noise models, even under the locality constraints of topological codes.
Troposphere-Stratosphere Connections in Recent Northern Winters in NASA GEOS Assimilated Datasets
NASA Technical Reports Server (NTRS)
Pawson, Steven
2000-01-01
The northern winter stratosphere displays a wide range of interannual variability, much of which is believed to result from the response to the damping of upward-propagating waves. However, there is considerable (growing) evidence that the stratospheric state can also impact the tropospheric circulation. This issue will be examined using datasets generated in the Data Assimilation Office (DAO) at NASA's Goddard Space Flight Center. Just as the tropospheric circulation in each of these years was dominated by differing synoptic-scale structures, the stratospheric polar vortex also displayed different evolutions. The two extremes are the winter 1998/1999, when the stratosphere underwent a series of warming events (including two major warmings), and the winter 1999/2000, which was dominated by a persistent, cold polar vortex, often distorted by a dominant blocking pattern in the troposphere. This study will examine several operational and research-level versions of the DAO's systems. The 70-level-TRMM-system with a resolution of 2-by-2.5 degrees and the 48-level, 1-by-l-degree resolution ''Terra'' system were operational in 1998/1999 and 1999/2000, respectively. Research versions of the system used a 48-level, 2-by-2.5-degree configuration, which facilitates studies of the impact of vertical resolution. The study includes checks against independent datasets and error analyses, as well as the main issue of troposphere-stratosphere interactions.
Preventing medical errors by designing benign failures.
Grout, John R
2003-07-01
One way to successfully reduce medical errors is to design health care systems that are more resistant to the tendencies of human beings to err. One interdisciplinary approach entails creating design changes, mitigating human errors, and making human error irrelevant to outcomes. This approach is intended to facilitate the creation of benign failures, which have been called mistake-proofing devices and forcing functions elsewhere. USING FAULT TREES TO DESIGN FORCING FUNCTIONS: A fault tree is a graphical tool used to understand the relationships that either directly cause or contribute to the cause of a particular failure. A careful analysis of a fault tree enables the analyst to anticipate how the process will behave after the change. EXAMPLE OF AN APPLICATION: A scenario in which a patient is scalded while bathing can serve as an example of how multiple fault trees can be used to design forcing functions. The first fault tree shows the undesirable event--patient scalded while bathing. The second fault tree has a benign event--no water. Adding a scald valve changes the outcome from the undesirable event ("patient scalded while bathing") to the benign event ("no water") Analysis of fault trees does not ensure or guarantee that changes necessary to eliminate error actually occur. Most mistake-proofing is used to prevent simple errors and to create well-defended processes, but complex errors can also result. The utilization of mistake-proofing or forcing functions can be thought of as changing the logic of a process. Errors that formerly caused undesirable failures can be converted into the causes of benign failures. The use of fault trees can provide a variety of insights into the design of forcing functions that will improve patient safety.
A New Expanded Mixed Element Method for Convection-Dominated Sobolev Equation
Wang, Jinfeng; Li, Hong; Fang, Zhichao
2014-01-01
We propose and analyze a new expanded mixed element method, whose gradient belongs to the simple square integrable space instead of the classical H(div; Ω) space of Chen's expanded mixed element method. We study the new expanded mixed element method for convection-dominated Sobolev equation, prove the existence and uniqueness for finite element solution, and introduce a new expanded mixed projection. We derive the optimal a priori error estimates in L 2-norm for the scalar unknown u and a priori error estimates in (L 2)2-norm for its gradient λ and its flux σ. Moreover, we obtain the optimal a priori error estimates in H 1-norm for the scalar unknown u. Finally, we obtained some numerical results to illustrate efficiency of the new method. PMID:24701153
Coherent Doppler Lidar for Boundary Layer Studies and Wind Energy
NASA Astrophysics Data System (ADS)
Choukulkar, Aditya
This thesis outlines the development of a vector retrieval technique, based on data assimilation, for a coherent Doppler LIDAR (Light Detection and Ranging). A detailed analysis of the Optimal Interpolation (OI) technique for vector retrieval is presented. Through several modifications to the OI technique, it is shown that the modified technique results in significant improvement in velocity retrieval accuracy. These modifications include changes to innovation covariance portioning, covariance binning, and analysis increment calculation. It is observed that the modified technique is able to make retrievals with better accuracy, preserves local information better, and compares well with tower measurements. In order to study the error of representativeness and vector retrieval error, a lidar simulator was constructed. Using the lidar simulator a thorough sensitivity analysis of the lidar measurement process and vector retrieval is carried out. The error of representativeness as a function of scales of motion and sensitivity of vector retrieval to look angle is quantified. Using the modified OI technique, study of nocturnal flow in Owens' Valley, CA was carried out to identify and understand uncharacteristic events on the night of March 27th 2006. Observations from 1030 UTC to 1230 UTC (0230 hr local time to 0430 hr local time) on March 27 2006 are presented. Lidar observations show complex and uncharacteristic flows such as sudden bursts of westerly cross-valley wind mixing with the dominant up-valley wind. Model results from Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS RTM) and other in-situ instrumentations are used to corroborate and complement these observations. The modified OI technique is used to identify uncharacteristic and extreme flow events at a wind development site. Estimates of turbulence and shear from this technique are compared to tower measurements. A formulation for equivalent wind speed in the presence of variations in wind speed and direction, combined with shear is developed and used to determine wind energy content in presence of turbulence.
Forecasting the brittle failure of heterogeneous, porous geomaterials
NASA Astrophysics Data System (ADS)
Vasseur, Jérémie; Wadsworth, Fabian; Heap, Michael; Main, Ian; Lavallée, Yan; Dingwell, Donald
2017-04-01
Heterogeneity develops in magmas during ascent and is dominated by the development of crystal and importantly, bubble populations or pore-network clusters which grow, interact, localize, coalesce, outgas and resorb. Pore-scale heterogeneity is also ubiquitous in sedimentary basin fill during diagenesis. As a first step, we construct numerical simulations in 3D in which randomly generated heterogeneous and polydisperse spheres are placed in volumes and which are permitted to overlap with one another, designed to represent the random growth and interaction of bubbles in a liquid volume. We use these simulated geometries to show that statistical predictions of the inter-bubble lengthscales and evolving bubble surface area or cluster densities can be made based on fundamental percolation theory. As a second step, we take a range of well constrained random heterogeneous rock samples including sandstones, andesites, synthetic partially sintered glass bead samples, and intact glass samples and subject them to a variety of stress loading conditions at a range of temperatures until failure. We record in real time the evolution of the number of acoustic events that precede failure and show that in all scenarios, the acoustic event rate accelerates toward failure, consistent with previous findings. Applying tools designed to forecast the failure time based on these precursory signals, we constrain the absolute error on the forecast time. We find that for all sample types, the error associated with an accurate forecast of failure scales non-linearly with the lengthscale between the pore clusters in the material. Moreover, using a simple micromechanical model for the deformation of porous elastic bodies, we show that the ratio between the equilibrium sub-critical crack length emanating from the pore clusters relative to the inter-pore lengthscale, provides a scaling for the error on forecast accuracy. Thus for the first time we provide a potential quantitative correction for forecasting the failure of porous brittle solids that build the Earth's crust.
Diagnosis of NMOS DRAM functional performance as affected by a picosecond dye laser
NASA Technical Reports Server (NTRS)
Kim, Q.; Schwartz, H. R.; Edmonds, L. D.; Zoutendyk, J. A.
1992-01-01
A picosec pulsed dye laser beam was at selected wavelengths successfully used to simulate heavy-ion single-event effects (SEEs) in negative channel NMOS DRAMs. A DRAM was used to develop the test technique because bit-mapping capability and previous heavy-ion upset data were available. The present analysis is the first to establish such a correlation between laser and heavy-ion data for devices, such as the NMOS DRAM, where charge collection is dominated by long-range diffusion, which is controlled by carrier density at remote distances from a depletion region. In the latter case, penetration depth is an important parameter and is included in the present analysis. A single-pulse picosecond dye laser beam (1.5 microns diameter) focused onto a single cell component can upset a single memory cell; clusters of memory cell upsets (multiple errors) were observed when the laser energy was increased above the threshold energy. The multiple errors were analyzed as a function of the bias voltage and total energy of a single pulse. A diffusion model to distinguish the multiple upsets from the laser-induced charge agreed well with previously reported heavy ion data.
Xiao, Yongling; Abrahamowicz, Michal
2010-03-30
We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.
Wedell, Douglas H; Moro, Rodrigo
2008-04-01
Two experiments used within-subject designs to examine how conjunction errors depend on the use of (1) choice versus estimation tasks, (2) probability versus frequency language, and (3) conjunctions of two likely events versus conjunctions of likely and unlikely events. All problems included a three-option format verified to minimize misinterpretation of the base event. In both experiments, conjunction errors were reduced when likely events were conjoined. Conjunction errors were also reduced for estimations compared with choices, with this reduction greater for likely conjuncts, an interaction effect. Shifting conceptual focus from probabilities to frequencies did not affect conjunction error rates. Analyses of numerical estimates for a subset of the problems provided support for the use of three general models by participants for generating estimates. Strikingly, the order in which the two tasks were carried out did not affect the pattern of results, supporting the idea that the mode of responding strongly determines the mode of thinking about conjunctions and hence the occurrence of the conjunction fallacy. These findings were evaluated in terms of implications for rationality of human judgment and reasoning.
The effectiveness of risk management program on pediatric nurses' medication error.
Dehghan-Nayeri, Nahid; Bayat, Fariba; Salehi, Tahmineh; Faghihzadeh, Soghrat
2013-09-01
Medication therapy is one of the most complex and high-risk clinical processes that nurses deal with. Medication error is the most common type of error that brings about damage and death to patients, especially pediatric ones. However, these errors are preventable. Identifying and preventing undesirable events leading to medication errors are the main risk management activities. The aim of this study was to investigate the effectiveness of a risk management program on the pediatric nurses' medication error rate. This study is a quasi-experimental one with a comparison group. In this study, 200 nurses were recruited from two main pediatric hospitals in Tehran. In the experimental hospital, we applied the risk management program for a period of 6 months. Nurses of the control hospital did the hospital routine schedule. A pre- and post-test was performed to measure the frequency of the medication error events. SPSS software, t-test, and regression analysis were used for data analysis. After the intervention, the medication error rate of nurses at the experimental hospital was significantly lower (P < 0.001) and the error-reporting rate was higher (P < 0.007) compared to before the intervention and also in comparison to the nurses of the control hospital. Based on the results of this study and taking into account the high-risk nature of the medical environment, applying the quality-control programs such as risk management can effectively prevent the occurrence of the hospital undesirable events. Nursing mangers can reduce the medication error rate by applying risk management programs. However, this program cannot succeed without nurses' cooperation.
Matsui, Mié; Sumiyoshi, Tomiki; Yuuki, Hiromi; Kato, Kanade; Kurachi, Masayoshi
2006-08-30
The purpose of this study was to examine event schema, the conceptualization of past experience based on script theory, in Japanese patients with schizophrenia. Subjects comprised 25 patients meeting DSM-IV criteria for schizophrenia and 31 normal individuals who gave informed consent. This experiment used three script tasks measuring free recall, frequency judgment, and sequencing of events encountered when shopping at a supermarket. Patients with schizophrenia performed significantly worse than did control subjects on all tasks. In particular, patients committed more errors when judging the events that "occasionally happen" in the frequency judgment task. On the other hand, these patients judged "seldom occurring events" relatively well. Patients with schizophrenia made more errors than normal people in the free recall task. Specifically, patients made more intrusion errors and failed to close scripts. There was a negative correlation between scores the Scale for the Assessment of Positive Symptoms and performance on the free recall task. The results of the present study suggest that event schemas (semantic structure) in patients with schizophrenia are impaired which may be associated with positive symptoms and frontal lobe dysfunction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naughton, M.J.; Bourke, W.; Browning, G.L.
The convergence of spectral model numerical solutions of the global shallow-water equations is examined as a function of the time step and the spectral truncation. The contributions to the errors due to the spatial and temporal discretizations are separately identified and compared. Numerical convergence experiments are performed with the inviscid equations from smooth (Rossby-Haurwitz wave) and observed (R45 atmospheric analysis) initial conditions, and also with the diffusive shallow-water equations. Results are compared with the forced inviscid shallow-water equations case studied by Browning et al. Reduction of the time discretization error by the removal of fast waves from the solution usingmore » initialization is shown. The effects of forcing and diffusion on the convergence are discussed. Time truncation errors are found to dominate when a feature is large scale and well resolved; spatial truncation errors dominate for small-scale features and also for large scale after the small scales have affected them. Possible implications of these results for global atmospheric modeling are discussed. 31 refs., 14 figs., 4 tabs.« less
Mechanisms of protein-folding diseases at a glance.
Valastyan, Julie S; Lindquist, Susan
2014-01-01
For a protein to function appropriately, it must first achieve its proper conformation and location within the crowded environment inside the cell. Multiple chaperone systems are required to fold proteins correctly. In addition, degradation pathways participate by destroying improperly folded proteins. The intricacy of this multisystem process provides many opportunities for error. Furthermore, mutations cause misfolded, nonfunctional forms of proteins to accumulate. As a result, many pathological conditions are fundamentally rooted in the protein-folding problem that all cells must solve to maintain their function and integrity. Here, to illustrate the breadth of this phenomenon, we describe five examples of protein-misfolding events that can lead to disease: improper degradation, mislocalization, dominant-negative mutations, structural alterations that establish novel toxic functions, and amyloid accumulation. In each case, we will highlight current therapeutic options for battling such diseases.
Alterations in Error-Related Brain Activity and Post-Error Behavior over Time
ERIC Educational Resources Information Center
Themanson, Jason R.; Rosen, Peter J.; Pontifex, Matthew B.; Hillman, Charles H.; McAuley, Edward
2012-01-01
This study examines the relation between the error-related negativity (ERN) and post-error behavior over time in healthy young adults (N = 61). Event-related brain potentials were collected during two sessions of an identical flanker task. Results indicated changes in ERN and post-error accuracy were related across task sessions, with more…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnaswamy, J.; Kalsi, S.; Hsieh, H.
1991-01-01
Magnetic measurements performed on the 12-pole trim magnets is described including Hall probe measurements to verify symmetry of the field and, rotating coil measurements to map the multipoles. The rotating coil measurements were carried out using a HP Dynamic Signal Analyzer. Excited as a quadrupole the dominant error multipole is the 20th pole and excited as a sextrupole the dominant error multipole is the 18th pole. Reasonable agreement was found between the Hall probe measurements and the rotating coil measurements. 2 refs., 5 figs.
Benjamin, David M; Pendrak, Robert F
2003-07-01
Clinical pharmacologists are all dedicated to improving the use of medications and decreasing medication errors and adverse drug reactions. However, quality improvement requires that some significant parameters of quality be categorized, measured, and tracked to provide benchmarks to which future data (performance) can be compared. One of the best ways to accumulate data on medication errors and adverse drug reactions is to look at medical malpractice data compiled by the insurance industry. Using data from PHICO insurance company, PHICO's Closed Claims Data, and PHICO's Event Reporting Trending System (PERTS), this article examines the significance and trends of the claims and events reported between 1996 and 1998. Those who misread history are doomed to repeat the mistakes of the past. From a quality improvement perspective, the categorization of the claims and events is useful for reengineering integrated medication delivery, particularly in a hospital setting, and for redesigning drug administration protocols on low therapeutic index medications and "high-risk" drugs. Demonstrable evidence of quality improvement is being required by state laws and by accreditation agencies. The state of Florida requires that quality improvement data be posted quarterly on the Web sites of the health care facilities. Other states have followed suit. The insurance industry is concerned with costs, and medication errors cost money. Even excluding costs of litigation, an adverse drug reaction may cost up to $2500 in hospital resources, and a preventable medication error may cost almost $4700. To monitor costs and assess risk, insurance companies want to know what errors are made and where the system has broken down, permitting the error to occur. Recording and evaluating reliable data on adverse drug events is the first step in improving the quality of pharmacotherapy and increasing patient safety. Cost savings and quality improvement evolve on parallel paths. The PHICO data provide an excellent opportunity to review information that typically would not be in the public domain. The events captured by PHICO are similar to the errors and "high-risk" drugs described in the literature, the U.S. Pharmacopeia's MedMARx Reporting System, and the Sentinel Event reporting system maintained by the Joint Commission for the Accreditation of Healthcare Organizations. The information in this report serves to alert clinicians to the possibility of adverse events when treating patients with the reported drugs, thus allowing for greater care in their use and closer monitoring. Moreover, when using high-risk drugs, patients should be well informed of known risks, dosage should be titrated slowly, and therapeutic drug monitoring and laboratory monitoring should be employed to optimize therapy and minimize adverse effects.
NASA Astrophysics Data System (ADS)
Duan, Wansuo; Zhao, Peng
2017-04-01
Within the Zebiak-Cane model, the nonlinear forcing singular vector (NFSV) approach is used to investigate the role of model errors in the "Spring Predictability Barrier" (SPB) phenomenon within ENSO predictions. NFSV-related errors have the largest negative effect on the uncertainties of El Niño predictions. NFSV errors can be classified into two types: the first is characterized by a zonal dipolar pattern of SST anomalies (SSTA), with the western poles centered in the equatorial central-western Pacific exhibiting positive anomalies and the eastern poles in the equatorial eastern Pacific exhibiting negative anomalies; and the second is characterized by a pattern almost opposite the first type. The first type of error tends to have the worst effects on El Niño growth-phase predictions, whereas the latter often yields the largest negative effects on decaying-phase predictions. The evolution of prediction errors caused by NFSV-related errors exhibits prominent seasonality, with the fastest error growth in the spring and/or summer seasons; hence, these errors result in a significant SPB related to El Niño events. The linear counterpart of NFSVs, the (linear) forcing singular vector (FSV), induces a less significant SPB because it contains smaller prediction errors. Random errors cannot generate a SPB for El Niño events. These results show that the occurrence of an SPB is related to the spatial patterns of tendency errors. The NFSV tendency errors cause the most significant SPB for El Niño events. In addition, NFSVs often concentrate these large value errors in a few areas within the equatorial eastern and central-western Pacific, which likely represent those areas sensitive to El Niño predictions associated with model errors. Meanwhile, these areas are also exactly consistent with the sensitive areas related to initial errors determined by previous studies. This implies that additional observations in the sensitive areas would not only improve the accuracy of the initial field but also promote the reduction of model errors to greatly improve ENSO forecasts.
A Flight/Ground/Test Event Logging Facility
NASA Technical Reports Server (NTRS)
Dvorak, Daniel
1999-01-01
The onboard control software for spacecraft such as Mars Pathfinder and Cassini is composed of many subsystems including executive control, navigation, attitude control, imaging, data management, and telecommunications. The software in all of these subsystems needs to be instrumented for several purposes: to report required telemetry data, to report warning and error events, to verify internal behavior during system testing, and to provide ground operators with detailed data when investigating in-flight anomalies. Events can range in importance from purely informational events to major errors. It is desirable to provide a uniform mechanism for reporting such events and controlling their subsequent processing. Since radiation-hardened flight processors are several years behind the speed and memory of their commercial cousins, and since most subsystems require real-time control, and since downlink rates to earth can be very low from deep space, there are limits to how much of the data can be saved and transmitted. Some kinds of events are more important than others and should therefore be preferentially retained when memory is low. Some faults can cause an event to recur at a high rate, but this must not be allowed to consume the memory pool. Some event occurrences may be of low importance when reported but suddenly become more important when a subsequent error event gets reported. Some events may be so low-level that they need not be saved and reported unless specifically requested by ground operators.
Design considerations for case series models with exposure onset measurement error.
Mohammed, Sandra M; Dalrymple, Lorien S; Sentürk, Damla; Nguyen, Danh V
2013-02-28
The case series model allows for estimation of the relative incidence of events, such as cardiovascular events, within a pre-specified time window after an exposure, such as an infection. The method requires only cases (individuals with events) and controls for all fixed/time-invariant confounders. The measurement error case series model extends the original case series model to handle imperfect data, where the timing of an infection (exposure) is not known precisely. In this work, we propose a method for power/sample size determination for the measurement error case series model. Extensive simulation studies are used to assess the accuracy of the proposed sample size formulas. We also examine the magnitude of the relative loss of power due to exposure onset measurement error, compared with the ideal situation where the time of exposure is measured precisely. To facilitate the design of case series studies, we provide publicly available web-based tools for determining power/sample size for both the measurement error case series model as well as the standard case series model. Copyright © 2012 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson, Jason P.; Carlson, Deborah K.; Ortiz, Anne
Accurate location of seismic events is crucial for nuclear explosion monitoring. There are several sources of error in seismic location that must be taken into account to obtain high confidence results. Most location techniques account for uncertainties in the phase arrival times (measurement error) and the bias of the velocity model (model error), but they do not account for the uncertainty of the velocity model bias. By determining and incorporating this uncertainty in the location algorithm we seek to improve the accuracy of the calculated locations and uncertainty ellipses. In order to correct for deficiencies in the velocity model, itmore » is necessary to apply station specific corrections to the predicted arrival times. Both master event and multiple event location techniques assume that the station corrections are known perfectly, when in reality there is an uncertainty associated with these corrections. For multiple event location algorithms that calculate station corrections as part of the inversion, it is possible to determine the variance of the corrections. The variance can then be used to weight the arrivals associated with each station, thereby giving more influence to stations with consistent corrections. We have modified an existing multiple event location program (based on PMEL, Pavlis and Booker, 1983). We are exploring weighting arrivals with the inverse of the station correction standard deviation as well using the conditional probability of the calculated station corrections. This is in addition to the weighting already given to the measurement and modeling error terms. We re-locate a group of mining explosions that occurred at Black Thunder, Wyoming, and compare the results to those generated without accounting for station correction uncertainty.« less
Adaptive constructive processes and the future of memory.
Schacter, Daniel L
2012-11-01
Memory serves critical functions in everyday life but is also prone to error. This article examines adaptive constructive processes, which play a functional role in memory and cognition but can also produce distortions, errors, and illusions. The article describes several types of memory errors that are produced by adaptive constructive processes and focuses in particular on the process of imagining or simulating events that might occur in one's personal future. Simulating future events relies on many of the same cognitive and neural processes as remembering past events, which may help to explain why imagination and memory can be easily confused. The article considers both pitfalls and adaptive aspects of future event simulation in the context of research on planning, prediction, problem solving, mind-wandering, prospective and retrospective memory, coping and positivity bias, and the interconnected set of brain regions known as the default network. PsycINFO Database Record (c) 2012 APA, all rights reserved.
Asymmetric Memory Circuit Would Resist Soft Errors
NASA Technical Reports Server (NTRS)
Buehler, Martin G.; Perlman, Marvin
1990-01-01
Some nonlinear error-correcting codes more efficient in presence of asymmetry. Combination of circuit-design and coding concepts expected to make integrated-circuit random-access memories more resistant to "soft" errors (temporary bit errors, also called "single-event upsets" due to ionizing radiation). Integrated circuit of new type made deliberately more susceptible to one kind of bit error than to other, and associated error-correcting code adapted to exploit this asymmetry in error probabilities.
Error Orientation and Reflection at Work
ERIC Educational Resources Information Center
Hetzner, Stefanie; Gartmeier, Martin; Heid, Helmut; Gruber, Hans
2011-01-01
Reflection on events at work, including errors is often as a means to learn effectively through work. In a cross-sectional field study in the banking sector, we investigated attitudes towards workplace errors (i.e. error orientation) as predictors of reflective activity. We assumed the organisational climate for psychological safety to have a…
When is an error not a prediction error? An electrophysiological investigation.
Holroyd, Clay B; Krigolson, Olave E; Baker, Robert; Lee, Seung; Gibson, Jessica
2009-03-01
A recent theory holds that the anterior cingulate cortex (ACC) uses reinforcement learning signals conveyed by the midbrain dopamine system to facilitate flexible action selection. According to this position, the impact of reward prediction error signals on ACC modulates the amplitude of a component of the event-related brain potential called the error-related negativity (ERN). The theory predicts that ERN amplitude is monotonically related to the expectedness of the event: It is larger for unexpected outcomes than for expected outcomes. However, a recent failure to confirm this prediction has called the theory into question. In the present article, we investigated this discrepancy in three trial-and-error learning experiments. All three experiments provided support for the theory, but the effect sizes were largest when an optimal response strategy could actually be learned. This observation suggests that ACC utilizes dopamine reward prediction error signals for adaptive decision making when the optimal behavior is, in fact, learnable.
ERIC Educational Resources Information Center
van Veen, V.; Holroyd, C.B.; Cohen, J.D.; Stenger, V.A.; Carter, C.S.
2004-01-01
Recent theories of the neural basis of performance monitoring have emphasized a central role for the anterior cingulate cortex (ACC). Replicating an earlier event-related potential (ERP) study, which showed an error feedback negativity that was modeled as having an ACC generator, we used event-related fMRI to investigate whether the ACC would…
Effect of random errors in planar PIV data on pressure estimation in vortex dominated flows
NASA Astrophysics Data System (ADS)
McClure, Jeffrey; Yarusevych, Serhiy
2015-11-01
The sensitivity of pressure estimation techniques from Particle Image Velocimetry (PIV) measurements to random errors in measured velocity data is investigated using the flow over a circular cylinder as a test case. Direct numerical simulations are performed for ReD = 100, 300 and 1575, spanning laminar, transitional, and turbulent wake regimes, respectively. A range of random errors typical for PIV measurements is applied to synthetic PIV data extracted from numerical results. A parametric study is then performed using a number of common pressure estimation techniques. Optimal temporal and spatial resolutions are derived based on the sensitivity of the estimated pressure fields to the simulated random error in velocity measurements, and the results are compared to an optimization model derived from error propagation theory. It is shown that the reductions in spatial and temporal scales at higher Reynolds numbers leads to notable changes in the optimal pressure evaluation parameters. The effect of smaller scale wake structures is also quantified. The errors in the estimated pressure fields are shown to depend significantly on the pressure estimation technique employed. The results are used to provide recommendations for the use of pressure and force estimation techniques from experimental PIV measurements in vortex dominated laminar and turbulent wake flows.
Weng, Shenglin; Li, Yiping; Wei, Jin; Du, Wei; Gao, Xiaomeng; Wang, Wencai; Wang, Jianwei; Acharya, Kumud; Luo, Liancong
2018-05-01
The identification of coherent structures is very important in investigating the sediment transport mechanism and controlling the eutrophication in shallow lakes. This study analyzed the turbulence characteristics and the sensitivity of quadrant analysis to threshold level. Simultaneous in situ measurements of velocities and suspended sediment concentration (SSC) were conducted in Lake Taihu with acoustic Doppler velocimeter (ADV) and optical backscatter sensor (OBS) instruments. The results show that the increase in hole size makes the difference between dominant and non-dominant events more distinct. Wind velocity determines the frequency of occurrence of sweep and ejection events, which provide dominant contributions to the Reynolds stress. The increase of wind velocity enlarges the magnitude of coherent events but has little impact on the events frequency with the same hole size. The events occurring within short periods provide large contributions to the momentum flux. Transportation and diffusion of sediment are in control of the intermittent coherent events to a large extent.
Event-related potentials for post-error and post-conflict slowing.
Chang, Andrew; Chen, Chien-Chung; Li, Hsin-Hung; Li, Chiang-Shan R
2014-01-01
In a reaction time task, people typically slow down following an error or conflict, each called post-error slowing (PES) and post-conflict slowing (PCS). Despite many studies of the cognitive mechanisms, the neural responses of PES and PCS continue to be debated. In this study, we combined high-density array EEG and a stop-signal task to examine event-related potentials of PES and PCS in sixteen young adult participants. The results showed that the amplitude of N2 is greater during PES but not PCS. In contrast, the peak latency of N2 is longer for PCS but not PES. Furthermore, error-positivity (Pe) but not error-related negativity (ERN) was greater in the stop error trials preceding PES than non-PES trials, suggesting that PES is related to participants' awareness of the error. Together, these findings extend earlier work of cognitive control by specifying the neural correlates of PES and PCS in the stop signal task.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arnold, Anthony, E-mail: anthony.arnold@sesiahs.health.nsw.gov.a; Delaney, Geoff P.; Cassapi, Lynette
Purpose: Radiotherapy is a common treatment for cancer patients. Although incidence of error is low, errors can be severe or affect significant numbers of patients. In addition, errors will often not manifest until long periods after treatment. This study describes the development of an incident reporting tool that allows categorical analysis and time trend reporting, covering first 3 years of use. Methods and Materials: A radiotherapy-specific incident analysis system was established. Staff members were encouraged to report actual errors and near-miss events detected at prescription, simulation, planning, or treatment phases of radiotherapy delivery. Trend reporting was reviewed monthly. Results: Reportsmore » were analyzed for the first 3 years of operation (May 2004-2007). A total of 688 reports was received during the study period. The actual error rate was 0.2% per treatment episode. During the study period, the actual error rates reduced significantly from 1% per year to 0.3% per year (p < 0.001), as did the total event report rates (p < 0.0001). There were 3.5 times as many near misses reported compared with actual errors. Conclusions: This system has allowed real-time analysis of events within a radiation oncology department to a reduced error rate through focus on learning and prevention from the near-miss reports. Plans are underway to develop this reporting tool for Australia and New Zealand.« less
Single-Event Upset Characterization of Common First- and Second-Order All-Digital Phase-Locked Loops
NASA Astrophysics Data System (ADS)
Chen, Y. P.; Massengill, L. W.; Kauppila, J. S.; Bhuva, B. L.; Holman, W. T.; Loveless, T. D.
2017-08-01
The single-event upset (SEU) vulnerability of common first- and second-order all-digital-phase-locked loops (ADPLLs) is investigated through field-programmable gate array-based fault injection experiments. SEUs in the highest order pole of the loop filter and fraction-based phase detectors (PDs) may result in the worst case error response, i.e., limit cycle errors, often requiring system restart. SEUs in integer-based linear PDs may result in loss-of-lock errors, while SEUs in bang-bang PDs only result in temporary-frequency errors. ADPLLs with the same frequency tuning range but fewer bits in the control word exhibit better overall SEU performance.
Concomitant prescribing and dispensing errors at a Brazilian hospital: a descriptive study
Silva, Maria das Dores Graciano; Rosa, Mário Borges; Franklin, Bryony Dean; Reis, Adriano Max Moreira; Anchieta, Lêni Márcia; Mota, Joaquim Antônio César
2011-01-01
OBJECTIVE: To analyze the prevalence and types of prescribing and dispensing errors occurring with high-alert medications and to propose preventive measures to avoid errors with these medications. INTRODUCTION: The prevalence of adverse events in health care has increased, and medication errors are probably the most common cause of these events. Pediatric patients are known to be a high-risk group and are an important target in medication error prevention. METHODS: Observers collected data on prescribing and dispensing errors occurring with high-alert medications for pediatric inpatients in a university hospital. In addition to classifying the types of error that occurred, we identified cases of concomitant prescribing and dispensing errors. RESULTS: One or more prescribing errors, totaling 1,632 errors, were found in 632 (89.6%) of the 705 high-alert medications that were prescribed and dispensed. We also identified at least one dispensing error in each high-alert medication dispensed, totaling 1,707 errors. Among these dispensing errors, 723 (42.4%) content errors occurred concomitantly with the prescribing errors. A subset of dispensing errors may have occurred because of poor prescription quality. The observed concomitancy should be examined carefully because improvements in the prescribing process could potentially prevent these problems. CONCLUSION: The system of drug prescribing and dispensing at the hospital investigated in this study should be improved by incorporating the best practices of medication safety and preventing medication errors. High-alert medications may be used as triggers for improving the safety of the drug-utilization system. PMID:22012039
NASA Astrophysics Data System (ADS)
Yan, H.; Sun, N.; Wigmosta, M. S.; Hou, Z.
2017-12-01
There is a renewed focus on the design of infrastructure resilient to extreme hydrometeorological events. While precipitation-based intensity-duration-frequency (IDF) curves are commonly used as part of infrastructure design, a large percentage of peak runoff events in the snow-dominated regions are caused by snowmelt, particularly during rain-on-snow (ROS) events. In this study, we examined next-generation IDF (NG-IDF) curves with inclusion of snowmelt and ROS events to improve infrastructure design in snow-dominated regions. We compared NG-IDF curves to standard precipitation-based IDF curves for estimates of extreme events at 377 Snowpack Telemetry (SNOTEL) stations across the western United States with at least 30 years of high quality record. We found 38% of the stations were subject to under-design, many with significant underestimation of 100-year extreme events, where the precipitation-based IDF curves can underestimate water potentially available for runoff by as much as 121% due to snowmelt and ROS events. The regions with the greatest potential for under-design were in the Pacific Northwest, the Sierra Nevada, and the Middle and Southern Rockies. We also found the potential for over-design at 27% of the stations, primarily in the Middle Rockies and Arizona mountains. These results demonstrate the need to consider snow processes in development of IDF curves for engineering design procedures in snow-dominated regions.
2016-08-01
Lipid Metabolism, Inborn Errors; Hypercholesterolemia, Autosomal Dominant; Hyperlipidemias; Metabolic Diseases; Hyperlipoproteinemia Type II; Metabolism, Inborn Errors; Genetic Diseases, Inborn; Infant, Newborn, Diseases; Metabolic Disorder; Congenital Abnormalities; Hypercholesterolemia; Hyperlipoproteinemias; Dyslipidemias; Lipid Metabolism Disorders
Quality-control of an hourly rainfall dataset and climatology of extremes for the UK.
Blenkinsop, Stephen; Lewis, Elizabeth; Chan, Steven C; Fowler, Hayley J
2017-02-01
Sub-daily rainfall extremes may be associated with flash flooding, particularly in urban areas but, compared with extremes on daily timescales, have been relatively little studied in many regions. This paper describes a new, hourly rainfall dataset for the UK based on ∼1600 rain gauges from three different data sources. This includes tipping bucket rain gauge data from the UK Environment Agency (EA), which has been collected for operational purposes, principally flood forecasting. Significant problems in the use of such data for the analysis of extreme events include the recording of accumulated totals, high frequency bucket tips, rain gauge recording errors and the non-operation of gauges. Given the prospect of an intensification of short-duration rainfall in a warming climate, the identification of such errors is essential if sub-daily datasets are to be used to better understand extreme events. We therefore first describe a series of procedures developed to quality control this new dataset. We then analyse ∼380 gauges with near-complete hourly records for 1992-2011 and map the seasonal climatology of intense rainfall based on UK hourly extremes using annual maxima, n-largest events and fixed threshold approaches. We find that the highest frequencies and intensities of hourly extreme rainfall occur during summer when the usual orographically defined pattern of extreme rainfall is replaced by a weaker, north-south pattern. A strong diurnal cycle in hourly extremes, peaking in late afternoon to early evening, is also identified in summer and, for some areas, in spring. This likely reflects the different mechanisms that generate sub-daily rainfall, with convection dominating during summer. The resulting quality-controlled hourly rainfall dataset will provide considerable value in several contexts, including the development of standard, globally applicable quality-control procedures for sub-daily data, the validation of the new generation of very high-resolution climate models and improved understanding of the drivers of extreme rainfall.
Quality‐control of an hourly rainfall dataset and climatology of extremes for the UK
Lewis, Elizabeth; Chan, Steven C.; Fowler, Hayley J.
2016-01-01
ABSTRACT Sub‐daily rainfall extremes may be associated with flash flooding, particularly in urban areas but, compared with extremes on daily timescales, have been relatively little studied in many regions. This paper describes a new, hourly rainfall dataset for the UK based on ∼1600 rain gauges from three different data sources. This includes tipping bucket rain gauge data from the UK Environment Agency (EA), which has been collected for operational purposes, principally flood forecasting. Significant problems in the use of such data for the analysis of extreme events include the recording of accumulated totals, high frequency bucket tips, rain gauge recording errors and the non‐operation of gauges. Given the prospect of an intensification of short‐duration rainfall in a warming climate, the identification of such errors is essential if sub‐daily datasets are to be used to better understand extreme events. We therefore first describe a series of procedures developed to quality control this new dataset. We then analyse ∼380 gauges with near‐complete hourly records for 1992–2011 and map the seasonal climatology of intense rainfall based on UK hourly extremes using annual maxima, n‐largest events and fixed threshold approaches. We find that the highest frequencies and intensities of hourly extreme rainfall occur during summer when the usual orographically defined pattern of extreme rainfall is replaced by a weaker, north–south pattern. A strong diurnal cycle in hourly extremes, peaking in late afternoon to early evening, is also identified in summer and, for some areas, in spring. This likely reflects the different mechanisms that generate sub‐daily rainfall, with convection dominating during summer. The resulting quality‐controlled hourly rainfall dataset will provide considerable value in several contexts, including the development of standard, globally applicable quality‐control procedures for sub‐daily data, the validation of the new generation of very high‐resolution climate models and improved understanding of the drivers of extreme rainfall. PMID:28239235
Boluda-Ruiz, Rubén; García-Zambrana, Antonio; Castillo-Vázquez, Carmen; Castillo-Vázquez, Beatriz
2016-10-03
A novel accurate and useful approximation of the well-known Beckmann distribution is presented here, which is used to model generalized pointing errors in the context of free-space optical (FSO) communication systems. We derive an approximate closed-form probability density function (PDF) for the composite gamma-gamma (GG) atmospheric turbulence with the pointing error model using the proposed approximation of the Beckmann distribution, which is valid for most practical terrestrial FSO links. This approximation takes into account the effect of the beam width, different jitters for the elevation and the horizontal displacement and the simultaneous effect of nonzero boresight errors for each axis at the receiver plane. Additionally, the proposed approximation allows us to delimit two different FSO scenarios. The first of them is when atmospheric turbulence is the dominant effect in relation to generalized pointing errors, and the second one when generalized pointing error is the dominant effect in relation to atmospheric turbulence. The second FSO scenario has not been studied in-depth by the research community. Moreover, the accuracy of the method is measured both visually and quantitatively using curve-fitting metrics. Simulation results are further included to confirm the analytical results.
De Sá Teixeira, Nuno Alexandre
2014-12-01
Given its conspicuous nature, gravity has been acknowledged by several research lines as a prime factor in structuring the spatial perception of one's environment. One such line of enquiry has focused on errors in spatial localization aimed at the vanishing location of moving objects - it has been systematically reported that humans mislocalize spatial positions forward, in the direction of motion (representational momentum) and downward in the direction of gravity (representational gravity). Moreover, spatial localization errors were found to evolve dynamically with time in a pattern congruent with an anticipated trajectory (representational trajectory). The present study attempts to ascertain the degree to which vestibular information plays a role in these phenomena. Human observers performed a spatial localization task while tilted to varying degrees and referring to the vanishing locations of targets moving along several directions. A Fourier decomposition of the obtained spatial localization errors revealed that although spatial errors were increased "downward" mainly along the body's longitudinal axis (idiotropic dominance), the degree of misalignment between the latter and physical gravity modulated the time course of the localization responses. This pattern is surmised to reflect increased uncertainty about the internal model when faced with conflicting cues regarding the perceived "downward" direction.
Swarms of repeating long-period earthquakes at Shishaldin Volcano, Alaska, 2001-2004
Petersen, Tanja
2007-01-01
During 2001–2004, a series of four periods of elevated long-period seismic activity, each lasting about 1–2 months, occurred at Shishaldin Volcano, Aleutian Islands, Alaska. The time periods are termed swarms of repeating events, reflecting an abundance of earthquakes with highly similar waveforms that indicate stable, non-destructive sources. These swarms are characterized by increased earthquake amplitudes, although the seismicity rate of one event every 0.5–5 min has remained more or less constant since Shishaldin last erupted in 1999. A method based on waveform cross-correlation is used to identify highly repetitive events, suggestive of spatially distinct source locations. The waveform analysis shows that several different families of similar events co-exist during a given swarm day, but generally only one large family dominates. A network of hydrothermal fractures may explain the events that do not belong to a dominant repeating event group, i.e. multiple sources at different locations exist next to a dominant source. The dominant waveforms exhibit systematic changes throughout each swarm, but some of these waveforms do reappear over the course of 4 years indicating repeatedly activated source locations. The choked flow model provides a plausible trigger mechanism for the repeating events observed at Shishaldin, explaining the gradual changes in waveforms over time by changes in pressure gradient across a constriction within the uppermost part of the conduit. The sustained generation of Shishaldin's long-period events may be attributed to complex dynamics of a multi-fractured hydrothermal system: the pressure gradient within the main conduit may be regulated by temporarily sealing and reopening of parallel flow pathways, by the amount of debris within the main conduit and/or by changing gas influx into the hydrothermal system. The observations suggest that Shishaldin's swarms of repeating events represent time periods during which a dominant source is activated.
Forecasting volcanic air pollution in Hawaii: Tests of time series models
NASA Astrophysics Data System (ADS)
Reikard, Gordon
2012-12-01
Volcanic air pollution, known as vog (volcanic smog) has recently become a major issue in the Hawaiian islands. Vog is caused when volcanic gases react with oxygen and water vapor. It consists of a mixture of gases and aerosols, which include sulfur dioxide and other sulfates. The source of the volcanic gases is the continuing eruption of Mount Kilauea. This paper studies predicting vog using statistical methods. The data sets include time series for SO2 and SO4, over locations spanning the west, south and southeast coasts of Hawaii, and the city of Hilo. The forecasting models include regressions and neural networks, and a frequency domain algorithm. The most typical pattern for the SO2 data is for the frequency domain method to yield the most accurate forecasts over the first few hours, and at the 24 h horizon. The neural net places second. For the SO4 data, the results are less consistent. At two sites, the neural net generally yields the most accurate forecasts, except at the 1 and 24 h horizons, where the frequency domain technique wins narrowly. At one site, the neural net and the frequency domain algorithm yield comparable errors over the first 5 h, after which the neural net dominates. At the remaining site, the frequency domain method is more accurate over the first 4 h, after which the neural net achieves smaller errors. For all the series, the average errors are well within one standard deviation of the actual data at all the horizons. However, the errors also show irregular outliers. In essence, the models capture the central tendency of the data, but are less effective in predicting the extreme events.
Maskens, Carolyn; Downie, Helen; Wendt, Alison; Lima, Ana; Merkley, Lisa; Lin, Yulia; Callum, Jeannie
2014-01-01
This report provides a comprehensive analysis of transfusion errors occurring at a large teaching hospital and aims to determine key errors that are threatening transfusion safety, despite implementation of safety measures. Errors were prospectively identified from 2005 to 2010. Error data were coded on a secure online database called the Transfusion Error Surveillance System. Errors were defined as any deviation from established standard operating procedures. Errors were identified by clinical and laboratory staff. Denominator data for volume of activity were used to calculate rates. A total of 15,134 errors were reported with a median number of 215 errors per month (range, 85-334). Overall, 9083 (60%) errors occurred on the transfusion service and 6051 (40%) on the clinical services. In total, 23 errors resulted in patient harm: 21 of these errors occurred on the clinical services and two in the transfusion service. Of the 23 harm events, 21 involved inappropriate use of blood. Errors with no harm were 657 times more common than events that caused harm. The most common high-severity clinical errors were sample labeling (37.5%) and inappropriate ordering of blood (28.8%). The most common high-severity error in the transfusion service was sample accepted despite not meeting acceptance criteria (18.3%). The cost of product and component loss due to errors was $593,337. Errors occurred at every point in the transfusion process, with the greatest potential risk of patient harm resulting from inappropriate ordering of blood products and errors in sample labeling. © 2013 American Association of Blood Banks (CME).
Robust THP Transceiver Designs for Multiuser MIMO Downlink with Imperfect CSIT
NASA Astrophysics Data System (ADS)
Ubaidulla, P.; Chockalingam, A.
2009-12-01
We present robust joint nonlinear transceiver designs for multiuser multiple-input multiple-output (MIMO) downlink in the presence of imperfections in the channel state information at the transmitter (CSIT). The base station (BS) is equipped with multiple transmit antennas, and each user terminal is equipped with one or more receive antennas. The BS employs Tomlinson-Harashima precoding (THP) for interuser interference precancellation at the transmitter. We consider robust transceiver designs that jointly optimize the transmit THP filters and receive filter for two models of CSIT errors. The first model is a stochastic error (SE) model, where the CSIT error is Gaussian-distributed. This model is applicable when the CSIT error is dominated by channel estimation error. In this case, the proposed robust transceiver design seeks to minimize a stochastic function of the sum mean square error (SMSE) under a constraint on the total BS transmit power. We propose an iterative algorithm to solve this problem. The other model we consider is a norm-bounded error (NBE) model, where the CSIT error can be specified by an uncertainty set. This model is applicable when the CSIT error is dominated by quantization errors. In this case, we consider a worst-case design. For this model, we consider robust (i) minimum SMSE, (ii) MSE-constrained, and (iii) MSE-balancing transceiver designs. We propose iterative algorithms to solve these problems, wherein each iteration involves a pair of semidefinite programs (SDPs). Further, we consider an extension of the proposed algorithm to the case with per-antenna power constraints. We evaluate the robustness of the proposed algorithms to imperfections in CSIT through simulation, and show that the proposed robust designs outperform nonrobust designs as well as robust linear transceiver designs reported in the recent literature.
Kaplan, H S
2005-11-01
Safety and reliability in blood transfusion are not static, but are dynamic non-events. Since performance deviations continually occur in complex systems, their detection and correction must be accomplished over and over again. Non-conformance must be detected early enough to allow for recovery or mitigation. Near-miss events afford early detection of possible system weaknesses and provide an early chance at correction. National event reporting systems, both voluntary and involuntary, have begun to include near-miss reporting in their classification schemes, raising awareness for their detection. MERS-TM is a voluntary safety reporting initiative in transfusion. Currently 22 hospitals submit reports anonymously to a central database which supports analysis of a hospital's own data and that of an aggregate database. The system encourages reporting of near-miss events, where the patient is protected from receiving an unsuitable or incorrect blood component due to a planned or unplanned recovery step. MERS-TM data suggest approximately 90% of events are near-misses, with 10% caught after issue but before transfusion. Near-miss reporting may increase total reports ten-fold. The ratio of near-misses to events with harm is 339:1, consistent with other industries' ratio of 300:1, which has been proposed as a measure of reporting in event reporting systems. Use of a risk matrix and an event's relation to protective barriers allow prioritization of these events. Near-misses recovered by planned barriers occur ten times more frequently then unplanned recoveries. A bedside check of the patient's identity with that on the blood component is an essential, final barrier. How the typical two person check is performed, is critical. Even properly done, this check is ineffective against sampling and testing errors. Blood testing at bedside just prior to transfusion minimizes the risk of such upstream events. However, even with simple and well designed devices, training may be a critical issue. Sample errors account for more than half of reported events. The most dangerous miscollection is a blood sample passing acceptance with no previous patient results for comparison. Bar code labels or collection of a second sample may counter this upstream vulnerability. Further upstream barriers have been proposed to counter the precariousness of urgent blood sample collection in a changing unstable situation. One, a linking device, allows safer labeling of tubes away from the bedside, the second, a forcing function, prevents omission of critical patient identification steps. Errors in the blood bank itself account for 15% of errors with a high potential severity. In one such event, a component incorrectly issued, but safely detected prior to transfusion, focused attention on multitasking's contribution to laboratory error. In sum, use of near-miss information, by enhancing barriers supporting error prevention and mitigation, increases our capacity to get the right blood to the right patient.
Butler, Troy; Wildey, Timothy
2018-01-01
In thist study, we develop a procedure to utilize error estimates for samples of a surrogate model to compute robust upper and lower bounds on estimates of probabilities of events. We show that these error estimates can also be used in an adaptive algorithm to simultaneously reduce the computational cost and increase the accuracy in estimating probabilities of events using computationally expensive high-fidelity models. Specifically, we introduce the notion of reliability of a sample of a surrogate model, and we prove that utilizing the surrogate model for the reliable samples and the high-fidelity model for the unreliable samples gives preciselymore » the same estimate of the probability of the output event as would be obtained by evaluation of the original model for each sample. The adaptive algorithm uses the additional evaluations of the high-fidelity model for the unreliable samples to locally improve the surrogate model near the limit state, which significantly reduces the number of high-fidelity model evaluations as the limit state is resolved. Numerical results based on a recently developed adjoint-based approach for estimating the error in samples of a surrogate are provided to demonstrate (1) the robustness of the bounds on the probability of an event, and (2) that the adaptive enhancement algorithm provides a more accurate estimate of the probability of the QoI event than standard response surface approximation methods at a lower computational cost.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, Troy; Wildey, Timothy
In thist study, we develop a procedure to utilize error estimates for samples of a surrogate model to compute robust upper and lower bounds on estimates of probabilities of events. We show that these error estimates can also be used in an adaptive algorithm to simultaneously reduce the computational cost and increase the accuracy in estimating probabilities of events using computationally expensive high-fidelity models. Specifically, we introduce the notion of reliability of a sample of a surrogate model, and we prove that utilizing the surrogate model for the reliable samples and the high-fidelity model for the unreliable samples gives preciselymore » the same estimate of the probability of the output event as would be obtained by evaluation of the original model for each sample. The adaptive algorithm uses the additional evaluations of the high-fidelity model for the unreliable samples to locally improve the surrogate model near the limit state, which significantly reduces the number of high-fidelity model evaluations as the limit state is resolved. Numerical results based on a recently developed adjoint-based approach for estimating the error in samples of a surrogate are provided to demonstrate (1) the robustness of the bounds on the probability of an event, and (2) that the adaptive enhancement algorithm provides a more accurate estimate of the probability of the QoI event than standard response surface approximation methods at a lower computational cost.« less
An ILP based Algorithm for Optimal Customer Selection for Demand Response in SmartGrids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuppannagari, Sanmukh R.; Kannan, Rajgopal; Prasanna, Viktor K.
Demand Response (DR) events are initiated by utilities during peak demand periods to curtail consumption. They ensure system reliability and minimize the utility’s expenditure. Selection of the right customers and strategies is critical for a DR event. An effective DR scheduling algorithm minimizes the curtailment error which is the absolute difference between the achieved curtailment value and the target. State-of-the-art heuristics exist for customer selection, however their curtailment errors are unbounded and can be as high as 70%. In this work, we develop an Integer Linear Programming (ILP) formulation for optimally selecting customers and curtailment strategies that minimize the curtailmentmore » error during DR events in SmartGrids. We perform experiments on real world data obtained from the University of Southern California’s SmartGrid and show that our algorithm achieves near exact curtailment values with errors in the range of 10 -7 to 10 -5, which are within the range of numerical errors. We compare our results against the state-of-the-art heuristic being deployed in practice in the USC SmartGrid. We show that for the same set of available customer strategy pairs our algorithm performs 103 to 107 times better in terms of the curtailment errors incurred.« less
NASA Astrophysics Data System (ADS)
Chen, Y.; Xu, X.
2017-12-01
The broad band Lg 1/Q tomographic models in eastern Eurasia are inverted from source- and site-corrected path 1/Q data. The path 1/Q are measured between stations (or events) by the two-station (TS), reverse two-station (RTS) and reverse two-event (RTE) methods, respectively. Because path 1/Q are computed using logarithm of the product of observed spectral ratios and simplified 1D geometrical spreading correction, they are subject to "modeling errors" dominated by uncompensated 3D structural effects. We have found in Chen and Xie [2017] that these errors closely follow normal distribution after the long-tailed outliers are screened out (similar to teleseismic travel time residuals). We thus rigorously analyze the statistics of these errors collected from repeated samplings of station (and event) pairs from 1.0 to 10.0Hz and reject about 15% outliers at each frequency band. The resultant variance of Δ/Q decreases with frequency as 1/f2. The 1/Q tomography using screened data is now a stochastic inverse problem with solutions approximate the means of Gaussian random variables and the model covariance matrix is that of Gaussian variables with well-known statistical behavior. We adopt a new SVD based tomographic method to solve for 2D Q image together with its resolution and covariance matrices. The RTS and RTE yield the most reliable 1/Q data free of source and site effects, but the path coverage is rather sparse due to very strict recording geometry. The TS absorbs the effects of non-unit site response ratios into 1/Q data. The RTS also yields site responses, which can then be corrected from the path 1/Q of TS to make them also free of site effect. The site corrected TS data substantially improve path coverage, allowing able to solve for 1/Q tomography up to 6.0Hz. The model resolution and uncertainty are first quantitively accessed by spread functions (fulfilled by resolution matrix) and covariance matrix. The reliably retrieved Q models correlate well with the distinct tectonic blocks featured by the most recent major deformations and vary with frequencies. With the 1/Q tomographic model and its covariance matrix, we can formally estimate the uncertainty of any path-specific Lg 1/Q prediction. This new capability significantly benefits source estimation for which reliable uncertainty estimate is especially important.
ERIC Educational Resources Information Center
Flouri, Eirini; Panourgia, Constantina
2011-01-01
The aim of this study was to test whether negative cognitive errors (overgeneralizing, catastrophizing, selective abstraction, and personalizing) mediate the moderator effect of non-verbal cognitive ability on the association between adverse life events (life stress) and emotional and behavioral problems in adolescence. The sample consisted of 430…
Time Here, Time There, Time Everywhere: Teaching Young Children Time through Daily Routine
ERIC Educational Resources Information Center
Lee, Joohi; Lee, Joo Ok; Fox, Jill
2009-01-01
According to Piaget, 5- or 6-year-old children gradually acquire the concept of time based on events (Piaget, 1969). In his experiment of investigating children's time concepts, Piaget found that children of these ages were able to place pictures based on sequential events with some errors; the younger children made more errors. The National…
Radiation Tests on 2Gb NAND Flash Memories
NASA Technical Reports Server (NTRS)
Nguyen, Duc N.; Guertin, Steven M.; Patterson, J. D.
2006-01-01
We report on SEE and TID tests of highly scaled Samsung 2Gbits flash memories. Both in-situ and biased interval irradiations were used to characterize the response of the total accumulated dose failures. The radiation-induced failures can be categorized as followings: single event upset (SEU) read errors in biased and unbiased modes, write errors, and single-event-functional-interrupt (SEFI) failures.
Disclosure of adverse events and errors in surgical care: challenges and strategies for improvement.
Lipira, Lauren E; Gallagher, Thomas H
2014-07-01
The disclosure of adverse events to patients, including those caused by medical errors, is a critical part of patient-centered healthcare and a fundamental component of patient safety and quality improvement. Disclosure benefits patients, providers, and healthcare institutions. However, the act of disclosure can be difficult for physicians. Surgeons struggle with disclosure in unique ways compared with other specialties, and disclosure in the surgical setting has specific challenges. The frequency of surgical adverse events along with a dysfunctional tort system, the team structure of surgical staff, and obstacles created inadvertently by existing surgical patient safety initiatives may contribute to an environment not conducive to disclosure. Fortunately, there are multiple strategies to address these barriers. Participation in communication and resolution programs, integration of Just Culture principles, surgical team disclosure planning, refinement of informed consent and morbidity and mortality processes, surgery-specific professional standards, and understanding the complexities of disclosing other clinicians' errors all have the potential to help surgeons provide patients with complete, satisfactory disclosures. Improvement in the regularity and quality of disclosures after surgical adverse events and errors will be key as the field of patient safety continues to advance.
Sandberg, Warren S; Häkkinen, Matti; Egan, Marie; Curran, Paige K; Fairbrother, Pamela; Choquette, Ken; Daily, Bethany; Sarkka, Jukka-Pekka; Rattner, David
2005-09-01
When procedures and processes to assure patient location based on human performance do not work as expected, patients are brought incrementally closer to a possible "wrong patient-wrong procedure'' error. We developed a system for automated patient location monitoring and management. Real-time data from an active infrared/radio frequency identification tracking system provides patient location data that are robust and can be compared with an "expected process'' model to automatically flag wrong-location events as soon as they occur. The system also generates messages that are automatically sent to process managers via the hospital paging system, thus creating an active alerting function to annunciate errors. We deployed the system to detect and annunciate "patient-in-wrong-OR'' events. The system detected all "wrong-operating room (OR)'' events, and all "wrong-OR'' locations were correctly assigned within 0.50+/-0.28 minutes (mean+/-SD). This corresponded to the measured latency of the tracking system. All wrong-OR events were correctly annunciated via the paging function. This experiment demonstrates that current technology can automatically collect sufficient data to remotely monitor patient flow through a hospital, provide decision support based on predefined rules, and automatically notify stakeholders of errors.
A mediation skills model to manage disclosure of errors and adverse events to patients.
Liebman, Carol B; Hyman, Chris Stern
2004-01-01
In 2002 Pennsylvania became the first state to impose on hospitals a statutory duty to notify patients in writing of a serious event. If the disclosure conversations are carefully planned, properly executed, and responsive to patients' needs, this new requirement creates possible benefits for both patient safety and litigation risk management. This paper describes a model for accomplishing these goals that encourages health care providers to communicate more effectively with patients following an adverse event or medical error, learn from mistakes, respond to the concerns of patients and families after an adverse event, and arrive at a fair and cost-effective resolution of valid claims.
Methods developed to elucidate nursing related adverse events in Japan.
Yamagishi, Manaho; Kanda, Katsuya; Takemura, Yukie
2003-05-01
Financial resources for quality assurance in Japanese hospitals are limited and few hospitals have quality monitoring systems of nursing service systems. However, recently its necessity has been recognized. This study has cost effectively used adverse event occurrence rates as indicators of the quality of nursing service, and audited methods of collecting data on adverse events to elucidate their approximate true numbers. Data collection was conducted in July, August and November 2000 at a hospital in Tokyo that administered both primary and secondary health care services (281 beds, six wards, average length of stay 23 days). We collected adverse events through incident reports, logs, check-lists, nurse interviews, medication error questionnaires, urine leucocyte tests, patient interviews and medical records. Adverse events included the unplanned removals of invasive lines, medication errors, falls, pressure sores, skin deficiencies, physical restraints, and nosocomial infections. After evaluating the time and useful outcomes of each source, it soon became clear that we could elucidate adverse events most consistently and cost-effectively through incident reports, check lists, nurse interviews, urine leucocyte tests and medication error questionnaires. This study suggests that many hospitals in Japan could monitor the quality of the nursing service using these sources.
Irradiation setup at the U-120M cyclotron facility
NASA Astrophysics Data System (ADS)
Křížek, F.; Ferencei, J.; Matlocha, T.; Pospíšil, J.; Príbeli, P.; Raskina, V.; Isakov, A.; Štursa, J.; Vaňát, T.; Vysoká, K.
2018-06-01
This paper describes parameters of the proton beams provided by the U-120M cyclotron and the related irradiation setup at the open access irradiation facility at the Nuclear Physics Institute of the Czech Academy of Sciences. The facility is suitable for testing radiation hardness of various electronic components. The use of the setup is illustrated by a measurement of an error rate for errors caused by Single Event Transients in an SRAM-based Xilinx XC3S200 FPGA. This measurement provides an estimate of a possible occurrence of Single Event Transients. Data suggest that the variation of error rate of the Single Event Effects for different clock phase shifts is not significant enough to use clock phase alignment with the beam as a fault mitigation technique.
Mazur, E; Wolchik, S A; Virdin, L; Sandler, I N; West, S G
1999-01-01
This study examined whether children's cognitive appraisal biases moderate the impact of stressful divorce-related events on psychological adjustment in 355 children ages 9 to 12, whose families had experienced divorce within the past 2 years. Multiple regression indicated that endorsement of negative cognitive errors for hypothetical divorce events moderates the relations between stressful divorce events and self- and maternal reports of internalizing and externalizing symptoms, but only for older children. Positive illusions buffer the effects of stressful divorce events on child-reported depression and mother-reported externalizing problems. Implications of these results for theories of stress and coping, as well as for interventions for children of divorced families, are discussed.
Epinephrine Auto-Injector Versus Drawn Up Epinephrine for Anaphylaxis Management: A Scoping Review.
Chime, Nnenna O; Riese, Victoria G; Scherzer, Daniel J; Perretta, Julianne S; McNamara, LeAnn; Rosen, Michael A; Hunt, Elizabeth A
2017-08-01
Anaphylaxis is a life-threatening event. Most clinical symptoms of anaphylaxis can be reversed by prompt intramuscular administration of epinephrine using an auto-injector or epinephrine drawn up in a syringe and delays and errors may be fatal. The aim of this scoping review is to identify and compare errors associated with use of epinephrine drawn up in a syringe versus epinephrine auto-injectors in order to assist hospitals as they choose which approach minimizes risk of adverse events for their patients. PubMed, Embase, CINAHL, Web of Science, and the Cochrane Library were searched using terms agreed to a priori. We reviewed human and simulation studies reporting errors associated with the use of epinephrine in anaphylaxis. There were multiple screening stages with evolving feedback. Each study was independently assessed by two reviewers for eligibility. Data were extracted using an instrument modeled from the Zaza et al instrument and grouped into themes. Three main themes were noted: 1) ergonomics, 2) dosing errors, and 3) errors due to route of administration. Significant knowledge gaps in the operation of epinephrine auto-injectors among healthcare providers, patients, and caregivers were identified. For epinephrine in a syringe, there were more frequent reports of incorrect dosing and erroneous IV administration with associated adverse cardiac events. For the epinephrine auto-injector, unintentional administration to the digit was an error reported on multiple occasions. This scoping review highlights knowledge gaps and a diverse set of errors regardless of the approach to epinephrine preparation during management of anaphylaxis. There are more potentially life-threatening errors reported for epinephrine drawn up in a syringe than with the auto-injectors. The impact of these knowledge gaps and potentially fatal errors on patient outcomes, cost, and quality of care is worthy of further investigation.
Accuracy assessment of high-rate GPS measurements for seismology
NASA Astrophysics Data System (ADS)
Elosegui, P.; Davis, J. L.; Ekström, G.
2007-12-01
Analysis of GPS measurements with a controlled laboratory system, built to simulate the ground motions caused by tectonic earthquakes and other transient geophysical signals such as glacial earthquakes, enables us to assess the technique of high-rate GPS. The root-mean-square (rms) position error of this system when undergoing realistic simulated seismic motions is 0.05~mm, with maximum position errors of 0.1~mm, thus providing "ground truth" GPS displacements. We have acquired an extensive set of high-rate GPS measurements while inducing seismic motions on a GPS antenna mounted on this system with a temporal spectrum similar to real seismic events. We found that, for a particular 15-min-long test event, the rms error of the 1-Hz GPS position estimates was 2.5~mm, with maximum position errors of 10~mm, and the error spectrum of the GPS estimates was approximately flicker noise. These results may however represent a best-case scenario since they were obtained over a short (~10~m) baseline, thereby greatly mitigating baseline-dependent errors, and when the number and distribution of satellites on the sky was good. For example, we have determined that the rms error can increase by a factor of 2--3 as the GPS constellation changes throughout the day, with an average value of 3.5~mm for eight identical, hourly-spaced, consecutive test events. The rms error also increases with increasing baseline, as one would expect, with an average rms error for a ~1400~km baseline of 9~mm. We will present an assessment of the accuracy of high-rate GPS based on these measurements, discuss the implications of this study for seismology, and describe new applications in glaciology.
Progress in the improved lattice calculation of direct CP-violation in the Standard Model
NASA Astrophysics Data System (ADS)
Kelly, Christopher
2018-03-01
We discuss the ongoing effort by the RBC & UKQCD collaborations to improve our lattice calculation of the measure of Standard Model direct CP violation, ɛ', with physical kinematics. We present our progress in decreasing the (dominant) statistical error and discuss other related activities aimed at reducing the systematic errors.
Effects of Tropospheric Spatio-Temporal Correlated Noise on the Analysis of Space Geodetic Data
NASA Technical Reports Server (NTRS)
Romero-Wolf, A.; Jacobs, C. S.; Ratcliff, J. T.
2012-01-01
The standard VLBI analysis models the distribution of measurement noise as Gaussian. Because the price of recording bits is steadily decreasing, thermal errors will soon no longer dominate. As a result, it is expected that troposphere and instrumentation/clock errors will increasingly become more dominant. Given that both of these errors have correlated spectra, properly modeling the error distributions will become increasingly relevant for optimal analysis. We discuss the advantages of modeling the correlations between tropospheric delays using a Kolmogorov spectrum and the frozen flow assumption pioneered by Treuhaft and Lanyi. We then apply these correlated noise spectra to the weighting of VLBI data analysis for two case studies: X/Ka-band global astrometry and Earth orientation. In both cases we see improved results when the analyses are weighted with correlated noise models vs. the standard uncorrelated models. The X/Ka astrometric scatter improved by approx.10% and the systematic Delta delta vs. delta slope decreased by approx. 50%. The TEMPO Earth orientation results improved by 17% in baseline transverse and 27% in baseline vertical.
Indels, structural variation, and recombination drive genomic diversity in Plasmodium falciparum
Miles, Alistair; Iqbal, Zamin; Vauterin, Paul; Pearson, Richard; Campino, Susana; Theron, Michel; Gould, Kelda; Mead, Daniel; Drury, Eleanor; O'Brien, John; Ruano Rubio, Valentin; MacInnis, Bronwyn; Mwangi, Jonathan; Samarakoon, Upeka; Ranford-Cartwright, Lisa; Ferdig, Michael; Hayton, Karen; Su, Xin-zhuan; Wellems, Thomas; Rayner, Julian; McVean, Gil; Kwiatkowski, Dominic
2016-01-01
The malaria parasite Plasmodium falciparum has a great capacity for evolutionary adaptation to evade host immunity and develop drug resistance. Current understanding of parasite evolution is impeded by the fact that a large fraction of the genome is either highly repetitive or highly variable and thus difficult to analyze using short-read sequencing technologies. Here, we describe a resource of deep sequencing data on parents and progeny from genetic crosses, which has enabled us to perform the first genome-wide, integrated analysis of SNP, indel and complex polymorphisms, using Mendelian error rates as an indicator of genotypic accuracy. These data reveal that indels are exceptionally abundant, being more common than SNPs and thus the dominant mode of polymorphism within the core genome. We use the high density of SNP and indel markers to analyze patterns of meiotic recombination, confirming a high rate of crossover events and providing the first estimates for the rate of non-crossover events and the length of conversion tracts. We observe several instances of meiotic recombination within copy number variants associated with drug resistance, demonstrating a mechanism whereby fitness costs associated with resistance mutations could be compensated and greater phenotypic plasticity could be acquired. PMID:27531718
Poster Presentation: Optical Test of NGST Developmental Mirrors
NASA Technical Reports Server (NTRS)
Hadaway, James B.; Geary, Joseph; Reardon, Patrick; Peters, Bruce; Keidel, John; Chavers, Greg
2000-01-01
An Optical Testing System (OTS) has been developed to measure the figure and radius of curvature of NGST developmental mirrors in the vacuum, cryogenic environment of the X-Ray Calibration Facility (XRCF) at Marshall Space Flight Center (MSFC). The OTS consists of a WaveScope Shack-Hartmann sensor from Adaptive Optics Associates as the main instrument, a Point Diffraction Interferometer (PDI), a Point Spread Function (PSF) imager, an alignment system, a Leica Disto Pro distance measurement instrument, and a laser source palette (632.8 nm wavelength) that is fiber-coupled to the sensor instruments. All of the instruments except the laser source palette are located on a single breadboard known as the Wavefront Sensor Pallet (WSP). The WSP is located on top of a 5-DOF motion system located at the center of curvature of the test mirror. Two PC's are used to control the OTS. The error in the figure measurement is dominated by the WaveScope's measurement error. An analysis using the absolute wavefront gradient error of 1/50 wave P-V (at 0.6328 microns) provided by the manufacturer leads to a total surface figure measurement error of approximately 1/100 wave rms. This easily meets the requirement of 1/10 wave P-V. The error in radius of curvature is dominated by the Leica's absolute measurement error of VI.5 mm and the focus setting error of Vi.4 mm, giving an overall error of V2 mm. The OTS is currently being used to test the NGST Mirror System Demonstrators (NMSD's) and the Subscale Beryllium Mirror Demonstrator (SBNM).
Bioethics for clinicians: 23. Disclosure of medical error
Hébert, Philip C.; Levin, Alex V.; Robertson, Gerald
2001-01-01
ADVERSE EVENTS AND MEDICAL ERRORS ARE NOT UNCOMMON. In this article we review the literature on such events and discuss the ethical, legal and practical aspects of whether and how they should be disclosed to patients. Ethics, professional policy and the law, as well as the relevant empirical literature, suggest that timely and candid disclosure should be standard practice. Candour about error may lessen, rather than increase, the medicolegal liability of the health care professionals and may help to alleviate the patient's concerns. Guidelines for disclosure to patients, and their families if necessary, are proposed. PMID:11233873
Atmospheric lifetime of SF5CF3
NASA Astrophysics Data System (ADS)
Takahashi, K.; Nakayama, T.; Matsumi, Y.; Solomon, S.; Gejo, T.; Shigemasa, E.; Wallington, T. J.
2002-08-01
The vacuum ultraviolet (VUV) absorption spectrum of SF5CF3 was measured over the range 106-200 nm. At 121.6 nm, σ(base e) = (7.8 +/- 0.6) × 10-18 cm2 molecule-1, in which quoted uncertainty includes two standard deviation from the least-square fit in the Beer-Lambert plot and our estimate of potential systematic errors associated with measurements of the reactant concentrations. The VUV spectrum and literature data for electron attachment and ion-molecule reactions were incorporated into a model of the stratosphere, mesosphere, and lower thermosphere. This information provides better constraints on the atmospheric lifetime and hence on the potential of this highly radiatively-active trace gas to influence the climate system. The atmospheric lifetime of SF5CF3 is dominated by dissociative electron attachment and is estimated to be approximately 950 years. Solar proton events could reduce this to a lower limit of 650 years.
Crucial steps to life: From chemical reactions to code using agents.
Witzany, Guenther
2016-02-01
The concepts of the origin of the genetic code and the definitions of life changed dramatically after the RNA world hypothesis. Main narratives in molecular biology and genetics such as the "central dogma," "one gene one protein" and "non-coding DNA is junk" were falsified meanwhile. RNA moved from the transition intermediate molecule into centre stage. Additionally the abundance of empirical data concerning non-random genetic change operators such as the variety of mobile genetic elements, persistent viruses and defectives do not fit with the dominant narrative of error replication events (mutations) as being the main driving forces creating genetic novelty and diversity. The reductionistic and mechanistic views on physico-chemical properties of the genetic code are no longer convincing as appropriate descriptions of the abundance of non-random genetic content operators which are active in natural genetic engineering and natural genome editing. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Marshall, Cheryl J.; Marshall, Paul W.
1999-01-01
This portion of the Short Course is divided into two segments to separately address the two major proton-related effects confronting satellite designers: ionization effects and displacement damage effects. While both of these topics are deeply rooted in "traditional" descriptions of space radiation effects, there are several factors at play to cause renewed concern for satellite systems being designed today. For example, emphasis on Commercial Off-The-Shelf (COTS) technologies in both commercial and government systems increases both Total Ionizing Dose (TID) and Single Event Effect (SEE) concerns. Scaling trends exacerbate the problems, especially with regard to SEEs where protons can dominate soft error rates and even cause destructive failure. In addition, proton-induced displacement damage at fluences encountered in natural space environments can cause degradation in modern bipolar circuitry as well as in many emerging electronic and opto-electronic technologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herberger, Sarah M.; Boring, Ronald L.
Abstract Objectives: This paper discusses the differences between classical human reliability analysis (HRA) dependence and the full spectrum of probabilistic dependence. Positive influence suggests an error increases the likelihood of subsequent errors or success increases the likelihood of subsequent success. Currently the typical method for dependence in HRA implements the Technique for Human Error Rate Prediction (THERP) positive dependence equations. This assumes that the dependence between two human failure events varies at discrete levels between zero and complete dependence (as defined by THERP). Dependence in THERP does not consistently span dependence values between 0 and 1. In contrast, probabilistic dependencemore » employs Bayes Law, and addresses a continuous range of dependence. Methods: Using the laws of probability, complete dependence and maximum positive dependence do not always agree. Maximum dependence is when two events overlap to their fullest amount. Maximum negative dependence is the smallest amount that two events can overlap. When the minimum probability of two events overlapping is less than independence, negative dependence occurs. For example, negative dependence is when an operator fails to actuate Pump A, thereby increasing his or her chance of actuating Pump B. The initial error actually increases the chance of subsequent success. Results: Comparing THERP and probability theory yields different results in certain scenarios; with the latter addressing negative dependence. Given that most human failure events are rare, the minimum overlap is typically 0. And when the second event is smaller than the first event the max dependence is less than 1, as defined by Bayes Law. As such alternative dependence equations are provided along with a look-up table defining the maximum and maximum negative dependence given the probability of two events. Conclusions: THERP dependence has been used ubiquitously for decades, and has provided approximations of the dependencies between two events. Since its inception, computational abilities have increased exponentially, and alternative approaches that follow the laws of probability dependence need to be implemented. These new approaches need to consider negative dependence and identify when THERP output is not appropriate.« less
Dynamic dominance varies with handedness: reduced interlimb asymmetries in left-handers
Przybyla, Andrzej; Good, David C.; Sainburg, Robert L.
2013-01-01
Our previous studies of interlimb asymmetries during reaching movements have given rise to the dynamic-dominance hypothesis of motor lateralization. This hypothesis proposes that dominant arm control has become optimized for efficient intersegmental coordination, which is often associated with straight and smooth hand-paths, while non-dominant arm control has become optimized for controlling steady-state posture, which has been associated with greater final position accuracy when movements are mechanically perturbed, and often during movements made in the absence of visual feedback. The basis for this model of motor lateralization was derived from studies conducted in right-handed subjects. We now ask whether left-handers show similar proficiencies in coordinating reaching movements. We recruited right- and left-handers (20 per group) to perform reaching movements to three targets, in which intersegmental coordination requirements varied systematically. Our results showed that the dominant arm of both left- and right-handers were well coordinated, as reflected by fairly straight hand-paths and low errors in initial direction. Consistent with our previous studies, the non-dominant arm of right-handers showed substantially greater curvature and large errors in initial direction, most notably to targets that elicited higher intersegmental interactions. While the right, non-dominant, hand-paths of left-handers were slightly more curved than those of the dominant arm, they were also substantially more accurate and better coordinated than the non-dominant arm of right-handers. Our results indicate a similar pattern, but reduced lateralization for intersegmental coordination in left-handers. These findings suggest that left-handers develop more coordinated control of their non-dominant arms than right-handers, possibly due to environmental pressure for right-handed manipulations. PMID:22113487
Dynamic dominance varies with handedness: reduced interlimb asymmetries in left-handers.
Przybyla, Andrzej; Good, David C; Sainburg, Robert L
2012-02-01
Our previous studies of interlimb asymmetries during reaching movements have given rise to the dynamic-dominance hypothesis of motor lateralization. This hypothesis proposes that dominant arm control has become optimized for efficient intersegmental coordination, which is often associated with straight and smooth hand-paths, while non-dominant arm control has become optimized for controlling steady-state posture, which has been associated with greater final position accuracy when movements are mechanically perturbed, and often during movements made in the absence of visual feedback. The basis for this model of motor lateralization was derived from studies conducted in right-handed subjects. We now ask whether left-handers show similar proficiencies in coordinating reaching movements. We recruited right- and left-handers (20 per group) to perform reaching movements to three targets, in which intersegmental coordination requirements varied systematically. Our results showed that the dominant arm of both left- and right-handers were well coordinated, as reflected by fairly straight hand-paths and low errors in initial direction. Consistent with our previous studies, the non-dominant arm of right-handers showed substantially greater curvature and large errors in initial direction, most notably to targets that elicited higher intersegmental interactions. While the right, non-dominant, hand-paths of left-handers were slightly more curved than those of the dominant arm, they were also substantially more accurate and better coordinated than the non-dominant arm of right-handers. Our results indicate a similar pattern, but reduced lateralization for intersegmental coordination in left-handers. These findings suggest that left-handers develop more coordinated control of their non-dominant arms than right-handers, possibly due to environmental pressure for right-handed manipulations.
Partnerships With Aviation: Promoting a Culture of Safety in Health Care.
Skinner, Lori; Tripp, Terrance R; Scouler, David; Pechacek, Judith M
2015-01-01
According to the Institute of Medicine (IOM, 1999, p. 1), "Medical errors can be defined as the failure of a planned action to be completed as intended or the use of a wrong plan to achieve an aim." The current health care culture is disjointed, as evidenced by a lack of consistent reporting standards for all providers; provider licensing pays little attention to errors, and there are no financial incentives to improve safety (IOM, 1999). Many errors in health care are preventable. "Near misses" and adverse events that do occur can offer insight on how to improve practice and prevent future events. The aim of this article is to better understand underreporting of errors in health care, to present a model of change that increases voluntary error reporting, and to discuss the role nurse executives play in creating a culture of safety. This article explores how high reliability organizations such as aviation improve safety through enhanced error reporting, culture change, and teamwork.
Characterizing the SWOT discharge error budget on the Sacramento River, CA
NASA Astrophysics Data System (ADS)
Yoon, Y.; Durand, M. T.; Minear, J. T.; Smith, L.; Merry, C. J.
2013-12-01
The Surface Water and Ocean Topography (SWOT) is an upcoming satellite mission (2020 year) that will provide surface-water elevation and surface-water extent globally. One goal of SWOT is the estimation of river discharge directly from SWOT measurements. SWOT discharge uncertainty is due to two sources. First, SWOT cannot measure channel bathymetry and determine roughness coefficient data necessary for discharge calculations directly; these parameters must be estimated from the measurements or from a priori information. Second, SWOT measurement errors directly impact the discharge estimate accuracy. This study focuses on characterizing parameter and measurement uncertainties for SWOT river discharge estimation. A Bayesian Markov Chain Monte Carlo scheme is used to calculate parameter estimates, given the measurements of river height, slope and width, and mass and momentum constraints. The algorithm is evaluated using simulated both SWOT and AirSWOT (the airborne version of SWOT) observations over seven reaches (about 40 km) of the Sacramento River. The SWOT and AirSWOT observations are simulated by corrupting the ';true' HEC-RAS hydraulic modeling results with the instrument error. This experiment answers how unknown bathymetry and roughness coefficients affect the accuracy of the river discharge algorithm. From the experiment, the discharge error budget is almost completely dominated by unknown bathymetry and roughness; 81% of the variance error is explained by uncertainties in bathymetry and roughness. Second, we show how the errors in water surface, slope, and width observations influence the accuracy of discharge estimates. Indeed, there is a significant sensitivity to water surface, slope, and width errors due to the sensitivity of bathymetry and roughness to measurement errors. Increasing water-surface error above 10 cm leads to a corresponding sharper increase of errors in bathymetry and roughness. Increasing slope error above 1.5 cm/km leads to a significant degradation due to direct error in the discharge estimates. As the width error increases past 20%, the discharge error budget is dominated by the width error. Above two experiments are performed based on AirSWOT scenarios. In addition, we explore the sensitivity of the algorithm to the SWOT scenarios.
Does a better model yield a better argument? An info-gap analysis
NASA Astrophysics Data System (ADS)
Ben-Haim, Yakov
2017-04-01
Theories, models and computations underlie reasoned argumentation in many areas. The possibility of error in these arguments, though of low probability, may be highly significant when the argument is used in predicting the probability of rare high-consequence events. This implies that the choice of a theory, model or computational method for predicting rare high-consequence events must account for the probability of error in these components. However, error may result from lack of knowledge or surprises of various sorts, and predicting the probability of error is highly uncertain. We show that the putatively best, most innovative and sophisticated argument may not actually have the lowest probability of error. Innovative arguments may entail greater uncertainty than more standard but less sophisticated methods, creating an innovation dilemma in formulating the argument. We employ info-gap decision theory to characterize and support the resolution of this problem and present several examples.
Phillips, R; Bartholomew, L; Dovey, S; Fryer, G; Miyoshi, T; Green, L
2004-01-01
Background: The epidemiology, risks, and outcomes of errors in primary care are poorly understood. Malpractice claims brought for negligent adverse events offer a useful insight into errors in primary care. Methods: Physician Insurers Association of America malpractice claims data (1985–2000) were analyzed for proportions of negligent claims by primary care specialty, setting, severity, health condition, and attributed cause. We also calculated risks of a claim for condition-specific negligent events relative to the prevalence of those conditions in primary care. Results: Of 49 345 primary care claims, 26 126 (53%) were peer reviewed and 5921 (23%) were assessed as negligent; 68% of claims were for negligent events in outpatient settings. No single condition accounted for more than 5% of all negligent claims, but the underlying causes were more clustered with "diagnosis error" making up one third of claims. The ratios of condition-specific negligent event claims relative to the frequency of those conditions in primary care revealed a significantly disproportionate risk for a number of conditions (for example, appendicitis was 25 times more likely to generate a claim for negligence than breast cancer). Conclusions: Claims data identify conditions and processes where primary health care in the United States is prone to go awry. The burden of severe outcomes and death from malpractice claims made against primary care physicians was greater in primary care outpatient settings than in hospitals. Although these data enhance information about error related negligent events in primary care, particularly when combined with other primary care data, there are many operating limitations. PMID:15069219
Multiconfiguration Pair-Density Functional Theory Is Free From Delocalization Error.
Bao, Junwei Lucas; Wang, Ying; He, Xiao; Gagliardi, Laura; Truhlar, Donald G
2017-11-16
Delocalization error has been singled out by Yang and co-workers as the dominant error in Kohn-Sham density functional theory (KS-DFT) with conventional approximate functionals. In this Letter, by computing the vertical first ionization energy for well separated He clusters, we show that multiconfiguration pair-density functional theory (MC-PDFT) is free from delocalization error. To put MC-PDFT in perspective, we also compare it with some Kohn-Sham density functionals, including both traditional and modern functionals. Whereas large delocalization errors are almost universal in KS-DFT (the only exception being the very recent corrected functionals of Yang and co-workers), delocalization error is removed by MC-PDFT, which bodes well for its future as a step forward from KS-DFT.
Induced Seismicity in Greeley, CO: The Effects of Pore Pressure on Seismic Wave Character
NASA Astrophysics Data System (ADS)
Bogolub, K. R.; Holmes, R.; Sheehan, A. F.; Brown, M. R. M.
2017-12-01
Since 2013, a series of injection-induced earthquakes has occurred near Greeley, Colorado including a Mw 3.2 event in June 2014. With induced seismicity on the rise, it is important to understand injection-induced earthquakes to improve mitigation efforts. In this research, we analyzed seismograms from a local seismic network to see if there are any notable differences in seismic waveform as a result of changes in pore pressure from wastewater injection. Catalogued earthquake events from January-June 2017 that were clearly visible on 4 or more stations in the network were used as template events in a subspace detector. Since the template events were constructed using seismograms from a single event, the subspace detector operated similarly to a matched filter and detections had very similar waveforms to the template event. Having these detections ultimately helped us identify similar earthquakes, which gave us better located events for comparison. These detections were then examined and located using a 1D local velocity model. While many of these detections were already catalogued events, we also identified >20 new events by using this detector. Any two events that were matched by the detector, collocated within the error ellipses of both events and at least a month apart temporally were classified as "event pairs". One challenge of this method is that most of the collocated earthquakes occurred in a very narrow time window, which indicates that the events have a tendency to cluster both spatially and temporally. However, we were able to examine an event pair that fit our spatial proximity criteria, and were several months apart (March 3, 2017 and May 8, 2017). We present an examination of propagation velocity and frequency content for these two events specifically to assess if transient changes in pore pressure had any observable influence on these characteristics. Our preliminary results indicate a slight difference in lag time between P wave and S wave arrivals (slightly greater in lag time for March event) and frequency content (slightly higher dominant frequencies for March event). However, more work needs to be done to refine our earthquake locations so we can determine if these observations are caused by a transient change in velocity structure, a difference in location of the two events, or some other mechanism.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stenger, Drake C., E-mail: drake.stenger@ars.usda.
Population structure of Homalodisca coagulata Virus-1 (HoCV-1) among and within field-collected insects sampled from a single point in space and time was examined. Polymorphism in complete consensus sequences among single-insect isolates was dominated by synonymous substitutions. The mutant spectrum of the C2 helicase region within each single-insect isolate was unique and dominated by nonsynonymous singletons. Bootstrapping was used to correct the within-isolate nonsynonymous:synonymous arithmetic ratio (N:S) for RT-PCR error, yielding an N:S value ~one log-unit greater than that of consensus sequences. Probability of all possible single-base substitutions for the C2 region predicted N:S values within 95% confidence limits of themore » corrected within-isolate N:S when the only constraint imposed was viral polymerase error bias for transitions over transversions. These results indicate that bottlenecks coupled with strong negative/purifying selection drive consensus sequences toward neutral sequence space, and that most polymorphism within single-insect isolates is composed of newly-minted mutations sampled prior to selection. -- Highlights: •Sampling protocol minimized differential selection/history among isolates. •Polymorphism among consensus sequences dominated by negative/purifying selection. •Within-isolate N:S ratio corrected for RT-PCR error by bootstrapping. •Within-isolate mutant spectrum dominated by new mutations yet to undergo selection.« less
ERIC Educational Resources Information Center
Torpey, Dana C.; Hajcak, Greg; Kim, Jiyon; Kujawa, Autumn J.; Dyson, Margaret W.; Olino, Thomas M.; Klein, Daniel N.
2013-01-01
Background: There is increasing interest in error-related brain activity in anxiety disorders. The error-related negativity (ERN) is a negative deflection in the event-related potential approximately 50 [milliseconds] after errors compared to correct responses. Recent studies suggest that the ERN may be a biomarker for anxiety, as it is positively…
Hybrid Transverse Polar Navigation for High-Precision and Long-Term INSs
Wu, Qiuping; Zhang, Rong; Hu, Peida; Li, Haixia
2018-01-01
Transverse navigation has been proposed to help inertial navigation systems (INSs) fill the gap of polar navigation ability. However, as the transverse system does not have the ability of navigate globally, a complicated switch between the transverse and the traditional algorithms is necessary when the system moves across the polar circles. To maintain the inner continuity and consistency of the core algorithm, a hybrid transverse polar navigation is proposed in this research based on a combination of Earth-fixed-frame mechanization and transverse-frame outputs. Furthermore, a thorough analysis of kinematic error characteristics, proper damping technology and corresponding long-term contributions of main error sources is conducted for the high-precision INSs. According to the analytical expressions of the long-term navigation errors in polar areas, the 24-h period symmetrical oscillation with a slowly divergent amplitude dominates the transverse horizontal position errors, and the first-order drift dominates the transverse azimuth error, which results from the g0 gyro drift coefficients that occur in corresponding directions. Simulations are conducted to validate the theoretical analysis and the deduced analytical expressions. The results show that the proposed hybrid transverse navigation can ensure the same accuracy and oscillation characteristics in polar areas as the traditional algorithm in low and mid latitude regions. PMID:29757242
Hybrid Transverse Polar Navigation for High-Precision and Long-Term INSs.
Wu, Ruonan; Wu, Qiuping; Han, Fengtian; Zhang, Rong; Hu, Peida; Li, Haixia
2018-05-12
Transverse navigation has been proposed to help inertial navigation systems (INSs) fill the gap of polar navigation ability. However, as the transverse system does not have the ability of navigate globally, a complicated switch between the transverse and the traditional algorithms is necessary when the system moves across the polar circles. To maintain the inner continuity and consistency of the core algorithm, a hybrid transverse polar navigation is proposed in this research based on a combination of Earth-fixed-frame mechanization and transverse-frame outputs. Furthermore, a thorough analysis of kinematic error characteristics, proper damping technology and corresponding long-term contributions of main error sources is conducted for the high-precision INSs. According to the analytical expressions of the long-term navigation errors in polar areas, the 24-h period symmetrical oscillation with a slowly divergent amplitude dominates the transverse horizontal position errors, and the first-order drift dominates the transverse azimuth error, which results from the gyro drift coefficients that occur in corresponding directions. Simulations are conducted to validate the theoretical analysis and the deduced analytical expressions. The results show that the proposed hybrid transverse navigation can ensure the same accuracy and oscillation characteristics in polar areas as the traditional algorithm in low and mid latitude regions.
Language function distribution in left-handers: A navigated transcranial magnetic stimulation study.
Tussis, Lorena; Sollmann, Nico; Boeckh-Behrens, Tobias; Meyer, Bernhard; Krieg, Sandro M
2016-02-01
Recent studies suggest that in left-handers, the right hemisphere (RH) is more involved in language function when compared to right-handed subjects. Since data on lesion-based approaches is lacking, we aimed to investigate language distribution of left-handers by repetitive navigated transcranial magnetic stimulation (rTMS). Thus, rTMS was applied to the left hemisphere (LH) and RH in 15 healthy left-handers during an object-naming task, and resulting naming errors were categorized. Then, we calculated error rates (ERs=number of errors per number of stimulations) for both hemispheres separately and defined a laterality score as the quotient of the LH ER - RH ER through the LH ER + RH ER (abbreviated as (L-R)/(L+R)). In this context, (L-R)/(L+R)>0 indicates that the LH is dominant, whereas (L-R)/(L+R)<0 shows that the RH is dominant. No significant difference in ERs was found between hemispheres (all errors: mean LH 18.0±11.7%, mean RH 18.1±12.2%, p=0.94; all errors without hesitation: mean LH 12.4±9.8%, mean RH 12.9±10.0%, p=0.65; no responses: mean LH 9.3±9.2%, mean RH 11.5±10.3%, p=0.84). However, a significant difference between the results of (L-R)/(L+R) of left-handers and right-handers (source data of another study) for all errors (mean 0.01±0.14 vs. 0.19±0.20, p=0.0019) and all errors without hesitation (mean -0.02±0.20 vs. 0.19±0.28, p=0.0051) was revealed, whereas the comparison for no responses did not show a significant difference (mean: -0.004±0.27 vs. 0.09±0.44, p=0.64). Accordingly, left-handers present a comparatively equal language distribution across both hemispheres with language dominance being nearly equally distributed between hemispheres in contrast to right-handers. Copyright © 2016 Elsevier Ltd. All rights reserved.
Improved Event Location Uncertainty Estimates
2008-06-30
throughout this study . The data set consists of GT0-2 nuclear explosions from the SAIC Nuclear Explosion Database (www.rdss.info, Bahavar et al...errors: Bias and variance In this study SNR dependence of both delay and variance of reading errors of first arriving P waves are analyzed and...ground truth and range of event size. For other datasets we turn to estimates based on double- differences between arrival times of station pairs
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-06
...-AA08 Special Local Regulation for Marine Events; Temporary Change of Dates for Recurring Marine Events... period of two special local regulations for recurring marine events in the Fifth Coast Guard District... errors do not impact the events scheduled for this year, but could cause confusion about future years...
Learning a locomotor task: with or without errors?
Marchal-Crespo, Laura; Schneider, Jasmin; Jaeger, Lukas; Riener, Robert
2014-03-04
Robotic haptic guidance is the most commonly used robotic training strategy to reduce performance errors while training. However, research on motor learning has emphasized that errors are a fundamental neural signal that drive motor adaptation. Thus, researchers have proposed robotic therapy algorithms that amplify movement errors rather than decrease them. However, to date, no study has analyzed with precision which training strategy is the most appropriate to learn an especially simple task. In this study, the impact of robotic training strategies that amplify or reduce errors on muscle activation and motor learning of a simple locomotor task was investigated in twenty two healthy subjects. The experiment was conducted with the MAgnetic Resonance COmpatible Stepper (MARCOS) a special robotic device developed for investigations in the MR scanner. The robot moved the dominant leg passively and the subject was requested to actively synchronize the non-dominant leg to achieve an alternating stepping-like movement. Learning with four different training strategies that reduce or amplify errors was evaluated: (i) Haptic guidance: errors were eliminated by passively moving the limbs, (ii) No guidance: no robot disturbances were presented, (iii) Error amplification: existing errors were amplified with repulsive forces, (iv) Noise disturbance: errors were evoked intentionally with a randomly-varying force disturbance on top of the no guidance strategy. Additionally, the activation of four lower limb muscles was measured by the means of surface electromyography (EMG). Strategies that reduce or do not amplify errors limit muscle activation during training and result in poor learning gains. Adding random disturbing forces during training seems to increase attention, and therefore improve motor learning. Error amplification seems to be the most suitable strategy for initially less skilled subjects, perhaps because subjects could better detect their errors and correct them. Error strategies have a great potential to evoke higher muscle activation and provoke better motor learning of simple tasks. Neuroimaging evaluation of brain regions involved in learning can provide valuable information on observed behavioral outcomes related to learning processes. The impacts of these strategies on neurological patients need further investigations.
2014-01-01
Background The Health Information Technology for Economic and Clinical Health (HITECH) Act subsidizes implementation by hospitals of electronic health records with computerized provider order entry (CPOE), which may reduce patient injuries caused by medication errors (preventable adverse drug events, pADEs). Effects on pADEs have not been rigorously quantified, and effects on medication errors have been variable. The objectives of this analysis were to assess the effectiveness of CPOE at reducing pADEs in hospital-related settings, and examine reasons for heterogeneous effects on medication errors. Methods Articles were identified using MEDLINE, Cochrane Library, Econlit, web-based databases, and bibliographies of previous systematic reviews (September 2013). Eligible studies compared CPOE with paper-order entry in acute care hospitals, and examined diverse pADEs or medication errors. Studies on children or with limited event-detection methods were excluded. Two investigators extracted data on events and factors potentially associated with effectiveness. We used random effects models to pool data. Results Sixteen studies addressing medication errors met pooling criteria; six also addressed pADEs. Thirteen studies used pre-post designs. Compared with paper-order entry, CPOE was associated with half as many pADEs (pooled risk ratio (RR) = 0.47, 95% CI 0.31 to 0.71) and medication errors (RR = 0.46, 95% CI 0.35 to 0.60). Regarding reasons for heterogeneous effects on medication errors, five intervention factors and two contextual factors were sufficiently reported to support subgroup analyses or meta-regression. Differences between commercial versus homegrown systems, presence and sophistication of clinical decision support, hospital-wide versus limited implementation, and US versus non-US studies were not significant, nor was timing of publication. Higher baseline rates of medication errors predicted greater reductions (P < 0.001). Other context and implementation variables were seldom reported. Conclusions In hospital-related settings, implementing CPOE is associated with a greater than 50% decline in pADEs, although the studies used weak designs. Decreases in medication errors are similar and robust to variations in important aspects of intervention design and context. This suggests that CPOE implementation, as subsidized under the HITECH Act, may benefit public health. More detailed reporting of the context and process of implementation could shed light on factors associated with greater effectiveness. PMID:24894078
Single Event Effect Testing of the Analog Devices ADV212
NASA Technical Reports Server (NTRS)
Wilcox, Ted; Campola, Michael; Kadari, Madhu; Nadendla, Seshagiri R.
2017-01-01
The Analog Devices ADV212 was initially tested for single event effects (SEE) at the Texas AM University Cyclotron Facility (TAMU) in July of 2013. Testing revealed a sensitivity to device hang-ups classified as single event functional interrupts (SEFI), soft data errors classified as single event upsets (SEU), and, of particular concern, single event latch-ups (SEL). All error types occurred so frequently as to make accurate measurements of the exposure time, and thus total particle fluence, challenging. To mitigate some of the risk posed by single event latch-ups, circuitry was added to the electrical design to detect a high current event and automatically recycle power and reboot the device. An additional heavy-ion test was scheduled to validate the operation of the recovery circuitry and the continuing functionality of the ADV212 after a substantial number of latch-up events. As a secondary goal, more precise data would be gathered by an improved test method, described in this test report.
Effects of multiple scattering on time- and depth-resolved signals in airborne lidar systems
NASA Technical Reports Server (NTRS)
Punjabi, A.; Venable, D. D.
1986-01-01
A semianalytic Monte Carlo radiative transfer model (SALMON) is employed to probe the effects of multiple-scattering events on the time- and depth-resolved lidar signals from homogeneous aqueous media. The effective total attenuation coefficients in the single-scattering approximation are determined as functions of dimensionless parameters characterizing the lidar system and the medium. Results show that single-scattering events dominate when these parameters are close to their lower bounds and that when their values exceed unity multiple-scattering events dominate.
[Risk Management: concepts and chances for public health].
Palm, Stefan; Cardeneo, Margareta; Halber, Marco; Schrappe, Matthias
2002-01-15
Errors are a common problem in medicine and occur as a result of a complex process involving many contributing factors. Medical errors significantly reduce the safety margin for the patient and contribute additional costs in health care delivery. In most cases adverse events cannot be attributed to a single underlying cause. Therefore an effective risk management strategy must follow a system approach, which is based on counting and analysis of near misses. The development of defenses against the undesired effects of errors should be the main focus rather than asking the question "Who blundered?". Analysis of near misses (which in this context can be compared to indicators) offers several methodological advantages as compared to the analysis of errors and adverse events. Risk management is an integral element of quality management.
Riga, Marina; Vozikis, Athanassios; Pollalis, Yannis; Souliotis, Kyriakos
2015-04-01
The economic crisis in Greece poses the necessity to resolve problems concerning both the spiralling cost and the quality assurance in the health system. The detection and the analysis of patient adverse events and medical errors are considered crucial elements of this course. The implementation of MERIS embodies a mandatory module, which adopts the trigger tool methodology for measuring adverse events and medical errors an intensive care unit [ICU] environment, and a voluntary one with web-based public reporting methodology. A pilot implementation of MERIS running in a public hospital identified 35 adverse events, with approx. 12 additional hospital days and an extra healthcare cost of €12,000 per adverse event or of about €312,000 per annum for ICU costs only. At the same time, the voluntary module unveiled 510 reports on adverse events submitted by citizens or patients. MERIS has been evaluated as a comprehensive and effective system; it succeeded in detecting the main factors that cause adverse events and discloses severe omissions of the Greek health system. MERIS may be incorporated and run efficiently nationally, adapted to the needs and peculiarities of each hospital or clinic. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Determination of Barometric Altimeter Errors for the Orion Exploration Flight Test-1 Entry
NASA Technical Reports Server (NTRS)
Brown, Denise L.; Bunoz, Jean-Philippe; Gay, Robert
2012-01-01
The Exploration Flight Test 1 (EFT-1) mission is the unmanned flight test for the upcoming Multi-Purpose Crew Vehicle (MPCV). During entry, the EFT-1 vehicle will trigger several Landing and Recovery System (LRS) events, such as parachute deployment, based on on-board altitude information. The primary altitude source is the filtered navigation solution updated with GPS measurement data. The vehicle also has three barometric altimeters that will be used to measure atmospheric pressure during entry. In the event that GPS data is not available during entry, the altitude derived from the barometric altimeter pressure will be used to trigger chute deployment for the drogues and main parachutes. Therefore it is important to understand the impact of error sources on the pressure measured by the barometric altimeters and on the altitude derived from that pressure. The error sources for the barometric altimeters are not independent, and many error sources result in bias in a specific direction. Therefore conventional error budget methods could not be applied. Instead, high fidelity Monte-Carlo simulation was performed and error bounds were determined based on the results of this analysis. Aerodynamic errors were the largest single contributor to the error budget for the barometric altimeters. The large errors drove a change to the altitude trigger setpoint for FBC jettison deploy.
Thorlund, Kristian; Imberger, Georgina; Walsh, Michael; Chu, Rong; Gluud, Christian; Wetterslev, Jørn; Guyatt, Gordon; Devereaux, Philip J.; Thabane, Lehana
2011-01-01
Background Meta-analyses including a limited number of patients and events are prone to yield overestimated intervention effect estimates. While many assume bias is the cause of overestimation, theoretical considerations suggest that random error may be an equal or more frequent cause. The independent impact of random error on meta-analyzed intervention effects has not previously been explored. It has been suggested that surpassing the optimal information size (i.e., the required meta-analysis sample size) provides sufficient protection against overestimation due to random error, but this claim has not yet been validated. Methods We simulated a comprehensive array of meta-analysis scenarios where no intervention effect existed (i.e., relative risk reduction (RRR) = 0%) or where a small but possibly unimportant effect existed (RRR = 10%). We constructed different scenarios by varying the control group risk, the degree of heterogeneity, and the distribution of trial sample sizes. For each scenario, we calculated the probability of observing overestimates of RRR>20% and RRR>30% for each cumulative 500 patients and 50 events. We calculated the cumulative number of patients and events required to reduce the probability of overestimation of intervention effect to 10%, 5%, and 1%. We calculated the optimal information size for each of the simulated scenarios and explored whether meta-analyses that surpassed their optimal information size had sufficient protection against overestimation of intervention effects due to random error. Results The risk of overestimation of intervention effects was usually high when the number of patients and events was small and this risk decreased exponentially over time as the number of patients and events increased. The number of patients and events required to limit the risk of overestimation depended considerably on the underlying simulation settings. Surpassing the optimal information size generally provided sufficient protection against overestimation. Conclusions Random errors are a frequent cause of overestimation of intervention effects in meta-analyses. Surpassing the optimal information size will provide sufficient protection against overestimation. PMID:22028777
The Public Understanding of Error in Educational Assessment
ERIC Educational Resources Information Center
Gardner, John
2013-01-01
Evidence from recent research suggests that in the UK the public perception of errors in national examinations is that they are simply mistakes; events that are preventable. This perception predominates over the more sophisticated technical view that errors arise from many sources and create an inevitable variability in assessment outcomes. The…
NASA Astrophysics Data System (ADS)
Chen, R. M.; Diggins, Z. J.; Mahatme, N. N.; Wang, L.; Zhang, E. X.; Chen, Y. P.; Zhang, H.; Liu, Y. N.; Narasimham, B.; Witulski, A. F.; Bhuva, B. L.; Fleetwood, D. M.
2017-08-01
The single-event sensitivity of bulk 40-nm sequential circuits is investigated as a function of temperature and supply voltage. An overall increase in SEU cross section versus temperature is observed at relatively high supply voltages. However, at low supply voltages, there is a threshold temperature beyond which the SEU cross section decreases with further increases in temperature. Single-event transient induced errors in flip-flops also increase versus temperature at relatively high supply voltages and are more sensitive to temperature variation than those caused by single-event upsets.
Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task One Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Shuai; Makarov, Yuri V.
This is a report for task one of the tail event analysis project for BPA. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, the imbalance between generation and load becomes very significant. This type of events occurs infrequently and appears on the tails of the distribution of system power imbalance; therefore, is referred to as tail events. This report analyzes what happened during the Electric Reliability Council of Texas (ERCOT) reliability event on Februarymore » 26, 2008, which was widely reported because of the involvement of wind generation. The objective is to identify sources of the problem, solutions to it and potential improvements that can be made to the system. Lessons learned from the analysis include the following: (1) Large mismatch between generation and load can be caused by load forecast error, wind forecast error and generation scheduling control error on traditional generators, or a combination of all of the above; (2) The capability of system balancing resources should be evaluated both in capacity (MW) and in ramp rate (MW/min), and be procured accordingly to meet both requirements. The resources need to be able to cover a range corresponding to the variability of load and wind in the system, additional to other uncertainties; (3) Unexpected ramps caused by load and wind can both become the cause leading to serious issues; (4) A look-ahead tool evaluating system balancing requirement during real-time operations and comparing that with available system resources should be very helpful to system operators in predicting the forthcoming of similar events and planning ahead; and (5) Demand response (only load reduction in ERCOT event) can effectively reduce load-generation mismatch and terminate frequency deviation in an emergency situation.« less
Garrouste-Orgeas, Maité; Perrin, Marion; Soufir, Lilia; Vesin, Aurélien; Blot, François; Maxime, Virginie; Beuret, Pascal; Troché, Gilles; Klouche, Kada; Argaud, Laurent; Azoulay, Elie; Timsit, Jean-François
2015-02-01
Staff behaviours to optimise patient safety may be influenced by burnout, depression and strength of the safety culture. We evaluated whether burnout, symptoms of depression and safety culture affected the frequency of medical errors and adverse events (selected using Delphi techniques) in ICUs. Prospective, observational, multicentre (31 ICUs) study from August 2009 to December 2011. Burnout, depression symptoms and safety culture were evaluated using the Maslach Burnout Inventory (MBI), CES-Depression scale and Safety Attitudes Questionnaire, respectively. Of 1,988 staff members, 1,534 (77.2 %) participated. Frequencies of medical errors and adverse events were 804.5/1,000 and 167.4/1,000 patient-days, respectively. Burnout prevalence was 3 or 40 % depending on the definition (severe emotional exhaustion, depersonalisation and low personal accomplishment; or MBI score greater than -9). Depression symptoms were identified in 62/330 (18.8 %) physicians and 188/1,204 (15.6 %) nurses/nursing assistants. Median safety culture score was 60.7/100 [56.8-64.7] in physicians and 57.5/100 [52.4-61.9] in nurses/nursing assistants. Depression symptoms were an independent risk factor for medical errors. Burnout was not associated with medical errors. The safety culture score had a limited influence on medical errors. Other independent risk factors for medical errors or adverse events were related to ICU organisation (40 % of ICU staff off work on the previous day), staff (specific safety training) and patients (workload). One-on-one training of junior physicians during duties and existence of a hospital risk-management unit were associated with lower risks. The frequency of selected medical errors in ICUs was high and was increased when staff members had symptoms of depression.
Donn, Steven M; McDonnell, William M
2012-01-01
The Institute of Medicine has recommended a change in culture from "name and blame" to patient safety. This will require system redesign to identify and address errors, establish performance standards, and set safety expectations. This approach, however, is at odds with the present medical malpractice (tort) system. The current system is outcomes-based, meaning that health care providers and institutions are often sued despite providing appropriate care. Nevertheless, the focus should remain to provide the safest patient care. Effective peer review may be hindered by the present tort system. Reporting of medical errors is a key piece of peer review and education, and both anonymous reporting and confidential reporting of errors have potential disadvantages. Diagnostic and treatment errors continue to be the leading sources of allegations of malpractice in pediatrics, and the neonatal intensive care unit is uniquely vulnerable. Most errors result from systems failures rather than human error. Risk management can be an effective process to identify, evaluate, and address problems that may injure patients, lead to malpractice claims, and result in financial losses. Risk management identifies risk or potential risk, calculates the probability of an adverse event arising from a risk, estimates the impact of the adverse event, and attempts to control the risk. Implementation of a successful risk management program requires a positive attitude, sufficient knowledge base, and a commitment to improvement. Transparency in the disclosure of medical errors and a strategy of prospective risk management in dealing with medical errors may result in a substantial reduction in medical malpractice lawsuits, lower litigation costs, and a more safety-conscious environment. Thieme Medical Publishers, Inc.
Wen, Shiping; Zeng, Zhigang; Chen, Michael Z Q; Huang, Tingwen
2017-10-01
This paper addresses the issue of synchronization of switched delayed neural networks with communication delays via event-triggered control. For synchronizing coupled switched neural networks, we propose a novel event-triggered control law which could greatly reduce the number of control updates for synchronization tasks of coupled switched neural networks involving embedded microprocessors with limited on-board resources. The control signals are driven by properly defined events, which depend on the measurement errors and current-sampled states. By using a delay system method, a novel model of synchronization error system with delays is proposed with the communication delays and event-triggered control in the unified framework for coupled switched neural networks. The criteria are derived for the event-triggered synchronization analysis and control synthesis of switched neural networks via the Lyapunov-Krasovskii functional method and free weighting matrix approach. A numerical example is elaborated on to illustrate the effectiveness of the derived results.
Making Residents Part of the Safety Culture: Improving Error Reporting and Reducing Harms.
Fox, Michael D; Bump, Gregory M; Butler, Gabriella A; Chen, Ling-Wan; Buchert, Andrew R
2017-01-30
Reporting medical errors is a focus of the patient safety movement. As frontline physicians, residents are optimally positioned to recognize errors and flaws in systems of care. Previous work highlights the difficulty of engaging residents in identification and/or reduction of medical errors and in integrating these trainees into their institutions' cultures of safety. The authors describe the implementation of a longitudinal, discipline-based, multifaceted curriculum to enhance the reporting of errors by pediatric residents at Children's Hospital of Pittsburgh of University of Pittsburgh Medical Center. The key elements of this curriculum included providing the necessary education to identify medical errors with an emphasis on systems-based causes, modeling of error reporting by faculty, and integrating error reporting and discussion into the residents' daily activities. The authors tracked monthly error reporting rates by residents and other health care professionals, in addition to serious harm event rates at the institution. The interventions resulted in significant increases in error reports filed by residents, from 3.6 to 37.8 per month over 4 years (P < 0.0001). This increase in resident error reporting correlated with a decline in serious harm events, from 15.0 to 8.1 per month over 4 years (P = 0.01). Integrating patient safety into the everyday resident responsibilities encourages frequent reporting and discussion of medical errors and leads to improvements in patient care. Multiple simultaneous interventions are essential to making residents part of the safety culture of their training hospitals.
NASA Astrophysics Data System (ADS)
Thomas, Manu Anna; Devasthale, Abhay
2017-10-01
Characterizing typical meteorological conditions associated with extreme pollution events helps to better understand the role of local meteorology in governing the transport and distribution of pollutants in the atmosphere. The knowledge of their co-variability could further help to evaluate and constrain chemistry transport models. Hence, in this study, we investigate the statistical linkages between extreme nitrogen dioxide (NO2) pollution events and meteorology over Scandinavia using observational and reanalysis data. It is observed that the south-westerly winds dominated during extreme events, accounting for 50-65 % of the total events depending on the season, while the second largest annual occurrence was from south-easterly winds, accounting for 17 % of total events. The specific humidity anomalies showed an influx of warmer and moisture-laden air masses over Scandinavia in the free troposphere. Two distinct modes in the persistency of circulation patterns are observed. The first mode lasts for 1-2 days, dominated by south-easterly winds that prevailed during 78 % of total extreme events in that mode, while the second mode lasted for 3-5 days, dominated by south-westerly winds that prevailed during 86 % of the events. The combined analysis of circulation patterns, their persistency, and associated changes in humidity and clouds suggests that NO2 extreme events over Scandinavia occur mainly due to long-range transport from the southern latitudes.
Error-Induced Learning as a Resource-Adaptive Process in Young and Elderly Individuals
NASA Astrophysics Data System (ADS)
Ferdinand, Nicola K.; Weiten, Anja; Mecklinger, Axel; Kray, Jutta
Thorndike described in his law of effect [44] that actions followed by positive events are more likely to be repeated in the future, whereas actions that are followed by negative outcomes are less likely to be repeated. This implies that behavior is evaluated in the light of its potential consequences, and non-reward events (i.e., errors) must be detected for reinforcement learning to take place. In short, humans have to monitor their performance in order to detect and correct errors, and this allows them to successfully adapt their behavior to changing environmental demands and acquire new behavior, i.e., to learn.
A preliminary taxonomy of medical errors in family practice
Dovey, S; Meyers, D; Phillips, R; Green, L; Fryer, G; Galliher, J; Kappus, J; Grob, P
2002-01-01
Objective: To develop a preliminary taxonomy of primary care medical errors. Design: Qualitative analysis to identify categories of error reported during a randomized controlled trial of computer and paper reporting methods. Setting: The National Network for Family Practice and Primary Care Research. Participants: Family physicians. Main outcome measures: Medical error category, context, and consequence. Results: Forty two physicians made 344 reports: 284 (82.6%) arose from healthcare systems dysfunction; 46 (13.4%) were errors due to gaps in knowledge or skills; and 14 (4.1%) were reports of adverse events, not errors. The main subcategories were: administrative failures (102; 30.9% of errors), investigation failures (82; 24.8%), treatment delivery lapses (76; 23.0%), miscommunication (19; 5.8%), payment systems problems (4; 1.2%), error in the execution of a clinical task (19; 5.8%), wrong treatment decision (14; 4.2%), and wrong diagnosis (13; 3.9%). Most reports were of errors that were recognized and occurred in reporters' practices. Affected patients ranged in age from 8 months to 100 years, were of both sexes, and represented all major US ethnic groups. Almost half the reports were of events which had adverse consequences. Ten errors resulted in patients being admitted to hospital and one patient died. Conclusions: This medical error taxonomy, developed from self-reports of errors observed by family physicians during their routine clinical practice, emphasizes problems in healthcare processes and acknowledges medical errors arising from shortfalls in clinical knowledge and skills. Patient safety strategies with most effect in primary care settings need to be broader than the current focus on medication errors. PMID:12486987
A preliminary taxonomy of medical errors in family practice.
Dovey, S M; Meyers, D S; Phillips, R L; Green, L A; Fryer, G E; Galliher, J M; Kappus, J; Grob, P
2002-09-01
To develop a preliminary taxonomy of primary care medical errors. Qualitative analysis to identify categories of error reported during a randomized controlled trial of computer and paper reporting methods. The National Network for Family Practice and Primary Care Research. Family physicians. Medical error category, context, and consequence. Forty two physicians made 344 reports: 284 (82.6%) arose from healthcare systems dysfunction; 46 (13.4%) were errors due to gaps in knowledge or skills; and 14 (4.1%) were reports of adverse events, not errors. The main subcategories were: administrative failure (102; 30.9% of errors), investigation failures (82; 24.8%), treatment delivery lapses (76; 23.0%), miscommunication (19; 5.8%), payment systems problems (4; 1.2%), error in the execution of a clinical task (19; 5.8%), wrong treatment decision (14; 4.2%), and wrong diagnosis (13; 3.9%). Most reports were of errors that were recognized and occurred in reporters' practices. Affected patients ranged in age from 8 months to 100 years, were of both sexes, and represented all major US ethnic groups. Almost half the reports were of events which had adverse consequences. Ten errors resulted in patients being admitted to hospital and one patient died. This medical error taxonomy, developed from self-reports of errors observed by family physicians during their routine clinical practice, emphasizes problems in healthcare processes and acknowledges medical errors arising from shortfalls in clinical knowledge and skills. Patient safety strategies with most effect in primary care settings need to be broader than the current focus on medication errors.
Appraisals of Negative Divorce Events and Children's Psychological Adjustment.
ERIC Educational Resources Information Center
Mazur, Elizabeth; And Others
Adding to prior literature on adults' and children's appraisals of stressors, this study examined relationships among children's negative cognitive errors regarding hypothetical negative divorce events, positive illusions about those same events, the actual divorce events, and children's post-divorce psychological adjustment. Subjects were 38…
Medication administration error: magnitude and associated factors among nurses in Ethiopia.
Feleke, Senafikish Amsalu; Mulatu, Muluadam Abebe; Yesmaw, Yeshaneh Seyoum
2015-01-01
The significant impact of medication administration errors affect patients in terms of morbidity, mortality, adverse drug events, and increased length of hospital stay. It also increases costs for clinicians and healthcare systems. Due to this, assessing the magnitude and associated factors of medication administration error has a significant contribution for improving the quality of patient care. The aim of this study was to assess the magnitude and associated factors of medication administration errors among nurses at the Felege Hiwot Referral Hospital inpatient department. A prospective, observation-based, cross-sectional study was conducted from March 24-April 7, 2014 at the Felege Hiwot Referral Hospital inpatient department. A total of 82 nurses were interviewed using a pre-tested structured questionnaire, and observed while administering 360 medications by using a checklist supplemented with a review of medication charts. Data were analyzed by using SPSS version 20 software package and logistic regression was done to identify possible factors associated with medication administration error. The incidence of medication administration error was 199 (56.4 %). The majority (87.5 %) of the medications have documentation error, followed by technique error 263 (73.1 %) and time error 193 (53.6 %). Variables which were significantly associated with medication administration error include nurses between the ages of 18-25 years [Adjusted Odds Ratio (AOR) = 2.9, 95 % CI (1.65,6.38)], 26-30 years [AOR = 2.3, 95 % CI (1.55, 7.26)] and 31-40 years [AOR = 2.1, 95 % CI (1.07, 4.12)], work experience of less than or equal to 10 years [AOR = 1.7, 95 % CI (1.33, 4.99)], nurse to patient ratio of 7-10 [AOR = 1.6, 95 % CI (1.44, 3.19)] and greater than 10 [AOR = 1.5, 95 % CI (1.38, 3.89)], interruption of the respondent at the time of medication administration [AOR = 1.5, 95 % CI (1.14, 3.21)], night shift of medication administration [AOR = 3.1, 95 % CI (1.38, 9.66)] and age of the patients with less than 18 years [AOR = 2.3, 95 % CI (1.17, 4.62)]. In general, medication errors at the administration phase were highly prevalent in Felege Hiwot Referral Hospital. Documentation error is the most dominant type of error observed during the study. Increasing nurses' staffing levels, minimizing distraction and interruptions during medication administration by using no interruptions zones and "No-Talk" signage are recommended to overcome medication administration errors. Retaining experienced nurses from leaving to train and supervise inexperienced nurses with the focus on medication safety, in addition providing convenient sleep hours for nurses would be helpful in ensuring that medication errors don't occur as frequently as observed in this study.
Regenbogen, Scott E; Greenberg, Caprice C; Studdert, David M; Lipsitz, Stuart R; Zinner, Michael J; Gawande, Atul A
2007-11-01
To identify the most prevalent patterns of technical errors in surgery, and evaluate commonly recommended interventions in light of these patterns. The majority of surgical adverse events involve technical errors, but little is known about the nature and causes of these events. We examined characteristics of technical errors and common contributing factors among closed surgical malpractice claims. Surgeon reviewers analyzed 444 randomly sampled surgical malpractice claims from four liability insurers. Among 258 claims in which injuries due to error were detected, 52% (n = 133) involved technical errors. These technical errors were further analyzed with a structured review instrument designed by qualitative content analysis. Forty-nine percent of the technical errors caused permanent disability; an additional 16% resulted in death. Two-thirds (65%) of the technical errors were linked to manual error, 9% to errors in judgment, and 26% to both manual and judgment error. A minority of technical errors involved advanced procedures requiring special training ("index operations"; 16%), surgeons inexperienced with the task (14%), or poorly supervised residents (9%). The majority involved experienced surgeons (73%), and occurred in routine, rather than index, operations (84%). Patient-related complexities-including emergencies, difficult or unexpected anatomy, and previous surgery-contributed to 61% of technical errors, and technology or systems failures contributed to 21%. Most technical errors occur in routine operations with experienced surgeons, under conditions of increased patient complexity or systems failure. Commonly recommended interventions, including restricting high-complexity operations to experienced surgeons, additional training for inexperienced surgeons, and stricter supervision of trainees, are likely to address only a minority of technical errors. Surgical safety research should instead focus on improving decision-making and performance in routine operations for complex patients and circumstances.
Sozda, Christopher N.; Larson, Michael J.; Kaufman, David A.S.; Schmalfuss, Ilona M.; Perlstein, William M.
2011-01-01
Continuous monitoring of one’s performance is invaluable for guiding behavior towards successful goal attainment by identifying deficits and strategically adjusting responses when performance is inadequate. In the present study, we exploited the advantages of event-related functional magnetic resonance imaging (fMRI) to examine brain activity associated with error-related processing after severe traumatic brain injury (sTBI). fMRI and behavioral data were acquired while 10 sTBI participants and 12 neurologically-healthy controls performed a task-switching cued-Stroop task. fMRI data were analyzed using a random-effects whole-brain voxel-wise general linear model and planned linear contrasts. Behaviorally, sTBI patients showed greater error-rate interference than neurologically-normal controls. fMRI data revealed that, compared to controls, sTBI patients showed greater magnitude error-related activation in the anterior cingulate cortex (ACC) and an increase in the overall spatial extent of error-related activation across cortical and subcortical regions. Implications for future research and potential limitations in conducting fMRI research in neurologically-impaired populations are discussed, as well as some potential benefits of employing multimodal imaging (e.g., fMRI and event-related potentials) of cognitive control processes in TBI. PMID:21756946
Sozda, Christopher N; Larson, Michael J; Kaufman, David A S; Schmalfuss, Ilona M; Perlstein, William M
2011-10-01
Continuous monitoring of one's performance is invaluable for guiding behavior towards successful goal attainment by identifying deficits and strategically adjusting responses when performance is inadequate. In the present study, we exploited the advantages of event-related functional magnetic resonance imaging (fMRI) to examine brain activity associated with error-related processing after severe traumatic brain injury (sTBI). fMRI and behavioral data were acquired while 10 sTBI participants and 12 neurologically-healthy controls performed a task-switching cued-Stroop task. fMRI data were analyzed using a random-effects whole-brain voxel-wise general linear model and planned linear contrasts. Behaviorally, sTBI patients showed greater error-rate interference than neurologically-normal controls. fMRI data revealed that, compared to controls, sTBI patients showed greater magnitude error-related activation in the anterior cingulate cortex (ACC) and an increase in the overall spatial extent of error-related activation across cortical and subcortical regions. Implications for future research and potential limitations in conducting fMRI research in neurologically-impaired populations are discussed, as well as some potential benefits of employing multimodal imaging (e.g., fMRI and event-related potentials) of cognitive control processes in TBI. Copyright © 2011 Elsevier B.V. All rights reserved.
Pezzetta, Rachele; Nicolardi, Valentina; Tidoni, Emmanuele; Aglioti, Salvatore Maria
2018-06-06
Detecting errors in one's own actions, and in the actions of others, is a crucial ability for adaptable and flexible behavior. Studies show that specific EEG signatures underpin the monitoring of observed erroneous actions (error-related negativity, error-positivity, mid-frontal theta oscillations). However, the majority of studies on action observation used sequences of trials where erroneous actions were less frequent than correct actions. Therefore, it was not possible to disentangle whether the activation of the performance monitoring system was due to an error - as a violation of the intended goal - or a surprise/novelty effect, associated with a rare and unexpected event. Combining EEG and immersive virtual reality (IVR-CAVE system), we recorded the neural signal of 25 young adults who observed in first-person perspective, simple reach-to-grasp actions performed by an avatar aiming for a glass. Importantly, the proportion of erroneous actions was higher than correct actions. Results showed that the observation of erroneous actions elicits the typical electro-cortical signatures of error monitoring and therefore the violation of the action goal is still perceived as a salient event. The observation of correct actions elicited stronger alpha suppression. This confirmed the role of the alpha frequency band in the general orienting response to novel and infrequent stimuli. Our data provides novel evidence that an observed goal error (the action slip) triggers the activity of the performance monitoring system even when erroneous actions, which are, typically, relevant events, occur more often than correct actions and thus are not salient because of their rarity.
Marquardt, Lynn; Eichele, Heike; Lundervold, Astri J.; Haavik, Jan; Eichele, Tom
2018-01-01
Introduction: Attention-deficit hyperactivity disorder (ADHD) is one of the most frequent neurodevelopmental disorders in children and tends to persist into adulthood. Evidence from neuropsychological, neuroimaging, and electrophysiological studies indicates that alterations of error processing are core symptoms in children and adolescents with ADHD. To test whether adults with ADHD show persisting deficits and compensatory processes, we investigated performance monitoring during stimulus-evaluation and response-selection, with a focus on errors, as well as within-group correlations with symptom scores. Methods: Fifty-five participants (27 ADHD and 28 controls) aged 19–55 years performed a modified flanker task during EEG recording with 64 electrodes, and the ADHD and control groups were compared on measures of behavioral task performance, event-related potentials of performance monitoring (N2, P3), and error processing (ERN, Pe). Adult ADHD Self-Report Scale (ASRS) was used to assess ADHD symptom load. Results: Adults with ADHD showed higher error rates in incompatible trials, and these error rates correlated positively with the ASRS scores. Also, we observed lower P3 amplitudes in incompatible trials, which were inversely correlated with symptom load in the ADHD group. Adults with ADHD also displayed reduced error-related ERN and Pe amplitudes. There were no significant differences in reaction time (RT) and RT variability between the two groups. Conclusion: Our findings show deviations of electrophysiological measures, suggesting reduced effortful engagement of attentional and error-monitoring processes in adults with ADHD. Associations between ADHD symptom scores, event-related potential amplitudes, and poorer task performance in the ADHD group further support this notion. PMID:29706908
Geographically correlated orbit error
NASA Technical Reports Server (NTRS)
Rosborough, G. W.
1989-01-01
The dominant error source in estimating the orbital position of a satellite from ground based tracking data is the modeling of the Earth's gravity field. The resulting orbit error due to gravity field model errors are predominantly long wavelength in nature. This results in an orbit error signature that is strongly correlated over distances on the size of ocean basins. Anderle and Hoskin (1977) have shown that the orbit error along a given ground track also is correlated to some degree with the orbit error along adjacent ground tracks. This cross track correlation is verified here and is found to be significant out to nearly 1000 kilometers in the case of TOPEX/POSEIDON when using the GEM-T1 gravity model. Finally, it was determined that even the orbit error at points where ascending and descending ground traces cross is somewhat correlated. The implication of these various correlations is that the orbit error due to gravity error is geographically correlated. Such correlations have direct implications when using altimetry to recover oceanographic signals.
Sari, A Akbari; Doshmangir, L; Sheldon, T
2010-01-01
Understanding the nature and causes of medical adverse events may help their prevention. This systematic review explores the types, risk factors, and likely causes of preventable adverse events in the hospital sector. MEDLINE (1970-2008), EMBASE, CINAHL (1970-2005) and the reference lists were used to identify the studies and a structured narrative method used to synthesise the data. Operative adverse events were more common but less preventable and diagnostic adverse events less common but more preventable than other adverse events. Preventable adverse events were often associated with more than one contributory factor. The majority of adverse events were linked to individual human error, and a significant proportion of these caused serious patient harm. Equipment failure was involved in a small proportion of adverse events and rarely caused patient harm. The proportion of system failures varied widely ranging from 3% to 85% depending on the data collection and classification methods used. Operative adverse events are more common but less preventable than diagnostic adverse events. Adverse events are usually associated with more than one contributory factor, the majority are linked to individual human error, and a proportion of these with system failure.
Full temperature single event upset characterization of two microprocessor technologies
NASA Technical Reports Server (NTRS)
Nichols, Donald K.; Coss, James R.; Smith, L. S.; Rax, Bernard; Huebner, Mark
1988-01-01
Data for the 9450 I3L bipolar microprocessor and the 80C86 CMOS/epi (vintage 1985) microprocessor are presented, showing single-event soft errors for the full MIL-SPEC temperature range of -55 to 125 C. These data show for the first time that the soft-error cross sections continue to decrease with decreasing temperature at subzero temperatures. The temperature dependence of the two parts, however, is very different.
Drawing conclusions: The effect of instructions on children's confabulation and fantasy errors.
Macleod, Emily; Gross, Julien; Hayne, Harlene
2016-01-01
Drawing is commonly used in forensic and clinical interviews with children. In these interviews, children are often allowed to draw without specific instructions about the purpose of the drawing materials. Here, we examined whether this practice influenced the accuracy of children's reports. Seventy-four 5- and 6-year-old children were interviewed one to two days after they took part in an interactive event. Some children were given drawing materials to use during the interview. Of these children, some were instructed to draw about the event, and some were given no additional instructions at all. Children who were instructed to draw about the event, or who were interviewed without drawing, made few errors. In contrast, children who drew without being given specific instructions reported more errors that were associated with both confabulation and fantasy. We conclude that, to maximise accuracy during interviews involving drawing, children should be directed to draw specifically about the interview topic.
Yang, Shu-Hui; Jerng, Jih-Shuin; Chen, Li-Chin; Li, Yu-Tsu; Huang, Hsiao-Fang; Wu, Chao-Ling; Chan, Jing-Yuan; Huang, Szu-Fen; Liang, Huey-Wen; Sun, Jui-Sheng
2017-11-03
Intra-hospital transportation (IHT) might compromise patient safety because of different care settings and higher demand on the human operation. Reports regarding the incidence of IHT-related patient safety events and human failures remain limited. To perform a retrospective analysis of IHT-related events, human failures and unsafe acts. A hospital-wide process for the IHT and database from the incident reporting system in a medical centre in Taiwan. All eligible IHT-related patient safety events between January 2010 to December 2015 were included. Incidence rate of IHT-related patient safety events, human failure modes, and types of unsafe acts. There were 206 patient safety events in 2 009 013 IHT sessions (102.5 per 1 000 000 sessions). Most events (n=148, 71.8%) did not involve patient harm, and process events (n=146, 70.9%) were most common. Events at the location of arrival (n=101, 49.0%) were most frequent; this location accounted for 61.0% and 44.2% of events with patient harm and those without harm, respectively (p<0.001). Of the events with human failures (n=186), the most common related process step was the preparation of the transportation team (n=91, 48.9%). Contributing unsafe acts included perceptual errors (n=14, 7.5%), decision errors (n=56, 30.1%), skill-based errors (n=48, 25.8%), and non-compliance (n=68, 36.6%). Multivariate analysis showed that human failure found in the arrival and hand-off sub-process (OR 4.84, p<0.001) was associated with increased patient harm, whereas the presence of omission (OR 0.12, p<0.001) was associated with less patient harm. This study shows a need to reduce human failures to prevent patient harm during intra-hospital transportation. We suggest that the transportation team pay specific attention to the sub-process at the location of arrival and prevent errors other than omissions. Long-term monitoring of IHT-related events is also warranted. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Peak flood estimation using gene expression programming
NASA Astrophysics Data System (ADS)
Zorn, Conrad R.; Shamseldin, Asaad Y.
2015-12-01
As a case study for the Auckland Region of New Zealand, this paper investigates the potential use of gene-expression programming (GEP) in predicting specific return period events in comparison to the established and widely used Regional Flood Estimation (RFE) method. Initially calibrated to 14 gauged sites, the GEP derived model was further validated to 10 and 100 year flood events with a relative errors of 29% and 18%, respectively. This is compared to the RFE method providing 48% and 44% errors for the same flood events. While the effectiveness of GEP in predicting specific return period events is made apparent, it is argued that the derived equations should be used in conjunction with those existing methodologies rather than as a replacement.
[Second victim : Critical incident stress management in clinical medicine].
Schiechtl, B; Hunger, M S; Schwappach, D L; Schmidt, C E; Padosch, S A
2013-09-01
Critical incidents in clinical medicine can have far-reaching consequences on patient health. In cases of severe medical errors they can seriously harm the patient or even lead to death. The involvement in such an event can result in a stress reaction, a so-called acute posttraumatic stress disorder in the healthcare provider, the so-called second victim of an adverse event. Psychological distress may not only have a long lasting impact on quality of life of the physician or caregiver involved but it may also affect the ability to provide safe patient care in the aftermath of adverse events. A literature review was performed to obtain information on care giver responses to medical errors and to determine possible supportive strategies to mitigate negative consequences of an adverse event on the second victim. An internet search and a search in Medline/Pubmed for scientific studies were conducted using the key words "second victim, "medical error", "critical incident stress management" (CISM) and "critical incident stress reporting system" (CIRS). Sources from academic medical societies and public institutions which offer crisis management programs where analyzed. The data were sorted by main categories and relevance for hospitals. Analysis was carried out using descriptive measures. In disaster medicine and aviation navigation services the implementation of a CISM program is an efficient intervention to help staff to recover after a traumatic event and to return to normal functioning and behavior. Several other concepts for a clinical crisis management plan were identified. The integration of CISM and CISM-related programs in a clinical setting may provide efficient support in an acute crisis and may help the caregiver to deal effectively with future error events and employee safety.
Tsuji-Akimoto, Sachiko; Hamada, Shinsuke; Yabe, Ichiro; Tamura, Itaru; Otsuki, Mika; Kobashi, Syoji; Sasaki, Hidenao
2010-12-01
Loss of communication is a critical problem for advanced amyotrophic lateral sclerosis (ALS) patients. This loss of communication is mainly caused by severe dysarthria and disability of the dominant hand. However, reports show that about 50% of ALS patients have mild cognitive dysfunction, and there are a considerable number of case reports on Japanese ALS patients with agraphia. To clarify writing disabilities in non-demented ALS patients, eighteen non-demented ALS patients and 16 controls without neurological disorders were examined for frontal cognitive function and writing ability. To assess writing errors statistically, we scored them on their composition ability with the original writing error index (WEI). The ALS and control groups did not differ significantly with regard to age, years of education, or general cognitive level. Two patients could not write a letter because of disability of the dominant hand. The WEI and results of picture arrangement tests indicated significant impairment in the ALS patients. Auditory comprehension (Western Aphasia Battery; WAB IIC) and kanji dictation also showed mild impairment. Patients' writing errors consisted of both syntactic and letter-writing mistakes. Omission, substitution, displacement, and inappropriate placement of the phonic marks of kana were observed; these features have often been reported in Japanese patients with agraphia resulted from a frontal lobe lesion. The most frequent type of error was an omission of kana, the next most common was a missing subject. Writing errors might be a specific deficit for some non-demented ALS patients.
Hsieh, Shulan; Li, Tzu-Hsien; Tsai, Ling-Ling
2010-04-01
To examine whether monetary incentives attenuate the negative effects of sleep deprivation on cognitive performance in a flanker task that requires higher-level cognitive-control processes, including error monitoring. Twenty-four healthy adults aged 18 to 23 years were randomly divided into 2 subject groups: one received and the other did not receive monetary incentives for performance accuracy. Both subject groups performed a flanker task and underwent electroencephalographic recordings for event-related brain potentials after normal sleep and after 1 night of total sleep deprivation in a within-subject, counterbalanced, repeated-measures study design. Monetary incentives significantly enhanced the response accuracy and reaction time variability under both normal sleep and sleep-deprived conditions, and they reduced the effects of sleep deprivation on the subjective effort level, the amplitude of the error-related negativity (an error-related event-related potential component), and the latency of the P300 (an event-related potential variable related to attention processes). However, monetary incentives could not attenuate the effects of sleep deprivation on any measures of behavior performance, such as the response accuracy, reaction time variability, or posterror accuracy adjustments; nor could they reduce the effects of sleep deprivation on the amplitude of the Pe, another error-related event-related potential component. This study shows that motivation incentives selectively reduce the effects of total sleep deprivation on some brain activities, but they cannot attenuate the effects of sleep deprivation on performance decrements in tasks that require high-level cognitive-control processes. Thus, monetary incentives and sleep deprivation may act through both common and different mechanisms to affect cognitive performance.
Factors that influence the generation of autobiographical memory conjunction errors
Devitt, Aleea L.; Monk-Fromont, Edwin; Schacter, Daniel L.; Addis, Donna Rose
2015-01-01
The constructive nature of memory is generally adaptive, allowing us to efficiently store, process and learn from life events, and simulate future scenarios to prepare ourselves for what may come. However, the cost of a flexibly constructive memory system is the occasional conjunction error, whereby the components of an event are authentic, but the combination of those components is false. Using a novel recombination paradigm, it was demonstrated that details from one autobiographical memory may be incorrectly incorporated into another, forming autobiographical memory conjunction errors that elude typical reality monitoring checks. The factors that contribute to the creation of these conjunction errors were examined across two experiments. Conjunction errors were more likely to occur when the corresponding details were partially rather than fully recombined, likely due to increased plausibility and ease of simulation of partially recombined scenarios. Brief periods of imagination increased conjunction error rates, in line with the imagination inflation effect. Subjective ratings suggest that this inflation is due to similarity of phenomenological experience between conjunction and authentic memories, consistent with a source monitoring perspective. Moreover, objective scoring of memory content indicates that increased perceptual detail may be particularly important for the formation of autobiographical memory conjunction errors. PMID:25611492
Patient Safety in the Context of Neonatal Intensive Care: Research and Educational Opportunities
Raju, Tonse N. K.; Suresh, Gautham; Higgins, Rosemary D.
2012-01-01
Case reports and observational studies continue to report adverse events from medical errors. However, despite considerable attention to patient safety in the popular media, this topic is not a regular component of medical education, and much research needs to be carried out to understand the causes, consequences, and prevention of healthcare-related adverse events during neonatal intensive care. To address the knowledge gaps and to formulate a research and educational agenda in neonatology, the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD) invited a panel of experts to a workshop in August 2010. Patient safety issues discussed were: the reasons for errors, including systems design, working conditions, and worker fatigue; a need to develop a “culture” of patient safety; the role of electronic medical records, information technology, and simulators in reducing errors; error disclosure practices; medico-legal concerns; and educational needs. Specific neonatology-related topics discussed were: errors during resuscitation, mechanical ventilation, and performance of invasive procedures; medication errors including those associated with milk feedings; diagnostic errors; and misidentification of patients. This article provides an executive summary of the workshop. PMID:21386749
Factors that influence the generation of autobiographical memory conjunction errors.
Devitt, Aleea L; Monk-Fromont, Edwin; Schacter, Daniel L; Addis, Donna Rose
2016-01-01
The constructive nature of memory is generally adaptive, allowing us to efficiently store, process and learn from life events, and simulate future scenarios to prepare ourselves for what may come. However, the cost of a flexibly constructive memory system is the occasional conjunction error, whereby the components of an event are authentic, but the combination of those components is false. Using a novel recombination paradigm, it was demonstrated that details from one autobiographical memory (AM) may be incorrectly incorporated into another, forming AM conjunction errors that elude typical reality monitoring checks. The factors that contribute to the creation of these conjunction errors were examined across two experiments. Conjunction errors were more likely to occur when the corresponding details were partially rather than fully recombined, likely due to increased plausibility and ease of simulation of partially recombined scenarios. Brief periods of imagination increased conjunction error rates, in line with the imagination inflation effect. Subjective ratings suggest that this inflation is due to similarity of phenomenological experience between conjunction and authentic memories, consistent with a source monitoring perspective. Moreover, objective scoring of memory content indicates that increased perceptual detail may be particularly important for the formation of AM conjunction errors.
Fargen, Kyle M; Friedman, William A
2014-01-01
During the last 2 decades, there has been a shift in the U.S. health care system towards improving the quality of health care provided by enhancing patient safety and reducing medical errors. Unfortunately, surgical complications, patient harm events, and malpractice claims remain common in the field of neurosurgery. Many of these events are potentially avoidable. There are an increasing number of publications in the medical literature in which authors address cognitive errors in diagnosis and treatment and strategies for reducing such errors, but these are for the most part absent in the neurosurgical literature. The purpose of this article is to highlight the complexities of medical decision making to a neurosurgical audience, with the hope of providing insight into the biases that lead us towards error and strategies to overcome our innate cognitive deficiencies. To accomplish this goal, we review the current literature on medical errors and just culture, explain the dual process theory of cognition, identify common cognitive errors affecting neurosurgeons in practice, review cognitive debiasing strategies, and finally provide simple methods that can be easily assimilated into neurosurgical practice to improve clinical decision making. Copyright © 2014 Elsevier Inc. All rights reserved.
Kostopoulou, Olga; Delaney, Brendan
2007-04-01
To classify events of actual or potential harm to primary care patients using a multilevel taxonomy of cognitive and system factors. Observational study of patient safety events obtained via a confidential but not anonymous reporting system. Reports were followed up with interviews where necessary. Events were analysed for their causes and contributing factors using causal trees and were classified using the taxonomy. Five general medical practices in the West Midlands were selected to represent a range of sizes and types of patient population. All practice staff were invited to report patient safety events. Main outcome measures were frequencies of clinical types of events reported, cognitive types of error, types of detection and contributing factors; and relationship between types of error, practice size, patient consequences and detection. 78 reports were relevant to patient safety and analysable. They included 21 (27%) adverse events and 50 (64%) near misses. 16.7% (13/71) had serious patient consequences, including one death. 75.7% (59/78) had the potential for serious patient harm. Most reports referred to administrative errors (25.6%, 20/78). 60% (47/78) of the reports contained sufficient information to characterise cognition: "situation assessment and response selection" was involved in 45% (21/47) of these reports and was often linked to serious potential consequences. The most frequent contributing factor was work organisation, identified in 71 events. This included excessive task demands (47%, 37/71) and fragmentation (28%, 22/71). Even though most reported events were near misses, events with serious patient consequences were also reported. Failures in situation assessment and response selection, a cognitive activity that occurs in both clinical and administrative tasks, was related to serious potential harm.
Kostopoulou, Olga; Delaney, Brendan
2007-01-01
Objective To classify events of actual or potential harm to primary care patients using a multilevel taxonomy of cognitive and system factors. Methods Observational study of patient safety events obtained via a confidential but not anonymous reporting system. Reports were followed up with interviews where necessary. Events were analysed for their causes and contributing factors using causal trees and were classified using the taxonomy. Five general medical practices in the West Midlands were selected to represent a range of sizes and types of patient population. All practice staff were invited to report patient safety events. Main outcome measures were frequencies of clinical types of events reported, cognitive types of error, types of detection and contributing factors; and relationship between types of error, practice size, patient consequences and detection. Results 78 reports were relevant to patient safety and analysable. They included 21 (27%) adverse events and 50 (64%) near misses. 16.7% (13/71) had serious patient consequences, including one death. 75.7% (59/78) had the potential for serious patient harm. Most reports referred to administrative errors (25.6%, 20/78). 60% (47/78) of the reports contained sufficient information to characterise cognition: “situation assessment and response selection” was involved in 45% (21/47) of these reports and was often linked to serious potential consequences. The most frequent contributing factor was work organisation, identified in 71 events. This included excessive task demands (47%, 37/71) and fragmentation (28%, 22/71). Conclusions Even though most reported events were near misses, events with serious patient consequences were also reported. Failures in situation assessment and response selection, a cognitive activity that occurs in both clinical and administrative tasks, was related to serious potential harm. PMID:17403753
{lambda} elements for singular problems in CFD: Viscoelastic fluids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, K.K.; Surana, K.S.
1996-10-01
This paper presents two dimensional {lambda} element formulation for viscoelastic fluid flow containing point singularities in the flow field. The flow of viscoelastic fluid even without singularities are a difficult class of problems for increasing Deborah number or Weissenburg number due to increased dominance of convective terms and thus increased hyperbolicity. In the present work the equations of fluid motion and the constitutive laws are recast in the form of a first order system of coupled equations with the use of auxiliary variables. The velocity, pressure and stresses are interpolated using equal order C{sup 0} {lambda} element approximations. The Leastmore » Squares Finite Element Method (LSFEM) is used to construct the integral form (error functional I) corresponding to these equations. The error functional is constructed by taking the integrated sum of the squares of the errors or residuals (over the whole discretization) resulting when the element approximation is substituted into these equations. The conditions resulting from the minimization of the error functional are satisfied by using Newton`s method with line search. LSFEM has much superior performance when dealing with non-linear and convection dominated problems.« less
Pogue, Brian W; Song, Xiaomei; Tosteson, Tor D; McBride, Troy O; Jiang, Shudong; Paulsen, Keith D
2002-07-01
Near-infrared (NIR) diffuse tomography is an emerging method for imaging the interior of tissues to quantify concentrations of hemoglobin and exogenous chromophores non-invasively in vivo. It often exploits an optical diffusion model-based image reconstruction algorithm to estimate spatial property values from measurements of the light flux at the surface of the tissue. In this study, mean-squared error (MSE) over the image is used to evaluate methods for regularizing the ill-posed inverse image reconstruction problem in NIR tomography. Estimates of image bias and image standard deviation were calculated based upon 100 repeated reconstructions of a test image with randomly distributed noise added to the light flux measurements. It was observed that the bias error dominates at high regularization parameter values while variance dominates as the algorithm is allowed to approach the optimal solution. This optimum does not necessarily correspond to the minimum projection error solution, but typically requires further iteration with a decreasing regularization parameter to reach the lowest image error. Increasing measurement noise causes a need to constrain the minimum regularization parameter to higher values in order to achieve a minimum in the overall image MSE.
Sepsis in Poland: Why Do We Die?
Rorat, Marta; Jurek, Tomasz
2015-01-01
Objective To investigate the adverse events and potential risk factors in patients who develop sepsis. Subjects and Methods Fifty-five medico-legal opinion forms relating to sepsis cases issued by the Department of Forensic Medicine, Wroclaw, Poland, between 2004 and 2013 were analyzed for medical errors and risk factors for adverse events. Results The most common causes of medical errors were a lack of knowledge in recognition, diagnosis and therapy as well as ignorance of risk. The common risk factors for adverse events were deferral of a diagnostic or therapeutic decision, high-level anxiety of patients or their families about the patient's health and actively seeking for help. The most significant risk factors were communication errors, not enough medical staff, stereotype-based thinking about diseases and providing easy explanations for serious symptoms. Conclusion The most common cause of adverse events related to sepsis in the Polish health-care system was a lack of knowledge about the symptoms, diagnosis and treatment as well as the ignoring of danger. A possible means of improving safety might be through spreading knowledge and creating medical management algorithms for all health-care workers, especially physicians. PMID:25501966
Disambiguating ventral striatum fMRI-related bold signal during reward prediction in schizophrenia
Morris, R W; Vercammen, A; Lenroot, R; Moore, L; Langton, J M; Short, B; Kulkarni, J; Curtis, J; O'Donnell, M; Weickert, C S; Weickert, T W
2012-01-01
Reward detection, surprise detection and prediction-error signaling have all been proposed as roles for the ventral striatum (vStr). Previous neuroimaging studies of striatal function in schizophrenia have found attenuated neural responses to reward-related prediction errors; however, as prediction errors represent a discrepancy in mesolimbic neural activity between expected and actual events, it is critical to examine responses to both expected and unexpected rewards (URs) in conjunction with expected and UR omissions in order to clarify the nature of ventral striatal dysfunction in schizophrenia. In the present study, healthy adults and people with schizophrenia were tested with a reward-related prediction-error task during functional magnetic resonance imaging to determine whether schizophrenia is associated with altered neural responses in the vStr to rewards, surprise prediction errors or all three factors. In healthy adults, we found neural responses in the vStr were correlated more specifically with prediction errors than to surprising events or reward stimuli alone. People with schizophrenia did not display the normal differential activation between expected and URs, which was partially due to exaggerated ventral striatal responses to expected rewards (right vStr) but also included blunted responses to unexpected outcomes (left vStr). This finding shows that neural responses, which typically are elicited by surprise, can also occur to well-predicted events in schizophrenia and identifies aberrant activity in the vStr as a key node of dysfunction in the neural circuitry used to differentiate expected and unexpected feedback in schizophrenia. PMID:21709684
The importance of matched poloidal spectra to error field correction in DIII-D
Paz-Soldan, Carlos; Lanctot, Matthew J.; Logan, Nikolas C.; ...
2014-07-09
Optimal error field correction (EFC) is thought to be achieved when coupling to the least-stable "dominant" mode of the plasma is nulled at each toroidal mode number ( n). The limit of this picture is tested in the DIII-D tokamak by applying superpositions of in- and ex-vessel coil set n = 1 fields calculated to be fully orthogonal to the n = 1 dominant mode. In co-rotating H-mode and low-density Ohmic scenarios the plasma is found to be respectively 7x and 20x less sensitive to the orthogonal field as compared to the in-vessel coil set field. For the scenarios investigated,more » any geometry of EFC coil can thus recover a strong majority of the detrimental effect introduced by the n = 1 error field. Furthermore, despite low sensitivity to the orthogonal field, its optimization in H-mode is shown to be consistent with minimizing the neoclassical toroidal viscosity torque and not the higher-order n = 1 mode coupling.« less
ANALYZING NUMERICAL ERRORS IN DOMAIN HEAT TRANSPORT MODELS USING THE CVBEM.
Hromadka, T.V.
1987-01-01
Besides providing an exact solution for steady-state heat conduction processes (Laplace-Poisson equations), the CVBEM (complex variable boundary element method) can be used for the numerical error analysis of domain model solutions. For problems where soil-water phase change latent heat effects dominate the thermal regime, heat transport can be approximately modeled as a time-stepped steady-state condition in the thawed and frozen regions, respectively. The CVBEM provides an exact solution of the two-dimensional steady-state heat transport problem, and also provides the error in matching the prescribed boundary conditions by the development of a modeling error distribution or an approximate boundary generation.
Climbing fibers predict movement kinematics and performance errors.
Streng, Martha L; Popa, Laurentiu S; Ebner, Timothy J
2017-09-01
Requisite for understanding cerebellar function is a complete characterization of the signals provided by complex spike (CS) discharge of Purkinje cells, the output neurons of the cerebellar cortex. Numerous studies have provided insights into CS function, with the most predominant view being that they are evoked by error events. However, several reports suggest that CSs encode other aspects of movements and do not always respond to errors or unexpected perturbations. Here, we evaluated CS firing during a pseudo-random manual tracking task in the monkey ( Macaca mulatta ). This task provides extensive coverage of the work space and relative independence of movement parameters, delivering a robust data set to assess the signals that activate climbing fibers. Using reverse correlation, we determined feedforward and feedback CSs firing probability maps with position, velocity, and acceleration, as well as position error, a measure of tracking performance. The direction and magnitude of the CS modulation were quantified using linear regression analysis. The major findings are that CSs significantly encode all three kinematic parameters and position error, with acceleration modulation particularly common. The modulation is not related to "events," either for position error or kinematics. Instead, CSs are spatially tuned and provide a linear representation of each parameter evaluated. The CS modulation is largely predictive. Similar analyses show that the simple spike firing is modulated by the same parameters as the CSs. Therefore, CSs carry a broader array of signals than previously described and argue for climbing fiber input having a prominent role in online motor control. NEW & NOTEWORTHY This article demonstrates that complex spike (CS) discharge of cerebellar Purkinje cells encodes multiple parameters of movement, including motor errors and kinematics. The CS firing is not driven by error or kinematic events; instead it provides a linear representation of each parameter. In contrast with the view that CSs carry feedback signals, the CSs are predominantly predictive of upcoming position errors and kinematics. Therefore, climbing fibers carry multiple and predictive signals for online motor control. Copyright © 2017 the American Physiological Society.
Nonlinear truncation error analysis of finite difference schemes for the Euler equations
NASA Technical Reports Server (NTRS)
Klopfer, G. H.; Mcrae, D. S.
1983-01-01
It is pointed out that, in general, dissipative finite difference integration schemes have been found to be quite robust when applied to the Euler equations of gas dynamics. The present investigation considers a modified equation analysis of both implicit and explicit finite difference techniques as applied to the Euler equations. The analysis is used to identify those error terms which contribute most to the observed solution errors. A technique for analytically removing the dominant error terms is demonstrated, resulting in a greatly improved solution for the explicit Lax-Wendroff schemes. It is shown that the nonlinear truncation errors are quite large and distributed quite differently for each of the three conservation equations as applied to a one-dimensional shock tube problem.
Can we obtain the coefficient of restitution from the sound of a bouncing ball?
NASA Astrophysics Data System (ADS)
Heckel, Michael; Glielmo, Aldo; Gunkelmann, Nina; Pöschel, Thorsten
2016-03-01
The coefficient of restitution may be determined from the sound signal emitted by a sphere bouncing repeatedly off the ground. Although there is a large number of publications exploiting this method, so far, there is no quantitative discussion of the error related to this type of measurement. Analyzing the main error sources, we find that even tiny deviations of the shape from the perfect sphere may lead to substantial errors that dominate the overall error of the measurement. Therefore, we come to the conclusion that the well-established method to measure the coefficient of restitution through the emitted sound is applicable only for the case of nearly perfect spheres. For larger falling height, air drag may lead to considerable error, too.
NASA Technical Reports Server (NTRS)
Tuttle, M. E.; Brinson, H. F.
1986-01-01
The impact of flight error in measured viscoelastic parameters on subsequent long-term viscoelastic predictions is numerically evaluated using the Schapery nonlinear viscoelastic model. Of the seven Schapery parameters, the results indicated that long-term predictions were most sensitive to errors in the power law parameter n. Although errors in the other parameters were significant as well, errors in n dominated all other factors at long times. The process of selecting an appropriate short-term test cycle so as to insure an accurate long-term prediction was considered, and a short-term test cycle was selected using material properties typical for T300/5208 graphite-epoxy at 149 C. The process of selection is described, and its individual steps are itemized.
Lin, Yanli; Moran, Tim P; Schroder, Hans S; Moser, Jason S
2015-10-01
Anxious apprehension/worry is associated with exaggerated error monitoring; however, the precise mechanisms underlying this relationship remain unclear. The current study tested the hypothesis that the worry-error monitoring relationship involves left-lateralized linguistic brain activity by examining the relationship between worry and error monitoring, indexed by the error-related negativity (ERN), as a function of hand of error (Experiment 1) and stimulus orientation (Experiment 2). Results revealed that worry was exclusively related to the ERN on right-handed errors committed by the linguistically dominant left hemisphere. Moreover, the right-hand ERN-worry relationship emerged only when stimuli were presented horizontally (known to activate verbal processes) but not vertically. Together, these findings suggest that the worry-ERN relationship involves left hemisphere verbal processing, elucidating a potential mechanism to explain error monitoring abnormalities in anxiety. Implications for theory and practice are discussed. © 2015 Society for Psychophysiological Research.
Yasui, Takuya; Kaga, Kimitaka; Sakai, Kuniyoshi L
2009-02-01
Using magnetoencephalography (MEG), we report here the hemispheric dominance of the auditory cortex that is selectively modulated by unexpected errors in the lyrics and melody of songs (lyrics and melody deviants), thereby elucidating under which conditions the lateralization of auditory processing changes. In experiment 1 using familiar songs, we found that the dipole strength of responses to the lyrics deviants was left-dominant at 140 ms (M140), whereas that of responses to the melody deviants was right-dominant at 130 ms (M130). In experiment 2 using familiar songs with a constant syllable or pitch, the dipole strength of frequency mismatch negativity elicited by oddballs was left-dominant. There were significant main effects of experiment (1 and 2) for the peak latencies and for the coordinates of the dipoles, indicating that the M140 and M130 were not the frequency mismatch negativity. In experiment 3 using newly memorized songs, the right-dominant M130 was observed only when the presented note was unexpected one, independent of perceiving unnatural pitch transitions (i.e., perceptual saliency) and of selective attention to the melody of songs. The consistent right-dominance of the M130 between experiments 1 and 3 suggests that the M130 in experiment 1 is due to unexpected notes deviating from well-memorized songs. On the other hand, the left-dominant M140 was elicited by lyrics deviants, suggesting the influence of top-down linguistic information and the memory of the familiar songs. We thus conclude that the left- lateralized M140 and right-lateralized M130 reflect the expectation based on top-down information of language and music, respectively.
Avulsion research using flume experiments and highly accurate and temporal-rich SfM datasets
NASA Astrophysics Data System (ADS)
Javernick, L.; Bertoldi, W.; Vitti, A.
2017-12-01
SfM's ability to produce high-quality, large-scale digital elevation models (DEMs) of complicated and rapidly evolving systems has made it a valuable technique for low-budget researchers and practitioners. While SfM has provided valuable datasets that capture single-flood event DEMs, there is an increasing scientific need to capture higher temporal resolution datasets that can quantify the evolutionary processes instead of pre- and post-flood snapshots. However, flood events' dangerous field conditions and image matching challenges (e.g. wind, rain) prevent quality SfM-image acquisition. Conversely, flume experiments offer opportunities to document flood events, but achieving consistent and accurate DEMs to detect subtle changes in dry and inundated areas remains a challenge for SfM (e.g. parabolic error signatures).This research aimed at investigating the impact of naturally occurring and manipulated avulsions on braided river morphology and on the encroachment of floodplain vegetation, using laboratory experiments. This required DEMs with millimeter accuracy and precision and at a temporal resolution to capture the processes. SfM was chosen as it offered the most practical method. Through redundant local network design and a meticulous ground control point (GCP) survey with a Leica Total Station in red laser configuration (reported 2 mm accuracy), the SfM residual errors compared to separate ground truthing data produced mean errors of 1.5 mm (accuracy) and standard deviations of 1.4 mm (precision) without parabolic error signatures. Lighting conditions in the flume were limited to uniform, oblique, and filtered LED strips, which removed glint and thus improved bed elevation mean errors to 4 mm, but errors were further reduced by means of an open source software for refraction correction. The obtained datasets have provided the ability to quantify how small flood events with avulsion can have similar morphologic and vegetation impacts as large flood events without avulsion. Further, this research highlights the potential application of SfM in the laboratory and ability to document physical and biological processes at greater spatial and temporal resolution. Marie Sklodowska-Curie Individual Fellowship: River-HMV, 656917
Defining and classifying medical error: lessons for patient safety reporting systems.
Tamuz, M; Thomas, E J; Franchois, K E
2004-02-01
It is important for healthcare providers to report safety related events, but little attention has been paid to how the definition and classification of events affects a hospital's ability to learn from its experience. To examine how the definition and classification of safety related events influences key organizational routines for gathering information, allocating incentives, and analyzing event reporting data. In semi-structured interviews, professional staff and administrators in a tertiary care teaching hospital and its pharmacy were asked to describe the existing programs designed to monitor medication safety, including the reporting systems. With a focus primarily on the pharmacy staff, interviews were audio recorded, transcribed, and analyzed using qualitative research methods. Eighty six interviews were conducted, including 36 in the hospital pharmacy. Examples are presented which show that: (1) the definition of an event could lead to under-reporting; (2) the classification of a medication error into alternative categories can influence the perceived incentives and disincentives for incident reporting; (3) event classification can enhance or impede organizational routines for data analysis and learning; and (4) routines that promote organizational learning within the pharmacy can reduce the flow of medication error data to the hospital. These findings from one hospital raise important practical and research questions about how reporting systems are influenced by the definition and classification of safety related events. By understanding more clearly how hospitals define and classify their experience, we may improve our capacity to learn and ultimately improve patient safety.
Spelling in adolescents with dyslexia: errors and modes of assessment.
Tops, Wim; Callens, Maaike; Bijn, Evi; Brysbaert, Marc
2014-01-01
In this study we focused on the spelling of high-functioning students with dyslexia. We made a detailed classification of the errors in a word and sentence dictation task made by 100 students with dyslexia and 100 matched control students. All participants were in the first year of their bachelor's studies and had Dutch as mother tongue. Three main error categories were distinguished: phonological, orthographic, and grammatical errors (on the basis of morphology and language-specific spelling rules). The results indicated that higher-education students with dyslexia made on average twice as many spelling errors as the controls, with effect sizes of d ≥ 2. When the errors were classified as phonological, orthographic, or grammatical, we found a slight dominance of phonological errors in students with dyslexia. Sentence dictation did not provide more information than word dictation in the correct classification of students with and without dyslexia. © Hammill Institute on Disabilities 2012.
Mirus, Benjamin B.
2015-01-01
Incorporating the influence of soil structure and horizons into parameterizations of distributed surface water/groundwater models remains a challenge. Often, only a single soil unit is employed, and soil-hydraulic properties are assigned based on textural classification, without evaluating the potential impact of these simplifications. This study uses a distributed physics-based model to assess the influence of soil horizons and structure on effective parameterization. This paper tests the viability of two established and widely used hydrogeologic methods for simulating runoff and variably saturated flow through layered soils: (1) accounting for vertical heterogeneity by combining hydrostratigraphic units with contrasting hydraulic properties into homogeneous, anisotropic units and (2) use of established pedotransfer functions based on soil texture alone to estimate water retention and conductivity, without accounting for the influence of pedon structures and hysteresis. The viability of this latter method for capturing the seasonal transition from runoff-dominated to evapotranspiration-dominated regimes is also tested here. For cases tested here, event-based simulations using simplified vertical heterogeneity did not capture the state-dependent anisotropy and complex combinations of runoff generation mechanisms resulting from permeability contrasts in layered hillslopes with complex topography. Continuous simulations using pedotransfer functions that do not account for the influence of soil structure and hysteresis generally over-predicted runoff, leading to propagation of substantial water balance errors. Analysis suggests that identifying a dominant hydropedological unit provides the most acceptable simplification of subsurface layering and that modified pedotransfer functions with steeper soil-water retention curves might adequately capture the influence of soil structure and hysteresis on hydrologic response in headwater catchments.
[Improving blood safety: errors management in transfusion medicine].
Bujandrić, Nevenka; Grujić, Jasmina; Krga-Milanović, Mirjana
2014-01-01
The concept of blood safety includes the entire transfusion chain starting with the collection of blood from the blood donor, and ending with blood transfusion to the patient. The concept involves quality management system as the systematic monitoring of adverse reactions and incidents regarding the blood donor or patient. Monitoring of near-miss errors show the critical points in the working process and increase transfusion safety. The aim of the study was to present the analysis results of adverse and unexpected events in transfusion practice with a potential risk to the health of blood donors and patients. One-year retrospective study was based on the collection, analysis and interpretation of written reports on medical errors in the Blood Transfusion Institute of Vojvodina. Errors were distributed according to the type, frequency and part of the working process where they occurred. Possible causes and corrective actions were described for each error. The study showed that there were not errors with potential health consequences for the blood donor/patient. Errors with potentially damaging consequences for patients were detected throughout the entire transfusion chain. Most of the errors were identified in the preanalytical phase. The human factor was responsible for the largest number of errors. Error reporting system has an important role in the error management and the reduction of transfusion-related risk of adverse events and incidents. The ongoing analysis reveals the strengths and weaknesses of the entire process and indicates the necessary changes. Errors in transfusion medicine can be avoided in a large percentage and prevention is cost-effective, systematic and applicable.
AGILE confirmation of gamma-ray activity from the IceCube-170922A error region
NASA Astrophysics Data System (ADS)
Lucarelli, F.; Piano, G.; Pittori, C.; Verrecchia, F.; Tavani, M.; Bulgarelli, A.; Munar-Adrover, P.; Minervini, G.; Ursi, A.; Vercellone, S.; Donnarumma, I.; Fioretti, V.; Zoli, A.; Striani, E.; Cardillo, M.; Gianotti, F.; Trifoglio, M.; Giuliani, A.; Mereghetti, S.; Caraveo, P.; Perotti, F.; Chen, A.; Argan, A.; Costa, E.; Del Monte, E.; Evangelista, Y.; Feroci, M.; Lazzarotto, F.; Lapshov, I.; Pacciani, L.; Soffitta, P.; Sabatini, S.; Vittorini, V.; Pucella, G.; Rapisarda, M.; Di Cocco, G.; Fuschino, F.; Galli, M.; Labanti, C.; Marisaldi, M.; Pellizzoni, A.; Pilia, M.; Trois, A.; Barbiellini, G.; Vallazza, E.; Longo, F.; Morselli, A.; Picozza, P.; Prest, M.; Lipari, P.; Zanello, D.; Cattaneo, P. W.; Rappoldi, A.; Colafrancesco, S.; Parmiggiani, N.; Ferrari, A.; Paoletti, F.; Antonelli, A.; Giommi, P.; Salotti, L.; Valentini, G.; D'Amico, F.
2017-09-01
Following the IceCube observation of a high-energy neutrino candidate event, IceCube-170922A, at T0 = 17/09/22 20:54:30.43 UT (https://gcn.gsfc.nasa.gov/gcn3/21916.gcn3), and the detection of increased gamma-ray activity from a previously known Fermi-LAT gamma-ray source (3FGL J0509.4+0541) in the IceCube-170922A error region (ATel #10791), we have analysed the AGILE-GRID data acquired in the days before and after the neutrino event T0, searching for significant gamma-ray excess above 100 MeV from a position compatible with the IceCube and Fermi-LAT error regions.
Boquet, Albert J; Cohen, Tara N; Cabrera, Jennifer S; Litzinger, Tracy L; Captain, Kevin A; Fabian, Michael A; Miles, Steven G; Shappell, Scott A
2016-09-09
Historically, health care has relied on error management techniques to measure and reduce the occurrence of adverse events. This study proposes an alternative approach for identifying and analyzing hazardous events. Whereas previous research has concentrated on investigating individual flow disruptions, we maintain the industry should focus on threat windows, or the accumulation of these disruptions. This methodology, driven by the broken windows theory, allows us to identify process inefficiencies before they manifest and open the door for the occurrence of errors and adverse events. Medical human factors researchers observed disruptions during 34 trauma cases at a Level II trauma center. Data were collected during resuscitation and imaging and were classified using a human factors taxonomy: Realizing Improved Patient Care Through Human-Centered Operating Room Design for Threat Window Analysis (RIPCHORD-TWA). Of the 576 total disruptions observed, communication issues were the most prevalent (28%), followed by interruptions and coordination issues (24% each). Issues related to layout (16%), usability (5%), and equipment (2%) comprised the remainder of the observations. Disruptions involving communication issues were more prevalent during resuscitation, whereas coordination problems were observed more frequently during imaging. Rather than solely investigating errors and adverse events, we propose conceptualizing the accumulation of disruptions in terms of threat windows as a means to analyze potential threats to the integrity of the trauma care system. This approach allows for the improved identification of system weaknesses or threats, affording us the ability to address these inefficiencies and intervene before errors and adverse events may occur.
Dossett, Lesly A; Kauffmann, Rondi M; Lee, Jay S; Singh, Harkamal; Lee, M Catherine; Morris, Arden M; Jagsi, Reshma; Quinn, Gwendolyn P; Dimick, Justin B
2018-06-01
Our objective was to determine specialist physicians' attitudes and practices regarding disclosure of pre-referral errors. Physicians are encouraged to disclose their own errors to patients. However, no clear professional norms exist regarding disclosure when physicians discover errors in diagnosis or treatment that occurred at other institutions before referral. We conducted semistructured interviews of cancer specialists from 2 National Cancer Institute-designated Cancer Centers. We purposively sampled specialists by discipline, sex, and experience-level who self-described a >50% reliance on external referrals (n = 30). Thematic analysis of verbatim interview transcripts was performed to determine physician attitudes regarding disclosure of pre-referral medical errors; whether and how physicians disclose these errors; and barriers to providing full disclosure. Participants described their experiences identifying different types of pre-referral errors including errors of diagnosis, staging and treatment resulting in adverse events ranging from decreased quality of life to premature death. The majority of specialists expressed the belief that disclosure provided no benefit to patients, and might unnecessarily add to their anxiety about their diagnoses or prognoses. Specialists had varying practices of disclosure including none, non-verbal, partial, event-dependent, and full disclosure. They identified a number of barriers to disclosure, including medicolegal implications and damage to referral relationships, the profession's reputation, and to patient-physician relationships. Specialist physicians identify pre-referral errors but struggle with whether and how to provide disclosure, even when clinical circumstances force disclosure. Education- or communication-based interventions that overcome barriers to disclosing pre-referral errors warrant development.
Amori, Renee E; Pittas, Anastassios G; Siegel, Richard D; Kumar, Sanjaya; Chen, Jack S; Karnam, Suneel; Golden, Sherita H; Salem, Deeb N
2008-01-01
To describe characteristics of inpatient medical errors involving hypoglycemic medications and their impact on patient care. We conducted a cross-sectional analysis of medical errors and associated adverse events voluntarily reported by hospital employees and staff in 21 nonprofit, nonfederal health-care organizations in the United States that implemented a Web-based electronic error-reporting system (e-ERS) between August 1, 2000, and December 31, 2005. Persons reporting the errors determined the level of impact on patient care. The median duration of e-ERS use was 3.1 years, and 2,598 inpatient error reports involved insulin or orally administered hypoglycemic agents. Nursing staff provided 59% of the reports; physicians reported <2%. Approximately two-thirds of the errors (1,693 of 2,598) reached the patient. Errors that caused temporary harm necessitating major treatment or that caused permanent harm accounted for 1.5% of reports (40 of 2,598). Insulin was involved in 82% of reports, and orally administered hypoglycemic agents were involved in 18% of all reports (473 of 2,598). Sulfonylureas were implicated in 51.8% of reports involving oral hypoglycemic agents (9.4% of all reports). An e-ERS provides an accessible venue for reporting and tracking inpatient medical errors involving glucose-lowering medications. Results are limited by potential underreporting of events, particularly by physicians, and variations in the reporter perception of patient harm.
Development and validation of Aviation Causal Contributors for Error Reporting Systems (ACCERS).
Baker, David P; Krokos, Kelley J
2007-04-01
This investigation sought to develop a reliable and valid classification system for identifying and classifying the underlying causes of pilot errors reported under the Aviation Safety Action Program (ASAP). ASAP is a voluntary safety program that air carriers may establish to study pilot and crew performance on the line. In ASAP programs, similar to the Aviation Safety Reporting System, pilots self-report incidents by filing a short text description of the event. The identification of contributors to errors is critical if organizations are to improve human performance, yet it is difficult for analysts to extract this information from text narratives. A taxonomy was needed that could be used by pilots to classify the causes of errors. After completing a thorough literature review, pilot interviews and a card-sorting task were conducted in Studies 1 and 2 to develop the initial structure of the Aviation Causal Contributors for Event Reporting Systems (ACCERS) taxonomy. The reliability and utility of ACCERS was then tested in studies 3a and 3b by having pilots independently classify the primary and secondary causes of ASAP reports. The results provided initial evidence for the internal and external validity of ACCERS. Pilots were found to demonstrate adequate levels of agreement with respect to their category classifications. ACCERS appears to be a useful system for studying human error captured under pilot ASAP reports. Future work should focus on how ACCERS is organized and whether it can be used or modified to classify human error in ASAP programs for other aviation-related job categories such as dispatchers. Potential applications of this research include systems in which individuals self-report errors and that attempt to extract and classify the causes of those events.
Medication errors in the emergency department: a systems approach to minimizing risk.
Peth, Howard A
2003-02-01
Adverse drug events caused by medication errors represent a common cause of patient injury in the practice of medicine. Many medication errors are preventable and hence particularly tragic when they occur, often with serious consequences. The enormous increase in the number of available drugs on the market makes it all but impossible for physicians, nurses, and pharmacists to possess the knowledge base necessary for fail-safe medication practice. Indeed, the greatest single systemic factor associated with medication errors is a deficiency in the knowledge requisite to the safe use of drugs. It is vital that physicians, nurses, and pharmacists have at their immediate disposal up-to-date drug references. Patients presenting for care in EDs are usually unfamiliar to their EPs and nurses, and the unique patient factors affecting medication response and toxicity are obscured. An appropriate history, physical examination, and diagnostic workup will assist EPs, nurses, and pharmacists in selecting the safest and most optimum therapeutic regimen for each patient. EDs deliver care "24/7" and are open when valuable information resources, such as hospital pharmacists and previously treating physicians, may not be available for consultation. A systems approach to the complex problem of medication errors will help emergency clinicians eliminate preventable adverse drug events and achieve a goal of a zero-defects system, in which medication errors are a thing of the past. New developments in information technology and the advent of electronic medical records with computerized physician order entry, ward-based clinical pharmacists, and standardized bar codes promise substantial reductions in the incidence of medication errors and adverse drug events. ED patients expect and deserve nothing less than the safest possible emergency medicine service.
NASA Technical Reports Server (NTRS)
Fields, J. M.
1980-01-01
The data from seven surveys of community response to environmental noise are reanalyzed to assess the relative influence of peak noise levels and the numbers of noise events on human response. The surveys do not agree on the value of the tradeoff between the effects of noise level and numbers of events. The value of the tradeoff cannot be confidently specified in any survey because the tradeoff estimate may have a large standard error of estimate and because the tradeoff estimate may be seriously biased by unknown noise measurement errors. Some evidence suggests a decrease in annoyance with very high numbers of noise events but this evidence is not strong enough to lead to the rejection of the conventionally accepted assumption that annoyance is related to a log transformation of the number of noise events.
A Bayesian framework for infrasound location
NASA Astrophysics Data System (ADS)
Modrak, Ryan T.; Arrowsmith, Stephen J.; Anderson, Dale N.
2010-04-01
We develop a framework for location of infrasound events using backazimuth and infrasonic arrival times from multiple arrays. Bayesian infrasonic source location (BISL) developed here estimates event location and associated credibility regions. BISL accounts for unknown source-to-array path or phase by formulating infrasonic group velocity as random. Differences between observed and predicted source-to-array traveltimes are partitioned into two additive Gaussian sources, measurement error and model error, the second of which accounts for the unknown influence of wind and temperature on path. By applying the technique to both synthetic tests and ground-truth events, we highlight the complementary nature of back azimuths and arrival times for estimating well-constrained event locations. BISL is an extension to methods developed earlier by Arrowsmith et al. that provided simple bounds on location using a grid-search technique.
In-flight performance of pulse-processing system of the ASTRO-H/Hitomi soft x-ray spectrometer
NASA Astrophysics Data System (ADS)
Ishisaki, Yoshitaka; Yamada, Shinya; Seta, Hiromi; Tashiro, Makoto S.; Takeda, Sawako; Terada, Yukikatsu; Kato, Yuka; Tsujimoto, Masahiro; Koyama, Shu; Mitsuda, Kazuhisa; Sawada, Makoto; Boyce, Kevin R.; Chiao, Meng P.; Watanabe, Tomomi; Leutenegger, Maurice A.; Eckart, Megan E.; Porter, Frederick Scott; Kilbourne, Caroline Anne
2018-01-01
We summarize results of the initial in-orbit performance of the pulse shape processor (PSP) of the soft x-ray spectrometer instrument onboard ASTRO-H (Hitomi). Event formats, kind of telemetry, and the pulse-processing parameters are described, and the parameter settings in orbit are listed. The PSP was powered-on 2 days after launch, and the event threshold was lowered in orbit. The PSP worked fine in orbit, and there was neither memory error nor SpaceWire communication error until the break-up of spacecraft. Time assignment, electrical crosstalk, and the event screening criteria are studied. It is confirmed that the event processing rate at 100% central processing unit load is ˜200 c / s / array, compliant with the requirement on the PSP.
Aronis, Konstantinos N.; Ashikaga, Hiroshi
2018-01-01
Background Conflicting evidence exists on the efficacy of focal impulse and rotor modulation on atrial fibrillation ablation. A potential explanation is inaccurate rotor localization from multiple rotors coexistence and a relatively large (9–11 mm) inter-electrode distance (IED) of the multi-electrode basket catheter. Methods and results We studied a numerical model of cardiac action potential to reproduce one through seven rotors in a two-dimensional lattice. We estimated rotor location using phase singularity, Shannon entropy and dominant frequency. We then spatially downsampled the time series to create IEDs of 2–30 mm. The error of rotor localization was measured with reference to the dynamics of phase singularity at the original spatial resolution (IED = 1 mm). IED has a significant impact on the error using all the methods. When only one rotor is present, the error increases exponentially as a function of IED. At the clinical IED of 10 mm, the error is 3.8 mm (phase singularity), 3.7 mm (dominant frequency), and 11.8 mm (Shannon entropy). When there are more than one rotors, the error of rotor localization increases 10-fold. The error based on the phase singularity method at the clinical IED of 10 mm ranges from 30.0 mm (two rotors) to 96.1 mm (five rotors). Conclusions The magnitude of error of rotor localization using a clinically available basket catheter, in the presence of multiple rotors might be high enough to impact the accuracy of targeting during AF ablation. Improvement of catheter design and development of high-density mapping catheters may improve clinical outcomes of FIRM-guided AF ablation. PMID:28988690
Aronis, Konstantinos N; Ashikaga, Hiroshi
Conflicting evidence exists on the efficacy of focal impulse and rotor modulation on atrial fibrillation ablation. A potential explanation is inaccurate rotor localization from multiple rotors coexistence and a relatively large (9-11mm) inter-electrode distance (IED) of the multi-electrode basket catheter. We studied a numerical model of cardiac action potential to reproduce one through seven rotors in a two-dimensional lattice. We estimated rotor location using phase singularity, Shannon entropy and dominant frequency. We then spatially downsampled the time series to create IEDs of 2-30mm. The error of rotor localization was measured with reference to the dynamics of phase singularity at the original spatial resolution (IED=1mm). IED has a significant impact on the error using all the methods. When only one rotor is present, the error increases exponentially as a function of IED. At the clinical IED of 10mm, the error is 3.8mm (phase singularity), 3.7mm (dominant frequency), and 11.8mm (Shannon entropy). When there are more than one rotors, the error of rotor localization increases 10-fold. The error based on the phase singularity method at the clinical IED of 10mm ranges from 30.0mm (two rotors) to 96.1mm (five rotors). The magnitude of error of rotor localization using a clinically available basket catheter, in the presence of multiple rotors might be high enough to impact the accuracy of targeting during AF ablation. Improvement of catheter design and development of high-density mapping catheters may improve clinical outcomes of FIRM-guided AF ablation. Copyright © 2017 Elsevier Inc. All rights reserved.
Giraldo, Priscila; Corbella, Josep; Rodrigo, Carmen; Comas, Mercè; Sala, Maria; Castells, Xavier
2016-01-01
To identify opportunities for disclosing information on medical errors in Spain and issuing an apology, as well as legal-ethical barriers. A cross-sectional study was conducted through a questionnaire sent to health law and bioethics experts (n=46). A total of 39 experts (84.7%) responded that health providers should always disclose adverse events and 38 experts (82.6%) were in favour of issuing an apology. Thirty experts (65.2%) reported that disclosure of errors would not lead to professional liability. The main opportunity for increasing disclosure was by enhancing trust in the physician-patient relationship and the main barrier was fear of the outcomes of disclosing medical errors. There is a broad agreement on the lack of liability following disclosure/apology on adverse events and the need to develop a strategy for disclosure among support for physicians. Copyright © 2015 SESPAS. Published by Elsevier Espana. All rights reserved.
Tozbikian, Gary; Gemignani, Mary L; Brogi, Edi
2017-09-01
The consequences of patient identification errors due to specimen mislabeling can be deleterious. We describe two near-miss events involving mislabeled breast specimens from two patients who sought treatment at our institution. In both cases, microscopic review of the slides identified inconsistencies between the histologic findings and patient age, unveiling specimen identification errors. By correlating the clinical information with the microscopic findings, we identified mistakes that had occurred at the time of specimen accessioning at the original laboratories. In both cases, thanks to a timely reassignment of the specimens, the patients suffered no harm. These cases highlight the importance of routine clinical and pathologic correlation as a critical component of quality assurance and patient safety. A review of possible specimen identification errors in the anatomic pathology setting is presented. © 2017 Wiley Periodicals, Inc.
Targeting errors in the ICU: use of a national database.
Kleinpell, Ruth; Thompson, David; Kelso, Lynn; Pronovost, Peter J
2006-12-01
The authors believe that as we move from viewing adverse event reporting system as punitive, and as the safety culture improves, reporting will likely increase. Voluntary incident reporting systems can be used to improve patient safety in the ICU by identifying broken or inadequate systems that lead to adverse events [26]. Voluntary external reporting systems such as the ICUSRS can be used to target errors and produce evidence-based best practice measures to improve patient safety in the ICU.
Thomey, Michell L; Collins, Scott L; Friggens, Michael T; Brown, Renee F; Pockman, William T
2014-11-01
For the southwestern United States, climate models project an increase in extreme precipitation events and prolonged dry periods. While most studies emphasize plant functional type response to precipitation variability, it is also important to understand the physiological characteristics of dominant plant species that define plant community composition and, in part, regulate ecosystem response to climate change. We utilized rainout shelters to alter the magnitude and frequency of rainfall and measured the physiological response of the dominant C4 grasses, Bouteloua eriopoda and Bouteloua gracilis. We hypothesized that: (1) the more drought-adapted B. eriopoda would exhibit faster recovery and higher rates of leaf-level photosynthesis (A(net)) than B. gracilis, (2) A(net) would be greater under the higher average soil water content in plots receiving 30-mm rainfall events, (3) co-dominance of B. eriopoda and B. gracilis in the ecotone would lead to intra-specific differences from the performance of each species at the site where it was dominant. Throughout the study, soil moisture explained 40-70% of the variation in A(net). Consequently, differences in rainfall treatments were not evident from intra-specific physiological function without sufficient divergence in soil moisture. Under low frequency, larger rainfall events B. gracilis exhibited improved water status and longer periods of C gain than B. eriopoda. Results from this study indicate that less frequent and larger rainfall events could provide a competitive advantage to B. gracilis and influence species composition across this arid-semiarid grassland ecotone.
Mackiewicz, Dorota; de Oliveira, Paulo Murilo Castro; Moss de Oliveira, Suzana; Cebrat, Stanisław
2013-01-01
Recombination is the main cause of genetic diversity. Thus, errors in this process can lead to chromosomal abnormalities. Recombination events are confined to narrow chromosome regions called hotspots in which characteristic DNA motifs are found. Genomic analyses have shown that both recombination hotspots and DNA motifs are distributed unevenly along human chromosomes and are much more frequent in the subtelomeric regions of chromosomes than in their central parts. Clusters of motifs roughly follow the distribution of recombination hotspots whereas single motifs show a negative correlation with the hotspot distribution. To model the phenomena related to recombination, we carried out computer Monte Carlo simulations of genome evolution. Computer simulations generated uneven distribution of hotspots with their domination in the subtelomeric regions of chromosomes. They also revealed that purifying selection eliminating defective alleles is strong enough to cause such hotspot distribution. After sufficiently long time of simulations, the structure of chromosomes reached a dynamic equilibrium, in which number and global distribution of both hotspots and defective alleles remained statistically unchanged, while their precise positions were shifted. This resembles the dynamic structure of human and chimpanzee genomes, where hotspots change their exact locations but the global distributions of recombination events are very similar.
Gustavsson, Peter; Förster, Alisa; Hofmeister, Wolfgang; Wincent, Josephine; Zachariadis, Vasilios; Anderlid, Britt-Marie; Nordgren, Ann; Mäkitie, Outi; Wirta, Valtteri; Käller, Max; Vezzi, Francesco; Lupski, James R; Nordenskjöld, Magnus; Lundberg, Elisabeth Syk; Carvalho, Claudia M. B.; Lindstrand, Anna
2016-01-01
Most balanced translocations are thought to result mechanistically from non-homologous endjoining (NHEJ) or, in rare cases of recurrent events, by nonallelic homologous recombination (NAHR). Here, we use low coverage mate pair whole genome sequencing to fine map rearrangement breakpoint junctions in both phenotypically normal and affected translocation carriers. In total, 46 junctions from 22 carriers of balanced translocations were characterized. Genes were disrupted in 48% of the breakpoints; recessive genes in four normal carriers and known dominant intellectual disability genes in three affected carriers. Finally, seven candidate disease genes were disrupted in five carriers with neurocognitive disabilities (SVOPL, SUSD1, TOX, NCALD, SLC4A10) and one XX-male carrier with Tourette syndrome (LYPD6, GPC5). Breakpoint junction analyses revealed microhomology and small templated insertions in a substantive fraction of the analyzed translocations (17.4%; n=4); an observation that was substantiated by reanalysis of 37 previously published translocation junctions. Microhomology associated with templated-insertions is a characteristic seen in the breakpoint junctions of rearrangements mediated by the error prone replication-based repair mechanisms (RBMs). Our data implicate that a mechanism involving template switching might contribute to the formation of at least 15% of the interchromosomal translocation events. PMID:27862604
Mackiewicz, Dorota; de Oliveira, Paulo Murilo Castro; Moss de Oliveira, Suzana; Cebrat, Stanisław
2013-01-01
Recombination is the main cause of genetic diversity. Thus, errors in this process can lead to chromosomal abnormalities. Recombination events are confined to narrow chromosome regions called hotspots in which characteristic DNA motifs are found. Genomic analyses have shown that both recombination hotspots and DNA motifs are distributed unevenly along human chromosomes and are much more frequent in the subtelomeric regions of chromosomes than in their central parts. Clusters of motifs roughly follow the distribution of recombination hotspots whereas single motifs show a negative correlation with the hotspot distribution. To model the phenomena related to recombination, we carried out computer Monte Carlo simulations of genome evolution. Computer simulations generated uneven distribution of hotspots with their domination in the subtelomeric regions of chromosomes. They also revealed that purifying selection eliminating defective alleles is strong enough to cause such hotspot distribution. After sufficiently long time of simulations, the structure of chromosomes reached a dynamic equilibrium, in which number and global distribution of both hotspots and defective alleles remained statistically unchanged, while their precise positions were shifted. This resembles the dynamic structure of human and chimpanzee genomes, where hotspots change their exact locations but the global distributions of recombination events are very similar. PMID:23776462
Five years of Project META - An all-sky narrow-band radio search for extraterrestrial signals
NASA Technical Reports Server (NTRS)
Horowitz, Paul; Sagan, Carl
1993-01-01
We have conducted a five-year search of the northern sky (delta between 30 and 60 deg) for narrow-band radio signals near the 1420 MHz line of neutral hydrogen, and its second harmonic, using an 8.4 x 10 exp 6 channel Fourier spectrometer of 0.05 Hz resolution and 400 kHz instantaneous bandwidth. The observing frequency was corrected both for motions with respect to three astronomical inertial frames, and for the effect of Earth's rotation, which provides a characteristic changing Doppler signature for narrow-band signals of extraterrestrial origin. Among the 6 x 10 exp 13 spectral channels searched, we have found 37 candidate events exceeding the average detection threshold of 1.7 x 10 exp -23 W/sq m, none of which was detected upon reobservation. The strongest of these appear to be dominated by rare processor errors. However, the strongest signals that survive culling for terrestrial interference lie in or near the Galactic plane. We describe the search and candidate events, and set limits on the prevalence of supercivilizations transmitting Doppler-precompensated beacons at H I or its second harmonic. We conclude with recommendations for future searches, based upon these findings, and a description of our next-generation search system.
Woolf, Steven H.; Kuzel, Anton J.; Dovey, Susan M.; Phillips, Robert L.
2004-01-01
BACKGROUND Notions about the most common errors in medicine currently rest on conjecture and weak epidemiologic evidence. We sought to determine whether cascade analysis is of value in clarifying the epidemiology and causes of errors and whether physician reports are sensitive to the impact of errors on patients. METHODS Eighteen US family physicians participating in a 6-country international study filed 75 anonymous error reports. The narratives were examined to identify the chain of events and the predominant proximal errors. We tabulated the consequences to patients, both reported by physicians and inferred by investigators. RESULTS A chain of errors was documented in 77% of incidents. Although 83% of the errors that ultimately occurred were mistakes in treatment or diagnosis, 2 of 3 were set in motion by errors in communication. Fully 80% of the errors that initiated cascades involved informational or personal miscommunication. Examples of informational miscommunication included communication breakdowns among colleagues and with patients (44%), misinformation in the medical record (21%), mishandling of patients’ requests and messages (18%), inaccessible medical records (12%), and inadequate reminder systems (5%). When asked whether the patient was harmed, physicians answered affirmatively in 43% of cases in which their narratives described harms. Psychological and emotional effects accounted for 17% of physician-reported consequences but 69% of investigator-inferred consequences. CONCLUSIONS Cascade analysis of physicians’ error reports is helpful in understanding the precipitant chain of events, but physicians provide incomplete information about how patients are affected. Miscommunication appears to play an important role in propagating diagnostic and treatment mistakes. PMID:15335130
Huynh, Chi; Wong, Ian C K; Correa-West, Jo; Terry, David; McCarthy, Suzanne
2017-04-01
Since the publication of To Err Is Human: Building a Safer Health System in 1999, there has been much research conducted into the epidemiology, nature and causes of medication errors in children, from prescribing and supply to administration. It is reassuring to see growing evidence of improving medication safety in children; however, based on media reports, it can be seen that serious and fatal medication errors still occur. This critical opinion article examines the problem of medication errors in children and provides recommendations for research, training of healthcare professionals and a culture shift towards dealing with medication errors. There are three factors that we need to consider to unravel what is missing and why fatal medication errors still occur. (1) Who is involved and affected by the medication error? (2) What factors hinder staff and organisations from learning from mistakes? Does the fear of litigation and criminal charges deter healthcare professionals from voluntarily reporting medication errors? (3) What are the educational needs required to prevent medication errors? It is important to educate future healthcare professionals about medication errors and human factors to prevent these from happening. Further research is required to apply aviation's 'black box' principles in healthcare to record and learn from near misses and errors to prevent future events. There is an urgent need for the black box investigations to be published and made public for the benefit of other organisations that may have similar potential risks for adverse events. International sharing of investigations and learning is also needed.
Shifflett, Benjamin; Huang, Rong; Edland, Steven D
2017-01-01
Genotypic association studies are prone to inflated type I error rates if multiple hypothesis testing is performed, e.g., sequentially testing for recessive, multiplicative, and dominant risk. Alternatives to multiple hypothesis testing include the model independent genotypic χ 2 test, the efficiency robust MAX statistic, which corrects for multiple comparisons but with some loss of power, or a single Armitage test for multiplicative trend, which has optimal power when the multiplicative model holds but with some loss of power when dominant or recessive models underlie the genetic association. We used Monte Carlo simulations to describe the relative performance of these three approaches under a range of scenarios. All three approaches maintained their nominal type I error rates. The genotypic χ 2 and MAX statistics were more powerful when testing a strictly recessive genetic effect or when testing a dominant effect when the allele frequency was high. The Armitage test for multiplicative trend was most powerful for the broad range of scenarios where heterozygote risk is intermediate between recessive and dominant risk. Moreover, all tests had limited power to detect recessive genetic risk unless the sample size was large, and conversely all tests were relatively well powered to detect dominant risk. Taken together, these results suggest the general utility of the multiplicative trend test when the underlying genetic model is unknown.
Detecting Signatures of GRACE Sensor Errors in Range-Rate Residuals
NASA Astrophysics Data System (ADS)
Goswami, S.; Flury, J.
2016-12-01
In order to reach the accuracy of the GRACE baseline, predicted earlier from the design simulations, efforts are ongoing since a decade. GRACE error budget is highly dominated by noise from sensors, dealiasing models and modeling errors. GRACE range-rate residuals contain these errors. Thus, their analysis provides an insight to understand the individual contribution to the error budget. Hence, we analyze the range-rate residuals with focus on contribution of sensor errors due to mis-pointing and bad ranging performance in GRACE solutions. For the analysis of pointing errors, we consider two different reprocessed attitude datasets with differences in pointing performance. Then range-rate residuals are computed from these two datasetsrespectively and analysed. We further compare the system noise of four K-and Ka- band frequencies of the two spacecrafts, with range-rate residuals. Strong signatures of mis-pointing errors can be seen in the range-rate residuals. Also, correlation between range frequency noise and range-rate residuals are seen.
Tilt Error in Cryospheric Surface Radiation Measurements at High Latitudes: A Model Study
NASA Astrophysics Data System (ADS)
Bogren, W.; Kylling, A.; Burkhart, J. F.
2015-12-01
We have evaluated the magnitude and makeup of error in cryospheric radiation observations due to small sensor misalignment in in-situ measurements of solar irradiance. This error is examined through simulation of diffuse and direct irradiance arriving at a detector with a cosine-response foreoptic. Emphasis is placed on assessing total error over the solar shortwave spectrum from 250nm to 4500nm, as well as supporting investigation over other relevant shortwave spectral ranges. The total measurement error introduced by sensor tilt is dominated by the direct component. For a typical high latitude albedo measurement with a solar zenith angle of 60◦, a sensor tilted by 1, 3, and 5◦ can respectively introduce up to 2.6, 7.7, and 12.8% error into the measured irradiance and similar errors in the derived albedo. Depending on the daily range of solar azimuth and zenith angles, significant measurement error can persist also in integrated daily irradiance and albedo.
Higher-order ionospheric error at Arecibo, Millstone, and Jicamarca
NASA Astrophysics Data System (ADS)
Matteo, N. A.; Morton, Y. T.
2010-12-01
The ionosphere is a dominant source of Global Positioning System receiver range measurement error. Although dual-frequency receivers can eliminate the first-order ionospheric error, most second- and third-order errors remain in the range measurements. Higher-order ionospheric error is a function of both electron density distribution and the magnetic field vector along the GPS signal propagation path. This paper expands previous efforts by combining incoherent scatter radar (ISR) electron density measurements, the International Reference Ionosphere model, exponential decay extensions of electron densities, the International Geomagnetic Reference Field, and total electron content maps to compute higher-order error at ISRs in Arecibo, Puerto Rico; Jicamarca, Peru; and Millstone Hill, Massachusetts. Diurnal patterns, dependency on signal direction, seasonal variation, and geomagnetic activity dependency are analyzed. Higher-order error is largest at Arecibo with code phase maxima circa 7 cm for low-elevation southern signals. The maximum variation of the error over all angles of arrival is circa 8 cm.
Seismic Characterization of the Newberry and Cooper Basin EGS Sites
NASA Astrophysics Data System (ADS)
Templeton, D. C.; Wang, J.; Goebel, M.; Johannesson, G.; Myers, S. C.; Harris, D.; Cladouhos, T. T.
2015-12-01
To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance traditional microearthquake detection and location methodologies at two EGS systems: the Newberry EGS site and the Habanero EGS site in the Cooper Basin of South Australia. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP typically have smaller magnitudes or occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation is real, or simply within the anticipated error range. At the Newberry EGS site, 235 events were reported in the original catalog. MFP identified 164 additional events (an increase of over 70% more events). For the relocated events in the Newberry catalog, we can distinguish two distinct seismic swarms that fall outside of one another's 95% probability error ellipsoids.This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Forster, Sarah E; Zirnheld, Patrick; Shekhar, Anantha; Steinhauer, Stuart R; O'Donnell, Brian F; Hetrick, William P
2017-09-01
Signals carried by the mesencephalic dopamine system and conveyed to anterior cingulate cortex are critically implicated in probabilistic reward learning and performance monitoring. A common evaluative mechanism purportedly subserves both functions, giving rise to homologous medial frontal negativities in feedback- and response-locked event-related brain potentials (the feedback-related negativity (FRN) and the error-related negativity (ERN), respectively), reflecting dopamine-dependent prediction error signals to unexpectedly negative events. Consistent with this model, the dopamine receptor antagonist, haloperidol, attenuates the ERN, but effects on FRN have not yet been evaluated. ERN and FRN were recorded during a temporal interval learning task (TILT) following randomized, double-blind administration of haloperidol (3 mg; n = 18), diphenhydramine (an active control for haloperidol; 25 mg; n = 20), or placebo (n = 21) to healthy controls. Centroparietal positivities, the Pe and feedback-locked P300, were also measured and correlations between ERP measures and behavioral indices of learning, overall accuracy, and post-error compensatory behavior were evaluated. We hypothesized that haloperidol would reduce ERN and FRN, but that ERN would uniquely track automatic, error-related performance adjustments, while FRN would be associated with learning and overall accuracy. As predicted, ERN was reduced by haloperidol and in those exhibiting less adaptive post-error performance; however, these effects were limited to ERNs following fast timing errors. In contrast, the FRN was not affected by drug condition, although increased FRN amplitude was associated with improved accuracy. Significant drug effects on centroparietal positivities were also absent. Our results support a functional and neurobiological dissociation between the ERN and FRN.
Single Event Effect Testing of the Micron MT46V128M8
NASA Technical Reports Server (NTRS)
Stansberry, Scott; Campola, Michael; Wilcox, Ted; Seidleck, Christina; Phan, Anthony
2017-01-01
The Micron MT46V128M8 was tested for single event effects (SEE) at the Texas AM University Cyclotron Facility (TAMU) in June of 2017. Testing revealed a sensitivity to device hang-ups classified as single event functional interrupts (SEFI) and possible soft data errors classified as single event upsets (SEU).
Effect of rain on Ku-band scatterometer wind measurements
NASA Technical Reports Server (NTRS)
Spencer, Michael; Shimada, Masanobu
1991-01-01
The impact of precipitation on scatterometer wind measurements is investigated. A model is developed which includes the effects of rain attenuation, rain backscatter, and storm horizontal structure. Rain attenuation is found to be the dominant error source at low radar incidence angles and high wind speeds. Volume backscatter from the rain-loaded atmosphere, however, is found to dominate for high incidence angles and low wind speeds.
Spatio-temporal error growth in the multi-scale Lorenz'96 model
NASA Astrophysics Data System (ADS)
Herrera, S.; Fernández, J.; Rodríguez, M. A.; Gutiérrez, J. M.
2010-07-01
The influence of multiple spatio-temporal scales on the error growth and predictability of atmospheric flows is analyzed throughout the paper. To this aim, we consider the two-scale Lorenz'96 model and study the interplay of the slow and fast variables on the error growth dynamics. It is shown that when the coupling between slow and fast variables is weak the slow variables dominate the evolution of fluctuations whereas in the case of strong coupling the fast variables impose a non-trivial complex error growth pattern on the slow variables with two different regimes, before and after saturation of fast variables. This complex behavior is analyzed using the recently introduced Mean-Variance Logarithmic (MVL) diagram.
Barnabe, Christian; Buitrago, Rosio; Bremond, Philippe; Aliaga, Claudia; Salas, Renata; Vidaurre, Pablo; Herrera, Claudia; Cerqueira, Frédérique; Bosseno, Marie-France; Waleckx, Etienne; Breniere, Simone Frédérique
2013-01-01
Trypanosoma cruzi, the causative agent of Chagas disease, is subdivided into six discrete typing units (DTUs; TcI–TcVI) of which TcI is ubiquitous and genetically highly variable. While clonality is the dominant mode of propagation, recombinant events play a significant evolutive role. Recently, foci of wild Triatoma infestans have been described in Bolivia, mainly infected by TcI. Hence, for the first time, we evaluated the level of genetic exchange within TcI natural potentially panmictic populations (single DTU, host, area and sampling time). Seventy-nine TcI stocks from wild T. infestans, belonging to six populations were characterized at eight microsatellite loci. For each population, Hardy-Weinberg equilibrium (HWE), linkage disequilibrium (LD), and presence of repeated multilocus genotypes (MLG) were analyzed by using a total of seven statistics, to test the null hypothesis of panmixia (H0). For three populations, none of the seven statistics allowed to rejecting H0; for another one the low size did not allow us to conclude, and for the two others the tests have given contradictory results. Interestingly, apparent panmixia was only observed in very restricted areas, and was not observed when grouping populations distant of only two kilometers or more. Nevertheless it is worth stressing that for the statistic tests of "HWE", in order to minimize the type I error (i. e. incorrect rejection of a true H0), we used the Bonferroni correction (BC) known to considerably increase the type II error ( i. e. failure to reject a false H0). For the other tests (LD and MLG), we did not use BC and the risk of type II error in these cases was acceptable. Thus, these results should be considered as a good indicator of the existence of panmixia in wild environment but this must be confirmed on larger samples to reduce the risk of type II error. PMID:24312410
Uncertainties in Past and Future Global Water Availability
NASA Astrophysics Data System (ADS)
Sheffield, J.; Kam, J.
2014-12-01
Understanding how water availability changes on inter-annual to decadal time scales and how it may change in the future under climate change are a key part of understanding future stresses on water and food security. Historic evaluations of water availability on regional to global scales are generally based on large-scale model simulations with their associated uncertainties, in particular for long-term changes. Uncertainties are due to model errors and missing processes, parameter uncertainty, and errors in meteorological forcing data. Recent multi-model inter-comparisons and impact studies have highlighted large differences for past reconstructions, due to different simplifying assumptions in the models or the inclusion of physical processes such as CO2 fertilization. Modeling of direct anthropogenic factors such as water and land management also carry large uncertainties in their physical representation and from lack of socio-economic data. Furthermore, there is little understanding of the impact of uncertainties in the meteorological forcings that underpin these historic simulations. Similarly, future changes in water availability are highly uncertain due to climate model diversity, natural variability and scenario uncertainty, each of which dominates at different time scales. In particular, natural climate variability is expected to dominate any externally forced signal over the next several decades. We present results from multi-land surface model simulations of the historic global availability of water in the context of natural variability (droughts) and long-term changes (drying). The simulations take into account the impact of uncertainties in the meteorological forcings and the incorporation of water management in the form of reservoirs and irrigation. The results indicate that model uncertainty is important for short-term drought events, and forcing uncertainty is particularly important for long-term changes, especially uncertainty in precipitation due to reduced gauge density in recent years. We also discuss uncertainties in future projections from these models as driven by bias-corrected and downscaled CMIP5 climate projections, in the context of the balance between climate model robustness and climate model diversity.
Human errors and measurement uncertainty
NASA Astrophysics Data System (ADS)
Kuselman, Ilya; Pennecchi, Francesca
2015-04-01
Evaluating the residual risk of human errors in a measurement and testing laboratory, remaining after the error reduction by the laboratory quality system, and quantifying the consequences of this risk for the quality of the measurement/test results are discussed based on expert judgments and Monte Carlo simulations. A procedure for evaluation of the contribution of the residual risk to the measurement uncertainty budget is proposed. Examples are provided using earlier published sets of expert judgments on human errors in pH measurement of groundwater, elemental analysis of geological samples by inductively coupled plasma mass spectrometry, and multi-residue analysis of pesticides in fruits and vegetables. The human error contribution to the measurement uncertainty budget in the examples was not negligible, yet also not dominant. This was assessed as a good risk management result.
Prevention of medication errors: detection and audit.
Montesi, Germana; Lechi, Alessandro
2009-06-01
1. Medication errors have important implications for patient safety, and their identification is a main target in improving clinical practice errors, in order to prevent adverse events. 2. Error detection is the first crucial step. Approaches to this are likely to be different in research and routine care, and the most suitable must be chosen according to the setting. 3. The major methods for detecting medication errors and associated adverse drug-related events are chart review, computerized monitoring, administrative databases, and claims data, using direct observation, incident reporting, and patient monitoring. All of these methods have both advantages and limitations. 4. Reporting discloses medication errors, can trigger warnings, and encourages the diffusion of a culture of safe practice. Combining and comparing data from various and encourages the diffusion of a culture of safe practice sources increases the reliability of the system. 5. Error prevention can be planned by means of retroactive and proactive tools, such as audit and Failure Mode, Effect, and Criticality Analysis (FMECA). Audit is also an educational activity, which promotes high-quality care; it should be carried out regularly. In an audit cycle we can compare what is actually done against reference standards and put in place corrective actions to improve the performances of individuals and systems. 6. Patient safety must be the first aim in every setting, in order to build safer systems, learning from errors and reducing the human and fiscal costs.
Multiple-Event Seismic Location Using the Markov-Chain Monte Carlo Technique
NASA Astrophysics Data System (ADS)
Myers, S. C.; Johannesson, G.; Hanley, W.
2005-12-01
We develop a new multiple-event location algorithm (MCMCloc) that utilizes the Markov-Chain Monte Carlo (MCMC) method. Unlike most inverse methods, the MCMC approach produces a suite of solutions, each of which is consistent with observations and prior estimates of data and model uncertainties. Model parameters in MCMCloc consist of event hypocenters, and travel-time predictions. Data are arrival time measurements and phase assignments. Posteriori estimates of event locations, path corrections, pick errors, and phase assignments are made through analysis of the posteriori suite of acceptable solutions. Prior uncertainty estimates include correlations between travel-time predictions, correlations between measurement errors, the probability of misidentifying one phase for another, and the probability of spurious data. Inclusion of prior constraints on location accuracy allows direct utilization of ground-truth locations or well-constrained location parameters (e.g. from InSAR) that aid in the accuracy of the solution. Implementation of a correlation structure for travel-time predictions allows MCMCloc to operate over arbitrarily large geographic areas. Transition in behavior between a multiple-event locator for tightly clustered events and a single-event locator for solitary events is controlled by the spatial correlation of travel-time predictions. We test the MCMC locator on a regional data set of Nevada Test Site nuclear explosions. Event locations and origin times are known for these events, allowing us to test the features of MCMCloc using a high-quality ground truth data set. Preliminary tests suggest that MCMCloc provides excellent relative locations, often outperforming traditional multiple-event location algorithms, and excellent absolute locations are attained when constraints from one or more ground truth event are included. When phase assignments are switched, we find that MCMCloc properly corrects the error when predicted arrival times are separated by several seconds. In cases where the predicted arrival times are within the combined uncertainty of prediction and measurement errors, MCMCloc determines the probability of one or the other phase assignment and propagates this uncertainty into all model parameters. We find that MCMCloc is a promising method for simultaneously locating large, geographically distributed data sets. Because we incorporate prior knowledge on many parameters, MCMCloc is ideal for combining trusted data with data of unknown reliability. This work was performed under the auspices of the U.S. Department of Energy by the University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48, Contribution UCRL-ABS-215048
NASA Technical Reports Server (NTRS)
Tasca, D. M.
1981-01-01
Single event upset phenomena are discussed, taking into account cosmic ray induced errors in IIL microprocessors and logic devices, single event upsets in NMOS microprocessors, a prediction model for bipolar RAMs in a high energy ion/proton environment, the search for neutron-induced hard errors in VLSI structures, soft errors due to protons in the radiation belt, and the use of an ion microbeam to study single event upsets in microcircuits. Basic mechanisms in materials and devices are examined, giving attention to gamma induced noise in CCD's, the annealing of MOS capacitors, an analysis of photobleaching techniques for the radiation hardening of fiber optic data links, a hardened field insulator, the simulation of radiation damage in solids, and the manufacturing of radiation resistant optical fibers. Energy deposition and dosimetry is considered along with SGEMP/IEMP, radiation effects in devices, space radiation effects and spacecraft charging, EMP/SREMP, and aspects of fabrication, testing, and hardness assurance.
Hwang, Jee-In; Park, Hyeoun-Ae
2017-12-01
Healthcare professionals' systems thinking is emphasized for patient safety. To report nurses' systems thinking competency, and its relationship with medical error reporting and the occurrence of adverse events. A cross-sectional survey using a previously validated Systems Thinking Scale (STS), was conducted. Nurses from two teaching hospitals were invited to participate in the survey. There were 407 (60.3%) completed surveys. The mean STS score was 54.5 (SD 7.3) out of 80. Nurses with higher STS scores were more likely to report medical errors (odds ratio (OR) = 1.05; 95% confidence interval (CI) = 1.02-1.08) and were less likely to be involved in the occurrence of adverse events (OR = 0.96; 95% CI = 0.93-0.98). Nurses showed moderate systems thinking competency. Systems thinking was a significant factor associated with patient safety. Impact Statement: The findings of this study highlight the importance of enhancing nurses' systems thinking capacity to promote patient safety.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2016-06-15
Radiation treatment consists of a chain of events influenced by the quality of machine operation, beam data commissioning, machine calibration, patient specific data, simulation, treatment planning, imaging and treatment delivery. There is always a chance that the clinical medical physicist may make or fail to detect an error in one of the events that may impact on the patient’s treatment. In the clinical scenario, errors may be systematic and, without peer review, may have a low detectability because they are not part of routine QA procedures. During treatment, there might be errors on machine that needs attention. External reviews ofmore » some of the treatment delivery components by independent reviewers, like IROC, can detect errors, but may not be timely. The goal of this session is to help junior clinical physicists identify potential errors as well as the approach of quality assurance to perform a root cause analysis to find and eliminate an error and to continually monitor for errors. A compilation of potential errors will be presented by examples of the thought process required to spot the error and determine the root cause. Examples may include unusual machine operation, erratic electrometer reading, consistent lower electron output, variation in photon output, body parts inadvertently left in beam, unusual treatment plan, poor normalization, hot spots etc. Awareness of the possibility and detection of error in any link of the treatment process chain will help improve the safe and accurate delivery of radiation to patients. Four experts will discuss how to identify errors in four areas of clinical treatment. D. Followill, NIH grant CA 180803.« less
TH-B-BRC-01: How to Identify and Resolve Potential Clinical Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Das, I.
2016-06-15
Radiation treatment consists of a chain of events influenced by the quality of machine operation, beam data commissioning, machine calibration, patient specific data, simulation, treatment planning, imaging and treatment delivery. There is always a chance that the clinical medical physicist may make or fail to detect an error in one of the events that may impact on the patient’s treatment. In the clinical scenario, errors may be systematic and, without peer review, may have a low detectability because they are not part of routine QA procedures. During treatment, there might be errors on machine that needs attention. External reviews ofmore » some of the treatment delivery components by independent reviewers, like IROC, can detect errors, but may not be timely. The goal of this session is to help junior clinical physicists identify potential errors as well as the approach of quality assurance to perform a root cause analysis to find and eliminate an error and to continually monitor for errors. A compilation of potential errors will be presented by examples of the thought process required to spot the error and determine the root cause. Examples may include unusual machine operation, erratic electrometer reading, consistent lower electron output, variation in photon output, body parts inadvertently left in beam, unusual treatment plan, poor normalization, hot spots etc. Awareness of the possibility and detection of error in any link of the treatment process chain will help improve the safe and accurate delivery of radiation to patients. Four experts will discuss how to identify errors in four areas of clinical treatment. D. Followill, NIH grant CA 180803.« less
NASA Astrophysics Data System (ADS)
Zakeri, Zeinab; Azadi, Majid; Ghader, Sarmad
2018-01-01
Satellite radiances and in-situ observations are assimilated through Weather Research and Forecasting Data Assimilation (WRFDA) system into Advanced Research WRF (ARW) model over Iran and its neighboring area. Domain specific background error based on x and y components of wind speed (UV) control variables is calculated for WRFDA system and some sensitivity experiments are carried out to compare the impact of global background error and the domain specific background errors, both on the precipitation and 2-m temperature forecasts over Iran. Three precipitation events that occurred over the country during January, September and October 2014 are simulated in three different experiments and the results for precipitation and 2-m temperature are verified against the verifying surface observations. Results show that using domain specific background error improves 2-m temperature and 24-h accumulated precipitation forecasts consistently, while global background error may even degrade the forecasts compared to the experiments without data assimilation. The improvement in 2-m temperature is more evident during the first forecast hours and decreases significantly as the forecast length increases.
Quantum error-correction failure distributions: Comparison of coherent and stochastic error models
NASA Astrophysics Data System (ADS)
Barnes, Jeff P.; Trout, Colin J.; Lucarelli, Dennis; Clader, B. D.
2017-06-01
We compare failure distributions of quantum error correction circuits for stochastic errors and coherent errors. We utilize a fully coherent simulation of a fault-tolerant quantum error correcting circuit for a d =3 Steane and surface code. We find that the output distributions are markedly different for the two error models, showing that no simple mapping between the two error models exists. Coherent errors create very broad and heavy-tailed failure distributions. This suggests that they are susceptible to outlier events and that mean statistics, such as pseudothreshold estimates, may not provide the key figure of merit. This provides further statistical insight into why coherent errors can be so harmful for quantum error correction. These output probability distributions may also provide a useful metric that can be utilized when optimizing quantum error correcting codes and decoding procedures for purely coherent errors.
Gálvez, Carlos; Rivera-Cogollos, María Jesus; Galiana-Ivars, María; Bolufer, Sergio; Martínez-Adsuar, Francisco
2015-01-01
The management of surgical and medical intraoperative emergencies are included in the group of high acuity (high potential severity of an event and the patient impact) and low opportunity (the frequency in which the team is required to manage the event). This combination places the patient into a situation where medical errors could happen more frequently. Although medical error are ubiquitous and inevitable we should try to establish the necessary knowledge, skills and attitudes needed for effective team performance and to guide the development of a critical event. This strategy would probably reduce the incidence of error and improve decision-making. The way to apply it comes from the application of the management of critical events in the airline industry. Its use in a surgical environment is through the crisis resource management (CRM) principles. The CRM tries to develop all the non-technical skills necessary in a critical situation, but not only that, also includes all the tools needed to prevent them. The purpose of this special issue is to appraise and summarize the design, implementation, and efficacy of simulation-based CRM training programs for a specific surgery such as the non-intubated video-assisted thoracoscopic surgery. PMID:26046052
Context and meter enhance long-range planning in music performance
Mathias, Brian; Pfordresher, Peter Q.; Palmer, Caroline
2015-01-01
Neural responses demonstrate evidence of resonance, or oscillation, during the production of periodic auditory events. Music contains periodic auditory events that give rise to a sense of beat, which in turn generates a sense of meter on the basis of multiple periodicities. Metrical hierarchies may aid memory for music by facilitating similarity-based associations among sequence events at different periodic distances that unfold in longer contexts. A fundamental question is how metrical associations arising from a musical context influence memory during music performance. Longer contexts may facilitate metrical associations at higher hierarchical levels more than shorter contexts, a prediction of the range model, a formal model of planning processes in music performance (Palmer and Pfordresher, 2003; Pfordresher et al., 2007). Serial ordering errors, in which intended sequence events are produced in incorrect sequence positions, were measured as skilled pianists performed musical pieces that contained excerpts embedded in long or short musical contexts. Pitch errors arose from metrically similar positions and further sequential distances more often when the excerpt was embedded in long contexts compared to short contexts. Musicians’ keystroke intensities and error rates also revealed influences of metrical hierarchies, which differed for performances in long and short contexts. The range model accounted for contextual effects and provided better fits to empirical findings when metrical associations between sequence events were included. Longer sequence contexts may facilitate planning during sequence production by increasing conceptual similarity between hierarchically associated events. These findings are consistent with the notion that neural oscillations at multiple periodicities may strengthen metrical associations across sequence events during planning. PMID:25628550
Buechel, Eva C; Zhang, Jiao; Morewedge, Carey K
2017-05-01
Affective forecasts are used to anticipate the hedonic impact of future events and decide which events to pursue or avoid. We propose that because affective forecasters are more sensitive to outcome specifications of events than experiencers, the outcome specification values of an event, such as its duration, magnitude, probability, and psychological distance, can be used to predict the direction of affective forecasting errors: whether affective forecasters will overestimate or underestimate its hedonic impact. When specifications are positively correlated with the hedonic impact of an event, forecasters will overestimate the extent to which high specification values will intensify and low specification values will discount its impact. When outcome specifications are negatively correlated with its hedonic impact, forecasters will overestimate the extent to which low specification values will intensify and high specification values will discount its impact. These affective forecasting errors compound additively when multiple specifications are aligned in their impact: In Experiment 1, affective forecasters underestimated the hedonic impact of winning a smaller prize that they expected to win, and they overestimated the hedonic impact of winning a larger prize that they did not expect to win. In Experiment 2, affective forecasters underestimated the hedonic impact of a short unpleasant video about a temporally distant event, and they overestimated the hedonic impact of a long unpleasant video about a temporally near event. Experiments 3A and 3B showed that differences in the affect-richness of forecasted and experienced events underlie these differences in sensitivity to outcome specifications, therefore accounting for both the impact bias and its reversal. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
de Wet, C; Bowie, P
2009-04-01
A multi-method strategy has been proposed to understand and improve the safety of primary care. The trigger tool is a relatively new method that has shown promise in American and secondary healthcare settings. It involves the focused review of a random sample of patient records using a series of "triggers" that alert reviewers to potential errors and previously undetected adverse events. To develop and test a global trigger tool to detect errors and adverse events in primary-care records. Trigger tool development was informed by previous research and content validated by expert opinion. The tool was applied by trained reviewers who worked in pairs to conduct focused audits of 100 randomly selected electronic patient records in each of five urban general practices in central Scotland. Review of 500 records revealed 2251 consultations and 730 triggers. An adverse event was found in 47 records (9.4%), indicating that harm occurred at a rate of one event per 48 consultations. Of these, 27 were judged to be preventable (42%). A further 17 records (3.4%) contained evidence of a potential adverse event. Harm severity was low to moderate for most patients (82.9%). Error and harm rates were higher in those aged > or =60 years, and most were medication-related (59%). The trigger tool was successful in identifying undetected patient harm in primary-care records and may be the most reliable method for achieving this. However, the feasibility of its routine application is open to question. The tool may have greater utility as a research rather than an audit technique. Further testing in larger, representative study samples is required.
NASA Technical Reports Server (NTRS)
Roberts, J. Brent; Clayson, C. A.
2012-01-01
Residual forcing necessary to close the MLTB on seasonal time scales are largest in regions of strongest surface heat flux forcing. Identifying the dominant source of error - surface heat flux error, mixed layer depth estimation, ocean dynamical forcing - remains a challenge in the eastern tropical oceans where ocean processes are very active. Improved sub-surface observations are necessary to better constrain errors. 1. Mixed layer depth evolution is critical to the seasonal evolution of mixed layer temperatures. It determines the inertia of the mixed layer, and scales the sensitivity of the MLTB to errors in surface heat flux and ocean dynamical forcing. This role produces timing impacts for errors in SST prediction. 2. Errors in the MLTB are larger than the historical 10Wm-2 target accuracy. In some regions, a larger accuracy can be tolerated if the goal is to resolve the seasonal SST cycle.
Transfer Alignment Error Compensator Design Based on Robust State Estimation
NASA Astrophysics Data System (ADS)
Lyou, Joon; Lim, You-Chol
This paper examines the transfer alignment problem of the StrapDown Inertial Navigation System (SDINS), which is subject to the ship’s roll and pitch. Major error sources for velocity and attitude matching are lever arm effect, measurement time delay and ship-body flexure. To reduce these alignment errors, an error compensation method based on state augmentation and robust state estimation is devised. A linearized error model for the velocity and attitude matching transfer alignment system is derived first by linearizing the nonlinear measurement equation with respect to its time delay and dominant Y-axis flexure, and by augmenting the delay state and flexure state into conventional linear state equations. Then an H∞ filter is introduced to account for modeling uncertainties of time delay and the ship-body flexure. The simulation results show that this method considerably decreases azimuth alignment errors considerably.
Hessian matrix approach for determining error field sensitivity to coil deviations
NASA Astrophysics Data System (ADS)
Zhu, Caoxiang; Hudson, Stuart R.; Lazerson, Samuel A.; Song, Yuntao; Wan, Yuanxi
2018-05-01
The presence of error fields has been shown to degrade plasma confinement and drive instabilities. Error fields can arise from many sources, but are predominantly attributed to deviations in the coil geometry. In this paper, we introduce a Hessian matrix approach for determining error field sensitivity to coil deviations. A primary cost function used for designing stellarator coils, the surface integral of normalized normal field errors, was adopted to evaluate the deviation of the generated magnetic field from the desired magnetic field. The FOCUS code (Zhu et al 2018 Nucl. Fusion 58 016008) is utilized to provide fast and accurate calculations of the Hessian. The sensitivities of error fields to coil displacements are then determined by the eigenvalues of the Hessian matrix. A proof-of-principle example is given on a CNT-like configuration. We anticipate that this new method could provide information to avoid dominant coil misalignments and simplify coil designs for stellarators.
Low-Energy Proton Testing Methodology
NASA Technical Reports Server (NTRS)
Pellish, Jonathan A.; Marshall, Paul W.; Heidel, David F.; Schwank, James R.; Shaneyfelt, Marty R.; Xapsos, M.A.; Ladbury, Raymond L.; LaBel, Kenneth A.; Berg, Melanie; Kim, Hak S.;
2009-01-01
Use of low-energy protons and high-energy light ions is becoming necessary to investigate current-generation SEU thresholds. Systematic errors can dominate measurements made with low-energy protons. Range and energy straggling contribute to systematic error. Low-energy proton testing is not a step-and-repeat process. Low-energy protons and high-energy light ions can be used to measure SEU cross section of single sensitive features; important for simulation.
Shi, Lu-Feng; Morozova, Natalia
2012-08-01
Word recognition is a basic component in a comprehensive hearing evaluation, but data are lacking for listeners speaking two languages. This study obtained such data for Russian natives in the US and analysed the data using the perceptual assimilation model (PAM) and speech learning model (SLM). Listeners were randomly presented 200 NU-6 words in quiet. Listeners responded verbally and in writing. Performance was scored on words and phonemes (word-initial consonants, vowels, and word-final consonants). Seven normal-hearing, adult monolingual English natives (NM), 16 English-dominant (ED), and 15 Russian-dominant (RD) Russian natives participated. ED and RD listeners differed significantly in their language background. Consistent with the SLM, NM outperformed ED listeners and ED outperformed RD listeners, whether responses were scored on words or phonemes. NM and ED listeners shared similar phoneme error patterns, whereas RD listeners' errors had unique patterns that could be largely understood via the PAM. RD listeners had particular difficulty differentiating vowel contrasts /i-I/, /æ-ε/, and /ɑ-Λ/, word-initial consonant contrasts /p-h/ and /b-f/, and word-final contrasts /f-v/. Both first-language phonology and second-language learning history affect word and phoneme recognition. Current findings may help clinicians differentiate word recognition errors due to language background from hearing pathologies.
NASA Astrophysics Data System (ADS)
Jia, Chaoqing; Hu, Jun; Chen, Dongyan; Liu, Yurong; Alsaadi, Fuad E.
2018-07-01
In this paper, we discuss the event-triggered resilient filtering problem for a class of time-varying systems subject to stochastic uncertainties and successive packet dropouts. The event-triggered mechanism is employed with hope to reduce the communication burden and save network resources. The stochastic uncertainties are considered to describe the modelling errors and the phenomenon of successive packet dropouts is characterized by a random variable obeying the Bernoulli distribution. The aim of the paper is to provide a resilient event-based filtering approach for addressed time-varying systems such that, for all stochastic uncertainties, successive packet dropouts and filter gain perturbation, an optimized upper bound of the filtering error covariance is obtained by designing the filter gain. Finally, simulations are provided to demonstrate the effectiveness of the proposed robust optimal filtering strategy.
The association of shift-level nurse staffing with adverse patient events.
Patrician, Patricia A; Loan, Lori; McCarthy, Mary; Fridman, Moshe; Donaldson, Nancy; Bingham, Mona; Brosch, Laura R
2011-02-01
The objective of this study was to demonstrate the association between nurse staffing and adverse events at the shift level. Despite a growing body of research linking nurse staffing and patient outcomes, the relationship of staffing to patient falls and medication errors remains equivocal, possibly due to dependence on aggregated data. Thirteen military hospitals participated in creating a longitudinal nursing outcomes database to monitor nurse staffing, patient falls and medication errors, and other outcomes. Unit types were analyzed separately to stratify patient and nurse staffing characteristics. Bayesian hierarchical logistic regression modeling was used to examine associations between staffing and adverse events. RN skill mix, total nursing care hours, and experience, measured by a proxy variable, were associated with shift-level adverse events. Consideration must be given to nurse staffing and experience levels on every shift.
A Physician-based Voluntary Reporting System for Adverse Events and Medical Errors
Weingart, Saul N; Callanan, Lawrence D; Ship, Amy N; Aronson, Mark D
2001-01-01
OBJECTIVE To create a voluntary reporting method for identifying adverse events (AEs) and potential adverse events (PAEs) among medical inpatients. DESIGN Medical house officers asked their peers about obstacles to care, injuries or extended hospitalizations, and problems with medications that affected their patients. Two independent reviewers coded event narratives for adverse outcomes, responsible parties, preventability, and process problems. We corroborated house officers' reports with hospital incident reports and conducted a retrospective chart review. SETTING The cardiac step-down, oncology, and medical intensive care units of an urban teaching hospital. INTERVENTION Structured confidential interviews by postgraduate year-2 and -3 medical residents of interns during work rounds. MEASUREMENTS AND MAIN RESULTS Respondents reported 88 events over 3 months. AEs occurred among 5 patients (0.5% of admissions) and PAEs among 48 patients (4.9% of admissions). Delayed diagnoses and treatments figured prominently among PAEs (54%). Clinicians were responsible for the greatest number of incidents (55%), followed by workers in the laboratory (11%), radiology (15%), and pharmacy (3%). Respondents identified a variety of problematic processes of care, including problems with diagnosis (16%), therapy (26%), and failure to provide clinical and support services (29%). We corroborated 84% of reported events in the medical record. Participants found voluntary peer reporting of medical errors unobtrusive and agreed that it could be implemented on a regular basis. CONCLUSIONS A physician-based voluntary reporting system for medical errors is feasible and acceptable to front-line clinicians. PMID:11903759
Why leaders don't learn from success.
Gino, Francesca; Pisano, Gary P
2011-04-01
What causes so many companies that once dominated their industries to slide into decline? In this article, two Harvard Business School professors argue that such firms lose their touch because success breeds failure by impeding learning at both the individual and organizational levels. When we succeed, we assume that we know what we are doing, but it could be that we just got lucky. We make what psychologists call fundamental attribution errors, giving too much credit to our talents and strategy and too tittle to environmental factors and random events. We develop an overconfidence bias, becoming so self-assured that we think we don't need to change anything. We also experience the failure-to-ask-why syndrome and neglect to investigate the causes of good performance. To overcome these three learning impediments, executives should examine successes with the same scrutiny they apply to failures. Companies should implement systematic after-action reviews to understand all the factors that led to a win, and test their theories by conducting experiments even if "it ain't broke."
Crustal deformation at the terminal stage before earthquake occurrence
NASA Astrophysics Data System (ADS)
Chen, C. H.; Meng, G.; Su, X.
2016-12-01
GPS data retrieved from 300 stations in China are used in this work to study stressed areas during earthquake preparation periods. Surface deformation data are derived by using the standard method and are smoothed by a temporal moving to mitigate influence from noise. A statistical method is used to distinguish significant variations from the smoothed data. The spatial distributions comprised of those significant variations show that a diameter of a stressed area preparing earthquakes is about 3500 km for a M6 event. The deformation deduced from the significant variations is highly related with the slip direction of the fault plane determined through the focal mechanism solution of earthquakes. Although the causal mechanism of such large stressed areas with rapid changes is not fully understood, the analytical results suggest that the earthquake preparation would be one of the factors dominating the common mode error in GPS studies. Mechanisms and/or numerical models of some pre-earthquake anomalous phenomena would be reconsidered based on this novel observation.
NASA Astrophysics Data System (ADS)
Watanabe, Kenichi; Minniti, Triestino; Kockelmann, Winfried; Dalgliesh, Robert; Burca, Genoveva; Tremsin, Anton S.
2017-07-01
The uncertainties and the stability of a neutron sensitive MCP/Timepix detector when operating in the event timing mode for quantitative image analysis at a pulsed neutron source were investigated. The dominant component to the uncertainty arises from the counting statistics. The contribution of the overlap correction to the uncertainty was concluded to be negligible from considerations based on the error propagation even if a pixel occupation probability is more than 50%. We, additionally, have taken into account the multiple counting effect in consideration of the counting statistics. Furthermore, the detection efficiency of this detector system changes under relatively high neutron fluxes due to the ageing effects of current Microchannel Plates. Since this efficiency change is position-dependent, it induces a memory image. The memory effect can be significantly reduced with correction procedures using the rate equations describing the permanent gain degradation and the scrubbing effect on the inner surfaces of the MCP pores.
Tilt error in cryospheric surface radiation measurements at high latitudes: a model study
NASA Astrophysics Data System (ADS)
Bogren, Wiley Steven; Faulkner Burkhart, John; Kylling, Arve
2016-03-01
We have evaluated the magnitude and makeup of error in cryospheric radiation observations due to small sensor misalignment in in situ measurements of solar irradiance. This error is examined through simulation of diffuse and direct irradiance arriving at a detector with a cosine-response fore optic. Emphasis is placed on assessing total error over the solar shortwave spectrum from 250 to 4500 nm, as well as supporting investigation over other relevant shortwave spectral ranges. The total measurement error introduced by sensor tilt is dominated by the direct component. For a typical high-latitude albedo measurement with a solar zenith angle of 60°, a sensor tilted by 1, 3, and 5° can, respectively introduce up to 2.7, 8.1, and 13.5 % error into the measured irradiance and similar errors in the derived albedo. Depending on the daily range of solar azimuth and zenith angles, significant measurement error can persist also in integrated daily irradiance and albedo. Simulations including a cloud layer demonstrate decreasing tilt error with increasing cloud optical depth.
NASA Astrophysics Data System (ADS)
Langford, B.; Acton, W.; Ammann, C.; Valach, A.; Nemitz, E.
2015-10-01
All eddy-covariance flux measurements are associated with random uncertainties which are a combination of sampling error due to natural variability in turbulence and sensor noise. The former is the principal error for systems where the signal-to-noise ratio of the analyser is high, as is usually the case when measuring fluxes of heat, CO2 or H2O. Where signal is limited, which is often the case for measurements of other trace gases and aerosols, instrument uncertainties dominate. Here, we are applying a consistent approach based on auto- and cross-covariance functions to quantify the total random flux error and the random error due to instrument noise separately. As with previous approaches, the random error quantification assumes that the time lag between wind and concentration measurement is known. However, if combined with commonly used automated methods that identify the individual time lag by looking for the maximum in the cross-covariance function of the two entities, analyser noise additionally leads to a systematic bias in the fluxes. Combining data sets from several analysers and using simulations, we show that the method of time-lag determination becomes increasingly important as the magnitude of the instrument error approaches that of the sampling error. The flux bias can be particularly significant for disjunct data, whereas using a prescribed time lag eliminates these effects (provided the time lag does not fluctuate unduly over time). We also demonstrate that when sampling at higher elevations, where low frequency turbulence dominates and covariance peaks are broader, both the probability and magnitude of bias are magnified. We show that the statistical significance of noisy flux data can be increased (limit of detection can be decreased) by appropriate averaging of individual fluxes, but only if systematic biases are avoided by using a prescribed time lag. Finally, we make recommendations for the analysis and reporting of data with low signal-to-noise and their associated errors.
NASA Astrophysics Data System (ADS)
Langford, B.; Acton, W.; Ammann, C.; Valach, A.; Nemitz, E.
2015-03-01
All eddy-covariance flux measurements are associated with random uncertainties which are a combination of sampling error due to natural variability in turbulence and sensor noise. The former is the principal error for systems where the signal-to-noise ratio of the analyser is high, as is usually the case when measuring fluxes of heat, CO2 or H2O. Where signal is limited, which is often the case for measurements of other trace gases and aerosols, instrument uncertainties dominate. We are here applying a consistent approach based on auto- and cross-covariance functions to quantifying the total random flux error and the random error due to instrument noise separately. As with previous approaches, the random error quantification assumes that the time-lag between wind and concentration measurement is known. However, if combined with commonly used automated methods that identify the individual time-lag by looking for the maximum in the cross-covariance function of the two entities, analyser noise additionally leads to a systematic bias in the fluxes. Combining datasets from several analysers and using simulations we show that the method of time-lag determination becomes increasingly important as the magnitude of the instrument error approaches that of the sampling error. The flux bias can be particularly significant for disjunct data, whereas using a prescribed time-lag eliminates these effects (provided the time-lag does not fluctuate unduly over time). We also demonstrate that when sampling at higher elevations, where low frequency turbulence dominates and covariance peaks are broader, both the probability and magnitude of bias are magnified. We show that the statistical significance of noisy flux data can be increased (limit of detection can be decreased) by appropriate averaging of individual fluxes, but only if systematic biases are avoided by using a prescribed time-lag. Finally, we make recommendations for the analysis and reporting of data with low signal-to-noise and their associated errors.
Orbit determination strategy and results for the Pioneer 10 Jupiter mission
NASA Technical Reports Server (NTRS)
Wong, S. K.; Lubeley, A. J.
1974-01-01
Pioneer 10 is the first earth-based vehicle to encounter Jupiter and occult its moon, Io. In contributing to the success of the mission, the Orbit Determination Group evaluated the effects of the dominant error sources on the spacecraft's computed orbit and devised an encounter strategy minimizing the effects of these error sources. The encounter results indicated that: (1) errors in the satellite model played a very important role in the accuracy of the computed orbit, (2) encounter strategy was sound, (3) all mission objectives were met, and (4) Jupiter-Saturn mission for Pioneer 11 is within the navigation capability.
S-193 scatterometer transfer function analysis for data processing
NASA Technical Reports Server (NTRS)
Johnson, L.
1974-01-01
A mathematical model for converting raw data measurements of the S-193 scatterometer into processed values of radar scattering coefficient is presented. The argument is based on an approximation derived from the Radar Equation and actual operating principles of the S-193 Scatterometer hardware. Possible error sources are inaccuracies in transmitted wavelength, range, antenna illumination integrals, and the instrument itself. The dominant source of error in the calculation of scattering coefficent is accuracy of the range. All other ractors with the possible exception of illumination integral are not considered to cause significant error in the calculation of scattering coefficient.
Ille, Sebastian; Kulchytska, Nataliia; Sollmann, Nico; Wittig, Regina; Beurskens, Eva; Butenschoen, Vicki M; Ringel, Florian; Vajkoczy, Peter; Meyer, Bernhard; Picht, Thomas; Krieg, Sandro M
2016-10-01
The resection of left-sided perisylvian brain lesions harbors the risk of postoperative aphasia. Because it is known that language function can shift between hemispheres in brain tumor patients, the preoperative knowledge of the patient's language dominance could be helpful. We therefore investigated the hemispheric language dominance by repetitive navigated transcranial magnetic stimulation (rTMS) and surgery-related deficits of language function. We pooled the bicentric language mapping data of 80 patients undergoing the resection of left-sided perisylvian brain lesions in our two university neurosurgical departments. We calculated error rates (ERs; ER = errors per stimulations) for both hemispheres and defined the hemispheric dominance ratio (HDR) as the quotient of the left- and right-sided ER (HDR >1= left dominant; HDR <1= right dominant). The course of the patient's language function was evaluated and correlated with the preoperative HDR. Only three of 80 patients (4%) presented with permanent surgery-related aphasia and 24 patients (30%) with transient surgery-related aphasia. The mean HDR (± standard deviation) of patients with new aphasia after five days was significantly higher (1.68±1.07) than the HDR of patients with no new language deficit (1.37±1.08) (p=0.0482). With a predefined cut-off value of 0.5 for HDR, we achieved a sensitivity for predicting new aphasia of 100%. A higher preoperative HDR significantly correlates with an increased risk for transient aphasia. Moreover, the intensive preoperative workup in this study led to a considerably low rate of permanent aphasia. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Cheng, Tianhai; Gu, Xingfa; Wu, Yu; Chen, Hao; Yu, Tao
2013-08-01
Applying sphere aerosol models to replace the absorbing fine-sized dominated aerosols can potentially result in significant errors in the climate models and aerosol remote sensing retrieval. In this paper, the optical properties of absorbing fine-sized dominated aerosol were modeled, which are taking into account the fresh emitted soot particles (agglomerates of primary spherules), aged soot particles (semi-externally mixed with other weakly absorbing aerosols), and coarse aerosol particles (dust particles). The optical properties of the individual fresh and aged soot aggregates are calculated using the superposition T-matrix method. In order to quantify the morphology effect of absorbing aerosol models on the aerosol remote sensing retrieval, the ensemble averaged optical properties of absorbing fine-sized dominated aerosols are calculated based on the size distribution of fine aerosols (fresh and aged soot) and coarse aerosols. The corresponding optical properties of sphere absorbing aerosol models using Lorenz-Mie solutions were presented for comparison. The comparison study demonstrates that the sphere absorbing aerosol models underestimate the absorption ability of the fine-sized dominated aerosol particles. The morphology effect of absorbing fine-sized dominated aerosols on the TOA radiances and polarized radiances is also investigated. It is found that the sphere aerosol models overestimate the TOA reflectance and polarized reflectance by approximately a factor of 3 at wavelength of 0.865 μm. In other words, the fine-sized dominated aerosol models can cause large errors in the retrieved aerosol properties if satellite reflectance measurements are analyzed using the conventional Mie theory for spherical particles.
Lessons from aviation - the role of checklists in minimally invasive cardiac surgery.
Hussain, S; Adams, C; Cleland, A; Jones, P M; Walsh, G; Kiaii, B
2016-01-01
We describe an adverse event during minimally invasive cardiac surgery that resulted in a multi-disciplinary review of intra-operative errors and the creation of a procedural checklist. This checklist aims to prevent errors of omission and communication failures that result in increased morbidity and mortality. We discuss the application of the aviation - led "threats and errors model" to medical practice and the role of checklists and other strategies aimed at reducing medical errors. © The Author(s) 2015.
Statistical error in simulations of Poisson processes: Example of diffusion in solids
NASA Astrophysics Data System (ADS)
Nilsson, Johan O.; Leetmaa, Mikael; Vekilova, Olga Yu.; Simak, Sergei I.; Skorodumova, Natalia V.
2016-08-01
Simulations of diffusion in solids often produce poor statistics of diffusion events. We present an analytical expression for the statistical error in ion conductivity obtained in such simulations. The error expression is not restricted to any computational method in particular, but valid in the context of simulation of Poisson processes in general. This analytical error expression is verified numerically for the case of Gd-doped ceria by running a large number of kinetic Monte Carlo calculations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopan, O; Novak, A; Zeng, J
Purpose: Physics pre-treatment plan review is crucial to safe radiation oncology treatments. Studies show that most errors originate in treatment planning, which underscores the importance of physics plan review. As a QA measure the physics review is of fundamental importance and is central to the profession of medical physics. However, little is known about its effectiveness. More hard data are needed. The purpose of this study was to quantify the effectiveness of physics review with the goal of improving it. Methods: This study analyzed 315 “potentially serious” near-miss incidents within an institutional incident learning system collected over a two-year period.more » 139 of these originated prior to physics review and were found at the review or after. Incidents were classified as events that: 1)were detected by physics review, 2)could have been detected (but were not), and 3)could not have been detected. Category 1 and 2 events were classified by which specific check (within physics review) detected or could have detected the event. Results: Of the 139 analyzed events, 73/139 (53%) were detected or could have been detected by the physics review; although, 42/73 (58%) were not actually detected. 45/73 (62%) errors originated in treatment planning, making physics review the first step in the workflow that could detect the error. Two specific physics checks were particularly effective (combined effectiveness of >20%): verifying DRRs (8/73) and verifying isocenter (7/73). Software-based plan checking systems were evaluated and found to have potential effectiveness of 40%. Given current data structures, software implementations of some tests such as isocenter verification check would be challenging. Conclusion: Physics plan review is a key safety measure and can detect majority of reported events. However, a majority of events that potentially could have been detected were NOT detected in this study, indicating the need to improve the performance of physics review.« less
Fringe-period selection for a multifrequency fringe-projection phase unwrapping method
NASA Astrophysics Data System (ADS)
Zhang, Chunwei; Zhao, Hong; Jiang, Kejian
2016-08-01
The multi-frequency fringe-projection phase unwrapping method (MFPPUM) is a typical phase unwrapping algorithm for fringe projection profilometry. It has the advantage of being capable of correctly accomplishing phase unwrapping even in the presence of surface discontinuities. If the fringe frequency ratio of the MFPPUM is too large, fringe order error (FOE) may be triggered. FOE will result in phase unwrapping error. It is preferable for the phase unwrapping to be kept correct while the fewest sets of lower frequency fringe patterns are used. To achieve this goal, in this paper a parameter called fringe order inaccuracy (FOI) is defined, dominant factors which may induce FOE are theoretically analyzed, a method to optimally select the fringe periods for the MFPPUM is proposed with the aid of FOI, and experiments are conducted to research the impact of the dominant factors in phase unwrapping and demonstrate the validity of the proposed method. Some novel phenomena are revealed by these experiments. The proposed method helps to optimally select the fringe periods and detect the phase unwrapping error for the MFPPUM.
NASA Astrophysics Data System (ADS)
Wilby, M. J.; Keller, C. U.; Haffert, S.; Korkiakoski, V.; Snik, F.; Pietrow, A. G. M.
2016-07-01
Non-Common Path Errors (NCPEs) are the dominant factor limiting the performance of current astronomical high-contrast imaging instruments. If uncorrected, the resulting quasi-static speckle noise floor limits coronagraph performance to a raw contrast of typically 10-4, a value which does not improve with increasing integration time. The coronagraphic Modal Wavefront Sensor (cMWS) is a hybrid phase optic which uses holographic PSF copies to supply focal-plane wavefront sensing information directly from the science camera, whilst maintaining a bias-free coronagraphic PSF. This concept has already been successfully implemented on-sky at the William Herschel Telescope (WHT), La Palma, demonstrating both real-time wavefront sensing capability and successful extraction of slowly varying wavefront errors under a dominant and rapidly changing atmospheric speckle foreground. In this work we present an overview of the development of the cMWS and recent first light results obtained using the Leiden EXoplanet Instrument (LEXI), a high-contrast imager and high-dispersion spectrograph pathfinder instrument for the WHT.
Total Dose Effects on Error Rates in Linear Bipolar Systems
NASA Technical Reports Server (NTRS)
Buchner, Stephen; McMorrow, Dale; Bernard, Muriel; Roche, Nicholas; Dusseau, Laurent
2007-01-01
The shapes of single event transients in linear bipolar circuits are distorted by exposure to total ionizing dose radiation. Some transients become broader and others become narrower. Such distortions may affect SET system error rates in a radiation environment. If the transients are broadened by TID, the error rate could increase during the course of a mission, a possibility that has implications for hardness assurance.
Self-Interaction Error in Density Functional Theory: An Appraisal.
Bao, Junwei Lucas; Gagliardi, Laura; Truhlar, Donald G
2018-05-03
Self-interaction error (SIE) is considered to be one of the major sources of error in most approximate exchange-correlation functionals for Kohn-Sham density-functional theory (KS-DFT), and it is large with all local exchange-correlation functionals and with some hybrid functionals. In this work, we consider systems conventionally considered to be dominated by SIE. For these systems, we demonstrate that by using multiconfiguration pair-density functional theory (MC-PDFT), the error of a translated local density-functional approximation is significantly reduced (by a factor of 3) when using an MCSCF density and on-top density, as compared to using KS-DFT with the parent functional; the error in MC-PDFT with local on-top functionals is even lower than the error in some popular KS-DFT hybrid functionals. Density-functional theory, either in MC-PDFT form with local on-top functionals or in KS-DFT form with some functionals having 50% or more nonlocal exchange, has smaller errors for SIE-prone systems than does CASSCF, which has no SIE.
10 CFR 50.73 - Licensee event report system.
Code of Federal Regulations, 2010 CFR
2010-01-01
... plant design; or (2) Normal and expected wear or degradation. (x) Any event that posed an actual threat... discovery of each component or system failure or procedural error. (J) For each human performance related...
Verifying Parentage and Confirming Identity in Blackberry with a Fingerprinting Set
USDA-ARS?s Scientific Manuscript database
Parentage and identity confirmation is an important aspect of clonally propagated crops outcrossing. Potential errors resulting misidentification include off-type pollination events, labeling errors, or sports of clones. DNA fingerprinting sets are an excellent solution to quickly identify off-type ...
Event-triggered attitude control of spacecraft
NASA Astrophysics Data System (ADS)
Wu, Baolin; Shen, Qiang; Cao, Xibin
2018-02-01
The problem of spacecraft attitude stabilization control system with limited communication and external disturbances is investigated based on an event-triggered control scheme. In the proposed scheme, information of attitude and control torque only need to be transmitted at some discrete triggered times when a defined measurement error exceeds a state-dependent threshold. The proposed control scheme not only guarantees that spacecraft attitude control errors converge toward a small invariant set containing the origin, but also ensures that there is no accumulation of triggering instants. The performance of the proposed control scheme is demonstrated through numerical simulation.
MS-READ: Quantitative measurement of amino acid incorporation.
Mohler, Kyle; Aerni, Hans-Rudolf; Gassaway, Brandon; Ling, Jiqiang; Ibba, Michael; Rinehart, Jesse
2017-11-01
Ribosomal protein synthesis results in the genetically programmed incorporation of amino acids into a growing polypeptide chain. Faithful amino acid incorporation that accurately reflects the genetic code is critical to the structure and function of proteins as well as overall proteome integrity. Errors in protein synthesis are generally detrimental to cellular processes yet emerging evidence suggest that proteome diversity generated through mistranslation may be beneficial under certain conditions. Cumulative translational error rates have been determined at the organismal level, however codon specific error rates and the spectrum of misincorporation errors from system to system remain largely unexplored. In particular, until recently technical challenges have limited the ability to detect and quantify comparatively rare amino acid misincorporation events, which occur orders of magnitude less frequently than canonical amino acid incorporation events. We now describe a technique for the quantitative analysis of amino acid incorporation that provides the sensitivity necessary to detect mistranslation events during translation of a single codon at frequencies as low as 1 in 10,000 for all 20 proteinogenic amino acids, as well as non-proteinogenic and modified amino acids. This article is part of a Special Issue entitled "Biochemistry of Synthetic Biology - Recent Developments" Guest Editor: Dr. Ilka Heinemann and Dr. Patrick O'Donoghue. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Lang, Christapher G.; Bey, Kim S. (Technical Monitor)
2002-01-01
This research investigates residual-based a posteriori error estimates for finite element approximations of heat conduction in single-layer and multi-layered materials. The finite element approximation, based upon hierarchical modelling combined with p-version finite elements, is described with specific application to a two-dimensional, steady state, heat-conduction problem. Element error indicators are determined by solving an element equation for the error with the element residual as a source, and a global error estimate in the energy norm is computed by collecting the element contributions. Numerical results of the performance of the error estimate are presented by comparisons to the actual error. Two methods are discussed and compared for approximating the element boundary flux. The equilibrated flux method provides more accurate results for estimating the error than the average flux method. The error estimation is applied to multi-layered materials with a modification to the equilibrated flux method to approximate the discontinuous flux along a boundary at the material interfaces. A directional error indicator is developed which distinguishes between the hierarchical modeling error and the finite element error. Numerical results are presented for single-layered materials which show that the directional indicators accurately determine which contribution to the total error dominates.
Erosion during extreme flood events dominates Holocene canyon evolution in northeast Iceland.
Baynes, Edwin R C; Attal, Mikaël; Niedermann, Samuel; Kirstein, Linda A; Dugmore, Andrew J; Naylor, Mark
2015-02-24
Extreme flood events have the potential to cause catastrophic landscape change in short periods of time (10(0) to 10(3) h). However, their impacts are rarely considered in studies of long-term landscape evolution (>10(3) y), because the mechanisms of erosion during such floods are poorly constrained. Here we use topographic analysis and cosmogenic (3)He surface exposure dating of fluvially sculpted surfaces to determine the impact of extreme flood events within the Jökulsárgljúfur canyon (northeast Iceland) and to constrain the mechanisms of bedrock erosion during these events. Surface exposure ages allow identification of three periods of intense canyon cutting about 9 ka ago, 5 ka ago, and 2 ka ago during which multiple large knickpoints retreated large distances (>2 km). During these events, a threshold flow depth was exceeded, leading to the toppling and transportation of basalt lava columns. Despite continuing and comparatively large-scale (500 m(3)/s) discharge of sediment-rich glacial meltwater, there is no evidence for a transition to an abrasion-dominated erosion regime since the last erosive event because the vertical knickpoints have not diffused over time. We provide a model for the evolution of the Jökulsárgljúfur canyon through the reconstruction of the river profile and canyon morphology at different stages over the last 9 ka and highlight the dominant role played by extreme flood events in the shaping of this landscape during the Holocene.
Erosion during extreme flood events dominates Holocene canyon evolution in northeast Iceland
Baynes, Edwin R. C.; Attal, Mikaël; Kirstein, Linda A.; Dugmore, Andrew J.; Naylor, Mark
2015-01-01
Extreme flood events have the potential to cause catastrophic landscape change in short periods of time (100 to 103 h). However, their impacts are rarely considered in studies of long-term landscape evolution (>103 y), because the mechanisms of erosion during such floods are poorly constrained. Here we use topographic analysis and cosmogenic 3He surface exposure dating of fluvially sculpted surfaces to determine the impact of extreme flood events within the Jökulsárgljúfur canyon (northeast Iceland) and to constrain the mechanisms of bedrock erosion during these events. Surface exposure ages allow identification of three periods of intense canyon cutting about 9 ka ago, 5 ka ago, and 2 ka ago during which multiple large knickpoints retreated large distances (>2 km). During these events, a threshold flow depth was exceeded, leading to the toppling and transportation of basalt lava columns. Despite continuing and comparatively large-scale (500 m3/s) discharge of sediment-rich glacial meltwater, there is no evidence for a transition to an abrasion-dominated erosion regime since the last erosive event because the vertical knickpoints have not diffused over time. We provide a model for the evolution of the Jökulsárgljúfur canyon through the reconstruction of the river profile and canyon morphology at different stages over the last 9 ka and highlight the dominant role played by extreme flood events in the shaping of this landscape during the Holocene. PMID:25675484
Positive events protect children from causal false memories for scripted events.
Melinder, Annika; Toffalini, Enrico; Geccherle, Eleonora; Cornoldi, Cesare
2017-11-01
Adults produce fewer inferential false memories for scripted events when their conclusions are emotionally charged than when they are neutral, but it is not clear whether the same effect is also found in children. In the present study, we examined this issue in a sample of 132 children aged 6-12 years (mean 9 years, 3 months). Participants encoded photographs depicting six script-like events that had a positively, negatively, or a neutral valenced ending. Subsequently, true and false recognition memory of photographs related to the observed scripts was tested as a function of emotionality. Causal errors-a type of false memory thought to stem from inferential processes-were found to be affected by valence: children made fewer causal errors for positive than for neutral or negative events. Hypotheses are proposed on why adults were found protected against inferential false memories not only by positive (as for children) but also by negative endings when administered similar versions of the same paradigm.
NASA Astrophysics Data System (ADS)
Rodas, Claudio; Pulido, Manuel
2017-09-01
A climatological characterization of Rossby wave generation events in the middle atmosphere of the Southern Hemisphere is conducted using 20 years of Modern-Era Retrospective Analysis for Research and Applications (MERRA) reanalysis. An automatic detection technique of wave generation events is developed and applied to MERRA reanalysis. The Rossby wave generation events with wave period of 1.25 to 5.5 days and zonal wave number from one to three dominate the Eliassen-Palm flux divergence around the stratopause at high latitudes in the examined 20 year period. These produce an eastward forcing of the general circulation between May and mid-August in that region. Afterward from mid-August to the final warming date, Rossby wave generation events are still present but the Eliassen-Palm flux divergence in the polar stratopause is dominated by low-frequency Rossby waves that propagate from the troposphere. The Rossby wave generation events are associated with potential vorticity gradient inversion, and so they are a manifestation of the dominant barotropic/baroclinic unstable modes that grow at the cost of smearing the negative meridional gradient of potential vorticity. The most likely region of wave generation is found between 60° and 80°S and at a height of 0.7 hPa, but events were detected from 40 hPa to 0.3 hPa (which is the top of the examined region). The mean number of events per year is 24, and its mean duration is 3.35 days. The event duration follows an exponential distribution.
High Reliability Organizations--Medication Safety.
Yip, Luke; Farmer, Brenna
2015-06-01
High reliability organizations (HROs), such as the aviation industry, successfully engage in high-risk endeavors and have low incidence of adverse events. HROs have a preoccupation with failure and errors. They analyze each event to effect system wide change in an attempt to mitigate the occurrence of similar errors. The healthcare industry can adapt HRO practices, specifically with regard to teamwork and communication. Crew resource management concepts can be adapted to healthcare with the use of certain tools such as checklists and the sterile cockpit to reduce medication errors. HROs also use The Swiss Cheese Model to evaluate risk and look for vulnerabilities in multiple protective barriers, instead of focusing on one failure. This model can be used in medication safety to evaluate medication management in addition to using the teamwork and communication tools of HROs.
Awareness of deficits and error processing after traumatic brain injury.
Larson, Michael J; Perlstein, William M
2009-10-28
Severe traumatic brain injury is frequently associated with alterations in performance monitoring, including reduced awareness of physical and cognitive deficits. We examined the relationship between awareness of deficits and electrophysiological indices of performance monitoring, including the error-related negativity and posterror positivity (Pe) components of the scalp-recorded event-related potential, in 16 traumatic brain injury survivors who completed a Stroop color-naming task while event-related potential measurements were recorded. Awareness of deficits was measured as the discrepancy between patient and significant-other ratings on the Frontal Systems Behavior Scale. The amplitude of the Pe, but not error-related negativity, was reliably associated with decreased awareness of deficits. Results indicate that Pe amplitude may serve as an electrophysiological indicator of awareness of abilities and deficits.
A stochastic dynamic model for human error analysis in nuclear power plants
NASA Astrophysics Data System (ADS)
Delgado-Loperena, Dharma
Nuclear disasters like Three Mile Island and Chernobyl indicate that human performance is a critical safety issue, sending a clear message about the need to include environmental press and competence aspects in research. This investigation was undertaken to serve as a roadmap for studying human behavior through the formulation of a general solution equation. The theoretical model integrates models from two heretofore-disassociated disciplines (behavior specialists and technical specialists), that historically have independently studied the nature of error and human behavior; including concepts derived from fractal and chaos theory; and suggests re-evaluation of base theory regarding human error. The results of this research were based on comprehensive analysis of patterns of error, with the omnipresent underlying structure of chaotic systems. The study of patterns lead to a dynamic formulation, serving for any other formula used to study human error consequences. The search for literature regarding error yielded insight for the need to include concepts rooted in chaos theory and strange attractors---heretofore unconsidered by mainstream researchers who investigated human error in nuclear power plants or those who employed the ecological model in their work. The study of patterns obtained from the rupture of a steam generator tube (SGTR) event simulation, provided a direct application to aspects of control room operations in nuclear power plant operations. In doing so, the conceptual foundation based in the understanding of the patterns of human error analysis can be gleaned, resulting in reduced and prevent undesirable events.
Disclosing harmful medical errors to patients: tackling three tough cases.
Gallagher, Thomas H; Bell, Sigall K; Smith, Kelly M; Mello, Michelle M; McDonald, Timothy B
2009-09-01
A gap exists between recommendations to disclose errors to patients and current practice. This gap may reflect important, yet unanswered questions about implementing disclosure principles. We explore some of these unanswered questions by presenting three real cases that pose challenging disclosure dilemmas. The first case involves a pancreas transplant that failed due to the pancreas graft being discarded, an error that was not disclosed partly because the family did not ask clarifying questions. Relying on patient or family questions to determine the content of disclosure is problematic. We propose a standard of materiality that can help clinicians to decide what information to disclose. The second case involves a fatal diagnostic error that the patient's widower was unaware had happened. The error was not disclosed out of concern that disclosure would cause the widower more harm than good. This case highlights how institutions can overlook patients' and families' needs following errors and emphasizes that benevolent deception has little role in disclosure. Institutions should consider whether involving neutral third parties could make disclosures more patient centered. The third case presents an intraoperative cardiac arrest due to a large air embolism where uncertainty around the clinical event was high and complicated the disclosure. Uncertainty is common to many medical errors but should not deter open conversations with patients and families about what is and is not known about the event. Continued discussion within the medical profession about applying disclosure principles to real-world cases can help to better meet patients' and families' needs following medical errors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasylkivska, Veronika S.; Huerta, Nicolas J.
Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog’s inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable withmore » respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.« less
A decade of Australian methotrexate dosing errors.
Cairns, Rose; Brown, Jared A; Lynch, Ann-Maree; Robinson, Jeff; Wylie, Carol; Buckley, Nicholas A
2016-06-06
Accidental daily dosing of methotrexate can result in life-threatening toxicity. We investigated methotrexate dosing errors reported to the National Coronial Information System (NCIS), the Therapeutic Goods Administration Database of Adverse Event Notifications (TGA DAEN) and Australian Poisons Information Centres (PICs). A retrospective review of coronial cases in the NCIS (2000-2014), and of reports to the TGA DAEN (2004-2014) and Australian PICs (2004-2015). Cases were included if dosing errors were accidental, with evidence of daily dosing on at least 3 consecutive days. Events per year, dose, consecutive days of methotrexate administration, reasons for the error, clinical features. Twenty-two deaths linked with methotrexate were identified in the NCIS, including seven cases in which erroneous daily dosing was documented. Methotrexate medication error was listed in ten cases in the DAEN, including two deaths. Australian PIC databases contained 92 cases, with a worrying increase seen during 2014-2015. Reasons for the errors included patient misunderstanding and incorrect packaging of dosette packs by pharmacists. The recorded clinical effects of daily dosage were consistent with those previously reported for methotrexate toxicity. Dosing errors with methotrexate can be lethal and continue to occur despite a number of safety initiatives in the past decade. Further strategies to reduce these preventable harms need to be implemented and evaluated. Recent suggestions include further changes in packet size, mandatory weekly dosing labelling on packaging, improving education, and including alerts in prescribing and dispensing software.
Calculation of cosmic ray induced single event upsets: Program CRUP (Cosmic Ray Upset Program)
NASA Astrophysics Data System (ADS)
Shapiro, P.
1983-09-01
This report documents PROGRAM CRUP, COSMIC RAY UPSET PROGRAM. The computer program calculates cosmic ray induced single-event error rates in microelectronic circuits exposed to several representative cosmic-ray environments.
Real-time monitoring of clinical processes using complex event processing and transition systems.
Meinecke, Sebastian
2014-01-01
Dependencies between tasks in clinical processes are often complex and error-prone. Our aim is to describe a new approach for the automatic derivation of clinical events identified via the behaviour of IT systems using Complex Event Processing. Furthermore we map these events on transition systems to monitor crucial clinical processes in real-time for preventing and detecting erroneous situations.
Pilot error in air carrier accidents: does age matter?
Li, Guohua; Grabowski, Jurek G; Baker, Susan P; Rebok, George W
2006-07-01
The relationship between pilot age and safety performance has been the subject of research and controversy since the "Age 60 Rule" became effective in 1960. This study aimed to examine age-related differences in the prevalence and patterns of pilot error in air carrier accidents. Investigation reports from the National Transportation Safety Board for accidents involving Part 121 operations in the United States between 1983 and 2002 were reviewed to identify pilot error and other contributing factors. Accident circumstances and the presence and type of pilot error were analyzed in relation to pilot age using Chi-square tests. Of the 558 air carrier accidents studied, 25% resulted from turbulence, 21% from mechanical failure, 16% from taxiing events, 13% from loss of control at landing or takeoff, and 25% from other causes. Accidents involving older pilots were more likely to be caused by turbulence, whereas accidents involving younger pilots were more likely to be taxiing events. Pilot error was a contributing factor in 34%, 38%, 35%, and 34% of the accidents involving pilots ages 25-34 yr, 35-44 yr, 45-54 yr, and 55-59 yr, respectively (p = 0.87). The patterns of pilot error were similar across age groups. Overall, 26% of the pilot errors identified were inattentiveness, 22% flawed decisions, 22% mishandled aircraft kinetics, and 11% poor crew interactions. The prevalence and patterns of pilot error in air carrier accidents do not seem to change with pilot age. The lack of association between pilot age and error may be due to the "safe worker effect" resulting from the rigorous selection processes and certification standards for professional pilots.
Data-driven models of dominantly-inherited Alzheimer's disease progression.
Oxtoby, Neil P; Young, Alexandra L; Cash, David M; Benzinger, Tammie L S; Fagan, Anne M; Morris, John C; Bateman, Randall J; Fox, Nick C; Schott, Jonathan M; Alexander, Daniel C
2018-05-01
See Li and Donohue (doi:10.1093/brain/awy089) for a scientific commentary on this article.Dominantly-inherited Alzheimer's disease is widely hoped to hold the key to developing interventions for sporadic late onset Alzheimer's disease. We use emerging techniques in generative data-driven disease progression modelling to characterize dominantly-inherited Alzheimer's disease progression with unprecedented resolution, and without relying upon familial estimates of years until symptom onset. We retrospectively analysed biomarker data from the sixth data freeze of the Dominantly Inherited Alzheimer Network observational study, including measures of amyloid proteins and neurofibrillary tangles in the brain, regional brain volumes and cortical thicknesses, brain glucose hypometabolism, and cognitive performance from the Mini-Mental State Examination (all adjusted for age, years of education, sex, and head size, as appropriate). Data included 338 participants with known mutation status (211 mutation carriers in three subtypes: 163 PSEN1, 17 PSEN2, and 31 APP) and a baseline visit (age 19-66; up to four visits each, 1.1 ± 1.9 years in duration; spanning 30 years before, to 21 years after, parental age of symptom onset). We used an event-based model to estimate sequences of biomarker changes from baseline data across disease subtypes (mutation groups), and a differential equation model to estimate biomarker trajectories from longitudinal data (up to 66 mutation carriers, all subtypes combined). The two models concur that biomarker abnormality proceeds as follows: amyloid deposition in cortical then subcortical regions (∼24 ± 11 years before onset); phosphorylated tau (17 ± 8 years), tau and amyloid-β changes in cerebrospinal fluid; neurodegeneration first in the putamen and nucleus accumbens (up to 6 ± 2 years); then cognitive decline (7 ± 6 years), cerebral hypometabolism (4 ± 4 years), and further regional neurodegeneration. Our models predicted symptom onset more accurately than predictions that used familial estimates: root mean squared error of 1.35 years versus 5.54 years. The models reveal hidden detail on dominantly-inherited Alzheimer's disease progression, as well as providing data-driven systems for fine-grained patient staging and prediction of symptom onset with great potential utility in clinical trials.
Decoy-state quantum key distribution with more than three types of photon intensity pulses
NASA Astrophysics Data System (ADS)
Chau, H. F.
2018-04-01
The decoy-state method closes source security loopholes in quantum key distribution (QKD) using a laser source. In this method, accurate estimates of the detection rates of vacuum and single-photon events plus the error rate of single-photon events are needed to give a good enough lower bound of the secret key rate. Nonetheless, the current estimation method for these detection and error rates, which uses three types of photon intensities, is accurate up to about 1 % relative error. Here I report an experimentally feasible way that greatly improves these estimates and hence increases the one-way key rate of the BB84 QKD protocol with unbiased bases selection by at least 20% on average in realistic settings. The major tricks are the use of more than three types of photon intensities plus the fact that estimating bounds of the above detection and error rates is numerically stable, although these bounds are related to the inversion of a high condition number matrix.
Charles, Krista; Cannon, Margaret; Hall, Robert; Coustasse, Alberto
2014-01-01
Computerized provider order entry (CPOE) systems allow physicians to prescribe patient services electronically. In hospitals, CPOE essentially eliminates the need for handwritten paper orders and achieves cost savings through increased efficiency. The purpose of this research study was to examine the benefits of and barriers to CPOE adoption in hospitals to determine the effects on medical errors and adverse drug events (ADEs) and examine cost and savings associated with the implementation of this newly mandated technology. This study followed a methodology using the basic principles of a systematic review and referenced 50 sources. CPOE systems in hospitals were found to be capable of reducing medical errors and ADEs, especially when CPOE systems are bundled with clinical decision support systems designed to alert physicians and other healthcare providers of pending lab or medical errors. However, CPOE systems face major barriers associated with adoption in a hospital system, mainly high implementation costs and physicians' resistance to change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hensley, Alyssa J. R.; Ghale, Kushal; Rieg, Carolin
In recent years, the popularity of density functional theory with periodic boundary conditions (DFT) has surged for the design and optimization of functional materials. However, no single DFT exchange–correlation functional currently available gives accurate adsorption energies on transition metals both when bonding to the surface is dominated by strong covalent or ionic bonding and when it has strong contributions from van der Waals interactions (i.e., dispersion forces). Here we present a new, simple method for accurately predicting adsorption energies on transition-metal surfaces based on DFT calculations, using an adaptively weighted sum of energies from RPBE and optB86b-vdW (or optB88-vdW) densitymore » functionals. This method has been benchmarked against a set of 39 reliable experimental energies for adsorption reactions. Our results show that this method has a mean absolute error and root mean squared error relative to experiments of 13.4 and 19.3 kJ/mol, respectively, compared to 20.4 and 26.4 kJ/mol for the BEEF-vdW functional. For systems with large van der Waals contributions, this method decreases these errors to 11.6 and 17.5 kJ/mol. Furthermore, this method provides predictions of adsorption energies both for processes dominated by strong covalent or ionic bonding and for those dominated by dispersion forces that are more accurate than those of any current standard DFT functional alone.« less
Hensley, Alyssa J. R.; Ghale, Kushal; Rieg, Carolin; ...
2017-01-26
In recent years, the popularity of density functional theory with periodic boundary conditions (DFT) has surged for the design and optimization of functional materials. However, no single DFT exchange–correlation functional currently available gives accurate adsorption energies on transition metals both when bonding to the surface is dominated by strong covalent or ionic bonding and when it has strong contributions from van der Waals interactions (i.e., dispersion forces). Here we present a new, simple method for accurately predicting adsorption energies on transition-metal surfaces based on DFT calculations, using an adaptively weighted sum of energies from RPBE and optB86b-vdW (or optB88-vdW) densitymore » functionals. This method has been benchmarked against a set of 39 reliable experimental energies for adsorption reactions. Our results show that this method has a mean absolute error and root mean squared error relative to experiments of 13.4 and 19.3 kJ/mol, respectively, compared to 20.4 and 26.4 kJ/mol for the BEEF-vdW functional. For systems with large van der Waals contributions, this method decreases these errors to 11.6 and 17.5 kJ/mol. Furthermore, this method provides predictions of adsorption energies both for processes dominated by strong covalent or ionic bonding and for those dominated by dispersion forces that are more accurate than those of any current standard DFT functional alone.« less
Chang, Soju; Pool, Vitali; O'Connell, Kathryn; Polder, Jacquelyn A; Iskander, John; Sweeney, Colleen; Ball, Robert; Braun, M Miles
2008-01-01
Errors involving the mix-up of tuberculin purified protein derivative (PPD) and vaccines leading to adverse reactions and unnecessary medical management have been reported previously. To determine the frequency of PPD-vaccine mix-ups reported to the US Vaccine Adverse Event Reporting System (VAERS) and the Adverse Event Reporting System (AERS), characterize adverse events and clusters involving mix-ups and describe reported contributory factors. We reviewed AERS reports from 1969 to 2005 and VAERS reports from 1990 to 2005. We defined a mix-up error event as an incident in which a single patient or a cluster of patients inadvertently received vaccine instead of a PPD product or received a PPD product instead of vaccine. We defined a cluster as inadvertent administration of PPD or vaccine products to more than one patient in the same facility within 1 month. Of 115 mix-up events identified, 101 involved inadvertent administration of vaccines instead of PPD. Product confusion involved PPD and multiple vaccines. The annual number of reported mix-ups increased from an average of one event per year in the early 1990s to an average of ten events per year in the early part of this decade. More than 240 adults and children were affected and the majority reported local injection site reactions. Four individuals were hospitalized (all recovered) after receiving the wrong products. Several patients were inappropriately started on tuberculosis prophylaxis as a result of a vaccine local reaction being interpreted as a positive tuberculin skin test. Reported potential contributory factors involved both system factors (e.g. similar packaging) and human errors (e.g. failure to read label before product administration). To prevent PPD-vaccine mix-ups, proper storage, handling and administration of vaccine and PPD products is necessary.
Bonfiglio, Luca; Minichilli, Fabrizio; Cantore, Nicoletta; Carboncini, Maria Chiara; Piccotti, Emily; Rossi, Bruno
2016-01-01
Modulation of frontal midline theta (fmθ) is observed during error commission, but little is known about the role of theta oscillations in correcting motor behaviours. We investigate EEG activity of healthy partipants executing a reaching task under variable degrees of prism-induced visuo-motor distortion and visual occlusion of the initial arm trajectory. This task introduces directional errors of different magnitudes. The discrepancy between predicted and actual movement directions (i.e. the error), at the time when visual feedback (hand appearance) became available, elicits a signal that triggers on-line movement correction. Analysis were performed on 25 EEG channels. For each participant, the median value of the angular error of all reaching trials was used to partition the EEG epochs into high- and low-error conditions. We computed event-related spectral perturbations (ERSP) time-locked either to visual feedback or to the onset of movement correction. ERSP time-locked to the onset of visual feedback showed that fmθ increased in the high- but not in the low-error condition with an approximate time lag of 200 ms. Moreover, when single epochs were sorted by the degree of motor error, fmθ started to increase when a certain level of error was exceeded and, then, scaled with error magnitude. When ERSP were time-locked to the onset of movement correction, the fmθ increase anticipated this event with an approximate time lead of 50 ms. During successive trials, an error reduction was observed which was associated with indices of adaptations (i.e., aftereffects) suggesting the need to explore if theta oscillations may facilitate learning. To our knowledge this is the first study where the EEG signal recorded during reaching movements was time-locked to the onset of the error visual feedback. This allowed us to conclude that theta oscillations putatively generated by anterior cingulate cortex activation are implicated in error processing in semi-naturalistic motor behaviours. PMID:26963919
Arrighi, Pieranna; Bonfiglio, Luca; Minichilli, Fabrizio; Cantore, Nicoletta; Carboncini, Maria Chiara; Piccotti, Emily; Rossi, Bruno; Andre, Paolo
2016-01-01
Modulation of frontal midline theta (fmθ) is observed during error commission, but little is known about the role of theta oscillations in correcting motor behaviours. We investigate EEG activity of healthy partipants executing a reaching task under variable degrees of prism-induced visuo-motor distortion and visual occlusion of the initial arm trajectory. This task introduces directional errors of different magnitudes. The discrepancy between predicted and actual movement directions (i.e. the error), at the time when visual feedback (hand appearance) became available, elicits a signal that triggers on-line movement correction. Analysis were performed on 25 EEG channels. For each participant, the median value of the angular error of all reaching trials was used to partition the EEG epochs into high- and low-error conditions. We computed event-related spectral perturbations (ERSP) time-locked either to visual feedback or to the onset of movement correction. ERSP time-locked to the onset of visual feedback showed that fmθ increased in the high- but not in the low-error condition with an approximate time lag of 200 ms. Moreover, when single epochs were sorted by the degree of motor error, fmθ started to increase when a certain level of error was exceeded and, then, scaled with error magnitude. When ERSP were time-locked to the onset of movement correction, the fmθ increase anticipated this event with an approximate time lead of 50 ms. During successive trials, an error reduction was observed which was associated with indices of adaptations (i.e., aftereffects) suggesting the need to explore if theta oscillations may facilitate learning. To our knowledge this is the first study where the EEG signal recorded during reaching movements was time-locked to the onset of the error visual feedback. This allowed us to conclude that theta oscillations putatively generated by anterior cingulate cortex activation are implicated in error processing in semi-naturalistic motor behaviours.
Panel positioning error and support mechanism for a 30-m THz radio telescope
NASA Astrophysics Data System (ADS)
Yang, De-Hua; Okoh, Daniel; Zhou, Guo-Hua; Li, Ai-Hua; Li, Guo-Ping; Cheng, Jing-Quan
2011-06-01
A 30-m TeraHertz (THz) radio telescope is proposed to operate at 200 μm with an active primary surface. This paper presents sensitivity analysis of active surface panel positioning errors with optical performance in terms of the Strehl ratio. Based on Ruze's surface error theory and using a Monte Carlo simulation, the effects of six rigid panel positioning errors, such as piston, tip, tilt, radial, azimuthal and twist displacements, were directly derived. The optical performance of the telescope was then evaluated using the standard Strehl ratio. We graphically illustrated the various panel error effects by presenting simulations of complete ensembles of full reflector surface errors for the six different rigid panel positioning errors. Study of the panel error sensitivity analysis revealed that the piston error and tilt/tip errors are dominant while the other rigid errors are much less important. Furthermore, as indicated by the results, we conceived of an alternative Master-Slave Concept-based (MSC-based) active surface by implementating a special Series-Parallel Concept-based (SPC-based) hexapod as the active panel support mechanism. A new 30-m active reflector based on the two concepts was demonstrated to achieve correction for all the six rigid panel positioning errors in an economically feasible way.
The Swiss cheese model of adverse event occurrence--Closing the holes.
Stein, James E; Heiss, Kurt
2015-12-01
Traditional surgical attitude regarding error and complications has focused on individual failings. Human factors research has brought new and significant insights into the occurrence of error in healthcare, helping us identify systemic problems that injure patients while enhancing individual accountability and teamwork. This article introduces human factors science and its applicability to teamwork, surgical culture, medical error, and individual accountability. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Garcia-Medina, G.; Ozkan-Haller, H. T.; Holman, R. A.; Ruggiero, P.
2016-02-01
Understanding the primary hydrodynamic processes that cause extreme runup events is important for the prediction of dune erosion and coastal flooding. Large runups may be caused by a superposition of physical and environmental conditions, bore-bore capture, infragravity-short wave interaction, and/or swash-backwash interaction. To investigate the conditions leading to these events we combine optical remote sensing observations (Argus) and state-of-the-art phase resolving numerical modeling (primarily NHWAVE). We evaluate runup time series derived from across-shore transects of pixel intensities in two very different beaches: Agate (Oregon, USA) and Duck (North Carolina, USA). The former is a dissipative beach where the runup is dominated by infragravity energy, whereas the latter is a reflective beach where the runup is dominated by short surface gravity waves. Phase resolving numerical models are implemented to explore an expanded parameter set and identify the mechanisms that control these large runups. Model results are in good qualitative agreement with observations. We also distinguish unexpected runups, which are defined by having an unexpectedly large excursion distance in comparison to the hourly-to-daily local runup conditions and do not necessarily represent a statistical extrema. These events pose significant safety hazards. We evaluate the relative contribution of the dominating physics to extreme and unexpected runup events.
Peter, Hannes; Hörtnagl, Paul; Reche, Isabel; Sommaruga, Ruben
2014-12-01
The diversity of airborne microorganisms that potentially reach aquatic ecosystems during rain events is poorly explored. Here, we used a culture-independent approach to characterize bacterial assemblages during rain events with and without Saharan dust influence arriving to a high mountain lake in the Austrian Alps. Bacterial assemblage composition differed significantly between samples with and without Saharan dust influence. Although alpha diversity indices were within the same range in both sample categories, rain events with Atlantic or continental origins were dominated by Betaproteobacteria, whereas those with Saharan dust intrusions were dominated by Gammaproteobacteria. The high diversity and evenness observed in all samples suggests that different sources of bacteria contributed to the airborne assemblage collected at the lake shore. During experiments with bacterial assemblages collected during rain events with Saharan dust influence, cell numbers rapidly increased in sterile lake water from initially ∼3 × 103 cell ml-1 to 3.6-11.1 x105 cells ml-1 within 4-5 days, and initially, rare taxa dominated at the end of the experiment. Our study documents the dispersal of viable bacteria associated to Saharan dust intrusions travelling northwards as far as 47° latitude.
Barriers to success: physical separation optimizes event-file retrieval in shared workspaces.
Klempova, Bibiana; Liepelt, Roman
2017-07-08
Sharing tasks with other persons can simplify our work and life, but seeing and hearing other people's actions may also be very distracting. The joint Simon effect (JSE) is a standard measure of referential response coding when two persons share a Simon task. Sequential modulations of the joint Simon effect (smJSE) are interpreted as a measure of event-file processing containing stimulus information, response information and information about the just relevant control-state active in a given social situation. This study tested effects of physical (Experiment 1) and virtual (Experiment 2) separation of shared workspaces on referential coding and event-file processing using a joint Simon task. In Experiment 1, participants performed this task in individual (go-nogo), joint and standard Simon task conditions with and without a transparent curtain (physical separation) placed along the imagined vertical midline of the monitor. In Experiment 2, participants performed the same tasks with and without receiving background music (virtual separation). For response times, physical separation enhanced event-file retrieval indicated by an enlarged smJSE in the joint Simon task with curtain than without curtain (Experiment1), but did not change referential response coding. In line with this, we also found evidence for enhanced event-file processing through physical separation in the joint Simon task for error rates. Virtual separation did neither impact event-file processing, nor referential coding, but generally slowed down response times in the joint Simon task. For errors, virtual separation hampered event-file processing in the joint Simon task. For the cognitively more demanding standard two-choice Simon task, we found music to have a degrading effect on event-file retrieval for response times. Our findings suggest that adding a physical separation optimizes event-file processing in shared workspaces, while music seems to lead to a more relaxed task processing mode under shared task conditions. In addition, music had an interfering impact on joint error processing and more generally when dealing with a more complex task in isolation.
Restrictions on surgical resident shift length does not impact type of medical errors.
Anderson, Jamie E; Goodman, Laura F; Jensen, Guy W; Salcedo, Edgardo S; Galante, Joseph M
2017-05-15
In 2011, resident duty hours were restricted in an attempt to improve patient safety and resident education. With the goal of reducing fatigue, shorter shift length leads to more patient handoffs, raising concerns about adverse effects on patient safety. This study seeks to determine whether differences in duty-hour restrictions influence types of errors made by residents. This is a nested retrospective cohort study at a surgery department in an academic medical center. During 2013-14, standard 2011 duty hours were in place for residents. In 2014-15, duty-hour restrictions at the study site were relaxed ("flexible") with no restrictions on shift length. We reviewed all morbidity and mortality submissions from July 1, 2013-June 30, 2015 and compared differences in types of errors between these periods. A total of 383 patients experienced adverse events, including 59 deaths (15.4%). Comparing standard versus flexible periods, there was no difference in mortality (15.7% versus 12.6%, P = 0.479) or complication rates (2.6% versus 2.5%, P = 0.696). There was no difference in types of errors between periods (P = 0.050-0.808). The most number of errors were due to cognitive failures (229, 59.6%), whereas the fewest number of errors were due to team failure (127, 33.2%). By subset, technical errors resulted in the highest number of errors (169, 44.1%). There were no differences between types of errors of cases that were nonelective, at night, or involving residents. Among adverse events reported in this departmental surgical morbidity and mortality, there were no differences in types of errors when resident duty hours were less restrictive. Copyright © 2017 Elsevier Inc. All rights reserved.
Patterson, Mark E; Pace, Heather A; Fincham, Jack E
2013-09-01
Although error-reporting systems enable hospitals to accurately track safety climate through the identification of adverse events, these systems may be underused within a work climate of poor communication. The objective of this analysis is to identify the extent to which perceived communication climate among hospital pharmacists impacts medical error reporting rates. This cross-sectional study used survey responses from more than 5000 pharmacists responding to the 2010 Hospital Survey on Patient Safety Culture (HSOPSC). Two composite scores were constructed for "communication openness" and "feedback and about error," respectively. Error reporting frequency was defined from the survey question, "In the past 12 months, how many event reports have you filled out and submitted?" Multivariable logistic regressions were used to estimate the likelihood of medical error reporting conditional upon communication openness or feedback levels, controlling for pharmacist years of experience, hospital geographic region, and ownership status. Pharmacists with higher communication openness scores compared with lower scores were 40% more likely to have filed or submitted a medical error report in the past 12 months (OR, 1.4; 95% CI, 1.1-1.7; P = 0.004). In contrast, pharmacists with higher communication feedback scores were not any more likely than those with lower scores to have filed or submitted a medical report in the past 12 months (OR, 1.0; 95% CI, 0.8-1.3; P = 0.97). Hospital work climates that encourage pharmacists to freely communicate about problems related to patient safety is conducive to medical error reporting. The presence of feedback infrastructures about error may not be sufficient to induce error-reporting behavior.
Modeling and characterization of multipath in global navigation satellite system ranging signals
NASA Astrophysics Data System (ADS)
Weiss, Jan Peter
The Global Positioning System (GPS) provides position, velocity, and time information to users in anywhere near the earth in real-time and regardless of weather conditions. Since the system became operational, improvements in many areas have reduced systematic errors affecting GPS measurements such that multipath, defined as any signal taking a path other than the direct, has become a significant, if not dominant, error source for many applications. This dissertation utilizes several approaches to characterize and model multipath errors in GPS measurements. Multipath errors in GPS ranging signals are characterized for several receiver systems and environments. Experimental P(Y) code multipath data are analyzed for ground stations with multipath levels ranging from minimal to severe, a C-12 turboprop, an F-18 jet, and an aircraft carrier. Comparisons between receivers utilizing single patch antennas and multi-element arrays are also made. In general, the results show significant reductions in multipath with antenna array processing, although large errors can occur even with this kind of equipment. Analysis of airborne platform multipath shows that the errors tend to be small in magnitude because the size of the aircraft limits the geometric delay of multipath signals, and high in frequency because aircraft dynamics cause rapid variations in geometric delay. A comprehensive multipath model is developed and validated. The model integrates 3D structure models, satellite ephemerides, electromagnetic ray-tracing algorithms, and detailed antenna and receiver models to predict multipath errors. Validation is performed by comparing experimental and simulated multipath via overall error statistics, per satellite time histories, and frequency content analysis. The validation environments include two urban buildings, an F-18, an aircraft carrier, and a rural area where terrain multipath dominates. The validated models are used to identify multipath sources, characterize signal properties, evaluate additional antenna and receiver tracking configurations, and estimate the reflection coefficients of multipath-producing surfaces. Dynamic models for an F-18 landing on an aircraft carrier correlate aircraft dynamics to multipath frequency content; the model also characterizes the separate contributions of multipath due to the aircraft, ship, and ocean to the overall error statistics. Finally, reflection coefficients for multipath produced by terrain are estimated via a least-squares algorithm.
Brain State Before Error Making in Young Patients With Mild Spastic Cerebral Palsy.
Hakkarainen, Elina; Pirilä, Silja; Kaartinen, Jukka; van der Meere, Jaap J
2015-10-01
In the present experiment, children with mild spastic cerebral palsy and a control group carried out a memory recognition task. The key question was if errors of the patient group are foreshadowed by attention lapses, by weak motor preparation, or by both. Reaction times together with event-related potentials associated with motor preparation (frontal late contingent negative variation), attention (parietal P300), and response evaluation (parietal error-preceding positivity) were investigated in instances where 3 subsequent correct trials preceded an error. The findings indicated that error responses of the patient group are foreshadowed by weak motor preparation in correct trials directly preceding an error. © The Author(s) 2015.
Using Antelope and Seiscomp in the framework of the Romanian Seismic Network
NASA Astrophysics Data System (ADS)
Marius Craiu, George; Craiu, Andreea; Marmureanu, Alexandru; Neagoe, Cristian
2014-05-01
The National Institute for Earth Physics (NIEP) operates a real-time seismic network designed to monitor the seismic activity on the Romania territory, dominated by the Vrancea intermediate-depth (60-200 km) earthquakes. The NIEP real-time network currently consists of 102 stations and two seismic arrays equipped with different high quality digitizers (Kinemetrics K2, Quanterra Q330, Quanterra Q330HR, PS6-26, Basalt), broadband and short period seismometers (CMG3ESP, CMG40T, KS2000, KS54000, KS2000, CMG3T, STS2, SH-1, S13, Mark l4c, Ranger, Gs21, Mark 22) and acceleration sensors (Episensor Kinemetrics). The primary goal of the real-time seismic network is to provide earthquake parameters from more broad-band stations with a high dynamic range, for more rapid and accurate computation of the locations and magnitudes of earthquakes. The Seedlink and AntelopeTM program packages are completely automated Antelope seismological system is run at the Data Center in Măgurele. The Antelope data acquisition and processing software is running for real-time processing and post processing. The Antelope real-time system provides automatic event detection, arrival picking, event location, and magnitude calculation. It also provides graphical displays and automatic location within near real time after a local, regional or teleseismic event has occurred SeisComP 3 is another automated system that is run at the NIEP and which provides the following features: data acquisition, data quality control, real-time data exchange and processing, network status monitoring, issuing event alerts, waveform archiving and data distribution, automatic event detection and location, easy access to relevant information about stations, waveforms, and recent earthquakes. The main goal of this paper is to compare both of these data acquisitions systems in order to improve their detection capabilities, location accuracy, magnitude and depth determination and reduce the RMS and other location errors.
NASA Technical Reports Server (NTRS)
Fields, J. M.
1984-01-01
Even though there are surveys in which annoyance decreases as the number of events increases above about 150 a day, the available evidence is not considered strong enough to reject the conventional assumption that reactions are related to the logarithm of the number of events. The data do not make it possible to reject the conventional assumption that the effects of the number of events and the peak noise level are additive. It is found that even when equivalent questionnaire items and definitions of noise events could be used, differences between the surveys' estimates of the effect of the number of events remained large. Three explanations are suggested for inconsistent estimates. The first has to do with errors in specifying the values of noise parameters, the second with the effects of unmeasured acoustical and area characteristics that are correlated with noise level or number, and the third with large sampling errors deriving from community differences in response to noise. It is concluded that significant advances in the knowledge about the effects of the number of noise events can be made only if surveys include large numbers of study areas.
West, Michael E.; Larsen, Christopher F.; Truffer, Martin; O'Neel, Shad; LeBlanc, Laura
2010-01-01
We present a framework for interpreting small glacier seismic events based on data collected near the center of Bering Glacier, Alaska, in spring 2007. We find extremely high microseismicity rates (as many as tens of events per minute) occurring largely within a few kilometers of the receivers. A high-frequency class of seismicity is distinguished by dominant frequencies of 20–35 Hz and impulsive arrivals. A low-frequency class has dominant frequencies of 6–15 Hz, emergent onsets, and longer, more monotonic codas. A bimodal distribution of 160,000 seismic events over two months demonstrates that the classes represent two distinct populations. This is further supported by the presence of hybrid waveforms that contain elements of both event types. The high-low-hybrid paradigm is well established in volcano seismology and is demonstrated by a comparison to earthquakes from Augustine Volcano. We build on these parallels to suggest that fluid-induced resonance is likely responsible for the low-frequency glacier events and that the hybrid glacier events may be caused by the rush of water into newly opening pathways.
Khosla, T.
1977-01-01
The United States of America dominated 58 events in athletics, field and swimming, which between them accounted for 35 per cent of all events in the Munich Olympiad. 1972; these events favour taller individuals. But, in 25 per cent of other events (1) cycling, (2) fencing, (3) gymnastics, (4) judo, (5) weightlifting and (6) Graeco Roman wrestling the U.S.A. did not win a single medal. The failure of the U.S.A. to maintain her lead in Munich was largely due to weaknesses in these other events in many of which the potential medallists can be derived from the lower half of the height distribution (events 3 to 6). These weaknesses are Russia's strength and they continued to remain unstrengthened at Montreal. Also, the domination held by the U.S.A. in swimming was seriously challenged by East Germany. The present trends indicate that the U.S.A.'s ranking is likely to slip further to the third position in Moscow 1980. Factors inhibiting the survival of the Olympics are pointed. PMID:861436
Jahn, Beate; Rochau, Ursula; Kurzthaler, Christina; Paulden, Mike; Kluibenschädl, Martina; Arvandi, Marjan; Kühne, Felicitas; Goehler, Alexander; Krahn, Murray D; Siebert, Uwe
2016-04-01
Breast cancer is the most common malignancy among women in developed countries. We developed a model (the Oncotyrol breast cancer outcomes model) to evaluate the cost-effectiveness of a 21-gene assay when used in combination with Adjuvant! Online to support personalized decisions about the use of adjuvant chemotherapy. The goal of this study was to perform a cross-model validation. The Oncotyrol model evaluates the 21-gene assay by simulating a hypothetical cohort of 50-year-old women over a lifetime horizon using discrete event simulation. Primary model outcomes were life-years, quality-adjusted life-years (QALYs), costs, and incremental cost-effectiveness ratios (ICERs). We followed the International Society for Pharmacoeconomics and Outcomes Research-Society for Medical Decision Making (ISPOR-SMDM) best practice recommendations for validation and compared modeling results of the Oncotyrol model with the state-transition model developed by the Toronto Health Economics and Technology Assessment (THETA) Collaborative. Both models were populated with Canadian THETA model parameters, and outputs were compared. The differences between the models varied among the different validation end points. The smallest relative differences were in costs, and the greatest were in QALYs. All relative differences were less than 1.2%. The cost-effectiveness plane showed that small differences in the model structure can lead to different sets of nondominated test-treatment strategies with different efficiency frontiers. We faced several challenges: distinguishing between differences in outcomes due to different modeling techniques and initial coding errors, defining meaningful differences, and selecting measures and statistics for comparison (means, distributions, multivariate outcomes). Cross-model validation was crucial to identify and correct coding errors and to explain differences in model outcomes. In our comparison, small differences in either QALYs or costs led to changes in ICERs because of changes in the set of dominated and nondominated strategies. © The Author(s) 2015.
Estimating alarm thresholds and the number of components in mixture distributions
NASA Astrophysics Data System (ADS)
Burr, Tom; Hamada, Michael S.
2012-09-01
Mixtures of probability distributions arise in many nuclear assay and forensic applications, including nuclear weapon detection, neutron multiplicity counting, and in solution monitoring (SM) for nuclear safeguards. SM data is increasingly used to enhance nuclear safeguards in aqueous reprocessing facilities having plutonium in solution form in many tanks. This paper provides background for mixture probability distributions and then focuses on mixtures arising in SM data. SM data can be analyzed by evaluating transfer-mode residuals defined as tank-to-tank transfer differences, and wait-mode residuals defined as changes during non-transfer modes. A previous paper investigated impacts on transfer-mode and wait-mode residuals of event marking errors which arise when the estimated start and/or stop times of tank events such as transfers are somewhat different from the true start and/or stop times. Event marking errors contribute to non-Gaussian behavior and larger variation than predicted on the basis of individual tank calibration studies. This paper illustrates evidence for mixture probability distributions arising from such event marking errors and from effects such as condensation or evaporation during non-transfer modes, and pump carryover during transfer modes. A quantitative assessment of the sample size required to adequately characterize a mixture probability distribution arising in any context is included.
2014-01-01
Background The Italian code of medical deontology recently approved stipulates that physicians have the duty to inform the patient of each unwanted event and its causes, and to identify, report and evaluate adverse events and errors. Thus the obligation to supply information continues to widen, in some way extending beyond the doctor-patient relationship to become an essential tool for improving the quality of professional services. Discussion The new deontological precepts intersect two areas in which the figure of the physician is paramount. On the one hand is the need for maximum integrity towards the patient, in the name of the doctor’s own, and the other’s (the patient’s) dignity and liberty; on the other is the physician’s developing role in the strategies of the health system to achieve efficacy, quality, reliability and efficiency, to reduce errors and adverse events and to manage clinical risk. Summary In Italy, due to guidelines issued by the Ministry of Health and to the new code of medical deontology, the role of physicians becomes a part of a complex strategy of risk management based on a system focused approach in which increasing transparency regarding adverse outcomes and full disclosure of health- related negative events represent a key factor. PMID:25023339
Xu, Stanley; Clarke, Christina L; Newcomer, Sophia R; Daley, Matthew F; Glanz, Jason M
2018-05-16
Vaccine safety studies are often electronic health record (EHR)-based observational studies. These studies often face significant methodological challenges, including confounding and misclassification of adverse event. Vaccine safety researchers use self-controlled case series (SCCS) study design to handle confounding effect and employ medical chart review to ascertain cases that are identified using EHR data. However, for common adverse events, limited resources often make it impossible to adjudicate all adverse events observed in electronic data. In this paper, we considered four approaches for analyzing SCCS data with confirmation rates estimated from an internal validation sample: (1) observed cases, (2) confirmed cases only, (3) known confirmation rate, and (4) multiple imputation (MI). We conducted a simulation study to evaluate these four approaches using type I error rates, percent bias, and empirical power. Our simulation results suggest that when misclassification of adverse events is present, approaches such as observed cases, confirmed case only, and known confirmation rate may inflate the type I error, yield biased point estimates, and affect statistical power. The multiple imputation approach considers the uncertainty of estimated confirmation rates from an internal validation sample, yields a proper type I error rate, largely unbiased point estimate, proper variance estimate, and statistical power. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Turillazzi, Emanuela; Neri, Margherita
2014-07-15
The Italian code of medical deontology recently approved stipulates that physicians have the duty to inform the patient of each unwanted event and its causes, and to identify, report and evaluate adverse events and errors. Thus the obligation to supply information continues to widen, in some way extending beyond the doctor-patient relationship to become an essential tool for improving the quality of professional services. The new deontological precepts intersect two areas in which the figure of the physician is paramount. On the one hand is the need for maximum integrity towards the patient, in the name of the doctor's own, and the other's (the patient's) dignity and liberty; on the other is the physician's developing role in the strategies of the health system to achieve efficacy, quality, reliability and efficiency, to reduce errors and adverse events and to manage clinical risk. In Italy, due to guidelines issued by the Ministry of Health and to the new code of medical deontology, the role of physicians becomes a part of a complex strategy of risk management based on a system focused approach in which increasing transparency regarding adverse outcomes and full disclosure of health- related negative events represent a key factor.
NASA Astrophysics Data System (ADS)
Coyne, Kevin Anthony
The safe operation of complex systems such as nuclear power plants requires close coordination between the human operators and plant systems. In order to maintain an adequate level of safety following an accident or other off-normal event, the operators often are called upon to perform complex tasks during dynamic situations with incomplete information. The safety of such complex systems can be greatly improved if the conditions that could lead operators to make poor decisions and commit erroneous actions during these situations can be predicted and mitigated. The primary goal of this research project was the development and validation of a cognitive model capable of simulating nuclear plant operator decision-making during accident conditions. Dynamic probabilistic risk assessment methods can improve the prediction of human error events by providing rich contextual information and an explicit consideration of feedback arising from man-machine interactions. The Accident Dynamics Simulator paired with the Information, Decision, and Action in a Crew context cognitive model (ADS-IDAC) shows promise for predicting situational contexts that might lead to human error events, particularly knowledge driven errors of commission. ADS-IDAC generates a discrete dynamic event tree (DDET) by applying simple branching rules that reflect variations in crew responses to plant events and system status changes. Branches can be generated to simulate slow or fast procedure execution speed, skipping of procedure steps, reliance on memorized information, activation of mental beliefs, variations in control inputs, and equipment failures. Complex operator mental models of plant behavior that guide crew actions can be represented within the ADS-IDAC mental belief framework and used to identify situational contexts that may lead to human error events. This research increased the capabilities of ADS-IDAC in several key areas. The ADS-IDAC computer code was improved to support additional branching events and provide a better representation of the IDAC cognitive model. An operator decision-making engine capable of responding to dynamic changes in situational context was implemented. The IDAC human performance model was fully integrated with a detailed nuclear plant model in order to realistically simulate plant accident scenarios. Finally, the improved ADS-IDAC model was calibrated, validated, and updated using actual nuclear plant crew performance data. This research led to the following general conclusions: (1) A relatively small number of branching rules are capable of efficiently capturing a wide spectrum of crew-to-crew variabilities. (2) Compared to traditional static risk assessment methods, ADS-IDAC can provide a more realistic and integrated assessment of human error events by directly determining the effect of operator behaviors on plant thermal hydraulic parameters. (3) The ADS-IDAC approach provides an efficient framework for capturing actual operator performance data such as timing of operator actions, mental models, and decision-making activities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, Katrina E.; Cannon, Alex J.; Hinzman, Larry
Climate change will shift the frequency, intensity, duration and persistence of extreme hydroclimate events and have particularly disastrous consequences in vulnerable systems such as the warm permafrost-dominated Interior region of boreal Alaska. This work focuses on recent research results from nonparametric trends and nonstationary generalized extreme value (GEV) analyses at eight Interior Alaskan river basins for the past 50/60 years (1954/64–2013). Trends analysis of maximum and minimum streamflow indicates a strong (>+50%) and statistically significant increase in 11-day flow events during the late fall/winter and during the snowmelt period (late April/mid-May), followed by a significant decrease in the 11-day flowmore » events during the post-snowmelt period (late May and into the summer). The April–May–June seasonal trends show significant decreases in maximum streamflow for snowmelt dominated systems (<–50%) and glacially influenced basins (–24% to –33%). Annual maximum streamflow trends indicate that most systems are experiencing declines, while minimum flow trends are largely increasing. Nonstationary GEV analysis identifies time-dependent changes in the distribution of spring extremes for snowmelt dominated and glacially dominated systems. Temperature in spring influences the glacial and high elevation snowmelt systems and winter precipitation drives changes in the snowmelt dominated basins. The Pacific Decadal Oscillation was associated with changes occurring in snowmelt dominated systems, and the Arctic Oscillation was linked to one lake dominated basin, with half of the basins exhibiting no change in response to climate variability. The paper indicates that broad scale studies examining trend and direction of change should employ multiple methods across various scales and consider regime dependent shifts to identify and understand changes in extreme streamflow within boreal forested watersheds of Alaska.« less
Historical trends and extremes in boreal Alaska river basins
Bennett, Katrina E.; Cannon, Alex J.; Hinzman, Larry
2015-05-12
Climate change will shift the frequency, intensity, duration and persistence of extreme hydroclimate events and have particularly disastrous consequences in vulnerable systems such as the warm permafrost-dominated Interior region of boreal Alaska. This work focuses on recent research results from nonparametric trends and nonstationary generalized extreme value (GEV) analyses at eight Interior Alaskan river basins for the past 50/60 years (1954/64–2013). Trends analysis of maximum and minimum streamflow indicates a strong (>+50%) and statistically significant increase in 11-day flow events during the late fall/winter and during the snowmelt period (late April/mid-May), followed by a significant decrease in the 11-day flowmore » events during the post-snowmelt period (late May and into the summer). The April–May–June seasonal trends show significant decreases in maximum streamflow for snowmelt dominated systems (<–50%) and glacially influenced basins (–24% to –33%). Annual maximum streamflow trends indicate that most systems are experiencing declines, while minimum flow trends are largely increasing. Nonstationary GEV analysis identifies time-dependent changes in the distribution of spring extremes for snowmelt dominated and glacially dominated systems. Temperature in spring influences the glacial and high elevation snowmelt systems and winter precipitation drives changes in the snowmelt dominated basins. The Pacific Decadal Oscillation was associated with changes occurring in snowmelt dominated systems, and the Arctic Oscillation was linked to one lake dominated basin, with half of the basins exhibiting no change in response to climate variability. The paper indicates that broad scale studies examining trend and direction of change should employ multiple methods across various scales and consider regime dependent shifts to identify and understand changes in extreme streamflow within boreal forested watersheds of Alaska.« less
MEADERS: Medication Errors and Adverse Drug Event Reporting system.
Zafar, Atif
2007-10-11
The Agency for Healthcare Research and Quality (AHRQ) recently funded the PBRN Resource Center to develop a system for reporting ambulatory medication errors. Our goal was to develop a usable system that practices could use internally to track errors. We initially performed a comprehensive literature review of what is currently available. Then, using a combination of expert panel meetings and iterative development we designed an instrument for ambulatory medication error reporting and createad a reporting system based both in MS Access 2003 and on the web using MS ASP.NET 2.0 technologies.
Scarry, Clara J; Tujague, M Paula
2012-09-01
In conflicts between primate groups, the resource-holding potential (RHP) of competitors is frequently related to group size or male group size, which can remain relatively constant for long periods of time, promoting stable intergroup dominance relationships. Demographic changes in neighboring groups, however, could introduce uncertainty into existing relationships. Among tufted capuchin monkeys (Cebus apella nigritus), dominant male replacement is a relatively infrequent demographic event that can have a profound effect on both the composition and size of the social group. Here, we report such a case and the consequences for home range use and intergroup aggression. Between June 2008 and August 2010, we periodically followed two neighboring groups (Macuco and Rita) in Iguazú National Park, recording daily paths (N = 143) and encounters between the groups (N = 28). We describe the events leading to a change in the male dominance hierarchy in the larger group (Macuco), which resulted in the death or dispersal of all adult males, followed by the succession of a young adult male to the dominant position. This takeover event reduced the numerical advantage in number of males between the two groups, although the ratio of total group sizes remained nearly constant. Following this shift in numerical asymmetry, the degree of escalation of intergroup aggression increased, and we observed reversals in the former intergroup dominance relationship. These changes in behavior during intergroup encounters were associated with changes in the use of overlapping areas. In the 6 months following the takeover, the area of home range overlap doubled, and the formerly dominant group's area of exclusive access was reduced by half. These results suggest that RHPin tufted capuchin monkeys is related to male group size. Furthermore, they highlight the importance of considering rare demographic events in attempts to understand the dynamics of aggression between primate groups. © 2012 Wiley Periodicals, Inc.
Dodd, Lori E; Korn, Edward L; Freidlin, Boris; Gu, Wenjuan; Abrams, Jeffrey S; Bushnell, William D; Canetta, Renzo; Doroshow, James H; Gray, Robert J; Sridhara, Rajeshwari
2013-10-01
Measurement error in time-to-event end points complicates interpretation of treatment effects in clinical trials. Non-differential measurement error is unlikely to produce large bias [1]. When error depends on treatment arm, bias is of greater concern. Blinded-independent central review (BICR) of all images from a trial is commonly undertaken to mitigate differential measurement-error bias that may be present in hazard ratios (HRs) based on local evaluations. Similar BICR and local evaluation HRs may provide reassurance about the treatment effect, but BICR adds considerable time and expense to trials. We describe a BICR audit strategy [2] and apply it to five randomized controlled trials to evaluate its use and to provide practical guidelines. The strategy requires BICR on a subset of study subjects, rather than a complete-case BICR, and makes use of an auxiliary-variable estimator. When the effect size is relatively large, the method provides a substantial reduction in the size of the BICRs. In a trial with 722 participants and a HR of 0.48, an average audit of 28% of the data was needed and always confirmed the treatment effect as assessed by local evaluations. More moderate effect sizes and/or smaller trial sizes required larger proportions of audited images, ranging from 57% to 100% for HRs ranging from 0.55 to 0.77 and sample sizes between 209 and 737. The method is developed for a simple random sample of study subjects. In studies with low event rates, more efficient estimation may result from sampling individuals with events at a higher rate. The proposed strategy can greatly decrease the costs and time associated with BICR, by reducing the number of images undergoing review. The savings will depend on the underlying treatment effect and trial size, with larger treatment effects and larger trials requiring smaller proportions of audited data.
Leading a highly visible hospital through a serious reportable event.
Erickson, Jeanette Ives
2012-03-01
Most preventable adverse events result from systemic causes, not human error. The senior vice president for patient care/chief nurse at a leading hospital recounts the unnecessary death of a patient and the investigation that followed. Citing the critical importance of a "just culture," this case study offers a blueprint for managing a serious reportable event.
Clinical Errors and Medical Negligence
Oyebode, Femi
2013-01-01
This paper discusses the definition, nature and origins of clinical errors including their prevention. The relationship between clinical errors and medical negligence is examined as are the characteristics of litigants and events that are the source of litigation. The pattern of malpractice claims in different specialties and settings is examined. Among hospitalized patients worldwide, 3–16s% suffer injury as a result of medical intervention, the most common being the adverse effects of drugs. The frequency of adverse drug effects appears superficially to be higher in intensive care units and emergency departments but once rates have been corrected for volume of patients, comorbidity of conditions and number of drugs prescribed, the difference is not significant. It is concluded that probably no more than 1 in 7 adverse events in medicine result in a malpractice claim and the factors that predict that a patient will resort to litigation include a prior poor relationship with the clinician and the feeling that the patient is not being kept informed. Methods for preventing clinical errors are still in their infancy. The most promising include new technologies such as electronic prescribing systems, diagnostic and clinical decision-making aids and error-resistant systems. PMID:23343656
Clinical errors and medical negligence.
Oyebode, Femi
2013-01-01
This paper discusses the definition, nature and origins of clinical errors including their prevention. The relationship between clinical errors and medical negligence is examined as are the characteristics of litigants and events that are the source of litigation. The pattern of malpractice claims in different specialties and settings is examined. Among hospitalized patients worldwide, 3-16% suffer injury as a result of medical intervention, the most common being the adverse effects of drugs. The frequency of adverse drug effects appears superficially to be higher in intensive care units and emergency departments but once rates have been corrected for volume of patients, comorbidity of conditions and number of drugs prescribed, the difference is not significant. It is concluded that probably no more than 1 in 7 adverse events in medicine result in a malpractice claim and the factors that predict that a patient will resort to litigation include a prior poor relationship with the clinician and the feeling that the patient is not being kept informed. Methods for preventing clinical errors are still in their infancy. The most promising include new technologies such as electronic prescribing systems, diagnostic and clinical decision-making aids and error-resistant systems. Copyright © 2013 S. Karger AG, Basel.
ANALYZING NUMERICAL ERRORS IN DOMAIN HEAT TRANSPORT MODELS USING THE CVBEM.
Hromadka, T.V.; ,
1985-01-01
Besides providing an exact solution for steady-state heat conduction processes (Laplace Poisson equations), the CVBEM (complex variable boundary element method) can be used for the numerical error analysis of domain model solutions. For problems where soil water phase change latent heat effects dominate the thermal regime, heat transport can be approximately modeled as a time-stepped steady-state condition in the thawed and frozen regions, respectively. The CVBEM provides an exact solution of the two-dimensional steady-state heat transport problem, and also provides the error in matching the prescribed boundary conditions by the development of a modeling error distribution or an approximative boundary generation. This error evaluation can be used to develop highly accurate CVBEM models of the heat transport process, and the resulting model can be used as a test case for evaluating the precision of domain models based on finite elements or finite differences.
Prediction of human errors by maladaptive changes in event-related brain networks.
Eichele, Tom; Debener, Stefan; Calhoun, Vince D; Specht, Karsten; Engel, Andreas K; Hugdahl, Kenneth; von Cramon, D Yves; Ullsperger, Markus
2008-04-22
Humans engaged in monotonous tasks are susceptible to occasional errors that may lead to serious consequences, but little is known about brain activity patterns preceding errors. Using functional MRI and applying independent component analysis followed by deconvolution of hemodynamic responses, we studied error preceding brain activity on a trial-by-trial basis. We found a set of brain regions in which the temporal evolution of activation predicted performance errors. These maladaptive brain activity changes started to evolve approximately 30 sec before the error. In particular, a coincident decrease of deactivation in default mode regions of the brain, together with a decline of activation in regions associated with maintaining task effort, raised the probability of future errors. Our findings provide insights into the brain network dynamics preceding human performance errors and suggest that monitoring of the identified precursor states may help in avoiding human errors in critical real-world situations.
Prediction of human errors by maladaptive changes in event-related brain networks
Eichele, Tom; Debener, Stefan; Calhoun, Vince D.; Specht, Karsten; Engel, Andreas K.; Hugdahl, Kenneth; von Cramon, D. Yves; Ullsperger, Markus
2008-01-01
Humans engaged in monotonous tasks are susceptible to occasional errors that may lead to serious consequences, but little is known about brain activity patterns preceding errors. Using functional MRI and applying independent component analysis followed by deconvolution of hemodynamic responses, we studied error preceding brain activity on a trial-by-trial basis. We found a set of brain regions in which the temporal evolution of activation predicted performance errors. These maladaptive brain activity changes started to evolve ≈30 sec before the error. In particular, a coincident decrease of deactivation in default mode regions of the brain, together with a decline of activation in regions associated with maintaining task effort, raised the probability of future errors. Our findings provide insights into the brain network dynamics preceding human performance errors and suggest that monitoring of the identified precursor states may help in avoiding human errors in critical real-world situations. PMID:18427123
Statistical process control methods allow the analysis and improvement of anesthesia care.
Fasting, Sigurd; Gisvold, Sven E
2003-10-01
Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.
Determination of Barometric Altimeter Errors for the Orion Exploration Flight Test-1 Entry
NASA Technical Reports Server (NTRS)
Brown, Denise L.; Munoz, Jean-Philippe; Gay, Robert
2011-01-01
The EFT-1 mission is the unmanned flight test for the upcoming Multi-Purpose Crew Vehicle (MPCV). During entry, the EFT-1 vehicle will trigger several Landing and Recovery System (LRS) events, such as parachute deployment, based on onboard altitude information. The primary altitude source is the filtered navigation solution updated with GPS measurement data. The vehicle also has three barometric altimeters that will be used to measure atmospheric pressure during entry. In the event that GPS data is not available during entry, the altitude derived from the barometric altimeter pressure will be used to trigger chute deployment for the drogues and main parachutes. Therefore it is important to understand the impact of error sources on the pressure measured by the barometric altimeters and on the altitude derived from that pressure. There are four primary error sources impacting the sensed pressure: sensor errors, Analog to Digital conversion errors, aerodynamic errors, and atmosphere modeling errors. This last error source is induced by the conversion from pressure to altitude in the vehicle flight software, which requires an atmosphere model such as the US Standard 1976 Atmosphere model. There are several secondary error sources as well, such as waves, tides, and latencies in data transmission. Typically, for error budget calculations it is assumed that all error sources are independent, normally distributed variables. Thus, the initial approach to developing the EFT-1 barometric altimeter altitude error budget was to create an itemized error budget under these assumptions. This budget was to be verified by simulation using high fidelity models of the vehicle hardware and software. The simulation barometric altimeter model includes hardware error sources and a data-driven model of the aerodynamic errors expected to impact the pressure in the midbay compartment in which the sensors are located. The aerodynamic model includes the pressure difference between the midbay compartment and the free stream pressure as a function of altitude, oscillations in sensed pressure due to wake effects, and an acoustics model capturing fluctuations in pressure due to motion of the passive vents separating the barometric altimeters from the outside of the vehicle.
From Here to There: Lessons from an Integrative Patient Safety Project in Rural Health Care Settings
2005-05-01
errors and patient falls. The medication errors generally involved one of three issues: incorrect dose, time, or port. Although most of the health...statistics about trends; and the summary of events related to patient safety and medical errors.12 The interplay among factors These three domains...the medical staff. We explored these issues further when administering a staff-wide Patient Safety Survey. Responses mirrored the findings that
NASA Astrophysics Data System (ADS)
Wiese, D. N.; McCullough, C. M.
2017-12-01
Studies have shown that both single pair low-low satellite-to-satellite tracking (LL-SST) and dual-pair LL-SST hypothetical future satellite gravimetry missions utilizing improved onboard measurement systems relative to the Gravity Recovery and Climate Experiment (GRACE) will be limited by temporal aliasing errors; that is, the error introduced through deficiencies in models of high frequency mass variations required for the data processing. Here, we probe the spatio-temporal characteristics of temporal aliasing errors to understand their impact on satellite gravity retrievals using high fidelity numerical simulations. We find that while aliasing errors are dominant at long wavelengths and multi-day timescales, improving knowledge of high frequency mass variations at these resolutions translates into only modest improvements (i.e. spatial resolution/accuracy) in the ability to measure temporal gravity variations at monthly timescales. This result highlights the reliance on accurate models of high frequency mass variations for gravity processing, and the difficult nature of reducing temporal aliasing errors and their impact on satellite gravity retrievals.
Hessian matrix approach for determining error field sensitivity to coil deviations.
Zhu, Caoxiang; Hudson, Stuart R.; Lazerson, Samuel A.; ...
2018-03-15
The presence of error fields has been shown to degrade plasma confinement and drive instabilities. Error fields can arise from many sources, but are predominantly attributed to deviations in the coil geometry. In this paper, we introduce a Hessian matrix approach for determining error field sensitivity to coil deviations. A primary cost function used for designing stellarator coils, the surface integral of normalized normal field errors, was adopted to evaluate the deviation of the generated magnetic field from the desired magnetic field. The FOCUS code [Zhu et al., Nucl. Fusion 58(1):016008 (2018)] is utilized to provide fast and accurate calculationsmore » of the Hessian. The sensitivities of error fields to coil displacements are then determined by the eigenvalues of the Hessian matrix. A proof-of-principle example is given on a CNT-like configuration. We anticipate that this new method could provide information to avoid dominant coil misalignments and simplify coil designs for stellarators.« less
Hessian matrix approach for determining error field sensitivity to coil deviations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Caoxiang; Hudson, Stuart R.; Lazerson, Samuel A.
The presence of error fields has been shown to degrade plasma confinement and drive instabilities. Error fields can arise from many sources, but are predominantly attributed to deviations in the coil geometry. In this paper, we introduce a Hessian matrix approach for determining error field sensitivity to coil deviations. A primary cost function used for designing stellarator coils, the surface integral of normalized normal field errors, was adopted to evaluate the deviation of the generated magnetic field from the desired magnetic field. The FOCUS code [Zhu et al., Nucl. Fusion 58(1):016008 (2018)] is utilized to provide fast and accurate calculationsmore » of the Hessian. The sensitivities of error fields to coil displacements are then determined by the eigenvalues of the Hessian matrix. A proof-of-principle example is given on a CNT-like configuration. We anticipate that this new method could provide information to avoid dominant coil misalignments and simplify coil designs for stellarators.« less
Study of run time errors of the ATLAS pixel detector in the 2012 data taking period
NASA Astrophysics Data System (ADS)
Gandrajula, Reddy Pratap
The high resolution silicon Pixel detector is critical in event vertex reconstruction and in particle track reconstruction in the ATLAS detector. During the pixel data taking operation, some modules (Silicon Pixel sensor +Front End Chip+ Module Control Chip (MCC)) go to an auto-disable state, where the Modules don't send the data for storage. Modules become operational again after reconfiguration. The source of the problem is not fully understood. One possible source of the problem is traced to the occurrence of single event upset (SEU) in the MCC. Such a module goes to either a Timeout or Busy state. This report is the study of different types and rates of errors occurring in the Pixel data taking operation. Also, the study includes the error rate dependency on Pixel detector geometry.
Schwanda, C; Abe, K; Abe, K; Abe, T; Adachi, I; Aihara, H; Akatsu, M; Asano, Y; Aushev, T; Bahinipati, S; Bakich, A M; Ban, Y; Banas, E; Bay, A; Bizjak, I; Bondar, A; Bozek, A; Bracko, M; Browder, T E; Chang, M-C; Chao, Y; Cheon, B G; Choi, Y; Choi, Y K; Chuvikov, A; Cole, S; Danilov, M; Dash, M; Dong, L Y; Drutskoy, A; Eidelman, S; Eiges, V; Gabyshev, N; Gershon, T; Gokhroo, G; Golob, B; Hazumi, M; Higuchi, I; Hinz, L; Hokuue, T; Hoshi, Y; Hou, W-S; Huang, H-C; Iijima, T; Inami, K; Ishikawa, A; Itoh, R; Iwasaki, H; Iwasaki, M; Kang, J H; Kang, J S; Kapusta, P; Katayama, N; Kawai, H; Kichimi, H; Kim, H J; Kinoshita, K; Koppenburg, P; Korpar, S; Krizan, P; Krokovny, P; Kumar, S; Kwon, Y-J; Lange, J S; Leder, G; Lee, S H; Lesiak, T; Li, J; Limosani, A; Lin, S-W; MacNaughton, J; Mandl, F; Matsumoto, T; Matyja, A; Mikami, Y; Mitaroff, W; Miyake, H; Miyata, H; Mori, T; Nagamine, T; Nagasaka, Y; Nakano, E; Nakao, M; Natkaniec, Z; Nishida, S; Nitoh, O; Nozaki, T; Ogawa, S; Ohshima, T; Okabe, T; Okuno, S; Olsen, S L; Onuki, Y; Ostrowicz, W; Ozaki, H; Pakhlov, P; Palka, H; Park, C W; Park, H; Parslow, N; Peak, L S; Piilonen, L E; Sagawa, H; Saitoh, S; Sakai, Y; Sarangi, T R; Schneider, O; Schümann, J; Schwartz, A J; Semenov, S; Senyo, K; Sevior, M E; Shibuya, H; Singh, J B; Soni, N; Stamen, R; Stanic, S; Staric, M; Sumisawa, K; Sumiyoshi, T; Suzuki, S; Tajima, O; Takasaki, F; Tamai, K; Tanaka, M; Teramoto, Y; Tomura, T; Tsukamoto, T; Uehara, S; Uglov, T; Ueno, K; Uno, S; Varner, G; Varvell, K E; Wang, C C; Wang, C H; Yabsley, B D; Yamada, Y; Yamaguchi, A; Yamashita, Y; Yanai, H; Ying, J; Zhang, Z P; Zontar, D; Zürcher, D
2004-09-24
We have searched for the decay B+-->omegal(+)nu (l=e or mu) in 78 fb(-1) of Upsilon(4S) data (85x10(6)BB events) accumulated with the Belle detector. The final state is fully reconstructed using the omega decay into pi(+)pi(-)pi(0), combined with detector hermeticity to estimate the neutrino momentum. A signal of 414+/-125 events is found in the data, corresponding to a branching fraction of (1.3+/-0.4+/-0.2+/-0.3)x10(-4), where the first two errors are statistical and systematic, respectively. The third error reflects the estimated form-factor uncertainty.
A pilot study of the safety implications of Australian nurses' sleep and work hours.
Dorrian, Jillian; Lamond, Nicole; van den Heuvel, Cameron; Pincombe, Jan; Rogers, Ann E; Dawson, Drew
2006-01-01
The frequency and severity of adverse events in Australian healthcare is under increasing scrutiny. A recent state government report identified 31 events involving "death or serious [patient] harm" and 452 "very high risk" incidents. Australia-wide, a previous study identified 2,324 adverse medical events (AME) in a single year, with more than half considered preventable. Despite the recognized link between fatigue and error in other industries, to date, few studies of medical errors have assessed the fatigue of the healthcare professionals involved. Nurses work extended and unpredictable hours with a lack of regular breaks and are therefore likely to experience elevated fatigue. Currently, there is very little available information on Australian nurses' sleep or fatigue levels, nor is there any information about whether this affects their performance. This study therefore aims to examine work hours, sleep, fatigue and error occurrence in Australian nurses. Using logbooks, 23 full-time nurses in a metropolitan hospital completed daily recordings for one month (644 days, 377 shifts) of their scheduled and actual work hours, sleep length and quality, sleepiness, and fatigue levels. Frequency and type of nursing errors, near errors, and observed errors (made by others) were recorded. Nurses reported struggling to remain awake during 36% of shifts. Moderate to high levels of stress, physical exhaustion, and mental exhaustion were reported on 23%, 40%, and 36% of shifts, respectively. Extreme drowsiness while driving or cycling home was reported on 45 occasions (11.5%), with three reports of near accidents. Overall, 20 errors, 13 near errors, and 22 observed errors were reported. The perceived potential consequences for the majority of errors were minor; however, 11 errors were associated with moderate and four with potentially severe consequences. Nurses reported that they had trouble falling asleep on 26.8% of days, had frequent arousals on 34.0% of days, and that work-related concerns were either partially or fully responsible for their sleep disruption on 12.5% of occasions. Fourteen out of the 23 nurses reported using a sleep aid. The most commonly reported sleep aids were prescription medications (62.7%), followed by alcohol (26.9%). Total sleep duration was significantly shorter on workdays than days off (p < 0.01). In comparison to other workdays, sleep was significantly shorter on days when an error (p < 0.05) or a near error (p < 0.01) was recorded. In contrast, sleep was higher on workdays when someone else's error was recorded (p = 0.08). Logistic regression analysis indicated that sleep duration was a significant predictor of error occurrence (chi2 = 6.739, p = 0.009, e beta = 0.727). The findings of this pilot study suggest that Australian nurses experience sleepiness and related physical symptoms at work and during their trip home. Further, a measurable number of errors occur of various types and severity. Less sleep may lead to the increased likelihood of making an error, and importantly, the decreased likelihood of catching someone else's error. These pilot results suggest that further investigation into the effects of sleep loss in nursing may be necessary for patient safety from an individual nurse perspective and from a healthcare team perspective.
NASA Astrophysics Data System (ADS)
Yan, Hongxiang; Sun, Ning; Wigmosta, Mark; Skaggs, Richard; Hou, Zhangshuan; Leung, Ruby
2018-02-01
There is a renewed focus on the design of infrastructure resilient to extreme hydrometeorological events. While precipitation-based intensity-duration-frequency (IDF) curves are commonly used as part of infrastructure design, a large percentage of peak runoff events in snow-dominated regions are caused by snowmelt, particularly during rain-on-snow (ROS) events. In these regions, precipitation-based IDF curves may lead to substantial overestimation/underestimation of design basis events and subsequent overdesign/underdesign of infrastructure. To overcome this deficiency, we proposed next-generation IDF (NG-IDF) curves, which characterize the actual water reaching the land surface. We compared NG-IDF curves to standard precipitation-based IDF curves for estimates of extreme events at 376 Snowpack Telemetry (SNOTEL) stations across the western United States that each had at least 30 years of high-quality records. We found standard precipitation-based IDF curves at 45% of the stations were subject to underdesign, many with significant underestimation of 100 year extreme events, for which the precipitation-based IDF curves can underestimate water potentially available for runoff by as much as 125% due to snowmelt and ROS events. The regions with the greatest potential for underdesign were in the Pacific Northwest, the Sierra Nevada Mountains, and the Middle and Southern Rockies. We also found the potential for overdesign at 20% of the stations, primarily in the Middle Rockies and Arizona mountains. These results demonstrate the need to consider snow processes in the development of IDF curves, and they suggest use of the more robust NG-IDF curves for hydrologic design in snow-dominated environments.
NASA Astrophysics Data System (ADS)
Ohminato, T.; Kobayashi, T.; Ida, Y.; Fujita, E.
2006-12-01
During the 2000 Miyake-jima volcanic activity started on 26 June 2000, an intense earthquake swarm occurred initially beneath the southwest flank near the summit and gradually migrated west of the island. A volcanic earthquake activity in the island was reactivated beneath the summit, leading to a summit eruption with a significant summit subsidence on 8 July. We detected small but numerous number of long period (LP) seismic signals during these activities. Most of them include both 0.2 and 0.4 Hz components suggesting an existence of a harmonic oscillator. Some of them have dominant frequency peak at 0.2Hz (LP1), while others have one at 0.4 Hz (LP2). At the beginning of each waveform of both LP1 and LP2, an impulsive signal with a pulse-width of about 2 s is clearly identified. The major axis of the particle motion for the initial impulsive signal is almost horizontal suggesting a shallow source beneath the summit, while the inclined particle motion for the latter phase suggests deeper source beneath the island. For both LP1 and LP2, we can identify a clear positive correlation between the amplitude of the initial pulse and that of the latter phase. We conducted waveform inversions for the LP events assuming a point source and determined the locations and mechanisms simultaneously. We assumed three types of source mechanisms; three single forces, six moment tensor components, and a combination of moment tensor and single forces. We used AIC to decide the optimal solutions. Firstly, we applied the method to the entire waveform including both the initial pulse and the latter phase. The source type with a combination of moment tensor and single force components yields the minimum values of the AIC for both LP events. However, the spatial distribution of the residual errors tends to have two local minima. Considering the error distribution and the characteristic particle motions, it is likely that the source of the LP event consists of two different parts. We thus divided the LP events into two parts; the initial and the latter phases, and applied the same waveform inversion procedure separately for each part of the waveform. The inversion results show that the initial impulsive phase and the latter oscillatory phase are well explained by a nearly horizontal single force and a moment solution, respectively. The single force solutions of the initial pulse are positioned at the depth of about 2 km beneath the summit. The single force initially oriented to the north, and then to the south. On the other hand, the sources of the moment solutions are significantly deeper than the single force solutions. The hypocenter of the later phase of LP1 is located at the depth of 5.5 km in the southern region of the island, while that for the LP2 event is at 5.1 km beneath the summit. The horizontal oscillations are relatively dominant for both the LP1 and LP2 events. Although the two sources are separated each other by several kilometers, the positive correlation between the amplitudes of the initial pulse and the latter phase strongly suggests that the shallow sources trigger the deeper sources. The source time histories of the 6 moment tensor components of the latter portion of the LP1 and LP2 are not in phase. This makes it difficult to extract information on source geometry using the amplitude ratio among moment tensor components in a traditional manner. It may suggest that the source is composed of two independent sources whose oscillations are out of phase.
No rationale for 1 variable per 10 events criterion for binary logistic regression analysis.
van Smeden, Maarten; de Groot, Joris A H; Moons, Karel G M; Collins, Gary S; Altman, Douglas G; Eijkemans, Marinus J C; Reitsma, Johannes B
2016-11-24
Ten events per variable (EPV) is a widely advocated minimal criterion for sample size considerations in logistic regression analysis. Of three previous simulation studies that examined this minimal EPV criterion only one supports the use of a minimum of 10 EPV. In this paper, we examine the reasons for substantial differences between these extensive simulation studies. The current study uses Monte Carlo simulations to evaluate small sample bias, coverage of confidence intervals and mean square error of logit coefficients. Logistic regression models fitted by maximum likelihood and a modified estimation procedure, known as Firth's correction, are compared. The results show that besides EPV, the problems associated with low EPV depend on other factors such as the total sample size. It is also demonstrated that simulation results can be dominated by even a few simulated data sets for which the prediction of the outcome by the covariates is perfect ('separation'). We reveal that different approaches for identifying and handling separation leads to substantially different simulation results. We further show that Firth's correction can be used to improve the accuracy of regression coefficients and alleviate the problems associated with separation. The current evidence supporting EPV rules for binary logistic regression is weak. Given our findings, there is an urgent need for new research to provide guidance for supporting sample size considerations for binary logistic regression analysis.
Veazey, Lindsay M; Franklin, Erik C; Kelley, Christopher; Rooney, John; Frazer, L Neil; Toonen, Robert J
2016-01-01
Predictive habitat suitability models are powerful tools for cost-effective, statistically robust assessment of the environmental drivers of species distributions. The aim of this study was to develop predictive habitat suitability models for two genera of scleractinian corals (Leptoserisand Montipora) found within the mesophotic zone across the main Hawaiian Islands. The mesophotic zone (30-180 m) is challenging to reach, and therefore historically understudied, because it falls between the maximum limit of SCUBA divers and the minimum typical working depth of submersible vehicles. Here, we implement a logistic regression with rare events corrections to account for the scarcity of presence observations within the dataset. These corrections reduced the coefficient error and improved overall prediction success (73.6% and 74.3%) for both original regression models. The final models included depth, rugosity, slope, mean current velocity, and wave height as the best environmental covariates for predicting the occurrence of the two genera in the mesophotic zone. Using an objectively selected theta ("presence") threshold, the predicted presence probability values (average of 0.051 for Leptoseris and 0.040 for Montipora) were translated to spatially-explicit habitat suitability maps of the main Hawaiian Islands at 25 m grid cell resolution. Our maps are the first of their kind to use extant presence and absence data to examine the habitat preferences of these two dominant mesophotic coral genera across Hawai'i.
Ice-dammed lake drainage evolution at Russell Glacier, west Greenland
NASA Astrophysics Data System (ADS)
Carrivick, Jonathan L.; Tweed, Fiona S.; Ng, Felix; Quincey, Duncan J.; Mallalieu, Joseph; Ingeman-Nielsen, Thomas; Mikkelsen, Andreas B.; Palmer, Steven J.; Yde, Jacob C.; Homer, Rachel; Russell, Andrew J.; Hubbard, Alun
2017-11-01
Glaciological and hydraulic factors that control the timing and mechanisms of glacier lake outburst floods (GLOFs) remain poorly understood. This study used measurements of lake level at fifteen minute intervals and known lake bathymetry to calculate lake outflow during two GLOF events from the northern margin of Russell Glacier, west Greenland. We used measured ice surface elevation, interpolated subglacial topography and likely conduit geometry to inform a melt enlargement model of the outburst evolution. The model was tuned to best-fit the hydrograph’s rising limb and timing of peak discharge in both events; it achieved Mean Absolute Errors of < 5 %. About one third of the way through the rising limb, conduit melt enlargement became the dominant drainage mechanism. Lake water temperature, which strongly governed the enlargement rate, preconditioned the high peak discharge and short duration of these floods. We hypothesize that both GLOFs were triggered by ice dam flotation, and localised hydraulic jacking sustained most of their early-stage outflow, explaining the particularly rapid water egress in comparison to that recorded at other ice-marginal lakes. As ice overburden pressure relative to lake water hydraulic head diminished, flow became confined to a subglacial conduit. This study has emphasised the inter-play between ice dam thickness and lake level, drainage timing, lake water temperature and consequently rising stage lake outflow and flood evolution.
The effectiveness of pretreatment physics plan review for detecting errors in radiation therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopan, Olga; Zeng, Jing; Novak, Avrey
Purpose: The pretreatment physics plan review is a standard tool for ensuring treatment quality. Studies have shown that the majority of errors in radiation oncology originate in treatment planning, which underscores the importance of the pretreatment physics plan review. This quality assurance measure is fundamentally important and central to the safety of patients and the quality of care that they receive. However, little is known about its effectiveness. The purpose of this study was to analyze reported incidents to quantify the effectiveness of the pretreatment physics plan review with the goal of improving it. Methods: This study analyzed 522 potentiallymore » severe or critical near-miss events within an institutional incident learning system collected over a three-year period. Of these 522 events, 356 originated at a workflow point that was prior to the pretreatment physics plan review. The remaining 166 events originated after the pretreatment physics plan review and were not considered in the study. The applicable 356 events were classified into one of the three categories: (1) events detected by the pretreatment physics plan review, (2) events not detected but “potentially detectable” by the physics review, and (3) events “not detectable” by the physics review. Potentially detectable events were further classified by which specific checks performed during the pretreatment physics plan review detected or could have detected the event. For these events, the associated specific check was also evaluated as to the possibility of automating that check given current data structures. For comparison, a similar analysis was carried out on 81 events from the international SAFRON radiation oncology incident learning system. Results: Of the 356 applicable events from the institutional database, 180/356 (51%) were detected or could have been detected by the pretreatment physics plan review. Of these events, 125 actually passed through the physics review; however, only 38% (47/125) were actually detected at the review. Of the 81 events from the SAFRON database, 66/81 (81%) were potentially detectable by the pretreatment physics plan review. From the institutional database, three specific physics checks were particularly effective at detecting events (combined effectiveness of 38%): verifying the isocenter (39/180), verifying DRRs (17/180), and verifying that the plan matched the prescription (12/180). The most effective checks from the SAFRON database were verifying that the plan matched the prescription (13/66) and verifying the field parameters in the record and verify system against those in the plan (23/66). Software-based plan checking systems, if available, would have potential effectiveness of 29% and 64% at detecting events from the institutional and SAFRON databases, respectively. Conclusions: Pretreatment physics plan review is a key safety measure and can detect a high percentage of errors. However, the majority of errors that potentially could have been detected were not detected in this study, indicating the need to improve the pretreatment physics plan review performance. Suggestions for improvement include the automation of specific physics checks performed during the pretreatment physics plan review and the standardization of the review process.« less
The effectiveness of pretreatment physics plan review for detecting errors in radiation therapy.
Gopan, Olga; Zeng, Jing; Novak, Avrey; Nyflot, Matthew; Ford, Eric
2016-09-01
The pretreatment physics plan review is a standard tool for ensuring treatment quality. Studies have shown that the majority of errors in radiation oncology originate in treatment planning, which underscores the importance of the pretreatment physics plan review. This quality assurance measure is fundamentally important and central to the safety of patients and the quality of care that they receive. However, little is known about its effectiveness. The purpose of this study was to analyze reported incidents to quantify the effectiveness of the pretreatment physics plan review with the goal of improving it. This study analyzed 522 potentially severe or critical near-miss events within an institutional incident learning system collected over a three-year period. Of these 522 events, 356 originated at a workflow point that was prior to the pretreatment physics plan review. The remaining 166 events originated after the pretreatment physics plan review and were not considered in the study. The applicable 356 events were classified into one of the three categories: (1) events detected by the pretreatment physics plan review, (2) events not detected but "potentially detectable" by the physics review, and (3) events "not detectable" by the physics review. Potentially detectable events were further classified by which specific checks performed during the pretreatment physics plan review detected or could have detected the event. For these events, the associated specific check was also evaluated as to the possibility of automating that check given current data structures. For comparison, a similar analysis was carried out on 81 events from the international SAFRON radiation oncology incident learning system. Of the 356 applicable events from the institutional database, 180/356 (51%) were detected or could have been detected by the pretreatment physics plan review. Of these events, 125 actually passed through the physics review; however, only 38% (47/125) were actually detected at the review. Of the 81 events from the SAFRON database, 66/81 (81%) were potentially detectable by the pretreatment physics plan review. From the institutional database, three specific physics checks were particularly effective at detecting events (combined effectiveness of 38%): verifying the isocenter (39/180), verifying DRRs (17/180), and verifying that the plan matched the prescription (12/180). The most effective checks from the SAFRON database were verifying that the plan matched the prescription (13/66) and verifying the field parameters in the record and verify system against those in the plan (23/66). Software-based plan checking systems, if available, would have potential effectiveness of 29% and 64% at detecting events from the institutional and SAFRON databases, respectively. Pretreatment physics plan review is a key safety measure and can detect a high percentage of errors. However, the majority of errors that potentially could have been detected were not detected in this study, indicating the need to improve the pretreatment physics plan review performance. Suggestions for improvement include the automation of specific physics checks performed during the pretreatment physics plan review and the standardization of the review process.
Eventogram: A Visual Representation of Main Events in Biomedical Signals.
Elgendi, Mohamed
2016-09-22
Biomedical signals carry valuable physiological information and many researchers have difficulty interpreting and analyzing long-term, one-dimensional, quasi-periodic biomedical signals. Traditionally, biomedical signals are analyzed and visualized using periodogram, spectrogram, and wavelet methods. However, these methods do not offer an informative visualization of main events within the processed signal. This paper attempts to provide an event-related framework to overcome the drawbacks of the traditional visualization methods and describe the main events within the biomedical signal in terms of duration and morphology. Electrocardiogram and photoplethysmogram signals are used in the analysis to demonstrate the differences between the traditional visualization methods, and their performance is compared against the proposed method, referred to as the " eventogram " in this paper. The proposed method is based on two event-related moving averages that visualizes the main time-domain events in the processed biomedical signals. The traditional visualization methods were unable to find dominant events in processed signals while the eventogram was able to visualize dominant events in signals in terms of duration and morphology. Moreover, eventogram -based detection algorithms succeeded with detecting main events in different biomedical signals with a sensitivity and positive predictivity >95%. The output of the eventogram captured unique patterns and signatures of physiological events, which could be used to visualize and identify abnormal waveforms in any quasi-periodic signal.
NASA Astrophysics Data System (ADS)
Possamai, Bianca; Vieira, João P.; Grimm, Alice M.; Garcia, Alexandre M.
2018-03-01
Global climatic phenomena like El Niño events are known to alter hydrological cycles and local abiotic conditions leading to changes in structure and dynamics of terrestrial and aquatic biological communities worldwide. Based on a long-term (19 years) standardized sampling of shallow water estuarine fishes, this study investigated the temporal variability in composition and dominance patterns of trophic guilds in a subtropical estuary (Patos Lagoon estuary, Southern Brazil) and their relationship with local and regional driving forces associated with moderate (2002-2003 and 2009-2010) and very strong (1997-1998 and 2015-2016) El Niño events. Fish species were classified into eight trophic guilds (DTV detritivore, HVP herbivore-phytoplankton, HVM macroalgae herbivore, ISV insectivore, OMN omnivore, PSV piscivore, ZBV zoobenthivore and ZPL zooplanktivore) and their abundances were correlated with environmental factors. Canonical correspondence analysis revealed that less dominant (those comprising < 10% of total abundance) trophic guilds, such as HVP, HVM, ISV, PSV, increased their relative abundance in the estuary during higher rainfall and lower salinity conditions associated with moderate and very strong El Niño events. An opposite pattern was observed for the dominant trophic fish guilds like OMN and, at lesser extent, DTV and ZPL, which had greater association with higher values of salinity and water transparency occurring mostly during non-El Niño conditions. In contrast, ZBV's abundance was not correlated with contrasting environmental conditions, but rather, had higher association with samples characterized by intermediate environmental values. Overall, these findings show that moderate and very strong El Niño events did not substantially disrupt the dominance patterns among trophic fish guilds in the estuary. Rather, they increased trophic estuarine diversity by flushing freshwater fishes with distinct feeding habits into the estuary.
Predictive error detection in pianists: a combined ERP and motion capture study
Maidhof, Clemens; Pitkäniemi, Anni; Tervaniemi, Mari
2013-01-01
Performing a piece of music involves the interplay of several cognitive and motor processes and requires extensive training to achieve a high skill level. However, even professional musicians commit errors occasionally. Previous event-related potential (ERP) studies have investigated the neurophysiological correlates of pitch errors during piano performance, and reported pre-error negativity already occurring approximately 70–100 ms before the error had been committed and audible. It was assumed that this pre-error negativity reflects predictive control processes that compare predicted consequences with actual consequences of one's own actions. However, in previous investigations, correct and incorrect pitch events were confounded by their different tempi. In addition, no data about the underlying movements were available. In the present study, we exploratively recorded the ERPs and 3D movement data of pianists' fingers simultaneously while they performed fingering exercises from memory. Results showed a pre-error negativity for incorrect keystrokes when both correct and incorrect keystrokes were performed with comparable tempi. Interestingly, even correct notes immediately preceding erroneous keystrokes elicited a very similar negativity. In addition, we explored the possibility of computing ERPs time-locked to a kinematic landmark in the finger motion trajectories defined by when a finger makes initial contact with the key surface, that is, at the onset of tactile feedback. Results suggest that incorrect notes elicited a small difference after the onset of tactile feedback, whereas correct notes preceding incorrect ones elicited negativity before the onset of tactile feedback. The results tentatively suggest that tactile feedback plays an important role in error-monitoring during piano performance, because the comparison between predicted and actual sensory (tactile) feedback may provide the information necessary for the detection of an upcoming error. PMID:24133428
A crater and its ejecta: An interpretation of Deep Impact
NASA Astrophysics Data System (ADS)
Holsapple, Keith A.; Housen, Kevin R.
2007-03-01
We apply recently updated scaling laws for impact cratering and ejecta to interpret observations of the Deep Impact event. An important question is whether the cratering event was gravity or strength-dominated; the answer gives important clues about the properties of the surface material of Tempel 1. Gravity scaling was assumed in pre-event calculations and has been asserted in initial studies of the mission results. Because the gravity field of Tempel 1 is extremely weak, a gravity-dominated event necessarily implies a surface with essentially zero strength. The conclusion of gravity scaling was based mainly on the interpretation that the impact ejecta plume remained attached to the comet during its evolution. We address that feature here, and conclude that even strength-dominated craters would result in a plume that appeared to remain attached to the surface. We then calculate the plume characteristics from scaling laws for a variety of material types, and for gravity and strength-dominated cases. We find that no model of cratering alone can match the reported observation of plume mass and brightness history. Instead, comet-like acceleration mechanisms such as expanding vapor clouds are required to move the ejected mass to the far field in a few-hour time frame. With such mechanisms, and to within the large uncertainties, either gravity or strength craters can provide the levels of estimated observed mass. Thus, the observations are unlikely to answer the questions about the mechanical nature of the Tempel 1 surface.
A crater and its ejecta: An interpretation of Deep Impact
NASA Astrophysics Data System (ADS)
Holsapple, Keith A.; Housen, Kevin R.
We apply recently updated scaling laws for impact cratering and ejecta to interpret observations of the Deep Impact event. An important question is whether the cratering event was gravity or strength-dominated; the answer gives important clues about the properties of the surface material of Tempel 1. Gravity scaling was assumed in pre-event calculations and has been asserted in initial studies of the mission results. Because the gravity field of Tempel 1 is extremely weak, a gravity-dominated event necessarily implies a surface with essentially zero strength. The conclusion of gravity scaling was based mainly on the interpretation that the impact ejecta plume remained attached to the comet during its evolution. We address that feature here, and conclude that even strength-dominated craters would result in a plume that appeared to remain attached to the surface. We then calculate the plume characteristics from scaling laws for a variety of material types, and for gravity and strength-dominated cases. We find that no model of cratering alone can match the reported observation of plume mass and brightness history. Instead, comet-like acceleration mechanisms such as expanding vapor clouds are required to move the ejected mass to the far field in a few-hour time frame. With such mechanisms, and to within the large uncertainties, either gravity or strength craters can provide the levels of estimated observed mass. Thus, the observations are unlikely to answer the questions about the mechanical nature of the Tempel 1 surface.
Local blur analysis and phase error correction method for fringe projection profilometry systems.
Rao, Li; Da, Feipeng
2018-05-20
We introduce a flexible error correction method for fringe projection profilometry (FPP) systems in the presence of local blur phenomenon. Local blur caused by global light transport such as camera defocus, projector defocus, and subsurface scattering will cause significant systematic errors in FPP systems. Previous methods, which adopt high-frequency patterns to separate the direct and global components, fail when the global light phenomenon occurs locally. In this paper, the influence of local blur on phase quality is thoroughly analyzed, and a concise error correction method is proposed to compensate the phase errors. For defocus phenomenon, this method can be directly applied. With the aid of spatially varying point spread functions and local frontal plane assumption, experiments show that the proposed method can effectively alleviate the system errors and improve the final reconstruction accuracy in various scenes. For a subsurface scattering scenario, if the translucent object is dominated by multiple scattering, the proposed method can also be applied to correct systematic errors once the bidirectional scattering-surface reflectance distribution function of the object material is measured.
Umar, Amara; Javaid, Nadeem; Ahmad, Ashfaq; Khan, Zahoor Ali; Qasim, Umar; Alrajeh, Nabil; Hayat, Amir
2015-06-18
Performance enhancement of Underwater Wireless Sensor Networks (UWSNs) in terms of throughput maximization, energy conservation and Bit Error Rate (BER) minimization is a potential research area. However, limited available bandwidth, high propagation delay, highly dynamic network topology, and high error probability leads to performance degradation in these networks. In this regard, many cooperative communication protocols have been developed that either investigate the physical layer or the Medium Access Control (MAC) layer, however, the network layer is still unexplored. More specifically, cooperative routing has not yet been jointly considered with sink mobility. Therefore, this paper aims to enhance the network reliability and efficiency via dominating set based cooperative routing and sink mobility. The proposed work is validated via simulations which show relatively improved performance of our proposed work in terms the selected performance metrics.
Mazor, Kathleen; Roblin, Douglas W; Greene, Sarah M; Fouayzi, Hassan; Gallagher, Thomas H
2016-10-01
Full disclosure of harmful errors to patients, including a statement of regret, an explanation, acceptance of responsibility and commitment to prevent recurrences is the current standard for physicians in the USA. To examine the extent to which primary care physicians' perceptions of event-level, physician-level and organisation-level factors influence intent to disclose a medical error in challenging situations. Cross-sectional survey containing two hypothetical vignettes: (1) delayed diagnosis of breast cancer, and (2) care coordination breakdown causing a delayed response to patient symptoms. In both cases, multiple physicians shared responsibility for the error, and both involved oncology diagnoses. The study was conducted in the context of the HMO Cancer Research Network Cancer Communication Research Center. Primary care physicians from three integrated healthcare delivery systems located in Washington, Massachusetts and Georgia; responses from 297 participants were included in these analyses. The dependent variable intent to disclose included intent to provide an apology, an explanation, information about the cause and plans for preventing recurrences. Independent variables included event-level factors (responsibility for the event, perceived seriousness of the event, predictions about a lawsuit); physician-level factors (value of patient-centred communication, communication self-efficacy and feelings about practice); organisation-level factors included perceived support for communication and time constraints. A majority of respondents would not fully disclose in either situation. The strongest predictors of disclosure were perceived personal responsibility, perceived seriousness of the event and perceived value of patient-centred communication. These variables were consistently associated with intent to disclose. To make meaningful progress towards improving disclosure; physicians, risk managers, organisational leaders, professional organisations and accreditation bodies need to understand the factors which influence disclosure. Such an understanding is required to inform institutional policies and provider training. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Alfei, Joaquín M.; Ferrer Monti, Roque I.; Molina, Victor A.; Bueno, Adrián M.
2015-01-01
Different mnemonic outcomes have been observed when associative memories are reactivated by CS exposure and followed by amnestics. These outcomes include mere retrieval, destabilization–reconsolidation, a transitional period (which is insensitive to amnestics), and extinction learning. However, little is known about the interaction between initial learning conditions and these outcomes during a reinforced or nonreinforced reactivation. Here we systematically combined temporally specific memories with different reactivation parameters to observe whether these four outcomes are determined by the conditions established during training. First, we validated two training regimens with different temporal expectations about US arrival. Then, using Midazolam (MDZ) as an amnestic agent, fear memories in both learning conditions were submitted to retraining either under identical or different parameters to the original training. Destabilization (i.e., susceptibly to MDZ) occurred when reactivation was reinforced, provided the occurrence of a temporal prediction error about US arrival. In subsequent experiments, both treatments were systematically reactivated by nonreinforced context exposure of different lengths, which allowed to explore the interaction between training and reactivation lengths. These results suggest that temporal prediction error and trace dominance determine the extent to which reactivation produces the different outcomes. PMID:26179232
An urban runoff model designed to inform stormwater management decisions.
Beck, Nicole G; Conley, Gary; Kanner, Lisa; Mathias, Margaret
2017-05-15
We present an urban runoff model designed for stormwater managers to quantify runoff reduction benefits of mitigation actions that has lower input data and user expertise requirements than most commonly used models. The stormwater tool to estimate load reductions (TELR) employs a semi-distributed approach, where landscape characteristics and process representation are spatially-lumped within urban catchments on the order of 100 acres (40 ha). Hydrologic computations use a set of metrics that describe a 30-year rainfall distribution, combined with well-tested algorithms for rainfall-runoff transformation and routing to generate average annual runoff estimates for each catchment. User inputs include the locations and specifications for a range of structural best management practice (BMP) types. The model was tested in a set of urban catchments within the Lake Tahoe Basin of California, USA, where modeled annual flows matched that of the observed flows within 18% relative error for 5 of the 6 catchments and had good regional performance for a suite of performance metrics. Comparisons with continuous simulation models showed an average of 3% difference from TELR predicted runoff for a range of hypothetical urban catchments. The model usually identified the dominant BMP outflow components within 5% relative error of event-based measured flow data and simulated the correct proportionality between outflow components. TELR has been implemented as a web-based platform for use by municipal stormwater managers to inform prioritization, report program benefits and meet regulatory reporting requirements (www.swtelr.com). Copyright © 2017. Published by Elsevier Ltd.
Starmer, Amy J; Sectish, Theodore C; Simon, Dennis W; Keohane, Carol; McSweeney, Maireade E; Chung, Erica Y; Yoon, Catherine S; Lipsitz, Stuart R; Wassner, Ari J; Harper, Marvin B; Landrigan, Christopher P
2013-12-04
Handoff miscommunications are a leading cause of medical errors. Studies comprehensively assessing handoff improvement programs are lacking. To determine whether introduction of a multifaceted handoff program was associated with reduced rates of medical errors and preventable adverse events, fewer omissions of key data in written handoffs, improved verbal handoffs, and changes in resident-physician workflow. Prospective intervention study of 1255 patient admissions (642 before and 613 after the intervention) involving 84 resident physicians (42 before and 42 after the intervention) from July-September 2009 and November 2009-January 2010 on 2 inpatient units at Boston Children's Hospital. Resident handoff bundle, consisting of standardized communication and handoff training, a verbal mnemonic, and a new team handoff structure. On one unit, a computerized handoff tool linked to the electronic medical record was introduced. The primary outcomes were the rates of medical errors and preventable adverse events measured by daily systematic surveillance. The secondary outcomes were omissions in the printed handoff document and resident time-motion activity. Medical errors decreased from 33.8 per 100 admissions (95% CI, 27.3-40.3) to 18.3 per 100 admissions (95% CI, 14.7-21.9; P < .001), and preventable adverse events decreased from 3.3 per 100 admissions (95% CI, 1.7-4.8) to 1.5 (95% CI, 0.51-2.4) per 100 admissions (P = .04) following the intervention. There were fewer omissions of key handoff elements on printed handoff documents, especially on the unit that received the computerized handoff tool (significant reductions of omissions in 11 of 14 categories with computerized tool; significant reductions in 2 of 14 categories without computerized tool). Physicians spent a greater percentage of time in a 24-hour period at the patient bedside after the intervention (8.3%; 95% CI 7.1%-9.8%) vs 10.6% (95% CI, 9.2%-12.2%; P = .03). The average duration of verbal handoffs per patient did not change. Verbal handoffs were more likely to occur in a quiet location (33.3%; 95% CI, 14.5%-52.2% vs 67.9%; 95% CI, 50.6%-85.2%; P = .03) and private location (50.0%; 95% CI, 30%-70% vs 85.7%; 95% CI, 72.8%-98.7%; P = .007) after the intervention. Implementation of a handoff bundle was associated with a significant reduction in medical errors and preventable adverse events among hospitalized children. Improvements in verbal and written handoff processes occurred, and resident workflow did not change adversely.
A predictability study of Lorenz's 28-variable model as a dynamical system
NASA Technical Reports Server (NTRS)
Krishnamurthy, V.
1993-01-01
The dynamics of error growth in a two-layer nonlinear quasi-geostrophic model has been studied to gain an understanding of the mathematical theory of atmospheric predictability. The growth of random errors of varying initial magnitudes has been studied, and the relation between this classical approach and the concepts of the nonlinear dynamical systems theory has been explored. The local and global growths of random errors have been expressed partly in terms of the properties of an error ellipsoid and the Liapunov exponents determined by linear error dynamics. The local growth of small errors is initially governed by several modes of the evolving error ellipsoid but soon becomes dominated by the longest axis. The average global growth of small errors is exponential with a growth rate consistent with the largest Liapunov exponent. The duration of the exponential growth phase depends on the initial magnitude of the errors. The subsequent large errors undergo a nonlinear growth with a steadily decreasing growth rate and attain saturation that defines the limit of predictability. The degree of chaos and the largest Liapunov exponent show considerable variation with change in the forcing, which implies that the time variation in the external forcing can introduce variable character to the predictability.
Vegetation and Water Level Changes for the Northeast U.S. During the "8.2 ka Event"
NASA Astrophysics Data System (ADS)
Newby, P. E.; Donnelly, J. P.; Shuman, B.; MacDonald, D.
2006-12-01
Cool conditions, known as the "8.2 ka event", occurred between 8400 and 7900 cal yr B.P. in Greenland, Europe and elsewhere in the North Atlantic. The impact of this brief cool interval on local forests is recorded in radiocarbon-dated, high-resolution pollen stratigraphies for New Long Pond (41^{0}50'N, 70^{0}42'W) and Davis Pond (42^{0}30'N, 73^{0}19'W), Massachusetts. The vegetation response to the event is recorded differently for regions with contrasting soil types. At New Long Pond, the sandy outwash derived soils are associated with changes in jack/red, white and pitch pine populations, whereas the dominant changes in vegetation for the clay-rich, proglacial lake derived soils around Davis Pond are among oak, hemlock, and beech. At both sites, pollen evidence for the "8.2 ka event" may be easily overlooked within the more dominant regional pattern for the Northeast, which shows a shift from dry to moist conditions in conjunction with changes from predominantly white pine to oak with more mesic plant taxa between 9000 and 8000 cal yr B.P. At New Long Pond, the "8.2 ka event" is brief, preceded by a low-stand in water-level during the early Holocene and dominated by white pine pollen. After 9000 cal yr B.P., pitch pine with beech, maple, hop/hornbeam, elm and ash pollen indicate a mixed mesophytic forest. A radiocarbon-dated decrease in loss-on-ignition values at 8400 cal yr B.P., likely related to a drawdown in lake level, distinguishes the "8.2 event" and helps highlight subtle shifts in vegetation that favor colder and drier conditions than before the event. Following this brief episode, the pollen data indicate a return to warm and moist conditions until about 5600 years ago. At Davis Pond, increased oak and decreased hemlock pollen abundances, followed by an increase in beech pollen abundance is evident and show what may be the dominant regional pollen signature for the "8.2 ka event" in the Northest. This pattern is also recorded at nearby Berry and North Ponds in western Massachusetts. The appearance of ragweed pollen at both Davis and New Long Pond may indicate perturbations to the vegetation that also relate to the "8.2 ka event".
Zhang, Zhi-Hui; Yang, Guang-Hong
2017-05-01
This paper provides a novel event-triggered fault detection (FD) scheme for discrete-time linear systems. First, an event-triggered interval observer is proposed to generate the upper and lower residuals by taking into account the influence of the disturbances and the event error. Second, the robustness of the residual interval against the disturbances and the fault sensitivity are improved by introducing l 1 and H ∞ performances. Third, dilated linear matrix inequalities are used to decouple the Lyapunov matrices from the system matrices. The nonnegative conditions for the estimation error variables are presented with the aid of the slack matrix variables. This technique allows considering a more general Lyapunov function. Furthermore, the FD decision scheme is proposed by monitoring whether the zero value belongs to the residual interval. It is shown that the information communication burden is reduced by designing the event-triggering mechanism, while the FD performance can still be guaranteed. Finally, simulation results demonstrate the effectiveness of the proposed method. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
Vasylkivska, Veronika S.; Huerta, Nicolas J.
2017-06-24
Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog’s inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable withmore » respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.« less
NASA Astrophysics Data System (ADS)
Vasylkivska, Veronika S.; Huerta, Nicolas J.
2017-07-01
Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog's inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable with respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasylkivska, Veronika S.; Huerta, Nicolas J.
Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog’s inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable withmore » respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.« less
NASA Astrophysics Data System (ADS)
Zhang, Rong-Hua; Tao, Ling-Jiang; Gao, Chuan
2017-09-01
Large uncertainties exist in real-time predictions of the 2015 El Niño event, which have systematic intensity biases that are strongly model-dependent. It is critically important to characterize those model biases so they can be reduced appropriately. In this study, the conditional nonlinear optimal perturbation (CNOP)-based approach was applied to an intermediate coupled model (ICM) equipped with a four-dimensional variational data assimilation technique. The CNOP-based approach was used to quantify prediction errors that can be attributed to initial conditions (ICs) and model parameters (MPs). Two key MPs were considered in the ICM: one represents the intensity of the thermocline effect, and the other represents the relative coupling intensity between the ocean and atmosphere. Two experiments were performed to illustrate the effects of error corrections, one with a standard simulation and another with an optimized simulation in which errors in the ICs and MPs derived from the CNOP-based approach were optimally corrected. The results indicate that simulations of the 2015 El Niño event can be effectively improved by using CNOP-derived error correcting. In particular, the El Niño intensity in late 2015 was adequately captured when simulations were started from early 2015. Quantitatively, the Niño3.4 SST index simulated in Dec. 2015 increased to 2.8 °C in the optimized simulation, compared with only 1.5 °C in the standard simulation. The feasibility and effectiveness of using the CNOP-based technique to improve ENSO simulations are demonstrated in the context of the 2015 El Niño event. The limitations and further applications are also discussed.
NASA Technical Reports Server (NTRS)
Yang, Shu-Chih; Rienecker, Michele; Keppenne, Christian
2010-01-01
This study investigates the impact of four different ocean analyses on coupled forecasts of the 2006 El Nino event. Forecasts initialized in June 2006 using ocean analyses from an assimilation that uses flow-dependent background error covariances are compared with those using static error covariances that are not flow dependent. The flow-dependent error covariances reflect the error structures related to the background ENSO instability and are generated by the coupled breeding method. The ocean analyses used in this study result from the assimilation of temperature and salinity, with the salinity data available from Argo floats. Of the analyses, the one using information from the coupled bred vectors (BV) replicates the observed equatorial long wave propagation best and exhibits more warming features leading to the 2006 El Nino event. The forecasts initialized from the BV-based analysis agree best with the observations in terms of the growth of the warm anomaly through two warming phases. This better performance is related to the impact of the salinity analysis on the state evolution in the equatorial thermocline. The early warming is traced back to salinity differences in the upper ocean of the equatorial central Pacific, while the second warming, corresponding to the mature phase, is associated with the effect of the salinity assimilation on the depth of the thermocline in the western equatorial Pacific. The series of forecast experiments conducted here show that the structure of the salinity in the initial conditions is important to the forecasts of the extension of the warm pool and the evolution of the 2006 El Ni o event.
Prediction skill of rainstorm events over India in the TIGGE weather prediction models
NASA Astrophysics Data System (ADS)
Karuna Sagar, S.; Rajeevan, M.; Vijaya Bhaskara Rao, S.; Mitra, A. K.
2017-12-01
Extreme rainfall events pose a serious threat of leading to severe floods in many countries worldwide. Therefore, advance prediction of its occurrence and spatial distribution is very essential. In this paper, an analysis has been made to assess the skill of numerical weather prediction models in predicting rainstorms over India. Using gridded daily rainfall data set and objective criteria, 15 rainstorms were identified during the monsoon season (June to September). The analysis was made using three TIGGE (THe Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble) models. The models considered are the European Centre for Medium-Range Weather Forecasts (ECMWF), National Centre for Environmental Prediction (NCEP) and the UK Met Office (UKMO). Verification of the TIGGE models for 43 observed rainstorm days from 15 rainstorm events has been made for the period 2007-2015. The comparison reveals that rainstorm events are predictable up to 5 days in advance, however with a bias in spatial distribution and intensity. The statistical parameters like mean error (ME) or Bias, root mean square error (RMSE) and correlation coefficient (CC) have been computed over the rainstorm region using the multi-model ensemble (MME) mean. The study reveals that the spread is large in ECMWF and UKMO followed by the NCEP model. Though the ensemble spread is quite small in NCEP, the ensemble member averages are not well predicted. The rank histograms suggest that the forecasts are under prediction. The modified Contiguous Rain Area (CRA) technique was used to verify the spatial as well as the quantitative skill of the TIGGE models. Overall, the contribution from the displacement and pattern errors to the total RMSE is found to be more in magnitude. The volume error increases from 24 hr forecast to 48 hr forecast in all the three models.
A boundary-optimized rejection region test for the two-sample binomial problem.
Gabriel, Erin E; Nason, Martha; Fay, Michael P; Follmann, Dean A
2018-03-30
Testing the equality of 2 proportions for a control group versus a treatment group is a well-researched statistical problem. In some settings, there may be strong historical data that allow one to reliably expect that the control proportion is one, or nearly so. While one-sample tests or comparisons to historical controls could be used, neither can rigorously control the type I error rate in the event the true control rate changes. In this work, we propose an unconditional exact test that exploits the historical information while controlling the type I error rate. We sequentially construct a rejection region by first maximizing the rejection region in the space where all controls have an event, subject to the constraint that our type I error rate does not exceed α for any true event rate; then with any remaining α we maximize the additional rejection region in the space where one control avoids the event, and so on. When the true control event rate is one, our test is the most powerful nonrandomized test for all points in the alternative space. When the true control event rate is nearly one, we demonstrate that our test has equal or higher mean power, averaging over the alternative space, than a variety of well-known tests. For the comparison of 4 controls and 4 treated subjects, our proposed test has higher power than all comparator tests. We demonstrate the properties of our proposed test by simulation and use our method to design a malaria vaccine trial. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.
Effects of extended work shifts and shift work on patient safety, productivity, and employee health.
Keller, Simone M
2009-12-01
It is estimated 1.3 million health care errors occur each year and of those errors 48,000 to 98,000 result in the deaths of patients (Barger et al., 2006). Errors occur for a variety of reasons, including the effects of extended work hours and shift work. The need for around-the-clock staff coverage has resulted in creative ways to maintain quality patient care, keep health care errors or adverse events to a minimum, and still meet the needs of the organization. One way organizations have attempted to alleviate staff shortages is to create extended work shifts. Instead of the standard 8-hour shift, workers are now working 10, 12, 16, or more hours to provide continuous patient care. Although literature does support these staffing patterns, it cannot be denied that shifts beyond the traditional 8 hours increase staff fatigue, health care errors, and adverse events and outcomes and decrease alertness and productivity. This article includes a review of current literature on shift work, the definition of shift work, error rates and adverse outcomes related to shift work, health effects on shift workers, shift work effects on older workers, recommended optimal shift length, positive and negative effects of shift work on the shift worker, hazards associated with driving after extended shifts, and implications for occupational health nurses. Copyright 2009, SLACK Incorporated.
Technology utilization to prevent medication errors.
Forni, Allison; Chu, Hanh T; Fanikos, John
2010-01-01
Medication errors have been increasingly recognized as a major cause of iatrogenic illness and system-wide improvements have been the focus of prevention efforts. Critically ill patients are particularly vulnerable to injury resulting from medication errors because of the severity of illness, need for high risk medications with a narrow therapeutic index and frequent use of intravenous infusions. Health information technology has been identified as method to reduce medication errors as well as improve the efficiency and quality of care; however, few studies regarding the impact of health information technology have focused on patients in the intensive care unit. Computerized physician order entry and clinical decision support systems can play a crucial role in decreasing errors in the ordering stage of the medication use process through improving the completeness and legibility of orders, alerting physicians to medication allergies and drug interactions and providing a means for standardization of practice. Electronic surveillance, reminders and alerts identify patients susceptible to an adverse event, communicate critical changes in a patient's condition, and facilitate timely and appropriate treatment. Bar code technology, intravenous infusion safety systems, and electronic medication administration records can target prevention of errors in medication dispensing and administration where other technologies would not be able to intercept a preventable adverse event. Systems integration and compliance are vital components in the implementation of health information technology and achievement of a safe medication use process.
Classification and reduction of pilot error
NASA Technical Reports Server (NTRS)
Rogers, W. H.; Logan, A. L.; Boley, G. D.
1989-01-01
Human error is a primary or contributing factor in about two-thirds of commercial aviation accidents worldwide. With the ultimate goal of reducing pilot error accidents, this contract effort is aimed at understanding the factors underlying error events and reducing the probability of certain types of errors by modifying underlying factors such as flight deck design and procedures. A review of the literature relevant to error classification was conducted. Classification includes categorizing types of errors, the information processing mechanisms and factors underlying them, and identifying factor-mechanism-error relationships. The classification scheme developed by Jens Rasmussen was adopted because it provided a comprehensive yet basic error classification shell or structure that could easily accommodate addition of details on domain-specific factors. For these purposes, factors specific to the aviation environment were incorporated. Hypotheses concerning the relationship of a small number of underlying factors, information processing mechanisms, and error types types identified in the classification scheme were formulated. ASRS data were reviewed and a simulation experiment was performed to evaluate and quantify the hypotheses.
Heritability of refractive error and ocular biometrics: the Genes in Myopia (GEM) twin study.
Dirani, Mohamed; Chamberlain, Matthew; Shekar, Sri N; Islam, Amirul F M; Garoufalis, Pam; Chen, Christine Y; Guymer, Robyn H; Baird, Paul N
2006-11-01
A classic twin study was undertaken to assess the contribution of genes and environment to the development of refractive errors and ocular biometrics in a twin population. A total of 1224 twins (345 monozygotic [MZ] and 267 dizygotic [DZ] twin pairs) aged between 18 and 88 years were examined. All twins completed a questionnaire consisting of a medical history, education, and zygosity. Objective refraction was measured in all twins, and biometric measurements were obtained using partial coherence interferometry. Intrapair correlations for spherical equivalent and ocular biometrics were significantly higher in the MZ than in the DZ twin pairs (P < 0.05), when refraction was considered as a continuous variable. A significant gender difference in the variation of spherical equivalent and ocular biometrics was found (P < 0.05). A genetic model specifying an additive, dominant, and unique environmental factor that was sex limited was the best fit for all measured variables. Heritability of spherical equivalents of 88% and 75% were found in the men and women, respectively, whereas, that of axial length was 94% and 92%, respectively. Additive genetic effects accounted for a greater proportion of the variance in spherical equivalent, whereas the variance in ocular biometrics, particularly axial length was explained mostly by dominant genetic effects. Genetic factors, both additive and dominant, play a significant role in refractive error (myopia and hypermetropia) as well as in ocular biometrics, particularly axial length. The sex limitation ADE model (additive genetic, nonadditive genetic, and environmental components) provided the best-fit genetic model for all parameters.
Major episodes of geologic change - Correlations, time structure and possible causes
NASA Technical Reports Server (NTRS)
Rampino, Michael R.; Caldeira, Ken
1993-01-01
Published data sets of major geologic events of the past about 250 Myr (extinction events, sea-level lows, continental flood-basalt eruptions, mountain-building events, abrupt changes in sea-floor spreading, ocean-anoxic and blackshale events and the largest evaporite deposits) have been synthesized (with estimated errors). These events show evidence for a statistically significant periodic component with an underlying periodicity, formally equal to 26.6 Myr, and a recent maximum, close to the present time. The cycle may not be strictly periodic, but a periodicity of about 30 Myr is robust to probable errors in dating of the geologic events. The intervals of geologic change seem to involve jumps in sea-floor spreading associated with episodic continental rifting, volcanism, enhanced orogeny, global sea-level changes and fluctuations in climate. The period may represent a purely internal earth-pulsation, but evidence of planetesimal impacts at several extinction boundaries, and a possible underlying cycle of 28-36 Myr in crater ages, suggests that highly energetic impacts may be affecting global tectonics. A cyclic increase in the flux of planetesimals might result from the passage of the Solar System through the central plane of the Milky Way Galaxy - an event with a periodicity and mean phasing similar to that detected in the geologic changes.
Importance of initial and final state effects for azimuthal correlations in p + Pb collisions
Greif, Moritz; Greiner, Carsten; Schenke, Bjorn; ...
2017-11-27
In this work, we investigate the relative importance of initial and final state effects on azimuthal correlations of gluons in low and high multiplicity p+Pb collisions. To achieve this, we couple Yang-Mills dynamics of pre-equilibrium gluon fields (IP-GLASMA) to a perturbative QCD based parton cascade for the final state evolution (BAMPS) on an event-by-event basis. We find that signatures of both the initial state correlations and final state interactions are seen in azimuthal correlation observables, such as v 2 {2PC} (p T), their strength depending on the event multiplicity and transverse momentum. Initial state correlations dominate v 2 {2PC} (pmore » T) in low multiplicity events for transverse momenta p T > 2 GeV. Lastly, while final state interactions are dominant in high multiplicity events, initial state correlations affect v 2 {2PC} (p T) for p T > 2 GeV as well as the pT integrated v 2 {2PC}.« less
Importance of initial and final state effects for azimuthal correlations in p + Pb collisions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greif, Moritz; Greiner, Carsten; Schenke, Bjorn
In this work, we investigate the relative importance of initial and final state effects on azimuthal correlations of gluons in low and high multiplicity p+Pb collisions. To achieve this, we couple Yang-Mills dynamics of pre-equilibrium gluon fields (IP-GLASMA) to a perturbative QCD based parton cascade for the final state evolution (BAMPS) on an event-by-event basis. We find that signatures of both the initial state correlations and final state interactions are seen in azimuthal correlation observables, such as v 2 {2PC} (p T), their strength depending on the event multiplicity and transverse momentum. Initial state correlations dominate v 2 {2PC} (pmore » T) in low multiplicity events for transverse momenta p T > 2 GeV. Lastly, while final state interactions are dominant in high multiplicity events, initial state correlations affect v 2 {2PC} (p T) for p T > 2 GeV as well as the pT integrated v 2 {2PC}.« less
ERIC Educational Resources Information Center
Taylor, Matthew A.; Skourides, Andreas; Alvero, Alicia M.
2012-01-01
Interval recording procedures are used by persons who collect data through observation to estimate the cumulative occurrence and nonoccurrence of behavior/events. Although interval recording procedures can increase the efficiency of observational data collection, they can also induce error from the observer. In the present study, 50 observers were…
Patient safety: honoring advanced directives.
Tice, Martha A
2007-02-01
Healthcare providers typically think of patient safety in the context of preventing iatrogenic injury. Prevention of falls and medication or treatment errors is the typical focus of adverse event analyses. If healthcare providers are committed to honoring the wishes of patients, then perhaps failures to honor advanced directives should be viewed as reportable medical errors.
Partial-Interval Estimation of Count: Uncorrected and Poisson-Corrected Error Levels
ERIC Educational Resources Information Center
Yoder, Paul J.; Ledford, Jennifer R.; Harbison, Amy L.; Tapp, Jon T.
2018-01-01
A simulation study that used 3,000 computer-generated event streams with known behavior rates, interval durations, and session durations was conducted to test whether the main and interaction effects of true rate and interval duration affect the error level of uncorrected and Poisson-transformed (i.e., "corrected") count as estimated by…
12 CFR 226.13 - Billing error resolution. 27
Code of Federal Regulations, 2010 CFR
2010-01-01
... statement of a computational or similar error of an accounting nature that is made by the creditor. (6) A... least 20 days before the end of the billing cycle for which the statement was required. (b) Billing... applicable, within 2 complete billing cycles (but in no event later than 90 days) after receiving a billing...
Dialysis Facility Safety: Processes and Opportunities.
Garrick, Renee; Morey, Rishikesh
2015-01-01
Unintentional human errors are the source of most safety breaches in complex, high-risk environments. The environment of dialysis care is extremely complex. Dialysis patients have unique and changing physiology, and the processes required for their routine care involve numerous open-ended interfaces between providers and an assortment of technologically advanced equipment. Communication errors, both within the dialysis facility and during care transitions, and lapses in compliance with policies and procedures are frequent areas of safety risk. Some events, such as air emboli and needle dislodgments occur infrequently, but are serious risks. Other adverse events include medication errors, patient falls, catheter and access-related infections, access infiltrations and prolonged bleeding. A robust safety system should evaluate how multiple, sequential errors might align to cause harm. Systems of care can be improved by sharing the results of root cause analyses, and "good catches." Failure mode effects and analyses can be used to proactively identify and mitigate areas of highest risk, and methods drawn from cognitive psychology, simulation training, and human factor engineering can be used to advance facility safety. © 2015 Wiley Periodicals, Inc.
Tracking Progress in Improving Diagnosis: A Framework for Defining Undesirable Diagnostic Events.
Olson, Andrew P J; Graber, Mark L; Singh, Hardeep
2018-01-29
Diagnostic error is a prevalent, harmful, and costly phenomenon. Multiple national health care and governmental organizations have recently identified the need to improve diagnostic safety as a high priority. A major barrier, however, is the lack of standardized, reliable methods for measuring diagnostic safety. Given the absence of reliable and valid measures for diagnostic errors, we need methods to help establish some type of baseline diagnostic performance across health systems, as well as to enable researchers and health systems to determine the impact of interventions for improving the diagnostic process. Multiple approaches have been suggested but none widely adopted. We propose a new framework for identifying "undesirable diagnostic events" (UDEs) that health systems, professional organizations, and researchers could further define and develop to enable standardized measurement and reporting related to diagnostic safety. We propose an outline for UDEs that identifies both conditions prone to diagnostic error and the contexts of care in which these errors are likely to occur. Refinement and adoption of this framework across health systems can facilitate standardized measurement and reporting of diagnostic safety.
A logic programming approach to medical errors in imaging.
Rodrigues, Susana; Brandão, Paulo; Nelas, Luís; Neves, José; Alves, Victor
2011-09-01
In 2000, the Institute of Medicine reported disturbing numbers on the scope it covers and the impact of medical error in the process of health delivery. Nevertheless, a solution to this problem may lie on the adoption of adverse event reporting and learning systems that can help to identify hazards and risks. It is crucial to apply models to identify the adverse events root causes, enhance the sharing of knowledge and experience. The efficiency of the efforts to improve patient safety has been frustratingly slow. Some of this insufficiency of progress may be assigned to the lack of systems that take into account the characteristic of the information about the real world. In our daily lives, we formulate most of our decisions normally based on incomplete, uncertain and even forbidden or contradictory information. One's knowledge is less based on exact facts and more on hypothesis, perceptions or indications. From the data collected on our adverse event treatment and learning system on medical imaging, and through the use of Extended Logic Programming to knowledge representation and reasoning, and the exploitation of new methodologies for problem solving, namely those based on the perception of what is an agent and/or multi-agent systems, we intend to generate reports that identify the most relevant causes of error and define improvement strategies, concluding about the impact, place of occurrence, form or type of event recorded in the healthcare institutions. The Eindhoven Classification Model was extended and adapted to the medical imaging field and used to classify adverse events root causes. Extended Logic Programming was used for knowledge representation with defective information, allowing for the modelling of the universe of discourse in terms of data and knowledge default. A systematization of the evolution of the body of knowledge about Quality of Information embedded in the Root Cause Analysis was accomplished. An adverse event reporting and learning system was developed based on the presented approach to medical errors in imaging. This system was deployed in two Portuguese healthcare institutions, with an appealing outcome. The system enabled to verify that the majority of occurrences were concentrated in a few events that could be avoided. The developed system allowed automatic knowledge extraction, enabling report generation with strategies for the improvement of quality-of-care. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Solar Energetic Particle Events Observed on Mars with MSL/RAD
NASA Astrophysics Data System (ADS)
Ehresmann, B.; Hassler, D.; Zeitlin, C.; Guo, J.; Wimmer-Schweingruber, R. F.; Appel, J. K.; Boehm, E.; Boettcher, S. I.; Brinza, D. E.; Burmeister, S.; Lohf, H.; Martin-Garcia, C.; Rafkin, S. C.; Posner, A.; Reitz, G.
2016-12-01
The Mars Science Laboratory's Radiation Assessment Detector (MSL/RAD) has been conducting measurements of the ionizing radiation field on the Martian surface since August 2012. While this field is mainly dominated by Galactic Cosmic Rays (GCRs) and their interactions with the atoms in the atmosphere and soil, Solar Energetic Particle (SEP) events can contribute significantly to the radiation environment on short time scales and enhance and dominate, in particular, the Martian surface proton flux. Monitoring and understanding the effects of these SEP events on the radiation environment is of great importance to assess the associated health risks for potential, future manned missions to Mars. Furthermore, measurements of the proton spectra during such events aids in the validation of particle transport codes that are used to model the propagation of SEPs through the Martian atmosphere. Comparing the temporal evolution of the SEP events signals detected by MSL/RAD with measurements from other spacecraft can further yield insight into SEP propagation throughout the heliosphere. Here, we present and overview of measurements of the SEP events that have been directly detected on the Martian surface by the MSL/RAD instrument.
RED NOISE VERSUS PLANETARY INTERPRETATIONS IN THE MICROLENSING EVENT OGLE-2013-BLG-446
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bachelet, E.; Bramich, D. M.; AlSubai, K.
2015-10-20
For all exoplanet candidates, the reliability of a claimed detection needs to be assessed through a careful study of systematic errors in the data to minimize the false positives rate. We present a method to investigate such systematics in microlensing data sets using the microlensing event OGLE-2013-BLG-0446 as a case study. The event was observed from multiple sites around the world and its high magnification (A{sub max} ∼ 3000) allowed us to investigate the effects of terrestrial and annual parallax. Real-time modeling of the event while it was still ongoing suggested the presence of an extremely low-mass companion (∼3M{sub ⨁})more » to the lensing star, leading to substantial follow-up coverage of the light curve. We test and compare different models for the light curve and conclude that the data do not favor the planetary interpretation when systematic errors are taken into account.« less
Computing in the presence of soft bit errors. [caused by single event upset on spacecraft
NASA Technical Reports Server (NTRS)
Rasmussen, R. D.
1984-01-01
It is shown that single-event-upsets (SEUs) due to cosmic rays are a significant source of single bit error in spacecraft computers. The physical mechanism of SEU, electron hole generation by means of Linear Energy Transfer (LET), it discussed with reference made to the results of a study of the environmental effects on computer systems of the Galileo spacecraft. Techniques for making software more tolerant of cosmic ray effects are considered, including: reducing the number of registers used by the software; continuity testing of variables; redundant execution of major procedures for error detection; and encoding state variables to detect single-bit changes. Attention is also given to design modifications which may reduce the cosmic ray exposure of on-board hardware. These modifications include: shielding components operating in LEO; removing low-power Schottky parts; and the use of CMOS diodes. The SEU parameters of different electronic components are listed in a table.
NASA Astrophysics Data System (ADS)
Kobayashi, T.; Ohminato, T.; Fujita, E.; Ida, Y.
2002-12-01
The volcanic activity of Miyake-jima started at 18:30 (JST) on June 26, 2000 with large ground deformation and earthquake swarms. The seismic activity started at the southern part of the island. The hypocenter distribution migrated northwestward and slipped away out of the island by early in the morning, June 27. Low frequency (LF) earthquakes with dominant frequencies of 0.2 and 0.4 Hz were first observed in the afternoon of June 27. The LF activity lasted till the first summit eruption on July 8. Earthquake Research Institute of Tokyo University and National Research Institute for Earth Science and Disaster Prevention deployed 3 CMG-3T and 4 STS-2 broadband seismometers in the island. More than 300 LF earthquakes are detected during the period from June 27 to July 8. Most of the LF events whose dominant frequency is 0.2Hz occurred before July 1, while LF events with dominant frequency of 0.4Hz mainly occurred after July 2. We determine hypocenters of these LF events by using the following technique. For each LF event, we assume a source location on a grid point in a homogeneous half-space. A reference station is chosen among all the stations. The cross correlation coefficients are computed between the waveform of the reference station and those of other stations. Then, the coefficients for all the stations are summed. In the same manner, summations of the coefficients are computed grid by grid. A grid point that gives the maximum value of the sum of the coefficients is regarded as the best estimate of the source location of the LF event under consideration. The result shows that hypocenters of LF events are spread over the southern to western part of the island and they migrate from south to the west day by day. Hypocenter migrations associated with volcanic activity have been often reported but usually for short period events. This is one of remarkable cases in which a migration of earthquakes with dominant frequencies as low as 0.2 and 0.4Hz are clearly observed.
Fitting Photometry of Blended Microlensing Events
NASA Astrophysics Data System (ADS)
Thomas, Christian L.; Griest, Kim
2006-03-01
We reexamine the usefulness of fitting blended light-curve models to microlensing photometric data. We find agreement with previous workers (e.g., Woźniak & Paczyński) that this is a difficult proposition because of the degeneracy of blend fraction with other fit parameters. We show that follow-up observations at specific point along the light curve (peak region and wings) of high-magnification events are the most helpful in removing degeneracies. We also show that very small errors in the baseline magnitude can result in problems in measuring the blend fraction and study the importance of non-Gaussian errors in the fit results. The biases and skewness in the distribution of the recovered blend fraction is discussed. We also find a new approximation formula relating the blend fraction and the unblended fit parameters to the underlying event duration needed to estimate microlensing optical depth.
Earthquake relocation near the Leech River Fault, southern Vancouver Island
NASA Astrophysics Data System (ADS)
Li, G.; Liu, Y.; Regalla, C.
2015-12-01
The Leech River Fault (LRF), a northeast dipping thrust, extends across the southern tip of Vancouver Island in Southwest British Columbia, where local tectonic regime is dominated by the subduction of the Juan de Fuca plate beneath the North American plate at the present rate of 40-50 mm/year. British Columbia geologic map (Geoscience Map 2009-1A) shows that this area also consists of many crosscutting minor faults in addition to the San Juan Fault north of the LRF. To investigate the seismic evidence of the subsurface structures of these minor faults and of possible hidden active structures in this area, precise earthquake locations are required. In this study, we relocate 941 earthquakes reported by Canadian National Seismograph Network (CNSN) catalog from 2000 to 2015 within a 100km x 55km study area surrounding the LRF. We use HypoDD [Waldhauser, F., 2001] double-difference relocation method by combining P/S phase arrivals provided by the CNSN at 169 stations and waveform data with correlation coefficient values greater than 0.7 at 50 common stations and event separation less than 10km. A total of 900 out of the 931 events satisfy the above relocation criteria. Velocity model used is a 1-D model extracted from the Ramachandran et al. (2005) model. Average relative location errors estimated by the bootstrap method are 546.5m (horizontal) and 1128.6m (in depth). Absolute errors reported by SVD method for individual clusters are ~100m in both dimensions. We select 5 clusters visually according to their epicenters (see figure). Cluster 1 is parallel to the LRF and a thrust FID #60. Clusters 2 and 3 are bounded by two faults: FID #75, a northeast dipping thrust marking the southwestern boundary of the Wrangellia terrane, and FID #2 marking the northern boundary. Clusters 4 and 5, to the northeast and northwest of Victoria respectively, however, do not represent the surface traces of any mapped faults. The depth profile of Cluster 5 depicts a hidden northeast dipping structure, while other clusters illustrate near-vertical structures. Seismicity of Clusters 1 and 3 suggests vertically dipping patterns for FID #60 and FID #2, while Cluster 4 may reveal a hidden vertically dipping structure. It is noteworthy that most events in this area are deeper than 20km, but the explanation for such deep earthquakes is still unclear.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kertzscher, Gustavo, E-mail: guke@dtu.dk; Andersen, Claus E., E-mail: clan@dtu.dk; Tanderup, Kari, E-mail: karitand@rm.dk
Purpose: This study presents an adaptive error detection algorithm (AEDA) for real-timein vivo point dosimetry during high dose rate (HDR) or pulsed dose rate (PDR) brachytherapy (BT) where the error identification, in contrast to existing approaches, does not depend on an a priori reconstruction of the dosimeter position. Instead, the treatment is judged based on dose rate comparisons between measurements and calculations of the most viable dosimeter position provided by the AEDA in a data driven approach. As a result, the AEDA compensates for false error cases related to systematic effects of the dosimeter position reconstruction. Given its nearly exclusivemore » dependence on stable dosimeter positioning, the AEDA allows for a substantially simplified and time efficient real-time in vivo BT dosimetry implementation. Methods: In the event of a measured potential treatment error, the AEDA proposes the most viable dosimeter position out of alternatives to the original reconstruction by means of a data driven matching procedure between dose rate distributions. If measured dose rates do not differ significantly from the most viable alternative, the initial error indication may be attributed to a mispositioned or misreconstructed dosimeter (false error). However, if the error declaration persists, no viable dosimeter position can be found to explain the error, hence the discrepancy is more likely to originate from a misplaced or misreconstructed source applicator or from erroneously connected source guide tubes (true error). Results: The AEDA applied on twoin vivo dosimetry implementations for pulsed dose rate BT demonstrated that the AEDA correctly described effects responsible for initial error indications. The AEDA was able to correctly identify the major part of all permutations of simulated guide tube swap errors and simulated shifts of individual needles from the original reconstruction. Unidentified errors corresponded to scenarios where the dosimeter position was sufficiently symmetric with respect to error and no-error source position constellations. The AEDA was able to correctly identify all false errors represented by mispositioned dosimeters contrary to an error detection algorithm relying on the original reconstruction. Conclusions: The study demonstrates that the AEDA error identification during HDR/PDR BT relies on a stable dosimeter position rather than on an accurate dosimeter reconstruction, and the AEDA’s capacity to distinguish between true and false error scenarios. The study further shows that the AEDA can offer guidance in decision making in the event of potential errors detected with real-timein vivo point dosimetry.« less
A Nonlinear Adaptive Filter for Gyro Thermal Bias Error Cancellation
NASA Technical Reports Server (NTRS)
Galante, Joseph M.; Sanner, Robert M.
2012-01-01
Deterministic errors in angular rate gyros, such as thermal biases, can have a significant impact on spacecraft attitude knowledge. In particular, thermal biases are often the dominant error source in MEMS gyros after calibration. Filters, such as J\\,fEKFs, are commonly used to mitigate the impact of gyro errors and gyro noise on spacecraft closed loop pointing accuracy, but often have difficulty in rapidly changing thermal environments and can be computationally expensive. In this report an existing nonlinear adaptive filter is used as the basis for a new nonlinear adaptive filter designed to estimate and cancel thermal bias effects. A description of the filter is presented along with an implementation suitable for discrete-time applications. A simulation analysis demonstrates the performance of the filter in the presence of noisy measurements and provides a comparison with existing techniques.
Nya-Ngatchou, Jean-Jacques; Corl, Dawn; Onstad, Susan; Yin, Tom; Tylee, Tracy; Suhr, Louise; Thompson, Rachel E; Wisse, Brent E
2015-02-01
Hypoglycaemia is associated with morbidity and mortality in critically ill patients, and many hospitals have programmes to minimize hypoglycaemia rates. Recent studies have established the hypoglycaemic patient-day as a key metric and have published benchmark inpatient hypoglycaemia rates on the basis of point-of-care blood glucose data even though these values are prone to measurement errors. A retrospective, cohort study including all patients admitted to Harborview Medical Center Intensive Care Units (ICUs) during 2010 and 2011 was conducted to evaluate a quality improvement programme to reduce inappropriate documentation of point-of-care blood glucose measurement errors. Laboratory Medicine point-of-care blood glucose data and patient charts were reviewed to evaluate all episodes of hypoglycaemia. A quality improvement intervention decreased measurement errors from 31% of hypoglycaemic (<70 mg/dL) patient-days in 2010 to 14% in 2011 (p < 0.001) and decreased the observed hypoglycaemia rate from 4.3% of ICU patient-days to 3.4% (p < 0.001). Hypoglycaemic events were frequently recurrent or prolonged (~40%), and these events are not identified by the hypoglycaemic patient-day metric, which also may be confounded by a large number of very low risk or minimally monitored patient-days. Documentation of point-of-care blood glucose measurement errors likely overestimates ICU hypoglycaemia rates and can be reduced by a quality improvement effort. The currently used hypoglycaemic patient-day metric does not evaluate recurrent or prolonged events that may be more likely to cause patient harm. The monitored patient-day as currently defined may not be the optimal denominator to determine inpatient hypoglycaemic risk. Copyright © 2014 John Wiley & Sons, Ltd.
Hydrologic Design in the Anthropocene
NASA Astrophysics Data System (ADS)
Vogel, R. M.; Farmer, W. H.; Read, L.
2014-12-01
In an era dubbed the Anthropocene, the natural world is being transformed by a myriad of human influences. As anthropogenic impacts permeate hydrologic systems, hydrologists are challenged to fully account for such changes and develop new methods of hydrologic design. Deterministic watershed models (DWM), which can account for the impacts of changes in land use, climate and infrastructure, are becoming increasing popular for the design of flood and/or drought protection measures. As with all models that are calibrated to existing datasets, DWMs are subject to model error or uncertainty. In practice, the model error component of DWM predictions is typically ignored yet DWM simulations which ignore model error produce model output which cannot reproduce the statistical properties of the observations they are intended to replicate. In the context of hydrologic design, we demonstrate how ignoring model error can lead to systematic downward bias in flood quantiles, upward bias in drought quantiles and upward bias in water supply yields. By reincorporating model error, we document how DWM models can be used to generate results that mimic actual observations and preserve their statistical behavior. In addition to use of DWM for improved predictions in a changing world, improved communication of the risk and reliability is also needed. Traditional statements of risk and reliability in hydrologic design have been characterized by return periods, but such statements often assume that the annual probability of experiencing a design event remains constant throughout the project horizon. We document the general impact of nonstationarity on the average return period and reliability in the context of hydrologic design. Our analyses reveal that return periods do not provide meaningful expressions of the likelihood of future hydrologic events. Instead, knowledge of system reliability over future planning horizons can more effectively prepare society and communicate the likelihood of future hydrologic events of interest.
Liu, Ying; Hu, Huijing; Jones, Jeffery A; Guo, Zhiqiang; Li, Weifeng; Chen, Xi; Liu, Peng; Liu, Hanjun
2015-08-01
Speakers rapidly adjust their ongoing vocal productions to compensate for errors they hear in their auditory feedback. It is currently unclear what role attention plays in these vocal compensations. This event-related potential (ERP) study examined the influence of selective and divided attention on the vocal and cortical responses to pitch errors heard in auditory feedback regarding ongoing vocalisations. During the production of a sustained vowel, participants briefly heard their vocal pitch shifted up two semitones while they actively attended to auditory or visual events (selective attention), or both auditory and visual events (divided attention), or were not told to attend to either modality (control condition). The behavioral results showed that attending to the pitch perturbations elicited larger vocal compensations than attending to the visual stimuli. Moreover, ERPs were likewise sensitive to the attentional manipulations: P2 responses to pitch perturbations were larger when participants attended to the auditory stimuli compared to when they attended to the visual stimuli, and compared to when they were not explicitly told to attend to either the visual or auditory stimuli. By contrast, dividing attention between the auditory and visual modalities caused suppressed P2 responses relative to all the other conditions and caused enhanced N1 responses relative to the control condition. These findings provide strong evidence for the influence of attention on the mechanisms underlying the auditory-vocal integration in the processing of pitch feedback errors. In addition, selective attention and divided attention appear to modulate the neurobehavioral processing of pitch feedback errors in different ways. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Fong, Allan; Harriott, Nicole; Walters, Donna M; Foley, Hanan; Morrissey, Richard; Ratwani, Raj R
2017-08-01
Many healthcare providers have implemented patient safety event reporting systems to better understand and improve patient safety. Reviewing and analyzing these reports is often time consuming and resource intensive because of both the quantity of reports and length of free-text descriptions in the reports. Natural language processing (NLP) experts collaborated with clinical experts on a patient safety committee to assist in the identification and analysis of medication related patient safety events. Different NLP algorithmic approaches were developed to identify four types of medication related patient safety events and the models were compared. Well performing NLP models were generated to categorize medication related events into pharmacy delivery delays, dispensing errors, Pyxis discrepancies, and prescriber errors with receiver operating characteristic areas under the curve of 0.96, 0.87, 0.96, and 0.81 respectively. We also found that modeling the brief without the resolution text generally improved model performance. These models were integrated into a dashboard visualization to support the patient safety committee review process. We demonstrate the capabilities of various NLP models and the use of two text inclusion strategies at categorizing medication related patient safety events. The NLP models and visualization could be used to improve the efficiency of patient safety event data review and analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
Wilcox, Kevin R; von Fischer, Joseph C; Muscha, Jennifer M; Petersen, Mark K; Knapp, Alan K
2015-01-01
Intensification of the global hydrological cycle with atmospheric warming is expected to increase interannual variation in precipitation amount and the frequency of extreme precipitation events. Although studies in grasslands have shown sensitivity of aboveground net primary productivity (ANPP) to both precipitation amount and event size, we lack equivalent knowledge for responses of belowground net primary productivity (BNPP) and NPP. We conducted a 2-year experiment in three US Great Plains grasslands--the C4-dominated shortgrass prairie (SGP; low ANPP) and tallgrass prairie (TGP; high ANPP), and the C3-dominated northern mixed grass prairie (NMP; intermediate ANPP)--to test three predictions: (i) both ANPP and BNPP responses to increased precipitation amount would vary inversely with mean annual precipitation (MAP) and site productivity; (ii) increased numbers of extreme rainfall events during high-rainfall years would affect high and low MAP sites differently; and (iii) responses belowground would mirror those aboveground. We increased growing season precipitation by as much as 50% by augmenting natural rainfall via (i) many (11-13) small or (ii) fewer (3-5) large watering events, with the latter coinciding with naturally occurring large storms. Both ANPP and BNPP increased with water addition in the two C4 grasslands, with greater ANPP sensitivity in TGP, but greater BNPP and NPP sensitivity in SGP. ANPP and BNPP did not respond to any rainfall manipulations in the C3 -dominated NMP. Consistent with previous studies, fewer larger (extreme) rainfall events increased ANPP relative to many small events in SGP, but event size had no effect in TGP. Neither system responded consistently above- and belowground to event size; consequently, total NPP was insensitive to event size. The diversity of responses observed in these three grassland types underscores the challenge of predicting responses relevant to C cycling to forecast changes in precipitation regimes even within relatively homogeneous biomes such as grasslands. © 2014 John Wiley & Sons Ltd.
Acheampong, Franklin; Tetteh, Ashalley Raymond; Anto, Berko Panyin
2016-12-01
This study determined the incidence, types, clinical significance, and potential causes of medication administration errors (MAEs) at the emergency department (ED) of a tertiary health care facility in Ghana. This study used a cross-sectional nonparticipant observational technique. Study participants (nurses) were observed preparing and administering medication at the ED of a 2000-bed tertiary care hospital in Accra, Ghana. The observations were then compared with patients' medication charts, and identified errors were clarified with staff for possible causes. Of the 1332 observations made, involving 338 patients and 49 nurses, 362 had errors, representing 27.2%. However, the error rate excluding "lack of drug availability" fell to 12.8%. Without wrong time error, the error rate was 22.8%. The 2 most frequent error types were omission (n = 281, 77.6%) and wrong time (n = 58, 16%) errors. Omission error was mainly due to unavailability of medicine, 48.9% (n = 177). Although only one of the errors was potentially fatal, 26.7% were definitely clinically severe. The common themes that dominated the probable causes of MAEs were unavailability, staff factors, patient factors, prescription, and communication problems. This study gives credence to similar studies in different settings that MAEs occur frequently in the ED of hospitals. Most of the errors identified were not potentially fatal; however, preventive strategies need to be used to make life-saving processes such as drug administration in such specialized units error-free.
Koppel, Jonathan; Berntsen, Dorthe
2016-01-01
The reminiscence bump has been found for both autobiographical memories and memories of public events. However, there have been few comparisons of the bump across each type of event. In the current study, therefore, we compared the bump for autobiographical memories versus the bump for memories of public events. We did so between-subjects, through two cueing methods administered within-subjects, the cue word method and the important memories method. For word-cued memories, we found a similar bump from ages 5 to 19 for both types of memories. However, the bump was more pronounced for autobiographical memories. For most important memories, we found a bump from ages 20 to 29 in autobiographical memory, but little discernible age pattern for public events. Rather, specific public events (e.g., the Fall of the Berlin Wall) dominated recall, producing a chronological distribution characterised by spikes in citations according to the years these events occurred. Follow-up analyses suggested that the bump in most important autobiographical memories was a function of the cultural life script. Our findings did not yield support for any of the dominant existing accounts of the bump as underlying the bump in word-cued memories.
Bodina, A; Demarchi, A; Castaldi, S
2014-01-01
A web-based incident reporting system (IRS) is a tool allowing healthcare workers to voluntary and anonymously report adverse events/near misses. In 2010, this system was introduced in a research and teaching hospital in metropolitan area in the North part of Italy, in order to detect errors and to learn from failures in care delivery. The aim of this paper is to assess whether and how IRS has proved to be a valuable tool to manage clinical risk and improve healthcare quality. Adverse events are reported anonymously by staff members with the use of an online template form available in the hospital intranet. We retrospectively reviewed the recorded data for each incident/near miss reported between January 2011 and December 2012. The number of reported incidents/near misses was 521 in 2011 and 442 in 2012. In the two years the admissions were 36.974 and 36.107 respectively. We noticed that nursing staff made more use of IRS and that reported errors were basically related to prescription and administration of medications. Much international literature reports that adverse events and near misses are 10% of admissions. Our data are far from that number, thus meaning that a failure in reporting adverse events exists. This consideration, together with the high number of near misses in comparison with occurred errors, leads us to speculate that adverse events with serious consequences for patients are marginally reported. Probably the lack of a strong leadership considering IRS as an instrument for improving quality and operators' reluctance to overcome the culture of blame may negatively affect IRS.
Kupek, Emil
2002-01-01
Background Frequent use of self-reports for investigating recent and past behavior in medical research requires statistical techniques capable of analyzing complex sources of bias associated with this methodology. In particular, although decreasing accuracy of recalling more distant past events is commonplace, the bias due to differential in memory errors resulting from it has rarely been modeled statistically. Methods Covariance structure analysis was used to estimate the recall error of self-reported number of sexual partners for past periods of varying duration and its implication for the bias. Results Results indicated increasing levels of inaccuracy for reports about more distant past. Considerable positive bias was found for a small fraction of respondents who reported ten or more partners in the last year, last two years and last five years. This is consistent with the effect of heteroscedastic random error where the majority of partners had been acquired in the more distant past and therefore were recalled less accurately than the partners acquired more recently to the time of interviewing. Conclusions Memory errors of this type depend on the salience of the events recalled and are likely to be present in many areas of health research based on self-reported behavior. PMID:12435276
Making and monitoring errors based on altered auditory feedback
Pfordresher, Peter Q.; Beasley, Robertson T. E.
2014-01-01
Previous research has demonstrated that altered auditory feedback (AAF) disrupts music performance and causes disruptions in both action planning and the perception of feedback events. It has been proposed that this disruption occurs because of interference within a shared representation for perception and action (Pfordresher, 2006). Studies reported here address this claim from the standpoint of error monitoring. In Experiment 1 participants performed short melodies on a keyboard while hearing no auditory feedback, normal auditory feedback, or alterations to feedback pitch on some subset of events. Participants overestimated error frequency when AAF was present but not for normal feedback. Experiment 2 introduced a concurrent load task to determine whether error monitoring requires executive resources. Although the concurrent task enhanced the effect of AAF, it did not alter participants’ tendency to overestimate errors when AAF was present. A third correlational study addressed whether effects of AAF are reduced for a subset of the population who may lack the kind of perception/action associations that lead to AAF disruption: poor-pitch singers. Effects of manipulations similar to those presented in Experiments 1 and 2 were reduced for these individuals. We propose that these results are consistent with the notion that AAF interference is based on associations between perception and action within a forward internal model of auditory-motor relationships. PMID:25191294
The cost of implementing inpatient bar code medication administration.
Sakowski, Julie Ann; Ketchel, Alan
2013-02-01
To calculate the costs associated with implementing and operating an inpatient bar-code medication administration (BCMA) system in the community hospital setting and to estimate the cost per harmful error prevented. This is a retrospective, observational study. Costs were calculated from the hospital perspective and a cost-consequence analysis was performed to estimate the cost per preventable adverse drug event averted. Costs were collected from financial records and key informant interviews at 4 not-for profit community hospitals. Costs included direct expenditures on capital, infrastructure, additional personnel, and the opportunity costs of time for existing personnel working on the project. The number of adverse drug events prevented using BCMA was estimated by multiplying the number of doses administered using BCMA by the rate of harmful errors prevented by interventions in response to system warnings. Our previous work found that BCMA identified and intercepted medication errors in 1.1% of doses administered, 9% of which potentially could have resulted in lasting harm. The cost of implementing and operating BCMA including electronic pharmacy management and drug repackaging over 5 years is $40,000 (range: $35,600 to $54,600) per BCMA-enabled bed and $2000 (range: $1800 to $2600) per harmful error prevented. BCMA can be an effective and potentially cost-saving tool for preventing the harm and costs associated with medication errors.
Peak fitting and integration uncertainties for the Aerodyne Aerosol Mass Spectrometer
NASA Astrophysics Data System (ADS)
Corbin, J. C.; Othman, A.; Haskins, J. D.; Allan, J. D.; Sierau, B.; Worsnop, D. R.; Lohmann, U.; Mensah, A. A.
2015-04-01
The errors inherent in the fitting and integration of the pseudo-Gaussian ion peaks in Aerodyne High-Resolution Aerosol Mass Spectrometers (HR-AMS's) have not been previously addressed as a source of imprecision for these instruments. This manuscript evaluates the significance of these uncertainties and proposes a method for their estimation in routine data analysis. Peak-fitting uncertainties, the most complex source of integration uncertainties, are found to be dominated by errors in m/z calibration. These calibration errors comprise significant amounts of both imprecision and bias, and vary in magnitude from ion to ion. The magnitude of these m/z calibration errors is estimated for an exemplary data set, and used to construct a Monte Carlo model which reproduced well the observed trends in fits to the real data. The empirically-constrained model is used to show that the imprecision in the fitted height of isolated peaks scales linearly with the peak height (i.e., as n1), thus contributing a constant-relative-imprecision term to the overall uncertainty. This constant relative imprecision term dominates the Poisson counting imprecision term (which scales as n0.5) at high signals. The previous HR-AMS uncertainty model therefore underestimates the overall fitting imprecision. The constant relative imprecision in fitted peak height for isolated peaks in the exemplary data set was estimated as ~4% and the overall peak-integration imprecision was approximately 5%. We illustrate the importance of this constant relative imprecision term by performing Positive Matrix Factorization (PMF) on a~synthetic HR-AMS data set with and without its inclusion. Finally, the ability of an empirically-constrained Monte Carlo approach to estimate the fitting imprecision for an arbitrary number of known overlapping peaks is demonstrated. Software is available upon request to estimate these error terms in new data sets.
Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task 2 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Shuai; Makarov, Yuri V.; McKinstry, Craig A.
Task report detailing low probability tail event analysis and mitigation in BPA control area. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, causing the imbalance between generation and load to become very significant.
Single event upset susceptibility testing of the Xilinx Virtex II FPGA
NASA Technical Reports Server (NTRS)
Yui, C.; Swift, G.; Carmichael, C.
2002-01-01
Heavy ion testing of the Xilinx Virtex IZ was conducted on the configuration, block RAM and user flip flop cells to determine their single event upset susceptibility using LETs of 1.2 to 60 MeVcm^2/mg. A software program specifically designed to count errors in the FPGA is used to reveal L1/e values and single-event-functional interrupt failures.
Impacts of high resolution model downscaling in coastal regions
NASA Astrophysics Data System (ADS)
Bricheno, Lucy; Wolf, Judith
2013-04-01
With model development and cheaper computational resources ocean forecasts are becoming readily available, high resolution coastal forecasting is now a reality. This can only be achieved, however, by downscaling global or basin-scale products such as the MyOcean reanalyses and forecasts. These model products have resolution ranging from 1/16th - 1/4 degree, which are often insufficient for coastal scales, but can provide initialisation and boundary data. We present applications of downscaling the MyOcean products for use in shelf-seas and the nearshore. We will address the question 'Do coastal predictions improve with higher resolution modelling?' with a few focused examples, while also discussing what is meant by an improved result. Increasing resolution appears to be an obvious route for getting more accurate forecasts in operational coastal models. However, when models resolve finer scales, this may lead to the introduction of high-frequency variability which is not necessarily deterministic. Thus a flow may appear more realistic by generating eddies but the simple statistics like rms error and correlation may become less good because the model variability is not exactly in phase with the observations (Hoffman et al., 1995). By deciding on a specific process to simulate (rather than concentrating on reducing rms error) we can better assess the improvements gained by downscaling. In this work we will select two processes which are dominant in our case-study site: Liverpool Bay. Firstly we consider the magnitude and timing of a peak in tide-surge elevations, by separating out the event into timing (or displacement) and intensity (or amplitude) errors. The model can thus be evaluated on how well it predicts the timing and magnitude of the surge. The second important characteristic of Liverpool Bay is the position of the freshwater front. To evaluate model performance in this case, the location, sharpness, and temperature difference across the front will be considered. We will show that by using intelligent metrics designed with a physical process in mind, we can learn more about model performance than by considering 'bulk' statistics alone. R. M. Hoffman and Z. Liu and J-F. Louic and C. Grassotti (1995) 'Distortion Representation of Forecast Errors' Monthly Weather Review 123: 2758-2770
Factors contributing to registered nurse medication administration error: a narrative review.
Parry, Angela M; Barriball, K Louise; While, Alison E
2015-01-01
To explore the factors contributing to Registered Nurse medication administration error behaviour. A narrative review. Electronic databases (Cochrane, CINAHL, MEDLINE, BNI, EmBase, and PsycINFO) were searched from 1 January 1999 to 31 December 2012 in the English language. 1127 papers were identified and 26 papers were included in the review. Data were extracted by one reviewer and checked by a second reviewer. A thematic analysis and narrative synthesis of the factors contributing to Registered Nurses' medication administration behaviour. Bandura's (1986) theory of reciprocal determinism was used as an organising framework. This theory proposes that there is a reciprocal interplay between the environment, the person and their behaviour. Medication administration error is an outcome of RN behaviour. The 26 papers reported studies conducted in 4 continents across 11 countries predominantly in North America and Europe, with one multi-national study incorporating 27 countries. Within both the environment and person domain of the reciprocal determinism framework, a number of factors emerged as influencing Registered Nurse medication administration error behaviour. Within the environment domain, two key themes of clinical workload and work setting emerged, and within the person domain the Registered Nurses' characteristics and their lived experience of work emerged as themes. Overall, greater attention has been given to the contribution of the environment domain rather than the person domain as contributing to error, with the literature viewing an error as an event rather than the outcome of behaviour. The interplay between factors that influence behaviour were poorly accounted for within the selected studies. It is proposed that a shift away from error as an event to a focus on the relationships between the person, the environment and Registered Nurse medication administration behaviour is needed to better understand medication administration error. Copyright © 2014 Elsevier Ltd. All rights reserved.
Henneman, Elizabeth A
2017-07-01
The Institute of Medicine (now National Academy of Medicine) reports "To Err is Human" and "Crossing the Chasm" made explicit 3 previously unappreciated realities: (1) Medical errors are common and result in serious, preventable adverse events; (2) The majority of medical errors are the result of system versus human failures; and (3) It would be impossible for any system to prevent all errors. With these realities, the role of the nurse in the "near miss" process and as the final safety net for the patient is of paramount importance. The nurse's role in patient safety is described from both a systems perspective and a human factors perspective. Critical care nurses use specific strategies to identify, interrupt, and correct medical errors. Strategies to identify errors include knowing the patient, knowing the plan of care, double-checking, and surveillance. Nursing strategies to interrupt errors include offering assistance, clarifying, and verbally interrupting. Nurses correct errors by persevering, being physically present, reviewing/confirming the plan of care, or involving another nurse or physician. Each of these strategies has implications for education, practice, and research. Surveillance is a key nursing strategy for identifying medical errors and reducing adverse events. Eye-tracking technology is a novel approach for evaluating the surveillance process during common, high-risk processes such as blood transfusion and medication administration. Eye tracking has also been used to examine the impact of interruptions to care caused by bedside alarms as well as by other health care personnel. Findings from this safety-related eye-tracking research provide new insight into effective bedside surveillance and interruption management strategies. ©2017 American Association of Critical-Care Nurses.
Thomas, Felicity; Signal, Mathew; Harris, Deborah L; Weston, Philip J; Harding, Jane E; Shaw, Geoffrey M; Chase, J Geoffrey
2014-05-01
Neonatal hypoglycemia is common and can cause serious brain injury. Continuous glucose monitoring (CGM) could improve hypoglycemia detection, while reducing blood glucose (BG) measurements. Calibration algorithms use BG measurements to convert sensor signals into CGM data. Thus, inaccuracies in calibration BG measurements directly affect CGM values and any metrics calculated from them. The aim was to quantify the effect of timing delays and calibration BG measurement errors on hypoglycemia metrics in newborn infants. Data from 155 babies were used. Two timing and 3 BG meter error models (Abbott Optium Xceed, Roche Accu-Chek Inform II, Nova Statstrip) were created using empirical data. Monte-Carlo methods were employed, and each simulation was run 1000 times. Each set of patient data in each simulation had randomly selected timing and/or measurement error added to BG measurements before CGM data were calibrated. The number of hypoglycemic events, duration of hypoglycemia, and hypoglycemic index were then calculated using the CGM data and compared to baseline values. Timing error alone had little effect on hypoglycemia metrics, but measurement error caused substantial variation. Abbott results underreported the number of hypoglycemic events by up to 8 and Roche overreported by up to 4 where the original number reported was 2. Nova results were closest to baseline. Similar trends were observed in the other hypoglycemia metrics. Errors in blood glucose concentration measurements used for calibration of CGM devices can have a clinically important impact on detection of hypoglycemia. If CGM devices are going to be used for assessing hypoglycemia it is important to understand of the impact of these errors on CGM data. © 2014 Diabetes Technology Society.
Ajemigbitse, Adetutu A.; Omole, Moses Kayode; Ezike, Nnamdi Chika; Erhun, Wilson O.
2013-01-01
Context: Junior doctors are reported to make most of the prescribing errors in the hospital setting. Aims: The aim of the following study is to determine the knowledge intern doctors have about prescribing errors and circumstances contributing to making them. Settings and Design: A structured questionnaire was distributed to intern doctors in National Hospital Abuja Nigeria. Subjects and Methods: Respondents gave information about their experience with prescribing medicines, the extent to which they agreed with the definition of a clinically meaningful prescribing error and events that constituted such. Their experience with prescribing certain categories of medicines was also sought. Statistical Analysis Used: Data was analyzed with Statistical Package for the Social Sciences (SPSS) software version 17 (SPSS Inc Chicago, Ill, USA). Chi-squared analysis contrasted differences in proportions; P < 0.05 was considered to be statistically significant. Results: The response rate was 90.9% and 27 (90%) had <1 year of prescribing experience. 17 (56.7%) respondents totally agreed with the definition of a clinically meaningful prescribing error. Most common reasons for prescribing mistakes were a failure to check prescriptions with a reference source (14, 25.5%) and failure to check for adverse drug interactions (14, 25.5%). Omitting some essential information such as duration of therapy (13, 20%), patient age (14, 21.5%) and dosage errors (14, 21.5%) were the most common types of prescribing errors made. Respondents considered workload (23, 76.7%), multitasking (19, 63.3%), rushing (18, 60.0%) and tiredness/stress (16, 53.3%) as important factors contributing to prescribing errors. Interns were least confident prescribing antibiotics (12, 25.5%), opioid analgesics (12, 25.5%) cytotoxics (10, 21.3%) and antipsychotics (9, 19.1%) unsupervised. Conclusions: Respondents seemed to have a low awareness of making prescribing errors. Principles of rational prescribing and events that constitute prescribing errors should be taught in the practice setting. PMID:24808682
Ajemigbitse, Adetutu A; Omole, Moses Kayode; Ezike, Nnamdi Chika; Erhun, Wilson O
2013-12-01
Junior doctors are reported to make most of the prescribing errors in the hospital setting. The aim of the following study is to determine the knowledge intern doctors have about prescribing errors and circumstances contributing to making them. A structured questionnaire was distributed to intern doctors in National Hospital Abuja Nigeria. Respondents gave information about their experience with prescribing medicines, the extent to which they agreed with the definition of a clinically meaningful prescribing error and events that constituted such. Their experience with prescribing certain categories of medicines was also sought. Data was analyzed with Statistical Package for the Social Sciences (SPSS) software version 17 (SPSS Inc Chicago, Ill, USA). Chi-squared analysis contrasted differences in proportions; P < 0.05 was considered to be statistically significant. The response rate was 90.9% and 27 (90%) had <1 year of prescribing experience. 17 (56.7%) respondents totally agreed with the definition of a clinically meaningful prescribing error. Most common reasons for prescribing mistakes were a failure to check prescriptions with a reference source (14, 25.5%) and failure to check for adverse drug interactions (14, 25.5%). Omitting some essential information such as duration of therapy (13, 20%), patient age (14, 21.5%) and dosage errors (14, 21.5%) were the most common types of prescribing errors made. Respondents considered workload (23, 76.7%), multitasking (19, 63.3%), rushing (18, 60.0%) and tiredness/stress (16, 53.3%) as important factors contributing to prescribing errors. Interns were least confident prescribing antibiotics (12, 25.5%), opioid analgesics (12, 25.5%) cytotoxics (10, 21.3%) and antipsychotics (9, 19.1%) unsupervised. Respondents seemed to have a low awareness of making prescribing errors. Principles of rational prescribing and events that constitute prescribing errors should be taught in the practice setting.
Late Quaternary glaciation history of monsoon-dominated Dingad basin, central Himalaya, India
NASA Astrophysics Data System (ADS)
Shukla, Tanuj; Mehta, Manish; Jaiswal, Manoj K.; Srivastava, Pradeep; Dobhal, D. P.; Nainwal, H. C.; Singh, Atul K.
2018-02-01
The study presents the Late Quaternary glaciation history of monsoon-dominated Dokriani Glacier valley, Dingad basin, central Himalaya, India. The basin is tested for the mechanism of landforms preservation in high relief and abundant precipitation regimes of the Higher Himalaya. Field geomorphology and remote sensing data, supported by Optical Stimulated Luminescence (OSL) dating enabled identification of five major glacial events of decreasing magnitude. The oldest glacial stage, Dokriani Glacial Stage I (DGS-I), extended down to ∼8 km (2883 m asl) from present-day snout (3965 m asl) followed by other four glaciations events viz. DGS-II, DGS-III, DGS-IV and DGS-V terminating at ∼3211, 3445, 3648 and ∼3733 m asl respectively. The DGS-I glaciation (∼25-∼22 ka BP) occurred during early Marine Isotope Stage (MIS) -2, characterized as Last Glacial Maximum (LGM) extension of the valley. Similarly, DGS-II stage (∼14-∼11 ka BP) represents the global cool and dry Older Dryas and Younger Dryas event glaciation. The DGS-III glaciation (∼8 ka BP) coincides with early Holocene 8.2 ka cooling event, the DGS-IV glaciations (∼4-3.7 ka BP) corresponds to 4.2 ka cool and drier event, DGS-V (∼2.7-∼1 ka BP) represents the cool and moist late Holocene glacial advancement of the valley. This study suggests that the Dokriani Glacier valley responded to the global lowering of temperature and variable precipitation conditions. This study also highlights the close correlation between the monsoon-dominated valley glaciations and Northern Hemisphere cooling events influenced by North Atlantic climate.
Ensemble Streamflow Prediction in Korea: Past and Future 5 Years
NASA Astrophysics Data System (ADS)
Jeong, D.; Kim, Y.; Lee, J.
2005-05-01
The Ensemble Streamflow Prediction (ESP) approach was first introduced in 2000 by the Hydrology Research Group (HRG) at Seoul National University as an alternative probabilistic forecasting technique for improving the 'Water Supply Outlook' That is issued every month by the Ministry of Construction and Transportation in Korea. That study motivated the Korea Water Resources Corporation (KOWACO) to establish their seasonal probabilistic forecasting system for the 5 major river basins using the ESP approach. In cooperation with the HRG, the KOWACO developed monthly optimal multi-reservoir operating systems for the Geum river basin in 2004, which coupled the ESP forecasts with an optimization model using sampling stochastic dynamic programming. The user interfaces for both ESP and SSDP have also been designed for the developed computer systems to become more practical. More projects for developing ESP systems to the other 3 major river basins (i.e. the Nakdong, Han and Seomjin river basins) was also completed by the HRG and KOWACO at the end of December 2004. Therefore, the ESP system has become the most important mid- and long-term streamflow forecast technique in Korea. In addition to the practical aspects, resent research experience on ESP has raised some concerns into ways of improving the accuracy of ESP in Korea. Jeong and Kim (2002) performed an error analysis on its resulting probabilistic forecasts and found that the modeling error is dominant in the dry season, while the meteorological error is dominant in the flood season. To address the first issue, Kim et al. (2004) tested various combinations and/or combining techniques and showed that the ESP probabilistic accuracy could be improved considerably during the dry season when the hydrologic models were combined and/or corrected. In addition, an attempt was also made to improve the ESP accuracy for the flood season using climate forecast information. This ongoing project handles three types of climate forecast information: (1) the Monthly Industrial Meteorology Information Magazine (MIMIM) of the Korea Meteorological Administration (2) the Global Data Assimilation Prediction System (GDAPS), and (3) the US National Centers for Environmental Prediction (NCEP). Each of these forecasts is issued in a unique format: (1) MIMIM is a most-probable-event forecast, (2) GDAPS is a single series of deterministic forecasts, and (3) NCEP is an ensemble of deterministic forecasts. Other minor issues include how long the initial conditions influences the ESP accuracy, and how many ESP scenarios are needed to obtain the best accuracy. This presentation also addresses some future research that is needed for ESP in Korea.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noel, Camille E.; Gutti, VeeraRajesh; Bosch, Walter
Purpose: To quantify the potential impact of the Integrating the Healthcare Enterprise–Radiation Oncology Quality Assurance with Plan Veto (QAPV) on patient safety of external beam radiation therapy (RT) operations. Methods and Materials: An institutional database of events (errors and near-misses) was used to evaluate the ability of QAPV to prevent clinically observed events. We analyzed reported events that were related to Digital Imaging and Communications in Medicine RT plan parameter inconsistencies between the intended treatment (on the treatment planning system) and the delivered treatment (on the treatment machine). Critical Digital Imaging and Communications in Medicine RT plan parameters were identified.more » Each event was scored for importance using the Failure Mode and Effects Analysis methodology. Potential error occurrence (frequency) was derived according to the collected event data, along with the potential event severity, and the probability of detection with and without the theoretical implementation of the QAPV plan comparison check. Failure Mode and Effects Analysis Risk Priority Numbers (RPNs) with and without QAPV were compared to quantify the potential benefit of clinical implementation of QAPV. Results: The implementation of QAPV could reduce the RPN values for 15 of 22 (71%) of evaluated parameters, with an overall average reduction in RPN of 68 (range, 0-216). For the 6 high-risk parameters (>200), the average reduction in RPN value was 163 (range, 108-216). The RPN value reduction for the intermediate-risk (200 > RPN > 100) parameters was (0-140). With QAPV, the largest RPN value for “Beam Meterset” was reduced from 324 to 108. The maximum reduction in RPN value was for Beam Meterset (216, 66.7%), whereas the maximum percentage reduction was for Cumulative Meterset Weight (80, 88.9%). Conclusion: This analysis quantifies the value of the Integrating the Healthcare Enterprise–Radiation Oncology QAPV implementation in clinical workflow. We demonstrate that although QAPV does not provide a comprehensive solution for error prevention in RT, it can have a significant impact on a subset of the most severe clinically observed events.« less
Burlison, Jonathan D; Quillivan, Rebecca R; Kath, Lisa M; Zhou, Yinmei; Courtney, Sam C; Cheng, Cheng; Hoffman, James M
2016-11-03
Patient safety events offer opportunities to improve patient care, but, unfortunately, events often go unreported. Although some barriers to event reporting can be reduced with electronic reporting systems, insight on organizational and cultural factors that influence reporting frequency may help hospitals increase reporting rates and improve patient safety. The purpose of this study was to evaluate the associations between dimensions of patient safety culture and perceived reporting practices of safety events of varying severity. We conducted a cross-sectional survey study using previously collected data from The Agency for Healthcare Research and Quality Hospital Survey of Patient Safety Culture as predictors and outcome variables. The dataset included health-care professionals in U.S. hospitals, and data were analyzed using multilevel modeling techniques. Data from 223,412 individuals, 7816 work areas/units, and 967 hospitals were analyzed. Whether examining near miss, no harm, or potential for harm safety events, the dimension feedback about error accounted for the most unique predictive variance in the outcome frequency of events reported. Other significantly associated variables included organizational learning, nonpunitive response to error, and teamwork within units (all P < 0.001). As the perceived severity of the safety event increased, more culture dimensions became significantly associated with voluntary reporting. To increase the likelihood that a patient safety event will be voluntarily reported, our study suggests placing priority on improving event feedback mechanisms and communication of event-related improvements. Focusing efforts on these aspects may be more efficient than other forms of culture change.
Burlison, Jonathan D.; Quillivan, Rebecca R.; Kath, Lisa M.; Zhou, Yinmei; Courtney, Sam C.; Cheng, Cheng; Hoffman, James M.
2016-01-01
Objectives Patient safety events offer opportunities to improve patient care, but, unfortunately, events often go unreported. Although some barriers to event reporting can be reduced with electronic reporting systems, insight on organizational and cultural factors that influence reporting frequency may help hospitals increase reporting rates and improve patient safety. The purpose of this study was to evaluate the associations between dimensions of patient safety culture and perceived reporting practices of safety events of varying severity. Methods We conducted a cross-sectional survey study using previously collected data from The Agency for Healthcare Research and Quality Hospital Survey of Patient Safety Culture as predictors and outcome variables. The dataset included healthcare professionals in U.S. hospitals, and data were analyzed by using multilevel modeling techniques. Results Data from 223,412 individuals, 7816 work areas/units and 967 hospitals were analyzed. Whether examining Near-miss, No harm, or Potential for harm safety events, the dimension Feedback about error accounted for the most unique predictive variance in the outcome Frequency of events reported. Other significantly associated variables included Organizational learning, Nonpunitive response to error, and Teamwork within units (all p<.001). As the perceived severity of the safety event increased, more culture dimensions became significantly associated with voluntary reporting. Conclusions To increase the likelihood that a patient safety event will be voluntarily reported, our study suggests placing priority on improving event feedback mechanisms and communication of event-related improvements. Focusing efforts on these aspects may be more efficient than other forms of culture change. PMID:27820722
Measuring ionizing radiation in the atmosphere with a new balloon-borne detector
NASA Astrophysics Data System (ADS)
Aplin, K. L.; Briggs, A. A.; Harrison, R. G.; Marlton, G. J.
2017-05-01
Increasing interest in energetic particle effects on weather and climate has motivated development of a miniature scintillator-based detector intended for deployment on meteorological radiosondes or unmanned airborne vehicles. The detector was calibrated with laboratory gamma sources up to 1.3 MeV and known gamma peaks from natural radioactivity of up to 2.6 MeV. The specifications of our device in combination with the performance of similar devices suggest that it will respond to up to 17 MeV gamma rays. Laboratory tests show that the detector can measure muons at the surface, and it is also expected to respond to other ionizing radiation including, for example, protons, electrons (>100 keV), and energetic helium nuclei from cosmic rays or during space weather events. Its estimated counting error is ±10%. Recent tests, when the detector was integrated with a meteorological radiosonde system and carried on a balloon to 25 km altitude, identified the transition region between energetic particles near the surface, which are dominated by terrestrial gamma emissions, to higher-energy particles in the free troposphere.
Electromagnetic Methods of Lightning Detection
NASA Astrophysics Data System (ADS)
Rakov, V. A.
2013-11-01
Both cloud-to-ground and cloud lightning discharges involve a number of processes that produce electromagnetic field signatures in different regions of the spectrum. Salient characteristics of measured wideband electric and magnetic fields generated by various lightning processes at distances ranging from tens to a few hundreds of kilometers (when at least the initial part of the signal is essentially radiation while being not influenced by ionospheric reflections) are reviewed. An overview of the various lightning locating techniques, including magnetic direction finding, time-of-arrival technique, and interferometry, is given. Lightning location on global scale, when radio-frequency electromagnetic signals are dominated by ionospheric reflections, is also considered. Lightning locating system performance characteristics, including flash and stroke detection efficiencies, percentage of misclassified events, location accuracy, and peak current estimation errors, are discussed. Both cloud and cloud-to-ground flashes are considered. Representative examples of modern lightning locating systems are reviewed. Besides general characterization of each system, the available information on its performance characteristics is given with emphasis on those based on formal ground-truth studies published in the peer-reviewed literature.
Overview of medical errors and adverse events
2012-01-01
Safety is a global concept that encompasses efficiency, security of care, reactivity of caregivers, and satisfaction of patients and relatives. Patient safety has emerged as a major target for healthcare improvement. Quality assurance is a complex task, and patients in the intensive care unit (ICU) are more likely than other hospitalized patients to experience medical errors, due to the complexity of their conditions, need for urgent interventions, and considerable workload fluctuation. Medication errors are the most common medical errors and can induce adverse events. Two approaches are available for evaluating and improving quality-of-care: the room-for-improvement model, in which problems are identified, plans are made to resolve them, and the results of the plans are measured; and the monitoring model, in which quality indicators are defined as relevant to potential problems and then monitored periodically. Indicators that reflect structures, processes, or outcomes have been developed by medical societies. Surveillance of these indicators is organized at the hospital or national level. Using a combination of methods improves the results. Errors are caused by combinations of human factors and system factors, and information must be obtained on how people make errors in the ICU environment. Preventive strategies are more likely to be effective if they rely on a system-based approach, in which organizational flaws are remedied, rather than a human-based approach of encouraging people not to make errors. The development of a safety culture in the ICU is crucial to effective prevention and should occur before the evaluation of safety programs, which are more likely to be effective when they involve bundles of measures. PMID:22339769
Sosic-Vasic, Zrinka; Ulrich, Martin; Ruchsow, Martin; Vasic, Nenad; Grön, Georg
2012-01-01
The present study investigated the association between traits of the Five Factor Model of Personality (Neuroticism, Extraversion, Openness for Experiences, Agreeableness, and Conscientiousness) and neural correlates of error monitoring obtained from a combined Eriksen-Flanker-Go/NoGo task during event-related functional magnetic resonance imaging in 27 healthy subjects. Individual expressions of personality traits were measured using the NEO-PI-R questionnaire. Conscientiousness correlated positively with error signaling in the left inferior frontal gyrus and adjacent anterior insula (IFG/aI). A second strong positive correlation was observed in the anterior cingulate gyrus (ACC). Neuroticism was negatively correlated with error signaling in the inferior frontal cortex possibly reflecting the negative inter-correlation between both scales observed on the behavioral level. Under present statistical thresholds no significant results were obtained for remaining scales. Aligning the personality trait of Conscientiousness with task accomplishment striving behavior the correlation in the left IFG/aI possibly reflects an inter-individually different involvement whenever task-set related memory representations are violated by the occurrence of errors. The strong correlations in the ACC may indicate that more conscientious subjects were stronger affected by these violations of a given task-set expressed by individually different, negatively valenced signals conveyed by the ACC upon occurrence of an error. Present results illustrate that for predicting individual responses to errors underlying personality traits should be taken into account and also lend external validity to the personality trait approach suggesting that personality constructs do reflect more than mere descriptive taxonomies.
Continuous Glucose Monitoring in Newborn Infants
Thomas, Felicity; Signal, Mathew; Harris, Deborah L.; Weston, Philip J.; Harding, Jane E.; Shaw, Geoffrey M.
2014-01-01
Neonatal hypoglycemia is common and can cause serious brain injury. Continuous glucose monitoring (CGM) could improve hypoglycemia detection, while reducing blood glucose (BG) measurements. Calibration algorithms use BG measurements to convert sensor signals into CGM data. Thus, inaccuracies in calibration BG measurements directly affect CGM values and any metrics calculated from them. The aim was to quantify the effect of timing delays and calibration BG measurement errors on hypoglycemia metrics in newborn infants. Data from 155 babies were used. Two timing and 3 BG meter error models (Abbott Optium Xceed, Roche Accu-Chek Inform II, Nova Statstrip) were created using empirical data. Monte-Carlo methods were employed, and each simulation was run 1000 times. Each set of patient data in each simulation had randomly selected timing and/or measurement error added to BG measurements before CGM data were calibrated. The number of hypoglycemic events, duration of hypoglycemia, and hypoglycemic index were then calculated using the CGM data and compared to baseline values. Timing error alone had little effect on hypoglycemia metrics, but measurement error caused substantial variation. Abbott results underreported the number of hypoglycemic events by up to 8 and Roche overreported by up to 4 where the original number reported was 2. Nova results were closest to baseline. Similar trends were observed in the other hypoglycemia metrics. Errors in blood glucose concentration measurements used for calibration of CGM devices can have a clinically important impact on detection of hypoglycemia. If CGM devices are going to be used for assessing hypoglycemia it is important to understand of the impact of these errors on CGM data. PMID:24876618
[Adverse events in general surgery. A prospective analysis of 13,950 consecutive patients].
Rebasa, Pere; Mora, Laura; Vallverdú, Helena; Luna, Alexis; Montmany, Sandra; Romaguera, Andreu; Navarro, Salvador
2011-11-01
Adverse event (AE) rates in General Surgery vary, according to different authors and recording methods, between 2% and 30%. Six years ago we designed a prospective AE recording system to change patient safety culture in our Department. We present the results of this work after a 6 year follow-up. The AE, sequelae and health care errors in a University Hospital surgery department were recorded. An analysis of each incident recorded was performed by a reviewer. The data was entered into data base for rapid access and consultation. The results were routinely presented in Departmental morbidity-mortality sessions. A total of 13,950 patients had suffered 11,254 AE, which affected 5142 of them (36.9% of admissions). A total of 920 patients were subjected to at least one health care error (6.6% of admissions). This meant that 6.6% of our patients suffered an avoidable AE. The overall mortality at 5 years in our department was 2.72% (380 deaths). An adverse event was implicated in the death of the patient in 180 cases (1.29% of admissions). In 49 cases (0.35% of admissions), mortality could be attributed to an avoidable AE. After 6 years there tends to be an increasingly lower incidence of errors. The exhaustive and prospective recording of AE leads to changes in patient safety culture in a Surgery Department and helps decrease the incidence of health care errors. Copyright © 2011 AEC. Published by Elsevier Espana. All rights reserved.
Antecedent Synoptic Environments Conducive to North American Polar/Subtropical Jet Superpositions
NASA Astrophysics Data System (ADS)
Winters, A. C.; Keyser, D.; Bosart, L. F.
2017-12-01
The atmosphere often exhibits a three-step pole-to-equator tropopause structure, with each break in the tropopause associated with a jet stream. The polar jet stream (PJ) typically resides in the break between the polar and subtropical tropopause and is positioned atop the strongly baroclinic, tropospheric-deep polar front around 50°N. The subtropical jet stream (STJ) resides in the break between the subtropical and the tropical tropopause and is situated on the poleward edge of the Hadley cell around 30°N. On occasion, the latitudinal separation between the PJ and the STJ can vanish, resulting in a vertical jet superposition. Prior case study work indicates that jet superpositions are often attended by a vigorous transverse vertical circulation that can directly impact the production of extreme weather over North America. Furthermore, this work suggests that there is considerable variability among antecedent environments conducive to the production of jet superpositions. These considerations motivate a comprehensive study to examine the synoptic-dynamic mechanisms that operate within the double-jet environment to produce North American jet superpositions. This study focuses on the identification of North American jet superposition events in the CFSR dataset during November-March 1979-2010. Superposition events will be classified into three characteristic types: "Polar Dominant" events will consist of events during which only the PJ is characterized by a substantial excursion from its climatological latitude band; "Subtropical Dominant" events will consist of events during which only the STJ is characterized by a substantial excursion from its climatological latitude band; and "Hybrid" events will consist of those events characterized by an excursion of both the PJ and STJ from their climatological latitude bands. Following their classification, frequency distributions of jet superpositions will be constructed to highlight the geographical locations most often associated with jet superpositions for each event type. PV inversion and composite analysis will also be performed on each event type in an effort to illustrate the antecedent environments and the dominant synoptic-dynamic mechanisms that favor the production of North American jet superpositions for each event type.
Performance Analysis: Work Control Events Identified January - August 2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Grange, C E; Freeman, J W; Kerr, C E
2011-01-14
This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting inmore » each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009, training of the workforce began and as of the time of this report more than 50% of authorized Integration Work Sheets (IWS) use the activity-based planning process. In 2010, LSO independently reviewed the work planning and control process and confirmed to the Laboratory that the Integrated Safety Management (ISM) System was implemented. LLNL conducted a cross-directorate management self-assessment of work planning and control and is developing actions to respond to the issues identified. Ongoing efforts to strengthen the work planning and control process and to improve the quality of LLNL work packages are in progress: completion of remaining actions in response to the 2009 DOE Office of Health, Safety, and Security (HSS) evaluation of LLNL's ISM System; scheduling more than 14 work planning and control self-assessments in FY11; continuing to align subcontractor work control with the Institutional work planning and control system; and continuing to maintain the electronic IWS application. The 24 events included in this analysis were caused by errors in the first four of the five ISMS functions. The most frequent cause was errors in analyzing the hazards (Function 2). The second most frequent cause was errors occurring when defining the work (Function 1), followed by errors during the performance of work (Function 4). Interestingly, very few errors in developing controls (Function 3) resulted in events. This leads one to conclude that if improvements are made to defining the scope of work and analyzing the potential hazards, LLNL may reduce the frequency or severity of events. Analysis of the 24 events resulted in the identification of ten common causes. Some events had multiple causes, resulting in the mention of 39 causes being identified for the 24 events. The most frequent cause was workers, supervisors, or experts believing they understood the work and the hazards but their understanding was incomplete. The second most frequent cause was unclear, incomplete or confusing documents directing the work. Together, these two causes were mentioned 17 times and contributed to 13 of the events. All of the events with the cause of ''workers, supervisors, or experts believing they understood the work and the hazards but their understanding was incomplete'' had this error in the first two ISMS functions: define the work and analyze the hazard. This means that these causes result in the scope of work being ill-defined or the hazard(s) improperly analyzed. Incomplete implementation of these functional steps leads to the hazards not being controlled. The causes are then manifested in events when the work is conducted. The process to operate safely relies on accurately defining the scope of work. This review has identified a number of examples of latent organizational weakness in the execution of work control processes.« less
Assessment of individual hand performance in box trainers compared to virtual reality trainers.
Madan, Atul K; Frantzides, Constantine T; Shervin, Nina; Tebbit, Christopher L
2003-12-01
Training residents in laparoscopic skills is ideally initiated in an inanimate laboratory with both box trainers and virtual reality trainers. Virtual reality trainers have the ability to score individual hand performance although they are expensive. Here we compared the ability to assess dominant and nondominant hand performance in box trainers with virtual reality trainers. Medical students without laparoscopic experience were utilized in this study (n = 16). Each student performed tasks on the LTS 2000, an inanimate box trainer (placing pegs with both hands and transferring pegs from one hand to another), as well as a task on the MIST-VR, a virtual reality trainer (grasping a virtual object and placing it in a virtual receptable with alternating hands). A surgeon scored students for the inanimate box trainer exercises (time and errors) while the MIST-VR scored students (time, economy of movements, and errors for each hand). Statistical analysis included Pearson correlations. Errors and time for the one-handed tasks on the box trainer did not correlate with errors, time, or economy measured for each hand by the MIST-VR (r = 0.01 to 0.30; P = NS). Total errors on the virtual reality trainer did correlate with errors on transferring pege (r = 0.61; P < 0.05). Economy and time of both dominant and nondominant hand from the MIST-VR correlated with time of transferring pegs in the box trainer (r = 0.53 to 0.77; P < 0.05). While individual hand assessment by the box trainer during 2-handed tasks was related to assessment by the virtual reality trainer, individual hand assessment during 1-handed tasks did not correlate with the virtual reality trainer. Virtual reality trainers, such as the MIST-VR, allow assessment of individual hand skills which may lead to improved laparoscopic skill acquisition. It is difficult to assess individual hand performance with box trainers alone.
NASA Astrophysics Data System (ADS)
Staneva, Joanna; Wahle, Kathrin
2015-04-01
This study addresses the coupling between wind wave and circulation models on the example of the German Bight and its coastal area called the Wadden Sea (the area between the barrier islands and the coast). This topic reflects the increased interest in operational oceanography to reduce prediction errors of state estimates at coastal scales. The uncertainties in most of the presently used models result from the nonlinear feedback between strong tidal currents and wind-waves, which can no longer be ignored, in particular in the coastal zone where its role seems to be dominant. A nested modelling system is used in the Helmholtz-Zentrum Geesthacht to producing reliable now- and short-term forecasts of ocean state variables, including wind waves and hydrodynamics. In this study we present analysis of wave and hydrographic observations, as well as the results of numerical simulations. The data base includes ADCP observations and continuous measurements from data stations. The individual and collective role of wind, waves and tidal forcing are quantified. The performance of the forecasting system is illustrated for the cases of several extreme events. Effects of ocean waves on coastal circulation and SST simulations are investigated considering wave-dependent stress and wave breaking parameterization during extreme events, e.g. hurricane Xavier in December, 2013. Also the effect which the circulation exerts on the wind waves is tested for the coastal areas using different parameterizations. The improved skill resulting from the new developments in the forecasting system, in particular during extreme events, justifies further enhancements of the coastal pre-operational system for the North Sea and German Bight.
Increased Error-Related Negativity (ERN) in Childhood Anxiety Disorders: ERP and Source Localization
ERIC Educational Resources Information Center
Ladouceur, Cecile D.; Dahl, Ronald E.; Birmaher, Boris; Axelson, David A.; Ryan, Neal D.
2006-01-01
Background: In this study we used event-related potentials (ERPs) and source localization analyses to track the time course of neural activity underlying response monitoring in children diagnosed with an anxiety disorder compared to age-matched low-risk normal controls. Methods: High-density ERPs were examined following errors on a flanker task…
Development of Action Monitoring through Adolescence into Adulthood: ERP and Source Localization
ERIC Educational Resources Information Center
Ladouceur, Cecile D.; Dahl, Ronald E.; Carter, Cameron S.
2007-01-01
In this study we examined the development of three action monitoring event-related potentials (ERPs)--the error-related negativity (ERN/Ne), error positivity (P[subscript E]) and the N2--and estimated their neural sources. These ERPs were recorded during a flanker task in the following groups: early adolescents (mean age = 12 years), late…
Building a learning culture and prevention of error - to near miss or not.
Arnold, Anthony
2017-09-01
This editorial provides an insight into learning and prevention of error through near miss event reporting. © 2017 The Author. Journal of Medical Radiation Sciences published by John Wiley & Sons Australia, Ltd on behalf of Australian Society of Medical Imaging and Radiation Therapy and New Zealand Institute of Medical Radiation Technology.
16 CFR 308.7 - Billing and collection for pay-per-call services.
Code of Federal Regulations, 2010 CFR
2010-01-01
... purchase. (vi) A computation error or similar error of an accounting nature on a billing statement of a... before the end of the billing cycle for which the statement was required. (viii) A reflection on a... billing cycles of the billing entity (in no event later than ninety (90) days) after receiving the notice...
Error-Related Negativity and Tic History in Pediatric Obsessive-Compulsive Disorder
ERIC Educational Resources Information Center
Hanna, Gregory L.; Carrasco, Melisa; Harbin, Shannon M.; Nienhuis, Jenna K.; LaRosa, Christina E.; Chen, Poyu; Fitzgerald, Kate D.; Gehring, William J.
2012-01-01
Objective: The error-related negativity (ERN) is a negative deflection in the event-related potential after an incorrect response, which is often increased in patients with obsessive-compulsive disorder (OCD). However, the relation of the ERN to comorbid tic disorders has not been examined in patients with OCD. This study compared ERN amplitudes…
Microcircuit radiation effects databank
NASA Technical Reports Server (NTRS)
1983-01-01
Radiation test data submitted by many testers is collated to serve as a reference for engineers who are concerned with and have some knowledge of the effects of the natural radiation environment on microcircuits. Total dose damage information and single event upset cross sections, i.e., the probability of a soft error (bit flip) or of a hard error (latchup) are presented.