Sample records for storing human error

  1. Analysis of measured data of human body based on error correcting frequency

    NASA Astrophysics Data System (ADS)

    Jin, Aiyan; Peipei, Gao; Shang, Xiaomei

    2014-04-01

    Anthropometry is to measure all parts of human body surface, and the measured data is the basis of analysis and study of the human body, establishment and modification of garment size and formulation and implementation of online clothing store. In this paper, several groups of the measured data are gained, and analysis of data error is gotten by analyzing the error frequency and using analysis of variance method in mathematical statistics method. Determination of the measured data accuracy and the difficulty of measured parts of human body, further studies of the causes of data errors, and summarization of the key points to minimize errors possibly are also mentioned in the paper. This paper analyses the measured data based on error frequency, and in a way , it provides certain reference elements to promote the garment industry development.

  2. Evaluation of lens distortion errors using an underwater camera system for video-based motion analysis

    NASA Technical Reports Server (NTRS)

    Poliner, Jeffrey; Fletcher, Lauren; Klute, Glenn K.

    1994-01-01

    Video-based motion analysis systems are widely employed to study human movement, using computers to capture, store, process, and analyze video data. This data can be collected in any environment where cameras can be located. One of the NASA facilities where human performance research is conducted is the Weightless Environment Training Facility (WETF), a pool of water which simulates zero-gravity with neutral buoyance. Underwater video collection in the WETF poses some unique problems. This project evaluates the error caused by the lens distortion of the WETF cameras. A grid of points of known dimensions was constructed and videotaped using a video vault underwater system. Recorded images were played back on a VCR and a personal computer grabbed and stored the images on disk. These images were then digitized to give calculated coordinates for the grid points. Errors were calculated as the distance from the known coordinates of the points to the calculated coordinates. It was demonstrated that errors from lens distortion could be as high as 8 percent. By avoiding the outermost regions of a wide-angle lens, the error can be kept smaller.

  3. ALICE Expert System

    NASA Astrophysics Data System (ADS)

    Ionita, C.; Carena, F.

    2014-06-01

    The ALICE experiment at CERN employs a number of human operators (shifters), who have to make sure that the experiment is always in a state compatible with taking Physics data. Given the complexity of the system and the myriad of errors that can arise, this is not always a trivial task. The aim of this paper is to describe an expert system that is capable of assisting human shifters in the ALICE control room. The system diagnoses potential issues and attempts to make smart recommendations for troubleshooting. At its core, a Prolog engine infers whether a Physics or a technical run can be started based on the current state of the underlying sub-systems. A separate C++ component queries certain SMI objects and stores their state as facts in a Prolog knowledge base. By mining the data stored in different system logs, the expert system can also diagnose errors arising during a run. Currently the system is used by the on-call experts for faster response times, but we expect it to be adopted as a standard tool by regular shifters during the next data taking period.

  4. Development of object permanence in food-storing magpies (Pica pica).

    PubMed

    Pollok, B; Prior, H; Güntürkün, O

    2000-06-01

    The development of object permanence was investigated in black-billed magpies (Pica pica), a food-storing passerine bird. The authors tested the hypothesis that food-storing development should be correlated with object-permanence development and that specific stages of object permanence should be achieved before magpies become independent. As predicted, Piagetian Stages 4 and 5 were reached before independence was achieved, and the ability to represent a fully hidden object (Piagetian Stage 4) emerged by the age when magpies begin to retrieve food. Contrary to psittacine birds and humans, but as in dogs and cats, no "A-not-B error" occurred. Although magpies also mastered 5 of 6 invisible displacement tasks, evidence of Piagetian Stage 6 competence was ambiguous.

  5. Adaptive control system for pulsed megawatt klystrons

    DOEpatents

    Bolie, Victor W.

    1992-01-01

    The invention provides an arrangement for reducing waveform errors such as errors in phase or amplitude in output pulses produced by pulsed power output devices such as klystrons by generating an error voltage representing the extent of error still present in the trailing edge of the previous output pulse, using the error voltage to provide a stored control voltage, and applying the stored control voltage to the pulsed power output device to limit the extent of error in the leading edge of the next output pulse.

  6. Method and apparatus for faulty memory utilization

    DOEpatents

    Cher, Chen-Yong; Andrade Costa, Carlos H.; Park, Yoonho; Rosenburg, Bryan S.; Ryu, Kyung D.

    2016-04-19

    A method for faulty memory utilization in a memory system includes: obtaining information regarding memory health status of at least one memory page in the memory system; determining an error tolerance of the memory page when the information regarding memory health status indicates that a failure is predicted to occur in an area of the memory system affecting the memory page; initiating a migration of data stored in the memory page when it is determined that the data stored in the memory page is non-error-tolerant; notifying at least one application regarding a predicted operating system failure and/or a predicted application failure when it is determined that data stored in the memory page is non-error-tolerant and cannot be migrated; and notifying at least one application regarding the memory failure predicted to occur when it is determined that data stored in the memory page is error-tolerant.

  7. Error Tolerant Plan Recognition: An Empirical Investigation

    DTIC Science & Technology

    2015-05-01

    structure can differ drastically in semantics. For instance, a plan to travel to a grocery store to buy milk might coincidentally be structurally...algorithm for its ability to tolerate input errors, and that storing and leveraging state information in its plan representation substantially...proposed a novel representation for storing and organizing plans in a plan library, based on action-state pairs and abstract states. It counts the

  8. Reading Ground Water Levels with a Smartphone

    NASA Astrophysics Data System (ADS)

    van Overloop, Peter-Jules

    2015-04-01

    Most ground water levels in the world are measured manually. It requires employees of water management organizations to visit sites in the field and execute a measurement procedure that requires special tools and training. Once the measurement is done, the value is jotted down in a notebook and later, at the office, entered in a computer system. This procedure is slow and prone to human errors. A new development is the introduction of modern Information and Communication Technology to support this task and make it more efficient. Two innovations are introduced to measure and immediately store ground water levels. The first method is a measuring tape that gives a sound and light when it just touches the water in combination with an app on a smartphone with which a picture needs to be taken from the measuring tape. Using dedicated pattern recognition algorithms, the depth is read on the tape and it is verified if the light is on. The second method estimates the depth using a sound from the smartphone that is sent into the borehole and records the reflecting waves in the pipe. Both methods use gps-localization of the smartphone to store the depths in the right location in the central database, making the monitoring of ground water levels a real-time process that eliminates human errors.

  9. Error Tracking System

    EPA Pesticide Factsheets

    Error Tracking System is a database used to store & track error notifications sent by users of EPA's web site. ETS is managed by OIC/OEI. OECA's ECHO & OEI Envirofacts use it. Error notifications from EPA's home Page under Contact Us also uses it.

  10. Structured inspection of medications carried and stored by emergency medical services agencies identifies practices that may lead to medication errors.

    PubMed

    Kupas, Douglas F; Shayhorn, Meghan A; Green, Paul; Payton, Thomas F

    2012-01-01

    Medications are essential to emergency medical services (EMS) agencies when providing lifesaving care, but the EMS environment has challenges related to safe medication storage when compared with a hospital setting. We developed a structured process, based on common pharmacy practices, to review medications carried by EMS agencies to identify situations that may lead to medication error and to determine some best practices that may reduce potential errors and the risk of patient harm. To provide a descriptive account of EMS practices related to carrying and storing medications that have the potential for causing a medication administration error or patient harm. Using a structured process for inspection, an emergency medicine pharmacist and emergency physician(s) reviewed the medication carrying and storage practices of all nine advanced life support ambulance agencies within a five-county EMS region. Each medication carried and stored by the EMS agency was inspected for predetermined and spontaneously observed issues that could lead to medication error. These issues were documented and photographed. Two EMS medical directors reviewed each potential error for the risk of producing patient harm and assigned each to a category of high, moderate, or low risk. Because issues of temperature on EMS medications have been addressed elsewhere, this study concentrated on potential for EMS medication administration errors exclusive of storage temperatures. When reviewing medications carried by the nine EMS agencies, 38 medication safety issues were identified (range 1 to 8 per EMS agency). Of these, 16 were considered to be high risk, 14 moderate risk, and eight low risk for patient harm. Examples of potential issues included carrying expired medications, container-labeling issues, different medications stored in look-alike vials or prefilled syringes in the same compartment, and carrying crystalloid solutions next to solutions premixed with a medication. When reviewing medications stored at the EMS agency stations, eight safety issues were identified (range from 0 to 4 per station), including five moderate-risk and three low-risk issues. No agency had any high-risk medication issues related to storage of medication stock in the station. We observed potential medication safety issues related to how medications are carried and stored at all nine EMS agencies in a five-county region. Understanding these issues may assist EMS agencies in reducing the potential for a medication error and risk of patient harm. More research is needed to determine whether following these suggested best practices for carrying medications on EMS vehicles actually reduces errors in medication administration by EMS providers or decreases patient harm.

  11. Obstacle Detection in Indoor Environment for Visually Impaired Using Mobile Camera

    NASA Astrophysics Data System (ADS)

    Rahman, Samiur; Ullah, Sana; Ullah, Sehat

    2018-01-01

    Obstacle detection can improve the mobility as well as the safety of visually impaired people. In this paper, we present a system using mobile camera for visually impaired people. The proposed algorithm works in indoor environment and it uses a very simple technique of using few pre-stored floor images. In indoor environment all unique floor types are considered and a single image is stored for each unique floor type. These floor images are considered as reference images. The algorithm acquires an input image frame and then a region of interest is selected and is scanned for obstacle using pre-stored floor images. The algorithm compares the present frame and the next frame and compute mean square error of the two frames. If mean square error is less than a threshold value α then it means that there is no obstacle in the next frame. If mean square error is greater than α then there are two possibilities; either there is an obstacle or the floor type is changed. In order to check if the floor is changed, the algorithm computes mean square error of next frame and all stored floor types. If minimum of mean square error is less than a threshold value α then flour is changed otherwise there exist an obstacle. The proposed algorithm works in real-time and 96% accuracy has been achieved.

  12. Methods and apparatus using commutative error detection values for fault isolation in multiple node computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Almasi, Gheorghe; Blumrich, Matthias Augustin; Chen, Dong

    Methods and apparatus perform fault isolation in multiple node computing systems using commutative error detection values for--example, checksums--to identify and to isolate faulty nodes. When information associated with a reproducible portion of a computer program is injected into a network by a node, a commutative error detection value is calculated. At intervals, node fault detection apparatus associated with the multiple node computer system retrieve commutative error detection values associated with the node and stores them in memory. When the computer program is executed again by the multiple node computer system, new commutative error detection values are created and stored inmore » memory. The node fault detection apparatus identifies faulty nodes by comparing commutative error detection values associated with reproducible portions of the application program generated by a particular node from different runs of the application program. Differences in values indicate a possible faulty node.« less

  13. Signature-based store checking buffer

    DOEpatents

    Sridharan, Vilas; Gurumurthi, Sudhanva

    2015-06-02

    A system and method for optimizing redundant output verification, are provided. A hardware-based store fingerprint buffer receives multiple instances of output from multiple instances of computation. The store fingerprint buffer generates a signature from the content included in the multiple instances of output. When a barrier is reached, the store fingerprint buffer uses the signature to verify the content is error-free.

  14. Tirilazad mesylate protects stored erythrocytes against osmotic fragility.

    PubMed

    Epps, D E; Knechtel, T J; Bacznskyj, O; Decker, D; Guido, D M; Buxser, S E; Mathews, W R; Buffenbarger, S L; Lutzke, B S; McCall, J M

    1994-12-01

    The hypoosmotic lysis curve of freshly collected human erythrocytes is consistent with a single Gaussian error function with a mean of 46.5 +/- 0.25 mM NaCl and a standard deviation of 5.0 +/- 0.4 mM NaCl. After extended storage of RBCs under standard blood bank conditions the lysis curve conforms to the sum of two error functions instead of a possible shift in the mean and a broadening of a single error function. Thus, two distinct sub-populations with different fragilities are present instead of a single, broadly distributed population. One population is identical to the freshly collected erythrocytes, whereas the other population consists of osmotically fragile cells. The rate of generation of the new, osmotically fragile, population of cells was used to probe the hypothesis that lipid peroxidation is responsible for the induction of membrane fragility. If it is so, then the antioxidant, tirilazad mesylate (U-74,006f), should protect against this degradation of stored erythrocytes. We found that tirilazad mesylate, at 17 microM (1.5 mol% with respect to membrane lecithin), retards significantly the formation of the osmotically fragile RBCs. Concomitantly, the concentration of free hemoglobin which accumulates during storage is markedly reduced by the drug. Since the presence of the drug also decreases the amount of F2-isoprostanes formed during the storage period, an antioxidant mechanism must be operative. These results demonstrate that tirilazad mesylate significantly decreases the number of fragile erythrocytes formed during storage in the blood bank.

  15. Memory and the Moses illusion: failures to detect contradictions with stored knowledge yield negative memorial consequences.

    PubMed

    Bottoms, Hayden C; Eslick, Andrea N; Marsh, Elizabeth J

    2010-08-01

    Although contradictions with stored knowledge are common in daily life, people often fail to notice them. For example, in the Moses illusion, participants fail to notice errors in questions such as "How many animals of each kind did Moses take on the Ark?" despite later showing knowledge that the Biblical reference is to Noah, not Moses. We examined whether error prevalence affected participants' ability to detect distortions in questions, and whether this in turn had memorial consequences. Many of the errors were overlooked, but participants were better able to catch them when they were more common. More generally, the failure to detect errors had negative memorial consequences, increasing the likelihood that the errors were used to answer later general knowledge questions. Methodological implications of this finding are discussed, as it suggests that typical analyses likely underestimate the size of the Moses illusion. Overall, answering distorted questions can yield errors in the knowledge base; most importantly, prior knowledge does not protect against these negative memorial consequences.

  16. eCOMPAGT integrates mtDNA: import, validation and export of mitochondrial DNA profiles for population genetics, tumour dynamics and genotype-phenotype association studies.

    PubMed

    Weissensteiner, Hansi; Schönherr, Sebastian; Specht, Günther; Kronenberg, Florian; Brandstätter, Anita

    2010-03-09

    Mitochondrial DNA (mtDNA) is widely being used for population genetics, forensic DNA fingerprinting and clinical disease association studies. The recent past has uncovered severe problems with mtDNA genotyping, not only due to the genotyping method itself, but mainly to the post-lab transcription, storage and report of mtDNA genotypes. eCOMPAGT, a system to store, administer and connect phenotype data to all kinds of genotype data is now enhanced by the possibility of storing mtDNA profiles and allowing their validation, linking to phenotypes and export as numerous formats. mtDNA profiles can be imported from different sequence evaluation programs, compared between evaluations and their haplogroup affiliations stored. Furthermore, eCOMPAGT has been improved in its sophisticated transparency (support of MySQL and Oracle), security aspects (by using database technology) and the option to import, manage and store genotypes derived from various genotyping methods (SNPlex, TaqMan, and STRs). It is a software solution designed for project management, laboratory work and the evaluation process all-in-one. The extended mtDNA version of eCOMPAGT was designed to enable error-free post-laboratory data handling of human mtDNA profiles. This software is suited for small to medium-sized human genetic, forensic and clinical genetic laboratories. The direct support of MySQL and the improved database security options render eCOMPAGT a powerful tool to build an automated workflow architecture for several genotyping methods. eCOMPAGT is freely available at http://dbis-informatik.uibk.ac.at/ecompagt.

  17. eCOMPAGT integrates mtDNA: import, validation and export of mitochondrial DNA profiles for population genetics, tumour dynamics and genotype-phenotype association studies

    PubMed Central

    2010-01-01

    Background Mitochondrial DNA (mtDNA) is widely being used for population genetics, forensic DNA fingerprinting and clinical disease association studies. The recent past has uncovered severe problems with mtDNA genotyping, not only due to the genotyping method itself, but mainly to the post-lab transcription, storage and report of mtDNA genotypes. Description eCOMPAGT, a system to store, administer and connect phenotype data to all kinds of genotype data is now enhanced by the possibility of storing mtDNA profiles and allowing their validation, linking to phenotypes and export as numerous formats. mtDNA profiles can be imported from different sequence evaluation programs, compared between evaluations and their haplogroup affiliations stored. Furthermore, eCOMPAGT has been improved in its sophisticated transparency (support of MySQL and Oracle), security aspects (by using database technology) and the option to import, manage and store genotypes derived from various genotyping methods (SNPlex, TaqMan, and STRs). It is a software solution designed for project management, laboratory work and the evaluation process all-in-one. Conclusions The extended mtDNA version of eCOMPAGT was designed to enable error-free post-laboratory data handling of human mtDNA profiles. This software is suited for small to medium-sized human genetic, forensic and clinical genetic laboratories. The direct support of MySQL and the improved database security options render eCOMPAGT a powerful tool to build an automated workflow architecture for several genotyping methods. eCOMPAGT is freely available at http://dbis-informatik.uibk.ac.at/ecompagt. PMID:20214782

  18. Orthogonal patterns in binary neural networks

    NASA Technical Reports Server (NTRS)

    Baram, Yoram

    1988-01-01

    A binary neural network that stores only mutually orthogonal patterns is shown to converge, when probed by any pattern, to a pattern in the memory space, i.e., the space spanned by the stored patterns. The latter are shown to be the only members of the memory space under a certain coding condition, which allows maximum storage of M=(2N) sup 0.5 patterns, where N is the number of neurons. The stored patterns are shown to have basins of attraction of radius N/(2M), within which errors are corrected with probability 1 in a single update cycle. When the probe falls outside these regions, the error correction capability can still be increased to 1 by repeatedly running the network with the same probe.

  19. Coagulation Function of Stored Whole Blood is Preserved for 14 Days in Austere Conditions: A ROTEM Feasibility Study During a Norwegian Antipiracy Mission and Comparison to Equal Ratio Reconstituted Blood

    DTIC Science & Technology

    2015-06-24

    mechanical piston movements measured by the ROTEM device. Error messages were recorded in 4 (1.5%) of 267 tests. CWB yielded repro- ducible ROTEM results... piston movement analysis, error message frequency, and result variability and (2) compare the clotting properties of cold-stored WB obtained from a walking...signed the selection form, which tracked TTD screening and blood grouping results. That same form doubled as a transfusion form and was used to

  20. Human error in hospitals and industrial accidents: current concepts.

    PubMed

    Spencer, F C

    2000-10-01

    Most data concerning errors and accidents are from industrial accidents and airline injuries. General Electric, Alcoa, and Motorola, among others, all have reported complex programs that resulted in a marked reduction in frequency of worker injuries. In the field of medicine, however, with the outstanding exception of anesthesiology, there is a paucity of information, most reports referring to the 1984 Harvard-New York State Study, more than 16 years ago. This scarcity of information indicates the complexity of the problem. It seems very unlikely that simple exhortation or additional regulations will help because the problem lies principally in the multiple human-machine interfaces that constitute modern medical care. The absence of success stories also indicates that the best methods have to be learned by experience. A liaison with industry should be helpful, although the varieties of human illness are far different from a standardized manufacturing process. Concurrent with the studies of industrial and nuclear accidents, cognitive psychologists have intensively studied how the brain stores and retrieves information. Several concepts have emerged. First, errors are not character defects to be treated by the classic approach of discipline and education, but are byproducts of normal thinking that occur frequently. Second, major accidents are rarely causedby a single error; instead, they are often a combination of chronic system errors, termed latent errors. Identifying and correcting these latent errors should be the principal focus for corrective planning rather than searching for an individual culprit. This nonpunitive concept of errors is a key basis for an effective reporting system, brilliantly demonstrated in aviation with the ASRS system developed more than 25 years ago. The ASRS currently receives more than 30,000 reports annually and is credited with the remarkable increase in safety of airplane travel. Adverse drug events constitute about 25% of hospital errors. In the future, the combination of new drugs and a vast amount of new information will additionally increase the possibilities for error. Two major advances in recent years have been computerization and active participation of the pharmacist with dispensing medications. Further investigation of hospital errors should concentrate primarily on latent system errors. Significant system changes will require broad staff participation throughout the hospital. This, in turn, should foster development of an institutional safety culture, rather than the popular attitude that patient safety responsibility is concentrated in the Quality Assurance-Risk Management division. Quality of service and patient safety are closely intertwined.

  1. Web-Based Information Management System for the Investigation, Reporting, and Analysis of Human Error in Naval Aviation Maintenance

    DTIC Science & Technology

    2001-09-01

    groupwork # table (rd int identity primary key,agrp varchar(150),bgrp varchar(150),ard int,brd int) -- declare @trans# table (rd int identity,agg varchar(300...8217 */ -- Store field and associated fields within it. insert @ groupwork #(agrp,bgrp,ard,brd) select a.grp,b.grp,a.rd,b.rd --select a.rd as a_rd,cast(a.grp as...8217 end +’#rac.’+bgrp, @str1=agrp from @ groupwork # where ard<@k /* don’t want last field */ order by rd -- Fields set @mgrpsupdate5= ’ case when #rac.cgrps

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Batista, Antonio J. N.; Santos, Bruno; Fernandes, Ana

    The data acquisition and control instrumentation cubicles room of the ITER tokamak will be irradiated with neutrons during the fusion reactor operation. A Virtex-6 FPGA from Xilinx (XC6VLX365T-1FFG1156C) is used on the ATCA-IO-PROCESSOR board, included in the ITER Catalog of I and C products - Fast Controllers. The Virtex-6 is a re-programmable logic device where the configuration is stored in Static RAM (SRAM), functional data stored in dedicated Block RAM (BRAM) and functional state logic in Flip-Flops. Single Event Upsets (SEU) due to the ionizing radiation of neutrons causes soft errors, unintended changes (bit-flips) to the values stored in statemore » elements of the FPGA. The SEU monitoring and soft errors repairing, when possible, were explored in this work. An FPGA built-in Soft Error Mitigation (SEM) controller detects and corrects soft errors in the FPGA configuration memory. Novel SEU sensors with Error Correction Code (ECC) detect and repair the BRAM memories. Proper management of SEU can increase reliability and availability of control instrumentation hardware for nuclear applications. The results of the tests performed using the SEM controller and the BRAM SEU sensors are presented for a Virtex-6 FPGA (XC6VLX240T-1FFG1156C) when irradiated with neutrons from the Portuguese Research Reactor (RPI), a 1 MW nuclear fission reactor operated by IST in the neighborhood of Lisbon. Results show that the proposed SEU mitigation technique is able to repair the majority of the detected SEU errors in the configuration and BRAM memories. (authors)« less

  3. Processor register error correction management

    DOEpatents

    Bose, Pradip; Cher, Chen-Yong; Gupta, Meeta S.

    2016-12-27

    Processor register protection management is disclosed. In embodiments, a method of processor register protection management can include determining a sensitive logical register for executable code generated by a compiler, generating an error-correction table identifying the sensitive logical register, and storing the error-correction table in a memory accessible by a processor. The processor can be configured to generate a duplicate register of the sensitive logical register identified by the error-correction table.

  4. Oxylipins, endocannabinoids, and related compounds in human milk: Levels and effects of storage conditions.

    PubMed

    Wu, Junfang; Gouveia-Figueira, Sandra; Domellöf, Magnus; Zivkovic, Angela M; Nording, Malin L

    2016-01-01

    The presence of fatty acid derived oxylipins, endocannabinoids and related compounds in human milk may be of importance to the infant. Presently, clinically relevant protocols for storing and handling human milk that minimize error and variability in oxylipin and endocannabinoid concentrations are lacking. In this study, we compared the individual and combined effects of the following storage conditions on the stability of these fatty acid metabolites in human milk: state (fresh or frozen), storage temperature (4 °C, -20 °C or -80 °C), and duration (1 day, 1 week or 3 months). Thirteen endocannabinoids and related compounds, as well as 37 oxylipins were analyzed simultaneously by liquid chromatography coupled to tandem mass spectrometry. Twelve endocannabinoids and related compounds (2-111 nM) and 31 oxylipins (1.2 pM-1242 nM) were detected, with highest levels being found for 2-arachidonoylglycerol and 17(R)hydroxydocosahexaenoic acid, respectively. The concentrations of most endocannabinoid-related compounds and oxylipins were dependent on storage condition, and especially storage at 4 °C introduced significant variability. Our findings suggest that human milk samples should be analyzed immediately after, or within one day of collection (if stored at 4 °C). Storage at -80 °C is required for long-term preservation, and storage at -20 °C is acceptable for no more than one week. These findings provide a protocol for investigating the oxylipin and endocannabinoid metabolome in human milk, useful for future milk-related clinical studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. On the Maximum Storage Capacity of the Hopfield Model

    PubMed Central

    Folli, Viola; Leonetti, Marco; Ruocco, Giancarlo

    2017-01-01

    Recurrent neural networks (RNN) have traditionally been of great interest for their capacity to store memories. In past years, several works have been devoted to determine the maximum storage capacity of RNN, especially for the case of the Hopfield network, the most popular kind of RNN. Analyzing the thermodynamic limit of the statistical properties of the Hamiltonian corresponding to the Hopfield neural network, it has been shown in the literature that the retrieval errors diverge when the number of stored memory patterns (P) exceeds a fraction (≈ 14%) of the network size N. In this paper, we study the storage performance of a generalized Hopfield model, where the diagonal elements of the connection matrix are allowed to be different from zero. We investigate this model at finite N. We give an analytical expression for the number of retrieval errors and show that, by increasing the number of stored patterns over a certain threshold, the errors start to decrease and reach values below unit for P ≫ N. We demonstrate that the strongest trade-off between efficiency and effectiveness relies on the number of patterns (P) that are stored in the network by appropriately fixing the connection weights. When P≫N and the diagonal elements of the adjacency matrix are not forced to be zero, the optimal storage capacity is obtained with a number of stored memories much larger than previously reported. This theory paves the way to the design of RNN with high storage capacity and able to retrieve the desired pattern without distortions. PMID:28119595

  6. Cariogenic potential of stored human milk--an in-vitro study.

    PubMed

    Hegde, Amitha M; Vikyath, Rani

    2007-01-01

    Human milk samples collected from ten lactating mothers in the K. S. Hegde Medical Hospital, Mangalore were divided into five different parts and stored at different temperatures for varying durations. The pH, buffer capacity and growth of Streptococcus mutans were assessed in each of these samples. There was a fall in pH of human milk stored at various temperatures. The buffer capacity of human milk increased with duration of storage. There was an increase in Streptococcus colony count in stored human milk proportional to the duration of storage and it increased more rapidly in case of milk stored at higher temperatures (0 degrees C - 4 degrees C) compared to the milk stored in the freezer (-19 degrees C). Milk samples stored at room temperature for 6 hours and in the freezer at -19 degrees C for 2 weeks were found to be relatively safe.

  7. Decodoku: Quantum error rorrection as a simple puzzle game

    NASA Astrophysics Data System (ADS)

    Wootton, James

    To build quantum computers, we need to detect and manage any noise that occurs. This will be done using quantum error correction. At the hardware level, QEC is a multipartite system that stores information non-locally. Certain measurements are made which do not disturb the stored information, but which do allow signatures of errors to be detected. Then there is a software problem. How to take these measurement outcomes and determine: a) The errors that caused them, and (b) how to remove their effects. For qubit error correction, the algorithms required to do this are well known. For qudits, however, current methods are far from optimal. We consider the error correction problem of qubit surface codes. At the most basic level, this is a problem that can be expressed in terms of a grid of numbers. Using this fact, we take the inherent problem at the heart of quantum error correction, remove it from its quantum context, and presented in terms of simple grid based puzzle games. We have developed three versions of these puzzle games, focussing on different aspects of the required algorithms. These have been presented and iOS and Android apps, allowing the public to try their hand at developing good algorithms to solve the puzzles. For more information, see www.decodoku.com. Funding from the NCCR QSIT.

  8. The Mathematics of Computer Error.

    ERIC Educational Resources Information Center

    Wood, Eric

    1988-01-01

    Why a computer error occurred is considered by analyzing the binary system and decimal fractions. How the computer stores numbers is then described. Knowledge of the mathematics behind computer operation is important if one wishes to understand and have confidence in the results of computer calculations. (MNS)

  9. Random access in large-scale DNA data storage.

    PubMed

    Organick, Lee; Ang, Siena Dumas; Chen, Yuan-Jyue; Lopez, Randolph; Yekhanin, Sergey; Makarychev, Konstantin; Racz, Miklos Z; Kamath, Govinda; Gopalan, Parikshit; Nguyen, Bichlien; Takahashi, Christopher N; Newman, Sharon; Parker, Hsing-Yeh; Rashtchian, Cyrus; Stewart, Kendall; Gupta, Gagan; Carlson, Robert; Mulligan, John; Carmean, Douglas; Seelig, Georg; Ceze, Luis; Strauss, Karin

    2018-03-01

    Synthetic DNA is durable and can encode digital data with high density, making it an attractive medium for data storage. However, recovering stored data on a large-scale currently requires all the DNA in a pool to be sequenced, even if only a subset of the information needs to be extracted. Here, we encode and store 35 distinct files (over 200 MB of data), in more than 13 million DNA oligonucleotides, and show that we can recover each file individually and with no errors, using a random access approach. We design and validate a large library of primers that enable individual recovery of all files stored within the DNA. We also develop an algorithm that greatly reduces the sequencing read coverage required for error-free decoding by maximizing information from all sequence reads. These advances demonstrate a viable, large-scale system for DNA data storage and retrieval.

  10. Evaluation of lens distortion errors in video-based motion analysis

    NASA Technical Reports Server (NTRS)

    Poliner, Jeffrey; Wilmington, Robert; Klute, Glenn K.; Micocci, Angelo

    1993-01-01

    In an effort to study lens distortion errors, a grid of points of known dimensions was constructed and videotaped using a standard and a wide-angle lens. Recorded images were played back on a VCR and stored on a personal computer. Using these stored images, two experiments were conducted. Errors were calculated as the difference in distance from the known coordinates of the points to the calculated coordinates. The purposes of this project were as follows: (1) to develop the methodology to evaluate errors introduced by lens distortion; (2) to quantify and compare errors introduced by use of both a 'standard' and a wide-angle lens; (3) to investigate techniques to minimize lens-induced errors; and (4) to determine the most effective use of calibration points when using a wide-angle lens with a significant amount of distortion. It was seen that when using a wide-angle lens, errors from lens distortion could be as high as 10 percent of the size of the entire field of view. Even with a standard lens, there was a small amount of lens distortion. It was also found that the choice of calibration points influenced the lens distortion error. By properly selecting the calibration points and avoidance of the outermost regions of a wide-angle lens, the error from lens distortion can be kept below approximately 0.5 percent with a standard lens and 1.5 percent with a wide-angle lens.

  11. Error Characterization and Mitigation for 16Nm MLC NAND Flash Memory Under Total Ionizing Dose Effect

    NASA Technical Reports Server (NTRS)

    Li, Yue (Inventor); Bruck, Jehoshua (Inventor)

    2018-01-01

    A data device includes a memory having a plurality of memory cells configured to store data values in accordance with a predetermined rank modulation scheme that is optional and a memory controller that receives a current error count from an error decoder of the data device for one or more data operations of the flash memory device and selects an operating mode for data scrubbing in accordance with the received error count and a program cycles count.

  12. Correcting a fundamental error in greenhouse gas accounting related to bioenergy.

    PubMed

    Haberl, Helmut; Sprinz, Detlef; Bonazountas, Marc; Cocco, Pierluigi; Desaubies, Yves; Henze, Mogens; Hertel, Ole; Johnson, Richard K; Kastrup, Ulrike; Laconte, Pierre; Lange, Eckart; Novak, Peter; Paavola, Jouni; Reenberg, Anette; van den Hove, Sybille; Vermeire, Theo; Wadhams, Peter; Searchinger, Timothy

    2012-06-01

    Many international policies encourage a switch from fossil fuels to bioenergy based on the premise that its use would not result in carbon accumulation in the atmosphere. Frequently cited bioenergy goals would at least double the present global human use of plant material, the production of which already requires the dedication of roughly 75% of vegetated lands and more than 70% of water withdrawals. However, burning biomass for energy provision increases the amount of carbon in the air just like burning coal, oil or gas if harvesting the biomass decreases the amount of carbon stored in plants and soils, or reduces carbon sequestration. Neglecting this fact results in an accounting error that could be corrected by considering that only the use of 'additional biomass' - biomass from additional plant growth or biomass that would decompose rapidly if not used for bioenergy - can reduce carbon emissions. Failure to correct this accounting flaw will likely have substantial adverse consequences. The article presents recommendations for correcting greenhouse gas accounts related to bioenergy.

  13. A Cognitive Approach to Brailling Errors

    ERIC Educational Resources Information Center

    Wells-Jensen, Sheri; Schwartz, Aaron; Gosche, Bradley

    2007-01-01

    This article analyzes a corpus of 1,600 brailling errors made by one expert braillist. It presents a testable model of braille writing and shows that the subject braillist stores standard braille contractions as part of the orthographic representation of words, rather than imposing contractions on a serially ordered string of letters. (Contains 1…

  14. Round-off error in long-term orbital integrations using multistep methods

    NASA Technical Reports Server (NTRS)

    Quinlan, Gerald D.

    1994-01-01

    Techniques for reducing roundoff error are compared by testing them on high-order Stormer and summetric multistep methods. The best technique for most applications is to write the equation in summed, function-evaluation form and to store the coefficients as rational numbers. A larger error reduction can be achieved by writing the equation in backward-difference form and performing some of the additions in extended precision, but this entails a larger central processing unit (cpu) cost.

  15. An Evaluation Of Holograms In Training And As Job Performance Aids

    NASA Astrophysics Data System (ADS)

    Frey, Allan H.

    1986-08-01

    Experimentation was carried out to evaluate holograms for use in training and as job aids. Holograms were compared against line drawings and photographs as methods of presenting visual information needed to accomplish a number of tasks. The dependent variables were assembly speed and assembly errors with people unstressed, assembly speed and assembly errors with people stressed, the percentage of discovered errors in assemblies, the number of correct assemblies misidentified as erroneous, and information extraction. Holograms generally were as good as or better visual aids than either photographs or line drawings. The use of holograms tends to reduce errors rather than speed assembly time in the assembly tasks used in these experiments. They also enhance the discovery of errors when the subject is attempting to locate assembly errors in a construction. The results of this experimentation suggest that serious consideration should be given to the use of holography in the development of job aids and in training. Besides these advantages for job aids, other advantages we found are that when page formated information is stored in man-readable holograms they are still useable when scratched or damaged even when similarly damaged microfilm is unuseable. Holography can also be used to store man and machine readable data simultaneously. Such storage would provide simplified backup in the event of machine failure, and it would permit the development of compatible machine and manual systems for job aid applications.

  16. Replacing Voice Input with Technology that Provided Immediate Visual and Audio Feedback to Reduce Employee Errors

    ERIC Educational Resources Information Center

    Goomas, David T.

    2010-01-01

    In this report from the field at two auto parts distribution centers, order selectors picked auto accessories (e.g., fuses, oil caps, tool kits) into industrial plastic totes as part of store orders. Accurately identifying all store order totes via the license plate number was a prerequisite for the warehouse management system (WMS) to track each…

  17. Fine-grained, local maps and coarse, global representations support human spatial working memory.

    PubMed

    Katshu, Mohammad Zia Ul Haq; d'Avossa, Giovanni

    2014-01-01

    While sensory processes are tuned to particular features, such as an object's specific location, color or orientation, visual working memory (vWM) is assumed to store information using representations, which generalize over a feature dimension. Additionally, current vWM models presume that different features or objects are stored independently. On the other hand, configurational effects, when observed, are supposed to mainly reflect encoding strategies. We show that the location of the target, relative to the display center and boundaries, and overall memory load influenced recall precision, indicating that, like sensory processes, capacity limited vWM resources are spatially tuned. When recalling one of three memory items the target distance from the display center was overestimated, similar to the error when only one item was memorized, but its distance from the memory items' average position was underestimated, showing that not only individual memory items' position, but also the global configuration of the memory array may be stored. Finally, presenting the non-target items at recall, consequently providing landmarks and configurational information, improved precision and accuracy of target recall. Similarly, when the non-target items were translated at recall, relative to their position in the initial display, a parallel displacement of the recalled target was observed. These findings suggest that fine-grained spatial information in vWM is represented in local maps whose resolution varies with distance from landmarks, such as the display center, while coarse representations are used to store the memory array configuration. Both these representations are updated at the time of recall.

  18. Fine-Grained, Local Maps and Coarse, Global Representations Support Human Spatial Working Memory

    PubMed Central

    Katshu, Mohammad Zia Ul Haq; d'Avossa, Giovanni

    2014-01-01

    While sensory processes are tuned to particular features, such as an object's specific location, color or orientation, visual working memory (vWM) is assumed to store information using representations, which generalize over a feature dimension. Additionally, current vWM models presume that different features or objects are stored independently. On the other hand, configurational effects, when observed, are supposed to mainly reflect encoding strategies. We show that the location of the target, relative to the display center and boundaries, and overall memory load influenced recall precision, indicating that, like sensory processes, capacity limited vWM resources are spatially tuned. When recalling one of three memory items the target distance from the display center was overestimated, similar to the error when only one item was memorized, but its distance from the memory items' average position was underestimated, showing that not only individual memory items' position, but also the global configuration of the memory array may be stored. Finally, presenting the non-target items at recall, consequently providing landmarks and configurational information, improved precision and accuracy of target recall. Similarly, when the non-target items were translated at recall, relative to their position in the initial display, a parallel displacement of the recalled target was observed. These findings suggest that fine-grained spatial information in vWM is represented in local maps whose resolution varies with distance from landmarks, such as the display center, while coarse representations are used to store the memory array configuration. Both these representations are updated at the time of recall. PMID:25259601

  19. Random and independent sampling of endogenous tryptic peptides from normal human EDTA plasma by liquid chromatography micro electrospray ionization and tandem mass spectrometry.

    PubMed

    Dufresne, Jaimie; Florentinus-Mefailoski, Angelique; Ajambo, Juliet; Ferwa, Ammara; Bowden, Peter; Marshall, John

    2017-01-01

    Normal human EDTA plasma samples were collected on ice, processed ice cold, and stored in a freezer at - 80 °C prior to experiments. Plasma test samples from the - 80 °C freezer were thawed on ice or intentionally warmed to room temperature. Protein content was measured by CBBR binding and the release of alcohol soluble amines by the Cd ninhydrin assay. Plasma peptides released over time were collected over C18 for random and independent sampling by liquid chromatography micro electrospray ionization and tandem mass spectrometry (LC-ESI-MS/MS) and correlated with X!TANDEM. Fully tryptic peptides by X!TANDEM returned a similar set of proteins, but was more computationally efficient, than "no enzyme" correlations. Plasma samples maintained on ice, or ice with a cocktail of protease inhibitors, showed lower background amounts of plasma peptides compared to samples incubated at room temperature. Regression analysis indicated that warming plasma to room temperature, versus ice cold, resulted in a ~ twofold increase in the frequency of peptide identification over hours-days of incubation at room temperature. The type I error rate of the protein identification from the X!TANDEM algorithm combined was estimated to be low compared to a null model of computer generated random MS/MS spectra. The peptides of human plasma were identified and quantified with low error rates by random and independent sampling that revealed 1000s of peptides from hundreds of human plasma proteins from endogenous tryptic peptides.

  20. Creating Illusions of Knowledge: Learning Errors that Contradict Prior Knowledge

    ERIC Educational Resources Information Center

    Fazio, Lisa K.; Barber, Sarah J.; Rajaram, Suparna; Ornstein, Peter A.; Marsh, Elizabeth J.

    2013-01-01

    Most people know that the Pacific is the largest ocean on Earth and that Edison invented the light bulb. Our question is whether this knowledge is stable, or if people will incorporate errors into their knowledge bases, even if they have the correct knowledge stored in memory. To test this, we asked participants general-knowledge questions 2 weeks…

  1. Real-time video analysis for retail stores

    NASA Astrophysics Data System (ADS)

    Hassan, Ehtesham; Maurya, Avinash K.

    2015-03-01

    With the advancement in video processing technologies, we can capture subtle human responses in a retail store environment which play decisive role in the store management. In this paper, we present a novel surveillance video based analytic system for retail stores targeting localized and global traffic estimate. Development of an intelligent system for human traffic estimation in real-life poses a challenging problem because of the variation and noise involved. In this direction, we begin with a novel human tracking system by an intelligent combination of motion based and image level object detection. We demonstrate the initial evaluation of this approach on available standard dataset yielding promising result. Exact traffic estimate in a retail store require correct separation of customers from service providers. We present a role based human classification framework using Gaussian mixture model for this task. A novel feature descriptor named graded colour histogram is defined for object representation. Using, our role based human classification and tracking system, we have defined a novel computationally efficient framework for two types of analytics generation i.e., region specific people count and dwell-time estimation. This system has been extensively evaluated and tested on four hours of real-life video captured from a retail store.

  2. What Information is Stored in DNA: Does it Contain Digital Error Correcting Codes?

    NASA Astrophysics Data System (ADS)

    Liebovitch, Larry

    1998-03-01

    The longest term correlations in living systems are the information stored in DNA which reflects the evolutionary history of an organism. The 4 bases (A,T,G,C) encode sequences of amino acids as well as locations of binding sites for proteins that regulate DNA. The fidelity of this important information is maintained by ANALOG error check mechanisms. When a single strand of DNA is replicated the complementary base is inserted in the new strand. Sometimes the wrong base is inserted that sticks out disrupting the phosphate backbone. The new base is not yet methylated, so repair enzymes, that slide along the DNA, can tear out the wrong base and replace it with the right one. The bases in DNA form a sequence of 4 different symbols and so the information is encoded in a DIGITAL form. All the digital codes in our society (ISBN book numbers, UPC product codes, bank account numbers, airline ticket numbers) use error checking code, where some digits are functions of other digits to maintain the fidelity of transmitted informaiton. Does DNA also utitlize a DIGITAL error chekcing code to maintain the fidelity of its information and increase the accuracy of replication? That is, are some bases in DNA functions of other bases upstream or downstream? This raises the interesting mathematical problem: How does one determine whether some symbols in a sequence of symbols are a function of other symbols. It also bears on the issue of determining algorithmic complexity: What is the function that generates the shortest algorithm for reproducing the symbol sequence. The error checking codes most used in our technology are linear block codes. We developed an efficient method to test for the presence of such codes in DNA. We coded the 4 bases as (0,1,2,3) and used Gaussian elimination, modified for modulus 4, to test if some bases are linear combinations of other bases. We used this method to analyze the base sequence in the genes from the lac operon and cytochrome C. We did not find evidence for such error correcting codes in these genes. However, we analyzed only a small amount of DNA and if digitial error correcting schemes are present in DNA, they may be more subtle than such simple linear block codes. The basic issue we raise here, is how information is stored in DNA and an appreciation that digital symbol sequences, such as DNA, admit of interesting schemes to store and protect the fidelity of their information content. Liebovitch, Tao, Todorov, Levine. 1996. Biophys. J. 71:1539-1544. Supported by NIH grant EY6234.

  3. Array coding for large data memories

    NASA Technical Reports Server (NTRS)

    Tranter, W. H.

    1982-01-01

    It is pointed out that an array code is a convenient method for storing large quantities of data. In a typical application, the array consists of N data words having M symbols in each word. The probability of undetected error is considered, taking into account three symbol error probabilities which are of interest, and a formula for determining the probability of undetected error. Attention is given to the possibility of reading data into the array using a digital communication system with symbol error probability p. Two different schemes are found to be of interest. The conducted analysis of array coding shows that the probability of undetected error is very small even for relatively large arrays.

  4. Accounting for stimulus-specific variation in precision reveals a discrete capacity limit in visual working memory

    PubMed Central

    Pratte, Michael S.; Park, Young Eun; Rademaker, Rosanne L.; Tong, Frank

    2016-01-01

    If we view a visual scene that contains many objects, then momentarily close our eyes, some details persist while others seem to fade. Discrete models of visual working memory (VWM) assume that only a few items can be actively maintained in memory, beyond which pure guessing will emerge. Alternatively, continuous resource models assume that all items in a visual scene can be stored with some precision. Distinguishing between these competing models is challenging, however, as resource models that allow for stochastically variable precision (across items and trials) can produce error distributions that resemble random guessing behavior. Here, we evaluated the hypothesis that a major source of variability in VWM performance arises from systematic variation in precision across the stimuli themselves; such stimulus-specific variability can be incorporated into both discrete-capacity and variable-precision resource models. Participants viewed multiple oriented gratings, and then reported the orientation of a cued grating from memory. When modeling the overall distribution of VWM errors, we found that the variable-precision resource model outperformed the discrete model. However, VWM errors revealed a pronounced “oblique effect”, with larger errors for oblique than cardinal orientations. After this source of variability was incorporated into both models, we found that the discrete model provided a better account of VWM errors. Our results demonstrate that variable precision across the stimulus space can lead to an unwarranted advantage for resource models that assume stochastically variable precision. When these deterministic sources are adequately modeled, human working memory performance reveals evidence of a discrete capacity limit. PMID:28004957

  5. Accounting for stimulus-specific variation in precision reveals a discrete capacity limit in visual working memory.

    PubMed

    Pratte, Michael S; Park, Young Eun; Rademaker, Rosanne L; Tong, Frank

    2017-01-01

    If we view a visual scene that contains many objects, then momentarily close our eyes, some details persist while others seem to fade. Discrete models of visual working memory (VWM) assume that only a few items can be actively maintained in memory, beyond which pure guessing will emerge. Alternatively, continuous resource models assume that all items in a visual scene can be stored with some precision. Distinguishing between these competing models is challenging, however, as resource models that allow for stochastically variable precision (across items and trials) can produce error distributions that resemble random guessing behavior. Here, we evaluated the hypothesis that a major source of variability in VWM performance arises from systematic variation in precision across the stimuli themselves; such stimulus-specific variability can be incorporated into both discrete-capacity and variable-precision resource models. Participants viewed multiple oriented gratings, and then reported the orientation of a cued grating from memory. When modeling the overall distribution of VWM errors, we found that the variable-precision resource model outperformed the discrete model. However, VWM errors revealed a pronounced "oblique effect," with larger errors for oblique than cardinal orientations. After this source of variability was incorporated into both models, we found that the discrete model provided a better account of VWM errors. Our results demonstrate that variable precision across the stimulus space can lead to an unwarranted advantage for resource models that assume stochastically variable precision. When these deterministic sources are adequately modeled, human working memory performance reveals evidence of a discrete capacity limit. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. Human Error: A Concept Analysis

    NASA Technical Reports Server (NTRS)

    Hansen, Frederick D.

    2007-01-01

    Human error is the subject of research in almost every industry and profession of our times. This term is part of our daily language and intuitively understood by most people however, it would be premature to assume that everyone's understanding of human error s the same. For example, human error is used to describe the outcome or consequence of human action, the causal factor of an accident, deliberate violations,a nd the actual action taken by a human being. As a result, researchers rarely agree on the either a specific definition or how to prevent human error. The purpose of this article is to explore the specific concept of human error using Concept Analysis as described by Walker and Avant (1995). The concept of human error is examined as currently used in the literature of a variety of industries and professions. Defining attributes and examples of model, borderline, and contrary cases are described. The antecedents and consequences of human error are also discussed and a definition of human error is offered.

  7. Preliminary study of injection transients in the TPS storage ring

    NASA Astrophysics Data System (ADS)

    Chen, C. H.; Liu, Y. C.; Y Chen, J.; Chiu, M. S.; Tseng, F. H.; Fann, S.; Liang, C. C.; Huang, C. S.; Y Lee, T.; Y Chen, B.; Tsai, H. J.; Luo, G. H.; Kuo, C. C.

    2017-07-01

    An optimized injection efficiency is related to a perfect match between the pulsed magnetic fields in the storage ring and transfer line extraction in the TPS. However, misalignment errors, hardware output errors and leakage fields are unavoidable. We study the influence of injection transients on the stored TPS beam and discuss solutions to compensate these. Related simulations and measurements will be presented.

  8. Naval Ship Database: Database Design, Implementation, and Schema

    DTIC Science & Technology

    2013-09-01

    incoming data. The solution allows database users to store and analyze data collected by navy ships in the Royal Canadian Navy ( RCN ). The data...understanding RCN jargon and common practices on a typical RCN vessel. This experience led to the development of several error detection methods to...data to be stored in the database. Mr. Massel has also collected data pertaining to day to day activities on RCN vessels that has been imported into

  9. Correcting a fundamental error in greenhouse gas accounting related to bioenergy

    PubMed Central

    Haberl, Helmut; Sprinz, Detlef; Bonazountas, Marc; Cocco, Pierluigi; Desaubies, Yves; Henze, Mogens; Hertel, Ole; Johnson, Richard K.; Kastrup, Ulrike; Laconte, Pierre; Lange, Eckart; Novak, Peter; Paavola, Jouni; Reenberg, Anette; van den Hove, Sybille; Vermeire, Theo; Wadhams, Peter; Searchinger, Timothy

    2012-01-01

    Many international policies encourage a switch from fossil fuels to bioenergy based on the premise that its use would not result in carbon accumulation in the atmosphere. Frequently cited bioenergy goals would at least double the present global human use of plant material, the production of which already requires the dedication of roughly 75% of vegetated lands and more than 70% of water withdrawals. However, burning biomass for energy provision increases the amount of carbon in the air just like burning coal, oil or gas if harvesting the biomass decreases the amount of carbon stored in plants and soils, or reduces carbon sequestration. Neglecting this fact results in an accounting error that could be corrected by considering that only the use of ‘additional biomass’ – biomass from additional plant growth or biomass that would decompose rapidly if not used for bioenergy – can reduce carbon emissions. Failure to correct this accounting flaw will likely have substantial adverse consequences. The article presents recommendations for correcting greenhouse gas accounts related to bioenergy. PMID:23576835

  10. Low delay and area efficient soft error correction in arbitration logic

    DOEpatents

    Sugawara, Yutaka

    2013-09-10

    There is provided an arbitration logic device for controlling an access to a shared resource. The arbitration logic device comprises at least one storage element, a winner selection logic device, and an error detection logic device. The storage element stores a plurality of requestors' information. The winner selection logic device selects a winner requestor among the requestors based on the requestors' information received from a plurality of requestors. The winner selection logic device selects the winner requestor without checking whether there is the soft error in the winner requestor's information.

  11. Parallel pulse processing and data acquisition for high speed, low error flow cytometry

    DOEpatents

    van den Engh, Gerrit J.; Stokdijk, Willem

    1992-01-01

    A digitally synchronized parallel pulse processing and data acquisition system for a flow cytometer has multiple parallel input channels with independent pulse digitization and FIFO storage buffer. A trigger circuit controls the pulse digitization on all channels. After an event has been stored in each FIFO, a bus controller moves the oldest entry from each FIFO buffer onto a common data bus. The trigger circuit generates an ID number for each FIFO entry, which is checked by an error detection circuit. The system has high speed and low error rate.

  12. Optical storage media data integrity studies

    NASA Technical Reports Server (NTRS)

    Podio, Fernando L.

    1994-01-01

    Optical disk-based information systems are being used in private industry and many Federal Government agencies for on-line and long-term storage of large quantities of data. The storage devices that are part of these systems are designed with powerful, but not unlimited, media error correction capacities. The integrity of data stored on optical disks does not only depend on the life expectancy specifications for the medium. Different factors, including handling and storage conditions, may result in an increase of medium errors in size and frequency. Monitoring the potential data degradation is crucial, especially for long term applications. Efforts are being made by the Association for Information and Image Management Technical Committee C21, Storage Devices and Applications, to specify methods for monitoring and reporting to the user medium errors detected by the storage device while writing, reading or verifying the data stored in that medium. The Computer Systems Laboratory (CSL) of the National Institute of Standard and Technology (NIST) has a leadership role in the development of these standard techniques. In addition, CSL is researching other data integrity issues, including the investigation of error-resilient compression algorithms. NIST has conducted care and handling experiments on optical disk media with the objective of identifying possible causes of degradation. NIST work in data integrity and related standards activities is described.

  13. Generalization in Adaptation to Stable and Unstable Dynamics

    PubMed Central

    Kadiallah, Abdelhamid; Franklin, David W.; Burdet, Etienne

    2012-01-01

    Humans skillfully manipulate objects and tools despite the inherent instability. In order to succeed at these tasks, the sensorimotor control system must build an internal representation of both the force and mechanical impedance. As it is not practical to either learn or store motor commands for every possible future action, the sensorimotor control system generalizes a control strategy for a range of movements based on learning performed over a set of movements. Here, we introduce a computational model for this learning and generalization, which specifies how to learn feedforward muscle activity in a function of the state space. Specifically, by incorporating co-activation as a function of error into the feedback command, we are able to derive an algorithm from a gradient descent minimization of motion error and effort, subject to maintaining a stability margin. This algorithm can be used to learn to coordinate any of a variety of motor primitives such as force fields, muscle synergies, physical models or artificial neural networks. This model for human learning and generalization is able to adapt to both stable and unstable dynamics, and provides a controller for generating efficient adaptive motor behavior in robots. Simulation results exhibit predictions consistent with all experiments on learning of novel dynamics requiring adaptation of force and impedance, and enable us to re-examine some of the previous interpretations of experiments on generalization. PMID:23056191

  14. The compression–error trade-off for large gridded data sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silver, Jeremy D.; Zender, Charles S.

    The netCDF-4 format is widely used for large gridded scientific data sets and includes several compression methods: lossy linear scaling and the non-lossy deflate and shuffle algorithms. Many multidimensional geoscientific data sets exhibit considerable variation over one or several spatial dimensions (e.g., vertically) with less variation in the remaining dimensions (e.g., horizontally). On such data sets, linear scaling with a single pair of scale and offset parameters often entails considerable loss of precision. We introduce an alternative compression method called "layer-packing" that simultaneously exploits lossy linear scaling and lossless compression. Layer-packing stores arrays (instead of a scalar pair) of scalemore » and offset parameters. An implementation of this method is compared with lossless compression, storing data at fixed relative precision (bit-grooming) and scalar linear packing in terms of compression ratio, accuracy and speed. When viewed as a trade-off between compression and error, layer-packing yields similar results to bit-grooming (storing between 3 and 4 significant figures). Bit-grooming and layer-packing offer significantly better control of precision than scalar linear packing. Relative performance, in terms of compression and errors, of bit-groomed and layer-packed data were strongly predicted by the entropy of the exponent array, and lossless compression was well predicted by entropy of the original data array. Layer-packed data files must be "unpacked" to be readily usable. The compression and precision characteristics make layer-packing a competitive archive format for many scientific data sets.« less

  15. The compression–error trade-off for large gridded data sets

    DOE PAGES

    Silver, Jeremy D.; Zender, Charles S.

    2017-01-27

    The netCDF-4 format is widely used for large gridded scientific data sets and includes several compression methods: lossy linear scaling and the non-lossy deflate and shuffle algorithms. Many multidimensional geoscientific data sets exhibit considerable variation over one or several spatial dimensions (e.g., vertically) with less variation in the remaining dimensions (e.g., horizontally). On such data sets, linear scaling with a single pair of scale and offset parameters often entails considerable loss of precision. We introduce an alternative compression method called "layer-packing" that simultaneously exploits lossy linear scaling and lossless compression. Layer-packing stores arrays (instead of a scalar pair) of scalemore » and offset parameters. An implementation of this method is compared with lossless compression, storing data at fixed relative precision (bit-grooming) and scalar linear packing in terms of compression ratio, accuracy and speed. When viewed as a trade-off between compression and error, layer-packing yields similar results to bit-grooming (storing between 3 and 4 significant figures). Bit-grooming and layer-packing offer significantly better control of precision than scalar linear packing. Relative performance, in terms of compression and errors, of bit-groomed and layer-packed data were strongly predicted by the entropy of the exponent array, and lossless compression was well predicted by entropy of the original data array. Layer-packed data files must be "unpacked" to be readily usable. The compression and precision characteristics make layer-packing a competitive archive format for many scientific data sets.« less

  16. An emerging network storage management standard: Media error monitoring and reporting information (MEMRI) - to determine optical tape data integrity

    NASA Technical Reports Server (NTRS)

    Podio, Fernando; Vollrath, William; Williams, Joel; Kobler, Ben; Crouse, Don

    1998-01-01

    Sophisticated network storage management applications are rapidly evolving to satisfy a market demand for highly reliable data storage systems with large data storage capacities and performance requirements. To preserve a high degree of data integrity, these applications must rely on intelligent data storage devices that can provide reliable indicators of data degradation. Error correction activity generally occurs within storage devices without notification to the host. Early indicators of degradation and media error monitoring 333 and reporting (MEMR) techniques implemented in data storage devices allow network storage management applications to notify system administrators of these events and to take appropriate corrective actions before catastrophic errors occur. Although MEMR techniques have been implemented in data storage devices for many years, until 1996 no MEMR standards existed. In 1996 the American National Standards Institute (ANSI) approved the only known (world-wide) industry standard specifying MEMR techniques to verify stored data on optical disks. This industry standard was developed under the auspices of the Association for Information and Image Management (AIIM). A recently formed AIIM Optical Tape Subcommittee initiated the development of another data integrity standard specifying a set of media error monitoring tools and media error monitoring information (MEMRI) to verify stored data on optical tape media. This paper discusses the need for intelligent storage devices that can provide data integrity metadata, the content of the existing data integrity standard for optical disks, and the content of the MEMRI standard being developed by the AIIM Optical Tape Subcommittee.

  17. Creating illusions of knowledge: learning errors that contradict prior knowledge.

    PubMed

    Fazio, Lisa K; Barber, Sarah J; Rajaram, Suparna; Ornstein, Peter A; Marsh, Elizabeth J

    2013-02-01

    Most people know that the Pacific is the largest ocean on Earth and that Edison invented the light bulb. Our question is whether this knowledge is stable, or if people will incorporate errors into their knowledge bases, even if they have the correct knowledge stored in memory. To test this, we asked participants general-knowledge questions 2 weeks before they read stories that contained errors (e.g., "Franklin invented the light bulb"). On a later general-knowledge test, participants reproduced story errors despite previously answering the questions correctly. This misinformation effect was found even for questions that were answered correctly on the initial test with the highest level of confidence. Furthermore, prior knowledge offered no protection against errors entering the knowledge base; the misinformation effect was equivalent for previously known and unknown facts. Errors can enter the knowledge base even when learners have the knowledge necessary to catch the errors. 2013 APA, all rights reserved

  18. Graph Learning in Knowledge Bases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Sean; Wang, Daisy Zhe

    The amount of text data has been growing exponentially in recent years, giving rise to automatic information extraction methods that store text annotations in a database. The current state-of-theart structured prediction methods, however, are likely to contain errors and it’s important to be able to manage the overall uncertainty of the database. On the other hand, the advent of crowdsourcing has enabled humans to aid machine algorithms at scale. As part of this project we introduced pi-CASTLE , a system that optimizes and integrates human and machine computing as applied to a complex structured prediction problem involving conditional random fieldsmore » (CRFs). We proposed strategies grounded in information theory to select a token subset, formulate questions for the crowd to label, and integrate these labelings back into the database using a method of constrained inference. On both a text segmentation task over academic citations and a named entity recognition task over tweets we showed an order of magnitude improvement in accuracy gain over baseline methods.« less

  19. Human Error In Complex Systems

    NASA Technical Reports Server (NTRS)

    Morris, Nancy M.; Rouse, William B.

    1991-01-01

    Report presents results of research aimed at understanding causes of human error in such complex systems as aircraft, nuclear powerplants, and chemical processing plants. Research considered both slips (errors of action) and mistakes (errors of intention), and influence of workload on them. Results indicated that: humans respond to conditions in which errors expected by attempting to reduce incidence of errors; and adaptation to conditions potent influence on human behavior in discretionary situations.

  20. Time trend of injection drug errors before and after implementation of bar-code verification system.

    PubMed

    Sakushima, Ken; Umeki, Reona; Endoh, Akira; Ito, Yoichi M; Nasuhara, Yasuyuki

    2015-01-01

    Bar-code technology, used for verification of patients and their medication, could prevent medication errors in clinical practice. Retrospective analysis of electronically stored medical error reports was conducted in a university hospital. The number of reported medication errors of injected drugs, including wrong drug administration and administration to the wrong patient, was compared before and after implementation of the bar-code verification system for inpatient care. A total of 2867 error reports associated with injection drugs were extracted. Wrong patient errors decreased significantly after implementation of the bar-code verification system (17.4/year vs. 4.5/year, p< 0.05), although wrong drug errors did not decrease sufficiently (24.2/year vs. 20.3/year). The source of medication errors due to wrong drugs was drug preparation in hospital wards. Bar-code medication administration is effective for prevention of wrong patient errors. However, ordinary bar-code verification systems are limited in their ability to prevent incorrect drug preparation in hospital wards.

  1. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  2. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps/incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  3. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Quality within space exploration ground processing operations, the identification and or classification of underlying contributors and causes of human error must be identified, in order to manage human error.This presentation will provide a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  4. Understanding human management of automation errors

    PubMed Central

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  5. Understanding human management of automation errors.

    PubMed

    McBride, Sara E; Rogers, Wendy A; Fisk, Arthur D

    2014-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance.

  6. Parallel pulse processing and data acquisition for high speed, low error flow cytometry

    DOEpatents

    Engh, G.J. van den; Stokdijk, W.

    1992-09-22

    A digitally synchronized parallel pulse processing and data acquisition system for a flow cytometer has multiple parallel input channels with independent pulse digitization and FIFO storage buffer. A trigger circuit controls the pulse digitization on all channels. After an event has been stored in each FIFO, a bus controller moves the oldest entry from each FIFO buffer onto a common data bus. The trigger circuit generates an ID number for each FIFO entry, which is checked by an error detection circuit. The system has high speed and low error rate. 17 figs.

  7. Adaptive control for accelerators

    DOEpatents

    Eaton, Lawrie E.; Jachim, Stephen P.; Natter, Eckard F.

    1991-01-01

    An adaptive feedforward control loop is provided to stabilize accelerator beam loading of the radio frequency field in an accelerator cavity during successive pulses of the beam into the cavity. A digital signal processor enables an adaptive algorithm to generate a feedforward error correcting signal functionally determined by the feedback error obtained by a beam pulse loading the cavity after the previous correcting signal was applied to the cavity. Each cavity feedforward correcting signal is successively stored in the digital processor and modified by the feedback error resulting from its application to generate the next feedforward error correcting signal. A feedforward error correcting signal is generated by the digital processor in advance of the beam pulse to enable a composite correcting signal and the beam pulse to arrive concurrently at the cavity.

  8. Human operator response to error-likely situations in complex engineering systems

    NASA Technical Reports Server (NTRS)

    Morris, Nancy M.; Rouse, William B.

    1988-01-01

    The causes of human error in complex systems are examined. First, a conceptual framework is provided in which two broad categories of error are discussed: errors of action, or slips, and errors of intention, or mistakes. Conditions in which slips and mistakes might be expected to occur are identified, based on existing theories of human error. Regarding the role of workload, it is hypothesized that workload may act as a catalyst for error. Two experiments are presented in which humans' response to error-likely situations were examined. Subjects controlled PLANT under a variety of conditions and periodically provided subjective ratings of mental effort. A complex pattern of results was obtained, which was not consistent with predictions. Generally, the results of this research indicate that: (1) humans respond to conditions in which errors might be expected by attempting to reduce the possibility of error, and (2) adaptation to conditions is a potent influence on human behavior in discretionary situations. Subjects' explanations for changes in effort ratings are also explored.

  9. 29 CFR 779.388 - Exemption provided for food or beverage service employees.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... bakery or grocery store who handle, prepare or sell food or beverages for human consumption since such... activities for such establishments as “drug stores, department stores, bowling alleys, and the like.” (S..., provided it qualifies as a “retail or service establishment,” may be a drug store, department store...

  10. 29 CFR 779.388 - Exemption provided for food or beverage service employees.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... bakery or grocery store who handle, prepare or sell food or beverages for human consumption since such... activities for such establishments as “drug stores, department stores, bowling alleys, and the like.” (S..., provided it qualifies as a “retail or service establishment,” may be a drug store, department store...

  11. 29 CFR 779.388 - Exemption provided for food or beverage service employees.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... bakery or grocery store who handle, prepare or sell food or beverages for human consumption since such... activities for such establishments as “drug stores, department stores, bowling alleys, and the like.” (S..., provided it qualifies as a “retail or service establishment,” may be a drug store, department store...

  12. 29 CFR 779.388 - Exemption provided for food or beverage service employees.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... bakery or grocery store who handle, prepare or sell food or beverages for human consumption since such... activities for such establishments as “drug stores, department stores, bowling alleys, and the like.” (S..., provided it qualifies as a “retail or service establishment,” may be a drug store, department store...

  13. Human error and human factors engineering in health care.

    PubMed

    Welch, D L

    1997-01-01

    Human error is inevitable. It happens in health care systems as it does in all other complex systems, and no measure of attention, training, dedication, or punishment is going to stop it. The discipline of human factors engineering (HFE) has been dealing with the causes and effects of human error since the 1940's. Originally applied to the design of increasingly complex military aircraft cockpits, HFE has since been effectively applied to the problem of human error in such diverse systems as nuclear power plants, NASA spacecraft, the process control industry, and computer software. Today the health care industry is becoming aware of the costs of human error and is turning to HFE for answers. Just as early experimental psychologists went beyond the label of "pilot error" to explain how the design of cockpits led to air crashes, today's HFE specialists are assisting the health care industry in identifying the causes of significant human errors in medicine and developing ways to eliminate or ameliorate them. This series of articles will explore the nature of human error and how HFE can be applied to reduce the likelihood of errors and mitigate their effects.

  14. Protein kinase C activates non-capacitative calcium entry in human platelets

    PubMed Central

    Rosado, Juan A; Sage, Stewart O

    2000-01-01

    In many non-excitable cells Ca2+ influx is mainly controlled by the filling state of the intracellular Ca2+ stores. It has been suggested that this store-mediated or capacitative Ca2+ entry is brought about by a physical and reversible coupling of the endoplasmic reticulum with the plasma membrane. Here we provide evidence for an additional, non-capacitative Ca2+ entry mechanism in human platelets. Changes in cytosolic Ca2+ and Sr2+ were measured in human platelets loaded with the fluorescent indicator fura-2. Depletion of the internal Ca2+ stores with thapsigargin plus a low concentration of ionomycin stimulated store-mediated cation entry, as demonstrated upon Ca2+ or Sr2+ addition. Subsequent treatment with thrombin stimulated further divalent cation entry in a concentration-dependent manner. Direct activation of protein kinase C (PKC) by phorbol-12-myristate-13-acetate or 1-oleoyl-2-acetyl-sn-glycerol also stimulated divalent cation entry, without evoking the release of Ca2+ from intracellular stores. Cation entry evoked by thrombin or activators of PKC was abolished by the PKC inhibitor Ro-31-8220. Unlike store-mediated Ca2+ entry, jasplakinolide, which reorganises actin filaments into a tight cortical layer adjacent to the plasma membrane, did not inhibit divalent cation influx evoked by thrombin when applied after Ca2+ store depletion, or by activators of PKC. Thrombin also activated Ca2+ entry in platelets in which the release from intracellular stores and store-mediated Ca2+ entry were blocked by xestospongin C. These results indicate that the non-capacitative divalent cation entry pathway is regulated independently of store-mediated entry and does not require coupling of the endoplasmic reticulum and the plasma membrane. These results support the existence of a mechanism for receptor-evoked Ca2+ entry in human platelets that is independent of Ca2+ store depletion. This Ca2+ entry mechanism may be activated by occupation of G-protein-coupled receptors, which activate PKC, or by direct activation of PKC, thus generating non-capacitative Ca2+ entry alongside that evoked following the release of Ca2+ from the intracellular stores. PMID:11080259

  15. Applications of integrated human error identification techniques on the chemical cylinder change task.

    PubMed

    Cheng, Ching-Min; Hwang, Sheue-Ling

    2015-03-01

    This paper outlines the human error identification (HEI) techniques that currently exist to assess latent human errors. Many formal error identification techniques have existed for years, but few have been validated to cover latent human error analysis in different domains. This study considers many possible error modes and influential factors, including external error modes, internal error modes, psychological error mechanisms, and performance shaping factors, and integrates several execution procedures and frameworks of HEI techniques. The case study in this research was the operational process of changing chemical cylinders in a factory. In addition, the integrated HEI method was used to assess the operational processes and the system's reliability. It was concluded that the integrated method is a valuable aid to develop much safer operational processes and can be used to predict human error rates on critical tasks in the plant. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  16. Retrieval Failure Contributes to Gist-Based False Recognition

    PubMed Central

    Guerin, Scott A.; Robbins, Clifford A.; Gilmore, Adrian W.; Schacter, Daniel L.

    2011-01-01

    People often falsely recognize items that are similar to previously encountered items. This robust memory error is referred to as gist-based false recognition. A widely held view is that this error occurs because the details fade rapidly from our memory. Contrary to this view, an initial experiment revealed that, following the same encoding conditions that produce high rates of gist-based false recognition, participants overwhelmingly chose the correct target rather than its related foil when given the option to do so. A second experiment showed that this result is due to increased access to stored details provided by reinstatement of the originally encoded photograph, rather than to increased attention to the details. Collectively, these results suggest that details needed for accurate recognition are, to a large extent, still stored in memory and that a critical factor determining whether false recognition will occur is whether these details can be accessed during retrieval. PMID:22125357

  17. Learning clinical reasoning.

    PubMed

    Pinnock, Ralph; Welch, Paul

    2014-04-01

    Errors in clinical reasoning continue to account for significant morbidity and mortality, despite evidence-based guidelines and improved technology. Experts in clinical reasoning often use unconscious cognitive processes that they are not aware of unless they explain how they are thinking. Understanding the intuitive and analytical thinking processes provides a guide for instruction. How knowledge is stored is critical to expertise in clinical reasoning. Curricula should be designed so that trainees store knowledge in a way that is clinically relevant. Competence in clinical reasoning is acquired by supervised practice with effective feedback. Clinicians must recognise the common errors in clinical reasoning and how to avoid them. Trainees can learn clinical reasoning effectively in everyday practice if teachers provide guidance on the cognitive processes involved in making diagnostic decisions. © 2013 The Authors. Journal of Paediatrics and Child Health © 2013 Paediatrics and Child Health Division (Royal Australasian College of Physicians).

  18. Capacity estimation and verification of quantum channels with arbitrarily correlated errors.

    PubMed

    Pfister, Corsin; Rol, M Adriaan; Mantri, Atul; Tomamichel, Marco; Wehner, Stephanie

    2018-01-02

    The central figure of merit for quantum memories and quantum communication devices is their capacity to store and transmit quantum information. Here, we present a protocol that estimates a lower bound on a channel's quantum capacity, even when there are arbitrarily correlated errors. One application of these protocols is to test the performance of quantum repeaters for transmitting quantum information. Our protocol is easy to implement and comes in two versions. The first estimates the one-shot quantum capacity by preparing and measuring in two different bases, where all involved qubits are used as test qubits. The second verifies on-the-fly that a channel's one-shot quantum capacity exceeds a minimal tolerated value while storing or communicating data. We discuss the performance using simple examples, such as the dephasing channel for which our method is asymptotically optimal. Finally, we apply our method to a superconducting qubit in experiment.

  19. On the capacity of ternary Hebbian networks

    NASA Technical Reports Server (NTRS)

    Baram, Yoram

    1991-01-01

    Networks of ternary neurons storing random vectors over the set -1,0,1 by the so-called Hebbian rule are considered. It is shown that the maximal number of stored patterns that are equilibrium states of the network with probability tending to one as N tends to infinity is at least on the order of (N exp 2-1/alpha)/K, where N is the number of neurons, K is the number of nonzero elements in a pattern, and t = alpha x K, alpha between 1/2 and 1, is the threshold in the neuron function. While, for small K, this bound is similar to that obtained for fully connected binary networks, the number of interneural connections required in the ternary case is considerably smaller. Similar bounds, incorporating error probabilities, are shown to guarantee, in the same probabilistic sense, the correction of errors in the nonzero elements and in the location of these elements.

  20. Integrated Power and Attitude Control for a Spacecraft with Flywheels and Control Moment Gyroscopes

    NASA Technical Reports Server (NTRS)

    Roithmayr, Carlos M.; Karlgaard, Christopher D.; Kumar, Renjith R.; Bose, David M.

    2003-01-01

    A law is designed for simultaneous control of the orientation of an Earth-pointing spacecraft, the energy stored by counter-rotating flywheels, and the angular momentum of the flywheels and control moment gyroscopes used together as all integrated set of actuators for attitude control. General. nonlinear equations of motion are presented in vector-dyadic form, and used to obtain approximate expressions which are then linearized in preparation for design of control laws that include feedback of flywheel kinetic energy error as it means of compensating for damping exerted by rotor bearings. Two flywheel 'steering laws' are developed such that torque commanded by all attitude control law is achieved while energy is stored or discharged at the required rate. Using the International Space Station as an example, numerical simulations are performed to demonstrate control about a torque equilibrium attitude and illustrate the benefits of kinetic energy error feedback.

  1. Field assessment of dried Plasmodium falciparum samples for malaria rapid diagnostic test quality control and proficiency testing in Ethiopia.

    PubMed

    Tamiru, Afework; Boulanger, Lucy; Chang, Michelle A; Malone, Joseph L; Aidoo, Michael

    2015-01-21

    Rapid diagnostic tests (RDTs) are now widely used for laboratory confirmation of suspected malaria cases to comply with the World Health Organization recommendation for universal testing before treatment. However, many malaria programmes lack quality control (QC) processes to assess RDT use under field conditions. Prior research showed the feasibility of using the dried tube specimen (DTS) method for preserving Plasmodium falciparum parasites for use as QC samples for RDTs. This study focused on the use of DTS for RDT QC and proficiency testing under field conditions. DTS were prepared using cultured P. falciparum at densities of 500 and 1,000 parasites/μL; 50 μL aliquots of these along with parasite negative human blood controls (0 parasites/μL) were air-dried in specimen tubes and reactivity verified after rehydration. The DTS were used in a field study in the Oromia Region of Ethiopia. Replicate DTS samples containing 0, 500 and 1,000 parasites/μL were stored at 4°C at a reference laboratory and at ambient temperatures at two nearby health facilities. At weeks 0, 4, 8, 12, 16, 20, and 24, the DTS were rehydrated and tested on RDTs stored under manufacturer-recommended temperatures at the RL and on RDTs stored under site-specific conditions at the two health facilities. Reactivity of DTS stored at 4°C at the reference laboratory on RDTs stored at the reference laboratory was considered the gold standard for assessing DTS stability. A proficiency-testing panel consisting of one negative and three positive samples, monitored with a checklist was administered at weeks 12 and 24. At all the seven time points, DTS stored at both the reference laboratory and health facility were reactive on RDTs stored under the recommended temperature and under field conditions, and the DTS without malaria parasites were negative. At the reference laboratory and one health facility, a 500 parasites/μL DTS from the proficiency panel was falsely reported as negative at week 24 due to errors in interpreting faint test lines. The DTS method can be used under field conditions to supplement other RDT QC methods and health worker proficiency in Ethiopia and possibly other malaria-endemic countries.

  2. Human error in airway facilities.

    DOT National Transportation Integrated Search

    2001-01-01

    This report examines human errors in Airway Facilities (AF) with the intent of preventing these errors from being : passed on to the new Operations Control Centers. To effectively manage errors, they first have to be identified. : Human factors engin...

  3. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks.

    PubMed

    Chiu, Ming-Chuan; Hsieh, Min-Chih

    2016-05-01

    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Mitigating TCP Degradation over Intermittent Link Failures using Intermediate Buffers

    DTIC Science & Technology

    2006-06-01

    signal strength [10]. The Preemptive routing in ad hoc networks [10] attempts to predict that a route will fail by looking at the signal power of the...when the error rate is high there are non -optimal back offs in the Retransmission Timeout. And third, in the high error situation the slow start...network storage follows. In Beck et. al. [3], Logistical Networking is outlined as a means of storing data throughout the network. End to end

  5. Effects of syllable structure in aphasic errors: implications for a new model of speech production.

    PubMed

    Romani, Cristina; Galluzzi, Claudia; Bureca, Ivana; Olson, Andrew

    2011-03-01

    Current models of word production assume that words are stored as linear sequences of phonemes which are structured into syllables only at the moment of production. This is because syllable structure is always recoverable from the sequence of phonemes. In contrast, we present theoretical and empirical evidence that syllable structure is lexically represented. Storing syllable structure would have the advantage of making representations more stable and resistant to damage. On the other hand, re-syllabifications affect only a minimal part of phonological representations and occur only in some languages and depending on speech register. Evidence for these claims comes from analyses of aphasic errors which not only respect phonotactic constraints, but also avoid transformations which move the syllabic structure of the word further away from the original structure, even when equating for segmental complexity. This is true across tasks, types of errors, and, crucially, types of patients. The same syllabic effects are shown by apraxic patients and by phonological patients who have more central difficulties in retrieving phonological representations. If syllable structure was only computed after phoneme retrieval, it would have no way to influence the errors of phonological patients. Our results have implications for psycholinguistic and computational models of language as well as for clinical and educational practices. Copyright © 2010 Elsevier Inc. All rights reserved.

  6. Expertise effects in the Moses illusion: detecting contradictions with stored knowledge.

    PubMed

    Cantor, Allison D; Marsh, Elizabeth J

    2017-02-01

    People frequently miss contradictions with stored knowledge; for example, readers often fail to notice any problem with a reference to the Atlantic as the largest ocean. Critically, such effects occur even though participants later demonstrate knowing the Pacific is the largest ocean (the Moses Illusion) [Erickson, T. D., & Mattson, M. E. (1981). From words to meaning: A semantic illusion. Journal of Verbal Learning & Verbal Behavior, 20, 540-551]. We investigated whether such oversights disappear when erroneous references contradict information in one's expert domain, material which likely has been encountered many times and is particularly well-known. Biology and history graduate students monitored for errors while answering biology and history questions containing erroneous presuppositions ("In what US state were the forty-niners searching for oil?"). Expertise helped: participants were less susceptible to the illusion and less likely to later reproduce errors in their expert domain. However, expertise did not eliminate the illusion, even when errors were bolded and underlined, meaning that it was unlikely that people simply skipped over errors. The results support claims that people often use heuristics to judge truth, as opposed to directly retrieving information from memory, likely because such heuristics are adaptive and often lead to the correct answer. Even experts sometimes use such shortcuts, suggesting that overlearned and accessible knowledge does not guarantee retrieval of that information.

  7. Algorithm 699 - A new representation of Patterson's quadrature formulae

    NASA Technical Reports Server (NTRS)

    Krogh, Fred T.; Van Snyder, W.

    1991-01-01

    A method is presented to reduce the number of coefficients necessary to represent Patterson's quadrature formulae. It also reduces the amount of storage necessary for storing function values, and produces slightly smaller error in evaluating the formulae.

  8. Investigating energy-saving potentials in the cloud.

    PubMed

    Lee, Da-Sheng

    2014-02-20

    Collecting webpage messages can serve as a sensor for investigating the energy-saving potential of buildings. Focusing on stores, a cloud sensor system is developed to collect data and determine their energy-saving potential. The owner of a store under investigation must register online, report the store address, area, and the customer ID number on the electric meter. The cloud sensor system automatically surveys the energy usage records by connecting to the power company website and calculating the energy use index (EUI) of the store. Other data includes the chain store check, company capital, location price, and the influence of weather conditions on the store; even the exposure frequency of store under investigation may impact the energy usage collected online. After collecting data from numerous stores, a multi-dimensional data array is constructed to determine energy-saving potential by identifying stores with similarity conditions. Similarity conditions refer to analyzed results that indicate that two stores have similar capital, business scale, weather conditions, and exposure frequency on web. Calculating the EUI difference or pure technical efficiency of stores, the energy-saving potential is determined. In this study, a real case study is performed. An 8-dimensional (8D) data array is constructed by surveying web data related to 67 stores. Then, this study investigated the savings potential of the 33 stores, using a site visit, and employed the cloud sensor system to determine the saving potential. The case study results show good agreement between the data obtained by the site visit and the cloud investigation, with errors within 4.17%. Among 33 the samples, eight stores have low saving potentials of less than 5%. The developed sensor on the cloud successfully identifies them as having low saving potential and avoids wasting money on the site visit.

  9. Investigating Energy-Saving Potentials in the Cloud

    PubMed Central

    Lee, Da-Sheng

    2014-01-01

    Collecting webpage messages can serve as a sensor for investigating the energy-saving potential of buildings. Focusing on stores, a cloud sensor system is developed to collect data and determine their energy-saving potential. The owner of a store under investigation must register online, report the store address, area, and the customer ID number on the electric meter. The cloud sensor system automatically surveys the energy usage records by connecting to the power company website and calculating the energy use index (EUI) of the store. Other data includes the chain store check, company capital, location price, and the influence of weather conditions on the store; even the exposure frequency of store under investigation may impact the energy usage collected online. After collecting data from numerous stores, a multi-dimensional data array is constructed to determine energy-saving potential by identifying stores with similarity conditions. Similarity conditions refer to analyzed results that indicate that two stores have similar capital, business scale, weather conditions, and exposure frequency on web. Calculating the EUI difference or pure technical efficiency of stores, the energy-saving potential is determined. In this study, a real case study is performed. An 8-dimensional (8D) data array is constructed by surveying web data related to 67 stores. Then, this study investigated the savings potential of the 33 stores, using a site visit, and employed the cloud sensor system to determine the saving potential. The case study results show good agreement between the data obtained by the site visit and the cloud investigation, with errors within 4.17%. Among 33 the samples, eight stores have low saving potentials of less than 5%. The developed sensor on the cloud successfully identifies them as having low saving potential and avoids wasting money on the site visit. PMID:24561405

  10. Information systems and human error in the lab.

    PubMed

    Bissell, Michael G

    2004-01-01

    Health system costs in clinical laboratories are incurred daily due to human error. Indeed, a major impetus for automating clinical laboratories has always been the opportunity it presents to simultaneously reduce cost and improve quality of operations by decreasing human error. But merely automating these processes is not enough. To the extent that introduction of these systems results in operators having less practice in dealing with unexpected events or becoming deskilled in problemsolving, however new kinds of error will likely appear. Clinical laboratories could potentially benefit by integrating findings on human error from modern behavioral science into their operations. Fully understanding human error requires a deep understanding of human information processing and cognition. Predicting and preventing negative consequences requires application of this understanding to laboratory operations. Although the occurrence of a particular error at a particular instant cannot be absolutely prevented, human error rates can be reduced. The following principles are key: an understanding of the process of learning in relation to error; understanding the origin of errors since this knowledge can be used to reduce their occurrence; optimal systems should be forgiving to the operator by absorbing errors, at least for a time; although much is known by industrial psychologists about how to write operating procedures and instructions in ways that reduce the probability of error, this expertise is hardly ever put to use in the laboratory; and a feedback mechanism must be designed into the system that enables the operator to recognize in real time that an error has occurred.

  11. Efficient recovery of proteins from multiple source samples after TRIzol(®) or TRIzol(®)LS RNA extraction and long-term storage.

    PubMed

    Simões, André E S; Pereira, Diane M; Amaral, Joana D; Nunes, Ana F; Gomes, Sofia E; Rodrigues, Pedro M; Lo, Adrian C; D'Hooge, Rudi; Steer, Clifford J; Thibodeau, Stephen N; Borralho, Pedro M; Rodrigues, Cecília M P

    2013-03-15

    Simultaneous isolation of nucleic acids and proteins from a single biological sample facilitates meaningful data interpretation and reduces time, cost and sampling errors. This is particularly relevant for rare human and animal specimens, often scarce, and/or irreplaceable. TRIzol(®) and TRIzol(®)LS are suitable for simultaneous isolation of RNA, DNA and proteins from the same biological sample. These reagents are widely used for RNA and/or DNA isolation, while reports on their use for protein extraction are limited, attributable to technical difficulties in protein solubilisation. TRIzol(®)LS was used for RNA isolation from 284 human colon cancer samples, including normal colon mucosa, tubulovillous adenomas, and colon carcinomas with proficient and deficient mismatch repair system. TRIzol(®) was used for RNA isolation from human colon cancer cells, from brains of transgenic Alzheimer's disease mice model, and from cultured mouse cortical neurons. Following RNA extraction, the TRIzol(®)-chloroform fractions from human colon cancer samples and from mouse hippocampus and frontal cortex were stored for 2 years and 3 months, respectively, at -80°C until used for protein isolation.Simple modifications to the TRIzol(®) manufacturer's protocol, including Urea:SDS solubilization and sonication, allowed improved protein recovery yield compared to the TRIzol(®) manufacturer's protocol. Following SDS-PAGE and Ponceau and Coomassie staining, recovered proteins displayed wide molecular weight range and staining pattern comparable to those obtainable with commonly used protein extraction protocols. We also show that nuclear and cytosolic proteins can be easily extracted and detected by immunoblotting, and that posttranslational modifications, such as protein phosphorylation, are detectable in proteins recovered from TRIzol(®)-chloroform fractions stored for up to 2 years at -80°C. We provide a novel approach to improve protein recovery from samples processed for nucleic acid extraction with TRIzol(®) and TRIzol(®)LS compared to the manufacturer`s protocol, allowing downstream immunoblotting and evaluation of steady-state relative protein expression levels. The method was validated in large sets of samples from multiple sources, including human colon cancer and brains of transgenic Alzheimer's disease mice model, stored in TRIzol(®)-chloroform for up to two years. Collectively, we provide a faster and cheaper alternative to the TRIzol(®) manufacturer`s protein extraction protocol, illustrating the high relevance, and wide applicability, of the present protein isolation method for the immunoblot evaluation of steady-state relative protein expression levels in samples from multiple sources, and following prolonged storage.

  12. Capacity and precision in an animal model of visual short-term memory.

    PubMed

    Lara, Antonio H; Wallis, Jonathan D

    2012-03-14

    Temporary storage of information in visual short-term memory (VSTM) is a key component of many complex cognitive abilities. However, it is highly limited in capacity. Understanding the neurophysiological nature of this capacity limit will require a valid animal model of VSTM. We used a multiple-item color change detection task to measure macaque monkeys' VSTM capacity. Subjects' performance deteriorated and reaction times increased as a function of the number of items in memory. Additionally, we measured the precision of the memory representations by varying the distance between sample and test colors. In trials with similar sample and test colors, subjects made more errors compared to trials with highly discriminable colors. We modeled the error distribution as a Gaussian function and used this to estimate the precision of VSTM representations. We found that as the number of items in memory increases the precision of the representations decreases dramatically. Additionally, we found that focusing attention on one of the objects increases the precision with which that object is stored and degrades the precision of the remaining. These results are in line with recent findings in human psychophysics and provide a solid foundation for understanding the neurophysiological nature of the capacity limit of VSTM.

  13. Stochastic Models of Human Errors

    NASA Technical Reports Server (NTRS)

    Elshamy, Maged; Elliott, Dawn M. (Technical Monitor)

    2002-01-01

    Humans play an important role in the overall reliability of engineering systems. More often accidents and systems failure are traced to human errors. Therefore, in order to have meaningful system risk analysis, the reliability of the human element must be taken into consideration. Describing the human error process by mathematical models is a key to analyzing contributing factors. Therefore, the objective of this research effort is to establish stochastic models substantiated by sound theoretic foundation to address the occurrence of human errors in the processing of the space shuttle.

  14. Operational Interventions to Maintenance Error

    NASA Technical Reports Server (NTRS)

    Kanki, Barbara G.; Walter, Diane; Dulchinos, VIcki

    1997-01-01

    A significant proportion of aviation accidents and incidents are known to be tied to human error. However, research of flight operational errors has shown that so-called pilot error often involves a variety of human factors issues and not a simple lack of individual technical skills. In aircraft maintenance operations, there is similar concern that maintenance errors which may lead to incidents and accidents are related to a large variety of human factors issues. Although maintenance error data and research are limited, industry initiatives involving human factors training in maintenance have become increasingly accepted as one type of maintenance error intervention. Conscientious efforts have been made in re-inventing the team7 concept for maintenance operations and in tailoring programs to fit the needs of technical opeRAtions. Nevertheless, there remains a dual challenge: 1) to develop human factors interventions which are directly supported by reliable human error data, and 2) to integrate human factors concepts into the procedures and practices of everyday technical tasks. In this paper, we describe several varieties of human factors interventions and focus on two specific alternatives which target problems related to procedures and practices; namely, 1) structured on-the-job training and 2) procedure re-design. We hope to demonstrate that the key to leveraging the impact of these solutions comes from focused interventions; that is, interventions which are derived from a clear understanding of specific maintenance errors, their operational context and human factors components.

  15. Reduction of Maintenance Error Through Focused Interventions

    NASA Technical Reports Server (NTRS)

    Kanki, Barbara G.; Walter, Diane; Rosekind, Mark R. (Technical Monitor)

    1997-01-01

    It is well known that a significant proportion of aviation accidents and incidents are tied to human error. In flight operations, research of operational errors has shown that so-called "pilot error" often involves a variety of human factors issues and not a simple lack of individual technical skills. In aircraft maintenance operations, there is similar concern that maintenance errors which may lead to incidents and accidents are related to a large variety of human factors issues. Although maintenance error data and research are limited, industry initiatives involving human factors training in maintenance have become increasingly accepted as one type of maintenance error intervention. Conscientious efforts have been made in re-inventing the "team" concept for maintenance operations and in tailoring programs to fit the needs of technical operations. Nevertheless, there remains a dual challenge: to develop human factors interventions which are directly supported by reliable human error data, and to integrate human factors concepts into the procedures and practices of everyday technical tasks. In this paper, we describe several varieties of human factors interventions and focus on two specific alternatives which target problems related to procedures and practices; namely, 1) structured on-the-job training and 2) procedure re-design. We hope to demonstrate that the key to leveraging the impact of these solutions comes from focused interventions; that is, interventions which are derived from a clear understanding of specific maintenance errors, their operational context and human factors components.

  16. All-optical associative memory using photorefractive crystals and a saturable absorber

    NASA Astrophysics Data System (ADS)

    Duelli, Markus; Cudney, Roger S.; Keller, Claude; Guenter, Peter

    1995-07-01

    We report on the investigation of a new configuration of an all-optical associative memory. The images to be recalled associatively are stored in a LiNbO3 crystal via angular multiplexing. Thresholding of the reconstructed reference beams during associative readout is achieved by using a saturable absorber with an intensity-tunable threshold. We demonstrate associative readout and error correction for 10 strongly overlapping black-and-white images. Associative recall and full reconstruction is performed when only 1/500 of the image stored is entered.

  17. Human Factors Process Task Analysis: Liquid Oxygen Pump Acceptance Test Procedure at the Advanced Technology Development Center

    NASA Technical Reports Server (NTRS)

    Diorio, Kimberly A.; Voska, Ned (Technical Monitor)

    2002-01-01

    This viewgraph presentation provides information on Human Factors Process Failure Modes and Effects Analysis (HF PFMEA). HF PFMEA includes the following 10 steps: Describe mission; Define System; Identify human-machine; List human actions; Identify potential errors; Identify factors that effect error; Determine likelihood of error; Determine potential effects of errors; Evaluate risk; Generate solutions (manage error). The presentation also describes how this analysis was applied to a liquid oxygen pump acceptance test.

  18. A theory of human error

    NASA Technical Reports Server (NTRS)

    Mcruer, D. T.; Clement, W. F.; Allen, R. W.

    1981-01-01

    Human errors tend to be treated in terms of clinical and anecdotal descriptions, from which remedial measures are difficult to derive. Correction of the sources of human error requires an attempt to reconstruct underlying and contributing causes of error from the circumstantial causes cited in official investigative reports. A comprehensive analytical theory of the cause-effect relationships governing propagation of human error is indispensable to a reconstruction of the underlying and contributing causes. A validated analytical theory of the input-output behavior of human operators involving manual control, communication, supervisory, and monitoring tasks which are relevant to aviation, maritime, automotive, and process control operations is highlighted. This theory of behavior, both appropriate and inappropriate, provides an insightful basis for investigating, classifying, and quantifying the needed cause-effect relationships governing propagation of human error.

  19. High storage capacity in the Hopfield model with auto-interactions—stability analysis

    NASA Astrophysics Data System (ADS)

    Rocchi, Jacopo; Saad, David; Tantari, Daniele

    2017-11-01

    Recent studies point to the potential storage of a large number of patterns in the celebrated Hopfield associative memory model, well beyond the limits obtained previously. We investigate the properties of new fixed points to discover that they exhibit instabilities for small perturbations and are therefore of limited value as associative memories. Moreover, a large deviations approach also shows that errors introduced to the original patterns induce additional errors and increased corruption with respect to the stored patterns.

  20. The contributions of human factors on human error in Malaysia aviation maintenance industries

    NASA Astrophysics Data System (ADS)

    Padil, H.; Said, M. N.; Azizan, A.

    2018-05-01

    Aviation maintenance is a multitasking activity in which individuals perform varied tasks under constant pressure to meet deadlines as well as challenging work conditions. These situational characteristics combined with human factors can lead to various types of human related errors. The primary objective of this research is to develop a structural relationship model that incorporates human factors, organizational factors, and their impact on human errors in aviation maintenance. Towards that end, a questionnaire was developed which was administered to Malaysian aviation maintenance professionals. Structural Equation Modelling (SEM) approach was used in this study utilizing AMOS software. Results showed that there were a significant relationship of human factors on human errors and were tested in the model. Human factors had a partial effect on organizational factors while organizational factors had a direct and positive impact on human errors. It was also revealed that organizational factors contributed to human errors when coupled with human factors construct. This study has contributed to the advancement of knowledge on human factors effecting safety and has provided guidelines for improving human factors performance relating to aviation maintenance activities and could be used as a reference for improving safety performance in the Malaysian aviation maintenance companies.

  1. A theory of human error

    NASA Technical Reports Server (NTRS)

    Mcruer, D. T.; Clement, W. F.; Allen, R. W.

    1980-01-01

    Human error, a significant contributing factor in a very high proportion of civil transport, general aviation, and rotorcraft accidents is investigated. Correction of the sources of human error requires that one attempt to reconstruct underlying and contributing causes of error from the circumstantial causes cited in official investigative reports. A validated analytical theory of the input-output behavior of human operators involving manual control, communication, supervisory, and monitoring tasks which are relevant to aviation operations is presented. This theory of behavior, both appropriate and inappropriate, provides an insightful basis for investigating, classifying, and quantifying the needed cause-effect relationships governing propagation of human error.

  2. Simultaneous determination of glucose, triglycerides, urea, cholesterol, albumin and total protein in human plasma by Fourier transform infrared spectroscopy: direct clinical biochemistry without reagents.

    PubMed

    Jessen, Torben E; Höskuldsson, Agnar T; Bjerrum, Poul J; Verder, Henrik; Sørensen, Lars; Bratholm, Palle S; Christensen, Bo; Jensen, Lene S; Jensen, Maria A B

    2014-09-01

    Direct measurement of chemical constituents in complex biologic matrices without the use of analyte specific reagents could be a step forward toward the simplification of clinical biochemistry. Problems related to reagents such as production errors, improper handling, and lot-to-lot variations would be eliminated as well as errors occurring during assay execution. We describe and validate a reagent free method for direct measurement of six analytes in human plasma based on Fourier-transform infrared spectroscopy (FTIR). Blood plasma is analyzed without any sample preparation. FTIR spectrum of the raw plasma is recorded in a sampling cuvette specially designed for measurement of aqueous solutions. For each analyte, a mathematical calibration process is performed by a stepwise selection of wavelengths giving the optimal least-squares correlation between the measured FTIR signal and the analyte concentration measured by conventional clinical reference methods. The developed calibration algorithms are subsequently evaluated for their capability to predict the concentration of the six analytes in blinded patient samples. The correlation between the six FTIR methods and corresponding reference methods were 0.87

  3. Human Reliability and the Cost of Doing Business

    NASA Technical Reports Server (NTRS)

    DeMott, Diana

    2014-01-01

    Most businesses recognize that people will make mistakes and assume errors are just part of the cost of doing business, but does it need to be? Companies with high risk, or major consequences, should consider the effect of human error. In a variety of industries, Human Errors have caused costly failures and workplace injuries. These have included: airline mishaps, medical malpractice, administration of medication and major oil spills have all been blamed on human error. A technique to mitigate or even eliminate some of these costly human errors is the use of Human Reliability Analysis (HRA). Various methodologies are available to perform Human Reliability Assessments that range from identifying the most likely areas for concern to detailed assessments with human error failure probabilities calculated. Which methodology to use would be based on a variety of factors that would include: 1) how people react and act in different industries, and differing expectations based on industries standards, 2) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 3) type and availability of data and 4) how the industry views risk & reliability influences ( types of emergencies, contingencies and routine tasks versus cost based concerns). The Human Reliability Assessments should be the first step to reduce, mitigate or eliminate the costly mistakes or catastrophic failures. Using Human Reliability techniques to identify and classify human error risks allows a company more opportunities to mitigate or eliminate these risks and prevent costly failures.

  4. Human Reliability and the Cost of Doing Business

    NASA Technical Reports Server (NTRS)

    DeMott, D. L.

    2014-01-01

    Human error cannot be defined unambiguously in advance of it happening, it often becomes an error after the fact. The same action can result in a tragic accident for one situation or a heroic action given a more favorable outcome. People often forget that we employ humans in business and industry for the flexibility and capability to change when needed. In complex systems, operations are driven by their specifications of the system and the system structure. People provide the flexibility to make it work. Human error has been reported as being responsible for 60%-80% of failures, accidents and incidents in high-risk industries. We don't have to accept that all human errors are inevitable. Through the use of some basic techniques, many potential human error events can be addressed. There are actions that can be taken to reduce the risk of human error.

  5. Human Error Analysis in a Permit to Work System: A Case Study in a Chemical Plant

    PubMed Central

    Jahangiri, Mehdi; Hoboubi, Naser; Rostamabadi, Akbar; Keshavarzi, Sareh; Hosseini, Ali Akbar

    2015-01-01

    Background A permit to work (PTW) is a formal written system to control certain types of work which are identified as potentially hazardous. However, human error in PTW processes can lead to an accident. Methods This cross-sectional, descriptive study was conducted to estimate the probability of human errors in PTW processes in a chemical plant in Iran. In the first stage, through interviewing the personnel and studying the procedure in the plant, the PTW process was analyzed using the hierarchical task analysis technique. In doing so, PTW was considered as a goal and detailed tasks to achieve the goal were analyzed. In the next step, the standardized plant analysis risk-human (SPAR-H) reliability analysis method was applied for estimation of human error probability. Results The mean probability of human error in the PTW system was estimated to be 0.11. The highest probability of human error in the PTW process was related to flammable gas testing (50.7%). Conclusion The SPAR-H method applied in this study could analyze and quantify the potential human errors and extract the required measures for reducing the error probabilities in PTW system. Some suggestions to reduce the likelihood of errors, especially in the field of modifying the performance shaping factors and dependencies among tasks are provided. PMID:27014485

  6. Managing human fallibility in critical aerospace situations

    NASA Astrophysics Data System (ADS)

    Tew, Larry

    2014-11-01

    Human fallibility is pervasive in the aerospace industry with over 50% of errors attributed to human error. Consider the benefits to any organization if those errors were significantly reduced. Aerospace manufacturing involves high value, high profile systems with significant complexity and often repetitive build, assembly, and test operations. In spite of extensive analysis, planning, training, and detailed procedures, human factors can cause unexpected errors. Handling such errors involves extensive cause and corrective action analysis and invariably schedule slips and cost growth. We will discuss success stories, including those associated with electro-optical systems, where very significant reductions in human fallibility errors were achieved after receiving adapted and specialized training. In the eyes of company and customer leadership, the steps used to achieve these results lead to in a major culture change in both the workforce and the supporting management organization. This approach has proven effective in other industries like medicine, firefighting, law enforcement, and aviation. The roadmap to success and the steps to minimize human error are known. They can be used by any organization willing to accept human fallibility and take a proactive approach to incorporate the steps needed to manage and minimize error.

  7. Prediction of human errors by maladaptive changes in event-related brain networks.

    PubMed

    Eichele, Tom; Debener, Stefan; Calhoun, Vince D; Specht, Karsten; Engel, Andreas K; Hugdahl, Kenneth; von Cramon, D Yves; Ullsperger, Markus

    2008-04-22

    Humans engaged in monotonous tasks are susceptible to occasional errors that may lead to serious consequences, but little is known about brain activity patterns preceding errors. Using functional MRI and applying independent component analysis followed by deconvolution of hemodynamic responses, we studied error preceding brain activity on a trial-by-trial basis. We found a set of brain regions in which the temporal evolution of activation predicted performance errors. These maladaptive brain activity changes started to evolve approximately 30 sec before the error. In particular, a coincident decrease of deactivation in default mode regions of the brain, together with a decline of activation in regions associated with maintaining task effort, raised the probability of future errors. Our findings provide insights into the brain network dynamics preceding human performance errors and suggest that monitoring of the identified precursor states may help in avoiding human errors in critical real-world situations.

  8. Prediction of human errors by maladaptive changes in event-related brain networks

    PubMed Central

    Eichele, Tom; Debener, Stefan; Calhoun, Vince D.; Specht, Karsten; Engel, Andreas K.; Hugdahl, Kenneth; von Cramon, D. Yves; Ullsperger, Markus

    2008-01-01

    Humans engaged in monotonous tasks are susceptible to occasional errors that may lead to serious consequences, but little is known about brain activity patterns preceding errors. Using functional MRI and applying independent component analysis followed by deconvolution of hemodynamic responses, we studied error preceding brain activity on a trial-by-trial basis. We found a set of brain regions in which the temporal evolution of activation predicted performance errors. These maladaptive brain activity changes started to evolve ≈30 sec before the error. In particular, a coincident decrease of deactivation in default mode regions of the brain, together with a decline of activation in regions associated with maintaining task effort, raised the probability of future errors. Our findings provide insights into the brain network dynamics preceding human performance errors and suggest that monitoring of the identified precursor states may help in avoiding human errors in critical real-world situations. PMID:18427123

  9. Defining the Relationship Between Human Error Classes and Technology Intervention Strategies

    NASA Technical Reports Server (NTRS)

    Wiegmann, Douglas A.; Rantanen, Eas M.

    2003-01-01

    The modus operandi in addressing human error in aviation systems is predominantly that of technological interventions or fixes. Such interventions exhibit considerable variability both in terms of sophistication and application. Some technological interventions address human error directly while others do so only indirectly. Some attempt to eliminate the occurrence of errors altogether whereas others look to reduce the negative consequences of these errors. In any case, technological interventions add to the complexity of the systems and may interact with other system components in unforeseeable ways and often create opportunities for novel human errors. Consequently, there is a need to develop standards for evaluating the potential safety benefit of each of these intervention products so that resources can be effectively invested to produce the biggest benefit to flight safety as well as to mitigate any adverse ramifications. The purpose of this project was to help define the relationship between human error and technological interventions, with the ultimate goal of developing a set of standards for evaluating or measuring the potential benefits of new human error fixes.

  10. Applying lessons learned to enhance human performance and reduce human error for ISS operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, W.R.

    1999-01-01

    A major component of reliability, safety, and mission success for space missions is ensuring that the humans involved (flight crew, ground crew, mission control, etc.) perform their tasks and functions as required. This includes compliance with training and procedures during normal conditions, and successful compensation when malfunctions or unexpected conditions occur. A very significant issue that affects human performance in space flight is human error. Human errors can invalidate carefully designed equipment and procedures. If certain errors combine with equipment failures or design flaws, mission failure or loss of life can occur. The control of human error during operation ofmore » the International Space Station (ISS) will be critical to the overall success of the program. As experience from Mir operations has shown, human performance plays a vital role in the success or failure of long duration space missions. The Department of Energy{close_quote}s Idaho National Engineering and Environmental Laboratory (INEEL) is developing a systematic approach to enhance human performance and reduce human errors for ISS operations. This approach is based on the systematic identification and evaluation of lessons learned from past space missions such as Mir to enhance the design and operation of ISS. This paper will describe previous INEEL research on human error sponsored by NASA and how it can be applied to enhance human reliability for ISS. {copyright} {ital 1999 American Institute of Physics.}« less

  11. Applying lessons learned to enhance human performance and reduce human error for ISS operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, W.R.

    1998-09-01

    A major component of reliability, safety, and mission success for space missions is ensuring that the humans involved (flight crew, ground crew, mission control, etc.) perform their tasks and functions as required. This includes compliance with training and procedures during normal conditions, and successful compensation when malfunctions or unexpected conditions occur. A very significant issue that affects human performance in space flight is human error. Human errors can invalidate carefully designed equipment and procedures. If certain errors combine with equipment failures or design flaws, mission failure or loss of life can occur. The control of human error during operation ofmore » the International Space Station (ISS) will be critical to the overall success of the program. As experience from Mir operations has shown, human performance plays a vital role in the success or failure of long duration space missions. The Department of Energy`s Idaho National Engineering and Environmental Laboratory (INEEL) is developed a systematic approach to enhance human performance and reduce human errors for ISS operations. This approach is based on the systematic identification and evaluation of lessons learned from past space missions such as Mir to enhance the design and operation of ISS. This paper describes previous INEEL research on human error sponsored by NASA and how it can be applied to enhance human reliability for ISS.« less

  12. Structured methods for identifying and correcting potential human errors in aviation operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, W.R.

    1997-10-01

    Human errors have been identified as the source of approximately 60% of the incidents and accidents that occur in commercial aviation. It can be assumed that a very large number of human errors occur in aviation operations, even though in most cases the redundancies and diversities built into the design of aircraft systems prevent the errors from leading to serious consequences. In addition, when it is acknowledged that many system failures have their roots in human errors that occur in the design phase, it becomes apparent that the identification and elimination of potential human errors could significantly decrease the risksmore » of aviation operations. This will become even more critical during the design of advanced automation-based aircraft systems as well as next-generation systems for air traffic management. Structured methods to identify and correct potential human errors in aviation operations have been developed and are currently undergoing testing at the Idaho National Engineering and Environmental Laboratory (INEEL).« less

  13. "Most people are simply not designed to eat pasta": evolutionary explanations for obesity in the low-carbohydrate diet movement.

    PubMed

    Knight, Christine

    2011-09-01

    Low-carbohydrate diets, notably the Atkins Diet, were particularly popular in Britain and North America in the late 1990s and early 2000s. On the basis of a discourse analysis of bestselling low-carbohydrate diet books, I examine and critique genetic and evolutionary explanations for obesity and diabetes as they feature in the low-carbohydrate literature. Low-carbohydrate diet books present two distinct neo-Darwinian explanations of health and body-weight. First, evolutionary nutrition is based on the premise that the human body has adapted to function best on the diet eaten in the Paleolithic era. Second, the thrifty gene theory suggests that feast-or-famine conditions during human evolutionary development naturally selected for people who could store excess energy as body fat for later use. However, the historical narratives and scientific arguments presented in the low-carbohydrate literature are beset with generalisations, inconsistencies and errors. These result, I argue, from the use of the primitive as a discursive "blank slate" onto which to project ideals perceived to be lacking in contemporary industrialised life.

  14. Human errors and violations in computer and information security: the viewpoint of network administrators and security specialists.

    PubMed

    Kraemer, Sara; Carayon, Pascale

    2007-03-01

    This paper describes human errors and violations of end users and network administration in computer and information security. This information is summarized in a conceptual framework for examining the human and organizational factors contributing to computer and information security. This framework includes human error taxonomies to describe the work conditions that contribute adversely to computer and information security, i.e. to security vulnerabilities and breaches. The issue of human error and violation in computer and information security was explored through a series of 16 interviews with network administrators and security specialists. The interviews were audio taped, transcribed, and analyzed by coding specific themes in a node structure. The result is an expanded framework that classifies types of human error and identifies specific human and organizational factors that contribute to computer and information security. Network administrators tended to view errors created by end users as more intentional than unintentional, while errors created by network administrators as more unintentional than intentional. Organizational factors, such as communication, security culture, policy, and organizational structure, were the most frequently cited factors associated with computer and information security.

  15. Intervention strategies for the management of human error

    NASA Technical Reports Server (NTRS)

    Wiener, Earl L.

    1993-01-01

    This report examines the management of human error in the cockpit. The principles probably apply as well to other applications in the aviation realm (e.g. air traffic control, dispatch, weather, etc.) as well as other high-risk systems outside of aviation (e.g. shipping, high-technology medical procedures, military operations, nuclear power production). Management of human error is distinguished from error prevention. It is a more encompassing term, which includes not only the prevention of error, but also a means of disallowing an error, once made, from adversely affecting system output. Such techniques include: traditional human factors engineering, improvement of feedback and feedforward of information from system to crew, 'error-evident' displays which make erroneous input more obvious to the crew, trapping of errors within a system, goal-sharing between humans and machines (also called 'intent-driven' systems), paperwork management, and behaviorally based approaches, including procedures, standardization, checklist design, training, cockpit resource management, etc. Fifteen guidelines for the design and implementation of intervention strategies are included.

  16. Are gestational age, birth weight, and birth length indicators of favorable fetal growth conditions? A structural equation analysis of Filipino infants.

    PubMed

    Bollen, Kenneth A; Noble, Mark D; Adair, Linda S

    2013-07-30

    The fetal origins hypothesis emphasizes the life-long health impacts of prenatal conditions. Birth weight, birth length, and gestational age are indicators of the fetal environment. However, these variables often have missing data and are subject to random and systematic errors caused by delays in measurement, differences in measurement instruments, and human error. With data from the Cebu (Philippines) Longitudinal Health and Nutrition Survey, we use structural equation models, to explore random and systematic errors in these birth outcome measures, to analyze how maternal characteristics relate to birth outcomes, and to take account of missing data. We assess whether birth weight, birth length, and gestational age are influenced by a single latent variable that we call favorable fetal growth conditions (FFGC) and if so, which variable is most closely related to FFGC. We find that a model with FFGC as a latent variable fits as well as a less parsimonious model that has birth weight, birth length, and gestational age as distinct individual variables. We also demonstrate that birth weight is more reliably measured than is gestational age. FFGCs were significantly influenced by taller maternal stature, better nutritional stores indexed by maternal arm fat and muscle area during pregnancy, higher birth order, avoidance of smoking, and maternal age 20-35 years. Effects of maternal characteristics on newborn weight, length, and gestational age were largely indirect, operating through FFGC. Copyright © 2013 John Wiley & Sons, Ltd.

  17. Recent study, but not retrieval, of knowledge protects against learning errors.

    PubMed

    Mullet, Hillary G; Umanath, Sharda; Marsh, Elizabeth J

    2014-11-01

    Surprisingly, people incorporate errors into their knowledge bases even when they have the correct knowledge stored in memory (e.g., Fazio, Barber, Rajaram, Ornstein, & Marsh, 2013). We examined whether heightening the accessibility of correct knowledge would protect people from later reproducing misleading information that they encountered in fictional stories. In Experiment 1, participants studied a series of target general knowledge questions and their correct answers either a few minutes (high accessibility of knowledge) or 1 week (low accessibility of knowledge) before exposure to misleading story references. In Experiments 2a and 2b, participants instead retrieved the answers to the target general knowledge questions either a few minutes or 1 week before the rest of the experiment. Reading the relevant knowledge directly before the story-reading phase protected against reproduction of the misleading story answers on a later general knowledge test, but retrieving that same correct information did not. Retrieving stored knowledge from memory might actually enhance the encoding of relevant misinformation.

  18. Simple Pixel Structure Using Video Data Correction Method for Nonuniform Electrical Characteristics of Polycrystalline Silicon Thin-Film Transistors and Differential Aging Phenomenon of Organic Light-Emitting Diodes

    NASA Astrophysics Data System (ADS)

    Hai-Jung In,; Oh-Kyong Kwon,

    2010-03-01

    A simple pixel structure using a video data correction method is proposed to compensate for electrical characteristic variations of driving thin-film transistors (TFTs) and the degradation of organic light-emitting diodes (OLEDs) in active-matrix OLED (AMOLED) displays. The proposed method senses the electrical characteristic variations of TFTs and OLEDs and stores them in external memory. The nonuniform emission current of TFTs and the aging of OLEDs are corrected by modulating video data using the stored data. Experimental results show that the emission current error due to electrical characteristic variation of driving TFTs is in the range from -63.1 to 61.4% without compensation, but is decreased to the range from -1.9 to 1.9% with the proposed correction method. The luminance error due to the degradation of an OLED is less than 1.8% when the proposed correction method is used for a 50% degraded OLED.

  19. Exploring Reactions to Pilot Reliability Certification and Changing Attitudes on the Reduction of Errors

    ERIC Educational Resources Information Center

    Boedigheimer, Dan

    2010-01-01

    Approximately 70% of aviation accidents are attributable to human error. The greatest opportunity for further improving aviation safety is found in reducing human errors in the cockpit. The purpose of this quasi-experimental, mixed-method research was to evaluate whether there was a difference in pilot attitudes toward reducing human error in the…

  20. Evaluating a medical error taxonomy.

    PubMed

    Brixey, Juliana; Johnson, Todd R; Zhang, Jiajie

    2002-01-01

    Healthcare has been slow in using human factors principles to reduce medical errors. The Center for Devices and Radiological Health (CDRH) recognizes that a lack of attention to human factors during product development may lead to errors that have the potential for patient injury, or even death. In response to the need for reducing medication errors, the National Coordinating Council for Medication Errors Reporting and Prevention (NCC MERP) released the NCC MERP taxonomy that provides a standard language for reporting medication errors. This project maps the NCC MERP taxonomy of medication error to MedWatch medical errors involving infusion pumps. Of particular interest are human factors associated with medical device errors. The NCC MERP taxonomy of medication errors is limited in mapping information from MEDWATCH because of the focus on the medical device and the format of reporting.

  1. Dielectrophoresis enhances the whitening effect of carbamide peroxide on enamel.

    PubMed

    Ivanoff, Chris S; Hottel, Timothy L; Garcia-Godoy, Franklin; Riga, Alan T

    2011-10-01

    To compare the enamel whitening effect of a 20-minute dielectrophoresis enhanced electrochemical delivery to a 20-minute diffusion treatment. Forty freshly extracted human teeth without detectable caries or restoration were stored in distilled water at 4 degrees C and used within 1 month of extraction. Two different bleaching gels (Plus White 5 Minute Speed Whitening Gel and 35% Opalescence PF gel) were tested. The study had two parts: Part 1--Quantitative comparison of hydrogen peroxide (H2O2, HP) absorption--following application of an over-the-counter 35% HP whitening gel (Plus White 5 Minute Speed Whitening Gel) to 30 (n = 30) extracted human teeth by conventional diffusion or dielectrophoresis. The amount of H2O2 that diffused from the dentin was measured by a colorimetric oxidation-reduction reaction kit. HP concentration was measured by UV-Vis spectroscopy at 550 nm. Part 2--HP diffusion in stained teeth--35% carbamide peroxide whitening gel (35% Opalescence PF gel) was applied to 10 extracted human teeth (n = 10) stained by immersion in a black tea solution for 48 hours. The teeth were randomly assigned to the 20-minute dielectrophoresis or diffusion treatment group; whitening was evaluated by a dental spectrophotometer and macro-photography. Part 1: The analysis found significant differences between both groups with relative percent errors of 3% or less (a single outlier had an RPE of 12%). The average absorbance for the dielectrophoresis group in round 1 was 79% greater than the diffusion group. The average absorbance for the dielectrophoresis group in round 2 was 130% greater than the diffusion group. A single-factor ANOVA found a statistically significant difference between the diffusion and dielectrophoresis groups (P = 0.01). Part 2--The average change in Shade Guide Units (SGU) was 0.6 for the diffusion group, well under the error of measurement of 0.82 SGU. The average change in SGU for the dielectrophoresis group was 9, significantly above the error of measurement and 14 times or 1,400% greater than the diffusion group average. A single-factor ANOVA found a statistically significant difference between the diffusion and dielectrophoresis treatment groups (P < 0.001).

  2. An Evaluation of Departmental Radiation Oncology Incident Reports: Anticipating a National Reporting System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terezakis, Stephanie A., E-mail: stereza1@jhmi.edu; Harris, Kendra M.; Ford, Eric

    Purpose: Systems to ensure patient safety are of critical importance. The electronic incident reporting systems (IRS) of 2 large academic radiation oncology departments were evaluated for events that may be suitable for submission to a national reporting system (NRS). Methods and Materials: All events recorded in the combined IRS were evaluated from 2007 through 2010. Incidents were graded for potential severity using the validated French Nuclear Safety Authority (ASN) 5-point scale. These incidents were categorized into 7 groups: (1) human error, (2) software error, (3) hardware error, (4) error in communication between 2 humans, (5) error at the human-software interface,more » (6) error at the software-hardware interface, and (7) error at the human-hardware interface. Results: Between the 2 systems, 4407 incidents were reported. Of these events, 1507 (34%) were considered to have the potential for clinical consequences. Of these 1507 events, 149 (10%) were rated as having a potential severity of ≥2. Of these 149 events, the committee determined that 79 (53%) of these events would be submittable to a NRS of which the majority was related to human error or to the human-software interface. Conclusions: A significant number of incidents were identified in this analysis. The majority of events in this study were related to human error and to the human-software interface, further supporting the need for a NRS to facilitate field-wide learning and system improvement.« less

  3. Sleep quality, posttraumatic stress, depression, and human errors in train drivers: a population-based nationwide study in South Korea.

    PubMed

    Jeon, Hong Jin; Kim, Ji-Hae; Kim, Bin-Na; Park, Seung Jin; Fava, Maurizio; Mischoulon, David; Kang, Eun-Ho; Roh, Sungwon; Lee, Dongsoo

    2014-12-01

    Human error is defined as an unintended error that is attributable to humans rather than machines, and that is important to avoid to prevent accidents. We aimed to investigate the association between sleep quality and human errors among train drivers. Cross-sectional. Population-based. A sample of 5,480 subjects who were actively working as train drivers were recruited in South Korea. The participants were 4,634 drivers who completed all questionnaires (response rate 84.6%). None. The Pittsburgh Sleep Quality Index (PSQI), the Center for Epidemiologic Studies Depression Scale (CES-D), the Impact of Event Scale-Revised (IES-R), the State-Trait Anxiety Inventory (STAI), and the Korean Occupational Stress Scale (KOSS). Of 4,634 train drivers, 349 (7.5%) showed more than one human error per 5 y. Human errors were associated with poor sleep quality, higher PSQI total scores, short sleep duration at night, and longer sleep latency. Among train drivers with poor sleep quality, those who experienced severe posttraumatic stress showed a significantly higher number of human errors than those without. Multiple logistic regression analysis showed that human errors were significantly associated with poor sleep quality and posttraumatic stress, whereas there were no significant associations with depression, trait and state anxiety, and work stress after adjusting for age, sex, education years, marital status, and career duration. Poor sleep quality was found to be associated with more human errors in train drivers, especially in those who experienced severe posttraumatic stress. © 2014 Associated Professional Sleep Societies, LLC.

  4. Augmented burst-error correction for UNICON laser memory. [digital memory

    NASA Technical Reports Server (NTRS)

    Lim, R. S.

    1974-01-01

    A single-burst-error correction system is described for data stored in the UNICON laser memory. In the proposed system, a long fire code with code length n greater than 16,768 bits was used as an outer code to augment an existing inner shorter fire code for burst error corrections. The inner fire code is a (80,64) code shortened from the (630,614) code, and it is used to correct a single-burst-error on a per-word basis with burst length b less than or equal to 6. The outer code, with b less than or equal to 12, would be used to correct a single-burst-error on a per-page basis, where a page consists of 512 32-bit words. In the proposed system, the encoding and error detection processes are implemented by hardware. A minicomputer, currently used as a UNICON memory management processor, is used on a time-demanding basis for error correction. Based upon existing error statistics, this combination of an inner code and an outer code would enable the UNICON system to obtain a very low error rate in spite of flaws affecting the recorded data.

  5. Analyzing human errors in flight mission operations

    NASA Technical Reports Server (NTRS)

    Bruno, Kristin J.; Welz, Linda L.; Barnes, G. Michael; Sherif, Josef

    1993-01-01

    A long-term program is in progress at JPL to reduce cost and risk of flight mission operations through a defect prevention/error management program. The main thrust of this program is to create an environment in which the performance of the total system, both the human operator and the computer system, is optimized. To this end, 1580 Incident Surprise Anomaly reports (ISA's) from 1977-1991 were analyzed from the Voyager and Magellan projects. A Pareto analysis revealed that 38 percent of the errors were classified as human errors. A preliminary cluster analysis based on the Magellan human errors (204 ISA's) is presented here. The resulting clusters described the underlying relationships among the ISA's. Initial models of human error in flight mission operations are presented. Next, the Voyager ISA's will be scored and included in the analysis. Eventually, these relationships will be used to derive a theoretically motivated and empirically validated model of human error in flight mission operations. Ultimately, this analysis will be used to make continuous process improvements continuous process improvements to end-user applications and training requirements. This Total Quality Management approach will enable the management and prevention of errors in the future.

  6. Paperless Grades and Faculty Development.

    ERIC Educational Resources Information Center

    Hardy, James C.; Jones, Dennis; Turner, Sandy

    2003-01-01

    Provides overview of process of switching from paper-based grade reporting to computer-based grading. Authors found that paperless grading decreased number of errors, made student access more immediate, and reduced costs incurred by purchasing and storing grade-scanning sheets. Authors also argue that direct entry grading encourages faculty to…

  7. Scheme for Terminal Guidance Utilizing Acousto-Optic Correlator.

    DTIC Science & Technology

    longitudinally extending acousto - optic device as index of refraction variation pattern signals. Real time signals corresponding to the scene actually being viewed...by the vehicle are propagated across the stored signals, and the results of an acousto - optic correlation are utilized to determine X and Y error

  8. Carrier recovery techniques on satellite mobile channels

    NASA Technical Reports Server (NTRS)

    Vucetic, B.; Du, J.

    1990-01-01

    An analytical method and a stored channel model were used to evaluate error performance of uncoded quadrature phase shift keying (QPSK) and M-ary phase shift keying (MPSK) trellis coded modulation (TCM) over shadowed satellite mobile channels in the presence of phase jitter for various carrier recovery techniques.

  9. Human Error and the International Space Station: Challenges and Triumphs in Science Operations

    NASA Technical Reports Server (NTRS)

    Harris, Samantha S.; Simpson, Beau C.

    2016-01-01

    Any system with a human component is inherently risky. Studies in human factors and psychology have repeatedly shown that human operators will inevitably make errors, regardless of how well they are trained. Onboard the International Space Station (ISS) where crew time is arguably the most valuable resource, errors by the crew or ground operators can be costly to critical science objectives. Operations experts at the ISS Payload Operations Integration Center (POIC), located at NASA's Marshall Space Flight Center in Huntsville, Alabama, have learned that from payload concept development through execution, there are countless opportunities to introduce errors that can potentially result in costly losses of crew time and science. To effectively address this challenge, we must approach the design, testing, and operation processes with two specific goals in mind. First, a systematic approach to error and human centered design methodology should be implemented to minimize opportunities for user error. Second, we must assume that human errors will be made and enable rapid identification and recoverability when they occur. While a systematic approach and human centered development process can go a long way toward eliminating error, the complete exclusion of operator error is not a reasonable expectation. The ISS environment in particular poses challenging conditions, especially for flight controllers and astronauts. Operating a scientific laboratory 250 miles above the Earth is a complicated and dangerous task with high stakes and a steep learning curve. While human error is a reality that may never be fully eliminated, smart implementation of carefully chosen tools and techniques can go a long way toward minimizing risk and increasing the efficiency of NASA's space science operations.

  10. Modeling human response errors in synthetic flight simulator domain

    NASA Technical Reports Server (NTRS)

    Ntuen, Celestine A.

    1992-01-01

    This paper presents a control theoretic approach to modeling human response errors (HRE) in the flight simulation domain. The human pilot is modeled as a supervisor of a highly automated system. The synthesis uses the theory of optimal control pilot modeling for integrating the pilot's observation error and the error due to the simulation model (experimental error). Methods for solving the HRE problem are suggested. Experimental verification of the models will be tested in a flight quality handling simulation.

  11. Defining the Relationship Between Human Error Classes and Technology Intervention Strategies

    NASA Technical Reports Server (NTRS)

    Wiegmann, Douglas A.; Rantanen, Esa; Crisp, Vicki K. (Technical Monitor)

    2002-01-01

    One of the main factors in all aviation accidents is human error. The NASA Aviation Safety Program (AvSP), therefore, has identified several human-factors safety technologies to address this issue. Some technologies directly address human error either by attempting to reduce the occurrence of errors or by mitigating the negative consequences of errors. However, new technologies and system changes may also introduce new error opportunities or even induce different types of errors. Consequently, a thorough understanding of the relationship between error classes and technology "fixes" is crucial for the evaluation of intervention strategies outlined in the AvSP, so that resources can be effectively directed to maximize the benefit to flight safety. The purpose of the present project, therefore, was to examine the repositories of human factors data to identify the possible relationship between different error class and technology intervention strategies. The first phase of the project, which is summarized here, involved the development of prototype data structures or matrices that map errors onto "fixes" (and vice versa), with the hope of facilitating the development of standards for evaluating safety products. Possible follow-on phases of this project are also discussed. These additional efforts include a thorough and detailed review of the literature to fill in the data matrix and the construction of a complete database and standards checklists.

  12. The effect of mitomycin C after long-term storage on human Tenon's fibroblast proliferation.

    PubMed

    Hu, D; Chen, P P; Oda, D

    1999-10-01

    To investigate the effect of mitomycin C (MMC) after long-term storage on proliferation of human Tenon's fibroblasts in vitro. Human Tenon's fibroblasts in tissue culture were exposed for 5 minutes to MMC (0.4 mg/mL) that was either freshly prepared or had been stored for as long as 18 months at either 4 degrees C or -20 degrees C. The MTT colorimetric assay was used to determine the inhibition of proliferation as measured indirectly by mitochondrial activity. The inhibition rate was 88% using fresh MMC, and declined to a mean of 73% when using MMC that had been stored for as long as 18 months at 4 degrees C; this decrease was not statistically significant. The mean inhibition for MMC stored at -20 degrees C was 68%, and this was significantly less than inhibition with fresh MMC. Inhibition did not vary significantly with MMC after different storage times. Mitomycin C continues to have strong in vitro antiproliferative effects when stored for as long as 18 months at 4 degrees C or -20 degrees C. A significant decline in potency compared with fresh MMC occurs when MMC is stored at -20 degrees C.

  13. Development and implementation of a human accuracy program in patient foodservice.

    PubMed

    Eden, S H; Wood, S M; Ptak, K M

    1987-04-01

    For many years, industry has utilized the concept of human error rates to monitor and minimize human errors in the production process. A consistent quality-controlled product increases consumer satisfaction and repeat purchase of product. Administrative dietitians have applied the concepts of using human error rates (the number of errors divided by the number of opportunities for error) at four hospitals, with a total bed capacity of 788, within a tertiary-care medical center. Human error rate was used to monitor and evaluate trayline employee performance and to evaluate layout and tasks of trayline stations, in addition to evaluating employees in patient service areas. Long-term employees initially opposed the error rate system with some hostility and resentment, while newer employees accepted the system. All employees now believe that the constant feedback given by supervisors enhances their self-esteem and productivity. Employee error rates are monitored daily and are used to counsel employees when necessary; they are also utilized during annual performance evaluation. Average daily error rates for a facility staffed by new employees decreased from 7% to an acceptable 3%. In a facility staffed by long-term employees, the error rate increased, reflecting improper error documentation. Patient satisfaction surveys reveal satisfaction, for tray accuracy increased from 88% to 92% in the facility staffed by long-term employees and has remained above the 90% standard in the facility staffed by new employees.

  14. Reflections on human error - Matters of life and death

    NASA Technical Reports Server (NTRS)

    Wiener, Earl L.

    1989-01-01

    The last two decades have witnessed a rapid growth in the introduction of automatic devices into aircraft cockpits, and eleswhere in human-machine systems. This was motivated in part by the assumption that when human functioning is replaced by machine functioning, human error is eliminated. Experience to date shows that this is far from true, and that automation does not replace humans, but changes their role in the system, as well as the types and severity of the errors they make. This altered role may lead to fewer, but more critical errors. Intervention strategies to prevent these errors, or ameliorate their consequences include basic human factors engineering of the interface, enhanced warning and alerting systems, and more intelligent interfaces that understand the strategic intent of the crew and can detect and trap inconsistent or erroneous input before it affects the system.

  15. Endodontic Procedural Errors: Frequency, Type of Error, and the Most Frequently Treated Tooth.

    PubMed

    Yousuf, Waqas; Khan, Moiz; Mehdi, Hasan

    2015-01-01

    Introduction. The aim of this study is to determine the most common endodontically treated tooth and the most common error produced during treatment and to note the association of particular errors with particular teeth. Material and Methods. Periapical radiographs were taken of all the included teeth and were stored and assessed using DIGORA Optime. Teeth in each group were evaluated for presence or absence of procedural errors (i.e., overfill, underfill, ledge formation, perforations, apical transportation, and/or instrument separation) and the most frequent tooth to undergo endodontic treatment was also noted. Results. A total of 1748 root canal treated teeth were assessed, out of which 574 (32.8%) contained a procedural error. Out of these 397 (22.7%) were overfilled, 155 (8.9%) were underfilled, 16 (0.9%) had instrument separation, and 7 (0.4%) had apical transportation. The most frequently treated tooth was right permanent mandibular first molar (11.3%). The least commonly treated teeth were the permanent mandibular third molars (0.1%). Conclusion. Practitioners should show greater care to maintain accuracy of the working length throughout the procedure, as errors in length accounted for the vast majority of errors and special care should be taken when working on molars.

  16. A stochastic dynamic model for human error analysis in nuclear power plants

    NASA Astrophysics Data System (ADS)

    Delgado-Loperena, Dharma

    Nuclear disasters like Three Mile Island and Chernobyl indicate that human performance is a critical safety issue, sending a clear message about the need to include environmental press and competence aspects in research. This investigation was undertaken to serve as a roadmap for studying human behavior through the formulation of a general solution equation. The theoretical model integrates models from two heretofore-disassociated disciplines (behavior specialists and technical specialists), that historically have independently studied the nature of error and human behavior; including concepts derived from fractal and chaos theory; and suggests re-evaluation of base theory regarding human error. The results of this research were based on comprehensive analysis of patterns of error, with the omnipresent underlying structure of chaotic systems. The study of patterns lead to a dynamic formulation, serving for any other formula used to study human error consequences. The search for literature regarding error yielded insight for the need to include concepts rooted in chaos theory and strange attractors---heretofore unconsidered by mainstream researchers who investigated human error in nuclear power plants or those who employed the ecological model in their work. The study of patterns obtained from the rupture of a steam generator tube (SGTR) event simulation, provided a direct application to aspects of control room operations in nuclear power plant operations. In doing so, the conceptual foundation based in the understanding of the patterns of human error analysis can be gleaned, resulting in reduced and prevent undesirable events.

  17. It's positive to be negative: Achilles tendon work loops during human locomotion.

    PubMed

    Zelik, Karl E; Franz, Jason R

    2017-01-01

    Ultrasound imaging is increasingly used with motion and force data to quantify tendon dynamics during human movement. Frequently, tendon dynamics are estimated indirectly from muscle fascicle kinematics (by subtracting muscle from muscle-tendon unit length), but there is mounting evidence that this Indirect approach yields implausible tendon work loops. Since tendons are passive viscoelastic structures, when they undergo a loading-unloading cycle they must exhibit a negative work loop (i.e., perform net negative work). However, prior studies using this Indirect approach report large positive work loops, often estimating that tendons return 2-5 J of elastic energy for every 1 J of energy stored. More direct ultrasound estimates of tendon kinematics have emerged that quantify tendon elongations by tracking either the muscle-tendon junction or localized tendon tissue. However, it is unclear if these yield more plausible estimates of tendon dynamics. Our objective was to compute tendon work loops and hysteresis losses using these two Direct tendon kinematics estimates during human walking. We found that Direct estimates generally resulted in negative work loops, with average tendon hysteresis losses of 2-11% at 1.25 m/s and 33-49% at 0.75 m/s (N = 8), alluding to 0.51-0.98 J of tendon energy returned for every 1 J stored. We interpret this finding to suggest that Direct approaches provide more plausible estimates than the Indirect approach, and may be preferable for understanding tendon energy storage and return. However, the Direct approaches did exhibit speed-dependent trends that are not consistent with isolated, in vitro tendon hysteresis losses of about 5-10%. These trends suggest that Direct estimates also contain some level of error, albeit much smaller than Indirect estimates. Overall, this study serves to highlight the complexity and difficulty of estimating tendon dynamics non-invasively, and the care that must be taken to interpret biological function from current ultrasound-based estimates.

  18. Applying axiomatic design to a medication distribution system

    NASA Astrophysics Data System (ADS)

    Raguini, Pepito B.

    As the need to minimize medication errors drives many medical facilities to come up with robust solutions to the most common error that affects patient's safety, these hospitals would be wise to put a concerted effort into finding methodologies that can facilitate an optimized medical distribution system. If the hospitals' upper management is looking for an optimization method that is an ideal fit, it is just as important that the right tool be selected for the application at hand. In the present work, we propose the application of Axiomatic Design (AD), which is a process that focuses on the generation and selection of functional requirements to meet the customer needs for product and/or process design. The appeal of the axiomatic approach is to provide both a formal design process and a set of technical coefficients for meeting the customer's needs. Thus, AD offers a strategy for the effective integration of people, design methods, design tools and design data. Therefore, we propose the AD methodology to medical applications with the main objective of allowing nurses the opportunity to provide cost effective delivery of medications to inpatients, thereby improving quality patient care. The AD methodology will be implemented through the use of focused stores, where medications can be readily stored and can be conveniently located near patients, as well as a mobile apparatus that can also store medications and is commonly used by hospitals, the medication cart. Moreover, a robust methodology called the focused store methodology will be introduced and developed for both the uncapacitated and capacitated case studies, which will set up an appropriate AD framework and design problem for a medication distribution case study.

  19. Capacity and precision in an animal model of visual short-term memory

    PubMed Central

    Lara, Antonio H.; Wallis, Jonathan D.

    2013-01-01

    Temporary storage of information in visual short-term memory (VSTM) is a key component of many complex cognitive abilities. However, it is highly limited in capacity. Understanding the neurophysiological nature of this capacity limit will require a valid animal model of VSTM. We used a multiple-item color change detection task to measure macaque monkeys’ VSTM capacity. Subjects’ performance deteriorated and reaction times increased as a function of the number of items in memory. Additionally, we measured the precision of the memory representations by varying the distance between sample and test colors. In trials with similar sample and test colors, subjects made more errors compared to trials with highly discriminable colors. We modeled the error distribution as a Gaussian function and used this to estimate the precision of VSTM representations. We found that as the number of items in memory increases the precision of the representations decreases dramatically. Additionally, we found that focusing attention on one of the objects increases the precision with which that object is stored and degrading the precision of the remaining. These results are in line with recent findings in human psychophysics and provide a solid foundation for understanding the neurophysiological nature of the capacity limit of VSTM. PMID:22419756

  20. Cognitive memory.

    PubMed

    Widrow, Bernard; Aragon, Juan Carlos

    2013-05-01

    Regarding the workings of the human mind, memory and pattern recognition seem to be intertwined. You generally do not have one without the other. Taking inspiration from life experience, a new form of computer memory has been devised. Certain conjectures about human memory are keys to the central idea. The design of a practical and useful "cognitive" memory system is contemplated, a memory system that may also serve as a model for many aspects of human memory. The new memory does not function like a computer memory where specific data is stored in specific numbered registers and retrieval is done by reading the contents of the specified memory register, or done by matching key words as with a document search. Incoming sensory data would be stored at the next available empty memory location, and indeed could be stored redundantly at several empty locations. The stored sensory data would neither have key words nor would it be located in known or specified memory locations. Sensory inputs concerning a single object or subject are stored together as patterns in a single "file folder" or "memory folder". When the contents of the folder are retrieved, sights, sounds, tactile feel, smell, etc., are obtained all at the same time. Retrieval would be initiated by a query or a prompt signal from a current set of sensory inputs or patterns. A search through the memory would be made to locate stored data that correlates with or relates to the prompt input. The search would be done by a retrieval system whose first stage makes use of autoassociative artificial neural networks and whose second stage relies on exhaustive search. Applications of cognitive memory systems have been made to visual aircraft identification, aircraft navigation, and human facial recognition. Concerning human memory, reasons are given why it is unlikely that long-term memory is stored in the synapses of the brain's neural networks. Reasons are given suggesting that long-term memory is stored in DNA or RNA. Neural networks are an important component of the human memory system, and their purpose is for information retrieval, not for information storage. The brain's neural networks are analog devices, subject to drift and unplanned change. Only with constant training is reliable action possible. Good training time is during sleep and while awake and making use of one's memory. A cognitive memory is a learning system. Learning involves storage of patterns or data in a cognitive memory. The learning process for cognitive memory is unsupervised, i.e. autonomous. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Human factors process failure modes and effects analysis (HF PFMEA) software tool

    NASA Technical Reports Server (NTRS)

    Chandler, Faith T. (Inventor); Relvini, Kristine M. (Inventor); Shedd, Nathaneal P. (Inventor); Valentino, William D. (Inventor); Philippart, Monica F. (Inventor); Bessette, Colette I. (Inventor)

    2011-01-01

    Methods, computer-readable media, and systems for automatically performing Human Factors Process Failure Modes and Effects Analysis for a process are provided. At least one task involved in a process is identified, where the task includes at least one human activity. The human activity is described using at least one verb. A human error potentially resulting from the human activity is automatically identified, the human error is related to the verb used in describing the task. A likelihood of occurrence, detection, and correction of the human error is identified. The severity of the effect of the human error is identified. The likelihood of occurrence, and the severity of the risk of potential harm is identified. The risk of potential harm is compared with a risk threshold to identify the appropriateness of corrective measures.

  2. Air Force Academy Homepage

    Science.gov Websites

    Chaplain Corps Cadet Chapel Community Center Chapel Institutional Review Board Not Human Subjects Research Requirements 7 Not Human Subjects Research Form 8 Researcher Instructions - Activities Submitted to DoD IRB 9 Review 18 Not Human Subjects Errors 19 Exempt Research Most Frequent Errors 20 Most Frequent Errors for

  3. Development of an FAA-EUROCONTROL technique for the analysis of human error in ATM : final report.

    DOT National Transportation Integrated Search

    2002-07-01

    Human error has been identified as a dominant risk factor in safety-oriented industries such as air traffic control (ATC). However, little is known about the factors leading to human errors in current air traffic management (ATM) systems. The first s...

  4. Human Error: The Stakes Are Raised.

    ERIC Educational Resources Information Center

    Greenberg, Joel

    1980-01-01

    Mistakes related to the operation of nuclear power plants and other technologically complex systems are discussed. Recommendations are given for decreasing the chance of human error in the operation of nuclear plants. The causes of the Three Mile Island incident are presented in terms of the human error element. (SA)

  5. Assessing the potential for raw meat to influence human colonization with Staphylococcus aureus.

    PubMed

    Carrel, Margaret; Zhao, Chang; Thapaliya, Dipendra; Bitterman, Patrick; Kates, Ashley E; Hanson, Blake M; Smith, Tara C

    2017-09-07

    The role of household meat handling and consumption in the transfer of Staphylococcus aureus (S. aureus) from livestock to consumers is not well understood. Examining the similarity of S. aureus colonizing humans and S. aureus in meat from the stores in which those individuals shop can provide insight into the role of meat in human S. aureus colonization. S. aureus isolates were collected from individuals in rural and urban communities in Iowa (n = 3347) and contemporaneously from meat products in stores where participants report purchasing meat (n = 913). The staphylococcal protein A (spa) gene was sequenced for all isolates to determine a spa type. Morisita indices and Permutational Multivariate Analysis of Variance Using Distance Matrices (PERMANOVA) were used to determine the relationship between spa type composition among human samples and meat samples. spa type composition was significantly different between households and meat sampled from their associated grocery stores. spa types found in meat were not significantly different regardless of the store or county in which they were sampled. spa types in people also exhibit high similarity regardless of residential location in urban or rural counties. Such findings suggest meat is not an important source of S. aureus colonization in shoppers.

  6. Avoiding Human Error in Mission Operations: Cassini Flight Experience

    NASA Technical Reports Server (NTRS)

    Burk, Thomas A.

    2012-01-01

    Operating spacecraft is a never-ending challenge and the risk of human error is ever- present. Many missions have been significantly affected by human error on the part of ground controllers. The Cassini mission at Saturn has not been immune to human error, but Cassini operations engineers use tools and follow processes that find and correct most human errors before they reach the spacecraft. What is needed are skilled engineers with good technical knowledge, good interpersonal communications, quality ground software, regular peer reviews, up-to-date procedures, as well as careful attention to detail and the discipline to test and verify all commands that will be sent to the spacecraft. Two areas of special concern are changes to flight software and response to in-flight anomalies. The Cassini team has a lot of practical experience in all these areas and they have found that well-trained engineers with good tools who follow clear procedures can catch most errors before they get into command sequences to be sent to the spacecraft. Finally, having a robust and fault-tolerant spacecraft that allows ground controllers excellent visibility of its condition is the most important way to ensure human error does not compromise the mission.

  7. Good people who try their best can have problems: recognition of human factors and how to minimise error.

    PubMed

    Brennan, Peter A; Mitchell, David A; Holmes, Simon; Plint, Simon; Parry, David

    2016-01-01

    Human error is as old as humanity itself and is an appreciable cause of mistakes by both organisations and people. Much of the work related to human factors in causing error has originated from aviation where mistakes can be catastrophic not only for those who contribute to the error, but for passengers as well. The role of human error in medical and surgical incidents, which are often multifactorial, is becoming better understood, and includes both organisational issues (by the employer) and potential human factors (at a personal level). Mistakes as a result of individual human factors and surgical teams should be better recognised and emphasised. Attitudes and acceptance of preoperative briefing has improved since the introduction of the World Health Organization (WHO) surgical checklist. However, this does not address limitations or other safety concerns that are related to performance, such as stress and fatigue, emotional state, hunger, awareness of what is going on situational awareness, and other factors that could potentially lead to error. Here we attempt to raise awareness of these human factors, and highlight how they can lead to error, and how they can be minimised in our day-to-day practice. Can hospitals move from being "high risk industries" to "high reliability organisations"? Copyright © 2015 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  8. Using a Delphi Method to Identify Human Factors Contributing to Nursing Errors.

    PubMed

    Roth, Cheryl; Brewer, Melanie; Wieck, K Lynn

    2017-07-01

    The purpose of this study was to identify human factors associated with nursing errors. Using a Delphi technique, this study used feedback from a panel of nurse experts (n = 25) on an initial qualitative survey questionnaire followed by summarizing the results with feedback and confirmation. Synthesized factors regarding causes of errors were incorporated into a quantitative Likert-type scale, and the original expert panel participants were queried a second time to validate responses. The list identified 24 items as most common causes of nursing errors, including swamping and errors made by others that nurses are expected to recognize and fix. The responses provided a consensus top 10 errors list based on means with heavy workload and fatigue at the top of the list. The use of the Delphi survey established consensus and developed a platform upon which future study of nursing errors can evolve as a link to future solutions. This list of human factors in nursing errors should serve to stimulate dialogue among nurses about how to prevent errors and improve outcomes. Human and system failures have been the subject of an abundance of research, yet nursing errors continue to occur. © 2016 Wiley Periodicals, Inc.

  9. The stability of human, bovine and avian tuberculin purified protein derivative (PPD).

    PubMed

    Maes, Mailis; Giménez, José Francisco; D'Alessandro, Adriana; De Waard, Jacobus H

    2011-11-15

    Guidelines recommend storing tuberculin purified protein derivative (PPD) refrigerated. However, especially in developing countries, maintaining the product refrigerated under field conditions can be difficult, limiting its use. Here we determine the effect of prolonged exposure to high temperatures on the potency of human, bovine and avian tuberculin PPD. Human, bovine and avian tuberculin PPD were stored for several weeks exposed to temperatures ranging from 37º to 100ºC. The potency was evaluated in vivo, in sensitized or naturally infected animals. Most test situations didn't affect the biological activity of the tuberculin PPDs and only very long and extreme incubations (several days at 100 °C) compromised the potency. Tuberculin PPD is very stable and can be stored or transported for long periods without refrigeration. 

  10. Obtaining correct compile results by absorbing mismatches between data types representations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horie, Michihiro; Horii, Hiroshi H.; Kawachiya, Kiyokuni

    Methods and a system are provided. A method includes implementing a function, which a compiler for a first language does not have, using a compiler for a second language. The implementing step includes generating, by the compiler for the first language, a first abstract syntax tree. The implementing step further includes converting, by a converter, the first abstract syntax tree to a second abstract syntax tree of the compiler for the second language using a conversion table from data representation types in the first language to data representation types in the second language. When a compilation error occurs, the implementingmore » step also includes generating a special node for error processing in the second abstract syntax tree and storing an error token in the special node. When unparsing, the implementing step additionally includes outputting the error token, in the form of source code written in the first language.« less

  11. Obtaining correct compile results by absorbing mismatches between data types representations

    DOEpatents

    Horie, Michihiro; Horii, Hiroshi H.; Kawachiya, Kiyokuni; Takeuchi, Mikio

    2017-03-21

    Methods and a system are provided. A method includes implementing a function, which a compiler for a first language does not have, using a compiler for a second language. The implementing step includes generating, by the compiler for the first language, a first abstract syntax tree. The implementing step further includes converting, by a converter, the first abstract syntax tree to a second abstract syntax tree of the compiler for the second language using a conversion table from data representation types in the first language to data representation types in the second language. When a compilation error occurs, the implementing step also includes generating a special node for error processing in the second abstract syntax tree and storing an error token in the special node. When unparsing, the implementing step additionally includes outputting the error token, in the form of source code written in the first language.

  12. Obtaining correct compile results by absorbing mismatches between data types representations

    DOEpatents

    Horie, Michihiro; Horii, Hiroshi H.; Kawachiya, Kiyokuni; Takeuchi, Mikio

    2017-11-21

    Methods and a system are provided. A method includes implementing a function, which a compiler for a first language does not have, using a compiler for a second language. The implementing step includes generating, by the compiler for the first language, a first abstract syntax tree. The implementing step further includes converting, by a converter, the first abstract syntax tree to a second abstract syntax tree of the compiler for the second language using a conversion table from data representation types in the first language to data representation types in the second language. When a compilation error occurs, the implementing step also includes generating a special node for error processing in the second abstract syntax tree and storing an error token in the special node. When unparsing, the implementing step additionally includes outputting the error token, in the form of source code written in the first language.

  13. Biometrics encryption combining palmprint with two-layer error correction codes

    NASA Astrophysics Data System (ADS)

    Li, Hengjian; Qiu, Jian; Dong, Jiwen; Feng, Guang

    2017-07-01

    To bridge the gap between the fuzziness of biometrics and the exactitude of cryptography, based on combining palmprint with two-layer error correction codes, a novel biometrics encryption method is proposed. Firstly, the randomly generated original keys are encoded by convolutional and cyclic two-layer coding. The first layer uses a convolution code to correct burst errors. The second layer uses cyclic code to correct random errors. Then, the palmprint features are extracted from the palmprint images. Next, they are fused together by XORing operation. The information is stored in a smart card. Finally, the original keys extraction process is the information in the smart card XOR the user's palmprint features and then decoded with convolutional and cyclic two-layer code. The experimental results and security analysis show that it can recover the original keys completely. The proposed method is more secure than a single password factor, and has higher accuracy than a single biometric factor.

  14. An affordable cuff-less blood pressure estimation solution.

    PubMed

    Jain, Monika; Kumar, Niranjan; Deb, Sujay

    2016-08-01

    This paper presents a cuff-less hypertension pre-screening device that non-invasively monitors the Blood Pressure (BP) and Heart Rate (HR) continuously. The proposed device simultaneously records two clinically significant and highly correlated biomedical signals, viz., Electrocardiogram (ECG) and Photoplethysmogram (PPG). The device provides a common data acquisition platform that can interface with PC/laptop, Smart phone/tablet and Raspberry-pi etc. The hardware stores and processes the recorded ECG and PPG in order to extract the real-time BP and HR using kernel regression approach. The BP and HR estimation error is measured in terms of normalized mean square error, Error Standard Deviation (ESD) and Mean Absolute Error (MAE), with respect to a clinically proven digital BP monitor (OMRON HBP1300). The computed error falls under the maximum standard allowable error mentioned by Association for the Advancement of Medical Instrumentation; MAE <; 5 mmHg and ESD <; 8mmHg. The results are validated using two-tailed dependent sample t-test also. The proposed device is a portable low-cost home and clinic bases solution for continuous health monitoring.

  15. FRamework Assessing Notorious Contributing Influences for Error (FRANCIE): Perspective on Taxonomy Development to Support Error Reporting and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lon N. Haney; David I. Gertman

    2003-04-01

    Beginning in the 1980s a primary focus of human reliability analysis was estimation of human error probabilities. However, detailed qualitative modeling with comprehensive representation of contextual variables often was lacking. This was likely due to the lack of comprehensive error and performance shaping factor taxonomies, and the limited data available on observed error rates and their relationship to specific contextual variables. In the mid 90s Boeing, America West Airlines, NASA Ames Research Center and INEEL partnered in a NASA sponsored Advanced Concepts grant to: assess the state of the art in human error analysis, identify future needs for human errormore » analysis, and develop an approach addressing these needs. Identified needs included the need for a method to identify and prioritize task and contextual characteristics affecting human reliability. Other needs identified included developing comprehensive taxonomies to support detailed qualitative modeling and to structure meaningful data collection efforts across domains. A result was the development of the FRamework Assessing Notorious Contributing Influences for Error (FRANCIE) with a taxonomy for airline maintenance tasks. The assignment of performance shaping factors to generic errors by experts proved to be valuable to qualitative modeling. Performance shaping factors and error types from such detailed approaches can be used to structure error reporting schemes. In a recent NASA Advanced Human Support Technology grant FRANCIE was refined, and two new taxonomies for use on space missions were developed. The development, sharing, and use of error taxonomies, and the refinement of approaches for increased fidelity of qualitative modeling is offered as a means to help direct useful data collection strategies.« less

  16. Long-Term Stability of Human Genomic and Human Papillomavirus DNA Stored in BD SurePath and Hologic PreservCyt Liquid-Based Cytology Media

    PubMed Central

    Agreda, Patricia M.; Beitman, Gerard H.; Gutierrez, Erin C.; Harris, James M.; Koch, Kristopher R.; LaViers, William D.; Leitch, Sharon V.; Maus, Courtney E.; McMillian, Ray A.; Nussbaumer, William A.; Palmer, Marcus L. R.; Porter, Michael J.; Richart, Gregory A.; Schwab, Ryan J.

    2013-01-01

    We evaluated the effect of storage at 2 to 8°C on the stability of human genomic and human papillomavirus (HPV) DNA stored in BD SurePath and Hologic PreservCyt liquid-based cytology media. DNA retained the ability to be extracted and PCR amplified for more than 2.5 years in both medium types. Prior inability to detect DNA in archived specimens may have been due to failure of the extraction method to isolate DNA from fixed cells. PMID:23678069

  17. Investigation of misfiled cases in the PACS environment and a solution to prevent filing errors for chest radiographs.

    PubMed

    Morishita, Junji; Watanabe, Hideyuki; Katsuragawa, Shigehiko; Oda, Nobuhiro; Sukenobu, Yoshiharu; Okazaki, Hiroko; Nakata, Hajime; Doi, Kunio

    2005-01-01

    The aim of the study was to survey misfiled cases in a picture archiving and communication system environment at two hospitals and to demonstrate the potential usefulness of an automated patient recognition method for posteroanterior chest radiographs based on a template-matching technique designed to prevent filing errors. We surveyed misfiled cases obtained from different modalities in one hospital for 25 months, and misfiled cases of chest radiographs in another hospital for 17 months. For investigating the usefulness of an automated patient recognition and identification method for chest radiographs, a prospective study has been completed in clinical settings at the latter hospital. The total numbers of misfiled cases for different modalities in one hospital and for chest radiographs in another hospital were 327 and 22, respectively. The misfiled cases in the two hospitals were mainly the result of human errors (eg, incorrect manual entries of patient information, incorrect usage of identification cards in which an identification card for the previous patient was used for the next patient's image acquisition). The prospective study indicated the usefulness of the computerized method for discovering misfiled cases with a high performance (ie, an 86.4% correct warning rate for different patients and 1.5% incorrect warning rate for the same patients). We confirmed the occurrence of misfiled cases in the two hospitals. The automated patient recognition and identification method for chest radiographs would be useful in preventing wrong images from being stored in the picture archiving and communication system environment.

  18. Tailoring a Human Reliability Analysis to Your Industry Needs

    NASA Technical Reports Server (NTRS)

    DeMott, D. L.

    2016-01-01

    Companies at risk of accidents caused by human error that result in catastrophic consequences include: airline industry mishaps, medical malpractice, medication mistakes, aerospace failures, major oil spills, transportation mishaps, power production failures and manufacturing facility incidents. Human Reliability Assessment (HRA) is used to analyze the inherent risk of human behavior or actions introducing errors into the operation of a system or process. These assessments can be used to identify where errors are most likely to arise and the potential risks involved if they do occur. Using the basic concepts of HRA, an evolving group of methodologies are used to meet various industry needs. Determining which methodology or combination of techniques will provide a quality human reliability assessment is a key element to developing effective strategies for understanding and dealing with risks caused by human errors. There are a number of concerns and difficulties in "tailoring" a Human Reliability Assessment (HRA) for different industries. Although a variety of HRA methodologies are available to analyze human error events, determining the most appropriate tools to provide the most useful results can depend on industry specific cultures and requirements. Methodology selection may be based on a variety of factors that include: 1) how people act and react in different industries, 2) expectations based on industry standards, 3) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 4) type and availability of data, 5) how the industry views risk & reliability, and 6) types of emergencies, contingencies and routine tasks. Other considerations for methodology selection should be based on what information is needed from the assessment. If the principal concern is determination of the primary risk factors contributing to the potential human error, a more detailed analysis method may be employed versus a requirement to provide a numerical value as part of a probabilistic risk assessment. Industries involved with humans operating large equipment or transport systems (ex. railroads or airlines) would have more need to address the man machine interface than medical workers administering medications. Human error occurs in every industry; in most cases the consequences are relatively benign and occasionally beneficial. In cases where the results can have disastrous consequences, the use of Human Reliability techniques to identify and classify the risk of human errors allows a company more opportunities to mitigate or eliminate these types of risks and prevent costly tragedies.

  19. If It Is Stored in My Memory I Will Surely Retrieve It: Anatomy of a Metacognitive Belief

    ERIC Educational Resources Information Center

    Kornell, Nate

    2015-01-01

    Retrieval failures--moments when a memory will not come to mind--are a universal human experience. Yet many laypeople believe human memory is a reliable storage system in which a stored memory should be accessible. I predicted that people would see retrieval failures as aberrations and predict that fewer retrieval failures would happen in the…

  20. Factors that influence the generation of autobiographical memory conjunction errors

    PubMed Central

    Devitt, Aleea L.; Monk-Fromont, Edwin; Schacter, Daniel L.; Addis, Donna Rose

    2015-01-01

    The constructive nature of memory is generally adaptive, allowing us to efficiently store, process and learn from life events, and simulate future scenarios to prepare ourselves for what may come. However, the cost of a flexibly constructive memory system is the occasional conjunction error, whereby the components of an event are authentic, but the combination of those components is false. Using a novel recombination paradigm, it was demonstrated that details from one autobiographical memory may be incorrectly incorporated into another, forming autobiographical memory conjunction errors that elude typical reality monitoring checks. The factors that contribute to the creation of these conjunction errors were examined across two experiments. Conjunction errors were more likely to occur when the corresponding details were partially rather than fully recombined, likely due to increased plausibility and ease of simulation of partially recombined scenarios. Brief periods of imagination increased conjunction error rates, in line with the imagination inflation effect. Subjective ratings suggest that this inflation is due to similarity of phenomenological experience between conjunction and authentic memories, consistent with a source monitoring perspective. Moreover, objective scoring of memory content indicates that increased perceptual detail may be particularly important for the formation of autobiographical memory conjunction errors. PMID:25611492

  1. Radiation-Hardened Solid-State Drive

    NASA Technical Reports Server (NTRS)

    Sheldon, Douglas J.

    2010-01-01

    A method is provided for a radiationhardened (rad-hard) solid-state drive for space mission memory applications by combining rad-hard and commercial off-the-shelf (COTS) non-volatile memories (NVMs) into a hybrid architecture. The architecture is controlled by a rad-hard ASIC (application specific integrated circuit) or a FPGA (field programmable gate array). Specific error handling and data management protocols are developed for use in a rad-hard environment. The rad-hard memories are smaller in overall memory density, but are used to control and manage radiation-induced errors in the main, and much larger density, non-rad-hard COTS memory devices. Small amounts of rad-hard memory are used as error buffers and temporary caches for radiation-induced errors in the large COTS memories. The rad-hard ASIC/FPGA implements a variety of error-handling protocols to manage these radiation-induced errors. The large COTS memory is triplicated for protection, and CRC-based counters are calculated for sub-areas in each COTS NVM array. These counters are stored in the rad-hard non-volatile memory. Through monitoring, rewriting, regeneration, triplication, and long-term storage, radiation-induced errors in the large NV memory are managed. The rad-hard ASIC/FPGA also interfaces with the external computer buses.

  2. Factors that influence the generation of autobiographical memory conjunction errors.

    PubMed

    Devitt, Aleea L; Monk-Fromont, Edwin; Schacter, Daniel L; Addis, Donna Rose

    2016-01-01

    The constructive nature of memory is generally adaptive, allowing us to efficiently store, process and learn from life events, and simulate future scenarios to prepare ourselves for what may come. However, the cost of a flexibly constructive memory system is the occasional conjunction error, whereby the components of an event are authentic, but the combination of those components is false. Using a novel recombination paradigm, it was demonstrated that details from one autobiographical memory (AM) may be incorrectly incorporated into another, forming AM conjunction errors that elude typical reality monitoring checks. The factors that contribute to the creation of these conjunction errors were examined across two experiments. Conjunction errors were more likely to occur when the corresponding details were partially rather than fully recombined, likely due to increased plausibility and ease of simulation of partially recombined scenarios. Brief periods of imagination increased conjunction error rates, in line with the imagination inflation effect. Subjective ratings suggest that this inflation is due to similarity of phenomenological experience between conjunction and authentic memories, consistent with a source monitoring perspective. Moreover, objective scoring of memory content indicates that increased perceptual detail may be particularly important for the formation of AM conjunction errors.

  3. Using APEX to Model Anticipated Human Error: Analysis of a GPS Navigational Aid

    NASA Technical Reports Server (NTRS)

    VanSelst, Mark; Freed, Michael; Shefto, Michael (Technical Monitor)

    1997-01-01

    The interface development process can be dramatically improved by predicting design facilitated human error at an early stage in the design process. The approach we advocate is to SIMULATE the behavior of a human agent carrying out tasks with a well-specified user interface, ANALYZE the simulation for instances of human error, and then REFINE the interface or protocol to minimize predicted error. This approach, incorporated into the APEX modeling architecture, differs from past approaches to human simulation in Its emphasis on error rather than e.g. learning rate or speed of response. The APEX model consists of two major components: (1) a powerful action selection component capable of simulating behavior in complex, multiple-task environments; and (2) a resource architecture which constrains cognitive, perceptual, and motor capabilities to within empirically demonstrated limits. The model mimics human errors arising from interactions between limited human resources and elements of the computer interface whose design falls to anticipate those limits. We analyze the design of a hand-held Global Positioning System (GPS) device used for radical and navigational decisions in small yacht recalls. The analysis demonstrates how human system modeling can be an effective design aid, helping to accelerate the process of refining a product (or procedure).

  4. The Effect of Health Information Technology on Hospital Quality of Care

    ERIC Educational Resources Information Center

    Sun, Ruirui

    2016-01-01

    Health Information Technology (Health IT) is designed to store patients' records safely and clearly, to reduce input errors and missing records, and to make communications more efficiently. Concerned with the relatively lower adoption rate among the US hospitals compared to most developed countries, the Bush Administration set up the Office of…

  5. Single event upset susceptibilities of latchup immune CMOS process programmable gate arrays

    NASA Astrophysics Data System (ADS)

    Koga, R.; Crain, W. R.; Crawford, K. B.; Hansel, S. J.; Lau, D. D.; Tsubota, T. K.

    Single event upsets (SEU) and latchup susceptibilities of complementary metal oxide semiconductor programmable gate arrays (CMOS PPGA's) were measured at the Lawrence Berkeley Laboratory 88-in. cyclotron facility with Xe (603 MeV), Cu (290 MeV), and Ar (180 MeV) ion beams. The PPGA devices tested were those which may be used in space. Most of the SEU measurements were taken with a newly constructed tester called the Bus Access Storage and Comparison System (BASACS) operating via a Macintosh II computer. When BASACS finds that an output does not match a prerecorded pattern, the state of all outputs, position in the test cycle, and other necessary information is transmitted and stored in the Macintosh. The upset rate was kept between 1 and 3 per second. After a sufficient number of errors are stored, the test is stopped and the total fluence of particles and total errors are recorded. The device power supply current was closely monitored to check for occurrence of latchup. Results of the tests are presented, indicating that some of the PPGA's are good candidates for selected space applications.

  6. Utilizing semantic networks to database and retrieve generalized stochastic colored Petri nets

    NASA Technical Reports Server (NTRS)

    Farah, Jeffrey J.; Kelley, Robert B.

    1992-01-01

    Previous work has introduced the Planning Coordinator (PCOORD), a coordinator functioning within the hierarchy of the Intelligent Machine Mode. Within the structure of the Planning Coordinator resides the Primitive Structure Database (PSDB) functioning to provide the primitive structures utilized by the Planning Coordinator in the establishing of error recovery or on-line path plans. This report further explores the Primitive Structure Database and establishes the potential of utilizing semantic networks as a means of efficiently storing and retrieving the Generalized Stochastic Colored Petri Nets from which the error recovery plans are derived.

  7. Determination of stores pointing error due to wing flexibility under flight load

    NASA Technical Reports Server (NTRS)

    Lokos, William A.; Bahm, Catherine M.; Heinle, Robert A.

    1995-01-01

    The in-flight elastic wing twist of a fighter-type aircraft was studied to provide for an improved on-board real-time computed prediction of pointing variations of three wing store stations. This is an important capability to correct sensor pod alignment variation or to establish initial conditions of iron bombs or smart weapons prior to release. The original algorithm was based upon coarse measurements. The electro-optical Flight Deflection Measurement System measured the deformed wing shape in flight under maneuver loads to provide a higher resolution database from which an improved twist prediction algorithm could be developed. The FDMS produced excellent repeatable data. In addition, a NASTRAN finite-element analysis was performed to provide additional elastic deformation data. The FDMS data combined with the NASTRAN analysis indicated that an improved prediction algorithm could be derived by using a different set of aircraft parameters, namely normal acceleration, stores configuration, Mach number, and gross weight.

  8. Exploring human error in military aviation flight safety events using post-incident classification systems.

    PubMed

    Hooper, Brionny J; O'Hare, David P A

    2013-08-01

    Human error classification systems theoretically allow researchers to analyze postaccident data in an objective and consistent manner. The Human Factors Analysis and Classification System (HFACS) framework is one such practical analysis tool that has been widely used to classify human error in aviation. The Cognitive Error Taxonomy (CET) is another. It has been postulated that the focus on interrelationships within HFACS can facilitate the identification of the underlying causes of pilot error. The CET provides increased granularity at the level of unsafe acts. The aim was to analyze the influence of factors at higher organizational levels on the unsafe acts of front-line operators and to compare the errors of fixed-wing and rotary-wing operations. This study analyzed 288 aircraft incidents involving human error from an Australasian military organization occurring between 2001 and 2008. Action errors accounted for almost twice (44%) the proportion of rotary wing compared to fixed wing (23%) incidents. Both classificatory systems showed significant relationships between precursor factors such as the physical environment, mental and physiological states, crew resource management, training and personal readiness, and skill-based, but not decision-based, acts. The CET analysis showed different predisposing factors for different aspects of skill-based behaviors. Skill-based errors in military operations are more prevalent in rotary wing incidents and are related to higher level supervisory processes in the organization. The Cognitive Error Taxonomy provides increased granularity to HFACS analyses of unsafe acts.

  9. Two Equals One: Two Human Actions During Social Interaction Are Grouped as One Unit in Working Memory.

    PubMed

    Ding, Xiaowei; Gao, Zaifeng; Shen, Mowei

    2017-09-01

    Every day, people perceive other people performing interactive actions. Retaining these actions of human agents in working memory (WM) plays a pivotal role in a normal social life. However, whether the semantic knowledge embedded in the interactive actions has a pervasive impact on the storage of the actions in WM remains unknown. In the current study, we investigated two opposing hypotheses: (a) that WM stores the interactions individually (the individual-storage hypothesis) and (b) that WM stores the interactions as chunks (the chunk-storage hypothesis). We required participants to memorize a set of individual actions while ignoring the underlying social interactions. We found that although the social-interaction aspect was task irrelevant, the interactive actions were stored in WM as chunks that were not affected by memory load (Experiments 1 and 2); however, inverting the human actions vertically abolished this chunking effect (Experiment 3). These results suggest that WM automatically and efficiently used semantic knowledge about interactive actions to store them and support the chunk-storage hypothesis.

  10. The application of SHERPA (Systematic Human Error Reduction and Prediction Approach) in the development of compensatory cognitive rehabilitation strategies for stroke patients with left and right brain damage.

    PubMed

    Hughes, Charmayne M L; Baber, Chris; Bienkiewicz, Marta; Worthington, Andrew; Hazell, Alexa; Hermsdörfer, Joachim

    2015-01-01

    Approximately 33% of stroke patients have difficulty performing activities of daily living, often committing errors during the planning and execution of such activities. The objective of this study was to evaluate the ability of the human error identification (HEI) technique SHERPA (Systematic Human Error Reduction and Prediction Approach) to predict errors during the performance of daily activities in stroke patients with left and right hemisphere lesions. Using SHERPA we successfully predicted 36 of the 38 observed errors, with analysis indicating that the proportion of predicted and observed errors was similar for all sub-tasks and severity levels. HEI results were used to develop compensatory cognitive strategies that clinicians could employ to reduce or prevent errors from occurring. This study provides evidence for the reliability and validity of SHERPA in the design of cognitive rehabilitation strategies in stroke populations.

  11. An Analysis of U.S. Army Fratricide Incidents during the Global War on Terror (11 September 2001 to 31 March 2008)

    DTIC Science & Technology

    2010-03-15

    Swiss cheese model of human error causation. ................................................................... 3  2. Results for the classification of...based on Reason’s “ Swiss cheese ” model of human error (1990). Figure 1 describes how an accident is likely to occur when all of the errors, or “holes...align. A detailed description of HFACS can be found in Wiegmann and Shappell (2003). Figure 1. The Swiss cheese model of human error

  12. A Quality Improvement Project to Decrease Human Milk Errors in the NICU.

    PubMed

    Oza-Frank, Reena; Kachoria, Rashmi; Dail, James; Green, Jasmine; Walls, Krista; McClead, Richard E

    2017-02-01

    Ensuring safe human milk in the NICU is a complex process with many potential points for error, of which one of the most serious is administration of the wrong milk to the wrong infant. Our objective was to describe a quality improvement initiative that was associated with a reduction in human milk administration errors identified over a 6-year period in a typical, large NICU setting. We employed a quasi-experimental time series quality improvement initiative by using tools from the model for improvement, Six Sigma methodology, and evidence-based interventions. Scanned errors were identified from the human milk barcode medication administration system. Scanned errors of interest were wrong-milk-to-wrong-infant, expired-milk, or preparation errors. The scanned error rate and the impact of additional improvement interventions from 2009 to 2015 were monitored by using statistical process control charts. From 2009 to 2015, the total number of errors scanned declined from 97.1 per 1000 bottles to 10.8. Specifically, the number of expired milk error scans declined from 84.0 per 1000 bottles to 8.9. The number of preparation errors (4.8 per 1000 bottles to 2.2) and wrong-milk-to-wrong-infant errors scanned (8.3 per 1000 bottles to 2.0) also declined. By reducing the number of errors scanned, the number of opportunities for errors also decreased. Interventions that likely had the greatest impact on reducing the number of scanned errors included installation of bedside (versus centralized) scanners and dedicated staff to handle milk. Copyright © 2017 by the American Academy of Pediatrics.

  13. [Study on freshness evaluation of ice-stored large yellow croaker (Pseudosciaena crocea) using near infrared spectroscopy].

    PubMed

    Liu, Yuan; Chen, Wei-Hua; Hou, Qiao-Juan; Wang, Xi-Chang; Dong, Ruo-Yan; Wu, Hao

    2014-04-01

    Near infrared spectroscopy (NIR) was used in this experiment to evaluate the freshness of ice-stored large yellow croaker (Pseudosciaena crocea) during different storage periods. And the TVB-N was used as an index to evaluate the freshness. Through comparing the correlation coefficent and standard deviations of calibration set and validation set of models established by singly and combined using of different pretreatment methods, different modeling methods and different wavelength region, the best TVB-N models of ice-stored large yellow croaker sold in the market were established to predict the freshness quickly. According to the research, the model shows that the best performance could be established by using the normalization by closure (Ncl) with 1st derivative (Dbl) and normalization to unit length (Nle) with 1st derivative as the pretreated method and partial least square (PLS) as the modeling method combined with choosing the wavelength region of 5 000-7 144, and 7 404-10 000 cm(-1). The calibration model gave the correlation coefficient of 0.992, with a standard error of calibration of 1.045 and the validation model gave the correlation coefficient of 0.999, with a standard error of prediction of 0.990. This experiment attempted to combine several pretreatment methods and choose the best wavelength region, which has got a good result. It could have a good prospective application of freshness detection and quality evaluation of large yellow croaker in the market.

  14. Human errors and measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Kuselman, Ilya; Pennecchi, Francesca

    2015-04-01

    Evaluating the residual risk of human errors in a measurement and testing laboratory, remaining after the error reduction by the laboratory quality system, and quantifying the consequences of this risk for the quality of the measurement/test results are discussed based on expert judgments and Monte Carlo simulations. A procedure for evaluation of the contribution of the residual risk to the measurement uncertainty budget is proposed. Examples are provided using earlier published sets of expert judgments on human errors in pH measurement of groundwater, elemental analysis of geological samples by inductively coupled plasma mass spectrometry, and multi-residue analysis of pesticides in fruits and vegetables. The human error contribution to the measurement uncertainty budget in the examples was not negligible, yet also not dominant. This was assessed as a good risk management result.

  15. Influence of linearly polarized near-infrared irradiation on deformability of human stored erythrocytes.

    PubMed

    Yokoyama, Kozo; Sugiyama, Kazuna

    2003-02-01

    To investigate the influence of linearly polarized near-infrared irradiation using the Super Lizer trade mark on deformability of human erythrocytes. Not only low-powered laser but also linearly polarized near-infrared beams have some biostimulation effects on various tissues. There were some reports of erythrocyte deformability improved by low-powered He-Ne laser irradiation. Human erythrocyte samples stored for three weeks were adjusted to 30% hematocrit. Erythrocyte deformability presented as the filter filtration rate was measured. There was no difference of the filter filtration rate between control group without irradiation and the group of 125 mJ/cm(2) exposure level at a wavelength of 830 nm. However, the groups of 625 and 1,250 mJ/cm(2) exposure levels at a wavelength of 830 nm showed higher filter filtration rates compared to the control group. Linearly polarized near-infrared irradiation in a range of 625-1,250 mJ/cm(2) exposure level at a wavelength of 830 nm improved deformability of human stored erythrocytes.

  16. Human factors evaluation of remote afterloading brachytherapy: Human error and critical tasks in remote afterloading brachytherapy and approaches for improved system performance. Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Callan, J.R.; Kelly, R.T.; Quinn, M.L.

    1995-05-01

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices usedmore » in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error.« less

  17. Human error analysis of commercial aviation accidents using the human factors analysis and classification system (HFACS)

    DOT National Transportation Integrated Search

    2001-02-01

    The Human Factors Analysis and Classification System (HFACS) is a general human error framework : originally developed and tested within the U.S. military as a tool for investigating and analyzing the human : causes of aviation accidents. Based upon ...

  18. Use of attribute association error probability estimates to evaluate quality of medical record geocodes.

    PubMed

    Klaus, Christian A; Carrasco, Luis E; Goldberg, Daniel W; Henry, Kevin A; Sherman, Recinda L

    2015-09-15

    The utility of patient attributes associated with the spatiotemporal analysis of medical records lies not just in their values but also the strength of association between them. Estimating the extent to which a hierarchy of conditional probability exists between patient attribute associations such as patient identifying fields, patient and date of diagnosis, and patient and address at diagnosis is fundamental to estimating the strength of association between patient and geocode, and patient and enumeration area. We propose a hierarchy for the attribute associations within medical records that enable spatiotemporal relationships. We also present a set of metrics that store attribute association error probability (AAEP), to estimate error probability for all attribute associations upon which certainty in a patient geocode depends. A series of experiments were undertaken to understand how error estimation could be operationalized within health data and what levels of AAEP in real data reveal themselves using these methods. Specifically, the goals of this evaluation were to (1) assess if the concept of our error assessment techniques could be implemented by a population-based cancer registry; (2) apply the techniques to real data from a large health data agency and characterize the observed levels of AAEP; and (3) demonstrate how detected AAEP might impact spatiotemporal health research. We present an evaluation of AAEP metrics generated for cancer cases in a North Carolina county. We show examples of how we estimated AAEP for selected attribute associations and circumstances. We demonstrate the distribution of AAEP in our case sample across attribute associations, and demonstrate ways in which disease registry specific operations influence the prevalence of AAEP estimates for specific attribute associations. The effort to detect and store estimates of AAEP is worthwhile because of the increase in confidence fostered by the attribute association level approach to the assessment of uncertainty in patient geocodes, relative to existing geocoding related uncertainty metrics.

  19. Hypoxia modulates the purine salvage pathway and decreases red blood cell and supernatant levels of hypoxanthine during refrigerated storage.

    PubMed

    Nemkov, Travis; Sun, Kaiqi; Reisz, Julie A; Song, Anren; Yoshida, Tatsuro; Dunham, Andrew; Wither, Matthew J; Francis, Richard O; Roach, Robert C; Dzieciatkowska, Monika; Rogers, Stephen C; Doctor, Allan; Kriebardis, Anastasios; Antonelou, Marianna; Papassideri, Issidora; Young, Carolyn T; Thomas, Tiffany A; Hansen, Kirk C; Spitalnik, Steven L; Xia, Yang; Zimring, James C; Hod, Eldad A; D'Alessandro, Angelo

    2018-02-01

    Hypoxanthine catabolism in vivo is potentially dangerous as it fuels production of urate and, most importantly, hydrogen peroxide. However, it is unclear whether accumulation of intracellular and supernatant hypoxanthine in stored red blood cell units is clinically relevant for transfused recipients. Leukoreduced red blood cells from glucose-6-phosphate dehydrogenase-normal or -deficient human volunteers were stored in AS-3 under normoxic, hyperoxic, or hypoxic conditions (with oxygen saturation ranging from <3% to >95%). Red blood cells from healthy human volunteers were also collected at sea level or after 1-7 days at high altitude (>5000 m). Finally, C57BL/6J mouse red blood cells were incubated in vitro with 13 C 1 -aspartate or 13 C 5 -adenosine under normoxic or hypoxic conditions, with or without deoxycoformycin, a purine deaminase inhibitor. Metabolomics analyses were performed on human and mouse red blood cells stored for up to 42 or 14 days, respectively, and correlated with 24 h post-transfusion red blood cell recovery. Hypoxanthine increased in stored red blood cell units as a function of oxygen levels. Stored red blood cells from human glucose-6-phosphate dehydrogenase-deficient donors had higher levels of deaminated purines. Hypoxia in vitro and in vivo decreased purine oxidation and enhanced purine salvage reactions in human and mouse red blood cells, which was partly explained by decreased adenosine monophosphate deaminase activity. In addition, hypoxanthine levels negatively correlated with post-transfusion red blood cell recovery in mice and - preliminarily albeit significantly - in humans. In conclusion, hypoxanthine is an in vitro metabolic marker of the red blood cell storage lesion that negatively correlates with post-transfusion recovery in vivo Storage-dependent hypoxanthine accumulation is ameliorated by hypoxia-induced decreases in purine deamination reaction rates. Copyright© 2018 Ferrata Storti Foundation.

  20. Hypoxia modulates the purine salvage pathway and decreases red blood cell and supernatant levels of hypoxanthine during refrigerated storage

    PubMed Central

    Nemkov, Travis; Sun, Kaiqi; Reisz, Julie A.; Song, Anren; Yoshida, Tatsuro; Dunham, Andrew; Wither, Matthew J.; Francis, Richard O.; Roach, Robert C.; Dzieciatkowska, Monika; Rogers, Stephen C.; Doctor, Allan; Kriebardis, Anastasios; Antonelou, Marianna; Papassideri, Issidora; Young, Carolyn T.; Thomas, Tiffany A.; Hansen, Kirk C.; Spitalnik, Steven L.; Xia, Yang; Zimring, James C.; Hod, Eldad A.; D’Alessandro, Angelo

    2018-01-01

    Hypoxanthine catabolism in vivo is potentially dangerous as it fuels production of urate and, most importantly, hydrogen peroxide. However, it is unclear whether accumulation of intracellular and supernatant hypoxanthine in stored red blood cell units is clinically relevant for transfused recipients. Leukoreduced red blood cells from glucose-6-phosphate dehydrogenase-normal or -deficient human volunteers were stored in AS-3 under normoxic, hyperoxic, or hypoxic conditions (with oxygen saturation ranging from <3% to >95%). Red blood cells from healthy human volunteers were also collected at sea level or after 1–7 days at high altitude (>5000 m). Finally, C57BL/6J mouse red blood cells were incubated in vitro with 13C1-aspartate or 13C5-adenosine under normoxic or hypoxic conditions, with or without deoxycoformycin, a purine deaminase inhibitor. Metabolomics analyses were performed on human and mouse red blood cells stored for up to 42 or 14 days, respectively, and correlated with 24 h post-transfusion red blood cell recovery. Hypoxanthine increased in stored red blood cell units as a function of oxygen levels. Stored red blood cells from human glucose-6-phosphate dehydrogenase-deficient donors had higher levels of deaminated purines. Hypoxia in vitro and in vivo decreased purine oxidation and enhanced purine salvage reactions in human and mouse red blood cells, which was partly explained by decreased adenosine monophosphate deaminase activity. In addition, hypoxanthine levels negatively correlated with post-transfusion red blood cell recovery in mice and – preliminarily albeit significantly - in humans. In conclusion, hypoxanthine is an in vitro metabolic marker of the red blood cell storage lesion that negatively correlates with post-transfusion recovery in vivo. Storage-dependent hypoxanthine accumulation is ameliorated by hypoxia-induced decreases in purine deamination reaction rates. PMID:29079593

  1. Artificial Intelligence in Space Platforms.

    DTIC Science & Technology

    1984-12-01

    technician would be resposible for filling the data base with DSCS particular information concerning thrusters, 90 b...fault conditions and performing predefined self -preserving (entering a safe-hold stat9) switching actions. Is capable of storing contingency or...on-board for syntactical errors (parity, sign, logic, time). Uses coding or other self -checking techniques to minimize the effects of Internally

  2. LATENT IMAGE FADING IN DOSIMETER FILM EMULSIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musialowicz, T.; Wysopolski, J.

    Latent image fading in film emulsions produced by Foton for dosimeter purposes is investigated with regard to the influence of time. The decrease of density caused by latent image fading in normal conditions of storing and relative humidity of 50 to 80% does not exceed 10% during a year. This corresponds to the dose reading error up to 20%. (auth)

  3. Batch reporting of forest inventory statistics using the EVALIDator

    Treesearch

    Patrick D. Miles

    2015-01-01

    The EVALIDator Web application, developed in 2007, provides estimates and sampling errors of forest statistics (e.g., forest area, number of trees, tree biomass) from data stored in the Forest Inventory and Analysis database. In response to user demand, new features have been added to the EVALIDator. The most recent additions are 1) the ability to generate multiple...

  4. Fundamental bound on the persistence and capacity of short-term memory stored as graded persistent activity.

    PubMed

    Koyluoglu, Onur Ozan; Pertzov, Yoni; Manohar, Sanjay; Husain, Masud; Fiete, Ila R

    2017-09-07

    It is widely believed that persistent neural activity underlies short-term memory. Yet, as we show, the degradation of information stored directly in such networks behaves differently from human short-term memory performance. We build a more general framework where memory is viewed as a problem of passing information through noisy channels whose degradation characteristics resemble those of persistent activity networks. If the brain first encoded the information appropriately before passing the information into such networks, the information can be stored substantially more faithfully. Within this framework, we derive a fundamental lower-bound on recall precision, which declines with storage duration and number of stored items. We show that human performance, though inconsistent with models involving direct (uncoded) storage in persistent activity networks, can be well-fit by the theoretical bound. This finding is consistent with the view that if the brain stores information in patterns of persistent activity, it might use codes that minimize the effects of noise, motivating the search for such codes in the brain.

  5. Fundamental bound on the persistence and capacity of short-term memory stored as graded persistent activity

    PubMed Central

    Pertzov, Yoni; Manohar, Sanjay; Husain, Masud; Fiete, Ila R

    2017-01-01

    It is widely believed that persistent neural activity underlies short-term memory. Yet, as we show, the degradation of information stored directly in such networks behaves differently from human short-term memory performance. We build a more general framework where memory is viewed as a problem of passing information through noisy channels whose degradation characteristics resemble those of persistent activity networks. If the brain first encoded the information appropriately before passing the information into such networks, the information can be stored substantially more faithfully. Within this framework, we derive a fundamental lower-bound on recall precision, which declines with storage duration and number of stored items. We show that human performance, though inconsistent with models involving direct (uncoded) storage in persistent activity networks, can be well-fit by the theoretical bound. This finding is consistent with the view that if the brain stores information in patterns of persistent activity, it might use codes that minimize the effects of noise, motivating the search for such codes in the brain. PMID:28879851

  6. Compound Stimulus Presentation Does Not Deepen Extinction in Human Causal Learning

    PubMed Central

    Griffiths, Oren; Holmes, Nathan; Westbrook, R. Fred

    2017-01-01

    Models of associative learning have proposed that cue-outcome learning critically depends on the degree of prediction error encountered during training. Two experiments examined the role of error-driven extinction learning in a human causal learning task. Target cues underwent extinction in the presence of additional cues, which differed in the degree to which they predicted the outcome, thereby manipulating outcome expectancy and, in the absence of any change in reinforcement, prediction error. These prediction error manipulations have each been shown to modulate extinction learning in aversive conditioning studies. While both manipulations resulted in increased prediction error during training, neither enhanced extinction in the present human learning task (one manipulation resulted in less extinction at test). The results are discussed with reference to the types of associations that are regulated by prediction error, the types of error terms involved in their regulation, and how these interact with parameters involved in training. PMID:28232809

  7. Storage of Unfed and Leftover Pasteurized Human Milk.

    PubMed

    Meng, Ting; Perrin, Maryanne T; Allen, Jonathan C; Osborne, Jason; Jones, Frances; Fogleman, April D

    2016-12-01

    To determine the impact of storage on bacterial growth and immunological activity of pasteurized human milk and leftover pasteurized human milk that has been exposed to the microflora in an infant's mouth. Eighteen mother-infant dyads participated in two separate studies. Mother's milk was pasteurized, and each baby was fed 1 to 2 ounces. Pasteurized and leftover pasteurized milk were stored at room (24°C) and refrigerated temperatures (4°C). After storage, milk was analyzed for bacteria, total protein, lysozyme activity, and secretory immunoglobulin A (SIgA) activity. In pasteurized and leftover pasteurized milk stored in the refrigerator for 7 days, total aerobic bacteria do not increase significantly and total protein and bioactive proteins are stable. At room temperature, there is a significant increase in total aerobic bacteria in leftover pasteurized milk during 12 hours of storage (p < 0.01) and a significant decrease in total protein and SIgA activity in pasteurized milk during 12 hours of storage (p = 0.02 and p = 0.03, respectively). When stored in the refrigerator, pasteurized and leftover pasteurized milk may be stored for at least 7 days when considering the variables studied. Caution should be used when storing pasteurized and leftover pasteurized milk at room temperature to prevent an increase in bacterial growth and a decrease in total protein and SIgA activity.

  8. Associations between food environment around schools and professionally measured weight status for middle and high school students.

    PubMed

    Tang, Xuyang; Ohri-Vachaspati, Punam; Abbott, Joshua K; Aggarwal, Rimjhim; Tulloch, David L; Lloyd, Kristen; Yedidia, Michael J

    2014-12-01

    Obesity rates among school-age children remain high. Access to energy-dense foods at home, in schools, in stores, and restaurants around homes and schools is of concern. Research on the relationship between food environment around schools and students' weight status is inconclusive. This study examines the association between weight status of middle and high school students and proximity to a comprehensive set of food outlets around schools. Deidentified nurse-measured heights and weights data were obtained for 12,954 middle and high school students attending 33 public schools in four low-income communities in New Jersey. Geocoded locations of supermarkets, convenience stores, small grocery stores, and limited-service restaurants were obtained from commercial sources. Random-effect regression models with robust standard errors were developed to adjust for unequal variances across schools and clustering of students within schools. Proximity to small grocery stores that offered some healthy options (e.g., five fruits, five vegetables, and low-fat/skim milk) and supermarkets was associated with healthier student weight status. Having a small grocery store within 0.25 mile of school and an additional such store within that radius was associated with a lower BMI z-score (p<0.05). An additional supermarket within 0.25 mile of schools was associated with a lower probability of being overweight/obese (p<0.05). Improving access to healthy food outlets, such as small stores, that offer healthy food options and supermarkets around middle and high schools is a potential strategy for improving weight outcomes among students.

  9. Associations between Food Environment around Schools and Professionally Measured Weight Status for Middle and High School Students

    PubMed Central

    Tang, Xuyang; Abbott, Joshua K.; Aggarwal, Rimjhim; Tulloch, David L.; Lloyd, Kristen; Yedidia, Michael J.

    2014-01-01

    Abstract Background: Obesity rates among school-age children remain high. Access to energy-dense foods at home, in schools, in stores, and restaurants around homes and schools is of concern. Research on the relationship between food environment around schools and students' weight status is inconclusive. This study examines the association between weight status of middle and high school students and proximity to a comprehensive set of food outlets around schools. Methods: Deidentified nurse-measured heights and weights data were obtained for 12,954 middle and high school students attending 33 public schools in four low-income communities in New Jersey. Geocoded locations of supermarkets, convenience stores, small grocery stores, and limited-service restaurants were obtained from commercial sources. Random-effect regression models with robust standard errors were developed to adjust for unequal variances across schools and clustering of students within schools. Results: Proximity to small grocery stores that offered some healthy options (e.g., five fruits, five vegetables, and low-fat/skim milk) and supermarkets was associated with healthier student weight status. Having a small grocery store within 0.25 mile of school and an additional such store within that radius was associated with a lower BMI z-score (p<0.05). An additional supermarket within 0.25 mile of schools was associated with a lower probability of being overweight/obese (p<0.05). Conclusions: Improving access to healthy food outlets, such as small stores, that offer healthy food options and supermarkets around middle and high schools is a potential strategy for improving weight outcomes among students. PMID:25343730

  10. Force Analysis and Energy Operation of Chaotic System of Permanent-Magnet Synchronous Motor

    NASA Astrophysics Data System (ADS)

    Qi, Guoyuan; Hu, Jianbing

    2017-12-01

    The disadvantage of a nondimensionalized model of a permanent-magnet synchronous Motor (PMSM) is identified. The original PMSM model is transformed into a Kolmogorov system to aid dynamic force analysis. The vector field of the PMSM is analogous to the force field including four types of torque — inertial, internal, dissipative, and generalized external. Using the feedback thought, the error torque between external torque and dissipative torque is identified. The pitchfork bifurcation of the PMSM is performed. Four forms of energy are identified for the system — kinetic, potential, dissipative, and supplied. The physical interpretations of the decomposition of force and energy exchange are given. Casimir energy is stored energy, and its rate of change is the error power between the dissipative energy and the energy supplied to the motor. Error torque and error power influence the different types of dynamic modes. The Hamiltonian energy and Casimir energy are compared to find the function of each in producing the dynamic modes. A supremum bound for the chaotic attractor is proposed using the error power and Lagrange multiplier.

  11. Fault and Error Latency Under Real Workload: an Experimental Study. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chillarege, Ram

    1986-01-01

    A practical methodology for the study of fault and error latency is demonstrated under a real workload. This is the first study that measures and quantifies the latency under real workload and fills a major gap in the current understanding of workload-failure relationships. The methodology is based on low level data gathered on a VAX 11/780 during the normal workload conditions of the installation. Fault occurrence is simulated on the data, and the error generation and discovery process is reconstructed to determine latency. The analysis proceeds to combine the low level activity data with high level machine performance data to yield a better understanding of the phenomena. A strong relationship exists between latency and workload and that relationship is quantified. The sampling and reconstruction techniques used are also validated. Error latency in the memory where the operating system resides was studied using data on the physical memory access. Fault latency in the paged section of memory was determined using data from physical memory scans. Error latency in the microcontrol store was studied using data on the microcode access and usage.

  12. A cell transportation solution that preserves live circulating tumor cells in patient blood samples.

    PubMed

    Stefansson, Steingrimur; Adams, Daniel L; Ershler, William B; Le, Huyen; Ho, David H

    2016-05-06

    Circulating tumor cells (CTCs) are typically collected into CellSave fixative tubes, which kills the cells, but preserves their morphology. Currently, the clinical utility of CTCs is mostly limited to their enumeration. More detailed investigation of CTC biology can be performed on live cells, but obtaining live CTCs is technically challenging, requiring blood collection into biocompatible solutions and rapid isolation which limits transportation options. To overcome the instability of CTCs, we formulated a sugar based cell transportation solution (SBTS) that stabilizes cell viability at ambient temperature. In this study we examined the long term viability of human cancer cell lines, primary cells and CTCs in human blood samples in the SBTS for transportation purposes. Four cell lines, 5 primary human cells and purified human PBMCs were tested to determine the viability of cells stored in the transportation solution at ambient temperature for up to 7 days. We then demonstrated viability of MCF-7 cells spiked into normal blood with SBTS and stored for up to 7 days. A pilot study was then run on blood samples from 3 patients with metastatic malignancies stored with or without SBTS for 6 days. CTCs were then purified by Ficoll separation/microfilter isolation and identified using CTC markers. Cell viability was assessed using trypan blue or CellTracker™ live cell stain. Our results suggest that primary/immortalized cell lines stored in SBTS remain ~90% viable for > 72 h. Further, MCF-7 cells spiked into whole blood remain viable when stored with SBTS for up to 7 days. Finally, live CTCs were isolated from cancer patient blood samples kept in SBTS at ambient temperature for 6 days. No CTCs were isolated from blood samples stored without SBTS. In this proof of principle pilot study we show that viability of cell lines is preserved for days using SBTS. Further, this solution can be used to store patient derived blood samples for eventual isolation of viable CTCs after days of storage. Therefore, we suggest an effective and economical transportation of cancer patient blood samples containing live CTCs can be achieved.

  13. Human errors and occupational injuries of older female workers in the residential healthcare facilities for the elderly.

    PubMed

    Kim, Jun Sik; Jeong, Byung Yong

    2018-05-03

    The study aimed to describe the characteristics of occupational injuries of female workers in the residential healthcare facilities for the elderly, and analyze human errors as causes of accidents. From the national industrial accident compensation data, 506 female injuries were analyzed by age and occupation. The results showed that medical service worker was the most prevalent (54.1%), followed by social welfare worker (20.4%). Among injuries, 55.7% were <1 year of work experience, and 37.9% were ≥60 years old. Slips/falls were the most common type of accident (42.7%), and proportion of injured by slips/falls increases with age. Among human errors, action errors were the primary reasons, followed by perception errors, and cognition errors. Besides, the ratios of injuries by perception errors and action errors increase with age, respectively. The findings of this study suggest that there is a need to design workplaces that accommodate the characteristics of older female workers.

  14. [Risk and risk management in aviation].

    PubMed

    Müller, Manfred

    2004-10-01

    RISK MANAGEMENT: The large proportion of human errors in aviation accidents suggested the solution--at first sight brilliant--to replace the fallible human being by an "infallible" digitally-operating computer. However, even after the introduction of the so-called HITEC-airplanes, the factor human error still accounts for 75% of all accidents. Thus, if the computer is ruled out as the ultimate safety system, how else can complex operations involving quick and difficult decisions be controlled? OPTIMIZED TEAM INTERACTION/PARALLEL CONNECTION OF THOUGHT MACHINES: Since a single person is always "highly error-prone", support and control have to be guaranteed by a second person. The independent work of mind results in a safety network that more efficiently cushions human errors. NON-PUNITIVE ERROR MANAGEMENT: To be able to tackle the actual problems, the open discussion of intervened errors must not be endangered by the threat of punishment. It has been shown in the past that progress is primarily achieved by investigating and following up mistakes, failures and catastrophes shortly after they happened. HUMAN FACTOR RESEARCH PROJECT: A comprehensive survey showed the following result: By far the most frequent safety-critical situation (37.8% of all events) consists of the following combination of risk factors: 1. A complication develops. 2. In this situation of increased stress a human error occurs. 3. The negative effects of the error cannot be corrected or eased because there are deficiencies in team interaction on the flight deck. This means, for example, that a negative social climate has the effect of a "turbocharger" when a human error occurs. It needs to be pointed out that a negative social climate is not identical with a dispute. In many cases the working climate is burdened without the responsible person even noticing it: A first negative impression, too much or too little respect, contempt, misunderstandings, not expressing unclear concern, etc. can considerably reduce the efficiency of a team.

  15. Is visual short-term memory depthful?

    PubMed

    Reeves, Adam; Lei, Quan

    2014-03-01

    Does visual short-term memory (VSTM) depend on depth, as it might be if information was stored in more than one depth layer? Depth is critical in natural viewing and might be expected to affect retention, but whether this is so is currently unknown. Cued partial reports of letter arrays (Sperling, 1960) were measured up to 700 ms after display termination. Adding stereoscopic depth hardly affected VSTM capacity or decay inferred from total errors. The pattern of transposition errors (letters reported from an uncued row) was almost independent of depth and cue delay. We conclude that VSTM is effectively two-dimensional. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Online production validation in a HEP environment

    NASA Astrophysics Data System (ADS)

    Harenberg, T.; Kuhl, T.; Lang, N.; Mättig, P.; Sandhoff, M.; Schwanenberger, C.; Volkmer, F.

    2017-03-01

    In high energy physics (HEP) event simulations, petabytes of data are processed and stored requiring millions of CPU-years. This enormous demand for computing resources is handled by centers distributed worldwide, which form part of the LHC computing grid. The consumption of such an important amount of resources demands for an efficient production of simulation and for the early detection of potential errors. In this article we present a new monitoring framework for grid environments, which polls a measure of data quality during job execution. This online monitoring facilitates the early detection of configuration errors (specially in simulation parameters), and may thus contribute to significant savings in computing resources.

  17. Lexical morphology and its role in the writing process: evidence from a case of acquired dysgraphia.

    PubMed

    Badecker, W; Hillis, A; Caramazza, A

    1990-06-01

    A case of acquired dysgraphia is presented in which the deficit is attributed to an impairment at the level of the Graphemic Output Buffer. It is argued that this patient's performance can be used to identify the representational character of the processing units that are stored in the Orthographic Output Lexicon. In particular, it is argued that the distribution of spelling errors and the types of lexical items which affect error rates indicate that the lexical representations passed from the lexical output system to the Graphemic Output Buffer correspond to the productive morphemes of the language.

  18. Common data buffer system. [communication with computational equipment utilized in spacecraft operations

    NASA Technical Reports Server (NTRS)

    Byrne, F. (Inventor)

    1981-01-01

    A high speed common data buffer system is described for providing an interface and communications medium between a plurality of computers utilized in a distributed computer complex forming part of a checkout, command and control system for space vehicles and associated ground support equipment. The system includes the capability for temporarily storing data to be transferred between computers, for transferring a plurality of interrupts between computers, for monitoring and recording these transfers, and for correcting errors incurred in these transfers. Validity checks are made on each transfer and appropriate error notification is given to the computer associated with that transfer.

  19. Competition between learned reward and error outcome predictions in anterior cingulate cortex.

    PubMed

    Alexander, William H; Brown, Joshua W

    2010-02-15

    The anterior cingulate cortex (ACC) is implicated in performance monitoring and cognitive control. Non-human primate studies of ACC show prominent reward signals, but these are elusive in human studies, which instead show mainly conflict and error effects. Here we demonstrate distinct appetitive and aversive activity in human ACC. The error likelihood hypothesis suggests that ACC activity increases in proportion to the likelihood of an error, and ACC is also sensitive to the consequence magnitude of the predicted error. Previous work further showed that error likelihood effects reach a ceiling as the potential consequences of an error increase, possibly due to reductions in the average reward. We explored this issue by independently manipulating reward magnitude of task responses and error likelihood while controlling for potential error consequences in an Incentive Change Signal Task. The fMRI results ruled out a modulatory effect of expected reward on error likelihood effects in favor of a competition effect between expected reward and error likelihood. Dynamic causal modeling showed that error likelihood and expected reward signals are intrinsic to the ACC rather than received from elsewhere. These findings agree with interpretations of ACC activity as signaling both perceptions of risk and predicted reward. Copyright 2009 Elsevier Inc. All rights reserved.

  20. On the error propagation of semi-Lagrange and Fourier methods for advection problems☆

    PubMed Central

    Einkemmer, Lukas; Ostermann, Alexander

    2015-01-01

    In this paper we study the error propagation of numerical schemes for the advection equation in the case where high precision is desired. The numerical methods considered are based on the fast Fourier transform, polynomial interpolation (semi-Lagrangian methods using a Lagrange or spline interpolation), and a discontinuous Galerkin semi-Lagrangian approach (which is conservative and has to store more than a single value per cell). We demonstrate, by carrying out numerical experiments, that the worst case error estimates given in the literature provide a good explanation for the error propagation of the interpolation-based semi-Lagrangian methods. For the discontinuous Galerkin semi-Lagrangian method, however, we find that the characteristic property of semi-Lagrangian error estimates (namely the fact that the error increases proportionally to the number of time steps) is not observed. We provide an explanation for this behavior and conduct numerical simulations that corroborate the different qualitative features of the error in the two respective types of semi-Lagrangian methods. The method based on the fast Fourier transform is exact but, due to round-off errors, susceptible to a linear increase of the error in the number of time steps. We show how to modify the Cooley–Tukey algorithm in order to obtain an error growth that is proportional to the square root of the number of time steps. Finally, we show, for a simple model, that our conclusions hold true if the advection solver is used as part of a splitting scheme. PMID:25844018

  1. Temporal uncertainty analysis of human errors based on interrelationships among multiple factors: a case of Minuteman III missile accident.

    PubMed

    Rong, Hao; Tian, Jin; Zhao, Tingdi

    2016-01-01

    In traditional approaches of human reliability assessment (HRA), the definition of the error producing conditions (EPCs) and the supporting guidance are such that some of the conditions (especially organizational or managerial conditions) can hardly be included, and thus the analysis is burdened with incomprehensiveness without reflecting the temporal trend of human reliability. A method based on system dynamics (SD), which highlights interrelationships among technical and organizational aspects that may contribute to human errors, is presented to facilitate quantitatively estimating the human error probability (HEP) and its related variables changing over time in a long period. Taking the Minuteman III missile accident in 2008 as a case, the proposed HRA method is applied to assess HEP during missile operations over 50 years by analyzing the interactions among the variables involved in human-related risks; also the critical factors are determined in terms of impact that the variables have on risks in different time periods. It is indicated that both technical and organizational aspects should be focused on to minimize human errors in a long run. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  2. Systematic analysis of video data from different human-robot interaction studies: a categorization of social signals during error situations.

    PubMed

    Giuliani, Manuel; Mirnig, Nicole; Stollnberger, Gerald; Stadler, Susanne; Buchner, Roland; Tscheligi, Manfred

    2015-01-01

    Human-robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human-robot interaction experiments. For that, we analyzed 201 videos of five human-robot interaction user studies with varying tasks from four independent projects. The analysis shows that there are two types of error situations: social norm violations and technical failures. Social norm violations are situations in which the robot does not adhere to the underlying social script of the interaction. Technical failures are caused by technical shortcomings of the robot. The results of the video analysis show that the study participants use many head movements and very few gestures, but they often smile, when in an error situation with the robot. Another result is that the participants sometimes stop moving at the beginning of error situations. We also found that the participants talked more in the case of social norm violations and less during technical failures. Finally, the participants use fewer non-verbal social signals (for example smiling, nodding, and head shaking), when they are interacting with the robot alone and no experimenter or other human is present. The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills. Our findings have implications for builders and evaluators of human-robot interaction systems. The builders need to consider including modules for recognition and classification of head movements to the robot input channels. The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies.

  3. Physical implementation of protected qubits

    NASA Astrophysics Data System (ADS)

    Douçot, B.; Ioffe, L. B.

    2012-07-01

    We review the general notion of topological protection of quantum states in spin models and its relation with the ideas of quantum error correction. We show that topological protection can be viewed as a Hamiltonian realization of error correction: for a quantum code for which the minimal number of errors that remain undetected is N, the corresponding Hamiltonian model of the effects of the environment noise appears only in the Nth order of the perturbation theory. We discuss the simplest model Hamiltonians that realize topological protection and their implementation in superconducting arrays. We focus on two dual realizations: in one the protected state is stored in the parity of the Cooper pair number, in the other, in the parity of the flux number. In both cases the superconducting arrays allow a number of fault-tolerant operations that should make the universal quantum computation possible.

  4. Carbon Storage in US Wetlands.

    EPA Science Inventory

    Background/Question/Methods Wetland soils contain some of the highest stores of soil carbon in the biosphere. However, there is little understanding of the quantity and distribution of carbon stored in US wetlands or of the potential effects of human disturbance on these stocks. ...

  5. The distribution and public health consequences of releases of chemicals intended for pool use in 17 states, 2001-2009.

    PubMed

    Anderson, Ayana R; Welles, Wanda Lizak; Drew, James; Orr, Maureen F

    2014-05-01

    To keep swimming pool water clean and clear, consumers purchase, transport, store, use, and dispose of large amounts of potentially hazardous chemicals. Data about incidents due to the use of these chemicals and the resultant public health impacts are limited. The authors analyzed pool chemical release data from 17 states that participated in the Agency for Toxic Substances and Disease Registry's chemical event surveillance system during 2001-2009. In 400 pool chemical incidents, 60% resulted in injuries. Of the 732 injured persons, 67% were members of the public and 50% were under 18 years old. Incidents occurred most frequently in private residences (39%), but incidents with the most injured persons (34%) occurred at recreational facilities. Human error (71.9%) was the most frequent primary contributing factor, followed by equipment failure (22.8%). Interventions designed to mitigate the public health impact associated with pool chemical releases should target both private pool owners and public pool operators.

  6. Human Milk Banking.

    PubMed

    Haiden, Nadja; Ziegler, Ekhard E

    2016-01-01

    Human milk banks play an essential role by providing human milk to infants who would otherwise not be able to receive human milk. The largest group of recipients are premature infants who derive very substantial benefits from it. Human milk protects premature infants from necrotizing enterocolitis and from sepsis, two devastating medical conditions. Milk banks collect, screen, store, process, and distribute human milk. Donating women usually nurse their own infants and have a milk supply that exceeds their own infants' needs. Donor women are carefully selected and are screened for HIV-1, HIV-2, human T-cell leukemia virus 1 and 2, hepatitis B, hepatitis C, and syphilis. In the milk bank, handling, storing, processing, pooling, and bacterial screening follow standardized algorithms. Heat treatment of human milk diminishes anti-infective properties, cellular components, growth factors, and nutrients. However, the beneficial effects of donor milk remain significant and donor milk is still highly preferable in comparison to formula. © 2017 S. Karger AG, Basel.

  7. Introduction to the Microbiological Spoilage of Foods and Beverages

    NASA Astrophysics Data System (ADS)

    Sperber, William H.

    Though direct evidence of ancient food-handling practices is difficult to obtain and examine, it seems safe to assume that over the span of several million years, prehistoric humans struggled to maintain an adequate food supply. Their daily food needed to be hunted or harvested and consumed before it spoiled and became unfit to eat. Freshly killed animals, for example, could not have been kept for very long periods of time. Moreover, many early humans were nomadic, continually searching for food. We can imagine that, with an unreliable food supply, their lives must have often been literally "feast or famine." Yet, our ancestors gradually learned by accident, or by trial and error, simple techniques that could extend the storage time of their food (Block, 1991). Their brain capacity was similar to that of modern humans; therefore, some of them were likely early scientists and technologists. They would have learned that primitive cereal grains, nuts and berries, etc. could be stored in covered vessels to keep them dry and safer from mold spoilage. Animal products could be kept in cool places or dried and smoked over a fire, as the controlled use of fire by humans is thought to have begun about 400,000 years ago. Quite likely, naturally desiccated or fermented foods were also noticed and produced routinely to provide a more stable supply of edible food. Along with the development of agricultural practices for crop and animal production, the "simple" food-handling practices developed during the relatively countless millennia of prehistory paved the way for human civilizations.

  8. Human error analysis of commercial aviation accidents: application of the Human Factors Analysis and Classification system (HFACS).

    PubMed

    Wiegmann, D A; Shappell, S A

    2001-11-01

    The Human Factors Analysis and Classification System (HFACS) is a general human error framework originally developed and tested within the U.S. military as a tool for investigating and analyzing the human causes of aviation accidents. Based on Reason's (1990) model of latent and active failures, HFACS addresses human error at all levels of the system, including the condition of aircrew and organizational factors. The purpose of the present study was to assess the utility of the HFACS framework as an error analysis and classification tool outside the military. The HFACS framework was used to analyze human error data associated with aircrew-related commercial aviation accidents that occurred between January 1990 and December 1996 using database records maintained by the NTSB and the FAA. Investigators were able to reliably accommodate all the human causal factors associated with the commercial aviation accidents examined in this study using the HFACS system. In addition, the classification of data using HFACS highlighted several critical safety issues in need of intervention research. These results demonstrate that the HFACS framework can be a viable tool for use within the civil aviation arena. However, additional research is needed to examine its applicability to areas outside the flight deck, such as aircraft maintenance and air traffic control domains.

  9. To Err Is Human; To Structurally Prime from Errors Is Also Human

    ERIC Educational Resources Information Center

    Slevc, L. Robert; Ferreira, Victor S.

    2013-01-01

    Natural language contains disfluencies and errors. Do listeners simply discard information that was clearly produced in error, or can erroneous material persist to affect subsequent processing? Two experiments explored this question using a structural priming paradigm. Speakers described dative-eliciting pictures after hearing prime sentences that…

  10. Human factors analysis and classification system-HFACS.

    DOT National Transportation Integrated Search

    2000-02-01

    Human error has been implicated in 70 to 80% of all civil and military aviation accidents. Yet, most accident : reporting systems are not designed around any theoretical framework of human error. As a result, most : accident databases are not conduci...

  11. CME Velocity and Acceleration Error Estimates Using the Bootstrap Method

    NASA Technical Reports Server (NTRS)

    Michalek, Grzegorz; Gopalswamy, Nat; Yashiro, Seiji

    2017-01-01

    The bootstrap method is used to determine errors of basic attributes of coronal mass ejections (CMEs) visually identified in images obtained by the Solar and Heliospheric Observatory (SOHO) mission's Large Angle and Spectrometric Coronagraph (LASCO) instruments. The basic parameters of CMEs are stored, among others, in a database known as the SOHO/LASCO CME catalog and are widely employed for many research studies. The basic attributes of CMEs (e.g. velocity and acceleration) are obtained from manually generated height-time plots. The subjective nature of manual measurements introduces random errors that are difficult to quantify. In many studies the impact of such measurement errors is overlooked. In this study we present a new possibility to estimate measurements errors in the basic attributes of CMEs. This approach is a computer-intensive method because it requires repeating the original data analysis procedure several times using replicate datasets. This is also commonly called the bootstrap method in the literature. We show that the bootstrap approach can be used to estimate the errors of the basic attributes of CMEs having moderately large numbers of height-time measurements. The velocity errors are in the vast majority small and depend mostly on the number of height-time points measured for a particular event. In the case of acceleration, the errors are significant, and for more than half of all CMEs, they are larger than the acceleration itself.

  12. Radiated EMC immunity investigation of common recognition identification platform for medical applications

    NASA Astrophysics Data System (ADS)

    Miranda, Jorge; Cabral, Jorge; Ravelo, Blaise; Wagner, Stefan; Pedersen, Christian F.; Memon, Mukhtiar; Mathiesen, Morten

    2015-01-01

    An innovative e-healthcare platform named common recognition and identification platform (CRIP) was developed and tested as part of the CareStore project. CareStore and CRIP aims at delivering accurate and safe disease management by minimising human operator errors in hospitals and care facilities. To support this, the CRIP platform features fingerprint biometrics and near field communication (NFC) for user identification; and Bluetooth communication support for a range of telemedicine medical devices adhering to the IEEE 11073 standard. The aim of this study was to evaluate the electromagnetic compatibility (EMC) immunity of the CRIP platform in order to validate it for medical application use. The first prototype of CRIP was demonstrated to operate as expected by showing the user identification function feasibility, both via NFC and biometric, and by detection of Bluetooth devices via radio frequency (RF) scanning. The NFC module works in the 13.56 MHz band and the Bluetooth module work in the 2.4 GHz band, according to the IEEE 802.15.1 standard. The standard test qualification of the CRIP was performed based on the radiated EMC immunity with respect to the EN 61000-4-3 standard. The immunity tests were conducted under industrial EMC compliance with electric field aggression, with levels up to 10 V/m in both horizontal and vertical polarisations when the test antenna and the CRIP were placed at a distance of 3 m. It was found that the CRIP device complies with the European electromagnetic (EM) radiation immunity requirements.

  13. The ALICE DAQ infoLogger

    NASA Astrophysics Data System (ADS)

    Chapeland, S.; Carena, F.; Carena, W.; Chibante Barroso, V.; Costa, F.; Dénes, E.; Divià, R.; Fuchs, U.; Grigore, A.; Ionita, C.; Delort, C.; Simonetti, G.; Soós, C.; Telesca, A.; Vande Vyvre, P.; Von Haller, B.; Alice Collaboration

    2014-04-01

    ALICE (A Large Ion Collider Experiment) is a heavy-ion experiment studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). The ALICE DAQ (Data Acquisition System) is based on a large farm of commodity hardware consisting of more than 600 devices (Linux PCs, storage, network switches). The DAQ reads the data transferred from the detectors through 500 dedicated optical links at an aggregated and sustained rate of up to 10 Gigabytes per second and stores at up to 2.5 Gigabytes per second. The infoLogger is the log system which collects centrally the messages issued by the thousands of processes running on the DAQ machines. It allows to report errors on the fly, and to keep a trace of runtime execution for later investigation. More than 500000 messages are stored every day in a MySQL database, in a structured table keeping track for each message of 16 indexing fields (e.g. time, host, user, ...). The total amount of logs for 2012 exceeds 75GB of data and 150 million rows. We present in this paper the architecture and implementation of this distributed logging system, consisting of a client programming API, local data collector processes, a central server, and interactive human interfaces. We review the operational experience during the 2012 run, in particular the actions taken to ensure shifters receive manageable and relevant content from the main log stream. Finally, we present the performance of this log system, and future evolutions.

  14. Technical approaches for measurement of human errors

    NASA Technical Reports Server (NTRS)

    Clement, W. F.; Heffley, R. K.; Jewell, W. F.; Mcruer, D. T.

    1980-01-01

    Human error is a significant contributing factor in a very high proportion of civil transport, general aviation, and rotorcraft accidents. The technical details of a variety of proven approaches for the measurement of human errors in the context of the national airspace system are presented. Unobtrusive measurements suitable for cockpit operations and procedures in part of full mission simulation are emphasized. Procedure, system performance, and human operator centered measurements are discussed as they apply to the manual control, communication, supervisory, and monitoring tasks which are relevant to aviation operations.

  15. The Emperor’s New Password Manager: Security Analysis of Web-based Password Managers

    DTIC Science & Technology

    2014-07-07

    POST re- quest, LastPass will store h’ as authenticating Alice. Mallory can then use otp’ to log-in to LastPass us- ing otp’. Of course , decrypting the...everywhere. [36] M. Rochkind. Security, forms, and error handling. In Expert PHP and MySQL , pages 191–247. Springer, 2013. [37] D. Silver, S. Jana, E

  16. Configuration Manual Polarized Proton Collider at RHIC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alekseev, I.; Allgower, C.; Bai, M.

    2006-01-01

    In this report we present our design to accelerate and store polarized protons in RHIC, with the level of polarization, luminosity, and control of systematic errors required by the approved RHIC spin physics program. We provide an overview of the physics to be studied using RHIC with polarized proton beams, and a brief description of the accelerator systems required for the project.

  17. Retained satellite information influences performance of GPS devices in a forested ecosystem

    Treesearch

    Katie M. Moriarty; Clinton W. Epps

    2015-01-01

    Global Positioning System (GPS) units used in animal telemetry often suffer from nonrandom data loss and location error. GPS units use stored satellite information to estimate locations, including almanac and ephemeris data reflecting satellite positions at weekly and at <4-hr temporal scales, respectively. Using the smallest GPS collars (45–51 g) available for...

  18. Loss tolerant speech decoder for telecommunications

    NASA Technical Reports Server (NTRS)

    Prieto, Jr., Jaime L. (Inventor)

    1999-01-01

    A method and device for extrapolating past signal-history data for insertion into missing data segments in order to conceal digital speech frame errors. The extrapolation method uses past-signal history that is stored in a buffer. The method is implemented with a device that utilizes a finite-impulse response (FIR) multi-layer feed-forward artificial neural network that is trained by back-propagation for one-step extrapolation of speech compression algorithm (SCA) parameters. Once a speech connection has been established, the speech compression algorithm device begins sending encoded speech frames. As the speech frames are received, they are decoded and converted back into speech signal voltages. During the normal decoding process, pre-processing of the required SCA parameters will occur and the results stored in the past-history buffer. If a speech frame is detected to be lost or in error, then extrapolation modules are executed and replacement SCA parameters are generated and sent as the parameters required by the SCA. In this way, the information transfer to the SCA is transparent, and the SCA processing continues as usual. The listener will not normally notice that a speech frame has been lost because of the smooth transition between the last-received, lost, and next-received speech frames.

  19. Dynamics and Control of Attitude, Power, and Momentum for a Spacecraft Using Flywheels and Control Moment Gyroscopes

    NASA Technical Reports Server (NTRS)

    Roithmayr, Carlos M.; Karlgaard, Christopher D.; Kumar, Renjith R.; Seywald, Hans; Bose, David M.

    2003-01-01

    Several laws are designed for simultaneous control of the orientation of an Earth-pointing spacecraft, the energy stored by counter-rotating flywheels, and the angular momentum of the flywheels and control moment gyroscopes used together as an integrated set of actuators for attitude control. General, nonlinear equations of motion are presented in vector-dyadic form, and used to obtain approximate expressions which are then linearized in preparation for design of control laws that include feedback of flywheel kinetic energy error as a means of compensating for damping exerted by rotor bearings. Two flywheel steering laws are developed such that torque commanded by an attitude control law is achieved while energy is stored or discharged at the required rate. Using the International Space Station as an example, numerical simulations are performed to demonstrate control about a torque equilibrium attitude, and illustrate the benefits of kinetic energy error feedback. Control laws for attitude hold are also developed, and used to show the amount of propellant that can be saved when flywheels assist the CMGs. Nonlinear control laws for large-angle slew maneuvers perform well, but excessive momentum is required to reorient a vehicle like the International Space Station.

  20. Effects of irrelevant sounds on phonological coding in reading comprehension and short-term memory.

    PubMed

    Boyle, R; Coltheart, V

    1996-05-01

    The effects of irrelevant sounds on reading comprehension and short-term memory were studied in two experiments. In Experiment 1, adults judged the acceptability of written sentences during irrelevant speech, accompanied and unaccompanied singing, instrumental music, and in silence. Sentences varied in syntactic complexity: Simple sentences contained a right-branching relative clause (The applause pleased the woman that gave the speech) and syntactically complex sentences included a centre-embedded relative clause (The hay that the farmer stored fed the hungry animals). Unacceptable sentences either sounded acceptable (The dog chased the cat that eight up all his food) or did not (The man praised the child that sight up his spinach). Decision accuracy was impaired by syntactic complexity but not by irrelevant sounds. Phonological coding was indicated by increased errors on unacceptable sentences that sounded correct. These errors rates were unaffected by irrelevant sounds. Experiment 2 examined effects of irrelevant sounds on ordered recall of phonologically similar and dissimilar word lists. Phonological similarity impaired recall. Irrelevant speech reduced recall but did not interact with phonological similarity. The results of these experiments question assumptions about the relationship between speech input and phonological coding in reading and the short-term store.

  1. Numerical simulation of ozone concentration profile and flow characteristics in paddy bulks.

    PubMed

    Pandiselvam, Ravi; Chandrasekar, Veerapandian; Thirupathi, Venkatachalam

    2017-08-01

    Ozone has shown the potential to control stored product insect pests. The high reactivity of ozone leads to special problems when it passes though an organic medium such as stored grains. Thus, there is a need for a simulation study to understand the concentration profile and flow characteristics of ozone in stored paddy bulks as a function of time. Simulation of ozone concentration through the paddy grain bulks was explained by applying the principle of the law of conservation along with a continuity equation. A higher ozone concentration value was observed at regions near the ozone diffuser whereas a lower concentration value was observed at regions away from the ozone diffuser. The relative error between the experimental and predicted ozone concentration values for the entire bin geometry was less than 42.8%. The simulation model described a non-linear change of ozone concentration in stored paddy bulks. Results of this study provide a valuable source for estimating the parameters needed for effectively designing a storage bin for fumigation of paddy grains in a commercial scale continuous-flow ozone fumigation system. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  2. Applying different independent component analysis algorithms and support vector regression for IT chain store sales forecasting.

    PubMed

    Dai, Wensheng; Wu, Jui-Yu; Lu, Chi-Jie

    2014-01-01

    Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting.

  3. Applying Different Independent Component Analysis Algorithms and Support Vector Regression for IT Chain Store Sales Forecasting

    PubMed Central

    Dai, Wensheng

    2014-01-01

    Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting. PMID:25165740

  4. STED super-resolution microscopy of clinical paraffin-embedded human rectal cancer tissue.

    PubMed

    Ilgen, Peter; Stoldt, Stefan; Conradi, Lena-Christin; Wurm, Christian Andreas; Rüschoff, Josef; Ghadimi, B Michael; Liersch, Torsten; Jakobs, Stefan

    2014-01-01

    Formalin fixed and paraffin-embedded human tissue resected during cancer surgery is indispensable for diagnostic and therapeutic purposes and represents a vast and largely unexploited resource for research. Optical microscopy of such specimen is curtailed by the diffraction-limited resolution of conventional optical microscopy. To overcome this limitation, we used STED super-resolution microscopy enabling optical resolution well below the diffraction barrier. We visualized nanoscale protein distributions in sections of well-annotated paraffin-embedded human rectal cancer tissue stored in a clinical repository. Using antisera against several mitochondrial proteins, STED microscopy revealed distinct sub-mitochondrial protein distributions, suggesting a high level of structural preservation. Analysis of human tissues stored for up to 17 years demonstrated that these samples were still amenable for super-resolution microscopy. STED microscopy of sections of HER2 positive rectal adenocarcinoma revealed details in the surface and intracellular HER2 distribution that were blurred in the corresponding conventional images, demonstrating the potential of super-resolution microscopy to explore the thus far largely untapped nanoscale regime in tissues stored in biorepositories.

  5. STED Super-Resolution Microscopy of Clinical Paraffin-Embedded Human Rectal Cancer Tissue

    PubMed Central

    Wurm, Christian Andreas; Rüschoff, Josef; Ghadimi, B. Michael; Liersch, Torsten; Jakobs, Stefan

    2014-01-01

    Formalin fixed and paraffin-embedded human tissue resected during cancer surgery is indispensable for diagnostic and therapeutic purposes and represents a vast and largely unexploited resource for research. Optical microscopy of such specimen is curtailed by the diffraction-limited resolution of conventional optical microscopy. To overcome this limitation, we used STED super-resolution microscopy enabling optical resolution well below the diffraction barrier. We visualized nanoscale protein distributions in sections of well-annotated paraffin-embedded human rectal cancer tissue stored in a clinical repository. Using antisera against several mitochondrial proteins, STED microscopy revealed distinct sub-mitochondrial protein distributions, suggesting a high level of structural preservation. Analysis of human tissues stored for up to 17 years demonstrated that these samples were still amenable for super-resolution microscopy. STED microscopy of sections of HER2 positive rectal adenocarcinoma revealed details in the surface and intracellular HER2 distribution that were blurred in the corresponding conventional images, demonstrating the potential of super-resolution microscopy to explore the thus far largely untapped nanoscale regime in tissues stored in biorepositories. PMID:25025184

  6. The dynamics of error processing in the human brain as reflected by high-gamma activity in noninvasive and intracranial EEG.

    PubMed

    Völker, Martin; Fiederer, Lukas D J; Berberich, Sofie; Hammer, Jiří; Behncke, Joos; Kršek, Pavel; Tomášek, Martin; Marusič, Petr; Reinacher, Peter C; Coenen, Volker A; Helias, Moritz; Schulze-Bonhage, Andreas; Burgard, Wolfram; Ball, Tonio

    2018-06-01

    Error detection in motor behavior is a fundamental cognitive function heavily relying on local cortical information processing. Neural activity in the high-gamma frequency band (HGB) closely reflects such local cortical processing, but little is known about its role in error processing, particularly in the healthy human brain. Here we characterize the error-related response of the human brain based on data obtained with noninvasive EEG optimized for HGB mapping in 31 healthy subjects (15 females, 16 males), and additional intracranial EEG data from 9 epilepsy patients (4 females, 5 males). Our findings reveal a multiscale picture of the global and local dynamics of error-related HGB activity in the human brain. On the global level as reflected in the noninvasive EEG, the error-related response started with an early component dominated by anterior brain regions, followed by a shift to parietal regions, and a subsequent phase characterized by sustained parietal HGB activity. This phase lasted for more than 1 s after the error onset. On the local level reflected in the intracranial EEG, a cascade of both transient and sustained error-related responses involved an even more extended network, spanning beyond frontal and parietal regions to the insula and the hippocampus. HGB mapping appeared especially well suited to investigate late, sustained components of the error response, possibly linked to downstream functional stages such as error-related learning and behavioral adaptation. Our findings establish the basic spatio-temporal properties of HGB activity as a neural correlate of error processing, complementing traditional error-related potential studies. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Calcium sources used by post-natal human myoblasts during initial differentiation.

    PubMed

    Arnaudeau, Serge; Holzer, Nicolas; König, Stéphane; Bader, Charles R; Bernheim, Laurent

    2006-08-01

    Increases in cytoplasmic Ca(2+) are crucial for inducing the initial steps of myoblast differentiation that ultimately lead to fusion; yet the mechanisms that produce this elevated Ca(2+) have not been fully resolved. For example, it is still unclear whether the increase comes exclusively from membrane Ca(2+) influx or also from Ca(2+) release from internal stores. To address this, we investigated early differentiation of myoblast clones each derived from single post-natal human satellite cells. Initial differentiation was assayed by immunostaining myonuclei for the transcription factor MEF2. When Ca(2+) influx was eliminated by using low external Ca(2+) media, we found that approximately half the clones could still differentiate. Of the clones that required influx of external Ca(2+), most clones used T-type Ca(2+) channels, but others used store-operated channels as influx-generating mechanisms. On the other hand, clones that differentiated in low external Ca(2+) relied on Ca(2+) release from internal stores through IP(3) receptors. Interestingly, by following clones over time, we observed that some switched their preferred Ca(2+) source: clones that initially used calcium release from internal stores to differentiate later required Ca(2+) influx and inversely. In conclusion, we show that human myoblasts can use three alternative mechanisms to increase cytoplasmic Ca(2+) at the onset of the differentiation process: influx through T-types Ca(2+) channels, influx through store operated channels and release from internal stores through IP(3) receptors. In addition, we suggest that, probably because Ca(2+) elevation is essential during initial differentiation, myoblasts may be able to select between these alternate Ca(2+) pathways.

  8. Evaluation of digital dental models obtained from dental cone-beam computed tomography scan of alginate impressions

    PubMed Central

    Jiang, Tingting; Lee, Sang-Mi; Hou, Yanan; Chang, Xin

    2016-01-01

    Objective To investigate the dimensional accuracy of digital dental models obtained from the dental cone-beam computed tomography (CBCT) scan of alginate impressions according to the time elapse when the impressions are stored under ambient conditions. Methods Alginate impressions were obtained from 20 adults using 3 different alginate materials, 2 traditional alginate materials (Alginoplast and Cavex Impressional) and 1 extended-pour alginate material (Cavex ColorChange). The impressions were stored under ambient conditions, and scanned by CBCT immediately after the impressions were taken, and then at 1 hour intervals for 6 hours. After reconstructing three-dimensional digital dental models, the models were measured and the data were analyzed to determine dimensional changes according to the elapsed time. The changes within the measurement error were regarded as clinically acceptable in this study. Results All measurements showed a decreasing tendency with an increase in the elapsed time after the impressions. Although the extended-pour alginate exhibited a less decreasing tendency than the other 2 materials, there were no statistically significant differences between the materials. Changes above the measurement error occurred between the time points of 3 and 4 hours after the impressions. Conclusions The results of this study indicate that digital dental models can be obtained simply from a CBCT scan of alginate impressions without sending them to a remote laboratory. However, when the impressions are not stored under special conditions, they should be scanned immediately, or at least within 2 to 3 hours after the impressions are taken. PMID:27226958

  9. Feasibility study of molecular memory device based on DNA using methylation to store information

    NASA Astrophysics Data System (ADS)

    Jiang, Liming; Qiu, Wanzhi; Al-Dirini, Feras; Hossain, Faruque M.; Evans, Robin; Skafidas, Efstratios

    2016-07-01

    DNA, because of its robustness and dense information storage capability, has been proposed as a potential candidate for next-generation storage media. However, encoding information into the DNA sequence requires molecular synthesis technology, which to date is costly and prone to synthesis errors. Reading the DNA strand information is also complex. Ideally, DNA storage will provide methods for modifying stored information. Here, we conduct a feasibility study investigating the use of the DNA 5-methylcytosine (5mC) methylation state as a molecular memory to store information. We propose a new 1-bit memory device and study, based on the density functional theory and non-equilibrium Green's function method, the feasibility of electrically reading the information. Our results show that changes to methylation states lead to changes in the peak of negative differential resistance which can be used to interrogate memory state. Our work demonstrates a new memory concept based on methylation state which can be beneficial in the design of next generation DNA based molecular electronic memory devices.

  10. Development of an errorable car-following driver model

    NASA Astrophysics Data System (ADS)

    Yang, H.-H.; Peng, H.

    2010-06-01

    An errorable car-following driver model is presented in this paper. An errorable driver model is one that emulates human driver's functions and can generate both nominal (error-free), as well as devious (with error) behaviours. This model was developed for evaluation and design of active safety systems. The car-following data used for developing and validating the model were obtained from a large-scale naturalistic driving database. The stochastic car-following behaviour was first analysed and modelled as a random process. Three error-inducing behaviours were then introduced. First, human perceptual limitation was studied and implemented. Distraction due to non-driving tasks was then identified based on the statistical analysis of the driving data. Finally, time delay of human drivers was estimated through a recursive least-square identification process. By including these three error-inducing behaviours, rear-end collisions with the lead vehicle could occur. The simulated crash rate was found to be similar but somewhat higher than that reported in traffic statistics.

  11. Lost in Translation: the Case for Integrated Testing

    NASA Technical Reports Server (NTRS)

    Young, Aaron

    2017-01-01

    The building of a spacecraft is complex and often involves multiple suppliers and companies that have their own designs and processes. Standards have been developed across the industries to reduce the chances for critical flight errors at the system level, but the spacecraft is still vulnerable to the introduction of critical errors during integration of these systems. Critical errors can occur at any time during the process and in many cases, human reliability analysis (HRA) identifies human error as a risk driver. Most programs have a test plan in place that is intended to catch these errors, but it is not uncommon for schedule and cost stress to result in less testing than initially planned. Therefore, integrated testing, or "testing as you fly," is essential as a final check on the design and assembly to catch any errors prior to the mission. This presentation will outline the unique benefits of integrated testing by catching critical flight errors that can otherwise go undetected, discuss HRA methods that are used to identify opportunities for human error, lessons learned and challenges over ownership of testing will be discussed.

  12. Human factors in aircraft incidents - Results of a 7-year study (Andre Allard Memorial Lecture)

    NASA Technical Reports Server (NTRS)

    Billings, C. E.; Reynard, W. D.

    1984-01-01

    It is pointed out that nearly all fatal aircraft accidents are preventable, and that most such accidents are due to human error. The present discussion is concerned with the results of a seven-year study of the data collected by the NASA Aviation Safety Reporting System (ASRS). The Aviation Safety Reporting System was designed to stimulate as large a flow as possible of information regarding errors and operational problems in the conduct of air operations. It was implemented in April, 1976. In the following 7.5 years, 35,000 reports have been received from pilots, controllers, and the armed forces. Human errors are found in more than 80 percent of these reports. Attention is given to the types of events reported, possible causal factors in incidents, the relationship of incidents and accidents, and sources of error in the data. ASRS reports include sufficient detail to permit authorities to institute changes in the national aviation system designed to minimize the likelihood of human error, and to insulate the system against the effects of errors.

  13. Human factors in surgery: from Three Mile Island to the operating room.

    PubMed

    D'Addessi, Alessandro; Bongiovanni, Luca; Volpe, Andrea; Pinto, Francesco; Bassi, PierFrancesco

    2009-01-01

    Human factors is a definition that includes the science of understanding the properties of human capability, the application of this understanding to the design and development of systems and services, the art of ensuring their successful applications to a program. The field of human factors traces its origins to the Second World War, but Three Mile Island has been the best example of how groups of people react and make decisions under stress: this nuclear accident was exacerbated by wrong decisions made because the operators were overwhelmed with irrelevant, misleading or incorrect information. Errors and their nature are the same in all human activities. The predisposition for error is so intrinsic to human nature that scientifically it is best considered as inherently biologic. The causes of error in medical care may not be easily generalized. Surgery differs in important ways: most errors occur in the operating room and are technical in nature. Commonly, surgical error has been thought of as the consequence of lack of skill or ability, and is the result of thoughtless actions. Moreover the 'operating theatre' has a unique set of team dynamics: professionals from multiple disciplines are required to work in a closely coordinated fashion. This complex environment provides multiple opportunities for unclear communication, clashing motivations, errors arising not from technical incompetence but from poor interpersonal skills. Surgeons have to work closely with human factors specialists in future studies. By improving processes already in place in many operating rooms, safety will be enhanced and quality increased.

  14. Short-Term Storage of Human Spermatozoa in Electrolyte-Free Medium Without Freezing Maintains Sperm Chromatin Integrity Better Than Cryopreservation1

    PubMed Central

    Riel, Jonathan M.; Yamauchi, Yasuhiro; Huang, Thomas T.F.; Grove, John; Ward, Monika A.

    2011-01-01

    Previous attempts to maintain human spermatozoa without freezing were based on short-term storage in component-rich medium and led to fast decline in motility and increased incidence of chromosome breaks. Here we report a new method in which sperm are maintained without freezing in an electrolyte-free medium (EFM) composed of glucose and bovine serum albumin. Human sperm were stored in EFM or human tubal fluid medium (HTFM) or were cryopreserved, and their motility, viability, and DNA integrity were examined at different intervals. Cryopreservation led to significant decline in sperm motility and viability and induced DNA fragmentation. Sperm stored in EFM maintained motility and viability for up to 4 and 7 wk, respectively, much longer than sperm stored in HTFM (<2 and <4 wk, respectively). DNA integrity, assessed with comet assay, was also maintained significantly better in EFM than in HTFM. One-week storage in EFM yielded motility and viability similar to that of cryopreserved sperm, but DNA integrity was significantly higher, resembling that of fresh sperm. After several weeks of storage in EFM, sperm were able to activate oocytes, undergo chromatin remodeling, and form normal zygotic chromosomes after intracytoplasmic sperm injection. This study demonstrated that human spermatozoa can be stored in EFM without freezing for several weeks while maintaining motility, viability, and chromatin integrity and that 1-wk storage in EFM offers better protection of sperm DNA integrity than cryopreservation. Sperm storage in EFM may become a viable option for the physicians working in assisted reproduction technology clinics, which would avoid cryodamage. PMID:21593474

  15. The Human Factors Analysis and Classification System : HFACS : final report.

    DOT National Transportation Integrated Search

    2000-02-01

    Human error has been implicated in 70 to 80% of all civil and military aviation accidents. Yet, most accident reporting systems are not designed around any theoretical framework of human error. As a result, most accident databases are not conducive t...

  16. Differential reliance of chimpanzees and humans on automatic and deliberate control of motor actions.

    PubMed

    Kaneko, Takaaki; Tomonaga, Masaki

    2014-06-01

    Humans are often unaware of how they control their limb motor movements. People pay attention to their own motor movements only when their usual motor routines encounter errors. Yet little is known about the extent to which voluntary actions rely on automatic control and when automatic control shifts to deliberate control in nonhuman primates. In this study, we demonstrate that chimpanzees and humans showed similar limb motor adjustment in response to feedback error during reaching actions, whereas attentional allocation inferred from gaze behavior differed. We found that humans shifted attention to their own motor kinematics as errors were induced in motor trajectory feedback regardless of whether the errors actually disrupted their reaching their action goals. In contrast, chimpanzees shifted attention to motor execution only when errors actually interfered with their achieving a planned action goal. These results indicate that the species differed in their criteria for shifting from automatic to deliberate control of motor actions. It is widely accepted that sophisticated motor repertoires have evolved in humans. Our results suggest that the deliberate monitoring of one's own motor kinematics may have evolved in the human lineage. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. All-optical 10Gb/s ternary-CAM cell for routing look-up table applications.

    PubMed

    Mourgias-Alexandris, George; Vagionas, Christos; Tsakyridis, Apostolos; Maniotis, Pavlos; Pleros, Nikos

    2018-03-19

    We experimentally demonstrate the first all-optical Ternary-Content Addressable Memory (T-CAM) cell that operates at 10Gb/s and comprises two monolithically integrated InP Flip-Flops (FF) and a SOA-MZI optical XOR gate. The two FFs are responsible for storing the data bit and the ternary state 'X', respectively, with the XOR gate used for comparing the stored FF-data and the search bit. The experimental results reveal error-free operation at 10Gb/s for both Write and Ternary Content Addressing of the T-CAM cell, indicating that the proposed optical T-CAM cell could in principle lead to all-optical T-CAM-based Address Look-up memory architectures for high-end routing applications.

  18. Preventable Medical Errors Driven Modeling of Medical Best Practice Guidance Systems.

    PubMed

    Ou, Andrew Y-Z; Jiang, Yu; Wu, Po-Liang; Sha, Lui; Berlin, Richard B

    2017-01-01

    In a medical environment such as Intensive Care Unit, there are many possible reasons to cause errors, and one important reason is the effect of human intellectual tasks. When designing an interactive healthcare system such as medical Cyber-Physical-Human Systems (CPHSystems), it is important to consider whether the system design can mitigate the errors caused by these tasks or not. In this paper, we first introduce five categories of generic intellectual tasks of humans, where tasks among each category may lead to potential medical errors. Then, we present an integrated modeling framework to model a medical CPHSystem and use UPPAAL as the foundation to integrate and verify the whole medical CPHSystem design models. With a verified and comprehensive model capturing the human intellectual tasks effects, we can design a more accurate and acceptable system. We use a cardiac arrest resuscitation guidance and navigation system (CAR-GNSystem) for such medical CPHSystem modeling. Experimental results show that the CPHSystem models help determine system design flaws and can mitigate the potential medical errors caused by the human intellectual tasks.

  19. Checking Questionable Entry of Personally Identifiable Information Encrypted by One-Way Hash Transformation

    PubMed Central

    Chen, Xianlai; Fann, Yang C; McAuliffe, Matthew; Vismer, David

    2017-01-01

    Background As one of the several effective solutions for personal privacy protection, a global unique identifier (GUID) is linked with hash codes that are generated from combinations of personally identifiable information (PII) by a one-way hash algorithm. On the GUID server, no PII is permitted to be stored, and only GUID and hash codes are allowed. The quality of PII entry is critical to the GUID system. Objective The goal of our study was to explore a method of checking questionable entry of PII in this context without using or sending any portion of PII while registering a subject. Methods According to the principle of GUID system, all possible combination patterns of PII fields were analyzed and used to generate hash codes, which were stored on the GUID server. Based on the matching rules of the GUID system, an error-checking algorithm was developed using set theory to check PII entry errors. We selected 200,000 simulated individuals with randomly-planted errors to evaluate the proposed algorithm. These errors were placed in the required PII fields or optional PII fields. The performance of the proposed algorithm was also tested in the registering system of study subjects. Results There are 127,700 error-planted subjects, of which 114,464 (89.64%) can still be identified as the previous one and remaining 13,236 (10.36%, 13,236/127,700) are discriminated as new subjects. As expected, 100% of nonidentified subjects had errors within the required PII fields. The possibility that a subject is identified is related to the count and the type of incorrect PII field. For all identified subjects, their errors can be found by the proposed algorithm. The scope of questionable PII fields is also associated with the count and the type of the incorrect PII field. The best situation is to precisely find the exact incorrect PII fields, and the worst situation is to shrink the questionable scope only to a set of 13 PII fields. In the application, the proposed algorithm can give a hint of questionable PII entry and perform as an effective tool. Conclusions The GUID system has high error tolerance and may correctly identify and associate a subject even with few PII field errors. Correct data entry, especially required PII fields, is critical to avoiding false splits. In the context of one-way hash transformation, the questionable input of PII may be identified by applying set theory operators based on the hash codes. The count and the type of incorrect PII fields play an important role in identifying a subject and locating questionable PII fields. PMID:28213343

  20. Checking Questionable Entry of Personally Identifiable Information Encrypted by One-Way Hash Transformation.

    PubMed

    Chen, Xianlai; Fann, Yang C; McAuliffe, Matthew; Vismer, David; Yang, Rong

    2017-02-17

    As one of the several effective solutions for personal privacy protection, a global unique identifier (GUID) is linked with hash codes that are generated from combinations of personally identifiable information (PII) by a one-way hash algorithm. On the GUID server, no PII is permitted to be stored, and only GUID and hash codes are allowed. The quality of PII entry is critical to the GUID system. The goal of our study was to explore a method of checking questionable entry of PII in this context without using or sending any portion of PII while registering a subject. According to the principle of GUID system, all possible combination patterns of PII fields were analyzed and used to generate hash codes, which were stored on the GUID server. Based on the matching rules of the GUID system, an error-checking algorithm was developed using set theory to check PII entry errors. We selected 200,000 simulated individuals with randomly-planted errors to evaluate the proposed algorithm. These errors were placed in the required PII fields or optional PII fields. The performance of the proposed algorithm was also tested in the registering system of study subjects. There are 127,700 error-planted subjects, of which 114,464 (89.64%) can still be identified as the previous one and remaining 13,236 (10.36%, 13,236/127,700) are discriminated as new subjects. As expected, 100% of nonidentified subjects had errors within the required PII fields. The possibility that a subject is identified is related to the count and the type of incorrect PII field. For all identified subjects, their errors can be found by the proposed algorithm. The scope of questionable PII fields is also associated with the count and the type of the incorrect PII field. The best situation is to precisely find the exact incorrect PII fields, and the worst situation is to shrink the questionable scope only to a set of 13 PII fields. In the application, the proposed algorithm can give a hint of questionable PII entry and perform as an effective tool. The GUID system has high error tolerance and may correctly identify and associate a subject even with few PII field errors. Correct data entry, especially required PII fields, is critical to avoiding false splits. In the context of one-way hash transformation, the questionable input of PII may be identified by applying set theory operators based on the hash codes. The count and the type of incorrect PII fields play an important role in identifying a subject and locating questionable PII fields. ©Xianlai Chen, Yang C Fann, Matthew McAuliffe, David Vismer, Rong Yang. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 17.02.2017.

  1. Tiny Turtles Purchased at Pet Stores are a Potential High Risk for Salmonella Human Infection in the Valencian Region, Eastern Spain.

    PubMed

    Marin, Clara; Vega, Santiago; Marco-Jiménez, Francisco

    2016-07-01

    Turtles may be considered unsafe pets, particularly in households with children. This study aimed to assess Salmonella carriage by turtles in pet stores and in private ownership to inform the public of the potential health risk, enabling informed choices around pet selection. During the period between September and October 2013, 24 pet stores and 96 private owners were sampled in the Valencian Region (Eastern Spain). Salmonella identification procedure was based on ISO 6579: 2002 recommendations (Annex D). Salmonella strains were serotyped in accordance with Kauffman-White-Le-Minor technique. The rate of isolation of Salmonella was very high from pet store samples (75.0% ± 8.8%) and moderate for private owners (29.0% ± 4.6%). Serotyping revealed 18 different serotypes among two Salmonella enterica subspecies: S. enterica subsp. enterica and S. enterica subsp. diarizonae. Most frequently isolated serotypes were Salmonella Typhimurium (39.5%, 17/43) and Salmonella Pomona (9.3%, 4/43). Serotypes identified have previously been reported in turtles, and child Salmonella infections associate with pet turtle exposure. The present study clearly demonstrates that turtles in pet stores, as well as in private owners, could be a direct or indirect source of a high risk of human Salmonella infections. In addition, pet stores should advise their customers of the potential risks associated with reptile ownership.

  2. Modeling human tracking error in several different anti-tank systems

    NASA Technical Reports Server (NTRS)

    Kleinman, D. L.

    1981-01-01

    An optimal control model for generating time histories of human tracking errors in antitank systems is outlined. Monte Carlo simulations of human operator responses for three Army antitank systems are compared. System/manipulator dependent data comparisons reflecting human operator limitations in perceiving displayed quantities and executing intended control motions are presented. Motor noise parameters are also discussed.

  3. [Using some modern mathematical models of postmortem cooling of the human body for the time of death determination].

    PubMed

    Vavilov, A Iu; Viter, V I

    2007-01-01

    Mathematical questions of data errors of modern thermometrical models of postmortem cooling of the human body are considered. The main diagnostic areas used for thermometry are analyzed to minimize these data errors. The authors propose practical recommendations to decrease data errors of determination of prescription of death coming.

  4. The Hinton train disaster.

    PubMed

    Smiley, A M

    1990-10-01

    In February of 1986 a head-on collision occurred between a freight train and a passenger train in western Canada killing 23 people and causing over $30 million of damage. A Commission of Inquiry appointed by the Canadian government concluded that human error was the major reason for the collision. This report discusses the factors contributing to the human error: mainly poor work-rest schedules, the monotonous nature of the train driving task, insufficient information about train movements, and the inadequate backup systems in case of human error.

  5. A Conceptual Framework for Predicting Error in Complex Human-Machine Environments

    NASA Technical Reports Server (NTRS)

    Freed, Michael; Remington, Roger; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    We present a Goals, Operators, Methods, and Selection Rules-Model Human Processor (GOMS-MHP) style model-based approach to the problem of predicting human habit capture errors. Habit captures occur when the model fails to allocate limited cognitive resources to retrieve task-relevant information from memory. Lacking the unretrieved information, decision mechanisms act in accordance with implicit default assumptions, resulting in error when relied upon assumptions prove incorrect. The model helps interface designers identify situations in which such failures are especially likely.

  6. Climate model biases in seasonality of continental water storage revealed by satellite gravimetry

    USGS Publications Warehouse

    Swenson, Sean; Milly, P.C.D.

    2006-01-01

    Satellite gravimetric observations of monthly changes in continental water storage are compared with outputs from five climate models. All models qualitatively reproduce the global pattern of annual storage amplitude, and the seasonal cycle of global average storage is reproduced well, consistent with earlier studies. However, global average agreements mask systematic model biases in low latitudes. Seasonal extrema of low‐latitude, hemispheric storage generally occur too early in the models, and model‐specific errors in amplitude of the low‐latitude annual variations are substantial. These errors are potentially explicable in terms of neglected or suboptimally parameterized water stores in the land models and precipitation biases in the climate models.

  7. Holonomic surface codes for fault-tolerant quantum computation

    NASA Astrophysics Data System (ADS)

    Zhang, Jiang; Devitt, Simon J.; You, J. Q.; Nori, Franco

    2018-02-01

    Surface codes can protect quantum information stored in qubits from local errors as long as the per-operation error rate is below a certain threshold. Here we propose holonomic surface codes by harnessing the quantum holonomy of the system. In our scheme, the holonomic gates are built via auxiliary qubits rather than the auxiliary levels in multilevel systems used in conventional holonomic quantum computation. The key advantage of our approach is that the auxiliary qubits are in their ground state before and after each gate operation, so they are not involved in the operation cycles of surface codes. This provides an advantageous way to implement surface codes for fault-tolerant quantum computation.

  8. A fingerprint key binding algorithm based on vector quantization and error correction

    NASA Astrophysics Data System (ADS)

    Li, Liang; Wang, Qian; Lv, Ke; He, Ning

    2012-04-01

    In recent years, researches on seamless combination cryptosystem with biometric technologies, e.g. fingerprint recognition, are conducted by many researchers. In this paper, we propose a binding algorithm of fingerprint template and cryptographic key to protect and access the key by fingerprint verification. In order to avoid the intrinsic fuzziness of variant fingerprints, vector quantization and error correction technique are introduced to transform fingerprint template and then bind with key, after a process of fingerprint registration and extracting global ridge pattern of fingerprint. The key itself is secure because only hash value is stored and it is released only when fingerprint verification succeeds. Experimental results demonstrate the effectiveness of our ideas.

  9. The role of the collaborative functions of the composite structure of organic and inorganic constituents and their influence on the electrical properties of human bone.

    PubMed

    Kohata, Kazuhiro; Itoh, Soichiro; Horiuchi, Naohiro; Yoshioka, Taro; Yamashita, Kimihiro

    2016-08-12

    The electrical potential, which is generated in bone by collagen displacement, has been well documented. However, the role of mineral crystals in bone piezoelectricity has not yet been elucidated. We examined the mechanism that the composite structure of organic and inorganic constituents and their collaborative functions play an important role in the electrical properties of human bone. The electrical potential and bone structure were evaluated using thermally stimulated depolarized current (TSDC) and micro computed tomography, respectively. After electrical polarization of bone specimens, the stored electrical charge was calculated using TSDC measurements. The CO3/PO4 peak ratio was calculated using attenuated total reflection to compare the content of carbonate ion in the bone specimens. The TSDC curve contained 3 peaks at 100, 300 and 500°C, which were classified into 4 patterns. The CO3/PO4 peak ratio positively correlated with the stored charges at approximately 300°C in the polarized bone. There was a positive correlation between the stored bone charge and the bone mineral density only. It is suggested that the peak at 300°C is attributed to carbonate apatite and the total bone mass of human bone, not the three-dimensional structure, affects the stored charge.

  10. STAMP-Based HRA Considering Causality Within a Sociotechnical System: A Case of Minuteman III Missile Accident.

    PubMed

    Rong, Hao; Tian, Jin

    2015-05-01

    The study contributes to human reliability analysis (HRA) by proposing a method that focuses more on human error causality within a sociotechnical system, illustrating its rationality and feasibility by using a case of the Minuteman (MM) III missile accident. Due to the complexity and dynamics within a sociotechnical system, previous analyses of accidents involving human and organizational factors clearly demonstrated that the methods using a sequential accident model are inadequate to analyze human error within a sociotechnical system. System-theoretic accident model and processes (STAMP) was used to develop a universal framework of human error causal analysis. To elaborate the causal relationships and demonstrate the dynamics of human error, system dynamics (SD) modeling was conducted based on the framework. A total of 41 contributing factors, categorized into four types of human error, were identified through the STAMP-based analysis. All factors are related to a broad view of sociotechnical systems, and more comprehensive than the causation presented in the accident investigation report issued officially. Recommendations regarding both technical and managerial improvement for a lower risk of the accident are proposed. The interests of an interdisciplinary approach provide complementary support between system safety and human factors. The integrated method based on STAMP and SD model contributes to HRA effectively. The proposed method will be beneficial to HRA, risk assessment, and control of the MM III operating process, as well as other sociotechnical systems. © 2014, Human Factors and Ergonomics Society.

  11. ERP evidence for on-line syntactic computations in 2-year-olds.

    PubMed

    Brusini, Perrine; Dehaene-Lambertz, Ghislaine; Dutat, Michel; Goffinet, François; Christophe, Anne

    2016-06-01

    Syntax allows human beings to build an infinite number of sentences from a finite number of words. How this unique, productive power of human language unfolds over the course of language development is still hotly debated. When they listen to sentences comprising newly-learned words, do children generalize from their knowledge of the legal combinations of word categories or do they instead rely on strings of words stored in memory to detect syntactic errors? Using novel words taught in the lab, we recorded Evoked Response Potentials (ERPs) in two-year-olds and adults listening to grammatical and ungrammatical sentences containing syntactic contexts that had not been used during training. In toddlers, the ungrammatical use of words, even when they have been just learned, induced an early left anterior negativity (surfacing 100-400ms after target word onset) followed by a late posterior positivity (surfacing 700-900ms after target word onset) that was not observed in grammatical sentences. This late effect was remarkably similar to the P600 displayed by adults, suggesting that toddlers and adults perform similar syntactic computations. Our results thus show that toddlers build on-line expectations regarding the syntactic category of upcoming words in a sentence. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. The dorsal stream contribution to phonological retrieval in object naming

    PubMed Central

    Faseyitan, Olufunsho; Kim, Junghoon; Coslett, H. Branch

    2012-01-01

    Meaningful speech, as exemplified in object naming, calls on knowledge of the mappings between word meanings and phonological forms. Phonological errors in naming (e.g. GHOST named as ‘goath’) are commonly seen in persisting post-stroke aphasia and are thought to signal impairment in retrieval of phonological form information. We performed a voxel-based lesion-symptom mapping analysis of 1718 phonological naming errors collected from 106 individuals with diverse profiles of aphasia. Voxels in which lesion status correlated with phonological error rates localized to dorsal stream areas, in keeping with classical and contemporary brain-language models. Within the dorsal stream, the critical voxels were concentrated in premotor cortex, pre- and postcentral gyri and supramarginal gyrus with minimal extension into auditory-related posterior temporal and temporo-parietal cortices. This challenges the popular notion that error-free phonological retrieval requires guidance from sensory traces stored in posterior auditory regions and points instead to sensory-motor processes located further anterior in the dorsal stream. In a separate analysis, we compared the lesion maps for phonological and semantic errors and determined that there was no spatial overlap, demonstrating that the brain segregates phonological and semantic retrieval operations in word production. PMID:23171662

  13. Nature of the refractive errors in rhesus monkeys (Macaca mulatta) with experimentally induced ametropias.

    PubMed

    Qiao-Grider, Ying; Hung, Li-Fang; Kee, Chea-Su; Ramamirtham, Ramkumar; Smith, Earl L

    2010-08-23

    We analyzed the contribution of individual ocular components to vision-induced ametropias in 210 rhesus monkeys. The primary contribution to refractive-error development came from vitreous chamber depth; a minor contribution from corneal power was also detected. However, there was no systematic relationship between refractive error and anterior chamber depth or between refractive error and any crystalline lens parameter. Our results are in good agreement with previous studies in humans, suggesting that the refractive errors commonly observed in humans are created by vision-dependent mechanisms that are similar to those operating in monkeys. This concordance emphasizes the applicability of rhesus monkeys in refractive-error studies. Copyright 2010 Elsevier Ltd. All rights reserved.

  14. Nature of the Refractive Errors in Rhesus Monkeys (Macaca mulatta) with Experimentally Induced Ametropias

    PubMed Central

    Qiao-Grider, Ying; Hung, Li-Fang; Kee, Chea-su; Ramamirtham, Ramkumar; Smith, Earl L.

    2010-01-01

    We analyzed the contribution of individual ocular components to vision-induced ametropias in 210 rhesus monkeys. The primary contribution to refractive-error development came from vitreous chamber depth; a minor contribution from corneal power was also detected. However, there was no systematic relationship between refractive error and anterior chamber depth or between refractive error and any crystalline lens parameter. Our results are in good agreement with previous studies in humans, suggesting that the refractive errors commonly observed in humans are created by vision-dependent mechanisms that are similar to those operating in monkeys. This concordance emphasizes the applicability of rhesus monkeys in refractive-error studies. PMID:20600237

  15. Borax and Octabor Treatment of Stored Swine Manure: Reduction in Hydrogen Sulfide Emissions and Phytotoxicity to Agronomic Crops

    USDA-ARS?s Scientific Manuscript database

    Gaseous emissions from stored manure have become environmental and health issues for humans and animals as the livestock industry becomes specialized and concentrated. Of particular concern is hydrogen sulfide, which is being targeted for regulatory control in concentrated animal farm operations. ...

  16. The Store Challenge

    ERIC Educational Resources Information Center

    Roman, Harry T.

    2014-01-01

    Biomedical and robotic technologies are merging to present a wonderful opportunity to develop artificial limbs and prosthetic devices for humans injured on the job, in the military, or due to disease. In this challenge, students will have the opportunity to design a store or online service that specifically dedicates itself to amputees. Described…

  17. A graph edit dictionary for correcting errors in roof topology graphs reconstructed from point clouds

    NASA Astrophysics Data System (ADS)

    Xiong, B.; Oude Elberink, S.; Vosselman, G.

    2014-07-01

    In the task of 3D building model reconstruction from point clouds we face the problem of recovering a roof topology graph in the presence of noise, small roof faces and low point densities. Errors in roof topology graphs will seriously affect the final modelling results. The aim of this research is to automatically correct these errors. We define the graph correction as a graph-to-graph problem, similar to the spelling correction problem (also called the string-to-string problem). The graph correction is more complex than string correction, as the graphs are 2D while strings are only 1D. We design a strategy based on a dictionary of graph edit operations to automatically identify and correct the errors in the input graph. For each type of error the graph edit dictionary stores a representative erroneous subgraph as well as the corrected version. As an erroneous roof topology graph may contain several errors, a heuristic search is applied to find the optimum sequence of graph edits to correct the errors one by one. The graph edit dictionary can be expanded to include entries needed to cope with errors that were previously not encountered. Experiments show that the dictionary with only fifteen entries already properly corrects one quarter of erroneous graphs in about 4500 buildings, and even half of the erroneous graphs in one test area, achieving as high as a 95% acceptance rate of the reconstructed models.

  18. Aflatoxin Contamination Detected in Nutrient and Anti-Oxidant Rich Edible Stink Bug Stored in Recycled Grain Containers.

    PubMed

    Musundire, Robert; Osuga, Isaac M; Cheseto, Xavier; Irungu, Janet; Torto, Baldwyn

    2016-01-01

    Recently, there has been multi-agency promotion of entomophagy as an environmentally-friendly source of food for the ever increasing human population especially in the developing countries. However, food quality and safety concerns must first be addressed in this context. We addressed these concerns in the present study using the edible stink bug Encosternum delegorguei, which is widely consumed in southern Africa. We analysed for mycotoxins, and health beneficials including antioxidants, amino acids and essential fatty acids using liquid chromatography coupled to quadrupole time of flight mass spectrometry (LC-Qtof-MS) and coupled gas chromatography (GC)-MS. We also performed proximate analysis to determine nutritional components. We identified the human carcinogen mycotoxin (aflatoxin B1) at low levels in edible stink bugs that were stored in traditonally woven wooden dung smeared baskets and gunny bags previously used to store cereals. However, it was absent in insects stored in clean zip lock bags. On the other hand, we identified 10 fatty acids, of which 7 are considered essential fatty acids for human nutrition and health; 4 flavonoids and 12 amino acids of which two are considered the most limiting amino acids in cereal based diets. The edible stink bug also contained high crude protein and fats but was a poor source of minerals, except for phosphorus which was found in relatively high levels. Our results show that the edible stink bug is a nutrient- and antioxidant-rich source of food and health benefits for human consumption. As such, use of better handling and storage methods can help eliminate contamination of the edible stink bug with the carcinogen aflatoxin and ensure its safety as human food.

  19. Aflatoxin Contamination Detected in Nutrient and Anti-Oxidant Rich Edible Stink Bug Stored in Recycled Grain Containers

    PubMed Central

    Musundire, Robert; Osuga, Isaac M.; Cheseto, Xavier; Irungu, Janet; Torto, Baldwyn

    2016-01-01

    Recently, there has been multi-agency promotion of entomophagy as an environmentally-friendly source of food for the ever increasing human population especially in the developing countries. However, food quality and safety concerns must first be addressed in this context. We addressed these concerns in the present study using the edible stink bug Encosternum delegorguei, which is widely consumed in southern Africa. We analysed for mycotoxins, and health beneficials including antioxidants, amino acids and essential fatty acids using liquid chromatography coupled to quadrupole time of flight mass spectrometry (LC-Qtof-MS) and coupled gas chromatography (GC)-MS. We also performed proximate analysis to determine nutritional components. We identified the human carcinogen mycotoxin (aflatoxin B1) at low levels in edible stink bugs that were stored in traditonally woven wooden dung smeared baskets and gunny bags previously used to store cereals. However, it was absent in insects stored in clean zip lock bags. On the other hand, we identified 10 fatty acids, of which 7 are considered essential fatty acids for human nutrition and health; 4 flavonoids and 12 amino acids of which two are considered the most limiting amino acids in cereal based diets. The edible stink bug also contained high crude protein and fats but was a poor source of minerals, except for phosphorus which was found in relatively high levels. Our results show that the edible stink bug is a nutrient- and antioxidant-rich source of food and health benefits for human consumption. As such, use of better handling and storage methods can help eliminate contamination of the edible stink bug with the carcinogen aflatoxin and ensure its safety as human food. PMID:26731419

  20. Effects of freezing on the bactericidal activity of human milk.

    PubMed

    Takci, Sahin; Gulmez, Dolunay; Yigit, Sule; Dogan, Ozlem; Dik, Kezban; Hascelik, Gulsen

    2012-08-01

    Storage of human milk by freezing has been recommended for long-term storage. The present study analyzed the bactericidal activity of human milk on Escherichia coli and Pseudomonas aeruginosa and determined the changes in bactericidal activity following freezing at -20°C and -80°C for 1 month and 3 months. Forty-eight milk samples were collected from 48 lactating mothers. Each sample was divided into 10 aliquots. Two of the samples were processed immediately and the others were stored at both -20°C and -80°C until analysis after 1 month and 3 months of freezing. All of the fresh milk samples showed bactericidal activity against E coli and P aeruginosa. Freezing at -20°C for 1 month did not cause statistically significant alteration in bactericidal activity (P > 0.017), whereas storage for 3 months lowered the degree of bactericidal activity significantly (P < 0.017) against E coli. Bactericidal activity was protected when the samples were stored at -80°C. There was no statistically significant difference in the bactericidal activity of human milk against E coli between freezing at -20°C and -80°C for 1 month (P > 0.017); however, when milk was stored for 3 months, -80°C was significantly more protective (P < 0.017). Freezing at -20°C and -80°C for 1 month and 3 months did not cause any significant change in bactericidal activity against P aeruginosa (P > 0.05). Storage by freezing at -80°C is more appropriate to keep bactericidal capacity of stored human milk >1 month if affordable and available, especially in intensive care settings.

  1. Human Error as an Emergent Property of Action Selection and Task Place-Holding.

    PubMed

    Tamborello, Franklin P; Trafton, J Gregory

    2017-05-01

    A computational process model could explain how the dynamic interaction of human cognitive mechanisms produces each of multiple error types. With increasing capability and complexity of technological systems, the potential severity of consequences of human error is magnified. Interruption greatly increases people's error rates, as does the presence of other information to maintain in an active state. The model executed as a software-instantiated Monte Carlo simulation. It drew on theoretical constructs such as associative spreading activation for prospective memory, explicit rehearsal strategies as a deliberate cognitive operation to aid retrospective memory, and decay. The model replicated the 30% effect of interruptions on postcompletion error in Ratwani and Trafton's Stock Trader task, the 45% interaction effect on postcompletion error of working memory capacity and working memory load from Byrne and Bovair's Phaser Task, as well as the 5% perseveration and 3% omission effects of interruption from the UNRAVEL Task. Error classes including perseveration, omission, and postcompletion error fall naturally out of the theory. The model explains post-interruption error in terms of task state representation and priming for recall of subsequent steps. Its performance suggests that task environments providing more cues to current task state will mitigate error caused by interruption. For example, interfaces could provide labeled progress indicators or facilities for operators to quickly write notes about their task states when interrupted.

  2. Parametric Grid Information in the DOE Knowledge Base: Data Preparation, Storage, and Access

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HIPP,JAMES R.; MOORE,SUSAN G.; MYERS,STEPHEN C.

    The parametric grid capability of the Knowledge Base provides an efficient, robust way to store and access interpolatable information which is needed to monitor the Comprehensive Nuclear Test Ban Treaty. To meet both the accuracy and performance requirements of operational monitoring systems, we use a new approach which combines the error estimation of kriging with the speed and robustness of Natural Neighbor Interpolation (NNI). The method involves three basic steps: data preparation (DP), data storage (DS), and data access (DA). The goal of data preparation is to process a set of raw data points to produce a sufficient basis formore » accurate NNI of value and error estimates in the Data Access step. This basis includes a set of nodes and their connectedness, collectively known as a tessellation, and the corresponding values and errors that map to each node, which we call surfaces. In many cases, the raw data point distribution is not sufficiently dense to guarantee accurate error estimates from the NNI, so the original data set must be densified using a newly developed interpolation technique known as Modified Bayesian Kriging. Once appropriate kriging parameters have been determined by variogram analysis, the optimum basis for NNI is determined in a process they call mesh refinement, which involves iterative kriging, new node insertion, and Delauny triangle smoothing. The process terminates when an NNI basis has been calculated which will fir the kriged values within a specified tolerance. In the data storage step, the tessellations and surfaces are stored in the Knowledge Base, currently in a binary flatfile format but perhaps in the future in a spatially-indexed database. Finally, in the data access step, a client application makes a request for an interpolated value, which triggers a data fetch from the Knowledge Base through the libKBI interface, a walking triangle search for the containing triangle, and finally the NNI interpolation.« less

  3. Simultaneous Control of Error Rates in fMRI Data Analysis

    PubMed Central

    Kang, Hakmook; Blume, Jeffrey; Ombao, Hernando; Badre, David

    2015-01-01

    The key idea of statistical hypothesis testing is to fix, and thereby control, the Type I error (false positive) rate across samples of any size. Multiple comparisons inflate the global (family-wise) Type I error rate and the traditional solution to maintaining control of the error rate is to increase the local (comparison-wise) Type II error (false negative) rates. However, in the analysis of human brain imaging data, the number of comparisons is so large that this solution breaks down: the local Type II error rate ends up being so large that scientifically meaningful analysis is precluded. Here we propose a novel solution to this problem: allow the Type I error rate to converge to zero along with the Type II error rate. It works because when the Type I error rate per comparison is very small, the accumulation (or global) Type I error rate is also small. This solution is achieved by employing the Likelihood paradigm, which uses likelihood ratios to measure the strength of evidence on a voxel-by-voxel basis. In this paper, we provide theoretical and empirical justification for a likelihood approach to the analysis of human brain imaging data. In addition, we present extensive simulations that show the likelihood approach is viable, leading to ‘cleaner’ looking brain maps and operationally superiority (lower average error rate). Finally, we include a case study on cognitive control related activation in the prefrontal cortex of the human brain. PMID:26272730

  4. Human error and the search for blame

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1989-01-01

    Human error is a frequent topic in discussions about risks in using computer systems. A rational analysis of human error leads through the consideration of mistakes to standards that designers use to avoid mistakes that lead to known breakdowns. The irrational side, however, is more interesting. It conditions people to think that breakdowns are inherently wrong and that there is ultimately someone who is responsible. This leads to a search for someone to blame which diverts attention from: learning from the mistakes; seeing the limitations of current engineering methodology; and improving the discourse of design.

  5. A method for neighborhood-level surveillance of food purchasing.

    PubMed

    Buckeridge, David L; Charland, Katia; Labban, Alice; Ma, Yu

    2014-12-01

    Added sugar, particularly in carbonated soft drinks (CSDs), represents a considerable proportion of caloric intake in North America. Interventions to decrease the intake of added sugar have been proposed, but monitoring their effectiveness can be difficult due to the costs and limitations of dietary surveys. We developed, assessed the accuracy of, and took an initial step toward validating an indicator of neighborhood-level purchases of CSDs using automatically captured store scanner data in Montreal, Canada, between 2008 and 2010 and census data describing neighborhood socioeconomic characteristics. Our indicator predicted total monthly neighborhood sales based on historical sales and promotions and characteristics of the stores and neighborhoods. The prediction error for monthly sales in sampled stores was low (2.2%), and we demonstrated a negative association between predicted total sales and median personal income. For each $10,000 decrease in median personal income, we observed a fivefold increase in predicted monthly sales of CSDs. This indicator can be used by public health agencies to implement automated systems for neighborhood-level monitoring of an important upstream determinant of health. Future refinement of this indicator is possible to account for factors such as store catchment areas and to incorporate nutritional information about products. © 2014 New York Academy of Sciences.

  6. Egocentric and nonegocentric coding in memory for spatial layout: Evidence from scene recognition

    PubMed Central

    2005-01-01

    Much contemporary research has suggested that memories for spatial layout are stored with a preferred orientation. The present paper examines whether spatial memories are also stored with a preferred viewpoint position. Participants viewed images of an arrangement of objects taken from a single viewpoint, and were subsequently tested on their ability to recognize the arrangement from novel viewpoints that had been translated in either the lateral or depth dimension. Lateral and forward displacements of the viewpoint resulted in increasing response latencies and errors. Backward displacement showed no such effect, nor did lateral translation that resulted in a centered “canonical” view of the arrangement. These results further constrain the specificity of spatial memory, while also providing some evidence that nonegocentric spatial information is coded in memory. PMID:16933759

  7. 21 CFR 21.31 - Records stored by the National Archives and Records Administration.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Records stored by the National Archives and Records Administration. 21.31 Section 21.31 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PROTECTION OF PRIVACY Requirements for Specific Categories of Records...

  8. Performance Feedback Utility in a Small Organization: Effects on Organizational Outcomes and Managerial Decision Processes.

    ERIC Educational Resources Information Center

    Florin-Thuma, Beth C.; Boudreau, John W.

    1987-01-01

    Investigated the frequent but previously untested assertion that utility analysis can improve communication and decision making about human resource management programs by examining a performance feedback intervention in a small fast-food store. Results suggest substantial payoffs from performance feedback, though the store's owner-managers had…

  9. Why do adult dogs (Canis familiaris) commit the A-not-B search error?

    PubMed

    Sümegi, Zsófia; Kis, Anna; Miklósi, Ádám; Topál, József

    2014-02-01

    It has been recently reported that adult domestic dogs, like human infants, tend to commit perseverative search errors; that is, they select the previously rewarded empty location in Piagetian A-not-B search task because of the experimenter's ostensive communicative cues. There is, however, an ongoing debate over whether these findings reveal that dogs can use the human ostensive referential communication as a source of information or the phenomenon can be accounted for by "more simple" explanations like insufficient attention and learning based on local enhancement. In 2 experiments the authors systematically manipulated the type of human cueing (communicative or noncommunicative) adjacent to the A hiding place during both the A and B trials. Results highlight 3 important aspects of the dogs' A-not-B error: (a) search errors are influenced to a certain extent by dogs' motivation to retrieve the toy object; (b) human communicative and noncommunicative signals have different error-inducing effects; and (3) communicative signals presented at the A hiding place during the B trials but not during the A trials play a crucial role in inducing the A-not-B error and it can be induced even without demonstrating repeated hiding events at location A. These findings further confirm the notion that perseverative search error, at least partially, reflects a "ready-to-obey" attitude in the dog rather than insufficient attention and/or working memory.

  10. Episodic-like memory during cache recovery by scrub jays.

    PubMed

    Clayton, N S; Dickinson, A

    1998-09-17

    The recollection of past experiences allows us to recall what a particular event was, and where and when it occurred, a form of memory that is thought to be unique to humans. It is known, however, that food-storing birds remember the spatial location and contents of their caches. Furthermore, food-storing animals adapt their caching and recovery strategies to the perishability of food stores, which suggests that they are sensitive to temporal factors. Here we show that scrub jays (Aphelocoma coerulescens) remember 'when' food items are stored by allowing them to recover perishable 'wax worms' (wax-moth larvae) and non-perishable peanuts which they had previously cached in visuospatially distinct sites. Jays searched preferentially for fresh wax worms, their favoured food, when allowed to recover them shortly after caching. However, they rapidly learned to avoid searching for worms after a longer interval during which the worms had decayed. The recovery preference of jays demonstrates memory of where and when particular food items were cached, thereby fulfilling the behavioural criteria for episodic-like memory in non-human animals.

  11. Three-dimensional integral imaging displays using a quick-response encoded elemental image array: an overview

    NASA Astrophysics Data System (ADS)

    Markman, A.; Javidi, B.

    2016-06-01

    Quick-response (QR) codes are barcodes that can store information such as numeric data and hyperlinks. The QR code can be scanned using a QR code reader, such as those built into smartphone devices, revealing the information stored in the code. Moreover, the QR code is robust to noise, rotation, and illumination when scanning due to error correction built in the QR code design. Integral imaging is an imaging technique used to generate a three-dimensional (3D) scene by combining the information from two-dimensional (2D) elemental images (EIs) each with a different perspective of a scene. Transferring these 2D images in a secure manner can be difficult. In this work, we overview two methods to store and encrypt EIs in multiple QR codes. The first method uses run-length encoding with Huffman coding and the double-random-phase encryption (DRPE) to compress and encrypt an EI. This information is then stored in a QR code. An alternative compression scheme is to perform photon-counting on the EI prior to compression. Photon-counting is a non-linear transformation of data that creates redundant information thus improving image compression. The compressed data is encrypted using the DRPE. Once information is stored in the QR codes, it is scanned using a smartphone device. The information scanned is decompressed and decrypted and an EI is recovered. Once all EIs have been recovered, a 3D optical reconstruction is generated.

  12. Common medial frontal mechanisms of adaptive control in humans and rodents

    PubMed Central

    Frank, Michael J.; Laubach, Mark

    2013-01-01

    In this report, we describe how common brain networks within the medial frontal cortex facilitate adaptive behavioral control in rodents and humans. We demonstrate that low frequency oscillations below 12 Hz are dramatically modulated after errors in humans over mid-frontal cortex and in rats within prelimbic and anterior cingulate regions of medial frontal cortex. These oscillations were phase-locked between medial frontal cortex and motor areas in both rats and humans. In rats, single neurons that encoded prior behavioral outcomes were phase-coherent with low-frequency field oscillations particularly after errors. Inactivating medial frontal regions in rats led to impaired behavioral adjustments after errors, eliminated the differential expression of low frequency oscillations after errors, and increased low-frequency spike-field coupling within motor cortex. Our results describe a novel mechanism for behavioral adaptation via low-frequency oscillations and elucidate how medial frontal networks synchronize brain activity to guide performance. PMID:24141310

  13. #2 - An Empirical Assessment of Exposure Measurement Error ...

    EPA Pesticide Factsheets

    Background• Differing degrees of exposure error acrosspollutants• Previous focus on quantifying and accounting forexposure error in single-pollutant models• Examine exposure errors for multiple pollutantsand provide insights on the potential for bias andattenuation of effect estimates in single and bipollutantepidemiological models The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.

  14. Tyrosine kinases activate store-mediated Ca2+ entry in human platelets through the reorganization of the actin cytoskeleton.

    PubMed Central

    Rosado, J A; Graves, D; Sage, S O

    2000-01-01

    We have recently reported that store-mediated Ca(2+) entry in platelets is likely to be mediated by a reversible trafficking and coupling of the endoplasmic reticulum with the plasma membrane, a model termed 'secretion-like coupling'. In this model the actin cytoskeleton plays a key regulatory role. Since tyrosine kinases have been shown to be important for Ca(2+) entry in platelets and other cells, we have now investigated the possible involvement of tyrosine kinases in the secretion-like-coupling model. Treatment of platelets with thrombin or thapsigargin induced actin polymerization by a calcium-independent pathway. Methyl 2,5-dihydroxycinnamate, a tyrosine kinase inhibitor, prevented thrombin- or thapsigargin-induced actin polymerization. The effects of tyrosine kinases in store-mediated Ca(2+) entry were found to be entirely dependent on the actin cytoskeleton. PP1, an inhibitor of the Src family of proteins, partially inhibited store-mediated Ca(2+) entry. In addition, depletion of intracellular Ca(2+) stores stimulated cytoskeletal association of the cytoplasmic tyrosine kinase pp60(src), a process that was sensitive to treatment with cytochalasin D and PP1, but not to inhibition of Ras proteins using prenylcysteine analogues. Finally, combined inhibition of both Ras proteins and tyrosine kinases resulted in complete inhibition of Ca(2+) entry, suggesting that these two families of proteins have independent effects in the activation of store-mediated Ca(2+) entry in human platelets. PMID:11023829

  15. A Goal VPN Protection Profile for Protecting Sensitive Information

    DTIC Science & Technology

    2000-07-10

    security for the systems in which they are used. Nothing could be further from the truth . There are no perfect security solutions, and no...establishment/termination, failures, and errors); • provide for directly connected (local hard -wire connection) and remote (over the network) interfaces... the TOERU is left unattended procedures such as media encryption or secure storage of the hard drive, will be used to insure the protection of stored

  16. Children Show Heightened Knew-It-All-Along Errors When Learning New Facts about Kinds: Evidence for the Power of Kind Representations in Children's Thinking

    ERIC Educational Resources Information Center

    Sutherland, Shelbie L.; Cimpian, Andrei

    2015-01-01

    Several proposals in the literature on conceptual development converge on the claim that information about "kinds of things" in the world has a privileged status in children's cognition, insofar as it is acquired, manipulated, and stored with surprising ease. Our goal in the present studies (N = 440) was to test a prediction of this…

  17. FAIL-SAFE: Fault Aware IntelLigent Software for Exascale

    DTIC Science & Technology

    2016-06-13

    and that these programs can continue to correct solutions. To broaden the impact of this research, we also needed to be able to ameliorate errors...designing an interface between the application and an introspection framework for resilience ( IFR ) based on the inference engine SHINE; (4) using...the ROSE compiler to translate annotations into reasoning rules for the IFR ; and (5) designing a Knowledge/Experience Database, which will store

  18. A Comparison of the Performance of Three Inventory Control Strategies in the Commissary Store Environment.

    DTIC Science & Technology

    1987-12-01

    DEMAND- OFQTSCEUS N(DMD,SD(DAY)) IPB ORDE SOLD NW LLOST IP=SCL TNW+ SUBROUTI NE QIPUT GE NE RAT ES PERFORMANCE SUMMARY SC L RPT I * TURNS OHP NIS OH SOLD...VAR I ABLE: I MVEMTORY-TO-SALES RAT 10 SOLRCE OF SUM OF SQUARES MEM SQARE F VALUE MlOOEL 17 5.54073711 O.32592574 83685.81 ERROR 72 0.00029041

  19. INDEPENDENT EVALUATION OF THE GAM EX5ALN MINIATURE LINE-NARROWED KRF EXCIMER LASER

    DTIC Science & Technology

    2017-06-01

    software included the disabled tabs and buttons that clutter the panels. Information on these panels was not updated correctly (e.g., shots per fill and...total shots are not stored correctly and appear to contain random data, the lock function on the fill page does not update correctly, the time to...fill level after 7 M shots . .............................................................................. Error! Bookmark not defined. 7: Shelf-life

  20. Are There Multiple Visual Short-Term Memory Stores?

    PubMed Central

    Sligte, Ilja G.; Scholte, H. Steven; Lamme, Victor A. F.

    2008-01-01

    Background Classic work on visual short-term memory (VSTM) suggests that people store a limited amount of items for subsequent report. However, when human observers are cued to shift attention to one item in VSTM during retention, it seems as if there is a much larger representation, which keeps additional items in a more fragile VSTM store. Thus far, it is not clear whether the capacity of this fragile VSTM store indeed exceeds the traditional capacity limits of VSTM. The current experiments address this issue and explore the capacity, stability, and duration of fragile VSTM representations. Methodology/Principal Findings We presented cues in a change-detection task either just after off-set of the memory array (iconic-cue), 1,000 ms after off-set of the memory array (retro-cue) or after on-set of the probe array (post-cue). We observed three stages in visual information processing 1) iconic memory with unlimited capacity, 2) a four seconds lasting fragile VSTM store with a capacity that is at least a factor of two higher than 3) the robust and capacity-limited form of VSTM. Iconic memory seemed to depend on the strength of the positive after-image resulting from the memory display and was virtually absent under conditions of isoluminance or when intervening light masks were presented. This suggests that iconic memory is driven by prolonged retinal activation beyond stimulus duration. Fragile VSTM representations were not affected by light masks, but were completely overwritten by irrelevant pattern masks that spatially overlapped the memory array. Conclusions/Significance We find that immediately after a stimulus has disappeared from view, subjects can still access information from iconic memory because they can see an after-image of the display. After that period, human observers can still access a substantial, but somewhat more limited amount of information from a high-capacity, but fragile VSTM that is overwritten when new items are presented to the eyes. What is left after that is the traditional VSTM store, with a limit of about four objects. We conclude that human observers store more sustained representations than is evident from standard change detection tasks and that these representations can be accessed at will. PMID:18301775

  1. Are there multiple visual short-term memory stores?

    PubMed

    Sligte, Ilja G; Scholte, H Steven; Lamme, Victor A F

    2008-02-27

    Classic work on visual short-term memory (VSTM) suggests that people store a limited amount of items for subsequent report. However, when human observers are cued to shift attention to one item in VSTM during retention, it seems as if there is a much larger representation, which keeps additional items in a more fragile VSTM store. Thus far, it is not clear whether the capacity of this fragile VSTM store indeed exceeds the traditional capacity limits of VSTM. The current experiments address this issue and explore the capacity, stability, and duration of fragile VSTM representations. We presented cues in a change-detection task either just after off-set of the memory array (iconic-cue), 1,000 ms after off-set of the memory array (retro-cue) or after on-set of the probe array (post-cue). We observed three stages in visual information processing 1) iconic memory with unlimited capacity, 2) a four seconds lasting fragile VSTM store with a capacity that is at least a factor of two higher than 3) the robust and capacity-limited form of VSTM. Iconic memory seemed to depend on the strength of the positive after-image resulting from the memory display and was virtually absent under conditions of isoluminance or when intervening light masks were presented. This suggests that iconic memory is driven by prolonged retinal activation beyond stimulus duration. Fragile VSTM representations were not affected by light masks, but were completely overwritten by irrelevant pattern masks that spatially overlapped the memory array. We find that immediately after a stimulus has disappeared from view, subjects can still access information from iconic memory because they can see an after-image of the display. After that period, human observers can still access a substantial, but somewhat more limited amount of information from a high-capacity, but fragile VSTM that is overwritten when new items are presented to the eyes. What is left after that is the traditional VSTM store, with a limit of about four objects. We conclude that human observers store more sustained representations than is evident from standard change detection tasks and that these representations can be accessed at will.

  2. Informatics in radiology: automated structured reporting of imaging findings using the AIM standard and XML.

    PubMed

    Zimmerman, Stefan L; Kim, Woojin; Boonn, William W

    2011-01-01

    Quantitative and descriptive imaging data are a vital component of the radiology report and are frequently of paramount importance to the ordering physician. Unfortunately, current methods of recording these data in the report are both inefficient and error prone. In addition, the free-text, unstructured format of a radiology report makes aggregate analysis of data from multiple reports difficult or even impossible without manual intervention. A structured reporting work flow has been developed that allows quantitative data created at an advanced imaging workstation to be seamlessly integrated into the radiology report with minimal radiologist intervention. As an intermediary step between the workstation and the reporting software, quantitative and descriptive data are converted into an extensible markup language (XML) file in a standardized format specified by the Annotation and Image Markup (AIM) project of the National Institutes of Health Cancer Biomedical Informatics Grid. The AIM standard was created to allow image annotation data to be stored in a uniform machine-readable format. These XML files containing imaging data can also be stored on a local database for data mining and analysis. This structured work flow solution has the potential to improve radiologist efficiency, reduce errors, and facilitate storage of quantitative and descriptive imaging data for research. Copyright © RSNA, 2011.

  3. Web application for detailed real-time database transaction monitoring for CMS condition data

    NASA Astrophysics Data System (ADS)

    de Gruttola, Michele; Di Guida, Salvatore; Innocente, Vincenzo; Pierro, Antonio

    2012-12-01

    In the upcoming LHC era, database have become an essential part for the experiments collecting data from LHC, in order to safely store, and consistently retrieve, a wide amount of data, which are produced by different sources. In the CMS experiment at CERN, all this information is stored in ORACLE databases, allocated in several servers, both inside and outside the CERN network. In this scenario, the task of monitoring different databases is a crucial database administration issue, since different information may be required depending on different users' tasks such as data transfer, inspection, planning and security issues. We present here a web application based on Python web framework and Python modules for data mining purposes. To customize the GUI we record traces of user interactions that are used to build use case models. In addition the application detects errors in database transactions (for example identify any mistake made by user, application failure, unexpected network shutdown or Structured Query Language (SQL) statement error) and provides warning messages from the different users' perspectives. Finally, in order to fullfill the requirements of the CMS experiment community, and to meet the new development in many Web client tools, our application was further developed, and new features were deployed.

  4. Integrated modelling of H-mode pedestal and confinement in JET-ILW

    NASA Astrophysics Data System (ADS)

    Saarelma, S.; Challis, C. D.; Garzotti, L.; Frassinetti, L.; Maggi, C. F.; Romanelli, M.; Stokes, C.; Contributors, JET

    2018-01-01

    A pedestal prediction model Europed is built on the existing EPED1 model by coupling it with core transport simulation using a Bohm-gyroBohm transport model to self-consistently predict JET-ILW power scan for hybrid plasmas that display weaker power degradation than the IPB98(y, 2) scaling of the energy confinement time. The weak power degradation is reproduced in the coupled core-pedestal simulation. The coupled core-pedestal model is further tested for a 3.0 MA plasma with the highest stored energy achieved in JET-ILW so far, giving a prediction of the stored plasma energy within the error margins of the measured experimental value. A pedestal density prediction model based on the neutral penetration is tested on a JET-ILW database giving a prediction with an average error of 17% from the experimental data when a parameter taking into account the fuelling rate is added into the model. However the model fails to reproduce the power dependence of the pedestal density implying missing transport physics in the model. The future JET-ILW deuterium campaign with increased heating power is predicted to reach plasma energy of 11 MJ, which would correspond to 11-13 MW of fusion power in equivalent deuterium-tritium plasma but with isotope effects on pedestal stability and core transport ignored.

  5. Role of leptin in energy homeostasis in humans

    PubMed Central

    Rosenbaum, Michael; Leibel, Rudolph L

    2015-01-01

    The hyperphagia, low sympathetic nervous system tone, and decreased circulating concentrations of bioactive thyroid hormones that are common to states of congenital leptin deficiency and hypoleptinemia following and during weight loss suggest that the major physiological function of leptin is to signal states of negative energy balance and decreased energy stores. In weight-reduced humans, these phenotypes together with pronounced hypometabolism and increased parasympathetic nervous system tone create the optimal circumstance for weight regain. Based on the weight loss induced by leptin administration in states of leptin deficiency (obese) and observed similarity of phenotypes in states of congenital and dietary-induced states of hypoleptinemia (reduced obese), it has been suggested that exogenous leptin could potentially be useful in initiating, promoting, and sustaining weight reduction. However, the responses of human beings to exogenous leptin administration are dependent not only on extant energy stores but also on energy balance. Leptin administration to humans at usual weight has little, if any, effect on body weight while leptin administration during weight loss mitigates hunger, especially if given in supraphysiological doses during severe caloric restriction. Leptin repletion is most effective following weight loss by dietary restriction. In this state of weight stability but reduced energy stores, leptin at least partially reverses many of the metabolic, autonomic, neuroendocrine, and behavioral adaptations that favor weight regain. The major physiological function of leptin is to signal states of negative energy balance and decreased energy stores. Leptin, and pharmacotherapies affecting leptin signaling pathways, is likely to be most useful in sustaining weight loss. PMID:25063755

  6. Cutting the Cord: Discrimination and Command Responsibility in Autonomous Lethal Weapons

    DTIC Science & Technology

    2014-02-13

    machine responses to identical stimuli, and it was the job of a third party human “witness” to determine which participant was man and which was...machines may be error free, but there are potential benefits to be gained through autonomy if machines can meet or exceed human performance in...lieu of human operators and reap the benefits that autonomy provides. Human and Machine Error It would be foolish to assert that either humans

  7. Human error identification for laparoscopic surgery: Development of a motion economy perspective.

    PubMed

    Al-Hakim, Latif; Sevdalis, Nick; Maiping, Tanaphon; Watanachote, Damrongpan; Sengupta, Shomik; Dissaranan, Charuspong

    2015-09-01

    This study postulates that traditional human error identification techniques fail to consider motion economy principles and, accordingly, their applicability in operating theatres may be limited. This study addresses this gap in the literature with a dual aim. First, it identifies the principles of motion economy that suit the operative environment and second, it develops a new error mode taxonomy for human error identification techniques which recognises motion economy deficiencies affecting the performance of surgeons and predisposing them to errors. A total of 30 principles of motion economy were developed and categorised into five areas. A hierarchical task analysis was used to break down main tasks of a urological laparoscopic surgery (hand-assisted laparoscopic nephrectomy) to their elements and the new taxonomy was used to identify errors and their root causes resulting from violation of motion economy principles. The approach was prospectively tested in 12 observed laparoscopic surgeries performed by 5 experienced surgeons. A total of 86 errors were identified and linked to the motion economy deficiencies. Results indicate the developed methodology is promising. Our methodology allows error prevention in surgery and the developed set of motion economy principles could be useful for training surgeons on motion economy principles. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  8. Heuristic-driven graph wavelet modeling of complex terrain

    NASA Astrophysics Data System (ADS)

    Cioacǎ, Teodor; Dumitrescu, Bogdan; Stupariu, Mihai-Sorin; Pǎtru-Stupariu, Ileana; Nǎpǎrus, Magdalena; Stoicescu, Ioana; Peringer, Alexander; Buttler, Alexandre; Golay, François

    2015-03-01

    We present a novel method for building a multi-resolution representation of large digital surface models. The surface points coincide with the nodes of a planar graph which can be processed using a critically sampled, invertible lifting scheme. To drive the lazy wavelet node partitioning, we employ an attribute aware cost function based on the generalized quadric error metric. The resulting algorithm can be applied to multivariate data by storing additional attributes at the graph's nodes. We discuss how the cost computation mechanism can be coupled with the lifting scheme and examine the results by evaluating the root mean square error. The algorithm is experimentally tested using two multivariate LiDAR sets representing terrain surface and vegetation structure with different sampling densities.

  9. Quantum stopwatch: how to store time in a quantum memory.

    PubMed

    Yang, Yuxiang; Chiribella, Giulio; Hayashi, Masahito

    2018-05-01

    Quantum mechanics imposes a fundamental trade-off between the accuracy of time measurements and the size of the systems used as clocks. When the measurements of different time intervals are combined, the errors due to the finite clock size accumulate, resulting in an overall inaccuracy that grows with the complexity of the set-up. Here, we introduce a method that, in principle, eludes the accumulation of errors by coherently transferring information from a quantum clock to a quantum memory of the smallest possible size. Our method could be used to measure the total duration of a sequence of events with enhanced accuracy, and to reduce the amount of quantum communication needed to stabilize clocks in a quantum network.

  10. Systems Issues Pertaining to Holographic Optical Data Storage in Thick Bacteriorhodopsin Films

    NASA Technical Reports Server (NTRS)

    Downie, John D.; Timucin, Dogan A.; Gary, Charles K.; Oezcan, Meric; Smithey, Daniel T.; Crew, Marshall; Lau, Sonie (Technical Monitor)

    1998-01-01

    The optical data storage capacity and raw bit-error-rate achievable with thick photochromic bacteriorhodopsin (BR) films are investigated for sequential recording and read- out of angularly- and shift-multiplexed digital holograms inside a thick blue-membrane D85N BR film. We address the determination of an exposure schedule that produces equal diffraction efficiencies among each of the multiplexed holograms. This exposure schedule is determined by numerical simulations of the holographic recording process within the BR material, and maximizes the total grating strength. We also experimentally measure the shift selectivity and compare the results to theoretical predictions. Finally, we evaluate the bit-error-rate of a single hologram, and of multiple holograms stored within the film.

  11. Choosing the best partition of the output from a large-scale simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Challacombe, Chelsea Jordan; Casleton, Emily Michele

    Data partitioning becomes necessary when a large-scale simulation produces more data than can be feasibly stored. The goal is to partition the data, typically so that every element belongs to one and only one partition, and store summary information about the partition, either a representative value plus an estimate of the error or a distribution. Once the partitions are determined and the summary information stored, the raw data is discarded. This process can be performed in-situ; meaning while the simulation is running. When creating the partitions there are many decisions that researchers must make. For instance, how to determine oncemore » an adequate number of partitions have been created, how are the partitions created with respect to dividing the data, or how many variables should be considered simultaneously. In addition, decisions must be made for how to summarize the information within each partition. Because of the combinatorial number of possible ways to partition and summarize the data, a method of comparing the different possibilities will help guide researchers into choosing a good partitioning and summarization scheme for their application.« less

  12. Segy-change: The swiss army knife for the SEG-Y files

    NASA Astrophysics Data System (ADS)

    Stanghellini, Giuseppe; Carrara, Gabriela

    Data collected during active and passive seismic surveys can be stored in many different, more or less standard, formats. One of the most popular is the SEG-Y format, developed since 1975 to store single-line seismic digital data on tapes, and now evolved to store them into hard-disk and other media as well. Unfortunately, sometimes, files that are claimed to be recorded in the SEG-Y format cannot be processed using available free or industrial packages. Aiming to solve this impasse we present segy-change, a pre-processing software program to view, analyze, change and fix errors present in SEG-Y data files. It is written in C language and it can be used also as a software library and is compatible with most operating systems. Segy-change allows the user to display and optionally change the values inside all parts of a SEG-Y file: the file header, the trace headers and the data blocks. In addition, it allows to do a quality check on the data by plotting the traces. We provide instructions and examples on how to use the software.

  13. Feasibility study of molecular memory device based on DNA using methylation to store information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Liming; Al-Dirini, Feras; Center for Neural Engineering

    DNA, because of its robustness and dense information storage capability, has been proposed as a potential candidate for next-generation storage media. However, encoding information into the DNA sequence requires molecular synthesis technology, which to date is costly and prone to synthesis errors. Reading the DNA strand information is also complex. Ideally, DNA storage will provide methods for modifying stored information. Here, we conduct a feasibility study investigating the use of the DNA 5-methylcytosine (5mC) methylation state as a molecular memory to store information. We propose a new 1-bit memory device and study, based on the density functional theory and non-equilibriummore » Green's function method, the feasibility of electrically reading the information. Our results show that changes to methylation states lead to changes in the peak of negative differential resistance which can be used to interrogate memory state. Our work demonstrates a new memory concept based on methylation state which can be beneficial in the design of next generation DNA based molecular electronic memory devices.« less

  14. A close look at beam aborts with rise times less than 40 ms from the years 2014-2016. Case studies.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drees, A.

    2016-09-14

    In an effort to understand the risks of operating RHIC with an additional delay of 40 ms in the abort system, all beam aborts triggered by loss monitors at store from the years 2014, 2015 and 2016 were analyzed; and particularly fast cases, selected. The results were presented at the RHIC retreat on Jul 29, 2016. All beam aborts at injection, during the ramp and at flattop but before the “ev-lumi” event were ignored since the additional delay of 40 ms is proposed for operation at store only. Many (but not all) of the 15 studied cases are of nomore » concern. Cases with high damage potential are rare - but not rare enough. In order to make an added 40 ms during physics store conditions as safe as reasonably possible, additional permit inputs such as 10 Hz BF power supplies, RF storage cavities, power supply error states or BPMs, in addition to significantly reduced loss monitors (LM) thresholds for selected LM should be commissioned.« less

  15. Managing human error in aviation.

    PubMed

    Helmreich, R L

    1997-05-01

    Crew resource management (CRM) programs were developed to address team and leadership aspects of piloting modern airplanes. The goal is to reduce errors through team work. Human factors research and social, cognitive, and organizational psychology are used to develop programs tailored for individual airlines. Flight crews study accident case histories, group dynamics, and human error. Simulators provide pilots with the opportunity to solve complex flight problems. CRM in the simulator is called line-oriented flight training (LOFT). In automated cockpits CRM promotes the idea of automation as a crew member. Cultural aspects of aviation include professional, business, and national culture. The aviation CRM model has been adapted for training surgeons and operating room staff in human factors.

  16. Empirical Analysis of Systematic Communication Errors.

    DTIC Science & Technology

    1981-09-01

    human o~ . .... 8 components in communication systems. (Systematic errors were defined to be those that occur regularly in human communication links...phase of the human communication process and focuses on the linkage between a specific piece of information (and the receiver) and the transmission...communication flow. (2) Exchange. Exchange is the next phase in human communication and entails a concerted effort on the part of the sender and receiver to share

  17. HRA Aerospace Challenges

    NASA Technical Reports Server (NTRS)

    DeMott, Diana

    2013-01-01

    Compared to equipment designed to perform the same function over and over, humans are just not as reliable. Computers and machines perform the same action in the same way repeatedly getting the same result, unless equipment fails or a human interferes. Humans who are supposed to perform the same actions repeatedly often perform them incorrectly due to a variety of issues including: stress, fatigue, illness, lack of training, distraction, acting at the wrong time, not acting when they should, not following procedures, misinterpreting information or inattention to detail. Why not use robots and automatic controls exclusively if human error is so common? In an emergency or off normal situation that the computer, robotic element, or automatic control system is not designed to respond to, the result is failure unless a human can intervene. The human in the loop may be more likely to cause an error, but is also more likely to catch the error and correct it. When it comes to unexpected situations, or performing multiple tasks outside the defined mission parameters, humans are the only viable alternative. Human Reliability Assessments (HRA) identifies ways to improve human performance and reliability and can lead to improvements in systems designed to interact with humans. Understanding the context of the situation that can lead to human errors, which include taking the wrong action, no action or making bad decisions provides additional information to mitigate risks. With improved human reliability comes reduced risk for the overall operation or project.

  18. Identifying Human Factors Issues in Aircraft Maintenance Operations

    NASA Technical Reports Server (NTRS)

    Veinott, Elizabeth S.; Kanki, Barbara G.; Shafto, Michael G. (Technical Monitor)

    1995-01-01

    Maintenance operations incidents submitted to the Aviation Safety Reporting System (ASRS) between 1986-1992 were systematically analyzed in order to identify issues relevant to human factors and crew coordination. This exploratory analysis involved 95 ASRS reports which represented a wide range of maintenance incidents. The reports were coded and analyzed according to the type of error (e.g, wrong part, procedural error, non-procedural error), contributing factors (e.g., individual, within-team, cross-team, procedure, tools), result of the error (e.g., aircraft damage or not) as well as the operational impact (e.g., aircraft flown to destination, air return, delay at gate). The main findings indicate that procedural errors were most common (48.4%) and that individual and team actions contributed to the errors in more than 50% of the cases. As for operational results, most errors were either corrected after landing at the destination (51.6%) or required the flight crew to stop enroute (29.5%). Interactions among these variables are also discussed. This analysis is a first step toward developing a taxonomy of crew coordination problems in maintenance. By understanding what variables are important and how they are interrelated, we may develop intervention strategies that are better tailored to the human factor issues involved.

  19. Managing Errors to Reduce Accidents in High Consequence Networked Information Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ganter, J.H.

    1999-02-01

    Computers have always helped to amplify and propagate errors made by people. The emergence of Networked Information Systems (NISs), which allow people and systems to quickly interact worldwide, has made understanding and minimizing human error more critical. This paper applies concepts from system safety to analyze how hazards (from hackers to power disruptions) penetrate NIS defenses (e.g., firewalls and operating systems) to cause accidents. Such events usually result from both active, easily identified failures and more subtle latent conditions that have resided in the system for long periods. Both active failures and latent conditions result from human errors. We classifymore » these into several types (slips, lapses, mistakes, etc.) and provide NIS examples of how they occur. Next we examine error minimization throughout the NIS lifecycle, from design through operation to reengineering. At each stage, steps can be taken to minimize the occurrence and effects of human errors. These include defensive design philosophies, architectural patterns to guide developers, and collaborative design that incorporates operational experiences and surprises into design efforts. We conclude by looking at three aspects of NISs that will cause continuing challenges in error and accident management: immaturity of the industry, limited risk perception, and resource tradeoffs.« less

  20. Investigating mode errors on automated flight decks: illustrating the problem-driven, cumulative, and interdisciplinary nature of human factors research.

    PubMed

    Sarter, Nadine

    2008-06-01

    The goal of this article is to illustrate the problem-driven, cumulative, and highly interdisciplinary nature of human factors research by providing a brief overview of the work on mode errors on modern flight decks over the past two decades. Mode errors on modem flight decks were first reported in the late 1980s. Poor feedback, inadequate mental models of the automation, and the high degree of coupling and complexity of flight deck systems were identified as main contributors to these breakdowns in human-automation interaction. Various improvements of design, training, and procedures were proposed to address these issues. The author describes when and why the problem of mode errors surfaced, summarizes complementary research activities that helped identify and understand the contributing factors to mode errors, and describes some countermeasures that have been developed in recent years. This brief review illustrates how one particular human factors problem in the aviation domain enabled various disciplines and methodological approaches to contribute to a better understanding of, as well as provide better support for, effective human-automation coordination. Converging operations and interdisciplinary collaboration over an extended period of time are hallmarks of successful human factors research. The reported body of research can serve as a model for future research and as a teaching tool for students in this field of work.

  1. The Swiss cheese model of adverse event occurrence--Closing the holes.

    PubMed

    Stein, James E; Heiss, Kurt

    2015-12-01

    Traditional surgical attitude regarding error and complications has focused on individual failings. Human factors research has brought new and significant insights into the occurrence of error in healthcare, helping us identify systemic problems that injure patients while enhancing individual accountability and teamwork. This article introduces human factors science and its applicability to teamwork, surgical culture, medical error, and individual accountability. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Behind Human Error: Cognitive Systems, Computers and Hindsight

    DTIC Science & Technology

    1994-12-01

    evaluations • Organize and/or conduct workshops and conferences CSERIAC is a Department of Defense Information Analysis Cen- ter sponsored by the Defense...Process 185 Neutral Observer Criteria 191 Error Analysis as Causal Judgment 193 Error as Information 195 A Fundamental Surprise 195 What is Human...Kahnemann, 1974), and in risk analysis (Dougherty and Fragola, 1990). The discussions have continued in a wide variety of forums, includ- ing the

  3. Monte Carlo simulation of expert judgments on human errors in chemical analysis--a case study of ICP-MS.

    PubMed

    Kuselman, Ilya; Pennecchi, Francesca; Epstein, Malka; Fajgelj, Ales; Ellison, Stephen L R

    2014-12-01

    Monte Carlo simulation of expert judgments on human errors in a chemical analysis was used for determination of distributions of the error quantification scores (scores of likelihood and severity, and scores of effectiveness of a laboratory quality system in prevention of the errors). The simulation was based on modeling of an expert behavior: confident, reasonably doubting and irresolute expert judgments were taken into account by means of different probability mass functions (pmfs). As a case study, 36 scenarios of human errors which may occur in elemental analysis of geological samples by ICP-MS were examined. Characteristics of the score distributions for three pmfs of an expert behavior were compared. Variability of the scores, as standard deviation of the simulated score values from the distribution mean, was used for assessment of the score robustness. A range of the score values, calculated directly from elicited data and simulated by a Monte Carlo method for different pmfs, was also discussed from the robustness point of view. It was shown that robustness of the scores, obtained in the case study, can be assessed as satisfactory for the quality risk management and improvement of a laboratory quality system against human errors. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Designing a new structure for storing nuclear data: Progress of the Working Party for Evaluation Cooperation subgroup #38

    DOE PAGES

    Mattoon, C. M.; Beck, B. R.

    2015-12-24

    An international effort is underway to design a new structure for storing and using nuclear reaction data, with the goal of eventually replacing the current standard, ENDF-6. This effort, organized by the Working Party for Evaluation Cooperation, was initiated in 2012 and has resulted in a list of requirements and specifications for how the proposed new structure shall perform. The new structure will take advantage of new developments in computational tools, using a nested hierarchy to store data. Here, the structure can be stored in text form (such as an XML file) for human readability and data sharing, or itmore » can be stored in binary to optimize data access. In this paper, we present the progress towards completing the requirements, specifications and implementation of the new structure.« less

  5. Designing a new structure for storing nuclear data: Progress of the Working Party for Evaluation Cooperation subgroup #38

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mattoon, C. M.; Beck, B. R.

    An international effort is underway to design a new structure for storing and using nuclear reaction data, with the goal of eventually replacing the current standard, ENDF-6. This effort, organized by the Working Party for Evaluation Cooperation, was initiated in 2012 and has resulted in a list of requirements and specifications for how the proposed new structure shall perform. The new structure will take advantage of new developments in computational tools, using a nested hierarchy to store data. Here, the structure can be stored in text form (such as an XML file) for human readability and data sharing, or itmore » can be stored in binary to optimize data access. In this paper, we present the progress towards completing the requirements, specifications and implementation of the new structure.« less

  6. Human-computer interaction in multitask situations

    NASA Technical Reports Server (NTRS)

    Rouse, W. B.

    1977-01-01

    Human-computer interaction in multitask decisionmaking situations is considered, and it is proposed that humans and computers have overlapping responsibilities. Queueing theory is employed to model this dynamic approach to the allocation of responsibility between human and computer. Results of simulation experiments are used to illustrate the effects of several system variables including number of tasks, mean time between arrivals of action-evoking events, human-computer speed mismatch, probability of computer error, probability of human error, and the level of feedback between human and computer. Current experimental efforts are discussed and the practical issues involved in designing human-computer systems for multitask situations are considered.

  7. Phylesystem: a git-based data store for community-curated phylogenetic estimates.

    PubMed

    McTavish, Emily Jane; Hinchliff, Cody E; Allman, James F; Brown, Joseph W; Cranston, Karen A; Holder, Mark T; Rees, Jonathan A; Smith, Stephen A

    2015-09-01

    Phylogenetic estimates from published studies can be archived using general platforms like Dryad (Vision, 2010) or TreeBASE (Sanderson et al., 1994). Such services fulfill a crucial role in ensuring transparency and reproducibility in phylogenetic research. However, digital tree data files often require some editing (e.g. rerooting) to improve the accuracy and reusability of the phylogenetic statements. Furthermore, establishing the mapping between tip labels used in a tree and taxa in a single common taxonomy dramatically improves the ability of other researchers to reuse phylogenetic estimates. As the process of curating a published phylogenetic estimate is not error-free, retaining a full record of the provenance of edits to a tree is crucial for openness, allowing editors to receive credit for their work and making errors introduced during curation easier to correct. Here, we report the development of software infrastructure to support the open curation of phylogenetic data by the community of biologists. The backend of the system provides an interface for the standard database operations of creating, reading, updating and deleting records by making commits to a git repository. The record of the history of edits to a tree is preserved by git's version control features. Hosting this data store on GitHub (http://github.com/) provides open access to the data store using tools familiar to many developers. We have deployed a server running the 'phylesystem-api', which wraps the interactions with git and GitHub. The Open Tree of Life project has also developed and deployed a JavaScript application that uses the phylesystem-api and other web services to enable input and curation of published phylogenetic statements. Source code for the web service layer is available at https://github.com/OpenTreeOfLife/phylesystem-api. The data store can be cloned from: https://github.com/OpenTreeOfLife/phylesystem. A web application that uses the phylesystem web services is deployed at http://tree.opentreeoflife.org/curator. Code for that tool is available from https://github.com/OpenTreeOfLife/opentree. mtholder@gmail.com. © The Author 2015. Published by Oxford University Press.

  8. Food insecurity as a driver of obesity in humans: The insurance hypothesis.

    PubMed

    Nettle, Daniel; Andrews, Clare; Bateson, Melissa

    2017-01-01

    Integrative explanations of why obesity is more prevalent in some sectors of the human population than others are lacking. Here, we outline and evaluate one candidate explanation, the insurance hypothesis (IH). The IH is rooted in adaptive evolutionary thinking: The function of storing fat is to provide a buffer against shortfall in the food supply. Thus, individuals should store more fat when they receive cues that access to food is uncertain. Applied to humans, this implies that an important proximate driver of obesity should be food insecurity rather than food abundance per se. We integrate several distinct lines of theory and evidence that bear on this hypothesis. We present a theoretical model that shows it is optimal to store more fat when food access is uncertain, and we review the experimental literature from non-human animals showing that fat reserves increase when access to food is restricted. We provide a meta-analysis of 125 epidemiological studies of the association between perceived food insecurity and high body weight in humans. There is a robust positive association, but it is restricted to adult women in high-income countries. We explore why this could be in light of the IH and our theoretical model. We conclude that although the IH alone cannot explain the distribution of obesity in the human population, it may represent a very important component of a pluralistic explanation. We also discuss insights it may offer into the developmental origins of obesity, dieting-induced weight gain, and anorexia nervosa.

  9. Protective antifungal activity of essential oils extracted from Buddleja perfoliata and Pelargonium graveolens against fungi isolated from stored grains.

    PubMed

    Juárez, Z N; Bach, H; Sánchez-Arreola, E; Bach, H; Hernández, L R

    2016-05-01

    The chemical composition and antifungal activity of essential oils extracted from Buddleja perfoliata and Pelargonium graveolens were analysed to assess their efficacy as a potential alternative to synthetic chemical fungicides to protect stored grain. Essential oils were obtained by hydrodistillation, while GC-MS were used to characterize the components of theses oils. The main components identified from the essential oil of B. perfoliata were cubenol, eudesmol, germacrene D-4-ol and cis-verbenol; whereas (-)-aristolene, β-citronellol and geraniol, were identified in P. graveolens. These essential oils were tested against a panel of fungal strains isolated from stored grains. Toxicity of the essential oils was assessed using two models represented by human-derived macrophages and the brine shrimp assay. Moreover, inflammatory response of the oils was assessed by measuring secretion of the pro-inflammatory cytokines IL-6 and TNF-α using a human-derived macrophage cell line. Results show potent antifungal activity against a collection of fungi, with minimal inhibitory concentrations ranging from 0·3 to 50 μg ml(-1) for both plants. A moderated cytotoxicity was observed, but no inflammatory responses. These oils can be used as an alternative for synthetic chemical fungicides used to protect stored grains. Synthetic chemical fungicides are used to protect stored grains, but their broad use raises concerns about effects on the environment and human health. The impact of the present report is that the use of essential oils is an eco-friendly alternative for fungal control in postharvest grains with a low impact to the environment. © 2016 The Society for Applied Microbiology.

  10. Combining task analysis and fault tree analysis for accident and incident analysis: a case study from Bulgaria.

    PubMed

    Doytchev, Doytchin E; Szwillus, Gerd

    2009-11-01

    Understanding the reasons for incident and accident occurrence is important for an organization's safety. Different methods have been developed to achieve this goal. To better understand the human behaviour in incident occurrence we propose an analysis concept that combines Fault Tree Analysis (FTA) and Task Analysis (TA). The former method identifies the root causes of an accident/incident, while the latter analyses the way people perform the tasks in their work environment and how they interact with machines or colleagues. These methods were complemented with the use of the Human Error Identification in System Tools (HEIST) methodology and the concept of Performance Shaping Factors (PSF) to deepen the insight into the error modes of an operator's behaviour. HEIST shows the external error modes that caused the human error and the factors that prompted the human to err. To show the validity of the approach, a case study at a Bulgarian Hydro power plant was carried out. An incident - the flooding of the plant's basement - was analysed by combining the afore-mentioned methods. The case study shows that Task Analysis in combination with other methods can be applied successfully to human error analysis, revealing details about erroneous actions in a realistic situation.

  11. Object permanence in adult common marmosets (Callithrix jacchus): not everything is an "A-not-B" error that seems to be one.

    PubMed

    Kis, Anna; Gácsi, Márta; Range, Friederike; Virányi, Zsófia

    2012-01-01

    In this paper, we describe a behaviour pattern similar to the "A-not-B" error found in human infants and young apes in a monkey species, the common marmosets (Callithrix jacchus). In contrast to the classical explanation, recently it has been suggested that the "A-not-B" error committed by human infants is at least partially due to misinterpretation of the hider's ostensively communicated object hiding actions as potential 'teaching' demonstrations during the A trials. We tested whether this so-called Natural Pedagogy hypothesis would account for the A-not-B error that marmosets commit in a standard object permanence task, but found no support for the hypothesis in this species. Alternatively, we present evidence that lower level mechanisms, such as attention and motivation, play an important role in committing the "A-not-B" error in marmosets. We argue that these simple mechanisms might contribute to the effect of undeveloped object representational skills in other species including young non-human primates that commit the A-not-B error.

  12. Human Powered Centrifuge

    NASA Technical Reports Server (NTRS)

    Mulenburg, Gerald M. (Inventor); Vernikos, Joan (Inventor)

    1997-01-01

    A human powered centrifuge has independently established turntable angular velocity and human power input. A control system allows excess input power to be stored as electric energy in a battery or dissipated as heat through a resistors. In a mechanical embodiment, the excess power is dissipated in a friction brake.

  13. Predatory response of Xylocoris flavipes to bruchid pests of stored food legumes

    Treesearch

    Sharlene E. Sing; Richard T. Arbogast

    2008-01-01

    Biological control may provide an affordable and sustainable option for reducing losses to pest Bruchidae in stored food legumes, a crucial source of human dietary protein. Previous investigations have focused primarily on the role of parasitism in bruchid biological control, while the potential of generalist predators has been comparatively unexplored. The true bug...

  14. Updating expected action outcome in the medial frontal cortex involves an evaluation of error type.

    PubMed

    Maier, Martin E; Steinhauser, Marco

    2013-10-02

    Forming expectations about the outcome of an action is an important prerequisite for action control and reinforcement learning in the human brain. The medial frontal cortex (MFC) has been shown to play an important role in the representation of outcome expectations, particularly when an update of expected outcome becomes necessary because an error is detected. However, error detection alone is not always sufficient to compute expected outcome because errors can occur in various ways and different types of errors may be associated with different outcomes. In the present study, we therefore investigate whether updating expected outcome in the human MFC is based on an evaluation of error type. Our approach was to consider an electrophysiological correlate of MFC activity on errors, the error-related negativity (Ne/ERN), in a task in which two types of errors could occur. Because the two error types were associated with different amounts of monetary loss, updating expected outcomes on error trials required an evaluation of error type. Our data revealed a pattern of Ne/ERN amplitudes that closely mirrored the amount of monetary loss associated with each error type, suggesting that outcome expectations are updated based on an evaluation of error type. We propose that this is achieved by a proactive evaluation process that anticipates error types by continuously monitoring error sources or by dynamically representing possible response-outcome relations.

  15. Optimization of RFID network planning using Zigbee and WSN

    NASA Astrophysics Data System (ADS)

    Hasnan, Khalid; Ahmed, Aftab; Badrul-aisham, Bakhsh, Qadir

    2015-05-01

    Everyone wants to be ease in their life. Radio frequency identification (RFID) wireless technology is used to make our life easier. RFID technology increases productivity, accuracy and convenience in delivery of service in supply chain. It is used for various applications such as preventing theft of automobiles, tolls collection without stopping, no checkout lines at grocery stores, managing traffic, hospital management, corporate campuses and airports, mobile asset tracking, warehousing, tracking library books, and to track a wealth of assets in supply chain management. Efficiency of RFID can be enhanced by integrating with wireless sensor network (WSN), zigbee mesh network and internet of things (IOT). The proposed system is used for identifying, sensing and real-time locating system (RTLS) of items in an indoor heterogeneous region. The system gives real-time richer information of object's characteristics, location and their environmental parameters like temperature, noise and humidity etc. RTLS reduce human error, optimize inventory management, increase productivity and information accuracy at indoor heterogeneous network. The power consumption and the data transmission rate of the system can be minimized by using low power hardware design.

  16. Maggot development during morgue storage and its effect on estimating the post-mortem interval.

    PubMed

    Huntington, Timothy E; Higley, Leon G; Baxendale, Frederick P

    2007-03-01

    When insect evidence is obtained during autopsy, forensic entomologists make decisions regarding the effects of low-temperature (-1 degrees C to 4 degrees C) storage of the body and associated insects when estimating the post-mortem interval (PMI). To determine the effects of storage in a morgue cooler on the temperature of maggot masses, temperatures inside and outside of body bags containing a human cadaver and porcine cadavers (seven replicates) were measured during storage. Temperatures remained significantly higher (p<0.05) inside of the body bags relative to the cooler, and remained at levels sufficient for maggot feeding and development. If the assumption that no insect development takes place during preautopsy refrigeration is made, potential error rates in PMI estimation of 8.6-12.8% occur. The potential for blow fly larvae to undergo significant development while being stored in the morgue is a possibility that forensic entomologists should consider during an investigation involving samples collected from autopsy. Case and experimental evidence also demonstrate that substantial tissue loss can occur from maggot feeding during morgue storage.

  17. A novel automated rat catalepsy bar test system based on a RISC microcontroller.

    PubMed

    Alvarez-Cervera, Fernando J; Villanueva-Toledo, Jairo; Moo-Puc, Rosa E; Heredia-López, Francisco J; Alvarez-Cervera, Margarita; Pineda, Juan C; Góngora-Alfaro, José L

    2005-07-15

    Catalepsy tests performed in rodents treated with drugs that interfere with dopaminergic transmission have been widely used for the screening of drugs with therapeutic potential in the treatment of Parkinson's disease. The basic method for measuring catalepsy intensity is the "standard" bar test. We present here an easy to use microcontroller-based automatic system for recording bar test experiments. The design is simple, compact, and has a low cost. Recording intervals and total experimental time can be programmed within a wide range of values. The resulting catalepsy times are stored, and up to five simultaneous experiments can be recorded. A standard personal computer interface is included. The automated system also permits the elimination of human error associated with factors such as fatigue, distraction, and data transcription, occurring during manual recording. Furthermore, a uniform criterion for timing the cataleptic condition can be achieved. Correlation values between the results obtained with the automated system and those reported by two independent observers ranged between 0.88 and 0.99 (P<0.0001; three treatments, nine animals, 144 catalepsy time measurements).

  18. Secure and Privacy Enhanced Gait Authentication on Smart Phone

    PubMed Central

    Choi, Deokjai

    2014-01-01

    Smart environments established by the development of mobile technology have brought vast benefits to human being. However, authentication mechanisms on portable smart devices, particularly conventional biometric based approaches, still remain security and privacy concerns. These traditional systems are mostly based on pattern recognition and machine learning algorithms, wherein original biometric templates or extracted features are stored under unconcealed form for performing matching with a new biometric sample in the authentication phase. In this paper, we propose a novel gait based authentication using biometric cryptosystem to enhance the system security and user privacy on the smart phone. Extracted gait features are merely used to biometrically encrypt a cryptographic key which is acted as the authentication factor. Gait signals are acquired by using an inertial sensor named accelerometer in the mobile device and error correcting codes are adopted to deal with the natural variation of gait measurements. We evaluate our proposed system on a dataset consisting of gait samples of 34 volunteers. We achieved the lowest false acceptance rate (FAR) and false rejection rate (FRR) of 3.92% and 11.76%, respectively, in terms of key length of 50 bits. PMID:24955403

  19. Public health consequences of mercury spills: Hazardous Substances Emergency Events Surveillance system, 1993-1998.

    PubMed Central

    Zeitz, Perri; Orr, Maureen F; Kaye, Wendy E

    2002-01-01

    We analyzed data from states that participated in the Hazardous Substances Emergency Events Surveillance (HSEES) system maintained by the Agency for Toxic Substances and Disease Registry to describe the public health consequences of mercury releases. From 1993 through 1998, HSEES captured 406 events in which mercury was the only substance released. Schools and universities, private residences, and health care facilities were the most frequent locations involved in mercury events, and human error was the contributing factor for most of the releases. Fourteen persons experienced adverse health effects as a result of the releases. An additional 31 persons had documented elevated levels of mercury in the blood. No fatalities resulted. Evacuations were ordered in 90 (22%) of the events, and the length of evacuation ranged from 1 hr to 46 days. Mercury spills have a significant public health impact and economic burden. Some actions that could potentially lessen the consequences of mercury spills are to switch to mercury-free alternatives, train people in the safe handling and disposal of mercury, and keep mercury securely stored when it is necessary to have it on hand. PMID:11836139

  20. Changes of MK medium during storage of human cornea.

    PubMed Central

    Hasany, S M; Basu, P K

    1987-01-01

    By comparing the composition of McCarey-Kaufman (MK) medium before and after corneal storage we attempted to identify specific physiological changes in the medium as predictors of tissue damage. We also tried to determine if hydrocortisone (a lysosomal membrane stabiliser) added to the medium could reduce tissue damage during storage. Corneas (human and rabbit) were stored in the MK medium with and without hydrocortisone for 4 days at 4 degrees C. The water and nitrogen contents of the stored cornea were compared with those of the fresh cornea. The medium was analysed before and after corneal storage to determine the concentrations of glucose, protein, and amino acids as well as pH and osmolarity. Scanning electron microscopy (SEM) was used to estimate the degree of the corneal endothelial cell damage. The nitrogen contents and dry weights of the steroid treated and untreated stored corneas were similar to those of the fresh unstored cornea. The steroid treated cornea contained a lesser amount of water than the untreated cornea. The cornea stored in medium without steroid took up a greater amount of glucose from the medium than the cornea stored in medium with steroid. As compared with their concentrations in the fresh unused medium the concentrations of leucine, lysine, and glycine were lower and that of glutamic acid was higher in both the media used for corneal storage. However, the steroid treated storage medium as compared with the untreated storage medium had a greater reduction in the lowering of leucine, lysine, and glycine, and a lesser reduction in the increase of glutamic acid. Steroid treated medium also had a lesser amount of protein released from the stored cornea. Changes in the pH and osmolarity of the media before and after corneal storage were not remarkable. SEM showed that the endothelial cells of the cornea stored in the medium containing steroid were less damaged than those of the cornea stored in the medium without steroid. Images PMID:3620430

  1. Storage of Unfed and Leftover Mothers' Own Milk.

    PubMed

    Fogleman, April D; Meng, Ting; Osborne, Jason; Perrin, Maryanne T; Jones, Frances; Allen, Jonathan C

    The objective was to examine the bacteriological and immunological properties of freshly expressed, previously frozen, and leftover mothers' own milk during storage. In the first of two pilot studies, 12 mother-infant dyads participated. The milk studied included freshly expressed unfed and freshly expressed leftover milk. Milk samples were stored at 24°C, 4°C, or -20°C. In the second pilot study, 11 mother-infant dyads participated. The milk studied included milk that had been previously frozen, including previously frozen leftover milk. Milk samples were stored at 24°C and 4°C. After storage in both studies, the milk was analyzed for bacteriological and immunological properties. Bacteriological and immunological characteristics of freshly expressed unfed and freshly expressed leftover milk and previously frozen unfed and previously frozen leftover milk remained stable during storage at 4°C for at least 6 days. The quality of all groups of mothers' milk declined when stored at 24°C for longer than 3 hours. While this study provides evidence that human milk might be safe at longer storage times, storage guidelines should not be revised until more research is performed. This study serves as a call to action for more research on the topic of human milk storage, specifically leftover human milk. The study provides information to inform future study designs on the topic of unpasteurized human milk storage. More research is needed regarding leftover human milk storage with a greater number of participants, determination of the quality of human milk, and the storage of human milk in a real-life setting.

  2. Equalization for a page-oriented optical memory system

    NASA Astrophysics Data System (ADS)

    Trelewicz, Jennifer Q.; Capone, Jeffrey

    1999-11-01

    In this work, a method of decision-feedback equalization is developed for a digital holographic channel that experiences moderate-to-severe imaging errors. Decision feedback is utilized, not only where the channel is well-behaved, but also near the edges of the camera grid that are subject to a high degree of imaging error. In addition to these effects, the channel is worsened by typical problems of holographic channels, including non-uniform illumination, dropouts, and stuck bits. The approach described in this paper builds on established methods for performing trained and blind equalization on time-varying channels. The approach is tested on experimental data sets. On most of these data sets, the method of equalization described in this work delivers at least an order of magnitude improvement in bit-error rate (BER) before error-correction coding (ECC). When ECC is introduced, the approach is able to recover stored data with no errors for many of the tested data sets. Furthermore, a low BER was maintained even over a range of small alignment perturbations in the system. It is believed that this equalization method can allow cost reductions to be made in page-memory systems, by allowing for a larger image area per page or less complex imaging components, without sacrificing the low BER required by data storage applications.

  3. Verification of a national water data base using a geographic information system

    USGS Publications Warehouse

    Harrison, H.E.

    1994-01-01

    The National Water Data Exchange (NAWDEX) was developed to assist users of water-resource data in the identification, location, and acquisition of data. The Master Water Data Index (MWDI) of NAWDEX currently indexes the data collected by 423 organizations from nearly 500,000 sites throughout the United Stales. The utilization of new computer technologies permit the distribution of the MWDI to the public on compact disc. In addition, geographic information systems (GIS) are now available that can store and analyze these data in a spatial format. These recent innovations could increase access and add new capabilities to the MWDI. Before either of these technologies could be employed, however, a quality-assurance check of the MWDI needed to be performed. The MWDI resides on a mainframe computer in a tabular format. It was copied onto a workstation and converted to a GIS format. The GIS was used to identify errors in the MWDI and produce reports that summarized these errors. The summary reports were sent to the responsible contributing agencies along with instructions for submitting their corrections to the NAWDEX Program Office. The MWDI administrator received reports that summarized all of the errors identified. Of the 494,997 sites checked, 93,440 sites had at least one error (18.9 percent error rate).

  4. How much swamp are we talking here?: Propagating uncertainty about the area of coastal wetlands into the U.S. greenhouse gas inventory

    NASA Astrophysics Data System (ADS)

    Holmquist, J. R.; Crooks, S.; Windham-Myers, L.; Megonigal, P.; Weller, D.; Lu, M.; Bernal, B.; Byrd, K. B.; Morris, J. T.; Troxler, T.; McCombs, J.; Herold, N.

    2017-12-01

    Stable coastal wetlands can store substantial amounts of carbon (C) that can be released when they are degraded or eroded. The EPA recently incorporated coastal wetland net-storage and emissions within the Agricultural Forested and Other Land Uses category of the U.S. National Greenhouse Gas Inventory (NGGI). This was a seminal analysis, but its quantification of uncertainty needs improvement. We provide a value-added analysis by estimating that uncertainty, focusing initially on the most basic assumption, the area of coastal wetlands. We considered three sources: uncertainty in the areas of vegetation and salinity subclasses, uncertainty in the areas of changing or stable wetlands, and uncertainty in the inland extent of coastal wetlands. The areas of vegetation and salinity subtypes, as well as stable or changing, were estimated from 2006 and 2010 maps derived from Landsat imagery by the Coastal Change Analysis Program (C-CAP). We generated unbiased area estimates and confidence intervals for C-CAP, taking into account mapped area, proportional areas of commission and omission errors, as well as the number of observations. We defined the inland extent of wetlands as all land below the current elevation of twice monthly highest tides. We generated probabilistic inundation maps integrating wetland-specific bias and random error in light-detection and ranging elevation maps, with the spatially explicit random error in tidal surfaces generated from tide gauges. This initial uncertainty analysis will be extended to calculate total propagated uncertainty in the NGGI by including the uncertainties in the amount of C lost from eroded and degraded wetlands, stored annually in stable wetlands, and emitted in the form of methane by tidal freshwater wetlands.

  5. Accounting for measurement error in human life history trade-offs using structural equation modeling.

    PubMed

    Helle, Samuli

    2018-03-01

    Revealing causal effects from correlative data is very challenging and a contemporary problem in human life history research owing to the lack of experimental approach. Problems with causal inference arising from measurement error in independent variables, whether related either to inaccurate measurement technique or validity of measurements, seem not well-known in this field. The aim of this study is to show how structural equation modeling (SEM) with latent variables can be applied to account for measurement error in independent variables when the researcher has recorded several indicators of a hypothesized latent construct. As a simple example of this approach, measurement error in lifetime allocation of resources to reproduction in Finnish preindustrial women is modelled in the context of the survival cost of reproduction. In humans, lifetime energetic resources allocated in reproduction are almost impossible to quantify with precision and, thus, typically used measures of lifetime reproductive effort (e.g., lifetime reproductive success and parity) are likely to be plagued by measurement error. These results are contrasted with those obtained from a traditional regression approach where the single best proxy of lifetime reproductive effort available in the data is used for inference. As expected, the inability to account for measurement error in women's lifetime reproductive effort resulted in the underestimation of its underlying effect size on post-reproductive survival. This article emphasizes the advantages that the SEM framework can provide in handling measurement error via multiple-indicator latent variables in human life history studies. © 2017 Wiley Periodicals, Inc.

  6. Reliability of drivers in urban intersections.

    PubMed

    Gstalter, Herbert; Fastenmeier, Wolfgang

    2010-01-01

    The concept of human reliability has been widely used in industrial settings by human factors experts to optimise the person-task fit. Reliability is estimated by the probability that a task will successfully be completed by personnel in a given stage of system operation. Human Reliability Analysis (HRA) is a technique used to calculate human error probabilities as the ratio of errors committed to the number of opportunities for that error. To transfer this notion to the measurement of car driver reliability the following components are necessary: a taxonomy of driving tasks, a definition of correct behaviour in each of these tasks, a list of errors as deviations from the correct actions and an adequate observation method to register errors and opportunities for these errors. Use of the SAFE-task analysis procedure recently made it possible to derive driver errors directly from the normative analysis of behavioural requirements. Driver reliability estimates could be used to compare groups of tasks (e.g. different types of intersections with their respective regulations) as well as groups of drivers' or individual drivers' aptitudes. This approach was tested in a field study with 62 drivers of different age groups. The subjects drove an instrumented car and had to complete an urban test route, the main features of which were 18 intersections representing six different driving tasks. The subjects were accompanied by two trained observers who recorded driver errors using standardized observation sheets. Results indicate that error indices often vary between both the age group of drivers and the type of driving task. The highest error indices occurred in the non-signalised intersection tasks and the roundabout, which exactly equals the corresponding ratings of task complexity from the SAFE analysis. A comparison of age groups clearly shows the disadvantage of older drivers, whose error indices in nearly all tasks are significantly higher than those of the other groups. The vast majority of these errors could be explained by high task load in the intersections, as they represent difficult tasks. The discussion shows how reliability estimates can be used in a constructive way to propose changes in car design, intersection layout and regulation as well as driver training.

  7. Safety coaches in radiology: decreasing human error and minimizing patient harm.

    PubMed

    Dickerson, Julie M; Koch, Bernadette L; Adams, Janet M; Goodfriend, Martha A; Donnelly, Lane F

    2010-09-01

    Successful programs to improve patient safety require a component aimed at improving safety culture and environment, resulting in a reduced number of human errors that could lead to patient harm. Safety coaching provides peer accountability. It involves observing for safety behaviors and use of error prevention techniques and provides immediate feedback. For more than a decade, behavior-based safety coaching has been a successful strategy for reducing error within the context of occupational safety in industry. We describe the use of safety coaches in radiology. Safety coaches are an important component of our comprehensive patient safety program.

  8. Error-associated behaviors and error rates for robotic geology

    NASA Technical Reports Server (NTRS)

    Anderson, Robert C.; Thomas, Geb; Wagner, Jacob; Glasgow, Justin

    2004-01-01

    This study explores human error as a function of the decision-making process. One of many models for human decision-making is Rasmussen's decision ladder [9]. The decision ladder identifies the multiple tasks and states of knowledge involved in decision-making. The tasks and states of knowledge can be classified by the level of cognitive effort required to make the decision, leading to the skill, rule, and knowledge taxonomy (Rasmussen, 1987). Skill based decisions require the least cognitive effort and knowledge based decisions require the greatest cognitive effort. Errors can occur at any of the cognitive levels.

  9. Procedural error monitoring and smart checklists

    NASA Technical Reports Server (NTRS)

    Palmer, Everett

    1990-01-01

    Human beings make and usually detect errors routinely. The same mental processes that allow humans to cope with novel problems can also lead to error. Bill Rouse has argued that errors are not inherently bad but their consequences may be. He proposes the development of error-tolerant systems that detect errors and take steps to prevent the consequences of the error from occurring. Research should be done on self and automatic detection of random and unanticipated errors. For self detection, displays should be developed that make the consequences of errors immediately apparent. For example, electronic map displays graphically show the consequences of horizontal flight plan entry errors. Vertical profile displays should be developed to make apparent vertical flight planning errors. Other concepts such as energy circles could also help the crew detect gross flight planning errors. For automatic detection, systems should be developed that can track pilot activity, infer pilot intent and inform the crew of potential errors before their consequences are realized. Systems that perform a reasonableness check on flight plan modifications by checking route length and magnitude of course changes are simple examples. Another example would be a system that checked the aircraft's planned altitude against a data base of world terrain elevations. Information is given in viewgraph form.

  10. STIM1- and Orai1-dependent store-operated calcium entry regulates human myoblast differentiation.

    PubMed

    Darbellay, Basile; Arnaudeau, Serge; König, Stéphane; Jousset, Hélène; Bader, Charles; Demaurex, Nicolas; Bernheim, Laurent

    2009-02-20

    Our previous work on human myoblasts suggested that a hyperpolarization followed by a rise in [Ca(2+)](in) involving store-operated Ca(2+) entry (SOCE) channels induced myoblast differentiation. Advances in the understanding of the SOCE pathway led us to examine more precisely its role in post-natal human myoblast differentiation. We found that SOCE orchestrated by STIM1, the endoplasmic reticulum Ca(2+) sensor activating Orai Ca(2+) channels, is crucial. Silencing STIM1, Orai1, or Orai3 reduced SOCE amplitude and myoblast differentiation, whereas Orai2 knockdown had no effect. Conversely, overexpression of STIM1 with Orai1 increased SOCE and accelerated myoblast differentiation. STIM1 or Orai1 silencing decreased resting [Ca(2+)](in) and intracellular Ca(2+) store content, but correction of these parameters did not rescue myoblast differentiation. Remarkably, SOCE amplitude correlated linearly with the expression of two early markers of myoblast differentiation, MEF2 and myogenin, regardless of the STIM or Orai isoform that was silenced. Unexpectedly, we found that the hyperpolarization also depends on SOCE, placing SOCE upstream of K(+) channel activation in the signaling cascade that controls myoblast differentiation. These findings indicate that STIM1 and Orai1 are key molecules for the induction of human myoblast differentiation.

  11. Surprised at All the Entropy: Hippocampal, Caudate and Midbrain Contributions to Learning from Prediction Errors

    PubMed Central

    Schiffer, Anne-Marike; Ahlheim, Christiane; Wurm, Moritz F.; Schubotz, Ricarda I.

    2012-01-01

    Influential concepts in neuroscientific research cast the brain a predictive machine that revises its predictions when they are violated by sensory input. This relates to the predictive coding account of perception, but also to learning. Learning from prediction errors has been suggested for take place in the hippocampal memory system as well as in the basal ganglia. The present fMRI study used an action-observation paradigm to investigate the contributions of the hippocampus, caudate nucleus and midbrain dopaminergic system to different types of learning: learning in the absence of prediction errors, learning from prediction errors, and responding to the accumulation of prediction errors in unpredictable stimulus configurations. We conducted analyses of the regions of interests' BOLD response towards these different types of learning, implementing a bootstrapping procedure to correct for false positives. We found both, caudate nucleus and the hippocampus to be activated by perceptual prediction errors. The hippocampal responses seemed to relate to the associative mismatch between a stored representation and current sensory input. Moreover, its response was significantly influenced by the average information, or Shannon entropy of the stimulus material. In accordance with earlier results, the habenula was activated by perceptual prediction errors. Lastly, we found that the substantia nigra was activated by the novelty of sensory input. In sum, we established that the midbrain dopaminergic system, the hippocampus, and the caudate nucleus were to different degrees significantly involved in the three different types of learning: acquisition of new information, learning from prediction errors and responding to unpredictable stimulus developments. We relate learning from perceptual prediction errors to the concept of predictive coding and related information theoretic accounts. PMID:22570715

  12. A New Approach to Standardize Multicenter Studies: Mobile Lab Technology for the German Environmental Specimen Bank

    PubMed Central

    Lermen, Dominik; Schmitt, Daniel; Bartel-Steinbach, Martina; Schröter-Kermani, Christa; Kolossa-Gehring, Marike; von Briesen, Hagen; Zimmermann, Heiko

    2014-01-01

    Technical progress has simplified tasks in lab diagnosis and improved quality of test results. Errors occurring during the pre-analytical phase have more negative impact on the quality of test results than errors encountered during the total analytical process. Different infrastructures of sampling sites can highly influence the quality of samples and therewith of analytical results. Annually the German Environmental Specimen Bank (ESB) collects, characterizes, and stores blood, plasma, and urine samples of 120–150 volunteers each on four different sampling sites in Germany. Overarching goal is to investigate the exposure to environmental pollutants of non-occupational exposed young adults combining human biomonitoring with questionnaire data. We investigated the requirements of the study and the possibility to realize a highly standardized sampling procedure on a mobile platform in order to increase the required quality of the pre-analytical phase. The results lead to the development of a mobile epidemiologic laboratory (epiLab) in the project “Labor der Zukunft” (future’s lab technology). This laboratory includes a 14.7 m2 reception area to record medical history and exposure-relevant behavior, a 21.1 m2 examination room to record dental fillings and for blood withdrawal, a 15.5 m2 biological safety level 2 laboratory to process and analyze samples on site including a 2.8 m2 personnel lock and a 3.6 m2 cryofacility to immediately freeze samples. Frozen samples can be transferred to their final destination within the vehicle without breaking the cold chain. To our knowledge, we herewith describe for the first time the implementation of a biological safety laboratory (BSL) 2 lab and an epidemiologic unit on a single mobile platform. Since 2013 we have been collecting up to 15.000 individual human samples annually under highly standardized conditions using the mobile laboratory. Characterized and free of alterations they are kept ready for retrospective analyses in their final archive, the German ESB. PMID:25141120

  13. A new approach to standardize multicenter studies: mobile lab technology for the German Environmental Specimen Bank.

    PubMed

    Lermen, Dominik; Schmitt, Daniel; Bartel-Steinbach, Martina; Schröter-Kermani, Christa; Kolossa-Gehring, Marike; von Briesen, Hagen; Zimmermann, Heiko

    2014-01-01

    Technical progress has simplified tasks in lab diagnosis and improved quality of test results. Errors occurring during the pre-analytical phase have more negative impact on the quality of test results than errors encountered during the total analytical process. Different infrastructures of sampling sites can highly influence the quality of samples and therewith of analytical results. Annually the German Environmental Specimen Bank (ESB) collects, characterizes, and stores blood, plasma, and urine samples of 120-150 volunteers each on four different sampling sites in Germany. Overarching goal is to investigate the exposure to environmental pollutants of non-occupational exposed young adults combining human biomonitoring with questionnaire data. We investigated the requirements of the study and the possibility to realize a highly standardized sampling procedure on a mobile platform in order to increase the required quality of the pre-analytical phase. The results lead to the development of a mobile epidemiologic laboratory (epiLab) in the project "Labor der Zukunft" (future's lab technology). This laboratory includes a 14.7 m(2) reception area to record medical history and exposure-relevant behavior, a 21.1 m(2) examination room to record dental fillings and for blood withdrawal, a 15.5 m(2) biological safety level 2 laboratory to process and analyze samples on site including a 2.8 m(2) personnel lock and a 3.6 m2 cryofacility to immediately freeze samples. Frozen samples can be transferred to their final destination within the vehicle without breaking the cold chain. To our knowledge, we herewith describe for the first time the implementation of a biological safety laboratory (BSL) 2 lab and an epidemiologic unit on a single mobile platform. Since 2013 we have been collecting up to 15.000 individual human samples annually under highly standardized conditions using the mobile laboratory. Characterized and free of alterations they are kept ready for retrospective analyses in their final archive, the German ESB.

  14. Vienna Fortran - A Language Specification. Version 1.1

    DTIC Science & Technology

    1992-03-01

    other computer archi- tectures is the fact that the memory is physically distributed among the processors; the time required to access a non-local...datum may be an order of magnitude higher than the time taken to access locally stored data. This has important consequences for program efficiency. In...machine in many aspects. It is tedious, time -consuming and error prone. It has led to particularly slow software development cycles and, in consequence

  15. A circadian rhythm in skill-based errors in aviation maintenance.

    PubMed

    Hobbs, Alan; Williamson, Ann; Van Dongen, Hans P A

    2010-07-01

    In workplaces where activity continues around the clock, human error has been observed to exhibit a circadian rhythm, with a characteristic peak in the early hours of the morning. Errors are commonly distinguished by the nature of the underlying cognitive failure, particularly the level of intentionality involved in the erroneous action. The Skill-Rule-Knowledge (SRK) framework of Rasmussen is used widely in the study of industrial errors and accidents. The SRK framework describes three fundamental types of error, according to whether behavior is under the control of practiced sensori-motor skill routines with minimal conscious awareness; is guided by implicit or explicit rules or expertise; or where the planning of actions requires the conscious application of domain knowledge. Up to now, examinations of circadian patterns of industrial errors have not distinguished between different types of error. Consequently, it is not clear whether all types of error exhibit the same circadian rhythm. A survey was distributed to aircraft maintenance personnel in Australia. Personnel were invited to anonymously report a safety incident and were prompted to describe, in detail, the human involvement (if any) that contributed to it. A total of 402 airline maintenance personnel reported an incident, providing 369 descriptions of human error in which the time of the incident was reported and sufficient detail was available to analyze the error. Errors were categorized using a modified version of the SRK framework, in which errors are categorized as skill-based, rule-based, or knowledge-based, or as procedure violations. An independent check confirmed that the SRK framework had been applied with sufficient consistency and reliability. Skill-based errors were the most common form of error, followed by procedure violations, rule-based errors, and knowledge-based errors. The frequency of errors was adjusted for the estimated proportion of workers present at work/each hour of the day, and the 24 h pattern of each error type was examined. Skill-based errors exhibited a significant circadian rhythm, being most prevalent in the early hours of the morning. Variation in the frequency of rule-based errors, knowledge-based errors, and procedure violations over the 24 h did not reach statistical significance. The results suggest that during the early hours of the morning, maintenance technicians are at heightened risk of "absent minded" errors involving failures to execute action plans as intended.

  16. 49 CFR 229.137 - Sanitation, general requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... toilet facility in which human waste falls via gravity to a holding tank where it is stored and...) of this section, that contains and removes human waste by a method that does not conform with the...

  17. 49 CFR 229.137 - Sanitation, general requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... toilet facility in which human waste falls via gravity to a holding tank where it is stored and...) of this section, that contains and removes human waste by a method that does not conform with the...

  18. 49 CFR 229.137 - Sanitation, general requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... toilet facility in which human waste falls via gravity to a holding tank where it is stored and...) of this section, that contains and removes human waste by a method that does not conform with the...

  19. 49 CFR 229.137 - Sanitation, general requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... toilet facility in which human waste falls via gravity to a holding tank where it is stored and...) of this section, that contains and removes human waste by a method that does not conform with the...

  20. 49 CFR 229.137 - Sanitation, general requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... toilet facility in which human waste falls via gravity to a holding tank where it is stored and...) of this section, that contains and removes human waste by a method that does not conform with the...

  1. Food insecurity as a driver of obesity in humans: The insurance hypothesis

    PubMed Central

    Nettle, Daniel; Andrews, Clare; Bateson, Melissa

    2016-01-01

    Short abstract Common sense says that obesity is the consequence of too much food. Adaptive reasoning says something rather different: individuals should store fat when access to food is insecure, to buffer themselves against future shortfall. Applied to humans, this principle suggests that food insecurity should be a risk factor for overweight and obesity. We provide a meta-analysis of the extensive epidemiological literature, finding that food insecurity robustly predicts high body weight, but only amongst women in high-income countries. We discuss the relevance of food insecurity to understanding the global obesity problem. Long abstract Integrative explanations of why obesity is more prevalent in some sectors of the human population than others are lacking. Here, we outline and evaluate one candidate explanation, the insurance hypothesis (IH). The IH is rooted in adaptive evolutionary thinking: the function of storing fat is to provide a buffer against shortfall in the food supply. Thus, individuals should store more fat when they receive cues that access to food is uncertain. Applied to humans, this implies that an important proximate driver of obesity should be food insecurity rather than food abundance per se. We integrate several distinct lines of theory and evidence that bear on this hypothesis. We present a theoretical model that shows it is optimal to store more fat when food access is uncertain, and we review the experimental literature from non-human animals showing that fat reserves increase when access to food is restricted. We provide a meta-analysis of 125 epidemiological studies of the association between perceived food insecurity and high body weight in humans. There is a robust positive association, but it is restricted to adult women in high-income countries. We explore why this could be in light of the IH and our theoretical model. We conclude that whilst the IH alone cannot explain the distribution of obesity in the human population, it may represent a very important component of a pluralistic explanation. We also discuss insights it may offer into the developmental origins of obesity, dieting-induced weight gain, and Anorexia Nervosa. PMID:27464638

  2. Uncertainty in georeferencing current and historic plant locations

    USGS Publications Warehouse

    McEachern, K.; Niessen, K.

    2009-01-01

    With shrinking habitats, weed invasions, and climate change, repeated surveys are becoming increasingly important for rare plant conservation and ecological restoration. We often need to relocate historical sites or provide locations for newly restored sites. Georeferencing is the technique of giving geographic coordinates to the location of a site. Georeferencing has been done historically using verbal descriptions or field maps that accompany voucher collections. New digital technology gives us more exact techniques for mapping and storing location information. Error still exists, however, and even georeferenced locations can be uncertain, especially if error information is not included with the observation. We review the concept of uncertainty in georeferencing and compare several institutional database systems for cataloging error and uncertainty with georeferenced locations. These concepts are widely discussed among geographers, but ecologists and restorationists need to become more aware of issues related to uncertainty to improve our use of spatial information in field studies. ?? 2009 by the Board of Regents of the University of Wisconsin System.

  3. Analysis of Compression Algorithm in Ground Collision Avoidance Systems (Auto-GCAS)

    NASA Technical Reports Server (NTRS)

    Schmalz, Tyler; Ryan, Jack

    2011-01-01

    Automatic Ground Collision Avoidance Systems (Auto-GCAS) utilizes Digital Terrain Elevation Data (DTED) stored onboard a plane to determine potential recovery maneuvers. Because of the current limitations of computer hardware on military airplanes such as the F-22 and F-35, the DTED must be compressed through a lossy technique called binary-tree tip-tilt. The purpose of this study is to determine the accuracy of the compressed data with respect to the original DTED. This study is mainly interested in the magnitude of the error between the two as well as the overall distribution of the errors throughout the DTED. By understanding how the errors of the compression technique are affected by various factors (topography, density of sampling points, sub-sampling techniques, etc.), modifications can be made to the compression technique resulting in better accuracy. This, in turn, would minimize unnecessary activation of A-GCAS during flight as well as maximizing its contribution to fighter safety.

  4. Controlled English to facilitate human/machine analytical processing

    NASA Astrophysics Data System (ADS)

    Braines, Dave; Mott, David; Laws, Simon; de Mel, Geeth; Pham, Tien

    2013-06-01

    Controlled English is a human-readable information representation format that is implemented using a restricted subset of the English language, but which is unambiguous and directly accessible by simple machine processes. We have been researching the capabilities of CE in a number of contexts, and exploring the degree to which a flexible and more human-friendly information representation format could aid the intelligence analyst in a multi-agent collaborative operational environment; especially in cases where the agents are a mixture of other human users and machine processes aimed at assisting the human users. CE itself is built upon a formal logic basis, but allows users to easily specify models for a domain of interest in a human-friendly language. In our research we have been developing an experimental component known as the "CE Store" in which CE information can be quickly and flexibly processed and shared between human and machine agents. The CE Store environment contains a number of specialized machine agents for common processing tasks and also supports execution of logical inference rules that can be defined in the same CE language. This paper outlines the basic architecture of this approach, discusses some of the example machine agents that have been developed, and provides some typical examples of the CE language and the way in which it has been used to support complex analytical tasks on synthetic data sources. We highlight the fusion of human and machine processing supported through the use of the CE language and CE Store environment, and show this environment with examples of highly dynamic extensions to the model(s) and integration between different user-defined models in a collaborative setting.

  5. Spatial durbin error model for human development index in Province of Central Java.

    NASA Astrophysics Data System (ADS)

    Septiawan, A. R.; Handajani, S. S.; Martini, T. S.

    2018-05-01

    The Human Development Index (HDI) is an indicator used to measure success in building the quality of human life, explaining how people access development outcomes when earning income, health and education. Every year HDI in Central Java has improved to a better direction. In 2016, HDI in Central Java was 69.98 %, an increase of 0.49 % over the previous year. The objective of this study was to apply the spatial Durbin error model using angle weights queen contiguity to measure HDI in Central Java Province. Spatial Durbin error model is used because the model overcomes the spatial effect of errors and the effects of spatial depedency on the independent variable. Factors there use is life expectancy, mean years of schooling, expected years of schooling, and purchasing power parity. Based on the result of research, we get spatial Durbin error model for HDI in Central Java with influencing factors are life expectancy, mean years of schooling, expected years of schooling, and purchasing power parity.

  6. Hierarchical learning induces two simultaneous, but separable, prediction errors in human basal ganglia.

    PubMed

    Diuk, Carlos; Tsai, Karin; Wallis, Jonathan; Botvinick, Matthew; Niv, Yael

    2013-03-27

    Studies suggest that dopaminergic neurons report a unitary, global reward prediction error signal. However, learning in complex real-life tasks, in particular tasks that show hierarchical structure, requires multiple prediction errors that may coincide in time. We used functional neuroimaging to measure prediction error signals in humans performing such a hierarchical task involving simultaneous, uncorrelated prediction errors. Analysis of signals in a priori anatomical regions of interest in the ventral striatum and the ventral tegmental area indeed evidenced two simultaneous, but separable, prediction error signals corresponding to the two levels of hierarchy in the task. This result suggests that suitably designed tasks may reveal a more intricate pattern of firing in dopaminergic neurons. Moreover, the need for downstream separation of these signals implies possible limitations on the number of different task levels that we can learn about simultaneously.

  7. Mental representation of symbols as revealed by vocabulary errors in two bonobos (Pan paniscus).

    PubMed

    Lyn, Heidi

    2007-10-01

    Error analysis has been used in humans to detect implicit representations and categories in language use. The present study utilizes the same technique to report on mental representations and categories in symbol use from two bonobos (Pan paniscus). These bonobos have been shown in published reports to comprehend English at the level of a two-and-a-half year old child and to use a keyboard with over 200 visuographic symbols (lexigrams). In this study, vocabulary test errors from over 10 years of data revealed auditory, visual, and spatio-temporal generalizations (errors were more likely items that looked like sounded like, or were frequently associated with the sample item in space or in time), as well as hierarchical and conceptual categorizations. These error data, like those of humans, are a result of spontaneous responding rather than specific training and do not solely depend upon the sample mode (e.g. auditory similarity errors are not universally more frequent with an English sample, nor were visual similarity errors universally more frequent with a photograph sample). However, unlike humans, these bonobos do not make errors based on syntactical confusions (e.g. confusing semantically unrelated nouns), suggesting that they may not separate syntactical and semantic information. These data suggest that apes spontaneously create a complex, hierarchical, web of representations when exposed to a symbol system.

  8. 78 FR 11237 - Public Hearing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-15

    ... management of human error in its operations and system safety programs, and the status of PTC implementation... UP's safety management policies and programs associated with human error, operational accident and... Chairman of the Board of Inquiry 2. Introduction of the Board of Inquiry and Technical Panel 3...

  9. Intravenous Chemotherapy Compounding Errors in a Follow-Up Pan-Canadian Observational Study.

    PubMed

    Gilbert, Rachel E; Kozak, Melissa C; Dobish, Roxanne B; Bourrier, Venetia C; Koke, Paul M; Kukreti, Vishal; Logan, Heather A; Easty, Anthony C; Trbovich, Patricia L

    2018-05-01

    Intravenous (IV) compounding safety has garnered recent attention as a result of high-profile incidents, awareness efforts from the safety community, and increasingly stringent practice standards. New research with more-sensitive error detection techniques continues to reinforce that error rates with manual IV compounding are unacceptably high. In 2014, our team published an observational study that described three types of previously unrecognized and potentially catastrophic latent chemotherapy preparation errors in Canadian oncology pharmacies that would otherwise be undetectable. We expand on this research and explore whether additional potential human failures are yet to be addressed by practice standards. Field observations were conducted in four cancer center pharmacies in four Canadian provinces from January 2013 to February 2015. Human factors specialists observed and interviewed pharmacy managers, oncology pharmacists, pharmacy technicians, and pharmacy assistants as they carried out their work. Emphasis was on latent errors (potential human failures) that could lead to outcomes such as wrong drug, dose, or diluent. Given the relatively short observational period, no active failures or actual errors were observed. However, 11 latent errors in chemotherapy compounding were identified. In terms of severity, all 11 errors create the potential for a patient to receive the wrong drug or dose, which in the context of cancer care, could lead to death or permanent loss of function. Three of the 11 practices were observed in our previous study, but eight were new. Applicable Canadian and international standards and guidelines do not explicitly address many of the potentially error-prone practices observed. We observed a significant degree of risk for error in manual mixing practice. These latent errors may exist in other regions where manual compounding of IV chemotherapy takes place. Continued efforts to advance standards, guidelines, technological innovation, and chemical quality testing are needed.

  10. Poisoning following exposure to chemicals stored in mislabelled or unlabelled containers: a recipe for potential disaster.

    PubMed

    Millard, Yvette C; Slaughter, Robin J; Shieffelbien, Lucy M; Schep, Leo J

    2014-09-26

    To investigate poisoning exposures to chemicals that were unlabelled, mislabelled or not in their original containers in New Zealand over the last 10 years, based on calls to the New Zealand National Poisons Centre (NZNPC). Call data from the NZNPC between 2003 and 2012 were analysed retrospectively. Parameters reviewed included patient age, route and site of exposure, product classification and recommended intervention. Of the 324,411 calls received between 2003 and 2012, 100,465 calls were associated with acute human exposure to chemicals. There were 757 inquiries related to human exposure to mislabelled or unlabelled chemicals consisting of 0.75% of chemical exposures. Adults were involved in 51% of incidents, children, <5 years 32%, 5-10 years 10%, and adolescents 5%. Child exploratory behaviour was responsible for 38% of calls and adult unintentional exposures 61%. Medical attention was advised in 26% of calls. Inadvertent exposure to toxic products stored in unlabelled or mislabelled containers is a problem for all age groups. Although it represents a small proportion of total calls to the NZNPC it remains a potential risk for serious poisoning. It is important that chemicals are stored securely, in their original containers, and never stored in drinking vessels.

  11. Spacecraft and propulsion technician error

    NASA Astrophysics Data System (ADS)

    Schultz, Daniel Clyde

    Commercial aviation and commercial space similarly launch, fly, and land passenger vehicles. Unlike aviation, the U.S. government has not established maintenance policies for commercial space. This study conducted a mixed methods review of 610 U.S. space launches from 1984 through 2011, which included 31 failures. An analysis of the failure causal factors showed that human error accounted for 76% of those failures, which included workmanship error accounting for 29% of the failures. With the imminent future of commercial space travel, the increased potential for the loss of human life demands that changes be made to the standardized procedures, training, and certification to reduce human error and failure rates. Several recommendations were made by this study to the FAA's Office of Commercial Space Transportation, space launch vehicle operators, and maintenance technician schools in an effort to increase the safety of the space transportation passengers.

  12. Continuous H/V spectral ratio analysis of ambient noise: a necessity to understand microzonation results obtained by mobile stations

    NASA Astrophysics Data System (ADS)

    Van Noten, Koen; Lecocq, Thomas

    2016-04-01

    Estimating the resonance frequency (f0) and amplification factor of unconsolidated sediments by H/V spectral ratio (HVSR) analysis of seismic ambient noise has been widely used since Nakamura's proposal in 1989. To measure f0 properly, Nakamura suggested to perform microzonation surveys at night when the artificial microtremor is small and does not fully disrupt the ambient seismic noise. As nightly fieldwork is not always a reasonable demand, we propose an alternative workflow of Nakamura's technique to improve the quality of HVSR results obtained by ambient noise measurements of mobile stations during the day. This new workflow includes the automated H/V calculation of continuous seismic data of a stationary or permanent station installed near the microzonation site for as long as the survey lasts in order to control the error in the HVSR analysis obtained by the mobile stations. In this presentation, we apply this workflow on one year of seismic data at two different case studies; i.e. a rural site with a shallow bedrock depth of 30 m and an urban site (Brussels, capital of Belgium, bedrock depth of 110 m) where human activity is continuous 24h/day. By means of an automated python script, the fundamental peak frequency and the H/V amplitude are automatically picked from H/V spectra that are calculated from 50% overlapping, 30 minute windows during the whole year. Afterwards, the f0 and amplitude picks are averaged per hour/per day for the whole year. In both case studies, the H/V amplitude and the fundamental frequencies range considerable, up to ˜15% difference between the daily and nightly measurements. As bedrock depth is known from boreholes at both sites, we concluded that the nightly picked f0 is the true one. Our results thus suggest that changes in the determined f0 and H/V amplitude are dominantly caused by the human behaviour which is stored in the ambient seismic noise (e.g. later onset of traffic in a weekend, quiet Sundays, differences between daily/nightly activity,…). Consequently, performing a continuous HVSR analysis next to your microzonation site allows you to characterise the deviation of the measured f0 to the true f0 during the period of investigation (in our case during the whole year)! As mobile stations are affected by the same variation stored in the ambient noise, then a correction factor can be applied on the calculated f0 of individual measurements during the microzonation survey and a proper Vs can be estimated. Based on these results we recommend that microzonation with mobile stations should always be accompanied by a stationary seismic station to characterise the ambient noise and to control the error.

  13. 42 CFR 1005.23 - Harmless error.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 5 2012-10-01 2012-10-01 false Harmless error. 1005.23 Section 1005.23 Public Health OFFICE OF INSPECTOR GENERAL-HEALTH CARE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OIG AUTHORITIES APPEALS OF EXCLUSIONS, CIVIL MONEY PENALTIES AND ASSESSMENTS § 1005.23 Harmless error. No error in either...

  14. 42 CFR 1005.23 - Harmless error.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 5 2014-10-01 2014-10-01 false Harmless error. 1005.23 Section 1005.23 Public Health OFFICE OF INSPECTOR GENERAL-HEALTH CARE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OIG AUTHORITIES APPEALS OF EXCLUSIONS, CIVIL MONEY PENALTIES AND ASSESSMENTS § 1005.23 Harmless error. No error in either...

  15. 42 CFR 1005.23 - Harmless error.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Harmless error. 1005.23 Section 1005.23 Public Health OFFICE OF INSPECTOR GENERAL-HEALTH CARE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OIG AUTHORITIES APPEALS OF EXCLUSIONS, CIVIL MONEY PENALTIES AND ASSESSMENTS § 1005.23 Harmless error. No error in either...

  16. 42 CFR 1005.23 - Harmless error.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 5 2013-10-01 2013-10-01 false Harmless error. 1005.23 Section 1005.23 Public Health OFFICE OF INSPECTOR GENERAL-HEALTH CARE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OIG AUTHORITIES APPEALS OF EXCLUSIONS, CIVIL MONEY PENALTIES AND ASSESSMENTS § 1005.23 Harmless error. No error in either...

  17. 42 CFR 1005.23 - Harmless error.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Harmless error. 1005.23 Section 1005.23 Public Health OFFICE OF INSPECTOR GENERAL-HEALTH CARE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OIG AUTHORITIES APPEALS OF EXCLUSIONS, CIVIL MONEY PENALTIES AND ASSESSMENTS § 1005.23 Harmless error. No error in either...

  18. 42 CFR 3.552 - Harmless error.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Harmless error. 3.552 Section 3.552 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PROVISIONS PATIENT SAFETY ORGANIZATIONS AND PATIENT SAFETY WORK PRODUCT Enforcement Program § 3.552 Harmless error. No error in either the...

  19. 45 CFR 98.100 - Error Rate Report.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Error Rate Report. 98.100 Section 98.100 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.100 Error Rate Report. (a) Applicability—The requirements of this subpart...

  20. 45 CFR 98.100 - Error Rate Report.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Error Rate Report. 98.100 Section 98.100 Public Welfare Department of Health and Human Services GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.100 Error Rate Report. (a) Applicability—The requirements of this subpart...

  1. 45 CFR 98.100 - Error Rate Report.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Error Rate Report. 98.100 Section 98.100 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.100 Error Rate Report. (a) Applicability—The requirements of this subpart...

  2. 45 CFR 98.100 - Error Rate Report.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Error Rate Report. 98.100 Section 98.100 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.100 Error Rate Report. (a) Applicability—The requirements of this subpart...

  3. Defense Mapping Agency (DMA) Raster-to-Vector Analysis

    DTIC Science & Technology

    1984-11-30

    model) to pinpoint critical deficiencies and understand trade-offs between alternative solutions. This may be exemplified by the allocation of human ...process, prone to errors (i.e., human operator eye/motor control limitations), and its time consuming nature (as a function of data density). It should...achieved through the facilities of coinputer interactive graphics. Each error or anomaly is individually identified by a human operator and corrected

  4. Parallel structures in human and computer memory

    NASA Astrophysics Data System (ADS)

    Kanerva, Pentti

    1986-08-01

    If we think of our experiences as being recorded continuously on film, then human memory can be compared to a film library that is indexed by the contents of the film strips stored in it. Moreover, approximate retrieval cues suffice to retrieve information stored in this library: We recognize a familiar person in a fuzzy photograph or a familiar tune played on a strange instrument. This paper is about how to construct a computer memory that would allow a computer to recognize patterns and to recall sequences the way humans do. Such a memory is remarkably similar in structure to a conventional computer memory and also to the neural circuits in the cortex of the cerebellum of the human brain. The paper concludes that the frame problem of artificial intelligence could be solved by the use of such a memory if we were able to encode information about the world properly.

  5. Parallel structures in human and computer memory

    NASA Technical Reports Server (NTRS)

    Kanerva, P.

    1986-01-01

    If one thinks of our experiences as being recorded continuously on film, then human memory can be compared to a film library that is indexed by the contents of the film strips stored in it. Moreover, approximate retrieval cues suffice to retrieve information stored in this library. One recognizes a familiar person in a fuzzy photograph or a familiar tune played on a strange instrument. A computer memory that would allow a computer to recognize patterns and to recall sequences the way humans do is constructed. Such a memory is remarkably similiar in structure to a conventional computer memory and also to the neural circuits in the cortex of the cerebellum of the human brain. It is concluded that the frame problem of artificial intelligence could be solved by the use of such a memory if one were able to encode information about the world properly.

  6. A systematic review of patient medication error on self-administering medication at home.

    PubMed

    Mira, José Joaquín; Lorenzo, Susana; Guilabert, Mercedes; Navarro, Isabel; Pérez-Jover, Virtudes

    2015-06-01

    Medication errors have been analyzed as a health professionals' responsibility (due to mistakes in prescription, preparation or dispensing). However, sometimes, patients themselves (or their caregivers) make mistakes in the administration of the medication. The epidemiology of patient medication errors (PEs) has been scarcely reviewed in spite of its impact on people, on therapeutic effectiveness and on incremental cost for the health systems. This study reviews and describes the methodological approaches and results of published studies on the frequency, causes and consequences of medication errors committed by patients at home. A review of research articles published between 1990 and 2014 was carried out using MEDLINE, Web-of-Knowledge, Scopus, Tripdatabase and Index Medicus. The frequency of PE was situated between 19 and 59%. The elderly and the preschooler population constituted a higher number of mistakes than others. The most common were: incorrect dosage, forgetting, mixing up medications, failing to recall indications and taking out-of-date or inappropriately stored drugs. The majority of these mistakes have no negative consequences. Health literacy, information and communication and complexity of use of dispensing devices were identified as causes of PEs. Apps and other new technologies offer several opportunities for improving drug safety.

  7. How we learn to make decisions: rapid propagation of reinforcement learning prediction errors in humans.

    PubMed

    Krigolson, Olav E; Hassall, Cameron D; Handy, Todd C

    2014-03-01

    Our ability to make decisions is predicated upon our knowledge of the outcomes of the actions available to us. Reinforcement learning theory posits that actions followed by a reward or punishment acquire value through the computation of prediction errors-discrepancies between the predicted and the actual reward. A multitude of neuroimaging studies have demonstrated that rewards and punishments evoke neural responses that appear to reflect reinforcement learning prediction errors [e.g., Krigolson, O. E., Pierce, L. J., Holroyd, C. B., & Tanaka, J. W. Learning to become an expert: Reinforcement learning and the acquisition of perceptual expertise. Journal of Cognitive Neuroscience, 21, 1833-1840, 2009; Bayer, H. M., & Glimcher, P. W. Midbrain dopamine neurons encode a quantitative reward prediction error signal. Neuron, 47, 129-141, 2005; O'Doherty, J. P. Reward representations and reward-related learning in the human brain: Insights from neuroimaging. Current Opinion in Neurobiology, 14, 769-776, 2004; Holroyd, C. B., & Coles, M. G. H. The neural basis of human error processing: Reinforcement learning, dopamine, and the error-related negativity. Psychological Review, 109, 679-709, 2002]. Here, we used the brain ERP technique to demonstrate that not only do rewards elicit a neural response akin to a prediction error but also that this signal rapidly diminished and propagated to the time of choice presentation with learning. Specifically, in a simple, learnable gambling task, we show that novel rewards elicited a feedback error-related negativity that rapidly decreased in amplitude with learning. Furthermore, we demonstrate the existence of a reward positivity at choice presentation, a previously unreported ERP component that has a similar timing and topography as the feedback error-related negativity that increased in amplitude with learning. The pattern of results we observed mirrored the output of a computational model that we implemented to compute reward prediction errors and the changes in amplitude of these prediction errors at the time of choice presentation and reward delivery. Our results provide further support that the computations that underlie human learning and decision-making follow reinforcement learning principles.

  8. Causal Evidence from Humans for the Role of Mediodorsal Nucleus of the Thalamus in Working Memory.

    PubMed

    Peräkylä, Jari; Sun, Lihua; Lehtimäki, Kai; Peltola, Jukka; Öhman, Juha; Möttönen, Timo; Ogawa, Keith H; Hartikainen, Kaisa M

    2017-12-01

    The mediodorsal nucleus of the thalamus (MD), with its extensive connections to the lateral pFC, has been implicated in human working memory and executive functions. However, this understanding is based solely on indirect evidence from human lesion and imaging studies and animal studies. Direct, causal evidence from humans is missing. To obtain direct evidence for MD's role in humans, we studied patients treated with deep brain stimulation (DBS) for refractory epilepsy. This treatment is thought to prevent the generalization of a seizure by disrupting the functioning of the patient's anterior nuclei of the thalamus (ANT) with high-frequency electric stimulation. This structure is located superior and anterior to MD, and when the DBS lead is implanted in ANT, tip contacts of the lead typically penetrate through ANT into the adjoining MD. To study the role of MD in human executive functions and working memory, we periodically disrupted and recovered MD's function with high-frequency electric stimulation using DBS contacts reaching MD while participants performed a cognitive task engaging several aspects of executive functions. We hypothesized that the efficacy of executive functions, specifically working memory, is impaired when the functioning of MD is perturbed by high-frequency stimulation. Eight participants treated with ANT-DBS for refractory epilepsy performed a computer-based test of executive functions while DBS was repeatedly switched ON and OFF at MD and at the control location (ANT). In comparison to stimulation of the control location, when MD was stimulated, participants committed 2.26 times more errors in general (total errors; OR = 2.26, 95% CI [1.69, 3.01]) and 2.86 times more working memory-related errors specifically (incorrect button presses; OR = 2.88, CI [1.95, 4.24]). Similarly, participants committed 1.81 more errors in general ( OR = 1.81, CI [1.45, 2.24]) and 2.08 times more working memory-related errors ( OR = 2.08, CI [1.57, 2.75]) in comparison to no stimulation condition. "Total errors" is a composite score consisting of basic error types and was mostly driven by working memory-related errors. The facts that MD and a control location, ANT, are only few millimeters away from each other and that their stimulation produces very different results highlight the location-specific effect of DBS rather than regionally unspecific general effect. In conclusion, disrupting and recovering MD's function with high-frequency electric stimulation modulated participants' online working memory performance providing causal, in vivo evidence from humans for the role of MD in human working memory.

  9. A Method for the Study of Human Factors in Aircraft Operations

    NASA Technical Reports Server (NTRS)

    Barnhart, W.; Billings, C.; Cooper, G.; Gilstrap, R.; Lauber, J.; Orlady, H.; Puskas, B.; Stephens, W.

    1975-01-01

    A method for the study of human factors in the aviation environment is described. A conceptual framework is provided within which pilot and other human errors in aircraft operations may be studied with the intent of finding out how, and why, they occurred. An information processing model of human behavior serves as the basis for the acquisition and interpretation of information relating to occurrences which involve human error. A systematic method of collecting such data is presented and discussed. The classification of the data is outlined.

  10. An interactive framework for acquiring vision models of 3-D objects from 2-D images.

    PubMed

    Motai, Yuichi; Kak, Avinash

    2004-02-01

    This paper presents a human-computer interaction (HCI) framework for building vision models of three-dimensional (3-D) objects from their two-dimensional (2-D) images. Our framework is based on two guiding principles of HCI: 1) provide the human with as much visual assistance as possible to help the human make a correct input; and 2) verify each input provided by the human for its consistency with the inputs previously provided. For example, when stereo correspondence information is elicited from a human, his/her job is facilitated by superimposing epipolar lines on the images. Although that reduces the possibility of error in the human marked correspondences, such errors are not entirely eliminated because there can be multiple candidate points close together for complex objects. For another example, when pose-to-pose correspondence is sought from a human, his/her job is made easier by allowing the human to rotate the partial model constructed in the previous pose in relation to the partial model for the current pose. While this facility reduces the incidence of human-supplied pose-to-pose correspondence errors, such errors cannot be eliminated entirely because of confusion created when multiple candidate features exist close together. Each input provided by the human is therefore checked against the previous inputs by invoking situation-specific constraints. Different types of constraints (and different human-computer interaction protocols) are needed for the extraction of polygonal features and for the extraction of curved features. We will show results on both polygonal objects and object containing curved features.

  11. Experimental Demonstration of Fault-Tolerant State Preparation with Superconducting Qubits.

    PubMed

    Takita, Maika; Cross, Andrew W; Córcoles, A D; Chow, Jerry M; Gambetta, Jay M

    2017-11-03

    Robust quantum computation requires encoding delicate quantum information into degrees of freedom that are hard for the environment to change. Quantum encodings have been demonstrated in many physical systems by observing and correcting storage errors, but applications require not just storing information; we must accurately compute even with faulty operations. The theory of fault-tolerant quantum computing illuminates a way forward by providing a foundation and collection of techniques for limiting the spread of errors. Here we implement one of the smallest quantum codes in a five-qubit superconducting transmon device and demonstrate fault-tolerant state preparation. We characterize the resulting code words through quantum process tomography and study the free evolution of the logical observables. Our results are consistent with fault-tolerant state preparation in a protected qubit subspace.

  12. The NEEDS Data Base Management and Archival Mass Memory System

    NASA Technical Reports Server (NTRS)

    Bailey, G. A.; Bryant, S. B.; Thomas, D. T.; Wagnon, F. W.

    1980-01-01

    A Data Base Management System and an Archival Mass Memory System are being developed that will have a 10 to the 12th bit on-line and a 10 to the 13th off-line storage capacity. The integrated system will accept packetized data from the data staging area at 50 Mbps, create a comprehensive directory, provide for file management, record the data, perform error detection and correction, accept user requests, retrieve the requested data files and provide the data to multiple users at a combined rate of 50 Mbps. Stored and replicated data files will have a bit error rate of less than 10 to the -9th even after ten years of storage. The integrated system will be demonstrated to prove the technology late in 1981.

  13. New methodology to reconstruct in 2-D the cuspal enamel of modern human lower molars.

    PubMed

    Modesto-Mata, Mario; García-Campos, Cecilia; Martín-Francés, Laura; Martínez de Pinillos, Marina; García-González, Rebeca; Quintino, Yuliet; Canals, Antoni; Lozano, Marina; Dean, M Christopher; Martinón-Torres, María; Bermúdez de Castro, José María

    2017-08-01

    In the last years different methodologies have been developed to reconstruct worn teeth. In this article, we propose a new 2-D methodology to reconstruct the worn enamel of lower molars. Our main goals are to reconstruct molars with a high level of accuracy when measuring relevant histological variables and to validate the methodology calculating the errors associated with the measurements. This methodology is based on polynomial regression equations, and has been validated using two different dental variables: cuspal enamel thickness and crown height of the protoconid. In order to perform the validation process, simulated worn modern human molars were employed. The associated errors of the measurements were also estimated applying methodologies previously proposed by other authors. The mean percentage error estimated in reconstructed molars for these two variables in comparison with their own real values is -2.17% for the cuspal enamel thickness of the protoconid and -3.18% for the crown height of the protoconid. This error significantly improves the results of other methodologies, both in the interobserver error and in the accuracy of the measurements. The new methodology based on polynomial regressions can be confidently applied to the reconstruction of cuspal enamel of lower molars, as it improves the accuracy of the measurements and reduces the interobserver error. The present study shows that it is important to validate all methodologies in order to know the associated errors. This new methodology can be easily exportable to other modern human populations, the human fossil record and forensic sciences. © 2017 Wiley Periodicals, Inc.

  14. Safety and Performance Analysis of the Non-Radar Oceanic/Remote Airspace In-Trail Procedure

    NASA Technical Reports Server (NTRS)

    Carreno, Victor A.; Munoz, Cesar A.

    2007-01-01

    This document presents a safety and performance analysis of the nominal case for the In-Trail Procedure (ITP) in a non-radar oceanic/remote airspace. The analysis estimates the risk of collision between the aircraft performing the ITP and a reference aircraft. The risk of collision is only estimated for the ITP maneuver and it is based on nominal operating conditions. The analysis does not consider human error, communication error conditions, or the normal risk of flight present in current operations. The hazards associated with human error and communication errors are evaluated in an Operational Hazards Analysis presented elsewhere.

  15. Human domination of the biosphere: Rapid discharge of the earth-space battery foretells the future of humankind

    PubMed Central

    Schramski, John R.; Gattie, David K.; Brown, James H.

    2015-01-01

    Earth is a chemical battery where, over evolutionary time with a trickle-charge of photosynthesis using solar energy, billions of tons of living biomass were stored in forests and other ecosystems and in vast reserves of fossil fuels. In just the last few hundred years, humans extracted exploitable energy from these living and fossilized biomass fuels to build the modern industrial-technological-informational economy, to grow our population to more than 7 billion, and to transform the biogeochemical cycles and biodiversity of the earth. This rapid discharge of the earth’s store of organic energy fuels the human domination of the biosphere, including conversion of natural habitats to agricultural fields and the resulting loss of native species, emission of carbon dioxide and the resulting climate and sea level change, and use of supplemental nuclear, hydro, wind, and solar energy sources. The laws of thermodynamics governing the trickle-charge and rapid discharge of the earth’s battery are universal and absolute; the earth is only temporarily poised a quantifiable distance from the thermodynamic equilibrium of outer space. Although this distance from equilibrium is comprised of all energy types, most critical for humans is the store of living biomass. With the rapid depletion of this chemical energy, the earth is shifting back toward the inhospitable equilibrium of outer space with fundamental ramifications for the biosphere and humanity. Because there is no substitute or replacement energy for living biomass, the remaining distance from equilibrium that will be required to support human life is unknown. PMID:26178196

  16. Human domination of the biosphere: Rapid discharge of the earth-space battery foretells the future of humankind.

    PubMed

    Schramski, John R; Gattie, David K; Brown, James H

    2015-08-04

    Earth is a chemical battery where, over evolutionary time with a trickle-charge of photosynthesis using solar energy, billions of tons of living biomass were stored in forests and other ecosystems and in vast reserves of fossil fuels. In just the last few hundred years, humans extracted exploitable energy from these living and fossilized biomass fuels to build the modern industrial-technological-informational economy, to grow our population to more than 7 billion, and to transform the biogeochemical cycles and biodiversity of the earth. This rapid discharge of the earth's store of organic energy fuels the human domination of the biosphere, including conversion of natural habitats to agricultural fields and the resulting loss of native species, emission of carbon dioxide and the resulting climate and sea level change, and use of supplemental nuclear, hydro, wind, and solar energy sources. The laws of thermodynamics governing the trickle-charge and rapid discharge of the earth's battery are universal and absolute; the earth is only temporarily poised a quantifiable distance from the thermodynamic equilibrium of outer space. Although this distance from equilibrium is comprised of all energy types, most critical for humans is the store of living biomass. With the rapid depletion of this chemical energy, the earth is shifting back toward the inhospitable equilibrium of outer space with fundamental ramifications for the biosphere and humanity. Because there is no substitute or replacement energy for living biomass, the remaining distance from equilibrium that will be required to support human life is unknown.

  17. A cognitive taxonomy of medical errors.

    PubMed

    Zhang, Jiajie; Patel, Vimla L; Johnson, Todd R; Shortliffe, Edward H

    2004-06-01

    Propose a cognitive taxonomy of medical errors at the level of individuals and their interactions with technology. Use cognitive theories of human error and human action to develop the theoretical foundations of the taxonomy, develop the structure of the taxonomy, populate the taxonomy with examples of medical error cases, identify cognitive mechanisms for each category of medical error under the taxonomy, and apply the taxonomy to practical problems. Four criteria were used to evaluate the cognitive taxonomy. The taxonomy should be able (1) to categorize major types of errors at the individual level along cognitive dimensions, (2) to associate each type of error with a specific underlying cognitive mechanism, (3) to describe how and explain why a specific error occurs, and (4) to generate intervention strategies for each type of error. The proposed cognitive taxonomy largely satisfies the four criteria at a theoretical and conceptual level. Theoretically, the proposed cognitive taxonomy provides a method to systematically categorize medical errors at the individual level along cognitive dimensions, leads to a better understanding of the underlying cognitive mechanisms of medical errors, and provides a framework that can guide future studies on medical errors. Practically, it provides guidelines for the development of cognitive interventions to decrease medical errors and foundation for the development of medical error reporting system that not only categorizes errors but also identifies problems and helps to generate solutions. To validate this model empirically, we will next be performing systematic experimental studies.

  18. Differential sensitivity to human communication in dogs, wolves, and human infants.

    PubMed

    Topál, József; Gergely, György; Erdohegyi, Agnes; Csibra, Gergely; Miklósi, Adám

    2009-09-04

    Ten-month-old infants persistently search for a hidden object at its initial hiding place even after observing it being hidden at another location. Recent evidence suggests that communicative cues from the experimenter contribute to the emergence of this perseverative search error. We replicated these results with dogs (Canis familiaris), who also commit more search errors in ostensive-communicative (in 75% of the total trials) than in noncommunicative (39%) or nonsocial (17%) hiding contexts. However, comparative investigations suggest that communicative signals serve different functions for dogs and infants, whereas human-reared wolves (Canis lupus) do not show doglike context-dependent differences of search errors. We propose that shared sensitivity to human communicative signals stems from convergent social evolution of the Homo and the Canis genera.

  19. Metrics for Business Process Models

    NASA Astrophysics Data System (ADS)

    Mendling, Jan

    Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.

  20. Hierarchical Learning Induces Two Simultaneous, But Separable, Prediction Errors in Human Basal Ganglia

    PubMed Central

    Tsai, Karin; Wallis, Jonathan; Botvinick, Matthew

    2013-01-01

    Studies suggest that dopaminergic neurons report a unitary, global reward prediction error signal. However, learning in complex real-life tasks, in particular tasks that show hierarchical structure, requires multiple prediction errors that may coincide in time. We used functional neuroimaging to measure prediction error signals in humans performing such a hierarchical task involving simultaneous, uncorrelated prediction errors. Analysis of signals in a priori anatomical regions of interest in the ventral striatum and the ventral tegmental area indeed evidenced two simultaneous, but separable, prediction error signals corresponding to the two levels of hierarchy in the task. This result suggests that suitably designed tasks may reveal a more intricate pattern of firing in dopaminergic neurons. Moreover, the need for downstream separation of these signals implies possible limitations on the number of different task levels that we can learn about simultaneously. PMID:23536092

  1. Associations between errors and contributing factors in aircraft maintenance

    NASA Technical Reports Server (NTRS)

    Hobbs, Alan; Williamson, Ann

    2003-01-01

    In recent years cognitive error models have provided insights into the unsafe acts that lead to many accidents in safety-critical environments. Most models of accident causation are based on the notion that human errors occur in the context of contributing factors. However, there is a lack of published information on possible links between specific errors and contributing factors. A total of 619 safety occurrences involving aircraft maintenance were reported using a self-completed questionnaire. Of these occurrences, 96% were related to the actions of maintenance personnel. The types of errors that were involved, and the contributing factors associated with those actions, were determined. Each type of error was associated with a particular set of contributing factors and with specific occurrence outcomes. Among the associations were links between memory lapses and fatigue and between rule violations and time pressure. Potential applications of this research include assisting with the design of accident prevention strategies, the estimation of human error probabilities, and the monitoring of organizational safety performance.

  2. Neuroanatomical dissociation for taxonomic and thematic knowledge in the human brain

    PubMed Central

    Schwartz, Myrna F.; Kimberg, Daniel Y.; Walker, Grant M.; Brecher, Adelyn; Faseyitan, Olufunsho K.; Dell, Gary S.; Mirman, Daniel; Coslett, H. Branch

    2011-01-01

    It is thought that semantic memory represents taxonomic information differently from thematic information. This study investigated the neural basis for the taxonomic-thematic distinction in a unique way. We gathered picture-naming errors from 86 individuals with poststroke language impairment (aphasia). Error rates were determined separately for taxonomic errors (“pear” in response to apple) and thematic errors (“worm” in response to apple), and their shared variance was regressed out of each measure. With the segmented lesions normalized to a common template, we carried out voxel-based lesion-symptom mapping on each error type separately. We found that taxonomic errors localized to the left anterior temporal lobe and thematic errors localized to the left temporoparietal junction. This is an indication that the contribution of these regions to semantic memory cleaves along taxonomic-thematic lines. Our findings show that a distinction long recognized in the psychological sciences is grounded in the structure and function of the human brain. PMID:21540329

  3. Interspecies scaling and prediction of human clearance: comparison of small- and macro-molecule drugs

    PubMed Central

    Huh, Yeamin; Smith, David E.; Feng, Meihau Rose

    2014-01-01

    Human clearance prediction for small- and macro-molecule drugs was evaluated and compared using various scaling methods and statistical analysis.Human clearance is generally well predicted using single or multiple species simple allometry for macro- and small-molecule drugs excreted renally.The prediction error is higher for hepatically eliminated small-molecules using single or multiple species simple allometry scaling, and it appears that the prediction error is mainly associated with drugs with low hepatic extraction ratio (Eh). The error in human clearance prediction for hepatically eliminated small-molecules was reduced using scaling methods with a correction of maximum life span (MLP) or brain weight (BRW).Human clearance of both small- and macro-molecule drugs is well predicted using the monkey liver blood flow method. Predictions using liver blood flow from other species did not work as well, especially for the small-molecule drugs. PMID:21892879

  4. Fifty Years of THERP and Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring

    2012-06-01

    In 1962 at a Human Factors Society symposium, Alan Swain presented a paper introducing a Technique for Human Error Rate Prediction (THERP). This was followed in 1963 by a Sandia Laboratories monograph outlining basic human error quantification using THERP and, in 1964, by a special journal edition of Human Factors on quantification of human performance. Throughout the 1960s, Swain and his colleagues focused on collecting human performance data for the Sandia Human Error Rate Bank (SHERB), primarily in connection with supporting the reliability of nuclear weapons assembly in the US. In 1969, Swain met with Jens Rasmussen of Risø Nationalmore » Laboratory and discussed the applicability of THERP to nuclear power applications. By 1975, in WASH-1400, Swain had articulated the use of THERP for nuclear power applications, and the approach was finalized in the watershed publication of the NUREG/CR-1278 in 1983. THERP is now 50 years old, and remains the most well known and most widely used HRA method. In this paper, the author discusses the history of THERP, based on published reports and personal communication and interviews with Swain. The author also outlines the significance of THERP. The foundations of human reliability analysis are found in THERP: human failure events, task analysis, performance shaping factors, human error probabilities, dependence, event trees, recovery, and pre- and post-initiating events were all introduced in THERP. While THERP is not without its detractors, and it is showing signs of its age in the face of newer technological applications, the longevity of THERP is a testament of its tremendous significance. THERP started the field of human reliability analysis. This paper concludes with a discussion of THERP in the context of newer methods, which can be seen as extensions of or departures from Swain’s pioneering work.« less

  5. Error-free replicative bypass of (6–4) photoproducts by DNA polymerase ζ in mouse and human cells

    PubMed Central

    Yoon, Jung-Hoon; Prakash, Louise; Prakash, Satya

    2010-01-01

    The ultraviolet (UV)-induced (6–4) pyrimidine–pyrimidone photoproduct [(6–4) PP] confers a large structural distortion in DNA. Here we examine in human cells the roles of translesion synthesis (TLS) DNA polymerases (Pols) in promoting replication through a (6–4) TT photoproduct carried on a duplex plasmid where bidirectional replication initiates from an origin of replication. We show that TLS contributes to a large fraction of lesion bypass and that it is mostly error-free. We find that, whereas Pol η and Pol ι provide alternate pathways for mutagenic TLS, surprisingly, Pol ζ functions independently of these Pols and in a predominantly error-free manner. We verify and extend these observations in mouse cells and conclude that, in human cells, TLS during replication can be markedly error-free even opposite a highly distorting DNA lesion. PMID:20080950

  6. A Framework for Modeling Human-Machine Interactions

    NASA Technical Reports Server (NTRS)

    Shafto, Michael G.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    Modern automated flight-control systems employ a variety of different behaviors, or modes, for managing the flight. While developments in cockpit automation have resulted in workload reduction and economical advantages, they have also given rise to an ill-defined class of human-machine problems, sometimes referred to as 'automation surprises'. Our interest in applying formal methods for describing human-computer interaction stems from our ongoing research on cockpit automation. In this area of aeronautical human factors, there is much concern about how flight crews interact with automated flight-control systems, so that the likelihood of making errors, in particular mode-errors, is minimized and the consequences of such errors are contained. The goal of the ongoing research on formal methods in this context is: (1) to develop a framework for describing human interaction with control systems; (2) to formally categorize such automation surprises; and (3) to develop tests for identification of these categories early in the specification phase of a new human-machine system.

  7. Inborn Errors of Human JAKs and STATs

    PubMed Central

    Casanova, Jean-Laurent; Holland, Steven M.; Notarangelo, Luigi D.

    2012-01-01

    Inborn errors of the genes encoding two of the four human JAKs (JAK3 and TYK2) and three of the six human STATs (STAT1, STAT3, and STAT5B) have been described. We review the disorders arising from mutations in these five genes, highlighting the way in which the molecular and cellular pathogenesis of these conditions has been clarified by the discovery of inborn errors of cytokines, hormones, and their receptors, including those interacting with JAKs and STATs. The phenotypic similarities between mice and humans lacking individual JAK-STAT components suggest that the functions of JAKs and STATs are largely conserved in mammals. However, a wide array of phenotypic differences has emerged between mice and humans carrying bi-allelic null alleles of JAK3, TYK2, STAT1, or STAT5B. Moreover, the high level of allelic heterogeneity at the human JAK3, STAT1, and STAT3 loci has revealed highly diverse immunological and clinical phenotypes, which had not been anticipated. PMID:22520845

  8. Inborn errors of human JAKs and STATs.

    PubMed

    Casanova, Jean-Laurent; Holland, Steven M; Notarangelo, Luigi D

    2012-04-20

    Inborn errors of the genes encoding two of the four human JAKs (JAK3 and TYK2) and three of the six human STATs (STAT1, STAT3, and STAT5B) have been described. We review the disorders arising from mutations in these five genes, highlighting the way in which the molecular and cellular pathogenesis of these conditions has been clarified by the discovery of inborn errors of cytokines, hormones, and their receptors, including those interacting with JAKs and STATs. The phenotypic similarities between mice and humans lacking individual JAK-STAT components suggest that the functions of JAKs and STATs are largely conserved in mammals. However, a wide array of phenotypic differences has emerged between mice and humans carrying biallelic null alleles of JAK3, TYK2, STAT1, or STAT5B. Moreover, the high degree of allelic heterogeneity at the human JAK3, TYK2, STAT1, and STAT3 loci has revealed highly diverse immunological and clinical phenotypes, which had not been anticipated. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Use of modeling to identify vulnerabilities to human error in laparoscopy.

    PubMed

    Funk, Kenneth H; Bauer, James D; Doolen, Toni L; Telasha, David; Nicolalde, R Javier; Reeber, Miriam; Yodpijit, Nantakrit; Long, Myra

    2010-01-01

    This article describes an exercise to investigate the utility of modeling and human factors analysis in understanding surgical processes and their vulnerabilities to medical error. A formal method to identify error vulnerabilities was developed and applied to a test case of Veress needle insertion during closed laparoscopy. A team of 2 surgeons, a medical assistant, and 3 engineers used hierarchical task analysis and Integrated DEFinition language 0 (IDEF0) modeling to create rich models of the processes used in initial port creation. Using terminology from a standardized human performance database, detailed task descriptions were written for 4 tasks executed in the process of inserting the Veress needle. Key terms from the descriptions were used to extract from the database generic errors that could occur. Task descriptions with potential errors were translated back into surgical terminology. Referring to the process models and task descriptions, the team used a modified failure modes and effects analysis (FMEA) to consider each potential error for its probability of occurrence, its consequences if it should occur and be undetected, and its probability of detection. The resulting likely and consequential errors were prioritized for intervention. A literature-based validation study confirmed the significance of the top error vulnerabilities identified using the method. Ongoing work includes design and evaluation of procedures to correct the identified vulnerabilities and improvements to the modeling and vulnerability identification methods. Copyright 2010 AAGL. Published by Elsevier Inc. All rights reserved.

  10. Satellite Gravimetry Applied to Drought Monitoring

    NASA Technical Reports Server (NTRS)

    Rodell, Matthew

    2010-01-01

    Near-surface wetness conditions change rapidly with the weather, which limits their usefulness as drought indicators. Deeper stores of water, including root-zone soil wetness and groundwater, portend longer-term weather trends and climate variations, thus they are well suited for quantifying droughts. However, the existing in situ networks for monitoring these variables suffer from significant discontinuities (short records and spatial undersampling), as well as the inherent human and mechanical errors associated with the soil moisture and groundwater observation. Remote sensing is a promising alternative, but standard remote sensors, which measure various wavelengths of light emitted or reflected from Earth's surface and atmosphere, can only directly detect wetness conditions within the first few centimeters of the land s surface. Such sensors include the Advanced Microwave Scanning Radiometer - Earth Observing System (AMSR-E) C-band passive microwave measurement system on the National Aeronautic and Space Administration's (NASA) Aqua satellite, and the combined active and passive L-band microwave system currently under development for NASA's planned Soil Moisture Active Passive (SMAP) satellite mission. These instruments are sensitive to water as deep as the top 2 cm and 5 cm of the soil column, respectively, with the specific depth depending on vegetation cover. Thermal infrared (TIR) imaging has been used to infer water stored in the full root zone, with limitations: auxiliary information including soil grain size is required, the TIR temperature versus soil water content curve becomes flat as wetness increases, and dense vegetation and cloud cover impede measurement. Numerical models of land surface hydrology are another potential solution, but the quality of output from such models is limited by errors in the input data and tradeoffs between model realism and computational efficiency. This chapter is divided into eight sections, the next of which describes the theory behind satellite gravimetry. Following that is a summary of the GRACE mission and how hydrological information is gleaned from its gravity products. The fourth section provides examples of hydrological science enabled by GRACE. The fifth and sixth sections list the challenging aspects of GRACE derived hydrology data and how they are being overcome, including the use of data assimilation. The seventh section describes recent progress in applying GRACE for drought monitoring, including the development of new soil moisture and drought indicator products, and that is followed by a discussion of future prospects in satellite gravimetry based drought monitoring.

  11. Rapid Recycling of Ca2+ between IP3-Sensitive Stores and Lysosomes

    PubMed Central

    López Sanjurjo, Cristina I.; Tovey, Stephen C.; Taylor, Colin W.

    2014-01-01

    Inositol 1,4,5-trisphosphate (IP3) evokes release of Ca2+ from the endoplasmic reticulum (ER), but the resulting Ca2+ signals are shaped by interactions with additional intracellular organelles. Bafilomycin A1, which prevents lysosomal Ca2+ uptake by inhibiting H+ pumping into lysosomes, increased the amplitude of the initial Ca2+ signals evoked by carbachol in human embryonic kidney (HEK) cells. Carbachol alone and carbachol in combination with parathyroid hormone (PTH) evoke Ca2+ release from distinct IP3-sensitive Ca2+ stores in HEK cells stably expressing human type 1 PTH receptors. Bafilomycin A1 similarly exaggerated the Ca2+ signals evoked by carbachol or carbachol with PTH, indicating that Ca2+ released from distinct IP3-sensitive Ca2+ stores is sequestered by lysosomes. The Ca2+ signals resulting from store-operated Ca2+ entry, whether evoked by thapsigargin or carbachol, were unaffected by bafilomycin A1. Using Gd3+ (1 mM) to inhibit both Ca2+ entry and Ca2+ extrusion, HEK cells were repetitively stimulated with carbachol to assess the effectiveness of Ca2+ recycling to the ER after IP3-evoked Ca2+ release. Blocking lysosomal Ca2+ uptake with bafilomycin A1 increased the amplitude of each carbachol-evoked Ca2+ signal without affecting the rate of Ca2+ recycling to the ER. This suggests that Ca2+ accumulated by lysosomes is rapidly returned to the ER. We conclude that lysosomes rapidly, reversibly and selectively accumulate the Ca2+ released by IP3 receptors residing within distinct Ca2+ stores, but not the Ca2+ entering cells via receptor-regulated, store-operated Ca2+ entry pathways. PMID:25337829

  12. Rapid recycling of Ca2+ between IP3-sensitive stores and lysosomes.

    PubMed

    López Sanjurjo, Cristina I; Tovey, Stephen C; Taylor, Colin W

    2014-01-01

    Inositol 1,4,5-trisphosphate (IP3) evokes release of Ca2+ from the endoplasmic reticulum (ER), but the resulting Ca2+ signals are shaped by interactions with additional intracellular organelles. Bafilomycin A1, which prevents lysosomal Ca2+ uptake by inhibiting H+ pumping into lysosomes, increased the amplitude of the initial Ca2+ signals evoked by carbachol in human embryonic kidney (HEK) cells. Carbachol alone and carbachol in combination with parathyroid hormone (PTH) evoke Ca2+ release from distinct IP3-sensitive Ca2+ stores in HEK cells stably expressing human type 1 PTH receptors. Bafilomycin A1 similarly exaggerated the Ca2+ signals evoked by carbachol or carbachol with PTH, indicating that Ca2+ released from distinct IP3-sensitive Ca2+ stores is sequestered by lysosomes. The Ca2+ signals resulting from store-operated Ca2+ entry, whether evoked by thapsigargin or carbachol, were unaffected by bafilomycin A1. Using Gd3+ (1 mM) to inhibit both Ca2+ entry and Ca2+ extrusion, HEK cells were repetitively stimulated with carbachol to assess the effectiveness of Ca2+ recycling to the ER after IP3-evoked Ca2+ release. Blocking lysosomal Ca2+ uptake with bafilomycin A1 increased the amplitude of each carbachol-evoked Ca2+ signal without affecting the rate of Ca2+ recycling to the ER. This suggests that Ca2+ accumulated by lysosomes is rapidly returned to the ER. We conclude that lysosomes rapidly, reversibly and selectively accumulate the Ca2+ released by IP3 receptors residing within distinct Ca2+ stores, but not the Ca2+ entering cells via receptor-regulated, store-operated Ca2+ entry pathways.

  13. Detection of Error Related Neuronal Responses Recorded by Electrocorticography in Humans during Continuous Movements

    PubMed Central

    Milekovic, Tomislav; Ball, Tonio; Schulze-Bonhage, Andreas; Aertsen, Ad; Mehring, Carsten

    2013-01-01

    Background Brain-machine interfaces (BMIs) can translate the neuronal activity underlying a user’s movement intention into movements of an artificial effector. In spite of continuous improvements, errors in movement decoding are still a major problem of current BMI systems. If the difference between the decoded and intended movements becomes noticeable, it may lead to an execution error. Outcome errors, where subjects fail to reach a certain movement goal, are also present during online BMI operation. Detecting such errors can be beneficial for BMI operation: (i) errors can be corrected online after being detected and (ii) adaptive BMI decoding algorithm can be updated to make fewer errors in the future. Methodology/Principal Findings Here, we show that error events can be detected from human electrocorticography (ECoG) during a continuous task with high precision, given a temporal tolerance of 300–400 milliseconds. We quantified the error detection accuracy and showed that, using only a small subset of 2×2 ECoG electrodes, 82% of detection information for outcome error and 74% of detection information for execution error available from all ECoG electrodes could be retained. Conclusions/Significance The error detection method presented here could be used to correct errors made during BMI operation or to adapt a BMI algorithm to make fewer errors in the future. Furthermore, our results indicate that smaller ECoG implant could be used for error detection. Reducing the size of an ECoG electrode implant used for BMI decoding and error detection could significantly reduce the medical risk of implantation. PMID:23383315

  14. DA-6034 Induces [Ca(2+)]i Increase in Epithelial Cells.

    PubMed

    Yang, Yu-Mi; Park, Soonhong; Ji, Hyewon; Kim, Tae-Im; Kim, Eung Kweon; Kang, Kyung Koo; Shin, Dong Min

    2014-04-01

    DA-6034, a eupatilin derivative of flavonoid, has shown potent effects on the protection of gastric mucosa and induced the increases in fluid and glycoprotein secretion in human and rat corneal and conjunctival cells, suggesting that it might be considered as a drug for the treatment of dry eye. However, whether DA-6034 induces Ca(2+) signaling and its underlying mechanism in epithelial cells are not known. In the present study, we investigated the mechanism for actions of DA-6034 in Ca(2+) signaling pathways of the epithelial cells (conjunctival and corneal cells) from human donor eyes and mouse salivary gland epithelial cells. DA-6034 activated Ca(2+)-activated Cl(-) channels (CaCCs) and increased intracellular calcium concentrations ([Ca(2+)]i) in primary cultured human conjunctival cells. DA-6034 also increased [Ca(2+)]i in mouse salivary gland cells and human corneal epithelial cells. [Ca(2+)]i increase of DA-6034 was dependent on the Ca(2+) entry from extracellular and Ca(2+) release from internal Ca(2+) stores. Interestingly, these effects of DA-6034 were related to ryanodine receptors (RyRs) but not phospholipase C/inositol 1,4,5-triphosphate (IP3) pathway and lysosomal Ca(2+) stores. These results suggest that DA-6034 induces Ca(2+) signaling via extracellular Ca(2+) entry and RyRs-sensitive Ca(2+) release from internal Ca(2+) stores in epithelial cells.

  15. DNA as a digital information storage device: hope or hype?

    PubMed

    Panda, Darshan; Molla, Kutubuddin Ali; Baig, Mirza Jainul; Swain, Alaka; Behera, Deeptirekha; Dash, Manaswini

    2018-05-01

    The total digital information today amounts to 3.52 × 10 22 bits globally, and at its consistent exponential rate of growth is expected to reach 3 × 10 24 bits by 2040. Data storage density of silicon chips is limited, and magnetic tapes used to maintain large-scale permanent archives begin to deteriorate within 20 years. Since silicon has limited data storage ability and serious limitations, such as human health hazards and environmental pollution, researchers across the world are intently searching for an appropriate alternative. Deoxyribonucleic acid (DNA) is an appealing option for such a purpose due to its endurance, a higher degree of compaction, and similarity to the sequential code of 0's and 1's as found in a computer. This emerging field of DNA as means of data storage has the potential to transform science fiction into reality, wherein a device that can fit in our palms can accommodate the information of the entire world, as latest research has revealed that just four grams of DNA could store the annual global digital information. DNA has all the properties to supersede the conventional hard disk, as it is capable of retaining ten times more data, has a thousandfold storage density, and consumes 10 8 times less power to store a similar amount of data. Although DNA has an enormous potential as a data storage device of the future, multiple bottlenecks such as exorbitant costs, excruciatingly slow writing and reading mechanisms, and vulnerability to mutations or errors need to be resolved. In this review, we have critically analyzed the emergence of DNA as a molecular storage device for the future, its ability to address the future digital data crunch, potential challenges in achieving this objective, various current industrial initiatives, and major breakthroughs.

  16. Hypoxic remodelling of Ca{sup 2+} stores does not alter human cardiac myofibroblast invasion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riches, K.; Hettiarachchi, N.T.; Porter, K.E.

    2010-12-17

    Research highlights: {yields} Bradykinin promotes migration and proliferation of myofibroblasts. {yields} Such activity is Ca{sup 2+}-dependent and occurs under hypoxic conditions. {yields} Hypoxia increased myofibroblast Ca{sup 2+} stores but not influx evoked by bradykinin. {yields} Myofibroblast migration and proliferation was unaffected by hypoxia. -- Abstract: Cardiac fibroblasts are the most abundant cell type in the heart, and play a key role in the maintenance and repair of the myocardium following damage such as myocardial infarction by transforming into a cardiac myofibroblast (CMF) phenotype. Repair occurs through controlled proliferation and migration, which are Ca{sup 2+} dependent processes, and often requires themore » cells to operate within a hypoxic environment. Angiotensin converting enzyme (ACE) inhibitors reduce infarct size through the promotion of bradykinin (BK) stability. Although CMF express BK receptors, their activity under the reduced O{sub 2} conditions that occur following infarct are entirely unexplored. Using Fura-2 microfluorimetry on primary human CMF, we found that hypoxia significantly increased the mobilisation of Ca{sup 2+} from intracellular stores in response to BK whilst capacitative Ca{sup 2+} entry (CCE) remained unchanged. The enhanced store mobilisation was due to a striking increase in CMF intracellular Ca{sup 2+}-store content under hypoxic conditions. However, BK-induced CMF migration or proliferation was not affected following hypoxic exposure, suggesting that Ca{sup 2+} influx rather than mobilisation is of primary importance in CMF migration and proliferation.« less

  17. 21 CFR 110.80 - Processes and controls.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... HUMAN CONSUMPTION CURRENT GOOD MANUFACTURING PRACTICE IN MANUFACTURING, PACKING, OR HOLDING HUMAN FOOD Production and Process Controls § 110.80 Processes and controls. All operations in the receiving, inspecting, transporting, segregating, preparing, manufacturing, packaging, and storing of food shall be conducted in...

  18. The Human Resources Management System: Part 1.

    ERIC Educational Resources Information Center

    Ceriello, Vincent R.

    1982-01-01

    Presents a systematic and disciplined approach to planning for the development and implementation of an information system which will collect, store, maintain, and report human resources data. Discusses guidelines, priorities, training requirements, security, auditing, interface with payroll, and personnel reporting. (CT)

  19. Cryopreservation of human blood for alkaline and Fpg-modified comet assay.

    PubMed

    Pu, Xinzhu; Wang, Zemin; Klaunig, James E

    2016-01-01

    The Comet assay is a reproducible and sensitive assay for the detection of DNA damage in eukaryotic cells and tissues. Incorporation of lesion specific, oxidative DNA damage repair enzymes (for example, Fpg, OGG1 and EndoIII) in the standard alkaline Comet assay procedure allows for the detection and measurement of oxidative DNA damage. The Comet assay using white blood cells (WBC) has proven useful in monitoring DNA damage from environmental agents in humans. However, it is often impractical to performance Comet assay immediately after blood sampling. Thus, storage of blood sample is required. In this study, we developed and tested a simple storage method for very small amount of whole blood for standard and Fpg-modified modified Comet assay. Whole blood was stored in RPMI 1640 media containing 10% FBS, 10% DMSO and 1 mM deferoxamine at a sample to media ratio of 1:50. Samples were stored at -20 °C and -80 °C for 1, 7, 14 and 28 days. Isolated lymphocytes from the same subjects were also stored under the same conditions for comparison. Direct DNA strand breakage and oxidative DNA damage in WBC and lymphocytes were analyzed using standard and Fpg-modified alkaline Comet assay and compared with freshly analyzed samples. No significant changes in either direct DNA strand breakage or oxidative DNA damage was seen in WBC and lymphocytes stored at -20 °C for 1 and 7 days compared to fresh samples. However, significant increases in both direct and oxidative DNA damage were seen in samples stored at -20 °C for 14 and 28 days. No changes in direct and oxidative DNA damage were observed in WBC and lymphocytes stored at -80 °C for up to 28 days. These results identified the proper storage conditions for storing whole blood or isolated lymphocytes to evaluate direct and oxidative DNA damage using standard and Fpg-modified alkaline Comet assay.

  20. Calcium-activated K(+) channel (K(Ca)3.1) activity during Ca(2+) store depletion and store-operated Ca(2+) entry in human macrophages.

    PubMed

    Gao, Ya-dong; Hanley, Peter J; Rinné, Susanne; Zuzarte, Marylou; Daut, Jurgen

    2010-07-01

    STIM1 'senses' decreases in endoplasmic reticular (ER) luminal Ca(2+) and induces store-operated Ca(2+) (SOC) entry through plasma membrane Orai channels. The Ca(2+)/calmodulin-activated K(+) channel K(Ca)3.1 (previously known as SK4) has been implicated as an 'amplifier' of the Ca(2+)-release activated Ca(2+) (CRAC) current, especially in T lymphocytes. We have previously shown that human macrophages express K(Ca)3.1, and here we used the whole-cell patch-clamp technique to investigate the activity of these channels during Ca(2+) store depletion and store-operated Ca(2+) influx. Using RT-PCR, we found that macrophages express the elementary CRAC channel components Orai1 and STIM1, as well as Orai2, Orai3 and STIM2, but not the putatively STIM1-activated channels TRPC1, TRPC3-7 or TRPV6. In whole-cell configuration, a robust Ca(2+)-induced outwardly rectifying K(+) current inhibited by clotrimazole and augmented by DC-EBIO could be detected, consistent with K(Ca)3.1 channel current (also known as intermediate-conductance IK1). Introduction of extracellular Ca(2+) following Ca(2+) store depletion via P2Y(2) receptors induced a robust charybdotoxin (CTX)- and 2-APB-sensitive outward K(+) current and hyperpolarization. We also found that SOC entry induced by thapsigargin treatment induced CTX-sensitive K(+) current in HEK293 cells transiently expressing K(Ca)3.1. Our data suggest that SOC and K(Ca)3.1 channels are tightly coupled, such that a small Ca(2+) influx current induces a much large K(Ca)3.1 channel current and hyperpolarization, providing the necessary electrochemical driving force for prolonged Ca(2+) signaling and store repletion. Copyright 2010 Elsevier Ltd. All rights reserved.

  1. Inborn Errors of Fructose Metabolism. What Can We Learn from Them?

    PubMed

    Tran, Christel

    2017-04-03

    Fructose is one of the main sweetening agents in the human diet and its ingestion is increasing globally. Dietary sugar has particular effects on those whose capacity to metabolize fructose is limited. If intolerance to carbohydrates is a frequent finding in children, inborn errors of carbohydrate metabolism are rare conditions. Three inborn errors are known in the pathway of fructose metabolism; (1) essential or benign fructosuria due to fructokinase deficiency; (2) hereditary fructose intolerance; and (3) fructose-1,6-bisphosphatase deficiency. In this review the focus is set on the description of the clinical symptoms and biochemical anomalies in the three inborn errors of metabolism. The potential toxic effects of fructose in healthy humans also are discussed. Studies conducted in patients with inborn errors of fructose metabolism helped to understand fructose metabolism and its potential toxicity in healthy human. Influence of fructose on the glycolytic pathway and on purine catabolism is the cause of hypoglycemia, lactic acidosis and hyperuricemia. The discovery that fructose-mediated generation of uric acid may have a causal role in diabetes and obesity provided new understandings into pathogenesis for these frequent diseases.

  2. Inborn Errors of Fructose Metabolism. What Can We Learn from Them?

    PubMed Central

    Tran, Christel

    2017-01-01

    Fructose is one of the main sweetening agents in the human diet and its ingestion is increasing globally. Dietary sugar has particular effects on those whose capacity to metabolize fructose is limited. If intolerance to carbohydrates is a frequent finding in children, inborn errors of carbohydrate metabolism are rare conditions. Three inborn errors are known in the pathway of fructose metabolism; (1) essential or benign fructosuria due to fructokinase deficiency; (2) hereditary fructose intolerance; and (3) fructose-1,6-bisphosphatase deficiency. In this review the focus is set on the description of the clinical symptoms and biochemical anomalies in the three inborn errors of metabolism. The potential toxic effects of fructose in healthy humans also are discussed. Studies conducted in patients with inborn errors of fructose metabolism helped to understand fructose metabolism and its potential toxicity in healthy human. Influence of fructose on the glycolytic pathway and on purine catabolism is the cause of hypoglycemia, lactic acidosis and hyperuricemia. The discovery that fructose-mediated generation of uric acid may have a causal role in diabetes and obesity provided new understandings into pathogenesis for these frequent diseases. PMID:28368361

  3. Using lean "automation with a human touch" to improve medication safety: a step closer to the "perfect dose".

    PubMed

    Ching, Joan M; Williams, Barbara L; Idemoto, Lori M; Blackmore, C Craig

    2014-08-01

    Virginia Mason Medical Center (Seattle) employed the Lean concept of Jidoka (automation with a human touch) to plan for and deploy bar code medication administration (BCMA) to hospitalized patients. Integrating BCMA technology into the nursing work flow with minimal disruption was accomplished using three steps ofJidoka: (1) assigning work to humans and machines on the basis of their differing abilities, (2) adapting machines to the human work flow, and (3) monitoring the human-machine interaction. Effectiveness of BCMA to both reinforce safe administration practices and reduce medication errors was measured using the Collaborative Alliance for Nursing Outcomes (CALNOC) Medication Administration Accuracy Quality Study methodology. Trained nurses observed a total of 16,149 medication doses for 3,617 patients in a three-year period. Following BCMA implementation, the number of safe practice violations decreased from 54.8 violations/100 doses (January 2010-September 2011) to 29.0 violations/100 doses (October 2011-December 2012), resulting in an absolute risk reduction of 25.8 violations/100 doses (95% confidence interval [CI]: 23.7, 27.9, p < .001). The number of medication errors decreased from 5.9 errors/100 doses at baseline to 3.0 errors/100 doses after BCMA implementation (absolute risk reduction: 2.9 errors/100 doses [95% CI: 2.2, 3.6,p < .001]). The number of unsafe administration practices (estimate, -5.481; standard error 1.133; p < .001; 95% CI: -7.702, -3.260) also decreased. As more hospitals respond to health information technology meaningful use incentives, thoughtful, methodical, and well-managed approaches to technology deployment are crucial. This work illustrates how Jidoka offers opportunities for a smooth transition to new technology.

  4. Applying human factors principles to alert design increases efficiency and reduces prescribing errors in a scenario-based simulation

    PubMed Central

    Russ, Alissa L; Zillich, Alan J; Melton, Brittany L; Russell, Scott A; Chen, Siying; Spina, Jeffrey R; Weiner, Michael; Johnson, Elizabette G; Daggy, Joanne K; McManus, M Sue; Hawsey, Jason M; Puleo, Anthony G; Doebbeling, Bradley N; Saleem, Jason J

    2014-01-01

    Objective To apply human factors engineering principles to improve alert interface design. We hypothesized that incorporating human factors principles into alerts would improve usability, reduce workload for prescribers, and reduce prescribing errors. Materials and methods We performed a scenario-based simulation study using a counterbalanced, crossover design with 20 Veterans Affairs prescribers to compare original versus redesigned alerts. We redesigned drug–allergy, drug–drug interaction, and drug–disease alerts based upon human factors principles. We assessed usability (learnability of redesign, efficiency, satisfaction, and usability errors), perceived workload, and prescribing errors. Results Although prescribers received no training on the design changes, prescribers were able to resolve redesigned alerts more efficiently (median (IQR): 56 (47) s) compared to the original alerts (85 (71) s; p=0.015). In addition, prescribers rated redesigned alerts significantly higher than original alerts across several dimensions of satisfaction. Redesigned alerts led to a modest but significant reduction in workload (p=0.042) and significantly reduced the number of prescribing errors per prescriber (median (range): 2 (1–5) compared to original alerts: 4 (1–7); p=0.024). Discussion Aspects of the redesigned alerts that likely contributed to better prescribing include design modifications that reduced usability-related errors, providing clinical data closer to the point of decision, and displaying alert text in a tabular format. Displaying alert text in a tabular format may help prescribers extract information quickly and thereby increase responsiveness to alerts. Conclusions This simulation study provides evidence that applying human factors design principles to medication alerts can improve usability and prescribing outcomes. PMID:24668841

  5. Applying human factors principles to alert design increases efficiency and reduces prescribing errors in a scenario-based simulation.

    PubMed

    Russ, Alissa L; Zillich, Alan J; Melton, Brittany L; Russell, Scott A; Chen, Siying; Spina, Jeffrey R; Weiner, Michael; Johnson, Elizabette G; Daggy, Joanne K; McManus, M Sue; Hawsey, Jason M; Puleo, Anthony G; Doebbeling, Bradley N; Saleem, Jason J

    2014-10-01

    To apply human factors engineering principles to improve alert interface design. We hypothesized that incorporating human factors principles into alerts would improve usability, reduce workload for prescribers, and reduce prescribing errors. We performed a scenario-based simulation study using a counterbalanced, crossover design with 20 Veterans Affairs prescribers to compare original versus redesigned alerts. We redesigned drug-allergy, drug-drug interaction, and drug-disease alerts based upon human factors principles. We assessed usability (learnability of redesign, efficiency, satisfaction, and usability errors), perceived workload, and prescribing errors. Although prescribers received no training on the design changes, prescribers were able to resolve redesigned alerts more efficiently (median (IQR): 56 (47) s) compared to the original alerts (85 (71) s; p=0.015). In addition, prescribers rated redesigned alerts significantly higher than original alerts across several dimensions of satisfaction. Redesigned alerts led to a modest but significant reduction in workload (p=0.042) and significantly reduced the number of prescribing errors per prescriber (median (range): 2 (1-5) compared to original alerts: 4 (1-7); p=0.024). Aspects of the redesigned alerts that likely contributed to better prescribing include design modifications that reduced usability-related errors, providing clinical data closer to the point of decision, and displaying alert text in a tabular format. Displaying alert text in a tabular format may help prescribers extract information quickly and thereby increase responsiveness to alerts. This simulation study provides evidence that applying human factors design principles to medication alerts can improve usability and prescribing outcomes. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  6. Accounting for model error in Bayesian solutions to hydrogeophysical inverse problems using a local basis approach

    NASA Astrophysics Data System (ADS)

    Irving, J.; Koepke, C.; Elsheikh, A. H.

    2017-12-01

    Bayesian solutions to geophysical and hydrological inverse problems are dependent upon a forward process model linking subsurface parameters to measured data, which is typically assumed to be known perfectly in the inversion procedure. However, in order to make the stochastic solution of the inverse problem computationally tractable using, for example, Markov-chain-Monte-Carlo (MCMC) methods, fast approximations of the forward model are commonly employed. This introduces model error into the problem, which has the potential to significantly bias posterior statistics and hamper data integration efforts if not properly accounted for. Here, we present a new methodology for addressing the issue of model error in Bayesian solutions to hydrogeophysical inverse problems that is geared towards the common case where these errors cannot be effectively characterized globally through some parametric statistical distribution or locally based on interpolation between a small number of computed realizations. Rather than focusing on the construction of a global or local error model, we instead work towards identification of the model-error component of the residual through a projection-based approach. In this regard, pairs of approximate and detailed model runs are stored in a dictionary that grows at a specified rate during the MCMC inversion procedure. At each iteration, a local model-error basis is constructed for the current test set of model parameters using the K-nearest neighbour entries in the dictionary, which is then used to separate the model error from the other error sources before computing the likelihood of the proposed set of model parameters. We demonstrate the performance of our technique on the inversion of synthetic crosshole ground-penetrating radar traveltime data for three different subsurface parameterizations of varying complexity. The synthetic data are generated using the eikonal equation, whereas a straight-ray forward model is assumed in the inversion procedure. In each case, the developed model-error approach enables to remove posterior bias and obtain a more realistic characterization of uncertainty.

  7. Ionic liquid-based reagents improve the stability of midterm fecal sample storage.

    PubMed

    Hao, Lilan; Xia, Zhongkui; Yang, Huanming; Wang, Jian; Han, Mo

    2017-08-01

    Fecal samples are widely used in metagenomic research, which aims to elucidate the relationship between human health and the intestinal microbiota. However, the best conditions for stable and reliable storage and transport of these samples at room temperature are still unknown, and whether samples stored at room temperature for several days will maintain their microbiota composition is still unknown. Here, we established and tested a preservation method using reagents containing imidazolium- or pyridinium-based ionic liquids. We stored human fecal samples in these reagents for up to 7 days at different temperatures. Subsequently, all samples were sequenced and compared with fresh samples and/or samples treated under other conditions. The 16S rRNA sequencing results suggested that ionic liquid-based reagents could stabilize the composition of the microbiota in fecal samples during a 7-day storage period, particularly when stored at room temperature. Thus, this method may have implications in the storage of fecal samples for metagenomic research. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Storage of Organic and Inorganic Carbon in Human Settlements

    NASA Astrophysics Data System (ADS)

    Churkina, G.

    2009-12-01

    It has been shown that urban areas have carbon density comparable with tropical forest. Carbon density of urban areas may be even higher, because the density of organic carbon only was taking into account. Human settlements store carbon in two forms such as organic and inorganic. Carbon is stored in organic form in living biomass such as trees, grasses or in artifacts derived from biomass such as wooden furniture, building structures, paper, clothes and shoes made from natural materials. Inorganic carbon or fossil carbon, meanwhile, is primarily stored in objects fabricated by people like concrete, plastic, asphalt, and bricks. The key difference between organic and inorganic forms of carbon is how they return to the gaseous state. Organic carbon can be returned to the atmosphere without applying additional artificial energy through decomposition of organic matter, whereas energy input such as burning is needed to release inorganic carbon. In this study I compare inorganic with organic carbon storage, discuss their carbon residence time, decomposition rates, and possible implications for carbon emissions.

  9. In-memory interconnect protocol configuration registers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, Kevin Y.; Roberts, David A.

    Systems, apparatuses, and methods for moving the interconnect protocol configuration registers into the main memory space of a node. The region of memory used for storing the interconnect protocol configuration registers may also be made cacheable to reduce the latency of accesses to the interconnect protocol configuration registers. Interconnect protocol configuration registers which are used during a startup routine may be prefetched into the host's cache to make the startup routine more efficient. The interconnect protocol configuration registers for various interconnect protocols may include one or more of device capability tables, memory-side statistics (e.g., to support two-level memory data mappingmore » decisions), advanced memory and interconnect features such as repair resources and routing tables, prefetching hints, error correcting code (ECC) bits, lists of device capabilities, set and store base address, capability, device ID, status, configuration, capabilities, and other settings.« less

  10. A new use of high resolution magnetograms. [solar activity and magnetic flux

    NASA Technical Reports Server (NTRS)

    Baum, P. J.; Bratenahl, A.

    1978-01-01

    Ground-based solar magnetograms are frequently in error by as much as twenty percent and contribute to the poor correlation between magnetic changes and solar flares. High resolution measurement of the magnetic field component, which is normal to the photosphere and measured at photospheric height, can be used to construct a magnetic flux partition function F. Therefore, dF/dt is an EMF which drives atmospheric currents in reconnecting solar active regions. With a high quality magnetograph, the solar probe can be used to obtain good estimates of F and dF/dt and thereby the energy stored as induced solar atmospheric currents during quiescent interflare periods. Should a flare occur during a favorable observing period, the present method of analysis should show characteristic signatures in F, DF/dt, and especially, in the stored flux computed from dF/dt.

  11. Offset-free rail-to-rail derandomizing peak detect-and-hold circuit

    DOEpatents

    DeGeronimo, Gianluigi; O'Connor, Paul; Kandasamy, Anand

    2003-01-01

    A peak detect-and-hold circuit eliminates errors introduced by conventional amplifiers, such as common-mode rejection and input voltage offset. The circuit includes an amplifier, three switches, a transistor, and a capacitor. During a detect-and-hold phase, a hold voltage at a non-inverting in put terminal of the amplifier tracks an input voltage signal and when a peak is reached, the transistor is switched off, thereby storing a peak voltage in the capacitor. During a readout phase, the circuit functions as a unity gain buffer, in which the voltage stored in the capacitor is provided as an output voltage. The circuit is able to sense signals rail-to-rail and can readily be modified to sense positive, negative, or peak-to-peak voltages. Derandomization may be achieved by using a plurality of peak detect-and-hold circuits electrically connected in parallel.

  12. Development of a Clinical Data Warehouse for Hospital Infection Control

    PubMed Central

    Wisniewski, Mary F.; Kieszkowski, Piotr; Zagorski, Brandon M.; Trick, William E.; Sommers, Michael; Weinstein, Robert A.

    2003-01-01

    Existing data stored in a hospital's transactional servers have enormous potential to improve performance measurement and health care quality. Accessing, organizing, and using these data to support research and quality improvement projects are evolving challenges for hospital systems. The authors report development of a clinical data warehouse that they created by importing data from the information systems of three affiliated public hospitals. They describe their methodology; difficulties encountered; responses from administrators, computer specialists, and clinicians; and the steps taken to capture and store patient-level data. The authors provide examples of their use of the clinical data warehouse to monitor antimicrobial resistance, to measure antimicrobial use, to detect hospital-acquired bloodstream infections, to measure the cost of infections, and to detect antimicrobial prescribing errors. In addition, they estimate the amount of time and money saved and the increased precision achieved through the practical application of the data warehouse. PMID:12807807

  13. The Astrocentric Hypothesis: proposed role of astrocytes in consciousness and memory formation.

    PubMed

    Robertson, James M

    2002-01-01

    Consciousness is self-awareness. This process is closely associated with attention and working memory, a special form of short-term memory, which is vital when solving explicit task. Edelman has equated consciousness as the "remembered present" to highlight the importance of this form of memory (G.M. Edelman, Bright Air, Brilliant Fire, Basic Books, New York, 1992). The majority of other memories are recollections of past events that are encoded, stored, and brought back into consciousness if appropriate for solving new problems. Encoding prior experiences into memories is based on the salience of each event (A.R. Damasio, Descartes' Error, G.P. Putnam's Sons, New York, 1994; G.M. Edelman, Bright Air, Brilliant Fire, Basic Books, New York, 1992). It is proposed that protoplasmic astrocytes bind attended sensory information into consciousness and store encoded memories. This conclusion is supported by research conducted by gliobiologist over the past 15 years. Copyright 2002 Elsevier Science Ltd.

  14. Evaluating the Performance Diagnostic Checklist-Human Services to Assess Incorrect Error-Correction Procedures by Preschool Paraprofessionals

    ERIC Educational Resources Information Center

    Bowe, Melissa; Sellers, Tyra P.

    2018-01-01

    The Performance Diagnostic Checklist-Human Services (PDC-HS) has been used to assess variables contributing to undesirable staff performance. In this study, three preschool teachers completed the PDC-HS to identify the factors contributing to four paraprofessionals' inaccurate implementation of error-correction procedures during discrete trial…

  15. The Importance of HRA in Human Space Flight: Understanding the Risks

    NASA Technical Reports Server (NTRS)

    Hamlin, Teri

    2010-01-01

    Human performance is critical to crew safety during space missions. Humans interact with hardware and software during ground processing, normal flight, and in response to events. Human interactions with hardware and software can cause Loss of Crew and/or Vehicle (LOCV) through improper actions, or may prevent LOCV through recovery and control actions. Humans have the ability to deal with complex situations and system interactions beyond the capability of machines. Human Reliability Analysis (HRA) is a method used to qualitatively and quantitatively assess the occurrence of human failures that affect availability and reliability of complex systems. Modeling human actions with their corresponding failure probabilities in a Probabilistic Risk Assessment (PRA) provides a more complete picture of system risks and risk contributions. A high-quality HRA can provide valuable information on potential areas for improvement, including training, procedures, human interfaces design, and the need for automation. Modeling human error has always been a challenge in part because performance data is not always readily available. For spaceflight, the challenge is amplified not only because of the small number of participants and limited amount of performance data available, but also due to the lack of definition of the unique factors influencing human performance in space. These factors, called performance shaping factors in HRA terminology, are used in HRA techniques to modify basic human error probabilities in order to capture the context of an analyzed task. Many of the human error modeling techniques were developed within the context of nuclear power plants and therefore the methodologies do not address spaceflight factors such as the effects of microgravity and longer duration missions. This presentation will describe the types of human error risks which have shown up as risk drivers in the Shuttle PRA which may be applicable to commercial space flight. As with other large PRAs of complex machines, human error in the Shuttle PRA proved to be an important contributor (12 percent) to LOCV. An existing HRA technique was adapted for use in the Shuttle PRA, but additional guidance and improvements are needed to make the HRA task in space-related PRAs easier and more accurate. Therefore, this presentation will also outline plans for expanding current HRA methodology to more explicitly cover spaceflight performance shaping factors.

  16. Comment on "Differential sensitivity to human communication in dogs, wolves, and human infants".

    PubMed

    Fiset, Sylvain

    2010-07-09

    Topál et al. (Reports, 4 September 2009, p. 1269) reported that dogs' sensitivity to reading and using human signals contributes to the emergence of a spatial perseveration error (the A-not-B error) for locating objects. Here, I argue that the authors' conclusion was biased by two confounding factors: the use of an atypical A-not-B search task and an inadequate nonsocial condition as a control.

  17. NASA: Model development for human factors interfacing

    NASA Technical Reports Server (NTRS)

    Smith, L. L.

    1984-01-01

    The results of an intensive literature review in the general topics of human error analysis, stress and job performance, and accident and safety analysis revealed no usable techniques or approaches for analyzing human error in ground or space operations tasks. A task review model is described and proposed to be developed in order to reduce the degree of labor intensiveness in ground and space operations tasks. An extensive number of annotated references are provided.

  18. Designing to Control Flight Crew Errors

    NASA Technical Reports Server (NTRS)

    Schutte, Paul C.; Willshire, Kelli F.

    1997-01-01

    It is widely accepted that human error is a major contributing factor in aircraft accidents. There has been a significant amount of research in why these errors occurred, and many reports state that the design of flight deck can actually dispose humans to err. This research has led to the call for changes in design according to human factors and human-centered principles. The National Aeronautics and Space Administration's (NASA) Langley Research Center has initiated an effort to design a human-centered flight deck from a clean slate (i.e., without constraints of existing designs.) The effort will be based on recent research in human-centered design philosophy and mission management categories. This design will match the human's model of the mission and function of the aircraft to reduce unnatural or non-intuitive interfaces. The product of this effort will be a flight deck design description, including training and procedures, and a cross reference or paper trail back to design hypotheses, and an evaluation of the design. The present paper will discuss the philosophy, process, and status of this design effort.

  19. Causes and Prevention of Laparoscopic Bile Duct Injuries

    PubMed Central

    Way, Lawrence W.; Stewart, Lygia; Gantert, Walter; Liu, Kingsway; Lee, Crystine M.; Whang, Karen; Hunter, John G.

    2003-01-01

    Objective To apply human performance concepts in an attempt to understand the causes of and prevent laparoscopic bile duct injury. Summary Background Data Powerful conceptual advances have been made in understanding the nature and limits of human performance. Applying these findings in high-risk activities, such as commercial aviation, has allowed the work environment to be restructured to substantially reduce human error. Methods The authors analyzed 252 laparoscopic bile duct injuries according to the principles of the cognitive science of visual perception, judgment, and human error. The injury distribution was class I, 7%; class II, 22%; class III, 61%; and class IV, 10%. The data included operative radiographs, clinical records, and 22 videotapes of original operations. Results The primary cause of error in 97% of cases was a visual perceptual illusion. Faults in technical skill were present in only 3% of injuries. Knowledge and judgment errors were contributory but not primary. Sixty-four injuries (25%) were recognized at the index operation; the surgeon identified the problem early enough to limit the injury in only 15 (6%). In class III injuries the common duct, erroneously believed to be the cystic duct, was deliberately cut. This stemmed from an illusion of object form due to a specific uncommon configuration of the structures and the heuristic nature (unconscious assumptions) of human visual perception. The videotapes showed the persuasiveness of the illusion, and many operative reports described the operation as routine. Class II injuries resulted from a dissection too close to the common hepatic duct. Fundamentally an illusion, it was contributed to in some instances by working too deep in the triangle of Calot. Conclusions These data show that errors leading to laparoscopic bile duct injuries stem principally from misperception, not errors of skill, knowledge, or judgment. The misperception was so compelling that in most cases the surgeon did not recognize a problem. Even when irregularities were identified, corrective feedback did not occur, which is characteristic of human thinking under firmly held assumptions. These findings illustrate the complexity of human error in surgery while simultaneously providing insights. They demonstrate that automatically attributing technical complications to behavioral factors that rely on the assumption of control is likely to be wrong. Finally, this study shows that there are only a few points within laparoscopic cholecystectomy where the complication-causing errors occur, which suggests that focused training to heighten vigilance might be able to decrease the incidence of bile duct injury. PMID:12677139

  20. Advancing Usability Evaluation through Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring; David I. Gertman

    2005-07-01

    This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probabilitymore » of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues.« less

  1. Flight test of an improved solid waste collection system

    NASA Technical Reports Server (NTRS)

    Thornton, W.; Brasseaux, H.; Whitmore, H.

    1991-01-01

    A system for human waste collection is described and evaluated on the basis of a prototype employed for the shuttle flight STS-35. The manually operated version of the unit is designed to collect, compact, and store human waste and cleaning material in replaceable volumes. The system is presented with illustrations and descriptions of the disposable pads that are used to clean the cylinder and occlusive air valves as well as seal the unit. Temporary retention and waste entrainment are provided by the variable airflow in the manual unit tested. The prototype testing indicates that sufficient airflow is achieved at 45 CFM and that the stowage volume (18.7 cu in.) is adequate for storing human waste with minimal logistical support. Higher compaction pressure and the use of a directed airstream are proposed for improving the packing efficiency of the unit.

  2. Autonomous Control Modes and Optimized Path Guidance for Shipboard Landing in High Sea States

    DTIC Science & Technology

    2015-11-16

    a degraded visual environment, workload during the landing task begins to approach the limits of a human pilot’s capability. It is a similarly...Figure 2. Approach Trajectory ±4 ft landing error ±8 ft landing error ±12 ft landing error Flight Path -3000...heave and yaw axes. Figure 5. Open loop system generation ±4 ft landing error ±8 ft landing error ±12 ft landing error -10 -8 -6 -4 -2 0 2 4

  3. The Accuracy of GBM GRB Localizations

    NASA Astrophysics Data System (ADS)

    Briggs, Michael Stephen; Connaughton, V.; Meegan, C.; Hurley, K.

    2010-03-01

    We report an study of the accuracy of GBM GRB localizations, analyzing three types of localizations: those produced automatically by the GBM Flight Software on board GBM, those produced automatically with ground software in near real time, and localizations produced with human guidance. The two types of automatic locations are distributed in near real-time via GCN Notices; the human-guided locations are distributed on timescale of many minutes or hours using GCN Circulars. This work uses a Bayesian analysis that models the distribution of the GBM total location error by comparing GBM locations to more accurate locations obtained with other instruments. Reference locations are obtained from Swift, Super-AGILE, the LAT, and with the IPN. We model the GBM total location errors as having systematic errors in addition to the statistical errors and use the Bayesian analysis to constrain the systematic errors.

  4. Temporal and spatial dynamics underlying capacitative calcium entry in human colonic smooth muscle.

    PubMed

    Kovac, Jason R; Chrones, Tom; Sims, Stephen M

    2008-01-01

    Following smooth muscle excitation and contraction, depletion of intracellular Ca(2+) stores activates capacitative Ca(2+) entry (CCE) to replenish stores and sustain cytoplasmic Ca(2+) (Ca(2+)(i)) elevations. The objectives of the present study were to characterize CCE and the Ca(2+)(i) dynamics underlying human colonic smooth muscle contraction by using tension recordings, fluorescent Ca(2+)-indicator dyes, and patch-clamp electrophysiology. The neurotransmitter acetylcholine (ACh) contracted tissue strips and, in freshly isolated colonic smooth muscle cells (SMCs), caused elevation of Ca(2+)(i) as well as activation of nonselective cation currents. To deplete Ca(2+)(i) stores, the sarcoplasmic reticulum Ca(2+)-ATPase (SERCA) inhibitors thapsigargin and cyclopiazonic acid were added to a Ca(2+)-free bathing solution. Under these conditions, addition of extracellular Ca(2+) (3 mM) elicited increased tension that was inhibited by the cation channel blockers SKF-96365 (10 microM) and lanthanum (100 microM), suggestive of CCE. In a separate series of experiments on isolated SMCs, SERCA inhibition generated a gradual and sustained inward current. When combined with high-speed Ca(2+)-imaging techniques, the CCE-evoked rise of Ca(2+)(i) was associated with inward currents carrying Ca(2+) that were inhibited by SKF-96365. Regional specializations in Ca(2+) influx and handling during CCE were observed. Distinct "hotspot" regions of Ca(2+) rise and plateau were evident in 70% of cells, a feature not previously recognized in smooth muscle. We propose that store-operated Ca(2+) entry occurs in hotspots contributing to localized Ca(2+) elevations in human colonic smooth muscle.

  5. Recombinant human G6PD for quality control and quality assurance of novel point-of-care diagnostics for G6PD deficiency.

    PubMed

    Kahn, Maria; LaRue, Nicole; Zhu, Changcheng; Pal, Sampa; Mo, Jack S; Barrett, Lynn K; Hewitt, Steve N; Dumais, Mitchell; Hemmington, Sandra; Walker, Adrian; Joynson, Jeff; Leader, Brandon T; Van Voorhis, Wesley C; Domingo, Gonzalo J

    2017-01-01

    A large gap for the support of point-of-care testing is the availability of reagents to support quality control (QC) of diagnostic assays along the supply chain from the manufacturer to the end user. While reagents and systems exist to support QC of laboratory screening tests for glucose-6-phosphate dehydrogenase (G6PD) deficiency, they are not configured appropriately to support point-of-care testing. The feasibility of using lyophilized recombinant human G6PD as a QC reagent in novel point-of-care tests for G6PD deficiency is demonstrated. Human recombinant G6PD (r-G6PD) was expressed in Escherichia coli and purified. Aliquots were stored at -80°C. Prior to lyophilization, aliquots were thawed, and three concentrations of r-G6PD (representing normal, intermediate, and deficient clinical G6PD levels) were prepared and mixed with a protective formulation, which protects the enzyme activity against degradation from denaturation during the lyophilization process. Following lyophilization, individual single-use tubes of lyophilized r-G6PD were placed in individual packs with desiccants and stored at five temperatures for one year. An enzyme assay for G6PD activity was used to ascertain the stability of r-G6PD activity while stored at different temperatures. Lyophilized r-G6PD is stable and can be used as a control indicator. Results presented here show that G6PD activity is stable for at least 365 days when stored at -80°C, 4°C, 30°C, and 45°C. When stored at 55°C, enzyme activity was found to be stable only through day 28. Lyophilized r-G6PD enzyme is stable and can be used as a control for point-of-care tests for G6PD deficiency.

  6. Category Accessibility and Recall Accuracy: The Impact of Exposure to Mass Media in Witness Recall Situations.

    ERIC Educational Resources Information Center

    Tamborini, Ron; And Others

    The R.S. Wyer and T.K. Srull model suggests that when humans process information and store it in memory they create construct categories that are somewhat like storage bins. According to this model, when information is placed in these bins, it is stored in the order that it is received or used, with the most recently processed information always…

  7. Method and system for the diagnosis of disease using retinal image content and an archive of diagnosed human patient data

    DOEpatents

    Tobin, Kenneth W; Karnowski, Thomas P; Chaum, Edward

    2013-08-06

    A method for diagnosing diseases having retinal manifestations including retinal pathologies includes the steps of providing a CBIR system including an archive of stored digital retinal photography images and diagnosed patient data corresponding to the retinal photography images, the stored images each indexed in a CBIR database using a plurality of feature vectors, the feature vectors corresponding to distinct descriptive characteristics of the stored images. A query image of the retina of a patient is obtained. Using image processing, regions or structures in the query image are identified. The regions or structures are then described using the plurality of feature vectors. At least one relevant stored image from the archive based on similarity to the regions or structures is retrieved, and an eye disease or a disease having retinal manifestations in the patient is diagnosed based on the diagnosed patient data associated with the relevant stored image(s).

  8. An evaluation of memory accuracy in food hoarding marsh tits Poecile palustris--how accurate are they compared to humans?

    PubMed

    Brodin, Anders; Urhan, A Utku

    2013-07-01

    Laboratory studies of scatter hoarding birds have become a model system for spatial memory studies. Considering that such birds are known to have a good spatial memory, recovery success in lab studies seems low. In parids (titmice and chickadees) typically ranging between 25 and 60% if five seeds are cached in 50-128 available caching sites. Since these birds store many thousands of food items in nature in one autumn one might expect that they should easily retrieve five seeds in a laboratory where they know the environment with its caching sites in detail. We designed a laboratory set up to be as similar as possible with previous studies and trained wild caught marsh tits Poecile palustris to store and retrieve in this set up. Our results agree closely with earlier studies, of the first ten looks around 40% were correct when the birds had stored five seeds in 100 available sites both 5 and 24h after storing. The cumulative success curve suggests high success during the first 15 looks where after it declines. Humans performed much better, in the first five looks most subjects were 100% correct. We discuss possible reasons for why the birds were not doing better. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Human anterior prefrontal cortex encodes the 'what' and 'when' of future intentions.

    PubMed

    Momennejad, Ida; Haynes, John-Dylan

    2012-05-15

    On a daily basis we form numerous intentions to perform specific actions. However, we often have to delay the execution of intended actions while engaging in other demanding activities. Previous research has shown that patterns of activity in human prefrontal cortex (PFC) can reveal our current intentions. However, two fundamental questions have remained unresolved: (a) how does the PFC encode information about future tasks while we are busy engaging in other activities, and (b) how does the PFC enable us to commence a stored task at the intended time? Here we investigate how the brain stores and retrieves future intentions during occupied delays, i.e. while a person is busy performing a different task. For this purpose, we conducted a neuroimaging study with a time-based prospective memory paradigm. Using multivariate pattern classification and fMRI we show that during an occupied delay, activity patterns in the anterior PFC encode the content of 'what' subjects intend to do next, and 'when' they intend to do it. Importantly, distinct anterior PFC regions store the 'what' and 'when' components of future intentions during occupied maintenance and self-initiated retrieval. These results show a role for anterior PFC activity patterns in storing future action plans and ensuring their timely retrieval. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. THERP and HEART integrated methodology for human error assessment

    NASA Astrophysics Data System (ADS)

    Castiglia, Francesco; Giardina, Mariarosa; Tomarchio, Elio

    2015-11-01

    THERP and HEART integrated methodology is proposed to investigate accident scenarios that involve operator errors during high-dose-rate (HDR) treatments. The new approach has been modified on the basis of fuzzy set concept with the aim of prioritizing an exhaustive list of erroneous tasks that can lead to patient radiological overexposures. The results allow for the identification of human errors that are necessary to achieve a better understanding of health hazards in the radiotherapy treatment process, so that it can be properly monitored and appropriately managed.

  11. MM&T Program to Establish Production Techniques for the Automatic Detection and Qualification of Trace Elements Present in the Production of Microwave Semiconductors.

    DTIC Science & Technology

    1981-03-01

    lots. A single store of partially processed devices may serve as a source for several different product lines. Because the manufacture of microwave...matrix, or react chem- ically with some of the semiconductor materials. In some cases these element impurities may migrate to an interface inducing... different viscosity, the background intensity varied independently of the signal, a significant error could be introduced. A more effec- tive method

  12. An Electronic Pillbox for Continuous Monitoring of Medication Adherence

    PubMed Central

    Hayes, Tamara. L.; Hunt, John M.; Adami, Andre; Kaye, Jeffrey A.

    2010-01-01

    We have developed an instrumented pillbox, called a MedTracker, which allows monitoring of medication adherence on a continuous basis. This device improves on existing systems by providing mobility, frequent and automatic data collection, more detailed information about nonadherence and medication errors, and the familiar interface of a 7-day drug store pillbox. We report on the design of the MedTracker, and on the results of a field trial in 39 homes to evaluate the device. PMID:17946369

  13. Patient safety in otolaryngology: a descriptive review.

    PubMed

    Danino, Julian; Muzaffar, Jameel; Metcalfe, Chris; Coulson, Chris

    2017-03-01

    Human evaluation and judgement may include errors that can have disastrous results. Within medicine and healthcare there has been slow progress towards major changes in safety. Healthcare lags behind other specialised industries, such as aviation and nuclear power, where there have been significant improvements in overall safety, especially in reducing risk of errors. Following several high profile cases in the USA during the 1990s, a report titled "To Err Is Human: Building a Safer Health System" was published. The report extrapolated that in the USA approximately 50,000 to 100,000 patients may die each year as a result of medical errors. Traditionally otolaryngology has always been regarded as a "safe specialty". A study in the USA in 2004 inferred that there may be 2600 cases of major morbidity and 165 deaths within the specialty. MEDLINE via PubMed interface was searched for English language articles published between 2000 and 2012. Each combined two or three of the keywords noted earlier. Limitations are related to several generic topics within patient safety in otolaryngology. Other areas covered have been current relevant topics due to recent interest or new advances in technology. There has been a heightened awareness within the healthcare community of patient safety; it has become a major priority. Focus has shifted from apportioning blame to prevention of the errors and implementation of patient safety mechanisms in healthcare delivery. Type of Errors can be divided into errors due to action and errors due to knowledge or planning. In healthcare there are several factors that may influence adverse events and patient safety. Although technology may improve patient safety, it also introduces new sources of error. The ability to work with people allows for the increase in safety netting. Team working has been shown to have a beneficial effect on patient safety. Any field of work involving human decision-making will always have a risk of error. Within Otolaryngology, although patient safety has evolved along similar themes as other surgical specialties; there are several specific high-risk areas. Medical error is a common problem and its human cost is of immense importance. Steps to reduce such errors require the identification of high-risk practice within a complex healthcare system. The commitment to patient safety and quality improvement in medicine depend on personal responsibility and professional accountability.

  14. Preclinical safety evaluation of human platelets treated with antimicrobial peptides in severe combined immunodeficient mice.

    PubMed

    Bosch-Marcé, Marta; Mohan, Ketha V K; Gelderman, Monique P; Ryan, Patricia L; Russek-Cohen, Estelle; Atreya, Chintamani D

    2014-03-01

    Bacterial sepsis is a complication attributed to room temperature (RT)-stored platelets (PLTs) in transfusion medicine. Antimicrobial peptides (AMPs) are emerging as new therapeutic agents against microbes. We had previously demonstrated bactericidal activity of select synthetic AMPs against six types of bacteria in stored PLTs. In this report, we tested these AMPs for their potential antibody response and interference with the recovery and survival of human PLTs in an animal model. Two separate studies were conducted to evaluate the safety of the synthetic AMPs. 1) Two AMPs (PD3 and PD4), derived from thrombin-induced human PLT microbicidal protein, and four repeats of arginine-tryptophan (RW), containing two to five repeats (RW2-RW5), were tested in rabbits for potential antibody response. 2) RT-stored human PLTs treated for 2 hours with each of the six AMPs individually or with phosphate-buffered saline (PBS) alone were infused into severe combined immunodeficient (SCID) mice to evaluate their in vivo recovery and survival by flow cytometry. Except for PD3, which showed a weak immune response, all other peptides did not induce any detectable antibodies in rabbits. Furthermore, all six AMPs tested did not significantly affect the in vivo recovery and survival of human PLTs in SCID mice compared to PBS alone-treated PLTs. Preclinical evaluation studies reported here demonstrate that the selected AMPs used in the study did not adversely affect the human PLT recovery and survival in the SCID mouse model, suggesting further study of AMPs toward addressing the bacterial contamination of PLTs. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.

  15. Error identification and recovery by student nurses using human patient simulation: opportunity to improve patient safety.

    PubMed

    Henneman, Elizabeth A; Roche, Joan P; Fisher, Donald L; Cunningham, Helene; Reilly, Cheryl A; Nathanson, Brian H; Henneman, Philip L

    2010-02-01

    This study examined types of errors that occurred or were recovered in a simulated environment by student nurses. Errors occurred in all four rule-based error categories, and all students committed at least one error. The most frequent errors occurred in the verification category. Another common error was related to physician interactions. The least common errors were related to coordinating information with the patient and family. Our finding that 100% of student subjects committed rule-based errors is cause for concern. To decrease errors and improve safe clinical practice, nurse educators must identify effective strategies that students can use to improve patient surveillance. Copyright 2010 Elsevier Inc. All rights reserved.

  16. Alterations in the host defense properties of human milk following prolonged storage or pasteurization.

    PubMed

    Akinbi, Henry; Meinzen-Derr, Jareen; Auer, Christine; Ma, Yan; Pullum, Derek; Kusano, Ryosuke; Reszka, Krzysztof J; Zimmerly, Kira

    2010-09-01

    Preterm infants are often fed pasteurized donor milk or mother's milk that has been stored frozen for up to 4 weeks. Our objectives were to assess the impact of pasteurization or prolonged storage at -20 degrees C on the immunologic components of human milk and the capability of the different forms of human milk to support bacterial proliferation. The concentrations and activities of major host defense proteins in the whey fractions of mother's milk stored for 4 weeks at -20 degrees C or pasteurized human donor milk were compared with freshly expressed human milk. Proliferation of bacteria incubated in the 3 forms of human milk was assessed. Relative to freshly expressed human milk, the concentrations of lysozyme, lactoferrin, lactoperoxidase, and secretory immunoglobulin A were reduced 50% to 82% in pasteurized donor milk and the activities of lysozyme and lactoperoxidase were 74% to 88% lower (P < 0.01). Proliferation of bacterial pathogens in pasteurized donor milk was enhanced 1.8- to 4.6-fold compared with fresh or frozen human milk (P < 0.01). The immunomodulatory proteins in human milk are reduced by pasteurization and, to a lesser extent, by frozen storage, resulting in decreased antibacterial capability. Stringent procedure to minimize bacterial contamination is essential during handling of pasteurized milk.

  17. Advanced automated glass cockpit certification: Being wary of human factors

    NASA Technical Reports Server (NTRS)

    Amalberti, Rene; Wilbaux, Florence

    1994-01-01

    This paper presents some facets of the French experience with human factors in the process of certification of advanced automated cockpits. Three types of difficulties are described: first, the difficulties concerning the hotly debated concept of human error and its non-linear relationship to risk of accident; a typology of errors to be taken into account in the certification process is put forward to respond to this issue. Next, the difficulties connected to the basically gradual and evolving nature of pilot expertise on a given type of aircraft, which contrasts with the immediate and definitive style of certifying systems. The last difficulties to be considered are those related to the goals of certification itself on these new aircraft and the status of findings from human factor analyses (in particular, what should be done with disappointing results, how much can the changes induced by human factors investigation economically affect aircraft design, how many errors do we need to accumulate before we revise the system, what should be remedied when human factor problems are discovered at the certification stage: the machine? pilot training? the rules? or everything?). The growth of advanced-automated glass cockpits has forced the international aeronautical community to pay more attention to human factors during the design phase, the certification phase and pilot training. The recent creation of a human factor desk at the DGAC-SFACT (Official French services) is a direct consequence of this. The paper is divided into three parts. Part one debates human error and its relationship with system design and accident risk. Part two describes difficulties connected to the basically gradual and evolving nature of pilot expertise on a given type of aircraft, which contrasts with the immediate and definitive style of certifying systems. Part three focuses on concrete outcomes of human factors for certification purposes.

  18. Normal accidents: human error and medical equipment design.

    PubMed

    Dain, Steven

    2002-01-01

    High-risk systems, which are typical of our technologically complex era, include not just nuclear power plants but also hospitals, anesthesia systems, and the practice of medicine and perfusion. In high-risk systems, no matter how effective safety devices are, some types of accidents are inevitable because the system's complexity leads to multiple and unexpected interactions. It is important for healthcare providers to apply a risk assessment and management process to decisions involving new equipment and procedures or staffing matters in order to minimize the residual risks of latent errors, which are amenable to correction because of the large window of opportunity for their detection. This article provides an introduction to basic risk management and error theory principles and examines ways in which they can be applied to reduce and mitigate the inevitable human errors that accompany high-risk systems. The article also discusses "human factor engineering" (HFE), the process which is used to design equipment/ human interfaces in order to mitigate design errors. The HFE process involves interaction between designers and endusers to produce a series of continuous refinements that are incorporated into the final product. The article also examines common design problems encountered in the operating room that may predispose operators to commit errors resulting in harm to the patient. While recognizing that errors and accidents are unavoidable, organizations that function within a high-risk system must adopt a "safety culture" that anticipates problems and acts aggressively through an anonymous, "blameless" reporting mechanism to resolve them. We must continuously examine and improve the design of equipment and procedures, personnel, supplies and materials, and the environment in which we work to reduce error and minimize its effects. Healthcare providers must take a leading role in the day-to-day management of the "Perioperative System" and be a role model in promoting a culture of safety in their organizations.

  19. Heli/SITAN: A Terrain Referenced Navigation algorithm for helicopters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollowell, J.

    1990-01-01

    Heli/SITAN is a Terrain Referenced Navigation (TRN) algorithm that utilizes radar altimeter ground clearance measurements in combination with a conventional navigation system and a stored digital terrain elevation map to accurately estimate a helicopter's position. Multiple Model Adaptive Estimation (MMAE) techniques are employed using a bank of single state Kalman filters to ensure that reliable position estimates are obtained even in the face of large initial position errors. A real-time implementation of the algorithm was tested aboard a US Army UH-1 helicopter equipped with a Singer-Kearfott Doppler Velocity Sensor (DVS) and a Litton LR-80 strapdown Attitude and Heading Reference Systemmore » (AHRS). The median radial error of the position fixes provided in real-time by this implementation was less than 50 m for a variety of mission profiles. 6 refs., 7 figs.« less

  20. Adaptive Correction from Virtually Complex Dynamic Libraries: The Role of Noncovalent Interactions in Structural Selection and Folding.

    PubMed

    Lafuente, Maria; Atcher, Joan; Solà, Jordi; Alfonso, Ignacio

    2015-11-16

    The hierarchical self-assembling of complex molecular systems is dictated by the chemical and structural information stored in their components. This information can be expressed through an adaptive process that determines the structurally fittest assembly under given environmental conditions. We have set up complex disulfide-based dynamic covalent libraries of chemically and topologically diverse pseudopeptidic compounds. We show how the reaction evolves from very complex mixtures at short reaction times to the almost exclusive formation of a major compound, through the establishment of intramolecular noncovalent interactions. Our experiments demonstrate that the systems evolve through error-check and error-correction processes. The nature of these interactions, the importance of the folding and the effects of the environment are also discussed. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. The introduction of an acute physiological support service for surgical patients is an effective error reduction strategy.

    PubMed

    Clarke, D L; Kong, V Y; Naidoo, L C; Furlong, H; Aldous, C

    2013-01-01

    Acute surgical patients are particularly vulnerable to human error. The Acute Physiological Support Team (APST) was created with the twin objectives of identifying high-risk acute surgical patients in the general wards and reducing both the incidence of error and impact of error on these patients. A number of error taxonomies were used to understand the causes of human error and a simple risk stratification system was adopted to identify patients who are particularly at risk of error. During the period November 2012-January 2013 a total of 101 surgical patients were cared for by the APST at Edendale Hospital. The average age was forty years. There were 36 females and 65 males. There were 66 general surgical patients and 35 trauma patients. Fifty-six patients were referred on the day of their admission. The average length of stay in the APST was four days. Eleven patients were haemo-dynamically unstable on presentation and twelve were clinically septic. The reasons for referral were sepsis,(4) respiratory distress,(3) acute kidney injury AKI (38), post-operative monitoring (39), pancreatitis,(3) ICU down-referral,(7) hypoxia,(5) low GCS,(1) coagulopathy.(1) The mortality rate was 13%. A total of thirty-six patients experienced 56 errors. A total of 143 interventions were initiated by the APST. These included institution or adjustment of intravenous fluids (101), blood transfusion,(12) antibiotics,(9) the management of neutropenic sepsis,(1) central line insertion,(3) optimization of oxygen therapy,(7) correction of electrolyte abnormality,(8) correction of coagulopathy.(2) CONCLUSION: Our intervention combined current taxonomies of error with a simple risk stratification system and is a variant of the defence in depth strategy of error reduction. We effectively identified and corrected a significant number of human errors in high-risk acute surgical patients. This audit has helped understand the common sources of error in the general surgical wards and will inform on-going error reduction initiatives. Copyright © 2013 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  2. A framework for human-hydrologic system model development integrating hydrology and water management: application to the Cutzamala water system in Mexico

    NASA Astrophysics Data System (ADS)

    Wi, S.; Freeman, S.; Brown, C.

    2017-12-01

    This study presents a general approach to developing computational models of human-hydrologic systems where human modification of hydrologic surface processes are significant or dominant. A river basin system is represented by a network of human-hydrologic response units (HHRUs) identified based on locations where river regulations happen (e.g., reservoir operation and diversions). Natural and human processes in HHRUs are simulated in a holistic framework that integrates component models representing rainfall-runoff, river routing, reservoir operation, flow diversion and water use processes. We illustrate the approach in a case study of the Cutzamala water system (CWS) in Mexico, a complex inter-basin water transfer system supplying the Mexico City Metropolitan Area (MCMA). The human-hydrologic system model for CWS (CUTZSIM) is evaluated in terms of streamflow and reservoir storages measured across the CWS and to water supplied for MCMA. The CUTZSIM improves the representation of hydrology and river-operation interaction and, in so doing, advances evaluation of system-wide water management consequences under altered climatic and demand regimes. The integrated modeling framework enables evaluation and simulation of model errors throughout the river basin, including errors in representation of the human component processes. Heretofore, model error evaluation, predictive error intervals and the resultant improved understanding have been limited to hydrologic processes. The general framework represents an initial step towards fuller understanding and prediction of the many and varied processes that determine the hydrologic fluxes and state variables in real river basins.

  3. A Foundation for Systems Anthropometry: Lumbar/Pelvic Kinematics

    DTIC Science & Technology

    1983-02-01

    caused by human error in positioning the cursor of the digitizing board and inaccuracy of the digitizer. (Human error is approximately + .02 an and...Milne, J.S. and Lauder, I.J. 1974. "Age Effects in Kyphosis and Lordosis in Adults." Ann. Hum. Biol. 1(3):327-337. Mchr, G.C., Brinkley, J.W., Kazarian

  4. Comment on "Differential sensitivity to human communication in dogs, wolves, and human infants".

    PubMed

    Marshall-Pescini, S; Passalacqua, C; Valsecchi, P; Prato-Previde, E

    2010-07-09

    Topál et al. (Reports, 4 September 2009, p. 1269) showed that dogs, like infants but unlike wolves, make perseverative search errors that can be explained by the use of ostensive cues from the experimenter. We suggest that a simpler learning process, local enhancement, can account for errors made by dogs.

  5. Review of U.S. Army Unmanned Aerial Systems Accident Reports: Analysis of Human Error Contributions

    DTIC Science & Technology

    2018-03-20

    USAARL Report No. 2018-08 Review of U.S. Army Unmanned Aerial Systems Accident Reports: Analysis of Human Error Contributions By Kathryn A...3 Statistical Analysis Approach ..............................................................................................3 Results...1 Introduction The success of unmanned aerial systems (UAS) operations relies upon a variety of factors, including, but not limited to

  6. Impact of human error on lumber yield in rough mills

    Treesearch

    Urs Buehlmann; R. Edward Thomas; R. Edward Thomas

    2002-01-01

    Rough sawn, kiln-dried lumber contains characteristics such as knots and bark pockets that are considered by most people to be defects. When using boards to produce furniture components, these defects are removed to produce clear, defect-free parts. Currently, human operators identify and locate the unusable board areas containing defects. Errors in determining a...

  7. Action errors, error management, and learning in organizations.

    PubMed

    Frese, Michael; Keith, Nina

    2015-01-03

    Every organization is confronted with errors. Most errors are corrected easily, but some may lead to negative consequences. Organizations often focus on error prevention as a single strategy for dealing with errors. Our review suggests that error prevention needs to be supplemented by error management--an approach directed at effectively dealing with errors after they have occurred, with the goal of minimizing negative and maximizing positive error consequences (examples of the latter are learning and innovations). After defining errors and related concepts, we review research on error-related processes affected by error management (error detection, damage control). Empirical evidence on positive effects of error management in individuals and organizations is then discussed, along with emotional, motivational, cognitive, and behavioral pathways of these effects. Learning from errors is central, but like other positive consequences, learning occurs under certain circumstances--one being the development of a mind-set of acceptance of human error.

  8. Review of Significant Incidents and Close Calls in Human Spaceflight from a Human Factors Perspective

    NASA Technical Reports Server (NTRS)

    Silva-Martinez, Jackelynne; Ellenberger, Richard; Dory, Jonathan

    2017-01-01

    This project aims to identify poor human factors design decisions that led to error-prone systems, or did not facilitate the flight crew making the right choices; and to verify that NASA is effectively preventing similar incidents from occurring again. This analysis was performed by reviewing significant incidents and close calls in human spaceflight identified by the NASA Johnson Space Center Safety and Mission Assurance Flight Safety Office. The review of incidents shows whether the identified human errors were due to the operational phase (flight crew and ground control) or if they initiated at the design phase (includes manufacturing and test). This classification was performed with the aid of the NASA Human Systems Integration domains. This in-depth analysis resulted in a tool that helps with the human factors classification of significant incidents and close calls in human spaceflight, which can be used to identify human errors at the operational level, and how they were or should be minimized. Current governing documents on human systems integration for both government and commercial crew were reviewed to see if current requirements, processes, training, and standard operating procedures protect the crew and ground control against these issues occurring in the future. Based on the findings, recommendations to target those areas are provided.

  9. Uncorrected refractive errors.

    PubMed

    Naidoo, Kovin S; Jaggernath, Jyoti

    2012-01-01

    Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC), were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR) Development, Service Development and Social Entrepreneurship.

  10. Intelligent OCR Processing.

    ERIC Educational Resources Information Center

    Sun, Wei; And Others

    1992-01-01

    Identifies types and distributions of errors in text produced by optical character recognition (OCR) and proposes a process using machine learning techniques to recognize and correct errors in OCR texts. Results of experiments indicating that this strategy can reduce human interaction required for error correction are reported. (25 references)…

  11. Immortalization of normal human mammary epithelial cells in two steps by direct targeting of senescence barriers does not require gross genomic alterations

    DOE PAGES

    Garbe, James C.; Vrba, Lukas; Sputova, Klara; ...

    2014-10-29

    Telomerase reactivation and immortalization are critical for human carcinoma progression. However, little is known about the mechanisms controlling this crucial step, due in part to the paucity of experimentally tractable model systems that can examine human epithelial cell immortalization as it might occur in vivo. We achieved efficient non-clonal immortalization of normal human mammary epithelial cells (HMEC) by directly targeting the 2 main senescence barriers encountered by cultured HMEC. The stress-associated stasis barrier was bypassed using shRNA to p16INK4; replicative senescence due to critically shortened telomeres was bypassed in post-stasis HMEC by c-MYC transduction. Thus, 2 pathologically relevant oncogenic agentsmore » are sufficient to immortally transform normal HMEC. The resultant non-clonal immortalized lines exhibited normal karyotypes. Most human carcinomas contain genomically unstable cells, with widespread instability first observed in vivo in pre-malignant stages; in vitro, instability is seen as finite cells with critically shortened telomeres approach replicative senescence. Our results support our hypotheses that: (1) telomere-dysfunction induced genomic instability in pre-malignant finite cells may generate the errors required for telomerase reactivation and immortalization, as well as many additional “passenger” errors carried forward into resulting carcinomas; (2) genomic instability during cancer progression is needed to generate errors that overcome tumor suppressive barriers, but not required per se; bypassing the senescence barriers by direct targeting eliminated a need for genomic errors to generate immortalization. Achieving efficient HMEC immortalization, in the absence of “passenger” genomic errors, should facilitate examination of telomerase regulation during human carcinoma progression, and exploration of agents that could prevent immortalization.« less

  12. Shopper marketing nutrition interventions.

    PubMed

    Payne, Collin R; Niculescu, Mihai; Just, David R; Kelly, Michael P

    2014-09-01

    Grocery stores represent a context in which a majority of people's food purchases occur. Considering the nutrition quality of the population's food intake has dramatically decreased, understanding how to improve food choice in the grocery store is paramount to healthier living. In this work, we detail the type of financial resources from which shoppers could draw (i.e., personal income and benefits from government food assistance programs to low income populations) and explain how these financial resources are allocated in the grocery store (i.e., planned, unplanned, error). Subsequently, we identify a conceptual framework for shopper marketing nutrition interventions that targets unplanned fruit and vegetable purchases (i.e., slack, or willingness to spend minus list items). Targeting slack for fresh fruit and vegetable purchases allows retailers to benefit economically (i.e., fruit and vegetables are higher margin) and allows shoppers to improve their nutrition without increasing their budgets (i.e., budget neutrality). We also provide preliminary evidence of what in-store marketing of fresh fruits and vegetables could entail by modifying grocery carts and grocery floors to provide information of what is common, normal, or appropriate fruit and vegetable purchases. In each example, fresh fruit and vegetable purchases increased and evidence suggested shopper budget neutrality. To provide context for these results, we detail measurement tools that can be used to measure shopper behaviors, purchases, and consumption patterns. Finally, we address theoretical, practical, and policy implications of shopper marketing nutrition interventions. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. The stability of hydrogen ion and specific conductance in filtered wet-deposition samples stored at ambient temperatures

    USGS Publications Warehouse

    Gordon, J.D.; Schroder, L.J.; Morden-Moore, A. L.; Bowersox, V.C.

    1995-01-01

    Separate experiments by the U.S. Geological Survey (USGS) and the Illinois State Water Survey Central Analytical Laboratory (CAL) independently assessed the stability of hydrogen ion and specific conductance in filtered wet-deposition samples stored at ambient temperatures. The USGS experiment represented a test of sample stability under a diverse range of conditions, whereas the CAL experiment was a controlled test of sample stability. In the experiment by the USGS, a statistically significant (?? = 0.05) relation between [H+] and time was found for the composited filtered, natural, wet-deposition solution when all reported values are included in the analysis. However, if two outlying pH values most likely representing measurement error are excluded from the analysis, the change in [H+] over time was not statistically significant. In the experiment by the CAL, randomly selected samples were reanalyzed between July 1984 and February 1991. The original analysis and reanalysis pairs revealed that [H+] differences, although very small, were statistically different from zero, whereas specific-conductance differences were not. Nevertheless, the results of the CAL reanalysis project indicate there appears to be no consistent, chemically significant degradation in sample integrity with regard to [H+] and specific conductance while samples are stored at room temperature at the CAL. Based on the results of the CAL and USGS studies, short-term (45-60 day) stability of [H+] and specific conductance in natural filtered wet-deposition samples that are shipped and stored unchilled at ambient temperatures was satisfactory.

  14. 21 CFR 110.35 - Sanitary operations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... against contamination of food. (1) Food-contact surfaces used for manufacturing or holding low-moisture... HUMAN CONSUMPTION CURRENT GOOD MANUFACTURING PRACTICE IN MANUFACTURING, PACKING, OR HOLDING HUMAN FOOD... for contamination. Only the following toxic materials may be used or stored in a plant where food is...

  15. Development of a bolus injection system for regional deposition studies of nanoparticles in the human respiratory system

    NASA Astrophysics Data System (ADS)

    Koujalagi, V.; Ramesh, S. L.; Gunarathne, G. P. P.; Semple, S.; Ayres, J. G.

    2009-02-01

    This study presents the work carried out in developing a precision bolus injection system in order to understand the regional deposition of nanoparticles (NP) in human lung. A real-time control system has been developed that is capable of storing graphite NP, assessing human breathing pattern and delivering a bolus of the stored NP at a pre-determined instance of the inhalation phase of breathing. This will form the basis for further development of a system to deliver radioactive nanoparticles to enable 3-dimensional lung imaging using techniques such as positron emission tomography (PET). The system may then be used to better understand the actual regional deposition in human lung, which could validate or challenge the current computational lung models such as that published by the International Commission for Radiation Protection (ICRP-1994). A dose related response to inhaled PM can possibly be shown, which can be used to review the current workplace exposure limits (WELs).

  16. Recognizing and managing errors of cognitive underspecification.

    PubMed

    Duthie, Elizabeth A

    2014-03-01

    James Reason describes cognitive underspecification as incomplete communication that creates a knowledge gap. Errors occur when an information mismatch occurs in bridging that gap with a resulting lack of shared mental models during the communication process. There is a paucity of studies in health care examining this cognitive error and the role it plays in patient harm. The goal of the following case analyses is to facilitate accurate recognition, identify how it contributes to patient harm, and suggest appropriate management strategies. Reason's human error theory is applied in case analyses of errors of cognitive underspecification. Sidney Dekker's theory of human incident investigation is applied to event investigation to facilitate identification of this little recognized error. Contributory factors leading to errors of cognitive underspecification include workload demands, interruptions, inexperienced practitioners, and lack of a shared mental model. Detecting errors of cognitive underspecification relies on blame-free listening and timely incident investigation. Strategies for interception include two-way interactive communication, standardization of communication processes, and technological support to ensure timely access to documented clinical information. Although errors of cognitive underspecification arise at the sharp end with the care provider, effective management is dependent upon system redesign that mitigates the latent contributory factors. Cognitive underspecification is ubiquitous whenever communication occurs. Accurate identification is essential if effective system redesign is to occur.

  17. Comprehensive analysis of a medication dosing error related to CPOE.

    PubMed

    Horsky, Jan; Kuperman, Gilad J; Patel, Vimla L

    2005-01-01

    This case study of a serious medication error demonstrates the necessity of a comprehensive methodology for the analysis of failures in interaction between humans and information systems. The authors used a novel approach to analyze a dosing error related to computer-based ordering of potassium chloride (KCl). The method included a chronological reconstruction of events and their interdependencies from provider order entry usage logs, semistructured interviews with involved clinicians, and interface usability inspection of the ordering system. Information collected from all sources was compared and evaluated to understand how the error evolved and propagated through the system. In this case, the error was the product of faults in interaction among human and system agents that methods limited in scope to their distinct analytical domains would not identify. The authors characterized errors in several converging aspects of the drug ordering process: confusing on-screen laboratory results review, system usability difficulties, user training problems, and suboptimal clinical system safeguards that all contributed to a serious dosing error. The results of the authors' analysis were used to formulate specific recommendations for interface layout and functionality modifications, suggest new user alerts, propose changes to user training, and address error-prone steps of the KCl ordering process to reduce the risk of future medication dosing errors.

  18. Effect of Frozen Human Epidermis Storage Duration and Cryoprotectant on Barrier Function using Two Model Compounds

    PubMed Central

    Barbero, Ana M.; Frasch, H. Frederick

    2015-01-01

    Skin is commonly stored frozen and then thawed prior to use for in-vitro permeation experiments. Does frozen storage of skin alter its barrier property? Numerous studies have found contradictory answers to this question. In this study, the steady state flux and lag time of diethyl phthalate (DEP) were measured for fresh human skin and skin frozen at −85 °C for 1, 2, 3, 6, 9, 12, and 18 months, with 10% glycerol as cryoprotective agent. No significant differences in steady state flux were found between fresh and previously frozen samples (P = 0.6). For lag time, a significant (P = 0.002) difference was found among all groups but comparisons with fresh skin were not significant. Does glycerol have a cryoprotective effect? The steady state flux and lag time of DEP and caffeine were measured through human skin stored at −85 °C for up to 12 months with and without 10 % glycerol. No significant differences in steady state flux or lag time were found between samples stored with or without glycerol for either DEP or caffeine (P ≥ 0.17). These findings support the use of frozen skin to measure the passive permeation of chemicals in studies unconcerned with viability and metabolism. PMID:26606593

  19. 78 FR 17155 - Standards for the Growing, Harvesting, Packing, and Holding of Produce for Human Consumption...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-20

    ...The Food and Drug Administration (FDA or we) is correcting the preamble to a proposed rule that published in the Federal Register of January 16, 2013. That proposed rule would establish science-based minimum standards for the safe growing, harvesting, packing, and holding of produce, meaning fruits and vegetables grown for human consumption. FDA proposed these standards as part of our implementation of the FDA Food Safety Modernization Act. The document published with several technical errors, including some errors in cross references, as well as several errors in reference numbers cited throughout the document. This document corrects those errors. We are also placing a corrected copy of the proposed rule in the docket.

  20. Patient identification using a near-infrared laser scanner

    NASA Astrophysics Data System (ADS)

    Manit, Jirapong; Bremer, Christina; Schweikard, Achim; Ernst, Floris

    2017-03-01

    We propose a new biometric approach where the tissue thickness of a person's forehead is used as a biometric feature. Given that the spatial registration of two 3D laser scans of the same human face usually produces a low error value, the principle of point cloud registration and its error metric can be applied to human classification techniques. However, by only considering the spatial error, it is not possible to reliably verify a person's identity. We propose to use a novel near-infrared laser-based head tracking system to determine an additional feature, the tissue thickness, and include this in the error metric. Using MRI as a ground truth, data from the foreheads of 30 subjects was collected from which a 4D reference point cloud was created for each subject. The measurements from the near-infrared system were registered with all reference point clouds using the ICP algorithm. Afterwards, the spatial and tissue thickness errors were extracted, forming a 2D feature space. For all subjects, the lowest feature distance resulted from the registration of a measurement and the reference point cloud of the same person. The combined registration error features yielded two clusters in the feature space, one from the same subject and another from the other subjects. When only the tissue thickness error was considered, these clusters were less distinct but still present. These findings could help to raise safety standards for head and neck cancer patients and lays the foundation for a future human identification technique.

  1. Human factors evaluation of remote afterloading brachytherapy. Volume 2, Function and task analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Callan, J.R.; Gwynne, J.W. III; Kelly, T.T.

    1995-05-01

    A human factors project on the use of nuclear by-product material to treat cancer using remotely operated afterloaders was undertaken by the Nuclear Regulatory Commission. The purpose of the project was to identify factors that contribute to human error in the system for remote afterloading brachytherapy (RAB). This report documents the findings from the first phase of the project, which involved an extensive function and task analysis of RAB. This analysis identified the functions and tasks in RAB, made preliminary estimates of the likelihood of human error in each task, and determined the skills needed to perform each RAB task.more » The findings of the function and task analysis served as the foundation for the remainder of the project, which evaluated four major aspects of the RAB system linked to human error: human-system interfaces; procedures and practices; training and qualifications of RAB staff; and organizational practices and policies. At its completion, the project identified and prioritized areas for recommended NRC and industry attention based on all of the evaluations and analyses.« less

  2. Prediction error induced motor contagions in human behaviors.

    PubMed

    Ikegami, Tsuyoshi; Ganesh, Gowrishankar; Takeuchi, Tatsuya; Nakamoto, Hiroki

    2018-05-29

    Motor contagions refer to implicit effects on one's actions induced by observed actions. Motor contagions are believed to be induced simply by action observation and cause an observer's action to become similar to the action observed. In contrast, here we report a new motor contagion that is induced only when the observation is accompanied by prediction errors - differences between actions one observes and those he/she predicts or expects. In two experiments, one on whole-body baseball pitching and another on simple arm reaching, we show that the observation of the same action induces distinct motor contagions, depending on whether prediction errors are present or not. In the absence of prediction errors, as in previous reports, participants' actions changed to become similar to the observed action, while in the presence of prediction errors, their actions changed to diverge away from it, suggesting distinct effects of action observation and action prediction on human actions. © 2018, Ikegami et al.

  3. Application Analysis and Decision with Dynamic Analysis

    DTIC Science & Technology

    2014-12-01

    pushes the application file and the JSON file containing the metadata from the database . When the 2 files are in place, the consumer thread starts...human analysts and stores it in a database . It would then use some of these data to generate a risk score for the application. However, static analysis...and store them in the primary A2D database for future analysis. 15. SUBJECT TERMS Android, dynamic analysis 16. SECURITY CLASSIFICATION OF: 17

  4. Shining a Light on Black Holes in Keratinocytes.

    PubMed

    Bowman, Shanna L; Marks, Michael S

    2018-03-01

    The mechanisms by which melanins are transferred from melanocytes and stored within keratinocytes to generate skin pigmentation are hotly debated. Correia et al. and Hurbain et al. provide evidence that melanin cores of melanosomes are secreted from melanocytes and taken up and stored within non-degradative membranous organelles in keratinocytes in the basal epidermis of human skin. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  5. STARS 2.0: 2nd-generation open-source archiving and query software

    NASA Astrophysics Data System (ADS)

    Winegar, Tom

    2008-07-01

    The Subaru Telescope is in process of developing an open-source alternative to the 1st-generation software and databases (STARS 1) used for archiving and query. For STARS 2, we have chosen PHP and Python for scripting and MySQL as the database software. We have collected feedback from staff and observers, and used this feedback to significantly improve the design and functionality of our future archiving and query software. Archiving - We identified two weaknesses in 1st-generation STARS archiving software: a complex and inflexible table structure and uncoordinated system administration for our business model: taking pictures from the summit and archiving them in both Hawaii and Japan. We adopted a simplified and normalized table structure with passive keyword collection, and we are designing an archive-to-archive file transfer system that automatically reports real-time status and error conditions and permits error recovery. Query - We identified several weaknesses in 1st-generation STARS query software: inflexible query tools, poor sharing of calibration data, and no automatic file transfer mechanisms to observers. We are developing improved query tools and sharing of calibration data, and multi-protocol unassisted file transfer mechanisms for observers. In the process, we have redefined a 'query': from an invisible search result that can only transfer once in-house right now, with little status and error reporting and no error recovery - to a stored search result that can be monitored, transferred to different locations with multiple protocols, reporting status and error conditions and permitting recovery from errors.

  6. Morpho-syntactic processing of Arabic plurals after aphasia: dissecting lexical meaning from morpho-syntax within word boundaries.

    PubMed

    Khwaileh, Tariq; Body, Richard; Herbert, Ruth

    2015-01-01

    Within the domain of inflectional morpho-syntax, differential processing of regular and irregular forms has been found in healthy speakers and in aphasia. One view assumes that irregular forms are retrieved as full entities, while regular forms are compiled on-line. An alternative view holds that a single mechanism oversees regular and irregular forms. Arabic offers an opportunity to study this phenomenon, as Arabic nouns contain a consonantal root, delivering lexical meaning, and a vocalic pattern, delivering syntactic information, such as gender and number. The aim of this study is to investigate morpho-syntactic processing of regular (sound) and irregular (broken) Arabic plurals in patients with morpho-syntactic impairment. Three participants with acquired agrammatic aphasia produced plural forms in a picture-naming task. We measured overall response accuracy, then analysed lexical errors and morpho-syntactic errors, separately. Error analysis revealed different patterns of morpho-syntactic errors depending on the type of pluralization (sound vs broken). Omissions formed the vast majority of errors in sound plurals, while substitution was the only error mechanism that occurred in broken plurals. The dissociation was statistically significant for retrieval of morpho-syntactic information (vocalic pattern) but not for lexical meaning (consonantal root), suggesting that the participants' selective impairment was an effect of the morpho-syntax of plurals. These results suggest that irregular plurals forms are stored, while regular forms are derived. The current findings support the findings from other languages and provide a new analysis technique for data from languages with non-concatenative morpho-syntax.

  7. HRD in China.

    ERIC Educational Resources Information Center

    Wigglesworth, David C.

    1981-01-01

    Discusses the Chinese commitment to training and human resource development. Describes a department store, the Chinese Enterprise Management Association, a tool factory, and management development centers. (JOW)

  8. Synchronizing movements with the metronome: nonlinear error correction and unstable periodic orbits.

    PubMed

    Engbert, Ralf; Krampe, Ralf Th; Kurths, Jürgen; Kliegl, Reinhold

    2002-02-01

    The control of human hand movements is investigated in a simple synchronization task. We propose and analyze a stochastic model based on nonlinear error correction; a mechanism which implies the existence of unstable periodic orbits. This prediction is tested in an experiment with human subjects. We find that our experimental data are in good agreement with numerical simulations of our theoretical model. These results suggest that feedback control of the human motor systems shows nonlinear behavior. Copyright 2001 Elsevier Science (USA).

  9. Optimizing methods and dodging pitfalls in microbiome research.

    PubMed

    Kim, Dorothy; Hofstaedter, Casey E; Zhao, Chunyu; Mattei, Lisa; Tanes, Ceylan; Clarke, Erik; Lauder, Abigail; Sherrill-Mix, Scott; Chehoud, Christel; Kelsen, Judith; Conrad, Máire; Collman, Ronald G; Baldassano, Robert; Bushman, Frederic D; Bittinger, Kyle

    2017-05-05

    Research on the human microbiome has yielded numerous insights into health and disease, but also has resulted in a wealth of experimental artifacts. Here, we present suggestions for optimizing experimental design and avoiding known pitfalls, organized in the typical order in which studies are carried out. We first review best practices in experimental design and introduce common confounders such as age, diet, antibiotic use, pet ownership, longitudinal instability, and microbial sharing during cohousing in animal studies. Typically, samples will need to be stored, so we provide data on best practices for several sample types. We then discuss design and analysis of positive and negative controls, which should always be run with experimental samples. We introduce a convenient set of non-biological DNA sequences that can be useful as positive controls for high-volume analysis. Careful analysis of negative and positive controls is particularly important in studies of samples with low microbial biomass, where contamination can comprise most or all of a sample. Lastly, we summarize approaches to enhancing experimental robustness by careful control of multiple comparisons and to comparing discovery and validation cohorts. We hope the experimental tactics summarized here will help researchers in this exciting field advance their studies efficiently while avoiding errors.

  10. Response-cue interval effects in extended-runs task switching: memory, or monitoring?

    PubMed

    Altmann, Erik M

    2017-09-26

    This study investigated effects of manipulating the response-cue interval (RCI) in the extended-runs task-switching procedure. In this procedure, a task cue is presented at the start of a run of trials and then withdrawn, such that the task has to be stored in memory to guide performance until the next task cue is presented. The effects of the RCI manipulation were not as predicted by an existing model of memory processes in task switching (Altmann and Gray, Psychol Rev 115:602-639, 2008), suggesting that either the model is incorrect or the RCI manipulation did not have the intended effect. The manipulation did produce a theoretically meaningful pattern, in the form of a main effect on response time that was not accompanied by a similar effect on the error rate. This pattern, which replicated across two experiments, is interpreted here in terms of a process that monitors for the next task cue, with a longer RCI acting as a stronger signal that a cue is about to appear. The results have implications for the human factors of dynamic task environments in which critical events occur unpredictably.

  11. Synthesizer: Expediting synthesis studies from context-free data with information retrieval techniques.

    PubMed

    Gandy, Lisa M; Gumm, Jordan; Fertig, Benjamin; Thessen, Anne; Kennish, Michael J; Chavan, Sameer; Marchionni, Luigi; Xia, Xiaoxin; Shankrit, Shambhavi; Fertig, Elana J

    2017-01-01

    Scientists have unprecedented access to a wide variety of high-quality datasets. These datasets, which are often independently curated, commonly use unstructured spreadsheets to store their data. Standardized annotations are essential to perform synthesis studies across investigators, but are often not used in practice. Therefore, accurately combining records in spreadsheets from differing studies requires tedious and error-prone human curation. These efforts result in a significant time and cost barrier to synthesis research. We propose an information retrieval inspired algorithm, Synthesize, that merges unstructured data automatically based on both column labels and values. Application of the Synthesize algorithm to cancer and ecological datasets had high accuracy (on the order of 85-100%). We further implement Synthesize in an open source web application, Synthesizer (https://github.com/lisagandy/synthesizer). The software accepts input as spreadsheets in comma separated value (CSV) format, visualizes the merged data, and outputs the results as a new spreadsheet. Synthesizer includes an easy to use graphical user interface, which enables the user to finish combining data and obtain perfect accuracy. Future work will allow detection of units to automatically merge continuous data and application of the algorithm to other data formats, including databases.

  12. Synthesizer: Expediting synthesis studies from context-free data with information retrieval techniques

    PubMed Central

    Gumm, Jordan; Fertig, Benjamin; Thessen, Anne; Kennish, Michael J.; Chavan, Sameer; Marchionni, Luigi; Xia, Xiaoxin; Shankrit, Shambhavi; Fertig, Elana J.

    2017-01-01

    Scientists have unprecedented access to a wide variety of high-quality datasets. These datasets, which are often independently curated, commonly use unstructured spreadsheets to store their data. Standardized annotations are essential to perform synthesis studies across investigators, but are often not used in practice. Therefore, accurately combining records in spreadsheets from differing studies requires tedious and error-prone human curation. These efforts result in a significant time and cost barrier to synthesis research. We propose an information retrieval inspired algorithm, Synthesize, that merges unstructured data automatically based on both column labels and values. Application of the Synthesize algorithm to cancer and ecological datasets had high accuracy (on the order of 85–100%). We further implement Synthesize in an open source web application, Synthesizer (https://github.com/lisagandy/synthesizer). The software accepts input as spreadsheets in comma separated value (CSV) format, visualizes the merged data, and outputs the results as a new spreadsheet. Synthesizer includes an easy to use graphical user interface, which enables the user to finish combining data and obtain perfect accuracy. Future work will allow detection of units to automatically merge continuous data and application of the algorithm to other data formats, including databases. PMID:28437440

  13. Tactile perception and working memory in rats and humans

    PubMed Central

    Fassihi, Arash; Akrami, Athena; Esmaeili, Vahid; Diamond, Mathew E.

    2014-01-01

    Primates can store sensory stimulus parameters in working memory for subsequent manipulation, but until now, there has been no demonstration of this capacity in rodents. Here we report tactile working memory in rats. Each stimulus is a vibration, generated as a series of velocity values sampled from a normal distribution. To perform the task, the rat positions its whiskers to receive two such stimuli, “base” and “comparison,” separated by a variable delay. It then judges which stimulus had greater velocity SD. In analogous experiments, humans compare two vibratory stimuli on the fingertip. We demonstrate that the ability of rats to hold base stimulus information (for up to 8 s) and their acuity in assessing stimulus differences overlap the performance demonstrated by humans. This experiment highlights the ability of rats to perceive the statistical structure of vibrations and reveals their previously unknown capacity to store sensory information in working memory. PMID:24449850

  14. A Fuzzy Query Mechanism for Human Resource Websites

    NASA Astrophysics Data System (ADS)

    Lai, Lien-Fu; Wu, Chao-Chin; Huang, Liang-Tsung; Kuo, Jung-Chih

    Users' preferences often contain imprecision and uncertainty that are difficult for traditional human resource websites to deal with. In this paper, we apply the fuzzy logic theory to develop a fuzzy query mechanism for human resource websites. First, a storing mechanism is proposed to store fuzzy data into conventional database management systems without modifying DBMS models. Second, a fuzzy query language is proposed for users to make fuzzy queries on fuzzy databases. User's fuzzy requirement can be expressed by a fuzzy query which consists of a set of fuzzy conditions. Third, each fuzzy condition associates with a fuzzy importance to differentiate between fuzzy conditions according to their degrees of importance. Fourth, the fuzzy weighted average is utilized to aggregate all fuzzy conditions based on their degrees of importance and degrees of matching. Through the mutual compensation of all fuzzy conditions, the ordering of query results can be obtained according to user's preference.

  15. Store-operated Ca2+ entry regulates Ca2+-activated chloride channels and eccrine sweat gland function.

    PubMed

    Concepcion, Axel R; Vaeth, Martin; Wagner, Larry E; Eckstein, Miriam; Hecht, Lee; Yang, Jun; Crottes, David; Seidl, Maximilian; Shin, Hyosup P; Weidinger, Carl; Cameron, Scott; Turvey, Stuart E; Issekutz, Thomas; Meyts, Isabelle; Lacruz, Rodrigo S; Cuk, Mario; Yule, David I; Feske, Stefan

    2016-11-01

    Eccrine sweat glands are essential for sweating and thermoregulation in humans. Loss-of-function mutations in the Ca2+ release-activated Ca2+ (CRAC) channel genes ORAI1 and STIM1 abolish store-operated Ca2+ entry (SOCE), and patients with these CRAC channel mutations suffer from anhidrosis and hyperthermia at high ambient temperatures. Here we have shown that CRAC channel-deficient patients and mice with ectodermal tissue-specific deletion of Orai1 (Orai1K14Cre) or Stim1 and Stim2 (Stim1/2K14Cre) failed to sweat despite normal sweat gland development. SOCE was absent in agonist-stimulated sweat glands from Orai1K14Cre and Stim1/2K14Cre mice and human sweat gland cells lacking ORAI1 or STIM1 expression. In Orai1K14Cre mice, abolishment of SOCE was associated with impaired chloride secretion by primary murine sweat glands. In human sweat gland cells, SOCE mediated by ORAI1 was necessary for agonist-induced chloride secretion and activation of the Ca2+-activated chloride channel (CaCC) anoctamin 1 (ANO1, also known as TMEM16A). By contrast, expression of TMEM16A, the water channel aquaporin 5 (AQP5), and other regulators of sweat gland function was normal in the absence of SOCE. Our findings demonstrate that Ca2+ influx via store-operated CRAC channels is essential for CaCC activation, chloride secretion, and sweat production in humans and mice.

  16. Human error in aviation operations

    NASA Technical Reports Server (NTRS)

    Billings, C. E.; Lanber, J. K.; Cooper, G. E.

    1974-01-01

    This report is a brief description of research being undertaken by the National Aeronautics and Space Administration. The project is designed to seek out factors in the aviation system which contribute to human error, and to search for ways of minimizing the potential threat posed by these factors. The philosophy and assumptions underlying the study are discussed, together with an outline of the research plan.

  17. Human Factors Directions for Civil Aviation

    NASA Technical Reports Server (NTRS)

    Hart, Sandra G.

    2002-01-01

    Despite considerable progress in understanding human capabilities and limitations, incorporating human factors into aircraft design, operation, and certification, and the emergence of new technologies designed to reduce workload and enhance human performance in the system, most aviation accidents still involve human errors. Such errors occur as a direct or indirect result of untimely, inappropriate, or erroneous actions (or inactions) by apparently well-trained and experienced pilots, controllers, and maintainers. The field of human factors has solved many of the more tractable problems related to simple ergonomics, cockpit layout, symbology, and so on. We have learned much about the relationships between people and machines, but know less about how to form successful partnerships between humans and the information technologies that are beginning to play a central role in aviation. Significant changes envisioned in the structure of the airspace, pilots and controllers' roles and responsibilities, and air/ground technologies will require a similarly significant investment in human factors during the next few decades to ensure the effective integration of pilots, controllers, dispatchers, and maintainers into the new system. Many of the topics that will be addressed are not new because progress in crucial areas, such as eliminating human error, has been slow. A multidisciplinary approach that capitalizes upon human studies and new classes of information, computational models, intelligent analytical tools, and close collaborations with organizations that build, operate, and regulate aviation technology will ensure that the field of human factors meets the challenge.

  18. Neural bases of imitation and pantomime in acute stroke patients: distinct streams for praxis.

    PubMed

    Hoeren, Markus; Kümmerer, Dorothee; Bormann, Tobias; Beume, Lena; Ludwig, Vera M; Vry, Magnus-Sebastian; Mader, Irina; Rijntjes, Michel; Kaller, Christoph P; Weiller, Cornelius

    2014-10-01

    Apraxia is a cognitive disorder of skilled movements that characteristically affects the ability to imitate meaningless gestures, or to pantomime the use of tools. Despite substantial research, the neural underpinnings of imitation and pantomime have remained debated. An influential model states that higher motor functions are supported by different processing streams. A dorso-dorsal stream may mediate movements based on physical object properties, like reaching or grasping, whereas skilled tool use or pantomime rely on action representations stored within a ventro-dorsal stream. However, given variable results of past studies, the role of the two streams for imitation of meaningless gestures has remained uncertain, and the importance of the ventro-dorsal stream for pantomime of tool use has been questioned. To clarify the involvement of ventral and dorsal streams in imitation and pantomime, we performed voxel-based lesion-symptom mapping in a sample of 96 consecutive left-hemisphere stroke patients (mean age ± SD, 63.4 ± 14.8 years, 56 male). Patients were examined in the acute phase after ischaemic stroke (after a mean of 5.3, maximum 10 days) to avoid interference of brain reorganization with a reliable lesion-symptom mapping as best as possible. Patients were asked to imitate 20 meaningless hand and finger postures, and to pantomime the use of 14 common tools depicted as line drawings. Following the distinction between movement engrams and action semantics, pantomime errors were characterized as either movement or content errors, respectively. Whereas movement errors referred to incorrect spatio-temporal features of overall recognizable movements, content errors reflected an inability to associate tools with their prototypical actions. Both imitation and pantomime deficits were associated with lesions within the lateral occipitotemporal cortex, posterior inferior parietal lobule, posterior intraparietal sulcus and superior parietal lobule. However, the areas specifically related to the dorso-dorsal stream, i.e. posterior intraparietal sulcus and superior parietal lobule, were more strongly associated with imitation. Conversely, in contrast to imitation, pantomime deficits were associated with ventro-dorsal regions such as the supramarginal gyrus, as well as brain structures counted to the ventral stream, such as the extreme capsule. Ventral stream involvement was especially clear for content errors which were related to anterior temporal damage. However, movement errors were not consistently associated with a specific lesion location. In summary, our results indicate that imitation mainly relies on the dorso-dorsal stream for visuo-motor conversion and on-line movement control. Conversely, pantomime additionally requires ventro-dorsal and ventral streams for access to stored action engrams and retrieval of tool-action relationships. © The Author (2014). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. 45 CFR 98.102 - Content of Error Rate Reports.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ....102 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.102 Content of Error Rate Reports. (a) Baseline Submission Report... payments by the total dollar amount of child care payments that the State, the District of Columbia or...

  20. 45 CFR 98.102 - Content of Error Rate Reports.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ....102 Public Welfare Department of Health and Human Services GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.102 Content of Error Rate Reports. (a) Baseline Submission Report... payments by the total dollar amount of child care payments that the State, the District of Columbia or...

  1. 45 CFR 98.102 - Content of Error Rate Reports.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ....102 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.102 Content of Error Rate Reports. (a) Baseline Submission Report... payments by the total dollar amount of child care payments that the State, the District of Columbia or...

  2. 45 CFR 98.102 - Content of Error Rate Reports.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ....102 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.102 Content of Error Rate Reports. (a) Baseline Submission Report... payments by the total dollar amount of child care payments that the State, the District of Columbia or...

  3. 45 CFR 98.100 - Error Rate Report.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND... rates, which is defined as the percentage of cases with an error (expressed as the total number of cases with an error compared to the total number of cases); the percentage of cases with an improper payment...

  4. 42 CFR 407.32 - Prejudice to enrollment rights because of Federal Government misrepresentation, inaction, or error.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Government misrepresentation, inaction, or error. 407.32 Section 407.32 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES MEDICARE PROGRAM SUPPLEMENTARY MEDICAL INSURANCE... enrollment rights because of Federal Government misrepresentation, inaction, or error. If an individual's...

  5. 42 CFR 407.32 - Prejudice to enrollment rights because of Federal Government misrepresentation, inaction, or error.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Government misrepresentation, inaction, or error. 407.32 Section 407.32 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES MEDICARE PROGRAM SUPPLEMENTARY MEDICAL INSURANCE... enrollment rights because of Federal Government misrepresentation, inaction, or error. If an individual's...

  6. Can eye-tracking technology improve situational awareness in paramedic clinical education?

    PubMed

    Williams, Brett; Quested, Andrew; Cooper, Simon

    2013-01-01

    Human factors play a significant part in clinical error. Situational awareness (SA) means being aware of one's surroundings, comprehending the present situation, and being able to predict outcomes. It is a key human skill that, when properly applied, is associated with reducing medical error: eye-tracking technology can be used to provide an objective and qualitative measure of the initial perception component of SA. Feedback from eye-tracking technology can be used to improve the understanding and teaching of SA in clinical contexts, and consequently, has potential for reducing clinician error and the concomitant adverse events.

  7. [INVITED] Luminescent QR codes for smart labelling and sensing

    NASA Astrophysics Data System (ADS)

    Ramalho, João F. C. B.; António, L. C. F.; Correia, S. F. H.; Fu, L. S.; Pinho, A. S.; Brites, C. D. S.; Carlos, L. D.; André, P. S.; Ferreira, R. A. S.

    2018-05-01

    QR (Quick Response) codes are two-dimensional barcodes composed of special geometric patterns of black modules in a white square background that can encode different types of information with high density and robustness, correct errors and physical damages, thus keeping the stored information protected. Recently, these codes have gained increased attention as they offer a simple physical tool for quick access to Web sites for advertising and social interaction. Challenges encompass the increase of the storage capacity limit, even though they can store approximately 350 times more information than common barcodes, and encode different types of characters (e.g., numeric, alphanumeric, kanji and kana). In this work, we fabricate luminescent QR codes based on a poly(methyl methacrylate) substrate coated with organic-inorganic hybrid materials doped with trivalent terbium (Tb3+) and europium (Eu3+) ions, demonstrating the increase of storage capacity per unit area by a factor of two by using the colour multiplexing, when compared to conventional QR codes. A novel methodology to decode the multiplexed QR codes is developed based on a colour separation threshold where a decision level is calculated through a maximum-likelihood criteria to minimize the error probability of the demultiplexed modules, maximizing the foreseen total storage capacity. Moreover, the thermal dependence of the emission colour coordinates of the Eu3+/Tb3+-based hybrids enables the simultaneously QR code colour-multiplexing and may be used to sense temperature (reproducibility higher than 93%), opening new fields of applications for QR codes as smart labels for sensing.

  8. A Homing Missile Control System to Reduce the Effects of Radome Diffraction

    NASA Technical Reports Server (NTRS)

    Smith, Gerald L.

    1960-01-01

    The problem of radome diffraction in radar-controlled homing missiles at high speeds and high altitudes is considered from the point of view of developing a control system configuration which will alleviate the deleterious effects of the diffraction. It is shown that radome diffraction is in essence a kinematic feedback of body angular velocities which causes the radar to sense large apparent line-of-sight angular velocities. The normal control system cannot distinguish between the erroneous and actual line-of-sight rates, and entirely wrong maneuvers are produced which result in large miss distances. The problem is resolved by adding to the control system a special-purpose computer which utilizes measured body angular velocity to extract from the radar output true line-of-sight information for use in steering the missile. The computer operates on the principle of sampling and storing the radar output at instants when the body angular velocity is low and using this stored information for maneuvering commands. In addition, when the angular velocity is not low the computer determines a radome diffraction compensation which is subtracted from the radar output to reduce the error in the sampled information. Analog simulation results for the proposed control system operating in a coplanar (vertical plane) attack indicate a potential decrease in miss distance to an order of magnitude below that for a conventional system. Effects of glint noise, random target maneuvers, initial heading errors, and missile maneuverability are considered in the investigation.

  9. In vivo measurement of bone aluminum in population living in southern Ontario, Canada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, K.; Aslam,; Pejovic-Milic, A.

    The harmful biological effect of excessive aluminum (Al) load in humans has been well documented in the literature. Al stored in bone, for instance due to dialysis treatment or occupational exposure, can interfere with normal bone remodeling leading to osteodystrophy, osteoarthritis, or osteomalacia. On the other hand, the relationship between chronic Al exposure and the risk of Alzheimer's disease remains controversial. In this work, the feasibility of in vivo neutron activation analysis (IVNAA) for measuring Al levels in the human hand bone, using the thermal neutron capture reaction {sup 27}Al(n,{gamma}){sup 28}Al, is reported. This noninvasive diagnostic technique employs a highmore » beam current Tandetron accelerator based neutron source, an irradiation/shielding cavity, a 4{pi} NaI(Tl) detector system, and a new set of hand bone phantoms. The photon spectra of the irradiated phantom closely resemble those collected from the hands of nonexposed healthy subjects. A protocol was developed using the newly developed hand phantoms, which resulted in a minimum detectable limit (MDL) of 0.29 mg Al in the human hand. Using the ratio of Al to Ca as an index of Al levels per unit bone mass, the MDL was determined as 19.5{+-}1.5 {mu}g Al/g Ca, which is within the range of the measured levels of 20-27 {mu}g Al/g Ca[ICRP, Report of the Task Group on Reference Man, Publication 23 (Pergamon, Oxford, 1975)] found in other in vivo and in vitro studies. Following the feasibility studies conducted with phantoms, the diagnostic technique was used to measure Al levels in the hand bones of 20 healthy human subjects. The mean hand bone Al concentration was determined as 27.1{+-}16.1 ({+-}1 SD) {mu}g Al/g Ca. The average standard error (1{sigma}) in the Al/Ca is 14.0 {mu}g Al/g Ca, which corresponds to an average relative error of 50% in the measured levels of Al/Ca. These results were achieved with a dose equivalent of 17.6 mSv to a hand and an effective dose of 14.4 {mu}Sv. This effective dose is approximately half of that received in a chest radiograph examination. It is recommended to investigate the use of the bone Al IVNAA diagnostic technique for in vivo measurements of patients with documented overload of Al in bone.« less

  10. Use of Bacteriophage MS2 as an Internal Control in Viral Reverse Transcription-PCR Assays

    PubMed Central

    Dreier, Jens; Störmer, Melanie; Kleesiek, Knut

    2005-01-01

    Diagnostic systems based on reverse transcription (RT)-PCR are widely used for the detection of viral genomes in different human specimens. The application of internal controls (IC) to monitor each step of nucleic acid amplification is necessary to prevent false-negative results due to inhibition or human error. In this study, we designed various real-time RT-PCRs utilizing the coliphage MS2 replicase gene, which differ in detection format, amplicon size, and efficiency of amplification. These noncompetitive IC assays, using TaqMan, hybridization probe, or duplex scorpion probe techniques, were tested on the LightCycler and Rotorgene systems. In our approach, clinical specimens were spiked with the control virus to monitor the efficiency of extraction, reverse transcription, and amplification steps. The MS2 RT-PCR assays were applied for internal control when using a second target hepatitis C virus RNA in duplex PCR in blood donor screening. The 95% detection limit was calculated by probit analysis to 44.9 copies per PCR (range, 38.4 to 73.4). As demonstrated routinely, application of MS2 IC assays exhibits low variability and can be applied in various RT-PCR assays. MS2 phage lysates were obtained under standard laboratory conditions. The quantification of phage and template RNA was performed by plating assays to determine PFU or via real-time RT-PCR. High stability of the MS2 phage preparations stored at −20°C, 4°C, and room temperature was demonstrated. PMID:16145106

  11. Being fat and smart: A comparative analysis of the fat-brain trade-off in mammals.

    PubMed

    Heldstab, Sandra A; van Schaik, Carel P; Isler, Karin

    2016-11-01

    Humans stand out among non-aquatic mammals by having both an extremely large brain and a relatively large amount of body fat. To understand the evolution of this human peculiarity we report a phylogenetic comparative study of 120 mammalian species, including 30 primates, using seasonal variation in adult body mass as a proxy of the tendency to store fat. Species that rely on storing fat to survive lean periods are expected to be less active because of higher costs of locomotion and have increased predation risk due to reduced agility. Because a fat-storage strategy reduces the net cognitive benefit of a large brain without reducing its cost, such species should be less likely to evolve a larger brain than non-fat-storing species. We therefore predict that the two strategies to buffer food shortages (storing body fat and cognitive flexibility) are compensatory, and therefore predict negative co-evolution between relative brain size and seasonal variation in body mass. This trade-off is expected to be stronger in predominantly arboreal species than in more terrestrial ones, as the cost of transporting additional adipose depots is higher for climbing than for horizontal locomotion. We did, indeed, find a significant negative correlation between brain size and coefficient of variation (CV) in body mass in both sexes for the subsample of arboreal species, both in all mammals and within primates. In predominantly terrestrial species, in contrast, this correlation was not significant. We therefore suggest that the adoption of habitually terrestrial locomotor habits, accompanied by a reduced reliance on climbing, has allowed for a primate of our body size the unique human combination of unusually large brains and unusually large adipose depots. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Effects of Aged Stored Autologous Red Blood Cells on Human Endothelial Function

    PubMed Central

    Kanias, Tamir; Triulzi, Darrel; Donadee, Chenell; Barge, Suchitra; Badlam, Jessica; Jain, Shilpa; Belanger, Andrea M.; Kim-Shapiro, Daniel B.

    2015-01-01

    Rationale: A major abnormality that characterizes the red cell “storage lesion” is increased hemolysis and reduced red cell lifespan after infusion. Low levels of intravascular hemolysis after transfusion of aged stored red cells disrupt nitric oxide (NO) bioavailabity, via accelerated NO scavenging reaction with cell-free plasma hemoglobin. The degree of intravascular hemolysis post-transfusion and effects on endothelial-dependent vasodilation responses to acetylcholine have not been fully characterized in humans. Objectives: To evaluate the effects of blood aged to the limits of Food and Drug Administration–approved storage time on the human microcirculation and endothelial function. Methods: Eighteen healthy individuals donated 1 U of leukopheresed red cells, divided and autologously transfused into the forearm brachial artery 5 and 42 days after blood donation. Blood samples were obtained from stored blood bag supernatants and the antecubital vein of the infusion arm. Forearm blood flow measurements were performed using strain-gauge plethysmography during transfusion, followed by testing of endothelium-dependent blood flow with increasing doses of intraarterial acetylcholine. Measurements and Main Results: We demonstrate that aged stored blood has higher levels of arginase-1 and cell-free plasma hemoglobin. Compared with 5-day blood, the transfusion of 42-day packed red cells decreases acetylcholine-dependent forearm blood flows. Intravascular venous levels of arginase-1 and cell-free plasma hemoglobin increase immediately after red cell transfusion, with more significant increases observed after infusion of 42-day-old blood. Conclusions: We demonstrate that the transfusion of blood at the limits of Food and Drug Administration–approved storage has a significant effect on the forearm circulation and impairs endothelial function. Clinical trial registered with www.clinicaltrials.gov (NCT 01137656) PMID:26222884

  13. Human urine as test material in 1H NMR-based metabonomics: recommendations for sample preparation and storage.

    PubMed

    Lauridsen, Michael; Hansen, Steen H; Jaroszewski, Jerzy W; Cornett, Claus

    2007-02-01

    Metabonomic approaches are believed to have the capability of revolutionizing diagnosis of diseases and assessment of patient conditions after medical interventions. In order to ensure comparability of metabonomic 1H NMR data from different studies, we suggest validated sample preparation guidelines for human urine based on a stability study that evaluates effects of storage time and temperature, freeze-drying, and the presence of preservatives. The results indicated that human urine samples should be stored at or below -25 degrees C, as no changes in the 1H NMR fingerprints have been observed during storage at this temperature for 26 weeks. Formation of acetate, presumably due to microbial contamination, was occasionally observed in samples stored at 4 degrees C without addition of a preservative. Addition of a preserving agent is not mandatory provided that the samples are stored at -25 degrees C. Thus, no differences were observed between 1H NMR spectra of nonpreserved urines and urines with added sodium azide and stored at -25 degrees C, whereas the presence of sodium fluoride caused a shift of especially citrate resonances. Freeze-drying of urine and reconstitution in D2O at pH 7.4 resulted in the disappearance of the creatinine CH2 signal at delta 4.06 due to deuteration. A study evaluating the effects of phosphate buffer concentration on signal variability and assessment of the probability of citrate or creatinine resonances crossing bucket border (a boundary between adjacent integrated regions) led to the conclusion that a minimum buffer concentration of 0.3 M is adequate for normal urines used in this study. However, final buffer concentration of 1 M will be required for very concentrated urines.

  14. Recognizing the Ordinary as Extraordinary: Insight Into the "Way We Work" to Improve Patient Safety Outcomes.

    PubMed

    Henneman, Elizabeth A

    2017-07-01

    The Institute of Medicine (now National Academy of Medicine) reports "To Err is Human" and "Crossing the Chasm" made explicit 3 previously unappreciated realities: (1) Medical errors are common and result in serious, preventable adverse events; (2) The majority of medical errors are the result of system versus human failures; and (3) It would be impossible for any system to prevent all errors. With these realities, the role of the nurse in the "near miss" process and as the final safety net for the patient is of paramount importance. The nurse's role in patient safety is described from both a systems perspective and a human factors perspective. Critical care nurses use specific strategies to identify, interrupt, and correct medical errors. Strategies to identify errors include knowing the patient, knowing the plan of care, double-checking, and surveillance. Nursing strategies to interrupt errors include offering assistance, clarifying, and verbally interrupting. Nurses correct errors by persevering, being physically present, reviewing/confirming the plan of care, or involving another nurse or physician. Each of these strategies has implications for education, practice, and research. Surveillance is a key nursing strategy for identifying medical errors and reducing adverse events. Eye-tracking technology is a novel approach for evaluating the surveillance process during common, high-risk processes such as blood transfusion and medication administration. Eye tracking has also been used to examine the impact of interruptions to care caused by bedside alarms as well as by other health care personnel. Findings from this safety-related eye-tracking research provide new insight into effective bedside surveillance and interruption management strategies. ©2017 American Association of Critical-Care Nurses.

  15. Preventing medical errors by designing benign failures.

    PubMed

    Grout, John R

    2003-07-01

    One way to successfully reduce medical errors is to design health care systems that are more resistant to the tendencies of human beings to err. One interdisciplinary approach entails creating design changes, mitigating human errors, and making human error irrelevant to outcomes. This approach is intended to facilitate the creation of benign failures, which have been called mistake-proofing devices and forcing functions elsewhere. USING FAULT TREES TO DESIGN FORCING FUNCTIONS: A fault tree is a graphical tool used to understand the relationships that either directly cause or contribute to the cause of a particular failure. A careful analysis of a fault tree enables the analyst to anticipate how the process will behave after the change. EXAMPLE OF AN APPLICATION: A scenario in which a patient is scalded while bathing can serve as an example of how multiple fault trees can be used to design forcing functions. The first fault tree shows the undesirable event--patient scalded while bathing. The second fault tree has a benign event--no water. Adding a scald valve changes the outcome from the undesirable event ("patient scalded while bathing") to the benign event ("no water") Analysis of fault trees does not ensure or guarantee that changes necessary to eliminate error actually occur. Most mistake-proofing is used to prevent simple errors and to create well-defended processes, but complex errors can also result. The utilization of mistake-proofing or forcing functions can be thought of as changing the logic of a process. Errors that formerly caused undesirable failures can be converted into the causes of benign failures. The use of fault trees can provide a variety of insights into the design of forcing functions that will improve patient safety.

  16. Colonic Fermentation: A Neglected Topic in Human Physiology Education

    ERIC Educational Resources Information Center

    Valeur, Jorgen; Berstad, Arnold

    2010-01-01

    Human physiology textbooks tend to limit their discussion of colonic functions to those of absorbing water and electrolytes and storing waste material. However, the colon is a highly active metabolic organ, containing an exceedingly complex society of microbes. By means of fermentation, gastrointestinal microbes break down nutrients that cannot be…

  17. Comparison of software and human observers in reading images of the CDMAM test object to assess digital mammography systems

    NASA Astrophysics Data System (ADS)

    Young, Kenneth C.; Cook, James J. H.; Oduko, Jennifer M.; Bosmans, Hilde

    2006-03-01

    European Guidelines for quality control in digital mammography specify minimum and achievable standards of image quality in terms of threshold contrast, based on readings of images of the CDMAM test object by human observers. However this is time-consuming and has large inter-observer error. To overcome these problems a software program (CDCOM) is available to automatically read CDMAM images, but the optimal method of interpreting the output is not defined. This study evaluates methods of determining threshold contrast from the program, and compares these to human readings for a variety of mammography systems. The methods considered are (A) simple thresholding (B) psychometric curve fitting (C) smoothing and interpolation and (D) smoothing and psychometric curve fitting. Each method leads to similar threshold contrasts but with different reproducibility. Method (A) had relatively poor reproducibility with a standard error in threshold contrast of 18.1 +/- 0.7%. This was reduced to 8.4% by using a contrast-detail curve fitting procedure. Method (D) had the best reproducibility with an error of 6.7%, reducing to 5.1% with curve fitting. A panel of 3 human observers had an error of 4.4% reduced to 2.9 % by curve fitting. All automatic methods led to threshold contrasts that were lower than for humans. The ratio of human to program threshold contrasts varied with detail diameter and was 1.50 +/- .04 (sem) at 0.1mm and 1.82 +/- .06 at 0.25mm for method (D). There were good correlations between the threshold contrast determined by humans and the automated methods.

  18. Non-commuting two-local Hamiltonians for quantum error suppression

    NASA Astrophysics Data System (ADS)

    Jiang, Zhang; Rieffel, Eleanor G.

    2017-04-01

    Physical constraints make it challenging to implement and control many-body interactions. For this reason, designing quantum information processes with Hamiltonians consisting of only one- and two-local terms is a worthwhile challenge. Enabling error suppression with two-local Hamiltonians is particularly challenging. A no-go theorem of Marvian and Lidar (Phys Rev Lett 113(26):260504, 2014) demonstrates that, even allowing particles with high Hilbert space dimension, it is impossible to protect quantum information from single-site errors by encoding in the ground subspace of any Hamiltonian containing only commuting two-local terms. Here, we get around this no-go result by encoding in the ground subspace of a Hamiltonian consisting of non-commuting two-local terms arising from the gauge operators of a subsystem code. Specifically, we show how to protect stored quantum information against single-qubit errors using a Hamiltonian consisting of sums of the gauge generators from Bacon-Shor codes (Bacon in Phys Rev A 73(1):012340, 2006) and generalized-Bacon-Shor code (Bravyi in Phys Rev A 83(1):012320, 2011). Our results imply that non-commuting two-local Hamiltonians have more error-suppressing power than commuting two-local Hamiltonians. While far from providing full fault tolerance, this approach improves the robustness achievable in near-term implementable quantum storage and adiabatic quantum computations, reducing the number of higher-order terms required to encode commonly used adiabatic Hamiltonians such as the Ising Hamiltonians common in adiabatic quantum optimization and quantum annealing.

  19. A Multi-Methods Approach to HRA and Human Performance Modeling: A Field Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacques Hugo; David I Gertman

    2012-06-01

    The Advanced Test Reactor (ATR) is a research reactor at the Idaho National Laboratory is primarily designed and used to test materials to be used in other, larger-scale and prototype reactors. The reactor offers various specialized systems and allows certain experiments to be run at their own temperature and pressure. The ATR Canal temporarily stores completed experiments and used fuel. It also has facilities to conduct underwater operations such as experiment examination or removal. In reviewing the ATR safety basis, a number of concerns were identified involving the ATR canal. A brief study identified ergonomic issues involving the manual handlingmore » of fuel elements in the canal that may increase the probability of human error and possible unwanted acute physical outcomes to the operator. In response to this concern, that refined the previous HRA scoping analysis by determining the probability of the inadvertent exposure of a fuel element to the air during fuel movement and inspection was conducted. The HRA analysis employed the SPAR-H method and was supplemented by information gained from a detailed analysis of the fuel inspection and transfer tasks. This latter analysis included ergonomics, work cycles, task duration, and workload imposed by tool and workplace characteristics, personal protective clothing, and operational practices that have the potential to increase physical and mental workload. Part of this analysis consisted of NASA-TLX analyses, combined with operational sequence analysis, computational human performance analysis (CHPA), and 3D graphical modeling to determine task failures and precursors to such failures that have safety implications. Experience in applying multiple analysis techniques in support of HRA methods is discussed.« less

  20. Uncorrected refractive errors

    PubMed Central

    Naidoo, Kovin S; Jaggernath, Jyoti

    2012-01-01

    Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC), were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR) Development, Service Development and Social Entrepreneurship. PMID:22944755

Top