Sample records for predictive human operator

  1. Operating Comfort Prediction Model of Human-Machine Interface Layout for Cabin Based on GEP.

    PubMed

    Deng, Li; Wang, Guohua; Chen, Bo

    2015-01-01

    In view of the evaluation and decision-making problem of human-machine interface layout design for cabin, the operating comfort prediction model is proposed based on GEP (Gene Expression Programming), using operating comfort to evaluate layout scheme. Through joint angles to describe operating posture of upper limb, the joint angles are taken as independent variables to establish the comfort model of operating posture. Factor analysis is adopted to decrease the variable dimension; the model's input variables are reduced from 16 joint angles to 4 comfort impact factors, and the output variable is operating comfort score. The Chinese virtual human body model is built by CATIA software, which will be used to simulate and evaluate the operators' operating comfort. With 22 groups of evaluation data as training sample and validation sample, GEP algorithm is used to obtain the best fitting function between the joint angles and the operating comfort; then, operating comfort can be predicted quantitatively. The operating comfort prediction result of human-machine interface layout of driller control room shows that operating comfort prediction model based on GEP is fast and efficient, it has good prediction effect, and it can improve the design efficiency.

  2. Operating Comfort Prediction Model of Human-Machine Interface Layout for Cabin Based on GEP

    PubMed Central

    Wang, Guohua; Chen, Bo

    2015-01-01

    In view of the evaluation and decision-making problem of human-machine interface layout design for cabin, the operating comfort prediction model is proposed based on GEP (Gene Expression Programming), using operating comfort to evaluate layout scheme. Through joint angles to describe operating posture of upper limb, the joint angles are taken as independent variables to establish the comfort model of operating posture. Factor analysis is adopted to decrease the variable dimension; the model's input variables are reduced from 16 joint angles to 4 comfort impact factors, and the output variable is operating comfort score. The Chinese virtual human body model is built by CATIA software, which will be used to simulate and evaluate the operators' operating comfort. With 22 groups of evaluation data as training sample and validation sample, GEP algorithm is used to obtain the best fitting function between the joint angles and the operating comfort; then, operating comfort can be predicted quantitatively. The operating comfort prediction result of human-machine interface layout of driller control room shows that operating comfort prediction model based on GEP is fast and efficient, it has good prediction effect, and it can improve the design efficiency. PMID:26448740

  3. Development, Testing, and Validation of a Model-Based Tool to Predict Operator Responses in Unexpected Workload Transitions

    NASA Technical Reports Server (NTRS)

    Sebok, Angelia; Wickens, Christopher; Sargent, Robert

    2015-01-01

    One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.

  4. Analysis of operational comfort in manual tasks using human force manipulability measure.

    PubMed

    Tanaka, Yoshiyuki; Nishikawa, Kazuo; Yamada, Naoki; Tsuji, Toshio

    2015-01-01

    This paper proposes a scheme for human force manipulability (HFM) based on the use of isometric joint torque properties to simulate the spatial characteristics of human operation forces at an end-point of a limb with feasible magnitudes for a specified limb posture. This is also applied to the evaluation/prediction of operational comfort (OC) when manually operating a human-machine interface. The effectiveness of HFM is investigated through two experiments and computer simulations of humans generating forces by using their upper extremities. Operation force generation with maximum isometric effort can be roughly estimated with an HFM measure computed from information on the arm posture during a maintained posture. The layout of a human-machine interface is then discussed based on the results of operational experiments using an electric gear-shifting system originally developed for robotic devices. The results indicate a strong relationship between the spatial characteristics of the HFM and OC levels when shifting, and the OC is predicted by using a multiple regression model with HFM measures.

  5. Computational Models of Human Performance: Validation of Memory and Procedural Representation in Advanced Air/Ground Simulation

    NASA Technical Reports Server (NTRS)

    Corker, Kevin M.; Labacqz, J. Victor (Technical Monitor)

    1997-01-01

    The Man-Machine Interaction Design and Analysis System (MIDAS) under joint U.S. Army and NASA cooperative is intended to assist designers of complex human/automation systems in successfully incorporating human performance capabilities and limitations into decision and action support systems. MIDAS is a computational representation of multiple human operators, selected perceptual, cognitive, and physical functions of those operators, and the physical/functional representation of the equipment with which they operate. MIDAS has been used as an integrated predictive framework for the investigation of human/machine systems, particularly in situations with high demands on the operators. We have extended the human performance models to include representation of both human operators and intelligent aiding systems in flight management, and air traffic service. The focus of this development is to predict human performance in response to aiding system developed to identify aircraft conflict and to assist in the shared authority for resolution. The demands of this application requires representation of many intelligent agents sharing world-models, coordinating action/intention, and cooperative scheduling of goals and action in an somewhat unpredictable world of operations. In recent applications to airborne systems development, MIDAS has demonstrated an ability to predict flight crew decision-making and procedural behavior when interacting with automated flight management systems and Air Traffic Control. In this paper, we describe two enhancements to MIDAS. The first involves the addition of working memory in the form of an articulatory buffer for verbal communication protocols and a visuo-spatial buffer for communications via digital datalink. The second enhancement is a representation of multiple operators working as a team. This enhanced model was used to predict the performance of human flight crews and their level of compliance with commercial aviation communication procedures. We show how the data produced by MIDAS compares with flight crew performance data from full mission simulations. Finally, we discuss the use of these features to study communication issues connected with aircraft-based separation assurance.

  6. Computational Model of Human and System Dynamics in Free Flight: Studies in Distributed Control Technologies

    NASA Technical Reports Server (NTRS)

    Corker, Kevin M.; Pisanich, Gregory; Lebacqz, J. Victor (Technical Monitor)

    1998-01-01

    This paper presents a set of studies in full mission simulation and the development of a predictive computational model of human performance in control of complex airspace operations. NASA and the FAA have initiated programs of research and development to provide flight crew, airline operations and air traffic managers with automation aids to increase capacity in en route and terminal area to support the goals of safe, flexible, predictable and efficient operations. In support of these developments, we present a computational model to aid design that includes representation of multiple cognitive agents (both human operators and intelligent aiding systems). The demands of air traffic management require representation of many intelligent agents sharing world-models, coordinating action/intention, and scheduling goals and actions in a potentially unpredictable world of operations. The operator-model structure includes attention functions, action priority, and situation assessment. The cognitive model has been expanded to include working memory operations including retrieval from long-term store, and interference. The operator's activity structures have been developed to provide for anticipation (knowledge of the intention and action of remote operators), and to respond to failures of the system and other operators in the system in situation-specific paradigms. System stability and operator actions can be predicted by using the model. The model's predictive accuracy was verified using the full-mission simulation data of commercial flight deck operations with advanced air traffic management techniques.

  7. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    PubMed

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  8. A Multiple Agent Model of Human Performance in Automated Air Traffic Control and Flight Management Operations

    NASA Technical Reports Server (NTRS)

    Corker, Kevin; Pisanich, Gregory; Condon, Gregory W. (Technical Monitor)

    1995-01-01

    A predictive model of human operator performance (flight crew and air traffic control (ATC)) has been developed and applied in order to evaluate the impact of automation developments in flight management and air traffic control. The model is used to predict the performance of a two person flight crew and the ATC operators generating and responding to clearances aided by the Center TRACON Automation System (CTAS). The purpose of the modeling is to support evaluation and design of automated aids for flight management and airspace management and to predict required changes in procedure both air and ground in response to advancing automation in both domains. Additional information is contained in the original extended abstract.

  9. Applications of artificial intelligence in safe human-robot interactions.

    PubMed

    Najmaei, Nima; Kermani, Mehrdad R

    2011-04-01

    The integration of industrial robots into the human workspace presents a set of unique challenges. This paper introduces a new sensory system for modeling, tracking, and predicting human motions within a robot workspace. A reactive control scheme to modify a robot's operations for accommodating the presence of the human within the robot workspace is also presented. To this end, a special class of artificial neural networks, namely, self-organizing maps (SOMs), is employed for obtaining a superquadric-based model of the human. The SOM network receives information of the human's footprints from the sensory system and infers necessary data for rendering the human model. The model is then used in order to assess the danger of the robot operations based on the measured as well as predicted human motions. This is followed by the introduction of a new reactive control scheme that results in the least interferences between the human and robot operations. The approach enables the robot to foresee an upcoming danger and take preventive actions before the danger becomes imminent. Simulation and experimental results are presented in order to validate the effectiveness of the proposed method.

  10. Human Factors Vehicle Displacement Analysis: Engineering In Motion

    NASA Technical Reports Server (NTRS)

    Atencio, Laura Ashley; Reynolds, David; Robertson, Clay

    2010-01-01

    While positioned on the launch pad at the Kennedy Space Center, tall stacked launch vehicles are exposed to the natural environment. Varying directional winds and vortex shedding causes the vehicle to sway in an oscillating motion. The Human Factors team recognizes that vehicle sway may hinder ground crew operation, impact the ground system designs, and ultimately affect launch availability . The objective of this study is to physically simulate predicted oscillation envelopes identified by analysis. and conduct a Human Factors Analysis to assess the ability to carry out essential Upper Stage (US) ground operator tasks based on predicted vehicle motion.

  11. Trust-Based Design of Human-Guided Algorithms

    DTIC Science & Technology

    2007-06-01

    Management Interdepartmental Program in Operations Research 17 May, 2007 Approved by: Laura Major Forest The Charles Stark Draper Laboratory...2. Information Analysis: predicting based on data, integrating and managing information, augmenting human operator perception and cognition. 3...allocation of automation by designers and managers . How an operator decides between manual and automatic control of a system is a necessary

  12. Modeling and Evaluating Pilot Performance in NextGen: Review of and Recommendations Regarding Pilot Modeling Efforts, Architectures, and Validation Studies

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher; Sebok, Angelia; Keller, John; Peters, Steve; Small, Ronald; Hutchins, Shaun; Algarin, Liana; Gore, Brian Francis; Hooey, Becky Lee; Foyle, David C.

    2013-01-01

    NextGen operations are associated with a variety of changes to the national airspace system (NAS) including changes to the allocation of roles and responsibilities among operators and automation, the use of new technologies and automation, additional information presented on the flight deck, and the entire concept of operations (ConOps). In the transition to NextGen airspace, aviation and air operations designers need to consider the implications of design or system changes on human performance and the potential for error. To ensure continued safety of the NAS, it will be necessary for researchers to evaluate design concepts and potential NextGen scenarios well before implementation. One approach for such evaluations is through human performance modeling. Human performance models (HPMs) provide effective tools for predicting and evaluating operator performance in systems. HPMs offer significant advantages over empirical, human-in-the-loop testing in that (1) they allow detailed analyses of systems that have not yet been built, (2) they offer great flexibility for extensive data collection, (3) they do not require experimental participants, and thus can offer cost and time savings. HPMs differ in their ability to predict performance and safety with NextGen procedures, equipment and ConOps. Models also vary in terms of how they approach human performance (e.g., some focus on cognitive processing, others focus on discrete tasks performed by a human, while others consider perceptual processes), and in terms of their associated validation efforts. The objectives of this research effort were to support the Federal Aviation Administration (FAA) in identifying HPMs that are appropriate for predicting pilot performance in NextGen operations, to provide guidance on how to evaluate the quality of different models, and to identify gaps in pilot performance modeling research, that could guide future research opportunities. This research effort is intended to help the FAA evaluate pilot modeling efforts and select the appropriate tools for future modeling efforts to predict pilot performance in NextGen operations.

  13. Application of response surface methodology to maximize the productivity of scalable automated human embryonic stem cell manufacture.

    PubMed

    Ratcliffe, Elizabeth; Hourd, Paul; Guijarro-Leach, Juan; Rayment, Erin; Williams, David J; Thomas, Robert J

    2013-01-01

    Commercial regenerative medicine will require large quantities of clinical-specification human cells. The cost and quality of manufacture is notoriously difficult to control due to highly complex processes with poorly defined tolerances. As a step to overcome this, we aimed to demonstrate the use of 'quality-by-design' tools to define the operating space for economic passage of a scalable human embryonic stem cell production method with minimal cell loss. Design of experiments response surface methodology was applied to generate empirical models to predict optimal operating conditions for a unit of manufacture of a previously developed automatable and scalable human embryonic stem cell production method. Two models were defined to predict cell yield and cell recovery rate postpassage, in terms of the predictor variables of media volume, cell seeding density, media exchange and length of passage. Predicted operating conditions for maximized productivity were successfully validated. Such 'quality-by-design' type approaches to process design and optimization will be essential to reduce the risk of product failure and patient harm, and to build regulatory confidence in cell therapy manufacturing processes.

  14. Program Predicts Time Courses of Human/Computer Interactions

    NASA Technical Reports Server (NTRS)

    Vera, Alonso; Howes, Andrew

    2005-01-01

    CPM X is a computer program that predicts sequences of, and amounts of time taken by, routine actions performed by a skilled person performing a task. Unlike programs that simulate the interaction of the person with the task environment, CPM X predicts the time course of events as consequences of encoded constraints on human behavior. The constraints determine which cognitive and environmental processes can occur simultaneously and which have sequential dependencies. The input to CPM X comprises (1) a description of a task and strategy in a hierarchical description language and (2) a description of architectural constraints in the form of rules governing interactions of fundamental cognitive, perceptual, and motor operations. The output of CPM X is a Program Evaluation Review Technique (PERT) chart that presents a schedule of predicted cognitive, motor, and perceptual operators interacting with a task environment. The CPM X program allows direct, a priori prediction of skilled user performance on complex human-machine systems, providing a way to assess critical interfaces before they are deployed in mission contexts.

  15. Human Purposive Movement Theory

    DTIC Science & Technology

    2012-03-01

    theory and provides examples of developmental and operational technologies that could use this theory in common settings. 15. SUBJECT TERMS human ... activity , prediction of behavior, human algorithms purposive movement theory 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18

  16. Visualization and simulated surgery of the left ventricle in the virtual pathological heart of the Virtual Physiological Human

    PubMed Central

    McFarlane, N. J. B.; Lin, X.; Zhao, Y.; Clapworthy, G. J.; Dong, F.; Redaelli, A.; Parodi, O.; Testi, D.

    2011-01-01

    Ischaemic heart failure remains a significant health and economic problem worldwide. This paper presents a user-friendly software system that will form a part of the virtual pathological heart of the Virtual Physiological Human (VPH2) project, currently being developed under the European Commission Virtual Physiological Human (VPH) programme. VPH2 is an integrated medicine project, which will create a suite of modelling, simulation and visualization tools for patient-specific prediction and planning in cases of post-ischaemic left ventricular dysfunction. The work presented here describes a three-dimensional interactive visualization for simulating left ventricle restoration surgery, comprising the operations of cutting, stitching and patching, and for simulating the elastic deformation of the ventricle to its post-operative shape. This will supply the quantitative measurements required for the post-operative prediction tools being developed in parallel in the same project. PMID:22670207

  17. Closed loop models for analyzing the effects of simulator characteristics. [digital simulation of human operators

    NASA Technical Reports Server (NTRS)

    Baron, S.; Muralidharan, R.; Kleinman, D. L.

    1978-01-01

    The optimal control model of the human operator is used to develop closed loop models for analyzing the effects of (digital) simulator characteristics on predicted performance and/or workload. Two approaches are considered: the first utilizes a continuous approximation to the discrete simulation in conjunction with the standard optimal control model; the second involves a more exact discrete description of the simulator in a closed loop multirate simulation in which the optimal control model simulates the pilot. Both models predict that simulator characteristics can have significant effects on performance and workload.

  18. Operational forecasting of human-biometeorological conditions

    NASA Astrophysics Data System (ADS)

    Giannaros, T. M.; Lagouvardos, K.; Kotroni, V.; Matzarakis, A.

    2018-03-01

    This paper presents the development of an operational forecasting service focusing on human-biometeorological conditions. The service is based on the coupling of numerical weather prediction models with an advanced human-biometeorological model. Human thermal perception and stress forecasts are issued on a daily basis for Greece, in both point and gridded format. A user-friendly presentation approach is adopted for communicating the forecasts to the public via the worldwide web. The development of the presented service highlights the feasibility of replacing standard meteorological parameters and/or indices used in operational weather forecasting activities for assessing the thermal environment. This is of particular significance for providing effective, human-biometeorology-oriented, warnings for both heat waves and cold outbreaks.

  19. A predictive model of nuclear power plant crew decision-making and performance in a dynamic simulation environment

    NASA Astrophysics Data System (ADS)

    Coyne, Kevin Anthony

    The safe operation of complex systems such as nuclear power plants requires close coordination between the human operators and plant systems. In order to maintain an adequate level of safety following an accident or other off-normal event, the operators often are called upon to perform complex tasks during dynamic situations with incomplete information. The safety of such complex systems can be greatly improved if the conditions that could lead operators to make poor decisions and commit erroneous actions during these situations can be predicted and mitigated. The primary goal of this research project was the development and validation of a cognitive model capable of simulating nuclear plant operator decision-making during accident conditions. Dynamic probabilistic risk assessment methods can improve the prediction of human error events by providing rich contextual information and an explicit consideration of feedback arising from man-machine interactions. The Accident Dynamics Simulator paired with the Information, Decision, and Action in a Crew context cognitive model (ADS-IDAC) shows promise for predicting situational contexts that might lead to human error events, particularly knowledge driven errors of commission. ADS-IDAC generates a discrete dynamic event tree (DDET) by applying simple branching rules that reflect variations in crew responses to plant events and system status changes. Branches can be generated to simulate slow or fast procedure execution speed, skipping of procedure steps, reliance on memorized information, activation of mental beliefs, variations in control inputs, and equipment failures. Complex operator mental models of plant behavior that guide crew actions can be represented within the ADS-IDAC mental belief framework and used to identify situational contexts that may lead to human error events. This research increased the capabilities of ADS-IDAC in several key areas. The ADS-IDAC computer code was improved to support additional branching events and provide a better representation of the IDAC cognitive model. An operator decision-making engine capable of responding to dynamic changes in situational context was implemented. The IDAC human performance model was fully integrated with a detailed nuclear plant model in order to realistically simulate plant accident scenarios. Finally, the improved ADS-IDAC model was calibrated, validated, and updated using actual nuclear plant crew performance data. This research led to the following general conclusions: (1) A relatively small number of branching rules are capable of efficiently capturing a wide spectrum of crew-to-crew variabilities. (2) Compared to traditional static risk assessment methods, ADS-IDAC can provide a more realistic and integrated assessment of human error events by directly determining the effect of operator behaviors on plant thermal hydraulic parameters. (3) The ADS-IDAC approach provides an efficient framework for capturing actual operator performance data such as timing of operator actions, mental models, and decision-making activities.

  20. Man-Machine Integration Design and Analysis System (MIDAS) v5: Augmentations, Motivations, and Directions for Aeronautics Applications

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2011-01-01

    As automation and advanced technologies are introduced into transport systems ranging from the Next Generation Air Transportation System termed NextGen, to the advanced surface transportation systems as exemplified by the Intelligent Transportations Systems, to future systems designed for space exploration, there is an increased need to validly predict how the future systems will be vulnerable to error given the demands imposed by the assistive technologies. One formalized approach to study the impact of assistive technologies on the human operator in a safe and non-obtrusive manner is through the use of human performance models (HPMs). HPMs play an integral role when complex human-system designs are proposed, developed, and tested. One HPM tool termed the Man-machine Integration Design and Analysis System (MIDAS) is a NASA Ames Research Center HPM software tool that has been applied to predict human-system performance in various domains since 1986. MIDAS is a dynamic, integrated HPM and simulation environment that facilitates the design, visualization, and computational evaluation of complex man-machine system concepts in simulated operational environments. The paper will discuss a range of aviation specific applications including an approach used to model human error for NASA s Aviation Safety Program, and what-if analyses to evaluate flight deck technologies for NextGen operations. This chapter will culminate by raising two challenges for the field of predictive HPMs for complex human-system designs that evaluate assistive technologies: that of (1) model transparency and (2) model validation.

  1. Crew workload strategies in advanced cockpits

    NASA Technical Reports Server (NTRS)

    Hart, Sandra G.

    1990-01-01

    Many methods of measuring and predicting operator workload have been developed that provide useful information in the design, evaluation, and operation of complex systems and which aid in developing models of human attention and performance. However, the relationships between such measures, imposed task demands, and measures of performance remain complex and even contradictory. It appears that we have ignored an important factor: people do not passively translate task demands into performance. Rather, they actively manage their time, resources, and effort to achieve an acceptable level of performance while maintaining a comfortable level of workload. While such adaptive, creative, and strategic behaviors are the primary reason that human operators remain an essential component of all advanced man-machine systems, they also result in individual differences in the way people respond to the same task demands and inconsistent relationships among measures. Finally, we are able to measure workload and performance, but interpreting such measures remains difficult; it is still not clear how much workload is too much or too little nor the consequences of suboptimal workload on system performance and the mental, physical, and emotional well-being of the human operators. The rationale and philosophy of a program of research developed to address these issues will be reviewed and contrasted to traditional methods of defining, measuring, and predicting human operator workload. Viewgraphs are given.

  2. Human-robot interaction modeling and simulation of supervisory control and situational awareness during field experimentation with military manned and unmanned ground vehicles

    NASA Astrophysics Data System (ADS)

    Johnson, Tony; Metcalfe, Jason; Brewster, Benjamin; Manteuffel, Christopher; Jaswa, Matthew; Tierney, Terrance

    2010-04-01

    The proliferation of intelligent systems in today's military demands increased focus on the optimization of human-robot interactions. Traditional studies in this domain involve large-scale field tests that require humans to operate semiautomated systems under varying conditions within military-relevant scenarios. However, provided that adequate constraints are employed, modeling and simulation can be a cost-effective alternative and supplement. The current presentation discusses a simulation effort that was executed in parallel with a field test with Soldiers operating military vehicles in an environment that represented key elements of the true operational context. In this study, "constructive" human operators were designed to represent average Soldiers executing supervisory control over an intelligent ground system. The constructive Soldiers were simulated performing the same tasks as those performed by real Soldiers during a directly analogous field test. Exercising the models in a high-fidelity virtual environment provided predictive results that represented actual performance in certain aspects, such as situational awareness, but diverged in others. These findings largely reflected the quality of modeling assumptions used to design behaviors and the quality of information available on which to articulate principles of operation. Ultimately, predictive analyses partially supported expectations, with deficiencies explicable via Soldier surveys, experimenter observations, and previously-identified knowledge gaps.

  3. A Conceptual Framework for Predicting Error in Complex Human-Machine Environments

    NASA Technical Reports Server (NTRS)

    Freed, Michael; Remington, Roger; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    We present a Goals, Operators, Methods, and Selection Rules-Model Human Processor (GOMS-MHP) style model-based approach to the problem of predicting human habit capture errors. Habit captures occur when the model fails to allocate limited cognitive resources to retrieve task-relevant information from memory. Lacking the unretrieved information, decision mechanisms act in accordance with implicit default assumptions, resulting in error when relied upon assumptions prove incorrect. The model helps interface designers identify situations in which such failures are especially likely.

  4. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  5. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps/incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  6. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Quality within space exploration ground processing operations, the identification and or classification of underlying contributors and causes of human error must be identified, in order to manage human error.This presentation will provide a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  7. Using Modeling and Simulation to Predict Operator Performance and Automation-Induced Complacency With Robotic Automation: A Case Study and Empirical Validation.

    PubMed

    Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M

    2015-09-01

    The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.

  8. Visual performance modeling in the human operator simulator

    NASA Technical Reports Server (NTRS)

    Strieb, M. I.

    1979-01-01

    A brief description of the history of the development of the human operator simulator (HOS) model is presented. Features of the HOS micromodels that impact on the obtainment of visual performance data are discussed along with preliminary details on a HOS pilot model designed to predict the results of visual performance workload data obtained through oculometer studies on pilots in real and simulated approaches and landings.

  9. Aeroacoustics of Flight Vehicles: Theory and Practice. Volume 2: Noise Control

    NASA Technical Reports Server (NTRS)

    Hubbard, Harvey H. (Editor)

    1991-01-01

    Flight vehicles and the underlying concepts of noise generation, noise propagation, noise prediction, and noise control are studied. This volume includes those chapters that relate to flight vehicle noise control and operations: human response to aircraft noise; atmospheric propagation; theoretical models for duct acoustic propagation and radiation; design and performance of duct acoustic treatment; jet noise suppression; interior noise; flyover noise measurement and prediction; and quiet aircraft design and operational characteristics.

  10. Statistical modelling of networked human-automation performance using working memory capacity.

    PubMed

    Ahmed, Nisar; de Visser, Ewart; Shaw, Tyler; Mohamed-Ameen, Amira; Campbell, Mark; Parasuraman, Raja

    2014-01-01

    This study examines the challenging problem of modelling the interaction between individual attentional limitations and decision-making performance in networked human-automation system tasks. Analysis of real experimental data from a task involving networked supervision of multiple unmanned aerial vehicles by human participants shows that both task load and network message quality affect performance, but that these effects are modulated by individual differences in working memory (WM) capacity. These insights were used to assess three statistical approaches for modelling and making predictions with real experimental networked supervisory performance data: classical linear regression, non-parametric Gaussian processes and probabilistic Bayesian networks. It is shown that each of these approaches can help designers of networked human-automated systems cope with various uncertainties in order to accommodate future users by linking expected operating conditions and performance from real experimental data to observable cognitive traits like WM capacity. Practitioner Summary: Working memory (WM) capacity helps account for inter-individual variability in operator performance in networked unmanned aerial vehicle supervisory tasks. This is useful for reliable performance prediction near experimental conditions via linear models; robust statistical prediction beyond experimental conditions via Gaussian process models and probabilistic inference about unknown task conditions/WM capacities via Bayesian network models.

  11. Adaptive State Predictor Based Human Operator Modeling on Longitudinal and Lateral Control

    NASA Technical Reports Server (NTRS)

    Trujillo, Anna C.; Gregory, Irene M.; Hempley, Lucas E.

    2015-01-01

    Control-theoretic modeling of the human operator dynamic behavior in manual control tasks has a long and rich history. In the last two decades, there has been a renewed interest in modeling the human operator. There has also been significant work on techniques used to identify the pilot model of a given structure. The purpose of this research is to attempt to go beyond pilot identification based on collected experimental data and to develop a predictor of pilot behavior. An experiment was conducted to categorize these interactions of the pilot with an adaptive controller compensating during control surface failures. A general linear in-parameter model structure is used to represent a pilot. Three different estimation methods are explored. A gradient descent estimator (GDE), a least squares estimator with exponential forgetting (LSEEF), and a least squares estimator with bounded gain forgetting (LSEBGF) used the experiment data to predict pilot stick input. Previous results have found that the GDE and LSEEF methods are fairly accurate in predicting longitudinal stick input from commanded pitch. This paper discusses the accuracy of each of the three methods - GDE, LSEEF, and LSEBGF - to predict both pilot longitudinal and lateral stick input from the flight director's commanded pitch and bank attitudes.

  12. Human-Robot Interaction in High Vulnerability Domains

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2016-01-01

    Future NASA missions will require successful integration of the human with highly complex systems. Highly complex systems are likely to involve humans, automation, and some level of robotic assistance. The complex environments will require successful integration of the human with automation, with robots, and with human-automation-robot teams to accomplish mission critical goals. Many challenges exist for the human performing in these types of operational environments with these kinds of systems. Systems must be designed to optimally integrate various levels of inputs and outputs based on the roles and responsibilities of the human, the automation, and the robots; from direct manual control, shared human-robotic control, or no active human control (i.e. human supervisory control). It is assumed that the human will remain involved at some level. Technologies that vary based on contextual demands and on operator characteristics (workload, situation awareness) will be needed when the human integrates into these systems. Predictive models that estimate the impact of the technologies on the system performance and the on the human operator are also needed to meet the challenges associated with such future complex human-automation-robot systems in extreme environments.

  13. Preliminary Exploration of Adaptive State Predictor Based Human Operator Modeling

    NASA Technical Reports Server (NTRS)

    Trujillo, Anna C.; Gregory, Irene M.

    2012-01-01

    Control-theoretic modeling of the human operator dynamic behavior in manual control tasks has a long and rich history. In the last two decades, there has been a renewed interest in modeling the human operator. There has also been significant work on techniques used to identify the pilot model of a given structure. The purpose of this research is to attempt to go beyond pilot identification based on collected experimental data and to develop a predictor of pilot behavior. An experiment was conducted to quantify the effects of changing aircraft dynamics on an operator s ability to track a signal in order to eventually model a pilot adapting to changing aircraft dynamics. A gradient descent estimator and a least squares estimator with exponential forgetting used these data to predict pilot stick input. The results indicate that individual pilot characteristics and vehicle dynamics did not affect the accuracy of either estimator method to estimate pilot stick input. These methods also were able to predict pilot stick input during changing aircraft dynamics and they may have the capability to detect a change in a subject due to workload, engagement, etc., or the effects of changes in vehicle dynamics on the pilot.

  14. Designing Flight-Deck Procedures

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Wiener, L.; Shafto, Mike (Technical Monitor)

    1995-01-01

    A complex human-machine system consists of more than merely one or more human operators and a collection of hardware components. In order to operate a complex system successfully, the human-machine system must be supported by an organizational infrastructure of operating concepts, rules, guidelines, and documents. The coherency of such operating concepts, in terms of consistency and logic, is vitally important for the efficiency and safety of any complex system. In high-risk endeavors such as aircraft operations, space flight, nuclear power production, manufacturing process control, and military operations, it is essential that such support be flawless, as the price of operational error can be high. When operating rules are not adhered to, or the rules are inadequate for the task at hand, not only will the system's goals be thwarted, but there may also be tragic human and material consequences. To ensure safe and predictable operations, support to the operators, in this case flight crews, often comes in the form of standard operating procedures. These provide the crew with step-by-step guidance for carrying out their operations. Standard procedures do indeed promote uniformity, but they do so at the risk of reducing the role of human operators to a lower level. Management, however, must recognize the danger of over-procedurization, which fails to exploit one of the most valuable assets in the system, the intelligent operator who is "on the scene." The alert system designer and operations manager recognize that there cannot be a procedure for everything, and the time will come in which the operators of a complex system will face a situation for which there is no written procedure. Procedures, whether executed by humans or machines, have their place, but so does human cognition.

  15. Computational Virtual Reality (VR) as a human-computer interface in the operation of telerobotic systems

    NASA Technical Reports Server (NTRS)

    Bejczy, Antal K.

    1995-01-01

    This presentation focuses on the application of computer graphics or 'virtual reality' (VR) techniques as a human-computer interface tool in the operation of telerobotic systems. VR techniques offer very valuable task realization aids for planning, previewing and predicting robotic actions, operator training, and for visual perception of non-visible events like contact forces in robotic tasks. The utility of computer graphics in telerobotic operation can be significantly enhanced by high-fidelity calibration of virtual reality images to actual TV camera images. This calibration will even permit the creation of artificial (synthetic) views of task scenes for which no TV camera views are available.

  16. Man-Machine Interaction Design and Analysis System (MIDAS): Memory Representation and Procedural Implications for Airborne Communication Modalities

    NASA Technical Reports Server (NTRS)

    Corker, Kevin M.; Pisanich, Gregory M.; Lebacqz, Victor (Technical Monitor)

    1996-01-01

    The Man-Machine Interaction Design and Analysis System (MIDAS) has been under development for the past ten years through a joint US Army and NASA cooperative agreement. MIDAS represents multiple human operators and selected perceptual, cognitive, and physical functions of those operators as they interact with simulated systems. MIDAS has been used as an integrated predictive framework for the investigation of human/machine systems, particularly in situations with high demands on the operators. Specific examples include: nuclear power plant crew simulation, military helicopter flight crew response, and police force emergency dispatch. In recent applications to airborne systems development, MIDAS has demonstrated an ability to predict flight crew decision-making and procedural behavior when interacting with automated flight management systems and Air Traffic Control. In this paper we describe two enhancements to MIDAS. The first involves the addition of working memory in the form of an articulatory buffer for verbal communication protocols and a visuo-spatial buffer for communications via digital datalink. The second enhancement is a representation of multiple operators working as a team. This enhanced model was used to predict the performance of human flight crews and their level of compliance with commercial aviation communication procedures. We show how the data produced by MIDAS compares with flight crew performance data from full mission simulations. Finally, we discuss the use of these features to study communications issues connected with aircraft-based separation assurance.

  17. Simulation based efficiency prediction of a Brushless DC drive applied in ventricular assist devices.

    PubMed

    Pohlmann, André; Hameyer, Kay

    2012-01-01

    Ventricular Assist Devices (VADs) are mechanical blood pumps that support the human heart in order to maintain a sufficient perfusion of the human body and its organs. During VAD operation blood damage caused by hemolysis, thrombogenecity and denaturation has to be avoided. One key parameter causing the blood's denaturation is its temperature which must not exceed 42 °C. As a temperature rise can be directly linked to the losses occuring in the drive system, this paper introduces an efficiency prediction chain for Brushless DC (BLDC) drives which are applied in various VAD systems. The presented chain is applied to various core materials and operation ranges, providing a general overview on the loss dependencies.

  18. Predicting and Managing Lighting and Visibility for Human Operations in Space

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Peacock, Brian

    2003-01-01

    Lighting is critical to human visual performance. On earth this problem is well understood and solutions are well defined and executed. Because the sun rises and sets on average every 45 minutes during Earth orbit, humans working in space must cope with extremely dynamic lighting conditions varying from very low light conditions to severe glare and contrast conditions. For critical operations, it is essential that lighting conditions be predictable and manageable. Mission planners need to detelmine whether low-light video cameras are required or whether additional luminaires, or lamps, need to be flown . Crew and flight directors need to have up to date daylight orbit time lines showing the best and worst viewing conditions for sunlight and shadowing. Where applicable and possible, lighting conditions need to be part of crew training. In addition, it is desirable to optimize the quantity and quality of light because of the potential impacts on crew safety, delivery costs, electrical power and equipment maintainability for both exterior and interior conditions. Addressing these issues, an illumination modeling system has been developed in the Space Human Factors Laboratory at ASA Johnson Space Center. The system is the integration of a physically based ray-tracing package ("Radiance"), developed at the Lawrence Berkeley Laboratories, a human factors oriented geometric modeling system developed by NASA and an extensive database of humans and their work environments. Measured and published data has been collected for exterior and interior surface reflectivity; luminaire beam spread distribution, color and intensity and video camera light sensitivity and has been associated with their corresponding geometric models. Selecting an eye-point and one or more light sources, including sun and earthshine, a snapshot of the light energy reaching the surfaces or reaching the eye point is computed. This energy map is then used to extract the required information needed for useful predictions. Using a validated, comprehensive illumination model integrated with empirically derived data, predictions of lighting and viewing conditions have been successfully used for Shuttle and Space Station planning and assembly operations. It has successfully balanced the needs for adequate human performance with the utili zation of resources. Keywords: Modeling, ray tracing, luminaires, refl ectivity, luminance, illuminance.

  19. A framework for human-hydrologic system model development integrating hydrology and water management: application to the Cutzamala water system in Mexico

    NASA Astrophysics Data System (ADS)

    Wi, S.; Freeman, S.; Brown, C.

    2017-12-01

    This study presents a general approach to developing computational models of human-hydrologic systems where human modification of hydrologic surface processes are significant or dominant. A river basin system is represented by a network of human-hydrologic response units (HHRUs) identified based on locations where river regulations happen (e.g., reservoir operation and diversions). Natural and human processes in HHRUs are simulated in a holistic framework that integrates component models representing rainfall-runoff, river routing, reservoir operation, flow diversion and water use processes. We illustrate the approach in a case study of the Cutzamala water system (CWS) in Mexico, a complex inter-basin water transfer system supplying the Mexico City Metropolitan Area (MCMA). The human-hydrologic system model for CWS (CUTZSIM) is evaluated in terms of streamflow and reservoir storages measured across the CWS and to water supplied for MCMA. The CUTZSIM improves the representation of hydrology and river-operation interaction and, in so doing, advances evaluation of system-wide water management consequences under altered climatic and demand regimes. The integrated modeling framework enables evaluation and simulation of model errors throughout the river basin, including errors in representation of the human component processes. Heretofore, model error evaluation, predictive error intervals and the resultant improved understanding have been limited to hydrologic processes. The general framework represents an initial step towards fuller understanding and prediction of the many and varied processes that determine the hydrologic fluxes and state variables in real river basins.

  20. [Study of the relationship between human quality and reliability].

    PubMed

    Long, S; Wang, C; Wang, L i; Yuan, J; Liu, H; Jiao, X

    1997-02-01

    To clarify the relationship between human quality and reliability, 1925 experiments in 20 subjects were carried out to study the relationship between disposition character, digital memory, graphic memory, multi-reaction time and education level and simulated aircraft operation. Meanwhile, effects of task difficulty and enviromental factor on human reliability were also studied. The results showed that human quality can be predicted and evaluated through experimental methods. The better the human quality, the higher the human reliability.

  1. Forces associated with pneumatic power screwdriver operation: statics and dynamics.

    PubMed

    Lin, Jia-Hua; Radwin, Robert G; Fronczak, Frank J; Richard, Terry G

    2003-10-10

    The statics and dynamics of pneumatic power screwdriver operation were investigated in the context of predicting forces acting against the human operator. A static force model is described in the paper, based on tool geometry, mass, orientation in space, feed force, torque build up, and stall torque. Three common power hand tool shapes are considered, including pistol grip, right angle, and in-line. The static model estimates handle force needed to support a power nutrunner when it acts against the tightened fastener with a constant torque. A system of equations for static force and moment equilibrium conditions are established, and the resultant handle force (resolved in orthogonal directions) is calculated in matrix form. A dynamic model is formulated to describe pneumatic motor torque build-up characteristics dependent on threaded fastener joint hardness. Six pneumatic tools were tested to validate the deterministic model. The average torque prediction error was 6.6% (SD = 5.4%) and the average handle force prediction error was 6.7% (SD = 6.4%) for a medium-soft threaded fastener joint. The average torque prediction error was 5.2% (SD = 5.3%) and the average handle force prediction error was 3.6% (SD = 3.2%) for a hard threaded fastener joint. Use of these equations for estimating handle forces based on passive mechanical elements representing the human operator is also described. These models together should be useful for considering tool handle force in the selection and design of power screwdrivers, particularly for minimizing handle forces in the prevention of injuries and work related musculoskeletal disorders.

  2. Applications of integrated human error identification techniques on the chemical cylinder change task.

    PubMed

    Cheng, Ching-Min; Hwang, Sheue-Ling

    2015-03-01

    This paper outlines the human error identification (HEI) techniques that currently exist to assess latent human errors. Many formal error identification techniques have existed for years, but few have been validated to cover latent human error analysis in different domains. This study considers many possible error modes and influential factors, including external error modes, internal error modes, psychological error mechanisms, and performance shaping factors, and integrates several execution procedures and frameworks of HEI techniques. The case study in this research was the operational process of changing chemical cylinders in a factory. In addition, the integrated HEI method was used to assess the operational processes and the system's reliability. It was concluded that the integrated method is a valuable aid to develop much safer operational processes and can be used to predict human error rates on critical tasks in the plant. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  3. Electric field prediction for a human body-electric machine system.

    PubMed

    Ioannides, Maria G; Papadopoulos, Peter J; Dimitropoulou, Eugenia

    2004-01-01

    A system consisting of an electric machine and a human body is studied and the resulting electric field is predicted. A 3-phase induction machine operating at full load is modeled considering its geometry, windings, and materials. A human model is also constructed approximating its geometry and the electric properties of tissues. Using the finite element technique the electric field distribution in the human body is determined for a distance of 1 and 5 m from the machine and its effects are studied. Particularly, electric field potential variations are determined at specific points inside the human body and for these points the electric field intensity is computed and compared to the limit values for exposure according to international standards.

  4. Helicopter Acoustics, part 2. [conferences

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Exterior and interior helicopter noise problems are addressed from the physics and engineering as well as the human factors point of view. Noise regulation concepts, human factors and criteria, rotor noise generation and control, design, operations and testing for noise control, helicopter noise prediction, and research tools and measurements are covered.

  5. Developing operator capacity estimates for supervisory control of autonomous vehicles.

    PubMed

    Cummings, M L; Guerlain, Stephanie

    2007-02-01

    This study examined operators' capacity to successfully reallocate highly autonomous in-flight missiles to time-sensitive targets while performing secondary tasks of varying complexity. Regardless of the level of autonomy for unmanned systems, humans will be necessarily involved in the mission planning, higher level operation, and contingency interventions, otherwise known as human supervisory control. As a result, more research is needed that addresses the impact of dynamic decision support systems that support rapid planning and replanning in time-pressured scenarios, particularly on operator workload. A dual screen simulation that allows a single operator the ability to monitor and control 8, 12, or 16 missiles through high level replanning was tested on 42 U.S. Navy personnel. The most significant finding was that when attempting to control 16 missiles, participants' performance on three separate objective performance metrics and their situation awareness were significantly degraded. These results mirror studies of air traffic control that demonstrate a similar decline in performance for controllers managing 17 aircraft as compared with those managing only 10 to 11 aircraft. Moreover, the results suggest that a 70% utilization (percentage busy time) score is a valid threshold for predicting significant performance decay and could be a generalizable metric that can aid in manning predictions. This research is relevant to human supervisory control of networked military and commercial unmanned vehicles in the air, on the ground, and on and under the water.

  6. Human factors aspects of air traffic control

    NASA Technical Reports Server (NTRS)

    Older, H. J.; Cameron, B. J.

    1972-01-01

    An overview of human factors problems associated with the operation of present and future air traffic control systems is presented. A description is included of those activities and tasks performed by air traffic controllers at each operational position within the present system. Judgemental data obtained from controllers concerning psychological dimensions related to these tasks and activities are also presented. The analysis includes consideration of psychophysiological dimensions of human performance. The role of the human controller in present air traffic control systems and his predicted role in future systems is described, particularly as that role changes as the result of the system's evolution towards a more automated configuration. Special attention is directed towards problems of staffing, training, and system operation. A series of ten specific research and development projects are recommended and suggested work plans for their implementation are included.

  7. Intermittent control: a computational theory of human control.

    PubMed

    Gawthrop, Peter; Loram, Ian; Lakie, Martin; Gollee, Henrik

    2011-02-01

    The paradigm of continuous control using internal models has advanced understanding of human motor control. However, this paradigm ignores some aspects of human control, including intermittent feedback, serial ballistic control, triggered responses and refractory periods. It is shown that event-driven intermittent control provides a framework to explain the behaviour of the human operator under a wider range of conditions than continuous control. Continuous control is included as a special case, but sampling, system matched hold, an intermittent predictor and an event trigger allow serial open-loop trajectories using intermittent feedback. The implementation here may be described as "continuous observation, intermittent action". Beyond explaining unimodal regulation distributions in common with continuous control, these features naturally explain refractoriness and bimodal stabilisation distributions observed in double stimulus tracking experiments and quiet standing, respectively. Moreover, given that human control systems contain significant time delays, a biological-cybernetic rationale favours intermittent over continuous control: intermittent predictive control is computationally less demanding than continuous predictive control. A standard continuous-time predictive control model of the human operator is used as the underlying design method for an event-driven intermittent controller. It is shown that when event thresholds are small and sampling is regular, the intermittent controller can masquerade as the underlying continuous-time controller and thus, under these conditions, the continuous-time and intermittent controller cannot be distinguished. This explains why the intermittent control hypothesis is consistent with the continuous control hypothesis for certain experimental conditions.

  8. Using landscape disturbance and succession models to support forest management

    Treesearch

    Eric J. Gustafson; Brian R. Sturtevant; Anatoly S. Shvidenko; Robert M. Scheller

    2010-01-01

    Managers of forested landscapes must account for multiple, interacting ecological processes operating at broad spatial and temporal scales. These interactions can be of such complexity that predictions of future forest ecosystem states are beyond the analytical capability of the human mind. Landscape disturbance and succession models (LDSM) are predictive and...

  9. Learning-dependent plasticity in human auditory cortex during appetitive operant conditioning.

    PubMed

    Puschmann, Sebastian; Brechmann, André; Thiel, Christiane M

    2013-11-01

    Animal experiments provide evidence that learning to associate an auditory stimulus with a reward causes representational changes in auditory cortex. However, most studies did not investigate the temporal formation of learning-dependent plasticity during the task but rather compared auditory cortex receptive fields before and after conditioning. We here present a functional magnetic resonance imaging study on learning-related plasticity in the human auditory cortex during operant appetitive conditioning. Participants had to learn to associate a specific category of frequency-modulated tones with a reward. Only participants who learned this association developed learning-dependent plasticity in left auditory cortex over the course of the experiment. No differential responses to reward predicting and nonreward predicting tones were found in auditory cortex in nonlearners. In addition, learners showed similar learning-induced differential responses to reward-predicting and nonreward-predicting tones in the ventral tegmental area and the nucleus accumbens, two core regions of the dopaminergic neurotransmitter system. This may indicate a dopaminergic influence on the formation of learning-dependent plasticity in auditory cortex, as it has been suggested by previous animal studies. Copyright © 2012 Wiley Periodicals, Inc.

  10. Improving Human/Autonomous System Teaming Through Linguistic Analysis

    NASA Technical Reports Server (NTRS)

    Meszaros, Erica L.

    2016-01-01

    An area of increasing interest for the next generation of aircraft is autonomy and the integration of increasingly autonomous systems into the national airspace. Such integration requires humans to work closely with autonomous systems, forming human and autonomous agent teams. The intention behind such teaming is that a team composed of both humans and autonomous agents will operate better than homogenous teams. Procedures exist for licensing pilots to operate in the national airspace system and current work is being done to define methods for validating the function of autonomous systems, however there is no method in place for assessing the interaction of these two disparate systems. Moreover, currently these systems are operated primarily by subject matter experts, limiting their use and the benefits of such teams. Providing additional information about the ongoing mission to the operator can lead to increased usability and allow for operation by non-experts. Linguistic analysis of the context of verbal communication provides insight into the intended meaning of commonly heard phrases such as "What's it doing now?" Analyzing the semantic sphere surrounding these common phrases enables the prediction of the operator's intent and allows the interface to supply the operator's desired information.

  11. Real-time reservoir operation considering non-stationary inflow prediction

    NASA Astrophysics Data System (ADS)

    Zhao, J.; Xu, W.; Cai, X.; Wang, Z.

    2011-12-01

    Stationarity of inflow has been a basic assumption for reservoir operation rule design, which is now facing challenges due to climate change and human interferences. This paper proposes a modeling framework to incorporate non-stationary inflow prediction for optimizing the hedging operation rule of large reservoirs with multiple-year flow regulation capacity. A multi-stage optimization model is formulated and a solution algorithm based on the optimality conditions is developed to incorporate non-stationary annual inflow prediction through a rolling, dynamic framework that updates the prediction from period to period and adopt the updated prediction in reservoir operation decision. The prediction model is ARIMA(4,1,0), in which parameter 4 stands for the order of autoregressive, 1 represents a linear trend, and 0 is the order of moving average. The modeling framework and solution algorithm is applied to the Miyun reservoir in China, determining a yearly operating schedule during the period from 1996 to 2009, during which there was a significant declining trend of reservoir inflow. Different operation policy scenarios are modeled, including standard operation policy (SOP, matching the current demand as much as possible), hedging rule (i.e., leaving a certain amount of water for future to avoid large risk of water deficit) with forecast from ARIMA (HR-1), hedging (HR) with perfect forecast (HR-2 ). Compared to the results of these scenarios to that of the actual reservoir operation (AO), the utility of the reservoir operation under HR-1 is 3.0% lower than HR-2, but 3.7% higher than the AO and 14.4% higher than SOP. Note that the utility under AO is 10.3% higher than that under SOP, which shows that a certain level of hedging under some inflow prediction or forecast was used in the real-world operation. Moreover, the impacts of discount rate and forecast uncertainty level on the operation will be discussed.

  12. Information systems and human error in the lab.

    PubMed

    Bissell, Michael G

    2004-01-01

    Health system costs in clinical laboratories are incurred daily due to human error. Indeed, a major impetus for automating clinical laboratories has always been the opportunity it presents to simultaneously reduce cost and improve quality of operations by decreasing human error. But merely automating these processes is not enough. To the extent that introduction of these systems results in operators having less practice in dealing with unexpected events or becoming deskilled in problemsolving, however new kinds of error will likely appear. Clinical laboratories could potentially benefit by integrating findings on human error from modern behavioral science into their operations. Fully understanding human error requires a deep understanding of human information processing and cognition. Predicting and preventing negative consequences requires application of this understanding to laboratory operations. Although the occurrence of a particular error at a particular instant cannot be absolutely prevented, human error rates can be reduced. The following principles are key: an understanding of the process of learning in relation to error; understanding the origin of errors since this knowledge can be used to reduce their occurrence; optimal systems should be forgiving to the operator by absorbing errors, at least for a time; although much is known by industrial psychologists about how to write operating procedures and instructions in ways that reduce the probability of error, this expertise is hardly ever put to use in the laboratory; and a feedback mechanism must be designed into the system that enables the operator to recognize in real time that an error has occurred.

  13. Method for Prediction of the Power Output from Photovoltaic Power Plant under Actual Operating Conditions

    NASA Astrophysics Data System (ADS)

    Obukhov, S. G.; Plotnikov, I. A.; Surzhikova, O. A.; Savkin, K. D.

    2017-04-01

    Solar photovoltaic technology is one of the most rapidly growing renewable sources of electricity that has practical application in various fields of human activity due to its high availability, huge potential and environmental compatibility. The original simulation model of the photovoltaic power plant has been developed to simulate and investigate the plant operating modes under actual operating conditions. The proposed model considers the impact of the external climatic factors on the solar panel energy characteristics that improves accuracy in the power output prediction. The data obtained through the photovoltaic power plant operation simulation enable a well-reasoned choice of the required capacity for storage devices and determination of the rational algorithms to control the energy complex.

  14. OMV man/system simulation integration: A preliminary analysis and recommendation

    NASA Technical Reports Server (NTRS)

    Rogers, Jon G.

    1988-01-01

    The Orbital Maneuvering Vehicle (OMV) presents a series of challenges to the human operator. Some are unique to the OMV system itself, and are largely due to remote control versus control from the cockpit. Other challenges are not necessarily unique to the OMV, but are characteristic of many man-machine space flight systems. All of these challenges affect the operator's ability to perform his portion of the mission, and could lead to human error which might jeopardize the vehicle, mission, or both. It is imperative to make every effort to design the control and displays to facilitate the operator's task. The experimental program should address the perceptual, mediational, and motor dimensions of operator performance. With this in mind, a literature review with relevant design considerations was initiated, and a comprehensive outline of control/display parameters were developed. Out of this, a series of questions not answered in the literature was derived which can be converted into experimental protocols for the simulation program. A major task of the aircraft pilot as well as the OMV operator is prediction. Certain display principles have proved to enhance the pilot's ability to predict. A brief examination of some of these principles in relationship to OMV may be useful.

  15. An integrated approach to rotorcraft human factors research

    NASA Technical Reports Server (NTRS)

    Hart, Sandra G.; Hartzell, E. James; Voorhees, James W.; Bucher, Nancy M.; Shively, R. Jay

    1988-01-01

    As the potential of civil and military helicopters has increased, more complex and demanding missions in increasingly hostile environments have been required. Users, designers, and manufacturers have an urgent need for information about human behavior and function to create systems that take advantage of human capabilities, without overloading them. Because there is a large gap between what is known about human behavior and the information needed to predict pilot workload and performance in the complex missions projected for pilots of advanced helicopters, Army and NASA scientists are actively engaged in Human Factors Research at Ames. The research ranges from laboratory experiments to computational modeling, simulation evaluation, and inflight testing. Information obtained in highly controlled but simpler environments generates predictions which can be tested in more realistic situations. These results are used, in turn, to refine theoretical models, provide the focus for subsequent research, and ensure operational relevance, while maintaining predictive advantages. The advantages and disadvantages of each type of research are described along with examples of experimental results.

  16. Evaluating Nextgen Closely Spaced Parallel Operations Concepts with Validated Human Performance Models: Flight Deck Guidelines

    NASA Technical Reports Server (NTRS)

    Hooey, Becky Lee; Gore, Brian Francis; Mahlstedt, Eric; Foyle, David C.

    2013-01-01

    The objectives of the current research were to develop valid human performance models (HPMs) of approach and land operations; use these models to evaluate the impact of NextGen Closely Spaced Parallel Operations (CSPO) on pilot performance; and draw conclusions regarding flight deck display design and pilot-ATC roles and responsibilities for NextGen CSPO concepts. This document presents guidelines and implications for flight deck display designs and candidate roles and responsibilities. A companion document (Gore, Hooey, Mahlstedt, & Foyle, 2013) provides complete scenario descriptions and results including predictions of pilot workload, visual attention and time to detect off-nominal events.

  17. Deep-reasoning fault diagnosis - An aid and a model

    NASA Technical Reports Server (NTRS)

    Yoon, Wan Chul; Hammer, John M.

    1988-01-01

    The design and evaluation are presented for the knowledge-based assistance of a human operator who must diagnose a novel fault in a dynamic, physical system. A computer aid based on a qualitative model of the system was built to help the operators overcome some of their cognitive limitations. This aid differs from most expert systems in that it operates at several levels of interaction that are believed to be more suitable for deep reasoning. Four aiding approaches, each of which provided unique information to the operator, were evaluated. The aiding features were designed to help the human's casual reasoning about the system in predicting normal system behavior (N aiding), integrating observations into actual system behavior (O aiding), finding discrepancies between the two (O-N aiding), or finding discrepancies between observed behavior and hypothetical behavior (O-HN aiding). Human diagnostic performance was found to improve by almost a factor of two with O aiding and O-N aiding.

  18. Prediction of brain tissue temperature using near-infrared spectroscopy.

    PubMed

    Holper, Lisa; Mitra, Subhabrata; Bale, Gemma; Robertson, Nicola; Tachtsidis, Ilias

    2017-04-01

    Broadband near-infrared spectroscopy (NIRS) can provide an endogenous indicator of tissue temperature based on the temperature dependence of the water absorption spectrum. We describe a first evaluation of the calibration and prediction of brain tissue temperature obtained during hypothermia in newborn piglets (animal dataset) and rewarming in newborn infants (human dataset) based on measured body (rectal) temperature. The calibration using partial least squares regression proved to be a reliable method to predict brain tissue temperature with respect to core body temperature in the wavelength interval of 720 to 880 nm with a strong mean predictive power of [Formula: see text] (animal dataset) and [Formula: see text] (human dataset). In addition, we applied regression receiver operating characteristic curves for the first time to evaluate the temperature prediction, which provided an overall mean error bias between NIRS predicted brain temperature and body temperature of [Formula: see text] (animal dataset) and [Formula: see text] (human dataset). We discuss main methodological aspects, particularly the well-known aspect of over- versus underestimation between brain and body temperature, which is relevant for potential clinical applications.

  19. A predictive model of human performance.

    NASA Technical Reports Server (NTRS)

    Walters, R. F.; Carlson, L. D.

    1971-01-01

    An attempt is made to develop a model describing the overall responses of humans to exercise and environmental stresses for prediction of exhaustion vs an individual's physical characteristics. The principal components of the model are a steady state description of circulation and a dynamic description of thermal regulation. The circulatory portion of the system accepts changes in work load and oxygen pressure, while the thermal portion is influenced by external factors of ambient temperature, humidity and air movement, affecting skin blood flow. The operation of the model is discussed and its structural details are given.

  20. NOAA's world-class weather and climate prediction center opens at

    Science.gov Websites

    StumbleUpon Digg More Destinations NOAA's world-class weather and climate prediction center opens at currents and large-scale rain and snow storms. Billions of earth observations from around the world flow operations. Investing in this center is an investment in our human capital, serving as a world class facility

  1. The Impact of Trajectory Prediction Uncertainty on Air Traffic Controller Performance and Acceptability

    NASA Technical Reports Server (NTRS)

    Mercer, Joey S.; Bienert, Nancy; Gomez, Ashley; Hunt, Sarah; Kraut, Joshua; Martin, Lynne; Morey, Susan; Green, Steven M.; Prevot, Thomas; Wu, Minghong G.

    2013-01-01

    A Human-In-The-Loop air traffic control simulation investigated the impact of uncertainties in trajectory predictions on NextGen Trajectory-Based Operations concepts, seeking to understand when the automation would become unacceptable to controllers or when performance targets could no longer be met. Retired air traffic controllers staffed two en route transition sectors, delivering arrival traffic to the northwest corner-post of Atlanta approach control under time-based metering operations. Using trajectory-based decision-support tools, the participants worked the traffic under varying levels of wind forecast error and aircraft performance model error, impacting the ground automations ability to make accurate predictions. Results suggest that the controllers were able to maintain high levels of performance, despite even the highest levels of trajectory prediction errors.

  2. Inductive reasoning about causally transmitted properties.

    PubMed

    Shafto, Patrick; Kemp, Charles; Bonawitz, Elizabeth Baraff; Coley, John D; Tenenbaum, Joshua B

    2008-11-01

    Different intuitive theories constrain and guide inferences in different contexts. Formalizing simple intuitive theories as probabilistic processes operating over structured representations, we present a new computational model of category-based induction about causally transmitted properties. A first experiment demonstrates undergraduates' context-sensitive use of taxonomic and food web knowledge to guide reasoning about causal transmission and shows good qualitative agreement between model predictions and human inferences. A second experiment demonstrates strong quantitative and qualitative fits to inferences about a more complex artificial food web. A third experiment investigates human reasoning about complex novel food webs where species have known taxonomic relations. Results demonstrate a double-dissociation between the predictions of our causal model and a related taxonomic model [Kemp, C., & Tenenbaum, J. B. (2003). Learning domain structures. In Proceedings of the 25th annual conference of the cognitive science society]: the causal model predicts human inferences about diseases but not genes, while the taxonomic model predicts human inferences about genes but not diseases. We contrast our framework with previous models of category-based induction and previous formal instantiations of intuitive theories, and outline challenges in developing a complete model of context-sensitive reasoning.

  3. Base-Rate Neglect as a Function of Base Rates in Probabilistic Contingency Learning

    ERIC Educational Resources Information Center

    Kutzner, Florian; Freytag, Peter; Vogel, Tobias; Fiedler, Klaus

    2008-01-01

    When humans predict criterion events based on probabilistic predictors, they often lend excessive weight to the predictor and insufficient weight to the base rate of the criterion event. In an operant analysis, using a matching-to-sample paradigm, Goodie and Fantino (1996) showed that humans exhibit base-rate neglect when predictors are associated…

  4. A coronagraph for operational space weather predication

    NASA Astrophysics Data System (ADS)

    Middleton, Kevin F.

    2017-09-01

    Accurate prediction of the arrival of solar wind phenomena, in particular coronal mass ejections (CMEs), at Earth, and possibly elsewhere in the heliosphere, is becoming increasingly important given our ever-increasing reliance on technology. The potentially severe impact on human technological systems of such phenomena is termed space weather. A coronagraph is arguably the instrument that provides the earliest definitive evidence of CME eruption; from a vantage point on or near the Sun-Earth line, a coronagraph can provide near-definitive identification of an Earth-bound CME. Currently, prediction of CME arrival is critically dependent on ageing science coronagraphs whose design and operation were not optimized for space weather services. We describe the early stages of the conceptual design of SCOPE (the Solar Coronagraph for OPErations), optimized to support operational space weather services.

  5. Development of a Model for Human Operator Learning in Continuous Estimation and Control Tasks.

    DTIC Science & Technology

    1983-12-01

    and (3) a " precognitive mode" in 𔄁 17 which the pilot is able to take full advantage of any predictability "" inherent in the external inputs and can...allow application of a partial feedforward strategy; and (3) a " precognitive " mode in which full advantage is taken of any predictability of the

  6. Application of robotic manipulability indices to evaluate thumb performance during smartphone touch operations.

    PubMed

    Endo, Hiroshi

    2015-01-01

    This study examined whether manipulability during smartphone thumb-based touch operations could be predicted by the following robotic manipulability indices: the volume and direction of the 'manipulability ellipsoid' (MEd), both of which evaluate the influence of kinematics on manipulability. Limits of the thumb's range of motion were considered in the MEd to improve predictability. Thumb postures at 25 key target locations were measured in 16 subjects. Though there was no correlation between subjective evaluation and the volume of the MEd, high correlation was obtained when motion range limits were taken into account. These limits changed the size of the MEd and improved the accuracy of the manipulability evaluation. Movement directions associated with higher performance could also be predicted. In conclusion, robotic manipulability indices with motion range limits were considered to be useful measures for quantitatively evaluating human hand operations.

  7. Piloting Vertical Flight Aircraft: A Conference on Flying Qualities and Human Factors

    NASA Technical Reports Server (NTRS)

    Blanken, Christopher L. (Editor); Whalley, Matthew S. (Editor)

    1993-01-01

    This document contains papers from a specialists' meeting entitled 'Piloting Vertical Flight Aircraft: A Conference on Flying Qualities and Human Factors.' Vertical flight aircraft, including helicopters and a variety of Vertical Takeoff and Landing (VTOL) concepts, place unique requirements on human perception, control, and performance for the conduct of their design missions. The intent of this conference was to examine, for these vehicles, advances in: (1) design of flight control systems for ADS-33C standards; (2) assessment of human factors influences of cockpit displays and operational procedures; (3) development of VTOL design and operational criteria; and (4) development of theoretical methods or models for predicting pilot/vehicle performance and mission suitability. A secondary goal of the conference was to provide an initial venue for enhanced interaction between human factors and handling qualities specialists.

  8. Flare forecasting at the Met Office Space Weather Operations Centre

    NASA Astrophysics Data System (ADS)

    Murray, S. A.; Bingham, S.; Sharpe, M.; Jackson, D. R.

    2017-04-01

    The Met Office Space Weather Operations Centre produces 24/7/365 space weather guidance, alerts, and forecasts to a wide range of government and commercial end-users across the United Kingdom. Solar flare forecasts are one of its products, which are issued multiple times a day in two forms: forecasts for each active region on the solar disk over the next 24 h and full-disk forecasts for the next 4 days. Here the forecasting process is described in detail, as well as first verification of archived forecasts using methods commonly used in operational weather prediction. Real-time verification available for operational flare forecasting use is also described. The influence of human forecasters is highlighted, with human-edited forecasts outperforming original model results and forecasting skill decreasing over longer forecast lead times.

  9. Operational Forecasting and Warning systems for Coastal hazards in Korea

    NASA Astrophysics Data System (ADS)

    Park, Kwang-Soon; Kwon, Jae-Il; Kim, Jin-Ah; Heo, Ki-Young; Jun, Kicheon

    2017-04-01

    Coastal hazards caused by both Mother Nature and humans cost tremendous social, economic and environmental damages. To mitigate these damages many countries have been running the operational forecasting or warning systems. Since 2009 Korea Operational Oceanographic System (KOOS) has been developed by the leading of Korea Institute of Ocean Science and Technology (KIOST) in Korea and KOOS has been operated in 2012. KOOS is consists of several operational modules of numerical models and real-time observations and produces the basic forecasting variables such as winds, tides, waves, currents, temperature and salinity and so on. In practical application systems include storm surges, oil spills, and search and rescue prediction models. In particular, abnormal high waves (swell-like high-height waves) have occurred in the East coast of Korea peninsula during winter season owing to the local meteorological condition over the East Sea, causing property damages and the loss of human lives. In order to improve wave forecast accuracy even very local wave characteristics, numerical wave modeling system using SWAN is established with data assimilation module using 4D-EnKF and sensitivity test has been conducted. During the typhoon period for the prediction of sever waves and the decision making support system for evacuation of the ships, a high-resolution wave forecasting system has been established and calibrated.

  10. Predicting space telerobotic operator training performance from human spatial ability assessment

    NASA Astrophysics Data System (ADS)

    Liu, Andrew M.; Oman, Charles M.; Galvan, Raquel; Natapoff, Alan

    2013-11-01

    Our goal was to determine whether existing tests of spatial ability can predict an astronaut's qualification test performance after robotic training. Because training astronauts to be qualified robotics operators is so long and expensive, NASA is interested in tools that can predict robotics performance before training begins. Currently, the Astronaut Office does not have a validated tool to predict robotics ability as part of its astronaut selection or training process. Commonly used tests of human spatial ability may provide such a tool to predict robotics ability. We tested the spatial ability of 50 active astronauts who had completed at least one robotics training course, then used logistic regression models to analyze the correlation between spatial ability test scores and the astronauts' performance in their evaluation test at the end of the training course. The fit of the logistic function to our data is statistically significant for several spatial tests. However, the prediction performance of the logistic model depends on the criterion threshold assumed. To clarify the critical selection issues, we show how the probability of correct classification vs. misclassification varies as a function of the mental rotation test criterion level. Since the costs of misclassification are low, the logistic models of spatial ability and robotic performance are reliable enough only to be used to customize regular and remedial training. We suggest several changes in tracking performance throughout robotics training that could improve the range and reliability of predictive models.

  11. Predict human body indentation lying on a spring mattress using a neural network approach.

    PubMed

    Zhong, Shilu; Shen, Liming; Zhou, Lijuan; Guan, Zhongwei

    2014-08-01

    This article presents a method to predict and assess the interaction between a human body and a spring mattress. A three-layer artificial neural network model was developed to simulate and predict an indentation curve of human spine, characterized with the depth of lumbar lordosis and four inclination angles: cervicothoracic, thoracolumbar, lumbosacral and the back-hip (β). By comparing the spinal indentation curves described by the optimal evaluation parameters (depth of lumbar lordosis, cervicothoracic, thoracolumbar and lumbosacral), a better design of five-zone spring mattresses was obtained for individuals to have an effective support to the main part of the body. Using such approach, an operating process was further introduced, in which appropriate stiffness proportions were proposed to design mattress for the normal body types of Chinese young women. Finally, case studies were undertaken, which show that the method developed is feasible and practical. © IMechE 2014.

  12. Making an Effort to Feel Positive: Insecure Attachment in Infancy Predicts the Neural Underpinnings of Emotion Regulation in Adulthood

    ERIC Educational Resources Information Center

    Moutsiana, Christina; Fearon, Pasco; Murray, Lynne; Cooper, Peter; Goodyer, Ian; Johnstone, Tom; Halligan, Sarah

    2014-01-01

    Background: Animal research indicates that the neural substrates of emotion regulation may be persistently altered by early environmental exposures. If similar processes operate in human development then this is significant, as the capacity to regulate emotional states is fundamental to human adaptation. Methods: We utilised a 22-year longitudinal…

  13. Prediction of brain tissue temperature using near-infrared spectroscopy

    PubMed Central

    Holper, Lisa; Mitra, Subhabrata; Bale, Gemma; Robertson, Nicola; Tachtsidis, Ilias

    2017-01-01

    Abstract. Broadband near-infrared spectroscopy (NIRS) can provide an endogenous indicator of tissue temperature based on the temperature dependence of the water absorption spectrum. We describe a first evaluation of the calibration and prediction of brain tissue temperature obtained during hypothermia in newborn piglets (animal dataset) and rewarming in newborn infants (human dataset) based on measured body (rectal) temperature. The calibration using partial least squares regression proved to be a reliable method to predict brain tissue temperature with respect to core body temperature in the wavelength interval of 720 to 880 nm with a strong mean predictive power of R2=0.713±0.157 (animal dataset) and R2=0.798±0.087 (human dataset). In addition, we applied regression receiver operating characteristic curves for the first time to evaluate the temperature prediction, which provided an overall mean error bias between NIRS predicted brain temperature and body temperature of 0.436±0.283°C (animal dataset) and 0.162±0.149°C (human dataset). We discuss main methodological aspects, particularly the well-known aspect of over- versus underestimation between brain and body temperature, which is relevant for potential clinical applications. PMID:28630878

  14. Comparison between Surrogate Indexes of Insulin Sensitivity/Resistance and Hyperinsulinemic Euglycemic Glucose Clamps in Rhesus Monkeys

    PubMed Central

    Lee, Ho-Won; Muniyappa, Ranganath; Yan, Xu; Yue, Lilly Q.; Linden, Ellen H.; Chen, Hui; Hansen, Barbara C.

    2011-01-01

    The euglycemic glucose clamp is the reference method for assessing insulin sensitivity in humans and animals. However, clamps are ill-suited for large studies because of extensive requirements for cost, time, labor, and technical expertise. Simple surrogate indexes of insulin sensitivity/resistance including quantitative insulin-sensitivity check index (QUICKI) and homeostasis model assessment (HOMA) have been developed and validated in humans. However, validation studies of QUICKI and HOMA in both rats and mice suggest that differences in metabolic physiology between rodents and humans limit their value in rodents. Rhesus monkeys are a species more similar to humans than rodents. Therefore, in the present study, we evaluated data from 199 glucose clamp studies obtained from a large cohort of 86 monkeys with a broad range of insulin sensitivity. Data were used to evaluate simple surrogate indexes of insulin sensitivity/resistance (QUICKI, HOMA, Log HOMA, 1/HOMA, and 1/Fasting insulin) with respect to linear regression, predictive accuracy using a calibration model, and diagnostic performance using receiver operating characteristic. Most surrogates had modest linear correlations with SIClamp (r ≈ 0.4–0.64) with comparable correlation coefficients. Predictive accuracy determined by calibration model analysis demonstrated better predictive accuracy of QUICKI than HOMA and Log HOMA. Receiver operating characteristic analysis showed equivalent sensitivity and specificity of most surrogate indexes to detect insulin resistance. Thus, unlike in rodents but similar to humans, surrogate indexes of insulin sensitivity/resistance including QUICKI and log HOMA may be reasonable to use in large studies of rhesus monkeys where it may be impractical to conduct glucose clamp studies. PMID:21209021

  15. Fusing human and machine skills for remote robotic operations

    NASA Technical Reports Server (NTRS)

    Schenker, Paul S.; Kim, Won S.; Venema, Steven C.; Bejczy, Antal K.

    1991-01-01

    The question of how computer assists can improve teleoperator trajectory tracking during both free and force-constrained motions is addressed. Computer graphics techniques which enable the human operator to both visualize and predict detailed 3D trajectories in real-time are reported. Man-machine interactive control procedures for better management of manipulator contact forces and positioning are also described. It is found that collectively, these novel advanced teleoperations techniques both enhance system performance and significantly reduce control problems long associated with teleoperations under time delay. Ongoing robotic simulations of the 1984 space shuttle Solar Maximum EVA Repair Mission are briefly described.

  16. Modeling strategic behavior in human-automation interaction - Why an 'aid' can (and should) go unused

    NASA Technical Reports Server (NTRS)

    Kirlik, Alex

    1993-01-01

    Task-offload aids (e.g., an autopilot, an 'intelligent' assistant) can be selectively engaged by the human operator to dynamically delegate tasks to automation. Introducing such aids eliminates some task demands but creates new ones associated with programming, engaging, and disengaging the aiding device via an interface. The burdens associated with managing automation can sometimes outweigh the potential benefits of automation to improved system performance. Aid design parameters and features of the overall multitask context combine to determine whether or not a task-offload aid will effectively support the operator. A modeling and sensitivity analysis approach is presented that identifies effective strategies for human-automation interaction as a function of three task-context parameters and three aid design parameters. The analysis and modeling approaches provide resources for predicting how a well-adapted operator will use a given task-offload aid, and for specifying aid design features that ensure that automation will provide effective operator support in a multitask environment.

  17. Method of Testing and Predicting Failures of Electronic Mechanical Systems

    NASA Technical Reports Server (NTRS)

    Iverson, David L.; Patterson-Hine, Frances A.

    1996-01-01

    A method employing a knowledge base of human expertise comprising a reliability model analysis implemented for diagnostic routines is disclosed. The reliability analysis comprises digraph models that determine target events created by hardware failures human actions, and other factors affecting the system operation. The reliability analysis contains a wealth of human expertise information that is used to build automatic diagnostic routines and which provides a knowledge base that can be used to solve other artificial intelligence problems.

  18. Design of control software for the closed ecology experiment facilities (CEEF)

    NASA Astrophysics Data System (ADS)

    Miyajima, H.; Abe, K.; Hirosaki, T.; Ishikawa, Y.

    A habitation experiment using a closed ecology experiment facilities CEEF was started in fiscal 2005 three experiments in which two humans stayed for one week were conducted Their stays will be extended gradually until fiscal 2009 when an experiment will be launched with two humans staying for four months The CEEF has an ambitious target of acquiring the technology of an advanced life support system and the system is being developed based on the technology of conventional plant systems Especially in respect to supervision and control of the system the system still has little automation This system has many manual operation parts whose starts and stops are determined by human judgment There are even several parts requiring off-line measurements that include analyses performed by hand At present a CEEF behavioral prediction system CPS is being developed as the first stage for controlling such a system In this CPS an operator creates an operational schedule after due consideration However creation of the operational schedule of the complex CEEF is not easy and it is above the operator s capability to fully cope with alterations of the operational schedule that occur during a long-term habitation experiment Therefore we are going to develop an automatic creation function of the operational schedule that will be incorporated into the CPS by the beginning of the habitation experiment in fiscal 2009 This function will enable automation of most of the operational schedule that human operators currently set up In this paper we examine

  19. Comparative study between cytotoxicity and flowcytometry crossmatches before and after renal transplantation.

    PubMed

    Abdel Rahman, Afaf S; Fahim, Nehal M A; El Sayed, Abeer A; El Hady, Soha A R; Ahmad, Yasser S

    2005-01-01

    Renal transplantation, in most countries, is based on human leukocyte antigen (HLA) matching of the donor kidney with the recipient. Traditional human leukocyte antigen matching is based on defining human leukocyte antigen specificities by antibodies utilizing cytotoxicity crossmatch techniques. Newer techniques have emerged, which challenge the accuracy of serological typing and crossmatching. We compared the results of the standard complement-dependent cytotoxicity crossmatch (CDCXM) with the anti-human globulin augmented cytotoxicity (AHG-CDC), and Flowcytometry crossmatch (FCXM) for the detection of anti-HLA antibodies in 150 pre-transplant patients. The development of post-transplantation sensitization was screened utilizing these three techniques within two weeks post-operative and correlated with rejection episodes. Comparison between the results of CDCXM and AHG-CDC in 150 recipients, revealed no significant correlation (P>0.05). When comparing these results with that of FCXM in 50 recipients a significant correlation was shown (P<0.05). Relative to CDCXM, the sensitivity of AHG-CDC was 100%, specificity 97.4%, positive predictive value 92.3%, and negative predictive value 100%. On the other hand, the sensitivity of FCXM was 100%, specificity 76.3%, positive predictive value 57.1%, and negative predictive value 100%. According to the results of CDCXM, AHG-CDC, and FCXM, no difference was detected between pre- and posttransplant anti-HLA sensitization within two weeks after the operation. Patients with negative cytotoxicity crossmatch (CDCXM and AHG-CDC) and positive FCXM may have an increased risk of early graft loss and may represent a relative contraindication to transplantation. Given the important theoretical advantages of FCXM over the CDC XM, further testing of the clinical relevance is warranted.

  20. Sleep Patterns of Naval Aviation Personnel Conducting Mine Hunting Operations

    DTIC Science & Technology

    2006-09-01

    Personnel Conducting Mine Hunting Operations 6. AUTHOR(S) Bennett Solberg 5. FUNDING NUMBERS 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES...Naval Postgraduate School Monterey, CA 93943-5000 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING /MONITORING AGENCY NAME(S) AND...human performance , resulting in predictable changes not only on the individual level but also on the system as a whole. This descriptive study

  1. Biological Networks for Predicting Chemical Hepatocarcinogenicity Using Gene Expression Data from Treated Mice and Relevance across Human and Rat Species

    PubMed Central

    Thomas, Reuben; Thomas, Russell S.; Auerbach, Scott S.; Portier, Christopher J.

    2013-01-01

    Background Several groups have employed genomic data from subchronic chemical toxicity studies in rodents (90 days) to derive gene-centric predictors of chronic toxicity and carcinogenicity. Genes are annotated to belong to biological processes or molecular pathways that are mechanistically well understood and are described in public databases. Objectives To develop a molecular pathway-based prediction model of long term hepatocarcinogenicity using 90-day gene expression data and to evaluate the performance of this model with respect to both intra-species, dose-dependent and cross-species predictions. Methods Genome-wide hepatic mRNA expression was retrospectively measured in B6C3F1 mice following subchronic exposure to twenty-six (26) chemicals (10 were positive, 2 equivocal and 14 negative for liver tumors) previously studied by the US National Toxicology Program. Using these data, a pathway-based predictor model for long-term liver cancer risk was derived using random forests. The prediction model was independently validated on test sets associated with liver cancer risk obtained from mice, rats and humans. Results Using 5-fold cross validation, the developed prediction model had reasonable predictive performance with the area under receiver-operator curve (AUC) equal to 0.66. The developed prediction model was then used to extrapolate the results to data associated with rat and human liver cancer. The extrapolated model worked well for both extrapolated species (AUC value of 0.74 for rats and 0.91 for humans). The prediction models implied a balanced interplay between all pathway responses leading to carcinogenicity predictions. Conclusions Pathway-based prediction models estimated from sub-chronic data hold promise for predicting long-term carcinogenicity and also for its ability to extrapolate results across multiple species. PMID:23737943

  2. Biological networks for predicting chemical hepatocarcinogenicity using gene expression data from treated mice and relevance across human and rat species.

    PubMed

    Thomas, Reuben; Thomas, Russell S; Auerbach, Scott S; Portier, Christopher J

    2013-01-01

    Several groups have employed genomic data from subchronic chemical toxicity studies in rodents (90 days) to derive gene-centric predictors of chronic toxicity and carcinogenicity. Genes are annotated to belong to biological processes or molecular pathways that are mechanistically well understood and are described in public databases. To develop a molecular pathway-based prediction model of long term hepatocarcinogenicity using 90-day gene expression data and to evaluate the performance of this model with respect to both intra-species, dose-dependent and cross-species predictions. Genome-wide hepatic mRNA expression was retrospectively measured in B6C3F1 mice following subchronic exposure to twenty-six (26) chemicals (10 were positive, 2 equivocal and 14 negative for liver tumors) previously studied by the US National Toxicology Program. Using these data, a pathway-based predictor model for long-term liver cancer risk was derived using random forests. The prediction model was independently validated on test sets associated with liver cancer risk obtained from mice, rats and humans. Using 5-fold cross validation, the developed prediction model had reasonable predictive performance with the area under receiver-operator curve (AUC) equal to 0.66. The developed prediction model was then used to extrapolate the results to data associated with rat and human liver cancer. The extrapolated model worked well for both extrapolated species (AUC value of 0.74 for rats and 0.91 for humans). The prediction models implied a balanced interplay between all pathway responses leading to carcinogenicity predictions. Pathway-based prediction models estimated from sub-chronic data hold promise for predicting long-term carcinogenicity and also for its ability to extrapolate results across multiple species.

  3. Predictive Interfaces for Long-Distance Tele-Operations

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin R.; Martin, Rodney; Allan, Mark B.; Sunspiral, Vytas

    2005-01-01

    We address the development of predictive tele-operator interfaces for humanoid robots with respect to two basic challenges. Firstly, we address automating the transition from fully tele-operated systems towards degrees of autonomy. Secondly, we develop compensation for the time-delay that exists when sending telemetry data from a remote operation point to robots located at low earth orbit and beyond. Humanoid robots have a great advantage over other robotic platforms for use in space-based construction and maintenance because they can use the same tools as astronauts do. The major disadvantage is that they are difficult to control due to the large number of degrees of freedom, which makes it difficult to synthesize autonomous behaviors using conventional means. We are working with the NASA Johnson Space Center's Robonaut which is an anthropomorphic robot with fully articulated hands, arms, and neck. We have trained hidden Markov models that make use of the command data, sensory streams, and other relevant data sources to predict a tele-operator's intent. This allows us to achieve subgoal level commanding without the use of predefined command dictionaries, and to create sub-goal autonomy via sequence generation from generative models. Our method works as a means to incrementally transition from manual tele-operation to semi-autonomous, supervised operation. The multi-agent laboratory experiments conducted by Ambrose et. al. have shown that it is feasible to directly tele-operate multiple Robonauts with humans to perform complex tasks such as truss assembly. However, once a time-delay is introduced into the system, the rate of tele\\ioperation slows down to mimic a bump and wait type of activity. We would like to maintain the same interface to the operator despite time-delays. To this end, we are developing an interface which will allow for us to predict the intentions of the operator while interacting with a 3D virtual representation of the expected state of the robot. The predictive interface anticipates the intention of the operator, and then uses this prediction to initiate appropriate sub-goal autonomy tasks.

  4. Modeling and prediction of human word search behavior in interactive machine translation

    NASA Astrophysics Data System (ADS)

    Ji, Duo; Yu, Bai; Ma, Bin; Ye, Na

    2017-12-01

    As a kind of computer aided translation method, Interactive Machine Translation technology reduced manual translation repetitive and mechanical operation through a variety of methods, so as to get the translation efficiency, and played an important role in the practical application of the translation work. In this paper, we regarded the behavior of users' frequently searching for words in the translation process as the research object, and transformed the behavior to the translation selection problem under the current translation. The paper presented a prediction model, which is a comprehensive utilization of alignment model, translation model and language model of the searching words behavior. It achieved a highly accurate prediction of searching words behavior, and reduced the switching of mouse and keyboard operations in the users' translation process.

  5. The quiet revolution of numerical weather prediction.

    PubMed

    Bauer, Peter; Thorpe, Alan; Brunet, Gilbert

    2015-09-03

    Advances in numerical weather prediction represent a quiet revolution because they have resulted from a steady accumulation of scientific knowledge and technological advances over many years that, with only a few exceptions, have not been associated with the aura of fundamental physics breakthroughs. Nonetheless, the impact of numerical weather prediction is among the greatest of any area of physical science. As a computational problem, global weather prediction is comparable to the simulation of the human brain and of the evolution of the early Universe, and it is performed every day at major operational centres across the world.

  6. Predictive performance models and multiple task performance

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher D.; Larish, Inge; Contorer, Aaron

    1989-01-01

    Five models that predict how performance of multiple tasks will interact in complex task scenarios are discussed. The models are shown in terms of the assumptions they make about human operator divided attention. The different assumptions about attention are then empirically validated in a multitask helicopter flight simulation. It is concluded from this simulation that the most important assumption relates to the coding of demand level of different component tasks.

  7. GT-CATS: Tracking Operator Activities in Complex Systems

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.; Mitchell, Christine M.; Palmer, Everett A.

    1999-01-01

    Human operators of complex dynamic systems can experience difficulties supervising advanced control automation. One remedy is to develop intelligent aiding systems that can provide operators with context-sensitive advice and reminders. The research reported herein proposes, implements, and evaluates a methodology for activity tracking, a form of intent inferencing that can supply the knowledge required for an intelligent aid by constructing and maintaining a representation of operator activities in real time. The methodology was implemented in the Georgia Tech Crew Activity Tracking System (GT-CATS), which predicts and interprets the actions performed by Boeing 757/767 pilots navigating using autopilot flight modes. This report first describes research on intent inferencing and complex modes of automation. It then provides a detailed description of the GT-CATS methodology, knowledge structures, and processing scheme. The results of an experimental evaluation using airline pilots are given. The results show that GT-CATS was effective in predicting and interpreting pilot actions in real time.

  8. Human Reliability Assessments: Using the Past (Shuttle) to Predict the Future (ORION)

    NASA Technical Reports Server (NTRS)

    Mott, Diana L.; Bigler, Mark A.

    2017-01-01

    NASA uses two HRA assessment methodologies. The first is a simplified method which is based on how much time is available to complete the action, with consideration included for environmental and personal factors that could influence the human's reliability. This method is expected to provide a conservative value or placeholder as a preliminary estimate. This preliminary estimate is used to determine which placeholder needs a more detailed assessment. The second methodology is used to develop a more detailed human reliability assessment on the performance of critical human actions. This assessment needs to consider more than the time available, this would include factors such as: the importance of the action, the context, environmental factors, potential human stresses, previous experience, training, physical design interfaces, available procedures/checklists and internal human stresses. The more detailed assessment is still expected to be more realistic than that based primarily on time available. When performing an HRA on a system or process that has an operational history, we have information specific to the task based on this history and experience. In the case of a PRA model that is based on a new design and has no operational history, providing a "reasonable" assessment of potential crew actions becomes more problematic. In order to determine what is expected of future operational parameters, the experience from individuals who had relevant experience and were familiar with the system and process previously implemented by NASA was used to provide the "best" available data. Personnel from Flight Operations, Flight Directors, Launch Test Directors, Control Room Console Operators and Astronauts were all interviewed to provide a comprehensive picture of previous NASA operations. Verification of the assumptions and expectations expressed in the assessments will be needed when the procedures, flight rules and operational requirements are developed and then finalized.

  9. Intra-graft expression of genes involved in iron homeostasis predicts the development of operational tolerance in human liver transplantation

    PubMed Central

    Bohne, Felix; Martínez-Llordella, Marc; Lozano, Juan-José; Miquel, Rosa; Benítez, Carlos; Londoño, María-Carlota; Manzia, Tommaso-María; Angelico, Roberta; Swinkels, Dorine W.; Tjalsma, Harold; López, Marta; Abraldes, Juan G.; Bonaccorsi-Riani, Eliano; Jaeckel, Elmar; Taubert, Richard; Pirenne, Jacques; Rimola, Antoni; Tisone, Giuseppe; Sánchez-Fueyo, Alberto

    2011-01-01

    Following organ transplantation, lifelong immunosuppressive therapy is required to prevent the host immune system from destroying the allograft. This can cause severe side effects and increased recipient morbidity and mortality. Complete cessation of immunosuppressive drugs has been successfully accomplished in selected transplant recipients, providing proof of principle that operational allograft tolerance is attainable in clinical transplantation. The intra-graft molecular pathways associated with successful drug withdrawal, however, are not well defined. In this study, we analyzed sequential blood and liver tissue samples collected from liver transplant recipients enrolled in a prospective multicenter immunosuppressive drug withdrawal clinical trial. Before initiation of drug withdrawal, operationally tolerant and non-tolerant recipients differed in the intra-graft expression of genes involved in the regulation of iron homeostasis. Furthermore, as compared with non-tolerant recipients, operationally tolerant patients exhibited higher serum levels of hepcidin and ferritin and increased hepatocyte iron deposition. Finally, liver tissue gene expression measurements accurately predicted the outcome of immunosuppressive withdrawal in an independent set of patients. These results point to a critical role for iron metabolism in the regulation of intra-graft alloimmune responses in humans and provide a set of biomarkers to conduct drug-weaning trials in liver transplantation. PMID:22156196

  10. Cloud prediction of protein structure and function with PredictProtein for Debian.

    PubMed

    Kaján, László; Yachdav, Guy; Vicedo, Esmeralda; Steinegger, Martin; Mirdita, Milot; Angermüller, Christof; Böhm, Ariane; Domke, Simon; Ertl, Julia; Mertes, Christian; Reisinger, Eva; Staniewski, Cedric; Rost, Burkhard

    2013-01-01

    We report the release of PredictProtein for the Debian operating system and derivatives, such as Ubuntu, Bio-Linux, and Cloud BioLinux. The PredictProtein suite is available as a standard set of open source Debian packages. The release covers the most popular prediction methods from the Rost Lab, including methods for the prediction of secondary structure and solvent accessibility (profphd), nuclear localization signals (predictnls), and intrinsically disordered regions (norsnet). We also present two case studies that successfully utilize PredictProtein packages for high performance computing in the cloud: the first analyzes protein disorder for whole organisms, and the second analyzes the effect of all possible single sequence variants in protein coding regions of the human genome.

  11. Cloud Prediction of Protein Structure and Function with PredictProtein for Debian

    PubMed Central

    Kaján, László; Yachdav, Guy; Vicedo, Esmeralda; Steinegger, Martin; Mirdita, Milot; Angermüller, Christof; Böhm, Ariane; Domke, Simon; Ertl, Julia; Mertes, Christian; Reisinger, Eva; Rost, Burkhard

    2013-01-01

    We report the release of PredictProtein for the Debian operating system and derivatives, such as Ubuntu, Bio-Linux, and Cloud BioLinux. The PredictProtein suite is available as a standard set of open source Debian packages. The release covers the most popular prediction methods from the Rost Lab, including methods for the prediction of secondary structure and solvent accessibility (profphd), nuclear localization signals (predictnls), and intrinsically disordered regions (norsnet). We also present two case studies that successfully utilize PredictProtein packages for high performance computing in the cloud: the first analyzes protein disorder for whole organisms, and the second analyzes the effect of all possible single sequence variants in protein coding regions of the human genome. PMID:23971032

  12. Human Factors in Aeronautics at NASA

    NASA Technical Reports Server (NTRS)

    Mogford, Richard

    2016-01-01

    This is a briefing to a regularly meeting DoD group called the Human Systems Community of Interest: Mission Effectiveness. I was asked to address human factors in aeronautics at NASA. (Exploration (space) human factors has apparently already been covered.) The briefing describes human factors organizations at NASA Ames and Langley. It then summarizes some aeronautics tasks that involve the application of human factors in the development of specific tools and capabilities. The tasks covered include aircrew checklists, dispatch operations, Playbook, Dynamic Weather Routes, Traffic Aware Strategic Aircrew Requests, and Airplane State Awareness and Prediction Technologies. I mention that most of our aeronautics work involves human factors as embedded in development tasks rather than basic research.

  13. BRI: Cyber Trust and Suspicion

    DTIC Science & Technology

    2017-06-06

    Basis for Trust and Suspicion: Manipulating Insider Threat In Cyber Intelligence & Operations: For 2013, the concepts of Predictability...1 THRUST 1 – A SOCIAL, CULTURAL, AND EMOTIONAL BASIS FOR TRUST AND SUSPICION: MANIPULATING INSIDER THREAT IN CYBER INTELLIGENCE ...APPROACH ......................................... 59 3.1 Cybersecurity with humans in the loop

  14. A continuous function model for path prediction of entities

    NASA Astrophysics Data System (ADS)

    Nanda, S.; Pray, R.

    2007-04-01

    As militaries across the world continue to evolve, the roles of humans in various theatres of operation are being increasingly targeted by military planners for substitution with automation. Forward observation and direction of supporting arms to neutralize threats from dynamic adversaries is one such example. However, contemporary tracking and targeting systems are incapable of serving autonomously for they do not embody the sophisticated algorithms necessary to predict the future positions of adversaries with the accuracy offered by the cognitive and analytical abilities of human operators. The need for these systems to incorporate methods characterizing such intelligence is therefore compelling. In this paper, we present a novel technique to achieve this goal by modeling the path of an entity as a continuous polynomial function of multiple variables expressed as a Taylor series with a finite number of terms. We demonstrate the method for evaluating the coefficient of each term to define this function unambiguously for any given entity, and illustrate its use to determine the entity's position at any point in time in the future.

  15. Predictive discomfort in single- and combined-axis whole-body vibration considering different seated postures.

    PubMed

    DeShaw, Jonathan; Rahmatalla, Salam

    2014-08-01

    The aim of this study was to develop a predictive discomfort model in single-axis, 3-D, and 6-D combined-axis whole-body vibrations of seated occupants considering different postures. Non-neutral postures in seated whole-body vibration play a significant role in the resulting level of perceived discomfort and potential long-term injury. The current international standards address contact points but not postures. The proposed model computes discomfort on the basis of static deviation of human joints from their neutral positions and how fast humans rotate their joints under vibration. Four seated postures were investigated. For practical implications, the coefficients of the predictive discomfort model were changed into the Borg scale with psychophysical data from 12 volunteers in different vibration conditions (single-axis random fore-aft, lateral, and vertical and two magnitudes of 3-D). The model was tested under two magnitudes of 6-D vibration. Significant correlations (R = .93) were found between the predictive discomfort model and the reported discomfort with different postures and vibrations. The ISO 2631-1 correlated very well with discomfort (R2 = .89) but was not able to predict the effect of posture. Human discomfort in seated whole-body vibration with different non-neutral postures can be closely predicted by a combination of static posture and the angular velocities of the joint. The predictive discomfort model can assist ergonomists and human factors researchers design safer environments for seated operators under vibration. The model can be integrated with advanced computer biomechanical models to investigate the complex interaction between posture and vibration.

  16. The workload book: Assessment of operator workload to engineering systems

    NASA Technical Reports Server (NTRS)

    Gopher, D.

    1983-01-01

    The structure and initial work performed toward the creation of a handbook for workload analysis directed at the operational community of engineers and human factors psychologists are described. The goal, when complete, will be to make accessible to such individuals the results of theoretically-based research that are of practical interest and utility in the analysis and prediction of operator workload in advanced and existing systems. In addition, the results of laboratory study focused on the development of a subjective rating technique for workload that is based on psychophysical scaling techniques are described.

  17. Human performance cognitive-behavioral modeling: a benefit for occupational safety.

    PubMed

    Gore, Brian F

    2002-01-01

    Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.

  18. Human performance cognitive-behavioral modeling: a benefit for occupational safety

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2002-01-01

    Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.

  19. Life Sciences Implications of Lunar Surface Operations

    NASA Technical Reports Server (NTRS)

    Chappell, Steven P.; Norcross, Jason R.; Abercromby, Andrew F.; Gernhardt, Michael L.

    2010-01-01

    The purpose of this report is to document preliminary, predicted, life sciences implications of expected operational concepts for lunar surface extravehicular activity (EVA). Algorithms developed through simulation and testing in lunar analog environments were used to predict crew metabolic rates and ground reaction forces experienced during lunar EVA. Subsequently, the total metabolic energy consumption, the daily bone load stimulus, total oxygen needed, and other variables were calculated and provided to Human Research Program and Exploration Systems Mission Directorate stakeholders. To provide context to the modeling, the report includes an overview of some scenarios that have been considered. Concise descriptions of the analog testing and development of the algorithms are also provided. This document may be updated to remain current with evolving lunar or other planetary surface operations, assumptions and concepts, and to provide additional data and analyses collected during the ongoing analog research program.

  20. Trajectory-Based Complexity (TBX): A Modified Aircraft Count to Predict Sector Complexity During Trajectory-Based Operations

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas; Lee, Paul U.

    2011-01-01

    In this paper we introduce a new complexity metric to predict -in real-time- sector complexity for trajectory-based operations (TBO). TBO will be implemented in the Next Generation Air Transportation System (NextGen). Trajectory-Based Complexity (TBX) is a modified aircraft count that can easily be computed and communicated in a TBO environment based upon predictions of aircraft and weather trajectories. TBX is scaled to aircraft count and represents an alternate and additional means to manage air traffic demand and capacity with more consideration of dynamic factors such as weather, aircraft equipage or predicted separation violations, as well as static factors such as sector size. We have developed and evaluated TBX in the Airspace Operations Laboratory (AOL) at the NASA Ames Research Center during human-in-the-loop studies of trajectory-based concepts since 2009. In this paper we will describe the TBX computation in detail and present the underlying algorithm. Next, we will describe the specific TBX used in an experiment at NASA's AOL. We will evaluate the performance of this metric using data collected during a controller-inthe- loop study on trajectory-based operations at different equipage levels. In this study controllers were prompted at regular intervals to rate their current workload on a numeric scale. When comparing this real-time workload rating to the TBX values predicted for these time periods we demonstrate that TBX is a better predictor of workload than aircraft count. Furthermore we demonstrate that TBX is well suited to be used for complexity management in TBO and can easily be adjusted to future operational concepts.

  1. Human Reliability Assessments: Using the Past (Shuttle) to Predict the Future (Orion)

    NASA Technical Reports Server (NTRS)

    DeMott, Diana L.; Bigler, Mark A.

    2017-01-01

    NASA (National Aeronautics and Space Administration) Johnson Space Center (JSC) Safety and Mission Assurance (S&MA) uses two human reliability analysis (HRA) methodologies. The first is a simplified method which is based on how much time is available to complete the action, with consideration included for environmental and personal factors that could influence the human's reliability. This method is expected to provide a conservative value or placeholder as a preliminary estimate. This preliminary estimate or screening value is used to determine which placeholder needs a more detailed assessment. The second methodology is used to develop a more detailed human reliability assessment on the performance of critical human actions. This assessment needs to consider more than the time available, this would include factors such as: the importance of the action, the context, environmental factors, potential human stresses, previous experience, training, physical design interfaces, available procedures/checklists and internal human stresses. The more detailed assessment is expected to be more realistic than that based primarily on time available. When performing an HRA on a system or process that has an operational history, we have information specific to the task based on this history and experience. In the case of a Probabilistic Risk Assessment (PRA) that is based on a new design and has no operational history, providing a "reasonable" assessment of potential crew actions becomes more challenging. To determine what is expected of future operational parameters, the experience from individuals who had relevant experience and were familiar with the system and process previously implemented by NASA was used to provide the "best" available data. Personnel from Flight Operations, Flight Directors, Launch Test Directors, Control Room Console Operators, and Astronauts were all interviewed to provide a comprehensive picture of previous NASA operations. Verification of the assumptions and expectations expressed in the assessments will be needed when the procedures, flight rules, and operational requirements are developed and then finalized.

  2. Human Reliability Assessments: Using the Past (Shuttle) to Predict the Future (Orion)

    NASA Technical Reports Server (NTRS)

    DeMott, Diana; Bigler, Mark

    2016-01-01

    NASA (National Aeronautics and Space Administration) Johnson Space Center (JSC) Safety and Mission Assurance (S&MA) uses two human reliability analysis (HRA) methodologies. The first is a simplified method which is based on how much time is available to complete the action, with consideration included for environmental and personal factors that could influence the human's reliability. This method is expected to provide a conservative value or placeholder as a preliminary estimate. This preliminary estimate or screening value is used to determine which placeholder needs a more detailed assessment. The second methodology is used to develop a more detailed human reliability assessment on the performance of critical human actions. This assessment needs to consider more than the time available, this would include factors such as: the importance of the action, the context, environmental factors, potential human stresses, previous experience, training, physical design interfaces, available procedures/checklists and internal human stresses. The more detailed assessment is expected to be more realistic than that based primarily on time available. When performing an HRA on a system or process that has an operational history, we have information specific to the task based on this history and experience. In the case of a Probabilistic Risk Assessment (PRA) that is based on a new design and has no operational history, providing a "reasonable" assessment of potential crew actions becomes more challenging. In order to determine what is expected of future operational parameters, the experience from individuals who had relevant experience and were familiar with the system and process previously implemented by NASA was used to provide the "best" available data. Personnel from Flight Operations, Flight Directors, Launch Test Directors, Control Room Console Operators and Astronauts were all interviewed to provide a comprehensive picture of previous NASA operations. Verification of the assumptions and expectations expressed in the assessments will be needed when the procedures, flight rules and operational requirements are developed and then finalized.

  3. Optimal control model predictions of system performance and attention allocation and their experimental validation in a display design study

    NASA Technical Reports Server (NTRS)

    Johannsen, G.; Govindaraj, T.

    1980-01-01

    The influence of different types of predictor displays in a longitudinal vertical takeoff and landing (VTOL) hover task is analyzed in a theoretical study. Several cases with differing amounts of predictive and rate information are compared. The optimal control model of the human operator is used to estimate human and system performance in terms of root-mean-square (rms) values and to compute optimized attention allocation. The only part of the model which is varied to predict these data is the observation matrix. Typical cases are selected for a subsequent experimental validation. The rms values as well as eye-movement data are recorded. The results agree favorably with those of the theoretical study in terms of relative differences. Better matching is achieved by revised model input data.

  4. The Value of Biomedical Simulation Environments to Future Human Space Flight Missions

    NASA Technical Reports Server (NTRS)

    Mulugeta,Lealem; Myers, Jerry G.; Lewandowski, Beth; Platts, Steven H.

    2011-01-01

    Mars and NEO missions will expose astronaut to extended durations of reduced reduced gravity, isolation and higher radiation. These new operation conditions pose health risks that are not well understood and perhaps unanticipated. Advanced computational simulation environments can beneficially augment research to predict, assess and mitigate potential hazards to astronaut health. The NASA Digital Astronaut Project (DAP), within the NASA Human Research Program, strives to achieve this goal.

  5. Modeling Visual, Vestibular and Oculomotor Interactions in Self-Motion Estimation

    NASA Technical Reports Server (NTRS)

    Perrone, John

    1997-01-01

    A computational model of human self-motion perception has been developed in collaboration with Dr. Leland S. Stone at NASA Ames Research Center. The research included in the grant proposal sought to extend the utility of this model so that it could be used for explaining and predicting human performance in a greater variety of aerospace applications. This extension has been achieved along with physiological validation of the basic operation of the model.

  6. Operational Aspects of Space Radiation

    NASA Technical Reports Server (NTRS)

    1997-01-01

    In this session, Session FA4, the discussion focuses on the following topics: Solar Particle Events and the International Space Station; Radiation Environment on Mir and ISS Orbits During the Solar Cycle; New approach to Radiation Risk Assessment; An Industrial Method to Predict Major Solar Flares for a Better Protection of Human Beings in Space; Description of the Space Radiation Control System for the Russian Segment of ISS; Orbit Selection and Its Impact on Radiation Warning Architecture for a Human Mission to Mars; and Space Nuclear Power - Technology, Policy and Risk Considerations in Human Missions to Mars.

  7. Taxi Time Prediction at Charlotte Airport Using Fast-Time Simulation and Machine Learning Techniques

    NASA Technical Reports Server (NTRS)

    Lee, Hanbong

    2016-01-01

    Accurate taxi time prediction is required for enabling efficient runway scheduling that can increase runway throughput and reduce taxi times and fuel consumptions on the airport surface. Currently NASA and American Airlines are jointly developing a decision-support tool called Spot and Runway Departure Advisor (SARDA) that assists airport ramp controllers to make gate pushback decisions and improve the overall efficiency of airport surface traffic. In this presentation, we propose to use Linear Optimized Sequencing (LINOS), a discrete-event fast-time simulation tool, to predict taxi times and provide the estimates to the runway scheduler in real-time airport operations. To assess its prediction accuracy, we also introduce a data-driven analytical method using machine learning techniques. These two taxi time prediction methods are evaluated with actual taxi time data obtained from the SARDA human-in-the-loop (HITL) simulation for Charlotte Douglas International Airport (CLT) using various performance measurement metrics. Based on the taxi time prediction results, we also discuss how the prediction accuracy can be affected by the operational complexity at this airport and how we can improve the fast time simulation model before implementing it with an airport scheduling algorithm in a real-time environment.

  8. MIDAS Website. Revised

    NASA Technical Reports Server (NTRS)

    Goodman, Allen; Shively, R. Joy (Technical Monitor)

    1997-01-01

    MIDAS, Man-machine Integration Design and Analysis System, is a unique combination of software tools aimed at reducing design cycle time, supporting quantitative predictions of human-system effectiveness and improving the design of crew stations and their associated operating procedures. This project is supported jointly by the US Army and NASA.

  9. Validating models of target acquisition performance in the dismounted soldier context

    NASA Astrophysics Data System (ADS)

    Glaholt, Mackenzie G.; Wong, Rachel K.; Hollands, Justin G.

    2018-04-01

    The problem of predicting real-world operator performance with digital imaging devices is of great interest within the military and commercial domains. There are several approaches to this problem, including: field trials with imaging devices, laboratory experiments using imagery captured from these devices, and models that predict human performance based on imaging device parameters. The modeling approach is desirable, as both field trials and laboratory experiments are costly and time-consuming. However, the data from these experiments is required for model validation. Here we considered this problem in the context of dismounted soldiering, for which detection and identification of human targets are essential tasks. Human performance data were obtained for two-alternative detection and identification decisions in a laboratory experiment in which photographs of human targets were presented on a computer monitor and the images were digitally magnified to simulate range-to-target. We then compared the predictions of different performance models within the NV-IPM software package: Targeting Task Performance (TTP) metric model and the Johnson model. We also introduced a modification to the TTP metric computation that incorporates an additional correction for target angular size. We examined model predictions using NV-IPM default values for a critical model constant, V50, and we also considered predictions when this value was optimized to fit the behavioral data. When using default values, certain model versions produced a reasonably close fit to the human performance data in the detection task, while for the identification task all models substantially overestimated performance. When using fitted V50 values the models produced improved predictions, though the slopes of the performance functions were still shallow compared to the behavioral data. These findings are discussed in relation to the models' designs and parameters, and the characteristics of the behavioral paradigm.

  10. A systems approach to accident causation in mining: an application of the HFACS method.

    PubMed

    Lenné, Michael G; Salmon, Paul M; Liu, Charles C; Trotter, Margaret

    2012-09-01

    This project aimed to provide a greater understanding of the systemic factors involved in mining accidents, and to examine those organisational and supervisory failures that are predictive of sub-standard performance at operator level. A sample of 263 significant mining incidents in Australia across 2007-2008 were analysed using the Human Factors Analysis and Classification System (HFACS). Two human factors specialists independently undertook the analysis. Incidents occurred more frequently in operations concerning the use of surface mobile equipment (38%) and working at heights (21%), however injury was more frequently associated with electrical operations and vehicles and machinery. Several HFACS categories appeared frequently: skill-based errors (64%) and violations (57%), issues with the physical environment (56%), and organisational processes (65%). Focussing on the overall system, several factors were found to predict the presence of failures in other parts of the system, including planned inappropriate operations and team resource management; inadequate supervision and team resource management; and organisational climate and inadequate supervision. It is recommended that these associations deserve greater attention in future attempts to develop accident countermeasures, although other significant associations should not be ignored. In accordance with findings from previous HFACS-based analyses of aviation and medical incidents, efforts to reduce the frequency of unsafe acts or operations should be directed to a few critical HFACS categories at the higher levels: organisational climate, planned inadequate operations, and inadequate supervision. While remedial strategies are proposed it is important that future efforts evaluate the utility of the measures proposed in studies of system safety. Copyright © 2011. Published by Elsevier Ltd.

  11. Modeling and control of operator functional state in a unified framework of fuzzy inference petri nets.

    PubMed

    Zhang, Jian-Hua; Xia, Jia-Jun; Garibaldi, Jonathan M; Groumpos, Petros P; Wang, Ru-Bin

    2017-06-01

    In human-machine (HM) hybrid control systems, human operator and machine cooperate to achieve the control objectives. To enhance the overall HM system performance, the discrete manual control task-load by the operator must be dynamically allocated in accordance with continuous-time fluctuation of psychophysiological functional status of the operator, so-called operator functional state (OFS). The behavior of the HM system is hybrid in nature due to the co-existence of discrete task-load (control) variable and continuous operator performance (system output) variable. Petri net is an effective tool for modeling discrete event systems, but for hybrid system involving discrete dynamics, generally Petri net model has to be extended. Instead of using different tools to represent continuous and discrete components of a hybrid system, this paper proposed a method of fuzzy inference Petri nets (FIPN) to represent the HM hybrid system comprising a Mamdani-type fuzzy model of OFS and a logical switching controller in a unified framework, in which the task-load level is dynamically reallocated between the operator and machine based on the model-predicted OFS. Furthermore, this paper used a multi-model approach to predict the operator performance based on three electroencephalographic (EEG) input variables (features) via the Wang-Mendel (WM) fuzzy modeling method. The membership function parameters of fuzzy OFS model for each experimental participant were optimized using artificial bee colony (ABC) evolutionary algorithm. Three performance indices, RMSE, MRE, and EPR, were computed to evaluate the overall modeling accuracy. Experiment data from six participants are analyzed. The results show that the proposed method (FIPN with adaptive task allocation) yields lower breakdown rate (from 14.8% to 3.27%) and higher human performance (from 90.30% to 91.99%). The simulation results of the FIPN-based adaptive HM (AHM) system on six experimental participants demonstrate that the FIPN framework provides an effective way to model and regulate/optimize the OFS in HM hybrid systems composed of continuous-time OFS model and discrete-event switching controller. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Impact of Pilot Delay and Non-Responsiveness on the Safety Performance of Airborne Separation

    NASA Technical Reports Server (NTRS)

    Consiglio, Maria; Hoadley, Sherwood; Wing, David; Baxley, Brian; Allen, Bonnie Danette

    2008-01-01

    Assessing the safety effects of prediction errors and uncertainty on automationsupported functions in the Next Generation Air Transportation System concept of operations is of foremost importance, particularly safety critical functions such as separation that involve human decision-making. Both ground-based and airborne, the automation of separation functions must be designed to account for, and mitigate the impact of, information uncertainty and varying human response. This paper describes an experiment that addresses the potential impact of operator delay when interacting with separation support systems. In this study, we evaluated an airborne separation capability operated by a simulated pilot. The experimental runs are part of the Safety Performance of Airborne Separation (SPAS) experiment suite that examines the safety implications of prediction errors and system uncertainties on airborne separation assistance systems. Pilot actions required by the airborne separation automation to resolve traffic conflicts were delayed within a wide range, varying from five to 240 seconds while a percentage of randomly selected pilots were programmed to completely miss the conflict alerts and therefore take no action. Results indicate that the strategicAirborne Separation Assistance System (ASAS) functions exercised in the experiment can sustain pilot response delays of up to 90 seconds and more, depending on the traffic density. However, when pilots or operators fail to respond to conflict alerts the safety effects are substantial, particularly at higher traffic densities.

  13. Confident Surgical Decision Making in Temporal Lobe Epilepsy by Heterogeneous Classifier Ensembles

    PubMed Central

    Fakhraei, Shobeir; Soltanian-Zadeh, Hamid; Jafari-Khouzani, Kourosh; Elisevich, Kost; Fotouhi, Farshad

    2015-01-01

    In medical domains with low tolerance for invalid predictions, classification confidence is highly important and traditional performance measures such as overall accuracy cannot provide adequate insight into classifications reliability. In this paper, a confident-prediction rate (CPR) which measures the upper limit of confident predictions has been proposed based on receiver operating characteristic (ROC) curves. It has been shown that heterogeneous ensemble of classifiers improves this measure. This ensemble approach has been applied to lateralization of focal epileptogenicity in temporal lobe epilepsy (TLE) and prediction of surgical outcomes. A goal of this study is to reduce extraoperative electrocorticography (eECoG) requirement which is the practice of using electrodes placed directly on the exposed surface of the brain. We have shown that such goal is achievable with application of data mining techniques. Furthermore, all TLE surgical operations do not result in complete relief from seizures and it is not always possible for human experts to identify such unsuccessful cases prior to surgery. This study demonstrates the capability of data mining techniques in prediction of undesirable outcome for a portion of such cases. PMID:26609547

  14. Arctic Engineering--Through Human Eyes.

    ERIC Educational Resources Information Center

    Simmonds, W. H. C.

    Adopting technology to people and examining projects through the eyes of those concerned are two ways new technology and engineering can be installed and successfully operated under the adverse conditions of northern Canada and in the face of predicted labor shortages in the 1980's. Adopting a more flexible technology provides the opportunity for…

  15. Detection and Response for Rift Valley fever

    USDA-ARS?s Scientific Manuscript database

    Rift Valley fever is a viral disease that impacts domestic livestock and humans in Africa and the Middle East, and poses a threat to military operations in these areas. We describe a Rift Valley fever Risk Monitoring website, and its ability to predict risk of disease temporally and spatially. We al...

  16. The human dimensions of energy use in buildings: A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D'Oca, Simona; Hong, Tianzhen; Langevin, Jared

    The “human dimensions” of energy use in buildings refer to the energy-related behaviors of key stakeholders that affect energy use over the building life cycle. Stakeholders include building designers, operators, managers, engineers, occupants, industry, vendors, and policymakers, who directly or indirectly influence the acts of designing, constructing, living, operating, managing, and regulating the built environments, from individual building up to the urban scale. Among factors driving high-performance buildings, human dimensions play a role that is as significant as that of technological advances. However, this factor is not well understood, and, as a result, human dimensions are often ignored or simplifiedmore » by stakeholders. This work presents a review of the literature on human dimensions of building energy use to assess the state-of-the-art in this topic area. The paper highlights research needs for fully integrating human dimensions into the building design and operation processes with the goal of reducing energy use in buildings while enhancing occupant comfort and productivity. This research focuses on identifying key needs for each stakeholder involved in a building's life cycle and takes an interdisciplinary focus that spans the fields of architecture and engineering design, sociology, data science, energy policy, codes, and standards to provide targeted insights. Greater understanding of the human dimensions of energy use has several potential benefits including reductions in operating cost for building owners; enhanced comfort conditions and productivity for building occupants; more effective building energy management and automation systems for building operators and energy managers; and the integration of more accurate control logic into the next generation of human-in-the-loop technologies. The review concludes by summarizing recommendations for policy makers and industry stakeholders for developing codes, standards, and technologies that can leverage the human dimensions of energy use to reliably predict and achieve energy use reductions in the residential and commercial buildings sectors.« less

  17. The human dimensions of energy use in buildings: A review

    DOE PAGES

    D'Oca, Simona; Hong, Tianzhen; Langevin, Jared

    2017-08-19

    The “human dimensions” of energy use in buildings refer to the energy-related behaviors of key stakeholders that affect energy use over the building life cycle. Stakeholders include building designers, operators, managers, engineers, occupants, industry, vendors, and policymakers, who directly or indirectly influence the acts of designing, constructing, living, operating, managing, and regulating the built environments, from individual building up to the urban scale. Among factors driving high-performance buildings, human dimensions play a role that is as significant as that of technological advances. However, this factor is not well understood, and, as a result, human dimensions are often ignored or simplifiedmore » by stakeholders. This work presents a review of the literature on human dimensions of building energy use to assess the state-of-the-art in this topic area. The paper highlights research needs for fully integrating human dimensions into the building design and operation processes with the goal of reducing energy use in buildings while enhancing occupant comfort and productivity. This research focuses on identifying key needs for each stakeholder involved in a building's life cycle and takes an interdisciplinary focus that spans the fields of architecture and engineering design, sociology, data science, energy policy, codes, and standards to provide targeted insights. Greater understanding of the human dimensions of energy use has several potential benefits including reductions in operating cost for building owners; enhanced comfort conditions and productivity for building occupants; more effective building energy management and automation systems for building operators and energy managers; and the integration of more accurate control logic into the next generation of human-in-the-loop technologies. The review concludes by summarizing recommendations for policy makers and industry stakeholders for developing codes, standards, and technologies that can leverage the human dimensions of energy use to reliably predict and achieve energy use reductions in the residential and commercial buildings sectors.« less

  18. Modeling of the Human - Operator in a Complex System Functioning Under Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Getzov, Peter; Hubenova, Zoia; Yordanov, Dimitar; Popov, Wiliam

    2013-12-01

    Problems, related to the explication of sophisticated control systems of objects, operating under extreme conditions, have been examined and the impact of the effectiveness of the operator's activity on the systems as a whole. The necessity of creation of complex simulation models, reflecting operator's activity, is discussed. Organizational and technical system of an unmanned aviation complex is described as a sophisticated ergatic system. Computer realization of main subsystems of algorithmic system of the man as a controlling system is implemented and specialized software for data processing and analysis is developed. An original computer model of a Man as a tracking system has been implemented. Model of unmanned complex for operators training and formation of a mental model in emergency situation, implemented in "matlab-simulink" environment, has been synthesized. As a unit of the control loop, the pilot (operator) is simplified viewed as an autocontrol system consisting of three main interconnected subsystems: sensitive organs (perception sensors); central nervous system; executive organs (muscles of the arms, legs, back). Theoretical-data model of prediction the level of operator's information load in ergatic systems is proposed. It allows the assessment and prediction of the effectiveness of a real working operator. Simulation model of operator's activity in takeoff based on the Petri nets has been synthesized.

  19. Narrowing the scope of failure prediction using targeted fault load injection

    NASA Astrophysics Data System (ADS)

    Jordan, Paul L.; Peterson, Gilbert L.; Lin, Alan C.; Mendenhall, Michael J.; Sellers, Andrew J.

    2018-05-01

    As society becomes more dependent upon computer systems to perform increasingly critical tasks, ensuring that those systems do not fail becomes increasingly important. Many organizations depend heavily on desktop computers for day-to-day operations. Unfortunately, the software that runs on these computers is written by humans and, as such, is still subject to human error and consequent failure. A natural solution is to use statistical machine learning to predict failure. However, since failure is still a relatively rare event, obtaining labelled training data to train these models is not a trivial task. This work presents new simulated fault-inducing loads that extend the focus of traditional fault injection techniques to predict failure in the Microsoft enterprise authentication service and Apache web server. These new fault loads were successful in creating failure conditions that were identifiable using statistical learning methods, with fewer irrelevant faults being created.

  20. Evolutionary Agent-Based Simulation of the Introduction of New Technologies in Air Traffic Management

    NASA Technical Reports Server (NTRS)

    Yliniemi, Logan; Agogino, Adrian K.; Tumer, Kagan

    2014-01-01

    Accurate simulation of the effects of integrating new technologies into a complex system is critical to the modernization of our antiquated air traffic system, where there exist many layers of interacting procedures, controls, and automation all designed to cooperate with human operators. Additions of even simple new technologies may result in unexpected emergent behavior due to complex human/ machine interactions. One approach is to create high-fidelity human models coming from the field of human factors that can simulate a rich set of behaviors. However, such models are difficult to produce, especially to show unexpected emergent behavior coming from many human operators interacting simultaneously within a complex system. Instead of engineering complex human models, we directly model the emergent behavior by evolving goal directed agents, representing human users. Using evolution we can predict how the agent representing the human user reacts given his/her goals. In this paradigm, each autonomous agent in a system pursues individual goals, and the behavior of the system emerges from the interactions, foreseen or unforeseen, between the agents/actors. We show that this method reflects the integration of new technologies in a historical case, and apply the same methodology for a possible future technology.

  1. Cognitive issues in autonomous spacecraft-control operations: An investigation of software-mediated decision making in a scaled environment

    NASA Astrophysics Data System (ADS)

    Murphy, Elizabeth Drummond

    As advances in technology are applied in complex, semi-automated domains, human controllers are distanced from the controlled process. This physical and psychological distance may both facilitate and degrade human performance. To investigate cognitive issues in spacecraft ground-control operations, the present experimental research was undertaken. The primary issue concerned the ability of operations analysts who do not monitor operations to make timely, accurate decisions when autonomous software calls for human help. Another key issue involved the potential effects of spatial-visualization ability (SVA) in environments that present data in graphical formats. Hypotheses were derived largely from previous findings and predictions in the literature. Undergraduate psychology students were assigned at random to a monitoring condition or an on-call condition in a scaled environment. The experimental task required subjects to decide on the veracity of a problem diagnosis delivered by a software process on-board a simulated spacecraft. To support decision-making, tabular and graphical data displays presented information on system status. A level of software confidence in the problem diagnosis was displayed, and subjects reported their own level of confidence in their decisions. Contrary to expectations, the performance of on-call subjects did not differ significantly from that of continuous monitors. Analysis yielded a significant interaction of sex and condition: Females in the on-call condition had the lowest mean accuracy. Results included a preference for bar charts over line graphs and faster performance with tables than with line graphs. A significant correlation was found between subjective confidence and decision accuracy. SVA was found to be predictive of accuracy but not speed; and SVA was found to be a stronger predictor of performance for males than for females. Low-SVA subjects reported that they relied more on software confidence than did medium- or high-SVA subjects. These and other findings have implications for the design of user interfaces to support human decision-making in on-call situations and to accommodate low-SVA users.

  2. Prediction modeling of physiological responses and human performance in the heat with application to space operations

    NASA Technical Reports Server (NTRS)

    Pandolf, Kent B.; Stroschein, Leander A.; Gonzalez, Richard R.; Sawka, Michael N.

    1994-01-01

    This institute has developed a comprehensive USARIEM heat strain model for predicting physiological responses and soldier performance in the heat which has been programmed for use by hand-held calculators, personal computers, and incorporated into the development of a heat strain decision aid. This model deals directly with five major inputs: the clothing worn, the physical work intensity, the state of heat acclimation, the ambient environment (air temperature, relative humidity, wind speed, and solar load), and the accepted heat casualty level. In addition to predicting rectal temperature, heart rate, and sweat loss given the above inputs, our model predicts the expected physical work/rest cycle, the maximum safe physical work time, the estimated recovery time from maximal physical work, and the drinking water requirements associated with each of these situations. This model provides heat injury risk management guidance based on thermal strain predictions from the user specified environmental conditions, soldier characteristics, clothing worn, and the physical work intensity. If heat transfer values for space operations' clothing are known, NASA can use this prediction model to help avoid undue heat strain in astronauts during space flight.

  3. Ensemble flare forecasting: using numerical weather prediction techniques to improve space weather operations

    NASA Astrophysics Data System (ADS)

    Murray, S.; Guerra, J. A.

    2017-12-01

    One essential component of operational space weather forecasting is the prediction of solar flares. Early flare forecasting work focused on statistical methods based on historical flaring rates, but more complex machine learning methods have been developed in recent years. A multitude of flare forecasting methods are now available, however it is still unclear which of these methods performs best, and none are substantially better than climatological forecasts. Current operational space weather centres cannot rely on automated methods, and generally use statistical forecasts with a little human intervention. Space weather researchers are increasingly looking towards methods used in terrestrial weather to improve current forecasting techniques. Ensemble forecasting has been used in numerical weather prediction for many years as a way to combine different predictions in order to obtain a more accurate result. It has proved useful in areas such as magnetospheric modelling and coronal mass ejection arrival analysis, however has not yet been implemented in operational flare forecasting. Here we construct ensemble forecasts for major solar flares by linearly combining the full-disk probabilistic forecasts from a group of operational forecasting methods (ASSA, ASAP, MAG4, MOSWOC, NOAA, and Solar Monitor). Forecasts from each method are weighted by a factor that accounts for the method's ability to predict previous events, and several performance metrics (both probabilistic and categorical) are considered. The results provide space weather forecasters with a set of parameters (combination weights, thresholds) that allow them to select the most appropriate values for constructing the 'best' ensemble forecast probability value, according to the performance metric of their choice. In this way different forecasts can be made to fit different end-user needs.

  4. Neuroadaptive technology enables implicit cursor control based on medial prefrontal cortex activity.

    PubMed

    Zander, Thorsten O; Krol, Laurens R; Birbaumer, Niels P; Gramann, Klaus

    2016-12-27

    The effectiveness of today's human-machine interaction is limited by a communication bottleneck as operators are required to translate high-level concepts into a machine-mandated sequence of instructions. In contrast, we demonstrate effective, goal-oriented control of a computer system without any form of explicit communication from the human operator. Instead, the system generated the necessary input itself, based on real-time analysis of brain activity. Specific brain responses were evoked by violating the operators' expectations to varying degrees. The evoked brain activity demonstrated detectable differences reflecting congruency with or deviations from the operators' expectations. Real-time analysis of this activity was used to build a user model of those expectations, thus representing the optimal (expected) state as perceived by the operator. Based on this model, which was continuously updated, the computer automatically adapted itself to the expectations of its operator. Further analyses showed this evoked activity to originate from the medial prefrontal cortex and to exhibit a linear correspondence to the degree of expectation violation. These findings extend our understanding of human predictive coding and provide evidence that the information used to generate the user model is task-specific and reflects goal congruency. This paper demonstrates a form of interaction without any explicit input by the operator, enabling computer systems to become neuroadaptive, that is, to automatically adapt to specific aspects of their operator's mindset. Neuroadaptive technology significantly widens the communication bottleneck and has the potential to fundamentally change the way we interact with technology.

  5. Case Study: Influences of Uncertainties and Traffic Scenario Difficulties in a Human-in-the-Loop Simulation

    NASA Technical Reports Server (NTRS)

    Bienert, Nancy; Mercer, Joey; Homola, Jeffrey; Morey, Susan; Prevot, Thomas

    2014-01-01

    This paper presents a case study of how factors such as wind prediction errors and metering delays can influence controller performance and workload in Human-In-The-Loop simulations. Retired air traffic controllers worked two arrival sectors adjacent to the terminal area. The main tasks were to provide safe air traffic operations and deliver the aircraft to the metering fix within +/- 25 seconds of the scheduled arrival time with the help of provided decision support tools. Analyses explore the potential impact of metering delays and system uncertainties on controller workload and performance. The results suggest that trajectory prediction uncertainties impact safety performance, while metering fix accuracy and workload appear subject to the scenario difficulty.

  6. Improving Grid Resilience through Informed Decision-making (IGRID)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burnham, Laurie; Stamber, Kevin L.; Jeffers, Robert Fredric

    The transformation of the distribution grid from a centralized to decentralized architecture, with bi-directional power and data flows, is made possible by a surge in network intelligence and grid automation. While changes are largely beneficial, the interface between grid operator and automated technologies is not well understood, nor are the benefits and risks of automation. Quantifying and understanding the latter is an important facet of grid resilience that needs to be fully investigated. The work described in this document represents the first empirical study aimed at identifying and mitigating the vulnerabilities posed by automation for a grid that for themore » foreseeable future will remain a human-in-the-loop critical infrastructure. Our scenario-based methodology enabled us to conduct a series of experimental studies to identify causal relationships between grid-operator performance and automated technologies and to collect measurements of human performance as a function of automation. Our findings, though preliminary, suggest there are predictive patterns in the interplay between human operators and automation, patterns that can inform the rollout of distribution automation and the hiring and training of operators, and contribute in multiple and significant ways to the field of grid resilience.« less

  7. Combined panel of serum human tissue kallikreins and CA-125 for the detection of epithelial ovarian cancer.

    PubMed

    Koh, Stephen Chee Liang; Huak, Chan Yiong; Lutan, Delfi; Marpuang, Johny; Ketut, Suwiyoga; Budiana, Nyoma Gede; Saleh, Agustria Zainu; Aziz, Mohamad Farid; Winarto, Hariyono; Pradjatmo, Heru; Hoan, Nguyen Khac Han; Thanh, Pham Viet; Choolani, Mahesh

    2012-07-01

    To determine the predictive accuracy of the combined panels of serum human tissue kallikreins (hKs) and CA-125 for the detection of epithelial ovarian cancer. Serum specimens collected from 5 Indonesian centers and 1 Vietnamese center were analyzed for CA-125, hK6, and hK10 levels. A total of 375 specimens from patients presenting with ovarian tumors, which include 156 benign cysts, 172 epithelial ovarian cancers (stage I/II, n=72; stage III/IV, n=100), 36 germ cell tumors and 11 borderline tumors, were included in the study analysis. Receiver operating characteristic analysis were performed to determine the cutoffs for age, CA-125, hK6, and hK10. Sensitivity, specificity, negative, and positive predictive values were determined for various combinations of the biomarkers. The levels of hK6 and hK10 were significantly elevated in ovarian cancer cases compared to benign cysts. Combination of 3 markers, age/CA-125/hk6 or CA-125/hk6/hk10, showed improved specificity (100%) and positive predictive value (100%) for prediction of ovarian cancer, when compared to the performance of single markers having 80-92% specificity and 74-87% positive predictive value. Four-marker combination, age/CA-125/hK6/hK10 also showed 100% specificity and 100% positive predictive value, although it demonstrated low sensitivity (11.9%) and negative predictive value (52.8%). The combination of human tissue kallikreins and CA-125 showed potential for improving prediction of epithelial ovarian cancer in patients presenting with ovarian tumors.

  8. Human Behaviour in High Stress Situations in Aerospace Operations Conference Proceedings Held in The Hague, The Netherlands on 24-28 October 1988

    DTIC Science & Technology

    1989-06-01

    Continuously stimulating advances in the aerospace sciences relevant to strengthening the common defence posture; - Improving the co-operation among member...very stimulating symposium. vii KI-1 PREDICTION OF PERSONALITY Harald T. Andersen M.D., Ph.D., D.Sc,D.Av.Med. Director RNoAF Institute of Aviation...audio tape recorder which was connected to the aircraft communication system. This recorder provided a continuous auditory record of each mission so that

  9. Manual control models of industrial management

    NASA Technical Reports Server (NTRS)

    Crossman, E. R. F. W.

    1972-01-01

    The industrial engineer is often required to design and implement control systems and organization for manufacturing and service facilities, to optimize quality, delivery, and yield, and minimize cost. Despite progress in computer science most such systems still employ human operators and managers as real-time control elements. Manual control theory should therefore be applicable to at least some aspects of industrial system design and operations. Formulation of adequate model structures is an essential prerequisite to progress in this area; since real-world production systems invariably include multilevel and multiloop control, and are implemented by timeshared human effort. A modular structure incorporating certain new types of functional element, has been developed. This forms the basis for analysis of an industrial process operation. In this case it appears that managerial controllers operate in a discrete predictive mode based on fast time modelling, with sampling interval related to plant dynamics. Successive aggregation causes reduced response bandwidth and hence increased sampling interval as a function of level.

  10. DoD-GEIS Rift Valley Fever Monitoring and Prediction System as a Tool for Defense and US Diplomacy

    NASA Technical Reports Server (NTRS)

    Anyamba, Assaf; Tucker, Compton J.; Linthicum, Kenneth J.; Witt, Clara J.; Gaydos, Joel C.; Russell, Kevin L.

    2011-01-01

    Over the last 10 years the Armed Forces Health Surveillance Center's Global Emerging Infections Surveillance and Response System (GEIS) partnering with NASA'S Goddard Space Flight Center and USDA's USDA-Center for Medical, Agricultural & Veterinary Entomology established and have operated the Rift Valley fever Monitoring and Prediction System to monitor, predict and assess the risk of Rift Valley fever outbreaks and other vector-borne diseases over Africa and the Middle East. This system is built on legacy DoD basic research conducted by Walter Reed Army Institute of Research overseas laboratory (US Army Medical Research Unit-Kenya) and the operational satellite environmental monitoring by NASA GSFC. Over the last 10 years of operation the system has predicted outbreaks of Rift Valley fever in the Horn of Africa, Sudan, South Africa and Mauritania. The ability to predict an outbreak several months before it occurs provides early warning to protect deployed forces, enhance public health in concerned countries and is a valuable tool use.d by the State Department in US Diplomacy. At the international level the system has been used by the Food and Agricultural Organization (FAD) and the World Health Organization (WHO) to support their monitoring, surveillance and response programs in the livestock sector and human health. This project is a successful testament of leveraging resources of different federal agencies to achieve objectives of force health protection, health and diplomacy.

  11. Design of a Multi-mode Flight Deck Decision Support System for Airborne Conflict Management

    NASA Technical Reports Server (NTRS)

    Barhydt, Richard; Krishnamurthy, Karthik

    2004-01-01

    NASA Langley has developed a multi-mode decision support system for pilots operating in a Distributed Air-Ground Traffic Management (DAG-TM) environment. An Autonomous Operations Planner (AOP) assists pilots in performing separation assurance functions, including conflict detection, prevention, and resolution. Ongoing AOP design has been based on a comprehensive human factors analysis and evaluation results from previous human-in-the-loop experiments with airline pilot test subjects. AOP considers complex flight mode interactions and provides flight guidance to pilots consistent with the current aircraft control state. Pilots communicate goals to AOP by setting system preferences and actively probing potential trajectories for conflicts. To minimize training requirements and improve operational use, AOP design leverages existing alerting philosophies, displays, and crew interfaces common on commercial aircraft. Future work will consider trajectory prediction uncertainties, integration with the TCAS collision avoidance system, and will incorporate enhancements based on an upcoming air-ground coordination experiment.

  12. Pilot/vehicle model analysis of visual and motion cue requirements in flight simulation. [helicopter hovering

    NASA Technical Reports Server (NTRS)

    Baron, S.; Lancraft, R.; Zacharias, G.

    1980-01-01

    The optimal control model (OCM) of the human operator is used to predict the effect of simulator characteristics on pilot performance and workload. The piloting task studied is helicopter hover. Among the simulator characteristics considered were (computer generated) visual display resolution, field of view and time delay.

  13. (Toxicological Sciences) High-throughput H295R steroidogenesis assay: utility as an alternative and a statistical approach to characterize effects on steroidogenesis

    EPA Science Inventory

    The U.S. Environmental Protection Agency Endocrine Disruptor Screening Program and the Organization for Economic Co-operation and Development (OECD) have used the human adrenocarcinoma (H295R) cell-based assay to predict chemical perturbation of androgen and estrogen production. ...

  14. Multidimensional Profiling of Task Stress States for Human Factors: A Brief Review.

    PubMed

    Matthews, Gerald

    2016-09-01

    This article advocates multidimensional assessment of task stress in human factors and reviews the use of the Dundee Stress State Questionnaire (DSSQ) for evaluation of systems and operators. Contemporary stress research has progressed from an exclusive focus on environmental stressors to transactional perspectives on the stress process. Performance impacts of stress reflect the operator's dynamic attempts to understand and cope with task demands. Multidimensional stress assessments are necessary to gauge the different forms of system-operator interaction. This review discusses the theoretical and practical use of the DSSQ in evaluating multidimensional patterns of stress response. It presents psychometric evidence for the multidimensional perspective and illustrative profiles of subjective state response to task stressors and environments. Evidence is also presented on stress state correlations with related variables, including personality, stress process measures, psychophysiological response, and objective task performance. Evidence supports the validity of the DSSQ as a task stress measure. Studies of various simulated environments show that different tasks elicit different profiles of stress state response. Operator characteristics such as resilience predict individual differences in state response to stressors. Structural equation modeling may be used to understand performance impacts of stress states. Multidimensional assessment affords insight into the stress process in a variety of human factors contexts. Integrating subjective and psychophysiological assessment is a priority for future research. Stress state measurement contributes to evaluating system design, countermeasures to stress and fatigue, and performance vulnerabilities. It may also support personnel selection and diagnostic monitoring of operators. © 2016, Human Factors and Ergonomics Society.

  15. Models of Human Information Requirements: "When Reasonable Aiding Systems Disagree"

    NASA Technical Reports Server (NTRS)

    Corker, Kevin; Pisanich, Gregory; Shafto, Michael (Technical Monitor)

    1994-01-01

    Aircraft flight management and Air Traffic Control (ATC) automation are under development to maximize the economy of flight and to increase the capacity of the terminal area airspace while maintaining levels of flight safety equal to or better than current system performance. These goals are being realized by the introduction of flight management automation aiding and operations support systems on the flight deck and by new developments of ATC aiding systems that seek to optimize scheduling of aircraft while potentially reducing required separation and accounting for weather and wake vortex turbulence. Aiding systems on both the flight deck and the ground operate through algorithmic functions on models of the aircraft and of the airspace. These models may differ from each other as a result of variations in their models of the immediate environment. The resultant flight operations or ATC commands may differ in their response requirements (e.g. different preferred descent speeds or descent initiation points). The human operators in the system must then interact with the automation to reconcile differences and resolve conflicts. We have developed a model of human performance including cognitive functions (decision-making, rule-based reasoning, procedural interruption recovery and forgetting) that supports analysis of the information requirements for resolution of flight aiding and ATC conflicts. The model represents multiple individuals in the flight crew and in ATC. The model is supported in simulation on a Silicon Graphics' workstation using Allegro Lisp. Design guidelines for aviation automation aiding systems have been developed using the model's specification of information and team procedural requirements. Empirical data on flight deck operations from full-mission flight simulation are provided to support the model's predictions. The paper describes the model, its development and implementation, the simulation test of the model predictions, and the empirical validation process. The model and its supporting data provide a generalizable tool that is being expanded to include air/ground compatibility and ATC crew interactions in air traffic management.

  16. A Closed-Loop Model of Operator Visual Attention, Situation Awareness, and Performance Across Automation Mode Transitions.

    PubMed

    Johnson, Aaron W; Duda, Kevin R; Sheridan, Thomas B; Oman, Charles M

    2017-03-01

    This article describes a closed-loop, integrated human-vehicle model designed to help understand the underlying cognitive processes that influenced changes in subject visual attention, mental workload, and situation awareness across control mode transitions in a simulated human-in-the-loop lunar landing experiment. Control mode transitions from autopilot to manual flight may cause total attentional demands to exceed operator capacity. Attentional resources must be reallocated and reprioritized, which can increase the average uncertainty in the operator's estimates of low-priority system states. We define this increase in uncertainty as a reduction in situation awareness. We present a model built upon the optimal control model for state estimation, the crossover model for manual control, and the SEEV (salience, effort, expectancy, value) model for visual attention. We modify the SEEV attention executive to direct visual attention based, in part, on the uncertainty in the operator's estimates of system states. The model was validated using the simulated lunar landing experimental data, demonstrating an average difference in the percentage of attention ≤3.6% for all simulator instruments. The model's predictions of mental workload and situation awareness, measured by task performance and system state uncertainty, also mimicked the experimental data. Our model supports the hypothesis that visual attention is influenced by the uncertainty in system state estimates. Conceptualizing situation awareness around the metric of system state uncertainty is a valuable way for system designers to understand and predict how reallocations in the operator's visual attention during control mode transitions can produce reallocations in situation awareness of certain states.

  17. LocFuse: human protein-protein interaction prediction via classifier fusion using protein localization information.

    PubMed

    Zahiri, Javad; Mohammad-Noori, Morteza; Ebrahimpour, Reza; Saadat, Samaneh; Bozorgmehr, Joseph H; Goldberg, Tatyana; Masoudi-Nejad, Ali

    2014-12-01

    Protein-protein interaction (PPI) detection is one of the central goals of functional genomics and systems biology. Knowledge about the nature of PPIs can help fill the widening gap between sequence information and functional annotations. Although experimental methods have produced valuable PPI data, they also suffer from significant limitations. Computational PPI prediction methods have attracted tremendous attentions. Despite considerable efforts, PPI prediction is still in its infancy in complex multicellular organisms such as humans. Here, we propose a novel ensemble learning method, LocFuse, which is useful in human PPI prediction. This method uses eight different genomic and proteomic features along with four types of different classifiers. The prediction performance of this classifier selection method was found to be considerably better than methods employed hitherto. This confirms the complex nature of the PPI prediction problem and also the necessity of using biological information for classifier fusion. The LocFuse is available at: http://lbb.ut.ac.ir/Download/LBBsoft/LocFuse. The results revealed that if we divide proteome space according to the cellular localization of proteins, then the utility of some classifiers in PPI prediction can be improved. Therefore, to predict the interaction for any given protein pair, we can select the most accurate classifier with regard to the cellular localization information. Based on the results, we can say that the importance of different features for PPI prediction varies between differently localized proteins; however in general, our novel features, which were extracted from position-specific scoring matrices (PSSMs), are the most important ones and the Random Forest (RF) classifier performs best in most cases. LocFuse was developed with a user-friendly graphic interface and it is freely available for Linux, Mac OSX and MS Windows operating systems. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. The Value of Humans in the Operational River Forecasting Enterprise

    NASA Astrophysics Data System (ADS)

    Pagano, T. C.

    2012-04-01

    The extent of human control over operational river forecasts, such as by adjusting model inputs and outputs, varies from nearly completely automated systems to those where forecasts are generated after discussion among a group of experts. Historical and realtime data availability, the complexity of hydrologic processes, forecast user needs, and forecasting institution support/resource availability (e.g. computing power, money for model maintenance) influence the character and effectiveness of operational forecasting systems. Automated data quality algorithms, if used at all, are typically very basic (e.g. checks for impossible values); substantial human effort is devoted to cleaning up forcing data using subjective methods. Similarly, although it is an active research topic, nearly all operational forecasting systems struggle to make quantitative use of Numerical Weather Prediction model-based precipitation forecasts, instead relying on the assessment of meteorologists. Conversely, while there is a strong tradition in meteorology of making raw model outputs available to forecast users via the Internet, this is rarely done in hydrology; Operational river forecasters express concerns about exposing users to raw guidance, due to the potential for misinterpretation and misuse. However, this limits the ability of users to build their confidence in operational products through their own value-added analyses. Forecasting agencies also struggle with provenance (i.e. documenting the production process and archiving the pieces that went into creating a forecast) although this is necessary for quantifying the benefits of human involvement in forecasting and diagnosing weak links in the forecasting chain. In hydrology, the space between model outputs and final operational products is nearly unstudied by the academic community, although some studies exist in other fields such as meteorology.

  19. Experimental methods to validate measures of emotional state and readiness for duty in critical operations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weston, Louise Marie

    2007-09-01

    A recent report on criticality accidents in nuclear facilities indicates that human error played a major role in a significant number of incidents with serious consequences and that some of these human errors may be related to the emotional state of the individual. A pre-shift test to detect a deleterious emotional state could reduce the occurrence of such errors in critical operations. The effectiveness of pre-shift testing is a challenge because of the need to gather predictive data in a relatively short test period and the potential occurrence of learning effects due to a requirement for frequent testing. This reportmore » reviews the different types of reliability and validity methods and testing and statistical analysis procedures to validate measures of emotional state. The ultimate value of a validation study depends upon the percentage of human errors in critical operations that are due to the emotional state of the individual. A review of the literature to identify the most promising predictors of emotional state for this application is highly recommended.« less

  20. Distinct prediction errors in mesostriatal circuits of the human brain mediate learning about the values of both states and actions: evidence from high-resolution fMRI.

    PubMed

    Colas, Jaron T; Pauli, Wolfgang M; Larsen, Tobias; Tyszka, J Michael; O'Doherty, John P

    2017-10-01

    Prediction-error signals consistent with formal models of "reinforcement learning" (RL) have repeatedly been found within dopaminergic nuclei of the midbrain and dopaminoceptive areas of the striatum. However, the precise form of the RL algorithms implemented in the human brain is not yet well determined. Here, we created a novel paradigm optimized to dissociate the subtypes of reward-prediction errors that function as the key computational signatures of two distinct classes of RL models-namely, "actor/critic" models and action-value-learning models (e.g., the Q-learning model). The state-value-prediction error (SVPE), which is independent of actions, is a hallmark of the actor/critic architecture, whereas the action-value-prediction error (AVPE) is the distinguishing feature of action-value-learning algorithms. To test for the presence of these prediction-error signals in the brain, we scanned human participants with a high-resolution functional magnetic-resonance imaging (fMRI) protocol optimized to enable measurement of neural activity in the dopaminergic midbrain as well as the striatal areas to which it projects. In keeping with the actor/critic model, the SVPE signal was detected in the substantia nigra. The SVPE was also clearly present in both the ventral striatum and the dorsal striatum. However, alongside these purely state-value-based computations we also found evidence for AVPE signals throughout the striatum. These high-resolution fMRI findings suggest that model-free aspects of reward learning in humans can be explained algorithmically with RL in terms of an actor/critic mechanism operating in parallel with a system for more direct action-value learning.

  1. Distinct prediction errors in mesostriatal circuits of the human brain mediate learning about the values of both states and actions: evidence from high-resolution fMRI

    PubMed Central

    Pauli, Wolfgang M.; Larsen, Tobias; Tyszka, J. Michael; O’Doherty, John P.

    2017-01-01

    Prediction-error signals consistent with formal models of “reinforcement learning” (RL) have repeatedly been found within dopaminergic nuclei of the midbrain and dopaminoceptive areas of the striatum. However, the precise form of the RL algorithms implemented in the human brain is not yet well determined. Here, we created a novel paradigm optimized to dissociate the subtypes of reward-prediction errors that function as the key computational signatures of two distinct classes of RL models—namely, “actor/critic” models and action-value-learning models (e.g., the Q-learning model). The state-value-prediction error (SVPE), which is independent of actions, is a hallmark of the actor/critic architecture, whereas the action-value-prediction error (AVPE) is the distinguishing feature of action-value-learning algorithms. To test for the presence of these prediction-error signals in the brain, we scanned human participants with a high-resolution functional magnetic-resonance imaging (fMRI) protocol optimized to enable measurement of neural activity in the dopaminergic midbrain as well as the striatal areas to which it projects. In keeping with the actor/critic model, the SVPE signal was detected in the substantia nigra. The SVPE was also clearly present in both the ventral striatum and the dorsal striatum. However, alongside these purely state-value-based computations we also found evidence for AVPE signals throughout the striatum. These high-resolution fMRI findings suggest that model-free aspects of reward learning in humans can be explained algorithmically with RL in terms of an actor/critic mechanism operating in parallel with a system for more direct action-value learning. PMID:29049406

  2. Accurate and dynamic predictive model for better prediction in medicine and healthcare.

    PubMed

    Alanazi, H O; Abdullah, A H; Qureshi, K N; Ismail, A S

    2018-05-01

    Information and communication technologies (ICTs) have changed the trend into new integrated operations and methods in all fields of life. The health sector has also adopted new technologies to improve the systems and provide better services to customers. Predictive models in health care are also influenced from new technologies to predict the different disease outcomes. However, still, existing predictive models have suffered from some limitations in terms of predictive outcomes performance. In order to improve predictive model performance, this paper proposed a predictive model by classifying the disease predictions into different categories. To achieve this model performance, this paper uses traumatic brain injury (TBI) datasets. TBI is one of the serious diseases worldwide and needs more attention due to its seriousness and serious impacts on human life. The proposed predictive model improves the predictive performance of TBI. The TBI data set is developed and approved by neurologists to set its features. The experiment results show that the proposed model has achieved significant results including accuracy, sensitivity, and specificity.

  3. Does human capital matter? A meta-analysis of the relationship between human capital and firm performance.

    PubMed

    Crook, T Russell; Todd, Samuel Y; Combs, James G; Woehr, David J; Ketchen, David J

    2011-05-01

    Theory at both the micro and macro level predicts that investments in superior human capital generate better firm-level performance. However, human capital takes time and money to develop or acquire, which potentially offsets its positive benefits. Indeed, extant tests appear equivocal regarding its impact. To clarify what is known, we meta-analyzed effects drawn from 66 studies of the human capital-firm performance relationship and investigated 3 moderators suggested by resource-based theory. We found that human capital relates strongly to performance, especially when the human capital in question is not readily tradable in labor markets and when researchers use operational performance measures that are not subject to profit appropriation. Our results suggest that managers should invest in programs that increase and retain firm-specific human capital.

  4. An active learning approach for rapid characterization of endothelial cells in human tumors.

    PubMed

    Padmanabhan, Raghav K; Somasundar, Vinay H; Griffith, Sandra D; Zhu, Jianliang; Samoyedny, Drew; Tan, Kay See; Hu, Jiahao; Liao, Xuejun; Carin, Lawrence; Yoon, Sam S; Flaherty, Keith T; Dipaola, Robert S; Heitjan, Daniel F; Lal, Priti; Feldman, Michael D; Roysam, Badrinath; Lee, William M F

    2014-01-01

    Currently, no available pathological or molecular measures of tumor angiogenesis predict response to antiangiogenic therapies used in clinical practice. Recognizing that tumor endothelial cells (EC) and EC activation and survival signaling are the direct targets of these therapies, we sought to develop an automated platform for quantifying activity of critical signaling pathways and other biological events in EC of patient tumors by histopathology. Computer image analysis of EC in highly heterogeneous human tumors by a statistical classifier trained using examples selected by human experts performed poorly due to subjectivity and selection bias. We hypothesized that the analysis can be optimized by a more active process to aid experts in identifying informative training examples. To test this hypothesis, we incorporated a novel active learning (AL) algorithm into FARSIGHT image analysis software that aids the expert by seeking out informative examples for the operator to label. The resulting FARSIGHT-AL system identified EC with specificity and sensitivity consistently greater than 0.9 and outperformed traditional supervised classification algorithms. The system modeled individual operator preferences and generated reproducible results. Using the results of EC classification, we also quantified proliferation (Ki67) and activity in important signal transduction pathways (MAP kinase, STAT3) in immunostained human clear cell renal cell carcinoma and other tumors. FARSIGHT-AL enables characterization of EC in conventionally preserved human tumors in a more automated process suitable for testing and validating in clinical trials. The results of our study support a unique opportunity for quantifying angiogenesis in a manner that can now be tested for its ability to identify novel predictive and response biomarkers.

  5. A Cognitive System Model for Human/Automation Dynamics in Airspace Management

    NASA Technical Reports Server (NTRS)

    Corker, Kevin M.; Pisanich, Gregory; Lebacqz, J. Victor (Technical Monitor)

    1997-01-01

    NASA has initiated a significant thrust of research and development focused on providing the flight crew and air traffic managers automation aids to increase capacity in en route and terminal area operations through the use of flexible, more fuel-efficient routing, while improving the level of safety in commercial carrier operations. In that system development, definition of cognitive requirements for integrated multi-operator dynamic aiding systems is fundamental. In order to support that cognitive function definition, we have extended the Man Machine Integrated Design and Analysis System (MIDAS) to include representation of multiple cognitive agents (both human operators and intelligent aiding systems) operating aircraft, airline operations centers and air traffic control centers in the evolving airspace. The demands of this application require representation of many intelligent agents sharing world-models, and coordinating action/intention with cooperative scheduling of goals and actions in a potentially unpredictable world of operations. The MIDAS operator models have undergone significant development in order to understand the requirements for operator aiding and the impact of that aiding in the complex nondeterminate system of national airspace operations. The operator model's structure has been modified to include attention functions, action priority, and situation assessment. The cognitive function model has been expanded to include working memory operations including retrieval from long-term store, interference, visual-motor and verbal articulatory loop functions, and time-based losses. The operator's activity structures have been developed to include prioritization and interruption of multiple parallel activities among multiple operators, to provide for anticipation (knowledge of the intention and action of remote operators), and to respond to failures of the system and other operators in the system in situation-specific paradigms. The model's internal representation has been be modified so that multiple, autonomous sets of equipment will function in a scenario as the single equipment sets do now. In order to support the analysis requirements with multiple items of equipment, it is necessary for equipment to access the state of other equipment objects at initialization time (a radar object may need to access the position and speed of aircraft in its area, for example), and as a function of perception and sensor system interaction. The model has been improved to include multiple world-states as a function of equipment am operator interaction. The model has been used -1o predict the impact of warning and alert zones in aircraft operation, and, more critic-ally, the interaction of flight-deck based warning mechanisms and air traffic controller action in response to ground-based conflict prediction and alerting systems. In this operation, two operating systems provide alerting to two autonomous, but linked sets of operators, whose view of the system and whose dynamics in response are radically different. System stability and operator action was predicted using the MIDAS model.

  6. The Value of Biomedical Simulation Environments to Future Human Space Flight Missions

    NASA Technical Reports Server (NTRS)

    Mulugeta, Lealem; Myers, Jerry G.; Skytland, Nicholas G.; Platts, Steven H.

    2010-01-01

    With the ambitious goals to send manned missions to asteroids and onto Mars, substantial work will be required to ensure the well being of the men and women who will undertake these difficult missions. Unlike current International Space Station or Shuttle missions, astronauts will be required to endure long-term exposure to higher levels of radiation, isolation and reduced gravity. These new operation conditions will pose health risks that are currently not well understood and perhaps unanticipated. Therefore, it is essential to develop and apply advanced tools to predict, assess and mitigate potential hazards to astronaut health. NASA s Digital Astronaut Project (DAP) is working to develop and apply computational models of physiologic response to space flight operation conditions over various time periods and environmental circumstances. The collective application and integration of well vetted models assessing the physiology, biomechanics and anatomy is referred to as the Digital Astronaut. The Digital Astronaut simulation environment will serve as a practical working tool for use by NASA in operational activities such as the prediction of biomedical risks and functional capabilities of astronauts. In additional to space flight operation conditions, DAP s work has direct applicability to terrestrial biomedical research by providing virtual environments for hypothesis testing, experiment design, and to reduce animal/human testing. A practical application of the DA to assess pre and post flight responses to exercise is illustrated and the difficulty in matching true physiological responses is discussed.

  7. Prediction and Factor Extraction of Drug Function by Analyzing Medical Records in Developing Countries.

    PubMed

    Hu, Min; Nohara, Yasunobu; Nakamura, Masafumi; Nakashima, Naoki

    2017-01-01

    The World Health Organization has declared Bangladesh one of 58 countries facing acute Human Resources for Health (HRH) crisis. Artificial intelligence in healthcare has been shown to be successful for diagnostics. Using machine learning to predict pharmaceutical prescriptions may solve HRH crises. In this study, we investigate a predictive model by analyzing prescription data of 4,543 subjects in Bangladesh. We predict the function of prescribed drugs, comparing three machine-learning approaches. The approaches compare whether a subject shall be prescribed medicine from the 21 most frequently prescribed drug functions. Receiver Operating Characteristics (ROC) were selected as a way to evaluate and assess prediction models. The results show the drug function with the best prediction performance was oral hypoglycemic drugs, which has an average AUC of 0.962. To understand how the variables affect prediction, we conducted factor analysis based on tree-based algorithms and natural language processing techniques.

  8. MODELING H-ARS USING HEMATOLOGICAL PARAMETERS: A COMPARISON BETWEEN THE NON-HUMAN PRIMATE AND MINIPIG.

    PubMed

    Bolduc, David L; Bünger, Rolf; Moroni, Maria; Blakely, William F

    2016-12-01

    Multiple hematological biomarkers (i.e. complete blood counts and serum chemistry parameters) were used in a multivariate linear-regression fit to create predictive algorithms for estimating the severity of hematopoietic acute radiation syndrome (H-ARS) using two different species (i.e. Göttingen Minipig and non-human primate (NHP) (Macacca mulatta)). Biomarker data were analyzed prior to irradiation and between 1-60 days (minipig) and 1-30 days (NHP) after irradiation exposures of 1.6-3.5 Gy (minipig) and 6.5 Gy (NHP) 60 Co gamma ray doses at 0.5-0.6 Gy min -1 and 0.4 Gy min -1 , respectively. Fitted radiation risk and injury categorization (RRIC) values and RRIC prediction percent accuracies were compared between the two models. Both models estimated H-ARS severity with over 80% overall predictive power and with receiver operating characteristic curve area values of 0.884 and 0.825. These results based on two animal radiation models support the concept for the use of a hematopoietic-based algorithm for predicting the risk of H-ARS in humans. Published by Oxford University Press 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  9. Acquisition and production of skilled behavior in dynamic decision-making tasks: Modeling strategic behavior in human-automation interaction: Why and aid can (and should) go unused

    NASA Technical Reports Server (NTRS)

    Kirlik, Alex

    1991-01-01

    Advances in computer and control technology offer the opportunity for task-offload aiding in human-machine systems. A task-offload aid (e.g., an autopilot, an intelligent assistant) can be selectively engaged by the human operator to dynamically delegate tasks to an automated system. Successful design and performance prediction in such systems requires knowledge of the factors influencing the strategy the operator develops and uses for managing interaction with the task-offload aid. A model is presented that shows how such strategies can be predicted as a function of three task context properties (frequency and duration of secondary tasks and costs of delaying secondary tasks) and three aid design properties (aid engagement and disengagement times, aid performance relative to human performance). Sensitivity analysis indicates how each of these contextual and design factors affect the optimal aid aid usage strategy and attainable system performance. The model is applied to understanding human-automation interaction in laboratory experiments on human supervisory control behavior. The laboratory task allowed subjects freedom to determine strategies for using an autopilot in a dynamic, multi-task environment. Modeling results suggested that many subjects may indeed have been acting appropriately by not using the autopilot in the way its designers intended. Although autopilot function was technically sound, this aid was not designed with due regard to the overall task context in which it was placed. These results demonstrate the need for additional research on how people may strategically manage their own resources, as well as those provided by automation, in an effort to keep workload and performance at acceptable levels.

  10. The human operational sex ratio: effects of marriage, concealed ovulation, and menopause on mate competition.

    PubMed

    Marlowe, Frank W; Berbesque, J Colette

    2012-12-01

    Among mammals, male-male competition for sexual access to females frequently involves fighting. Larger body size gives males an advantage in fighting, which explains why males tend to be larger than females in many species, including anthropoid primates. Mitani et al. derived a formula to measure the operational sex ratio (OSR) to reflect the degree of male-male competition using the number of reproductively available males to females who are cycling and capable of conceiving. The OSR should predict the degree of sexual dimorphism in body mass-at least if male-male competition involves much fighting or threatening. Here, we use hunter-gatherer demographic data and the Mitani et al. formula to calculate the human OSR. We show that humans have a much lower degree of body mass sexual dimorphism than is predicted by our OSR. We suggest this is because human competition rarely involves fighting. In human hunter-gatherer societies, differences in the ages of marriage have an impact on competition in that the age of males at first marriage is younger when there is a lower percentage of married men with two or more wives, and older when there is a higher percentage of married men with two or more wives. We discuss the implications of this for females, along with the effects of two key life history traits that influence the OSR, concealed ovulation and menopause. While menopause decreases the number of reproductively available females to males and thus increases male-male competition, concealed ovulation decreases male-male competition. Finally, we discuss the importance of mostly monogamous mate bonds in human evolution. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Quantifying Astronaut Tasks: Robotic Technology and Future Space Suit Design

    NASA Technical Reports Server (NTRS)

    Newman, Dava

    2003-01-01

    The primary aim of this research effort was to advance the current understanding of astronauts' capabilities and limitations in space-suited EVA by developing models of the constitutive and compatibility relations of a space suit, based on experimental data gained from human test subjects as well as a 12 degree-of-freedom human-sized robot, and utilizing these fundamental relations to estimate a human factors performance metric for space suited EVA work. The three specific objectives are to: 1) Compile a detailed database of torques required to bend the joints of a space suit, using realistic, multi- joint human motions. 2) Develop a mathematical model of the constitutive relations between space suit joint torques and joint angular positions, based on experimental data and compare other investigators' physics-based models to experimental data. 3) Estimate the work envelope of a space suited astronaut, using the constitutive and compatibility relations of the space suit. The body of work that makes up this report includes experimentation, empirical and physics-based modeling, and model applications. A detailed space suit joint torque-angle database was compiled with a novel experimental approach that used space-suited human test subjects to generate realistic, multi-joint motions and an instrumented robot to measure the torques required to accomplish these motions in a space suit. Based on the experimental data, a mathematical model is developed to predict joint torque from the joint angle history. Two physics-based models of pressurized fabric cylinder bending are compared to experimental data, yielding design insights. The mathematical model is applied to EVA operations in an inverse kinematic analysis coupled to the space suit model to calculate the volume in which space-suited astronauts can work with their hands, demonstrating that operational human factors metrics can be predicted from fundamental space suit information.

  12. Models for 31-Mode PVDF Energy Harvester for Wearable Applications

    PubMed Central

    Zhao, Jingjing; You, Zheng

    2014-01-01

    Currently, wearable electronics are increasingly widely used, leading to an increasing need of portable power supply. As a clean and renewable power source, piezoelectric energy harvester can transfer mechanical energy into electric energy directly, and the energy harvester based on polyvinylidene difluoride (PVDF) operating in 31-mode is appropriate to harvest energy from human motion. This paper established a series of theoretical models to predict the performance of 31-mode PVDF energy harvester. Among them, the energy storage one can predict the collected energy accurately during the operation of the harvester. Based on theoretical study and experiments investigation, two approaches to improve the energy harvesting performance have been found. Furthermore, experiment results demonstrate the high accuracies of the models, which are better than 95%. PMID:25114981

  13. Testing and evaluation for astronaut extravehicular activity (EVA) operability.

    PubMed

    Shields, N; King, L C

    1998-09-01

    Because it is the human component that defines space mission success, careful planning is required to ensure that hardware can be operated and maintained by crews on-orbit. Several methods exist to allow researchers and designers to better predict how hardware designs will behave under the harsh environment of low Earth orbit, and whether designs incorporate the necessary features for Extra Vehicular Activity (EVA) operability. Testing under conditions of simulated microgravity can occur during the design concept phase when verifying design operability, during mission training, or concurrently with on-orbit mission operations. The bulk of testing is focused on normal operations, but also includes evaluation of credible mission contingencies or "what would happen if" planning. The astronauts and cosmonauts who fly these space missions are well prepared and trained to survive and be productive in Earth's orbit. The engineers, designers, and training crews involved in space missions subject themselves to Earth based simulation techniques that also expose them to extreme environments. Aircraft falling ten thousand feet, alternating g-loads, underwater testing at 45 foot depth, enclosure in a vacuum chamber and subject to thermal extremes, each carries with it inherent risks to the humans preparing for space missions.

  14. Tutorial on Actual Space Environmental Hazards For Space Systems (Invited)

    NASA Astrophysics Data System (ADS)

    Mazur, J. E.; Fennell, J. F.; Guild, T. B.; O'Brien, T. P.

    2013-12-01

    It has become common in the space science community to conduct research on diverse physical phenomena because they are thought to contribute to space weather. However, satellites contend with only three primary environmental hazards: single event effects, vehicle charging, and total dose, and not every physical phenomenon that occurs in space contributes in substantial ways to create these hazards. One consequence of the mismatch between actual threats and all-encompassing research is the often-described gap between research and operations; another is the creation of forecasts that provide no actionable information for design engineers or spacecraft operators. An example of the latter is the physics of magnetic field emergence on the Sun; the phenomenon is relevant to the formation and launch of coronal mass ejections and is also causally related to the solar energetic particles that may get accelerated in the interplanetary shock. Unfortunately for the research community, the engineering community mitigates the space weather threat (single-event effects from heavy ions above ~50 MeV/nucleon) with a worst-case specification of the environment and not with a prediction. Worst-case definition requires data mining of past events, while predictions involve large-scale systems science from the Sun to the Earth that is compelling for scientists and their funding agencies but not actionable for design or for most operations. Differing priorities among different space-faring organizations only compounds the confusion over what science research is relevant. Solar particle impacts to human crew arise mainly from the total ionizing dose from the solar protons, so the priority for prediction in the human spaceflight community is therefore much different than in the unmanned satellite community, while both communities refer to the fundamental phenomenon as space weather. Our goal in this paper is the presentation of a brief tutorial on the primary space environmental phenomena that are relevant to satellite design and operations. The tutorial will help space science researchers to understand the differing priorities of communities that operate in space and to better distinguish the science that is actually needed for the design and operation of all-weather space systems.

  15. The Use of Behavior Models for Predicting Complex Operations

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2010-01-01

    Modeling and simulation (M&S) plays an important role when complex human-system notions are being proposed, developed and tested within the system design process. National Aeronautics and Space Administration (NASA) as an agency uses many different types of M&S approaches for predicting human-system interactions, especially when it is early in the development phase of a conceptual design. NASA Ames Research Center possesses a number of M&S capabilities ranging from airflow, flight path models, aircraft models, scheduling models, human performance models (HPMs), and bioinformatics models among a host of other kinds of M&S capabilities that are used for predicting whether the proposed designs will benefit the specific mission criteria. The Man-Machine Integration Design and Analysis System (MIDAS) is a NASA ARC HPM software tool that integrates many models of human behavior with environment models, equipment models, and procedural / task models. The challenge to model comprehensibility is heightened as the number of models that are integrated and the requisite fidelity of the procedural sets are increased. Model transparency is needed for some of the more complex HPMs to maintain comprehensibility of the integrated model performance. This will be exemplified in a recent MIDAS v5 application model and plans for future model refinements will be presented.

  16. Psychosocial Characteristics of Optimum Performance in Isolated and Confined Environments (ICE)

    NASA Technical Reports Server (NTRS)

    Palinkas, Lawrence A.; Keeton, Kathryn E.; Shea, Camille; Leveton, Lauren B.

    2010-01-01

    The Behavioral Health and Performance (BHP) Element addresses human health risks in the NASA Human Research Program (HRP), including the Risk of Adverse Behavioral Conditions and the Risk of Psychiatric Disorders. BHP supports and conducts research to help characteristics and mitigate the Behavioral Medicine risk for exploration missions, and in some instances, current Flight Medical Operations. The Behavioral Health and Performance (BHP) Element identified research gaps within the Behavioral Medicine Risk, including Gap BMed6: What psychosocial characteristics predict success in an isolated, confined environment (ICE)? To address this gap, we conducted an extensive and exhaustive literature review to identify the following: 1) psychosocial characteristics that predict success in ICE environments; 2) characteristics that are most malleable; and 3) specific countermeasures that could enhance malleable characteristics.

  17. Systematic genome assessment of B-vitamin biosynthesis suggests co-operation among gut microbes

    PubMed Central

    Magnúsdóttir, Stefanía; Ravcheev, Dmitry; de Crécy-Lagard, Valérie; Thiele, Ines

    2015-01-01

    The human gut microbiota supplies its host with essential nutrients, including B-vitamins. Using the PubSEED platform, we systematically assessed the genomes of 256 common human gut bacteria for the presence of biosynthesis pathways for eight B-vitamins: biotin, cobalamin, folate, niacin, pantothenate, pyridoxine, riboflavin, and thiamin. On the basis of the presence and absence of genome annotations, we predicted that each of the eight vitamins was produced by 40–65% of the 256 human gut microbes. The distribution of synthesis pathways was diverse; some genomes had all eight biosynthesis pathways, whereas others contained no de novo synthesis pathways. We compared our predictions to experimental data from 16 organisms and found 88% of our predictions to be in agreement with published data. In addition, we identified several pairs of organisms whose vitamin synthesis pathway pattern complemented those of other organisms. This analysis suggests that human gut bacteria actively exchange B-vitamins among each other, thereby enabling the survival of organisms that do not synthesize any of these essential cofactors. This result indicates the co-evolution of the gut microbes in the human gut environment. Our work presents the first comprehensive assessment of the B-vitamin synthesis capabilities of the human gut microbiota. We propose that in addition to diet, the gut microbiota is an important source of B-vitamins, and that changes in the gut microbiota composition can severely affect our dietary B-vitamin requirements. PMID:25941533

  18. The Fate of Trace Contaminants in a Crewed Spacecraft Cabin Environment

    NASA Technical Reports Server (NTRS)

    Perry, Jay L.; Kayatin, Matthew J.

    2016-01-01

    Trace chemical contaminants produced via equipment offgassing, human metabolic sources, and vehicle operations are removed from the cabin atmosphere by active contamination control equipment and incidental removal by other air quality control equipment. The fate of representative trace contaminants commonly observed in spacecraft cabin atmospheres is explored. Removal mechanisms are described and predictive mass balance techniques are reviewed. Results from the predictive techniques are compared to cabin air quality analysis results. Considerations are discussed for an integrated trace contaminant control architecture suitable for long duration crewed space exploration missions.

  19. Transitioning a Chesapeake Bay Ecological Prediction System to Operations

    NASA Astrophysics Data System (ADS)

    Brown, C.; Green, D. S.; Eco Forecasters

    2011-12-01

    Ecological prediction of the impacts of physical, chemical, biological, and human-induced change on ecosystems and their components, encompass a wide range of space and time scales, and subject matter. They vary from predicting the occurrence and/or transport of certain species, such harmful algal blooms, or biogeochemical constituents, such as dissolved oxygen concentrations, to large-scale ecosystem responses and higher trophic levels. The timescales of ecological prediction, including guidance and forecasts, range from nowcasts and short-term forecasts (days), to intraseasonal and interannual outlooks (weeks to months), to decadal and century projections in climate change scenarios. The spatial scales range from small coastal inlets to basin and global scale biogeochemical and ecological forecasts. The types of models that have been used include conceptual, empirical, mechanistic, and hybrid approaches. This presentation will identify the challenges and progress toward transitioning experimental model-based ecological prediction into operational guidance and forecasting. Recent efforts are targeting integration of regional ocean, hydrodynamic and hydrological models and leveraging weather and water service infrastructure to enable the prototyping of an operational ecological forecast capability for the Chesapeake Bay and its tidal tributaries. A path finder demonstration predicts the probability of encountering sea nettles (Chrysaora quinquecirrha), a stinging jellyfish. These jellyfish can negatively impact safety and economic activities in the bay and an impact-based forecast that predicts where and when this biotic nuisance occurs may help management effects. The issuance of bay-wide nowcasts and three-day forecasts of sea nettle probability are generated daily by forcing an empirical habitat model (that predicts the probability of sea nettles) with real-time and 3-day forecasts of sea-surface temperature (SST) and salinity (SSS). In the first demonstration phase, the sea surface temperature (SST) and sea surface salinity (SSS) fields are generated by the Chesapeake Bay Operational Forecast System (CBOFS2), a 3-dimensional hydrodynamic model developed and operated by NOAA's National Ocean Service and run operationally at the National Weather Service National Centers for Environmental Prediction (NCEP). Importantly, this system is readily modified to predict the probability of other important target organisms, such as harmful algal blooms, biogeochemical constituents, such as dissolved oxygen concentration, and water-borne pathogens. Extending this initial effort includes advancement of a regional coastal ocean modeling testbed and proving ground. Such formal collaboration is intended to accelerate transition to operations and increase confidence and use of forecast guidance. The outcome will be improved decision making by emergency and resource managers, scientific researchers and the general public. The presentation will describe partnership plans for this testbed as well as the potential implications for the services and research community.

  20. Explaining neural signals in human visual cortex with an associative learning model.

    PubMed

    Jiang, Jiefeng; Schmajuk, Nestor; Egner, Tobias

    2012-08-01

    "Predictive coding" models posit a key role for associative learning in visual cognition, viewing perceptual inference as a process of matching (learned) top-down predictions (or expectations) against bottom-up sensory evidence. At the neural level, these models propose that each region along the visual processing hierarchy entails one set of processing units encoding predictions of bottom-up input, and another set computing mismatches (prediction error or surprise) between predictions and evidence. This contrasts with traditional views of visual neurons operating purely as bottom-up feature detectors. In support of the predictive coding hypothesis, a recent human neuroimaging study (Egner, Monti, & Summerfield, 2010) showed that neural population responses to expected and unexpected face and house stimuli in the "fusiform face area" (FFA) could be well-described as a summation of hypothetical face-expectation and -surprise signals, but not by feature detector responses. Here, we used computer simulations to test whether these imaging data could be formally explained within the broader framework of a mathematical neural network model of associative learning (Schmajuk, Gray, & Lam, 1996). Results show that FFA responses could be fit very closely by model variables coding for conditional predictions (and their violations) of stimuli that unconditionally activate the FFA. These data document that neural population signals in the ventral visual stream that deviate from classic feature detection responses can formally be explained by associative prediction and surprise signals.

  1. Supercomputers Of The Future

    NASA Technical Reports Server (NTRS)

    Peterson, Victor L.; Kim, John; Holst, Terry L.; Deiwert, George S.; Cooper, David M.; Watson, Andrew B.; Bailey, F. Ron

    1992-01-01

    Report evaluates supercomputer needs of five key disciplines: turbulence physics, aerodynamics, aerothermodynamics, chemistry, and mathematical modeling of human vision. Predicts these fields will require computer speed greater than 10(Sup 18) floating-point operations per second (FLOP's) and memory capacity greater than 10(Sup 15) words. Also, new parallel computer architectures and new structured numerical methods will make necessary speed and capacity available.

  2. Two languages, two minds: flexible cognitive processing driven by language of operation.

    PubMed

    Athanasopoulos, Panos; Bylund, Emanuel; Montero-Melis, Guillermo; Damjanovic, Ljubica; Schartner, Alina; Kibbe, Alexandra; Riches, Nick; Thierry, Guillaume

    2015-04-01

    People make sense of objects and events around them by classifying them into identifiable categories. The extent to which language affects this process has been the focus of a long-standing debate: Do different languages cause their speakers to behave differently? Here, we show that fluent German-English bilinguals categorize motion events according to the grammatical constraints of the language in which they operate. First, as predicted from cross-linguistic differences in motion encoding, bilingual participants functioning in a German testing context prefer to match events on the basis of motion completion to a greater extent than do bilingual participants in an English context. Second, when bilingual participants experience verbal interference in English, their categorization behavior is congruent with that predicted for German; when bilingual participants experience verbal interference in German, their categorization becomes congruent with that predicted for English. These findings show that language effects on cognition are context-bound and transient, revealing unprecedented levels of malleability in human cognition. © The Author(s) 2015.

  3. Design and Operational Evaluation of the Traffic Management Advisor at the Ft. Worth Air Route Traffic Control Center

    NASA Technical Reports Server (NTRS)

    Swenson, Harry N.; Vincent, Danny; Tobias, Leonard (Technical Monitor)

    1997-01-01

    NASA and the FAA have designed and developed and an automation tool known as the Traffic Management Advisor (TMA). The system was operationally evaluated at the Ft. Worth Air Route Traffic Control Center (ARTCC). The TMA is a time-based strategic planning tool that provides Traffic Management Coordinators and En Route Air Traffic Controllers the ability to efficiently optimize the capacity of a demand impacted airport. The TMA consists of trajectory prediction, constraint-based runway scheduling, traffic flow visualization and controllers advisories. The TMA was used and operationally evaluated for forty-one rush traffic periods during a one month period in the Summer of 1996. The evaluations included all shifts of air traffic operations as well as periods of inclement weather. Performance data was collected for engineering and human factor analysis and compared with similar operations without the TMA. The engineering data indicates that the operations with the TMA show a one to two minute per aircraft delay reduction during rush periods. The human factor data indicate a perceived reduction in en route controller workload as well as an increase in job satisfaction. Upon completion of the evaluation, the TMA has become part of the normal operations at the Ft. Worth ARTCC.

  4. A Satellite Mortality Study to Support Space Systems Lifetime Prediction

    NASA Technical Reports Server (NTRS)

    Fox, George; Salazar, Ronald; Habib-Agahi, Hamid; Dubos, Gregory

    2013-01-01

    Estimating the operational lifetime of satellites and spacecraft is a complex process. Operational lifetime can differ from mission design lifetime for a variety of reasons. Unexpected mortality can occur due to human errors in design and fabrication, to human errors in launch and operations, to random anomalies of hardware and software or even satellite function degradation or technology change, leading to unrealized economic or mission return. This study focuses on data collection of public information using, for the first time, a large, publically available dataset, and preliminary analysis of satellite lifetimes, both operational lifetime and design lifetime. The objective of this study is the illustration of the relationship of design life to actual lifetime for some representative classes of satellites and spacecraft. First, a Weibull and Exponential lifetime analysis comparison is performed on the ratio of mission operating lifetime to design life, accounting for terminated and ongoing missions. Next a Kaplan-Meier survivor function, standard practice for clinical trials analysis, is estimated from operating lifetime. Bootstrap resampling is used to provide uncertainty estimates of selected survival probabilities. This study highlights the need for more detailed databases and engineering reliability models of satellite lifetime that include satellite systems and subsystems, operations procedures and environmental characteristics to support the design of complex, multi-generation, long-lived space systems in Earth orbit.

  5. In silico prediction of splice-altering single nucleotide variants in the human genome.

    PubMed

    Jian, Xueqiu; Boerwinkle, Eric; Liu, Xiaoming

    2014-12-16

    In silico tools have been developed to predict variants that may have an impact on pre-mRNA splicing. The major limitation of the application of these tools to basic research and clinical practice is the difficulty in interpreting the output. Most tools only predict potential splice sites given a DNA sequence without measuring splicing signal changes caused by a variant. Another limitation is the lack of large-scale evaluation studies of these tools. We compared eight in silico tools on 2959 single nucleotide variants within splicing consensus regions (scSNVs) using receiver operating characteristic analysis. The Position Weight Matrix model and MaxEntScan outperformed other methods. Two ensemble learning methods, adaptive boosting and random forests, were used to construct models that take advantage of individual methods. Both models further improved prediction, with outputs of directly interpretable prediction scores. We applied our ensemble scores to scSNVs from the Catalogue of Somatic Mutations in Cancer database. Analysis showed that predicted splice-altering scSNVs are enriched in recurrent scSNVs and known cancer genes. We pre-computed our ensemble scores for all potential scSNVs across the human genome, providing a whole genome level resource for identifying splice-altering scSNVs discovered from large-scale sequencing studies.

  6. Computational prediction of virus-human protein-protein interactions using embedding kernelized heterogeneous data.

    PubMed

    Nourani, Esmaeil; Khunjush, Farshad; Durmuş, Saliha

    2016-05-24

    Pathogenic microorganisms exploit host cellular mechanisms and evade host defense mechanisms through molecular pathogen-host interactions (PHIs). Therefore, comprehensive analysis of these PHI networks should be an initial step for developing effective therapeutics against infectious diseases. Computational prediction of PHI data is gaining increasing demand because of scarcity of experimental data. Prediction of protein-protein interactions (PPIs) within PHI systems can be formulated as a classification problem, which requires the knowledge of non-interacting protein pairs. This is a restricting requirement since we lack datasets that report non-interacting protein pairs. In this study, we formulated the "computational prediction of PHI data" problem using kernel embedding of heterogeneous data. This eliminates the abovementioned requirement and enables us to predict new interactions without randomly labeling protein pairs as non-interacting. Domain-domain associations are used to filter the predicted results leading to 175 novel PHIs between 170 human proteins and 105 viral proteins. To compare our results with the state-of-the-art studies that use a binary classification formulation, we modified our settings to consider the same formulation. Detailed evaluations are conducted and our results provide more than 10 percent improvements for accuracy and AUC (area under the receiving operating curve) results in comparison with state-of-the-art methods.

  7. An Evaluation of Operational Airspace Sectorization Integrated System (OASIS) Advisory Tool

    NASA Technical Reports Server (NTRS)

    Lee, Paul U.; Mogford, Richard H.; Bridges, Wayne; Buckley, Nathan; Evans, Mark; Gujral, Vimmy; Lee, Hwasoo; Peknik, Daniel; Preston, William

    2013-01-01

    In January 2013, a human-in-the-loop evaluation of the Operational Airspace Sectorization Integrated System (OASIS) was conducted in the Airspace Operations Laboratory of the Human Systems Integration Division (Code TH) in conjunction with the Aviation Systems Division (Code AF). The development of OASIS is a major activity of the Dynamic Airspace Configuration (DAC) research focus area within the Aeronautics Research Mission Directorate (ARMD) Airspace Systems Program. OASIS is an advisory tool to assist Federal Aviation Administration (FAA) En Route Area Supervisors in their planning of sector combinedecombine operations as well as opening closing of Data-side (D-side) control positions. These advisory solutions are tailored to the predicted traffic demand over the next few hours. During the experiment, eight retired FAA personnel served as participants for a part-task evaluation of OASIS functionality, covering the user interface as well as the underlying algorithm. Participants gave positive feedback on both the user interface and the algorithm solutions for airspace configuration, including an excellent average rating of 94 on the tool usability scales. They also suggested various enhancements to the OASIS tool, which will be incorporated into the next tool development cycle for the full-scale human-in-the-loop evaluation to be conducted later this year.

  8. Optimized Algorithms for Prediction Within Robotic Tele-Operative Interfaces

    NASA Technical Reports Server (NTRS)

    Martin, Rodney A.; Wheeler, Kevin R.; Allan, Mark B.; SunSpiral, Vytas

    2010-01-01

    Robonaut, the humanoid robot developed at the Dexterous Robotics Labo ratory at NASA Johnson Space Center serves as a testbed for human-rob ot collaboration research and development efforts. One of the recent efforts investigates how adjustable autonomy can provide for a safe a nd more effective completion of manipulation-based tasks. A predictiv e algorithm developed in previous work was deployed as part of a soft ware interface that can be used for long-distance tele-operation. In this work, Hidden Markov Models (HMM?s) were trained on data recorded during tele-operation of basic tasks. In this paper we provide the d etails of this algorithm, how to improve upon the methods via optimization, and also present viable alternatives to the original algorithmi c approach. We show that all of the algorithms presented can be optim ized to meet the specifications of the metrics shown as being useful for measuring the performance of the predictive methods. 1

  9. Environmental Sensing of Expert Knowledge in a Computational Evolution System for Complex Problem Solving in Human Genetics

    NASA Astrophysics Data System (ADS)

    Greene, Casey S.; Hill, Douglas P.; Moore, Jason H.

    The relationship between interindividual variation in our genomes and variation in our susceptibility to common diseases is expected to be complex with multiple interacting genetic factors. A central goal of human genetics is to identify which DNA sequence variations predict disease risk in human populations. Our success in this endeavour will depend critically on the development and implementation of computational intelligence methods that are able to embrace, rather than ignore, the complexity of the genotype to phenotype relationship. To this end, we have developed a computational evolution system (CES) to discover genetic models of disease susceptibility involving complex relationships between DNA sequence variations. The CES approach is hierarchically organized and is capable of evolving operators of any arbitrary complexity. The ability to evolve operators distinguishes this approach from artificial evolution approaches using fixed operators such as mutation and recombination. Our previous studies have shown that a CES that can utilize expert knowledge about the problem in evolved operators significantly outperforms a CES unable to use this knowledge. This environmental sensing of external sources of biological or statistical knowledge is important when the search space is both rugged and large as in the genetic analysis of complex diseases. We show here that the CES is also capable of evolving operators which exploit one of several sources of expert knowledge to solve the problem. This is important for both the discovery of highly fit genetic models and because the particular source of expert knowledge used by evolved operators may provide additional information about the problem itself. This study brings us a step closer to a CES that can solve complex problems in human genetics in addition to discovering genetic models of disease.

  10. Development of Decision Support Formulas for the Prediction of Bladder Outlet Obstruction and Prostatic Surgery in Patients With Lower Urinary Tract Symptom/Benign Prostatic Hyperplasia: Part II, External Validation and Usability Testing of a Smartphone App.

    PubMed

    Choo, Min Soo; Jeong, Seong Jin; Cho, Sung Yong; Yoo, Changwon; Jeong, Chang Wook; Ku, Ja Hyeon; Oh, Seung-June

    2017-04-01

    We aimed to externally validate the prediction model we developed for having bladder outlet obstruction (BOO) and requiring prostatic surgery using 2 independent data sets from tertiary referral centers, and also aimed to validate a mobile app for using this model through usability testing. Formulas and nomograms predicting whether a subject has BOO and needs prostatic surgery were validated with an external validation cohort from Seoul National University Bundang Hospital and Seoul Metropolitan Government-Seoul National University Boramae Medical Center between January 2004 and April 2015. A smartphone-based app was developed, and 8 young urologists were enrolled for usability testing to identify any human factor issues of the app. A total of 642 patients were included in the external validation cohort. No significant differences were found in the baseline characteristics of major parameters between the original (n=1,179) and the external validation cohort, except for the maximal flow rate. Predictions of requiring prostatic surgery in the validation cohort showed a sensitivity of 80.6%, a specificity of 73.2%, a positive predictive value of 49.7%, and a negative predictive value of 92.0%, and area under receiver operating curve of 0.84. The calibration plot indicated that the predictions have good correspondence. The decision curve showed also a high net benefit. Similar evaluation results using the external validation cohort were seen in the predictions of having BOO. Overall results of the usability test demonstrated that the app was user-friendly with no major human factor issues. External validation of these newly developed a prediction model demonstrated a moderate level of discrimination, adequate calibration, and high net benefit gains for predicting both having BOO and requiring prostatic surgery. Also a smartphone app implementing the prediction model was user-friendly with no major human factor issue.

  11. Automatic measurement of voice onset time using discriminative structured prediction.

    PubMed

    Sonderegger, Morgan; Keshet, Joseph

    2012-12-01

    A discriminative large-margin algorithm for automatic measurement of voice onset time (VOT) is described, considered as a case of predicting structured output from speech. Manually labeled data are used to train a function that takes as input a speech segment of an arbitrary length containing a voiceless stop, and outputs its VOT. The function is explicitly trained to minimize the difference between predicted and manually measured VOT; it operates on a set of acoustic feature functions designed based on spectral and temporal cues used by human VOT annotators. The algorithm is applied to initial voiceless stops from four corpora, representing different types of speech. Using several evaluation methods, the algorithm's performance is near human intertranscriber reliability, and compares favorably with previous work. Furthermore, the algorithm's performance is minimally affected by training and testing on different corpora, and remains essentially constant as the amount of training data is reduced to 50-250 manually labeled examples, demonstrating the method's practical applicability to new datasets.

  12. Predicting human decisions in socioeconomic interaction using real-time functional magnetic resonance imaging (rtfMRI)

    NASA Astrophysics Data System (ADS)

    Hollmann, Maurice; Mönch, Tobias; Müller, Charles; Bernarding, Johannes

    2009-02-01

    A major field in cognitive neuroscience investigates neuronal correlates of human decision-making processes [1, 2]. Is it possible to predict a decision before it is actually revealed by the volunteer? In the presented manuscript we use a standard paradigm from economic behavioral research that proved emotional influences on human decision making: the Ultimatum Game (UG). In the UG, two players have the opportunity to split a sum of money. One player is deemed the proposer and the other, the responder. The proposer makes an offer as to how this money should be split between the two. The second player can either accept or reject this offer. If it is accepted, the money is split as proposed. If rejected, then neither player receives anything. In the presented study a real-time fMRI system was used to derive the brain activation of the responder. Using a Relevance-Vector-Machine classifier it was possible to predict if the responder will accept or reject an offer. The classification result was presented to the operator 1-2 seconds before the volunteer pressed a button to convey his decision. The classification accuracy reached about 70% averaged over six subjects.

  13. The Use Of Computational Human Performance Modeling As Task Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacuqes Hugo; David Gertman

    2012-07-01

    During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employedmore » to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.« less

  14. Editorial Commentary: Role of Synovial Biomarkers in Patient Outcomes After Knee Arthroscopy.

    PubMed

    Brand, Jefferson C

    2016-03-01

    Humans are notably poor at predicting event outcomes. In "Correlation of Synovial Fluid Biomarkers With Cartilage Pathology and Associated Outcomes in Knee Arthroscopy," Cuellar, Cuellar, Kirsch, and Strauss show that some synovial fluid biomarkers (20 were sampled for the investigation) may predict operative findings at the time of arthroscopy and patient-reported outcome measures at follow-up. Further research will clarify the role of synovial biomarkers in knee pathology and, hopefully, narrow the choices to one or two pertinent markers that can be used to improve our ability to predict outcomes from arthroscopic knee surgery. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  15. Automating CPM-GOMS

    NASA Technical Reports Server (NTRS)

    John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger

    2002-01-01

    CPM-GOMS is a modeling method that combines the task decomposition of a GOMS analysis with a model of human resource usage at the level of cognitive, perceptual, and motor operations. CPM-GOMS models have made accurate predictions about skilled user behavior in routine tasks, but developing such models is tedious and error-prone. We describe a process for automatically generating CPM-GOMS models from a hierarchical task decomposition expressed in a cognitive modeling tool called Apex. Resource scheduling in Apex automates the difficult task of interleaving the cognitive, perceptual, and motor resources underlying common task operators (e.g. mouse move-and-click). Apex's UI automatically generates PERT charts, which allow modelers to visualize a model's complex parallel behavior. Because interleaving and visualization is now automated, it is feasible to construct arbitrarily long sequences of behavior. To demonstrate the process, we present a model of automated teller interactions in Apex and discuss implications for user modeling. available to model human users, the Goals, Operators, Methods, and Selection (GOMS) method [6, 21] has been the most widely used, providing accurate, often zero-parameter, predictions of the routine performance of skilled users in a wide range of procedural tasks [6, 13, 15, 27, 28]. GOMS is meant to model routine behavior. The user is assumed to have methods that apply sequences of operators and to achieve a goal. Selection rules are applied when there is more than one method to achieve a goal. Many routine tasks lend themselves well to such decomposition. Decomposition produces a representation of the task as a set of nested goal states that include an initial state and a final state. The iterative decomposition into goals and nested subgoals can terminate in primitives of any desired granularity, the choice of level of detail dependent on the predictions required. Although GOMS has proven useful in HCI, tools to support the construction of GOMS models have not yet come into general use.

  16. Design and skill assessment of an Operational Forecasting System for currents and sea level variability to the Santos Estuarine System - Brazil

    NASA Astrophysics Data System (ADS)

    Godoi Rezende Costa, C.; Castro, B. M.; Blumberg, A. F.; Leite, J. R. B., Sr.

    2017-12-01

    Santos City is subject to an average of 12 storm tide events per year. Such events bring coastal flooding able to threat human life and damage coastal infrastructure. Severe events have forced the interruption of ferry boat services and ship traffic through Santos Harbor, causing great impacts to Santos Port, the largest in South America, activities. Several studies have focused on the hydrodynamics of storm tide events but only a few of those studies have pursued an operational initiative to predict short term (< 3 days) sea level variability. The goals of this study are (i) to describe the design of an operational forecasting system built to predict sea surface elevation and currents in the Santos Estuarine System and (ii) to evaluate model performance in simulating observed sea surface elevation. The Santos Operational Forecasting System (SOFS) hydrodynamic module is based on the Stevens Institute Estuarine and Coastal Ocean Model (sECOM). The fully automated SOFS is designed to provide up to 71 h forecast of sea surface elevations and currents every day. The system automatically collects results from global models to run the SOFS nested into another sECOM based model for the South Brazil Bight (SBB). Global forecasting results used to force both models come from Mercator Ocean, released by Copernicus Marine Service, and from the Brazilian developments on the Regional Atmospheric Modeling System (BRAMS) stablished by the Center for Weather Forecasts and Climate Studies (with Portuguese acronym CPTEC). The complete routines task take about 8 hours of run time to finish. SOFS was able to hindcast a severe storm tide event that took place in Santos on August 21-22, 2016. Comparisons with observed sea level provided skills of 0.92 and maximum root mean square errors of 25 cm. The good agreement with observed data shows the potential of the designed system to predict storm tides and to support both human and assets protection.

  17. Validating and Verifying Biomathematical Models of Human Fatigue

    NASA Technical Reports Server (NTRS)

    Martinez, Siera Brooke; Quintero, Luis Ortiz; Flynn-Evans, Erin

    2015-01-01

    Airline pilots experience acute and chronic sleep deprivation, sleep inertia, and circadian desynchrony due to the need to schedule flight operations around the clock. This sleep loss and circadian desynchrony gives rise to cognitive impairments, reduced vigilance and inconsistent performance. Several biomathematical models, based principally on patterns observed in circadian rhythms and homeostatic drive, have been developed to predict a pilots levels of fatigue or alertness. These models allow for the Federal Aviation Administration (FAA) and commercial airlines to make decisions about pilot capabilities and flight schedules. Although these models have been validated in a laboratory setting, they have not been thoroughly tested in operational environments where uncontrolled factors, such as environmental sleep disrupters, caffeine use and napping, may impact actual pilot alertness and performance. We will compare the predictions of three prominent biomathematical fatigue models (McCauley Model, Harvard Model, and the privately-sold SAFTE-FAST Model) to actual measures of alertness and performance. We collected sleep logs, movement and light recordings, psychomotor vigilance task (PVT), and urinary melatonin (a marker of circadian phase) from 44 pilots in a short-haul commercial airline over one month. We will statistically compare with the model predictions to lapses on the PVT and circadian phase. We will calculate the sensitivity and specificity of each model prediction under different scheduling conditions. Our findings will aid operational decision-makers in determining the reliability of each model under real-world scheduling situations.

  18. Publications in acoustic and noise control from NASA Langley Research Center during 1940-1979. [bibliographies

    NASA Technical Reports Server (NTRS)

    Fryer, B. A. (Compiler)

    1980-01-01

    Reference lists of approximately 900 published Langley Research Center reports in various areas of acoustics and noise control for the period 1940-1979 are presented. Specific topic areas covered include: duct acoustics; propagation and operations; rotating blade noise; jet noise; sonic boom; flow surface interaction noise; structural response/interior noise; human response; and noise prediction.

  19. Cognitive And Neural Sciences Division 1992 Programs

    DTIC Science & Technology

    1992-08-01

    Thalamic short-term plasticity in the auditory system: Associative retuning of receptive fields in the ventral medial geniculate body . Behavioral...prediction and enhancement of human performance in training and operational environments. A second goal is to understand the neurobiological constraints and...such complex, structured bodies of knowledge and skill are acquired. Fourth, to provide a precise theory of instruction, founded on cognitive theory

  20. General Formalism of Decision Making Based on Theory of Open Quantum Systems

    NASA Astrophysics Data System (ADS)

    Asano, M.; Ohya, M.; Basieva, I.; Khrennikov, A.

    2013-01-01

    We present the general formalism of decision making which is based on the theory of open quantum systems. A person (decision maker), say Alice, is considered as a quantum-like system, i.e., a system which information processing follows the laws of quantum information theory. To make decision, Alice interacts with a huge mental bath. Depending on context of decision making this bath can include her social environment, mass media (TV, newspapers, INTERNET), and memory. Dynamics of an ensemble of such Alices is described by Gorini-Kossakowski-Sudarshan-Lindblad (GKSL) equation. We speculate that in the processes of evolution biosystems (especially human beings) designed such "mental Hamiltonians" and GKSL-operators that any solution of the corresponding GKSL-equation stabilizes to a diagonal density operator (In the basis of decision making.) This limiting density operator describes population in which all superpositions of possible decisions has already been resolved. In principle, this approach can be used for the prediction of the distribution of possible decisions in human populations.

  1. Comparison of Enzootic Risk Measures for Predicting West Nile Disease, Los Angeles, California, USA, 2004–2010

    PubMed Central

    Kwan, Jennifer L.; Park, Bborie K.; Carpenter, Tim E.; Ngo, Van; Civen, Rachel

    2012-01-01

    In Los Angeles, California, USA, 2 epidemics of West Nile virus (WNV) disease have occurred since WNV was recognized in 2003. To assess which measure of risk was most predictive of human cases, we compared 3 measures: the California Mosquito-Borne Virus Surveillance and Response Plan Assessment, the vector index, and the Dynamic Continuous-Area Space-Time system. A case–crossover study was performed by using symptom onset dates from 384 persons with WNV infection to determine their relative environmental exposure to high-risk conditions as measured by each method. Receiver-operating characteristic plots determined thresholds for each model, and the area under the curve was used to compare methods. We found that the best risk assessment model for human WNV cases included surveillance data from avian, mosquito, and climate sources. PMID:22840314

  2. How robust are distributed systems

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1989-01-01

    A distributed system is made up of large numbers of components operating asynchronously from one another and hence with imcomplete and inaccurate views of one another's state. Load fluctuations are common as new tasks arrive and active tasks terminate. Jointly, these aspects make it nearly impossible to arrive at detailed predictions for a system's behavior. It is important to the successful use of distributed systems in situations in which humans cannot provide the sorts of predictable realtime responsiveness of a computer, that the system be robust. The technology of today can too easily be affected by worn programs or by seemingly trivial mechanisms that, for example, can trigger stock market disasters. Inventors of a technology have an obligation to overcome flaws that can exact a human cost. A set of principles for guiding solutions to distributed computing problems is presented.

  3. Modeling reality

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1990-01-01

    Although powerful computers have allowed complex physical and manmade hardware systems to be modeled successfully, we have encountered persistent problems with the reliability of computer models for systems involving human learning, human action, and human organizations. This is not a misfortune; unlike physical and manmade systems, human systems do not operate under a fixed set of laws. The rules governing the actions allowable in the system can be changed without warning at any moment, and can evolve over time. That the governing laws are inherently unpredictable raises serious questions about the reliability of models when applied to human situations. In these domains, computers are better used, not for prediction and planning, but for aiding humans. Examples are systems that help humans speculate about possible futures, offer advice about possible actions in a domain, systems that gather information from the networks, and systems that track and support work flows in organizations.

  4. Airport noise impact reduction through operations

    NASA Technical Reports Server (NTRS)

    Deloach, R.

    1981-01-01

    The airport-noise levels and annoyance model (ALAMO) developed at NASA Langley Research Center is comprised of a system of computer programs which is capable of quantifying airport community noise impact in terms of noise level, population distribution, and human subjective response to noise. The ALAMO can be used to compare the noise impact of an airport's current operating scenario with the noise impact which would result from some proposed change in airport operations. The relative effectiveness of number of noise-impact reduction alternatives is assessed for a major midwest airport. Significant reductions in noise impact are predicted for certain noise abatement strategies while others are shown to result in relatively little noise relief.

  5. Intelligent systems approach for automated identification of individual control behavior of a human operator

    NASA Astrophysics Data System (ADS)

    Zaychik, Kirill B.

    Acceptable results have been obtained using conventional techniques to model the generic human operator's control behavior. However, little research has been done in an attempt to identify an individual based on his/her control behavior. The main hypothesis investigated in this dissertation is that different operators exhibit different control behavior when performing a given control task. Furthermore, inter-person differences are manifested in the amplitude and frequency content of the non-linear component of the control behavior. Two enhancements to the existing models of the human operator, which allow personalization of the modeled control behavior, are presented in this dissertation. One of the proposed enhancements accounts for the "testing" control signals, which are introduced by an operator for more accurate control of the system and/or to adjust his/her control strategy. Such enhancement uses the Artificial Neural Network (ANN), which can be fine-tuned to model the "testing" control behavior of a given individual. The other model enhancement took the form of an equiripple filter (EF), which conditions the power spectrum of the control signal before it is passed through the plant dynamics block. The filter design technique uses Parks-McClellan algorithm, which allows parameterization of the desired levels of power at certain frequencies. A novel automated parameter identification technique (APID) was developed to facilitate the identification process of the parameters of the selected models of the human operator. APID utilizes a Genetic Algorithm (GA) based optimization engine called the Bit-climbing Algorithm (BCA). Proposed model enhancements were validated using the experimental data obtained at three different sources: the Manual Control Laboratory software experiments, Unmanned Aerial Vehicle simulation, and NASA Langley Research Center Visual Motion Simulator studies. Validation analysis involves comparison of the actual and simulated control activity signals. Validation criteria used in this dissertation is based on comparing Power Spectral Densities of the control signals against that of the Precision model of the human operator. This dissertation also addresses the issue of applying the proposed human operator model augmentation to evaluate the effectiveness of the motion feedback when simulating the actual pilot control behavior in a flight simulator. The proposed modeling methodology allows for quantitative assessments and prediction of the need for platform motion, while performing aircraft/pilot simulation studies.

  6. Predictive Compensator Optimization for Head Tracking Lag in Virtual Environments

    NASA Technical Reports Server (NTRS)

    Adelstein, Barnard D.; Jung, Jae Y.; Ellis, Stephen R.

    2001-01-01

    We examined the perceptual impact of plant noise parameterization for Kalman Filter predictive compensation of time delays intrinsic to head tracked virtual environments (VEs). Subjects were tested in their ability to discriminate between the VE system's minimum latency and conditions in which artificially added latency was then predictively compensated back to the system minimum. Two head tracking predictors were parameterized off-line according to cost functions that minimized prediction errors in (1) rotation, and (2) rotation projected into translational displacement with emphasis on higher frequency human operator noise. These predictors were compared with a parameterization obtained from the VE literature for cost function (1). Results from 12 subjects showed that both parameterization type and amount of compensated latency affected discrimination. Analysis of the head motion used in the parameterizations and the subsequent discriminability results suggest that higher frequency predictor artifacts are contributory cues for discriminating the presence of predictive compensation.

  7. Using Apex To Construct CPM-GOMS Models

    NASA Technical Reports Server (NTRS)

    John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger

    2006-01-01

    process for automatically generating computational models of human/computer interactions as well as graphical and textual representations of the models has been built on the conceptual foundation of a method known in the art as CPM-GOMS. This method is so named because it combines (1) the task decomposition of analysis according to an underlying method known in the art as the goals, operators, methods, and selection (GOMS) method with (2) a model of human resource usage at the level of cognitive, perceptual, and motor (CPM) operations. CPM-GOMS models have made accurate predictions about behaviors of skilled computer users in routine tasks, but heretofore, such models have been generated in a tedious, error-prone manual process. In the present process, CPM-GOMS models are generated automatically from a hierarchical task decomposition expressed by use of a computer program, known as Apex, designed previously to be used to model human behavior in complex, dynamic tasks. An inherent capability of Apex for scheduling of resources automates the difficult task of interleaving the cognitive, perceptual, and motor resources that underlie common task operators (e.g., move and click mouse). The user interface of Apex automatically generates Program Evaluation Review Technique (PERT) charts, which enable modelers to visualize the complex parallel behavior represented by a model. Because interleaving and the generation of displays to aid visualization are automated, it is now feasible to construct arbitrarily long sequences of behaviors. The process was tested by using Apex to create a CPM-GOMS model of a relatively simple human/computer-interaction task and comparing the time predictions of the model and measurements of the times taken by human users in performing the various steps of the task. The task was to withdraw $80 in cash from an automated teller machine (ATM). For the test, a Visual Basic mockup of an ATM was created, with a provision for input from (and measurement of the performance of) the user via a mouse. The times predicted by the automatically generated model turned out to approximate the measured times fairly well (see figure). While these results are promising, there is need for further development of the process. Moreover, it will also be necessary to test other, more complex models: The actions required of the user in the ATM task are too sequential to involve substantial parallelism and interleaving and, hence, do not serve as an adequate test of the unique strength of CPM-GOMS models to accommodate parallelism and interleaving.

  8. Analysis and prediction of meal motion by EMG signals

    NASA Astrophysics Data System (ADS)

    Horihata, S.; Iwahara, H.; Yano, K.

    2007-12-01

    The lack of carers for senior citizens and physically handicapped persons in our country has now become a huge issue and has created a great need for carer robots. The usual carer robots (many of which have switches or joysticks for their interfaces), however, are neither easy to use it nor very popular. Therefore, haptic devices have been adopted for a human-machine interface that will enable an intuitive operation. At this point, a method is being tested that seeks to prevent a wrong operation from occurring from the user's signals. This method matches motions with EMG signals.

  9. Human systems integration in remotely piloted aircraft operations.

    PubMed

    Tvaryanas, Anthony P

    2006-12-01

    The role of humans in remotely piloted aircraft (RPAs) is qualitatively different from manned aviation, lessening the applicability of aerospace medicine human factors knowledge derived from traditional cockpits. Aerospace medicine practitioners should expect to be challenged in addressing RPA crewmember performance. Human systems integration (HSI) provides a model for explaining human performance as a function of the domains of: human factors engineering; personnel; training; manpower; environment, safety, and occupational health (ESOH); habitability; and survivability. RPA crewmember performance is being particularly impacted by issues involving the domains of human factors engineering, personnel, training, manpower, ESOH, and habitability. Specific HSI challenges include: 1) changes in large RPA operator selection and training; 2) human factors engineering deficiencies in current RPA ground control station design and their impact on human error including considerations pertaining to multi-aircraft control; and 3) the combined impact of manpower shortfalls, shiftwork-related fatigue, and degraded crewmember effectiveness. Limited experience and available research makes it difficult to qualitatively or quantitatively predict the collective impact of these issues on RPA crewmember performance. Attending to HSI will be critical for the success of current and future RPA crewmembers. Aerospace medicine practitioners working with RPA crewmembers should gain first-hand knowledge of their task environment while the larger aerospace medicine community needs to address the limited information available on RPA-related aerospace medicine human factors. In the meantime, aeromedical decisions will need to be made based on what is known about other aerospace occupations, realizing this knowledge may have only partial applicability.

  10. Extending Validated Human Performance Models to Explore NextGen Concepts

    NASA Technical Reports Server (NTRS)

    Gore, Brian Francis; Hooey, Becky Lee; Mahlstedt, Eric; Foyle, David C.

    2012-01-01

    To meet the expected increases in air traffic demands, NASA and FAA are researching and developing Next Generation Air Transportation System (NextGen) concepts. NextGen will require substantial increases in the data available to pilots on the flight deck (e.g., weather,wake, traffic trajectory predictions, etc.) to support more precise and closely coordinated operations (e.g., self-separation, RNAV/RNP, and closely spaced parallel operations, CSPOs). These NextGen procedures and operations, along with the pilot's roles and responsibilities, must be designed with consideration of the pilot's capabilities and limitations. Failure to do so will leave the pilots, and thus the entire aviation system, vulnerable to error. A validated Man-machine Integration and design Analysis System (MIDAS) v5 model was extended to evaluate anticipated changes to flight deck and controller roles and responsibilities in NextGen approach and Land operations. Compared to conditions when the controllers are responsible for separation on decent to land phase of flight, the output from these model predictions suggest that the flight deck response time to detect the lead aircraft blunder will decrease, pilot scans to the navigation display will increase, and workload will increase.

  11. Predicting the Operational Acceptability of Route Advisories

    NASA Technical Reports Server (NTRS)

    Evans, Antony; Lee, Paul

    2017-01-01

    NASA envisions a future Air Traffic Management system that allows safe, efficient growth in global operations, enabled by increasing levels of automation and autonomy. In a safety-critical system, the introduction of increasing automation and autonomy has to be done in stages, making human-system integrated concepts critical in the foreseeable future. One example where this is relevant is for tools that generate more efficient flight routings or reroute advisories. If these routes are not operationally acceptable, they will be rejected by human operators, and the associated benefits will not be realized. Operational acceptance is therefore required to enable the increased efficiency and reduced workload benefits associated with these tools. In this paper, the authors develop a predictor of operational acceptability for reroute advisories. Such a capability has applications in tools that identify more efficient routings around weather and congestion and that better meet airline preferences. The capability is based on applying data mining techniques to flight plan amendment data reported by the Federal Aviation Administration and data on requested reroutes collected from a field trial of the NASA developed Dynamic Weather Routes tool, which advised efficient route changes to American Airlines dispatchers in 2014. 10-Fold cross validation was used for feature, model and parameter selection, while nested cross validation was used to validate the model. The model performed well in predicting controller acceptance or rejection of a route change as indicated by chosen performance metrics. Features identified as relevant to controller acceptance included the historical usage of the advised route, the location of the maneuver start point relative to the boundaries of the airspace sector containing the maneuver start (the maneuver start sector), the reroute deviation from the original flight plan, and the demand level in the maneuver start sector. A random forest with forty trees was the best performing of the five models evaluated in this paper.

  12. Procedures in complex systems: the airline cockpit.

    PubMed

    Degani, A; Wiener, E L

    1997-05-01

    In complex human-machine systems, successful operations depend on an elaborate set of procedures which are specified by the operational management of the organization. These procedures indicate to the human operator (in this case the pilot) the manner in which operational management intends to have various tasks done. The intent is to provide guidance to the pilots and to ensure a safe, logical, efficient, and predictable (standardized) means of carrying out the objectives of the job. However, procedures can become a hodge-podge. Inconsistent or illogical procedures may lead to noncompliance by operators. Based on a field study with three major airlines, the authors propose a model for procedure development called the "Four P's": philosophy, policies, procedures, and practices. Using this model as a framework, the authors discuss the intricate issue of designing flight-deck procedures, and propose a conceptual approach for designing any set of procedures. The various factors, both external and internal to the cockpit, that must be considered for procedure design are presented. In particular, the paper addresses the development of procedures for automated cockpits--a decade-long, and highly controversial issue in commercial aviation. Although this paper is based on airline operations, we assume that the principles discussed here are also applicable to other high-risk supervisory control systems, such as space flight, manufacturing process control, nuclear power production, and military operations.

  13. A Cognitive Game Theoretic Analysis of Conflict Alerts in Air Traffic Control

    NASA Technical Reports Server (NTRS)

    Erev, Ido; Gopher, Daniel; Remington, Roger

    1999-01-01

    The current research was motivated by the recommendation made by a joint Government/Industry committee to introduce a new traffic control system, referred to as the Free Flight. This system is designed to use recent new technology to facilitate efficient and safe air transportation. We addressed one of the major difficulties that arise in the design of this and similar multi-agent systems: the adaptive (and slippery) nature of human agents. To facilitate a safe and efficient design of this multi-agent system, designers have to rely on assessments of the expected behavior of the different agents under various scenarios. Whereas the behavior of the computerized agents is predictable, the behavior of the human agents (including air traffic controllers and pilots) is not. Experimental and empirical observations suggest that human agents are likely to adjust their behavior to the design of the system. To see the difficulty that the adaptive nature of human agents creates assume that a good approximation of the way operators currently behave is available. Given this information an optimal design can be performed. The problem arises as the human operator will learn to adjust their behavior to the new system. Following this adjustment process the assumptions made by the designer concerning the operators behavior will no longer be accurate and the system might reach a suboptimal state. In extreme situations these potential suboptimal states might involve unnecessary risk. That is, the fact that operators learn in an adaptive fashion does not imply that the system will become safer as they gain experience. At least in the context of Safety dilemmas, experience can lead to a pareto deficient risk taking behavior.

  14. Helicopter Acoustics

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Exterior and interior noise problems are addressed both from the physics and engineering as well as the human factors point of view. The role of technology in closing the gap between what the customers and regulating agencies would like to have and what is available is explored. Noise regulation concepts, design, operations and testing for noise control, helicopter noise prediction, and research tools and measurements are among the topics covered.

  15. GEM: Geospace Environment Modeling

    NASA Astrophysics Data System (ADS)

    Roederer, Juan G.

    Shortly after the beginning of the “space age” with the launching of the first man made object into terrestrial orbit, geospace assumed a fundamental role as a technological resource for all countries, advanced and developing alike. Today, satellite systems for communications, weather prediction, navigation, and remote sensing of natural resources are supporting, in an essential way, many facets of societal operations. We must expect that this trend will continue; for instance, in perhaps less than 3 decades, transatmospheric transportation will be routine and satellite systems will sustain human colonies in space.The medium in which Earth-orbiting systems operate is hostile. Far from a perfect vacuum, it is made up of high-temperature gas and corpuscular radiation of varying densities and intensities; these solar-activity controlled variations can reach proportions dangerous to orbital stability, to electronic systems performance, to shuttle and spaceplane reentry, and to the life of humans in orbit. Dramatic examples of solar-activity-induced satellite failures are the unexpected early degradation of the orbit of Skylab due to unusual upper atmosphere heating and the demise of satellite GOES-5, most probably caused by a large injection of energetic electrons from the outer magnetoshere. The need to predict “weather and climate” in geospace is becoming as important as the need to predict weather and climate in the inhospitable regions on Earth into which industrial activity has moved during the last decades, such as the Arctic and some of the arid lands.

  16. A statistical model for water quality predictions from a river discharge using coastal observations

    NASA Astrophysics Data System (ADS)

    Kim, S.; Terrill, E. J.

    2007-12-01

    Understanding and predicting coastal ocean water quality has benefits for reducing human health risks, protecting the environment, and improving local economies which depend on clean beaches. Continuous observations of coastal physical oceanography increase the understanding of the processes which control the fate and transport of a riverine plume which potentially contains high levels of contaminants from the upstream watershed. A data-driven model of the fate and transport of river plume water from the Tijuana River has been developed using surface current observations provided by a network of HF radar operated as part of a local coastal observatory that has been in place since 2002. The model outputs are compared with water quality sampling of shoreline indicator bacteria, and the skill of an alarm for low water quality is evaluated using the receiver operating characteristic (ROC) curve. In addition, statistical analysis of beach closures in comparison with environmental variables is also discussed.

  17. Differing Air Traffic Controller Responses to Similar Trajectory Prediction Errors

    NASA Technical Reports Server (NTRS)

    Mercer, Joey; Hunt-Espinosa, Sarah; Bienert, Nancy; Laraway, Sean

    2016-01-01

    A Human-In-The-Loop simulation was conducted in January of 2013 in the Airspace Operations Laboratory at NASA's Ames Research Center. The simulation airspace included two en route sectors feeding the northwest corner of Atlanta's Terminal Radar Approach Control. The focus of this paper is on how uncertainties in the study's trajectory predictions impacted the controllers ability to perform their duties. Of particular interest is how the controllers interacted with the delay information displayed in the meter list and data block while managing the arrival flows. Due to wind forecasts with 30-knot over-predictions and 30-knot under-predictions, delay value computations included errors of similar magnitude, albeit in opposite directions. However, when performing their duties in the presence of these errors, did the controllers issue clearances of similar magnitude, albeit in opposite directions?

  18. The brain and the law.

    PubMed Central

    Chorvat, Terrence; McCabe, Kevin

    2004-01-01

    Much has been written about how law as an institution has developed to solve many problems that human societies face. Inherent in all of these explanations are models of how humans make decisions. This article discusses what current neuroscience research tells us about the mechanisms of human decision making of particular relevance to law. This research indicates that humans are both more capable of solving many problems than standard economic models predict, but also limited in ways those models ignore. This article discusses how law is both shaped by our cognitive processes and also shapes them. The article considers some of the implications of this research for improving our understanding of how our current legal regimes operate and how the law can be structured to take advantage of our neural mechanisms to improve social welfare. PMID:15590613

  19. Postoperative PTH measurement as a predictor of hypocalcaemia after thyroidectomy.

    PubMed

    Proczko-Markuszewska, M; Kobiela, J; Stefaniak, T; Lachiński, A J; Sledziński, Z

    2010-01-01

    Hypocalcaemia after thyroidectomy is the most common postoperative complication, with a reported incidence from 0.5% to even 50% of the operated patients. Hypoparathyroidism could be a result of careless or inadequate preparation during the surgical procedure. There is a variety of proposed options for the prediction of the incidence of hypocalcaemia. The most effective of them are the peri-operative and intra-operative measurements of the parathyroid hormone (PTH) level. A prospective study was performed on 100 patients who underwent total thyroidectomy from January 2007 to June 2008. The total calcium level and intact human PTH (iPTH) levels were measured 24 hours before as well as 1 hour and 24 hours after the surgery. The goal of the study was to assess the potential correlation between the iPTH levels after the operation and the development of hypocalcaemia. The possible prediction value of postoperative iPTH levels was to be assessed. We have presented a significant correlation between early iPTH measurement and the risk of hypocalcaemia. Moreover, a significant correlation between the iPTH level one hour after operation with the calcium level 24 hours after the operation was demonstrated. Early postoperative assessment of iPTH levels can be used to identify the group of patients at risk of hypocalcaemia after thyroidectomy. Pre-emptive calcium supplementation can lead to the avoidance of complications causing prolonged hospital stay and most importantly to prevent severe hypocalcaemia.

  20. The application of heliospheric imaging to space weather operations: Lessons learned from published studies

    NASA Astrophysics Data System (ADS)

    Harrison, Richard A.; Davies, Jackie A.; Biesecker, Doug; Gibbs, Mark

    2017-08-01

    The field of heliospheric imaging has matured significantly over the last 10 years—corresponding, in particular, to the launch of NASA's STEREO mission and the successful operation of the heliospheric imager (HI) instruments thereon. In parallel, this decade has borne witness to a marked increase in concern over the potentially damaging effects of space weather on space and ground-based technological assets, and the corresponding potential threat to human health, such that it is now under serious consideration at governmental level in many countries worldwide. Hence, in a political climate that recognizes the pressing need for enhanced operational space weather monitoring capabilities most appropriately stationed, it is widely accepted, at the Lagrangian L1 and L5 points, it is timely to assess the value of heliospheric imaging observations in the context of space weather operations. To this end, we review a cross section of the scientific analyses that have exploited heliospheric imagery—particularly from STEREO/HI—and discuss their relevance to operational predictions of, in particular, coronal mass ejection (CME) arrival at Earth and elsewhere. We believe that the potential benefit of heliospheric images to the provision of accurate CME arrival predictions on an operational basis, although as yet not fully realized, is significant and we assert that heliospheric imagery is central to any credible space weather mission, particularly one located at a vantage point off the Sun-Earth line.

  1. Application of modified extended method in CREAM for safety inspector in coal mines

    NASA Astrophysics Data System (ADS)

    Wang, Jinhe; Zhang, Xiaohong; Zeng, Jianchao

    2018-01-01

    Safety inspector often performs duties in circumstances contributes to the oc currence of human failures. Therefore, the paper aims at quantifying human failure pro bability (HFP) of safety inspector during the coal mine operation with cognitive reliabi lity and error analysis method (CREAM). Whereas, some shortcomings of this approa ch that lacking considering the applicability of the common performance condition (C PC), and the subjective of evaluating CPC level which weaken the accuracy of the qua ntitative prediction results. A modified extended method in CREAM which is able to a ddress these difficulties with a CPC framework table is proposed, and the proposed me thodology is demonstrated by the virtue of a coal-mine accident example. The results a re expected to be useful in predicting HFP of safety inspector and contribute to the enh ancement of coal mine safety.

  2. Dynamic inverse models in human-cyber-physical systems

    NASA Astrophysics Data System (ADS)

    Robinson, Ryan M.; Scobee, Dexter R. R.; Burden, Samuel A.; Sastry, S. Shankar

    2016-05-01

    Human interaction with the physical world is increasingly mediated by automation. This interaction is characterized by dynamic coupling between robotic (i.e. cyber) and neuromechanical (i.e. human) decision-making agents. Guaranteeing performance of such human-cyber-physical systems will require predictive mathematical models of this dynamic coupling. Toward this end, we propose a rapprochement between robotics and neuromechanics premised on the existence of internal forward and inverse models in the human agent. We hypothesize that, in tele-robotic applications of interest, a human operator learns to invert automation dynamics, directly translating from desired task to required control input. By formulating the model inversion problem in the context of a tracking task for a nonlinear control system in control-a_ne form, we derive criteria for exponential tracking and show that the resulting dynamic inverse model generally renders a portion of the physical system state (i.e., the internal dynamics) unobservable from the human operator's perspective. Under stability conditions, we show that the human can achieve exponential tracking without formulating an estimate of the system's state so long as they possess an accurate model of the system's dynamics. These theoretical results are illustrated using a planar quadrotor example. We then demonstrate that the automation can intervene to improve performance of the tracking task by solving an optimal control problem. Performance is guaranteed to improve under the assumption that the human learns and inverts the dynamic model of the altered system. We conclude with a discussion of practical limitations that may hinder exact dynamic model inversion.

  3. Backwards compatible high dynamic range video compression

    NASA Astrophysics Data System (ADS)

    Dolzhenko, Vladimir; Chesnokov, Vyacheslav; Edirisinghe, Eran A.

    2014-02-01

    This paper presents a two layer CODEC architecture for high dynamic range video compression. The base layer contains the tone mapped video stream encoded with 8 bits per component which can be decoded using conventional equipment. The base layer content is optimized for rendering on low dynamic range displays. The enhancement layer contains the image difference, in perceptually uniform color space, between the result of inverse tone mapped base layer content and the original video stream. Prediction of the high dynamic range content reduces the redundancy in the transmitted data while still preserves highlights and out-of-gamut colors. Perceptually uniform colorspace enables using standard ratedistortion optimization algorithms. We present techniques for efficient implementation and encoding of non-uniform tone mapping operators with low overhead in terms of bitstream size and number of operations. The transform representation is based on human vision system model and suitable for global and local tone mapping operators. The compression techniques include predicting the transform parameters from previously decoded frames and from already decoded data for current frame. Different video compression techniques are compared: backwards compatible and non-backwards compatible using AVC and HEVC codecs.

  4. Progress in Space Weather Modeling and Observations Needed to Improve the Operational NAIRAS Model Aircraft Radiation Exposure Predictions

    NASA Astrophysics Data System (ADS)

    Mertens, C. J.; Kress, B. T.; Wiltberger, M. J.; Tobiska, W.; Xu, X.

    2011-12-01

    The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a prototype operational model for predicting commercial aircraft radiation exposure from galactic and solar cosmic rays. NAIRAS predictions are currently streaming live from the project's public website, and the exposure rate nowcast is also available on the SpaceWx smartphone app for iPhone, IPad, and Android. Cosmic rays are the primary source of human exposure to high linear energy transfer radiation at aircraft altitudes, which increases the risk of cancer and other adverse health effects. Thus, the NAIRAS model addresses an important national need with broad societal, public health and economic benefits. The processes responsible for the variability in the solar wind, interplanetary magnetic field, solar energetic particle spectrum, and the dynamical response of the magnetosphere to these space environment inputs, strongly influence the composition and energy distribution of the atmospheric ionizing radiation field. During the development of the NAIRAS model, new science questions were identified that must be addressed in order to obtain a more reliable and robust operational model of atmospheric radiation exposure. Addressing these science questions require improvements in both space weather modeling and observations. The focus of this talk is to present these science questions, the proposed methodologies for addressing these science questions, and the anticipated improvements to the operational predictions of atmospheric radiation exposure. The overarching goal of this work is to provide a decision support tool for the aviation industry that will enable an optimal balance to be achieved between minimizing health risks to passengers and aircrew while simultaneously minimizing costs to the airline companies.

  5. The validation of a human force model to predict dynamic forces resulting from multi-joint motions

    NASA Technical Reports Server (NTRS)

    Pandya, Abhilash K.; Maida, James C.; Aldridge, Ann M.; Hasson, Scott M.; Woolford, Barbara J.

    1992-01-01

    The development and validation is examined of a dynamic strength model for humans. This model is based on empirical data. The shoulder, elbow, and wrist joints were characterized in terms of maximum isolated torque, or position and velocity, in all rotational planes. This data was reduced by a least squares regression technique into a table of single variable second degree polynomial equations determining torque as a function of position and velocity. The isolated joint torque equations were then used to compute forces resulting from a composite motion, in this case, a ratchet wrench push and pull operation. A comparison of the predicted results of the model with the actual measured values for the composite motion indicates that forces derived from a composite motion of joints (ratcheting) can be predicted from isolated joint measures. Calculated T values comparing model versus measured values for 14 subjects were well within the statistically acceptable limits and regression analysis revealed coefficient of variation between actual and measured to be within 0.72 and 0.80.

  6. Towards Engineering Biological Systems in a Broader Context.

    PubMed

    Venturelli, Ophelia S; Egbert, Robert G; Arkin, Adam P

    2016-02-27

    Significant advances have been made in synthetic biology to program information processing capabilities in cells. While these designs can function predictably in controlled laboratory environments, the reliability of these devices in complex, temporally changing environments has not yet been characterized. As human society faces global challenges in agriculture, human health and energy, synthetic biology should develop predictive design principles for biological systems operating in complex environments. Natural biological systems have evolved mechanisms to overcome innumerable and diverse environmental challenges. Evolutionary design rules should be extracted and adapted to engineer stable and predictable ecological function. We highlight examples of natural biological responses spanning the cellular, population and microbial community levels that show promise in synthetic biology contexts. We argue that synthetic circuits embedded in host organisms or designed ecologies informed by suitable measurement of biotic and abiotic environmental parameters could be used as engineering substrates to achieve target functions in complex environments. Successful implementation of these methods will broaden the context in which synthetic biological systems can be applied to solve important problems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Research Initiatives and Preliminary Results In Automation Design In Airspace Management in Free Flight

    NASA Technical Reports Server (NTRS)

    Corker, Kevin; Lebacqz, J. Victor (Technical Monitor)

    1997-01-01

    The NASA and the FAA have entered into a joint venture to explore, define, design and implement a new airspace management operating concept. The fundamental premise of that concept is that technologies and procedures need to be developed for flight deck and ground operations to improve the efficiency, the predictability, the flexibility and the safety of airspace management and operations. To that end NASA Ames has undertaken an initial development and exploration of "key concepts" in the free flight airspace management technology development. Human Factors issues in automation aiding design, coupled aiding systems between air and ground, communication protocols in distributed decision making, and analytic techniques for definition of concepts of airspace density and operator cognitive load have been undertaken. This paper reports the progress of these efforts, which are not intended to definitively solve the many evolving issues of design for future ATM systems, but to provide preliminary results to chart the parameters of performance and the topology of the analytic effort required. The preliminary research in provision of cockpit display of traffic information, dynamic density definition, distributed decision making, situation awareness models and human performance models is discussed as they focus on the theme of "design requirements".

  8. Pathogen-Host Associations and Predicted Range Shifts of Human Monkeypox in Response to Climate Change in Central Africa

    PubMed Central

    Thomassen, Henri A.; Fuller, Trevon; Asefi-Najafabady, Salvi; Shiplacoff, Julia A. G.; Mulembakani, Prime M.; Blumberg, Seth; Johnston, Sara C.; Kisalu, Neville K.; Kinkela, Timothée L.; Fair, Joseph N.; Wolfe, Nathan D.; Shongo, Robert L.; LeBreton, Matthew; Meyer, Hermann; Wright, Linda L.; Muyembe, Jean-Jacques; Buermann, Wolfgang; Okitolonda, Emile; Hensley, Lisa E.; Lloyd-Smith, James O.; Smith, Thomas B.; Rimoin, Anne W.

    2013-01-01

    Climate change is predicted to result in changes in the geographic ranges and local prevalence of infectious diseases, either through direct effects on the pathogen, or indirectly through range shifts in vector and reservoir species. To better understand the occurrence of monkeypox virus (MPXV), an emerging Orthopoxvirus in humans, under contemporary and future climate conditions, we used ecological niche modeling techniques in conjunction with climate and remote-sensing variables. We first created spatially explicit probability distributions of its candidate reservoir species in Africa's Congo Basin. Reservoir species distributions were subsequently used to model current and projected future distributions of human monkeypox (MPX). Results indicate that forest clearing and climate are significant driving factors of the transmission of MPX from wildlife to humans under current climate conditions. Models under contemporary climate conditions performed well, as indicated by high values for the area under the receiver operator curve (AUC), and tests on spatially randomly and non-randomly omitted test data. Future projections were made on IPCC 4th Assessment climate change scenarios for 2050 and 2080, ranging from more conservative to more aggressive, and representing the potential variation within which range shifts can be expected to occur. Future projections showed range shifts into regions where MPX has not been recorded previously. Increased suitability for MPX was predicted in eastern Democratic Republic of Congo. Models developed here are useful for identifying areas where environmental conditions may become more suitable for human MPX; targeting candidate reservoir species for future screening efforts; and prioritizing regions for future MPX surveillance efforts. PMID:23935820

  9. Real time cancer prediction based on objective tissue compliance measurement in endoscopic surgery.

    PubMed

    Fakhry, Morkos; Bello, Fernando; Hanna, George B

    2014-02-01

    To investigate the feasibility of real time cancer tissue diagnosis intraoperatively based on in vivo tissue compliance measurements obtained by a recently developed laparoscopic smart device. Cancer tissue is stiffer than its normal counterpart. Modern forms of remote surgery such as laparoscopic and robotic surgical techniques diminish direct assessment of this important tissue property. In vivo human tissue compliance of the normal and cancer gastrointestinal tissue is unknown. A Clinical Real Time Tissue Compliance Mapping System (CRTCMS) with a predictive power comparable to the human hand and useable in routine surgical practice has been recently developed. The CRTCMS is employed in the operating theater to collect data from 50 patients undergoing intra-abdominal surgical interventions [40 men, 10 women, aged between 32 and 89 (mean = 66.4, range = 57)]. This includes 10 esophageal and 27 gastric cancer patients. A total of 1212 compliance measurements of normal and cancerous in vivo gastrointestinal tissues were taken. The data were used to calibrate the CRTCMS to predict cancerous tissue in a further 12 patients (3 cancer esophagus and 9 cancer stomach) involving 175 measurements. The system demonstrated a high prediction power to diagnose cancer tissue in real time during routine surgical procedures (sensitivity = 98.7%, specificity = 99%). An in vivo human tissue compliance data bank of the gastrointestinal tract was produced. Real time cancer diagnosis based on in vivo tissue compliance measurements is feasible. The reported data open new avenues in cancer diagnostics, surgical robotics, and development of more realistic surgical simulators.

  10. Physical attraction to reliable, low variability nervous systems: Reaction time variability predicts attractiveness.

    PubMed

    Butler, Emily E; Saville, Christopher W N; Ward, Robert; Ramsey, Richard

    2017-01-01

    The human face cues a range of important fitness information, which guides mate selection towards desirable others. Given humans' high investment in the central nervous system (CNS), cues to CNS function should be especially important in social selection. We tested if facial attractiveness preferences are sensitive to the reliability of human nervous system function. Several decades of research suggest an operational measure for CNS reliability is reaction time variability, which is measured by standard deviation of reaction times across trials. Across two experiments, we show that low reaction time variability is associated with facial attractiveness. Moreover, variability in performance made a unique contribution to attractiveness judgements above and beyond both physical health and sex-typicality judgements, which have previously been associated with perceptions of attractiveness. In a third experiment, we empirically estimated the distribution of attractiveness preferences expected by chance and show that the size and direction of our results in Experiments 1 and 2 are statistically unlikely without reference to reaction time variability. We conclude that an operating characteristic of the human nervous system, reliability of information processing, is signalled to others through facial appearance. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Initial eye movements during face identification are optimal and similar across cultures

    PubMed Central

    Or, Charles C.-F.; Peterson, Matthew F.; Eckstein, Miguel P.

    2015-01-01

    Culture influences not only human high-level cognitive processes but also low-level perceptual operations. Some perceptual operations, such as initial eye movements to faces, are critical for extraction of information supporting evolutionarily important tasks such as face identification. The extent of cultural effects on these crucial perceptual processes is unknown. Here, we report that the first gaze location for face identification was similar across East Asian and Western Caucasian cultural groups: Both fixated a featureless point between the eyes and the nose, with smaller between-group than within-group differences and with a small horizontal difference across cultures (8% of the interocular distance). We also show that individuals of both cultural groups initially fixated at a slightly higher point on Asian faces than on Caucasian faces. The initial fixations were found to be both fundamental in acquiring the majority of information for face identification and optimal, as accuracy deteriorated when observers held their gaze away from their preferred fixations. An ideal observer that integrated facial information with the human visual system's varying spatial resolution across the visual field showed a similar information distribution across faces of both races and predicted initial human fixations. The model consistently replicated the small vertical difference between human fixations to Asian and Caucasian faces but did not predict the small horizontal leftward bias of Caucasian observers. Together, the results suggest that initial eye movements during face identification may be driven by brain mechanisms aimed at maximizing accuracy, and less influenced by culture. The findings increase our understanding of the interplay between the brain's aims to optimally accomplish basic perceptual functions and to respond to sociocultural influences. PMID:26382003

  12. The assessment of ionising radiation impact on the cooling pond freshwater ecosystem non-human biota from the Ignalina NPP operation beginning to shut down and initial decommissioning.

    PubMed

    Mazeika, J; Marciulioniene, D; Nedveckaite, T; Jefanova, O

    2016-01-01

    The radiological doses to non-human biota of freshwater ecosystem in the Ignalina NPP cooling pond - Lake Druksiai were evaluated for several cases including the plant's operation period and initial decommissioning activities, using the ERICA 1.2 code with IAEA SRS-19 models integrated approach and tool. Among the Lake Druksiai freshwater ecosystem reference organisms investigated the highest exposure dose rate was determined for bottom fauna - benthic organisms (mollusc-bivalves, crustaceans, mollusc-gastropods, insect larvae), and among the other reference organisms - for vascular plants. The mean and maximum total dose rate values due to anthropogenic radionuclide ionising radiation impact in all investigated cases were lower than the ERICA screening dose rate value of 10 μGy/h. The main exposure of reference organisms as a result of Ignalina NPP former effluent to Lake Druksiai is due to ionizing radiation of radionuclides (60)Co and (137)Cs, of predicted releases to Lake Druksiai during initial decommissioning period - due to radionuclides (60)Co, (134)Cs and (137)Cs, and as a result of predicted releases to Lake Druksiai from low- and intermediate-level short-lived radioactive waste disposal site in 30-100 year period - due to radionuclides (99)Tc and (3)H. The risk quotient expected values in all investigated cases were <1, and therefore the risk to non-human biota can be considered negligible with the exception of a conservative risk quotient for insect larvae. Radiological protection of non-human biota in Lake Druksiai, the Ignalina NPP cooling pond, is both feasible and acceptable. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Predictive thresholds for plague in Kazakhstan.

    PubMed

    Davis, Stephen; Begon, Mike; De Bruyn, Luc; Ageyev, Vladimir S; Klassovskiy, Nikolay L; Pole, Sergey B; Viljugrein, Hildegunn; Stenseth, Nils Chr; Leirs, Herwig

    2004-04-30

    In Kazakhstan and elsewhere in central Asia, the bacterium Yersinia pestis circulates in natural populations of gerbils, which are the source of human cases of bubonic plague. Our analysis of field data collected between 1955 and 1996 shows that plague invades, fades out, and reinvades in response to fluctuations in the abundance of its main reservoir host, the great gerbil (Rhombomys opimus). This is a rare empirical example of the two types of abundance thresholds for infectious disease-invasion and persistence- operating in a single wildlife population. We parameterized predictive models that should reduce the costs of plague surveillance in central Asia and thereby encourage its continuance.

  14. Human Operator Interface with FLIR Displays.

    DTIC Science & Technology

    1980-03-01

    model (Ratches, et al., 1976) used to evaluate FUIR system performanmce. SECURITY CLASSIFICATION OF THIS PAOE(When Does Bntoff. PREFACE The research...the minimum resolv- able temperature (MRT) paradigm to test two modeled FLIR systems. Twelve male subjects with 20/20 uncorrected vision served as...varying iv levels of size, contrast, noise, and MTF. The test results were compared with the NVL predictive model (Ratches, et al., 1975) used to

  15. Robotics-based synthesis of human motion.

    PubMed

    Khatib, O; Demircan, E; De Sapio, V; Sentis, L; Besier, T; Delp, S

    2009-01-01

    The synthesis of human motion is a complex procedure that involves accurate reconstruction of movement sequences, modeling of musculoskeletal kinematics, dynamics and actuation, and characterization of reliable performance criteria. Many of these processes have much in common with the problems found in robotics research. Task-based methods used in robotics may be leveraged to provide novel musculoskeletal modeling methods and physiologically accurate performance predictions. In this paper, we present (i) a new method for the real-time reconstruction of human motion trajectories using direct marker tracking, (ii) a task-driven muscular effort minimization criterion and (iii) new human performance metrics for dynamic characterization of athletic skills. Dynamic motion reconstruction is achieved through the control of a simulated human model to follow the captured marker trajectories in real-time. The operational space control and real-time simulation provide human dynamics at any configuration of the performance. A new criteria of muscular effort minimization has been introduced to analyze human static postures. Extensive motion capture experiments were conducted to validate the new minimization criterion. Finally, new human performance metrics were introduced to study in details an athletic skill. These metrics include the effort expenditure and the feasible set of operational space accelerations during the performance of the skill. The dynamic characterization takes into account skeletal kinematics as well as muscle routing kinematics and force generating capacities. The developments draw upon an advanced musculoskeletal modeling platform and a task-oriented framework for the effective integration of biomechanics and robotics methods.

  16. PRESTO low-level waste transport and risk assessment code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Little, C.A.; Fields, D.E.; McDowell-Boyer, L.M.

    1981-01-01

    PRESTO (Prediction of Radiation Effects from Shallow Trench Operations) is a computer code developed under US Environmental Protection Agency (EPA) funding to evaluate possible health effects from shallow land burial trenches. The model is intended to be generic and to assess radionuclide transport, ensuing exposure, and health impact to a static local population for a 1000-y period following the end of burial operations. Human exposure scenarios considered by the model include normal releases (including leaching and operational spillage), human intrusion, and site farming or reclamation. Pathways and processes of transit from the trench to an individual or population inlude: groundwatermore » transport, overland flow, erosion, surface water dilution, resuspension, atmospheric transport, deposition, inhalation, and ingestion of contaminated beef, milk, crops, and water. Both population doses and individual doses are calculated as well as doses to the intruder and farmer. Cumulative health effects in terms of deaths from cancer are calculated for the population over the thousand-year period using a life-table approach. Data bases are being developed for three extant shallow land burial sites: Barnwell, South Carolina; Beatty, Nevada; and West Valley, New York.« less

  17. Environmental assessment model for shallow land disposal of low-level radioactive wastes

    NASA Astrophysics Data System (ADS)

    Little, C. A.; Fields, D. E.; Emerson, C. J.; Hiromoto, G.

    1981-09-01

    The PRESTO (Prediction of Radiation Effects from Shallow Trench Operations) computer code developed to evaluate health effects from shallow land burial trenches is described. This generic model assesses radionuclide transport, ensuing exposure, and health impact to a static local population for a 1000 y period following the end of burial operations. Human exposure scenarios considered include normal releases (including leaching and operational spillage), human intrusion, and site farming or reclamation. Pathways and processes of transit from the trench to an individual or population includes ground water transport overland flow, erosion, surface water dilution, resuspension, atmospheric transport, deposition, inhalation, and ingestion of contaminated beef, milk, crops, and water. Both population doses and individual doses are calculated as well as doses to the intruder and farmer. Cumulative health effects in terms of deaths from cancer are calculated for the population over the 1000 y period using a life table approach. Data bases for three shallow land burial sites (Barnwell, South Carolina, Beatty, Nevada, and West Valley, New York) are under development. The interim model, includes coding for environmental transport through air, surface water, and ground water.

  18. Entertainment-education and recruitment of cornea donors: the role of emotion and issue involvement.

    PubMed

    Bae, Hyuhn-Suhck

    2008-01-01

    This study examined the role of emotional responses and viewer's level of issue involvement to an entertainment-education show about cornea donation in order to predict intention to register as cornea donors. Results confirmed that sympathy and empathy responses operated as a catalyst for issue involvement, which emerged as an important intermediary in the persuasion process. Issue involvement also was found to be a common causal antecedent of attitude, subjective norm, and perceived behavioral control, the last two of which predict intentions unlike attitude, which does not. The revised path model confirmed that involvement directly influences intention. The findings of this study suggest that adding emotion and involvement in the Theory of Planned Behavior (TPB) enhances the explanatory power of the theory in predicting intentions, which indicates the possibility of combining the Elaboration Likelihood Model (ELM) and the TPB in the prediction of human behaviors.

  19. Need for Affect and Attitudes Toward Drugs: The Mediating Role of Values.

    PubMed

    Lins de Holanda Coelho, Gabriel; H P Hanel, Paul; Vilar, Roosevelt; P Monteiro, Renan; Gouveia, Valdiney V; R Maio, Gregory

    2018-05-04

    Human values and affective traits were found to predict attitudes toward the use of different types of drugs (e.g., alcohol, marijuana, and other illegal drugs). In this study (N = 196, M age = 23.09), we aimed to gain a more comprehensive understanding of those predictors of attitudes toward drug use in a mediated structural equation model, providing a better overview of a possible motivational path that drives to such a risky behavior. Specifically, we predicted and found that the relations between need for affect and attitudes toward drug use were mediated by excitement values. Also, results showed that excitement values and need for affect positively predicted attitudes toward the use of drugs, whereas normative values predicted it negatively. The pattern of results remained the same when we investigated attitudes toward alcohol, marijuana, or illegal drugs separately. Overall, the findings indicate that emotions operate via excitement and normative values to influence risk behavior.

  20. Prediction of missing common genes for disease pairs using network based module separation on incomplete human interactome.

    PubMed

    Akram, Pakeeza; Liao, Li

    2017-12-06

    Identification of common genes associated with comorbid diseases can be critical in understanding their pathobiological mechanism. This work presents a novel method to predict missing common genes associated with a disease pair. Searching for missing common genes is formulated as an optimization problem to minimize network based module separation from two subgraphs produced by mapping genes associated with disease onto the interactome. Using cross validation on more than 600 disease pairs, our method achieves significantly higher average receiver operating characteristic ROC Score of 0.95 compared to a baseline ROC score 0.60 using randomized data. Missing common genes prediction is aimed to complete gene set associated with comorbid disease for better understanding of biological intervention. It will also be useful for gene targeted therapeutics related to comorbid diseases. This method can be further considered for prediction of missing edges to complete the subgraph associated with disease pair.

  1. Use of a least absolute shrinkage and selection operator (LASSO) model to selected ion flow tube mass spectrometry (SIFT-MS) analysis of exhaled breath to predict the efficacy of dialysis: a pilot study.

    PubMed

    Wang, Maggie Haitian; Chong, Ka Chun; Storer, Malina; Pickering, John W; Endre, Zoltan H; Lau, Steven Yf; Kwok, Chloe; Lai, Maria; Chung, Hau Yin; Ying Zee, Benny Chung

    2016-09-28

    Selected ion flow tube-mass spectrometry (SIFT-MS) provides rapid, non-invasive measurements of a full-mass scan of volatile compounds in exhaled breath. Although various studies have suggested that breath metabolites may be indicators of human disease status, many of these studies have included few breath samples and large numbers of compounds, limiting their power to detect significant metabolites. This study employed a least absolute shrinkage and selective operator (LASSO) approach to SIFT-MS data of breath samples to preliminarily evaluate the ability of exhaled breath findings to monitor the efficacy of dialysis in hemodialysis patients. A process of model building and validation showed that blood creatinine and urea concentrations could be accurately predicted by LASSO-selected masses. Using various precursors, the LASSO models were able to predict creatinine and urea concentrations with high adjusted R-square (>80%) values. The correlation between actual concentrations and concentrations predicted by the LASSO model (using precursor H 3 O + ) was high (Pearson correlation coefficient  =  0.96). Moreover, use of full mass scan data provided a better prediction than compounds from selected ion mode. These findings warrant further investigations in larger patient cohorts. By employing a more powerful statistical approach to predict disease outcomes, breath analysis using SIFT-MS technology could be applicable in future to daily medical diagnoses.

  2. Power hand tool kinetics associated with upper limb injuries in an automobile assembly plant.

    PubMed

    Ku, Chia-Hua; Radwin, Robert G; Karsh, Ben-Tzion

    2007-06-01

    This study investigated the relationship between pneumatic nutrunner handle reactions, workstation characteristics, and prevalence of upper limb injuries in an automobile assembly plant. Tool properties (geometry, inertial properties, and motor characteristics), fastener properties, orientation relative to the fastener, and the position of the tool operator (horizontal and vertical distances) were measured for 69 workstations using 15 different pneumatic nutrunners. Handle reaction response was predicted using a deterministic mechanical model of the human operator and tool that was previously developed in our laboratory, specific to the measured tool, workstation, and job factors. Handle force was a function of target torque, tool geometry and inertial properties, motor speed, work orientation, and joint hardness. The study found that tool target torque was not well correlated with predicted handle reaction force (r=0.495) or displacement (r=0.285). The individual tool, tool shape, and threaded fastener joint hardness all affected predicted forces and displacements (p<0.05). The average peak handle force and displacement for right-angle tools were twice as great as pistol grip tools. Soft-threaded fastener joints had the greatest average handle forces and displacements. Upper limb injury cases were identified using plant OSHA 200 log and personnel records. Predicted handle forces for jobs where injuries were reported were significantly greater than those jobs free of injuries (p<0.05), whereas target torque and predicted handle displacement did not show statistically significant differences. The study concluded that quantification of handle reaction force, rather than target torque alone, is necessary for identifying stressful power hand tool operations and for controlling exposure to forces in manufacturing jobs involving power nutrunners. Therefore, a combination of tool, work station, and task requirements should be considered.

  3. Understanding and seasonal forecasting of hydrological drought in the Anthropocene

    NASA Astrophysics Data System (ADS)

    Yuan, Xing; Zhang, Miao; Wang, Linying; Zhou, Tian

    2017-11-01

    Hydrological drought is not only caused by natural hydroclimate variability but can also be directly altered by human interventions including reservoir operation, irrigation, groundwater exploitation, etc. Understanding and forecasting of hydrological drought in the Anthropocene are grand challenges due to complicated interactions among climate, hydrology and humans. In this paper, five decades (1961-2010) of naturalized and observed streamflow datasets are used to investigate hydrological drought characteristics in a heavily managed river basin, the Yellow River basin in north China. Human interventions decrease the correlation between hydrological and meteorological droughts, and make the hydrological drought respond to longer timescales of meteorological drought. Due to large water consumptions in the middle and lower reaches, there are 118-262 % increases in the hydrological drought frequency, up to 8-fold increases in the drought severity, 21-99 % increases in the drought duration and the drought onset is earlier. The non-stationarity due to anthropogenic climate change and human water use basically decreases the correlation between meteorological and hydrological droughts and reduces the effect of human interventions on hydrological drought frequency while increasing the effect on drought duration and severity. A set of 29-year (1982-2010) hindcasts from an established seasonal hydrological forecasting system are used to assess the forecast skill of hydrological drought. In the naturalized condition, the climate-model-based approach outperforms the climatology method in predicting the 2001 severe hydrological drought event. Based on the 29-year hindcasts, the former method has a Brier skill score of 11-26 % against the latter for the probabilistic hydrological drought forecasting. In the Anthropocene, the skill for both approaches increases due to the dominant influence of human interventions that have been implicitly incorporated by the hydrological post-processing, while the difference between the two predictions decreases. This suggests that human interventions can outweigh the climate variability for the hydrological drought forecasting in the Anthropocene, and the predictability for human interventions needs more attention.

  4. A Survey of Space Robotics

    NASA Technical Reports Server (NTRS)

    Pedersen, L.; Kortenkamp, D.; Wettergreen, D.; Nourbakhsh, I.; Korsmeyer, David (Technical Monitor)

    2003-01-01

    In this paper we summarize a survey conducted by NASA to determine the state-of-the-art in space robotics and to predict future robotic capabilities under either nominal and intensive development effort. The space robotics assessment study examined both in-space operations including assembly, inspection, and maintenance and planetary surface operations like mobility and exploration. Applications of robotic autonomy and human-robot cooperation were considered. The study group devised a decomposition of robotic capabilities and then suggested metrics to specify the technical challenges associated with each. The conclusion of this paper identifies possible areas in which investment in space robotics could lead to significant advances of important technologies.

  5. Sensitivity of a juvenile subject-specific musculoskeletal model of the ankle joint to the variability of operator-dependent input

    PubMed Central

    Hannah, Iain; Montefiori, Erica; Modenese, Luca; Prinold, Joe; Viceconti, Marco; Mazzà, Claudia

    2017-01-01

    Subject-specific musculoskeletal modelling is especially useful in the study of juvenile and pathological subjects. However, such methodologies typically require a human operator to identify key landmarks from medical imaging data and are thus affected by unavoidable variability in the parameters defined and subsequent model predictions. The aim of this study was to thus quantify the inter- and intra-operator repeatability of a subject-specific modelling methodology developed for the analysis of subjects with juvenile idiopathic arthritis. Three operators each created subject-specific musculoskeletal foot and ankle models via palpation of bony landmarks, adjustment of geometrical muscle points and definition of joint coordinate systems. These models were then fused to a generic Arnold lower limb model for each of three modelled patients. The repeatability of each modelling operation was found to be comparable to those previously reported for the modelling of healthy, adult subjects. However, the inter-operator repeatability of muscle point definition was significantly greater than intra-operator repeatability (p < 0.05) and predicted ankle joint contact forces ranged by up to 24% and 10% of the peak force for the inter- and intra-operator analyses, respectively. Similarly, the maximum inter- and intra-operator variations in muscle force output were 64% and 23% of peak force, respectively. Our results suggest that subject-specific modelling is operator dependent at the foot and ankle, with the definition of muscle geometry the most significant source of output uncertainty. The development of automated procedures to prevent the misplacement of crucial muscle points should therefore be considered a particular priority for those developing subject-specific models. PMID:28427313

  6. Sensitivity of a juvenile subject-specific musculoskeletal model of the ankle joint to the variability of operator-dependent input.

    PubMed

    Hannah, Iain; Montefiori, Erica; Modenese, Luca; Prinold, Joe; Viceconti, Marco; Mazzà, Claudia

    2017-05-01

    Subject-specific musculoskeletal modelling is especially useful in the study of juvenile and pathological subjects. However, such methodologies typically require a human operator to identify key landmarks from medical imaging data and are thus affected by unavoidable variability in the parameters defined and subsequent model predictions. The aim of this study was to thus quantify the inter- and intra-operator repeatability of a subject-specific modelling methodology developed for the analysis of subjects with juvenile idiopathic arthritis. Three operators each created subject-specific musculoskeletal foot and ankle models via palpation of bony landmarks, adjustment of geometrical muscle points and definition of joint coordinate systems. These models were then fused to a generic Arnold lower limb model for each of three modelled patients. The repeatability of each modelling operation was found to be comparable to those previously reported for the modelling of healthy, adult subjects. However, the inter-operator repeatability of muscle point definition was significantly greater than intra-operator repeatability ( p < 0.05) and predicted ankle joint contact forces ranged by up to 24% and 10% of the peak force for the inter- and intra-operator analyses, respectively. Similarly, the maximum inter- and intra-operator variations in muscle force output were 64% and 23% of peak force, respectively. Our results suggest that subject-specific modelling is operator dependent at the foot and ankle, with the definition of muscle geometry the most significant source of output uncertainty. The development of automated procedures to prevent the misplacement of crucial muscle points should therefore be considered a particular priority for those developing subject-specific models.

  7. Human Guidance Behavior Decomposition and Modeling

    NASA Astrophysics Data System (ADS)

    Feit, Andrew James

    Trained humans are capable of high performance, adaptable, and robust first-person dynamic motion guidance behavior. This behavior is exhibited in a wide variety of activities such as driving, piloting aircraft, skiing, biking, and many others. Human performance in such activities far exceeds the current capability of autonomous systems in terms of adaptability to new tasks, real-time motion planning, robustness, and trading safety for performance. The present work investigates the structure of human dynamic motion guidance that enables these performance qualities. This work uses a first-person experimental framework that presents a driving task to the subject, measuring control inputs, vehicle motion, and operator visual gaze movement. The resulting data is decomposed into subspace segment clusters that form primitive elements of action-perception interactive behavior. Subspace clusters are defined by both agent-environment system dynamic constraints and operator control strategies. A key contribution of this work is to define transitions between subspace cluster segments, or subgoals, as points where the set of active constraints, either system or operator defined, changes. This definition provides necessary conditions to determine transition points for a given task-environment scenario that allow a solution trajectory to be planned from known behavior elements. In addition, human gaze behavior during this task contains predictive behavior elements, indicating that the identified control modes are internally modeled. Based on these ideas, a generative, autonomous guidance framework is introduced that efficiently generates optimal dynamic motion behavior in new tasks. The new subgoal planning algorithm is shown to generate solutions to certain tasks more quickly than existing approaches currently used in robotics.

  8. A Mechanistic Model of Human Recall of Social Network Structure and Relationship Affect.

    PubMed

    Omodei, Elisa; Brashears, Matthew E; Arenas, Alex

    2017-12-07

    The social brain hypothesis argues that the need to deal with social challenges was key to our evolution of high intelligence. Research with non-human primates as well as experimental and fMRI studies in humans produce results consistent with this claim, leading to an estimate that human primary groups should consist of roughly 150 individuals. Gaps between this prediction and empirical observations can be partially accounted for using "compression heuristics", or schemata that simplify the encoding and recall of social information. However, little is known about the specific algorithmic processes used by humans to store and recall social information. We describe a mechanistic model of human network recall and demonstrate its sufficiency for capturing human recall behavior observed in experimental contexts. We find that human recall is predicated on accurate recall of a small number of high degree network nodes and the application of heuristics for both structural and affective information. This provides new insight into human memory, social network evolution, and demonstrates a novel approach to uncovering human cognitive operations.

  9. Using Empirical Models for Communication Prediction of Spacecraft

    NASA Technical Reports Server (NTRS)

    Quasny, Todd

    2015-01-01

    A viable communication path to a spacecraft is vital for its successful operation. For human spaceflight, a reliable and predictable communication link between the spacecraft and the ground is essential not only for the safety of the vehicle and the success of the mission, but for the safety of the humans on board as well. However, analytical models of these communication links are challenged by unique characteristics of space and the vehicle itself. For example, effects of radio frequency during high energy solar events while traveling through a solar array of a spacecraft can be difficult to model, and thus to predict. This presentation covers the use of empirical methods of communication link predictions, using the International Space Station (ISS) and its associated historical data as the verification platform and test bed. These empirical methods can then be incorporated into communication prediction and automation tools for the ISS in order to better understand the quality of the communication path given a myriad of variables, including solar array positions, line of site to satellites, position of the sun, and other dynamic structures on the outside of the ISS. The image on the left below show the current analytical model of one of the communication systems on the ISS. The image on the right shows a rudimentary empirical model of the same system based on historical archived data from the ISS.

  10. Variable responses of human and non-human primate gut microbiomes to a Western diet.

    PubMed

    Amato, Katherine R; Yeoman, Carl J; Cerda, Gabriela; Schmitt, Christopher A; Cramer, Jennifer Danzy; Miller, Margret E Berg; Gomez, Andres; Turner, Trudy R; Wilson, Brenda A; Stumpf, Rebecca M; Nelson, Karen E; White, Bryan A; Knight, Rob; Leigh, Steven R

    2015-11-16

    The human gut microbiota interacts closely with human diet and physiology. To better understand the mechanisms behind this relationship, gut microbiome research relies on complementing human studies with manipulations of animal models, including non-human primates. However, due to unique aspects of human diet and physiology, it is likely that host-gut microbe interactions operate differently in humans and non-human primates. Here, we show that the human microbiome reacts differently to a high-protein, high-fat Western diet than that of a model primate, the African green monkey, or vervet (Chlorocebus aethiops sabaeus). Specifically, humans exhibit increased relative abundance of Firmicutes and reduced relative abundance of Prevotella on a Western diet while vervets show the opposite pattern. Predictive metagenomics demonstrate an increased relative abundance of genes associated with carbohydrate metabolism in the microbiome of only humans consuming a Western diet. These results suggest that the human gut microbiota has unique properties that are a result of changes in human diet and physiology across evolution or that may have contributed to the evolution of human physiology. Therefore, the role of animal models for understanding the relationship between the human gut microbiota and host metabolism must be re-focused.

  11. Prevalidation of a model for predicting acute neutropenia by colony forming unit granulocyte/macrophage (CFU-GM) assay.

    PubMed

    Pessina, A; Albella, B; Bueren, J; Brantom, P; Casati, S; Gribaldo, L; Croera, C; Gagliardi, G; Foti, P; Parchment, R; Parent-Massin, D; Sibiril, Y; Van Den Heuvel, R

    2001-12-01

    This report describes an international prevalidation study conducted to optimise the Standard Operating Procedure (SOP) for detecting myelosuppressive agents by CFU-GM assay and to study a model for predicting (by means of this in vitro hematopoietic assay) the acute xenobiotic exposure levels that cause maximum tolerated decreases in absolute neutrophil counts (ANC). In the first phase of the study (Protocol Refinement), two SOPs were assessed, by using two cell culture media (Test A, containing GM-CSF; and Test B, containing G-CSF, GM-CSF, IL-3, IL-6 and SCF), and the two tests were applied to cells from both human (bone marrow and umbilical cord blood) and mouse (bone marrow) CFU-GM. In the second phase (Protocol Transfer), the SOPs were transferred to four laboratories to verify the linearity of the assay response and its interlaboratory reproducibility. After a further phase (Protocol Performance), dedicated to a training set of six anticancer drugs (adriamycin, flavopindol, morpholino-doxorubicin, pyrazoloacridine, taxol and topotecan), a model for predicting neutropenia was verified. Results showed that the assay is linear under SOP conditions, and that the in vitro endpoints used by the clinical prediction model of neutropenia are highly reproducible within and between laboratories. Valid tests represented 95% of all tests attempted. The 90% inhibitory concentration values (IC(90)) from Test A and Test B accurately predicted the human maximum tolerated dose (MTD) for five of six and for four of six myelosuppressive anticancer drugs, respectively, that were selected as prototype xenobiotics. As expected, both tests failed to accurately predict the human MTD of a drug that is a likely protoxicant. It is concluded that Test A offers significant cost advantages compared to Test B, without any loss of performance or predictive accuracy. On the basis of these results, we proposed a formal Phase II validation study using the Test A SOP for 16-18 additional xenobiotics that represent the spectrum of haematotoxic potential.

  12. Developmental validation of the IrisPlex system: determination of blue and brown iris colour for forensic intelligence.

    PubMed

    Walsh, Susan; Lindenbergh, Alexander; Zuniga, Sofia B; Sijen, Titia; de Knijff, Peter; Kayser, Manfred; Ballantyne, Kaye N

    2011-11-01

    The IrisPlex system consists of a highly sensitive multiplex genotyping assay together with a statistical prediction model, providing users with the ability to predict blue and brown human eye colour from DNA samples with over 90% precision. This 'DNA intelligence' system is expected to aid police investigations by providing phenotypic information on unknown individuals when conventional DNA profiling is not informative. Falling within the new area of forensic DNA phenotyping, this paper describes the developmental validation of the IrisPlex assay following the Scientific Working Group on DNA Analysis Methods (SWGDAM) guidelines for the application of DNA-based eye colour prediction to forensic casework. The IrisPlex assay produces complete SNP genotypes with only 31pg of DNA, approximately six human diploid cell equivalents, and is therefore more sensitive than commercial STR kits currently used in forensics. Species testing revealed human and primate specificity for a complete SNP profile. The assay is capable of producing accurate results from simulated casework samples such as blood, semen, saliva, hair, and trace DNA samples, including extremely low quantity samples. Due to its design, it can also produce full profiles with highly degraded samples often found in forensic casework. Concordance testing between three independent laboratories displayed reproducible results of consistent levels on varying types of simulated casework samples. With such high levels of sensitivity, specificity, consistency and reliability, this genotyping assay, as a core part of the IrisPlex system, operates in accordance with SWGDAM guidelines. Furthermore, as we demonstrated previously, the IrisPlex eye colour prediction system provides reliable results without the need for knowledge on the bio-geographic ancestry of the sample donor. Hence, the IrisPlex system, with its model-based prediction probability estimation of blue and brown human eye colour, represents a useful tool for immediate application in accredited forensic laboratories, to be used for forensic intelligence in tracing unknown individuals from crime scene samples. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  13. Artificial neural network approach to predict surgical site infection after free-flap reconstruction in patients receiving surgery for head and neck cancer.

    PubMed

    Kuo, Pao-Jen; Wu, Shao-Chun; Chien, Peng-Chen; Chang, Shu-Shya; Rau, Cheng-Shyuan; Tai, Hsueh-Ling; Peng, Shu-Hui; Lin, Yi-Chun; Chen, Yi-Chun; Hsieh, Hsiao-Yun; Hsieh, Ching-Hua

    2018-03-02

    The aim of this study was to develop an effective surgical site infection (SSI) prediction model in patients receiving free-flap reconstruction after surgery for head and neck cancer using artificial neural network (ANN), and to compare its predictive power with that of conventional logistic regression (LR). There were 1,836 patients with 1,854 free-flap reconstructions and 438 postoperative SSIs in the dataset for analysis. They were randomly assigned tin ratio of 7:3 into a training set and a test set. Based on comprehensive characteristics of patients and diseases in the absence or presence of operative data, prediction of SSI was performed at two time points (pre-operatively and post-operatively) with a feed-forward ANN and the LR models. In addition to the calculated accuracy, sensitivity, and specificity, the predictive performance of ANN and LR were assessed based on area under the curve (AUC) measures of receiver operator characteristic curves and Brier score. ANN had a significantly higher AUC (0.892) of post-operative prediction and AUC (0.808) of pre-operative prediction than LR (both P <0.0001). In addition, there was significant higher AUC of post-operative prediction than pre-operative prediction by ANN (p<0.0001). With the highest AUC and the lowest Brier score (0.090), the post-operative prediction by ANN had the highest overall predictive performance. The post-operative prediction by ANN had the highest overall performance in predicting SSI after free-flap reconstruction in patients receiving surgery for head and neck cancer.

  14. Mind-sets matter: a meta-analytic review of implicit theories and self-regulation.

    PubMed

    Burnette, Jeni L; O'Boyle, Ernest H; VanEpps, Eric M; Pollack, Jeffrey M; Finkel, Eli J

    2013-05-01

    This review builds on self-control theory (Carver & Scheier, 1998) to develop a theoretical framework for investigating associations of implicit theories with self-regulation. This framework conceptualizes self-regulation in terms of 3 crucial processes: goal setting, goal operating, and goal monitoring. In this meta-analysis, we included articles that reported a quantifiable assessment of implicit theories and at least 1 self-regulatory process or outcome. With a random effects approach used, meta-analytic results (total unique N = 28,217; k = 113) across diverse achievement domains (68% academic) and populations (age range = 5-42; 10 different nationalities; 58% from United States; 44% female) demonstrated that implicit theories predict distinct self-regulatory processes, which, in turn, predict goal achievement. Incremental theories, which, in contrast to entity theories, are characterized by the belief that human attributes are malleable rather than fixed, significantly predicted goal setting (performance goals, r = -.151; learning goals, r = .187), goal operating (helpless-oriented strategies, r = -.238; mastery-oriented strategies, r = .227), and goal monitoring (negative emotions, r = -.233; expectations, r = .157). The effects for goal setting and goal operating were stronger in the presence (vs. absence) of ego threats such as failure feedback. Discussion emphasizes how the present theoretical analysis merges an implicit theory perspective with self-control theory to advance scholarship and unlock major new directions for basic and applied research.

  15. Threat and error management for anesthesiologists: a predictive risk taxonomy

    PubMed Central

    Ruskin, Keith J.; Stiegler, Marjorie P.; Park, Kellie; Guffey, Patrick; Kurup, Viji; Chidester, Thomas

    2015-01-01

    Purpose of review Patient care in the operating room is a dynamic interaction that requires cooperation among team members and reliance upon sophisticated technology. Most human factors research in medicine has been focused on analyzing errors and implementing system-wide changes to prevent them from recurring. We describe a set of techniques that has been used successfully by the aviation industry to analyze errors and adverse events and explain how these techniques can be applied to patient care. Recent findings Threat and error management (TEM) describes adverse events in terms of risks or challenges that are present in an operational environment (threats) and the actions of specific personnel that potentiate or exacerbate those threats (errors). TEM is a technique widely used in aviation, and can be adapted for the use in a medical setting to predict high-risk situations and prevent errors in the perioperative period. A threat taxonomy is a novel way of classifying and predicting the hazards that can occur in the operating room. TEM can be used to identify error-producing situations, analyze adverse events, and design training scenarios. Summary TEM offers a multifaceted strategy for identifying hazards, reducing errors, and training physicians. A threat taxonomy may improve analysis of critical events with subsequent development of specific interventions, and may also serve as a framework for training programs in risk mitigation. PMID:24113268

  16. Spacelab Life Sciences 1 - The stepping stone

    NASA Technical Reports Server (NTRS)

    Dalton, B. P.; Leon, H.; Hogan, R.; Clarke, B.; Tollinger, D.

    1988-01-01

    The Spacelab Life Sciences (SLS-1) mission scheduled for launch in March 1990 will study the effects of microgravity on physiological parameters of humans and animals. The data obtained will guide equipment design, performance of activities involving the use of animals, and prediction of human physiological responses during long-term microgravity exposure. The experiments planned for the SLS-1 mission include a particulate-containment demonstration test, integrated rodent experiments, jellyfish experiments, and validation of the small-mass measuring instrument. The design and operation of the Research Animal Holding Facility, General-Purpose Work Station, General-Purpose Transfer Unit, and Animal Enclosure Module are discussed and illustrated with drawings and diagrams.

  17. Towards a National Space Weather Predictive Capability

    NASA Astrophysics Data System (ADS)

    Fox, N. J.; Lindstrom, K. L.; Ryschkewitsch, M. G.; Anderson, B. J.; Gjerloev, J. W.; Merkin, V. G.; Kelly, M. A.; Miller, E. S.; Sitnov, M. I.; Ukhorskiy, A. Y.; Erlandson, R. E.; Barnes, R. J.; Paxton, L. J.; Sotirelis, T.; Stephens, G.; Comberiate, J.

    2014-12-01

    National needs in the area of space weather informational and predictive tools are growing rapidly. Adverse conditions in the space environment can cause disruption of satellite operations, communications, navigation, and electric power distribution grids, leading to a variety of socio-economic losses and impacts on our security. Future space exploration and most modern human endeavors will require major advances in physical understanding and improved transition of space research to operations. At present, only a small fraction of the latest research and development results from NASA, NOAA, NSF and DoD investments are being used to improve space weather forecasting and to develop operational tools. The power of modern research and space weather model development needs to be better utilized to enable comprehensive, timely, and accurate operational space weather tools. The mere production of space weather information is not sufficient to address the needs of those who are affected by space weather. A coordinated effort is required to support research-to-applications transition efforts and to develop the tools required those who rely on this information. In this presentation we will review datasets, tools and models that have resulted from research by scientists at JHU/APL, and examine how they could be applied to support space weather applications in coordination with other community assets and capabilities.

  18. Sub-seasonal precipitation during the South Asian summer monsoon onset period

    NASA Astrophysics Data System (ADS)

    Takaya, Y.; Yamaguchi, M.

    2017-12-01

    The South Asian summer monsoon (SASM) has a great impact on human activities (e.g., agriculture and health), thus skillful prediction of the SASM is highly anticipated. In particular, precipitation amount and timing of a rainy season onset are of great importance for crop planning. This study examines the performance of precipitation prediction during the onset period of the SASM using the WWRP/WCRP sub-seasonal to seasonal prediction project (S2S) dataset. Preliminary verification of ECMWF model reforecasts against the GSMaP precipitation analysis produced by Japan Aerospace Exploration Agency (JAXA) shows that a predictive skill of precipitation is reasonably high in a sub-seasonal time-range. It is also found that the predictive skill of precipitation in the South Asia is relatively higher around the onset period, consistent with our previous finding using the latest JMA seasonal prediction system (JMA/MRI-CPS2). The results suggest that state-of-the-art operational models have the capability to provide useful SASM onset predictions at a sub-seasonal time scale. In the presentation, we will also discuss the inherent potential predictability, feasibility of prediction of the monsoon onset and relevant processes.

  19. Spot and Runway Departure Advisor (SARDA)

    NASA Technical Reports Server (NTRS)

    Jung, Yoon

    2016-01-01

    Spot and Runway Departure Advisor (SARDA) is a decision support tool to assist airline ramp controllers and ATC tower controllers to manage traffic on the airport surface to significantly improve efficiency and predictability in surface operations. The core function of the tool is the runway scheduler which generates an optimal solution for runway sequence and schedule of departure aircraft, which would minimize system delay and maximize runway throughput. The presentation also discusses the latest status of NASA's current surface research through a collaboration with an airline partner, where a tool is developed for airline ramp operators to assist departure pushback operations. The presentation describes the concept of the SARDA tool and results from human-in-the-loop simulations conducted in 2012 for Dallas-Ft. Worth International Airport and 2014 for Charlotte airport ramp tower.

  20. [The evaluation and prognosis of the psychophysiological status of a human operator].

    PubMed

    Sukhov, A E; Chaĭchenko, G M

    1989-01-01

    In experiments on 56 healthy subjects (18-20 years old) the quality of their activity was determined during compensatory watching the mark at complicating regimes of work. Depending on the difficulty of the task five groups of subjects were singled out with optimum working capacity in one of four working conditions: normal, ordinary and strenuous work, model of stress situation. It is established that the change of the number of significant correlative connections between main parameters of psychophysiological state of man-operator reflects the condition of his functional systems. On the basis of computation of total range of organization values of both R-R intervals of the ECG and duration of expiration, the success of the man-operator work in complex conditions of activity is predicted.

  1. Accuracy of simple biochemical tests in identifying liver fibrosis in patients co-infected with human immunodeficiency virus and hepatitis C virus.

    PubMed

    Tural, Cristina; Tor, Jordi; Sanvisens, Arantza; Pérez-Alvarez, Núria; Martínez, Elisenda; Ojanguren, Isabel; García-Samaniego, Javier; Rockstroh, Juergen; Barluenga, Eva; Muga, Robert; Planas, Ramon; Sirera, Guillem; Rey-Joly, Celestino; Clotet, Bonaventura

    2009-03-01

    We assessed the ability of 3 simple biochemical tests to stage liver fibrosis in patients co-infected with human immunodeficiency virus (HIV) and hepatitis C virus (HCV). We analyzed liver biopsy samples from 324 consecutive HIV/HCV-positive patients (72% men; mean age, 38 y; mean CD4+ T-cell counts, 548 cells/mm(3)). Scheuer fibrosis scores were as follows: 30% had F0, 22% had F1, 19% had F2, 23% had F3, and 6% had F4. Logistic regression analyses were used to predict the probability of significant (>or=F2) or advanced (>or=F3) fibrosis, based on numeric scores from the APRI, FORNS, or FIB-4 tests (alone and in combination). Area under the receiver operating characteristic curves were analyzed to assess diagnostic performance. Area under the receiver operating characteristic curves analyses indicated that the 3 tests had similar abilities to identify F2 and F3; the ability of APRI, FORNS, and FIB-4 were as follows: F2 or greater: 0.72, 0.67, and 0.72, respectively; F3 or greater: 0.75, 0.73, and 0.78, respectively. The accuracy of each test in predicting which samples were F3 or greater was significantly higher than for F2 or greater (APRI, FORNS, and FIB-4: >or=F3: 75%, 76%, and 76%, respectively; >or=F2: 66%, 62%, and 68%, respectively). By using the lowest cut-off values for all 3 tests, F3 or greater was ruled out with sensitivity and negative predictive values of 79% to 94% and 87% to 91%, respectively, and 47% to 70% accuracy. Advanced liver fibrosis (>or=F3) was identified using the highest cut-off value, with specificity and positive predictive values of 90% to 96% and 63% to 73%, respectively, and 75% to 77% accuracy. Simple biochemical tests accurately predicted liver fibrosis in more than half the HIV/HCV co-infected patients. The absence and presence of liver fibrosis are predicted fairly using the lowest and highest cut-off levels, respectively.

  2. Design of experiments to optimize an in vitro cast to predict human nasal drug deposition.

    PubMed

    Shah, Samir A; Dickens, Colin J; Ward, David J; Banaszek, Anna A; George, Chris; Horodnik, Walter

    2014-02-01

    Previous studies showed nasal spray in vitro tests cannot predict in vivo deposition, pharmacokinetics, or pharmacodynamics. This challenge makes it difficult to assess deposition achieved with new technologies delivering to the therapeutically beneficial posterior nasal cavity. In this study, we determined best parameters for using a regionally divided nasal cast to predict deposition. Our study used a model suspension and a design of experiments to produce repeatable deposition results that mimic nasal deposition patterns of nasal suspensions from the literature. The seven-section (the nozzle locator, nasal vestibule, front turbinate, rear turbinate, olfactory region, nasopharynx, and throat filter) nylon nasal cast was based on computed tomography images of healthy humans. It was coated with a glycerol/Brij-35 solution to mimic mucus. After assembling and orienting, airflow was applied and nasal spray containing a model suspension was sprayed. After disassembling the cast, drug depositing in each section was assayed by HPLC. The success criteria for optimal settings were based on nine in vivo studies in the literature. The design of experiments included exploratory and half factorial screening experiments to identify variables affecting deposition (angles, airflow, and airflow time), optimization experiments, and then repeatability and reproducibility experiments. We found tilt angle and airflow time after actuation affected deposition the most. The optimized settings were flow rate of 16 L/min, postactuation flow time of 12 sec, a tilt angle of 23°, nozzle angles of 0°, and actuation speed of 5 cm/sec. Neither cast nor operator caused significant variation of results. We determined cast parameters to produce results resembling suspension nasal sprays in the literature. The results were repeatable and unaffected by operator or cast. These nasal spray parameters could be used to assess deposition from new devices or formulations. For human deposition studies using radiolabeled formulations, this cast could show that radiolabel deposition represents drug deposition. Our methods could also be used to optimize settings for other casts.

  3. Facial expression judgments support a socio-relational model, rather than a negativity bias model of political psychology.

    PubMed

    Vigil, Jacob M; Strenth, Chance

    2014-06-01

    Self-reported opinions and judgments may be more rooted in expressive biases than in cognitive processing biases, and ultimately operate within a broader behavioral style for advertising the capacity - versus the trustworthiness - dimension of human reciprocity potential. Our analyses of facial expression judgments of likely voters are consistent with this thesis, and directly contradict one major prediction from the authors' "negativity-bias" model.

  4. Publications in acoustics and noise control from the NASA Langley Research Center during 1940-1976

    NASA Technical Reports Server (NTRS)

    Fryer, B. A. (Compiler)

    1977-01-01

    Reference lists are presented of published research papers in various areas of acoustics and noise control for the period 1940-1976. The references are listed chronologically and are grouped under the following general headings: (1) Duct acoustics; (2) propagation and operations; (3) rotating blade noise; (4) jet noise; (5) sonic boom; (6) flow-surface interaction noise; (7) human response; (8) structural response; (9) prediction; and (10) miscellaneous.

  5. The Structure of Processing Resource Demands in Monitoring Automatic Systems.

    DTIC Science & Technology

    1981-01-01

    Attempts at modelling the human failure detection process have continually focused on normative predictions of optimal operator behavior ( Smallwood ...Broadbent’s filter model (Broadbent, 1957), to Treisman’s attenuation model (Treisman, 1964), to Norman’s late selection model ( Norman , 1968), tife concept...survey and a model. Acta Psychologica, 1967, 27, 84-92. Moray, N. Mental workload: Its theory and measurement. New York: Plenum Press, 1979. Li 42 Norman

  6. An assigned responsibility system for robotic teleoperation control.

    PubMed

    Small, Nicolas; Lee, Kevin; Mann, Graham

    2018-01-01

    This paper proposes an architecture that explores a gap in the spectrum of existing strategies for robot control mode switching in adjustable autonomy. In situations where the environment is reasonably known and/or predictable, pre-planning these control changes could relieve robot operators of the additional task of deciding when and how to switch. Such a strategy provides a clear division of labour between the automation and the human operator(s) before the job even begins, allowing for individual responsibilities to be known ahead of time, limiting confusion and allowing rest breaks to be planned. Assigned Responsibility is a new form of adjustable autonomy-based teleoperation that allows the selective inclusion of automated control elements at key stages of a robot operation plan's execution. Progression through these stages is controlled by automatic goal accomplishment tracking. An implementation is evaluated through engineering tests and a usability study, demonstrating the viability of this approach and offering insight into its potential applications.

  7. Autonomous calibration of single spin qubit operations

    NASA Astrophysics Data System (ADS)

    Frank, Florian; Unden, Thomas; Zoller, Jonathan; Said, Ressa S.; Calarco, Tommaso; Montangero, Simone; Naydenov, Boris; Jelezko, Fedor

    2017-12-01

    Fully autonomous precise control of qubits is crucial for quantum information processing, quantum communication, and quantum sensing applications. It requires minimal human intervention on the ability to model, to predict, and to anticipate the quantum dynamics, as well as to precisely control and calibrate single qubit operations. Here, we demonstrate single qubit autonomous calibrations via closed-loop optimisations of electron spin quantum operations in diamond. The operations are examined by quantum state and process tomographic measurements at room temperature, and their performances against systematic errors are iteratively rectified by an optimal pulse engineering algorithm. We achieve an autonomous calibrated fidelity up to 1.00 on a time scale of minutes for a spin population inversion and up to 0.98 on a time scale of hours for a single qubit π/2 -rotation within the experimental error of 2%. These results manifest a full potential for versatile quantum technologies.

  8. Human operator response to error-likely situations in complex engineering systems

    NASA Technical Reports Server (NTRS)

    Morris, Nancy M.; Rouse, William B.

    1988-01-01

    The causes of human error in complex systems are examined. First, a conceptual framework is provided in which two broad categories of error are discussed: errors of action, or slips, and errors of intention, or mistakes. Conditions in which slips and mistakes might be expected to occur are identified, based on existing theories of human error. Regarding the role of workload, it is hypothesized that workload may act as a catalyst for error. Two experiments are presented in which humans' response to error-likely situations were examined. Subjects controlled PLANT under a variety of conditions and periodically provided subjective ratings of mental effort. A complex pattern of results was obtained, which was not consistent with predictions. Generally, the results of this research indicate that: (1) humans respond to conditions in which errors might be expected by attempting to reduce the possibility of error, and (2) adaptation to conditions is a potent influence on human behavior in discretionary situations. Subjects' explanations for changes in effort ratings are also explored.

  9. Artificial neural network approach to predict surgical site infection after free-flap reconstruction in patients receiving surgery for head and neck cancer

    PubMed Central

    Kuo, Pao-Jen; Wu, Shao-Chun; Chien, Peng-Chen; Chang, Shu-Shya; Rau, Cheng-Shyuan; Tai, Hsueh-Ling; Peng, Shu-Hui; Lin, Yi-Chun; Chen, Yi-Chun; Hsieh, Hsiao-Yun; Hsieh, Ching-Hua

    2018-01-01

    Background The aim of this study was to develop an effective surgical site infection (SSI) prediction model in patients receiving free-flap reconstruction after surgery for head and neck cancer using artificial neural network (ANN), and to compare its predictive power with that of conventional logistic regression (LR). Materials and methods There were 1,836 patients with 1,854 free-flap reconstructions and 438 postoperative SSIs in the dataset for analysis. They were randomly assigned tin ratio of 7:3 into a training set and a test set. Based on comprehensive characteristics of patients and diseases in the absence or presence of operative data, prediction of SSI was performed at two time points (pre-operatively and post-operatively) with a feed-forward ANN and the LR models. In addition to the calculated accuracy, sensitivity, and specificity, the predictive performance of ANN and LR were assessed based on area under the curve (AUC) measures of receiver operator characteristic curves and Brier score. Results ANN had a significantly higher AUC (0.892) of post-operative prediction and AUC (0.808) of pre-operative prediction than LR (both P<0.0001). In addition, there was significant higher AUC of post-operative prediction than pre-operative prediction by ANN (p<0.0001). With the highest AUC and the lowest Brier score (0.090), the post-operative prediction by ANN had the highest overall predictive performance. Conclusion The post-operative prediction by ANN had the highest overall performance in predicting SSI after free-flap reconstruction in patients receiving surgery for head and neck cancer. PMID:29568393

  10. Development of a real-time prediction model of driver behavior at intersections using kinematic time series data.

    PubMed

    Tan, Yaoyuan V; Elliott, Michael R; Flannagan, Carol A C

    2017-09-01

    As connected autonomous vehicles (CAVs) enter the fleet, there will be a long period when these vehicles will have to interact with human drivers. One of the challenges for CAVs is that human drivers do not communicate their decisions well. Fortunately, the kinematic behavior of a human-driven vehicle may be a good predictor of driver intent within a short time frame. We analyzed the kinematic time series data (e.g., speed) for a set of drivers making left turns at intersections to predict whether the driver would stop before executing the turn. We used principal components analysis (PCA) to generate independent dimensions that explain the variation in vehicle speed before a turn. These dimensions remained relatively consistent throughout the maneuver, allowing us to compute independent scores on these dimensions for different time windows throughout the approach to the intersection. We then linked these PCA scores to whether a driver would stop before executing a left turn using the random intercept Bayesian additive regression trees. Five more road and observable vehicle characteristics were included to enhance prediction. Our model achieved an area under the receiver operating characteristic curve (AUC) of 0.84 at 94m away from the center of an intersection and steadily increased to 0.90 by 46m away from the center of an intersection. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. An Illumination Modeling System for Human Factors Analyses

    NASA Technical Reports Server (NTRS)

    Huynh, Thong; Maida, James C.; Bond, Robert L. (Technical Monitor)

    2002-01-01

    Seeing is critical to human performance. Lighting is critical for seeing. Therefore, lighting is critical to human performance. This is common sense, and here on earth, it is easily taken for granted. However, on orbit, because the sun will rise or set every 45 minutes on average, humans working in space must cope with extremely dynamic lighting conditions. Contrast conditions of harsh shadowing and glare is also severe. The prediction of lighting conditions for critical operations is essential. Crew training can factor lighting into the lesson plans when necessary. Mission planners can determine whether low-light video cameras are required or whether additional luminaires need to be flown. The optimization of the quantity and quality of light is needed because of the effects on crew safety, on electrical power and on equipment maintainability. To address all of these issues, an illumination modeling system has been developed by the Graphics Research and Analyses Facility (GRAF) and Lighting Environment Test Facility (LETF) in the Space Human Factors Laboratory at NASA Johnson Space Center. The system uses physically based ray tracing software (Radiance) developed at Lawrence Berkeley Laboratories, a human factors oriented geometric modeling system (PLAID) and an extensive database of humans and environments. Material reflectivity properties of major surfaces and critical surfaces are measured using a gonio-reflectometer. Luminaires (lights) are measured for beam spread distribution, color and intensity. Video camera performances are measured for color and light sensitivity. 3D geometric models of humans and the environment are combined with the material and light models to form a system capable of predicting lighting conditions and visibility conditions in space.

  12. A new theoretical approach to the functional meaning of sleep and dreaming in humans based on the maintenance of ‘predictive psychic homeostasis’

    PubMed Central

    Barlow, Peter W.; Baluška, František; Tonin, Paolo; Guescini, Michele; Leo, Giuseppina; Fuxe, Kjell

    2011-01-01

    Different theories have been put forward during the last decade to explain the functional meaning of sleep and dreaming in humans. In the present paper, a new theory is presented which, while taking advantage of these earlier theories, introduces the following new and original aspects:   • Circadian rhythms relevant to various organs of the body affect the reciprocal interactions which operate to maintain constancy of the internal milieu and thereby also affect the sleep/wakefulness cycle. Particular attention is given to the constancy of natraemia and osmolarity and to the permissive role that the evolution of renal function has had for the evolution of the central nervous system and its integrative actions. • The resetting of neuro-endocrine controls at the onset of wakefulness leads to the acquisition of new information and its integration within previously stored memories. This point is dealt with in relation to Moore-Ede’s proposal for the existence of a ’predictive homeostasis’. • The concept of ‘psychic homeostasis’ is introduced and is considered as one of the most important states since it is aimed at the well-being, or eudemonia, of the human psyche. Sleep and dreaming in humans are discussed as important functions for the maintenance of a newly proposed composite state: that of ‘predictive psychic homeostasis’. On the basis of these assumptions, and in accordance with the available neurobiological data, the present paper puts forward the novel hypothesis that sleep and dreaming play important functions in humans by compensating for psychic allostatic overloads. Hence, both consolatory dreams and disturbing nightmares can be part of the vis medicatrix naturae, the natural healing power, in this case, the state of eudemonia. PMID:22448302

  13. Modified Petri net model sensitivity to workload manipulations

    NASA Technical Reports Server (NTRS)

    White, S. A.; Mackinnon, D. P.; Lyman, J.

    1986-01-01

    Modified Petri Nets (MPNs) are investigated as a workload modeling tool. The results of an exploratory study of the sensitivity of MPNs to work load manipulations in a dual task are described. Petri nets have been used to represent systems with asynchronous, concurrent and parallel activities (Peterson, 1981). These characteristics led some researchers to suggest the use of Petri nets in workload modeling where concurrent and parallel activities are common. Petri nets are represented by places and transitions. In the workload application, places represent operator activities and transitions represent events. MPNs have been used to formally represent task events and activities of a human operator in a man-machine system. Some descriptive applications demonstrate the usefulness of MPNs in the formal representation of systems. It is the general hypothesis herein that in addition to descriptive applications, MPNs may be useful for workload estimation and prediction. The results are reported of the first of a series of experiments designed to develop and test a MPN system of workload estimation and prediction. This first experiment is a screening test of MPN model general sensitivity to changes in workload. Positive results from this experiment will justify the more complicated analyses and techniques necessary for developing a workload prediction system.

  14. Drug repositioning for enzyme modulator based on human metabolite-likeness.

    PubMed

    Lee, Yoon Hyeok; Choi, Hojae; Park, Seongyong; Lee, Boah; Yi, Gwan-Su

    2017-05-31

    Recently, the metabolite-likeness of the drug space has emerged and has opened a new possibility for exploring human metabolite-like candidates in drug discovery. However, the applicability of metabolite-likeness in drug discovery has been largely unexplored. Moreover, there are no reports on its applications for the repositioning of drugs to possible enzyme modulators, although enzyme-drug relations could be directly inferred from the similarity relationships between enzyme's metabolites and drugs. We constructed a drug-metabolite structural similarity matrix, which contains 1,861 FDA-approved drugs and 1,110 human intermediary metabolites scored with the Tanimoto similarity. To verify the metabolite-likeness measure for drug repositioning, we analyzed 17 known antimetabolite drugs that resemble the innate metabolites of their eleven target enzymes as the gold standard positives. Highly scored drugs were selected as possible modulators of enzymes for their corresponding metabolites. Then, we assessed the performance of metabolite-likeness with a receiver operating characteristic analysis and compared it with other drug-target prediction methods. We set the similarity threshold for drug repositioning candidates of new enzyme modulators based on maximization of the Youden's index. We also carried out literature surveys for supporting the drug repositioning results based on the metabolite-likeness. In this paper, we applied metabolite-likeness to repurpose FDA-approved drugs to disease-associated enzyme modulators that resemble human innate metabolites. All antimetabolite drugs were mapped with their known 11 target enzymes with statistically significant similarity values to the corresponding metabolites. The comparison with other drug-target prediction methods showed the higher performance of metabolite-likeness for predicting enzyme modulators. After that, the drugs scored higher than similarity score of 0.654 were selected as possible modulators of enzymes for their corresponding metabolites. In addition, we showed that drug repositioning results of 10 enzymes were concordant with the literature evidence. This study introduced a method to predict the repositioning of known drugs to possible modulators of disease associated enzymes using human metabolite-likeness. We demonstrated that this approach works correctly with known antimetabolite drugs and showed that the proposed method has better performance compared to other drug target prediction methods in terms of enzyme modulators prediction. This study as a proof-of-concept showed how to apply metabolite-likeness to drug repositioning as well as potential in further expansion as we acquire more disease associated metabolite-target protein relations.

  15. Role of Educational Strategies for Human Resources in Green Infrastructure Operation and Maintenance

    NASA Astrophysics Data System (ADS)

    Ebrahimi, G.; Thurm, B.; Öberg, G.

    2014-12-01

    Rainwater harvesting and water reuse are receiving increasing attention as they hold the potential to effectively improve water conservation efforts. While many technical solutions have been developed, alternative water systems in built environments face significant challenges in the implementation and operational phases. The aim of this study is to examine obstacles to the implementation of alternative water systems in practice and identify criteria of feasible and sustainable solutions that allow bypassing of the identified obstacles. Interviews were conducted with planners, system designers and operators to find out which factors that central actors believe influence successful implementation of such systems. The results were analyzed in light of the literature. The actual performance of the water harvesting and reuse systems in four recently built green buildings in the Province of British Columbia, Canada was analyzed in light of the predicted outcome, according to the criteria identified in the interviews. It was found that the major obstacle to success is that the practical challenges involved in the implementation of alternative systems are underestimated. This, for example, leads to that education strategies for operational staff are not developed, and the staff is left floundering. This study highlights the importance of recognizing the need for strategic and directed educational programs for the human resources who are involved in operating and maintaining rainwater harvesting and water reuse systems.

  16. Signed reward prediction errors drive declarative learning

    PubMed Central

    Naert, Lien; Janssens, Clio; Talsma, Durk; Van Opstal, Filip; Verguts, Tom

    2018-01-01

    Reward prediction errors (RPEs) are thought to drive learning. This has been established in procedural learning (e.g., classical and operant conditioning). However, empirical evidence on whether RPEs drive declarative learning–a quintessentially human form of learning–remains surprisingly absent. We therefore coupled RPEs to the acquisition of Dutch-Swahili word pairs in a declarative learning paradigm. Signed RPEs (SRPEs; “better-than-expected” signals) during declarative learning improved recognition in a follow-up test, with increasingly positive RPEs leading to better recognition. In addition, classic declarative memory mechanisms such as time-on-task failed to explain recognition performance. The beneficial effect of SRPEs on recognition was subsequently affirmed in a replication study with visual stimuli. PMID:29293493

  17. Signed reward prediction errors drive declarative learning.

    PubMed

    De Loof, Esther; Ergo, Kate; Naert, Lien; Janssens, Clio; Talsma, Durk; Van Opstal, Filip; Verguts, Tom

    2018-01-01

    Reward prediction errors (RPEs) are thought to drive learning. This has been established in procedural learning (e.g., classical and operant conditioning). However, empirical evidence on whether RPEs drive declarative learning-a quintessentially human form of learning-remains surprisingly absent. We therefore coupled RPEs to the acquisition of Dutch-Swahili word pairs in a declarative learning paradigm. Signed RPEs (SRPEs; "better-than-expected" signals) during declarative learning improved recognition in a follow-up test, with increasingly positive RPEs leading to better recognition. In addition, classic declarative memory mechanisms such as time-on-task failed to explain recognition performance. The beneficial effect of SRPEs on recognition was subsequently affirmed in a replication study with visual stimuli.

  18. Design of an Adaptive Human-Machine System Based on Dynamical Pattern Recognition of Cognitive Task-Load.

    PubMed

    Zhang, Jianhua; Yin, Zhong; Wang, Rubin

    2017-01-01

    This paper developed a cognitive task-load (CTL) classification algorithm and allocation strategy to sustain the optimal operator CTL levels over time in safety-critical human-machine integrated systems. An adaptive human-machine system is designed based on a non-linear dynamic CTL classifier, which maps a set of electroencephalogram (EEG) and electrocardiogram (ECG) related features to a few CTL classes. The least-squares support vector machine (LSSVM) is used as dynamic pattern classifier. A series of electrophysiological and performance data acquisition experiments were performed on seven volunteer participants under a simulated process control task environment. The participant-specific dynamic LSSVM model is constructed to classify the instantaneous CTL into five classes at each time instant. The initial feature set, comprising 56 EEG and ECG related features, is reduced to a set of 12 salient features (including 11 EEG-related features) by using the locality preserving projection (LPP) technique. An overall correct classification rate of about 80% is achieved for the 5-class CTL classification problem. Then the predicted CTL is used to adaptively allocate the number of process control tasks between operator and computer-based controller. Simulation results showed that the overall performance of the human-machine system can be improved by using the adaptive automation strategy proposed.

  19. The AFHSC-Division of GEIS Operations Predictive Surveillance Program: a multidisciplinary approach for the early detection and response to disease outbreaks

    PubMed Central

    2011-01-01

    The Armed Forces Health Surveillance Center, Division of Global Emerging Infections Surveillance and Response System Operations (AFHSC-GEIS) initiated a coordinated, multidisciplinary program to link data sets and information derived from eco-climatic remote sensing activities, ecologic niche modeling, arthropod vector, animal disease-host/reservoir, and human disease surveillance for febrile illnesses, into a predictive surveillance program that generates advisories and alerts on emerging infectious disease outbreaks. The program’s ultimate goal is pro-active public health practice through pre-event preparedness, prevention and control, and response decision-making and prioritization. This multidisciplinary program is rooted in over 10 years experience in predictive surveillance for Rift Valley fever outbreaks in Eastern Africa. The AFHSC-GEIS Rift Valley fever project is based on the identification and use of disease-emergence critical detection points as reliable signals for increased outbreak risk. The AFHSC-GEIS predictive surveillance program has formalized the Rift Valley fever project into a structured template for extending predictive surveillance capability to other Department of Defense (DoD)-priority vector- and water-borne, and zoonotic diseases and geographic areas. These include leishmaniasis, malaria, and Crimea-Congo and other viral hemorrhagic fevers in Central Asia and Africa, dengue fever in Asia and the Americas, Japanese encephalitis (JE) and chikungunya fever in Asia, and rickettsial and other tick-borne infections in the U.S., Africa and Asia. PMID:21388561

  20. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground-Based Computation and Control Systems, and Human Health and Safety

    NASA Technical Reports Server (NTRS)

    Atwell, William; Koontz, Steve; Normand, Eugene

    2012-01-01

    Three twentieth century technological developments, 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems, have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools needed to design, test, and verify the safety and reliability of modern complex technological systems. The effects of primary cosmic ray particles and secondary particle showers produced by nuclear reactions with the atmosphere, can determine the design and verification processes (as well as the total dollar cost) for manned and unmanned spacecraft avionics systems. Similar considerations apply to commercial and military aircraft operating at high latitudes and altitudes near the atmospheric Pfotzer maximum. Even ground based computational and controls systems can be negatively affected by secondary particle showers at the Earth s surface, especially if the net target area of the sensitive electronic system components is large. Finally, accumulation of both primary cosmic ray and secondary cosmic ray induced particle shower radiation dose is an important health and safety consideration for commercial or military air crews operating at high altitude/latitude and is also one of the most important factors presently limiting manned space flight operations beyond low-Earth orbit (LEO). In this paper we review the discovery of cosmic ray effects on the performance and reliability of microelectronic systems as well as human health and the development of the engineering and health science tools used to evaluate and mitigate cosmic ray effects in ground-based atmospheric flight, and space flight environments. Ground test methods applied to microelectronic components and systems are used in combinations with radiation transport and reaction codes to predict the performance of microelectronic systems in their operating environments. Similar radiation transport codes are used to evaluate possible human health effects of cosmic ray exposure, however, the health effects are based on worst-case analysis and extrapolation of a very limited human exposure data base combined with some limited experimental animal data. Finally, the limitations on human space operations beyond low-Earth orbit imposed by long term exposure to galactic cosmic rays are discussed.

  1. A spatial method to calculate small-scale fisheries effort in data poor scenarios.

    PubMed

    Johnson, Andrew Frederick; Moreno-Báez, Marcia; Giron-Nava, Alfredo; Corominas, Julia; Erisman, Brad; Ezcurra, Exequiel; Aburto-Oropeza, Octavio

    2017-01-01

    To gauge the collateral impacts of fishing we must know where fishing boats operate and how much they fish. Although small-scale fisheries land approximately the same amount of fish for human consumption as industrial fleets globally, methods of estimating their fishing effort are comparatively poor. We present an accessible, spatial method of calculating the effort of small-scale fisheries based on two simple measures that are available, or at least easily estimated, in even the most data-poor fisheries: the number of boats and the local coastal human population. We illustrate the method using a small-scale fisheries case study from the Gulf of California, Mexico, and show that our measure of Predicted Fishing Effort (PFE), measured as the number of boats operating in a given area per day adjusted by the number of people in local coastal populations, can accurately predict fisheries landings in the Gulf. Comparing our values of PFE to commercial fishery landings throughout the Gulf also indicates that the current number of small-scale fishing boats in the Gulf is approximately double what is required to land theoretical maximum fish biomass. Our method is fishery-type independent and can be used to quantitatively evaluate the efficacy of growth in small-scale fisheries. This new method provides an important first step towards estimating the fishing effort of small-scale fleets globally.

  2. A Cognitive-System Model for En Route Air Traffic Management

    NASA Technical Reports Server (NTRS)

    Corker, Kevin M.; Pisanich, Gregory; Lebacqz, J. Victor (Technical Monitor)

    1998-01-01

    NASA Ames Research Center has been engaged in the development of advanced air traffic management technologies whose basic form is cognitive aiding systems for air traffic controller and flight deck operations. In the design and evaluation of such systems the dynamic interaction between the airborne aiding system and the ground-based aiding systems forms a critical coupling for control. The human operator is an integral control element in the system and the optimal integration of human decision and performance parameters with those of the automation aiding systems offers a significant challenge to cognitive engineering. This paper presents a study in full mission simulation and the development of a predictive computational model of human performance. We have found that this combination of methodologies provide a powerful design-aiding process. We have extended the computational model Man Machine Integrated Design and Analysis System (N13DAS) to include representation of multiple cognitive agents (both human operators and intelligent aiding systems), operating aircraft airline operations centers and air traffic control centers in the evolving airspace. The demands of this application require the representation of many intelligent agents sharing world-models, and coordinating action/intention with cooperative scheduling of goals and actions in a potentially unpredictable world of operations. The operator's activity structures have been developed to include prioritization and interruption of multiple parallel activities among multiple operators, to provide for anticipation (knowledge of the intention and action of remote operators), and to respond to failures of the system and other operators in the system in situation-specific paradigms. We have exercised this model in a multi-air traffic sector scenario with potential conflict among aircraft at and across sector boundaries. We have modeled the control situation as a multiple closed loop system. The inner and outer loop alerting structure of air traffic management has many implications that need to be investigated to assure adequate design. First, there are control and stability factors implicit in the design. As the inner loop response time approaches that of the outer loop, system stability may be compromised in that controllers may be solving a problem the nature of which has already been changed by pilot action. Second, information exchange and information presentation for both air and ground must be designed to complement as opposed to compete with each other. Third, the level of individual and shared awareness in trajectory modification and flight conformance needs to be defined. Fourth, the level of required awareness and performance impact of mixed fleet operations and failed-mode recovery must be explored.

  3. White matter changes linked to visual recovery after nerve decompression

    PubMed Central

    Paul, David A.; Gaffin-Cahn, Elon; Hintz, Eric B.; Adeclat, Giscard J.; Zhu, Tong; Williams, Zoë R.; Vates, G. Edward; Mahon, Bradford Z.

    2015-01-01

    The relationship between the integrity of white matter tracts and cortical function in the human brain remains poorly understood. Here we use a model of reversible white matter injury, compression of the optic chiasm by tumors of the pituitary gland, to study the structural and functional changes that attend spontaneous recovery of cortical function and visual abilities after surgical tumor removal and subsequent decompression of the nerves. We show that compression of the optic chiasm leads to demyelination of the optic tracts, which reverses as quickly as 4 weeks after nerve decompression. Furthermore, variability across patients in the severity of demyelination in the optic tracts predicts visual ability and functional activity in early cortical visual areas, and pre-operative measurements of myelination in the optic tracts predicts the magnitude of visual recovery after surgery. These data indicate that rapid regeneration of myelin in the human brain is a significant component of the normalization of cortical activity, and ultimately the recovery of sensory and cognitive function, after nerve decompression. More generally, our findings demonstrate the utility of diffusion tensor imaging as an in vivo measure of myelination in the human brain. PMID:25504884

  4. Assessing the Performance of 3 Human Immunodeficiency Virus Incidence Risk Scores in a Cohort of Black and White Men Who Have Sex With Men in the South.

    PubMed

    Jones, Jeb; Hoenigl, Martin; Siegler, Aaron J; Sullivan, Patrick S; Little, Susan; Rosenberg, Eli

    2017-05-01

    Risk scores have been developed to identify men at high risk of human immunodeficiency virus (HIV) seroconversion. These scores can be used to more efficiently allocate public health prevention resources, such as pre-exposure prophylaxis. However, the published scores were developed with data sets that comprise predominantly white men who have sex with men (MSM) collected several years prior and recruited from a limited geographic area. Thus, it is unclear how well these scores perform in men of different races or ethnicities or men in different geographic regions. We assessed the predictive ability of 3 published scores to predict HIV seroconversion in a cohort of black and white MSM in Atlanta, GA. Questionnaire data from the baseline study visit were used to derive individual scores for each participant. We assessed the discriminatory ability of each risk score to predict HIV seroconversion over 2 years of follow-up. The predictive ability of each score was low among all MSM and lower among black men compared to white men. Each score had lower sensitivity to predict seroconversion among black MSM compared to white MSM and low area under the curve values for the receiver operating characteristic curve indicating poor discriminatory ability. Reliance on the currently available risk scores will result in misclassification of high proportions of MSM, especially black MSM, in terms of HIV risk, leading to missed opportunities for HIV prevention services.

  5. Helicopter human factors research

    NASA Technical Reports Server (NTRS)

    Nagel, David C.; Hart, Sandra G.

    1988-01-01

    Helicopter flight is among the most demanding of all human-machine integrations. The inherent manual control complexities of rotorcraft are made even more challenging by the small margin for error created in certain operations, such as nap-of-the-Earth (NOE) flight, by the proximity of the terrain. Accident data recount numerous examples of unintended conflict between helicopters and terrain and attest to the perceptual and control difficulties associated with low altitude flight tasks. Ames Research Center, in cooperation with the U.S. Army Aeroflightdynamics Directorate, has initiated an ambitious research program aimed at increasing safety margins for both civilian and military rotorcraft operations. The program is broad, fundamental, and focused on the development of scientific understandings and technological countermeasures. Research being conducted in several areas is reviewed: workload assessment, prediction, and measure validation; development of advanced displays and effective pilot/automation interfaces; identification of visual cues necessary for low-level, low-visibility flight and modeling of visual flight-path control; and pilot training.

  6. Forecasting municipal solid waste generation using artificial intelligence modelling approaches.

    PubMed

    Abbasi, Maryam; El Hanandeh, Ali

    2016-10-01

    Municipal solid waste (MSW) management is a major concern to local governments to protect human health, the environment and to preserve natural resources. The design and operation of an effective MSW management system requires accurate estimation of future waste generation quantities. The main objective of this study was to develop a model for accurate forecasting of MSW generation that helps waste related organizations to better design and operate effective MSW management systems. Four intelligent system algorithms including support vector machine (SVM), adaptive neuro-fuzzy inference system (ANFIS), artificial neural network (ANN) and k-nearest neighbours (kNN) were tested for their ability to predict monthly waste generation in the Logan City Council region in Queensland, Australia. Results showed artificial intelligence models have good prediction performance and could be successfully applied to establish municipal solid waste forecasting models. Using machine learning algorithms can reliably predict monthly MSW generation by training with waste generation time series. In addition, results suggest that ANFIS system produced the most accurate forecasts of the peaks while kNN was successful in predicting the monthly averages of waste quantities. Based on the results, the total annual MSW generated in Logan City will reach 9.4×10(7)kg by 2020 while the peak monthly waste will reach 9.37×10(6)kg. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Main Pipelines Corrosion Monitoring Device

    NASA Astrophysics Data System (ADS)

    Anatoliy, Bazhenov; Galina, Bondareva; Natalia, Grivennaya; Sergey, Malygin; Mikhail, Goryainov

    2017-01-01

    The aim of the article is to substantiate the technical solution for the problem of monitoring corrosion changes in oil and gas pipelines with use (using) of an electromagnetic NDT method. Pipeline wall thinning under operating conditions can lead to perforations and leakage of the product to be transported outside the pipeline. In most cases there is danger for human life and environment. Monitoring of corrosion changes in pipeline inner wall under operating conditions is complicated because pipelines are mainly made of structural steels with conductive and magnetic properties that complicate test signal passage through the entire thickness of the object under study. The technical solution of this problem lies in monitoring of the internal corrosion changes in pipes under operating conditions in order to increase safety of pipelines by automated prediction of achieving the threshold pre-crash values due to corrosion.

  8. Principles of Automation for Patient Safety in Intensive Care: Learning From Aviation.

    PubMed

    Dominiczak, Jason; Khansa, Lara

    2018-06-01

    The transition away from written documentation and analog methods has opened up the possibility of leveraging data science and analytic techniques to improve health care. In the implementation of data science techniques and methodologies, high-acuity patients in the ICU can particularly benefit. The Principles of Automation for Patient Safety in Intensive Care (PASPIC) framework draws on Billings's principles of human-centered aviation (HCA) automation and helps in identifying the advantages, pitfalls, and unintended consequences of automation in health care. Billings's HCA principles are based on the premise that human operators must remain "in command," so that they are continuously informed and actively involved in all aspects of system operations. In addition, automated systems need to be predictable, simple to train, to learn, and to operate, and must be able to monitor the human operators, and every intelligent system element must know the intent of other intelligent system elements. In applying Billings's HCA principles to the ICU setting, PAPSIC has three key characteristics: (1) integration and better interoperability, (2) multidimensional analysis, and (3) enhanced situation awareness. PAPSIC suggests that health care professionals reduce overreliance on automation and implement "cooperative automation" and that vendors reduce mode errors and embrace interoperability. Much can be learned from the aviation industry in automating the ICU. Because it combines "smart" technology with the necessary controls to withstand unintended consequences, PAPSIC could help ensure more informed decision making in the ICU and better patient care. Copyright © 2018 The Joint Commission. Published by Elsevier Inc. All rights reserved.

  9. Translational Modeling in Schizophrenia: Predicting Human Dopamine D2 Receptor Occupancy.

    PubMed

    Johnson, Martin; Kozielska, Magdalena; Pilla Reddy, Venkatesh; Vermeulen, An; Barton, Hugh A; Grimwood, Sarah; de Greef, Rik; Groothuis, Geny M M; Danhof, Meindert; Proost, Johannes H

    2016-04-01

    To assess the ability of a previously developed hybrid physiology-based pharmacokinetic-pharmacodynamic (PBPKPD) model in rats to predict the dopamine D2 receptor occupancy (D2RO) in human striatum following administration of antipsychotic drugs. A hybrid PBPKPD model, previously developed using information on plasma concentrations, brain exposure and D2RO in rats, was used as the basis for the prediction of D2RO in human. The rat pharmacokinetic and brain physiology parameters were substituted with human population pharmacokinetic parameters and human physiological information. To predict the passive transport across the human blood-brain barrier, apparent permeability values were scaled based on rat and human brain endothelial surface area. Active efflux clearance in brain was scaled from rat to human using both human brain endothelial surface area and MDR1 expression. Binding constants at the D2 receptor were scaled based on the differences between in vitro and in vivo systems of the same species. The predictive power of this physiology-based approach was determined by comparing the D2RO predictions with the observed human D2RO of six antipsychotics at clinically relevant doses. Predicted human D2RO was in good agreement with clinically observed D2RO for five antipsychotics. Models using in vitro information predicted human D2RO well for most of the compounds evaluated in this analysis. However, human D2RO was under-predicted for haloperidol. The rat hybrid PBPKPD model structure, integrated with in vitro information and human pharmacokinetic and physiological information, constitutes a scientific basis to predict the time course of D2RO in man.

  10. Approximate numerical abilities and mathematics: Insight from correlational and experimental training studies.

    PubMed

    Hyde, D C; Berteletti, I; Mou, Y

    2016-01-01

    Humans have the ability to nonverbally represent the approximate numerosity of sets of objects. The cognitive system that supports this ability, often referred to as the approximate number system (ANS), is present in early infancy and continues to develop in precision over the life span. It has been proposed that the ANS forms a foundation for uniquely human symbolic number and mathematics learning. Recent work has brought two types of evidence to bear on the relationship between the ANS and human mathematics: correlational studies showing individual differences in approximate numerical abilities correlate with individual differences in mathematics achievement and experimental studies showing enhancing effects of nonsymbolic approximate numerical training on exact, symbolic mathematical abilities. From this work, at least two accounts can be derived from these empirical data. It may be the case that the ANS and mathematics are related because the cognitive and brain processes responsible for representing numerical quantity in each format overlap, the Representational Overlap Hypothesis, or because of commonalities in the cognitive operations involved in mentally manipulating the representations of each format, the Operational Overlap hypothesis. The two hypotheses make distinct predictions for future work to test. © 2016 Elsevier B.V. All rights reserved.

  11. Global skin colour prediction from DNA.

    PubMed

    Walsh, Susan; Chaitanya, Lakshmi; Breslin, Krystal; Muralidharan, Charanya; Bronikowska, Agnieszka; Pospiech, Ewelina; Koller, Julia; Kovatsi, Leda; Wollstein, Andreas; Branicki, Wojciech; Liu, Fan; Kayser, Manfred

    2017-07-01

    Human skin colour is highly heritable and externally visible with relevance in medical, forensic, and anthropological genetics. Although eye and hair colour can already be predicted with high accuracies from small sets of carefully selected DNA markers, knowledge about the genetic predictability of skin colour is limited. Here, we investigate the skin colour predictive value of 77 single-nucleotide polymorphisms (SNPs) from 37 genetic loci previously associated with human pigmentation using 2025 individuals from 31 global populations. We identified a minimal set of 36 highly informative skin colour predictive SNPs and developed a statistical prediction model capable of skin colour prediction on a global scale. Average cross-validated prediction accuracies expressed as area under the receiver-operating characteristic curve (AUC) ± standard deviation were 0.97 ± 0.02 for Light, 0.83 ± 0.11 for Dark, and 0.96 ± 0.03 for Dark-Black. When using a 5-category, this resulted in 0.74 ± 0.05 for Very Pale, 0.72 ± 0.03 for Pale, 0.73 ± 0.03 for Intermediate, 0.87±0.1 for Dark, and 0.97 ± 0.03 for Dark-Black. A comparative analysis in 194 independent samples from 17 populations demonstrated that our model outperformed a previously proposed 10-SNP-classifier approach with AUCs rising from 0.79 to 0.82 for White, comparable at the intermediate level of 0.63 and 0.62, respectively, and a large increase from 0.64 to 0.92 for Black. Overall, this study demonstrates that the chosen DNA markers and prediction model, particularly the 5-category level; allow skin colour predictions within and between continental regions for the first time, which will serve as a valuable resource for future applications in forensic and anthropologic genetics.

  12. Predictive Model for the Design of Zwitterionic Polymer Brushes: A Statistical Design of Experiments Approach.

    PubMed

    Kumar, Ramya; Lahann, Joerg

    2016-07-06

    The performance of polymer interfaces in biology is governed by a wide spectrum of interfacial properties. With the ultimate goal of identifying design parameters for stem cell culture coatings, we developed a statistical model that describes the dependence of brush properties on surface-initiated polymerization (SIP) parameters. Employing a design of experiments (DOE) approach, we identified operating boundaries within which four gel architecture regimes can be realized, including a new regime of associated brushes in thin films. Our statistical model can accurately predict the brush thickness and the degree of intermolecular association of poly[{2-(methacryloyloxy) ethyl} dimethyl-(3-sulfopropyl) ammonium hydroxide] (PMEDSAH), a previously reported synthetic substrate for feeder-free and xeno-free culture of human embryonic stem cells. DOE-based multifunctional predictions offer a powerful quantitative framework for designing polymer interfaces. For example, model predictions can be used to decrease the critical thickness at which the wettability transition occurs by simply increasing the catalyst quantity from 1 to 3 mol %.

  13. Plasticity in the Rat Prefrontal Cortex: Linking Gene Expression and an Operant Learning with a Computational Theory

    PubMed Central

    Rapanelli, Maximiliano; Lew, Sergio Eduardo; Frick, Luciana Romina; Zanutto, Bonifacio Silvano

    2010-01-01

    The plasticity in the medial Prefrontal Cortex (mPFC) of rodents or lateral prefrontal cortex in non human primates (lPFC), plays a key role neural circuits involved in learning and memory. Several genes, like brain-derived neurotrophic factor (BDNF), cAMP response element binding (CREB), Synapsin I, Calcium/calmodulin-dependent protein kinase II (CamKII), activity-regulated cytoskeleton-associated protein (Arc), c-jun and c-fos have been related to plasticity processes. We analysed differential expression of related plasticity genes and immediate early genes in the mPFC of rats during learning an operant conditioning task. Incompletely and completely trained animals were studied because of the distinct events predicted by our computational model at different learning stages. During learning an operant conditioning task, we measured changes in the mRNA levels by Real-Time RT-PCR during learning; expression of these markers associated to plasticity was incremented while learning and such increments began to decline when the task was learned. The plasticity changes in the lPFC during learning predicted by the model matched up with those of the representative gene BDNF. Herein, we showed for the first time that plasticity in the mPFC in rats during learning of an operant conditioning is higher while learning than when the task is learned, using an integrative approach of a computational model and gene expression. PMID:20111591

  14. The knowledge-based framework for a nuclear power plant operator advisor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.W.; Hajek, B.K.

    1989-01-01

    An important facet in the design, development, and evaluation of aids for complex systems is the identification of the tasks performed by the operator. Operator aids utilizing artificial intelligence, or more specifically knowledge-based systems, require identification of these tasks in the context of a knowledge-based framework. In this context, the operator responses to the plant behavior are to monitor and comprehend the state of the plant, identify normal and abnormal plant conditions, diagnose abnormal plant conditions, predict plant response to specific control actions, and select the best available control action, implement a feasible control action, monitor system response to themore » control action, and correct for any inappropriate responses. These tasks have been identified to formulate a knowledge-based framework for an operator advisor under development at Ohio State University that utilizes the generic task methodology proposed by Chandrasekaran. The paper lays the foundation to identify the responses as a knowledge-based set of tasks in accordance with the expected human operator responses during an event. Initial evaluation of the expert system indicates the potential for an operator aid that will improve the operator's ability to respond to both anticipated and unanticipated events.« less

  15. An Overview of the NASA Aviation Safety Program (AVSP) Systemwide Accident Prevention (SWAP) Human Performance Modeling (HPM) Element

    NASA Technical Reports Server (NTRS)

    Foyle, David C.; Goodman, Allen; Hooley, Becky L.

    2003-01-01

    An overview is provided of the Human Performance Modeling (HPM) element within the NASA Aviation Safety Program (AvSP). Two separate model development tracks for performance modeling of real-world aviation environments are described: the first focuses on the advancement of cognitive modeling tools for system design, while the second centers on a prescriptive engineering model of activity tracking for error detection and analysis. A progressive implementation strategy for both tracks is discussed in which increasingly more complex, safety-relevant applications are undertaken to extend the state-of-the-art, as well as to reveal potential human-system vulnerabilities in the aviation domain. Of particular interest is the ability to predict the precursors to error and to assess potential mitigation strategies associated with the operational use of future flight deck technologies.

  16. Space Environments and Effects Concept: Transitioning Research to Operations and Applications

    NASA Technical Reports Server (NTRS)

    Edwards, David L.; Spann, James; Burns, Howard D.; Schumacher, Dan

    2012-01-01

    The National Aeronautics and Space Administration (NASA) is embarking on a course to expand human presence beyond Low Earth Orbit (LEO) while expanding its mission to explore the solar system. Destinations such as Near Earth Asteroids (NEA), Mars and its moons, and the outer planets are but a few of the mission targets. NASA has established numerous offices specializing in specific space environments disciplines that will serve to enable these missions. To complement these existing discipline offices, a concept focusing on the development of space environment and effects application is presented. This includes space climate, space weather, and natural and induced space environments. This space environment and effects application is composed of 4 topic areas; characterization and modeling, engineering effects, prediction and operation, and mitigation and avoidance. These topic areas are briefly described below. Characterization and modeling of space environments will primarily focus on utilization during Program mission concept, planning, and design phases. Engineering effects includes materials testing and flight experiments producing data to be used in mission planning and design phases. Prediction and operation pulls data from existing sources into decision-making tools and empirical data sets to be used during the operational phase of a mission. Mitigation and avoidance will develop techniques and strategies used in the design and operations phases of the mission. The goal of this space environment and effects application is to develop decision-making tools and engineering products to support the mission phases of mission concept through operations by focusing on transitioning research to operations. Products generated by this space environments and effects application are suitable for use in anomaly investigations. This paper will outline the four topic areas, describe the need, and discuss an organizational structure for this space environments and effects application.

  17. A Near-Term Concept for Trajectory Based Operations with Air/Ground Data Link Communication

    NASA Technical Reports Server (NTRS)

    McNally, David; Mueller, Eric; Thipphavong, David; Paielli, Russell; Cheng, Jinn-Hwei; Lee, Chuhan; Sahlman, Scott; Walton, Joe

    2010-01-01

    An operating concept and required system components for trajectory-based operations with air/ground data link for today's en route and transition airspace is proposed. Controllers are fully responsible for separation as they are today, and no new aircraft equipage is required. Trajectory automation computes integrated solutions to problems like metering, weather avoidance, traffic conflicts and the desire to find and fly more time/fuel efficient flight trajectories. A common ground-based system supports all levels of aircraft equipage and performance including those equipped and not equipped for data link. User interface functions for the radar controller's display make trajectory-based clearance advisories easy to visualize, modify if necessary, and implement. Laboratory simulations (without human operators) were conducted to test integrated operation of selected system components with uncertainty modeling. Results are based on 102 hours of Fort Worth Center traffic recordings involving over 37,000 individual flights. The presence of uncertainty had a marginal effect (5%) on minimum-delay conflict resolution performance, and windfavorable routes had no effect on detection and resolution metrics. Flight plan amendments and clearances were substantially reduced compared to today s operations. Top-of-descent prediction errors are the largest cause of failure indicating that better descent predictions are needed to reliably achieve fuel-efficient descent profiles in medium to heavy traffic. Improved conflict detections for climbing flights could enable substantially more continuous climbs to cruise altitude. Unlike today s Conflict Alert, tactical automation must alert when an altitude amendment is entered, but before the aircraft starts the maneuver. In every other failure case tactical automation prevented losses of separation. A real-time prototype trajectory trajectory-automation system is running now and could be made ready for operational testing at an en route Center in 1-2 years.

  18. Robotic and Human-Tended Collaborative Drilling Automation for Subsurface Exploration

    NASA Technical Reports Server (NTRS)

    Glass, Brian; Cannon, Howard; Stoker, Carol; Davis, Kiel

    2005-01-01

    Future in-situ lunar/martian resource utilization and characterization, as well as the scientific search for life on Mars, will require access to the subsurface and hence drilling. Drilling on Earth is hard - an art form more than an engineering discipline. Human operators listen and feel drill string vibrations coming from kilometers underground. Abundant mass and energy make it possible for terrestrial drilling to employ brute-force approaches to failure recovery and system performance issues. Space drilling will require intelligent and autonomous systems for robotic exploration and to support human exploration. Eventual in-situ resource utilization will require deep drilling with probable human-tended operation of large-bore drills, but initial lunar subsurface exploration and near-term ISRU will be accomplished with lightweight, rover-deployable or standalone drills capable of penetrating a few tens of meters in depth. These lightweight exploration drills have a direct counterpart in terrestrial prospecting and ore-body location, and will be designed to operate either human-tended or automated. NASA and industry now are acquiring experience in developing and building low-mass automated planetary prototype drills to design and build a pre-flight lunar prototype targeted for 2011-12 flight opportunities. A successful system will include development of drilling hardware, and automated control software to operate it safely and effectively. This includes control of the drilling hardware, state estimation of both the hardware and the lithography being drilled and state of the hole, and potentially planning and scheduling software suitable for uncertain situations such as drilling. Given that Humans on the Moon or Mars are unlikely to be able to spend protracted EVA periods at a drill site, both human-tended and robotic access to planetary subsurfaces will require some degree of standalone, autonomous drilling capability. Human-robotic coordination will be important, either between a robotic drill and humans on Earth, or a human-tended drill and its visiting crew. The Mars Analog Rio Tinto Experiment (MARTE) is a current project that studies and simulates the remote science operations between an automated drill in Spain and a distant, distributed human science team. The Drilling Automation for Mars Exploration (DAME) project, by contrast: is developing and testing standalone automation at a lunar/martian impact crater analog site in Arctic Canada. The drill hardware in both projects is a hardened, evolved version of the Advanced Deep Drill (ADD) developed by Honeybee Robotics for the Mars Subsurface Program. The current ADD is capable of 20m, and the DAME project is developing diagnostic and executive software for hands-off surface operations of the evolved version of this drill. The current drill automation architecture being developed by NASA and tested in 2004-06 at analog sites in the Arctic and Spain will add downhole diagnosis of different strata, bit wear detection, and dynamic replanning capabilities when unexpected failures or drilling conditions are discovered in conjunction with simulated mission operations and remote science planning. The most important determinant of future 1unar and martian drilling automation and staffing requirements will be the actual performance of automated prototype drilling hardware systems in field trials in simulated mission operations. It is difficult to accurately predict the level of automation and human interaction that will be needed for a lunar-deployed drill without first having extensive experience with the robotic control of prototype drill systems under realistic analog field conditions. Drill-specific failure modes and software design flaws will become most apparent at this stage. DAME will develop and test drill automation software and hardware under stressful operating conditions during several planned field campaigns. Initial results from summer 2004 tests show seven identifi distinct failure modes of the drill: cuttings-removal issues with low-power drilling into permafrost, and successful steps at executive control and initial automation.

  19. Animal models of addiction

    PubMed Central

    Spanagel, Rainer

    2017-01-01

    In recent years, animal models in psychiatric research have been criticized for their limited translational value to the clinical situation. Failures in clinical trials have thus often been attributed to the lack of predictive power of preclinical animal models. Here, I argue that animal models of voluntary drug intake—under nonoperant and operant conditions—and addiction models based on the Diagnostic and Statistical Manual of Mental Disorders are crucial and informative tools for the identification of pathological mechanisms, target identification, and drug development. These models provide excellent face validity, and it is assumed that the neurochemical and neuroanatomical substrates involved in drug-intake behavior are similar in laboratory rodents and humans. Consequently, animal models of drug consumption and addiction provide predictive validity. This predictive power is best illustrated in alcohol research, in which three approved medications—acamprosate, naltrexone, and nalmefene—were developed by means of animal models and then successfully translated into the clinical situation. PMID:29302222

  20. Predictability of the Arctic sea ice edge

    NASA Astrophysics Data System (ADS)

    Goessling, H. F.; Tietsche, S.; Day, J. J.; Hawkins, E.; Jung, T.

    2016-02-01

    Skillful sea ice forecasts from days to years ahead are becoming increasingly important for the operation and planning of human activities in the Arctic. Here we analyze the potential predictability of the Arctic sea ice edge in six climate models. We introduce the integrated ice-edge error (IIEE), a user-relevant verification metric defined as the area where the forecast and the "truth" disagree on the ice concentration being above or below 15%. The IIEE lends itself to decomposition into an absolute extent error, corresponding to the common sea ice extent error, and a misplacement error. We find that the often-neglected misplacement error makes up more than half of the climatological IIEE. In idealized forecast ensembles initialized on 1 July, the IIEE grows faster than the absolute extent error. This means that the Arctic sea ice edge is less predictable than sea ice extent, particularly in September, with implications for the potential skill of end-user relevant forecasts.

  1. Predicting Operator Execution Times Using CogTool

    NASA Technical Reports Server (NTRS)

    Santiago-Espada, Yamira; Latorella, Kara A.

    2013-01-01

    Researchers and developers of NextGen systems can use predictive human performance modeling tools as an initial approach to obtain skilled user performance times analytically, before system testing with users. This paper describes the CogTool models for a two pilot crew executing two different types of a datalink clearance acceptance tasks, and on two different simulation platforms. The CogTool time estimates for accepting and executing Required Time of Arrival and Interval Management clearances were compared to empirical data observed in video tapes and registered in simulation files. Results indicate no statistically significant difference between empirical data and the CogTool predictions. A population comparison test found no significant differences between the CogTool estimates and the empirical execution times for any of the four test conditions. We discuss modeling caveats and considerations for applying CogTool to crew performance modeling in advanced cockpit environments.

  2. Predicting human activities in sequences of actions in RGB-D videos

    NASA Astrophysics Data System (ADS)

    Jardim, David; Nunes, Luís.; Dias, Miguel

    2017-03-01

    In our daily activities we perform prediction or anticipation when interacting with other humans or with objects. Prediction of human activity made by computers has several potential applications: surveillance systems, human computer interfaces, sports video analysis, human-robot-collaboration, games and health-care. We propose a system capable of recognizing and predicting human actions using supervised classifiers trained with automatically labeled data evaluated in our human activity RGB-D dataset (recorded with a Kinect sensor) and using only the position of the main skeleton joints to extract features. Using conditional random fields (CRFs) to model the sequential nature of actions in a sequence has been used before, but where other approaches try to predict an outcome or anticipate ahead in time (seconds), we try to predict what will be the next action of a subject. Our results show an activity prediction accuracy of 89.9% using an automatically labeled dataset.

  3. Using timing of ice retreat to predict timing of fall freeze-up in the Arctic

    NASA Astrophysics Data System (ADS)

    Stroeve, Julienne C.; Crawford, Alex D.; Stammerjohn, Sharon

    2016-06-01

    Reliable forecasts of the timing of sea ice advance are needed in order to reduce risks associated with operating in the Arctic as well as planning of human and environmental emergencies. This study investigates the use of a simple statistical model relating the timing of ice retreat to the timing of ice advance, taking advantage of the inherent predictive power supplied by the seasonal ice-albedo feedback and ocean heat uptake. Results show that using the last retreat date to predict the first advance date is applicable in some regions, such as Baffin Bay and the Laptev and East Siberian seas, where a predictive skill is found even after accounting for the long-term trend in both variables. Elsewhere, in the Arctic, there is some predictive skills depending on the year (e.g., Kara and Beaufort seas), but none in regions such as the Barents and Bering seas or the Sea of Okhotsk. While there is some suggestion that the relationship is strengthening over time, this may reflect that higher correlations are expected during periods when the underlying trend is strong.

  4. Molecular Docking for Prediction and Interpretation of Adverse Drug Reactions.

    PubMed

    Luo, Heng; Fokoue-Nkoutche, Achille; Singh, Nalini; Yang, Lun; Hu, Jianying; Zhang, Ping

    2018-05-23

    Adverse drug reactions (ADRs) present a major burden for patients and the healthcare industry. Various computational methods have been developed to predict ADRs for drug molecules. However, many of these methods require experimental or surveillance data and cannot be used when only structural information is available. We collected 1,231 small molecule drugs and 600 human proteins and utilized molecular docking to generate binding features among them. We developed machine learning models that use these docking features to make predictions for 1,533 ADRs. These models obtain an overall area under the receiver operating characteristic curve (AUROC) of 0.843 and an overall area under the precision-recall curve (AUPR) of 0.395, outperforming seven structural fingerprint-based prediction models. Using the method, we predicted skin striae for fluticasone propionate, dermatitis acneiform for mometasone, and decreased libido for irinotecan, as demonstrations. Furthermore, we analyzed the top binding proteins associated with some of the ADRs, which can help to understand and/or generate hypotheses for underlying mechanisms of ADRs. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. Pitfalls and Precautions When Using Predicted Failure Data for Quantitative Analysis of Safety Risk for Human Rated Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hatfield, Glen S.; Hark, Frank; Stott, James

    2016-01-01

    Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account system integration risks such as those attributable to manufacturing and assembly. These sources often dominate component level risk. While consequence of failure is often understood, using predicted values in a risk model to estimate the probability of occurrence may underestimate the actual risk. Managers and decision makers use the probability of occurrence to influence the determination whether to accept the risk or require a design modification. The actual risk threshold for acceptance may not be fully understood due to the absence of system level test data or operational data. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.

  6. The phantom robot - Predictive displays for teleoperation with time delay

    NASA Technical Reports Server (NTRS)

    Bejczy, Antal K.; Kim, Won S.; Venema, Steven C.

    1990-01-01

    An enhanced teleoperation technique for time-delayed bilateral teleoperator control is discussed. The control technique selected for time delay is based on the use of a high-fidelity graphics phantom robot that is being controlled in real time (without time delay) against the static task image. Thus, the motion of the phantom robot image on the monitor predicts the motion of the real robot. The real robot's motion will follow the phantom robot's motion on the monitor with the communication time delay implied in the task. Real-time high-fidelity graphics simulation of a PUMA arm is generated and overlaid on the actual camera view of the arm. A simple camera calibration technique is used for calibrated graphics overlay. A preliminary experiment is performed with the predictive display by using a very simple tapping task. The results with this simple task indicate that predictive display enhances the human operator's telemanipulation task performance significantly during free motion when there is a long time delay. It appears, however, that either two-view or stereoscopic predictive displays are necessary for general three-dimensional tasks.

  7. Global climate change and vector-borne diseases

    USGS Publications Warehouse

    Ginsberg, H.S.

    2002-01-01

    Global warming will have different effects on different diseases because of the complex and idiosynchratic interactions between vectors, hosts, and pathogens that influence transmission dynamics of each pathogen. Human activities, including urbanization, rapid global travel, and vector management, have profound effects on disease transmission that can operate on more rapid time scales than does global climate change. The general concern about global warming encouraging the spread of tropical diseases is legitimate, but the effects vary among diseases, and the ecological implications are difficult to predict.

  8. A prospective cohort study of fetal heart rate monitoring: deceleration area is predictive of fetal acidemia.

    PubMed

    Cahill, Alison G; Tuuli, Methodius G; Stout, Molly J; López, Julia D; Macones, George A

    2018-05-01

    Intrapartum electronic fetal monitoring is the most commonly used tool in obstetrics in the United States; however, which electronic fetal monitoring patterns predict acidemia remains unclear. This study was designed to describe the frequency of patterns seen in labor using modern nomenclature, and to test the hypothesis that visually interpreted patterns are associated with acidemia and morbidities in term infants. We further identified patterns prior to delivery, alone or in combination, predictive of acidemia and neonatal morbidity. This was a prospective cohort study of 8580 women from 2010 through 2015. Patients were all consecutive women laboring at ≥37 weeks' gestation with a singleton cephalic fetus. Electronic fetal monitoring patterns during the 120 minutes prior to delivery were interpreted in 10-minute epochs. Interpretation included the category system and individual electronic fetal monitoring patterns per the Eunice Kennedy Shriver National Institute of Child Health and Human Development criteria as well as novel patterns. The primary outcome was fetal acidemia (umbilical artery pH ≤7.10); neonatal morbidities were also assessed. Final regression models for acidemia adjusted for nulliparity, pregestational diabetes, and advanced maternal age. Area under the receiver operating characteristic curves were used to assess the test characteristics of individual models for acidemia and neonatal morbidity. Of 8580 women, 149 (1.7%) delivered acidemic infants. Composite neonatal morbidity was diagnosed in 757 (8.8%) neonates within the total cohort. Persistent category I, and 10-minute period of category III, were significantly associated with normal pH and acidemia, respectively. Total deceleration area was most discriminative of acidemia (area under the receiver operating characteristic curves, 0.76; 95% confidence interval, 0.72-0.80), and deceleration area with any 10 minutes of tachycardia had the greatest discriminative ability for neonatal morbidity (area under the receiver operating characteristic curves, 0.77; 95% confidence interval, 0.75-0.79). Once the threshold of deceleration area is reached the number of cesareans needed-to-be performed to potentially prevent 1 case of acidemia and morbidity is 5 and 6, respectively. Deceleration area is the most predictive electronic fetal monitoring pattern for acidemia, and combined with tachycardia for significant risk of morbidity, from the electronic fetal monitoring patterns studied. It is important to acknowledge that this study was performed in patients delivering ≥37 weeks, which may limit the generalizability to preterm populations. We also did not use computerized analysis of the electronic fetal monitoring patterns because human visual interpretation was the basis for the Eunice Kennedy Shriver National Institute of Child Health and Human Development categories, and importantly, it is how electronic fetal monitoring is used clinically. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. The role of predictive uncertainty in the operational management of reservoirs

    NASA Astrophysics Data System (ADS)

    Todini, E.

    2014-09-01

    The present work deals with the operational management of multi-purpose reservoirs, whose optimisation-based rules are derived, in the planning phase, via deterministic (linear and nonlinear programming, dynamic programming, etc.) or via stochastic (generally stochastic dynamic programming) approaches. In operation, the resulting deterministic or stochastic optimised operating rules are then triggered based on inflow predictions. In order to fully benefit from predictions, one must avoid using them as direct inputs to the reservoirs, but rather assess the "predictive knowledge" in terms of a predictive probability density to be operationally used in the decision making process for the estimation of expected benefits and/or expected losses. Using a theoretical and extremely simplified case, it will be shown why directly using model forecasts instead of the full predictive density leads to less robust reservoir management decisions. Moreover, the effectiveness and the tangible benefits for using the entire predictive probability density instead of the model predicted values will be demonstrated on the basis of the Lake Como management system, operational since 1997, as well as on the basis of a case study on the lake of Aswan.

  10. Coupling of Bayesian Networks with GIS for wildfire risk assessment on natural and agricultural areas of the Mediterranean

    NASA Astrophysics Data System (ADS)

    Scherb, Anke; Papakosta, Panagiota; Straub, Daniel

    2014-05-01

    Wildfires cause severe damages to ecosystems, socio-economic assets, and human lives in the Mediterranean. To facilitate coping with wildfire risks, an understanding of the factors influencing wildfire occurrence and behavior (e.g. human activity, weather conditions, topography, fuel loads) and their interaction is of importance, as is the implementation of this knowledge in improved wildfire hazard and risk prediction systems. In this project, a probabilistic wildfire risk prediction model is developed, with integrated fire occurrence and fire propagation probability and potential impact prediction on natural and cultivated areas. Bayesian Networks (BNs) are used to facilitate the probabilistic modeling. The final BN model is a spatial-temporal prediction system at the meso scale (1 km2 spatial and 1 day temporal resolution). The modeled consequences account for potential restoration costs and production losses referred to forests, agriculture, and (semi-) natural areas. BNs and a geographic information system (GIS) are coupled within this project to support a semi-automated BN model parameter learning and the spatial-temporal risk prediction. The coupling also enables the visualization of prediction results by means of daily maps. The BN parameters are learnt for Cyprus with data from 2006-2009. Data from 2010 is used as validation data set. A special focus is put on the performance evaluation of the BN for fire occurrence, which is modeled as binary classifier and thus, could be validated by means of Receiver Operator Characteristic (ROC) curves. With the final best models, AUC values of more than 70% for validation could be achieved, which indicates potential for reliable prediction performance via BN. Maps of selected days in 2010 are shown to illustrate final prediction results. The resulting system can be easily expanded to predict additional expected damages in the mesoscale (e.g. building and infrastructure damages). The system can support planning of preventive measures (e.g. state resources allocation for wildfire prevention and preparedness) and assist recuperation plans of damaged areas.

  11. A computer model of the pediatric circulatory system for testing pediatric assist devices.

    PubMed

    Giridharan, Guruprasad A; Koenig, Steven C; Mitchell, Michael; Gartner, Mark; Pantalos, George M

    2007-01-01

    Lumped parameter computer models of the pediatric circulatory systems for 1- and 4-year-olds were developed to predict hemodynamic responses to mechanical circulatory support devices. Model parameters, including resistance, compliance and volume, were adjusted to match hemodynamic pressure and flow waveforms, pressure-volume loops, percent systole, and heart rate of pediatric patients (n = 6) with normal ventricles. Left ventricular failure was modeled by adjusting the time-varying compliance curve of the left heart to produce aortic pressures and cardiac outputs consistent with those observed clinically. Models of pediatric continuous flow (CF) and pulsatile flow (PF) ventricular assist devices (VAD) and intraaortic balloon pump (IABP) were developed and integrated into the heart failure pediatric circulatory system models. Computer simulations were conducted to predict acute hemodynamic responses to PF and CF VAD operating at 50%, 75% and 100% support and 2.5 and 5 ml IABP operating at 1:1 and 1:2 support modes. The computer model of the pediatric circulation matched the human pediatric hemodynamic waveform morphology to within 90% and cardiac function parameters with 95% accuracy. The computer model predicted PF VAD and IABP restore aortic pressure pulsatility and variation in end-systolic and end-diastolic volume, but diminish with increasing CF VAD support.

  12. "Facing humanness: Facial width-to-height ratio predicts ascriptions of humanity": Correction to Deska, Lloyd, and Hugenberg (2017).

    PubMed

    2018-01-01

    Reports an error in "Facing Humanness: Facial Width-to-Height Ratio Predicts Ascriptions of Humanity" by Jason C. Deska, E. Paige Lloyd and Kurt Hugenberg ( Journal of Personality and Social Psychology , Advanced Online Publication, Aug 28, 2017, np). In the article, there is a data error in the Results section of Study 1c. The fourth sentence of the fourth paragraph should read as follows: High fWHR targets (M= 74.39, SD=18.25) were rated as equivalently evolved as their low fWHR counterparts (M=79.39, SD=15.91). (The following abstract of the original article appeared in record 2017-36694-001.) The ascription of mind to others is central to social cognition. Most research on the ascription of mind has focused on motivated, top-down processes. The current work provides novel evidence that facial width-to-height ratio (fWHR) serves as a bottom-up perceptual signal of humanness. Using a range of well-validated operational definitions of humanness, we provide evidence across 5 studies that target faces with relatively greater fWHR are seen as less than fully human compared with their relatively lower fWHR counterparts. We then present 2 ancillary studies exploring whether the fWHR-to-humanness link is mediated by previously established fWHR-trait links in the literature. Finally, 3 additional studies extend this fWHR-humanness link beyond measurements of humanness, demonstrating that the fWHR-humanness link has consequences for downstream social judgments including the sorts of crimes people are perceived to be guilty of and the social tasks for which they seem helpful. In short, we provide evidence for the hypothesis that individuals with relatively greater facial width-to-height ratio are routinely denied sophisticated, humanlike minds. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  13. Human Behavior & Low Energy Architecture: Linking Environmental Adaptation, Personal Comfort, & Energy Use in the Built Environment

    NASA Astrophysics Data System (ADS)

    Langevin, Jared

    Truly sustainable buildings serve to enrich the daily sensory experience of their human inhabitants while consuming the least amount of energy possible; yet, building occupants and their environmentally adaptive behaviors remain a poorly characterized variable in even the most "green" building design and operation approaches. This deficiency has been linked to gaps between predicted and actual energy use, as well as to eventual problems with occupant discomfort, productivity losses, and health issues. Going forward, better tools are needed for considering the human-building interaction as a key part of energy efficiency strategies that promote good Indoor Environmental Quality (IEQ) in buildings. This dissertation presents the development and implementation of a Human and Building Interaction Toolkit (HABIT), a framework for the integrated simulation of office occupants' thermally adaptive behaviors, IEQ, and building energy use as part of sustainable building design and operation. Development of HABIT begins with an effort to devise more reliable methods for predicting individual occupants' thermal comfort, considered the driving force behind the behaviors of focus for this project. A long-term field study of thermal comfort and behavior is then presented, and the data it generates are used to develop and validate an agent-based behavior simulation model. Key aspects of the agent-based behavior model are described, and its predictive abilities are shown to compare favorably to those of multiple other behavior modeling options. Finally, the agent-based behavior model is linked with whole building energy simulation in EnergyPlus, forming the full HABIT program. The program is used to evaluate the energy and IEQ impacts of several occupant behavior scenarios in the simulation of a case study office building for the Philadelphia climate. Results indicate that more efficient local heating/cooling options may be paired with wider set point ranges to yield up to 24/28% HVAC energy savings in the winter/summer while also reducing thermal unacceptability; however, it is shown that the source of energy being saved must be considered in each case, as local heating options end up replacing cheaper, more carbon-friendly gas heating with expensive, emissions-heavy plug load electricity. The dissertation concludes with a summary of key outcomes and suggests how HABIT may be further developed in the future.

  14. Prediction of polycystic ovarian syndrome based on ultrasound findings and clinical parameters.

    PubMed

    Moschos, Elysia; Twickler, Diane M

    2015-03-01

    To determine the accuracy of sonographic-diagnosed polycystic ovaries and clinical parameters in predicting polycystic ovarian syndrome. Medical records and ultrasounds of 151 women with sonographically diagnosed polycystic ovaries were reviewed. Sonographic criteria for polycystic ovaries were based on 2003 Rotterdam European Society of Human Reproduction and Embryology/American Society for Reproductive Medicine guidelines: at least one ovary with 12 or more follicles measuring 2-9 mm and/or increased ovarian volume >10 cm(3) . Clinical variables of age, gravidity, ethnicity, body mass index, and sonographic indication were collected. One hundred thirty-five patients had final outcomes (presence/absence of polycystic ovarian syndrome). Polycystic ovarian syndrome was diagnosed if a patient had at least one other of the following two criteria: oligo/chronic anovulation and/or clinical/biochemical hyperandrogenism. A logistic regression model was constructed using stepwise selection to identify variables significantly associated with polycystic ovarian syndrome (p < .05). The validity of the model was assessed using receiver operating characteristics and Hosmer-Lemeshow χ(2) analyses. One hundred twenty-eight patients met official sonographic criteria for polycystic ovaries and 115 (89.8%) had polycystic ovarian syndrome (p = .009). Lower gravidity, abnormal bleeding, and body mass index >33 were significant in predicting polycystic ovarian syndrome (receiver operating characteristics curve, c = 0.86). Pain decreased the likelihood of polycystic ovarian syndrome. Polycystic ovaries on ultrasound were sensitive in predicting polycystic ovarian syndrome. Ultrasound, combined with clinical parameters, can be used to generate a predictive index for polycystic ovarian syndrome. © 2014 Wiley Periodicals, Inc.

  15. A Model to Predict the Risk of Keratinocyte Carcinomas.

    PubMed

    Whiteman, David C; Thompson, Bridie S; Thrift, Aaron P; Hughes, Maria-Celia; Muranushi, Chiho; Neale, Rachel E; Green, Adele C; Olsen, Catherine M

    2016-06-01

    Basal cell and squamous cell carcinomas of the skin are the commonest cancers in humans, yet no validated tools exist to estimate future risks of developing keratinocyte carcinomas. To develop a prediction tool, we used baseline data from a prospective cohort study (n = 38,726) in Queensland, Australia, and used data linkage to capture all surgically excised keratinocyte carcinomas arising within the cohort. Predictive factors were identified through stepwise logistic regression models. In secondary analyses, we derived separate models within strata of prior skin cancer history, age, and sex. The primary model included terms for 10 items. Factors with the strongest effects were >20 prior skin cancers excised (odds ratio 8.57, 95% confidence interval [95% CI] 6.73-10.91), >50 skin lesions destroyed (odds ratio 3.37, 95% CI 2.85-3.99), age ≥ 70 years (odds ratio 3.47, 95% CI 2.53-4.77), and fair skin color (odds ratio 1.75, 95% CI 1.42-2.15). Discrimination in the validation dataset was high (area under the receiver operator characteristic curve 0.80, 95% CI 0.79-0.81) and the model appeared well calibrated. Among those reporting no prior history of skin cancer, a similar model with 10 factors predicted keratinocyte carcinoma events with reasonable discrimination (area under the receiver operator characteristic curve 0.72, 95% CI 0.70-0.75). Algorithms using self-reported patient data have high accuracy for predicting risks of keratinocyte carcinomas. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Francis Bacon's behavioral psychology.

    PubMed

    MacDonald, Paul S

    2007-01-01

    Francis Bacon offers two accounts of the nature and function of the human mind: one is a medical-physical account of the composition and operation of spirits specific to human beings, the other is a behavioral account of the character and activities of individual persons. The medical-physical account is a run-of-the-mill version of the late Renaissance model of elemental constituents and humoral temperaments. The other, less well-known, behavioral account represents an unusual position in early modern philosophy. This theory espouses a form of behavioral psychology according to which (a) supposed mental properties are "hidden forms" best described in dispositional terms, (b) the true character of an individual can be discovered in his observable behavior, and (c) an "informed" understanding of these properties permits the prediction and control of human behavior. Both of Bacon's theories of human nature fall under his general notion of systematic science: his medical-physical theory of vital spirits is theoretical natural philosophy and his behavioral theory of disposition and expression is operative natural philosophy. Because natural philosophy as a whole is "the inquiry of causes and the production of effects," knowledge of human nature falls under the same two-part definition. It is an inquisition of forms that pertains to the patterns of minute motions in the vital spirits and the production of effects that pertains both to the way these hidden motions produce behavioral effects and to the way in which a skillful agent is able to produce desired effects in other persons' behavior. (c) 2007 Wiley Periodicals, Inc.

  17. Chimeric mice with humanized liver: Application in drug metabolism and pharmacokinetics studies for drug discovery.

    PubMed

    Naritomi, Yoichi; Sanoh, Seigo; Ohta, Shigeru

    2018-02-01

    Predicting human drug metabolism and pharmacokinetics (PK) is key to drug discovery. In particular, it is important to predict human PK, metabolite profiles and drug-drug interactions (DDIs). Various methods have been used for such predictions, including in vitro metabolic studies using human biological samples, such as hepatic microsomes and hepatocytes, and in vivo studies using experimental animals. However, prediction studies using these methods are often inconclusive due to discrepancies between in vitro and in vivo results, and interspecies differences in drug metabolism. Further, the prediction methods have changed from qualitative to quantitative to solve these issues. Chimeric mice with humanized liver have been developed, in which mouse liver cells are mostly replaced with human hepatocytes. Since human drug metabolizing enzymes are expressed in the liver of these mice, they are regarded as suitable models for mimicking the drug metabolism and PK observed in humans; therefore, these mice are useful for predicting human drug metabolism and PK. In this review, we discuss the current state, issues, and future directions of predicting human drug metabolism and PK using chimeric mice with humanized liver in drug discovery. Copyright © 2017 The Japanese Society for the Study of Xenobiotics. Published by Elsevier Ltd. All rights reserved.

  18. Neural Mechanisms for Adaptive Learned Avoidance of Mental Effort.

    PubMed

    Mitsuto Nagase, Asako; Onoda, Keiichi; Clifford Foo, Jerome; Haji, Tomoki; Akaishi, Rei; Yamaguchi, Shuhei; Sakai, Katsuyuki; Morita, Kenji

    2018-02-05

    Humans tend to avoid mental effort. Previous studies have demonstrated this tendency using various demand-selection tasks; participants generally avoid options associated with higher cognitive demand. However, it remains unclear whether humans avoid mental effort adaptively in uncertain and non-stationary environments, and if so, what neural mechanisms underlie this learned avoidance and whether they remain the same irrespective of cognitive-demand types. We addressed these issues by developing novel demand-selection tasks where associations between choice options and cognitive-demand levels change over time, with two variations using mental arithmetic and spatial reasoning problems (29:4 and 18:2 males:females). Most participants showed avoidance, and their choices depended on the demand experienced on multiple preceding trials. We assumed that participants updated the expected cost of mental effort through experience, and fitted their choices by reinforcement learning models, comparing several possibilities. Model-based fMRI analyses revealed that activity in the dorsomedial and lateral frontal cortices was positively correlated with the trial-by-trial expected cost for the chosen option commonly across the different types of cognitive demand, and also revealed a trend of negative correlation in the ventromedial prefrontal cortex. We further identified correlates of cost-prediction-error at time of problem-presentation or answering the problem, the latter of which partially overlapped with or were proximal to the correlates of expected cost at time of choice-cue in the dorsomedial frontal cortex. These results suggest that humans adaptively learn to avoid mental effort, having neural mechanisms to represent expected cost and cost-prediction-error, and the same mechanisms operate for various types of cognitive demand. SIGNIFICANCE STATEMENT In daily life, humans encounter various cognitive demands, and tend to avoid high-demand options. However, it remains unclear whether humans avoid mental effort adaptively under dynamically changing environments, and if so, what are the underlying neural mechanisms and whether they operate irrespective of cognitive-demand types. To address these issues, we developed novel tasks, where participants could learn to avoid high-demand options under uncertain and non-stationary environments. Through model-based fMRI analyses, we found regions whose activity was correlated with the expected mental effort cost, or cost-prediction-error, regardless of demand-type, with overlap or adjacence in the dorsomedial frontal cortex. This finding contributes to clarifying the mechanisms for cognitive-demand avoidance, and provides empirical building blocks for the emerging computational theory of mental effort. Copyright © 2018 the authors.

  19. Integrating in Silico and in Vitro Approaches To Predict Drug Accessibility to the Central Nervous System.

    PubMed

    Zhang, Yan-Yan; Liu, Houfu; Summerfield, Scott G; Luscombe, Christopher N; Sahi, Jasminder

    2016-05-02

    Estimation of uptake across the blood-brain barrier (BBB) is key to designing central nervous system (CNS) therapeutics. In silico approaches ranging from physicochemical rules to quantitative structure-activity relationship (QSAR) models are utilized to predict potential for CNS penetration of new chemical entities. However, there are still gaps in our knowledge of (1) the relationship between marketed human drug derived CNS-accessible chemical space and preclinical neuropharmacokinetic (neuroPK) data, (2) interpretability of the selected physicochemical descriptors, and (3) correlation of the in vitro human P-glycoprotein (P-gp) efflux ratio (ER) and in vivo rodent unbound brain-to-blood ratio (Kp,uu), as these are assays routinely used to predict clinical CNS exposure, during drug discovery. To close these gaps, we explored the CNS druglike property boundaries of 920 market oral drugs (315 CNS and 605 non-CNS) and 846 compounds (54 CNS drugs and 792 proprietary GlaxoSmithKline compounds) with available rat Kp,uu data. The exact permeability coefficient (Pexact) and P-gp ER were determined for 176 compounds from the rat Kp,uu data set. Receiver operating characteristic curves were performed to evaluate the predictive power of human P-gp ER for rat Kp,uu. Our data demonstrates that simple physicochemical rules (most acidic pKa ≥ 9.5 and TPSA < 100) in combination with P-gp ER < 1.5 provide mechanistic insights for filtering BBB permeable compounds. For comparison, six classification modeling methods were investigated using multiple sets of in silico molecular descriptors. We present a random forest model with excellent predictive power (∼0.75 overall accuracy) using the rat neuroPK data set. We also observed good concordance between the structural interpretation results and physicochemical descriptor importance from the Kp,uu classification QSAR model. In summary, we propose a novel, hybrid in silico/in vitro approach and an in silico screening model for the effective development of chemical series with the potential to achieve optimal CNS exposure.

  20. Design and Evaluation of the Terminal Area Precision Scheduling and Spacing System

    NASA Technical Reports Server (NTRS)

    Swenson, Harry N.; Thipphavong, Jane; Sadovsky, Alex; Chen, Liang; Sullivan, Chris; Martin, Lynne

    2011-01-01

    This paper describes the design, development and results from a high fidelity human-in-the-loop simulation of an integrated set of trajectory-based automation tools providing precision scheduling, sequencing and controller merging and spacing functions. These integrated functions are combined into a system called the Terminal Area Precision Scheduling and Spacing (TAPSS) system. It is a strategic and tactical planning tool that provides Traffic Management Coordinators, En Route and Terminal Radar Approach Control air traffic controllers the ability to efficiently optimize the arrival capacity of a demand-impacted airport while simultaneously enabling fuel-efficient descent procedures. The TAPSS system consists of four-dimensional trajectory prediction, arrival runway balancing, aircraft separation constraint-based scheduling, traffic flow visualization and trajectory-based advisories to assist controllers in efficient metering, sequencing and spacing. The TAPSS system was evaluated and compared to today's ATC operation through extensive series of human-in-the-loop simulations for arrival flows into the Los Angeles International Airport. The test conditions included the variation of aircraft demand from a baseline of today's capacity constrained periods through 5%, 10% and 20% increases. Performance data were collected for engineering and human factor analysis and compared with similar operations both with and without the TAPSS system. The engineering data indicate operations with the TAPSS show up to a 10% increase in airport throughput during capacity constrained periods while maintaining fuel-efficient aircraft descent profiles from cruise to landing.

  1. Identification of the feedforward component in manual control with predictable target signals.

    PubMed

    Drop, Frank M; Pool, Daan M; Damveld, Herman J; van Paassen, Marinus M; Mulder, Max

    2013-12-01

    In the manual control of a dynamic system, the human controller (HC) often follows a visible and predictable reference path. Compared with a purely feedback control strategy, performance can be improved by making use of this knowledge of the reference. The operator could effectively introduce feedforward control in conjunction with a feedback path to compensate for errors, as hypothesized in literature. However, feedforward behavior has never been identified from experimental data, nor have the hypothesized models been validated. This paper investigates human control behavior in pursuit tracking of a predictable reference signal while being perturbed by a quasi-random multisine disturbance signal. An experiment was done in which the relative strength of the target and disturbance signals were systematically varied. The anticipated changes in control behavior were studied by means of an ARX model analysis and by fitting three parametric HC models: two different feedback models and a combined feedforward and feedback model. The ARX analysis shows that the experiment participants employed control action on both the error and the target signal. The control action on the target was similar to the inverse of the system dynamics. Model fits show that this behavior can be modeled best by the combined feedforward and feedback model.

  2. Predictive modeling of altitude decompression sickness in humans

    NASA Technical Reports Server (NTRS)

    Kenyon, D. J.; Hamilton, R. W., Jr.; Colley, I. A.; Schreiner, H. R.

    1972-01-01

    The coding of data on 2,565 individual human altitude chamber tests is reported as part of a selection procedure designed to eliminate individuals who are highly susceptible to decompression sickness, individual aircrew members were exposed to the pressure equivalent of 37,000 feet and observed for one hour. Many entries refer to subjects who have been tested two or three times. This data contains a substantial body of statistical information important to the understanding of the mechanisms of altitude decompression sickness and for the computation of improved high altitude operating procedures. Appropriate computer formats and encoding procedures were developed and all 2,565 entries have been converted to these formats and stored on magnetic tape. A gas loading file was produced.

  3. High level intelligent control of telerobotics systems

    NASA Technical Reports Server (NTRS)

    Mckee, James

    1988-01-01

    A high level robot command language is proposed for the autonomous mode of an advanced telerobotics system and a predictive display mechanism for the teleoperational model. It is believed that any such system will involve some mixture of these two modes, since, although artificial intelligence can facilitate significant autonomy, a system that can resort to teleoperation will always have the advantage. The high level command language will allow humans to give the robot instructions in a very natural manner. The robot will then analyze these instructions to infer meaning so that is can translate the task into lower level executable primitives. If, however, the robot is unable to perform the task autonomously, it will switch to the teleoperational mode. The time delay between control movement and actual robot movement has always been a problem in teleoperations. The remote operator may not actually see (via a monitor) the results of high actions for several seconds. A computer generated predictive display system is proposed whereby the operator can see a real-time model of the robot's environment and the delayed video picture on the monitor at the same time.

  4. NEEMO 14: Evaluation of Human Performance for Rover, Cargo Lander, Crew Lander, and Exploration Tasks in Simulated Partial Gravity

    NASA Technical Reports Server (NTRS)

    Chappell, Steven P.; Abercromby, Andrew F.; Gernhardt, Michael L.

    2011-01-01

    The ultimate success of future human space exploration missions is dependent on the ability to perform extravehicular activity (EVA) tasks effectively, efficiently, and safely, whether those tasks represent a nominal mode of operation or a contingency capability. To optimize EVA systems for the best human performance, it is critical to study the effects of varying key factors such as suit center of gravity (CG), suit mass, and gravity level. During the 2-week NASA Extreme Environment Mission Operations (NEEMO) 14 mission, four crewmembers performed a series of EVA tasks under different simulated EVA suit configurations and used full-scale mockups of a Space Exploration Vehicle (SEV) rover and lander. NEEMO is an underwater spaceflight analog that allows a true mission-like operational environment and uses buoyancy effects and added weight to simulate different gravity levels. Quantitative and qualitative data collected during NEEMO 14, as well as from spacesuit tests in parabolic flight and with overhead suspension, are being used to directly inform ongoing hardware and operations concept development of the SEV, exploration EVA systems, and future EVA suits. OBJECTIVE: To compare human performance across different weight and CG configurations. METHODS: Four subjects were weighed out to simulate reduced gravity and wore either a specially designed rig to allow adjustment of CG or a PLSS mockup. Subjects completed tasks including level ambulation, incline/decline ambulation, standing from the kneeling and prone position, picking up objects, shoveling, ladder climbing, incapacitated crewmember handling, and small and large payload transfer. Subjective compensation, exertion, task acceptability, and duration data as well as photo and video were collected. RESULTS: There appear to be interactions between CG, weight, and task. CGs nearest the subject s natural CG are the most predictable in terms of acceptable performance across tasks. Future research should focus on understanding the interactions between CG, mass, and subject differences.

  5. Minimizing Human Risk: Human Performance Models in the Human Factors and Behavioral Performance Element

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2017-01-01

    Human space exploration has never been more exciting than it is today. Human presence to outer worlds is becoming a reality as humans are leveraging much of our prior knowledge to the new mission of going to Mars. Exploring the solar system at greater distances from Earth than ever before will possess some unique challenges, which can be overcome thanks to the advances in modeling and simulation technologies. The National Aeronautics and Space Administration (NASA) is at the forefront of exploring our solar system. NASA's Human Research Program (HRP) focuses on discovering the best methods and technologies that support safe and productive human space travel in the extreme and harsh space environment. HRP uses various methods and approaches to answer questions about the impact of long duration missions on the human in space including: gravitys impact on the human body, isolation and confinement on the human, hostile environments impact on the human, space radiation, and how the distance is likely to impact the human. Predictive models are included in the HRP research portfolio as these models provide valuable insights into human-system operations. This paper will provide an overview of NASA's HRP and will present a number of projects that have used modeling and simulation to provide insights into human-system issues (e.g. automation, habitat design, schedules) in anticipation of space exploration.

  6. Minimizing Human Risk: Human Performance Models in the Space Human Factors and Habitability and Behavioral Health and Performance Elements

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2016-01-01

    Human space exploration has never been more exciting than it is today. Human presence to outer worlds is becoming a reality as humans are leveraging much of our prior knowledge to the new mission of going to Mars. Exploring the solar system at greater distances from Earth than ever before will possess some unique challenges, which can be overcome thanks to the advances in modeling and simulation technologies. The National Aeronautics and Space Administration (NASA) is at the forefront of exploring our solar system. NASA's Human Research Program (HRP) focuses on discovering the best methods and technologies that support safe and productive human space travel in the extreme and harsh space environment. HRP uses various methods and approaches to answer questions about the impact of long duration missions on the human in space including: gravity's impact on the human body, isolation and confinement on the human, hostile environments impact on the human, space radiation, and how the distance is likely to impact the human. Predictive models are included in the HRP research portfolio as these models provide valuable insights into human-system operations. This paper will provide an overview of NASA's HRP and will present a number of projects that have used modeling and simulation to provide insights into human-system issues (e.g. automation, habitat design, schedules) in anticipation of space exploration.

  7. Predicting the Consequences of Workload Management Strategies with Human Performance Modeling

    NASA Technical Reports Server (NTRS)

    Mitchell, Diane Kuhl; Samma, Charneta

    2011-01-01

    Human performance modelers at the US Army Research Laboratory have developed an approach for establishing Soldier high workload that can be used for analyses of proposed system designs. Their technique includes three key components. To implement the approach in an experiment, the researcher would create two experimental conditions: a baseline and a design alternative. Next they would identify a scenario in which the test participants perform all their representative concurrent interactions with the system. This scenario should include any events that would trigger a different set of goals for the human operators. They would collect workload values during both the control and alternative design condition to see if the alternative increased workload and decreased performance. They have successfully implemented this approach for military vehicle. designs using the human performance modeling tool, IMPRINT. Although ARL researches use IMPRINT to implement their approach, it can be applied to any workload analysis. Researchers using other modeling and simulations tools or conducting experiments or field tests can use the same approach.

  8. Copula based prediction models: an application to an aortic regurgitation study

    PubMed Central

    Kumar, Pranesh; Shoukri, Mohamed M

    2007-01-01

    Background: An important issue in prediction modeling of multivariate data is the measure of dependence structure. The use of Pearson's correlation as a dependence measure has several pitfalls and hence application of regression prediction models based on this correlation may not be an appropriate methodology. As an alternative, a copula based methodology for prediction modeling and an algorithm to simulate data are proposed. Methods: The method consists of introducing copulas as an alternative to the correlation coefficient commonly used as a measure of dependence. An algorithm based on the marginal distributions of random variables is applied to construct the Archimedean copulas. Monte Carlo simulations are carried out to replicate datasets, estimate prediction model parameters and validate them using Lin's concordance measure. Results: We have carried out a correlation-based regression analysis on data from 20 patients aged 17–82 years on pre-operative and post-operative ejection fractions after surgery and estimated the prediction model: Post-operative ejection fraction = - 0.0658 + 0.8403 (Pre-operative ejection fraction); p = 0.0008; 95% confidence interval of the slope coefficient (0.3998, 1.2808). From the exploratory data analysis, it is noted that both the pre-operative and post-operative ejection fractions measurements have slight departures from symmetry and are skewed to the left. It is also noted that the measurements tend to be widely spread and have shorter tails compared to normal distribution. Therefore predictions made from the correlation-based model corresponding to the pre-operative ejection fraction measurements in the lower range may not be accurate. Further it is found that the best approximated marginal distributions of pre-operative and post-operative ejection fractions (using q-q plots) are gamma distributions. The copula based prediction model is estimated as: Post -operative ejection fraction = - 0.0933 + 0.8907 × (Pre-operative ejection fraction); p = 0.00008 ; 95% confidence interval for slope coefficient (0.4810, 1.3003). For both models differences in the predicted post-operative ejection fractions in the lower range of pre-operative ejection measurements are considerably different and prediction errors due to copula model are smaller. To validate the copula methodology we have re-sampled with replacement fifty independent bootstrap samples and have estimated concordance statistics 0.7722 (p = 0.0224) for the copula model and 0.7237 (p = 0.0604) for the correlation model. The predicted and observed measurements are concordant for both models. The estimates of accuracy components are 0.9233 and 0.8654 for copula and correlation models respectively. Conclusion: Copula-based prediction modeling is demonstrated to be an appropriate alternative to the conventional correlation-based prediction modeling since the correlation-based prediction models are not appropriate to model the dependence in populations with asymmetrical tails. Proposed copula-based prediction model has been validated using the independent bootstrap samples. PMID:17573974

  9. The architecture of human kin detection

    PubMed Central

    Lieberman, Debra; Tooby, John; Cosmides, Leda

    2012-01-01

    Evolved mechanisms for assessing genetic relatedness have been found in many species, but their existence in humans has been a matter of controversy. Here we report three converging lines of evidence, drawn from siblings, that support the hypothesis that kin detection mechanisms exist in humans. These operate by computing, for each familiar individual, a unitary regulatory variable (the kinship index) that corresponds to a pairwise estimate of genetic relatedness between self and other. The cues that the system uses were identified by quantitatively matching individual exposure to potential cues of relatedness to variation in three outputs relevant to the system’s evolved functions: sibling altruism, aversion to personally engaging in sibling incest, and moral opposition to third party sibling incest. As predicted, the kin detection system uses two distinct, ancestrally valid cues to compute relatedness: the familiar other’s perinatal association with the individual’s biological mother, and duration of sibling coresidence. PMID:17301784

  10. A computational visual saliency model based on statistics and machine learning.

    PubMed

    Lin, Ru-Je; Lin, Wei-Song

    2014-08-01

    Identifying the type of stimuli that attracts human visual attention has been an appealing topic for scientists for many years. In particular, marking the salient regions in images is useful for both psychologists and many computer vision applications. In this paper, we propose a computational approach for producing saliency maps using statistics and machine learning methods. Based on four assumptions, three properties (Feature-Prior, Position-Prior, and Feature-Distribution) can be derived and combined by a simple intersection operation to obtain a saliency map. These properties are implemented by a similarity computation, support vector regression (SVR) technique, statistical analysis of training samples, and information theory using low-level features. This technique is able to learn the preferences of human visual behavior while simultaneously considering feature uniqueness. Experimental results show that our approach performs better in predicting human visual attention regions than 12 other models in two test databases. © 2014 ARVO.

  11. The Elementary Operations of Human Vision Are Not Reducible to Template-Matching

    PubMed Central

    Neri, Peter

    2015-01-01

    It is generally acknowledged that biological vision presents nonlinear characteristics, yet linear filtering accounts of visual processing are ubiquitous. The template-matching operation implemented by the linear-nonlinear cascade (linear filter followed by static nonlinearity) is the most widely adopted computational tool in systems neuroscience. This simple model achieves remarkable explanatory power while retaining analytical tractability, potentially extending its reach to a wide range of systems and levels in sensory processing. The extent of its applicability to human behaviour, however, remains unclear. Because sensory stimuli possess multiple attributes (e.g. position, orientation, size), the issue of applicability may be asked by considering each attribute one at a time in relation to a family of linear-nonlinear models, or by considering all attributes collectively in relation to a specified implementation of the linear-nonlinear cascade. We demonstrate that human visual processing can operate under conditions that are indistinguishable from linear-nonlinear transduction with respect to substantially different stimulus attributes of a uniquely specified target signal with associated behavioural task. However, no specific implementation of a linear-nonlinear cascade is able to account for the entire collection of results across attributes; a satisfactory account at this level requires the introduction of a small gain-control circuit, resulting in a model that no longer belongs to the linear-nonlinear family. Our results inform and constrain efforts at obtaining and interpreting comprehensive characterizations of the human sensory process by demonstrating its inescapably nonlinear nature, even under conditions that have been painstakingly fine-tuned to facilitate template-matching behaviour and to produce results that, at some level of inspection, do conform to linear filtering predictions. They also suggest that compliance with linear transduction may be the targeted outcome of carefully crafted nonlinear circuits, rather than default behaviour exhibited by basic components. PMID:26556758

  12. Revolutionary Design for Astronaut Exploration — Beyond the Bio-Suit System

    NASA Astrophysics Data System (ADS)

    Newman, Dava J.; Canina, Marita; Trotti, Guillermo L.

    2007-01-01

    The Bio-Suit System is designed to revolutionize human space exploration by providing enhanced astronaut extravehicular activity (EVA) locomotion and performance based on the concepts of a `second skin' capability. The novel Bio-Suit concept provides an overall exploration system realized through symbiotic relationships between a suite of advanced technologies, creative design, human modeling and analysis, and new mission operations techniques. By working at the intersection of engineering, design, life sciences and operations, new emergent capabilities and interrelationships result for applications to space missions, medical rehabilitation, and extreme sports activities. In many respects, the Bio-Suit System mimics Nature (biomimetics). For example, the second skin is capable of augmenting our biological skin by providing mechanical counter-pressure. We have designed and tested prototypes that prove mechanical counter-pressure feasibility. The `epidermis' of our second skin suit is patterned from 3D laser scans that incorporate human skin strain field maps for maximum mobility and natural movements, while requiring minimum energy expenditure for exploration tasks. We provide a technology roadmap for future design, pressure production and technology investments for the Bio-Suit System. Woven into the second skin are active materials to enhance human performance as well as to provide necessary performance metrics (i.e., energy expenditure). Wearable technologies will be embedded throughout the Bio-Suit System to place the explorer in an information-rich environment enabling real-time mission planning, prediction, and visualization. The Bio-Suit System concept augments human capabilities by coupling human and robotic abilities into a hybrid of the two, to the point where the explorer is hardly aware of the boundary between innate human performance and robotic activities.

  13. Predicting red wolf release success in the southeastern United States

    USGS Publications Warehouse

    van Manen, Frank T.; Crawford, Barron A.; Clark, Joseph D.

    2000-01-01

    Although the red wolf (Canis rufus) was once found throughout the southeastern United States, indiscriminate killing and habitat destruction reduced its range to a small section of coastal Texas and Louisiana. Wolves trapped from 1973 to 1980 were taken to establish a captive breeding program that was used to repatriate 2 mainland and 3 island red wolf populations. We collected data from 320 red wolf releases in these areas and classified each as a success or failure based on survival and reproductive criteria, and whether recaptures were necessary to resolve conflicts with humans. We evaluated the relations between release success and conditions at the release sites, characteristics of released wolves, and release procedures. Although <44% of the variation in release success was explained, model performance based on jackknife tests indicated a 72-80% correct prediction rate for the 4 operational models we developed. The models indicated that success was associated with human influences on the landscape and the level of wolf habituation to humans prior to release. We applied the models to 31 prospective areas for wolf repatriation and calculated an index of release success for each area. Decision-makers can use these models to objectively rank prospective release areas and compare strengths and weaknesses of each.

  14. Hypoxic ventilatory sensitivity in men is not reduced by prolonged hyperoxia (Predictive Studies V and VI)

    NASA Technical Reports Server (NTRS)

    Gelfand, R.; Lambertsen, C. J.; Clark, J. M.; Hopkin, E.

    1998-01-01

    Potential adverse effects on the O2-sensing function of the carotid body when its cells are exposed to toxic O2 pressures were assessed during investigations of human organ tolerance to prolonged continuous and intermittent hyperoxia (Predictive Studies V and VI). Isocapnic hypoxic ventilatory responses (HVR) were determined at 1.0 ATA before and after severe hyperoxic exposures: 1) continuous O2 breathing at 1.5, 2.0, and 2.5 ATA for 17.7, 9.0, and 5.7 h and 2) intermittent O2 breathing at 2.0 ATA (30 min O2-30 min normoxia) for 14.3 O2 h within 30-h total time. Postexposure curvature of HVR hyperbolas was not reduced compared with preexposure controls. The hyperbolas were temporarily elevated to higher ventilations than controls due to increments in respiratory frequency that were proportional to O2 exposure time, not O2 pressure. In humans, prolonged hyperoxia does not attenuate the hypoxia-sensing function of the peripheral chemoreceptors, even after exposures that approach limits of human pulmonary and central nervous system O2 tolerance. Current applications of hyperoxia in hyperbaric O2 therapy and in subsea- and aerospace-related operations are guided by and are well within these exposure limits.

  15. Estimating the footprint of pollution on coral reefs with models of species turnover.

    PubMed

    Brown, Christopher J; Hamilton, Richard J

    2018-01-15

    Ecological communities typically change along gradients of human impact, although it is difficult to estimate the footprint of impacts for diffuse threats such as pollution. We developed a joint model (i.e., one that includes multiple species and their interactions with each other and environmental covariates) of benthic habitats on lagoonal coral reefs and used it to infer change in benthic composition along a gradient of distance from logging operations. The model estimated both changes in abundances of benthic groups and their compositional turnover, a type of beta diversity. We used the model to predict the footprint of turbidity impacts from past and recent logging. Benthic communities far from logging were dominated by branching corals, whereas communities close to logging had higher cover of dead coral, massive corals, and soft sediment. Recent impacts were predicted to be small relative to the extensive impacts of past logging because recent logging has occurred far from lagoonal reefs. Our model can be used more generally to estimate the footprint of human impacts on ecosystems and evaluate the benefits of conservation actions for ecosystems. © 2018 Society for Conservation Biology.

  16. Energy-Aware Topology Control Strategy for Human-Centric Wireless Sensor Networks

    PubMed Central

    Meseguer, Roc; Molina, Carlos; Ochoa, Sergio F.; Santos, Rodrigo

    2014-01-01

    The adoption of mobile and ubiquitous solutions that involve participatory or opportunistic sensing increases every day. This situation has highlighted the relevance of optimizing the energy consumption of these solutions, because their operation depends on the devices' battery lifetimes. This article presents a study that intends to understand how the prediction of topology control messages in human-centric wireless sensor networks can be used to help reduce the energy consumption of the participating devices. In order to do that, five research questions have been defined and a study based on simulations was conducted to answer these questions. The obtained results help identify suitable mobile computing scenarios where the prediction of topology control messages can be used to save energy of the network nodes. These results also allow estimating the percentage of energy saving that can be expected, according to the features of the work scenario and the participants behavior. Designers of mobile collaborative applications that involve participatory or opportunistic sensing, can take advantage of these findings to increase the autonomy of their solutions. PMID:24514884

  17. Predicting outcome of Morris water maze test in vascular dementia mouse model with deep learning

    PubMed Central

    Mogi, Masaki; Iwanami, Jun; Min, Li-Juan; Bai, Hui-Yu; Shan, Bao-Shuai; Kukida, Masayoshi; Kan-no, Harumi; Ikeda, Shuntaro; Higaki, Jitsuo; Horiuchi, Masatsugu

    2018-01-01

    The Morris water maze test (MWM) is one of the most popular and established behavioral tests to evaluate rodents’ spatial learning ability. The conventional training period is around 5 days, but there is no clear evidence or guidelines about the appropriate duration. In many cases, the final outcome of the MWM seems predicable from previous data and their trend. So, we assumed that if we can predict the final result with high accuracy, the experimental period could be shortened and the burden on testers reduced. An artificial neural network (ANN) is a useful modeling method for datasets that enables us to obtain an accurate mathematical model. Therefore, we constructed an ANN system to estimate the final outcome in MWM from the previously obtained 4 days of data in both normal mice and vascular dementia model mice. Ten-week-old male C57B1/6 mice (wild type, WT) were subjected to bilateral common carotid artery stenosis (WT-BCAS) or sham-operation (WT-sham). At 6 weeks after surgery, we evaluated their cognitive function with MWM. Mean escape latency was significantly longer in WT-BCAS than in WT-sham. All data were collected and used as training data and test data for the ANN system. We defined a multiple layer perceptron (MLP) as a prediction model using an open source framework for deep learning, Chainer. After a certain number of updates, we compared the predicted values and actual measured values with test data. A significant correlation coefficient was derived form the updated ANN model in both WT-sham and WT-BCAS. Next, we analyzed the predictive capability of human testers with the same datasets. There was no significant difference in the prediction accuracy between human testers and ANN models in both WT-sham and WT-BCAS. In conclusion, deep learning method with ANN could predict the final outcome in MWM from 4 days of data with high predictive accuracy in a vascular dementia model. PMID:29415035

  18. Predicting outcome of Morris water maze test in vascular dementia mouse model with deep learning.

    PubMed

    Higaki, Akinori; Mogi, Masaki; Iwanami, Jun; Min, Li-Juan; Bai, Hui-Yu; Shan, Bao-Shuai; Kukida, Masayoshi; Kan-No, Harumi; Ikeda, Shuntaro; Higaki, Jitsuo; Horiuchi, Masatsugu

    2018-01-01

    The Morris water maze test (MWM) is one of the most popular and established behavioral tests to evaluate rodents' spatial learning ability. The conventional training period is around 5 days, but there is no clear evidence or guidelines about the appropriate duration. In many cases, the final outcome of the MWM seems predicable from previous data and their trend. So, we assumed that if we can predict the final result with high accuracy, the experimental period could be shortened and the burden on testers reduced. An artificial neural network (ANN) is a useful modeling method for datasets that enables us to obtain an accurate mathematical model. Therefore, we constructed an ANN system to estimate the final outcome in MWM from the previously obtained 4 days of data in both normal mice and vascular dementia model mice. Ten-week-old male C57B1/6 mice (wild type, WT) were subjected to bilateral common carotid artery stenosis (WT-BCAS) or sham-operation (WT-sham). At 6 weeks after surgery, we evaluated their cognitive function with MWM. Mean escape latency was significantly longer in WT-BCAS than in WT-sham. All data were collected and used as training data and test data for the ANN system. We defined a multiple layer perceptron (MLP) as a prediction model using an open source framework for deep learning, Chainer. After a certain number of updates, we compared the predicted values and actual measured values with test data. A significant correlation coefficient was derived form the updated ANN model in both WT-sham and WT-BCAS. Next, we analyzed the predictive capability of human testers with the same datasets. There was no significant difference in the prediction accuracy between human testers and ANN models in both WT-sham and WT-BCAS. In conclusion, deep learning method with ANN could predict the final outcome in MWM from 4 days of data with high predictive accuracy in a vascular dementia model.

  19. Interspecies scaling and prediction of human clearance: comparison of small- and macro-molecule drugs

    PubMed Central

    Huh, Yeamin; Smith, David E.; Feng, Meihau Rose

    2014-01-01

    Human clearance prediction for small- and macro-molecule drugs was evaluated and compared using various scaling methods and statistical analysis.Human clearance is generally well predicted using single or multiple species simple allometry for macro- and small-molecule drugs excreted renally.The prediction error is higher for hepatically eliminated small-molecules using single or multiple species simple allometry scaling, and it appears that the prediction error is mainly associated with drugs with low hepatic extraction ratio (Eh). The error in human clearance prediction for hepatically eliminated small-molecules was reduced using scaling methods with a correction of maximum life span (MLP) or brain weight (BRW).Human clearance of both small- and macro-molecule drugs is well predicted using the monkey liver blood flow method. Predictions using liver blood flow from other species did not work as well, especially for the small-molecule drugs. PMID:21892879

  20. Validation of Aircraft Noise Prediction Models at Low Levels of Exposure

    NASA Technical Reports Server (NTRS)

    Page, Juliet A.; Hobbs, Christopher M.; Plotkin, Kenneth J.; Stusnick, Eric; Shepherd, Kevin P. (Technical Monitor)

    2000-01-01

    Aircraft noise measurements were made at Denver International Airport for a period of four weeks. Detailed operational information was provided by airline operators which enabled noise levels to be predicted using the FAA's Integrated Noise Model. Several thrust prediction techniques were evaluated. Measured sound exposure levels for departure operations were found to be 4 to 10 dB higher than predicted, depending on the thrust prediction technique employed. Differences between measured and predicted levels are shown to be related to atmospheric conditions present at the aircraft altitude.

  1. Auralization Architectures for NASA?s Next Generation Aircraft Noise Prediction Program

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.; Aumann, Aric R.

    2013-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The assessment of human response to noise from future aircraft can only be afforded through laboratory testing using simulated flyover noise. Recent work by the authors demonstrated the ability to auralize predicted flyover noise for a state-of-the-art reference aircraft and a future hybrid wing body aircraft concept. This auralization used source noise predictions from NASA's Aircraft NOise Prediction Program (ANOPP) as input. The results from this process demonstrated that auralization based upon system noise predictions is consistent with, and complementary to, system noise predictions alone. To further develop and validate the auralization process, improvements to the interfaces between the synthesis capability and the system noise tools are required. This paper describes the key elements required for accurate noise synthesis and introduces auralization architectures for use with the next-generation ANOPP (ANOPP2). The architectures are built around a new auralization library and its associated Application Programming Interface (API) that utilize ANOPP2 APIs to access data required for auralization. The architectures are designed to make the process of auralizing flyover noise a common element of system noise prediction.

  2. Solar-Terrestrial Predictions

    NASA Astrophysics Data System (ADS)

    Thompson, R. J.; Cole, D. G.; Wilkinson, P. J.; Shea, M. A.; Smart, D.

    1990-11-01

    Volume 1: The following subject areas are covered: the magnetosphere environment; forecasting magnetically quiet periods; radiation hazards to human in deep space (a summary with special reference to large solar particle events); solar proton events (review and status); problems of the physics of solar-terrestrial interactions; prediction of solar proton fluxes from x-ray signatures; rhythms in solar activity and the prediction of episodes of large flares; the role of persistence in the 24-hour flare forecast; on the relationship between the observed sunspot number and the number of solar flares; the latitudinal distribution of coronal holes and geomagnetic storms due to coronal holes; and the signatures of flares in the interplanetary medium at 1 AU. Volume 2: The following subject areas were covered: a probability forecast for geomagnetic activity; cost recovery in solar-terrestrial predictions; magnetospheric specification and forecasting models; a geomagnetic forecast and monitoring system for power system operation; some aspects of predicting magnetospheric storms; some similarities in ionospheric disturbance characteristics in equatorial, mid-latitude, and sub-auroral regions; ionospheric support for low-VHF radio transmission; a new approach to prediction of ionospheric storms; a comparison of the total electron content of the ionosphere around L=4 at low sunspot numbers with the IRI model; the French ionospheric radio propagation predictions; behavior of the F2 layer at mid-latitudes; and the design of modern ionosondes.

  3. Prediction of wastewater treatment plants performance based on artificial fish school neural network

    NASA Astrophysics Data System (ADS)

    Zhang, Ruicheng; Li, Chong

    2011-10-01

    A reliable model for wastewater treatment plant is essential in providing a tool for predicting its performance and to form a basis for controlling the operation of the process. This would minimize the operation costs and assess the stability of environmental balance. For the multi-variable, uncertainty, non-linear characteristics of the wastewater treatment system, an artificial fish school neural network prediction model is established standing on actual operation data in the wastewater treatment system. The model overcomes several disadvantages of the conventional BP neural network. The results of model calculation show that the predicted value can better match measured value, played an effect on simulating and predicting and be able to optimize the operation status. The establishment of the predicting model provides a simple and practical way for the operation and management in wastewater treatment plant, and has good research and engineering practical value.

  4. PRESTO-II: a low-level waste environmental transport and risk assessment code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fields, D.E.; Emerson, C.J.; Chester, R.O.

    PRESTO-II (Prediction of Radiation Effects from Shallow Trench Operations) is a computer code designed for the evaluation of possible health effects from shallow-land and, waste-disposal trenches. The model is intended to serve as a non-site-specific screening model for assessing radionuclide transport, ensuing exposure, and health impacts to a static local population for a 1000-year period following the end of disposal operations. Human exposure scenarios considered include normal releases (including leaching and operational spillage), human intrusion, and limited site farming or reclamation. Pathways and processes of transit from the trench to an individual or population include ground-water transport, overland flow, erosion,more » surface water dilution, suspension, atmospheric transport, deposition, inhalation, external exposure, and ingestion of contaminated beef, milk, crops, and water. Both population doses and individual doses, as well as doses to the intruder and farmer, may be calculated. Cumulative health effects in terms of cancer deaths are calculated for the population over the 1000-year period using a life-table approach. Data are included for three example sites: Barnwell, South Carolina; Beatty, Nevada; and West Valley, New York. A code listing and example input for each of the three sites are included in the appendices to this report.« less

  5. Dynamic Simulation of Human Gait Model With Predictive Capability.

    PubMed

    Sun, Jinming; Wu, Shaoli; Voglewede, Philip A

    2018-03-01

    In this paper, it is proposed that the central nervous system (CNS) controls human gait using a predictive control approach in conjunction with classical feedback control instead of exclusive classical feedback control theory that controls based on past error. To validate this proposition, a dynamic model of human gait is developed using a novel predictive approach to investigate the principles of the CNS. The model developed includes two parts: a plant model that represents the dynamics of human gait and a controller that represents the CNS. The plant model is a seven-segment, six-joint model that has nine degrees-of-freedom (DOF). The plant model is validated using data collected from able-bodied human subjects. The proposed controller utilizes model predictive control (MPC). MPC uses an internal model to predict the output in advance, compare the predicted output to the reference, and optimize the control input so that the predicted error is minimal. To decrease the complexity of the model, two joints are controlled using a proportional-derivative (PD) controller. The developed predictive human gait model is validated by simulating able-bodied human gait. The simulation results show that the developed model is able to simulate the kinematic output close to experimental data.

  6. The use of patient factors to improve the prediction of operative duration using laparoscopic cholecystectomy.

    PubMed

    Thiels, Cornelius A; Yu, Denny; Abdelrahman, Amro M; Habermann, Elizabeth B; Hallbeck, Susan; Pasupathy, Kalyan S; Bingener, Juliane

    2017-01-01

    Reliable prediction of operative duration is essential for improving patient and care team satisfaction, optimizing resource utilization and reducing cost. Current operative scheduling systems are unreliable and contribute to costly over- and underestimation of operative time. We hypothesized that the inclusion of patient-specific factors would improve the accuracy in predicting operative duration. We reviewed all elective laparoscopic cholecystectomies performed at a single institution between 01/2007 and 06/2013. Concurrent procedures were excluded. Univariate analysis evaluated the effect of age, gender, BMI, ASA, laboratory values, smoking, and comorbidities on operative duration. Multivariable linear regression models were constructed using the significant factors (p < 0.05). The patient factors model was compared to the traditional surgical scheduling system estimates, which uses historical surgeon-specific and procedure-specific operative duration. External validation was done using the ACS-NSQIP database (n = 11,842). A total of 1801 laparoscopic cholecystectomy patients met inclusion criteria. Female sex was associated with reduced operative duration (-7.5 min, p < 0.001 vs. male sex) while increasing BMI (+5.1 min BMI 25-29.9, +6.9 min BMI 30-34.9, +10.4 min BMI 35-39.9, +17.0 min BMI 40 + , all p < 0.05 vs. normal BMI), increasing ASA (+7.4 min ASA III, +38.3 min ASA IV, all p < 0.01 vs. ASA I), and elevated liver function tests (+7.9 min, p < 0.01 vs. normal) were predictive of increased operative duration on univariate analysis. A model was then constructed using these predictive factors. The traditional surgical scheduling system was poorly predictive of actual operative duration (R 2  = 0.001) compared to the patient factors model (R 2  = 0.08). The model remained predictive on external validation (R 2  = 0.14).The addition of surgeon as a variable in the institutional model further improved predictive ability of the model (R 2  = 0.18). The use of routinely available pre-operative patient factors improves the prediction of operative duration during cholecystectomy.

  7. Towards a National Space Weather Predictive Capability

    NASA Astrophysics Data System (ADS)

    Fox, N. J.; Ryschkewitsch, M. G.; Merkin, V. G.; Stephens, G. K.; Gjerloev, J. W.; Barnes, R. J.; Anderson, B. J.; Paxton, L. J.; Ukhorskiy, A. Y.; Kelly, M. A.; Berger, T. E.; Bonadonna, L. C. M. F.; Hesse, M.; Sharma, S.

    2015-12-01

    National needs in the area of space weather informational and predictive tools are growing rapidly. Adverse conditions in the space environment can cause disruption of satellite operations, communications, navigation, and electric power distribution grids, leading to a variety of socio-economic losses and impacts on our security. Future space exploration and most modern human endeavors will require major advances in physical understanding and improved transition of space research to operations. At present, only a small fraction of the latest research and development results from NASA, NOAA, NSF and DoD investments are being used to improve space weather forecasting and to develop operational tools. The power of modern research and space weather model development needs to be better utilized to enable comprehensive, timely, and accurate operational space weather tools. The mere production of space weather information is not sufficient to address the needs of those who are affected by space weather. A coordinated effort is required to support research-to-applications transition efforts and to develop the tools required those who rely on this information. In this presentation we will review the space weather system developed for the Van Allen Probes mission, together with other datasets, tools and models that have resulted from research by scientists at JHU/APL. We will look at how these, and results from future missions such as Solar Probe Plus, could be applied to support space weather applications in coordination with other community assets and capabilities.

  8. Incidents Prediction in Road Junctions Using Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Hajji, Tarik; Alami Hassani, Aicha; Ouazzani Jamil, Mohammed

    2018-05-01

    The implementation of an incident detection system (IDS) is an indispensable operation in the analysis of the road traffics. However the IDS may, in no case, represent an alternative to the classical monitoring system controlled by the human eye. The aim of this work is to increase detection and prediction probability of incidents in camera-monitored areas. Knowing that, these areas are monitored by multiple cameras and few supervisors. Our solution is to use Artificial Neural Networks (ANN) to analyze moving objects trajectories on captured images. We first propose a modelling of the trajectories and their characteristics, after we develop a learning database for valid and invalid trajectories, and then we carry out a comparative study to find the artificial neural network architecture that maximizes the rate of valid and invalid trajectories recognition.

  9. Operator vision aids for space teleoperation assembly and servicing

    NASA Technical Reports Server (NTRS)

    Brooks, Thurston L.; Ince, Ilhan; Lee, Greg

    1992-01-01

    This paper investigates concepts for visual operator aids required for effective telerobotic control. Operator visual aids, as defined here, mean any operational enhancement that improves man-machine control through the visual system. These concepts were derived as part of a study of vision issues for space teleoperation. Extensive literature on teleoperation, robotics, and human factors was surveyed to definitively specify appropriate requirements. This paper presents these visual aids in three general categories of camera/lighting functions, display enhancements, and operator cues. In the area of camera/lighting functions concepts are discussed for: (1) automatic end effector or task tracking; (2) novel camera designs; (3) computer-generated virtual camera views; (4) computer assisted camera/lighting placement; and (5) voice control. In the technology area of display aids, concepts are presented for: (1) zone displays, such as imminent collision or indexing limits; (2) predictive displays for temporal and spatial location; (3) stimulus-response reconciliation displays; (4) graphical display of depth cues such as 2-D symbolic depth, virtual views, and perspective depth; and (5) view enhancements through image processing and symbolic representations. Finally, operator visual cues (e.g., targets) that help identify size, distance, shape, orientation and location are discussed.

  10. Orion Multi-Purpose Crew Vehicle Solving and Mitigating the Two Main Cluster Pendulum Problem

    NASA Technical Reports Server (NTRS)

    Ali, Yasmin; Sommer, Bruce; Troung, Tuan; Anderson, Brian; Madsen, Christopher

    2017-01-01

    The Orion Multi-purpose Crew Vehicle (MPCV) Orion spacecraft will return humans from beyond earth's orbit, including Mars and will be required to land 20,000 pounds of mass safely in the ocean. The parachute system nominally lands under 3 main parachutes, but the system is designed to be fault tolerant and land under 2 main parachutes. During several of the parachute development tests, it was observed that a pendulum, or swinging, motion could develop while the Crew Module (CM) was descending under two parachutes. This pendulum effect had not been previously predicted by modeling. Landing impact analysis showed that the landing loads would double in some places across the spacecraft. The CM structural design limits would be exceeded upon landing if this pendulum motion were to occur. The Orion descent and landing team was faced with potentially millions of dollars in structural modifications and a severe mass increase. A multidisciplinary team was formed to determine root cause, model the pendulum motion, study alternate canopy planforms and assess alternate operational vehicle controls & operations providing mitigation options resulting in a reliability level deemed safe for human spaceflight. The problem and solution is a balance of risk to a known solution versus a chance to improve the landing performance for the next human-rated spacecraft.

  11. Effects of 60-Heartz electric and magnetic fields on implanted cardiac pacemakers. Final report. [Hazards of power transmission line frequencies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bridges, J.E.; Frazier, M.J.

    1979-09-01

    The effects of 60-Hz electric and magnetic fields of exta-high voltage (EHV) transmission lines on the performance of implanted cardiac pacemakers were studied by: (1) in vitro bench tests of a total of thirteen cardiac pacemakers; (2) in vivo tests of six implanted cardiac pacemakers in baboons; and (3) non-hazardous skin measurement tests on four humans. Analytical methods were developed to predict the thresholds of body current and electric fields capable of affecting normal pacemaker operation in humans. The field strengths calculated to alter implanted pacemaker performance were compared with the range of maximum electric and magnetic field strengths amore » human would normally encounter under transmission lines of various voltages. Results indicate that the electric field or body current necessary to alter the normal operation of pacemakers is highly dependent on the type of pacemaker and the location of the implanted electrodes. However, cardiologists have not so far detected harmful effects of pacemaker reversion to the asynchronous mode in current types of pacemakers and with present methods of implantation. Such interferences can be eliminated by using advanced pacemakers less sensitive to 60-Hz voltages or by using implantation lead arrangements less sensitive to body current.« less

  12. Toward Reduced Aircraft Community Noise Impact Via a Perception-Influenced Design Approach

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.

    2016-01-01

    This is an exciting time for aircraft design. New configurations, including small multi-rotor uncrewed aerial systems, fixed- and tilt-wing distributed electric propulsion aircraft, high-speed rotorcraft, hybrid-electric commercial transports, and low-boom supersonic transports, are being made possible through a host of propulsion and airframe technology developments. The resulting noise signatures may be radically different, both spectrally and temporally, than those of the current fleet. Noise certification metrics currently used in aircraft design do not necessarily reflect these characteristics and therefore may not correlate well with human response. Further, as operations and missions become less airport-centric, e.g., those associated with on-demand mobility or package delivery, vehicles may operate in closer proximity to the population than ever before. Fortunately, a new set of tools are available for assessing human perception during the design process in order to affect the final design in a positive manner. The tool chain utilizes system noise prediction methods coupled with auralization and psychoacoustic testing, making possible the inclusion of human response to noise, along with performance criteria and certification requirements, into the aircraft design process. Several case studies are considered to illustrate how this approach could be used to influence the design of future aircraft.

  13. Man-machine Integration Design and Analysis System (MIDAS) Task Loading Model (TLM) experimental and software detailed design report

    NASA Technical Reports Server (NTRS)

    Staveland, Lowell

    1994-01-01

    This is the experimental and software detailed design report for the prototype task loading model (TLM) developed as part of the man-machine integration design and analysis system (MIDAS), as implemented and tested in phase 6 of the Army-NASA Aircrew/Aircraft Integration (A3I) Program. The A3I program is an exploratory development effort to advance the capabilities and use of computational representations of human performance and behavior in the design, synthesis, and analysis of manned systems. The MIDAS TLM computationally models the demands designs impose on operators to aide engineers in the conceptual design of aircraft crewstations. This report describes TLM and the results of a series of experiments which were run this phase to test its capabilities as a predictive task demand modeling tool. Specifically, it includes discussions of: the inputs and outputs of TLM, the theories underlying it, the results of the test experiments, the use of the TLM as both stand alone tool and part of a complete human operator simulation, and a brief introduction to the TLM software design.

  14. The development of a tool to predict team performance.

    PubMed

    Sinclair, M A; Siemieniuch, C E; Haslam, R A; Henshaw, M J D C; Evans, L

    2012-01-01

    The paper describes the development of a tool to predict quantitatively the success of a team when executing a process. The tool was developed for the UK defence industry, though it may be useful in other domains. It is expected to be used by systems engineers in initial stages of systems design, when concepts are still fluid, including the structure of the team(s) which are expected to be operators within the system. It enables answers to be calculated for questions such as "What happens if I reduce team size?" and "Can I reduce the qualifications necessary to execute this process and still achieve the required level of success?". The tool has undergone verification and validation; it predicts fairly well and shows promise. An unexpected finding is that the tool creates a good a priori argument for significant attention to Human Factors Integration in systems projects. The simulations show that if a systems project takes full account of human factors integration (selection, training, process design, interaction design, culture, etc.) then the likelihood of team success will be in excess of 0.95. As the project derogates from this state, the likelihood of team success will drop as low as 0.05. If the team has good internal communications and good individuals in key roles, the likelihood of success rises towards 0.25. Even with a team comprising the best individuals, p(success) will not be greater than 0.35. It is hoped that these results will be useful for human factors professionals involved in systems design. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  15. Integrated Human-Robotic Missions to the Moon and Mars: Mission Operations Design Implications

    NASA Technical Reports Server (NTRS)

    Mishkin, Andrew; Lee, Young; Korth, David; LeBlanc, Troy

    2007-01-01

    For most of the history of space exploration, human and robotic programs have been independent, and have responded to distinct requirements. The NASA Vision for Space Exploration calls for the return of humans to the Moon, and the eventual human exploration of Mars; the complexity of this range of missions will require an unprecedented use of automation and robotics in support of human crews. The challenges of human Mars missions, including roundtrip communications time delays of 6 to 40 minutes, interplanetary transit times of many months, and the need to manage lifecycle costs, will require the evolution of a new mission operations paradigm far less dependent on real-time monitoring and response by an Earthbound operations team. Robotic systems and automation will augment human capability, increase human safety by providing means to perform many tasks without requiring immediate human presence, and enable the transfer of traditional mission control tasks from the ground to crews. Developing and validating the new paradigm and its associated infrastructure may place requirements on operations design for nearer-term lunar missions. The authors, representing both the human and robotic mission operations communities, assess human lunar and Mars mission challenges, and consider how human-robot operations may be integrated to enable efficient joint operations, with the eventual emergence of a unified exploration operations culture.

  16. Integrated Human-Robotic Missions to the Moon and Mars: Mission Operations Design Implications

    NASA Technical Reports Server (NTRS)

    Korth, David; LeBlanc, Troy; Mishkin, Andrew; Lee, Young

    2006-01-01

    For most of the history of space exploration, human and robotic programs have been independent, and have responded to distinct requirements. The NASA Vision for Space Exploration calls for the return of humans to the Moon, and the eventual human exploration of Mars; the complexity of this range of missions will require an unprecedented use of automation and robotics in support of human crews. The challenges of human Mars missions, including roundtrip communications time delays of 6 to 40 minutes, interplanetary transit times of many months, and the need to manage lifecycle costs, will require the evolution of a new mission operations paradigm far less dependent on real-time monitoring and response by an Earthbound operations team. Robotic systems and automation will augment human capability, increase human safety by providing means to perform many tasks without requiring immediate human presence, and enable the transfer of traditional mission control tasks from the ground to crews. Developing and validating the new paradigm and its associated infrastructure may place requirements on operations design for nearer-term lunar missions. The authors, representing both the human and robotic mission operations communities, assess human lunar and Mars mission challenges, and consider how human-robot operations may be integrated to enable efficient joint operations, with the eventual emergence of a unified exploration operations culture.

  17. 77 FR 66082 - NASA Advisory Council; Human Exploration and Operations Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-01

    ... Exploration and Operations Committee; Meeting AGENCY: National Aeronautics and Space Administration. ACTION... Integration --International Space Station Status --Outreach --Human Exploration and Operations Status... Advisory Council Human Exploration and Operations Committee session in the Space Operations Center, Room...

  18. Multiscale Modeling of Virus Structure, Assembly, and Dynamics

    NASA Astrophysics Data System (ADS)

    May, Eric R.; Arora, Karunesh; Mannige, Ranjan V.; Nguyen, Hung D.; Brooks, Charles L.

    Viruses are traditionally considered as infectious agents that attack cells and cause illnesses like AIDS, Influenza, Hepatitis, etc. However, recent advances have illustrated the potential for viruses to play positive roles for human health, instead of causing disease [1, 2]. For example, viruses can be employed for a variety of biomedical and biotechnological applications, including gene therapy[3], drug delivery[4], tumor targeting[5], and medical imaging[6]. Therefore, it is important to understand quantitatively how viruses operate such that they can be engineered in a predictive manner for beneficial roles.

  19. Human Factors for Situation Assessment in Grid Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guttromson, Ross T.; Schur, Anne; Greitzer, Frank L.

    2007-08-08

    Executive Summary Despite advances in technology, power system operators must assimilate overwhelming amounts of data to keep the grid operating. Analyses of recent blackouts have clearly demonstrated the need to enhance the operator’s situation awareness (SA). The long-term objective of this research is to integrate valuable technologies into the grid operator environment that support decision making under normal and abnormal operating conditions and remove non-technical barriers to enable the optimum use of these technologies by individuals working alone and as a team. More specifically, the research aims to identify methods and principles to increase SA of grid operators in themore » context of system conditions that are representative or common across many operating entities and develop operationally relevant experimental methods for studying technologies and operational practices which contribute to SA. With increasing complexity and interconnectivity of the grid, the scope and complexity of situation awareness have grown. New paradigms are needed to guide research and tool development aimed to enhance and improve operations. In reviewing related research, operating practices, systems, and tools, the present study established a taxonomy that provides a perspective on research and development surrounding power grid situation awareness and clarifies the field of human factors/SA for grid operations. Information sources that we used to identify critical factors underlying SA included interviews with experienced operational personnel, available historical summaries and transcripts of abnormal conditions and outages (e.g., the August 14, 2003 blackout), scientific literature, and operational policies/procedures and other documentation. Our analysis of August 2003 blackout transcripts and interviews adopted a different perspective than previous analyses of this material, and we complemented this analysis with additional interviews. Based on our analysis and a broad literature review, we advocate a new perspective on SA in terms of sensemaking, also called situated or ecological decision making, where the focus of the investigation is to understand why the decision maker(s) experienced the situation the way they did, or why what they saw made sense to them at the time. This perspective is distinct from the traditional branch of human factors research in the field which focuses more on ergonomics and the transactional relationship between the human operator and the systems. Consistent with our findings from the literature review, we recognized an over-arching need to focus SA research on issues surrounding the concept of shared knowledge; e.g., awareness of what is happening in adjacent areas as well as one’s own area of responsibility. Major findings were: a) Inadequate communication/information sharing is pervasive, b) Information is available, but not used. Many tools and mechanisms exist for operators to build awareness of the physical grid system, yet the transcripts reveal that they still need to call and exchange information with operators of neighboring areas to improve or validate their SA. The specific types of information that they request are quite predictable and, in most cases, cover information that could be available to both operators and reliability coordinators through readily available displays or other data sources, c) Shared Knowledge is Required on Operations/Actions as Well as Physical Status. In an ideal, technologically and organizationally perfect world, every control room and every reliability coordinator may have access to complete data across all regional control areas and yet, there would still be reason for the operators to call each other to gain and improve their SA of power grid operations, and d) Situation Awareness as sensemaking and shared knowledge.« less

  20. Pitfalls and Precautions When Using Predicted Failure Data for Quantitative Analysis of Safety Risk for Human Rated Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hatfield, Glen S.; Hark, Frank; Stott, James

    2016-01-01

    Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account risks attributable to manufacturing, assembly, and process controls. These sources often dominate component level reliability or risk of failure probability. While consequences of failure is often understood in assessing risk, using predicted values in a risk model to estimate the probability of occurrence will likely underestimate the risk. Managers and decision makers often use the probability of occurrence in determining whether to accept the risk or require a design modification. Due to the absence of system level test and operational data inherent in aerospace applications, the actual risk threshold for acceptance may not be appropriately characterized for decision making purposes. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.

  1. First Steps Toward Incorporating Image Based Diagnostics Into Particle Accelerator Control Systems Using Convolutional Neural Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edelen, A. L.; Biedron, S. G.; Milton, S. V.

    At present, a variety of image-based diagnostics are used in particle accelerator systems. Often times, these are viewed by a human operator who then makes appropriate adjustments to the machine. Given recent advances in using convolutional neural networks (CNNs) for image processing, it should be possible to use image diagnostics directly in control routines (NN-based or otherwise). This is especially appealing for non-intercepting diagnostics that could run continuously during beam operation. Here, we show results of a first step toward implementing such a controller: our trained CNN can predict multiple simulated downstream beam parameters at the Fermilab Accelerator Science andmore » Technology (FAST) facility's low energy beamline using simulated virtual cathode laser images, gun phases, and solenoid strengths.« less

  2. Forecasting Flare Activity Using Deep Convolutional Neural Networks

    NASA Astrophysics Data System (ADS)

    Hernandez, T.

    2017-12-01

    Current operational flare forecasting relies on human morphological analysis of active regions and the persistence of solar flare activity through time (i.e. that the Sun will continue to do what it is doing right now: flaring or remaining calm). In this talk we present the results of applying deep Convolutional Neural Networks (CNNs) to the problem of solar flare forecasting. CNNs operate by training a set of tunable spatial filters that, in combination with neural layer interconnectivity, allow CNNs to automatically identify significant spatial structures predictive for classification and regression problems. We will start by discussing the applicability and success rate of the approach, the advantages it has over non-automated forecasts, and how mining our trained neural network provides a fresh look into the mechanisms behind magnetic energy storage and release.

  3. A real-time architecture for time-aware agents.

    PubMed

    Prouskas, Konstantinos-Vassileios; Pitt, Jeremy V

    2004-06-01

    This paper describes the specification and implementation of a new three-layer time-aware agent architecture. This architecture is designed for applications and environments where societies of humans and agents play equally active roles, but interact and operate in completely different time frames. The architecture consists of three layers: the April real-time run-time (ART) layer, the time aware layer (TAL), and the application agents layer (AAL). The ART layer forms the underlying real-time agent platform. An original online, real-time, dynamic priority-based scheduling algorithm is described for scheduling the computation time of agent processes, and it is shown that the algorithm's O(n) complexity and scalable performance are sufficient for application in real-time domains. The TAL layer forms an abstraction layer through which human and agent interactions are temporally unified, that is, handled in a common way irrespective of their temporal representation and scale. A novel O(n2) interaction scheduling algorithm is described for predicting and guaranteeing interactions' initiation and completion times. The time-aware predicting component of a workflow management system is also presented as an instance of the AAL layer. The described time-aware architecture addresses two key challenges in enabling agents to be effectively configured and applied in environments where humans and agents play equally active roles. It provides flexibility and adaptability in its real-time mechanisms while placing them under direct agent control, and it temporally unifies human and agent interactions.

  4. Predicting inpatient clinical order patterns with probabilistic topic models vs conventional order sets.

    PubMed

    Chen, Jonathan H; Goldstein, Mary K; Asch, Steven M; Mackey, Lester; Altman, Russ B

    2017-05-01

    Build probabilistic topic model representations of hospital admissions processes and compare the ability of such models to predict clinical order patterns as compared to preconstructed order sets. The authors evaluated the first 24 hours of structured electronic health record data for > 10 K inpatients. Drawing an analogy between structured items (e.g., clinical orders) to words in a text document, the authors performed latent Dirichlet allocation probabilistic topic modeling. These topic models use initial clinical information to predict clinical orders for a separate validation set of > 4 K patients. The authors evaluated these topic model-based predictions vs existing human-authored order sets by area under the receiver operating characteristic curve, precision, and recall for subsequent clinical orders. Existing order sets predict clinical orders used within 24 hours with area under the receiver operating characteristic curve 0.81, precision 16%, and recall 35%. This can be improved to 0.90, 24%, and 47% ( P  < 10 -20 ) by using probabilistic topic models to summarize clinical data into up to 32 topics. Many of these latent topics yield natural clinical interpretations (e.g., "critical care," "pneumonia," "neurologic evaluation"). Existing order sets tend to provide nonspecific, process-oriented aid, with usability limitations impairing more precise, patient-focused support. Algorithmic summarization has the potential to breach this usability barrier by automatically inferring patient context, but with potential tradeoffs in interpretability. Probabilistic topic modeling provides an automated approach to detect thematic trends in patient care and generate decision support content. A potential use case finds related clinical orders for decision support. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  5. Predicting inpatient clinical order patterns with probabilistic topic models vs conventional order sets

    PubMed Central

    Goldstein, Mary K; Asch, Steven M; Mackey, Lester; Altman, Russ B

    2017-01-01

    Objective: Build probabilistic topic model representations of hospital admissions processes and compare the ability of such models to predict clinical order patterns as compared to preconstructed order sets. Materials and Methods: The authors evaluated the first 24 hours of structured electronic health record data for > 10 K inpatients. Drawing an analogy between structured items (e.g., clinical orders) to words in a text document, the authors performed latent Dirichlet allocation probabilistic topic modeling. These topic models use initial clinical information to predict clinical orders for a separate validation set of > 4 K patients. The authors evaluated these topic model-based predictions vs existing human-authored order sets by area under the receiver operating characteristic curve, precision, and recall for subsequent clinical orders. Results: Existing order sets predict clinical orders used within 24 hours with area under the receiver operating characteristic curve 0.81, precision 16%, and recall 35%. This can be improved to 0.90, 24%, and 47% (P < 10−20) by using probabilistic topic models to summarize clinical data into up to 32 topics. Many of these latent topics yield natural clinical interpretations (e.g., “critical care,” “pneumonia,” “neurologic evaluation”). Discussion: Existing order sets tend to provide nonspecific, process-oriented aid, with usability limitations impairing more precise, patient-focused support. Algorithmic summarization has the potential to breach this usability barrier by automatically inferring patient context, but with potential tradeoffs in interpretability. Conclusion: Probabilistic topic modeling provides an automated approach to detect thematic trends in patient care and generate decision support content. A potential use case finds related clinical orders for decision support. PMID:27655861

  6. The Brain's Router: A Cortical Network Model of Serial Processing in the Primate Brain

    PubMed Central

    Zylberberg, Ariel; Fernández Slezak, Diego; Roelfsema, Pieter R.; Dehaene, Stanislas; Sigman, Mariano

    2010-01-01

    The human brain efficiently solves certain operations such as object recognition and categorization through a massively parallel network of dedicated processors. However, human cognition also relies on the ability to perform an arbitrarily large set of tasks by flexibly recombining different processors into a novel chain. This flexibility comes at the cost of a severe slowing down and a seriality of operations (100–500 ms per step). A limit on parallel processing is demonstrated in experimental setups such as the psychological refractory period (PRP) and the attentional blink (AB) in which the processing of an element either significantly delays (PRP) or impedes conscious access (AB) of a second, rapidly presented element. Here we present a spiking-neuron implementation of a cognitive architecture where a large number of local parallel processors assemble together to produce goal-driven behavior. The precise mapping of incoming sensory stimuli onto motor representations relies on a “router” network capable of flexibly interconnecting processors and rapidly changing its configuration from one task to another. Simulations show that, when presented with dual-task stimuli, the network exhibits parallel processing at peripheral sensory levels, a memory buffer capable of keeping the result of sensory processing on hold, and a slow serial performance at the router stage, resulting in a performance bottleneck. The network captures the detailed dynamics of human behavior during dual-task-performance, including both mean RTs and RT distributions, and establishes concrete predictions on neuronal dynamics during dual-task experiments in humans and non-human primates. PMID:20442869

  7. Metal and Metalloid Contaminants in Atmospheric Aerosols from Mining Operations

    PubMed Central

    Csavina, Janae; Landázuri, Andrea; Wonaschütz, Anna; Rine, Kyle; Rheinheimer, Paul; Barbaris, Brian; Conant, William; Sáez, A. Eduardo; Betterton, Eric A.

    2013-01-01

    Mining operations are potential sources of airborne metal and metalloid contaminants through both direct smelter emissions and wind erosion of mine tailings. The warmer, drier conditions predicted for the Southwestern US by climate models may make contaminated atmospheric dust and aerosols increasingly important, with potential deleterious effects on human health and ecology. Fine particulates such as those resulting from smelting operations may disperse more readily into the environment than coarser tailings dust. Fine particles also penetrate more deeply into the human respiratory system, and may become more bioavailable due to their high specific surface area. In this work, we report the size-fractionated chemical characterization of atmospheric aerosols sampled over a period of a year near an active mining and smelting site in Arizona. Aerosols were characterized with a 10-stage (0.054 to 18 μm aerodynamic diameter) multiple orifice uniform deposit impactor (MOUDI), a scanning mobility particle sizer (SMPS), and a total suspended particulate (TSP) collector. The MOUDI results show that arsenic and lead concentrations follow a bimodal distribution, with maxima centered at approximately 0.3 and 7.0 μm diameter. We hypothesize that the sub-micron arsenic and lead are the product of condensation and coagulation of smelting vapors. In the coarse size, contaminants are thought to originate as aeolian dust from mine tailings and other sources. Observation of ultrafine particle number concentration (SMPS) show the highest readings when the wind comes from the general direction of the smelting operations site. PMID:23441050

  8. Validation of the Predictive Value of Modeled Human Chorionic Gonadotrophin Residual Production in Low-Risk Gestational Trophoblastic Neoplasia Patients Treated in NRG Oncology/Gynecologic Oncology Group-174 Phase III Trial.

    PubMed

    You, Benoit; Deng, Wei; Hénin, Emilie; Oza, Amit; Osborne, Raymond

    2016-01-01

    In low-risk gestational trophoblastic neoplasia, chemotherapy effect is monitored and adjusted with serum human chorionic gonadotrophin (hCG) levels. Mathematical modeling of hCG kinetics may allow prediction of methotrexate (MTX) resistance, with production parameter "hCGres." This approach was evaluated using the GOG-174 (NRG Oncology/Gynecologic Oncology Group-174) trial database, in which weekly MTX (arm 1) was compared with dactinomycin (arm 2). Database (210 patients, including 78 with resistance) was split into 2 sets. A 126-patient training set was initially used to estimate model parameters. Patient hCG kinetics from days 7 to 45 were fit to: [hCG(time)] = hCG7 * exp(-k * time) + hCGres, where hCGres is residual hCG tumor production, hCG7 is the initial hCG level, and k is the elimination rate constant. Receiver operating characteristic (ROC) analyses defined putative hCGRes predictor of resistance. An 84-patient test set was used to assess prediction validity. The hCGres was predictive of outcome in both arms, with no impact of treatment arm on unexplained variability of kinetic parameter estimates. The best hCGres cutoffs to discriminate resistant versus sensitive patients were 7.7 and 74.0 IU/L in arms 1 and 2, respectively. By combining them, 2 predictive groups were defined (ROC area under the curve, 0.82; sensitivity, 93.8%; specificity, 70.5%). The predictive value of hCGres-based groups regarding resistance was reproducible in test set (ROC area under the curve, 0.81; sensitivity, 88.9%; specificity, 73.1%). Both hCGres and treatment arm were associated with resistance by logistic regression analysis. The early predictive value of the modeled kinetic parameter hCGres regarding resistance seems promising in the GOG-174 study. This is the second positive evaluation of this approach. Prospective validation is warranted.

  9. An Investigation of the Combined Effect of Stress, Fatigue and Workload on Human Performance: Position Paper

    NASA Technical Reports Server (NTRS)

    Mock, Jessica

    2005-01-01

    Stress, fatigue, and workload affect worker performance. NSF reported that 61% of respondents state losing concentration at work while 79% occasionally or frequently made errors as a result of being fatigued. Shift work, altered work schedules, long hours of continuous wakefulness, and sleep loss can create sleep and circadian disruptions that degrade waking fundions causing stress and fatigue. Review of the literature has proven void of information that links the combined effects of fatigue, stress, and workload to human performance. This paper will address which occupational factors within stress, fatigue, and workload were identified as occupational contributors to performance changes. The results of this research will be apglied to underlying models and algorithms that will help predict performance changes in control room operators.

  10. Predicting physiological capacity of human load carriage - a review.

    PubMed

    Drain, Jace; Billing, Daniel; Neesham-Smith, Daniel; Aisbett, Brad

    2016-01-01

    This review article aims to evaluate a proposed maximum acceptable work duration model for load carriage tasks. It is contended that this concept has particular relevance to physically demanding occupations such as military and firefighting. Personnel in these occupations are often required to perform very physically demanding tasks, over varying time periods, often involving load carriage. Previous research has investigated concepts related to physiological workload limits in occupational settings (e.g. industrial). Evidence suggests however, that existing (unloaded) workload guidelines are not appropriate for load carriage tasks. The utility of this model warrants further work to enable prediction of load carriage durations across a range of functional workloads for physically demanding occupations. If the maximum duration for which personnel can physiologically sustain a load carriage task could be accurately predicted, commanders and supervisors could better plan for and manage tasks to ensure operational imperatives were met whilst minimising health risks for their workers. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  11. Variant effect prediction tools assessed using independent, functional assay-based datasets: implications for discovery and diagnostics.

    PubMed

    Mahmood, Khalid; Jung, Chol-Hee; Philip, Gayle; Georgeson, Peter; Chung, Jessica; Pope, Bernard J; Park, Daniel J

    2017-05-16

    Genetic variant effect prediction algorithms are used extensively in clinical genomics and research to determine the likely consequences of amino acid substitutions on protein function. It is vital that we better understand their accuracies and limitations because published performance metrics are confounded by serious problems of circularity and error propagation. Here, we derive three independent, functionally determined human mutation datasets, UniFun, BRCA1-DMS and TP53-TA, and employ them, alongside previously described datasets, to assess the pre-eminent variant effect prediction tools. Apparent accuracies of variant effect prediction tools were influenced significantly by the benchmarking dataset. Benchmarking with the assay-determined datasets UniFun and BRCA1-DMS yielded areas under the receiver operating characteristic curves in the modest ranges of 0.52 to 0.63 and 0.54 to 0.75, respectively, considerably lower than observed for other, potentially more conflicted datasets. These results raise concerns about how such algorithms should be employed, particularly in a clinical setting. Contemporary variant effect prediction tools are unlikely to be as accurate at the general prediction of functional impacts on proteins as reported prior. Use of functional assay-based datasets that avoid prior dependencies promises to be valuable for the ongoing development and accurate benchmarking of such tools.

  12. Predicting human age using regional morphometry and inter-regional morphological similarity

    NASA Astrophysics Data System (ADS)

    Wang, Xun-Heng; Li, Lihua

    2016-03-01

    The goal of this study is predicting human age using neuro-metrics derived from structural MRI, as well as investigating the relationships between age and predictive neuro-metrics. To this end, a cohort of healthy subjects were recruited from 1000 Functional Connectomes Project. The ages of the participations were ranging from 7 to 83 (36.17+/-20.46). The structural MRI for each subject was preprocessed using FreeSurfer, resulting in regional cortical thickness, mean curvature, regional volume and regional surface area for 148 anatomical parcellations. The individual age was predicted from the combination of regional and inter-regional neuro-metrics. The prediction accuracy is r = 0.835, p < 0.00001, evaluated by Pearson correlation coefficient between predicted ages and actual ages. Moreover, the LASSO linear regression also found certain predictive features, most of which were inter-regional features. The turning-point of the developmental trajectories in human brain was around 40 years old based on regional cortical thickness. In conclusion, structural MRI could be potential biomarkers for the aging in human brain. The human age could be successfully predicted from the combination of regional morphometry and inter-regional morphological similarity. The inter-regional measures could be beneficial to investigating human brain connectome.

  13. Polybrominated diphenyl ethers (PBDE) in serum: findings from a US cohort of consumers of sport-caught fish.

    PubMed

    Anderson, Henry A; Imm, Pamela; Knobeloch, Lynda; Turyk, Mary; Mathew, John; Buelow, Carol; Persky, Victoria

    2008-09-01

    Polybrominated diphenyl ethers (PBDEs) have been used as flame retardants in foams, fabrics and plastics, and are common contaminants of household air and dust and bioaccumulate in wildlife, and are detectable in human tissues and in fish and animal food products. In the Great Lakes Basin sport fish consumption has been demonstrated to be an important source of PCB and DDE exposure. PBDEs are present in the same sport fish but prior to our study the contribution to human PBDE body burdens from Great Lakes sport fish consumption had not been investigated. This study was designed to assess PBDE, PCB and 1,1-bis(4-chlorophenyl)-2,2-dichloroethene (DDE) serum concentrations in an existing cohort of 508 frequent and infrequent consumers of sport-caught fish living in five Great Lake states. BDE congeners 47 and 99 were identified in the majority of blood samples, 98% and 62% respectively. summation operatorPBDE levels were positively associated with age, hours spent outdoors, DDE, summation operatorPCB, years of sportfish consumption, and catfish and shellfish intake, and negatively associated with income and recent weight loss. Other dietary components collected were not predictive of measured summation operatorPBDE levels. In multivariate models, summation operatorPBDE levels were positively associated with age, years consuming sport fish, shellfish meals, and computer use and negatively associated with recent weight loss. Having summation operatorPBDE levels in the highest quintile was independently associated with older age, male gender, consumption of catfish and shellfish, computer use and spending less time indoors. summation operatorPCB and DDE were strongly associated suggesting common exposure routes. The association between summation operatorPBDE and summation operatorPCB or DDE was much weaker and modeling suggested more diverse PBDE sources with few identified multi-contaminant-shared exposure routes. In our cohort Great Lakes sport fish consumption does not contribute strongly to PBDE exposure.

  14. Assessing human rights impacts in corporate development projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salcito, Kendyl, E-mail: kendyl.salcito@unibas.ch; University of Basel, P.O. Box, CH-4003 Basel; NomoGaia, 1900 Wazee Street, Suite 303, Denver, CO 80202

    Human rights impact assessment (HRIA) is a process for systematically identifying, predicting and responding to the potential impact on human rights of a business operation, capital project, government policy or trade agreement. Traditionally, it has been conducted as a desktop exercise to predict the effects of trade agreements and government policies on individuals and communities. In line with a growing call for multinational corporations to ensure they do not violate human rights in their activities, HRIA is increasingly incorporated into the standard suite of corporate development project impact assessments. In this context, the policy world's non-structured, desk-based approaches to HRIAmore » are insufficient. Although a number of corporations have commissioned and conducted HRIA, no broadly accepted and validated assessment tool is currently available. The lack of standardisation has complicated efforts to evaluate the effectiveness of HRIA as a risk mitigation tool, and has caused confusion in the corporate world regarding company duties. Hence, clarification is needed. The objectives of this paper are (i) to describe an HRIA methodology, (ii) to provide a rationale for its components and design, and (iii) to illustrate implementation of HRIA using the methodology in two selected corporate development projects—a uranium mine in Malawi and a tree farm in Tanzania. We found that as a prognostic tool, HRIA could examine potential positive and negative human rights impacts and provide effective recommendations for mitigation. However, longer-term monitoring revealed that recommendations were unevenly implemented, dependent on market conditions and personnel movements. This instability in the approach to human rights suggests a need for on-going monitoring and surveillance. -- Highlights: • We developed a novel methodology for corporate human rights impact assessment. • We piloted the methodology on two corporate projects—a mine and a plantation. • Human rights impact assessment exposed impacts not foreseen in ESIA. • Corporations adopted the majority of findings, but not necessarily immediately. • Methodological advancements are expected for monitoring processes.« less

  15. A Framework for Human Performance Criteria for Advanced Reactor Operational Concepts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacques V Hugo; David I Gertman; Jeffrey C Joe

    2014-08-01

    This report supports the determination of new Operational Concept models needed in support of the operational design of new reactors. The objective of this research is to establish the technical bases for human performance and human performance criteria frameworks, models, and guidance for operational concepts for advanced reactor designs. The report includes a discussion of operating principles for advanced reactors, the human performance issues and requirements for human performance based upon work domain analysis and current regulatory requirements, and a description of general human performance criteria. The major findings and key observations to date are that there is some operatingmore » experience that informs operational concepts for baseline designs for SFR and HGTRs, with the Experimental Breeder Reactor-II (EBR-II) as a best-case predecessor design. This report summarizes the theoretical and operational foundations for the development of a framework and model for human performance criteria that will influence the development of future Operational Concepts. The report also highlights issues associated with advanced reactor design and clarifies and codifies the identified aspects of technology and operating scenarios.« less

  16. BEHAVE: fire behavior prediction and fuel modeling system-BURN Subsystem, part 1

    Treesearch

    Patricia L. Andrews

    1986-01-01

    Describes BURN Subsystem, Part 1, the operational fire behavior prediction subsystem of the BEHAVE fire behavior prediction and fuel modeling system. The manual covers operation of the computer program, assumptions of the mathematical models used in the calculations, and application of the predictions.

  17. Can human experts predict solubility better than computers?

    PubMed

    Boobier, Samuel; Osbourn, Anne; Mitchell, John B O

    2017-12-13

    In this study, we design and carry out a survey, asking human experts to predict the aqueous solubility of druglike organic compounds. We investigate whether these experts, drawn largely from the pharmaceutical industry and academia, can match or exceed the predictive power of algorithms. Alongside this, we implement 10 typical machine learning algorithms on the same dataset. The best algorithm, a variety of neural network known as a multi-layer perceptron, gave an RMSE of 0.985 log S units and an R 2 of 0.706. We would not have predicted the relative success of this particular algorithm in advance. We found that the best individual human predictor generated an almost identical prediction quality with an RMSE of 0.942 log S units and an R 2 of 0.723. The collection of algorithms contained a higher proportion of reasonably good predictors, nine out of ten compared with around half of the humans. We found that, for either humans or algorithms, combining individual predictions into a consensus predictor by taking their median generated excellent predictivity. While our consensus human predictor achieved very slightly better headline figures on various statistical measures, the difference between it and the consensus machine learning predictor was both small and statistically insignificant. We conclude that human experts can predict the aqueous solubility of druglike molecules essentially equally well as machine learning algorithms. We find that, for either humans or algorithms, combining individual predictions into a consensus predictor by taking their median is a powerful way of benefitting from the wisdom of crowds.

  18. Comparison of predictability for human pharmacokinetics parameters among monkeys, rats, and chimeric mice with humanised liver.

    PubMed

    Miyamoto, Maki; Iwasaki, Shinji; Chisaki, Ikumi; Nakagawa, Sayaka; Amano, Nobuyuki; Hirabayashi, Hideki

    2017-12-01

    1. The aim of the present study was to evaluate the usefulness of chimeric mice with humanised liver (PXB mice) for the prediction of clearance (CL t ) and volume of distribution at steady state (Vd ss ), in comparison with monkeys, which have been reported as a reliable model for human pharmacokinetics (PK) prediction, and with rats, as a conventional PK model. 2. CL t and Vd ss values in PXB mice, monkeys and rats were determined following intravenous administration of 30 compounds known to be mainly eliminated in humans via the hepatic metabolism by various drug-metabolising enzymes. Using single-species allometric scaling, human CL t and Vd ss values were predicted from the three animal models. 3. Predicted CL t values from PXB mice exhibited the highest predictability: 25 for PXB mice, 21 for monkeys and 14 for rats were predicted within a three-fold range of actual values among 30 compounds. For predicted human Vd ss values, the number of compounds falling within a three-fold range was 23 for PXB mice, 24 for monkeys, and 16 for rats among 29 compounds. PXB mice indicated a higher predictability for CL t and Vd ss values than the other animal models. 4. These results demonstrate the utility of PXB mice in predicting human PK parameters.

  19. Determination of the Proper Rest Time for a Cyclic Mental Task Using ACT-R Architecture.

    PubMed

    Atashfeshan, Nooshin; Razavi, Hamideh

    2017-03-01

    Objective Analysis of the effect of mental fatigue on a cognitive task and determination of the right start time for rest breaks in work environments. Background Mental fatigue has been recognized as one of the most important factors influencing individual performance. Subjective and physiological measures are popular methods for analyzing fatigue, but they are restricted to physical experiments. Computational cognitive models are useful for predicting operator performance and can be used for analyzing fatigue in the design phase, particularly in industrial operations and inspections where cognitive tasks are frequent and the effects of mental fatigue are crucial. Method A cyclic mental task is modeled by the ACT-R architecture, and the effect of mental fatigue on response time and error rate is studied. The task includes visual inspections in a production line or control workstation where an operator has to check products' conformity to specifications. Initially, simulated and experimental results are compared using correlation coefficients and paired t test statistics. After validation of the model, the effects are studied by human and simulated results, which are obtained by running 50-minute tests. Results It is revealed that during the last 20 minutes of the tests, the response time increased by 20%, and during the last 12.5 minutes, the error rate increased by 7% on average. Conclusion The proper start time for the rest period can be identified by setting a limit on the error rate or response time. Application The proposed model can be applied early in production planning to decrease the negative effects of mental fatigue by predicting the operator performance. It can also be used for determining the rest breaks in the design phase without an operator in the loop.

  20. Accuracy, calibration and clinical performance of the EuroSCORE: can we reduce the number of variables?

    PubMed

    Ranucci, Marco; Castelvecchio, Serenella; Menicanti, Lorenzo; Frigiola, Alessandro; Pelissero, Gabriele

    2010-03-01

    The European system for cardiac operative risk evaluation (EuroSCORE) is currently used in many institutions and is considered a reference tool in many countries. We hypothesised that too many variables were included in the EuroSCORE using limited patient series. We tested different models using a limited number of variables. A total of 11150 adult patients undergoing cardiac operations at our institution (2001-2007) were retrospectively analysed. The 17 risk factors composing the EuroSCORE were separately analysed and ranked for accuracy of prediction of hospital mortality. Seventeen models were created by progressively including one factor at a time. The models were compared for accuracy with a receiver operating characteristics (ROC) analysis and area under the curve (AUC) evaluation. Calibration was tested with Hosmer-Lemeshow statistics. Clinical performance was assessed by comparing the predicted with the observed mortality rates. The best accuracy (AUC 0.76) was obtained using a model including only age, left ventricular ejection fraction, serum creatinine, emergency operation and non-isolated coronary operation. The EuroSCORE AUC (0.75) was not significantly different. Calibration and clinical performance were better in the five-factor model than in the EuroSCORE. Only in high-risk patients were 12 factors needed to achieve a good performance. Including many factors in multivariable logistic models increases the risk for overfitting, multicollinearity and human error. A five-factor model offers the same level of accuracy but demonstrated better calibration and clinical performance. Models with a limited number of factors may work better than complex models when applied to a limited number of patients. Copyright (c) 2009 European Association for Cardio-Thoracic Surgery. Published by Elsevier B.V. All rights reserved.

  1. Autonomous Mission Operations for Sensor Webs

    NASA Astrophysics Data System (ADS)

    Underbrink, A.; Witt, K.; Stanley, J.; Mandl, D.

    2008-12-01

    We present interim results of a 2005 ROSES AIST project entitled, "Using Intelligent Agents to Form a Sensor Web for Autonomous Mission Operations", or SWAMO. The goal of the SWAMO project is to shift the control of spacecraft missions from a ground-based, centrally controlled architecture to a collaborative, distributed set of intelligent agents. The network of intelligent agents intends to reduce management requirements by utilizing model-based system prediction and autonomic model/agent collaboration. SWAMO agents are distributed throughout the Sensor Web environment, which may include multiple spacecraft, aircraft, ground systems, and ocean systems, as well as manned operations centers. The agents monitor and manage sensor platforms, Earth sensing systems, and Earth sensing models and processes. The SWAMO agents form a Sensor Web of agents via peer-to-peer coordination. Some of the intelligent agents are mobile and able to traverse between on-orbit and ground-based systems. Other agents in the network are responsible for encapsulating system models to perform prediction of future behavior of the modeled subsystems and components to which they are assigned. The software agents use semantic web technologies to enable improved information sharing among the operational entities of the Sensor Web. The semantics include ontological conceptualizations of the Sensor Web environment, plus conceptualizations of the SWAMO agents themselves. By conceptualizations of the agents, we mean knowledge of their state, operational capabilities, current operational capacities, Web Service search and discovery results, agent collaboration rules, etc. The need for ontological conceptualizations over the agents is to enable autonomous and autonomic operations of the Sensor Web. The SWAMO ontology enables automated decision making and responses to the dynamic Sensor Web environment and to end user science requests. The current ontology is compatible with Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) Sensor Model Language (SensorML) concepts and structures. The agents are currently deployed on the U.S. Naval Academy MidSTAR-1 satellite and are actively managing the power subsystem on-orbit without the need for human intervention.

  2. Gas-Liquid Two-Phase Flows Through Packed Bed Reactors in Microgravity

    NASA Technical Reports Server (NTRS)

    Motil, Brian J.; Balakotaiah, Vemuri

    2001-01-01

    The simultaneous flow of gas and liquid through a fixed bed of particles occurs in many unit operations of interest to the designers of space-based as well as terrestrial equipment. Examples include separation columns, gas-liquid reactors, humidification, drying, extraction, and leaching. These operations are critical to a wide variety of industries such as petroleum, pharmaceutical, mining, biological, and chemical. NASA recognizes that similar operations will need to be performed in space and on planetary bodies such as Mars if we are to achieve our goals of human exploration and the development of space. The goal of this research is to understand how to apply our current understanding of two-phase fluid flow through fixed-bed reactors to zero- or partial-gravity environments. Previous experiments by NASA have shown that reactors designed to work on Earth do not necessarily function in a similar manner in space. Two experiments, the Water Processor Assembly and the Volatile Removal Assembly have encountered difficulties in predicting and controlling the distribution of the phases (a crucial element in the operation of this type of reactor) as well as the overall pressure drop.

  3. Intelligent Pilot Aids for Flight Re-Planning in Emergencies

    NASA Technical Reports Server (NTRS)

    Pritchett, Amy R.

    2002-01-01

    Experimental studies were conducted with pilots to investigate the attributes of automation that would be appropriate for aiding pilots in emergencies. The specific focus of this year was on methods of mitigating automation brittleness. Brittleness occurs when the automatic system is used in circumstances it was not designed for, causing it to choose an incorrect action or make an inaccurate decision for the situation. Brittleness is impossible to avoid since it is impossible to predict every potential situation the automatic system will be exposed to over its life. However, operators are always ultimately responsible for the actions and decisions of the automation they are monitoring or using, which means they must evaluate the automation's decisions and actions for accuracy. As has been pointed out, this is a difficult thing for human operators to do. There have been various suggestions as to how to aid operators with this evaluation. In the study described in this report we studied how presentation of contextual information about an automatic system's decision might impact the ability of the human operators to evaluate that decision. This study focused on the planning of emergency descents. Fortunately, emergencies (e.g., mechanical or electrical malfunction, on-board fire, and medical emergency) happen quite rarely. However, they can be catastrophic when they do. For all predictable or conceivable emergencies, pilots have emergency procedures that they are trained on, but those procedures often end with 'determine suitable airport and land as quickly as possible.' Planning an emergency descent to an unplanned airport is a difficult task, particularly under the time pressures of an emergency. Automatic decision aids could be very efficient at the task of determining an appropriate airport and calculating an optimal trajectory to that airport. This information could be conveyed to the pilot through an emergency descent procedure listing all of the actions necessary to safely land the plane. However, there is still the potential problem of brittleness. This study examined the impact of contextual information in presentations of emergency descent procedures to see if they might impact the pilot's evaluation of the feasibility of the presented procedure. The study and its results are described in detail.

  4. PSSMHCpan: a novel PSSM-based software for predicting class I peptide-HLA binding affinity

    PubMed Central

    Liu, Geng; Li, Dongli; Li, Zhang; Qiu, Si; Li, Wenhui; Chao, Cheng-chi; Yang, Naibo; Li, Handong; Cheng, Zhen; Song, Xin; Cheng, Le; Zhang, Xiuqing; Wang, Jian; Yang, Huanming

    2017-01-01

    Abstract Predicting peptide binding affinity with human leukocyte antigen (HLA) is a crucial step in developing powerful antitumor vaccine for cancer immunotherapy. Currently available methods work quite well in predicting peptide binding affinity with HLA alleles such as HLA-A*0201, HLA-A*0101, and HLA-B*0702 in terms of sensitivity and specificity. However, quite a few types of HLA alleles that are present in the majority of human populations including HLA-A*0202, HLA-A*0203, HLA-A*6802, HLA-B*5101, HLA-B*5301, HLA-B*5401, and HLA-B*5701 still cannot be predicted with satisfactory accuracy using currently available methods. Furthermore, currently the most popularly used methods for predicting peptide binding affinity are inefficient in identifying neoantigens from a large quantity of whole genome and transcriptome sequencing data. Here we present a Position Specific Scoring Matrix (PSSM)-based software called PSSMHCpan to accurately and efficiently predict peptide binding affinity with a broad coverage of HLA class I alleles. We evaluated the performance of PSSMHCpan by analyzing 10-fold cross-validation on a training database containing 87 HLA alleles and obtained an average area under receiver operating characteristic curve (AUC) of 0.94 and accuracy (ACC) of 0.85. In an independent dataset (Peptide Database of Cancer Immunity) evaluation, PSSMHCpan is substantially better than the popularly used NetMHC-4.0, NetMHCpan-3.0, PickPocket, Nebula, and SMM with a sensitivity of 0.90, as compared to 0.74, 0.81, 0.77, 0.24, and 0.79. In addition, PSSMHCpan is more than 197 times faster than NetMHC-4.0, NetMHCpan-3.0, PickPocket, sNebula, and SMM when predicting neoantigens from 661 263 peptides from a breast tumor sample. Finally, we built a neoantigen prediction pipeline and identified 117 017 neoantigens from 467 cancer samples of various cancers from TCGA. PSSMHCpan is superior to the currently available methods in predicting peptide binding affinity with a broad coverage of HLA class I alleles. PMID:28327987

  5. Prediction of Post-operative Mortality in Patients with HCV-related Cirrhosis Undergoing Non-Hepatic Surgeries

    PubMed Central

    Hemida, Khalid; Shabana, Sherif Sadek; Said, Hani; Ali-Eldin, Fatma

    2016-01-01

    Introduction Patients with chronic liver diseases are at great risk for both morbidity and mortality during the post-operative period due to the stress of surgery and the effects of general anaesthesia. Aim The main aim of this study was to evaluate the value of Model for End-stage Liver Disease (MELD) score, as compared to Child-Turcotte-Pugh (CTP) score, for prediction of 30- day post-operative mortality in Egyptian patients with liver cirrhosis undergoing non-hepatic surgery under general anaesthesia. Materials and Methods A total of 60 patients with Hepatitis C Virus (HCV) - related liver cirrhosis were included in this study. Sensitivity and specificity of MELD and CTP scores were evaluated for the prediction of post-operative mortality. A total of 20 patients who had no clinical, biochemical or radiological evidence of liver disease were included to serve as a control group. Results The highest sensitivity and specificity for detection of post-operative mortality was detected at a MELD score of 13.5. CTP score had a sensitivity of 75%, a specificity of 96.4%, and an overall accuracy of 95% for prediction of post-operative mortality. On the other side and at a cut-off value of 13.5, MELD score had a sensitivity of 100%, a specificity of 64.0%, and an overall accuracy of 66.6% for prediction of post-operative mortality in patients with HCV- related liver cirrhosis. Conclusion MELD score proved to be more sensitive but less specific than CTP score for prediction of post-operative mortality. CTP and MELD scores may be complementary rather than competitive in predicting post-operative mortality in patients with HCV- related liver cirrhosis. PMID:27891371

  6. Hepatobiliary Clearance Prediction: Species Scaling From Monkey, Dog, and Rat, and In Vitro-In Vivo Extrapolation of Sandwich-Cultured Human Hepatocytes Using 17 Drugs.

    PubMed

    Kimoto, Emi; Bi, Yi-An; Kosa, Rachel E; Tremaine, Larry M; Varma, Manthena V S

    2017-09-01

    Hepatobiliary elimination can be a major clearance pathway dictating the pharmacokinetics of drugs. Here, we first compared the dose eliminated in bile in preclinical species (monkey, dog, and rat) with that in human and further evaluated single-species scaling (SSS) to predict human hepatobiliary clearance. Six compounds dosed in bile duct-cannulated (BDC) monkeys showed biliary excretion comparable to human; and the SSS of hepatobiliary clearance with plasma fraction unbound correction yielded reasonable predictions (within 3-fold). Although dog SSS also showed reasonable predictions, rat overpredicted hepatobiliary clearance for 13 of 24 compounds. Second, we evaluated the translatability of in vitro sandwich-cultured human hepatocytes (SCHHs) to predict human hepatobiliary clearance for 17 drugs. For drugs with no significant active uptake in SCHH studies (i.e., with or without rifamycin SV), measured intrinsic biliary clearance was directly scalable with good predictability (absolute average fold error [AAFE] = 1.6). Drugs showing significant active uptake in SCHH, however, showed improved predictability when scaled based on extended clearance term (AAFE = 2.0), which incorporated sinusoidal uptake along with a global scaling factor for active uptake and the canalicular efflux clearance. In conclusion, SCHH is a useful tool to predict human hepatobiliary clearance, whereas BDC monkey model may provide further confidence in the prospective predictions. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  7. [Evaluation of performance of five bioinformatics software for the prediction of missense mutations].

    PubMed

    Chen, Qianting; Dai, Congling; Zhang, Qianjun; Du, Juan; Li, Wen

    2016-10-01

    To study the prediction performance evaluation with five kinds of bioinformatics software (SIFT, PolyPhen2, MutationTaster, Provean, MutationAssessor). From own database for genetic mutations collected over the past five years, Chinese literature database, Human Gene Mutation Database, and dbSNP, 121 missense mutations confirmed by functional studies, and 121 missense mutations suspected to be pathogenic by pedigree analysis were used as positive gold standard, while 242 missense mutations with minor allele frequency (MAF)>5% in dominant hereditary diseases were used as negative gold standard. The selected mutations were predicted with the five software. Based on the results, the performance of the five software was evaluated for their sensitivity, specificity, positive predict value, false positive rate, negative predict value, false negative rate, false discovery rate, accuracy, and receiver operating characteristic curve (ROC). In terms of sensitivity, negative predictive value and false negative rate, the rank was MutationTaster, PolyPhen2, Provean, SIFT, and MutationAssessor. For specificity and false positive rate, the rank was MutationTaster, Provean, MutationAssessor, SIFT, and PolyPhen2. For positive predict value and false discovery rate, the rank was MutationTaster, Provean, MutationAssessor, PolyPhen2, and SIFT. For area under the ROC curve (AUC) and accuracy, the rank was MutationTaster, Provean, PolyPhen2, MutationAssessor, and SIFT. The prediction performance of software may be different when using different parameters. Among the five software, MutationTaster has the best prediction performance.

  8. Generalized role for the cerebellum in encoding internal models: evidence from semantic processing.

    PubMed

    Moberget, Torgeir; Gullesen, Eva Hilland; Andersson, Stein; Ivry, Richard B; Endestad, Tor

    2014-02-19

    The striking homogeneity of cerebellar microanatomy is strongly suggestive of a corresponding uniformity of function. Consequently, theoretical models of the cerebellum's role in motor control should offer important clues regarding cerebellar contributions to cognition. One such influential theory holds that the cerebellum encodes internal models, neural representations of the context-specific dynamic properties of an object, to facilitate predictive control when manipulating the object. The present study examined whether this theoretical construct can shed light on the contribution of the cerebellum to language processing. We reasoned that the cerebellum might perform a similar coordinative function when the context provided by the initial part of a sentence can be highly predictive of the end of the sentence. Using functional MRI in humans we tested two predictions derived from this hypothesis, building on previous neuroimaging studies of internal models in motor control. First, focal cerebellar activation-reflecting the operation of acquired internal models-should be enhanced when the linguistic context leads terminal words to be predictable. Second, more widespread activation should be observed when such predictions are violated, reflecting the processing of error signals that can be used to update internal models. Both predictions were confirmed, with predictability and prediction violations associated with increased blood oxygenation level-dependent signal in the posterior cerebellum (Crus I/II). Our results provide further evidence for cerebellar involvement in predictive language processing and suggest that the notion of cerebellar internal models may be extended to the language domain.

  9. Metabolome of human gut microbiome is predictive of host dysbiosis.

    PubMed

    Larsen, Peter E; Dai, Yang

    2015-01-01

    Humans live in constant and vital symbiosis with a closely linked bacterial ecosystem called the microbiome, which influences many aspects of human health. When this microbial ecosystem becomes disrupted, the health of the human host can suffer; a condition called dysbiosis. However, the community compositions of human microbiomes also vary dramatically from individual to individual, and over time, making it difficult to uncover the underlying mechanisms linking the microbiome to human health. We propose that a microbiome's interaction with its human host is not necessarily dependent upon the presence or absence of particular bacterial species, but instead is dependent on its community metabolome; an emergent property of the microbiome. Using data from a previously published, longitudinal study of microbiome populations of the human gut, we extrapolated information about microbiome community enzyme profiles and metabolome models. Using machine learning techniques, we demonstrated that the aggregate predicted community enzyme function profiles and modeled metabolomes of a microbiome are more predictive of dysbiosis than either observed microbiome community composition or predicted enzyme function profiles. Specific enzyme functions and metabolites predictive of dysbiosis provide insights into the molecular mechanisms of microbiome-host interactions. The ability to use machine learning to predict dysbiosis from microbiome community interaction data provides a potentially powerful tool for understanding the links between the human microbiome and human health, pointing to potential microbiome-based diagnostics and therapeutic interventions.

  10. Metabolome of human gut microbiome is predictive of host dysbiosis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larsen, Peter E.; Dai, Yang

    Background: Humans live in constant and vital symbiosis with a closely linked bacterial ecosystem called the microbiome, which influences many aspects of human health. When this microbial ecosystem becomes disrupted, the health of the human host can suffer; a condition called dysbiosis. The community compositions of human microbiomes also vary dramatically from individual to individual, and over time, making it difficult to uncover the underlying mechanisms linking the microbiome to human health. We propose that a microbiome’s interaction with its human host is not necessarily dependent upon the presence or absence of particular bacterial species, but instead is dependent onmore » its community metabolome; an emergent property of the microbiome. Results: Using data from a previously published, longitudinal study of microbiome populations of the human gut, we extrapolated information about microbiome community enzyme profiles and metabolome models. Using machine learning techniques, we demonstrated that the aggregate predicted community enzyme function profiles and modeled metabolomes of a microbiome are more predictive of dysbiosis than either observed microbiome community composition or predicted enzyme function profiles. Conclusions: Specific enzyme functions and metabolites predictive of dysbiosis provide insights into the molecular mechanisms of microbiome–host interactions. The ability to use machine learning to predict dysbiosis from microbiome community interaction data provides a potentially powerful tool for understanding the links between the human microbiome and human health, pointing to potential microbiome-based diagnostics and therapeutic interventions.« less

  11. Metabolome of human gut microbiome is predictive of host dysbiosis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larsen, Peter E.; Dai, Yang

    Background: Humans live in constant and vital symbiosis with a closely linked bacterial ecosystem called the microbiome, which influences many aspects of human health. When this microbial ecosystem becomes disrupted, the health of the human host can suffer; a condition called dysbiosis. However, the community compositions of human microbiomes also vary dramatically from individual to individual, and over time, making it difficult to uncover the underlying mechanisms linking the microbiome to human health. We propose that a microbiome’s interaction with its human host is not necessarily dependent upon the presence or absence of particular bacterial species, but instead is dependentmore » on its community metabolome; an emergent property of the microbiome. Results: Using data from a previously published, longitudinal study of microbiome populations of the human gut, we extrapolated information about microbiome community enzyme profiles and metabolome models. Using machine learning techniques, we demonstrated that the aggregate predicted community enzyme function profiles and modeled metabolomes of a microbiome are more predictive of dysbiosis than either observed microbiome community composition or predicted enzyme function profiles. Conclusions: Specific enzyme functions and metabolites predictive of dysbiosis provide insights into the molecular mechanisms of microbiome–host interactions. The ability to use machine learning to predict dysbiosis from microbiome community interaction data provides a potentially powerful tool for understanding the links between the human microbiome and human health, pointing to potential microbiome-based diagnostics and therapeutic interventions.« less

  12. Metabolome of human gut microbiome is predictive of host dysbiosis

    DOE PAGES

    Larsen, Peter E.; Dai, Yang

    2015-09-14

    Background: Humans live in constant and vital symbiosis with a closely linked bacterial ecosystem called the microbiome, which influences many aspects of human health. When this microbial ecosystem becomes disrupted, the health of the human host can suffer; a condition called dysbiosis. The community compositions of human microbiomes also vary dramatically from individual to individual, and over time, making it difficult to uncover the underlying mechanisms linking the microbiome to human health. We propose that a microbiome’s interaction with its human host is not necessarily dependent upon the presence or absence of particular bacterial species, but instead is dependent onmore » its community metabolome; an emergent property of the microbiome. Results: Using data from a previously published, longitudinal study of microbiome populations of the human gut, we extrapolated information about microbiome community enzyme profiles and metabolome models. Using machine learning techniques, we demonstrated that the aggregate predicted community enzyme function profiles and modeled metabolomes of a microbiome are more predictive of dysbiosis than either observed microbiome community composition or predicted enzyme function profiles. Conclusions: Specific enzyme functions and metabolites predictive of dysbiosis provide insights into the molecular mechanisms of microbiome–host interactions. The ability to use machine learning to predict dysbiosis from microbiome community interaction data provides a potentially powerful tool for understanding the links between the human microbiome and human health, pointing to potential microbiome-based diagnostics and therapeutic interventions.« less

  13. Using mean duration and variation of procedure times to plan a list of surgical operations to fit into the scheduled list time.

    PubMed

    Pandit, Jaideep J; Tavare, Aniket

    2011-07-01

    It is important that a surgical list is planned to utilise as much of the scheduled time as possible while not over-running, because this can lead to cancellation of operations. We wished to assess whether, theoretically, the known duration of individual operations could be used quantitatively to predict the likely duration of the operating list. In a university hospital setting, we first assessed the extent to which the current ad-hoc method of operating list planning was able to match the scheduled operating list times for 153 consecutive historical lists. Using receiver operating curve analysis, we assessed the ability of an alternative method to predict operating list duration for the same operating lists. This method uses a simple formula: the sum of individual operation times and a pooled standard deviation of these times. We used the operating list duration estimated from this formula to generate a probability that the operating list would finish within its scheduled time. Finally, we applied the simple formula prospectively to 150 operating lists, 'shadowing' the current ad-hoc method, to confirm the predictive ability of the formula. The ad-hoc method was very poor at planning: 50% of historical operating lists were under-booked and 37% over-booked. In contrast, the simple formula predicted the correct outcome (under-run or over-run) for 76% of these operating lists. The calculated probability that a planned series of operations will over-run or under-run was found useful in developing an algorithm to adjust the planned cases optimally. In the prospective series, 65% of operating lists were over-booked and 10% were under-booked. The formula predicted the correct outcome for 84% of operating lists. A simple quantitative method of estimating operating list duration for a series of operations leads to an algorithm (readily created on an Excel spreadsheet, http://links.lww.com/EJA/A19) that can potentially improve operating list planning.

  14. Channelized relevance vector machine as a numerical observer for cardiac perfusion defect detection task

    NASA Astrophysics Data System (ADS)

    Kalayeh, Mahdi M.; Marin, Thibault; Pretorius, P. Hendrik; Wernick, Miles N.; Yang, Yongyi; Brankov, Jovan G.

    2011-03-01

    In this paper, we present a numerical observer for image quality assessment, aiming to predict human observer accuracy in a cardiac perfusion defect detection task for single-photon emission computed tomography (SPECT). In medical imaging, image quality should be assessed by evaluating the human observer accuracy for a specific diagnostic task. This approach is known as task-based assessment. Such evaluations are important for optimizing and testing imaging devices and algorithms. Unfortunately, human observer studies with expert readers are costly and time-demanding. To address this problem, numerical observers have been developed as a surrogate for human readers to predict human diagnostic performance. The channelized Hotelling observer (CHO) with internal noise model has been found to predict human performance well in some situations, but does not always generalize well to unseen data. We have argued in the past that finding a model to predict human observers could be viewed as a machine learning problem. Following this approach, in this paper we propose a channelized relevance vector machine (CRVM) to predict human diagnostic scores in a detection task. We have previously used channelized support vector machines (CSVM) to predict human scores and have shown that this approach offers better and more robust predictions than the classical CHO method. The comparison of the proposed CRVM with our previously introduced CSVM method suggests that CRVM can achieve similar generalization accuracy, while dramatically reducing model complexity and computation time.

  15. Predictors of operating room extubation in adult cardiac surgery.

    PubMed

    Subramaniam, Kathirvel; DeAndrade, Diana S; Mandell, Daniel R; Althouse, Andrew D; Manmohan, Rajan; Esper, Stephen A; Varga, Jeffrey M; Badhwar, Vinay

    2017-11-01

    The primary objective of the study was to identify perioperative factors associated with successful immediate extubation in the operating room after adult cardiac surgery. The secondary objective was to derive a simplified predictive scoring system to guide clinicians in operating room extubation. All 1518 patients in this retrospective cohort study underwent standardized fast-track cardiac anesthetic protocol during adult cardiac surgery. Perioperative variables between patients who had successful extubation in the operating room versus in the intensive care unit were retrospectively analyzed using both univariate and multivariable logistic regression analyses. A predictive score of successful operating room extubation was constructed from the multivariable results of 800 patients (derivation set), and the scoring system was further tested using a validation set of 398 patients. Younger age, lower body mass index, higher preoperative serum albumin, absence of chronic lung disease and diabetes, less-invasive surgical approach, isolated coronary bypass surgery, elective surgery, and lower doses of intraoperative intravenous fentanyl were independently associated with higher probability of operating room extubation. The extubation prediction score created in a derivation set of patients performed well in the validation set. Patient scores less than 0 had a minimal probability of successful operating room extubation. Operating room extubation was highly predicted with scores of 5 or greater. Perioperative factors that are independently associated with successful operating room extubation after adult cardiac operations were identified, and an operating room extubation prediction scoring system was validated. This scoring system may be used to guide safe operating room extubation after cardiac operations. Copyright © 2017 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  16. Transient-Free Operations With Physics-Based Real-time Analysis and Control

    NASA Astrophysics Data System (ADS)

    Kolemen, Egemen; Burrell, Keith; Eggert, William; Eldon, David; Ferron, John; Glasser, Alex; Humphreys, David

    2016-10-01

    In order to understand and predict disruptions, the two most common methods currently employed in tokamak analysis are the time-consuming ``kinetic EFITs,'' which are done offline with significant human involvement, and the search for correlations with global precursors using various parameterization techniques. We are developing automated ``kinetic EFITs'' at DIII-D to enable calculation of the stability as the plasma evolves close to the disruption. This allows us to quantify the probabilistic nature of the stability calculations and provides a stability metric for all possible linear perturbations to the plasma. This study also provides insight into how the control system can avoid the unstable operating space, which is critical for high-performance operations close to stability thresholds at ITER. A novel, efficient ideal stability calculation method and new real-time CER acquisition system are being developed, and a new 77-core server has been installed on the DIII-D PCS to enable experimental use. Sponsored by US DOE under DE-SC0015878 and DE-FC02-04ER54698.

  17. Hierarchical HMM based learning of navigation primitives for cooperative robotic endovascular catheterization.

    PubMed

    Rafii-Tari, Hedyeh; Liu, Jindong; Payne, Christopher J; Bicknell, Colin; Yang, Guang-Zhong

    2014-01-01

    Despite increased use of remote-controlled steerable catheter navigation systems for endovascular intervention, most current designs are based on master configurations which tend to alter natural operator tool interactions. This introduces problems to both ergonomics and shared human-robot control. This paper proposes a novel cooperative robotic catheterization system based on learning-from-demonstration. By encoding the higher-level structure of a catheterization task as a sequence of primitive motions, we demonstrate how to achieve prospective learning for complex tasks whilst incorporating subject-specific variations. A hierarchical Hidden Markov Model is used to model each movement primitive as well as their sequential relationship. This model is applied to generation of motion sequences, recognition of operator input, and prediction of future movements for the robot. The framework is validated by comparing catheter tip motions against the manual approach, showing significant improvements in the quality of catheterization. The results motivate the design of collaborative robotic systems that are intuitive to use, while reducing the cognitive workload of the operator.

  18. Cortical Hierarchies Perform Bayesian Causal Inference in Multisensory Perception

    PubMed Central

    Rohe, Tim; Noppeney, Uta

    2015-01-01

    To form a veridical percept of the environment, the brain needs to integrate sensory signals from a common source but segregate those from independent sources. Thus, perception inherently relies on solving the “causal inference problem.” Behaviorally, humans solve this problem optimally as predicted by Bayesian Causal Inference; yet, the underlying neural mechanisms are unexplored. Combining psychophysics, Bayesian modeling, functional magnetic resonance imaging (fMRI), and multivariate decoding in an audiovisual spatial localization task, we demonstrate that Bayesian Causal Inference is performed by a hierarchy of multisensory processes in the human brain. At the bottom of the hierarchy, in auditory and visual areas, location is represented on the basis that the two signals are generated by independent sources (= segregation). At the next stage, in posterior intraparietal sulcus, location is estimated under the assumption that the two signals are from a common source (= forced fusion). Only at the top of the hierarchy, in anterior intraparietal sulcus, the uncertainty about the causal structure of the world is taken into account and sensory signals are combined as predicted by Bayesian Causal Inference. Characterizing the computational operations of signal interactions reveals the hierarchical nature of multisensory perception in human neocortex. It unravels how the brain accomplishes Bayesian Causal Inference, a statistical computation fundamental for perception and cognition. Our results demonstrate how the brain combines information in the face of uncertainty about the underlying causal structure of the world. PMID:25710328

  19. Surface mine planning and design implications and theory of a visual environmental quality predictive model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burley, J.B.

    1999-07-01

    Surface mine planners and designers are searching for scientifically based tools to assist in the pre-mine planning and post-mine development or surface mine sites. In this study, the author presents a science based visual and environmental quality predictive model useful in preparing and assessing landscape treatments for surface mine sites. The equation explains 67 percent of respondent preference, with an overall p-value for the equation >0.0001 and a p-value >0.05 for each regressor. Regressors employed in the equation include an environmental quality index, foreground vegetation, distant nonvegetation, people, vehicles, utilities, foreground flowers, foreground erosion, wildlife, landscape openness, landscape mystery, andmore » noosphericness (a measure of human disturbance). The equation can be explained with an Intrusion/Neutral Modifier/Temporal Enhancement Theory which suggests that human intrusions upon other humans results in landscape of low preference and which also suggests that landscape containing natural and special temporal features such as wildlife and flowers can enhance the value of a landscape scene. This research supports the importance of visual barriers such as berms and vegetation screens during mining operations and supports public perceptions concerning many types of industrial activities. In addition, the equation can be applied to study post-mining landscape development plans to maximize the efficiency and effectiveness of landscape treatments.« less

  20. Eliciting management action: Using THERP to highlight human factors deficiencies for trip reduction programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuld, R.; Cybert, S.

    Methods and criteria for performing human factors evaluations of plant systems and procedures are well developed and available. For a design review to produce a positive impact on operations, however, it is not enough to simply document deficiences and solutions. The results must be presented to management in a clear and compelling form that will direct attention to the heart of a problem and present proposed solutions in terms of explicit, quantified cost/benefits. A proactive program of trip reduction provides an excellent opportunity to accomplish human factors-related upgrades. As an evaluative context, trip reduction imposes a uniform goodness criterion onmore » all situations: the probability of inadvertent plant trip. This in turn means that findings can be compared in terms of a common quantitative reference point: the cost of an inadvertent shutdown. To interpret human factors deficiencies in terms of trip probabilities, the Technique for Human Error Rate Prediction (THERP) can be used. THERP provides an accessible compilation of human reliability data for generic, discrete task elements. Sequences of such values are combined in standard event trees to determine the probability of failure (e.g., trip) for a given evolution. THERP is widely accepted as one of the best available alternatives for assessing human reliability.« less

  1. Forecasting Propagation and Evolution of CMEs in an Operational Setting: What Has Been Learned

    NASA Technical Reports Server (NTRS)

    Zheng, Yihua; Macneice, Peter; Odstrcil, Dusan; Mays, M. L.; Rastaetter, Lutz; Pulkkinen, Antti; Taktakishvili, Aleksandre; Hesse, Michael; Kuznetsova, M. Masha; Lee, Hyesook; hide

    2013-01-01

    One of the major types of solar eruption, coronal mass ejections (CMEs) not only impact space weather, but also can have significant societal consequences. CMEs cause intense geomagnetic storms and drive fast mode shocks that accelerate charged particles, potentially resulting in enhanced radiation levels both in ions and electrons. Human and technological assets in space can be endangered as a result. CMEs are also the major contributor to generating large amplitude Geomagnetically Induced Currents (GICs), which are a source of concern for power grid safety. Due to their space weather significance, forecasting the evolution and impacts of CMEs has become a much desired capability for space weather operations worldwide. Based on our operational experience at Space Weather Research Center at NASA Goddard Space Flight Center (http://swrc.gsfc.nasa.gov), we present here some of the insights gained about accurately predicting CME impacts, particularly in relation to space weather operations. These include: 1. The need to maximize information to get an accurate handle of three-dimensional (3-D) CME kinetic parameters and therefore improve CME forecast; 2. The potential use of CME simulation results for qualitative prediction of regions of space where solar energetic particles (SEPs) may be found; 3. The need to include all CMEs occurring within a 24 h period for a better representation of the CME interactions; 4. Various other important parameters in forecasting CME evolution in interplanetary space, with special emphasis on the CME propagation direction. It is noted that a future direction for our CME forecasting is to employ the ensemble modeling approach.

  2. Forecasting propagation and evolution of CMEs in an operational setting: What has been learned

    NASA Astrophysics Data System (ADS)

    Zheng, Yihua; Macneice, Peter; Odstrcil, Dusan; Mays, M. L.; Rastaetter, Lutz; Pulkkinen, Antti; Taktakishvili, Aleksandre; Hesse, Michael; Masha Kuznetsova, M.; Lee, Hyesook; Chulaki, Anna

    2013-10-01

    of the major types of solar eruption, coronal mass ejections (CMEs) not only impact space weather, but also can have significant societal consequences. CMEs cause intense geomagnetic storms and drive fast mode shocks that accelerate charged particles, potentially resulting in enhanced radiation levels both in ions and electrons. Human and technological assets in space can be endangered as a result. CMEs are also the major contributor to generating large amplitude Geomagnetically Induced Currents (GICs), which are a source of concern for power grid safety. Due to their space weather significance, forecasting the evolution and impacts of CMEs has become a much desired capability for space weather operations worldwide. Based on our operational experience at Space Weather Research Center at NASA Goddard Space Flight Center (http://swrc.gsfc.nasa.gov), we present here some of the insights gained about accurately predicting CME impacts, particularly in relation to space weather operations. These include: 1. The need to maximize information to get an accurate handle of three-dimensional (3-D) CME kinetic parameters and therefore improve CME forecast; 2. The potential use of CME simulation results for qualitative prediction of regions of space where solar energetic particles (SEPs) may be found; 3. The need to include all CMEs occurring within a 24 h period for a better representation of the CME interactions; 4. Various other important parameters in forecasting CME evolution in interplanetary space, with special emphasis on the CME propagation direction. It is noted that a future direction for our CME forecasting is to employ the ensemble modeling approach.

  3. A system performance throughput model applicable to advanced manned telescience systems

    NASA Technical Reports Server (NTRS)

    Haines, Richard F.

    1990-01-01

    As automated space systems become more complex, autonomous, and opaque to the flight crew, it becomes increasingly difficult to determine whether the total system is performing as it should. Some of the complex and interrelated human performance measurement issues are addressed that are related to total system validation. An evaluative throughput model is presented which can be used to generate a human operator-related benchmark or figure of merit for a given system which involves humans at the input and output ends as well as other automated intelligent agents. The concept of sustained and accurate command/control data information transfer is introduced. The first two input parameters of the model involve nominal and off-nominal predicted events. The first of these calls for a detailed task analysis while the second is for a contingency event assessment. The last two required input parameters involving actual (measured) events, namely human performance and continuous semi-automated system performance. An expression combining these four parameters was found using digital simulations and identical, representative, random data to yield the smallest variance.

  4. High-Efficiency Nested Hall Thrusters for Robotic Solar System Exploration

    NASA Technical Reports Server (NTRS)

    Hofer, Richard R.

    2013-01-01

    This work describes the scaling and design attributes of Nested Hall Thrusters (NHT) with extremely large operational envelopes, including a wide range of throttleability in power and specific impulse at high efficiency (>50%). NHTs have the potential to provide the game changing performance, powerprocessing capabilities, and cost effectiveness required to enable missions that cannot otherwise be accomplished. NHTs were first identified in the electric propulsion community as a path to 100- kW class thrusters for human missions. This study aimed to identify the performance capabilities NHTs can provide for NASA robotic and human missions, with an emphasis on 10-kW class thrusters well-suited for robotic exploration. A key outcome of this work has been the identification of NHTs as nearly constant-efficiency devices over large power throttling ratios, especially in direct-drive power systems. NHT systems sized for robotic solar system exploration are predicted to be capable of high-efficiency operation over nearly their entire power throttling range. A traditional Annular Hall Thruster (AHT) consists of a single annular discharge chamber where the propellant is ionized and accelerated. In an NHT, multiple annular channels are concentrically stacked. The channels can be operated in unison or individually depending on the available power or required performance. When throttling an AHT, performance must be sacrificed since a single channel cannot satisfy the diverse design attributes needed to maintain high thrust efficiency. NHTs can satisfy these requirements by varying which channels are operated and thereby offer significant benefits in terms of thruster performance, especially under deep power throttling conditions where the efficiency of an AHT suffers since a single channel can only operate efficiently (>50%) over a narrow power throttling ratio (3:1). Designs for 10-kW class NHTs were developed and compared with AHT systems. Power processing systems were considered using either traditional Power Processing Units (PPU) or Direct Drive Units (DDU). In a PPU-based system, power from the solar arrays is transformed from the low voltage of the arrays to the high voltage needed by the thruster. In a DDU-based system, power from the solar arrays is fed to the thruster without conversion. DDU-based systems are attractive for their simplicity since they eliminate the most complex and expensive part of the propulsion system. The results point to the strong potential of NHTs operating with either PPUs or DDUs to benefit robotic and human missions through their unprecedented power and specific impulse throttling capabilities. NHTs coupled to traditional PPUs are predicted to offer high-efficiency (>50%) power throttling ratios 320% greater than present capabilities, while NHTs with direct-drive power systems (DDU) could exceed existing capabilities by 340%. Because the NHT-DDU approach is implicitly low-cost, NHT-DDU technology has the potential to radically reduce the cost of SEP-enabled NASA missions while simultaneously enabling unprecedented performance capability.

  5. Is Earth F**ked? Dynamical Futility of Global Environmental Management and Possibilities for Sustainability via Direct Action Activism

    NASA Astrophysics Data System (ADS)

    wErnEr, B.

    2012-12-01

    Environmental challenges are dynamically generated within the dominant global culture principally by the mismatch between short-time-scale market and political forces driving resource extraction/use and longer-time-scale accommodations of the Earth system to these changes. Increasing resource demand is leading to the development of two-way, nonlinear interactions between human societies and environmental systems that are becoming global in extent, either through globalized markets and other institutions or through coupling to global environmental systems such as climate. These trends are further intensified by dissipation-reducing technological advances in transactions, communication and transport, which suppress emergence of longer-time-scale economic and political levels of description and facilitate long-distance connections, and by predictive environmental modeling, which strengthens human connections to a short-time-scale virtual Earth, and weakens connections to the longer time scales of the actual Earth. Environmental management seeks to steer fast scale economic and political interests of a coupled human-environmental system towards longer-time-scale consideration of benefits and costs by operating within the confines of the dominant culture using a linear, engineering-type connection to the system. Perhaps as evidenced by widespread inability to meaningfully address such global environmental challenges as climate change and soil degradation, nonlinear connections reduce the ability of managers to operate outside coupled human-environmental systems, decreasing their effectiveness in steering towards sustainable interactions and resulting in managers slaved to short-to-intermediate-term interests. In sum, the dynamics of the global coupled human-environmental system within the dominant culture precludes management for stable, sustainable pathways and promotes instability. Environmental direct action, resistance taken from outside the dominant culture, as in protests, blockades and sabotage by indigenous peoples, workers, anarchists and other activist groups, increases dissipation within the coupled system over fast to intermediate scales and pushes for changes in the dominant culture that favor transition to a stable, sustainable attractor. These dynamical relationships are illustrated and explored using a numerical model that simulates the short-, intermediate- and long-time-scale dynamics of the coupled human-environmental system. At fast scales, economic and political interests exploit environmental resources through a maze of environmental management and resistance, guided by virtual Earth predictions. At intermediate scales, managers become slaved to economic and political interests, which adapt to and repress resistance, and resistance is guided by patterns of environmental destruction. At slow scales, resistance interacts with the cultural context, which co-evolves with the environment. The transition from unstable dynamics to sustainability is sensitively dependent on the level of participation in and repression of resistance. Because of their differing impact inside and outside the dominant culture, virtual Earth predictions can either promote or oppose sustainability. Supported by the National Science Foundation, Geomorphology and Land Use Dynamics Program.

  6. Estimation of left ventricular operating stiffness from Doppler early filling deceleration time in humans.

    PubMed

    Garcia, M J; Firstenberg, M S; Greenberg, N L; Smedira, N; Rodriguez, L; Prior, D; Thomas, J D

    2001-02-01

    Shortened early transmitral deceleration times (E(DT)) have been qualitatively associated with increased filling pressure and reduced survival in patients with cardiac disease and increased left ventricular operating stiffness (K(LV)). An equation relating K(LV) quantitatively to E(DT) has previously been described in a canine model but not in humans. During several varying hemodynamic conditions, we studied 18 patients undergoing open-heart surgery. Transesophageal echocardiographic two-dimensional volumes and Doppler flows were combined with high-fidelity left atrial (LA) and left ventricular (LV) pressures to determine K(LV). From digitized Doppler recordings, E(DT) was measured and compared against changes in LV and LA diastolic volumes and pressures. E(DT) (180 +/- 39 ms) was inversely associated with LV end-diastolic pressures (r = -0.56, P = 0.004) and net atrioventricular stiffness (r = -0.55, P = 0.006) but had its strongest association with K(LV) (r = -0.81, P < 0.001). K(LV) was predicted assuming a nonrestrictive orifice (K(nonrest)) from E(DT) as K(nonrest) = (0.07/E(DT))(2) with K(LV) = 1.01 K(nonrest) - 0.02; r = 0.86, P < 0.001, DeltaK (K(nonrest) - K(LV)) = 0.02 +/- 0.06 mm Hg/ml. In adults with cardiac disease, E(DT) provides an accurate estimate of LV operating stiffness and supports its application as a practical noninvasive index in the evaluation of diastolic function.

  7. Estimation of left ventricular operating stiffness from Doppler early filling deceleration time in humans

    NASA Technical Reports Server (NTRS)

    Garcia, M. J.; Firstenberg, M. S.; Greenberg, N. L.; Smedira, N.; Rodriguez, L.; Prior, D.; Thomas, J. D.

    2001-01-01

    Shortened early transmitral deceleration times (E(DT)) have been qualitatively associated with increased filling pressure and reduced survival in patients with cardiac disease and increased left ventricular operating stiffness (K(LV)). An equation relating K(LV) quantitatively to E(DT) has previously been described in a canine model but not in humans. During several varying hemodynamic conditions, we studied 18 patients undergoing open-heart surgery. Transesophageal echocardiographic two-dimensional volumes and Doppler flows were combined with high-fidelity left atrial (LA) and left ventricular (LV) pressures to determine K(LV). From digitized Doppler recordings, E(DT) was measured and compared against changes in LV and LA diastolic volumes and pressures. E(DT) (180 +/- 39 ms) was inversely associated with LV end-diastolic pressures (r = -0.56, P = 0.004) and net atrioventricular stiffness (r = -0.55, P = 0.006) but had its strongest association with K(LV) (r = -0.81, P < 0.001). K(LV) was predicted assuming a nonrestrictive orifice (K(nonrest)) from E(DT) as K(nonrest) = (0.07/E(DT))(2) with K(LV) = 1.01 K(nonrest) - 0.02; r = 0.86, P < 0.001, DeltaK (K(nonrest) - K(LV)) = 0.02 +/- 0.06 mm Hg/ml. In adults with cardiac disease, E(DT) provides an accurate estimate of LV operating stiffness and supports its application as a practical noninvasive index in the evaluation of diastolic function.

  8. Real-time stylistic prediction for whole-body human motions.

    PubMed

    Matsubara, Takamitsu; Hyon, Sang-Ho; Morimoto, Jun

    2012-01-01

    The ability to predict human motion is crucial in several contexts such as human tracking by computer vision and the synthesis of human-like computer graphics. Previous work has focused on off-line processes with well-segmented data; however, many applications such as robotics require real-time control with efficient computation. In this paper, we propose a novel approach called real-time stylistic prediction for whole-body human motions to satisfy these requirements. This approach uses a novel generative model to represent a whole-body human motion including rhythmic motion (e.g., walking) and discrete motion (e.g., jumping). The generative model is composed of a low-dimensional state (phase) dynamics and a two-factor observation model, allowing it to capture the diversity of motion styles in humans. A real-time adaptation algorithm was derived to estimate both state variables and style parameter of the model from non-stationary unlabeled sequential observations. Moreover, with a simple modification, the algorithm allows real-time adaptation even from incomplete (partial) observations. Based on the estimated state and style, a future motion sequence can be accurately predicted. In our implementation, it takes less than 15 ms for both adaptation and prediction at each observation. Our real-time stylistic prediction was evaluated for human walking, running, and jumping behaviors. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Multi-Robot Interfaces and Operator Situational Awareness: Study of the Impact of Immersion and Prediction

    PubMed Central

    Peña-Tapia, Elena; Martín-Barrio, Andrés; Olivares-Méndez, Miguel A.

    2017-01-01

    Multi-robot missions are a challenge for operators in terms of workload and situational awareness. These operators have to receive data from the robots, extract information, understand the situation properly, make decisions, generate the adequate commands, and send them to the robots. The consequences of excessive workload and lack of awareness can vary from inefficiencies to accidents. This work focuses on the study of future operator interfaces of multi-robot systems, taking into account relevant issues such as multimodal interactions, immersive devices, predictive capabilities and adaptive displays. Specifically, four interfaces have been designed and developed: a conventional, a predictive conventional, a virtual reality and a predictive virtual reality interface. The four interfaces have been validated by the performance of twenty-four operators that supervised eight multi-robot missions of fire surveillance and extinguishing. The results of the workload and situational awareness tests show that virtual reality improves the situational awareness without increasing the workload of operators, whereas the effects of predictive components are not significant and depend on their implementation. PMID:28749407

  10. Reduced Pressure Cabin Testing of the Orion Atmosphere Revitalization Technology

    NASA Technical Reports Server (NTRS)

    Button, Amy; Sweterlitsch, Jeffrey

    2011-01-01

    An amine-based carbon dioxide (CO2) and water vapor sorbent in pressure-swing regenerable beds has been developed by Hamilton Sundstrand and baselined for the Atmosphere Revitalization System for moderate duration missions of the Orion Multipurpose Crew Vehicle. In previous years at this conference, reports were presented on extensive Johnson Space Center testing of this technology in a sea-level pressure environment with simulated and actual human metabolic loads in both open and closed-loop configurations. In 2011, the technology was tested in an open cabin-loop configuration at ambient and two sub-ambient pressures to compare the performance of the system to the results of previous tests at ambient pressure. The testing used a human metabolic simulator with a different type of water vapor generation than previously used, which added some unique challenges in the data analysis. This paper summarizes the results of: baseline and some matrix testing at all three cabin pressures, increased vacuum regeneration line pressure with a high metabolic load, a set of tests studying CO2 and water vapor co-adsorption effects relative to model-predicted performance, and validation tests of flight program computer model predictions with specific operating conditions.

  11. Reduced Pressure Cabin Testing of the Orion Atmosphere Revitalization Technology

    NASA Technical Reports Server (NTRS)

    Button, Amy; Sweterlisch, Jeffery J.

    2013-01-01

    An amine-based carbon dioxide (CO2) and water vapor sorbent in pressure-swing regenerable beds has been developed by Hamilton Sundstrand and baselined for the Atmosphere Revitalization System for moderate duration missions of the Orion Multipurpose Crew Vehicle. In previous years at this conference, reports were presented on extensive Johnson Space Center testing of this technology in a sea-level pressure environment with simulated and actual human metabolic loads in both open and closed-loop configurations. In 2011, the technology was tested in an open cabin-loop configuration at ambient and two sub-ambient pressures to compare the performance of the system to the results of previous tests at ambient pressure. The testing used a human metabolic simulator with a different type of water vapor generation than previously used, which added some unique challenges in the data analysis. This paper summarizes the results of: baseline and some matrix testing at all three cabin pressures, increased vacuum regeneration line pressure with a high metabolic load, a set of tests studying CO2 and water vapor co-adsorption effects relative to model-predicted performance, and validation tests of flight program computer model predictions with specific operating conditions.

  12. Reduced Pressure Cabin Testing of the Orion Atmosphere Revitalization Technology

    NASA Technical Reports Server (NTRS)

    Button, Amy B.; Sweterlitsch, Jeffrey J.

    2013-01-01

    An amine-based carbon dioxide (CO2) and water vapor sorbent in pressure-swing regenerable beds has been developed by United Technologies Corp. Aerospace Systems (UTAS, formerly Hamilton Sundstrand) and baselined for the Atmosphere Revitalization System for moderate duration missions of the Orion Multipurpose Crew Vehicle (MPCV). In previous years at this conference, reports were presented on extensive Johnson Space Center testing of this technology in a sea-level pressure environment with simulated and actual human metabolic loads in both open and closed-loop configurations. In 2011, the technology was tested in an open cabin-loop configuration at ambient and two sub-ambient pressures to compare the performance of the system to the results of previous tests at ambient pressure. The testing used a human metabolic simulator with a different type of water vapor generation than previously used, which added some unique challenges in the data analysis. This paper summarizes the results of: baseline and some matrix testing at all three cabin pressures, increased vacuum regeneration line pressure testing with a high metabolic load, a set of tests studying CO2 and water vapor co-adsorption effects relative to model-predicted performance, and validation tests of flight project computer model predictions with specific operating conditions.

  13. Dogs’ Body Language Relevant to Learning Achievement

    PubMed Central

    Hasegawa, Masashi; Ohtani, Nobuyo; Ohta, Mitsuaki

    2014-01-01

    Simple Summary For humans and dogs to live together amiably, dog training is required, and a lack of obedience training is significantly related to the prevalence of certain behavioral problems. To train efficiently, it is important that the trainer/owner ascertains the learning level of the dog. Understanding the dog’s body language helps humans understand the animal’s emotions. This study evaluated the posture of certain dog body parts during operant conditioning. Our findings suggest that certain postures were related to the dog’s learning level during operant conditioning. Being aware of these postures could be helpful to understand canine emotion during learning. Abstract The facial expressions and body postures of dogs can give helpful information about their moods and emotional states. People can more effectively obedience train their dogs if we can identify the mannerisms associated with learning in dogs. The aim of this study was to clarify the dog’s body language during operant conditioning to predict achievement in the test that followed by measuring the duration of behaviors. Forty-six untrained dogs (17 males and 26 females) of various breeds were used. Each session consisted of 5 minutes of training with a treat reward followed by 3 minutes of rest and finally an operant conditioning test that consisted of 20 “hand motion” cues. The operant tests were conducted a total of nine times over three consecutive days, and the success numbers were counted. The duration of the dog’s behavior, focusing on the dog’s eyes, mouth, ears, tail and tail-wagging, was recorded during the operant conditioning sessions before the test. Particular behaviors, including wide-eyes, closed mouth, erect ears, and forward and high tail carriage, without wagging or with short and quick wagging, related to high achievement results. It is concluded that dogs' body language during operant conditioning was related to their success rate. PMID:26479883

  14. Systematic Evaluation of Wajima Superposition (Steady-State Concentration to Mean Residence Time) in the Estimation of Human Intravenous Pharmacokinetic Profile.

    PubMed

    Lombardo, Franco; Berellini, Giuliano; Labonte, Laura R; Liang, Guiqing; Kim, Sean

    2016-03-01

    We present a systematic evaluation of the Wajima superpositioning method to estimate the human intravenous (i.v.) pharmacokinetic (PK) profile based on a set of 54 marketed drugs with diverse structure and range of physicochemical properties. We illustrate the use of average of "best methods" for the prediction of clearance (CL) and volume of distribution at steady state (VDss) as described in our earlier work (Lombardo F, Waters NJ, Argikar UA, et al. J Clin Pharmacol. 2013;53(2):178-191; Lombardo F, Waters NJ, Argikar UA, et al. J Clin Pharmacol. 2013;53(2):167-177). These methods provided much more accurate prediction of human PK parameters, yielding 88% and 70% of the prediction within 2-fold error for VDss and CL, respectively. The prediction of human i.v. profile using Wajima superpositioning of rat, dog, and monkey time-concentration profiles was tested against the observed human i.v. PK using fold error statistics. The results showed that 63% of the compounds yielded a geometric mean of fold error below 2-fold, and an additional 19% yielded a geometric mean of fold error between 2- and 3-fold, leaving only 18% of the compounds with a relatively poor prediction. Our results showed that good superposition was observed in any case, demonstrating the predictive value of the Wajima approach, and that the cause of poor prediction of human i.v. profile was mainly due to the poorly predicted CL value, while VDss prediction had a minor impact on the accuracy of human i.v. profile prediction. Copyright © 2016. Published by Elsevier Inc.

  15. Using optical coherence tomography (OCT) to evaluate the status of human donor kidneys (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Andrews, Peter M.; Konkel, Brandon; Anderson, Erik; Stein, Matthew; Cooper, Matthew; Verbesey, Jennifer E.; Ghasemian, Seyed; Chen, Yu

    2016-02-01

    The main cause of delayed renal function following the transplant of donor kidneys is ischemic induced acute tubular necrosis (ATN). The ability to determine the degree of ATN suffered by donor kidneys prior to their transplant would enable transplant surgeons to use kidneys that might otherwise be discarded and better predict post-transplant renal function. Currently, there are no reliable tests to determine the extent of ATN of donor kidneys prior to their transplant. In ongoing clinical trials, we have been using optical coherence tomography (OCT) to non-invasively image the superficial proximal tubules of human donor kidneys prior to and following transplant, and correlate these observations with post-transplant renal function. Thus far we have studied over 40 living donor kidneys and 10 cadaver donor kidneys, and demonstrated that this imaging can be performed in a sterile and expeditious fashion in the operating room (OR). Because of many variables associated with a diverse population of donors/recipients and transplant operation parameters, more transplant data must be collected prior to drawing definite conclusions. Nevertheless, our observations have thus far mirrored our previously published laboratory results indicating that damage to the kidney proximal tubules as indicated by tubule swelling is a good measure of post-transplant ATN and delayed graft function. We conclude that OCT is a useful procedure for analyzing human donor kidneys.

  16. Mercury in the national parks

    USGS Publications Warehouse

    Pritz, Colleen Flanagan; Eagles-Smith, Collin A.; Krabbenhoft, David

    2014-01-01

    One thing is certain: Even for trained researchers, predicting mercury’s behavior in the environment is challenging. Fundamentally it is one of 98 naturally occurring elements, with natural sources, such as volcanoes, and concentrated ore deposits, such as cinnabar. Yet there are also human-caused sources, such as emissions from both coal-burning power plants and mining operations for gold and silver. There are elemental forms, inorganic or organic forms, reactive and unreactive species. Mercury is emitted, then deposited, then re-emitted—thus earning its mercurial reputation. Most importantly, however, it is ultimately transferred into food chains through processes fueled by tiny microscopic creatures: bacteria.

  17. Depth Perception In Remote Stereoscopic Viewing Systems

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B.; Von Sydow, Marika

    1989-01-01

    Report describes theoretical and experimental studies of perception of depth by human operators through stereoscopic video systems. Purpose of such studies to optimize dual-camera configurations used to view workspaces of remote manipulators at distances of 1 to 3 m from cameras. According to analysis, static stereoscopic depth distortion decreased, without decreasing stereoscopitc depth resolution, by increasing camera-to-object and intercamera distances and camera focal length. Further predicts dynamic stereoscopic depth distortion reduced by rotating cameras around center of circle passing through point of convergence of viewing axes and first nodal points of two camera lenses.

  18. Towards Assessing the Human Trajectory Planning Horizon

    PubMed Central

    Nitsch, Verena; Meinzer, Dominik; Wollherr, Dirk

    2016-01-01

    Mobile robots are envisioned to cooperate closely with humans and to integrate seamlessly into a shared environment. For locomotion, these environments resemble traversable areas which are shared between multiple agents like humans and robots. The seamless integration of mobile robots into these environments requires accurate predictions of human locomotion. This work considers optimal control and model predictive control approaches for accurate trajectory prediction and proposes to integrate aspects of human behavior to improve their performance. Recently developed models are not able to reproduce accurately trajectories that result from sudden avoidance maneuvers. Particularly, the human locomotion behavior when handling disturbances from other agents poses a problem. The goal of this work is to investigate whether humans alter their trajectory planning horizon, in order to resolve abruptly emerging collision situations. By modeling humans as model predictive controllers, the influence of the planning horizon is investigated in simulations. Based on these results, an experiment is designed to identify, whether humans initiate a change in their locomotion planning behavior while moving in a complex environment. The results support the hypothesis, that humans employ a shorter planning horizon to avoid collisions that are triggered by unexpected disturbances. Observations presented in this work are expected to further improve the generalizability and accuracy of prediction methods based on dynamic models. PMID:27936015

  19. Towards Assessing the Human Trajectory Planning Horizon.

    PubMed

    Carton, Daniel; Nitsch, Verena; Meinzer, Dominik; Wollherr, Dirk

    2016-01-01

    Mobile robots are envisioned to cooperate closely with humans and to integrate seamlessly into a shared environment. For locomotion, these environments resemble traversable areas which are shared between multiple agents like humans and robots. The seamless integration of mobile robots into these environments requires accurate predictions of human locomotion. This work considers optimal control and model predictive control approaches for accurate trajectory prediction and proposes to integrate aspects of human behavior to improve their performance. Recently developed models are not able to reproduce accurately trajectories that result from sudden avoidance maneuvers. Particularly, the human locomotion behavior when handling disturbances from other agents poses a problem. The goal of this work is to investigate whether humans alter their trajectory planning horizon, in order to resolve abruptly emerging collision situations. By modeling humans as model predictive controllers, the influence of the planning horizon is investigated in simulations. Based on these results, an experiment is designed to identify, whether humans initiate a change in their locomotion planning behavior while moving in a complex environment. The results support the hypothesis, that humans employ a shorter planning horizon to avoid collisions that are triggered by unexpected disturbances. Observations presented in this work are expected to further improve the generalizability and accuracy of prediction methods based on dynamic models.

  20. NASA Space Launch System Operations Outlook

    NASA Technical Reports Server (NTRS)

    Hefner, William Keith; Matisak, Brian P.; McElyea, Mark; Kunz, Jennifer; Weber, Philip; Cummings, Nicholas; Parsons, Jeremy

    2014-01-01

    The National Aeronautics and Space Administration's (NASA) Space Launch System (SLS) Program, managed at the Marshall Space Flight Center (MSFC), is working with the Ground Systems Development and Operations (GSDO) Program, based at the Kennedy Space Center (KSC), to deliver a new safe, affordable, and sustainable capability for human and scientific exploration beyond Earth's orbit (BEO). Larger than the Saturn V Moon rocket, SLS will provide 10 percent more thrust at liftoff in its initial 70 metric ton (t) configuration and 20 percent more in its evolved 130-t configuration. The primary mission of the SLS rocket will be to launch astronauts to deep space destinations in the Orion Multi- Purpose Crew Vehicle (MPCV), also in development and managed by the Johnson Space Center. Several high-priority science missions also may benefit from the increased payload volume and reduced trip times offered by this powerful, versatile rocket. Reducing the lifecycle costs for NASA's space transportation flagship will maximize the exploration and scientific discovery returned from the taxpayer's investment. To that end, decisions made during development of SLS and associated systems will impact the nation's space exploration capabilities for decades. This paper will provide an update to the operations strategy presented at SpaceOps 2012. It will focus on: 1) Preparations to streamline the processing flow and infrastructure needed to produce and launch the world's largest rocket (i.e., through incorporation and modification of proven, heritage systems into the vehicle and ground systems); 2) Implementation of a lean approach to reach-back support of hardware manufacturing, green-run testing, and launch site processing and activities; and 3) Partnering between the vehicle design and operations communities on state-of-the-art predictive operations analysis techniques. An example of innovation is testing the integrated vehicle at the processing facility in parallel, rather than sequentially, saving both time and money. These themes are accomplished under the context of a new cross-program integration model that emphasizes peer-to-peer accountability and collaboration towards a common, shared goal. Utilizing the lessons learned through 50 years of human space flight experience, SLS is assigning the right number of people from appropriate backgrounds, providing them the right tools, and exercising the right processes for the job. The result will be a powerful, versatile, and capable heavy-lift, human-rated asset for the future human and scientific exploration of space.

  1. NASA Space Launch System Operations Outlook

    NASA Technical Reports Server (NTRS)

    Hefner, William Keith; Matisak, Brian P.; McElyea, Mark; Kunz, Jennifer; Weber, Philip; Cummings, Nicholas; Parsons, Jeremy

    2014-01-01

    The National Aeronautics and Space Administration's (NASA) Space Launch System (SLS) Program, managed at the Marshall Space Flight Center (MSFC), is working with the Ground Systems Development and Operations (GSDO) Program, based at the Kennedy Space Center (KSC), to deliver a new safe, affordable, and sustainable capability for human and scientific exploration beyond Earth's orbit (BEO). Larger than the Saturn V Moon rocket, SLS will provide 10 percent more thrust at liftoff in its initial 70 metric ton (t) configuration and 20 percent more in its evolved 130-t configuration. The primary mission of the SLS rocket will be to launch astronauts to deep space destinations in the Orion Multi-Purpose Crew Vehicle (MPCV), also in development and managed by the Johnson Space Center. Several high-priority science missions also may benefit from the increased payload volume and reduced trip times offered by this powerful, versatile rocket. Reducing the life-cycle costs for NASA's space transportation flagship will maximize the exploration and scientific discovery returned from the taxpayer's investment. To that end, decisions made during development of SLS and associated systems will impact the nation's space exploration capabilities for decades. This paper will provide an update to the operations strategy presented at SpaceOps 2012. It will focus on: 1) Preparations to streamline the processing flow and infrastructure needed to produce and launch the world's largest rocket (i.e., through incorporation and modification of proven, heritage systems into the vehicle and ground systems); 2) Implementation of a lean approach to reachback support of hardware manufacturing, green-run testing, and launch site processing and activities; and 3) Partnering between the vehicle design and operations communities on state-ofthe- art predictive operations analysis techniques. An example of innovation is testing the integrated vehicle at the processing facility in parallel, rather than sequentially, saving both time and money. These themes are accomplished under the context of a new cross-program integration model that emphasizes peer-to-peer accountability and collaboration towards a common, shared goal. Utilizing the lessons learned through 50 years of human space flight experience, SLS is assigning the right number of people from appropriate backgrounds, providing them the right tools, and exercising the right processes for the job. The result will be a powerful, versatile, and capable heavy-lift, human-rated asset for the future human and scientific exploration of space.

  2. ALLY: An operator's associate for satellite ground control systems

    NASA Technical Reports Server (NTRS)

    Bushman, J. B.; Mitchell, Christine M.; Jones, P. M.; Rubin, K. S.

    1991-01-01

    The key characteristics of an intelligent advisory system is explored. A central feature is that human-machine cooperation should be based on a metaphor of human-to-human cooperation. ALLY, a computer-based operator's associate which is based on a preliminary theory of human-to-human cooperation, is discussed. ALLY assists the operator in carrying out the supervisory control functions for a simulated NASA ground control system. Experimental evaluation of ALLY indicates that operators using ALLY performed at least as well as they did when using a human associate and in some cases even better.

  3. Hierarchical analytical and simulation modelling of human-machine systems with interference

    NASA Astrophysics Data System (ADS)

    Braginsky, M. Ya; Tarakanov, D. V.; Tsapko, S. G.; Tsapko, I. V.; Baglaeva, E. A.

    2017-01-01

    The article considers the principles of building the analytical and simulation model of the human operator and the industrial control system hardware and software. E-networks as the extension of Petri nets are used as the mathematical apparatus. This approach allows simulating complex parallel distributed processes in human-machine systems. The structural and hierarchical approach is used as the building method for the mathematical model of the human operator. The upper level of the human operator is represented by the logical dynamic model of decision making based on E-networks. The lower level reflects psychophysiological characteristics of the human-operator.

  4. Human fertility, molecular genetics, and natural selection in modern societies.

    PubMed

    Tropf, Felix C; Stulp, Gert; Barban, Nicola; Visscher, Peter M; Yang, Jian; Snieder, Harold; Mills, Melinda C

    2015-01-01

    Research on genetic influences on human fertility outcomes such as number of children ever born (NEB) or the age at first childbirth (AFB) has been solely based on twin and family-designs that suffer from problematic assumptions and practical limitations. The current study exploits recent advances in the field of molecular genetics by applying the genomic-relationship-matrix based restricted maximum likelihood (GREML) methods to quantify for the first time the extent to which common genetic variants influence the NEB and the AFB of women. Using data from the UK and the Netherlands (N = 6,758), results show significant additive genetic effects on both traits explaining 10% (SE = 5) of the variance in the NEB and 15% (SE = 4) in the AFB. We further find a significant negative genetic correlation between AFB and NEB in the pooled sample of -0.62 (SE = 0.27, p-value = 0.02). This finding implies that individuals with genetic predispositions for an earlier AFB had a reproductive advantage and that natural selection operated not only in historical, but also in contemporary populations. The observed postponement in the AFB across the past century in Europe contrasts with these findings, suggesting an evolutionary override by environmental effects and underscoring that evolutionary predictions in modern human societies are not straight forward. It emphasizes the necessity for an integrative research design from the fields of genetics and social sciences in order to understand and predict fertility outcomes. Finally, our results suggest that we may be able to find genetic variants associated with human fertility when conducting GWAS-meta analyses with sufficient sample size.

  5. Cognitive engineering models in space systems

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.

    1992-01-01

    NASA space systems, including mission operations on the ground and in space, are complex, dynamic, predominantly automated systems in which the human operator is a supervisory controller. The human operator monitors and fine-tunes computer-based control systems and is responsible for ensuring safe and efficient system operation. In such systems, the potential consequences of human mistakes and errors may be very large, and low probability of such events is likely. Thus, models of cognitive functions in complex systems are needed to describe human performance and form the theoretical basis of operator workstation design, including displays, controls, and decision support aids. The operator function model represents normative operator behavior-expected operator activities given current system state. The extension of the theoretical structure of the operator function model and its application to NASA Johnson mission operations and space station applications is discussed.

  6. Global precipitation measurement (GPM) preliminary design

    NASA Astrophysics Data System (ADS)

    Neeck, Steven P.; Kakar, Ramesh K.; Azarbarzin, Ardeshir A.; Hou, Arthur Y.

    2008-10-01

    The overarching Earth science mission objective of the Global Precipitation Measurement (GPM) mission is to develop a scientific understanding of the Earth system and its response to natural and human-induced changes. This will enable improved prediction of climate, weather, and natural hazards for present and future generations. The specific scientific objectives of GPM are advancing: Precipitation Measurement through combined use of active and passive remote-sensing techniques, Water/Energy Cycle Variability through improved knowledge of the global water/energy cycle and fresh water availability, Climate Prediction through better understanding of surface water fluxes, soil moisture storage, cloud/precipitation microphysics and latent heat release, Weather Prediction through improved numerical weather prediction (NWP) skills from more accurate and frequent measurements of instantaneous rain rates with better error characterizations and improved assimilation methods, Hydrometeorological Prediction through better temporal sampling and spatial coverage of highresolution precipitation measurements and innovative hydro-meteorological modeling. GPM is a joint initiative with the Japan Aerospace Exploration Agency (JAXA) and other international partners and is the backbone of the Committee on Earth Observation Satellites (CEOS) Precipitation Constellation. It will unify and improve global precipitation measurements from a constellation of dedicated and operational active/passive microwave sensors. GPM is completing the Preliminary Design Phase and is advancing towards launch in 2013 and 2014.

  7. Fractal dynamics in physiology: Alterations with disease and aging

    PubMed Central

    Goldberger, Ary L.; Amaral, Luis A. N.; Hausdorff, Jeffrey M.; Ivanov, Plamen Ch.; Peng, C.-K.; Stanley, H. Eugene

    2002-01-01

    According to classical concepts of physiologic control, healthy systems are self-regulated to reduce variability and maintain physiologic constancy. Contrary to the predictions of homeostasis, however, the output of a wide variety of systems, such as the normal human heartbeat, fluctuates in a complex manner, even under resting conditions. Scaling techniques adapted from statistical physics reveal the presence of long-range, power-law correlations, as part of multifractal cascades operating over a wide range of time scales. These scaling properties suggest that the nonlinear regulatory systems are operating far from equilibrium, and that maintaining constancy is not the goal of physiologic control. In contrast, for subjects at high risk of sudden death (including those with heart failure), fractal organization, along with certain nonlinear interactions, breaks down. Application of fractal analysis may provide new approaches to assessing cardiac risk and forecasting sudden cardiac death, as well as to monitoring the aging process. Similar approaches show promise in assessing other regulatory systems, such as human gait control in health and disease. Elucidating the fractal and nonlinear mechanisms involved in physiologic control and complex signaling networks is emerging as a major challenge in the postgenomic era. PMID:11875196

  8. A psychologist's view of validating aviation systems

    NASA Technical Reports Server (NTRS)

    Stein, Earl S.; Wagner, Dan

    1994-01-01

    All systems, no matter what they are designed to do, have shortcomings that may make them less productive than was hoped during the initial development. Such shortcomings can arise at any stage of development: from conception to the end of the implementation life cycle. While systems failure and errors of a lesser magnitude can occur as a function of mechanical or software breakdown, the majority of such problems, in aviation are usually laid on the shoulders of the human operator and, to a lesser extent, on human factors. The operator bears the responsibility and blame even though, from a human factors perspective, error may have been designed into the system. Human factors is not a new concept in aviation. The name may be new, but the issues related to operators in the loop date back to the industrial revolution of the nineteenth century and certainly to the aviation build-up for World War I. During this first global confrontation, military services from all sides discovered rather quickly that poor selection and training led to drastically increased personnel losses. While hardware design became an issue later, the early efforts were primarily focused on increased care in pilot selection and on their training. This actually involved early labor-intensive simulation, using such devices as sticks and chairs mounted on rope networks which could be manually moved in response to control input. The use of selection criteria and improved training led to more viable person-machine systems. More pilots survived training and their first ten missions in the air, a rule of thumb arrived at by experience which predicted ultimate survival better than any other. This rule was to hold through World War II. At that time, personnel selection and training became very sophisticated based on previous standards. Also, many psychologists were drafted into Army Air Corps programs which were geared towards refining the human factor. However, despite the talent involved in these programs and the tremendous build-up of aviation during the war, there were still aircraft designs that were man killers (no sexism implied since all combat pilots were men). One classic design error that was identified fifty years ago was the multipointer altimeter, which could easily be misread especially by a pilot under considerable task load. It has led to flying fully operational aircraft into the terrain. The authors of the research which formally identified this problem put 'Human Errors' in quotes to express their dissatisfaction with the traditional approach to accident investigation. It traditionally places the burden of guilt on the operator. Some of these altimeters still exist in older aircraft to this day.

  9. Evaluation of Bioinformatic Programmes for the Analysis of Variants within Splice Site Consensus Regions

    PubMed Central

    Tang, Rongying; Prosser, Debra O.; Love, Donald R.

    2016-01-01

    The increasing diagnostic use of gene sequencing has led to an expanding dataset of novel variants that lie within consensus splice junctions. The challenge for diagnostic laboratories is the evaluation of these variants in order to determine if they affect splicing or are merely benign. A common evaluation strategy is to use in silico analysis, and it is here that a number of programmes are available online; however, currently, there are no consensus guidelines on the selection of programmes or protocols to interpret the prediction results. Using a collection of 222 pathogenic mutations and 50 benign polymorphisms, we evaluated the sensitivity and specificity of four in silico programmes in predicting the effect of each variant on splicing. The programmes comprised Human Splice Finder (HSF), Max Entropy Scan (MES), NNSplice, and ASSP. The MES and ASSP programmes gave the highest performance based on Receiver Operator Curve analysis, with an optimal cut-off of score reduction of 10%. The study also showed that the sensitivity of prediction is affected by the level of conservation of individual positions, with in silico predictions for variants at positions −4 and +7 within consensus splice sites being largely uninformative. PMID:27313609

  10. Triadic social interactions operate across time: a field experiment with wild chimpanzees.

    PubMed

    Wittig, Roman M; Crockford, Catherine; Langergraber, Kevin E; Zuberbühler, Klaus

    2014-03-22

    Social animals cooperate with bonding partners to outcompete others. Predicting a competitor's supporter is likely to be beneficial, regardless of whether the supporting relationship is stable or transient, or whether the support happens immediately or later. Although humans make such predictions frequently, it is unclear to what extent animals have the cognitive abilities to recognize others' transient bond partners and to predict others' coalitions that extend beyond the immediate present. We conducted playback experiments with wild chimpanzees to test this. About 2 h after fighting, subjects heard recordings of aggressive barks of a bystander, who was or was not a bond partner of the former opponent. Subjects looked longer and moved away more often from barks of the former opponents' bond partners than non-bond partners. In an additional experiment, subjects moved away more from barks than socially benign calls of the same bond partner. These effects were present despite differences in genetic relatedness and considerable time delays between the two events. Chimpanzees, it appears, integrate memories of social interactions from different sources to make inferences about current interactions. This ability is crucial for connecting triadic social interactions across time, a requirement for predicting aggressive support even after a time delay.

  11. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    NASA Astrophysics Data System (ADS)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential usefulness of quantifying model uncertainty as sensitivity analysis in the PRA model.

  12. Generalization Gradients in Human Predictive Learning: Effects of Discrimination Training and within-Subjects Testing

    ERIC Educational Resources Information Center

    Vervliet, Bram; Iberico, Carlos; Vervoort, Ellen; Baeyens, Frank

    2011-01-01

    Generalization gradients have been investigated widely in animal conditioning experiments, but much less so in human predictive learning tasks. Here, we apply the experimental design of a recent study on conditioned fear generalization in humans (Lissek et al., 2008) to a predictive learning task, and examine the effects of a number of relevant…

  13. Prediction of a Therapeutic Dose for Buagafuran, a Potent Anxiolytic Agent by Physiologically Based Pharmacokinetic/Pharmacodynamic Modeling Starting from Pharmacokinetics in Rats and Human.

    PubMed

    Yang, Fen; Wang, Baolian; Liu, Zhihao; Xia, Xuejun; Wang, Weijun; Yin, Dali; Sheng, Li; Li, Yan

    2017-01-01

    Physiologically based pharmacokinetic (PBPK)/pharmacodynamic (PD) models can contribute to animal-to-human extrapolation and therapeutic dose predictions. Buagafuran is a novel anxiolytic agent and phase I clinical trials of buagafuran have been completed. In this paper, a potentially effective dose for buagafuran of 30 mg t.i.d. in human was estimated based on the human brain concentration predicted by a PBPK/PD modeling. The software GastroPlus TM was used to build the PBPK/PD model for buagafuran in rat which related the brain tissue concentrations of buagafuran and the times of animals entering the open arms in the pharmacological model of elevated plus-maze. Buagafuran concentrations in human plasma were fitted and brain tissue concentrations were predicted by using a human PBPK model in which the predicted plasma profiles were in good agreement with observations. The results provided supportive data for the rational use of buagafuran in clinic.

  14. Complexity analysis of the Next Gen Air Traffic Management System: trajectory based operations.

    PubMed

    Lyons, Rhonda

    2012-01-01

    According to Federal Aviation Administration traffic predictions currently our Air Traffic Management (ATM) system is operating at 150 percent capacity; forecasting that within the next two decades, the traffic with increase to a staggering 250 percent [17]. This will require a major redesign of our system. Today's ATM system is complex. It is designed to safely, economically, and efficiently provide air traffic services through the cost-effective provision of facilities and seamless services in collaboration with multiple agents however, contrary the vision, the system is loosely integrated and is suffering tremendously from antiquated equipment and saturated airways. The new Next Generation (Next Gen) ATM system is designed to transform the current system into an agile, robust and responsive set of operations that are designed to safely manage the growing needs of the projected increasingly complex, diverse set of air transportation system users and massive projected worldwide traffic rates. This new revolutionary technology-centric system is dynamically complex and is much more sophisticated than it's soon to be predecessor. ATM system failures could yield large scale catastrophic consequences as it is a safety critical system. This work will attempt to describe complexity and the complex nature of the NextGen ATM system and Trajectory Based Operational. Complex human factors interactions within Next Gen will be analyzed using a proposed dual experimental approach designed to identify hazards, gaps and elicit emergent hazards that would not be visible if conducted in isolation. Suggestions will be made along with a proposal for future human factors research in the TBO safety critical Next Gen environment.

  15. Best Practices for the Application of Functional Near Infrared Spectroscopy to Operator State Sensing

    NASA Technical Reports Server (NTRS)

    Harrivel, Angela R.; Hylton, Alan G.; Hearn, Tristan A.

    2012-01-01

    Functional Near Infrared Spectroscopy (fNIRS) is an emerging neuronal measurement technique with many advantages for application in operational and training contexts. Instrumentation and protocol improvements, however, are required to obtain useful signals and produce expeditiously self-applicable, comfortable and unobtrusive headgear. Approaches for improving the validity and reliability of fNIRS data for the purpose of sensing the mental state of commercial aircraft operators are identified, and an exemplary system design for attentional state monitoring is outlined. Intelligent flight decks of the future can be responsive to state changes to optimally support human performance. Thus, the identification of cognitive performance decrement, such as lapses in operator attention, may be used to predict and avoid error-prone states. We propose that attentional performance may be monitored with fNIRS through the quantification of hemodynamic activations in cortical regions which are part of functionally-connected attention and resting state networks. Activations in these regions have been shown to correlate with behavioral performance and task engagement. These regions lie beneath superficial tissue in head regions beyond the forehead. Headgear development is key to reliably and robustly accessing locations beyond the hair line to measure functionally-connected networks across the whole head. Human subject trials using both fNIRS and functional Magnetic Resonance Imaging (fMRI) will be used to test this system. Data processing employs Support Vector Machines for state classification based on the fNIRS signals. If accurate state classification is achieved based on sensed activation patterns, fNIRS will be shown to be useful for monitoring attentional performance.

  16. Earth orbital teleoperator systems evaluation

    NASA Technical Reports Server (NTRS)

    Shields, N. L., Jr.; Slaughter, P. H.; Brye, R. G.; Henderson, D. E.

    1979-01-01

    The mechanical extension of the human operator to remote and specialized environments poses a series of complex operational questions. A technical and scientific team was organized to investigate these questions through conducting specific laboratory and analytical studies. The intent of the studies was to determine the human operator requirements for remotely manned systems and to determine the particular effects that various system parameters have on human operator performance. In so doing, certain design criteria based on empirically derived data concerning the ultimate control system, the human operator, were added to the Teleoperator Development Program.

  17. Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.

    2001-01-01

    The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.

  18. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    NASA Astrophysics Data System (ADS)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  19. Designing an operator interface? Consider user`s `psychology`

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toffer, D.E.

    The modern operator interface is a channel of communication between operators and the plant that, ideally, provides them with information necessary to keep the plant running at maximum efficiency. Advances in automation technology have increased information flow from the field to the screen. New and improved Supervisory Control and Data Acquisition (SCADA) packages provide designers with powerful and open design considerations. All too often, however, systems go to the field designed for the software rather than the operator. Plant operators` jobs have changed fundamentally, from controlling their plants from out in the field to doing so from within control rooms.more » Control room-based operation does not denote idleness. Trained operators should be engaged in examination of plant status and cognitive evaluation of plant efficiencies. Designers who are extremely computer literate, often do not consider demographics of field operators. Many field operators have little knowledge of modern computer systems. As a result, they do not take full advantage of the interface`s capabilities. Designers often fail to understand the true nature of how operators run their plants. To aid field operators, designers must provide familiar controls and intuitive choices. To achieve success in interface design, it is necessary to understand the ways in which humans think conceptually, and to understand how they process this information physically. The physical and the conceptual are closely related when working with any type of interface. Designers should ask themselves: {open_quotes}What type of information is useful to the field operator?{close_quotes} Let`s explore an integration model that contains the following key elements: (1) Easily navigated menus; (2) Reduced chances for misunderstanding; (3) Accurate representations of the plant or operation; (4) Consistent and predictable operation; (5) A pleasant and engaging interface that conforms to the operator`s expectations. 4 figs.« less

  20. Affective forecasting in an orangutan: predicting the hedonic outcome of novel juice mixes.

    PubMed

    Sauciuc, Gabriela-Alina; Persson, Tomas; Bååth, Rasmus; Bobrowicz, Katarzyna; Osvath, Mathias

    2016-11-01

    Affective forecasting is an ability that allows the prediction of the hedonic outcome of never-before experienced situations, by mentally recombining elements of prior experiences into possible scenarios, and pre-experiencing what these might feel like. It has been hypothesised that this ability is uniquely human. For example, given prior experience with the ingredients, but in the absence of direct experience with the mixture, only humans are said to be able to predict that lemonade tastes better with sugar than without it. Non-human animals, on the other hand, are claimed to be confined to predicting-exclusively and inflexibly-the outcome of previously experienced situations. Relying on gustatory stimuli, we devised a non-verbal method for assessing affective forecasting and tested comparatively one Sumatran orangutan and ten human participants. Administered as binary choices, the test required the participants to mentally construct novel juice blends from familiar ingredients and to make hedonic predictions concerning the ensuing mixes. The orangutan's performance was within the range of that shown by the humans. Both species made consistent choices that reflected independently measured taste preferences for the stimuli. Statistical models fitted to the data confirmed the predictive accuracy of such a relationship. The orangutan, just like humans, thus seems to have been able to make hedonic predictions concerning never-before experienced events.

  1. How smart is your BEOL? productivity improvement through intelligent automation

    NASA Astrophysics Data System (ADS)

    Schulz, Kristian; Egodage, Kokila; Tabbone, Gilles; Garetto, Anthony

    2017-07-01

    The back end of line (BEOL) workflow in the mask shop still has crucial issues throughout all standard steps which are inspection, disposition, photomask repair and verification of repair success. All involved tools are typically run by highly trained operators or engineers who setup jobs and recipes, execute tasks, analyze data and make decisions based on the results. No matter how experienced operators are and how good the systems perform, there is one aspect that always limits the productivity and effectiveness of the operation: the human aspect. Human errors can range from seemingly rather harmless slip-ups to mistakes with serious and direct economic impact including mask rejects, customer returns and line stops in the wafer fab. Even with the introduction of quality control mechanisms that help to reduce these critical but unavoidable faults, they can never be completely eliminated. Therefore the mask shop BEOL cannot run in the most efficient manner as unnecessary time and money are spent on processes that still remain labor intensive. The best way to address this issue is to automate critical segments of the workflow that are prone to human errors. In fact, manufacturing errors can occur for each BEOL step where operators intervene. These processes comprise of image evaluation, setting up tool recipes, data handling and all other tedious but required steps. With the help of smart solutions, operators can work more efficiently and dedicate their time to less mundane tasks. Smart solutions connect tools, taking over the data handling and analysis typically performed by operators and engineers. These solutions not only eliminate the human error factor in the manufacturing process but can provide benefits in terms of shorter cycle times, reduced bottlenecks and prediction of an optimized workflow. In addition such software solutions consist of building blocks that seamlessly integrate applications and allow the customers to use tailored solutions. To accommodate for the variability and complexity in mask shops today, individual workflows can be supported according to the needs of any particular manufacturing line with respect to necessary measurement and production steps. At the same time the efficiency of assets is increased by avoiding unneeded cycle time and waste of resources due to the presence of process steps that are very crucial for a given technology. In this paper we present details of which areas of the BEOL can benefit most from intelligent automation, what solutions exist and the quantification of benefits to a mask shop with full automation by the use of a back end of line model.

  2. Pre-operative labs: Wasted dollars or predictors of post-operative cardiac and septic events in orthopaedic trauma patients?

    PubMed

    Lakomkin, Nikita; Sathiyakumar, Vasanth; Dodd, Ashley C; Jahangir, A Alex; Whiting, Paul S; Obremskey, William T; Sethi, Manish K

    2016-06-01

    As US healthcare expenditures continue to rise, there is significant pressure to reduce the cost of inpatient medical services. Studies have estimated that over 70% of routine labs may not yield clinical benefits while adding over $300 in costs per day for every inpatient. Although orthopaedic trauma patients tend to have longer inpatient stays and hip fractures have been associated with significant morbidity, there is a dearth of data examining pre-operative labs in predicting post-operative adverse events in these populations. The purpose of this study was to assess whether pre-operative labs significantly predict post-operative cardiac and septic complications in orthopaedic trauma and hip fracture patients. Between 2006 and 2013, 56,336 (15.6%) orthopaedic trauma patients were identified and 27,441 patients (7.6%) were diagnosed with hip fractures. Pre-operative labs included sodium, BUN, creatinine, albumin, bilirubin, SGOT, alkaline phosphatase, white count, hematocrit, platelet count, prothrombin time, INR, and partial thromboplastin time. For each of these labs, patients were deemed to have normal or abnormal values. Patients were noted to have developed cardiac or septic complications if they sustained (1) myocardial infarction (MI), (2) cardiac arrest, or (3) septic shock within 30 days after surgery. Separate regressions incorporating over 40 patient characteristics including age, gender, pre-operative comorbidities, and labs were performed for orthopaedic trauma patients in order to determine whether pre-operative labs predicted adverse cardiac or septic outcomes. 749 (1.3%) orthopaedic trauma patients developed cardiac complications and 311 (0.6%) developed septic shock. Multivariate regression demonstrated that abnormal pre-operative platelet values were significantly predictive of post-operative cardiac arrest (OR: 11.107, p=0.036), and abnormal bilirubin levels were predictive (OR: 8.487, p=0.008) of the development of septic shock in trauma patients. In the hip fracture cohort, abnormal partial thromboplastin time was significantly associated with post-operative myocardial infarction (OR: 15.083, p=0.046), and abnormal bilirubin (OR: 58.674, p=0.002) significantly predicted the onset of septic shock. This is the first study to demonstrate the utility of pre-operative labs in predicting perioperative cardiac and septic adverse events in orthopaedic trauma and hip fracture patients. Particular attention should be paid to haematologic/coagulation labs (platelets, PTT) and bilirubin values. Prognostic Level II. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Using species distribution models to optimize vector control in the framework of the tsetse eradication campaign in Senegal

    PubMed Central

    Dicko, Ahmadou H.; Lancelot, Renaud; Seck, Momar T.; Guerrini, Laure; Sall, Baba; Lo, Mbargou; Vreysen, Marc J. B.; Lefrançois, Thierry; Fonta, William M.; Peck, Steven L.; Bouyer, Jérémy

    2014-01-01

    Tsetse flies are vectors of human and animal trypanosomoses in sub-Saharan Africa and are the target of the Pan African Tsetse and Trypanosomiasis Eradication Campaign (PATTEC). Glossina palpalis gambiensis (Diptera: Glossinidae) is a riverine species that is still present as an isolated metapopulation in the Niayes area of Senegal. It is targeted by a national eradication campaign combining a population reduction phase based on insecticide-treated targets (ITTs) and cattle and an eradication phase based on the sterile insect technique. In this study, we used species distribution models to optimize control operations. We compared the probability of the presence of G. p. gambiensis and habitat suitability using a regularized logistic regression and Maxent, respectively. Both models performed well, with an area under the curve of 0.89 and 0.92, respectively. Only the Maxent model predicted an expert-based classification of landscapes correctly. Maxent predictions were therefore used throughout the eradication campaign in the Niayes to make control operations more efficient in terms of deployment of ITTs, release density of sterile males, and location of monitoring traps used to assess program progress. We discuss how the models’ results informed about the particular ecology of tsetse in the target area. Maxent predictions allowed optimizing efficiency and cost within our project, and might be useful for other tsetse control campaigns in the framework of the PATTEC and, more generally, other vector or insect pest control programs. PMID:24982143

  4. Using species distribution models to optimize vector control in the framework of the tsetse eradication campaign in Senegal.

    PubMed

    Dicko, Ahmadou H; Lancelot, Renaud; Seck, Momar T; Guerrini, Laure; Sall, Baba; Lo, Mbargou; Vreysen, Marc J B; Lefrançois, Thierry; Fonta, William M; Peck, Steven L; Bouyer, Jérémy

    2014-07-15

    Tsetse flies are vectors of human and animal trypanosomoses in sub-Saharan Africa and are the target of the Pan African Tsetse and Trypanosomiasis Eradication Campaign (PATTEC). Glossina palpalis gambiensis (Diptera: Glossinidae) is a riverine species that is still present as an isolated metapopulation in the Niayes area of Senegal. It is targeted by a national eradication campaign combining a population reduction phase based on insecticide-treated targets (ITTs) and cattle and an eradication phase based on the sterile insect technique. In this study, we used species distribution models to optimize control operations. We compared the probability of the presence of G. p. gambiensis and habitat suitability using a regularized logistic regression and Maxent, respectively. Both models performed well, with an area under the curve of 0.89 and 0.92, respectively. Only the Maxent model predicted an expert-based classification of landscapes correctly. Maxent predictions were therefore used throughout the eradication campaign in the Niayes to make control operations more efficient in terms of deployment of ITTs, release density of sterile males, and location of monitoring traps used to assess program progress. We discuss how the models' results informed about the particular ecology of tsetse in the target area. Maxent predictions allowed optimizing efficiency and cost within our project, and might be useful for other tsetse control campaigns in the framework of the PATTEC and, more generally, other vector or insect pest control programs.

  5. BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Data to decisions.

    PubMed

    White, B J; Amrine, D E; Larson, R L

    2018-04-14

    Big data are frequently used in many facets of business and agronomy to enhance knowledge needed to improve operational decisions. Livestock operations collect data of sufficient quantity to perform predictive analytics. Predictive analytics can be defined as a methodology and suite of data evaluation techniques to generate a prediction for specific target outcomes. The objective of this manuscript is to describe the process of using big data and the predictive analytic framework to create tools to drive decisions in livestock production, health, and welfare. The predictive analytic process involves selecting a target variable, managing the data, partitioning the data, then creating algorithms, refining algorithms, and finally comparing accuracy of the created classifiers. The partitioning of the datasets allows model building and refining to occur prior to testing the predictive accuracy of the model with naive data to evaluate overall accuracy. Many different classification algorithms are available for predictive use and testing multiple algorithms can lead to optimal results. Application of a systematic process for predictive analytics using data that is currently collected or that could be collected on livestock operations will facilitate precision animal management through enhanced livestock operational decisions.

  6. Sensors and systems for space applications: a methodology for developing fault detection, diagnosis, and recovery

    NASA Astrophysics Data System (ADS)

    Edwards, John L.; Beekman, Randy M.; Buchanan, David B.; Farner, Scott; Gershzohn, Gary R.; Khuzadi, Mbuyi; Mikula, D. F.; Nissen, Gerry; Peck, James; Taylor, Shaun

    2007-04-01

    Human space travel is inherently dangerous. Hazardous conditions will exist. Real time health monitoring of critical subsystems is essential for providing a safe abort timeline in the event of a catastrophic subsystem failure. In this paper, we discuss a practical and cost effective process for developing critical subsystem failure detection, diagnosis and response (FDDR). We also present the results of a real time health monitoring simulation of a propellant ullage pressurization subsystem failure. The health monitoring development process identifies hazards, isolates hazard causes, defines software partitioning requirements and quantifies software algorithm development. The process provides a means to establish the number and placement of sensors necessary to provide real time health monitoring. We discuss how health monitoring software tracks subsystem control commands, interprets off-nominal operational sensor data, predicts failure propagation timelines, corroborate failures predictions and formats failure protocol.

  7. Noninvasive prediction of shunt operation outcome in idiopathic normal pressure hydrocephalus

    PubMed Central

    Aoki, Yasunori; Kazui, Hiroaki; Tanaka, Toshihisa; Ishii, Ryouhei; Wada, Tamiki; Ikeda, Shunichiro; Hata, Masahiro; Canuet, Leonides; Katsimichas, Themistoklis; Musha, Toshimitsu; Matsuzaki, Haruyasu; Imajo, Kaoru; Kanemoto, Hideki; Yoshida, Tetsuhiko; Nomura, Keiko; Yoshiyama, Kenji; Iwase, Masao; Takeda, Masatoshi

    2015-01-01

    Idiopathic normal pressure hydrocephalus (iNPH) is a syndrome characterized by gait disturbance, cognitive deterioration and urinary incontinence in elderly individuals. These symptoms can be improved by shunt operation in some but not all patients. Therefore, discovering predictive factors for the surgical outcome is of great clinical importance. We used normalized power variance (NPV) of electroencephalography (EEG) waves, a sensitive measure of the instability of cortical electrical activity, and found significantly higher NPV in beta frequency band at the right fronto-temporo-occipital electrodes (Fp2, T4 and O2) in shunt responders compared to non-responders. By utilizing these differences, we were able to correctly identify responders and non-responders to shunt operation with a positive predictive value of 80% and a negative predictive value of 88%. Our findings indicate that NPV can be useful in noninvasively predicting the clinical outcome of shunt operation in patients with iNPH. PMID:25585705

  8. Predictive Mechanisms Are Not Involved the Same Way during Human-Human vs. Human-Machine Interactions: A Review

    PubMed Central

    Sahaï, Aïsha; Pacherie, Elisabeth; Grynszpan, Ouriel; Berberian, Bruno

    2017-01-01

    Nowadays, interactions with others do not only involve human peers but also automated systems. Many studies suggest that the motor predictive systems that are engaged during action execution are also involved during joint actions with peers and during other human generated action observation. Indeed, the comparator model hypothesis suggests that the comparison between a predicted state and an estimated real state enables motor control, and by a similar functioning, understanding and anticipating observed actions. Such a mechanism allows making predictions about an ongoing action, and is essential to action regulation, especially during joint actions with peers. Interestingly, the same comparison process has been shown to be involved in the construction of an individual's sense of agency, both for self-generated and observed other human generated actions. However, the implication of such predictive mechanisms during interactions with machines is not consensual, probably due to the high heterogeneousness of the automata used in the experimentations, from very simplistic devices to full humanoid robots. The discrepancies that are observed during human/machine interactions could arise from the absence of action/observation matching abilities when interacting with traditional low-level automata. Consistently, the difficulties to build a joint agency with this kind of machines could stem from the same problem. In this context, we aim to review the studies investigating predictive mechanisms during social interactions with humans and with automated artificial systems. We will start by presenting human data that show the involvement of predictions in action control and in the sense of agency during social interactions. Thereafter, we will confront this literature with data from the robotic field. Finally, we will address the upcoming issues in the field of robotics related to automated systems aimed at acting as collaborative agents. PMID:29081744

  9. Efficient prediction of human protein-protein interactions at a global scale.

    PubMed

    Schoenrock, Andrew; Samanfar, Bahram; Pitre, Sylvain; Hooshyar, Mohsen; Jin, Ke; Phillips, Charles A; Wang, Hui; Phanse, Sadhna; Omidi, Katayoun; Gui, Yuan; Alamgir, Md; Wong, Alex; Barrenäs, Fredrik; Babu, Mohan; Benson, Mikael; Langston, Michael A; Green, James R; Dehne, Frank; Golshani, Ashkan

    2014-12-10

    Our knowledge of global protein-protein interaction (PPI) networks in complex organisms such as humans is hindered by technical limitations of current methods. On the basis of short co-occurring polypeptide regions, we developed a tool called MP-PIPE capable of predicting a global human PPI network within 3 months. With a recall of 23% at a precision of 82.1%, we predicted 172,132 putative PPIs. We demonstrate the usefulness of these predictions through a range of experiments. The speed and accuracy associated with MP-PIPE can make this a potential tool to study individual human PPI networks (from genomic sequences alone) for personalized medicine.

  10. An improved version of the consequence analysis model for chemical emergencies, ESCAPE

    NASA Astrophysics Data System (ADS)

    Kukkonen, J.; Nikmo, J.; Riikonen, K.

    2017-02-01

    We present a refined version of a mathematical model called ESCAPE, "Expert System for Consequence Analysis and Preparing for Emergencies". The model has been designed for evaluating the releases of toxic and flammable gases into the atmosphere, their atmospheric dispersion and the effects on humans and the environment. We describe (i) the mathematical treatments of this model, (ii) a verification and evaluation of the model against selected experimental field data, and (iii) a new operational implementation of the model. The new mathematical treatments include state-of-the-art atmospheric vertical profiles and new submodels for dense gas and passive atmospheric dispersion. The model performance was first successfully verified using the data of the Thorney Island campaign, and then evaluated against the Desert Tortoise campaign. For the latter campaign, the geometric mean bias was 1.72 (this corresponds to an underprediction of approximately 70%) and 0.71 (overprediction of approximately 30%) for the concentration and the plume half-width, respectively. The geometric variance was <1.5 (this corresponds to an agreement that is better than a factor of two). These values can be considered to indicate a good agreement of predictions and data, in comparison to values evaluated for a range of other similar models. The model has also been adapted to be able to automatically use the real time predictions and forecasts of the numerical weather prediction model HIRLAM, "HIgh Resolution Limited Area Model". The operational implementation of the ESCAPE modelling system can be accessed anywhere using internet browsers, on laptop computers, tablets and mobile phones. The predicted results can be post-processed using geographic information systems. The model has already proved to be a useful tool of assessment for the needs of emergency response authorities in contingency planning.

  11. Operational Prototype Development of a Global Aircraft Radiation Exposure Nowcast

    NASA Astrophysics Data System (ADS)

    Mertens, Christopher; Kress, Brian; Wiltberger, Michael; Tobiska, W. Kent; Bouwer, Dave

    Galactic cosmic rays (GCR) and solar energetic particles (SEP) are the primary sources of human exposure to high linear energy transfer (LET) radiation in the atmosphere. High-LET radiation is effective at directly breaking DNA strands in biological tissue, or producing chemically active radicals in tissue that alter the cell function, both of which can lead to cancer or other adverse health effects. A prototype operational nowcast model of air-crew radiation exposure is currently under development and funded by NASA. The model predicts air-crew radiation exposure levels from both GCR and SEP that may accompany solar storms. The new air-crew radiation exposure model is called the Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) model. NAIRAS will provide global, data-driven, real-time exposure predictions of biologically harmful radiation at aviation altitudes. Observations are utilized from the ground (neutron monitors), from the atmosphere (the NCEP Global Forecast System), and from space (NASA/ACE and NOAA/GOES). Atmospheric observations characterize the overhead mass shielding and the ground-and space-based observations provide boundary conditions on the incident GCR and SEP particle flux distributions for transport and dosimetry calculations. Radiation exposure rates are calculated using the NASA physics-based HZETRN (High Charge (Z) and Energy TRaNsport) code. An overview of the NAIRAS model is given: the concept, design, prototype implementation status, data access, and example results. Issues encountered thus far and known and/or anticipated hurdles to research to operations transition are also discussed.

  12. Acute Hydrocortisone Treatment Increases Anxiety but Not Fear in Healthy Volunteers: A Fear-Potentiated Startle Study

    PubMed Central

    Grillon, Christian; Heller, Randi; Hirschhorn, Elizabeth; Kling, Mitchel A.; Pine, Daniel S.; Schulkin, Jay; Vythilingam, Meena

    2011-01-01

    Background The debilitating effects of chronic glucocorticoids excess are well-known, but comparatively little is understood about the role of acute cortisol. Indirect evidence in rodents suggests that acute cortisone could selectively increase some forms of long-duration aversive states, such as “anxiety,” but not relatively similar, briefer aversive states, such as “fear.” However, no prior experimental studies in humans consider the unique effects of cortisol on anxiety and fear, using well-validated methods for eliciting these two similar but dissociable aversive states. The current study examines these effects, as instantiated with short- and long-duration threats. Methods Healthy volunteers (n = 18) received placebo or a low (20 mg) or a high (60 mg) dose of hydrocortisone in a double-blind crossover design. Subjects were exposed repeatedly to three 150-sec duration conditions: no shock; predictable shocks, in which shocks were signaled by a short-duration threat cue; and unpredictable shocks. Aversive states were indexed by acoustic startle. Fear was operationally defined as the increase in startle reactivity during the threat cue in the predictable condition (fear-potentiated startle). Anxiety was operationally defined as the increase in baseline startle from the no shock to the two threat conditions (anxiety-potentiated startle). Results Hydrocortisone affected neither baseline nor short-duration, fear-potentiated startle but increased long-duration anxiety-potentiated startle. Conclusions These results suggest that hydrocortisone administration in humans selectively increases anxiety but not fear. Possible mechanisms implicated are discussed in light of prior data in rodents. Specifically, hydrocortisone might increase anxiety via sensitization of corticotrophin-releasing hormones in the bed nucleus of the stria terminalis. PMID:21277566

  13. Quantitative and Systems Pharmacology. 1. In Silico Prediction of Drug-Target Interactions of Natural Products Enables New Targeted Cancer Therapy.

    PubMed

    Fang, Jiansong; Wu, Zengrui; Cai, Chuipu; Wang, Qi; Tang, Yun; Cheng, Feixiong

    2017-11-27

    Natural products with diverse chemical scaffolds have been recognized as an invaluable source of compounds in drug discovery and development. However, systematic identification of drug targets for natural products at the human proteome level via various experimental assays is highly expensive and time-consuming. In this study, we proposed a systems pharmacology infrastructure to predict new drug targets and anticancer indications of natural products. Specifically, we reconstructed a global drug-target network with 7,314 interactions connecting 751 targets and 2,388 natural products and built predictive network models via a balanced substructure-drug-target network-based inference approach. A high area under receiver operating characteristic curve of 0.96 was yielded for predicting new targets of natural products during cross-validation. The newly predicted targets of natural products (e.g., resveratrol, genistein, and kaempferol) with high scores were validated by various literature studies. We further built the statistical network models for identification of new anticancer indications of natural products through integration of both experimentally validated and computationally predicted drug-target interactions of natural products with known cancer proteins. We showed that the significantly predicted anticancer indications of multiple natural products (e.g., naringenin, disulfiram, and metformin) with new mechanism-of-action were validated by various published experimental evidence. In summary, this study offers powerful computational systems pharmacology approaches and tools for the development of novel targeted cancer therapies by exploiting the polypharmacology of natural products.

  14. SPRINT: ultrafast protein-protein interaction prediction of the entire human interactome.

    PubMed

    Li, Yiwei; Ilie, Lucian

    2017-11-15

    Proteins perform their functions usually by interacting with other proteins. Predicting which proteins interact is a fundamental problem. Experimental methods are slow, expensive, and have a high rate of error. Many computational methods have been proposed among which sequence-based ones are very promising. However, so far no such method is able to predict effectively the entire human interactome: they require too much time or memory. We present SPRINT (Scoring PRotein INTeractions), a new sequence-based algorithm and tool for predicting protein-protein interactions. We comprehensively compare SPRINT with state-of-the-art programs on seven most reliable human PPI datasets and show that it is more accurate while running orders of magnitude faster and using very little memory. SPRINT is the only sequence-based program that can effectively predict the entire human interactome: it requires between 15 and 100 min, depending on the dataset. Our goal is to transform the very challenging problem of predicting the entire human interactome into a routine task. The source code of SPRINT is freely available from https://github.com/lucian-ilie/SPRINT/ and the datasets and predicted PPIs from www.csd.uwo.ca/faculty/ilie/SPRINT/ .

  15. A correlational approach to predicting operator status

    NASA Technical Reports Server (NTRS)

    Shingledecker, Clark A.

    1988-01-01

    This paper discusses a research approach for identifying and validating candidate physiological and behavioral parameters which can be used to predict the performance capabilities of aircrew and other system operators. In this methodology, concurrent and advance correlations are computed between predictor values and criterion performance measures. Continuous performance and sleep loss are used as stressors to promote performance variation. Preliminary data are presented which suggest dependence of prediction capability on the resource allocation policy of the operator.

  16. Prediction of Human Phenotype Ontology terms by means of hierarchical ensemble methods.

    PubMed

    Notaro, Marco; Schubach, Max; Robinson, Peter N; Valentini, Giorgio

    2017-10-12

    The prediction of human gene-abnormal phenotype associations is a fundamental step toward the discovery of novel genes associated with human disorders, especially when no genes are known to be associated with a specific disease. In this context the Human Phenotype Ontology (HPO) provides a standard categorization of the abnormalities associated with human diseases. While the problem of the prediction of gene-disease associations has been widely investigated, the related problem of gene-phenotypic feature (i.e., HPO term) associations has been largely overlooked, even if for most human genes no HPO term associations are known and despite the increasing application of the HPO to relevant medical problems. Moreover most of the methods proposed in literature are not able to capture the hierarchical relationships between HPO terms, thus resulting in inconsistent and relatively inaccurate predictions. We present two hierarchical ensemble methods that we formally prove to provide biologically consistent predictions according to the hierarchical structure of the HPO. The modular structure of the proposed methods, that consists in a "flat" learning first step and a hierarchical combination of the predictions in the second step, allows the predictions of virtually any flat learning method to be enhanced. The experimental results show that hierarchical ensemble methods are able to predict novel associations between genes and abnormal phenotypes with results that are competitive with state-of-the-art algorithms and with a significant reduction of the computational complexity. Hierarchical ensembles are efficient computational methods that guarantee biologically meaningful predictions that obey the true path rule, and can be used as a tool to improve and make consistent the HPO terms predictions starting from virtually any flat learning method. The implementation of the proposed methods is available as an R package from the CRAN repository.

  17. Development of a neural net paradigm that predicts simulator sickness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allgood, G.O.

    1993-03-01

    A disease exists that affects pilots and aircrew members who use Navy Operational Flight Training Systems. This malady, commonly referred to as simulator sickness and whose symptomatology closely aligns with that of motion sickness, can compromise the use of these systems because of a reduced utilization factor, negative transfer of training, and reduction in combat readiness. A report is submitted that develops an artificial neural network (ANN) and behavioral model that predicts the onset and level of simulator sickness in the pilots and aircrews who sue these systems. It is proposed that the paradigm could be implemented in real timemore » as a biofeedback monitor to reduce the risk to users of these systems. The model captures the neurophysiological impact of use (human-machine interaction) by developing a structure that maps the associative and nonassociative behavioral patterns (learned expectations) and vestibular (otolith and semicircular canals of the inner ear) and tactile interaction, derived from system acceleration profiles, onto an abstract space that predicts simulator sickness for a given training flight.« less

  18. Analysis of Ribosome Stalling and Translation Elongation Dynamics by Deep Learning.

    PubMed

    Zhang, Sai; Hu, Hailin; Zhou, Jingtian; He, Xuan; Jiang, Tao; Zeng, Jianyang

    2017-09-27

    Ribosome stalling is manifested by the local accumulation of ribosomes at specific codon positions of mRNAs. Here, we present ROSE, a deep learning framework to analyze high-throughput ribosome profiling data and estimate the probability of a ribosome stalling event occurring at each genomic location. Extensive validation tests on independent data demonstrated that ROSE possessed higher prediction accuracy than conventional prediction models, with an increase in the area under the receiver operating characteristic curve by up to 18.4%. In addition, genome-wide statistical analyses showed that ROSE predictions can be well correlated with diverse putative regulatory factors of ribosome stalling. Moreover, the genome-wide ribosome stalling landscapes of both human and yeast computed by ROSE recovered the functional interplays between ribosome stalling and cotranslational events in protein biogenesis, including protein targeting by the signal recognition particles and protein secondary structure formation. Overall, our study provides a novel method to complement the ribosome profiling techniques and further decipher the complex regulatory mechanisms underlying translation elongation dynamics encoded in the mRNA sequence. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahrens, J.S.

    For over fifteen years Sandia National Laboratories has been involved in laboratory testing of biometric identification devices. The key concept of biometric identification devices is the ability for the system to identify some unique aspect of the individual rather than some object a person may be carrying or some password they are required to know. Tests were conducted to verify manufacturer`s performance claims, to determine strengths/weaknesses of devices, and to determine devices that meet the US Department of energy`s needs. However, during recent field installation, significantly different performance was observed than was predicted by laboratory tests. Although most people usingmore » the device believed it operated adequately, the performance observed was over an order of magnitude worse than predicted. The search for reasons behind this gap between the predicted and the actual performance has revealed many possible contributing factors. As engineers, the most valuable lesson to be learned from this experience is the value of scientists and engineers with (1) common sense, (2) knowledge of human behavior, (3) the ability to observe the real world, and (4) the capability to realize the significant differences between controlled experiments and actual installations.« less

  20. 1 H MR spectroscopy in cervical carcinoma using external phase array body coil at 3.0 Tesla: Prediction of poor prognostic human papillomavirus genotypes.

    PubMed

    Lin, Gigin; Lai, Chyong-Huey; Tsai, Shang-Yueh; Lin, Yu-Chun; Huang, Yu-Ting; Wu, Ren-Chin; Yang, Lan-Yan; Lu, Hsin-Ying; Chao, Angel; Wang, Chiun-Chieh; Ng, Koon-Kwan; Ng, Shu-Hang; Chou, Hung-Hsueh; Yen, Tzu-Chen; Hung, Ji-Hong

    2017-03-01

    To assess the clinical value of proton ( 1 H) MR spectroscopy in cervical carcinomas, in the prediction of poor prognostic human papillomavirus (HPV) genotypes as well as persistent disease following concurrent chemoradiotherapy (CCRT). 1 H MR spectroscopy using external phase array coil was performed in 52 consecutive cervical cancer patients at 3 Tesla (T). Poor prognostic HPV genotypes (alpha-7 species or absence of HPV infection) and persistent cervical carcinoma after CCRT were recorded. Statistical significance was calculated with the Mann-Whitney two-sided nonparametric test and areas under the receiver operating characteristics curve (AUC) analysis. A 4.3-fold (P = 0.032) increased level of methyl resonance at 0.9 ppm was found in the poor prognostic HPV genotypes, mainly attributed to the presence of HPV18, with a sensitivity of 75%, a specificity of 81%, and an AUC of 0.76. Poor prognostic HPV genotypes were more frequently observed in patients with adeno-/adenosquamous carcinoma (Chi-square, P < 0.0001). In prediction of the four patients with persistent disease after CCRT, elevated methyl resonance demonstrated a sensitivity of 100%, a specificity of 74%, and an AUC of 0.82. 1 H MR spectroscopy at 3T can be used to depict the elevated lipid resonance levels in cervical carcinomas, as well as help to predict the poor prognostic HPV genotypes and persistent disease following CCRT. Further large studies with longer follow up times are warranted to validate our initial findings. 1 J. Magn. Reson. Imaging 2017;45:899-907. © 2016 International Society for Magnetic Resonance in Medicine.

  1. Prognostic and Pathogenetic Value of Combining Clinical and Biochemical Indices in Patients With Acute Lung Injury

    PubMed Central

    Koyama, Tatsuki; Billheimer, D. Dean; Wu, William; Bernard, Gordon R.; Thompson, B. Taylor; Brower, Roy G.; Standiford, Theodore J.; Martin, Thomas R.; Matthay, Michael A.

    2010-01-01

    Background: No single clinical or biologic marker reliably predicts clinical outcomes in acute lung injury (ALI)/ARDS. We hypothesized that a combination of biologic and clinical markers would be superior to either biomarkers or clinical factors alone in predicting ALI/ARDS mortality and would provide insight into the pathogenesis of clinical ALI/ARDS. Methods: Eight biologic markers that reflect endothelial and epithelial injury, inflammation, and coagulation (von Willebrand factor antigen, surfactant protein D [SP-D]), tumor necrosis factor receptor-1, interleukin [IL]-6, IL-8, intercellular adhesion molecule-1, protein C, plasminogen activator inhibitor-1) were measured in baseline plasma from 549 patients in the ARDSNet trial of low vs high positive end-expiratory pressure. Mortality was modeled with multivariable logistic regression. Predictors were selected using backward elimination. Comparisons between candidate models were based on the receiver operating characteristics (ROC) and tests of integrated discrimination improvement. Results: Clinical predictors (Acute Physiology And Chronic Health Evaluation III [APACHE III], organ failures, age, underlying cause, alveolar-arterial oxygen gradient, plateau pressure) predicted mortality with an area under the ROC curve (AUC) of 0.82; a combination of eight biomarkers and the clinical predictors had an AUC of 0.85. The best performing biomarkers were the neutrophil chemotactic factor, IL-8, and SP-D, a product of alveolar type 2 cells, supporting the concept that acute inflammation and alveolar epithelial injury are important pathogenetic pathways in human ALI/ARDS. Conclusions: A combination of biomarkers and clinical predictors is superior to clinical predictors or biomarkers alone for predicting mortality in ALI/ARDS and may be useful for stratifying patients in clinical trials. From a pathogenesis perspective, the degree of acute inflammation and alveolar epithelial injury are highly associated with the outcome of human ALI/ARDS. PMID:19858233

  2. Using open source computational tools for predicting human metabolic stability and additional absorption, distribution, metabolism, excretion, and toxicity properties.

    PubMed

    Gupta, Rishi R; Gifford, Eric M; Liston, Ted; Waller, Chris L; Hohman, Moses; Bunin, Barry A; Ekins, Sean

    2010-11-01

    Ligand-based computational models could be more readily shared between researchers and organizations if they were generated with open source molecular descriptors [e.g., chemistry development kit (CDK)] and modeling algorithms, because this would negate the requirement for proprietary commercial software. We initially evaluated open source descriptors and model building algorithms using a training set of approximately 50,000 molecules and a test set of approximately 25,000 molecules with human liver microsomal metabolic stability data. A C5.0 decision tree model demonstrated that CDK descriptors together with a set of Smiles Arbitrary Target Specification (SMARTS) keys had good statistics [κ = 0.43, sensitivity = 0.57, specificity = 0.91, and positive predicted value (PPV) = 0.64], equivalent to those of models built with commercial Molecular Operating Environment 2D (MOE2D) and the same set of SMARTS keys (κ = 0.43, sensitivity = 0.58, specificity = 0.91, and PPV = 0.63). Extending the dataset to ∼193,000 molecules and generating a continuous model using Cubist with a combination of CDK and SMARTS keys or MOE2D and SMARTS keys confirmed this observation. When the continuous predictions and actual values were binned to get a categorical score we observed a similar κ statistic (0.42). The same combination of descriptor set and modeling method was applied to passive permeability and P-glycoprotein efflux data with similar model testing statistics. In summary, open source tools demonstrated predictive results comparable to those of commercial software with attendant cost savings. We discuss the advantages and disadvantages of open source descriptors and the opportunity for their use as a tool for organizations to share data precompetitively, avoiding repetition and assisting drug discovery.

  3. Analysis of the human operator subsystems

    NASA Technical Reports Server (NTRS)

    Jones, Lynette A.; Hunter, Ian W.

    1991-01-01

    Except in low-bandwidth systems, knowledge of the human operator transfer function is essential for high-performance telerobotic systems. This information has usually been derived from detailed analyses of tracking performance, in which the human operator is considered as a complete system rather than as a summation of a number of subsystems, each of which influences the operator's output. Studies of one of these subsystems, the limb mechanics system, demonstrate that large parameter variations can occur that can have a profound effect on the stability of force-reflecting telerobot systems. An objective of this research was to decompose the performance of the human operator system in order to establish how the dynamics of each of the elements influence the operator's responses.

  4. A Novel Biclustering Approach to Association Rule Mining for Predicting HIV-1–Human Protein Interactions

    PubMed Central

    Mukhopadhyay, Anirban; Maulik, Ujjwal; Bandyopadhyay, Sanghamitra

    2012-01-01

    Identification of potential viral-host protein interactions is a vital and useful approach towards development of new drugs targeting those interactions. In recent days, computational tools are being utilized for predicting viral-host interactions. Recently a database containing records of experimentally validated interactions between a set of HIV-1 proteins and a set of human proteins has been published. The problem of predicting new interactions based on this database is usually posed as a classification problem. However, posing the problem as a classification one suffers from the lack of biologically validated negative interactions. Therefore it will be beneficial to use the existing database for predicting new viral-host interactions without the need of negative samples. Motivated by this, in this article, the HIV-1–human protein interaction database has been analyzed using association rule mining. The main objective is to identify a set of association rules both among the HIV-1 proteins and among the human proteins, and use these rules for predicting new interactions. In this regard, a novel association rule mining technique based on biclustering has been proposed for discovering frequent closed itemsets followed by the association rules from the adjacency matrix of the HIV-1–human interaction network. Novel HIV-1–human interactions have been predicted based on the discovered association rules and tested for biological significance. For validation of the predicted new interactions, gene ontology-based and pathway-based studies have been performed. These studies show that the human proteins which are predicted to interact with a particular viral protein share many common biological activities. Moreover, literature survey has been used for validation purpose to identify some predicted interactions that are already validated experimentally but not present in the database. Comparison with other prediction methods is also discussed. PMID:22539940

  5. Addressing Challenges to the Design & Test of Operational Lighting Environments for the International Space Station

    NASA Technical Reports Server (NTRS)

    Clark, Toni A.

    2014-01-01

    In our day to day lives, the availability of light, with which to see our environment, is often taken for granted. The designers of land based lighting systems use sunlight and artificial light as their toolset. The availability of power, quantity of light sources, and variety of design options are often unlimited. The accessibility of most land based lighting systems makes it easy for the architect and engineer to verify and validate their design ideas. Failures with an implementation, while sometimes costly, can easily be addressed by renovation. Consider now, an architectural facility orbiting in space, 260 miles above the surface of the earth. This human rated architectural facility, the International Space Station (ISS) must maintain operations every day, including life support and appropriate human comforts without fail. The facility must also handle logistics of regular shipments of cargo, including new passengers. The ISS requires accommodations necessary for human control of machine systems. Additionally, the ISS is a research facility and supports investigations performed inside and outside its livable volume. Finally, the facility must support remote operations and observations by ground controllers. All of these architectural needs require a functional, safe, and even an aesthetic lighting environment. At Johnson Space Center, our Habitability and Human Factors team assists our diverse customers with their lighting environment challenges, via physical test and computer based analysis. Because of the complexity of ISS operational environment, our team has learned and developed processes that help ISS operate safely. Because of the dynamic exterior lighting environment, uses computational modeling to predict the lighting environment. The ISS' orbit exposes it to a sunrise every 90 minutes, causing work surfaces to quickly change from direct sunlight to earthshine to total darkness. Proper planning of vehicle approaches, robotics operations, and crewed Extra Vehicular Activities are mandatory to ensure safety to the crew and all others involved. Innovation in testing techniques is important as well. The advent of Solid State Lighting technology and the lack of stable national and international standards for its implementation pose new challenges on how to design, test and verify individual light fixtures and the environment that uses them. The ISS will soon be replacing its internal fluorescent lighting system to a solid state LED system. The Solid State Lighting Assembly will be used not only for general lighting, but also as a medical countermeasure to control the circadian rhythm of the crew. The new light source has performance criteria very specific to its spectral fingerprint, creating new challenges that were originally not as significant during the original design of the ISS. This presentation will showcase findings and toolsets our team is using to assist in the planning of tasks, and design of operational lighting environments on the International Space Station.

  6. Lunar Surface Mission Operations Scenario and Considerations

    NASA Technical Reports Server (NTRS)

    Arnold, Larissa S.; Torney, Susan E.; Rask, John Doug; Bleisath, Scott A.

    2006-01-01

    Planetary surface operations have been studied since the last visit of humans to the Moon, including conducting analog missions. Mission Operations lessons from these activities are summarized. Characteristics of forecasted surface operations are compared to current human mission operations approaches. Considerations for future designs of mission operations are assessed.

  7. Scheduling Software for Complex Scenarios

    NASA Technical Reports Server (NTRS)

    2006-01-01

    Preparing a vehicle and its payload for a single launch is a complex process that involves thousands of operations. Because the equipment and facilities required to carry out these operations are extremely expensive and limited in number, optimal assignment and efficient use are critically important. Overlapping missions that compete for the same resources, ground rules, safety requirements, and the unique needs of processing vehicles and payloads destined for space impose numerous constraints that, when combined, require advanced scheduling. Traditional scheduling systems use simple algorithms and criteria when selecting activities and assigning resources and times to each activity. Schedules generated by these simple decision rules are, however, frequently far from optimal. To resolve mission-critical scheduling issues and predict possible problem areas, NASA historically relied upon expert human schedulers who used their judgment and experience to determine where things should happen, whether they will happen on time, and whether the requested resources are truly necessary.

  8. Systems Modeling for Crew Core Body Temperature Prediction Postlanding

    NASA Technical Reports Server (NTRS)

    Cross, Cynthia; Ochoa, Dustin

    2010-01-01

    The Orion Crew Exploration Vehicle, NASA s latest crewed spacecraft project, presents many challenges to its designers including ensuring crew survivability during nominal and off nominal landing conditions. With a nominal water landing planned off the coast of San Clemente, California, off nominal water landings could range from the far North Atlantic Ocean to the middle of the equatorial Pacific Ocean. For all of these conditions, the vehicle must provide sufficient life support resources to ensure that the crew member s core body temperatures are maintained at a safe level prior to crew rescue. This paper will examine the natural environments, environments created inside the cabin and constraints associated with post landing operations that affect the temperature of the crew member. Models of the capsule and the crew members are examined and analysis results are compared to the requirement for safe human exposure. Further, recommendations for updated modeling techniques and operational limits are included.

  9. Forensic DNA phenotyping: Developing a model privacy impact assessment.

    PubMed

    Scudder, Nathan; McNevin, Dennis; Kelty, Sally F; Walsh, Simon J; Robertson, James

    2018-05-01

    Forensic scientists around the world are adopting new technology platforms capable of efficiently analysing a larger proportion of the human genome. Undertaking this analysis could provide significant operational benefits, particularly in giving investigators more information about the donor of genetic material, a particularly useful investigative lead. Such information could include predicting externally visible characteristics such as eye and hair colour, as well as biogeographical ancestry. This article looks at the adoption of this new technology from a privacy perspective, using this to inform and critique the application of a Privacy Impact Assessment to this emerging technology. Noting the benefits and limitations, the article develops a number of themes that would influence a model Privacy Impact Assessment as a contextual framework for forensic laboratories and law enforcement agencies considering implementing forensic DNA phenotyping for operational use. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Users matter : multi-agent systems model of high performance computing cluster users.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, M. J.; Hood, C. S.; Decision and Information Sciences

    2005-01-01

    High performance computing clusters have been a critical resource for computational science for over a decade and have more recently become integral to large-scale industrial analysis. Despite their well-specified components, the aggregate behavior of clusters is poorly understood. The difficulties arise from complicated interactions between cluster components during operation. These interactions have been studied by many researchers, some of whom have identified the need for holistic multi-scale modeling that simultaneously includes network level, operating system level, process level, and user level behaviors. Each of these levels presents its own modeling challenges, but the user level is the most complex duemore » to the adaptability of human beings. In this vein, there are several major user modeling goals, namely descriptive modeling, predictive modeling and automated weakness discovery. This study shows how multi-agent techniques were used to simulate a large-scale computing cluster at each of these levels.« less

  11. Predicting drug-induced liver injury in human with Naïve Bayes classifier approach.

    PubMed

    Zhang, Hui; Ding, Lan; Zou, Yi; Hu, Shui-Qing; Huang, Hai-Guo; Kong, Wei-Bao; Zhang, Ji

    2016-10-01

    Drug-induced liver injury (DILI) is one of the major safety concerns in drug development. Although various toxicological studies assessing DILI risk have been developed, these methods were not sufficient in predicting DILI in humans. Thus, developing new tools and approaches to better predict DILI risk in humans has become an important and urgent task. In this study, we aimed to develop a computational model for assessment of the DILI risk with using a larger scale human dataset and Naïve Bayes classifier. The established Naïve Bayes prediction model was evaluated by 5-fold cross validation and an external test set. For the training set, the overall prediction accuracy of the 5-fold cross validation was 94.0 %. The sensitivity, specificity, positive predictive value and negative predictive value were 97.1, 89.2, 93.5 and 95.1 %, respectively. The test set with the concordance of 72.6 %, sensitivity of 72.5 %, specificity of 72.7 %, positive predictive value of 80.4 %, negative predictive value of 63.2 %. Furthermore, some important molecular descriptors related to DILI risk and some toxic/non-toxic fragments were identified. Thus, we hope the prediction model established here would be employed for the assessment of human DILI risk, and the obtained molecular descriptors and substructures should be taken into consideration in the design of new candidate compounds to help medicinal chemists rationally select the chemicals with the best prospects to be effective and safe.

  12. The North American Multi-Model Ensemble (NMME): Phase-1 Seasonal to Interannual Prediction, Phase-2 Toward Developing Intra-Seasonal Prediction

    NASA Technical Reports Server (NTRS)

    Kirtman, Ben P.; Min, Dughong; Infanti, Johnna M.; Kinter, James L., III; Paolino, Daniel A.; Zhang, Qin; vandenDool, Huug; Saha, Suranjana; Mendez, Malaquias Pena; Becker, Emily; hide

    2013-01-01

    The recent US National Academies report "Assessment of Intraseasonal to Interannual Climate Prediction and Predictability" was unequivocal in recommending the need for the development of a North American Multi-Model Ensemble (NMME) operational predictive capability. Indeed, this effort is required to meet the specific tailored regional prediction and decision support needs of a large community of climate information users. The multi-model ensemble approach has proven extremely effective at quantifying prediction uncertainty due to uncertainty in model formulation, and has proven to produce better prediction quality (on average) then any single model ensemble. This multi-model approach is the basis for several international collaborative prediction research efforts, an operational European system and there are numerous examples of how this multi-model ensemble approach yields superior forecasts compared to any single model. Based on two NOAA Climate Test Bed (CTB) NMME workshops (February 18, and April 8, 2011) a collaborative and coordinated implementation strategy for a NMME prediction system has been developed and is currently delivering real-time seasonal-to-interannual predictions on the NOAA Climate Prediction Center (CPC) operational schedule. The hindcast and real-time prediction data is readily available (e.g., http://iridl.ldeo.columbia.edu/SOURCES/.Models/.NMME/) and in graphical format from CPC (http://origin.cpc.ncep.noaa.gov/products/people/wd51yf/NMME/index.html). Moreover, the NMME forecast are already currently being used as guidance for operational forecasters. This paper describes the new NMME effort, presents an overview of the multi-model forecast quality, and the complementary skill associated with individual models.

  13. Sleep and Predicted Cognitive Performance of New Cadets during Cadet Basic Training at the United States Military Academy

    DTIC Science & Technology

    2005-09-01

    7 B. SLEEP ARCHITECTURE..................................7 1. Circadian Rhythm and Human Sleep Drive...body temperature. Van Dongen & Dinges, 2000 ....10 Figure 2. EEG of Human Brain Activity During Sleep. http://ist-socrates.berkeley.edu/~jmp...the predicted levels of human performance based on circadian rhythms , amount and quality of sleep, and combines cognitive performance 5 predictions

  14. On the design of flight-deck procedures

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Wiener, Earl L.

    1994-01-01

    In complex human-machine systems, operations, training, and standardization depend on a elaborate set of procedures which are specified and mandated by the operational management of the organization. The intent is to provide guidance to the pilots, to ensure a logical, efficient, safe, and predictable means of carrying out the mission objectives. In this report the authors examine the issue of procedure use and design from a broad viewpoint. The authors recommend a process which we call 'The Four P's:' philosophy, policies, procedures, and practices. We believe that if an organization commits to this process, it can create a set of procedures that are more internally consistent, less confusing, better respected by the flight crews, and that will lead to greater conformity. The 'Four-P' model, and the guidelines for procedural development in appendix 1, resulted from cockpit observations, extensive interviews with airline management and pilots, interviews and discussion at one major airframe manufacturer, and an examination of accident and incident reports. Although this report is based on airline operations, we believe that the principles may be applicable to other complex, high-risk systems, such as nuclear power production, manufacturing process control, space flight, and military operations.

  15. Overview of the Smart Network Element Architecture and Recent Innovations

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M.; Mata, Carlos T.; Oostdyk, Rebecca L.

    2008-01-01

    In industrial environments, system operators rely on the availability and accuracy of sensors to monitor processes and detect failures of components and/or processes. The sensors must be networked in such a way that their data is reported to a central human interface, where operators are tasked with making real-time decisions based on the state of the sensors and the components that are being monitored. Incorporating health management functions at this central location aids the operator by automating the decision-making process to suggest, and sometimes perform, the action required by current operating conditions. Integrated Systems Health Management (ISHM) aims to incorporate data from many sources, including real-time and historical data and user input, and extract information and knowledge from that data to diagnose failures and predict future failures of the system. By distributing health management processing to lower levels of the architecture, there is less bandwidth required for ISHM, enhanced data fusion, make systems and processes more robust, and improved resolution for the detection and isolation of failures in a system, subsystem, component, or process. The Smart Network Element (SNE) has been developed at NASA Kennedy Space Center to perform intelligent functions at sensors and actuators' level in support of ISHM.

  16. A learning controller for nonrepetitive robotic operation

    NASA Technical Reports Server (NTRS)

    Miller, W. T., III

    1987-01-01

    A practical learning control system is described which is applicable to complex robotic and telerobotic systems involving multiple feedback sensors and multiple command variables. In the controller, the learning algorithm is used to learn to reproduce the nonlinear relationship between the sensor outputs and the system command variables over particular regions of the system state space, rather than learning the actuator commands required to perform a specific task. The learned information is used to predict the command signals required to produce desired changes in the sensor outputs. The desired sensor output changes may result from automatic trajectory planning or may be derived from interactive input from a human operator. The learning controller requires no a priori knowledge of the relationships between the sensor outputs and the command variables. The algorithm is well suited for real time implementation, requiring only fixed point addition and logical operations. The results of learning experiments using a General Electric P-5 manipulator interfaced to a VAX-11/730 computer are presented. These experiments involved interactive operator control, via joysticks, of the position and orientation of an object in the field of view of a video camera mounted on the end of the robot arm.

  17. Concurrent Schedules of Positive and Negative Reinforcement: Differential-Impact and Differential-Outcomes Hypotheses

    PubMed Central

    Magoon, Michael A; Critchfield, Thomas S

    2008-01-01

    Considerable evidence from outside of operant psychology suggests that aversive events exert greater influence over behavior than equal-sized positive-reinforcement events. Operant theory is largely moot on this point, and most operant research is uninformative because of a scaling problem that prevents aversive events and those based on positive reinforcement from being directly compared. In the present investigation, humans' mouse-click responses were maintained on similarly structured, concurrent schedules of positive (money gain) and negative (avoidance of money loss) reinforcement. Because gains and losses were of equal magnitude, according to the analytical conventions of the generalized matching law, bias (log b ≠ 0) would indicate differential impact by one type of consequence; however, no systematic bias was observed. Further research is needed to reconcile this outcome with apparently robust findings in other literatures of superior behavior control by aversive events. In an incidental finding, the linear function relating log behavior ratio and log reinforcement ratio was steeper for concurrent negative and positive reinforcement than for control conditions involving concurrent positive reinforcement. This may represent the first empirical confirmation of a free-operant differential-outcomes effect predicted by contingency-discriminability theories of choice. PMID:18683609

  18. Control of the TSU 2-m automatic telescope

    NASA Astrophysics Data System (ADS)

    Eaton, Joel A.; Williamson, Michael H.

    2004-09-01

    Tennessee State University is operating a 2-m automatic telescope for high-dispersion spectroscopy. The alt-azimuth telescope is fiber-coupled to a conventional echelle spectrograph with two resolutions (R=30,000 and 70,000). We control this instrument with four computers running linux and communicating over ethernet through the UDP protocol. A computer physically located on the telescope handles the acquisition and tracking of stars. We avoid the need for real-time programming in this application by periodically latching the positions of the axes in a commercial motion controller and the time in a GPS receiver. A second (spectrograph) computer sets up the spectrograph and runs its CCD, a third (roof) computer controls the roll-off roof and front flap of the telescope enclosure, and the fourth (executive) computer makes decisions about which stars to observe and when to close the observatory for bad weather. The only human intervention in the telescope's operation involves changing the observing program, copying data back to TSU, and running quality-control checks on the data. It has been running reliably in this completely automatic, unattended mode for more than a year with all day-to-day adminsitration carried out over the Internet. To support automatic operation, we have written a number of useful tools to predict and analyze what the telescope does. These include a simulator that predicts roughly how the telescope will operate on a given night, a quality-control program to parse logfiles from the telescope and identify problems, and a rescheduling program that calculates new priorities to keep the frequency of observation for the various stars roughly as desired. We have also set up a database to keep track of the tens of thousands of spectra we expect to get each year.

  19. Economic optimization of operations for hybrid energy systems under variable markets

    DOE PAGES

    Chen, Jen; Garcia, Humberto E.

    2016-05-21

    We prosed a hybrid energy systems (HES) which is an important element to enable increasing penetration of clean energy. Our paper investigates the operations flexibility of HES, and develops a methodology for operations optimization for maximizing economic value based on predicted renewable generation and market information. A multi-environment computational platform for performing such operations optimization is also developed. In order to compensate for prediction error, a control strategy is accordingly designed to operate a standby energy storage element (ESE) to avoid energy imbalance within HES. The proposed operations optimizer allows systematic control of energy conversion for maximal economic value. Simulationmore » results of two specific HES configurations are included to illustrate the proposed methodology and computational capability. These results demonstrate the economic viability of HES under proposed operations optimizer, suggesting the diversion of energy for alternative energy output while participating in the ancillary service market. Economic advantages of such operations optimizer and associated flexible operations are illustrated by comparing the economic performance of flexible operations against that of constant operations. Sensitivity analysis with respect to market variability and prediction error, are also performed.« less

  20. Economic optimization of operations for hybrid energy systems under variable markets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Jen; Garcia, Humberto E.

    We prosed a hybrid energy systems (HES) which is an important element to enable increasing penetration of clean energy. Our paper investigates the operations flexibility of HES, and develops a methodology for operations optimization for maximizing economic value based on predicted renewable generation and market information. A multi-environment computational platform for performing such operations optimization is also developed. In order to compensate for prediction error, a control strategy is accordingly designed to operate a standby energy storage element (ESE) to avoid energy imbalance within HES. The proposed operations optimizer allows systematic control of energy conversion for maximal economic value. Simulationmore » results of two specific HES configurations are included to illustrate the proposed methodology and computational capability. These results demonstrate the economic viability of HES under proposed operations optimizer, suggesting the diversion of energy for alternative energy output while participating in the ancillary service market. Economic advantages of such operations optimizer and associated flexible operations are illustrated by comparing the economic performance of flexible operations against that of constant operations. Sensitivity analysis with respect to market variability and prediction error, are also performed.« less

  1. Modeling of human operator dynamics in simple manual control utilizing time series analysis. [tracking (position)

    NASA Technical Reports Server (NTRS)

    Agarwal, G. C.; Osafo-Charles, F.; Oneill, W. D.; Gottlieb, G. L.

    1982-01-01

    Time series analysis is applied to model human operator dynamics in pursuit and compensatory tracking modes. The normalized residual criterion is used as a one-step analytical tool to encompass the processes of identification, estimation, and diagnostic checking. A parameter constraining technique is introduced to develop more reliable models of human operator dynamics. The human operator is adequately modeled by a second order dynamic system both in pursuit and compensatory tracking modes. In comparing the data sampling rates, 100 msec between samples is adequate and is shown to provide better results than 200 msec sampling. The residual power spectrum and eigenvalue analysis show that the human operator is not a generator of periodic characteristics.

  2. Reconciled rat and human metabolic networks for comparative toxicogenomics and biomarker predictions

    PubMed Central

    Blais, Edik M.; Rawls, Kristopher D.; Dougherty, Bonnie V.; Li, Zhuo I.; Kolling, Glynis L.; Ye, Ping; Wallqvist, Anders; Papin, Jason A.

    2017-01-01

    The laboratory rat has been used as a surrogate to study human biology for more than a century. Here we present the first genome-scale network reconstruction of Rattus norvegicus metabolism, iRno, and a significantly improved reconstruction of human metabolism, iHsa. These curated models comprehensively capture metabolic features known to distinguish rats from humans including vitamin C and bile acid synthesis pathways. After reconciling network differences between iRno and iHsa, we integrate toxicogenomics data from rat and human hepatocytes, to generate biomarker predictions in response to 76 drugs. We validate comparative predictions for xanthine derivatives with new experimental data and literature-based evidence delineating metabolite biomarkers unique to humans. Our results provide mechanistic insights into species-specific metabolism and facilitate the selection of biomarkers consistent with rat and human biology. These models can serve as powerful computational platforms for contextualizing experimental data and making functional predictions for clinical and basic science applications. PMID:28176778

  3. Analytic Guided-Search Model of Human Performance Accuracy in Target- Localization Search Tasks

    NASA Technical Reports Server (NTRS)

    Eckstein, Miguel P.; Beutter, Brent R.; Stone, Leland S.

    2000-01-01

    Current models of human visual search have extended the traditional serial/parallel search dichotomy. Two successful models for predicting human visual search are the Guided Search model and the Signal Detection Theory model. Although these models are inherently different, it has been difficult to compare them because the Guided Search model is designed to predict response time, while Signal Detection Theory models are designed to predict performance accuracy. Moreover, current implementations of the Guided Search model require the use of Monte-Carlo simulations, a method that makes fitting the model's performance quantitatively to human data more computationally time consuming. We have extended the Guided Search model to predict human accuracy in target-localization search tasks. We have also developed analytic expressions that simplify simulation of the model to the evaluation of a small set of equations using only three free parameters. This new implementation and extension of the Guided Search model will enable direct quantitative comparisons with human performance in target-localization search experiments and with the predictions of Signal Detection Theory and other search accuracy models.

  4. The transition to foraging for dense and predictable resources and its impact on the evolution of modern humans.

    PubMed

    Marean, Curtis W

    2016-07-05

    Scientists have identified a series of milestones in the evolution of the human food quest that are anticipated to have had far-reaching impacts on biological, behavioural and cultural evolution: the inclusion of substantial portions of meat, the broad spectrum revolution and the transition to food production. The foraging shift to dense and predictable resources is another key milestone that had consequential impacts on the later part of human evolution. The theory of economic defendability predicts that this shift had an important consequence-elevated levels of intergroup territoriality and conflict. In this paper, this theory is integrated with a well-established general theory of hunter-gatherer adaptations and is used to make predictions for the sequence of appearance of several evolved traits of modern humans. The distribution of dense and predictable resources in Africa is reviewed and found to occur only in aquatic contexts (coasts, rivers and lakes). The palaeoanthropological empirical record contains recurrent evidence for a shift to the exploitation of dense and predictable resources by 110 000 years ago, and the first known occurrence is in a marine coastal context in South Africa. Some theory predicts that this elevated conflict would have provided the conditions for selection for the hyperprosocial behaviours unique to modern humans.This article is part of the themed issue 'Major transitions in human evolution'. © 2016 The Author(s).

  5. The transition to foraging for dense and predictable resources and its impact on the evolution of modern humans

    PubMed Central

    Marean, Curtis W.

    2016-01-01

    Scientists have identified a series of milestones in the evolution of the human food quest that are anticipated to have had far-reaching impacts on biological, behavioural and cultural evolution: the inclusion of substantial portions of meat, the broad spectrum revolution and the transition to food production. The foraging shift to dense and predictable resources is another key milestone that had consequential impacts on the later part of human evolution. The theory of economic defendability predicts that this shift had an important consequence—elevated levels of intergroup territoriality and conflict. In this paper, this theory is integrated with a well-established general theory of hunter–gatherer adaptations and is used to make predictions for the sequence of appearance of several evolved traits of modern humans. The distribution of dense and predictable resources in Africa is reviewed and found to occur only in aquatic contexts (coasts, rivers and lakes). The palaeoanthropological empirical record contains recurrent evidence for a shift to the exploitation of dense and predictable resources by 110 000 years ago, and the first known occurrence is in a marine coastal context in South Africa. Some theory predicts that this elevated conflict would have provided the conditions for selection for the hyperprosocial behaviours unique to modern humans. This article is part of the themed issue ‘Major transitions in human evolution’. PMID:27298470

  6. Future of Mechatronics and Human

    NASA Astrophysics Data System (ADS)

    Harashima, Fumio; Suzuki, Satoshi

    This paper mentions circumstance of mechatronics that sustain our human society, and introduces HAM(Human Adaptive Mechatronics)-project as one of research projects to create new human-machine system. The key point of HAM is skill, and analysis of skill and establishment of assist method to enhance total performance of human-machine system are main research concerns. As study of skill is an elucidation of human itself, analyses of human higher function are significant. In this paper, after surveying researches of human brain functions, an experimental analysis of human characteristic in machine operation is shown as one example of our research activities. We used hovercraft simulator as verification system including observation, voluntary motion control and machine operation that are needed to general machine operation. Process and factors to become skilled were investigated by identification of human control characteristics with measurement of the operator's line-of sight. It was confirmed that early switching of sub-controllers / reference signals in human and enhancement of space perception are significant.

  7. An evaluation of the real-time tropical cyclone forecast skill of the Navy Operational Global Atmospheric Prediction System in the western North Pacific

    NASA Technical Reports Server (NTRS)

    Fiorino, Michael; Goerss, James S.; Jensen, Jack J.; Harrison, Edward J., Jr.

    1993-01-01

    The paper evaluates the meteorological quality and operational utility of the Navy Operational Global Atmospheric Prediction System (NOGAPS) in forecasting tropical cyclones. It is shown that the model can provide useful predictions of motion and formation on a real-time basis in the western North Pacific. The meterological characteristics of the NOGAPS tropical cyclone predictions are evaluated by examining the formation of low-level cyclone systems in the tropics and vortex structure in the NOGAPS analysis and verifying 72-h forecasts. The adjusted NOGAPS track forecasts showed equitable skill to the baseline aid and the dynamical model. NOGAPS successfully predicted unusual equatorward turns for several straight-running cyclones.

  8. Human Factors Analysis of Pipeline Monitoring and Control Operations: Final Technical Report

    DOT National Transportation Integrated Search

    2008-11-26

    The purpose of the Human Factors Analysis of Pipeline Monitoring and Control Operations project was to develop procedures that could be used by liquid pipeline operators to assess and manage the human factors risks in their control rooms that may adv...

  9. Recent Human Factors Contributions to Improve Military Operations (Human Factors and Ergonomics Society Bulletin. Volume 46, Number 12, December 2003)

    DTIC Science & Technology

    2003-12-01

    operations run the full gamut from large-scale, theater-wide combat, as witnessed in Operation Iraqi Freedom, to small-scale operations against terrorists, to... gamut from large-scale, theater-wide combat, as witnessed in Operation Iraqi Freedom, to small-scale operations against terror- ists, to operations

  10. Prediction and warning system of SEP events and solar flares for risk estimation in space launch operations

    NASA Astrophysics Data System (ADS)

    García-Rigo, Alberto; Núñez, Marlon; Qahwaji, Rami; Ashamari, Omar; Jiggens, Piers; Pérez, Gustau; Hernández-Pajares, Manuel; Hilgers, Alain

    2016-07-01

    A web-based prototype system for predicting solar energetic particle (SEP) events and solar flares for use by space launch operators is presented. The system has been developed as a result of the European Space Agency (ESA) project SEPsFLAREs (Solar Events Prediction system For space LAunch Risk Estimation). The system consists of several modules covering the prediction of solar flares and early SEP Warnings (labeled Warning tool), the prediction of SEP event occurrence and onset, and the prediction of SEP event peak and duration. In addition, the system acquires data for solar flare nowcasting from Global Navigation Satellite Systems (GNSS)-based techniques (GNSS Solar Flare Detector, GSFLAD and the Sunlit Ionosphere Sudden Total Electron Content Enhancement Detector, SISTED) as additional independent products that may also prove useful for space launch operators.

  11. Leveraging ISI Multi-Model Prediction for Navy Operations: Proposal to the Office of Naval Research

    DTIC Science & Technology

    2014-09-30

    ISI Multi-Model Prediction for Navy Operations: Proposal to the Office of Naval Research PI: James L. Kinter III Director, Center for Ocean-Land...TYPE 3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Leveraging ISI Multi-Model Prediction for Navy Operations: Proposal to the ... Office of Naval Research 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f

  12. Dopamine modulates episodic memory persistence in old age

    PubMed Central

    Chowdhury, Rumana; Guitart-Masip, Marc; Bunzeck, Nico; Dolan, Raymond J; Düzel, Emrah

    2013-01-01

    Activation of the hippocampus is required in order to encode memories for new events (or episodes). Observations from animal studies suggest that for these memories to persist beyond 4 to 6 hours, a release of dopamine generated by strong hippocampal activation is needed. This predicts that dopaminergic enhancement should improve human episodic memory persistence also for events encoded with weak hippocampal activation. Here, using pharmacological fMRI in an elderly population where there is a loss of dopamine neurons as part of normal aging, we show this very effect. The dopamine precursor levodopa led to a dose-dependent (inverted U-shape) persistent episodic memory benefit for images of scenes when tested after 6 hours, independent of whether encoding-related hippocampal fMRI activity was weak or strong (U-shaped dose-response relationship). This lasting improvement even for weakly encoded events supports a role for dopamine in human episodic memory consolidation albeit operating within a narrow dose range. PMID:23055489

  13. An evaluative model of system performance in manned teleoperational systems

    NASA Technical Reports Server (NTRS)

    Haines, Richard F.

    1989-01-01

    Manned teleoperational systems are used in aerospace operations in which humans must interact with machines remotely. Manual guidance of remotely piloted vehicles, controling a wind tunnel, carrying out a scientific procedure remotely are examples of teleoperations. A four input parameter throughput (Tp) model is presented which can be used to evaluate complex, manned, teleoperations-based systems and make critical comparisons among candidate control systems. The first two parameters of this model deal with nominal (A) and off-nominal (B) predicted events while the last two focus on measured events of two types, human performance (C) and system performance (D). Digital simulations showed that the expression A(1-B)/C+D) produced the greatest homogeneity of variance and distribution symmetry. Results from a recently completed manned life science telescience experiment will be used to further validate the model. Complex, interacting teleoperational systems may be systematically evaluated using this expression much like a computer benchmark is used.

  14. Head Pose Estimation Using Multilinear Subspace Analysis for Robot Human Awareness

    NASA Technical Reports Server (NTRS)

    Ivanov, Tonislav; Matthies, Larry; Vasilescu, M. Alex O.

    2009-01-01

    Mobile robots, operating in unconstrained indoor and outdoor environments, would benefit in many ways from perception of the human awareness around them. Knowledge of people's head pose and gaze directions would enable the robot to deduce which people are aware of the its presence, and to predict future motions of the people for better path planning. To make such inferences, requires estimating head pose on facial images that are combination of multiple varying factors, such as identity, appearance, head pose, and illumination. By applying multilinear algebra, the algebra of higher-order tensors, we can separate these factors and estimate head pose regardless of subject's identity or image conditions. Furthermore, we can automatically handle uncertainty in the size of the face and its location. We demonstrate a pipeline of on-the-move detection of pedestrians with a robot stereo vision system, segmentation of the head, and head pose estimation in cluttered urban street scenes.

  15. Recovery of Lunar Surface Access Module Residual and Reserve Propellants

    NASA Technical Reports Server (NTRS)

    Notardonato, William U.

    2007-01-01

    The Vision for Space Exploration calls for human exploration of the lunar surface in the 2020 timeframe. Sustained human exploration of the lunar surface will require supply, storage, and distribution of consumables for a variety of mission elements. These elements include propulsion systems for ascent and descent stages, life support for habitats and extra-vehicular activity, and reactants for power systems. NASA KSC has been tasked to develop technologies and strategies for consumables transfer for lunar exploration as part of the Exploration Technology Development Program. This paper will investigate details of operational concepts to scavenge residual propellants from the lunar descent propulsion system. Predictions on the mass of residuals and reserves are made. Estimates of heat transfer and boiloff rates are calculated and transient tank thermodynamic issues post-engine cutoff are modeled. Recovery and storage options including cryogenic liquid, vapor and water are discussed, and possible reuse of LSAM assets is presented.

  16. Measurement of whole-body human centers of gravity and moments of inertia.

    PubMed

    Albery, C B; Schultz, R B; Bjorn, V S

    1998-06-01

    With the inclusion of women in combat aircraft, the question of safe ejection seat operation has been raised. The potential expanded population of combat pilots would include both smaller and larger ejection seat occupants, which could significantly affect seat performance. The method developed to measure human whole-body CG and MOI used a scale, a knife edge balance, and an inverted torsional pendulum. Subjects' moments of inertia were measured along six different axes. The inertia tensor was calculated from these values, and principal moments of inertia were then derived. Thirty-eight antropometric measurements were also taken for each subject to provide a means for direct correlation of inertial properties to body dimensions and for modeling purposes. Data collected in this study has been used to validate whole-body mass properties predictions. In addition, data will be used to improve Air Force and Navy ejection seat trajectory models for the expanded population.

  17. USSR Space Life Sciences Digest, issue 11

    NASA Technical Reports Server (NTRS)

    Hooke, Lydia Razran (Editor); Radtke, Mike (Editor); Radtke, Mike (Editor); Radtke, Mike (Editor); Radtke, Mike (Editor); Radtke, Mike (Editor)

    1987-01-01

    This is the eleventh issue of NASA's USSR Space Life Sciences Digest. It contains abstracts of 54 papers recently published in Russian language periodicals and bound collections and of four new Soviet monographs. Selected abstracts are illustrated. Additional features include the translation of a paper presented in Russian to the United Nations, a review of a book on space ecology, and report of a conference on evaluating human functional capacities and predicting health. Current Soviet Life Sciences titles available in English are cited. The materials included in this issue have been identified as relevant to 30 areas of aerospace medicine and space biology. These areas are: adaptation, aviation physiology, biological rhythms, biospherics, body fluids, botany, cardiovascular and respiratory systems, cosmonaut training, developmental biology, endocrinology, enzymology, equipment and instrumentation, gastrointestinal systems, group dynamics, genetics, hematology, human performance, immunology, life support systems, mathematical modeling, metabolism, microbiology, musculoskeletal system, neurophysiology, nutrition, operational medicine, perception, personnel selection, psychology, and radiobiology.

  18. RPD-based Hypothesis Reasoning for Cyber Situation Awareness

    NASA Astrophysics Data System (ADS)

    Yen, John; McNeese, Michael; Mullen, Tracy; Hall, David; Fan, Xiaocong; Liu, Peng

    Intelligence workers such as analysts, commanders, and soldiers often need a hypothesis reasoning framework to gain improved situation awareness of the highly dynamic cyber space. The development of such a framework requires the integration of interdisciplinary techniques, including supports for distributed cognition (human-in-the-loop hypothesis generation), supports for team collaboration (identification of information for hypothesis evaluation), and supports for resource-constrained information collection (hypotheses competing for information collection resources). We here describe a cognitively-inspired framework that is built upon Klein’s recognition-primed decision model and integrates the three components of Endsley’s situation awareness model. The framework naturally connects the logic world of tools for cyber situation awareness with the mental world of human analysts, enabling the perception, comprehension, and prediction of cyber situations for better prevention, survival, and response to cyber attacks by adapting missions at the operational, tactical, and strategic levels.

  19. The role of developmental plasticity and epigenetics in human health.

    PubMed

    Gluckman, Peter D; Hanson, Mark A; Low, Felicia M

    2011-03-01

    Considerable epidemiological, experimental and clinical data have amassed showing that the risk of developing disease in later life is dependent on early life conditions, mainly operating within the normative range of developmental exposures. This relationship reflects plastic responses made by the developing organism as an evolved strategy to cope with immediate or predicted circumstances, to maximize fitness in the context of the range of environments potentially faced. There is now increasing evidence, both in animals and humans, that such developmental plasticity is mediated in part by epigenetic mechanisms. However, recognition of the importance of developmental plasticity as an important factor in influencing later life health-particularly within the medical and public health communities-is low, and we argue that this indifference cannot be sustained in light of the growing understanding of developmental processes and the rapid rise in the prevalence of obesity and metabolic disease globally. Copyright © 2011 Wiley-Liss, Inc.

  20. A Physiologically-Based Description of the Inhalation Pharmacokinetics of Styrene in Rats and Humans

    DTIC Science & Technology

    1983-01-01

    model for rat were scaled to give a description of human kinetics and the predictions agreed closely with available data from the literature (Fig. 4...for predicting human kinetics from a data base in other mammalian species. The ability to anticipate kinetic behavior in humans could very much improve

  1. Jellyfish prediction of occurrence from remote sensing data and a non-linear pattern recognition approach

    NASA Astrophysics Data System (ADS)

    Albajes-Eizagirre, Anton; Romero, Laia; Soria-Frisch, Aureli; Vanhellemont, Quinten

    2011-11-01

    Impact of jellyfish in human activities has been increasingly reported worldwide in recent years. Segments such as tourism, water sports and leisure, fisheries and aquaculture are commonly damaged when facing blooms of gelatinous zooplankton. Hence the prediction of the appearance and disappearance of jellyfish in our coasts, which is not fully understood from its biological point of view, has been approached as a pattern recognition problem in the paper presented herein, where a set of potential ecological cues was selected to test their usefulness for prediction. Remote sensing data was used to describe environmental conditions that could support the occurrence of jellyfish blooms with the aim of capturing physical-biological interactions: forcing, coastal morphology, food availability, and water mass characteristics are some of the variables that seem to exert an effect on jellyfish accumulation on the shoreline, under specific spatial and temporal windows. A data-driven model based on computational intelligence techniques has been designed and implemented to predict jellyfish events on the beach area as a function of environmental conditions. Data from 2009 over the NW Mediterranean continental shelf have been used to train and test this prediction protocol. Standard level 2 products are used from MODIS (NASA OceanColor) and MERIS (ESA - FRS data). The procedure for designing the analysis system can be described as following. The aforementioned satellite data has been used as feature set for the performance evaluation. Ground truth has been extracted from visual observations by human agents on different beach sites along the Catalan area. After collecting the evaluation data set, the performance between different computational intelligence approaches have been compared. The outperforming one in terms of its generalization capability has been selected for prediction recall. Different tests have been conducted in order to assess the prediction capability of the resulting system in operational conditions. This includes taking into account several types of features with different distances in both the spatial and temporal domains with respect to prediction time and site. Moreover the generalization capability has been measured via cross-fold validation. The implementation and performance evaluation results are detailed in the present communication together with the feature extraction from satellite data. To the best of our knowledge the developed application constitutes the first implementation of an automate system for the prediction of jellyfish appearance founded on remote sensing technologies.

  2. Operator function modeling: Cognitive task analysis, modeling and intelligent aiding in supervisory control systems

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.

    1990-01-01

    The design, implementation, and empirical evaluation of task-analytic models and intelligent aids for operators in the control of complex dynamic systems, specifically aerospace systems, are studied. Three related activities are included: (1) the models of operator decision making in complex and predominantly automated space systems were used and developed; (2) the Operator Function Model (OFM) was used to represent operator activities; and (3) Operator Function Model Expert System (OFMspert), a stand-alone knowledge-based system was developed, that interacts with a human operator in a manner similar to a human assistant in the control of aerospace systems. OFMspert is an architecture for an operator's assistant that uses the OFM as its system and operator knowledge base and a blackboard paradigm of problem solving to dynamically generate expectations about upcoming operator activities and interpreting actual operator actions. An experiment validated the OFMspert's intent inferencing capability and showed that it inferred the intentions of operators in ways comparable to both a human expert and operators themselves. OFMspert was also augmented with control capabilities. An interface allowed the operator to interact with OFMspert, delegating as much or as little control responsibility as the operator chose. With its design based on the OFM, OFMspert's control capabilities were available at multiple levels of abstraction and allowed the operator a great deal of discretion over the amount and level of delegated control. An experiment showed that overall system performance was comparable for teams consisting of two human operators versus a human operator and OFMspert team.

  3. Human Factors Guidance for Control Room and Digital Human-System Interface Design and Modification, Guidelines for Planning, Specification, Design, Licensing, Implementation, Training, Operation and Maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. Fink, D. Hill, J. O'Hara

    2004-11-30

    Nuclear plant operators face a significant challenge designing and modifying control rooms. This report provides guidance on planning, designing, implementing and operating modernized control rooms and digital human-system interfaces.

  4. Reconciled Rat and Human Metabolic Networks for Comparative Toxicogenomics and Biomarker Predictions

    DTIC Science & Technology

    2017-02-08

    compared with the original human GPR rules (Supplementary Fig. 3). The consensus-based approach for filtering orthology annotations was designed to...ARTICLE Received 29 Jan 2016 | Accepted 13 Dec 2016 | Published 8 Feb 2017 Reconciled rat and human metabolic networks for comparative toxicogenomics...predictions in response to 76 drugs. We validate comparative predictions for xanthine derivatives with new experimental data and literature- based evidence

  5. Genome-Wide Prediction and Analysis of 3D-Domain Swapped Proteins in the Human Genome from Sequence Information.

    PubMed

    Upadhyay, Atul Kumar; Sowdhamini, Ramanathan

    2016-01-01

    3D-domain swapping is one of the mechanisms of protein oligomerization and the proteins exhibiting this phenomenon have many biological functions. These proteins, which undergo domain swapping, have acquired much attention owing to their involvement in human diseases, such as conformational diseases, amyloidosis, serpinopathies, proteionopathies etc. Early realisation of proteins in the whole human genome that retain tendency to domain swap will enable many aspects of disease control management. Predictive models were developed by using machine learning approaches with an average accuracy of 78% (85.6% of sensitivity, 87.5% of specificity and an MCC value of 0.72) to predict putative domain swapping in protein sequences. These models were applied to many complete genomes with special emphasis on the human genome. Nearly 44% of the protein sequences in the human genome were predicted positive for domain swapping. Enrichment analysis was performed on the positively predicted sequences from human genome for their domain distribution, disease association and functional importance based on Gene Ontology (GO). Enrichment analysis was also performed to infer a better understanding of the functional importance of these sequences. Finally, we developed hinge region prediction, in the given putative domain swapped sequence, by using important physicochemical properties of amino acids.

  6. New techniques for the analysis of manual control systems. [mathematical models of human operator behavior

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.

    1971-01-01

    Studies are summarized on the application of advanced analytical and computational methods to the development of mathematical models of human controllers in multiaxis manual control systems. Specific accomplishments include the following: (1) The development of analytical and computer methods for the measurement of random parameters in linear models of human operators. (2) Discrete models of human operator behavior in a multiple display situation were developed. (3) Sensitivity techniques were developed which make possible the identification of unknown sampling intervals in linear systems. (4) The adaptive behavior of human operators following particular classes of vehicle failures was studied and a model structure proposed.

  7. Petri nets as a modeling tool for discrete concurrent tasks of the human operator. [describing sequential and parallel demands on human operators

    NASA Technical Reports Server (NTRS)

    Schumacher, W.; Geiser, G.

    1978-01-01

    The basic concepts of Petri nets are reviewed as well as their application as the fundamental model of technical systems with concurrent discrete events such as hardware systems and software models of computers. The use of Petri nets is proposed for modeling the human operator dealing with concurrent discrete tasks. Their properties useful in modeling the human operator are discussed and practical examples are given. By means of and experimental investigation of binary concurrent tasks which are presented in a serial manner, the representation of human behavior by Petri nets is demonstrated.

  8. Identification of human operator performance models utilizing time series analysis

    NASA Technical Reports Server (NTRS)

    Holden, F. M.; Shinners, S. M.

    1973-01-01

    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  9. Intra-Operative Frozen Sections for Ovarian Tumors – A Tertiary Center Experience

    PubMed Central

    Arshad, Nur Zaiti Md; Ng, Beng Kwang; Paiman, Noor Asmaliza Md; Mahdy, Zaleha Abdullah; Noor, Rushdan Mohd

    2018-01-01

    Background: Accuracy of diagnosis with intra-operative frozen sections is extremely important in the evaluation of ovarian tumors so that appropriate surgical procedures can be selected. Study design: All patients who with intra-operative frozen sections for ovarian masses in a tertiary center over nine years from June 2008 until April 2017 were reviewed. Frozen section diagnosis and final histopathological reports were compared. Main outcome measures: Sensitivity, specificity, positive and negative predictive values of intra-operative frozen section as compared to final histopathological results for ovarian tumors. Results: A total of 92 cases were recruited for final evaluation. The frozen section diagnoses were comparable with the final histopathological reports in 83.7% of cases. The sensitivity, specificity, positive predictive value and negative predictive value for benign and malignant ovarian tumors were 95.6%, 85.1%, 86.0% and 95.2% and 69.2%, 100%, 100% and 89.2% respectively. For borderline ovarian tumors, the sensitivity and specificity were 76.2% and 88.7%, respectively; the positive predictive value was 66.7% and the negative predictive value was 92.7%. Conclusion: The accuracy of intra-operative frozen section diagnoses for ovarian tumors is high and this approach remains a reliable option in assessing ovarian masses intra-operatively. PMID:29373916

  10. Human mobility prediction from region functions with taxi trajectories.

    PubMed

    Wang, Minjie; Yang, Su; Sun, Yi; Gao, Jun

    2017-01-01

    People in cities nowadays suffer from increasingly severe traffic jams due to less awareness of how collective human mobility is affected by urban planning. Besides, understanding how region functions shape human mobility is critical for business planning but remains unsolved so far. This study aims to discover the association between region functions and resulting human mobility. We establish a linear regression model to predict the traffic flows of Beijing based on the input referred to as bag of POIs. By solving the predictor in the sense of sparse representation, we find that the average prediction precision is over 74% and each type of POI contributes differently in the predictor, which accounts for what factors and how such region functions attract people visiting. Based on these findings, predictive human mobility could be taken into account when planning new regions and region functions.

  11. Assessing the viability of `over-the-loop' real-time short-to-medium range ensemble streamflow forecasts

    NASA Astrophysics Data System (ADS)

    Wood, A. W.; Clark, E.; Mendoza, P. A.; Nijssen, B.; Newman, A. J.; Clark, M. P.; Arnold, J.; Nowak, K. C.

    2016-12-01

    Many if not most national operational short-to-medium range streamflow prediction systems rely on a forecaster-in-the-loop approach in which some parts of the forecast workflow are automated, but others require the hands-on-effort of an experienced human forecaster. This approach evolved out of the need to correct for deficiencies in the models and datasets that were available for forecasting, and often leads to skillful predictions despite the use of relatively simple, conceptual models. On the other hand, the process is not reproducible, which limits opportunities to assess and incorporate process variations, and the effort required to make forecasts in this way is an obstacle to expanding forecast services - e.g., though adding new forecast locations or more frequent forecast updates, running more complex models, or producing forecast ensembles and hindcasts that can support verification. In the last decade, the hydrologic forecasting community has begun to develop more centralized, `over-the-loop' systems. The quality of these new forecast products will depend on their ability to leverage research in areas including earth system modeling, parameter estimation, data assimilation, statistical post-processing, weather and climate prediction, verification, and uncertainty estimation through the use of ensembles. Currently, the operational streamflow forecasting and water management communities have little experience with the strengths and weaknesses of over-the-loop approaches, even as the systems are being rolled out in major operational forecasting centers. There is thus a need both to evaluate these forecasting advances and to demonstrate their potential in a public arena, raising awareness in forecast user communities and development programs alike. To address this need, the National Center for Atmospheric Research is collaborating with the University of Washington, the Bureau of Reclamation and the US Army Corps of Engineers, using the NCAR 'System for Hydromet Analysis, Research, and Prediction' (SHARP) to implement, assess and demonstrate real-time over-the-loop forecasts. We present early hindcast and verification results from SHARP for short to medium range streamflow forecasts in a number of US case study watersheds.

  12. Importance of good manufacturing practices in microbiological monitoring in processing human tissues for transplant.

    PubMed

    Pianigiani, Elisa; Ierardi, Francesca; Fimiani, Michele

    2013-12-01

    Skin allografts represent an important therapeutic resource in the treatment of severe skin loss. The risk associated with application of processed tissues in humans is very low, however, human material always carries the risk of disease transmission. To minimise the risk of contamination of grafts, processing is carried out in clean rooms where air quality is monitored. Procedures and quality control tests are performed to standardise the production process and to guarantee the final product for human use. Since we only validate and distribute aseptic tissues, we conducted a study to determine what type of quality controls for skin processing are the most suitable for detecting processing errors and intercurrent contamination, and for faithfully mapping the process without unduly increasing production costs. Two different methods for quality control were statistically compared using the Fisher exact test. On the basis of the current study we selected our quality control procedure based on pre- and post-processing tissue controls, operator and environmental controls. Evaluation of the predictability of our control methods showed that tissue control was the most reliable method of revealing microbial contamination of grafts. We obtained 100 % sensitivity by doubling tissue controls, while maintaining high specificity (77 %).

  13. Intelligent Context-Aware and Adaptive Interface for Mobile LBS

    PubMed Central

    Liu, Yanhong

    2015-01-01

    Context-aware user interface plays an important role in many human-computer Interaction tasks of location based services. Although spatial models for context-aware systems have been studied extensively, how to locate specific spatial information for users is still not well resolved, which is important in the mobile environment where location based services users are impeded by device limitations. Better context-aware human-computer interaction models of mobile location based services are needed not just to predict performance outcomes, such as whether people will be able to find the information needed to complete a human-computer interaction task, but to understand human processes that interact in spatial query, which will in turn inform the detailed design of better user interfaces in mobile location based services. In this study, a context-aware adaptive model for mobile location based services interface is proposed, which contains three major sections: purpose, adjustment, and adaptation. Based on this model we try to describe the process of user operation and interface adaptation clearly through the dynamic interaction between users and the interface. Then we show how the model applies users' demands in a complicated environment and suggested the feasibility by the experimental results. PMID:26457077

  14. Automated confidence ranked classification of randomized controlled trial articles: an aid to evidence-based medicine

    PubMed Central

    Smalheiser, Neil R; McDonagh, Marian S; Yu, Clement; Adams, Clive E; Davis, John M; Yu, Philip S

    2015-01-01

    Objective: For many literature review tasks, including systematic review (SR) and other aspects of evidence-based medicine, it is important to know whether an article describes a randomized controlled trial (RCT). Current manual annotation is not complete or flexible enough for the SR process. In this work, highly accurate machine learning predictive models were built that include confidence predictions of whether an article is an RCT. Materials and Methods: The LibSVM classifier was used with forward selection of potential feature sets on a large human-related subset of MEDLINE to create a classification model requiring only the citation, abstract, and MeSH terms for each article. Results: The model achieved an area under the receiver operating characteristic curve of 0.973 and mean squared error of 0.013 on the held out year 2011 data. Accurate confidence estimates were confirmed on a manually reviewed set of test articles. A second model not requiring MeSH terms was also created, and performs almost as well. Discussion: Both models accurately rank and predict article RCT confidence. Using the model and the manually reviewed samples, it is estimated that about 8000 (3%) additional RCTs can be identified in MEDLINE, and that 5% of articles tagged as RCTs in Medline may not be identified. Conclusion: Retagging human-related studies with a continuously valued RCT confidence is potentially more useful for article ranking and review than a simple yes/no prediction. The automated RCT tagging tool should offer significant savings of time and effort during the process of writing SRs, and is a key component of a multistep text mining pipeline that we are building to streamline SR workflow. In addition, the model may be useful for identifying errors in MEDLINE publication types. The RCT confidence predictions described here have been made available to users as a web service with a user query form front end at: http://arrowsmith.psych.uic.edu/cgi-bin/arrowsmith_uic/RCT_Tagger.cgi. PMID:25656516

  15. MIDAS-FAST: Design and Validation of a Model-Based Tool to Predict Operator Performance with Robotic Arm Automation

    NASA Technical Reports Server (NTRS)

    Sebok, Angelia (Principal Investigator); Wickens, Christopher; Gacy, Marc; Brehon, Mark; Scott-Nash, Shelly; Sarter, Nadine; Li, Huiyang; Gore, Brian; Hooey, Becky

    2017-01-01

    The Coalition for Aerospace and Science (CAS) is hosting an exhibition on Capitol Hill on June 14, 2017, to highlight the contributions of CAS members to NASAs portfolio of activities. This exhibition represents an opportunity for an HFES members ground breaking work to be displayed and to build on support within Congress for NASAs human research program including in those areas that are of specific interest to the HFE community. The intent of this poster presentation is to demonstrate the positive outcome that comes from funding HFE related research on a project like the one exemplified by MIDAS-FAST.

  16. From crowd modeling to safety problems. Comment on "Human behaviours in evacuation crowd dynamics: From modelling to "big data" toward crisis management" by Nicola Bellomo et al.

    NASA Astrophysics Data System (ADS)

    Elaiw, Ahmed

    2016-09-01

    Paper [3] presents a survey and a critical analysis on models of crowd dynamics derived to support crisis management related to safety problems. This is an important topic which can have an important impact on the wellbeing of our society. We are very interested in this topic as we operate in a country, Saudi Arabia, where huge crowds can be present and that stress conditions can be occasionally induced by non predictable events. In these situations the problem of crisis management is of fundamental importance.

  17. Factors determining the smooth flow and the non-operative time in a one-induction room to one-operating room setting

    PubMed Central

    Mulier, Jan P; De Boeck, Liesje; Meulders, Michel; Beliën, Jeroen; Colpaert, Jan; Sels, Annabel

    2015-01-01

    Rationale, aims and objectives What factors determine the use of an anaesthesia preparation room and shorten non-operative time? Methods A logistic regression is applied to 18 751 surgery records from AZ Sint-Jan Brugge AV, Belgium, where each operating room has its own anaesthesia preparation room. Surgeries, in which the patient's induction has already started when the preceding patient's surgery has ended, belong to a first group where the preparation room is used as an induction room. Surgeries not fulfilling this property belong to a second group. A logistic regression model tries to predict the probability that a surgery will be classified into a specific group. Non-operative time is calculated as the time between end of the previous surgery and incision of the next surgery. A log-linear regression of this non-operative time is performed. Results It was found that switches in surgeons, being a non-elective surgery as well as the previous surgery being non-elective, increase the probability of being classified into the second group. Only a few surgery types, anaesthesiologists and operating rooms can be found exclusively in one of the two groups. Analysis of variance demonstrates that the first group has significantly lower non-operative times. Switches in surgeons, anaesthesiologists and longer scheduled durations of the previous surgery increases the non-operative time. A switch in both surgeon and anaesthesiologist strengthens this negative effect. Only a few operating rooms and surgery types influence the non-operative time. Conclusion The use of the anaesthesia preparation room shortens the non-operative time and is determined by several human and structural factors. PMID:25496600

  18. High Sensitivity, Wearable, Piezoresistive Pressure Sensors Based on Irregular Microhump Structures and Its Applications in Body Motion Sensing.

    PubMed

    Wang, Zongrong; Wang, Shan; Zeng, Jifang; Ren, Xiaochen; Chee, Adrian J Y; Yiu, Billy Y S; Chung, Wai Choi; Yang, Yong; Yu, Alfred C H; Roberts, Robert C; Tsang, Anderson C O; Chow, Kwok Wing; Chan, Paddy K L

    2016-07-01

    A pressure sensor based on irregular microhump patterns has been proposed and developed. The devices show high sensitivity and broad operating pressure regime while comparing with regular micropattern devices. Finite element analysis (FEA) is utilized to confirm the sensing mechanism and predict the performance of the pressure sensor based on the microhump structures. Silicon carbide sandpaper is employed as the mold to develop polydimethylsiloxane (PDMS) microhump patterns with various sizes. The active layer of the piezoresistive pressure sensor is developed by spin coating PSS on top of the patterned PDMS. The devices show an averaged sensitivity as high as 851 kPa(-1) , broad operating pressure range (20 kPa), low operating power (100 nW), and fast response speed (6.7 kHz). Owing to their flexible properties, the devices are applied to human body motion sensing and radial artery pulse. These flexible high sensitivity devices show great potential in the next generation of smart sensors for robotics, real-time health monitoring, and biomedical applications. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Application of finite element analysis in pre-operative planning for deformity correction of abnormal hip joints--a case series.

    PubMed

    Rhyu, K H; Kim, Y H; Park, W M; Kim, K; Cho, T-J; Choi, I H

    2011-09-01

    In experimental and clinical research, it is difficult to directly measure responses in the human body, such as contact pressure and stress in a joint, but finite element analysis (FEA) enables the examination of in vivo responses by contact analysis. Hence, FEA is useful for pre-operative planning prior to orthopaedic surgeries, in order to gain insight into which surgical options will result in the best outcome. The present study develops a numerical simulation technique based on FEA to predict the surgical outcomes of osteotomy methods for the treatment of slipped capital femoral epiphyses. The correlation of biomechanical parameters including contact pressure and stress, for moderate and severe cases, is investigated. For severe slips, a base-of-neck osteotomy is thought to be the most reliable and effective surgical treatment, while any osteotomy may produce dramatic improvement for moderate slips. This technology of pre-operative planning using FEA can provide information regarding biomechanical parameters that might facilitate the selection of optimal osteotomy methods and corresponding surgical options.

  20. Punishment in human choice: direct or competitive suppression?

    PubMed Central

    Critchfield, Thomas S; Paletz, Elliott M; MacAleese, Kenneth R; Newland, M Christopher

    2003-01-01

    This investigation compared the predictions of two models describing the integration of reinforcement and punishment effects in operant choice. Deluty's (1976) competitive-suppression model (conceptually related to two-factor punishment theories) and de Villiers' (1980) direct-suppression model (conceptually related to one-factor punishment theories) have been tested previously in nonhumans but not at the individual level in humans. Mouse clicking by college students was maintained in a two-alternative concurrent schedule of variable-interval money reinforcement. Punishment consisted of variable-interval money losses. Experiment 1 verified that money loss was an effective punisher in this context. Experiment 2 consisted of qualitative model comparisons similar to those used in previous studies involving nonhumans. Following a no-punishment baseline, punishment was superimposed upon both response alternatives. Under schedule values for which the direct-suppression model, but not the competitive-suppression model, predicted distinct shifts from baseline performance, or vice versa, 12 of 14 individual-subject functions, generated by 7 subjects, supported the direct-suppression model. When the punishment models were converted to the form of the generalized matching law, least-squares linear regression fits for a direct-suppression model were superior to those of a competitive-suppression model for 6 of 7 subjects. In Experiment 3, a more thorough quantitative test of the modified models, fits for a direct-suppression model were superior in 11 of 13 cases. These results correspond well to those of investigations conducted with nonhumans and provide the first individual-subject evidence that a direct-suppression model, evaluated both qualitatively and quantitatively, describes human punishment better than a competitive-suppression model. We discuss implications for developing better punishment models and future investigations of punishment in human choice. PMID:13677606

Top